Have you ever wondered about how Google handles paid directories? Well, Matt Cutts answers the question “Are paid directories held to the same standards as paid links?” from the famous Blind Five Year Old and I personally liked what Matt had to say. Before you decide to pay to be in an online directory or […]

Follow SEJ on Twitter @sejournal


Matt explains this newer feature and tells you about all the options you have with Fetch as Googlebot. Follow SEJ on Twitter @sejournal

Follow SEJ on Twitter @sejournal


Google’s Matt Cutts answers a question about how SafeSearch works. The link Matt mentions is below the video. If you’re a website owner and you think that your content is mistakenly being filtered by SafeSearch, you can let us know here: http://support.google.com/webmasters/bin/request.py?contact_type=safe_search.   Follow SEJ on Twitter @sejournal

Follow SEJ on Twitter @sejournal





Google updated its search algorithm this week to help reduce webspam in its search results.

These changes were made in response to increased criticism of Google and its search engine results. The criticism has been partly inspired by the emergence of newer forms of webspam alongside traditional webspam (pages that consist of lots of keywords and phrases without context or meaning that “cheat” their way up to higher search ranks).

The latest webspam outbreaks commonly come from content farms and sites that syndicate content. Earlier this month, Stack Overflow‘s Jeff Atwood pointed out that in the last year, some content syndicators have routinely began outranking Stack Overflow on Google. In other words, the syndicates are outranking the originals.

In Stack Overflow‘s case, the problem was bad enough that a community member built a Google Chrome extension designed to redirect to Stack Overflow from spammier syndicates.

Matt Cutts, principle engineer at Google and head of the webspam team, responded to some of the criticism in a blog post and said Google would be “evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content.” On his personal blog, Cutts confirmed that those changes have indeed gone into effect.

Cutts writes that this was a “pretty targeted launch” and that the “net effect is that searchers are more likely to see the sites that wrote the original content rather than a site that scraped or copied the original site’s content.”

More About: Google, matt cutts, Search Spam, spam, stack overflow, webspam