DSL Marketing Myrlte Beach

Tuesday, December 1, 2009

Four New Signals in Search

Brought to you by Eric Enge

Search engines constantly look for new signals they can use to improve the quality of the results they provide to users. Ultimately, user satisfaction is a critical component in retaining or increasing their market share, especially over the long term. Let's explore some of these new potential signals and the way search engines evaluate and make decisions to use a new signal.

Back in the days of AltaVista, search engines were keyword-centric. These were the days when spammers loaded meta tags with large number of keywords, and also used invisible text to jack up the perceived relevance and value of a search page.

Google drove the next generation with its link-centric algorithm. However, this algorithm was also attacked and manipulated by spammers. As Google tuned its link-based algorithm, however, they were still able to keep the impact of spam much lower than it was in the keyword-centric days of search.

All the search engines rely heavily on links today, and these will remain critically important for the future. However, the complexity of the link algorithms in use today far exceeds that of the original PageRank paper by Larry Page and Sergey Brin.

However, spammers still have an impact on search results. The search engines want to continue to reduce that impact as much as possible. To aid in this, they continue to evaluate new potential signals that can improve search quality while making life harder for spammers. When good ideas are found, they are implemented.

Page Load Time

Google engineer Matt Cutts discussed this factor at Pubcon in November. Cutts indicated that there is a strong movement within Google to make page load time a ranking factor because pages that load quickly improve the user experience.

Because that is the case, why not make it a ranking factor? Cutts indicated that Google could start using this within the next year.

Clicks

In an interview with Josh Cohen of Google News, he indicated that click data is used as a ranking signal in Google News. In rough terms, the way that this would work is that Google knows what a normal distribution of clicks will be across the results.

Data leaked by AOL in 2006 suggested that the first result would get 42 percent of the clicks, the second would get 12 percent, the third would get 9 percent, the fourth result received 6 percent of the clicks, and so forth. But, if one particular result gets 10 percent in the fourth position instead of something closer to 6 percent, this could be a sign that the fourth result needs to move up in the SERP.

Cohen also indicated that Google News doesn't use links as a ranking factor. But if click data works in the Google News environment, it isn't a stretch to imagine that it would be helpful in Web search as well.

Web References

It's well known that Google's Local Search results factor in Web references as a ranking factor. A Web reference is a mention of a business that isn't implemented as a link.

Web references count as votes in a manner similar to the way links are used. As with click data, it isn't a stretch that these could start to have some weight in search results.

Closely related to this is the treatment of nofollowed links. Just because a link has the nofollow attribute doesn't mean that it counts for absolutely nothing. Certainly, nofollow links in blog and forum comments will count for nothing. Nofollowed links that are implemented in something that looks like an ad will likewise pass no link juice.

However, other sites implement nofollow policies on all external links, such as many U.S. government sites. These sites are trying to identify resources that they consider valuable, even though they nofollow the links. The search engines could choose to associate some value with these links anyway. Remember, the goal is search quality.

Social Media

Facebook and Twitter are all the rage these days, and there are a lot of potential signals available from these sites. These can be treated as a type of Web reference by the search engines.

What makes them interesting is the "freshness" of the signal. A surge of discussion on Twitter about some world event could indicate that the topic of the discussion is a hot story. The real-time responsiveness of these sites can provide a strong signal.

Summary

How and when search engines will use these signals isn't clear. Of course, the search engines will never spell it out for us.

An important goal for them is to reduce the impact of spam, and a lack of clarity about how they use the signals available to them helps their cause. Also, just because we can identify and talk about a potential signal doesn't mean it will be useful.

Search engines have to look at and evaluate its impact on the results. Certain types of signals are "noisy," meaning that they provide incomplete, inaccurate, or biased signals. For example, a Web site that is primarily used by one segment of the population (e.g., teenage girls) may have a lot of usage and present many signals that don't work well for retired people.

One key thing to take away from all this is their focus on user satisfaction. If you create a Web site that is useful to users, it's likely your site will emit many signals that tell the search engines that your site is a good result for users.

Meet Eric Enge at SES Chicago on December 7-11, 2009. Now in its 11th year, the only major Search Marketing Conference and Expo in the Midwest will be packed with 70+ sessions covering PPC management, keyword research, Search Engine Optimization (SEO), social media, local, mobile, link building, duplicate content, video optimization and usability, while offering high-level strategy, keynotes, an exhibit floor, networking events and more.

No comments:

Post a Comment