Robots vs Humans

//Robots vs Humans

Robots vs Humans

I have focused on optimising for human readers so far – So what about optimising for search engines and search engine robots that crawl and index a site? Visitors need an enjoyable and informative experience on your site to return, but they also need to be able to find you through Google and other search engines.

Fortunately, these two factors are quite easy to reconcile. Search engine robots prefer written content above all, plain body text, without tables, frames, and flash, without unnecessary clutter. Humans likewise, for most of the sites they visit, prefer information, in the form of easily understandable content. Flash can of course be used, and since 2008, Google for example can index it quite well. But if you build your whole site’s navigation with flash, it is just too cumbersome for the user to go through, and likewise for search engine robots.

Humans will mostly look for the keywords they searched for on Google in your copy, scan your text and rapidly select any information they need. Thus, you might not need exceptional writing skills to write good and sensible copy, but people will notice if something does not make sense or if you stuff your text with keywords, without giving useful information. Keyword-stuffing (obviously repeating your keywords excessively), moreover, is not generally a useful tactic with search engines either. Search engines are constantly changing their algorithms, and nowadays, they have more sophisticated methods of deciding if a site’s content is relevant based on the keyword one types in to search engines.

Not exactly, but partially yes. It is luckily not a question of Google and other search engines making up random rules that you have to follow to be successful in the natural search results. Search engines want users to find their search results relevant and satisfactory, as they want returning customers, just as any business. So they are constantly improving their algorithms, to be able to provide relevant results for humans, and we can expect this to be even more so in the future, with more and more sophisticated search algorithms and user behaviour studies. So if you are only optimising for search engine robots, neglecting humans, your methods will soon be outdated. But you still need a few tweaks just for the benefit of search engines.

Bizarre things still happen with Google searches though. George W Bush’s incident with miserable failure is an amusing case. And some search engines still cannot always cope effectively with  keyword-stuffing, and some duplicate content issues.

The key thus is perhaps to consciously optimize both for visitors, and tweak some further things (HTML, on-site optimisation, keyword density) specifically for search engines. This provides a firm foundation for further SEO and higher rankings, and also leads to good conversion rates.

You do not have to choose between robots and humans any more. It might have been an important question before, but with ever more sophisticated algorithms, this eternal dilemma is bound to disappear.

By | 2017-10-25T13:52:25+00:00 October 14th, 2010|SEO|0 Comments

About the Author:

Leave A Comment

one × 3 =