Five reasons as to why some websites can remain low in a search engine results page have been listed, urging sites to review everything from their technical to content side.

With firms hoping to crop up high in search engine page rankings so be more likely to be clicked upon, fifth reason for being low in this was that the host server was too slow, meaning that the search bot cannot index it, Glenn McCarthy said.

The internet consultant believes that switching hosts could solve this, but the next problem is that even then, optimisation could be poor. In the fast changing online world, many sites still use the old technique of hiding text purely consisting in keywords in the same colour as the background, hoping that search spiders will read them but not consumers. However, most search engines now ignore this trick and instead he urges sites to publish good content rich in relevant keywords.

Third reason for a low search ranking is a low link popularity – links must not only be to other sites, but to relevant or similar sites, particularly to already high-ranking websites.

Second from top problem is content, in that there is little text for search engines to find. Even if rich in text, pages accessed via javascript menus, text on graphics or images will not be found, rendering it invisible to search engines.

Finally, Mr McCarthy urges sites to have a little patience and warns that websites, particularly new ones, can wait up until a year before they see results, but if you follow his tips, they do happen.

In addition, he urges designers to not just consider this advice to attract search bots, but to think about readers and to make sure that it is attractive and worthy of being read by them and help propel a site up a search ranking.ADNFCR-8000151-ID-18111690-ADNFCR

Related Topics: General Marketing