I have so many instances and references where I myself have made mistakes or where others have fallen before me and I have learned from their mistakes. In this chapter we will talk about:
Complex Java Script Menus
Dynamic Code on Pages
Bad XML Site Maps
Abnormal Keyword Placement
IP Delivery/Page Cloaking
What to Do if You Have Been Banned
Problem Pages and Work-Around
Validating Your HTML
When used correctly, it can enhance a visitors experience, unless you’re trying to get mobile devices to be compatible. The non-mobile side of your website shouldn’t be built entirely in Flash, nor should your site navigation be done only in Flash. Search engines have claimed for a couple years now that they’re better at crawling Flash, but it’s still not a substitute for good, crawl able site menus and content.
The some issues mentioned regarding Flash apply here to AJAX . Google has now claimed it can read AJAX and index what it has found but I have not been able to find examples of this. Ajax can add to your site’s user experience, but AJAX has historically not been visible to search engine crawlers. In order to let Google to index your AJAX website content, Google offers guidelines to help make AJAX-based content crawl able, but it’s complicated and the SEO “best practice” recommendations remain the same. Don’t put important or unique content in AJAX.
Complex Java Script Menus
That is still best practice today: Make sure your site navigation is presented in simple, easy-to-crawl HTML links.
A dynamic URL’ is most simply defined as one that has a “?” in it, like http://clickperfect.co.in/page.src?ID=3456
That’s a very simple dynamic URL and today’s search engines have no trouble crawling something like that. But when dynamic URL’s get longer and more complicated, search engines may be less likely to crawl them (for a variety of reasons, one of which is that studies show searchers prefer short URL’s). So, if your URLs look anything like this, you may have crawlability problems.
Google webmaster help page says it well: “… be aware that not every search engine spider, crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.”
Dynamic Code On Pages
Code that is held in a database and pages that display that display the output dynamically and on pages that deliver unique output will have issues being indexed by the search engines. Some pages also have what is termed as “code boat”
Code boat is situations where the code required to render your page is dramatically more substantial than the actual content of the page. In many cases, this is not something you’ll need to worry about—search engines have gotten better at dealing with pages that have heavy code and little content.
First, your are not required to have a robots.txt file on your website; millions of websites are doing just fine without one. But if your use one (perhaps because you want to make sure your Admin or Members only page are not crawled, be careful not to completely block spiders from your entire website. It’s easy to do with just a simple line of code.
In no circumstances should your robots.txt file have something like this:
That code will block all bots, crawlers, and spiders from accessing your website. If you ever have questions about using robots.txt file, visit robotstxt.org.