In a recent analysis of over 500 websites with a first page search results ranking and a page rank of 4 or higher some interesting and useful information came through. Here are few of things of the many I discovered.
Page Content Length
Google does not have a page length maximum that Googlebot will crawl. There is false rumor that a page won’t be completely crawled if it’s over 1101Kb or 1000 words. I have heard both numbers from people. The analysis shows this to be completely wrong. However, there should be at least 400 words. All high ranking pages in Google SERPs (1 – 10) average 400 or more words. The only reason for not overstuffing a page is it tends to turnoff some visitors. Long pages with lots of copy does result in shorter visitor times the majority of cases. Keep it relevant and to the point and you’ll be okay.
This the first year I’ve seen pages ranked high with a keyword density in the double digits (>15%) compared to last year of <8%. These high density sites are usually ones such as Amazon, eBay, Nextag, etc. I believe they are getting a pass from Google because of the revenue they generate. Amazon and eBay spend hundreds-of-thousands of dollars everyday on Adwords. The best recommendation is to keep it natural and not stuff keywords. Use the primary keyword more than the rest, but do sprinkle in some of the non-primary ones. This reinforces the subject of the content. I would try for around 3% to 5% for the primary keyword phrase, if it works naturally.
Location of Keywords on a Page
The only true rule we know is that keywords in any section of the page outside the <body> </body> tags doesn’t count toward keyword density. So, everything shown in a Meta tag is looked at separately and compared to the <body> content. Also, keywords appearing in the gutters of a page are not counted by the search engines. Most automated keyword density tools that I have used include every instance of the keyword appearance on a page and therefore should not be relied upon. Manually calculating the density score is the best method.
It has been said that the primary keyword should appear at least once at the start, the middle and the end of the content. I have no proof of this at all. It is now believed that Google uses an algorithm called Latent Semantic Indexing (LSI) where it can make associations between the actual keywords and words that are similar in meaning. So, again the key is to make it seem natural and it will be okay.
A study of >500 websites with a PR 4 or higher shows that on-site SEO contributes about 20% of that PR. The remaining 80% depends on the overall site structure, outgoing and inbound links. On-site is the place to start your SEO tasks. This gives the search engines a clear understanding of what your site is about. Once you have your site in decent shape (perfection not required) you should begin building off-site backlinks from authority sites with a higher PR than your site.
Post by guest blogger Hal Major of I’m Just Sayin’