26 Aug 2012

SEO Competition Facts: Straight From The Horse’s Mouth …



So behind every algorithm, and therefore behind every search result, is a team of people responsible for making sure Google search makes the right decisions when responding to your query. Obviously, there’s no other way it could have happened: Google is a living example of what’s possible when brilliant people devise a smart algorithm and marry it to limitless computing resources.” – Tom Krazit, The human process behind Google’s algorithm, CNET, 09/07/10

Obviously, there are only so many pages for a given keyword, or key phrase, that can appear for a given search. Google provides us with the answers if we look. So what are the basic SEO competition facts, and points of note to target for a new website or web page?

Google relies on roughly 200 different algorithmic signals, many of which are well known in the SEO industry. Some examples would be … how often the search terms occur on the webpage, if they appear in the title, or whether or not synonyms of the search terms occur on the pages.  To rate the aggregate of these signals, many of us are familiar with the well known innovation of Page Rank (PR), actually named for Larry Page (Google’s co-founder and CEO).

Page Rank operates by totaling the number and quality of links to a page in order to determine a generalized estimate of a particular website’s relevance to the word or phrase being searched; the basic assumption here being that more important websites tend to receive more links from other websites.

The Google “Panda” Update, as defined by Google themselves:

“…recently we launched a pretty big algorithmic improvement to our ranking—a change that noticeably impacts 11.8% of Google searches” … “The Panda update was designed to improve the user experience by catching and demoting low-quality sites that did not provide useful original content or otherwise add much value.”

As our previous articles have demonstrated regarding the Google “Penguin” updates, the algorithms employed by Google are generally in a constant state of flux, being under the banner of ‘constant improvement’ via testing and evaluation.

Google tends to employ manual control of websites to the following extent:

1. Security issues and concerns
2. Legal issues (X-rated content and copyright issues)
3. Exception lists (false identification of websites)
4. SPAM (putting defenses in place against cloaking, link stuffing, paid links, etc.)

By following the rules of the road, a new website that provides legitimate value, in terms of topic and subject matter, 99 times out of 100 will not encounter difficulties in ranking organically for the associated keywords. It’s OK to target keywords and write content around the phrases which you would like customers to find; it’s how you go about doing it that defines whether your website is found at the top of page 1 or buried within the results.
You can read more on this topic here.