Google search engine optimisation and their 80-20 rule

Google online search engine optimisation and their 80/20 rule

Online search engine optimisation or optimization (with a ‘z’ or is that ‘zee’ if your from throughout ‘the pond’) techniques are constantly evolving. This advancement remains in response to the evolution of online search engine such as Google, Yahoo and MSN. Google in particular has become seen as the most advanced and innovative online search engine as it is armed with an array of anti-spam technology.

Google’s increasing usage of anti-spam functions has actually indicated that optimising sites for Google has actually ended up being much harder and it’s now not simply a case of opening your websites source files in notepad, adding some keywords into your various HTML tags, publishing your files and waiting on the results. In reality in my opinion and I’m sure others will agree with me, this type of optimisation, typically referred to as onpage optimisation will just ever be 20% effective at accomplishing rankings for any keywords which are even mildly competitive. Those of us who aced maths in school will understand this leaves us with 80% unaccounted for.

This 80% corresponds to offpage optimization. Offpage optimization is all to do with the amount of links pointing to your site and its pages, the actual linking text (anchor text) of these links and the quality of the pages which the links are on. Offpage optimisation is now for sure the overwhelmingly dominating aspect which chooses where a website will rank in Google. That then is exactly what I mean by the 80/20 guideline, I’m not speaking about the pareto guideline which means that in anything a couple of (20 percent) are essential and many (80 percent) are minor, I’m uncertain that uses to SEO.

Exactly what is the reasoning behind this then, why does Google give a lot ‘weight’ (80%) to offpage optimization efforts and so little (20%) to onpage optimisation. Well put simply it is everything about the quality of their outcomes. Whereas onpage optimisation is completely controlled by the webmaster and can hence be abused by an unethical one, offpage optimisation is something that is not controlled by anybody as such by rather by other webmasters, websites and certainly the Web in general. This implies that it is much more difficult to conduct any deceptive or spammy offpage optimisation methods in the hope of getting an unjust benefit for a website in the Google SERPS (Online search engine Result Pages), this does not indicate it is impossible though.

Let’s elaborate for a paragraph or two just why offpage elements such as incoming links are considered by Google to be such a great step of relevancy, hence making offpage optimisation the most reliable method of optimisation without a doubt. Take the anchor text of incoming links for example, if Google sees a link from WEBSITE A to WEBSITE B with the actual connecting text being the words ‘data recovery london’, then SITE B has just ended up being more relavent and therefore most likely to appear higher in the rankings when somebody look for ‘information healing london’. SITE B has no control over WEBSITE A (for the most parts …) and Google knows this. Google can then take a look at the link text and say to itself, why would SITE A connect to SITE B with the specific words ‘information healing london’ if WEBSITE B wasn’t ‘about’ ‘data healing london’, there is no response so Google must deem SITE B to be ‘about’ ‘data recovery london’.

I stated ‘in most cases’ above because typically web designers have several sites and would crosslink them with keyword abundant anchor text, however there is only many websites and crosslinks any webmaster can handle, once again Google understands this therefore as the variety of backlinks and incidents of keyword rich anchor text grows (and with that grows the unlikelihood of anything unnatural like crosslinking going on) so to does the relevance of the website which all the backlinks point to. Picture hundreds or countless sites all connecting to a site X with variations of ‘data recovery london’ type expressions as the linking text, well then Google can be pretty dam sure that website X is ‘about’ ‘information healing london’ and feel great about returning it in the leading 10 results. This is why they put so much value (80%) on offpage ranking factors such as links; they are just the most dependable method of checking exactly what a site has to do with and undoubtedly how well it covers exactly what it is about. This dependence on tough to cheat offpage aspects is what produces the quality search engine result we all understand, enjoy and use daily.

The moral of the story from an SEO perspective then is to invest less time on those little website tweaks which you believe may make a big difference (however won’t) and work hard on what truly counts, what truly counts is how the web ‘sees’ your website, the more quality (keyword rich) inbound links your website has the much better the webs ‘see’ will be and for that reason the much better Google’s view of your site will be. What Google considers your site is extremely important, as they ‘look after’ sites which they like.