For the past few months I have been writing SEO website content on assignment. It is great fun and although the search engine algorithms have undoubtly changed for the more complex, I have the suspicion that the basic concept and methodology I studied and formulated back in 2002 still hold true. Let’s explore this theory …
General Posits
First it is necessary to define some general posits and realities regarding search engine optimisation, which will be presented as unrelated concepts.
Content is King
Way back then, this was the rule when creating a website. The mantra was that search engines loved content. The more you had on your website, the more Google and Yahoo! loved your website, which then resulted in a high ranking.
For all that content to be king, it also had to be original and quality content. Relevancy was important. If you had a website about wheel nuts, then creating pages about squirrels loaded with nutty keyphrases would penalise your overall site ranking. Now keep in mind that this relevancy was checked by humans visiting your site and reading the content.
Computers Cannot Think
For all the glory and advances in computers, the machines are still only capable of doing what they are programmed to do by humans and still cannot learn anything by themselves. Machine learning may be hot science, but in reality it’s not true learning. It’s only recognising patterns that the computer was programmed to recognise.
All this means is that computers have been programmed to determine the relevancy of the copy on a website to its postulated purpose and, very important, determine the uniqueness of the copy compared to other websites. In short, they’ve replaced the humans in a more efficient manner.
Statistical Patterns
This where computers shine, albeit if correctly programmed. They can analyse a piece of text for keyword frequency and keyword spacing and keyword distribution, which is then compared to predetermined values.
Now these values are what differentiates the old hands from the young bucks and is the reason for this post.
Then and Now
Google did not exist when I first studied at a university. We searched using Alta Vista and Yahoo! was still a hand-edited directory. Courses and degrees in SEO and website design did not exist. We learned HTML in an afternoon and never spent time on the design, because we had information to share with the world.
We learned keyword frequency and emotive writing by studying the speeches of Cicero. Structure was learned from reading academic journals and the importance of brevity by searching for information using the card catalogue of the library. In short, we learned the how and why of language that was eventually programmed into search engines like Google and Bing.
Today, if you want a job in copywriting or SEO, then you must be in possession of a university degree majoring in … copywriting and SEO. Really? Yes, with total shock I saw that they actually offer B.Sc. degrees in website design and M.A. degrees in editing. Really really?
This means that the old hands, like me, who have all the knowledge, but the wrong qualifications, are precluded from obtaining jobs in copywriting and SEO work, even though we actually know more than the new graduates. Bummer.
So next time someone applies for a job, look beyond the new qualifications and remember that a B.A. degree obtained twenty years ago still meant something and entailed difficult language studies that eventually formed the foundation of modern SEO.