• Home
  • Blog
  • 2005
  • May
  • A Look at Google’s Patent Application for a New Search Engine

Blog

A Look at Google’s Patent Application for a New Search Engine

» What’s Going on at Google Again?

Imagine an Internet without search engines. It’s hard to believe, but prior to 1993, we were all stumbling around cyberspace without a map or an address book. The only way to access a site was if you had the URL already! E-commerce was in its infancy, in part, because no one could find the door to the stores Then, Yahoo! changed everything.

In 1993, Yahoo! introduced the first search engine. It was a primitive algorithm that relied, almost exclusively, on key words. E-tailers, quick to see the advantage of search engines, hastened the SE recognition process by submitting their URLs, and up to seven key words, that would then be entered into the Yahoo! database within six to eight weeks. Six to eight weeks! Today, some on-line businesses open and close in six to eight weeks, but then, 1993 was a long, long time ago in our digital age. Ancient history, in fact.

Anyway, back to our story. It didn’t take long for analysts to figure out how the Yahoo! SE did what it did. It accepted the key words submitted by site owners to start. Then, once the site was recognized, the SE would stop by, count key words and rank the site which, in turn, determined where that site showed up on search engine results pages, or SERPs as they’re known in the e-marketing industry.

Almost overnight, the search engine optimized (SEO) text industry was born and every hack who ever thought him/herself a writer came crawling out of the woodwork producing key word rich text by the ton! By the ton! It didn’t have to make much sense, it didn’t have to be at all useful, it just had to be key word dense.

Naturally, the search results were less than high quality. In fact, they were horrible. All you had to do was add the key word ‘sex’ to your submitted list to show up on SERPs for porn sites — even if you were selling hand-crafted clothes for pampered pets.

Fast forward a few years to 1996. Stanford computer science graduate students, Larry Page and Sergey Brin, decide to go at this search engine thing differently by looking at back links pointing to a site based on the premise that a site with lots of back links must be a quality site since other like-minded sites had linked. The initial program, called BackRub (get it, back links, BackRub?) eventually evolved into the massive behemoth of search engines, Google, which currently has more than a billion pages in its data base.

Over the years, e-marketers, site designers and just plain old geeks tried to figure out the formula (called an algorithm, or algo) Google used to weight various aspects of a site. Back links remained an important factor, but in the on-going Google effort to improve the quality of their search results, Google kept their algo more secret than nuclear launch codes. In addition, they were constantly tinkering with weighting factors, rocketing some sites to the head of the class while dropping others down the pipe.

No longer was it enough to have lots of back links — they had to be quality (trusted) links that were, somehow, related to the content of your site. This eliminated the high rankings of links farms which would link to anything from used construction equipment to knockoff perfumes. And over the years, since 1996, the Google algo has grown more complicated and more sophisticated, providing users with relevant, useful SERPs. Which brings us up to March 31, 2005.

On that day, Google filed patent application 0050071741 with the United States Patent office for an entirely new search engine entity -an SE that thinks and evaluates, and ultimately ranks your site based on its history. For the first time in the short history of SEs, historical site data will be used in determining page rank. And we’re not just talking key word and back link history. SE 125, as it’s known in the public documentation, looks at literally dozens of factors in determining your PageRank.

How’s this for a menu of ranking criteria:

  1. How long your site’s been up and running. Of course, this doesn’t count the time you were operational but still undiscovered by the Googlebot.
  2. How frequently you add, update and remove documents from your site. If Google’s objective is to deliver quality results, an article about how to soup up your Coleco isn’t going to move you up in PageRank.
  3. How often you move your site from one host to another. Perhaps this is viewed as an indication of shady dealings, even though you found a host that charges 30% less than your previous hosting service. A lot of moves from physical location to physical location subtracts points from your ranking score according to Google’s patent on ol’ SE 125.
  4. The new SE tracks visitor traffic — everything from how many, to how long to where they go after leaving your site. The new algo figures that if visitors to your site link to a similar site, your site didn’t have what the visitor was looking for — a reasonable assumption. Why keep looking once you’ve found what you’re looking for.
  5. The rate of key word density change. If you’re swapping one pile of SEO text for another, but not improving the quality of the information contained in the text, the Googlebot will give you a C for effort, but don’t expect to see a major jump in PageRank. The new algo actually factors in the quality of information each site contains.
  6. How often you change your anchor text. If your anchor text hasn’t changed since Y2K was a concern, you lose points for what the patent describes as ‘staleness’. Google wants fresh text and fresh text is what it will get — if you want to maintain or increase your PR.
  7. Both outbound and inbound links are still a major weighting factor according to the patent application. SE 125 wants to know how many inbound links point to your site, how long have they been in place and what is their trustworthiness. Junk links do you no good. Non-reciprocal inbound links from high trust sites, or expert sites, are like money in the bank to the small, e-business owner.

After reading through the pages and pages and pages of legal mumbo-jumbo, it’s obvious that Google has made the great leap forward in SE technology by tracking and incorporating historical data on every site. Though not quite as obvious, it’s clear that Google’s SE 125 is also designed to raise the bar and make site owners take the steps required to keep their sites current, fresh, relevant and helpful to Google users.

Perhaps it’s time to review your site through the eyes of SE 125. To read the entire patent application, first put on a pot of strong coffee, take the phone of the hook and click on the link below.

» United States Patent Application: Acharya, Anurag — March 31, 2005

Subscribe to W3 EDGE Updates

No related posts were found

Comments are closed.