Back to “Old-School”: Why Google-Safe White Hat SEO Works

The Google Endgame: why there’s no more room for SEO cheats.

If you were online in 1996 – 1999, you remember Alta Vista and the utter chaos of pure machine-driven search results with none of the semantic results we’ve come to rely on.

On AltaVista – and about 10 other top search engines – search input was based purely on the keywords used for the search. If you typed in “dog trainers,” you’d probably see a list of trainers in every city in the country – except yours. I once inputted “vegan restaurants” (to arrange a lunch for a relative who is vegan) and got pages of various vegan organizations, but nary a restaurant.

Search engines have always relied on software (called “spiders”) that crawled every bit of content on the net. The spiders did a really good job of recording links and words that were used in the associated webpages. It was so easy to game the system. All you had to do was join a link farm or link network. There was very little incentive to create useful content.

Google algorithms now watches for old SEO cheats (like link farms) – so don’t try them. You’ll get yourself banned. Clearly, this is what’s pushing Google’s seemingly unstoppable success. In the last few years, Google now generates the most relevant search results I’ve ever seen. Now if I search “vegan restaurants,” I may still see some vegan organizations, but now “vegan” and “restaurant” are weighted with my actual geographic location. I will see lists of local restaurants that serve to the vegan target market along with photos, menus, food ingredients, and directions on how to get to the restaurants. This is Google’s next level search world. Each search query deserves more accurate and relevant results.

They’ve moved away from delivering results purely based on keywords. They’re producing results that are heavily weighed in favor of the searcher. Everything about the searcher counts: the context of the search, location of the search user, time of day, search history, and the device that they’re using. Google even checks word usage to weigh context with the requested search. It’s called semantic search, where software ranks content based on the fuzzy value of the searchers intent rather than fixed values like the number of times that keywords appear in the text. Therein lies the Google Endgame – all of this is in service for the true master: the advertisers. Google HAS to get this right.

And that… my friends… is why there’s no room left for bad SEO cheats.

I am a reformed spamdexer. I did everything that could be done to an HTML website to scam the search engines. My conversion began in 1999 when I won a contract with the California Courts. I was brought in as a consultant to solve a problem.  The official California Courts website was absolutely buried by a mishmash of attorney offices, nonprofits, bail bonds agencies, and even porn sites. By the time I got on board, they had already developed an impressive quantity of content for a self-help section that covered a wide range of court-related topics – the every detail about the courts and proceedings. My recommendation was to convert that content into individual PDF documents. We optimized each doc to the max with good titles, descriptions, and tags. Then we imbedded the docs into link pages that were also fully optimized. In less than 6 months after posting these new pages, California Courts completely dominated the first two results pages for wide swath of legal search terms in California. We effectively bumped off the “spamdexers.”

And we had lots of links.

Besides good content for the search engines to nibble on, we had a mountain of links to just about every law school and lawyer in the state. This is well before social media, so we didn’t have that resource. But to be honest, we didn’t need anything else. What caught me off guard was how quickly things changed. That’s when I turned the corner and went all “white hat.”

Since then, I’ve expanded my understanding of search engine ranking. First and foremost, I have long since accepted the fact that what I’m calling the “Google Endgame” is really the endgame for all search engines. They run a business. They offer a free service that draws in eyeballs, but their core business is advertising and website promotion. What’s not to get?

Solution: The Content Network

Let’s say you’ve got the content thing down, you’re engaged on social media, and you’ve optimized your site to the hilt. Let me guess – traffic is still trickling in AND your site rank is still in backwoods territory, right? That’s my problem and this is how I’m trying to solve it.

What if you could toss a little automation into the mix and generate high-quality relevant links? What if you could use some of the semantic algorithm magic (similar to Google’s) to match high-quality, topic relevant sites with yours? Sound too good to be true?

First, some background. Several years ago, SEO pros were selling some form of backlink generation software. All but the most common linking solutions were banned because of persistent abuses that went against Google’s  “endgame.” One very important factor: they were easy to detect with a large quantity of links that were not anchored, not relevant, and massively static. The guys at FreeRelevantLinks.com (FRL) have created an automated solution that avoids the obvious temptations for abuse. It also generates relevant links from the on-page content of thousands of client sites. The age of the Content Network is born.

According to the guys at FRL, their “content network” has built-in “Google” safeties that prevent it from violating spam rules. For instance:

  1. Generated links are limited to the number of relevant keywords available;
  2. All links pass content relevance tests to ensure that they don’t have a bad influence on other linkers;
  3. About half of the generated links are reciprocal, the other half are not (100% reciprocal links could be a Google red flag); and
  4. The software regularly refreshes links, which encourages the search engine spiders to go back and reindex but also ensures that the list of links is never static (also a possible Google red flag).

This system IS automated (which is frowned upon by SEO purists), but I think this kind of automation is fine as long as it doesn’t put a great big Blacklist Target on your website. Some purists frown on just about any automation. But who doesn’t use Hootsuite or Communit? Are those cheats? I consider this solution a work productivity tool. I believe it speeds up the process of acquiring links so that I can spend more time developing high quality content and engaging with my audience (per the Google Endgame).

I have been testing FRL for about a year. I can attest that it has a positive impact on ranking on promoted keywords. From what I’ve seen, I believe that FRL architecture meets Google (Bing, Yahoo) criteria for relevant, high quality links. Of course, participation in any program like this is no guarantee for good ranking. FRL offers no guarantees. But I think it rocks the alternatives.

Want to give it a try? FRL gave me an account to continue my testing. Click this link and register for the 5 keywords for free. No credit card needed and you can keep using it for as long as you want. All I ask is that you let me know what happens.

Thanks!

About: Ray Wyman, Jr is a content creator, communications professional, and author with more than 30 years of experience. Visit LinkedIN or Raywyman.com for more information.



Comments are closed.

%d bloggers like this: