Post-PRISM SEO | Google Have Stopped Tracking Keywords

Post-PRISM SEO | Google Have Stopped Tracking Keywords

Just a few days ago (September 23rd, 2013) Google quietly rolled out an update to their Analytics platform: They stopped reporting on which keywords were used to find your website via organic search. But if Google have stopped tracking keywords, what does this mean for the average business out there?

More specifically (but without getting too technical), Google have actually changed the way their whole website works so that all data is now sent via an encrypted or ‘secure’ connection (i.e. HTTPS rather than HTTP) meaning all keyword data simply cannot be tracked. They aren’t throwing the data away; they never collect it in the first place. It’s a subtle difference but an important one if we want to understand their motives.

This isn’t exactly a sudden change either… Google have been ramping up the level of keyword encryption for at least 2 years. As a result, most website owners have seen considerable growth in a segment labelled ‘(not provided)’ in their keyword reports. However, after the recent update, as much as 98% (many are claiming 100%) of organic keyword data from Google is now encrypted. Of course you can still see keywords from other search engines—i.e. Yahoo, Bing, Ask etc—but they hold such a small share of the search market that they are insignificant by comparison.

The motivation

Many who are involved in SEO are asking: How can Google justify taking away such a valuable business reporting tool – especially one which is built into the very fabric of most marketing strategies? Well upon closer inspection, Google may actually be trying to do us all a favour. Admittedly, that depends on how cynical you’re feeling.

Here’s all you need to know about why Google have stopped tracking keywords, how that will affect your business and, of course, the best way to move forward:

Viewing life through the PRISM

The official line from Google is that they made this seemingly aggressive move because of a clandestine U.S. intelligence operation called PRISM. It sounds like something from a Jason Bourne movie, I know… but it’s true.

PRISM is a mass electronic surveillance and data mining program, run by the National Security Agency (NSA) in the U.S. since 2007. It operates pursuant to the Foreign Intelligence Surveillance Act, 1978 (aka FISA) which regulates how the U.S. collect, store and analyse data pertaining to “Foreign Powers” and “Agents of Foreign Powers”. That latter term is inclusive of anybody who could be even remotely “suspected” of terrorism and even those who are “at risk of becoming” a terrorist… and unfortunately in the modern world those vague terms pretty much include everybody. Yep, you too.

So PRISM has global reach and is essentially a spying tool used to secretly collect data about people’s web activity. Scratch that – everyone’s web activity. It came under fire earlier this year when, following a 6 year stint in the murky shadows, they were hurled into the public eye by a disgruntled NSA employee called Edward Snowden. He realised that given the extent of PRISM’s operations—gulping up huge amounts of personal data about non-U.S. citizens in a ‘proactive’ or ‘preemptive’ manner—was tantamount to dangerous and criminal activity on an international scale.

So what were Google thinking?

Well, when this all came to light, Google Inc. was accused of openly, instantly and directly sharing data with the NSA and PRISM program, as though they just let PRISM take whatever they wanted whenever they wanted it. Google adamantly denied it, insisting that they only ever give PRISM indirect access via specially regulated requests; and only give data in rare cases where there is moral impetus. In order to drive that point home they improved encryption between their internal data centres and actively campaigned for permission to disclose the frequency and type of so-called ‘spying requests’ from PRISM (it is currently illegal to release this information publicly).

Now, in a final move to protect their users’ information from the PRISM program, they have made keyword search tracking impossible; or at least that is the motive we’re expected to believe. The cynic in me, however, says perhaps not. While they do deserve a pat on the back for taking on the NSA, you have to look at how Google’s shareholders might benefit from this. In business, there’s usually a fiscal motive!

It is undeniable that removal of the organic keyword data from Analytics makes it much harder to conduct meaningful SEO without a good agency or trained professional in tow. I’ll explain that more in a moment, but right now I think it is important to note that for most non-tech-savvy small business owners, the only viable alternative to organic SEO is (you guessed it) Pay-Per-Click advertising… which by the way does still allow keyword tracking! Google’s shareholders will no doubt be happy about that: Many businesses will simply turn to paid advertising through Google rather than re-learn organic SEO. Pretty convenient, eh?

What Does This Mean For Businesses?

1. Goodbye long-tail optimisation reporting

‘Short tail’ search terms usually comprise only 1 or 2 keywords and represent a top-level category, for example “Builder” or “Building Contractor”. ‘Long-tail’ search terms on the other hand include both a short tail term and one or more qualifying words, for example “Reliable Builder” or “Cheap Building Contractor”. The qualifying descriptor(s) can relate to colour, price, size, quality… anything that could help to differentiate the value in what one web page offers versus other pages related to the same short tail category.

The precise distinctions are somewhat arbitrary, but there is good reason to understand the principle. Long tail search terms tend to be searched much less often than short tail terms but there are lots more of them and they are all less competitive. Plus, they indicate a more specific need in the searcher and therefore optimising for ‘the long tail’ enables Google to pre-qualify website traffic, improving traffic quality and therefore conversion rates.

By definition, a long tail SEO campaign is aimed at ‘mopping up’ a huge number of low volume, low competition long tail terms. But reporting and tracking rankings for all those terms is logistically impossible. Until now, the solution has been to report rankings for core category terms (i.e. short tail terms) and then cross-reference those with the list of top keywords that were used to actually find the website – the latter including long tail searches, indicating the kinds of qualifying terms that the site is well optimised for.

Obviously this is no longer possible. Well, at least not in the way it always has been i.e. handed to us on a silver platter. Some keyword data can be found in Webmaster Tools although it is all more focused on appearances in search results rather than actual clicks. Click data and keyword optimisation lists can be seen, but they are both much less reliable than the old Analytics reports and require the ability to read between the lines.

2. Hello trusted SEO partners

We now have to trust other metrics to gauge optimisation.

Firstly, Google Analytics still shows generic conversion rate information. Granted, this can be affected by a huge number of factors aside from optimisation, but it is at least a marker in the sand pertaining to traffic quality. Secondly, traffic volumes are still reported. Again, this is not a good SEO metric in its own right: Often, as optimisation is improved, you will receive less traffic overall but more high quality traffic than before.

Combine those two metrics with short tail search ranking positions and it becomes possible to extrapolate the level of SEO progress. False positives and negatives can be weened out. The difficulty lies in knowing how to read the trends and that is precisely why a good SEO partner that you can trust is more important than ever.

3. No cutting corners in 2014

Google’s top priority has always been improving the quality of their organic search traffic. They created Analytics so that website owners could understand their optimisation and make changes to their site, in turn helping Google understand its value. This particular change to Analytics, however, actually hinders website owners from taking responsibility; at least in the same way and with the same mentality. Ultimately then, you may think that this could result in lower quality search results over time.

However, Google has matured a great deal in recent years and they have become very good at picking up indications of value from activity surrounding a website. Therefore, this recent change just makes ‘manipulation’ of search results harder and places more importance on the generation of ‘natural’ ranking signals instead; such as social media activity, content release and high quality back-link profiles.

How different will the world of online marketing be without keyword tracking? Only time will tell. But my gut says the best SEO professionals out there will simply keep doing what they have always done… creating valuable and engaging user-experiences for their clients’ customers. The clients are simply going to need a little more faith in their SEO partners and in Google.

Want to know more about SEO since Google have stopped tracking keywords? Talk to us for advice and suggestions.
UPDATE: Google have created a new search algorithm called ‘Hummingbird’. Follow the link to find out what this means for you and how it ties in with the recent encryption of keyword data discussed in this article.