Google’s Rhythm
If Google was a bird, it would be a hummingbird. Or so, the hummingbird is Google’s given name for its newest algorithm update. Indeed, Google chose the bird for the speed and precision its species is known for. Does this mean goodbye to the Panda and Penguin of Google-past?
From a webmaster perspective, usually a main aim is to rank highly enough in search results that bring people to your nook of the internet. Whether you’re pedalling a product, a blog or a who’s who on your favourite TV programme; there’s a reason why you’re publishing on the internet.
From a search engine perspective, they aim to provide relevant and quality results to the user choosing to use them and not a competitor. It does this by implementing a string of processes that work to build results (SERPs) sorted in the most relevant way possible. This is called the algorithm and, as mentioned previously, Google has called its current toolbox the Google Hummingbird.
Life with the Hummingbird
‘Hummingbird’ has been driving Google’s results since its release toward the end of 2013. Search Engine Land have a great description for better understanding Hummingbird’s role in the Google universe.
Though Google notes the last time its algorithm was as significantly updated was with the 2010 Caffeine release, Hummingbird aims to better understand the context behind search queries. This has massive implications for users and, of course, webmasters.
Hummingbird introduced to the algorithm a better understanding of search context. Rather than picking out select keywords from a user’s search term, Hummingbird works by recognising the keyword’s meaning in that specific phrase.
An example in the search term for “What’s the closest place for breakfast near me?”
Hummingbird will focus more on the meaning of the words in the context of the sentence. So, it will pick out ‘breakfast’, ‘closest’ and ‘near me’ to understand that you’re asking for eateries that serve breakfast near your current location. In the past, perhaps the search engine results would return listings based on just the main keyword ‘breakfast’.
The shift towards better understanding semantic choices and their associated context contributes to a vastly improved user experience. As mobile becomes the more dominant form of search (nearly 60% of searches are now mobile), it’s expected that search queries will differ from those typed on a desktop. Hummingbird is the response to these changes.
If Hummingbird seeks to better understand the language we use, what place does Google’s previous algorithm tools now have?
Panda and Penguins
Back to the brilliant car analogy penned by Danny Sullivan over on Search Engine Land, he suggests to think of these as parts to an engine. ‘Hummingbird’ now being the overall engine, whilst Google Panda and Penguin act as individual components that make it work. Cogs in the wheel.
Google Panda was released back in 2011. It aims to lower the ranking of bad quality websites. Panda judges the content of a site to determine its quality. Sites with content of little relevance, repeated articles and stuffed with ads were prime targets for the update.
To SEO professionals, Google was cracking down on the ‘Black Hat SEO’ techniques some had been getting away with. No longer was it deemed acceptable to write irrelevant drivel, use content farms, or over-garnish articles with keywords anywhere and everywhere. Though these were effective previously, Google issues penalties to websites engaging in these lazy tactics.
Before this particular update, it wasn’t uncommon for sites to steal content from genuine, high-quality sites and duplicate them for better rankings. Perhaps why the Panda update affected a number of sites that paid little attention to content submissions.
The trick to taming the panda is to do what you say you do. If you tell Google you have information about hockey, make sure you do! Then make sure the information is your own, is detailed and well written. Give the user what they are looking for and Google will give you to them. That is the point.
SEO professionals can now tell Google even more about their website. Thanks to RankBrain, part of the Hummingbird algorithm, Google can now think for itself. At least, sort of.
Latent semantic indexing is a technique that allows Google to understand the intent behind a search. When we discuss things with our friends, they understand us, for the most part this is because of the context and topic of our conversation. For a search engine, this contextual information is difficult to understand. However, by including LSI keywords within a site, Google can now recognise the relationship between similar topics and their related keywords; and show the most relevant results for our search queries.
Google does this by beginning to find links between various keywords on a site. Done correctly and Google will be able to show you in results for search queries you may not have thought to target!
Google Penguin focuses more so on the quality of a site’s backlinks. The basic understanding is it aims to deter and penalise websites that try to cheat Google by purchasing backlinks. Released in 2012, the update targeted sites flagging up with low-quality backlinks, perhaps purchased from link networks.
Penguin aims to crack down on site’s implementing spamdexing techniques to achieve higher rankings. These include cloaking, which involves a site presenting one thing to the search engine, and another to a user.
Though Google implements these algorithmic changes to improve the results given to a searcher, some found a way to exploit them to their advantage. This is referred to as negative SEO.
Most negative SEO techniques include directing purposely bad links toward a competitor site. In the hope, of course, that Google will penalise them accordingly.
As a remedy, Google offered webmasters the power to choose which links were to be crawled. By using the “disavow” tool, sites can recover from penalties. This all relates to PageRank, which is Google’s tool for judging the quality of a website.
Artificial Intelligence
Google’s implementation of RankBrain and reliance on AI as a tool to better understanding its users is a huge change for SEO.
RankBrain converts keywords into vectors to find relevant results in the connections between certain words and phrases. Digital marketers may find it increasingly difficult to please the search engine algorithm’s because of how much knowledge the parts such as RankBrain are able to draw from.
The rise of the smartphone
It seems difficult to imagine life without an internet accessible phone. It wasn’t too long ago, however, that a mobile phone was used for simply that. A phone but mobile. Now we think nothing of sending e-mail, taking photos, or perusing our favourite shopping websites, all whilst out and about. Indeed, long gone have the days of the monophonic ringtone and, of course, a quick game of cult-classic snake. Oh, the simple things.
Webmasters have had to quickly adapt to the increasingly dominating aspect of mobile as a primary choice for internet use. This means creating an equally satisfying user experience on different platforms. What works on desktop, may not work on mobile.
I previously mentioned Google’s announcement that they are shifting toward a mobile-first index here. This is arguably one of the most important changes in recent years, further underpinning the overwhelming dominance of the mobile platform.
Mobile-browsing changed the game entirely. Users demand instant content that responds well to their smaller devices. Notable changes include the optimisation of desktop sites for mobile users (if they weren’t responsive already), and Google backing open-source initiative AMP (Accelerated Mobile Pages) to show preference in results to sites that are coded to load almost instantly.
Getting to the point…
The last five years have changed SEO incredibly. Though it’s true that sites face stricter guidelines in which they can attempt to achieve better organic rankings, the goalposts have not been moved. The key to SEO in 2017 lies in creating a strong foundation to your website; rich with high-quality (and relevant!) content, accurate HTML data and appropriate mobile optimisation techniques – ensuring more respect from search engines than trying to cheat the system and amend the errors.