OMG! Google’s New Algorithm BERT: Everything You’re Dying to Know for SEO
Jokes. You’re prolly not dying to know. But maybe you’re just a bit curious, which is a good thing. So let’s take a look at this BERT topic.
NB: This blog will be updated as more info roles out. Think of this as a brief intro…
BERT is here!
Yes, the algorithm that Google calls the most important update in five years has landed. And this update affects as much as 10% of all search queries. OMG. OMG. OMG!!! But seriously, although a new algo may not rock your world, BERT is a bangin’ big deal. But fortunately, it isn’t so big when it comes to SEO homework. Hmm… That’s lucky, isn’t it?
Bottom line: you can’t ‘optimise’ for BERT. (Despite what some blog posts and YouTube videos will tell you.)
It’s about context, y’all
BERT is about NLP (Natural Language Processing). And ever since 2015, when Google released RankBrain, their first artificial intelligence method for understanding queries, Google has been hammering on about how clever they’re getting with this stuff. Essentially, it’s about strings, not things. Putting words in context.
Human speak. For the peoples.
RankBrain is not dead, FYI. It still lives on; BERT is simply an extension for understanding strings. As Search Engine Land says, “It’s additive to Google’s ranking system. (My italics) RankBrain can and will still be used for some queries. But when Google thinks a query can be better understood with the help of BERT, Google will use that. In fact, a single query can use multiple methods, including BERT, for understanding a query.”
BERT makes things bi-directional, not just mono-directional. Gosh, that sounds kind of sexy, doesn’t it? Google is like totally loosening up. (FYI technically speaking it’s more like ‘any directional’ rather than bidirectional. Google now searches any way to find the right context. Google, you’re so... liberal...)
“Strings, not things”
BERT is an acronym that stands for Bidirectional Encoder Representations from Transformers. You see, up till now, machines – or transformers – could only do so much. We’ve all experienced examples where we punch something into Google and they Just Don’t Get it.
“BERT is an acronym that stands for Bidirectional Encoder Representations from Transformers.”
And these days, as someone I can’t recall wrote, ‘people speak keyboard-ese’ which I think is the cutest expression, no? We desperately hit the keyboard with a series of words to best get our intent across.
And sometimes this totally f*cks with Google’s brain.
Google gives an example of someone searching for “do estheticians stand a lot of work”. Previously, Google would have grabbed “stand” and matched it with “stand-alone” which would give quite a different meaning. But now, this shiny, new, improved algorithm allows them to look at the entire string and understand that ‘stand’ is related to concepts of the physical demands of the job – and therefore, give a better result.
Here’s another example from Google. See how the word “to” has significant meaning in the phrase?
Try it yourself, and see what you can come up with. You can’t do a retrospective jobbie, but have a play and see if you can observe the increased intelligence at work. And let me know of any corkers you can come up with.
From the Googles
“These improvements are oriented around improving language understanding, particularly for more natural language/conversational queries, as BERT is able to help Search better understand the nuance and context of words in Searches and better match those queries with helpful results.
Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.”
BERT analyses search queries, not webpages
Let’s remember: BERT analyses search queries, not webpages. So this is why you can’t optimise for BERT. But, if you have a page with thin content or poorly worded content that waffles on and leaves people wondering what you’re bangin’ on about, then it’s probably time you cleaned up your act. But you knew that anyway, didn’t you?
Wanna see the best vid-splanation of BERT?
Check out how Eric Enge and the charming, self-effacing Jessica Peck from Perficient Digital succinctly explain - in human speak - the BERT update. Not only is this the best explanation around, but the two of them are just so cute! Just watch…