Have you not heard of BERT yet? Well, you should have. It’s the biggest Google update for years. It’s so big that we have dedicated this weeks newsletter to it. It’s time to dig into BERT. If you’re used to Google updates sounding more like cute animals and less like Sesame Street characters, you’re not alone. BERT is an unusual name because it’s an acronym for a new technology developed by Google’s AI team. The acronym is exceedingly nerdy: Bidirectional Encoder Representations from Transformers.
Understanding Natural Language Processing
Natural Language Processing (NLP) is the science of making unstructured language and speech into structured data a computer can properly understand. Google BERT is a huge step forward in Google’s use of this technique to understand search queries. NLP is a subfield of computer science and artificial intelligence concerned with interactions between computers and human (natural) languages. It is used to apply machine learning algorithms to text and speech.
For example, we can use NLP to create systems like speech recognition, document summarisation, machine translation, spam detection, named entity recognition, question answering, autocomplete, predictive typing and so on. Most of us have smartphones that have speech recognition. These smartphones use NLP to understand what is said. Many people use laptops, which operating system has built-in speech recognition.
BERT steps in when Google has a search term with no exact matches but lots of content that roughly matches. How they refine down the content and find what is actually relevant is where BERT and it’s advanced Natural Language Processing ability comes in. So what’s new?
So what’s new?
Natural Language Processing isn’t a new science and Google has been employing it in search for some time. However, the way that BERT works is different from existing NLP models and provides a new approach to understanding whole sentences.
Other techniques only compare each word to a limited number of other words within the sentence. BERT uses a model that processes words as part of a whole sentence. Each word is looked at in the context of the sentence in which it appears. This provides a better overall understanding than looking at the words one by one.
What effect is it having?
Volatility has risen in the SERPs however it’s not as high as expected. Especially with Google stating that BERT would affect 10% of searches.This may be down to BERT not being fully deployed yet but also how rank tracking tools work and measure volatility. As BERT is focussed on terms which are more rarely seen, these are less likely to be tracked. So it may be that the changes are underreported.
Winners and Losers
Those who benefit are likely to have large amounts of well-written content and guides. This is more likely to provide information that answers queries, that before BERT, Google was unable to match with content.BERT should, therefore, provide a nice traffic bump to in-depth content, matching searches that previously may not have provided any results to relevant content.
The last series of Google algorithm updates, including the latest BERT update, are flying like a flurry of boxing jabs, with BERT being the most recent surprise jab from the tech giant.
Quite simply, Google has found a way to understand user intent. For example, if your search parameters included “for” and “and” BERT is able to understand the context of that entire string of words typed into the search bar. Prior to this algorithm update, only strong keywords were picked up so your search results may have included information not pertinent to your search.
It’s a big deal for users. And, it’s a big deal for marketers because we need to have a deeper insight into who our users are and exactly what they are looking for. Traffic is going to go up on some website pages, and others will plummet. To keep your ship afloat, so to speak, content is really going to be the captain of that ship.
What that means to digital marketers is this high-quality content on your web pages through your blog. Keyword stuffing is, even more, a thing of the past. Now it’s all about longtail keywords of 5 words or more natural language recognition, with natural speech patterns.