You’ll most likely be aware that Google’s search operation relies on algorithms – up to 200 of them according to some speculation. In order to keep its search service relevant – and to foil those looking for SEO loopholes, Google regularly updates its systems and introduces new algorithms. In October 2018, the latest of those algorithms, BERT, was introduced. BERT, or Bidirectional Encoder Representations from Transformers to give it its full name, is a natural language processing NLP framework.
Great to meet you BERT – but what is it for?
BERT was designed to help Google to unscramble sets of text which include lots of words, sentences and phrases with multiple meanings. As humans, we find it quite easy to make sense of long and complex groups of words but, this can be a little trickier for a computer. The BERT algorithm is all about helping the computer to understand the context of these word sets. It’s not enough for Google to get the meaning of the word ‘like’ – it also has to understand that the word has more than one meaning depending on context. For example, the sentence ‘I like this car’ has a very different meaning to the sentence ‘This car is like a tank’.
It’s only words
Google’s latest algorithm works by using National Language Disambiguation. It uses what? you ask. Natural Language Disambiguation works with co-occurrence which provides context and identifies connections between words. BERT is made up of three parts which are as follows:
Bi-directional
For the first time, BERT uses bi-directional language modeling. Previously, Google’s algorithm could only view a word’s context window from left to right or right to left. Clever BERT is able to not only see both the left and right hand sides but can also see the whole sentence either side of a word.
Encoder Representations
BERT works with an ‘in and out’ mechanism – this means that what gets encoded can also be decoded for extra flexibility.
Transformers
BERT uses innovative transformers which focus on the pronouns within a set of text and then moves back and forth in order to put the whole sentence or phrase into context.
BERT’s resume
BERT is able to help with a number of national language tasks including:
- Named entity determination
- Answering of questions
- Automatic summarisation
- Word sense disambiguation
- Textual entailment next sentence prediction
So, what will BERT do?
Understanding human language
BERT will make a huge difference in the way that Google’s search interprets queries as it has a much fuller understanding of the nuances of human language.
Scaling conversational search
As well as making sense of the written word, BERT is also a whizz at understanding voice searches – something that has been massively problematic until now. This is very much in line with the increasing number of people choosing to search the internet this way.
Ranking adjustment
BERT’s super-sophisticated talents will mean that Google is able to provide much more accurate ranking of pages and sites. Although this may be good news for some, it may be a concern for others!
Although the jury is still out on BERT, according to some algorithm experts, there’s no doubt that its a huge step forward for Google. The internet has for a long time struggled with nuances of language and, in this respect, BERT is a game changer in lots of ways.
AUTHOR BIO
Zachary Hadlee is a Technology Journalist from London, currently based in Malaga. For 2 years now, he’s been writing stories about how our internet works – and how it is changing. From artificial intelligence to UX, things are happening today at a pace that can seem bewildering.