What is Google Bert & How it works?

The Beginning

BERT was a research paper published by Google’s researchers, which has now made itself into a software update. The first update was implemented on October 25th, 2019. It will assist the search engine in evaluating queries better than before.

It has been described as the most significant change in search ever, especially after Google announced RankBrain approximately five years ago. It means that now you will see more relevant search results.

What is it?

To represent it technically, it is an abbreviation for Bidirectional Encoder Representations from Transformers. It has been described as a revolution in content search and ranking. Basically, it is a language processing model.

Basically, an attempt by Google to create more correspondence between the searches placed on it and the content that appears as results, it can let the search engine understand the context better. As an example, see these two sentences: “He saw me.” And, “He cut the wood with a saw.” Here, the word ‘saw’ has been used twice but in different contexts. Words of these kinds are called homonyms.

There are also polysemous words that look similar but have different meanings, such as the word “flurry.” Synonymous words are those that are different but mean the same. Thus, language could create confusion for a search engine.

To understand BERT properly, we need to break it down to its elements that make the whole mechanism function.

Algorithmic Networks:

For comprehending patterns, neural networks are deployed. These are based on algorithms. An algorithm is basically a dataset to solve a specific type of problem.

Natural Language Processing – NLP:

It is a field of linguistics which deals with how to make computers capable of understanding the complexities of human language. It enables machines to discern how humans communicate with each other.

Research in this area has resulted in services we regularly use, such as chat-bots, word suggestions, etc. BERT is actually a step forward in NLP.

The Actual Mechanism:

To understand how BERT works, we first need to grasp a term called Bidirectional Learning. It means to learn a complete set of words rather than learning them in an ordered way from left to right or left to right and right to left combined. It is a contextual based technique that lets the software to learn words based on all the words around, not just words immediately next to them.

For instance, when we say, “He cut the wood with a saw,” unidirectional learning will understand it as, “He cut the wood” and will leave a ‘saw’ unnoticed. A bidirectional approach would include both the preceding and succeeding words and make them contextually comprehensive.

The Extent of Usage:

To put it shortly, not too much. Google’s Vice President for Search, Pandu Nayak, described it will impact only one out of 10 searches placed in the English language because it has not been rolled out for other languages, yet.

He further said, “Particularly for longer, more conversational queries, or searches where prepositions like ‘for’ and ‘to’ matter a lot to the meaning, Search will be able to understand the context of the words in your query.”

SEO and Snippets:

A snippet is a small description that Google shows from a website on the search page related to your query. There are chances that BERT might impact snippets from Google. As the contextual understanding of terms and phrases changes, so will the results for them.

For instance, writing ‘best camera smartphone’ previously might have shown different results since software could have been ignoring the word ‘best’ sometimes. Every single word is crucial for a contextual understanding; hence, Google has come up with this solution.

Companies might see an improvement or a decline in their rankings. The keywords that Google couldn’t understand previously would now have higher chances of being grasped. This means the inclusion of more keywords. If more keywords are being included, rankings will automatically improve.

Old vs. new:

The previous AI tool that Google developed was RankBrain. Some of the aspects of BERT might indeed feel similar to it. But necessarily, there are significant differences between them.

RankBrain is known for making a lot of adjustments in the final search results. It evaluates the current query and compares it to the previous ones. In this way, it modifies search results. It also has an interpretative touch to it. It interprets the queries and sometimes even shows results that do not even have the words people have used in their queries.

For instance, if you search ‘height of landmark in Canada,’ it will show you the height of the CN tower, although the CN tower was not mentioned in the search query.

On the other hand, BERT operates under a completely different mechanism. Typical, traditional NLP tools either look at what is after the word or what is before the word. They are unidirectional. BERT is a difference in the sense that it looks both ways. It takes the content before and after a word to completely get the meaning of it.

But that does not mean BERT will replace RankBrain. There could be different algorithmic combinations that Google uses. For instance, BERT and RankBrain might be used together, on their own, separately with other algorithms or combined with other algorithms.

Altering other Google Products:

There have been no categorical statements regarding this but BERT might cause an improvement in the user experience of Google Assistant. The ability of Google Assistant to outplay Siri and Alexa is already established. Reports suggest its accuracy is much higher than its competitors.

This update can only reinforce its supremacy and move us even closer to the future where searches are based primarily on voice, not text.

BERT sits at the heart of grasping intent and optimization. With this update, Google has shown again that customers are its core assets, and it truly cares about them. It wants to reinforce the advice that content must be created according to the consumer’s wishes. It is also significant because companies can now be more creative in making content that targets consumers and not machines.


About Author: this update has written by Henry Wilson, he works as an SEO & Content Architect at The Smart Media Solutions, they are ranking websites from 2005.