Google has announced that it is starting to use an artificial intelligence system developed in its research labs, known as BERT (which stands for “Bidirectional Encoder Representations from Transformers”),to help answer conversational English-language queries, initially from US users. The new technology hopes to improve the results it delivers when you type in a search query. This development is seen as  “representing the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search.”

Google Update BERT

How will BERT impact Search and SEO?

BERT is designed to help Search better understand the context of words used in searches and better match those queries with more helpful results. 

The use of BERT will particularly help provide better search results for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning.  You can search in a way that feels more natural to you.

Search algorithm expert Dawn Anderson (@dawnieando on Twitter) responded that the new Google update won’t help websites that are poorly written. 

 Dawn Anderson observed:

“It’s knocking human understanding out of the water in loads of natural language understanding tasks. BERT is like a WordPress plugin which is a starting point and then they customise it and improve it.

The word “rose” means several things but it’s exactly the same word. The context must accompany the word otherwise the word means nothing.”

“Bass means different things. There are different meanings for the single words. The context around the word provides more meaning.”

Two examples of the BERT update in action

Google's older search technology would treat queries as a “bag of words,” search vice president Pandu Nayak said on Thursday. In other words it threw out lots of information about the sequence of words and only considered what words it thought were important.

For example, results for “can you get medicine for someone pharmacy” would previously have served a link to a 2017 MedlinePlus article about getting a prescription filled, and missed the point that the search was looking for information on how to pick up a prescription for someone else. Using BERT, Google’s search engine now shows a 2002 article from the Department of Health and Human Services about how to have a friend or family member pick up the medicine on your behalf.

As Jeff Dean, Google's senior vice president of AI, explained, BERT essentially teaches itself about language by playing a game: Google engineers trained the AI model by feeding it various paragraphs in which 10% to 15% of words were randomly deleted, and making it guess what needed to be filled in.

Another example where the BERT update makes a significant leap forward in understanding is “parking on a hill with no curb”. The kind of phrase in which Google would typically have figured the word “curb” was important but not “no” — which would mean you may get a result about parking on a hill that actually had a curb. BERT should be more adept at understanding the key word “no,” and give a result that reflects that.  

How does the BERT update effect me?

On page SEO becomes more important in terms of using more precise words since BERT analyses search queries, not web pages. In other words sloppy content will be penalised in this update. This as an opportunity to bring more website visitors with content that is more focused and well organised.