BERT – new Google algorithm updateBERT – new Google algorithm updateBERT – new Google algorithm updateBERT – new Google algorithm update
  • Overview
      • For Industry
        Introduce self-optimizing and self-learning systems into the production process. Use the power of artificial intelligence (AI) to produce much faster and more efficiently.
      • For Ecommerce
        Automate your CRM data, emails, SMS, push messages with one unified omnichannel personalization workflow and response in real-time based on qualitative data.
      • For Business
        Optimize all business processes from accounting and HR, to contracts and customer relationship management. Connect online with offline and grow with ease.
  • Capabilites
  • Blog
  • Company
  • Login
Request 10-min call
✕
12/17/2019

The latest update to the Google BERT algorithm, or Bidirectional Encoder Representations from Transformers, is presented by Google as the biggest change in the algorithm in five years. Google announcing it, said that it will affect up to 10% of search results. A lot of unchecked information about the new Google algorithm update is in the media.What exactly is BERT, how does it work, and why does it matter for our work as SEO?

What is BERT?

The latest Google algorithm update – BERT, helps Google to better understand natural language, especially in conversational search. BERT is a pre-trained natural language model without supervision. BERT can outperform the most common NLP tasks after fine tuning, essentially becoming a rocket enhancer of natural language processing and understanding. BERT is bidirectional, which means that it analyzes words before and after entities and context pre-trained on Wikipedia to provide a better understanding of the language.

The BERT algorithm (two-way representations of coders from transformers) is a deep learning algorithm associated with natural language processing. It helps the machine to understand what words in a sentence mean, but with all the nuances of context. BERT has significantly accelerated understanding of natural language more than anything else, and Google’s transition to open source BERT is likely to change natural language processing forever. The Machine Learning and NLP machine learning communities are very excited about BERT because it requires a lot of effort to be able to do research in natural language. It has been pre-trained for many words – and all over English Wikipedia, which has 2500 million words.

What can BERT do?

There are things which we easily understand, and which machines, including search engines, are unable to understand. Below are the things with which the new algorithm is already dealing with.

Problem with words

The problem with words is that they are everywhere. There is more and more content. Words are problematic because many of them are ambiguous, polygonal and synonymous. Bert was designed to help solve ambiguous sentences and phrases which are made up of many words with a large number of meanings.

Ambiguity and polysemy

Almost every other word in many languages ​​has several meanings. In voice search it looks even worse, because each can be understood differently by distinct accents. This is not a major challenge for us, because we have common sense and context, so we can understand all the other words surrounding the context of a situation or conversation – but search engines and machines do not.

Context of the word

The word does not matter unless it is used in a specific context. The meaning of a word literally changes as the sentence develops due to the many parts of speech which may occur in a given context. The longer sentence, the more difficult it is to follow all parts of speech in a sentence.

How BERT works

Previous language models built context-free word embedding. Instead, BERT provides ‘context’. With the better understand how BERT works, let’s look at what the abbreviation means.

B – Bilingual so Bidirectional

Previously, all language models were unidirectional, so they could move the context window only in one direction. Most of the language models are unidirectional. They can go through the context window of words from left to right or from right to left. Only in one direction, but not at the same time. BERT is different. BERT uses bidirectional language modeling.

ER – Encoder Representation

What is encoded is decoded. It is an entry and exit mechanism.

T – Transformers

BERT uses ‘transformers’ and ‘masked language modeling’. One of the main problems with understanding natural language in the past was that it was unable to understand what context the word was referring to. Only part of the transformers’ attention is focused on pronouns and all the meanings of words which go hand in hand to connect with whom one is talking or what is being said in a given context. Modeling in masked language stops seeing the target word. When the mask is in place, BERT guesses what the missing word is. It’s also part of the attuning process.

How will BERT affect the search?

BERT will help Google to better understand human language

BERT’s understanding of the nuances of the human language will have a huge impact on how Google interprets queries, because people are obviously looking for longer queries.

BERT will help you scale your conversational search

BERT will also have a huge impact on voice search.

Changes in international SEO

BERT has monolingual ability because many patterns in one language are translated into other languages. It is possible to transfer a large part of learning to different languages, even if it does not necessarily fully understand the language itself. So maybe Google will be able to better understand the contextual nuance and ambiguous queries.

Should we optimize content for BERT?

Probably not. Google BERT is a platform for better understanding. It does not evaluate content as such. It just understands what is there better. For example, Google Bert may suddenly understand more, and there may be pages which are over-optimized which may be suddenly affected by something other than Panda, because Google BERT suddenly realized that the page was not right for something. This does not mean that you need to specifically optimize the text for BERT, it’s probably better to just write naturally.

Share
0
Firecrux Crew
Firecrux Crew

Related posts

10/11/2021

What is E-A-T and what does it mean for Google?


Read more
09/14/2021

The future of natural language processing – NLP


Read more
08/13/2021

Google’s spam update


Read more
07/30/2021

What is Performance Marketing?


Read more

Leave a Reply Cancel reply

You must be logged in to post a comment.

Firecrux Logo

Platform

  • Capabilites
    • For Industrial Production
    • For Ecommerce
    • For Business
    • Customer Account
    • Documentation
    • Request PoC

Company

  • Company
    • Brand Manual
    • Careers
    • Partner Program
    • Sitemap
    • Contact

Resources

  • Blog
    • Use Cases
    • AIRE – AI Response Engine
    • CDP – Customer Data Platform
    • Send Us Your Design

Legal

  • Terms of Use
    • Privacy Policy
    • Security Commitment
    • Data Protection Officer
    • Data Control Panel
    • Accessibility Statement
2016- © Firecrux.com
By using Firecrux.com you accept the terms of use.
Request 10-min call