Google BERT is one of the big news of the search engine for 2020. This year we should already notice […]
Bidirectional Encoder Representations from Transformers or BERT (short) is one of the new tools from Google. The multi-company announced this novelty at the end of the last year and the basis of its operation is mainly in the search algorithm.
Google BERT will help to understand the searches we do daily on search engines using an algorithm more identical to human thinking.
The main goal is, among others, to reduce users' search time, presenting more relevant content and excluding those that can be considered "fake news".
At an initial stage, only the English language will be tested in the Google BERT algorithm. Subsequently, the goal will be to extend this type of BERT research to other languages, including Portuguese. According to Padu Nayak (Google Fellow and Vice President, Search), responsible for the algorithm, applying BERT models to the classification snippets and highlighted in Search, it will be easier and faster to find useful information. In fact, when it comes to classifying the results, BERT will help the Survey to better understand 1 in 10 surveys, in the USA, in the English language, then it will be applied to more languages and countries over time. See article.
Content creation is a very important factor in developing a good strategy for your website/news to be well-positioned on Google. And it has been happening forever!
Regarding the new Google BERT, if you were already concerned with producing user-oriented content, whether to answer questions or to provide new solutions, then you won't have much to worry about. This new algorithm is only concerned with real people (humans) and not with robots, that is, the system will understand what are the contents of real importance when doing research and not just because it corresponds to all the mandatory metrics of the SEO.
Even so, Google warns that it may be important for experts in digital marketing and SEO to prepare to monitor their websites, in more detail Landing Pages. If you are not aware of this measure your conversion rates could drop dramatically.
In this example, the search was "2019 brazil traveller to the US need a visa". One of the great improvements that come with Google BERT is related to the meaning of words. For example, searches with prepositions like "for" and "to".
When discovering the real meaning, the understanding will be totally different and it is this precision that Google BERT intends to identify right in its search box. In this case, we want to know if a Brazilian who is travelling to the US needs a visa or not.
Firstly, Google suggested that the user should receive tips for those going from the US to Brazil and not the other way around. Google BERT filters the nuances, that is, it understands that the "to" has a much broader meaning than if it were "for".
At MD3 we are aware of cyber news. Search engines are also constantly evolving and deserve extra attention in creating content and implementing a website so that whatever is managed by you remains in the first place of research. We have the best professionals for this change, contact us directly in the chat or here.