Their release stated that they predicted that the update would affect 10% of queries; so one in ten searches now has an improved understanding of the intent behind it. The update is built on some incredible neural-network technology for natural language processing – the “Bidirectional Encoder Representations from Transformers” – which is how BERT got its name.
In short, this algorithm has learned the way that we use connecting words to change the meaning of a search. Take their example: previously, the query ‘do estheticians stand a lot at work’ would have shown an article comparing the jobs of a medical and spa esthetician due to the inclusion of the word ‘stand’ in both the search query and the article. However, in this case the word takes on a different meaning in context. BERT is clever enough to have understood the difference in intent and now shows a far more relevant result about the physical demands of an esthetician.