Google is rolling out BERT – one of the biggest changes for search in the past few years after rankbrain
BERT stands for bidirectional encoder representations from transformers. Models that process words in relation to all other words in a sentence.
BERT models can interpret the appropriate meaning of a word by looking at the words that come before and after. This leads to a better understanding of the queries.
Read more about the same & how it impacts search on the offical google blog
https://blog.google/products/search/search-language-understanding-bert