Skip to main content
Back to the Wiki
Core Concepts Wiki Entry

BERT

Bidirectional Encoder Representations from Transformers - a language model used in search.

BERT, deployed in Google Search in 2019, helps the algorithm understand the context and intent of conversational queries by analysing words bidirectionally rather than sequentially. BERT particularly improved interpretation of prepositions, negations and ambiguous phrasings. The technology forms a foundation for subsequent language models including MUM and Gemini.

Want to apply this in practice?

The Complete Wix SEO Course covers this topic and 67 others in step-by-step lessons designed for real Wix sites.

Explore the course
Original text
Rate this translation
Your feedback will be used to help improve Google Translate