BERT (language model): Difference between revisions

no edit summary
(Created page with "== Introduction == BERT, or Bidirectional Encoder Representations from Transformers, is a transformer-based machine learning technique for natural language processing (NLP). It is designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers. As such, the pre-trained BERT model can be fine-tuned with just one additi...")
 
No edit summary
 
(One intermediate revision by the same user not shown)
Line 3: Line 3:
BERT, or Bidirectional Encoder Representations from Transformers, is a [[Transformer (machine learning model)|transformer-based]] machine learning technique for [[Natural language processing|natural language processing]] (NLP). It is designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers. As such, the pre-trained BERT model can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks, such as question answering and language inference, without substantial task-specific architecture modifications.
BERT, or Bidirectional Encoder Representations from Transformers, is a [[Transformer (machine learning model)|transformer-based]] machine learning technique for [[Natural language processing|natural language processing]] (NLP). It is designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers. As such, the pre-trained BERT model can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks, such as question answering and language inference, without substantial task-specific architecture modifications.


<div class='only_on_desktop image-preview'><div class='image-preview-loader'></div></div><div class='only_on_mobile image-preview'><div class='image-preview-loader'></div></div>
[[Image:Detail-146189.jpg|thumb|center|A computer screen displaying a representation of the BERT model|class=only_on_mobile]]
[[Image:Detail-146190.jpg|thumb|center|A computer screen displaying a representation of the BERT model|class=only_on_desktop]]


== Background ==
== Background ==
142,611

edits