Skip to content

NB BERT-base

NB BERT-base is a general BERT-base model built on the large digital collection at the National Library of Norway. The model is based on the same structure as BERT Cased multilingual model, and is trained on a wide variety of Norwegian text – both Bokmål and Nynorsk – from the last 200 years.

Version 1.1 of the model is general, and should be fine-tuned for any particular use.

NB BERT-base has been produced and released by the AI-lab at the National Library of Norway, and is one of the best performing models for Norwegian and other Scandinavian languages yet.

NB BERT-base is a general BERT-base model built on the large digital collection at the National Library of Norway. The model is based on the same structure as BERT Cased multilingual model, and is trained on a wide variety of Norwegian text – both Bokmål and Nynorsk – from the last 200 years.

Version 1.1 of the model is general, and should be fine-tuned for any particular use.

NB BERT-base has been produced and released by the AI-lab at the National Library of Norway, and is one of the best performing models for Norwegian and other Scandinavian languages yet.

Extended metadata

Download metadata