Search code examples
nlpbert-language-model

BERT fine tuning


I'm trying to create my model for question answering based on BERT und can't understand what is the meaning of fine tuning. Do I understand it right, that it is like adaption for specific domain? And if I want to use it with Wikipedia corpora, I just need to integrate unchanged pre-trained model in my network?


Solution

  • Fine tuning is adopting (refining) the pre-trained BERT model to two things:

    1. Domain
    2. Task (e.g. classification, entity extraction, etc.).

    You can use pre-trained models as-is at first and if the performance is sufficient, fine tuning for your use case may not be needed.