Smaller, faster, cheaper, lighter: Introducing DistilBERT, a distilled version of BERT
Blog by Victor Sanh : https://medium.com/huggingface/distilbert-8cf3380435b5
#MachineLearning #NLP #Bert #Distillation #Transformers
Blog by Victor Sanh : https://medium.com/huggingface/distilbert-8cf3380435b5
#MachineLearning #NLP #Bert #Distillation #Transformers
Medium
🏎 Smaller, faster, cheaper, lighter: Introducing DilBERT, a distilled version of BERT
You can find the code to reproduce the training of DilBERT along with pre-trained weights for DilBERT here.
Smaller, faster, cheaper, lighter: Introducing DistilBERT, a distilled version of BERT
Blog by Victor Sanh : https://medium.com/huggingface/distilbert-8cf3380435b5
#MachineLearning #NLP #Bert #Distillation #Transformers
Blog by Victor Sanh : https://medium.com/huggingface/distilbert-8cf3380435b5
#MachineLearning #NLP #Bert #Distillation #Transformers
Medium
🏎 Smaller, faster, cheaper, lighter: Introducing DilBERT, a distilled version of BERT
You can find the code to reproduce the training of DilBERT along with pre-trained weights for DilBERT here.