#SparkNLP: State of the Art Natural Language Processing
Spark NLP ships with many NLP features, pre-trained models and pipelines #johnsnowlab
NLP Features:
#Tokenization; #Normalizer; #Stemmer; #Lemmatizer; #RegexMatching; #TextMatching; #Chunking; #DateMatcher; #Part-of-speech tagging; #SentenceDetector; #SentimentDetection (ML model); #SpellChecker (ML and DL models); #WordEmbeddings (#BERT and #GloVe); #Namedentityrecognition; #Dependencyparsing (Labeled/unlabled); Easy #TensorFlow integration; #pretrainedpipelines!
Github: https://lnkd.in/fbWquan
Website: https://lnkd.in/fRqsDHX
✴️ @AI_Python_EN
Spark NLP ships with many NLP features, pre-trained models and pipelines #johnsnowlab
NLP Features:
#Tokenization; #Normalizer; #Stemmer; #Lemmatizer; #RegexMatching; #TextMatching; #Chunking; #DateMatcher; #Part-of-speech tagging; #SentenceDetector; #SentimentDetection (ML model); #SpellChecker (ML and DL models); #WordEmbeddings (#BERT and #GloVe); #Namedentityrecognition; #Dependencyparsing (Labeled/unlabled); Easy #TensorFlow integration; #pretrainedpipelines!
Github: https://lnkd.in/fbWquan
Website: https://lnkd.in/fRqsDHX
✴️ @AI_Python_EN
Microsoft open-sourced scripts and notebooks to pre-train and finetune BERT natural language model with domain-specific texts
Github: https://github.com/microsoft/AzureML-BERT
#Bert #Microsoft #NLP #dl
✴️ @AI_PYTHON_EN
Github: https://github.com/microsoft/AzureML-BERT
#Bert #Microsoft #NLP #dl
✴️ @AI_PYTHON_EN
Self-supervised QA from Facebook AI
The researchers from Facebook AI published a paper with the results of exploring the idea of unsupervised extractive question answering and the following training of the supervised question answering model. This approach achieves 56.41F1 on SQuAD2 dataset.
Original paper:
https://research.fb.com/wp-content/uploads/2019/07/Unsupervised-Question-Answering-by-Cloze-Translation.pdf?
Code for experiments:
https://github.com/facebookresearch/UnsupervisedQA
#NLP #BERT #FacebookAI #SelfSupervised
❇️ @AI_Python_EN
The researchers from Facebook AI published a paper with the results of exploring the idea of unsupervised extractive question answering and the following training of the supervised question answering model. This approach achieves 56.41F1 on SQuAD2 dataset.
Original paper:
https://research.fb.com/wp-content/uploads/2019/07/Unsupervised-Question-Answering-by-Cloze-Translation.pdf?
Code for experiments:
https://github.com/facebookresearch/UnsupervisedQA
#NLP #BERT #FacebookAI #SelfSupervised
❇️ @AI_Python_EN
Simple, Scalable Adaptation for Neural Machine Translation
Fine-tuning pre-trained Neural Machine Translation (NMT) models is the dominant approach for adapting to new languages and domains. However, fine-tuning requires adapting and maintaining a separate model for each target task. Researchers from Google propose a simple yet efficient approach for adaptation in #NMT. Their proposed approach consists of injecting tiny task specific adapter layers into a pre-trained model. These lightweight adapters, with just a small fraction of the original model size, adapt the model to multiple individual tasks simultaneously.
Guess it can be applied not only in #NMT but in many other #NLP, #NLU and #NLG tasks.
Paper: https://arxiv.org/pdf/1909.08478.pdf
#BERT
❇️ @AI_Python_EN
Fine-tuning pre-trained Neural Machine Translation (NMT) models is the dominant approach for adapting to new languages and domains. However, fine-tuning requires adapting and maintaining a separate model for each target task. Researchers from Google propose a simple yet efficient approach for adaptation in #NMT. Their proposed approach consists of injecting tiny task specific adapter layers into a pre-trained model. These lightweight adapters, with just a small fraction of the original model size, adapt the model to multiple individual tasks simultaneously.
Guess it can be applied not only in #NMT but in many other #NLP, #NLU and #NLG tasks.
Paper: https://arxiv.org/pdf/1909.08478.pdf
#BERT
❇️ @AI_Python_EN