Data Science by ODS.ai 🦜
51.8K subscribers
312 photos
29 videos
7 files
1.48K links
First Telegram Data Science channel. Covering all technical and popular staff about anything related to Data Science: AI, Big Data, Machine Learning, Statistics, general Math and the applications of former. To reach editors contact: @haarrp
Download Telegram
​​Pre-training via Paraphrasing
Mike Lewis, Marjan Ghazvininejad & etc. by Facebook AI

The authors introduce MARGE, a pre-trained seq2seq model learned with an unsupervised multi-lingual multi-document paraphrasing objective.
MARGE provides an alternative to the dominant masked language modeling paradigm, where they self-supervise the reconstruction of target text by retrieving a set of related texts (in many languages) and conditioning on them to maximize the likelihood of generating the original. Showed it is possible to jointly learn to do retrieval and reconstruction, given only a random initialization.
The objective noisily captures aspects of paraphrase, translation, multi-document summarization, and information retrieval, allowing for strong zero-shot performance on several tasks. For example, with no additional task-specific training they achieve BLEU scores of up to 35.8 for document translation.
Further show that fine-tuning gives a strong performance on a range of discriminative and generative tasks in many languages, making MARGE the most generally applicable pre-training method to date.
Future work should scale MARGE to more domains and languages, and study how to more closely align pre-training objectives with different end tasks.


paper: https://arxiv.org/abs/2006.15020.pdf

#nlp #paraphrasing #unsupervise