Best paper award at #emnlp2018
"Phrase-Based & Neural Unsupervised Machine Translation"
Lample et al.: https://arxiv.org/abs/1804.07755
Code:
https://github.com/facebookresearch/UnsupervisedMT
Blog:
https://code.fb.com/ai-research/unsupervised-machine-translation-a-novel-approach-to-provide-fast-accurate-translations-for-more-languages/
#computation #deeplearning #language
https://t.me/ArtificialIntelligenceArticles
"Phrase-Based & Neural Unsupervised Machine Translation"
Lample et al.: https://arxiv.org/abs/1804.07755
Code:
https://github.com/facebookresearch/UnsupervisedMT
Blog:
https://code.fb.com/ai-research/unsupervised-machine-translation-a-novel-approach-to-provide-fast-accurate-translations-for-more-languages/
#computation #deeplearning #language
https://t.me/ArtificialIntelligenceArticles
GitHub
GitHub - facebookresearch/UnsupervisedMT: Phrase-Based & Neural Unsupervised Machine Translation
Phrase-Based & Neural Unsupervised Machine Translation - facebookresearch/UnsupervisedMT
Neural Approaches to Conversational AI
Gao et al.: https://arxiv.org/abs/1809.08267
#computation #language #machinelearning
Gao et al.: https://arxiv.org/abs/1809.08267
#computation #language #machinelearning
Language GANs Falling Short"
Caccia et al.: https://arxiv.org/pdf/1811.02549.pdf
#artificialintelligence #deeplearning #generativeadversarialnetworks #language #machinelearning
Caccia et al.: https://arxiv.org/pdf/1811.02549.pdf
#artificialintelligence #deeplearning #generativeadversarialnetworks #language #machinelearning
Transferable Multi-Domain State Generator for Task-Oriented Dialogue Systems
By Chien-Sheng Wu, Andrea Madotto, Ehsan Hosseini-Asl, Caiming Xiong, Richard Socher, Pascale Fung: https://arxiv.org/abs/1905.08743
#Computation #Language #ArtificialIntelligence
By Chien-Sheng Wu, Andrea Madotto, Ehsan Hosseini-Asl, Caiming Xiong, Richard Socher, Pascale Fung: https://arxiv.org/abs/1905.08743
#Computation #Language #ArtificialIntelligence
"Automated Speech Generation from UN General Assembly Statements: Mapping Risks in AI Generated Texts"
Bullock et al.: https://arxiv.org/abs/1906.01946
#Computation #Language #AIEthics #AIGovernance #ArtificialIntelligence
@ArtificialIntelligenceArticles
Bullock et al.: https://arxiv.org/abs/1906.01946
#Computation #Language #AIEthics #AIGovernance #ArtificialIntelligence
@ArtificialIntelligenceArticles
Language as an Abstraction for Hierarchical Deep Reinforcement Learning
Jiang et al.: https://arxiv.org/abs/1906.07343
#reinforcementlearning #language #machinelearning
Jiang et al.: https://arxiv.org/abs/1906.07343
#reinforcementlearning #language #machinelearning
Neural Decipherment via Minimum-Cost Flow: from Ugaritic to Linear B
Luo et al.: https://arxiv.org/abs/1906.06718
#ArtificialIntelligence #Computation #Language
Luo et al.: https://arxiv.org/abs/1906.06718
#ArtificialIntelligence #Computation #Language
arXiv.org
Neural Decipherment via Minimum-Cost Flow: from Ugaritic to Linear B
In this paper we propose a novel neural approach for automatic decipherment of lost languages. To compensate for the lack of strong supervision signal, our model design is informed by patterns in...
Explain Yourself! Leveraging Language Models for Commonsense Reasoning
Rajani et al.: https://arxiv.org/abs/1906.02361
Github: https://github.com/salesforce/cos-e
Blog: https://blog.einstein.ai/leveraging-language-models-for-commonsense/
#Computation #Language #MachineLearning
Rajani et al.: https://arxiv.org/abs/1906.02361
Github: https://github.com/salesforce/cos-e
Blog: https://blog.einstein.ai/leveraging-language-models-for-commonsense/
#Computation #Language #MachineLearning
"Two neural nets learn to communicate through their own emergent visual language"
Here set in clay tablets.
By Joel Simon : https://github.com/joel-simon/dimensions-of-dialogue/blob/master/emergent_characters.ipynb
#language #neuralnetwork #deeplearning
Here set in clay tablets.
By Joel Simon : https://github.com/joel-simon/dimensions-of-dialogue/blob/master/emergent_characters.ipynb
#language #neuralnetwork #deeplearning
GitHub
joel-simon/dimensions-of-dialogue
Contribute to joel-simon/dimensions-of-dialogue development by creating an account on GitHub.
ParaQG: A System for Generating Questions and Answers from Paragraphs
Kumar et al.: https://arxiv.org/abs/1909.01642
#ArtificialIntelligence #Language #MachineLearning
Kumar et al.: https://arxiv.org/abs/1909.01642
#ArtificialIntelligence #Language #MachineLearning
Deep networks work by learning complex, often hierarchical internal representations of input data. These form a kind of functional language the network uses to describe the data.
Language can emerge from tasks like object recognition: has pointy ears, whiskers, tail => cat.
This relates to Wittgenstein’s "language-game" in Philosophical Investigations, where a functional language emerge from simple tasks before defining a vocabulary.
The visual vocabulary of a convolutional neural network seems to emerge from low level features such as edges and orientations, and builds up textures, patterns and composites, … and builds up even further into complete objects: houses, dogs, etc.
Source: NeurIPS 2018“Unsupervised Deep Learning” Tutorial – Part 1 by Alex Graves - https://media.neurips.cc/Conferences/NIPS2018/Slides/Deep_Unsupervised_Learning.pdf
#artificialintelligence #deeplearning #language #machinelearning
Language can emerge from tasks like object recognition: has pointy ears, whiskers, tail => cat.
This relates to Wittgenstein’s "language-game" in Philosophical Investigations, where a functional language emerge from simple tasks before defining a vocabulary.
The visual vocabulary of a convolutional neural network seems to emerge from low level features such as edges and orientations, and builds up textures, patterns and composites, … and builds up even further into complete objects: houses, dogs, etc.
Source: NeurIPS 2018“Unsupervised Deep Learning” Tutorial – Part 1 by Alex Graves - https://media.neurips.cc/Conferences/NIPS2018/Slides/Deep_Unsupervised_Learning.pdf
#artificialintelligence #deeplearning #language #machinelearning
FlauBERT: Unsupervised Language Model Pre-training for French
Le et al.: https://arxiv.org/abs/1912.05372
#Computation #Language #MachineLearning
Le et al.: https://arxiv.org/abs/1912.05372
#Computation #Language #MachineLearning