Cutting Edge Deep Learning
262 subscribers
193 photos
42 videos
51 files
363 links
πŸ“• Deep learning
πŸ“— Reinforcement learning
πŸ“˜ Machine learning
πŸ“™ Papers - tools - tutorials

πŸ”— Other Social Media Handles:
https://linktr.ee/cedeeplearning
Download Telegram
This media is not supported in your browser
VIEW IN TELEGRAM
πŸ”»More Efficient NLP Model Pre-training with ELECTRA
Recent advances in language pre-training have led to substantial gains in the field of natural language processing, with state-of-the-art models such as BERT, RoBERTa, XLNet, ALBERT, and T5, among many others. These methods, though they differ in design, share the same idea of leveraging a large amount of unlabeled text to build a general model of language understanding before being fine-tuned on specific NLP tasks such as sentiment analysis and question answering.
https://ai.googleblog.com/

#NLP
#deeplearning
#pretraining

@cedeeplearning
#Torch-Struct: Deep Structured Prediction Library

The literature on structured prediction for #NLP describes a rich collection of distributions and algorithms over #sequences, #segmentations, #alignments, and #trees; however, these algorithms are difficult to utilize in deep learning frameworks. We introduce Torch-Struct, a library for structured prediction designed to take advantage of and integrate with vectorized, auto-differentiation based #frameworks. Torch-Struct includes a broad collection of #probabilistic structures accessed through a simple and flexible distribution-based API that connects to any deep learning model. The library utilizes batched, vectorized operations and exploits auto-differentiation to produce readable, fast, and testable code. Internally, we also include a number of general-purpose optimizations to provide cross-algorithm efficiency. Experiments show significant performance gains over fast baselines and case-studies demonstrate the benefits of the library. Torch-Struct is available at:

Code: https://github.com/harvardnlp/pytorch-struct
Paper: https://arxiv.org/abs/2002.00876v1

@cedeeplearning
πŸ”»Does Deep Learning Come from the Devil?

Deep learning has revolutionized #computer_vision and natural language processing (#NLP). Yet the #mathematics explaining its success remains elusive. At the Yandex conference on machine learning prospects and applications, Vladimir Vapnik offered a critical perspective.

πŸ”Ήwe suggest you to tap the linkπŸ”Ή

link: https://www.kdnuggets.com/2015/10/deep-learning-vapnik-einstein-devil-yandex-conference.html

πŸ“ŒVia: @cedeeplearning

#deeplearning
#machinelearning
#neuralnetworks
πŸ”ΉFinding a good read among billions of choices

As natural language processing techniques improve, suggestions are getting speedier and more relevant. With the MIT-IBM Watson AI Lab and his Geometric Data Processing Group at MIT, Solomon recently presented a new technique for cutting through massive amounts of text at the Conference on Neural Information Processing Systems (NeurIPS). Their method combines three popular text-analysis tools β€” topic modeling, word embeddings, and optimal transport β€” to deliver better, faster results than competing methods on a popular benchmark for classifying documents. If an algorithm knows what you liked in the past, it can scan the millions of possibilities for something similar. As natural language processing techniques improve, those β€œyou might also like” suggestions are getting speedier and more relevant.
β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”
link: http://news.mit.edu/2019/finding-good-read-among-billions-of-choices-1220

πŸ“ŒVia: @cedeeplearning

#deepelarning
#NLP
#neuralnetworks
πŸ”ΉHow Algorithms Can Predict Our Intentions Faster Than We Can

Artificial Intelligence (AI) and Natural Language Processing (NLP) can gather data from anywhere online where we leave a mark. This includes our social media posts, our email, and even any small comments we leave on blog posts. Every trace we leave online allows NLP to track and predict our future decisions.
This article highlight how NLP can impact our day-to-day lives with the use of case studies.

β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”
https://www.entrepreneur.com/article/328776

πŸ“ŒVia: @cedeeplearning

#NLP
#AI
#machinelearning
#deeplearning
#algorithm
πŸ”»Unlocking potentials of NLP to fight against COVID-19 crisis

DAMO’s existing model has already been deployed widely in Alibaba’s ecosystem, powering its customer-service AI chatbot and the search engine on Alibaba’s retail platforms, as well as anonymous healthcare data analysis. The model was used in the text analysis of medical records and epidemiological investigation by CDCs in different cities in China for fighting against #COVID-19.
β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”
πŸ“ŒVia: @cedeeplearning
πŸ“ŒOther social media: https://linktr.ee/cedeeplearning

https://www.analyticsinsight.net/unlocking-potentials-nlp-fight-covid-19-crisis/

#machinelearning
#deeplearning
#neuralnetworks
#NLP
πŸ”ΉBENEFITS OF SPARK NLP

1. It’s very accurate
2. Reduced training model sizes
3. It’s fast
4. It is fully supported by Spark
5. It is scalable
6. Extensive functionality and support
7. A large community
β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”
πŸ“ŒVia: @cedeeplearning
πŸ“ŒSocial media: https://linktr.ee/cedeeplearning

link: https://www.analyticsinsight.net/benefits-of-spark-nlp/

#spark
#NLP
#deeplearning
#neuralnetworks
πŸ”»HOW TO SOLVE 90% OF NLP PROBLEMS: A STEP-BY-STEP GUIDE

πŸ”Ήby Emmanuel Ameisen

Whether you are an established company or working to launch a new service, you can always leverage text data to validate, improve, and expand the functionalities of your product. The science of extracting meaning and learning from text data is an active topic of research called Natural Language Processing (#NLP).
β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”β€”
πŸ“ŒVia: @cedeeplearning

https://www.topbots.com/solve-ai-nlp-problems-guide/

#deeplearning
#neuralnetworks
#machinelearning
#text_data
#datascience
NLP.pdf
701.7 KB
πŸ”Ή A Primer on Neural Network Models
for Natural Language Processing

This tutorial covers input encoding for natural language tasks, feed-forward networks, convolutional networks, recurrent networks and recursive networks, as well as the computation graph abstraction for automatic gradient computation.
β€”β€”β€”β€”β€”β€”β€”β€”β€”
πŸ“ŒVia: @cedeeplearning

#deeplearning #CNN #tutorial
#neuralnetworks #RNN #paper #nlp
⭕️ Top 6 Open Source Pre-trained Models for Text Classification you should use

1. XLNet
2. ERNIE
3. Text-to-Text Transfer Transformer (T5)
4. Binary - Partitioning Transformation (BPT)
5. Neural Attentive Bag-of-Entities Model for Text Classification (NABoE)
6. Rethinking Complex Neural Network Architectures for Document Classification
β€”β€”β€”β€”β€”β€”β€”β€”
πŸ“ŒVia: @cedeeplearning


https://www.analyticsvidhya.com/blog/2020/03/6-pretrained-models-text-classification/

#classification #machinelearning
#datascience #model #training
#deeplearning #dataset #neuralnetworks
#NLP #math #AI
πŸ”– The Best NLP with Deep Learning Course is Free

Stanford's Natural Language Processing with Deep Learning is one of the most respected courses on the topic that you will find anywhere, and the course materials are freely available online.
β€”β€”β€”β€”β€”β€”β€”
πŸ“ŒVia: @cedeeplearning

https://www.kdnuggets.com/2020/05/best-nlp-deep-learning-course-free.html

#deeplearning #NLP
#neuralnetworks
#machinelearning
#free #AI #math