For Who Have a Passion For:
1. Artificial Intelligence
2. Machine Learning
3. Deep Learning
4. Data Science
5. Computer vision
6. Image Processing
7. Cognitive Neuroscience
8. Research Papers and Related Courses
https://t.me/DeepLearningML
1. Artificial Intelligence
2. Machine Learning
3. Deep Learning
4. Data Science
5. Computer vision
6. Image Processing
7. Cognitive Neuroscience
8. Research Papers and Related Courses
https://t.me/DeepLearningML
In statistical modeling, apparent violations of distributional assumptions (e.g., normality, Poisson) may result from heterogeneity we haven't accounted for.
In plain English, these violations may result from outliers, for example. Outliers can tell us a lot about are data (e.g., why most people are this way and not that way). They may also be hinting that something is amiss with our data.
Violations of distributional assumptions may also tell us we haven't thought about the problem enough! For example, we may be trying to force a one-size-fits-all model on our data. But different types of consumers may have different motivations, so a regression model that aims at the middle may miss most of them. People also transition in and out of different "states" because of marriage, promotions, childbirth, etc.
One reason predictive analytics often goes wrong is because of people working in factory-like (or sweatshop-like) environments being forced to crank out as many models as possible as quickly as possible.
Well-trained analysts given more time to think about the data and business problem can extract more value from the same data. So the problem may lie more with the execution than the idea itself.
✴️ @AI_Python_EN
In plain English, these violations may result from outliers, for example. Outliers can tell us a lot about are data (e.g., why most people are this way and not that way). They may also be hinting that something is amiss with our data.
Violations of distributional assumptions may also tell us we haven't thought about the problem enough! For example, we may be trying to force a one-size-fits-all model on our data. But different types of consumers may have different motivations, so a regression model that aims at the middle may miss most of them. People also transition in and out of different "states" because of marriage, promotions, childbirth, etc.
One reason predictive analytics often goes wrong is because of people working in factory-like (or sweatshop-like) environments being forced to crank out as many models as possible as quickly as possible.
Well-trained analysts given more time to think about the data and business problem can extract more value from the same data. So the problem may lie more with the execution than the idea itself.
✴️ @AI_Python_EN
#Detection ...? or #Classification ...? its all there in #PyTorch.
A pytorch lib with state-of-the-art architectures, pre-trained models and real-time updated results.
#deeplearning #ai #cnn
https://lnkd.in/eTdvfEp
✴️ @AI_Python_EN
A pytorch lib with state-of-the-art architectures, pre-trained models and real-time updated results.
#deeplearning #ai #cnn
https://lnkd.in/eTdvfEp
✴️ @AI_Python_EN
GitHub
implus/PytorchInsight
a pytorch lib with state-of-the-art architectures, pretrained models and real-time updated results - implus/PytorchInsight
image_2019-07-18_13-06-34.png
2.4 MB
Demystifying #AI
Wonderful infographic by Swami Chanrasekaran
#artificiallintelligence #machinelearning #deeplearning
✴️ @AI_Python_EN
Wonderful infographic by Swami Chanrasekaran
#artificiallintelligence #machinelearning #deeplearning
✴️ @AI_Python_EN
Amazing!! Deep Learning-based NLP techniques are going to revolutionize the way we write software. Here's Deep TabNine, a GPT-2 model trained on around 2 million files from GitHub. Details at
https://tabnine.com/blog/deep
#nlp
✴️ @AI_Python_EN
https://tabnine.com/blog/deep
#nlp
✴️ @AI_Python_EN
WikiMatrix: large-scale bitext extraction from Wikipedia:
1620 language pairs in 85 languages, 135M parallel sentences,
Systematic evaluation on TED
Paper:
https://arxiv.org/abs/1907.05791
Data:
https://github.com/facebookresearch/LASER/tree/master/tasks/WikiMatrix
✴️ @AI_Python_EN
1620 language pairs in 85 languages, 135M parallel sentences,
Systematic evaluation on TED
Paper:
https://arxiv.org/abs/1907.05791
Data:
https://github.com/facebookresearch/LASER/tree/master/tasks/WikiMatrix
✴️ @AI_Python_EN
If you're doing anything with NLP, this is a great place to start! A PyTorch library of state-of-the-art pretrained Transformer language models featuring BERT, GPT-2, XLNet, and more.
"pytorch-transformers", the library for Natural Language Processing!
https://github.com/huggingface/pytorch-transformers
✴️ @AI_Python_EN
"pytorch-transformers", the library for Natural Language Processing!
https://github.com/huggingface/pytorch-transformers
✴️ @AI_Python_EN
It's out! I really think the network portrait divergence is a clever way to compare two networks--props to Jim Bagrow & Bollt for introducing it.
Paper:
https://appliednetsci.springeropen.com/articles/10.1007/s41109-019-0156-x
Code:
https://github.com/bagrow/network-portrait-divergence
✴️ @AI_Python_EN
Paper:
https://appliednetsci.springeropen.com/articles/10.1007/s41109-019-0156-x
Code:
https://github.com/bagrow/network-portrait-divergence
✴️ @AI_Python_EN
Remember the #BachDoodle? We’re excited to release paper on Behind-the-Scenes design, #ML, scaling it up, and dataset of 21.6M melodies from around the world!
📜 http://arxiv.org/abs/1907.06637
✴️ @AI_Python_EN
📜 http://arxiv.org/abs/1907.06637
✴️ @AI_Python_EN
This #AI magically removes
moving #objects from #videos
https://buff.ly/2XJtPf3
#ArtificialIntelligence #MachineLearning #DeepLearning
Fast Online Object Tracking and Segmentation: A Unifying Approach
✴️ @AI_Python_EN
moving #objects from #videos
https://buff.ly/2XJtPf3
#ArtificialIntelligence #MachineLearning #DeepLearning
Fast Online Object Tracking and Segmentation: A Unifying Approach
✴️ @AI_Python_EN
For Who Have a Passion For:
1. Artificial Intelligence
2. Machine Learning
3. Deep Learning
4. Data Science
5. Computer vision
6. Image Processing
7. Cognitive Neuroscience
8. Research Papers and Related Courses
https://t.me/DeepLearningML
1. Artificial Intelligence
2. Machine Learning
3. Deep Learning
4. Data Science
5. Computer vision
6. Image Processing
7. Cognitive Neuroscience
8. Research Papers and Related Courses
https://t.me/DeepLearningML
Deep Learning for NLP: An Overview of Recent Trends
1) Word Embeddings
2) Character Embeddings
3) Convolutional Neural Network (CNN)
4) Recurrent Neural Network (RNN)
5) Attention Mechanim
6) Recursive Neural Network
and more briefly trends!!
https://medium.com/dair-ai/deep-learning-for-nlp-an-overview-of-recent-trends-d0d8f40a776d
also a good start in Keras for beginners using BiLSTM, CNN, ...
link —> https://github.com/BrambleXu/nlp-beginner-guide-keras
✴️ @AI_Python_EN
1) Word Embeddings
2) Character Embeddings
3) Convolutional Neural Network (CNN)
4) Recurrent Neural Network (RNN)
5) Attention Mechanim
6) Recursive Neural Network
and more briefly trends!!
https://medium.com/dair-ai/deep-learning-for-nlp-an-overview-of-recent-trends-d0d8f40a776d
also a good start in Keras for beginners using BiLSTM, CNN, ...
link —> https://github.com/BrambleXu/nlp-beginner-guide-keras
✴️ @AI_Python_EN
Medium
Deep Learning for NLP: An Overview of Recent Trends
In a timely new paper, Young and colleagues discuss some of the recent trends in deep learning based natural language processing (NLP)…
A gentle overview on the Deep Learning and Machine Learning
The Deep Learning is a subarea of the Machine Learning that makes use of Deep Neural Networks (with many layers) and specific novel algorithms for the pre-processing of data and regularization of the model: word embeddings, dropout, data-augmentation. Deep Learning takes inspiration from Neuroscience since Neural Networks are a model of the neuronal network within the brain. Unlike the biological brain, where any neuron can connect to any other under some physical constraints, Artificial Neural Networks (ANNs) have a finite number of layers, connections, and fixed direction of the data propagation. So far, ANNs have been ignored by the research and business community. The problem is their computational cost.
Between 2006 and 2012 the research group led by Geoffrey Hinton at the University of Toronto finally parallelized the ANNs algorithms to parallel architectures. The main breakthroughs are the increased number of layers, neurons, and model parameters in general (even over than 10 million) allowing to compute massive amounts of data through the system to train it.
https://lnkd.in/dq87iFy
#neuralnetwork
#deeplearning
#machinelearning
✴️ @AI_Python_EN
The Deep Learning is a subarea of the Machine Learning that makes use of Deep Neural Networks (with many layers) and specific novel algorithms for the pre-processing of data and regularization of the model: word embeddings, dropout, data-augmentation. Deep Learning takes inspiration from Neuroscience since Neural Networks are a model of the neuronal network within the brain. Unlike the biological brain, where any neuron can connect to any other under some physical constraints, Artificial Neural Networks (ANNs) have a finite number of layers, connections, and fixed direction of the data propagation. So far, ANNs have been ignored by the research and business community. The problem is their computational cost.
Between 2006 and 2012 the research group led by Geoffrey Hinton at the University of Toronto finally parallelized the ANNs algorithms to parallel architectures. The main breakthroughs are the increased number of layers, neurons, and model parameters in general (even over than 10 million) allowing to compute massive amounts of data through the system to train it.
https://lnkd.in/dq87iFy
#neuralnetwork
#deeplearning
#machinelearning
✴️ @AI_Python_EN
Second Fashion-Gen challenge at #ICCV2019:
Fashion-Gen is a dataset of ~300K images paired with item descriptions. The task is image generation conditioned on the given text descriptions.
Deadline: Oct 15
Challenge:
https://fashion-gen.com
Paper:
https://arxiv.org/pdf/1806.08317
✴️ @AI_Python_EN
Fashion-Gen is a dataset of ~300K images paired with item descriptions. The task is image generation conditioned on the given text descriptions.
Deadline: Oct 15
Challenge:
https://fashion-gen.com
Paper:
https://arxiv.org/pdf/1806.08317
✴️ @AI_Python_EN
This Series of FREE PYTHON TUTORIALS will help You to get Closer to your Data Science Dream 💯🔥🔝
https://data-flair.training/blogs/python-tutorials-home/
✴️ @AI_Python_EN
https://data-flair.training/blogs/python-tutorials-home/
✴️ @AI_Python_EN
DataFlair
Python Tutorials for Beginners – Learn Python Programming - DataFlair
Python Tutorial for Beginners - Learn Python with 370+ Python tutorials, real-time practicals, live projects, quizzes and free courses.
List of institutions with most accepted papers at NeurIPS.
Github code for this graph :
https://lnkd.in/edwhZMf
Medium Link:
https://medium.com/@dcharrezt/neurips-2019-stats-c91346d31c8f
✴️ @AI_Python_EN
Github code for this graph :
https://lnkd.in/edwhZMf
Medium Link:
https://medium.com/@dcharrezt/neurips-2019-stats-c91346d31c8f
✴️ @AI_Python_EN
GitHub
dcharrezt/NeurIPS-2019-Stats
General stats about NeurIPS 2019. Contribute to dcharrezt/NeurIPS-2019-Stats development by creating an account on GitHub.