AI, Python, Cognitive Neuroscience
3.87K subscribers
1.09K photos
47 videos
78 files
893 links
Download Telegram
This is our new paper at KDD, proposing a new dataset for US traffic records with several attributes. We also did some data analysis to show its potential for further uses in AI and data mining. Hope to see you at KDD 19

https://www.youtube.com/watch?v=FhWO_uTf2Ho&t=2s

✴️ @AI_Python_EN
For Who Have a Passion For:

1. Artificial Intelligence
2. Machine Learning
3. Deep Learning
4. Data Science
5. Computer vision
6. Image Processing
7. Cognitive Neuroscience
8. Research Papers and Related Courses

https://t.me/DeepLearningML
In statistical modeling, apparent violations of distributional assumptions (e.g., normality, Poisson) may result from heterogeneity we haven't accounted for.

In plain English, these violations may result from outliers, for example. Outliers can tell us a lot about are data (e.g., why most people are this way and not that way). They may also be hinting that something is amiss with our data.

Violations of distributional assumptions may also tell us we haven't thought about the problem enough! For example, we may be trying to force a one-size-fits-all model on our data. But different types of consumers may have different motivations, so a regression model that aims at the middle may miss most of them. People also transition in and out of different "states" because of marriage, promotions, childbirth, etc.

One reason predictive analytics often goes wrong is because of people working in factory-like (or sweatshop-like) environments being forced to crank out as many models as possible as quickly as possible.

Well-trained analysts given more time to think about the data and business problem can extract more value from the same data. So the problem may lie more with the execution than the idea itself.

✴️ @AI_Python_EN
5 Common Used NLP Library in Python

✴️ @AI_Python_EN
Amazing!! Deep Learning-based NLP techniques are going to revolutionize the way we write software. Here's Deep TabNine, a GPT-2 model trained on around 2 million files from GitHub. Details at
https://tabnine.com/blog/deep
#nlp

✴️ @AI_Python_EN
WikiMatrix: large-scale bitext extraction from Wikipedia:
1620 language pairs in 85 languages, 135M parallel sentences,
Systematic evaluation on TED
Paper:
https://arxiv.org/abs/1907.05791
Data:
https://github.com/facebookresearch/LASER/tree/master/tasks/WikiMatrix

✴️ @AI_Python_EN
If you're doing anything with NLP, this is a great place to start! A PyTorch library of state-of-the-art pretrained Transformer language models featuring BERT, GPT-2, XLNet, and more.
"pytorch-transformers", the library for Natural Language Processing!
https://github.com/huggingface/pytorch-transformers

✴️ @AI_Python_EN
It's out! I really think the network portrait divergence is a clever way to compare two networks--props to Jim Bagrow & Bollt for introducing it.

Paper:
https://appliednetsci.springeropen.com/articles/10.1007/s41109-019-0156-x

Code:
https://github.com/bagrow/network-portrait-divergence

✴️ @AI_Python_EN
Remember the #BachDoodle? We’re excited to release paper on Behind-the-Scenes design, #ML, scaling it up, and dataset of 21.6M melodies from around the world!
📜 http://arxiv.org/abs/1907.06637

✴️ @AI_Python_EN
This #AI magically removes

moving #objects from #videos

https://buff.ly/2XJtPf3
#ArtificialIntelligence #MachineLearning #DeepLearning
Fast Online Object Tracking and Segmentation: A Unifying Approach
✴️ @AI_Python_EN
For Who Have a Passion For:

1. Artificial Intelligence
2. Machine Learning
3. Deep Learning
4. Data Science
5. Computer vision
6. Image Processing
7. Cognitive Neuroscience
8. Research Papers and Related Courses

https://t.me/DeepLearningML
Deep Learning for NLP: An Overview of Recent Trends


1) Word Embeddings
2) Character Embeddings
3) Convolutional Neural Network (CNN)
4) Recurrent Neural Network (RNN)
5) Attention Mechanim
6) Recursive Neural Network
and more briefly trends!!
https://medium.com/dair-ai/deep-learning-for-nlp-an-overview-of-recent-trends-d0d8f40a776d




also a good start in Keras for beginners using BiLSTM, CNN, ...
link —> https://github.com/BrambleXu/nlp-beginner-guide-keras




✴️ @AI_Python_EN
A gentle overview on the Deep Learning and Machine Learning

The Deep Learning is a subarea of the Machine Learning that makes use of Deep Neural Networks (with many layers) and specific novel algorithms for the pre-processing of data and regularization of the model: word embeddings, dropout, data-augmentation. Deep Learning takes inspiration from Neuroscience since Neural Networks are a model of the neuronal network within the brain. Unlike the biological brain, where any neuron can connect to any other under some physical constraints, Artificial Neural Networks (ANNs) have a finite number of layers, connections, and fixed direction of the data propagation. So far, ANNs have been ignored by the research and business community. The problem is their computational cost.

Between 2006 and 2012 the research group led by Geoffrey Hinton at the University of Toronto finally parallelized the ANNs algorithms to parallel architectures. The main breakthroughs are the increased number of layers, neurons, and model parameters in general (even over than 10 million) allowing to compute massive amounts of data through the system to train it.

https://lnkd.in/dq87iFy

#neuralnetwork
#deeplearning
#machinelearning

✴️ @AI_Python_EN
Second Fashion-Gen challenge at #ICCV2019:

Fashion-Gen is a dataset of ~300K images paired with item descriptions. The task is image generation conditioned on the given text descriptions.
Deadline: Oct 15
Challenge:
https://fashion-gen.com
Paper:
https://arxiv.org/pdf/1806.08317

✴️ @AI_Python_EN