It's out! I really think the network portrait divergence is a clever way to compare two networks--props to Jim Bagrow & Bollt for introducing it.
Paper:
https://appliednetsci.springeropen.com/articles/10.1007/s41109-019-0156-x
Code:
https://github.com/bagrow/network-portrait-divergence
β΄οΈ @AI_Python_EN
Paper:
https://appliednetsci.springeropen.com/articles/10.1007/s41109-019-0156-x
Code:
https://github.com/bagrow/network-portrait-divergence
β΄οΈ @AI_Python_EN
Remember the #BachDoodle? Weβre excited to release paper on Behind-the-Scenes design, #ML, scaling it up, and dataset of 21.6M melodies from around the world!
π http://arxiv.org/abs/1907.06637
β΄οΈ @AI_Python_EN
π http://arxiv.org/abs/1907.06637
β΄οΈ @AI_Python_EN
This #AI magically removes
moving #objects from #videos
https://buff.ly/2XJtPf3
#ArtificialIntelligence #MachineLearning #DeepLearning
Fast Online Object Tracking and Segmentation: A Unifying Approach
β΄οΈ @AI_Python_EN
moving #objects from #videos
https://buff.ly/2XJtPf3
#ArtificialIntelligence #MachineLearning #DeepLearning
Fast Online Object Tracking and Segmentation: A Unifying Approach
β΄οΈ @AI_Python_EN
For Who Have a Passion For:
1. Artificial Intelligence
2. Machine Learning
3. Deep Learning
4. Data Science
5. Computer vision
6. Image Processing
7. Cognitive Neuroscience
8. Research Papers and Related Courses
https://t.me/DeepLearningML
1. Artificial Intelligence
2. Machine Learning
3. Deep Learning
4. Data Science
5. Computer vision
6. Image Processing
7. Cognitive Neuroscience
8. Research Papers and Related Courses
https://t.me/DeepLearningML
Deep Learning for NLP: An Overview of Recent Trends
1) Word Embeddings
2) Character Embeddings
3) Convolutional Neural Network (CNN)
4) Recurrent Neural Network (RNN)
5) Attention Mechanim
6) Recursive Neural Network
and more briefly trends!!
https://medium.com/dair-ai/deep-learning-for-nlp-an-overview-of-recent-trends-d0d8f40a776d
also a good start in Keras for beginners using BiLSTM, CNN, ...
link β> https://github.com/BrambleXu/nlp-beginner-guide-keras
β΄οΈ @AI_Python_EN
1) Word Embeddings
2) Character Embeddings
3) Convolutional Neural Network (CNN)
4) Recurrent Neural Network (RNN)
5) Attention Mechanim
6) Recursive Neural Network
and more briefly trends!!
https://medium.com/dair-ai/deep-learning-for-nlp-an-overview-of-recent-trends-d0d8f40a776d
also a good start in Keras for beginners using BiLSTM, CNN, ...
link β> https://github.com/BrambleXu/nlp-beginner-guide-keras
β΄οΈ @AI_Python_EN
Medium
Deep Learning for NLP: An Overview of Recent Trends
In a timely new paper, Young and colleagues discuss some of the recent trends in deep learning based natural language processing (NLP)β¦
A gentle overview on the Deep Learning and Machine Learning
The Deep Learning is a subarea of the Machine Learning that makes use of Deep Neural Networks (with many layers) and specific novel algorithms for the pre-processing of data and regularization of the model: word embeddings, dropout, data-augmentation. Deep Learning takes inspiration from Neuroscience since Neural Networks are a model of the neuronal network within the brain. Unlike the biological brain, where any neuron can connect to any other under some physical constraints, Artificial Neural Networks (ANNs) have a finite number of layers, connections, and fixed direction of the data propagation. So far, ANNs have been ignored by the research and business community. The problem is their computational cost.
Between 2006 and 2012 the research group led by Geoffrey Hinton at the University of Toronto finally parallelized the ANNs algorithms to parallel architectures. The main breakthroughs are the increased number of layers, neurons, and model parameters in general (even over than 10 million) allowing to compute massive amounts of data through the system to train it.
https://lnkd.in/dq87iFy
#neuralnetwork
#deeplearning
#machinelearning
β΄οΈ @AI_Python_EN
The Deep Learning is a subarea of the Machine Learning that makes use of Deep Neural Networks (with many layers) and specific novel algorithms for the pre-processing of data and regularization of the model: word embeddings, dropout, data-augmentation. Deep Learning takes inspiration from Neuroscience since Neural Networks are a model of the neuronal network within the brain. Unlike the biological brain, where any neuron can connect to any other under some physical constraints, Artificial Neural Networks (ANNs) have a finite number of layers, connections, and fixed direction of the data propagation. So far, ANNs have been ignored by the research and business community. The problem is their computational cost.
Between 2006 and 2012 the research group led by Geoffrey Hinton at the University of Toronto finally parallelized the ANNs algorithms to parallel architectures. The main breakthroughs are the increased number of layers, neurons, and model parameters in general (even over than 10 million) allowing to compute massive amounts of data through the system to train it.
https://lnkd.in/dq87iFy
#neuralnetwork
#deeplearning
#machinelearning
β΄οΈ @AI_Python_EN
Second Fashion-Gen challenge at #ICCV2019:
Fashion-Gen is a dataset of ~300K images paired with item descriptions. The task is image generation conditioned on the given text descriptions.
Deadline: Oct 15
Challenge:
https://fashion-gen.com
Paper:
https://arxiv.org/pdf/1806.08317
β΄οΈ @AI_Python_EN
Fashion-Gen is a dataset of ~300K images paired with item descriptions. The task is image generation conditioned on the given text descriptions.
Deadline: Oct 15
Challenge:
https://fashion-gen.com
Paper:
https://arxiv.org/pdf/1806.08317
β΄οΈ @AI_Python_EN
This Series of FREE PYTHON TUTORIALS will help You to get Closer to your Data Science Dream π―π₯π
https://data-flair.training/blogs/python-tutorials-home/
β΄οΈ @AI_Python_EN
https://data-flair.training/blogs/python-tutorials-home/
β΄οΈ @AI_Python_EN
DataFlair
Python Tutorials for Beginners β Learn Python Programming - DataFlair
Python Tutorial for Beginners - Learn Python with 370+ Python tutorials, real-time practicals, live projects, quizzes and free courses.
List of institutions with most accepted papers at NeurIPS.
Github code for this graph :
https://lnkd.in/edwhZMf
Medium Link:
https://medium.com/@dcharrezt/neurips-2019-stats-c91346d31c8f
β΄οΈ @AI_Python_EN
Github code for this graph :
https://lnkd.in/edwhZMf
Medium Link:
https://medium.com/@dcharrezt/neurips-2019-stats-c91346d31c8f
β΄οΈ @AI_Python_EN
GitHub
dcharrezt/NeurIPS-2019-Stats
General stats about NeurIPS 2019. Contribute to dcharrezt/NeurIPS-2019-Stats development by creating an account on GitHub.
Introducing Neural Structured Learning in TensorFlow
https://medium.com/tensorflow/introducing-neural-structured-learning-in-tensorflow-5a802efd7afd
Neural Structured Learning: Training with Structured Signals
Article: https://www.tensorflow.org/neural_structured_learning
Code: https://github.com/tensorflow/neural-structured-learning
β΄οΈ @AI_Python_EN
https://medium.com/tensorflow/introducing-neural-structured-learning-in-tensorflow-5a802efd7afd
Neural Structured Learning: Training with Structured Signals
Article: https://www.tensorflow.org/neural_structured_learning
Code: https://github.com/tensorflow/neural-structured-learning
β΄οΈ @AI_Python_EN
Medium
Introducing Neural Structured Learning in TensorFlow
Posted by Da-Cheng Juan (Senior Software Engineer) and Sujith Ravi (Senior Staff Research Scientist)
5 Reasons to Learn Probability for Machine Learning
https://machinelearningmastery.com/why-learn-probability-for-machine-learning/
https://machinelearningmastery.com/why-learn-probability-for-machine-learning/
MachineLearningMastery.com
5 Reasons to Learn Probability for Machine Learning - MachineLearningMastery.com
Probability is a field of mathematics that quantifies uncertainty. It is undeniably a pillar of the field of machine learning, and many recommend it as a prerequisite subject to study prior to getting started. This is misleading advice, as probability makesβ¦
How Humans Judge Machines? Brain Bar presentation on my forthcoming book in this topic, scheduled for the spring of 2020.
https://www.youtube.com/watch?v=NQQ2CEqBe1Y
https://www.youtube.com/watch?v=NQQ2CEqBe1Y
YouTube
Can You Judge Artificial Intelligence? | Cesar A. Hidalgo at Brain Bar
SUBSCRIBE to our channel for more brainy bits: https://goo.gl/mLdVrFAi's are diagnosing cancer, driving cars, acting as police agents, and it seems that judg...
ideo Interpolation and Prediction with Unsupervised Landmarks
Shih et al.: https://lnkd.in/d2A3juW
#ArtificialIntelligence #DeepLearning
#MachineLearning
β΄οΈ @AI_Python_EN
Shih et al.: https://lnkd.in/d2A3juW
#ArtificialIntelligence #DeepLearning
#MachineLearning
β΄οΈ @AI_Python_EN
Basics of Python Programming
ββββββββββββββ-
a. Lists, Tuples, Dictionaries, Conditionals, Loops, etc...
https://lnkd.in/gWRbc3J
b. Data Structures & Algorithms
https://lnkd.in/gYKnJWN
d. NumPy Arrays:
https://lnkd.in/geeFePh
c. Regex:
https://lnkd.in/gzUahNV
Practice Coding Challenges
βββββββββββββ
a. Hacker Rank:
https://lnkd.in/gEufBUu
b. Codeacademy:
https://lnkd.in/gGQ7cuv
c. LeetCode:
https://leetcode.com/
Data Manipulation
ββββββββ-
a. Pandas:
https://lnkd.in/gxSgfuQ
b. Pandas Cheatsheet:
https://lnkd.in/gfAdcpw
c. SQLAlchemy:
https://lnkd.in/gjvbm7h
Data Visualization
ββββββββ
a. Matplotlib:
https://lnkd.in/g_3fx_6
b. Seaborn:
https://lnkd.in/gih7hqz
c. Plotly:
https://lnkd.in/gBYBMXc
d. Python Graph Gallery:
https://lnkd.in/gdGe-ef
Machine Learning / Deep Learning
ββββββββββββββββ
a. Skcikit-Learn Tutorial:
https://lnkd.in/gT5nNwS
b. Deep Learning Tutorial:
https://lnkd.in/gHKWM5m
c. Kaggle Kernels:
https://lnkd.in/e_VcNpk
d. Kaggle Competitions:
https://lnkd.in/epb9c8N
ββββββββββββββ-
a. Lists, Tuples, Dictionaries, Conditionals, Loops, etc...
https://lnkd.in/gWRbc3J
b. Data Structures & Algorithms
https://lnkd.in/gYKnJWN
d. NumPy Arrays:
https://lnkd.in/geeFePh
c. Regex:
https://lnkd.in/gzUahNV
Practice Coding Challenges
βββββββββββββ
a. Hacker Rank:
https://lnkd.in/gEufBUu
b. Codeacademy:
https://lnkd.in/gGQ7cuv
c. LeetCode:
https://leetcode.com/
Data Manipulation
ββββββββ-
a. Pandas:
https://lnkd.in/gxSgfuQ
b. Pandas Cheatsheet:
https://lnkd.in/gfAdcpw
c. SQLAlchemy:
https://lnkd.in/gjvbm7h
Data Visualization
ββββββββ
a. Matplotlib:
https://lnkd.in/g_3fx_6
b. Seaborn:
https://lnkd.in/gih7hqz
c. Plotly:
https://lnkd.in/gBYBMXc
d. Python Graph Gallery:
https://lnkd.in/gdGe-ef
Machine Learning / Deep Learning
ββββββββββββββββ
a. Skcikit-Learn Tutorial:
https://lnkd.in/gT5nNwS
b. Deep Learning Tutorial:
https://lnkd.in/gHKWM5m
c. Kaggle Kernels:
https://lnkd.in/e_VcNpk
d. Kaggle Competitions:
https://lnkd.in/epb9c8N
Programiz
Learn Python Programming
Python is a powerful general-purpose programming language. Our Python tutorial will guide you to learn Python one step at a time with the help of examples.
Has Area Under the ROC Curve (AUC-ROC) become Data Science & AI/ML communityβs P-Value?
Just returned from day 1 of Intelligent Health AI conference - and while there were some great speakers & talks - one thing stood out. Of the multiple talks reporting machine learning model performance, all except one talk reported AUC-ROC as the only metric - even for unbalanced datasets. It appears that the AUC-ROC metric is being misused similar to how the P-value has been misused & misinterpreted.
There is more to model evaluation than a single number. In addition to AUC-ROC, we have the Precision-Recall (PR) curve, Sensitivity (Recall), Specificity, F1-score, Positive/Negative Predictive Values, Matthews Correlation Coefficient, Calibration, and many other metrics. The graphic below presents a good summary of the various model performance / evaluation metrics (see articles & book for more details):
Regression Metrics:
https://lnkd.in/eRWvRVc
Classification Metrics:
https://lnkd.in/dpYnvGh
Evaluating Machine Learning Models (open-access book):
https://lnkd.in/dHcfZdP
Just returned from day 1 of Intelligent Health AI conference - and while there were some great speakers & talks - one thing stood out. Of the multiple talks reporting machine learning model performance, all except one talk reported AUC-ROC as the only metric - even for unbalanced datasets. It appears that the AUC-ROC metric is being misused similar to how the P-value has been misused & misinterpreted.
There is more to model evaluation than a single number. In addition to AUC-ROC, we have the Precision-Recall (PR) curve, Sensitivity (Recall), Specificity, F1-score, Positive/Negative Predictive Values, Matthews Correlation Coefficient, Calibration, and many other metrics. The graphic below presents a good summary of the various model performance / evaluation metrics (see articles & book for more details):
Regression Metrics:
https://lnkd.in/eRWvRVc
Classification Metrics:
https://lnkd.in/dpYnvGh
Evaluating Machine Learning Models (open-access book):
https://lnkd.in/dHcfZdP
Medium
Choosing the Right Metric for Evaluating Machine Learning Models β Part 1
First part of the series focussing on Regression Metrics
Meta-Learning with Implicit Gradients
Aravind Rajeswaran, Chelsea Finn, Sham Kakade, Sergey Levine : https://lnkd.in/g9H6mZ2
#MachineLearning #ArtificialIntelligence #Optimization #Control #MetaLearning
Aravind Rajeswaran, Chelsea Finn, Sham Kakade, Sergey Levine : https://lnkd.in/g9H6mZ2
#MachineLearning #ArtificialIntelligence #Optimization #Control #MetaLearning
arXiv.org
Meta-Learning with Implicit Gradients
A core capability of intelligent systems is the ability to quickly learn new
tasks by drawing on prior experience. Gradient (or optimization) based
meta-learning has recently emerged as an...
tasks by drawing on prior experience. Gradient (or optimization) based
meta-learning has recently emerged as an...