Data-Efficient Image Recognition with Contrastive Predictive Coding
Hénaff et al.: https://lnkd.in/eMDhrU8
#ArtificialIntelligence #ComputerVision #MachineLearning
✴️ @AI_Python_EN
Hénaff et al.: https://lnkd.in/eMDhrU8
#ArtificialIntelligence #ComputerVision #MachineLearning
✴️ @AI_Python_EN
Full Stack Deep Learning Bootcamp
(Most of) Lectures of Day 1:
https://fullstackdeeplearning.com/march2019
#ArtificialIntelligence #DeepLearning #MachineLearning
✴️ @AI_Python_EN
(Most of) Lectures of Day 1:
https://fullstackdeeplearning.com/march2019
#ArtificialIntelligence #DeepLearning #MachineLearning
✴️ @AI_Python_EN
Probabilistic Graphical Models
Spring 2019 • Carnegie Mellon University
Lisa Lee
https://sailinglab.github.io/pgm-spring-2019/lectures/
✴️ @AI_Python_EN
Spring 2019 • Carnegie Mellon University
Lisa Lee
https://sailinglab.github.io/pgm-spring-2019/lectures/
✴️ @AI_Python_EN
Deep Learning lecture
The full deck of (600+) slides, by Gilles Louppe:
https://glouppe.github.io/info8010-deep-learning/pdf/lec-all.pdf
#ArtificialIntelligence #DeepLearning #MachineLearning
✴️ @AI_Python_EN
The full deck of (600+) slides, by Gilles Louppe:
https://glouppe.github.io/info8010-deep-learning/pdf/lec-all.pdf
#ArtificialIntelligence #DeepLearning #MachineLearning
✴️ @AI_Python_EN
SOD is an embedded, modern cross-platform computer vision and machine learning software library that expose a set of APIs for deep-learning, advanced media analysis & processing including real-time, multi-class object detection and model training on embedded systems with limited computational resource and IoT devices.
SOD - An Embedded #ComputerVision & #MachineLearning Library
https://sod.pixlab.io/
api https://sod.pixlab.io/api.html
samples https://sod.pixlab.io/samples.html
guide https://sod.pixlab.io/intro.html
github https://github.com/symisc/sod
✴️ @AI_Python_EN
SOD - An Embedded #ComputerVision & #MachineLearning Library
https://sod.pixlab.io/
api https://sod.pixlab.io/api.html
samples https://sod.pixlab.io/samples.html
guide https://sod.pixlab.io/intro.html
github https://github.com/symisc/sod
✴️ @AI_Python_EN
Moving Camera, Moving People: A #DeepLearning Approach to Depth Prediction
http://ai.googleblog.com/2019/05/moving-camera-moving-people-deep.html
✴️ @AI_Python_EN
http://ai.googleblog.com/2019/05/moving-camera-moving-people-deep.html
✴️ @AI_Python_EN
"Storytelling with Data: A Data Visualization Guide for Business Professionals" by Cole Nussbaumer Knaflic
I've just come across this first (2015) edition, and there now may be a second edition out. Here's the link to the PDF:
https://lnkd.in/fJSN7ci
✴️ @AI_Python_EN
I've just come across this first (2015) edition, and there now may be a second edition out. Here's the link to the PDF:
https://lnkd.in/fJSN7ci
✴️ @AI_Python_EN
Estimators, Loss Functions, Optimizers —Core of ML Algorithms
https://towardsdatascience.com/estimators-loss-functions-optimizers-core-of-ml-algorithms-d603f6b0161a]
✴️ @AI_Python_EN
https://towardsdatascience.com/estimators-loss-functions-optimizers-core-of-ml-algorithms-d603f6b0161a]
✴️ @AI_Python_EN
Bringing human-like reasoning to driverless car navigation
Autonomous control system “learns” to use simple maps and image data to navigate new, complex routes.
#COMPUTERVISION
http://news.mit.edu/2019/human-reasoning-ai-driverless-car-navigation-0523
✴️ @AI_Python_EN
Autonomous control system “learns” to use simple maps and image data to navigate new, complex routes.
#COMPUTERVISION
http://news.mit.edu/2019/human-reasoning-ai-driverless-car-navigation-0523
✴️ @AI_Python_EN
torchvision 0.3.0: segmentation, detection models, new datasets, C++/CUDA operators Blog with link to tutorial, release notes: https://pytorch.org/blog/torchvision03/ … Install commands have changed, use the selector on https://pytorch.org
NEW VIDEO: Learn how to write better, more efficient #pandas code 🐼 📺 https://www.youtube.com/watch?v=dPwLlJkSHLo … Download the dataset to follow along with the exercises: 👩💻 https://github.com/justmarkham/pycon-2019-tutorial … Become more fluent at using pandas to answer your own #DataScience questions! #Python
Revisiting Graph Neural Networks: All We Have is Low-Pass Filters. (arXiv:1905.09550v1 [http://stat.ML ]) http://bit.ly/2JyBy8
✴️ @AI_Python_EN
✴️ @AI_Python_EN
ERT Rediscovers the Classical NLP Pipeline by I. Tenney, D. Das & E. Pavlic is 4 pages of great insights https://arxiv.org/abs/1905.05950 Such a constant source of fascinating papers from Ellie Pavlick & her collaborators! Here's BERT correcting his prediction along the model depth🤯
Awesome research from Google #RL team. Learning dynamics from video
https://planetrl.github.io
✴️ @AI_Python_EN
https://planetrl.github.io
✴️ @AI_Python_EN
The other day, someone asked me how many kinds of regression there were.
OLS and binary logistic are very popular. There are also numerous methods for count data, ordinal data, as well as multinomial models.
There are many kinds of time-to-event models, and nonparametric forms of regression.
We also have polynomial, spline and quantile regression. Some consider ridge, lasso and elastic net as separate types of regression.
IV regression, SUR, Tobit, Heckit, and hurdle regression are some other kinds which are widely-used.
Neural Nets, arguably, are a form of regression. There are also many ways to estimate the same regression model with likelihood and Bayesian approaches.
In addition, there are methods designed for time-series analysis and longitudinal modeling, as well as hierarchical, mixed and mixture models.
I've only scratched the surface, and new methods are continually being developed.
I should note that I've used all the methods mentioned above. Each has practical value for hard hat statisticians like me and are not merely of academic interest.
✴️ @AI_Python_EN
OLS and binary logistic are very popular. There are also numerous methods for count data, ordinal data, as well as multinomial models.
There are many kinds of time-to-event models, and nonparametric forms of regression.
We also have polynomial, spline and quantile regression. Some consider ridge, lasso and elastic net as separate types of regression.
IV regression, SUR, Tobit, Heckit, and hurdle regression are some other kinds which are widely-used.
Neural Nets, arguably, are a form of regression. There are also many ways to estimate the same regression model with likelihood and Bayesian approaches.
In addition, there are methods designed for time-series analysis and longitudinal modeling, as well as hierarchical, mixed and mixture models.
I've only scratched the surface, and new methods are continually being developed.
I should note that I've used all the methods mentioned above. Each has practical value for hard hat statisticians like me and are not merely of academic interest.
✴️ @AI_Python_EN
Understanding Neural Networks via Feature Visualization: A survey
Nguyen et al.: https://lnkd.in/eRZMuTS
#neuralnetworks #generatornetwork #generativemodels
✴️ @AI_Python_EN
Nguyen et al.: https://lnkd.in/eRZMuTS
#neuralnetworks #generatornetwork #generativemodels
✴️ @AI_Python_EN
ery interesting paper on machine learning algorithms. This paper compares polynomial regression vs neural networks applying on several well known datasets (including MNIST). The results are worth looking.
Other datasets tested: (1) census data of engineers salaries in Silicon Valley; (2) million song data; (3) concrete strength data; (4) letter recognition data; (5) New York city taxi data; (6) forest cover type data; (7) Harvard/MIT MOOC course completion data; (8) amateur athletic competitions; (9) NCI cancer genomics; (10) MNIST image classification; and (11) United States 2016 Presidential Election.
I haven't reproduced the paper myself but I am very tempted in doing it.
Link here: https://lnkd.in/fd-VNtk
#machinelearning #petroleumengineering #artificialintelligence #data #algorithms #neuralnetworks #predictiveanalytics
✴️ @AI_Python_EN
Other datasets tested: (1) census data of engineers salaries in Silicon Valley; (2) million song data; (3) concrete strength data; (4) letter recognition data; (5) New York city taxi data; (6) forest cover type data; (7) Harvard/MIT MOOC course completion data; (8) amateur athletic competitions; (9) NCI cancer genomics; (10) MNIST image classification; and (11) United States 2016 Presidential Election.
I haven't reproduced the paper myself but I am very tempted in doing it.
Link here: https://lnkd.in/fd-VNtk
#machinelearning #petroleumengineering #artificialintelligence #data #algorithms #neuralnetworks #predictiveanalytics
✴️ @AI_Python_EN
DeepAMD: Detect Early Age-Related Macular Degeneration.
#BigData #Analytics #DataScience #AI #MachineLearning #DeepLearning #IoT #IIoT #PyTorch #Python #CloudComputing #DataScientist #Linux
https://link.springer.com/chapter/10.1007%2F978-3-030-20873-8_40
✴️ @AI_Python_EN
#BigData #Analytics #DataScience #AI #MachineLearning #DeepLearning #IoT #IIoT #PyTorch #Python #CloudComputing #DataScientist #Linux
https://link.springer.com/chapter/10.1007%2F978-3-030-20873-8_40
✴️ @AI_Python_EN
A collection of research papers on decision trees, classification trees, and regression trees with implementations:
https://github.com/benedekrozemberczki/awesome-decision-tree-papers
#BigData #MachineLearning #AI #DataScience #Algorithms #NLProc #Coding #DataScientists
✴️ @AI_Python_EN
https://github.com/benedekrozemberczki/awesome-decision-tree-papers
#BigData #MachineLearning #AI #DataScience #Algorithms #NLProc #Coding #DataScientists
✴️ @AI_Python_EN
François Chollet
This is how you implement a network in Chainer. Chainer, the original eager-first #deeplearning framework, has had this API since launch, in mid-2015. When PyTorch got started, it followed the Chainer template (in fact, the prototype of PyTorch was literally a fork of Chainer).
Nearly every day, I am getting ignorant messages saying, "PyTorch is an original innovation that TensorFlow/Keras copied". This is incorrect. Subclassing is a fairly obvious way to do things in Python, and Chainer had this API first. Many others followed.
I had been looking at adding a Model subclassing API to Keras as soon as late 2015 (before the Functional API even existed, and over a year before being aware of PyTorch), inspired by Chainer. Our first discussions about adding an eager execution mode also predate PyTorch.
By the time #PyTorch came out, I had been looking at its API (which is exactly the Chainer API) for 1.5 year (since the release of Chainer). It wasn't exactly a shock. There was nothing we didn't already know.
To be clear, it's a good thing that API patterns and technical innovations are cross-pollinating among deep learning framework. The #Keras API itself has a had a pretty big influence over libraries that came after. It's completely fine, and it all benefits end users.
✴️ @AI_Python_EN
This is how you implement a network in Chainer. Chainer, the original eager-first #deeplearning framework, has had this API since launch, in mid-2015. When PyTorch got started, it followed the Chainer template (in fact, the prototype of PyTorch was literally a fork of Chainer).
Nearly every day, I am getting ignorant messages saying, "PyTorch is an original innovation that TensorFlow/Keras copied". This is incorrect. Subclassing is a fairly obvious way to do things in Python, and Chainer had this API first. Many others followed.
I had been looking at adding a Model subclassing API to Keras as soon as late 2015 (before the Functional API even existed, and over a year before being aware of PyTorch), inspired by Chainer. Our first discussions about adding an eager execution mode also predate PyTorch.
By the time #PyTorch came out, I had been looking at its API (which is exactly the Chainer API) for 1.5 year (since the release of Chainer). It wasn't exactly a shock. There was nothing we didn't already know.
To be clear, it's a good thing that API patterns and technical innovations are cross-pollinating among deep learning framework. The #Keras API itself has a had a pretty big influence over libraries that came after. It's completely fine, and it all benefits end users.
✴️ @AI_Python_EN