Moving Camera, Moving People: A #DeepLearning Approach to Depth Prediction
http://ai.googleblog.com/2019/05/moving-camera-moving-people-deep.html
✴️ @AI_Python_EN
http://ai.googleblog.com/2019/05/moving-camera-moving-people-deep.html
✴️ @AI_Python_EN
"Storytelling with Data: A Data Visualization Guide for Business Professionals" by Cole Nussbaumer Knaflic
I've just come across this first (2015) edition, and there now may be a second edition out. Here's the link to the PDF:
https://lnkd.in/fJSN7ci
✴️ @AI_Python_EN
I've just come across this first (2015) edition, and there now may be a second edition out. Here's the link to the PDF:
https://lnkd.in/fJSN7ci
✴️ @AI_Python_EN
Estimators, Loss Functions, Optimizers —Core of ML Algorithms
https://towardsdatascience.com/estimators-loss-functions-optimizers-core-of-ml-algorithms-d603f6b0161a]
✴️ @AI_Python_EN
https://towardsdatascience.com/estimators-loss-functions-optimizers-core-of-ml-algorithms-d603f6b0161a]
✴️ @AI_Python_EN
Bringing human-like reasoning to driverless car navigation
Autonomous control system “learns” to use simple maps and image data to navigate new, complex routes.
#COMPUTERVISION
http://news.mit.edu/2019/human-reasoning-ai-driverless-car-navigation-0523
✴️ @AI_Python_EN
Autonomous control system “learns” to use simple maps and image data to navigate new, complex routes.
#COMPUTERVISION
http://news.mit.edu/2019/human-reasoning-ai-driverless-car-navigation-0523
✴️ @AI_Python_EN
torchvision 0.3.0: segmentation, detection models, new datasets, C++/CUDA operators Blog with link to tutorial, release notes: https://pytorch.org/blog/torchvision03/ … Install commands have changed, use the selector on https://pytorch.org
NEW VIDEO: Learn how to write better, more efficient #pandas code 🐼 📺 https://www.youtube.com/watch?v=dPwLlJkSHLo … Download the dataset to follow along with the exercises: 👩💻 https://github.com/justmarkham/pycon-2019-tutorial … Become more fluent at using pandas to answer your own #DataScience questions! #Python
Revisiting Graph Neural Networks: All We Have is Low-Pass Filters. (arXiv:1905.09550v1 [http://stat.ML ]) http://bit.ly/2JyBy8
✴️ @AI_Python_EN
✴️ @AI_Python_EN
ERT Rediscovers the Classical NLP Pipeline by I. Tenney, D. Das & E. Pavlic is 4 pages of great insights https://arxiv.org/abs/1905.05950 Such a constant source of fascinating papers from Ellie Pavlick & her collaborators! Here's BERT correcting his prediction along the model depth🤯
Awesome research from Google #RL team. Learning dynamics from video
https://planetrl.github.io
✴️ @AI_Python_EN
https://planetrl.github.io
✴️ @AI_Python_EN
The other day, someone asked me how many kinds of regression there were.
OLS and binary logistic are very popular. There are also numerous methods for count data, ordinal data, as well as multinomial models.
There are many kinds of time-to-event models, and nonparametric forms of regression.
We also have polynomial, spline and quantile regression. Some consider ridge, lasso and elastic net as separate types of regression.
IV regression, SUR, Tobit, Heckit, and hurdle regression are some other kinds which are widely-used.
Neural Nets, arguably, are a form of regression. There are also many ways to estimate the same regression model with likelihood and Bayesian approaches.
In addition, there are methods designed for time-series analysis and longitudinal modeling, as well as hierarchical, mixed and mixture models.
I've only scratched the surface, and new methods are continually being developed.
I should note that I've used all the methods mentioned above. Each has practical value for hard hat statisticians like me and are not merely of academic interest.
✴️ @AI_Python_EN
OLS and binary logistic are very popular. There are also numerous methods for count data, ordinal data, as well as multinomial models.
There are many kinds of time-to-event models, and nonparametric forms of regression.
We also have polynomial, spline and quantile regression. Some consider ridge, lasso and elastic net as separate types of regression.
IV regression, SUR, Tobit, Heckit, and hurdle regression are some other kinds which are widely-used.
Neural Nets, arguably, are a form of regression. There are also many ways to estimate the same regression model with likelihood and Bayesian approaches.
In addition, there are methods designed for time-series analysis and longitudinal modeling, as well as hierarchical, mixed and mixture models.
I've only scratched the surface, and new methods are continually being developed.
I should note that I've used all the methods mentioned above. Each has practical value for hard hat statisticians like me and are not merely of academic interest.
✴️ @AI_Python_EN
Understanding Neural Networks via Feature Visualization: A survey
Nguyen et al.: https://lnkd.in/eRZMuTS
#neuralnetworks #generatornetwork #generativemodels
✴️ @AI_Python_EN
Nguyen et al.: https://lnkd.in/eRZMuTS
#neuralnetworks #generatornetwork #generativemodels
✴️ @AI_Python_EN
ery interesting paper on machine learning algorithms. This paper compares polynomial regression vs neural networks applying on several well known datasets (including MNIST). The results are worth looking.
Other datasets tested: (1) census data of engineers salaries in Silicon Valley; (2) million song data; (3) concrete strength data; (4) letter recognition data; (5) New York city taxi data; (6) forest cover type data; (7) Harvard/MIT MOOC course completion data; (8) amateur athletic competitions; (9) NCI cancer genomics; (10) MNIST image classification; and (11) United States 2016 Presidential Election.
I haven't reproduced the paper myself but I am very tempted in doing it.
Link here: https://lnkd.in/fd-VNtk
#machinelearning #petroleumengineering #artificialintelligence #data #algorithms #neuralnetworks #predictiveanalytics
✴️ @AI_Python_EN
Other datasets tested: (1) census data of engineers salaries in Silicon Valley; (2) million song data; (3) concrete strength data; (4) letter recognition data; (5) New York city taxi data; (6) forest cover type data; (7) Harvard/MIT MOOC course completion data; (8) amateur athletic competitions; (9) NCI cancer genomics; (10) MNIST image classification; and (11) United States 2016 Presidential Election.
I haven't reproduced the paper myself but I am very tempted in doing it.
Link here: https://lnkd.in/fd-VNtk
#machinelearning #petroleumengineering #artificialintelligence #data #algorithms #neuralnetworks #predictiveanalytics
✴️ @AI_Python_EN
DeepAMD: Detect Early Age-Related Macular Degeneration.
#BigData #Analytics #DataScience #AI #MachineLearning #DeepLearning #IoT #IIoT #PyTorch #Python #CloudComputing #DataScientist #Linux
https://link.springer.com/chapter/10.1007%2F978-3-030-20873-8_40
✴️ @AI_Python_EN
#BigData #Analytics #DataScience #AI #MachineLearning #DeepLearning #IoT #IIoT #PyTorch #Python #CloudComputing #DataScientist #Linux
https://link.springer.com/chapter/10.1007%2F978-3-030-20873-8_40
✴️ @AI_Python_EN
A collection of research papers on decision trees, classification trees, and regression trees with implementations:
https://github.com/benedekrozemberczki/awesome-decision-tree-papers
#BigData #MachineLearning #AI #DataScience #Algorithms #NLProc #Coding #DataScientists
✴️ @AI_Python_EN
https://github.com/benedekrozemberczki/awesome-decision-tree-papers
#BigData #MachineLearning #AI #DataScience #Algorithms #NLProc #Coding #DataScientists
✴️ @AI_Python_EN
François Chollet
This is how you implement a network in Chainer. Chainer, the original eager-first #deeplearning framework, has had this API since launch, in mid-2015. When PyTorch got started, it followed the Chainer template (in fact, the prototype of PyTorch was literally a fork of Chainer).
Nearly every day, I am getting ignorant messages saying, "PyTorch is an original innovation that TensorFlow/Keras copied". This is incorrect. Subclassing is a fairly obvious way to do things in Python, and Chainer had this API first. Many others followed.
I had been looking at adding a Model subclassing API to Keras as soon as late 2015 (before the Functional API even existed, and over a year before being aware of PyTorch), inspired by Chainer. Our first discussions about adding an eager execution mode also predate PyTorch.
By the time #PyTorch came out, I had been looking at its API (which is exactly the Chainer API) for 1.5 year (since the release of Chainer). It wasn't exactly a shock. There was nothing we didn't already know.
To be clear, it's a good thing that API patterns and technical innovations are cross-pollinating among deep learning framework. The #Keras API itself has a had a pretty big influence over libraries that came after. It's completely fine, and it all benefits end users.
✴️ @AI_Python_EN
This is how you implement a network in Chainer. Chainer, the original eager-first #deeplearning framework, has had this API since launch, in mid-2015. When PyTorch got started, it followed the Chainer template (in fact, the prototype of PyTorch was literally a fork of Chainer).
Nearly every day, I am getting ignorant messages saying, "PyTorch is an original innovation that TensorFlow/Keras copied". This is incorrect. Subclassing is a fairly obvious way to do things in Python, and Chainer had this API first. Many others followed.
I had been looking at adding a Model subclassing API to Keras as soon as late 2015 (before the Functional API even existed, and over a year before being aware of PyTorch), inspired by Chainer. Our first discussions about adding an eager execution mode also predate PyTorch.
By the time #PyTorch came out, I had been looking at its API (which is exactly the Chainer API) for 1.5 year (since the release of Chainer). It wasn't exactly a shock. There was nothing we didn't already know.
To be clear, it's a good thing that API patterns and technical innovations are cross-pollinating among deep learning framework. The #Keras API itself has a had a pretty big influence over libraries that came after. It's completely fine, and it all benefits end users.
✴️ @AI_Python_EN
Gradient Flow: a new notebook that explains automatic differentiation using eager execution in #TensorFlow. I go over computational graphs, vector valued functions, gradient, etc ..
https://colab.research.google.com/github/zaidalyafeai/Notebooks/blob/master/GradientFlow.ipynb
✴️ @AI_Python_EN
https://colab.research.google.com/github/zaidalyafeai/Notebooks/blob/master/GradientFlow.ipynb
✴️ @AI_Python_EN
Wow, so this actually happened. We are the highest rated course in Artificial Intelligence and Computer Science (>1000 courses) on Class Central 170 000 students and counting. Such an honor. #wearehelsinkiuni #ElementsofAI #AIChallenge #AIutmaningen
http://www.elementsofai.com
http://www.elementsofai.com
Global
A free online introduction to artificial intelligence for non-experts
Learn more about MinnaLearn's and the University of Helsinki's AI course - no programming or complicated math required.
One of the BEST #MachineLearning Glossary by Google
It will definitely come in handy - https://lnkd.in/gNiE9JT
✴️ @AI_Python_EN
It will definitely come in handy - https://lnkd.in/gNiE9JT
✴️ @AI_Python_EN
In my line of work, I often need to model multiple dependent (outcome) variables simultaneously.
These may be latent variables, each with several indicator (observed) variables, observed variables, or a combination of the two.
Structural Equation Modeling (SEM) is one method able to handle these situations. Observed variables do not have to be continuous and, in fact, can be a mix of data types, including count variables and binary indicators.
Here are some excellent #book s on SEM for those who'd like to learn more about this method:
- Principles and Practice of Structural Equation Modeling (Kline)
- Linear Causal Modeling with Structural Equations (Mulaik)
- Structural Equations with Latent Variables (Bollen)
- Handbook of Structural Equation Modeling (Hoyle)
It's not an easy method for most of us to master, however, and new developments are happening at a rapid pace. Structural Equation Modeling: A Multidisciplinary Journal (Routledge) is an excellent resource for experienced SEM modelers.
✴️ @AI_Python_EN
These may be latent variables, each with several indicator (observed) variables, observed variables, or a combination of the two.
Structural Equation Modeling (SEM) is one method able to handle these situations. Observed variables do not have to be continuous and, in fact, can be a mix of data types, including count variables and binary indicators.
Here are some excellent #book s on SEM for those who'd like to learn more about this method:
- Principles and Practice of Structural Equation Modeling (Kline)
- Linear Causal Modeling with Structural Equations (Mulaik)
- Structural Equations with Latent Variables (Bollen)
- Handbook of Structural Equation Modeling (Hoyle)
It's not an easy method for most of us to master, however, and new developments are happening at a rapid pace. Structural Equation Modeling: A Multidisciplinary Journal (Routledge) is an excellent resource for experienced SEM modelers.
✴️ @AI_Python_EN