Awesome research from Google #RL team. Learning dynamics from video
https://planetrl.github.io
✴️ @AI_Python_EN
https://planetrl.github.io
✴️ @AI_Python_EN
The other day, someone asked me how many kinds of regression there were.
OLS and binary logistic are very popular. There are also numerous methods for count data, ordinal data, as well as multinomial models.
There are many kinds of time-to-event models, and nonparametric forms of regression.
We also have polynomial, spline and quantile regression. Some consider ridge, lasso and elastic net as separate types of regression.
IV regression, SUR, Tobit, Heckit, and hurdle regression are some other kinds which are widely-used.
Neural Nets, arguably, are a form of regression. There are also many ways to estimate the same regression model with likelihood and Bayesian approaches.
In addition, there are methods designed for time-series analysis and longitudinal modeling, as well as hierarchical, mixed and mixture models.
I've only scratched the surface, and new methods are continually being developed.
I should note that I've used all the methods mentioned above. Each has practical value for hard hat statisticians like me and are not merely of academic interest.
✴️ @AI_Python_EN
OLS and binary logistic are very popular. There are also numerous methods for count data, ordinal data, as well as multinomial models.
There are many kinds of time-to-event models, and nonparametric forms of regression.
We also have polynomial, spline and quantile regression. Some consider ridge, lasso and elastic net as separate types of regression.
IV regression, SUR, Tobit, Heckit, and hurdle regression are some other kinds which are widely-used.
Neural Nets, arguably, are a form of regression. There are also many ways to estimate the same regression model with likelihood and Bayesian approaches.
In addition, there are methods designed for time-series analysis and longitudinal modeling, as well as hierarchical, mixed and mixture models.
I've only scratched the surface, and new methods are continually being developed.
I should note that I've used all the methods mentioned above. Each has practical value for hard hat statisticians like me and are not merely of academic interest.
✴️ @AI_Python_EN
Understanding Neural Networks via Feature Visualization: A survey
Nguyen et al.: https://lnkd.in/eRZMuTS
#neuralnetworks #generatornetwork #generativemodels
✴️ @AI_Python_EN
Nguyen et al.: https://lnkd.in/eRZMuTS
#neuralnetworks #generatornetwork #generativemodels
✴️ @AI_Python_EN
ery interesting paper on machine learning algorithms. This paper compares polynomial regression vs neural networks applying on several well known datasets (including MNIST). The results are worth looking.
Other datasets tested: (1) census data of engineers salaries in Silicon Valley; (2) million song data; (3) concrete strength data; (4) letter recognition data; (5) New York city taxi data; (6) forest cover type data; (7) Harvard/MIT MOOC course completion data; (8) amateur athletic competitions; (9) NCI cancer genomics; (10) MNIST image classification; and (11) United States 2016 Presidential Election.
I haven't reproduced the paper myself but I am very tempted in doing it.
Link here: https://lnkd.in/fd-VNtk
#machinelearning #petroleumengineering #artificialintelligence #data #algorithms #neuralnetworks #predictiveanalytics
✴️ @AI_Python_EN
Other datasets tested: (1) census data of engineers salaries in Silicon Valley; (2) million song data; (3) concrete strength data; (4) letter recognition data; (5) New York city taxi data; (6) forest cover type data; (7) Harvard/MIT MOOC course completion data; (8) amateur athletic competitions; (9) NCI cancer genomics; (10) MNIST image classification; and (11) United States 2016 Presidential Election.
I haven't reproduced the paper myself but I am very tempted in doing it.
Link here: https://lnkd.in/fd-VNtk
#machinelearning #petroleumengineering #artificialintelligence #data #algorithms #neuralnetworks #predictiveanalytics
✴️ @AI_Python_EN
DeepAMD: Detect Early Age-Related Macular Degeneration.
#BigData #Analytics #DataScience #AI #MachineLearning #DeepLearning #IoT #IIoT #PyTorch #Python #CloudComputing #DataScientist #Linux
https://link.springer.com/chapter/10.1007%2F978-3-030-20873-8_40
✴️ @AI_Python_EN
#BigData #Analytics #DataScience #AI #MachineLearning #DeepLearning #IoT #IIoT #PyTorch #Python #CloudComputing #DataScientist #Linux
https://link.springer.com/chapter/10.1007%2F978-3-030-20873-8_40
✴️ @AI_Python_EN
A collection of research papers on decision trees, classification trees, and regression trees with implementations:
https://github.com/benedekrozemberczki/awesome-decision-tree-papers
#BigData #MachineLearning #AI #DataScience #Algorithms #NLProc #Coding #DataScientists
✴️ @AI_Python_EN
https://github.com/benedekrozemberczki/awesome-decision-tree-papers
#BigData #MachineLearning #AI #DataScience #Algorithms #NLProc #Coding #DataScientists
✴️ @AI_Python_EN
François Chollet
This is how you implement a network in Chainer. Chainer, the original eager-first #deeplearning framework, has had this API since launch, in mid-2015. When PyTorch got started, it followed the Chainer template (in fact, the prototype of PyTorch was literally a fork of Chainer).
Nearly every day, I am getting ignorant messages saying, "PyTorch is an original innovation that TensorFlow/Keras copied". This is incorrect. Subclassing is a fairly obvious way to do things in Python, and Chainer had this API first. Many others followed.
I had been looking at adding a Model subclassing API to Keras as soon as late 2015 (before the Functional API even existed, and over a year before being aware of PyTorch), inspired by Chainer. Our first discussions about adding an eager execution mode also predate PyTorch.
By the time #PyTorch came out, I had been looking at its API (which is exactly the Chainer API) for 1.5 year (since the release of Chainer). It wasn't exactly a shock. There was nothing we didn't already know.
To be clear, it's a good thing that API patterns and technical innovations are cross-pollinating among deep learning framework. The #Keras API itself has a had a pretty big influence over libraries that came after. It's completely fine, and it all benefits end users.
✴️ @AI_Python_EN
This is how you implement a network in Chainer. Chainer, the original eager-first #deeplearning framework, has had this API since launch, in mid-2015. When PyTorch got started, it followed the Chainer template (in fact, the prototype of PyTorch was literally a fork of Chainer).
Nearly every day, I am getting ignorant messages saying, "PyTorch is an original innovation that TensorFlow/Keras copied". This is incorrect. Subclassing is a fairly obvious way to do things in Python, and Chainer had this API first. Many others followed.
I had been looking at adding a Model subclassing API to Keras as soon as late 2015 (before the Functional API even existed, and over a year before being aware of PyTorch), inspired by Chainer. Our first discussions about adding an eager execution mode also predate PyTorch.
By the time #PyTorch came out, I had been looking at its API (which is exactly the Chainer API) for 1.5 year (since the release of Chainer). It wasn't exactly a shock. There was nothing we didn't already know.
To be clear, it's a good thing that API patterns and technical innovations are cross-pollinating among deep learning framework. The #Keras API itself has a had a pretty big influence over libraries that came after. It's completely fine, and it all benefits end users.
✴️ @AI_Python_EN
Gradient Flow: a new notebook that explains automatic differentiation using eager execution in #TensorFlow. I go over computational graphs, vector valued functions, gradient, etc ..
https://colab.research.google.com/github/zaidalyafeai/Notebooks/blob/master/GradientFlow.ipynb
✴️ @AI_Python_EN
https://colab.research.google.com/github/zaidalyafeai/Notebooks/blob/master/GradientFlow.ipynb
✴️ @AI_Python_EN
Wow, so this actually happened. We are the highest rated course in Artificial Intelligence and Computer Science (>1000 courses) on Class Central 170 000 students and counting. Such an honor. #wearehelsinkiuni #ElementsofAI #AIChallenge #AIutmaningen
http://www.elementsofai.com
http://www.elementsofai.com
Global
A free online introduction to artificial intelligence for non-experts
Learn more about MinnaLearn's and the University of Helsinki's AI course - no programming or complicated math required.
One of the BEST #MachineLearning Glossary by Google
It will definitely come in handy - https://lnkd.in/gNiE9JT
✴️ @AI_Python_EN
It will definitely come in handy - https://lnkd.in/gNiE9JT
✴️ @AI_Python_EN
In my line of work, I often need to model multiple dependent (outcome) variables simultaneously.
These may be latent variables, each with several indicator (observed) variables, observed variables, or a combination of the two.
Structural Equation Modeling (SEM) is one method able to handle these situations. Observed variables do not have to be continuous and, in fact, can be a mix of data types, including count variables and binary indicators.
Here are some excellent #book s on SEM for those who'd like to learn more about this method:
- Principles and Practice of Structural Equation Modeling (Kline)
- Linear Causal Modeling with Structural Equations (Mulaik)
- Structural Equations with Latent Variables (Bollen)
- Handbook of Structural Equation Modeling (Hoyle)
It's not an easy method for most of us to master, however, and new developments are happening at a rapid pace. Structural Equation Modeling: A Multidisciplinary Journal (Routledge) is an excellent resource for experienced SEM modelers.
✴️ @AI_Python_EN
These may be latent variables, each with several indicator (observed) variables, observed variables, or a combination of the two.
Structural Equation Modeling (SEM) is one method able to handle these situations. Observed variables do not have to be continuous and, in fact, can be a mix of data types, including count variables and binary indicators.
Here are some excellent #book s on SEM for those who'd like to learn more about this method:
- Principles and Practice of Structural Equation Modeling (Kline)
- Linear Causal Modeling with Structural Equations (Mulaik)
- Structural Equations with Latent Variables (Bollen)
- Handbook of Structural Equation Modeling (Hoyle)
It's not an easy method for most of us to master, however, and new developments are happening at a rapid pace. Structural Equation Modeling: A Multidisciplinary Journal (Routledge) is an excellent resource for experienced SEM modelers.
✴️ @AI_Python_EN
**** Advanced NLP with spaCy ****
Credits - Ines Montani
Link - https://course.spacy.io/
#nlp #spacy #naturallanguageprocessing #machinelearing
#datascience
✴️ @AI_Python_EN
Credits - Ines Montani
Link - https://course.spacy.io/
#nlp #spacy #naturallanguageprocessing #machinelearing
#datascience
✴️ @AI_Python_EN
Every day we are bombarded with Artificial Intelligence news. Each new product is advertised as packed with AI. Every business wants to be perceived as AI wonder house.
To cut through this information AI storm, I've prepared a simplified (and definitely not complete) chart presented below. What currently is called "Artificial Intelligence" is a type of Machine Learning utilizing multiplier layer (i.e. deep) #ArtificialNeuralNetworks. Processing lager number of neural networks' "hidden" layers involves performing matrix operations which are more efficiently done on Graphics Processing Units (GPUs) than standard CPUs.
And since sometimes we have no f… idea what is going on inside these artificial neural networks, we decided to call it #ArtificialIntelligence ;)
✴️ @AI_Python_EN
To cut through this information AI storm, I've prepared a simplified (and definitely not complete) chart presented below. What currently is called "Artificial Intelligence" is a type of Machine Learning utilizing multiplier layer (i.e. deep) #ArtificialNeuralNetworks. Processing lager number of neural networks' "hidden" layers involves performing matrix operations which are more efficiently done on Graphics Processing Units (GPUs) than standard CPUs.
And since sometimes we have no f… idea what is going on inside these artificial neural networks, we decided to call it #ArtificialIntelligence ;)
✴️ @AI_Python_EN
How to Perform Object Detection With YOLOv3 in Keras
https://machinelearningmastery.com/how-to-perform-object-detection-with-yolov3-in-keras/
#ObjectDetection
✴️ @AI_Python_EN
https://machinelearningmastery.com/how-to-perform-object-detection-with-yolov3-in-keras/
#ObjectDetection
✴️ @AI_Python_EN
Forwarded from DLeX: AI Python (Farzad)
Unsupervised Learning with Graph Neural Networks
video: http://www.ipam.ucla.edu/programs/workshops/workshop-iv-deep-geometric-learning-of-big-data-and-applications/?tab=schedule
guide: http://helper.ipam.ucla.edu/publications/glws4/glws4_15546.pdf
#ArtificialIntelligence #UnsupervisedLearning
✴️ @AI_Python_EN
video: http://www.ipam.ucla.edu/programs/workshops/workshop-iv-deep-geometric-learning-of-big-data-and-applications/?tab=schedule
guide: http://helper.ipam.ucla.edu/publications/glws4/glws4_15546.pdf
#ArtificialIntelligence #UnsupervisedLearning
✴️ @AI_Python_EN
Geoffrey Hinton Leads Google Brain Representation Similarity Index Research Aiming to Understand Neural Networks
https://medium.com/syncedreview/geoffrey-hinton-leads-google-brain-representation-similarity-index-research-aiming-to-understand-b5d14bf77f49
✴️ @AI_Python_EN
https://medium.com/syncedreview/geoffrey-hinton-leads-google-brain-representation-similarity-index-research-aiming-to-understand-b5d14bf77f49
✴️ @AI_Python_EN
For anyone who is interested in non-parametric multivariate data analysis, PhD thesis is publicly available on arXiv now.
https://arxiv.org/abs/1905.10716v1
✴️ @AI_Python_EN
https://arxiv.org/abs/1905.10716v1
✴️ @AI_Python_EN
#Statistics such as correlation, mean and standard deviation (variance) create strong visual images and meaning. Two different #datasets with the same correlation would sort of look the same. Right?
Not so much.
Each of these very different-looking graphs are plotting datasets with the same correlation, mean and SD. This is why plotting data is so important though oddly so rarely (in my expereince) done.
https://bit.ly/2oZ29MP
✴️ @AI_Python_EN
Not so much.
Each of these very different-looking graphs are plotting datasets with the same correlation, mean and SD. This is why plotting data is so important though oddly so rarely (in my expereince) done.
https://bit.ly/2oZ29MP
✴️ @AI_Python_EN
Misconception 4: Explainable #machinelearning is just models of models.
I do like surrogate models. They have several important uses. However, I also REALLY like tools that explain models directly, including:
- ALE: https://lnkd.in/e3mz23V
- ICE: https://lnkd.in/eaQxk_Q
- Friedman's H-stat: https://lnkd.in/emwNcdy
- Partial dependence: https://lnkd.in/ejnkFYN, Section 10.13.2
- Shapley explanations: https://lnkd.in/ewsMxbU
(What did I miss? Any others?)
Moreover, surrogate models & direct explanatory techniques work very well together! See pic below.
Misconception 3: https://lnkd.in/eM3hVyW
Read more/contribute: https://lnkd.in/e8_hciE
#ai #datascience #deeplearning #aiforall #artificialintelligence #datascience #ml #python
✴️ @AI_Python_EN
I do like surrogate models. They have several important uses. However, I also REALLY like tools that explain models directly, including:
- ALE: https://lnkd.in/e3mz23V
- ICE: https://lnkd.in/eaQxk_Q
- Friedman's H-stat: https://lnkd.in/emwNcdy
- Partial dependence: https://lnkd.in/ejnkFYN, Section 10.13.2
- Shapley explanations: https://lnkd.in/ewsMxbU
(What did I miss? Any others?)
Moreover, surrogate models & direct explanatory techniques work very well together! See pic below.
Misconception 3: https://lnkd.in/eM3hVyW
Read more/contribute: https://lnkd.in/e8_hciE
#ai #datascience #deeplearning #aiforall #artificialintelligence #datascience #ml #python
✴️ @AI_Python_EN