There are now many methods we can use when our dependent variable is not continuous. SVM, XGBoost and Random Forests are some popular ones.
There are also "traditional" methods, such as Logistic Regression. These usually scale well and, when used properly, are competitive in terms of predictive accuracy.
They are probabilistic models, which gives them additional flexibility. They also are often easier to interpret, critical when the goal is explanation, not just prediction.
They can be more work, however, and are probably easier to misuse than newer methods such as Random Forests. Here are some excellent books on these methods that may be of interest:
- Categorical Data Analysis (Agresti)
- Analyzing Categorical Data (Simonoff)
- Regression Models for Categorical Dependent Variables (Long and Freese)
- Generalized Linear Models and Extensions (Hardin and Hilbe)
- Regression Modeling Strategies (Harrell)
- Applied Logistic Regression (Hosmer and Lemeshow)
- Logistic Regression Models (Hilbe)
- Analysis of Ordinal Categorical Data (Agresti)
- Applied Ordinal Logistic Regression (Liu)
- Modeling Count Data (Hilbe)
- Negative Binomial Regression (Hilbe)
- Handbook of Survival Analysis (Klein et al.)
- Survival Analysis: A Self-Learning Text (Kleinbaum and Klein)
#statistics #book #Machinelearning
✴️ @AI_Python
There are also "traditional" methods, such as Logistic Regression. These usually scale well and, when used properly, are competitive in terms of predictive accuracy.
They are probabilistic models, which gives them additional flexibility. They also are often easier to interpret, critical when the goal is explanation, not just prediction.
They can be more work, however, and are probably easier to misuse than newer methods such as Random Forests. Here are some excellent books on these methods that may be of interest:
- Categorical Data Analysis (Agresti)
- Analyzing Categorical Data (Simonoff)
- Regression Models for Categorical Dependent Variables (Long and Freese)
- Generalized Linear Models and Extensions (Hardin and Hilbe)
- Regression Modeling Strategies (Harrell)
- Applied Logistic Regression (Hosmer and Lemeshow)
- Logistic Regression Models (Hilbe)
- Analysis of Ordinal Categorical Data (Agresti)
- Applied Ordinal Logistic Regression (Liu)
- Modeling Count Data (Hilbe)
- Negative Binomial Regression (Hilbe)
- Handbook of Survival Analysis (Klein et al.)
- Survival Analysis: A Self-Learning Text (Kleinbaum and Klein)
#statistics #book #Machinelearning
✴️ @AI_Python
New #ArtificialIntelligence Sees Like a Human, Bringing Us Closer to Skynet
Read the research: https://lnkd.in/dU9W3D4
✴️ @AI_Python
Read the research: https://lnkd.in/dU9W3D4
✴️ @AI_Python
Can #neuralnetworks be made to reason?" Conversation with Ian Goodfellow
Full version: https://www.youtube.com/watch?v=Z6rxFNMGdn0
✴️ @AI_Python
Full version: https://www.youtube.com/watch?v=Z6rxFNMGdn0
✴️ @AI_Python
We are open-sourcing Pythia, a #deeplearning platform to support multitasking for vision and language tasks. With Pythia, researchers can more easily build, reproduce, and benchmark AI models.
https://code.fb.com/ai-research/pythia/
✴️ @AI_Python_EN
https://code.fb.com/ai-research/pythia/
✴️ @AI_Python_EN
Course material for STAT 479: #DeepLearning (SS 2019) course at University Wisconsin-Madison
🌎 Learn more
✴️ @AI_Python_EN
🌎 Learn more
✴️ @AI_Python_EN
#MachineLearning in Agriculture: Applications and Techniques
🌎 Machine Learning in Agriculture
✴️ @AI_Python_EN
🌎 Machine Learning in Agriculture
✴️ @AI_Python_EN
Detection Free Human Instance Segmentation using Pose2Seg and PyTorch
https://towardsdatascience.com/detection-free-human-instance-segmentation-using-pose2seg-and-pytorch-72f48dc4d23e
✴️ @AI_Python_EN
https://towardsdatascience.com/detection-free-human-instance-segmentation-using-pose2seg-and-pytorch-72f48dc4d23e
✴️ @AI_Python_EN
10 Free Python Programming Courses For Beginners
https://hackernoon.com/10-free-python-programming-courses-for-beginners-to-learn-online-38312f3b9912
✴️ @AI_Python_EN
https://hackernoon.com/10-free-python-programming-courses-for-beginners-to-learn-online-38312f3b9912
✴️ @AI_Python_EN
Korbit is now launching the world’s first deep learning course taught by an interactive deep learning tutor. The online course is a four-week-long introduction to #machinelearning and deep learning, featuring lectures from Mila professors Yoshua Bengio, Laurent Charlin, Audrey Durand and Aaron Courville, and includes over 100 interactive exercises (question-answering exercises, drag-and-drop exercises and mathematical problems).
The #deeplearning tutor Korbi guides students through the course with a problem-solving approach and offers them different exercises, hints and visual diagrams based on their individual level of understanding and unique learning profile. The course is free and available for everyone at: www.korbit.ai/machinelearning.
✴️ @AI_Python_EN
The #deeplearning tutor Korbi guides students through the course with a problem-solving approach and offers them different exercises, hints and visual diagrams based on their individual level of understanding and unique learning profile. The course is free and available for everyone at: www.korbit.ai/machinelearning.
✴️ @AI_Python_EN
🔸Inside TensorFlow: Summaries and TensorBoard
🌐 https://www.youtube.com/watch?v=OI4cskHUslQ
✴️ @AI_Python_EN
🌐 https://www.youtube.com/watch?v=OI4cskHUslQ
✴️ @AI_Python_EN
YouTube
Inside TensorFlow: Summaries and TensorBoard
Take an inside look into the TensorFlow team’s own internal training sessions--technical deep dives into TensorFlow by the very people who are building it!
This week we take a look into TensorBoard with Nick Felt, an Engineer on the TensorFlow team. Learn…
This week we take a look into TensorBoard with Nick Felt, an Engineer on the TensorFlow team. Learn…
This media is not supported in your browser
VIEW IN TELEGRAM
Another great paper from Samsung AI lab! Egor Zakharovdl et al (Few-Shot Adversarial Learning of Realistic Neural Talking Head Models). animate heads using only few shots of target person (or even 1 shot). Keypoints, adaptive instance norms and #GANs, no 3D face modelling at all.
📝 https://arxiv.org/abs/1905.08233
✴️ @AI_Python_EN
📝 https://arxiv.org/abs/1905.08233
✴️ @AI_Python_EN
A few weeks ago, a friend of mine asked me "Which papers can I read to catch up with the latest trends in modern #NLP?". 🏃♂️👨🎓 I compiled a list of papers and resources for him 📚 and thought it would be great to share it!
🌎 More
✴️ @AI_Python_EN
🌎 More
✴️ @AI_Python_EN
I do currently use Python but, since it's quite popular among data scientists, I've looked into it and have read several #book s on #Python:
- A Primer on Scientific Programming with Python (Langtangen)
- Python for Data Analysis (McKinney)
- Python Data Science Essentials (Boschetti and Massaron)
- Machine Learning in Python (Bowles)
- Hands-On Predictive Analytics with Python (Fuentes)
- Data Science for Marketing Analytics (Blanchard et al.)
- Bayesian Analysis with Python (Martin)
- Web Scraping with Python (Lawson)
I have found all of them helpful in different ways, including offering different perspectives on data science. Several are well-known, but I cannot critique them as an experienced Python user, so this is just FYI.
✴️ @AI_Python_EN
- A Primer on Scientific Programming with Python (Langtangen)
- Python for Data Analysis (McKinney)
- Python Data Science Essentials (Boschetti and Massaron)
- Machine Learning in Python (Bowles)
- Hands-On Predictive Analytics with Python (Fuentes)
- Data Science for Marketing Analytics (Blanchard et al.)
- Bayesian Analysis with Python (Martin)
- Web Scraping with Python (Lawson)
I have found all of them helpful in different ways, including offering different perspectives on data science. Several are well-known, but I cannot critique them as an experienced Python user, so this is just FYI.
✴️ @AI_Python_EN
When you read in another white paper or about us "our AI-based solution".
✴️ @AI_Python_EN
✴️ @AI_Python_EN
An advantage probabilistic models such as logistic regression have over hard classifiers is that we are not bound to positive/negative dichotomies.
A customer with purchase probability of .51 may be very different from one with a probability of .99.
Moreover, our model may perform very well in some prediction ranges and fall down badly in others. Having this information can help us diagnose the causes, improve our model and learn about our customers.
What about non-linearities and moderated effects (interactions)? With a little extra work, these can be identified and incorporated into the model.
What about customer heterogeneity (for example)? Any model can be boosted and bagged and mixture modeling is also (for me) an attractive option.
Methods such as logistic regression also have advantages over some other methods when there are more than two classes (groups) and when the sizes of the classes are very different (class imbalance).
✴️ @AI_Python_EN
A customer with purchase probability of .51 may be very different from one with a probability of .99.
Moreover, our model may perform very well in some prediction ranges and fall down badly in others. Having this information can help us diagnose the causes, improve our model and learn about our customers.
What about non-linearities and moderated effects (interactions)? With a little extra work, these can be identified and incorporated into the model.
What about customer heterogeneity (for example)? Any model can be boosted and bagged and mixture modeling is also (for me) an attractive option.
Methods such as logistic regression also have advantages over some other methods when there are more than two classes (groups) and when the sizes of the classes are very different (class imbalance).
✴️ @AI_Python_EN