What happens when a dataset has too many #variables? Here are few possible situations which you might come across:
• You find that most of the variables are correlated.
• You lose patience and decide to run a model on the whole data which returns poor accuracy
• You become indecisive about what to do
• You start thinking of some strategic method to find few important variables
But dealing with such situations isn’t as difficult as it sounds. Statistical techniques such as factor analysis and principal component analysis help to overcome such difficulties. Here's a detailed guide on Principal Component Analysis - a method to extract important variables from a large set of variables available in a #dataset. Read the full article here: https://lnkd.in/fbKgbrh
✴️ @AI_Python_EN
• You find that most of the variables are correlated.
• You lose patience and decide to run a model on the whole data which returns poor accuracy
• You become indecisive about what to do
• You start thinking of some strategic method to find few important variables
But dealing with such situations isn’t as difficult as it sounds. Statistical techniques such as factor analysis and principal component analysis help to overcome such difficulties. Here's a detailed guide on Principal Component Analysis - a method to extract important variables from a large set of variables available in a #dataset. Read the full article here: https://lnkd.in/fbKgbrh
✴️ @AI_Python_EN
How to Build a Simple Artificial Neural Network (#ANN)
#ArtificialNeuralNetwork
🌎 Artificial Neural Network
✴️ @AI_Python_EN
#ArtificialNeuralNetwork
🌎 Artificial Neural Network
✴️ @AI_Python_EN
If you're bootstrapping yourself into deep learning research, here’s what I would do:
1. FastAI (3m)
2. Personal projects/reproduce papers/consulting (3 - 12m)
3. Flashcard the Deep Learning Book (4-6m)
4. Flashcard ~100 papers in a niche (2m)
5. Publish your first paper (6m)
✴️ @AI_Python_EN
1. FastAI (3m)
2. Personal projects/reproduce papers/consulting (3 - 12m)
3. Flashcard the Deep Learning Book (4-6m)
4. Flashcard ~100 papers in a niche (2m)
5. Publish your first paper (6m)
✴️ @AI_Python_EN
Now is the time to build an AI startup! The ecosystem has matured, the 4 pillars of AI are now freely accessible
1. Data - Kaggle, Google Dataset Search
2. Algorithms - Arxiv, GitHub
3. Compute - Google Colab, Kaggle Kernels
4. Education - School of AI, KhanAcademy, Fast.AI
1. Data - Kaggle, Google Dataset Search
2. Algorithms - Arxiv, GitHub
3. Compute - Google Colab, Kaggle Kernels
4. Education - School of AI, KhanAcademy, Fast.AI
CS234: Reinforcement Learning | Winter 2019
By Emma Brunskill: https://lnkd.in/eyNjZBR
#DeepLearning #MachineLearning #ReinforcementLearning
✴️ @AI_Python_EN
By Emma Brunskill: https://lnkd.in/eyNjZBR
#DeepLearning #MachineLearning #ReinforcementLearning
✴️ @AI_Python_EN
Amazing project success for #DeepLearning for #Radiologists
This CNN model for breast cancer did screening exam classification, trained and evaluated on over 200,000 exams (over 1,000,000 images).
Accuracy? ~ 90% in predicting whether there is a cancer in the breast, when tested on the screening population.
It was a two-stage training procedure, which allows us to use a very high-capacity patch-level network to learn from pixel-level labels alongside a network learning from macroscopic breast-level labels.
Paper on #ArXiv https://lnkd.in/ggj5Z6W
Code: https://lnkd.in/gScbpUs
Explanation: https://lnkd.in/gfa9gzM
#ai #deeplearning #radiology #model #breast #mammography
✴️ @AI_Python_EN
This CNN model for breast cancer did screening exam classification, trained and evaluated on over 200,000 exams (over 1,000,000 images).
Accuracy? ~ 90% in predicting whether there is a cancer in the breast, when tested on the screening population.
It was a two-stage training procedure, which allows us to use a very high-capacity patch-level network to learn from pixel-level labels alongside a network learning from macroscopic breast-level labels.
Paper on #ArXiv https://lnkd.in/ggj5Z6W
Code: https://lnkd.in/gScbpUs
Explanation: https://lnkd.in/gfa9gzM
#ai #deeplearning #radiology #model #breast #mammography
✴️ @AI_Python_EN
An overview for using #R for validated work:
1.) Base R #Validation for #FDA: https://lnkd.in/ep8TRM8
2.) #RStudio IDE Validation: https://lnkd.in/e34FCXn
3.) Evaluating Package Stability
4.) Evaluating Package Dependencies: https://lnkd.in/eniCXgG
5.) Organizing Packages with an Internal Repository: https://lnkd.in/etSGuk4
#rstats
✴️ @AI_Python_EN
1.) Base R #Validation for #FDA: https://lnkd.in/ep8TRM8
2.) #RStudio IDE Validation: https://lnkd.in/e34FCXn
3.) Evaluating Package Stability
4.) Evaluating Package Dependencies: https://lnkd.in/eniCXgG
5.) Organizing Packages with an Internal Repository: https://lnkd.in/etSGuk4
#rstats
✴️ @AI_Python_EN
We released a new large-scale corpus of English speech derived for TTS; LibriTTS: A Corpus Derived from LibriSpeech for Text-to-Speech
Dataset: http://www.openslr.org/60/
Paper: http://arxiv.org/abs/1904.02882
✴️ @AI_Python_EN
Dataset: http://www.openslr.org/60/
Paper: http://arxiv.org/abs/1904.02882
✴️ @AI_Python_EN
We released our new interactive annotation approach, which outperforms Polygon-RNN++ and is 10x faster.
Paper:
https://arxiv.org/pdf/1903.06874.pdf
Video:
https://www.youtube.com/watch?v=ycD2BtO-QzU …
Code:
https://github.com/fidler-lab/curve-gcn
✴️ @AI_Python_EN
Paper:
https://arxiv.org/pdf/1903.06874.pdf
Video:
https://www.youtube.com/watch?v=ycD2BtO-QzU …
Code:
https://github.com/fidler-lab/curve-gcn
✴️ @AI_Python_EN
A preprint for our #naacl2019 paper "Combining Sentiment Lexica with a Multi-View Variational #Autoencoder" is now online! We combine lexica with different polarity scales with a novel multi-view VAE.
https://arxiv.org/abs/1904.02839
✴️ @AI_Python_EN
https://arxiv.org/abs/1904.02839
✴️ @AI_Python_EN
PaintBot: A Reinforcement Learning Approach for Natural Media Painting
Jia et al.: https://lnkd.in/ez5Vqav
#ComputerVision #PatternRecognition #ReinforcementLearning #Painting
✴️ @AI_Python_EN
Jia et al.: https://lnkd.in/ez5Vqav
#ComputerVision #PatternRecognition #ReinforcementLearning #Painting
✴️ @AI_Python_EN
Introduction to the math of backprop
By Deb Panigrahi: https://lnkd.in/ddtyj_U
#ArtificialIntelligence #BackPropagation #DeepLearning #NeuralNetworks
✴️ @AI_Python_EN
By Deb Panigrahi: https://lnkd.in/ddtyj_U
#ArtificialIntelligence #BackPropagation #DeepLearning #NeuralNetworks
✴️ @AI_Python_EN
Music Transformer
Huang et al.: https://lnkd.in/dzHEH4E
#DeepLearning #Transformer #MachineLearning #SpeechProcessing #Music
✴️ @AI_Python_EN
Huang et al.: https://lnkd.in/dzHEH4E
#DeepLearning #Transformer #MachineLearning #SpeechProcessing #Music
✴️ @AI_Python_EN
How to run #Pytorch 1.0 and http://Fast.ai 1.0 on an Nvidia Jetson Nano Board ($99), an ARM Cortex A57 processor board with 4GB of RAM https://forums.fast.ai/t/share-your-work-here/27676/1274
✴️ @AI_Python_EN
✴️ @AI_Python_EN
Four troubling trends in Machine Learning scholarship:
1. failure to distinguish between explanation and speculation;
2. failure to identify the sources of empirical gains, e.g., emphasizing unnecessary modifications to neural architectures when gains actually stem from hyper-parameter tuning;
3. mathiness: the use of mathematics that obfuscates or impresses rather than clarifies, e.g., by confusing technical and non-technical concepts; and
4. misuse of language, e.g., by choosing terms of art with colloquial connotations or by overloading established technical terms.
https://arxiv.org/abs/1807.03341
✴️ @AI_Python_EN
1. failure to distinguish between explanation and speculation;
2. failure to identify the sources of empirical gains, e.g., emphasizing unnecessary modifications to neural architectures when gains actually stem from hyper-parameter tuning;
3. mathiness: the use of mathematics that obfuscates or impresses rather than clarifies, e.g., by confusing technical and non-technical concepts; and
4. misuse of language, e.g., by choosing terms of art with colloquial connotations or by overloading established technical terms.
https://arxiv.org/abs/1807.03341
✴️ @AI_Python_EN
Six easy ways to run your Jupyter Notebook in the cloud
By Data School: https://lnkd.in/exbAJ-S
✴️ @AI_Python_EN
By Data School: https://lnkd.in/exbAJ-S
✴️ @AI_Python_EN
Understanding Neural ODE's
Blog by Jonty Sinai: https://lnkd.in/e2SEzmZ
#artificialintelligence #machinelearning #neuralnetworks
✴️ @AI_Python_EN
Blog by Jonty Sinai: https://lnkd.in/e2SEzmZ
#artificialintelligence #machinelearning #neuralnetworks
✴️ @AI_Python_EN