Top Resoures Deep learning Reading List
#deeplearning #Data
🌎 Link Review
🌎 Most Cited Deep Learning Papers
🌎 Understanding CNNs Part 3
🌎 Understanding CNNs Part 2
🌎 Understanding CNNs Part 1
🌎 23 Deep Learning Papers To Get You Started
🌎 Deep learning Reading List
✴️ @AI_Python_EN
🗣 @AI_Python_arXiv
❇️ @AI_Python
#deeplearning #Data
🌎 Link Review
🌎 Most Cited Deep Learning Papers
🌎 Understanding CNNs Part 3
🌎 Understanding CNNs Part 2
🌎 Understanding CNNs Part 1
🌎 23 Deep Learning Papers To Get You Started
🌎 Deep learning Reading List
✴️ @AI_Python_EN
🗣 @AI_Python_arXiv
❇️ @AI_Python
Kaggle kernels that Researcher have published.
1. Time Series Analysis - Artificial Neural Networks - https://lnkd.in/f8diQkX
2. Titanic - Data Preprocessing and Visualization - https://lnkd.in/fwrvHr5
3. Everything you can do with Seaborn - https://lnkd.in/fpgQCr8
4. Insights of Kaggle ML and DS Survey - https://lnkd.in/fPyiGyU
5. Time Series Analysis - ARIMA model - https://lnkd.in/fn24ihz
6. Time Series Analysis - LSTM - https://lnkd.in/fuY6DXm
7. Introduction to Regression - Complete Analysis - https://lnkd.in/fM3xsZ2
8. Time Series - Preprocessing to Modelling - https://lnkd.in/fJcar4u
Kaggle community is one of the best community for Data Science.
#machinelearning #artificialintelligence #datascience #deeplearning #data
✴️ @AI_Python_EN
🗣 @AI_Python_arXiv
❇️ @AI_Python
1. Time Series Analysis - Artificial Neural Networks - https://lnkd.in/f8diQkX
2. Titanic - Data Preprocessing and Visualization - https://lnkd.in/fwrvHr5
3. Everything you can do with Seaborn - https://lnkd.in/fpgQCr8
4. Insights of Kaggle ML and DS Survey - https://lnkd.in/fPyiGyU
5. Time Series Analysis - ARIMA model - https://lnkd.in/fn24ihz
6. Time Series Analysis - LSTM - https://lnkd.in/fuY6DXm
7. Introduction to Regression - Complete Analysis - https://lnkd.in/fM3xsZ2
8. Time Series - Preprocessing to Modelling - https://lnkd.in/fJcar4u
Kaggle community is one of the best community for Data Science.
#machinelearning #artificialintelligence #datascience #deeplearning #data
✴️ @AI_Python_EN
🗣 @AI_Python_arXiv
❇️ @AI_Python
How would a rockstar 🎸would improve their machine learning models?
چگونه مدلهای یادگیری ماشین را بهبود دهیم؟
To get better at playing the guitar, you play the guitar more. You try different songs, different cords. Practice, practice, practice.
All the practice adds up to more experience, more examples of different notes.
And to try something totally different, you might merge two songs together. Or even take a song written originally for the piano but play it on your guitar.
After a while, you're ready to play a show. But the show won't some any good if all the speakers are set to different settings. Steve the sound guy takes care of this.
How does this relate #machinelearning?
1. More practice = more data
More examples of playing different notes = more data. Machine learning models love more data.
2. Combining different songs = feature engineering
If the #data you have isn't in the form you want, transforming into a different shape may be a better way of looking at it.
3. Tuning the speakers = hyperparameter tuning
There's a reason tuning the speakers is the last step in playing a rock show. Working speakers don't mean anything without all the practice (collecting data) and songwriting (feature engineering). If you've done 1 and 2 right, this is the easy part.
🗣 @AI_Python_Arxiv
✴️ @AI_Python_EN
چگونه مدلهای یادگیری ماشین را بهبود دهیم؟
To get better at playing the guitar, you play the guitar more. You try different songs, different cords. Practice, practice, practice.
All the practice adds up to more experience, more examples of different notes.
And to try something totally different, you might merge two songs together. Or even take a song written originally for the piano but play it on your guitar.
After a while, you're ready to play a show. But the show won't some any good if all the speakers are set to different settings. Steve the sound guy takes care of this.
How does this relate #machinelearning?
1. More practice = more data
More examples of playing different notes = more data. Machine learning models love more data.
2. Combining different songs = feature engineering
If the #data you have isn't in the form you want, transforming into a different shape may be a better way of looking at it.
3. Tuning the speakers = hyperparameter tuning
There's a reason tuning the speakers is the last step in playing a rock show. Working speakers don't mean anything without all the practice (collecting data) and songwriting (feature engineering). If you've done 1 and 2 right, this is the easy part.
🗣 @AI_Python_Arxiv
✴️ @AI_Python_EN
How do you detect outliers in #data?
Do you use a blanket rule of anything outside 3 standard deviations?
Or do you use a more robust method?
If you have a resource you learned from or one you created. I'd love to reference it in my article on exploratory data analysis.
If you want to read it, there's a link in the comments. #EDA is one of the areas I've learned the most over the past year.
I remember things best if I write about them. So that's what I did.
PS There's more pretty pictures like this one in there too 🎨
#datascience
🗣 @AI_Python_Arxiv
✴️ @AI_Python_EN
Do you use a blanket rule of anything outside 3 standard deviations?
Or do you use a more robust method?
If you have a resource you learned from or one you created. I'd love to reference it in my article on exploratory data analysis.
If you want to read it, there's a link in the comments. #EDA is one of the areas I've learned the most over the past year.
I remember things best if I write about them. So that's what I did.
PS There's more pretty pictures like this one in there too 🎨
#datascience
🗣 @AI_Python_Arxiv
✴️ @AI_Python_EN
Been working through the Google Cloud Certified Professional Data Engineer track on Linux Academy the past few days.
Why?
Because it's one thing to build a #datascience or #machinelearning pipeline in a Jupyter Notebook but it's another thing to have something deployed in production.
Cloud services like #GoogleCloud provide a framework for ingesting, storing, analysing and visualising #data.
My exam is booked in for a couple of weeks.
The quizzes they have at the end of each module are incredibly helpful.
When I pass the exam, I'll do up a post with some of my favourite resources.
In the meantime, you can check out The Data Dossier book (pictured) here: https://lnkd.in/gmZMcGk
And if you're interested in the full Google Cloud Professional Data Engineer course, it's here: https://lnkd.in/gfBwXRF
✴️ @AI_Python_EN
❇️ @AI_Python
🗣 @AI_Python_arXiv
Why?
Because it's one thing to build a #datascience or #machinelearning pipeline in a Jupyter Notebook but it's another thing to have something deployed in production.
Cloud services like #GoogleCloud provide a framework for ingesting, storing, analysing and visualising #data.
My exam is booked in for a couple of weeks.
The quizzes they have at the end of each module are incredibly helpful.
When I pass the exam, I'll do up a post with some of my favourite resources.
In the meantime, you can check out The Data Dossier book (pictured) here: https://lnkd.in/gmZMcGk
And if you're interested in the full Google Cloud Professional Data Engineer course, it's here: https://lnkd.in/gfBwXRF
✴️ @AI_Python_EN
❇️ @AI_Python
🗣 @AI_Python_arXiv
Most of them use Python to solve data science problems. We may write code in Scripts or Notebook format.
People used to write in scripts. We have to run the entire code again and again which is a time taking process.
Now, everyone is using Jupyter notebooks. They are very useful and save time from executing the entire code. Instead, we can run individual chunks of code.
We need to be more productive in using this. If we don't know the shortcuts to use, we may waste a lot of time. So, it would be better if we know tips and shortcuts to use Jupyter notebook which makes us more productive at work.
Here is the resource to learn shortcuts.
28 Jupyter Notebook tips, tricks, and shortcuts: https://lnkd.in/f6VczRV
#datascience #python #datascience #machinelearning #artificialintelligence #data #deeplearning #jupyternotebook
✴️ @AI_Python_EN
❇️ @AI_Python
🗣 @AI_Python_arXiv
People used to write in scripts. We have to run the entire code again and again which is a time taking process.
Now, everyone is using Jupyter notebooks. They are very useful and save time from executing the entire code. Instead, we can run individual chunks of code.
We need to be more productive in using this. If we don't know the shortcuts to use, we may waste a lot of time. So, it would be better if we know tips and shortcuts to use Jupyter notebook which makes us more productive at work.
Here is the resource to learn shortcuts.
28 Jupyter Notebook tips, tricks, and shortcuts: https://lnkd.in/f6VczRV
#datascience #python #datascience #machinelearning #artificialintelligence #data #deeplearning #jupyternotebook
✴️ @AI_Python_EN
❇️ @AI_Python
🗣 @AI_Python_arXiv
The Best #FREE Books for Learning #DataScience
Link => bit.ly/AIFreeBooks
#ai #analytics #artificialinteligence #bi #bigdata #data #machinelearning
✴️ @AI_Python_EN
❇️ @AI_Python
🗣 @AI_Python_arXiv
Link => bit.ly/AIFreeBooks
#ai #analytics #artificialinteligence #bi #bigdata #data #machinelearning
✴️ @AI_Python_EN
❇️ @AI_Python
🗣 @AI_Python_arXiv
Introducing TensorFlow Datasets
By TensorFlow: https://lnkd.in/d2yEjSr
#MachineLearning #Data #Dataset #TensorFlow
✴️ @AI_Python_EN
❇️ @AI_Python
🗣 @AI_Python_arXiv
By TensorFlow: https://lnkd.in/d2yEjSr
#MachineLearning #Data #Dataset #TensorFlow
✴️ @AI_Python_EN
❇️ @AI_Python
🗣 @AI_Python_arXiv
"AI Needs Better Data, Not Just More Data"
> https://lnkd.in/gR5E7Re
#AI #ArtificialIntelligence #MI #MachineIntelligence
#ML #MachineLearning #DataScience #Analytics
#Data #BigData #IoT #4IR #DataPedigree
#Veracity #Trust #DataQuality #BetterData
✴️ @AI_Python_EN
> https://lnkd.in/gR5E7Re
#AI #ArtificialIntelligence #MI #MachineIntelligence
#ML #MachineLearning #DataScience #Analytics
#Data #BigData #IoT #4IR #DataPedigree
#Veracity #Trust #DataQuality #BetterData
✴️ @AI_Python_EN
One of my favorite tricks is adding a constant to each of the independent variables in a regression so as to shift the intercept. Of course just shifting the data will not change R-squared, slopes, F-scores, P-values, etc., so why do it?
Because just about any software package capable of doing regression, even Excel, can give you standard errors and confidence intervals for the Intercept, but it is much harder to get most packages to give you standard errors and confidence intervals around the predicted value of the dependent variable for OTHER combinations of the independent variables. Shifting the intercept is an easy way to get confidence intervals for arbitrary combinations of the independent variables.
This sort of thing becomes especially important at a time when the Statistics community is loudly calling for a move away from P-values. Instead it is recommended that researchers give confidence intervals in clinically meaningful terms.
#data #researchers #statistics #r #excel #regression
✴️ @AI_Python_EN
Because just about any software package capable of doing regression, even Excel, can give you standard errors and confidence intervals for the Intercept, but it is much harder to get most packages to give you standard errors and confidence intervals around the predicted value of the dependent variable for OTHER combinations of the independent variables. Shifting the intercept is an easy way to get confidence intervals for arbitrary combinations of the independent variables.
This sort of thing becomes especially important at a time when the Statistics community is loudly calling for a move away from P-values. Instead it is recommended that researchers give confidence intervals in clinically meaningful terms.
#data #researchers #statistics #r #excel #regression
✴️ @AI_Python_EN