As the author states: "work in process and even in an early dirty phase"
But still very cool 🙂
Book: Predictive Models: Visual Exploration, Explanation and Debugging With examples in R and Python By Przemyslaw Biecek
#book #datascience #machinelearning #statistics #programming_language
🌎 Book
✴️ @AI_Python_EN
But still very cool 🙂
Book: Predictive Models: Visual Exploration, Explanation and Debugging With examples in R and Python By Przemyslaw Biecek
#book #datascience #machinelearning #statistics #programming_language
🌎 Book
✴️ @AI_Python_EN
Transfer-Learning: Classification of 4 different types of Arctic Dog using Fast.AI library
#machinelearning #DeepLearning #TransferLearning
🌎 Transfer-Learning
✴️ @AI_Python_EN
#machinelearning #DeepLearning #TransferLearning
🌎 Transfer-Learning
✴️ @AI_Python_EN
Unified Language Model Pre-training for Natural Language Understanding and Generation
Dong et al.: https://lnkd.in/ez6xBKR
#ArtificialIntelligence #DeepLearning #MachineLearning
✴️ @AI_Python_EN
Dong et al.: https://lnkd.in/ez6xBKR
#ArtificialIntelligence #DeepLearning #MachineLearning
✴️ @AI_Python_EN
A Very Short History of Artificial Neural Networks
🌎 History of Artificial Neural Networks
#ArtificialNeuralNetwork #neuralnetworks #ReinforcementLearning #ANN #NN
✴️ @AI_Python_EN
🌎 History of Artificial Neural Networks
#ArtificialNeuralNetwork #neuralnetworks #ReinforcementLearning #ANN #NN
✴️ @AI_Python_EN
spy.zip
3.6 KB
Python Remote Access Trojan
The author does not hold any responsibility for the bad use of this tool, remember this is only for educational purpose.
#python #trojan #virus
✴️ @AI_Python_EN
The author does not hold any responsibility for the bad use of this tool, remember this is only for educational purpose.
#python #trojan #virus
✴️ @AI_Python_EN
Python for Data Analysis.pdf
1.1 MB
****Python for Data Analysis by Boston University****
#datascience #dataanalysis #dataanalytics #python #analytics
✴️ @AI_Python_EN
#datascience #dataanalysis #dataanalytics #python #analytics
✴️ @AI_Python_EN
“Generative models in Tensorflow 2”
GitHub, by Tim Sainburg: https://lnkd.in/eAHD5Ew
#deeplearning #generativeadversarialnetworks #tensorflow #technology
✴️ @AI_Python_EN
GitHub, by Tim Sainburg: https://lnkd.in/eAHD5Ew
#deeplearning #generativeadversarialnetworks #tensorflow #technology
✴️ @AI_Python_EN
Nice new feature in TensorFlow: weights pruning to reduce your models to get faster inference out of the box. The API is built on top of Keras, so it will be very easy to apply it to any Keras trained model. Check out the article for more information. #deeplearning #machinelearning
Article: https://lnkd.in/dzF8xRJ
✴️ @AI_Python_EN
Article: https://lnkd.in/dzF8xRJ
✴️ @AI_Python_EN
***What is pip***
A Guide for New Pythonistas.
Credits - Isaac Rodriguez
Link - https://lnkd.in/fPf8MWZ
#python #pythonprogramming
✴️ @AI_Python_EN
A Guide for New Pythonistas.
Credits - Isaac Rodriguez
Link - https://lnkd.in/fPf8MWZ
#python #pythonprogramming
✴️ @AI_Python_EN
120 Machine Learning business ideas from the latest McKinsey report
#machinelearning #artificialintelligence #datascience #ml #ai #deeplearning
✴️ @AI_Python_EN
#machinelearning #artificialintelligence #datascience #ml #ai #deeplearning
✴️ @AI_Python_EN
Andriy Burkov
I often receive questions from people in my network about what should they learn and master to become a data scientist. While I personally think that the term "data scientist" is very unfortunate and without a clear definition, this is what a good modern #dataanalyst has to master:
#DataScience
– Data structures (local and distributed)
– Data indexing
– Data privacy and anonymization
– Data lifecycle management
– Data transformation (deduplication, handling outliers, and missing values, dimensionality reduction)
– Data analysis (experiment design, classification, regression, unsupervised methods)
– #Machinelearning methods (feature engineering, regularization, hyperparameter tuning, ensemble methods, and #neuralnetwork s)
– Computer and database programming, numerical optimization
– Distributed data processing
– Real-time and high-frequency data processing
– Linux (my personal bias)
A modern data analyst also has to be a good popularizer of complex ideas. Having a Ph.D. is not a requirement, but a very big plus: it contributes to the popularizing skill and teaches the scientific approach to problem-solving.
✴️ @AI_Python_EN
I often receive questions from people in my network about what should they learn and master to become a data scientist. While I personally think that the term "data scientist" is very unfortunate and without a clear definition, this is what a good modern #dataanalyst has to master:
#DataScience
– Data structures (local and distributed)
– Data indexing
– Data privacy and anonymization
– Data lifecycle management
– Data transformation (deduplication, handling outliers, and missing values, dimensionality reduction)
– Data analysis (experiment design, classification, regression, unsupervised methods)
– #Machinelearning methods (feature engineering, regularization, hyperparameter tuning, ensemble methods, and #neuralnetwork s)
– Computer and database programming, numerical optimization
– Distributed data processing
– Real-time and high-frequency data processing
– Linux (my personal bias)
A modern data analyst also has to be a good popularizer of complex ideas. Having a Ph.D. is not a requirement, but a very big plus: it contributes to the popularizing skill and teaches the scientific approach to problem-solving.
✴️ @AI_Python_EN
San Francisco became the first major U.S. city to ban the use of facial recognition technology by police and other municipal agencies
https://www.nytimes.com/2019/05/14/us/facial-recognition-ban-san-francisco.html?smtyp=cur&smid=tw-nytimes
✴️ @AI_Python_EN
https://www.nytimes.com/2019/05/14/us/facial-recognition-ban-san-francisco.html?smtyp=cur&smid=tw-nytimes
✴️ @AI_Python_EN
A #Keras usage pattern that allows for maximum flexibility when defining arbitrary losses and metrics (that don't match the usual signature) is the "endpoint layer" pattern. It works like this: https://colab.research.google.com/drive/1zzLcJ2A2qofIvv94YJ3axRknlA6cBSIw
In short, you use
Of course logistic regression is a basic case that doesn't actually need this advanced pattern. But endpoint layers will work every time, even when you have losses & metrics that don't match the usual
✴️ @AI_Python_EN
In short, you use
add_loss
/add_metric
inside an "endpoint layer" that also has access to model targets. The layer then returns the inference-time predictions. You compile without an external "loss" argument, and you fit with a dictionary of data that contains the targets.Of course logistic regression is a basic case that doesn't actually need this advanced pattern. But endpoint layers will work every time, even when you have losses & metrics that don't match the usual
fn(y_true, y_pred, sampl_weight)
signature that is required in compile
.✴️ @AI_Python_EN
Build a chat widget with Python and JavaScript
http://bit.ly/2JnD8d0
#python #javascript #development
http://bit.ly/2JI78jc
✴️ @AI_Python_EN
http://bit.ly/2JnD8d0
#python #javascript #development
http://bit.ly/2JI78jc
✴️ @AI_Python_EN
Natural Language Processing with Deep Learning in Python ☞ http://bit.ly/2HlcwXV #DeepLearning #TensorFlow
academy.learnstartup.net
Learn Startup - Build a successful business and change the world
Learn Startup, starting a business, Mobile Development and Design with Node.js, Angular.js, React.js, Python, MongoDB, HTML5, CSS3, JavaScript, PHP, mobile app development, Responsive Web Design, Maketing