πΉTop 10 Data Visualization Tools for Every Data Scientist
At present, the data scientist is one of the most sought after professions. Thatβs one of the main reasons why we decided to cover the latest data visualization tools that every data scientist can use to make their work more effective.
1. Tableau
2. D3
3. Qlikview
4. Microsoft Power BI
5. Datawrapper
6. E Charts
7. Plotly
8. Sisense
9. FusionCharts
10. HighCharts
ββββββββββ
πVia: @cedeeplearning
πOther social media: https://linktr.ee/cedeeplearning
link: https://www.kdnuggets.com/2020/05/top-10-data-visualization-tools-every-data-scientist.html
#datascience #visualization #datatools
#machinelearning #tableau #powerbi
At present, the data scientist is one of the most sought after professions. Thatβs one of the main reasons why we decided to cover the latest data visualization tools that every data scientist can use to make their work more effective.
1. Tableau
2. D3
3. Qlikview
4. Microsoft Power BI
5. Datawrapper
6. E Charts
7. Plotly
8. Sisense
9. FusionCharts
10. HighCharts
ββββββββββ
πVia: @cedeeplearning
πOther social media: https://linktr.ee/cedeeplearning
link: https://www.kdnuggets.com/2020/05/top-10-data-visualization-tools-every-data-scientist.html
#datascience #visualization #datatools
#machinelearning #tableau #powerbi
π Deep Learning: The Free eBook
"Deep Learning" is the quintessential book for understanding deep learning theory, and you can still read it freely online.
πΉThe book's table of contents
Introduction
βοΈPart I: Applied Math and Machine Learning Basics
Linear Algebra
Probability and Information Theory
Numerical Computation
Machine Learning Basics
βοΈPart II: Modern Practical Deep Networks
Deep Feedforward Networks
Regularization for Deep Learning
Optimization for Training Deep Models
Convolutional Networks
Sequence Modeling: Recurrent and Recursive Nets
Practical Methodology
Applications
βοΈPart III: Deep Learning Research
Linear Factor Models
Autoencoders
Representation Learning
Structured Probabilistic Models for Deep Learning
Monte Carlo Methods
Confronting the Partition Function
Approximate Inference
Deep Generative Models
ββββββββ
πVia: @cedeeplearning
https://www.kdnuggets.com/2020/05/deep-learning-free-ebook.html
#deeplearning #AI
#neuralnetworks #ebook #machinelearning
"Deep Learning" is the quintessential book for understanding deep learning theory, and you can still read it freely online.
πΉThe book's table of contents
Introduction
βοΈPart I: Applied Math and Machine Learning Basics
Linear Algebra
Probability and Information Theory
Numerical Computation
Machine Learning Basics
βοΈPart II: Modern Practical Deep Networks
Deep Feedforward Networks
Regularization for Deep Learning
Optimization for Training Deep Models
Convolutional Networks
Sequence Modeling: Recurrent and Recursive Nets
Practical Methodology
Applications
βοΈPart III: Deep Learning Research
Linear Factor Models
Autoencoders
Representation Learning
Structured Probabilistic Models for Deep Learning
Monte Carlo Methods
Confronting the Partition Function
Approximate Inference
Deep Generative Models
ββββββββ
πVia: @cedeeplearning
https://www.kdnuggets.com/2020/05/deep-learning-free-ebook.html
#deeplearning #AI
#neuralnetworks #ebook #machinelearning
KDnuggets
Deep Learning: The Free eBook - KDnuggets
"Deep Learning" is the quintessential book for understanding deep learning theory, and you can still read it freely online.
This media is not supported in your browser
VIEW IN TELEGRAM
βͺοΈBasics of Neural Network Programming
βοΈ by Andrew Ng
πΉSource: Coursera
Neural Networks and Deep Learning
π Lecture 3 Logistic Regression
ββββββββββ
πVia: @cedeeplearning
πOther social media: https://linktr.ee/cedeeplearning
#DeepLearning #NeuralNeworks
#machinelearning #AI #coursera
#free #python #supervised_learning
βοΈ by Andrew Ng
πΉSource: Coursera
Neural Networks and Deep Learning
π Lecture 3 Logistic Regression
ββββββββββ
πVia: @cedeeplearning
πOther social media: https://linktr.ee/cedeeplearning
#DeepLearning #NeuralNeworks
#machinelearning #AI #coursera
#free #python #supervised_learning
Cutting Edge Deep Learning pinned Β«Hi guys ππΏ From today weβll be uploading βIntroduction to Deep Learningβ course by prof. Andrew Ng (Stanford lecturer and cofounder of coursera, deeplearning ai etc.) πΉMake sure to send this awesome course to your friends. If you have any suggestion orβ¦Β»
πΉ LSTM for time series prediction
πBy Roman Orac
π»Learn how to develop a LSTM neural network with #PyTorch on trading data to predict future prices by mimicking actual values of the time series data.
In this blog post, I am going to train a Long Short Term Memory Neural Network (LSTM) with PyTorch on Bitcoin trading data and use it to predict the price of unseen trading data. I had quite some difficulties with finding intermediate tutorials with a repeatable example of training an #LSTM for time series prediction, so Iβve put together a #Jupyter notebook to help you to get started.
βββββββββ
πVia: @cedeeplearning
https://www.kdnuggets.com/2020/04/lstm-time-series-prediction.html
#deeplearning #AI #machinelearning
#neuralnetworks #timeseries
πBy Roman Orac
π»Learn how to develop a LSTM neural network with #PyTorch on trading data to predict future prices by mimicking actual values of the time series data.
In this blog post, I am going to train a Long Short Term Memory Neural Network (LSTM) with PyTorch on Bitcoin trading data and use it to predict the price of unseen trading data. I had quite some difficulties with finding intermediate tutorials with a repeatable example of training an #LSTM for time series prediction, so Iβve put together a #Jupyter notebook to help you to get started.
βββββββββ
πVia: @cedeeplearning
https://www.kdnuggets.com/2020/04/lstm-time-series-prediction.html
#deeplearning #AI #machinelearning
#neuralnetworks #timeseries
KDnuggets
LSTM for time series prediction - KDnuggets
Learn how to develop a LSTM neural network with PyTorch on trading data to predict future prices by mimicking actual values of the time series data.
πΉDeep Learning-powering machines with human intelligence
π by Ashish Sukhadeve
π»Deep learning uses neural networks with many intermediate layers of artificial βneuronsβ between the input and the output, inspired by the human brain. The technology excels at modeling extremely complicated relationships between these layers to classify and predict things.
π»Deep learning is making a significant contribution to the business world, and the economy is already beginning to feel the impact. The deep learning market is expected to reach $18.2 billion by 2023 from $3.2 billion in 2018, growing at a CAGR of 41.7%. The confluence of three factors- the rise of big data, the emergence of powerful graphics processing units (GPUs), and increasing adoption of cloud computing is fuelling the rapid growth of deep learning.
βββββββββ
πVia: @cedeeplearning
πOther social media: https://linktr.ee/cedeeplearning
link: https://www.analyticsinsight.net/deep-learning-powering-machines-with-human-intelligence/
#deeplearning #machinelearning
#neuralnetworks
#business #market
π by Ashish Sukhadeve
π»Deep learning uses neural networks with many intermediate layers of artificial βneuronsβ between the input and the output, inspired by the human brain. The technology excels at modeling extremely complicated relationships between these layers to classify and predict things.
π»Deep learning is making a significant contribution to the business world, and the economy is already beginning to feel the impact. The deep learning market is expected to reach $18.2 billion by 2023 from $3.2 billion in 2018, growing at a CAGR of 41.7%. The confluence of three factors- the rise of big data, the emergence of powerful graphics processing units (GPUs), and increasing adoption of cloud computing is fuelling the rapid growth of deep learning.
βββββββββ
πVia: @cedeeplearning
πOther social media: https://linktr.ee/cedeeplearning
link: https://www.analyticsinsight.net/deep-learning-powering-machines-with-human-intelligence/
#deeplearning #machinelearning
#neuralnetworks
#business #market
πΉAre facial dataset sufficient to perform sentiment analysis using emotional AI?
πby Smriti Srivastava
According to Harvard Business Review, the days when #AI technology will be able to βrecognize, process, and simulateβ human emotions are not that far away. As the emotional AI has already made itself a unique place in the affective computing market, it has been predicted that its market size could grow to about US$ 41 billion by 2022.
The technology of #sentiment_analysis or emotion analysis caters great insights into rapidly growing customer service issues in an effort to conveniently identify and act on the root cause of issues or even mitigate them before they reach critical mass.
βββββββββ
πVia: @cedeeplearning
https://www.analyticsinsight.net/facial-datasets-sufficient-perform-sentiment-analysis-using-emotional-ai/
#emotional_ai #deeplearning
#machinelearning #sentiment
#facial_recognition
πby Smriti Srivastava
According to Harvard Business Review, the days when #AI technology will be able to βrecognize, process, and simulateβ human emotions are not that far away. As the emotional AI has already made itself a unique place in the affective computing market, it has been predicted that its market size could grow to about US$ 41 billion by 2022.
The technology of #sentiment_analysis or emotion analysis caters great insights into rapidly growing customer service issues in an effort to conveniently identify and act on the root cause of issues or even mitigate them before they reach critical mass.
βββββββββ
πVia: @cedeeplearning
https://www.analyticsinsight.net/facial-datasets-sufficient-perform-sentiment-analysis-using-emotional-ai/
#emotional_ai #deeplearning
#machinelearning #sentiment
#facial_recognition
Analytics Insight
Are Facial Datasets Sufficient for Emotional AI Analysis?
The technology which is commonly known as Emotional AI allows the study and development of devices and applications with an ability to recognize, interpret, process, and simulate human affects. Emotional AI is an interdisciplinary field that covers computerβ¦
π»Speeding Up Deep Learning Inference Using TensorRT
π This version starts from a #PyTorch model instead of the #ONNX model, upgrades the sample application to use #TensorRT 7, and replaces the ResNet-50 #classification model with UNet, which is a segmentation model.
π NVIDIA TensorRT is an SDK for deep learning inference. TensorRT provides APIs and parsers to import trained models from all major deep learning frameworks. It then generates optimized runtime engines deployable in the datacenter as well as in automotive and embedded environments.
πΉSimple TensorRT example
π»Convert the pretrained image segmentation PyTorch model into ONNX.
π»Import the ONNX model into TensorRT.
π»Apply optimizations and generate an engine.
π»Perform inference on the GPU.
βοΈ DO NOT MISS OUT THIS ARTICLE
βββββββββ
πVia: @cedeeplearning
πOther social media: https://linktr.ee/cedeeplearning
link: https://devblogs.nvidia.com/speeding-up-deep-learning-inference-using-tensorrt/
#NVIDIA #deeplearning #neuralnetworks #python
#machinelearning #AI
π This version starts from a #PyTorch model instead of the #ONNX model, upgrades the sample application to use #TensorRT 7, and replaces the ResNet-50 #classification model with UNet, which is a segmentation model.
π NVIDIA TensorRT is an SDK for deep learning inference. TensorRT provides APIs and parsers to import trained models from all major deep learning frameworks. It then generates optimized runtime engines deployable in the datacenter as well as in automotive and embedded environments.
πΉSimple TensorRT example
π»Convert the pretrained image segmentation PyTorch model into ONNX.
π»Import the ONNX model into TensorRT.
π»Apply optimizations and generate an engine.
π»Perform inference on the GPU.
βοΈ DO NOT MISS OUT THIS ARTICLE
βββββββββ
πVia: @cedeeplearning
πOther social media: https://linktr.ee/cedeeplearning
link: https://devblogs.nvidia.com/speeding-up-deep-learning-inference-using-tensorrt/
#NVIDIA #deeplearning #neuralnetworks #python
#machinelearning #AI
NLP.pdf
701.7 KB
πΉ A Primer on Neural Network Models
for Natural Language Processing
This tutorial covers input encoding for natural language tasks, feed-forward networks, convolutional networks, recurrent networks and recursive networks, as well as the computation graph abstraction for automatic gradient computation.
βββββββββ
πVia: @cedeeplearning
#deeplearning #CNN #tutorial
#neuralnetworks #RNN #paper #nlp
for Natural Language Processing
This tutorial covers input encoding for natural language tasks, feed-forward networks, convolutional networks, recurrent networks and recursive networks, as well as the computation graph abstraction for automatic gradient computation.
βββββββββ
πVia: @cedeeplearning
#deeplearning #CNN #tutorial
#neuralnetworks #RNN #paper #nlp
π» How cognitive technologies are redefining the future of manufacturing?
πby Kanti S
As factories and equipment get smarter and armed with new technologies, like IoT, AI, and Cognitive Automation, Industry 4.0 has finally arrived. Industry 4.0 a term coined in Germany to computerize manufacturing, has now launched into a worldwide phenomenon.
πΉThe Future is Bright for Cognitive Manufacturing
As the pace of technological advancement keeps increasing, Cognitive Manufacturing is a trend of today redefining commonplace activities that were impossible to comprehend by technology 10 or 15 years ago.
ββββββββ
πVia: @cedeeplearning
https://www.analyticsinsight.net/how-cognitive-technologies-are-redefining-the-future-of-manufacturing/
#deeplearning
#cognitive #AI
#manufacturing
πby Kanti S
As factories and equipment get smarter and armed with new technologies, like IoT, AI, and Cognitive Automation, Industry 4.0 has finally arrived. Industry 4.0 a term coined in Germany to computerize manufacturing, has now launched into a worldwide phenomenon.
πΉThe Future is Bright for Cognitive Manufacturing
As the pace of technological advancement keeps increasing, Cognitive Manufacturing is a trend of today redefining commonplace activities that were impossible to comprehend by technology 10 or 15 years ago.
ββββββββ
πVia: @cedeeplearning
https://www.analyticsinsight.net/how-cognitive-technologies-are-redefining-the-future-of-manufacturing/
#deeplearning
#cognitive #AI
#manufacturing
This media is not supported in your browser
VIEW IN TELEGRAM
βͺοΈBasics of Neural Network Programming
βοΈ by prof. Andrew Ng
πΉSource: Coursera
π Lecture 4 Binary Classification
Neural Networks and Deep Learning
ββββββββββ
πVia: @cedeeplearning
πOther social media: https://linktr.ee/cedeeplearning
#DeepLearning #NeuralNeworks
#machinelearning #AI #coursera
#free #python #supervised_learning
#classification #binary
βοΈ by prof. Andrew Ng
πΉSource: Coursera
π Lecture 4 Binary Classification
Neural Networks and Deep Learning
ββββββββββ
πVia: @cedeeplearning
πOther social media: https://linktr.ee/cedeeplearning
#DeepLearning #NeuralNeworks
#machinelearning #AI #coursera
#free #python #supervised_learning
#classification #binary
FineGym: A Hierarchical Video Dataset for Fine-grained Action Understanding
βββββ
@cedeeplearning
βββββ
@cedeeplearning
[Received highest review score at CVPR 2020] FineGym: A Hierarchical Video Dataset for Fine-grained Action Understanding
Page: https://sdolivia.github.io/FineGym/
Arxiv: https://arxiv.org/abs/2004.06704v1?utm_content=bufferd6581&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer
(the author will be livestreaming at AISC: https://buff.ly/3feLIrk) β> Time: Friday 15-May-2020 04:00
The Chinese University of Hong Kong: http://www.cuhk.edu.hk/english/index.html
ββββββββββ-
πVia: @cedeeplearning
Page: https://sdolivia.github.io/FineGym/
Arxiv: https://arxiv.org/abs/2004.06704v1?utm_content=bufferd6581&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer
(the author will be livestreaming at AISC: https://buff.ly/3feLIrk) β> Time: Friday 15-May-2020 04:00
The Chinese University of Hong Kong: http://www.cuhk.edu.hk/english/index.html
ββββββββββ-
πVia: @cedeeplearning
ai.science
[FineGym] A Dataset for Fine-grained Video Action Understanding and Our Experience of Building A High Quality Dataset | Spotlightβ¦
Speaker: Dian Shao (Multimedia Laboratory, The Chinese University of Hong Kong); Discussion Moderator: Xiyang Chen (Aggregate Intellect) | AI, Data Science, Artificial Intelligence, Machine Learning
π» Learning to Smell: Using Deep Learning to Predict the Olfactory Properties of Molecules
πΉSmell is a sense shared by an incredible range of living organisms, and plays a critical role in how they analyze and react to the world. For humans, our sense of smell is tied to our ability to enjoy food and can also trigger vivid memories.
πΉIn βMachine Learning for Scent: Learning Generalizable Perceptual Representations of Small Moleculesβ, we leverage graph neural networks (GNNs), a kind of deep neural network designed to operate on graphs as input, to directly predict the odor descriptors for individual molecules, without using any handcrafted rules.
βββββββ
πVia: @cedeeplearning
https://ai.googleblog.com/2019/10/learning-to-smell-using-deep-learning.html
#deeplearning #neuralnetworks
#GNN #graph #network #machinelearning
πΉSmell is a sense shared by an incredible range of living organisms, and plays a critical role in how they analyze and react to the world. For humans, our sense of smell is tied to our ability to enjoy food and can also trigger vivid memories.
πΉIn βMachine Learning for Scent: Learning Generalizable Perceptual Representations of Small Moleculesβ, we leverage graph neural networks (GNNs), a kind of deep neural network designed to operate on graphs as input, to directly predict the odor descriptors for individual molecules, without using any handcrafted rules.
βββββββ
πVia: @cedeeplearning
https://ai.googleblog.com/2019/10/learning-to-smell-using-deep-learning.html
#deeplearning #neuralnetworks
#GNN #graph #network #machinelearning
research.google
Learning to Smell: Using Deep Learning to Predict the Olfactory Properties of Mo
Posted by Alexander B Wiltschko, Senior Research Scientist, Google Research Smell is a sense shared by an incredible range of living organisms, a...
π»Quantum Supremacy Using a Programmable #Superconducting #Processor
πΉPhysicists have been talking about the power of quantum computing for over 30 years, but the questions have always been: will it ever do something useful and is it worth investing in? For such large-scale endeavors it is good engineering practice to formulate decisive short-term goals that demonstrate whether the designs are going in the right direction.
πΉToday we published the results of this quantum supremacy experiment in the Nature article, βQuantum Supremacy Using a Programmable Superconducting Processorβ. We developed a new 54-qubit processor, named βSycamoreβ, that is comprised of fast, high-fidelity quantum logic gates, in order to perform the benchmark testing.
βββββββ
πVia: @cedeeplearning
https://ai.googleblog.com/2019/10/quantum-supremacy-using-programmable.html
#quantomcomputing #deeplearning
#machinelearning #neuralnetworks
#AI #sycamore #hardware
πΉPhysicists have been talking about the power of quantum computing for over 30 years, but the questions have always been: will it ever do something useful and is it worth investing in? For such large-scale endeavors it is good engineering practice to formulate decisive short-term goals that demonstrate whether the designs are going in the right direction.
πΉToday we published the results of this quantum supremacy experiment in the Nature article, βQuantum Supremacy Using a Programmable Superconducting Processorβ. We developed a new 54-qubit processor, named βSycamoreβ, that is comprised of fast, high-fidelity quantum logic gates, in order to perform the benchmark testing.
βββββββ
πVia: @cedeeplearning
https://ai.googleblog.com/2019/10/quantum-supremacy-using-programmable.html
#quantomcomputing #deeplearning
#machinelearning #neuralnetworks
#AI #sycamore #hardware
research.google
Quantum Supremacy Using a Programmable Superconducting Processor
Posted by John Martinis, Chief Scientist Quantum Hardware and Sergio Boixo, Chief Scientist Quantum Computing Theory, Google AI Quantum Physicist...
π Machine Learning Resume Sample: how to build a strong ML Resume
π»Tips to make machine learning resume
π»What are the must-have skills for an AI resume
π»Common skills that employers look for on an ML Resume
π»How to master programming languages
π»Creating your Machine Learning Resume
ββββββββ
πVia: @cedeeplearning
https://www.mygreatlearning.com/blog/5-must-haves-machine-learning-resume/#tips
#resume #machinelearning
#datascience #skill #AI
#python #programming
π»Tips to make machine learning resume
π»What are the must-have skills for an AI resume
π»Common skills that employers look for on an ML Resume
π»How to master programming languages
π»Creating your Machine Learning Resume
ββββββββ
πVia: @cedeeplearning
https://www.mygreatlearning.com/blog/5-must-haves-machine-learning-resume/#tips
#resume #machinelearning
#datascience #skill #AI
#python #programming
GreatLearning
Machine Learning Resume: How to build a strong ML Resume and must haves
Machine Learning Resume Examples and Samples: These skills sould be in your Machine Learning CV- Statistics, Analytics, Programming Languages. Experience: Hands on experience on ML projects.
This media is not supported in your browser
VIEW IN TELEGRAM
βͺοΈBasics of Neural Network Programming
βοΈ by prof. Andrew Ng
πΉSource: Coursera
π Lecture 5 Derivatives
Neural Networks and Deep Learning
ββββββββββ
πVia: @cedeeplearning
πOther social media: https://linktr.ee/cedeeplearning
#DeepLearning #NeuralNeworks
#machinelearning #AI #coursera
#free #python #supervised_learning
#classification #binary
βοΈ by prof. Andrew Ng
πΉSource: Coursera
π Lecture 5 Derivatives
Neural Networks and Deep Learning
ββββββββββ
πVia: @cedeeplearning
πOther social media: https://linktr.ee/cedeeplearning
#DeepLearning #NeuralNeworks
#machinelearning #AI #coursera
#free #python #supervised_learning
#classification #binary
π Build your Own Object Detection Model using #TensorFlow API
π»The World of Object Detection
πΉOne of my favorite computer vision and deep learning concepts is object detection. The ability to build a model that can go through images and tell me what objects are present β itβs a priceless feeling!
π Nice reading article
ββββββ
πVia: @cedeeplearning
https://www.analyticsvidhya.com/blog/2020/04/build-your-own-object-detection-model-using-tensorflow-api/
#object_detection
#imagedetection
#deeplearning #computervision
#AI #machinelearning
#neuralnetworks
π»The World of Object Detection
πΉOne of my favorite computer vision and deep learning concepts is object detection. The ability to build a model that can go through images and tell me what objects are present β itβs a priceless feeling!
π Nice reading article
ββββββ
πVia: @cedeeplearning
https://www.analyticsvidhya.com/blog/2020/04/build-your-own-object-detection-model-using-tensorflow-api/
#object_detection
#imagedetection
#deeplearning #computervision
#AI #machinelearning
#neuralnetworks
Analytics Vidhya
Build your Own Object Detection Model using TensorFlow API
Object detection is a computer vision problem of locating instances of objects in an image.TensorFlow API makes this process easier with predefined models.
π13 βMust-Readβ Papers from AI Experts
1. Learning to Reinforcement Learn (2016) - Jane X Wang et al
2. Gradient-based Hyperparameter Optimization through Reversible Learning (2015) - Dougal Maclaurin
3. Long Short-Term Memory (1997) - Sepp Hochreiter and JΓΌrgen Schmidhuber
4. Efficient Incremental Learning for Mobile Object Detection (2019) - Dawei Li et al
5. Emergent Tool Use From Multi-Agent Autocurricula (2019) - Bowen Baker et al
6. Open-endedness: The last grand challenge youβve never heard of (2017) - Kenneth Stanley et al
7. Attention Is All You Need (2017) - Ashish Vaswani et al
8. Modeling yield response to crop management using convolutional neural networks (2020) - Andre Barbosa et al.
9. A comparison of deep learning performance against health-care professionals in detecting diseases from medical imaging: a systematic review and meta-analysis (2019) - Xiaoxuan Liu et al
10. The Next Decade in AI: Four Steps Towards Robust Artificial Intelligence (2020) - Gary Marcus
11. On the Measure of Intelligence (2019) - FranΓ§ois Chollet
12. Tackling climate change with Machine Learning (2019) - David Rolnick, Priya L Donti, Yoshua Bengio et al.
13. The Netflix Recommender System: Algorithms, Business Value, and Innovation (2015) - Carlos Gomez-Uribe & Neil Hunt.
βββββββ
πVia: @cedeeplearning
https://blog.re-work.co/ai-papers-suggested-by-experts/
#paper #resource #free #AI
#machinelearning #datascience
1. Learning to Reinforcement Learn (2016) - Jane X Wang et al
2. Gradient-based Hyperparameter Optimization through Reversible Learning (2015) - Dougal Maclaurin
3. Long Short-Term Memory (1997) - Sepp Hochreiter and JΓΌrgen Schmidhuber
4. Efficient Incremental Learning for Mobile Object Detection (2019) - Dawei Li et al
5. Emergent Tool Use From Multi-Agent Autocurricula (2019) - Bowen Baker et al
6. Open-endedness: The last grand challenge youβve never heard of (2017) - Kenneth Stanley et al
7. Attention Is All You Need (2017) - Ashish Vaswani et al
8. Modeling yield response to crop management using convolutional neural networks (2020) - Andre Barbosa et al.
9. A comparison of deep learning performance against health-care professionals in detecting diseases from medical imaging: a systematic review and meta-analysis (2019) - Xiaoxuan Liu et al
10. The Next Decade in AI: Four Steps Towards Robust Artificial Intelligence (2020) - Gary Marcus
11. On the Measure of Intelligence (2019) - FranΓ§ois Chollet
12. Tackling climate change with Machine Learning (2019) - David Rolnick, Priya L Donti, Yoshua Bengio et al.
13. The Netflix Recommender System: Algorithms, Business Value, and Innovation (2015) - Carlos Gomez-Uribe & Neil Hunt.
βββββββ
πVia: @cedeeplearning
https://blog.re-work.co/ai-papers-suggested-by-experts/
#paper #resource #free #AI
#machinelearning #datascience
REβ’WORK Blog - AI & Deep Learning News
AI Paper Recommendations from Experts
We reached out to further members of the AI community for their recommendations of papers which everyone should be reading! All of the cited papers are free to access and cover a range of topics from some incredible minds.
Media is too big
VIEW IN TELEGRAM
βͺοΈBasics of Neural Network Programming
βοΈ by prof. Andrew Ng
πΉSource: Coursera
π Lecture 6 Gradient Descent
Neural Networks and Deep Learning
ββββββββββ
πVia: @cedeeplearning
πOther social media: https://linktr.ee/cedeeplearning
#DeepLearning #NeuralNeworks
#machinelearning #AI #coursera
#free #python #supervised_learning
#classification #gradient
βοΈ by prof. Andrew Ng
πΉSource: Coursera
π Lecture 6 Gradient Descent
Neural Networks and Deep Learning
ββββββββββ
πVia: @cedeeplearning
πOther social media: https://linktr.ee/cedeeplearning
#DeepLearning #NeuralNeworks
#machinelearning #AI #coursera
#free #python #supervised_learning
#classification #gradient