#Datascience needs to move beyond #research to actually make a real impact in the #AI economy.
Agree?
#DeepLearning #artificialintelligence #machinelearning
β΄οΈ @AI_Python_EN
Agree?
#DeepLearning #artificialintelligence #machinelearning
β΄οΈ @AI_Python_EN
#DeepLearning is fun when you have loads of GPUs!
Here's a 256GB , 8 GPU cluster we will soon be testing as well.
#gpu #nvidia #research
#machinelearning
β΄οΈ @AI_Python_EN
Here's a 256GB , 8 GPU cluster we will soon be testing as well.
#gpu #nvidia #research
#machinelearning
β΄οΈ @AI_Python_EN
Machine Learning (ML) & Artificial Intelligence (AI): From Black Box to White Box Models in 4 Steps - Resources for Explainable AI & ML Model Interpretability.
βοΈSTEP 1 - ARTICLES
- (short) KDnuggets article: https://lnkd.in/eRyTXcQ
- (long) O'Reilly article: https://lnkd.in/ehMHYsr
βοΈSTEP 2 - BOOKS
- Interpretable Machine Learning: A Guide for Making Black Box Models Explainable (free e-book): https://lnkd.in/eUWfa5y
- An Introduction to Machine Learning Interpretability: An Applied Perspective on Fairness, Accountability, Transparency, and Explainable AI (free e-book): https://lnkd.in/dJm595N
βοΈSTEP 3 - COLLABORATE
- Join Explainable AI (XAI) Group: https://lnkd.in/dQjmhZQ
βοΈSTEP 4 - PRACTICE
- Hands-On Practice: Open-Source Tools & Tutorials for ML Interpretability (Python/R): https://lnkd.in/d5bXgV7
- Python Jupyter Notebooks: https://lnkd.in/dETegUH
#machinelearning #datascience #analytics #bigdata #statistics #artificialintelligence #ai #datamining #deeplearning #neuralnetworks #interpretability #science #research #technology #business #healthcare
β΄οΈ @AI_Python_EN
βοΈSTEP 1 - ARTICLES
- (short) KDnuggets article: https://lnkd.in/eRyTXcQ
- (long) O'Reilly article: https://lnkd.in/ehMHYsr
βοΈSTEP 2 - BOOKS
- Interpretable Machine Learning: A Guide for Making Black Box Models Explainable (free e-book): https://lnkd.in/eUWfa5y
- An Introduction to Machine Learning Interpretability: An Applied Perspective on Fairness, Accountability, Transparency, and Explainable AI (free e-book): https://lnkd.in/dJm595N
βοΈSTEP 3 - COLLABORATE
- Join Explainable AI (XAI) Group: https://lnkd.in/dQjmhZQ
βοΈSTEP 4 - PRACTICE
- Hands-On Practice: Open-Source Tools & Tutorials for ML Interpretability (Python/R): https://lnkd.in/d5bXgV7
- Python Jupyter Notebooks: https://lnkd.in/dETegUH
#machinelearning #datascience #analytics #bigdata #statistics #artificialintelligence #ai #datamining #deeplearning #neuralnetworks #interpretability #science #research #technology #business #healthcare
β΄οΈ @AI_Python_EN
Media is too big
VIEW IN TELEGRAM
Today, #LIDAR is used in all autonomous cars except in Tesla
Lidar sensors are big, bulky, expensive, and ugly to look at. Not only that, they do a poor job in snow, sleet, hail, smoke, and smog. If you canβt see the road ahead, neither can LIDAR!.
That last part is one of the reasons Elon Musk refuses to incorporate lidar sensors into the self-driving hardware package for Tesla cars.
Apple & Cornell University have solved the problem of depth precision and this paves the way for faster adoption for safer yet cheaper cars!
Read more here: https://lnkd.in/dZgS6id
Research paper: https://lnkd.in/djRhzq3
#research #selfdriving #deeplearning
β΄οΈ @AI_Python_EN
Lidar sensors are big, bulky, expensive, and ugly to look at. Not only that, they do a poor job in snow, sleet, hail, smoke, and smog. If you canβt see the road ahead, neither can LIDAR!.
That last part is one of the reasons Elon Musk refuses to incorporate lidar sensors into the self-driving hardware package for Tesla cars.
Apple & Cornell University have solved the problem of depth precision and this paves the way for faster adoption for safer yet cheaper cars!
Read more here: https://lnkd.in/dZgS6id
Research paper: https://lnkd.in/djRhzq3
#research #selfdriving #deeplearning
β΄οΈ @AI_Python_EN
What type of a presenter are you?
Are you a "diva", a "penguin" or "Mr. Toscanini"?
Presenting your #MachineLearning #AI #research or project is an art which you must master very well to succeed.
In our internal lectures / classes we do our best to teach our team members to develop a great storyline and present like a star.
#presentationskills #AI #soft #skills
β΄οΈ @AI_Python_EN
Are you a "diva", a "penguin" or "Mr. Toscanini"?
Presenting your #MachineLearning #AI #research or project is an art which you must master very well to succeed.
In our internal lectures / classes we do our best to teach our team members to develop a great storyline and present like a star.
#presentationskills #AI #soft #skills
β΄οΈ @AI_Python_EN
"Training Neural Nets on Larger Batches: Practical Tips for 1-GPU, Multi-GPU & Distributed setups"
By Thomas Wolf: https://lnkd.in/etyMzjQ
#ArtificialInteligence #DeepLearning #MachineLearning #NeuralNetworks #Research
β΄οΈ @AI_Python_EN
By Thomas Wolf: https://lnkd.in/etyMzjQ
#ArtificialInteligence #DeepLearning #MachineLearning #NeuralNetworks #Research
β΄οΈ @AI_Python_EN
RetinaFace: Single-stage Dense Face Localisation in the Wild.
Though tremendous strides have been made in uncontrolled face detection, accurate and efficient face localisation in the wild remains an open challenge.
This paper presents a robust single-stage face detector, named RetinaFace, which performs pixel-wise face localisation on various scales of faces by taking advantages of joint extra-supervised and self-supervised multi-task learning.
Paper: https://lnkd.in/dF48muv
#deeplearning #facerecognition #research
β΄οΈ @AI_Python_EN
Though tremendous strides have been made in uncontrolled face detection, accurate and efficient face localisation in the wild remains an open challenge.
This paper presents a robust single-stage face detector, named RetinaFace, which performs pixel-wise face localisation on various scales of faces by taking advantages of joint extra-supervised and self-supervised multi-task learning.
Paper: https://lnkd.in/dF48muv
#deeplearning #facerecognition #research
β΄οΈ @AI_Python_EN
Fundamentals of Clinical Data Science (Open-Access Book) - for healthcare & IT professionals: https://lnkd.in/eacNnjz
#
For more interesting & helpful content on healthcare & data science, follow me and Brainformatika on LinkedIn.
Table of Contents
Part I. Data Collection
- Data Sources
- Data at Scale
- Standards in Healthcare Data
- Research Data Stewardship for Healthcare Professionals
- The EUβs General Data Protection Regulation (GDPR) in a Research Context
Part II. From Data to Model
- Preparing Data for Predictive Modelling
- Extracting Features from Time Series
- Prediction Modeling Methodology
- Diving Deeper into Models
- Reporting Standards & Critical Appraisal of Prediction Models
Part III. From Model to Application
- Clinical Decision Support Systems
- Mobile Apps
- Optimizing Care Processes with Operational Excellence & Process Mining
- Value-Based Health Care Supported by Data Science
#healthcare #datascience #digitalhealth #analytics #machinelearning #bigdata #populationhealth #ai #medicine #informatics #artificialintelligence #research #precisionmedicine #publichealth #science #health #innovation #technology #informationtechnology
β΄οΈ @AI_Python_EN
#
For more interesting & helpful content on healthcare & data science, follow me and Brainformatika on LinkedIn.
Table of Contents
Part I. Data Collection
- Data Sources
- Data at Scale
- Standards in Healthcare Data
- Research Data Stewardship for Healthcare Professionals
- The EUβs General Data Protection Regulation (GDPR) in a Research Context
Part II. From Data to Model
- Preparing Data for Predictive Modelling
- Extracting Features from Time Series
- Prediction Modeling Methodology
- Diving Deeper into Models
- Reporting Standards & Critical Appraisal of Prediction Models
Part III. From Model to Application
- Clinical Decision Support Systems
- Mobile Apps
- Optimizing Care Processes with Operational Excellence & Process Mining
- Value-Based Health Care Supported by Data Science
#healthcare #datascience #digitalhealth #analytics #machinelearning #bigdata #populationhealth #ai #medicine #informatics #artificialintelligence #research #precisionmedicine #publichealth #science #health #innovation #technology #informationtechnology
β΄οΈ @AI_Python_EN
The Best and Most Current of Modern Natural Language Processing
Blog by Victor Sanh: https://lnkd.in/emch8gG
#NaturalLanguageProcessing #MachineLearning #NLP #DeepLearning #Research
β΄οΈ @AI_Python_EN
Blog by Victor Sanh: https://lnkd.in/emch8gG
#NaturalLanguageProcessing #MachineLearning #NLP #DeepLearning #Research
β΄οΈ @AI_Python_EN
Have you heard of "R-Transformer?", a Recurrent Neural Network Enhanced Transformer
Recurrent Neural Networks have long been the dominating choice for sequence modeling. However, it severely suffers from two issues: impotent in capturing very long-term dependencies and unable to parallelize the sequential computation procedure.
Therefore, many non-recurrent sequence models that are built on convolution and attention operations have been proposed recently.
Here the authors propose the R-Transformer which enjoys the advantages of both RNNs and the multi-head attention mechanism while avoids their respective drawbacks.
The proposed model can effectively capture both local structures and global long-term dependencies in sequences without any use of position embeddings. They evaluated R-Transformer through extensive experiments with data from a wide range of domains and the empirical results show that R-Transformer outperforms the state-of-the-art methods by a large margin in most of the tasks.
Github code: https://lnkd.in/dpFckix
#research #algorithms #machinelearning #deeplearning #rnn
β΄οΈ @AI_Python_EN
Recurrent Neural Networks have long been the dominating choice for sequence modeling. However, it severely suffers from two issues: impotent in capturing very long-term dependencies and unable to parallelize the sequential computation procedure.
Therefore, many non-recurrent sequence models that are built on convolution and attention operations have been proposed recently.
Here the authors propose the R-Transformer which enjoys the advantages of both RNNs and the multi-head attention mechanism while avoids their respective drawbacks.
The proposed model can effectively capture both local structures and global long-term dependencies in sequences without any use of position embeddings. They evaluated R-Transformer through extensive experiments with data from a wide range of domains and the empirical results show that R-Transformer outperforms the state-of-the-art methods by a large margin in most of the tasks.
Github code: https://lnkd.in/dpFckix
#research #algorithms #machinelearning #deeplearning #rnn
β΄οΈ @AI_Python_EN