image_2019-03-25_11-36-26.png
347.3 KB
Here are 25 awesome #deeplearning datasets handpicked by our team! We have divided them into 3 categories: Image Processing, Natural Language Processing (#NLP) and Audio/Speech Processing.
https://bit.ly/2DrzUAM
✴️ @AI_Python_EN
https://bit.ly/2DrzUAM
✴️ @AI_Python_EN
Deep Classifiers Ignore Almost Everything They See (and how we may be able to fix it)
Blog by Jorn Jacobsen: https://lnkd.in/eNZt5mn
#MachineLearning #ArtificialIntelligence #ComputerVision #DeepLearning
✴️ @AI_Python_EN
Blog by Jorn Jacobsen: https://lnkd.in/eNZt5mn
#MachineLearning #ArtificialIntelligence #ComputerVision #DeepLearning
✴️ @AI_Python_EN
Monte Carlo Neural Fictitious Self-Play: Achieve Approximate Nash equilibrium of Imperfect-Information Games
Zhang et al.: https://lnkd.in/eJQYNkD
#artificialintelligence #deeplearning #reinforcementlearning #selfplay
✴️ @AI_Python_EN
Zhang et al.: https://lnkd.in/eJQYNkD
#artificialintelligence #deeplearning #reinforcementlearning #selfplay
✴️ @AI_Python_EN
Visualizing memorization in RNNs
Blog by Andreas Madsen: https://lnkd.in/dAnHYcx
#artificialintelligence #deeplearning #machinelearning
✴️ @AI_Python_EN
Blog by Andreas Madsen: https://lnkd.in/dAnHYcx
#artificialintelligence #deeplearning #machinelearning
✴️ @AI_Python_EN
This book is written for persons of no-maths-background, specially IT employees,students of class 9th or more;
and also useful for non-techie business analyst and business leaders (as APIs for Face, Speech and chatbots are discussed)
This book is for beginners of deep learning,TensorFlow, Keras, Speech recognition, Face recognition and chatbot.
Chapter0:_Prerequisites of Deep Learning Numpy, Pandas and Scikit-Learn
Chapter1_Basics of Tensorflow
Chapter2_Understanding and working on Keras
Chapter 3: MultiLayer Perceptron
Chapter4_Regresson to MLP in Tensorflow
Chapter5_Regression to MLP in Keras
Chapter6_ CNN with visuals
Chapter7_CNN with Tensorflow
Chapter8_CNN with Keras
Chapter9_RNN and LSTM in visual
Chapter10_Speech to text and vice versa
Chapter11_Developing Chatbots
Chapter12_Face Recognition
Github: https://lnkd.in/fH-SjSV
Book: https://lnkd.in/fwf2aiv
Please recommend to school going students as well.
Need feedback regarding how we can make the book more lucid.
#deeplearning #democratization #ai
✴️ @AI_Python_EN
and also useful for non-techie business analyst and business leaders (as APIs for Face, Speech and chatbots are discussed)
This book is for beginners of deep learning,TensorFlow, Keras, Speech recognition, Face recognition and chatbot.
Chapter0:_Prerequisites of Deep Learning Numpy, Pandas and Scikit-Learn
Chapter1_Basics of Tensorflow
Chapter2_Understanding and working on Keras
Chapter 3: MultiLayer Perceptron
Chapter4_Regresson to MLP in Tensorflow
Chapter5_Regression to MLP in Keras
Chapter6_ CNN with visuals
Chapter7_CNN with Tensorflow
Chapter8_CNN with Keras
Chapter9_RNN and LSTM in visual
Chapter10_Speech to text and vice versa
Chapter11_Developing Chatbots
Chapter12_Face Recognition
Github: https://lnkd.in/fH-SjSV
Book: https://lnkd.in/fwf2aiv
Please recommend to school going students as well.
Need feedback regarding how we can make the book more lucid.
#deeplearning #democratization #ai
✴️ @AI_Python_EN
How comfortable are you working on #UnsupervisedLearning problems? Here are 5 comprehensive tutorials to help you learn this critical topic:
1. An Introduction to #Clustering and it's Different Methods - https://bit.ly/2Fwykil
2. Exploring Unsupervised #DeepLearning #Algorithms for #ComputerVision - https://bit.ly/2HPWV39
3. Introduction to Unsupervised Deep Learning (with #Python codes) - https://bit.ly/2HDkMUA
4. Essentials of
#MachineLearning Algorithms (with Python and R Codes) - https://bit.ly/2TQjJWW
5. An Alternative to Deep Learning? Guide to Hierarchical Temporal Memory (HTM) for Unsupervised Learning - https://bit.ly/2JzcyOR
✴️ @AI_Python_EN
1. An Introduction to #Clustering and it's Different Methods - https://bit.ly/2Fwykil
2. Exploring Unsupervised #DeepLearning #Algorithms for #ComputerVision - https://bit.ly/2HPWV39
3. Introduction to Unsupervised Deep Learning (with #Python codes) - https://bit.ly/2HDkMUA
4. Essentials of
#MachineLearning Algorithms (with Python and R Codes) - https://bit.ly/2TQjJWW
5. An Alternative to Deep Learning? Guide to Hierarchical Temporal Memory (HTM) for Unsupervised Learning - https://bit.ly/2JzcyOR
✴️ @AI_Python_EN
New work yao qin, Nicholas Carlini, ian good fellow, and Gary Cottrell on generating imperceptible, robust, and targeted adversarial examples for speech recognition systems! Paper: https://arxiv.org/abs/1903.10346 Audio samples: http://cseweb.ucsd.edu/~yaq007/imperceptible-robust-adv.html
✴️ @AI_Python_EN
✴️ @AI_Python_EN
Knowledge Graphs: The Third Era of Computing
Blog by Dan McCreary: https://lnkd.in/epZU-Yi
#MachineLearning #KnowledgeGraphs #AI #ProceduralProgramming
✴️ @AI_Python_EN
Blog by Dan McCreary: https://lnkd.in/epZU-Yi
#MachineLearning #KnowledgeGraphs #AI #ProceduralProgramming
✴️ @AI_Python_EN
Francesco Cardinale
I'm happy to announce that we just open-sourced a major update for our image super-resolution project: using an adversarial network and convolutional feature maps for the loss, we got some interesting results in terms realism and noise cancellation.
Pre-trained weights and GANs training code are available on GitHub!
If you want to read up about the process, check out the blog post.
Also, we released a pip package, 'ISR' (admittedly not the most creative name:D), with a nice documentation and colab notebooks to play around and experiment yourself on FREE GPU(#mindblown). Thanks to Dat Tran for the big help.
💻Blog: https://lnkd.in/dUnvXQZ
📝Documentation: https://lnkd.in/dAuu2Dk
🔤Github: https://lnkd.in/dmtV2ht
📕Colab (prediction): https://lnkd.in/dThVb_p
📘Colab (training): https://lnkd.in/diPTgWj
https://lnkd.in/dVBaKv4
#opensource #deeplearning #gans #machinelearning #keras
✴️ @AI_Python_EN
I'm happy to announce that we just open-sourced a major update for our image super-resolution project: using an adversarial network and convolutional feature maps for the loss, we got some interesting results in terms realism and noise cancellation.
Pre-trained weights and GANs training code are available on GitHub!
If you want to read up about the process, check out the blog post.
Also, we released a pip package, 'ISR' (admittedly not the most creative name:D), with a nice documentation and colab notebooks to play around and experiment yourself on FREE GPU(#mindblown). Thanks to Dat Tran for the big help.
💻Blog: https://lnkd.in/dUnvXQZ
📝Documentation: https://lnkd.in/dAuu2Dk
🔤Github: https://lnkd.in/dmtV2ht
📕Colab (prediction): https://lnkd.in/dThVb_p
📘Colab (training): https://lnkd.in/diPTgWj
https://lnkd.in/dVBaKv4
#opensource #deeplearning #gans #machinelearning #keras
✴️ @AI_Python_EN
Abstract Art with ML: Compositional Pattern Producing Networks
Randomly initialised neural networks are able to produce visual appealing images. By Jan Hünermann: https://lnkd.in/dKQEpE6
#art #artificialintelligence #machinelearning
✴️ @AI_Python_EN
Randomly initialised neural networks are able to produce visual appealing images. By Jan Hünermann: https://lnkd.in/dKQEpE6
#art #artificialintelligence #machinelearning
✴️ @AI_Python_EN
**** CS294-158 Deep Unsupervised Learning Spring 2019 ****
1) Week 1
https://lnkd.in/f8RbXpT
2) Week 2
https://lnkd.in/fig-KwB
3) Week 3
https://lnkd.in/fQkiQVh
4) Week 4
https://lnkd.in/f-euBDk
5) Week 5
https://lnkd.in/fz-Aqq2
6) Week 6
https://lnkd.in/fZpSG-Y
✴️ @AI_Python_EN
1) Week 1
https://lnkd.in/f8RbXpT
2) Week 2
https://lnkd.in/fig-KwB
3) Week 3
https://lnkd.in/fQkiQVh
4) Week 4
https://lnkd.in/f-euBDk
5) Week 5
https://lnkd.in/fz-Aqq2
6) Week 6
https://lnkd.in/fZpSG-Y
✴️ @AI_Python_EN
Teaching deep generative modelling techniques for NLP applications at Yandex NLP week https://academy.yandex.ru/events/data_analysis/NLP_week/ … All material available from https://github.com/philschulz/VITutorial/tree/yandex2019/modules
events.yandex.ru
События Яндекса
События — вся информация о конференциях, школах, семинарах и других мероприятиях Яндекса — как прошедших, так и предстоящих.
The US Military Is Creating the Future of Employee Monitoring
"A new AI-enabled pilot project aims to sense “micro changes” in the behavior of people with top-secret clearances. If it works, it could be the future of corporate HR."
In Defense One: https://lnkd.in/e6iuJU6
#ArtificialIntelligence #DeepLearning #MachineLearning
✴️ @AI_Python_EN
"A new AI-enabled pilot project aims to sense “micro changes” in the behavior of people with top-secret clearances. If it works, it could be the future of corporate HR."
In Defense One: https://lnkd.in/e6iuJU6
#ArtificialIntelligence #DeepLearning #MachineLearning
✴️ @AI_Python_EN
Articles on Natural Language Processing (NLP)
by Vincent Granville
https://lnkd.in/fnBGQfM #technology #naturallanguageprocessing #artificialintelligence #businessintelligence #deeplearning #machinelearning #python #datascience
✴️ @AI_Python_EN
by Vincent Granville
https://lnkd.in/fnBGQfM #technology #naturallanguageprocessing #artificialintelligence #businessintelligence #deeplearning #machinelearning #python #datascience
✴️ @AI_Python_EN
Word2vec was great .... in 2013. There's a lot of value in the idea of embeddings still (although, for a lot of NLP tasks, you should really upgrade to deeper representations. Check out transformer networks from 2017 and what followed them).
In any case, this is a modern analysis that should make clear what the possibilities of embeddings created with word2vec and similar techniques are.
Do pay attention to the genealogy of techniques that followed word2vec, as the original idea is powerful.
https://lnkd.in/f5KBf76
#neuralnetworks #artificialintelligence #machinelearning #ai
✴️ @AI_Python_EN
In any case, this is a modern analysis that should make clear what the possibilities of embeddings created with word2vec and similar techniques are.
Do pay attention to the genealogy of techniques that followed word2vec, as the original idea is powerful.
https://lnkd.in/f5KBf76
#neuralnetworks #artificialintelligence #machinelearning #ai
✴️ @AI_Python_EN
#AI hype can certainly be irritating, and many AI researchers have expressed concerns that it may lead to yet another AI winter.
At the opposite extreme are apocalyptic visions of a future in which "the machines" enslave or even eliminate humanity. Eschatology in its latest incarnation.
In keeping with ancient statistical traditions, I personally am wishy-washy about AI and technology in general. AI represents both an opportunity and a threat, but I don't know what the future holds in store for me or anyone else.
I'm not counseling indifference, though. As with any issue we regard as important, I think we should stay calm and do our homework.
✴️ @AI_Python_EN
At the opposite extreme are apocalyptic visions of a future in which "the machines" enslave or even eliminate humanity. Eschatology in its latest incarnation.
In keeping with ancient statistical traditions, I personally am wishy-washy about AI and technology in general. AI represents both an opportunity and a threat, but I don't know what the future holds in store for me or anyone else.
I'm not counseling indifference, though. As with any issue we regard as important, I think we should stay calm and do our homework.
✴️ @AI_Python_EN
Understanding the need to use mean, median and mode.
Understanding the need to use Inter-quartile ranges and not normal ranges.
Understanding the use of a line chart instead of a bar chart.
Understanding the model parameters and stats behind it to create a better model.
All these questions and many more can be answered, if we have a stronghold of the basic/intermediate level of statistics.
Even though I have a decent understanding of statistical concepts and always rely on reading up the stats part of models when I'm using them. I picked up head first statistics book this week.
After a long time, I felt I had spent a good 2 hours of the day to learn something new apart from my daily business analytics work.
I'll summarize my learning in a blog post or article after completing the book and also suggest this book to anyone who wants to learn statistics for doing data science.
✴️ @AI_Python_EN
Understanding the need to use Inter-quartile ranges and not normal ranges.
Understanding the use of a line chart instead of a bar chart.
Understanding the model parameters and stats behind it to create a better model.
All these questions and many more can be answered, if we have a stronghold of the basic/intermediate level of statistics.
Even though I have a decent understanding of statistical concepts and always rely on reading up the stats part of models when I'm using them. I picked up head first statistics book this week.
After a long time, I felt I had spent a good 2 hours of the day to learn something new apart from my daily business analytics work.
I'll summarize my learning in a blog post or article after completing the book and also suggest this book to anyone who wants to learn statistics for doing data science.
✴️ @AI_Python_EN
Pranav Dar in his Medium post has classified the pretrained models into three different categories based on their application and here it is!
Multi-Purpose #NLP Models: ULMFiT, Transformer, Google’s BERT, Transformer-XL, OpenAI’s GPT-2.
Word Embeddings: ELMo, Flair.
Other Pretrained Models: StanfordNLP.
🌎 Link Review
✴️ @AI_Python_EN
Multi-Purpose #NLP Models: ULMFiT, Transformer, Google’s BERT, Transformer-XL, OpenAI’s GPT-2.
Word Embeddings: ELMo, Flair.
Other Pretrained Models: StanfordNLP.
🌎 Link Review
✴️ @AI_Python_EN
Yoshua Bengio, Geoffrey Hinton and Yann LeCun, the fathers of #DeepLearning, receive the 2018 #ACMTuringAward for conceptual and engineering breakthroughs that have made deep neural networks a critical component of computing today. http://bit.ly/2HVJtdV
✴️ @AI_Python_EN
✴️ @AI_Python_EN
Tracking Progress in Natural Language Processing
By Sebastian Ruder: https://lnkd.in/e6tkGHH
#deeplearning #machinelearning #naturallanguageprocessing
✴️ @AI_Python_EN
By Sebastian Ruder: https://lnkd.in/e6tkGHH
#deeplearning #machinelearning #naturallanguageprocessing
✴️ @AI_Python_EN