ArtificialIntelligenceArticles
3.03K subscribers
1.64K photos
9 videos
5 files
3.86K links
for who have a passion for -
1. #ArtificialIntelligence
2. Machine Learning
3. Deep Learning
4. #DataScience
5. #Neuroscience

6. #ResearchPapers

7. Related Courses and Ebooks
Download Telegram
Clustering Time Series Data through Autoencoder-based Deep Learning Models. http://arxiv.org/abs/2004.07296
Papers with Code: A searchable site that links machine learning papers on ArXiv with code on GitHub. They also tag any framework libraries used, along with other info like GitHub stars. I think such a feature would be a nice addition to ArXiv-Sanity. https://paperswithcode.com
Your 100% up-to-date guide to transfer learning & fine-tuning with Keras: https://colab.research.google.com/drive/17vHSAj7no7RMdJ18MJomTf8twqw1suYC

Batch normalization involves many gotchas you need to be aware of.
Graph Neural Networks Meet Neural-Symbolic Computing: A Survey and Perspective
Luis Lamb et al.: https://arxiv.org/abs/2003.00330
#aidebate #thinkingfastandslow #AAAI2020debate #neurosymbolic #neurosymboliccomputing
The ACM just made its entire archive available for free until June 30.
7 magazines
30+ textbooks
50+ journals
Proceedings from 170+ annual conferences, symposia & workshops
Start your journey here
dl.acm.org
ACM - Association for Computing Machinery
We are excited to host the ICLR 2020 workshop “Tackling Climate Change with Machine Learning” from April 26-30. This workshop will feature talks, panels, and posters on work at the intersection of climate change and machine learning; tutorials on climate change and ML; and plenty of opportunities for digital networking with other participants.

All talks and panels will be live-streamed for free via the workshop website. Registration (via the main ICLR conference) is required to participate actively in Q&As, poster sessions, and the conference messaging app.

More details:

The main workshop on April 26 will feature a full-day program of invited and contributed presentations, as well as panel discussions and breakout sessions. Keynote speakers include:

Stefano Ermon (Stanford University)
Ciira wa Maina (Dedan Kimathi University of Technology)
Georgina Campbell (ClimaCell)
Dan Morris (Microsoft AI for Earth)

From April 27-30, our program will feature deep dives into specific sectors of relevance to climate change, via panels, fireside chats, small-group discussions, and tutorials.

April 27: Energy Day
April 28: Agriculture, Forestry, and Other Land Use (AFOLU) Day
April 29: Climate Science and Adaptation Day
April 30: Cross-cutting Methods Day
Schedule and details: https://www.climatechange.ai/ICLR2020_workshop
Registration: Via the main ICLR conference, at https://iclr.cc
Contact: climatechangeai.iclr2020@gmail.com

Organizers:

Priya Donti (CMU), David Rolnick (UPenn), Lynn Kaack (ETH Zürich), Sasha Luccioni (Mila), Kris Sankaran (Mila), Sharon Zhou (Stanford), Moustapha Cisse (Google), Carla Gomes (Cornell), Andrew Ng (Stanford), Yoshua Bengio (Mila)

Priya Lekha Donti, David Rolnick, Sasha Lu, Sharon Zhou, Moustapha Cisse, Andrew Ng, Ciira Maina
Researchers at Johns Hopkins and OpenAI derived new equations that optimize training parameters, including parameter count, training corpus size, batch size, and training time on language model performance: https://hubs.ly/H0prWMd0
Metric-Learning-Assisted Domain Adaptation. http://arxiv.org/abs/2004.10963
SQIL: Imitation Learning via Reinforcement Learning with Sparse Rewards

Reddy et al.:
https://arxiv.org/abs/1905.11108
SLIDES
lecture on "Graph Convolutional Networks"
for the NYU Deep Learning course

Xavier Bresson

https://drive.google.com/file/d/1oq-nZE2bEiQjqBlmk5_N_rFC8LQY0jQr/view
H/T Cecile G. Tamura

"Attention the mechanism by which a person (or algorithm) focuses on a single element or a few elements at a time.@ArtificialIntelligenceArticles

It’s central both to machine learning model architectures like Google’s Transformer and to the bottleneck neuroscientific theory of consciousness, which suggests that people have limited attention resources, so information is distilled down in the brain to only its salient bits.

Models with attention have already achieved state-of-the-art results in domains like natural language processing, and they could form the foundation of enterprise AI that assists employees in a range of cognitively demanding tasks.
@ArtificialIntelligenceArticles
Bengio described the cognitive systems proposed by Israeli-American psychologist and economist Daniel Kahneman in his seminal book Thinking, Fast and Slow.
@ArtificialIntelligenceArticles
The first type is unconscious — it’s intuitive and fast, non-linguistic and habitual, and it deals only with implicit types of knowledge.

The second is conscious — it’s linguistic and algorithmic, and it incorporates reasoning and planning, as well as explicit forms of knowledge.

An interesting property of the conscious system is that it allows the manipulation of semantic concepts that can be recombined in novel situations, which Bengio noted is a desirable property in AI and machine learning algorithms."
@ArtificialIntelligenceArticles
https://venturebeat.com/2020/04/28/yoshua-bengio-attention-is-a-core-ingredient-of-consciousness-ai/
Bayesian Deep Learning and a Probabilistic Perspective of Generalization
Andrew Gordon Wilson, Pavel Izmailov : https://arxiv.org/abs/2002.08791
#Artificialintelligence #Bayesian #DeepLearning