AI, Python, Cognitive Neuroscience
3.87K subscribers
1.09K photos
47 videos
78 files
893 links
Download Telegram
Capturing Context in Emotion AI: Innovations in Multimodal Video Sentiment Analysis
#ComputerVision #MachineLearning #ArtificialIntelligence

http://bit.ly/2ZdU6yc

✴️ @AI_Python_EN
Contrastive Multiview Coding Paper+Code: https://github.com/HobbitLong/CMC/ Different views of the world capture different info, but important factors are shared. Learning to capture the shared info .
Extends / simplifies "Contrastive Predictive Coding" https://arxiv.org/abs/1807.03748 Main findings: 1. More views —> better reps 2. Contrastive learning outperforms predictive 3. On Imagenet, unsupervised Resnet-101 outperforms supervised Alexnet.

✴️ @AI_Python_EN
image_2019-06-15_17-38-50.png
865.9 KB
Interested in Continual (Lifelong) Learning? Come to the Workshop on Multi-Task and Lifelong Reinforcement Learning tomorrow (Saturday) ICML for posters and oral on how to rehearse on on older tasks efficiently!

✴️ @AI_Python_EN
image_2019-06-15_17-40-10.png
657.8 KB
Erin LeDell: Happy to share our #ICML2019 #AutoML Workshop paper, "An Open Source AutoML Benchmark". We present a new #opensource AutoML benchmarking system and include results on: H2O AutoML, auto-sklearn, TPOT, Auto-WEKA
📰 Paper: https://www.automl.org/wp-content/uploads/2019/06/automlws2019_Paper45.pdf
👩‍💻 Code: https://github.com/openml/automlbenchmark/

✴️ @AI_Python_EN
image_2019-06-15_17-41-39.png
1.1 MB
Vithursan Thangarasa
Excited to be presenting my work on "Differentiable Hebbian Plasticity for Continual Learning"
(https://openreview.net/forum?id=r1x-E5Ss34 )
at the #ICML2019 Adaptive and Multi-task Learning workshop. Blog post to my
paper: https://vithursant.com/dhp-softmax/ .

✴️ @AI_Python_EN
image_2019-06-15_17-43-32.png
13.3 KB
Adrian Weller
We’re hiring for safe and ethical AI at the Turing Institute. Deadline 25th June. Also opportunities for more senior and junior folks. If you’re at ICML and interested, please contact me.
https://cezanneondemand.intervieweb.it/turing/jobs/safe_and_ethical_ai_research_fellows_6037/en/

✴️ @AI_Python_EN
ESPnet: end-to-end signal processing toolkit v0.4.0 is out. This is the largest release ever! Many features are added: pretrained models, Transformer (both for PyTorch and ChainerOfficial), YAML config, 6 new ASR/TTS corpora, etc. Check it out
https://github.com/espnet/espnet/releases/tag/v.0.4.0

✴️ @AI_Python_EN
SOSNet descriptor, that will be presented as an oral by Yurun Tian at cvpr2019 next week!
https://medium.com/scape-technologies/mapping-the-world-part-4-sosnet-to-the-rescue-5383671713e7
Read about how adding second order distance information to the training of a triplet network improves the results. #CVPR2019

✴️ @AI_Python_EN
Structured prediction requires substantial training data. new paper introduces the first few-shot scene graph model with predicates as functions within a graph convolution framework, resulting in the first semantically & spatially interpretable model.
https://arxiv.org/pdf/1906.04876.pdf

✴️ @AI_Python_EN
This media is not supported in your browser
VIEW IN TELEGRAM
Is it a good idea to train RL policies from raw pixels? Could visual priors about the world help RL? We just released the code of our Mid-Level Vision paper addressing these questions. Spoiler: using raw pixels doesn’t generalize! Play with the results at http://perceptual.actor

✴️ @AI_Python_EN
Interesting NBER paper on the history of industry investment in basic research. "The Changing Structure of American Innovation: Some Cautionary Remarks for Economic Growth", Ashish Arora, Sharon Belenzon, Andrea Patacconi, Jungkyu Suh

https://www.nber.org/papers/w25893

✴️ @AI_Python_EN
the first wide-coverage Minimalist Grammar parser! Read all about it https://stanojevic.github.io/papers/2019_ACL_MG_Wide_Coverage.pdf
The work will be presented at #ACL2019

✴️ @AI_Python_EN
According to one urban legend, statistics depends on the normal distribution and, since most data aren't normally distributed, statistics is invalid.

This myth is easily busted. Many data, including natural phenomena, are in fact normally distributed. Secondly, the normal (Gaussian) distribution is but one of more than three dozen used in statistics. These two books concisely review the ones used most often by statisticians:

- Handbook of Statistical Distributions (Krishnamoorthy)
- Statistical Distributions (Forbes et al.)

As Bayesian statistics becomes mainstream, a good understanding of probability may be more important than ever. I've been cracking the books and have found these three quite helpful:

- Introduction to Probability (Bertsekas and Tsitsiklis)
- Introduction to Probability Models (Ross)
- Essentials of Probability Theory for Statisticians (Proschan and Shaw)

These three provide a somewhat philosophical take on probability:

- Probability, Statistics and Truth (von Mises)
- Probability Theory: The Logic of Science (Jaynes)
- Uncertainty: The Soul of Modeling, Probability & Statistics (Briggs)

Lastly, I'd recommend The Improbability Principle by David Hand to anyone.

✴️ @AI_Python_EN
There is no machine learning for dummies. It is a fact that we have to accept it !
machine learning is an advanced topic that needs knowledge of math, optimization algorithms and programming constraints.

Poeple love to hear stories about AI and how powerful machine learning is. However, they give up as soon as they see the first math equation.

If you want it, work for it ! do not dream for it !

#AI

✴️ @AI_Python_EN
Join Top Experts in Machine Learning, Deep Learning, NLP, AI Engineering for up to four days in San Francisco, and accelerate your career in 2019. October 29 - November 1. 60% OFF Ends Soon: https://hubs.ly/H0jf4Cg0
✴️ @AI_Python_EN
Google Open Sources TensorNetwork , A Library For Faster ML And Physics Tasks
https://bit.ly/2F9vFus

✴️ @AI_Python_EN