AI, Python, Cognitive Neuroscience
3.87K subscribers
1.09K photos
47 videos
78 files
893 links
Download Telegram
Quantile Regression Deep Reinforcement Learning

Researchers: Oliver Richter, Roger Wattenhofer
Paper: https://lnkd.in/fnwiYXi
#artificialintelligence #ai #ml #machinelearning #bigdata #deeplearning #technology #datascience

✴️ @AI_Python_EN
Anticipatory Thinking: A Metacognitive Capability
Researchers: Adam Amos-Binks, Dustin Dannenhauer
Paper: http://ow.ly/wEyC50uR9q1

#artificialintelligence #ai #ml #machinelearning #bigdata #deeplearning #technology #datascience

✴️ @AI_Python_EN
Progressively growing the action space creates a great curriculum for learning agents

paper: https://arxiv.org/abs/1906.12266

code: https://github.com/TorchCraft/TorchCraftAI/tree/gas-micro

✴️ @AI_Python_EN
By your walking style, Artificial Intelligence can know if you are happy, sad, angry or neutral Great work on emotion recognition based on deep features learned via LSTM on labeled emotion datasets + psychological characterization for affective features
https://arxiv.org/pdf/1906.11884.pdf

✴️ @AI_Python_EN
How do neural networks see depth in single images?
https://arxiv.org/pdf/1905.07005.pdf
Deep Set Prediction Networks by Yan Zhang et. al.
https://arxiv.org/abs/1906.06565
This paper has received a lot of attention (deservedly) in the short time it has been published.

Encoding sets in neural networks is done using permutation-equivariant (applying a function on every item) and permutation-invariant (sum, average, max) operations. (I think sum works the best, because sum is a gradient distributor and help optimization).

"Deep Sets" paper showed that theoretically this can learn any function on sets (even maximum, and the second maximum, I know it is unintuitive, but it can. theoretically!) But there are some limitations which you can read about it here (https://www.inference.vc/deepsets-modeling-permutation-invariance/).

Similar ideas were also used and shown very effective in relation networks (https://arxiv.org/abs/1706.01427 ) and GQNs from Eslami et. al. But this paper tries to tackle "decoding" or generating sets.
To the best of my knowledge this is one of the very few papers that actually generates "real" sets. The trick is to use a set "encoder" for decoding! Assume you have a target latent vector z for your set that you want to "decode".
Start with an initial set estimate. Encode the set estimate into the latent space. Measure error to the target z. Backpropagate error to the set estimate. Use gradient descent to update your set estimate. Repeat 10 times to arrive at a set that encodes close to your target z.

This means that each "decoding" step is a mini-optimization loop. During training you have your main optimization loop as well. The initial set estimate in fixed and learnable. This paper studies 2 set generation settings. Auto-encoding setting and supervised setting.
In auto-encoding setting you use the set encoder to encode your set into your target z and use the same encoder for "decoding". The "set loss" is Soft-L1 loss, after finding the best assignment between the estimate set and the input set using the Hungarian algorithm.
In the supervised setting, the input (e.g. image) is first mapped into the latent space (e.g. using a resent) creating the target z. An encoder maps the target set into the same latent space. The encoder is used during "decoding" as well.
In the supervised setting, the loss has 2 parts: "set loss" as in auto-encoding setting, and "representation loss" which is the l2 distance between the encoder estimate of z and target z.

Some limitations of this approach: - The Hungarian assignment of your estimated set and your target set O(n^3) has to be calculated each time. (Actually 10 times for each iteration, refer to "practical tricks" in the paper) limiting the scalability of the approach.
The set estimate is real-valued. - The set estimate is not really variable size, you have to use some masking tricks to make it seem variable size. But over all I liked the paper.

The thing I like about Yan is that he has already published his code. Also he reports mean, std over multiple runs in his experiments. This is very important IMHO and is very hard to find these days!

✴️ @AI_Python_EN
Want to play with PyTorch, but prefer OCaml to Python? Here are OCaml bindings to the PyTorch C++ API. With examples for image recognition, GANs, RL, etc. By
Laurent Mazare. https://github.com/LaurentMazare/ocaml-torch
Data in the Life: Authorship Attribution in Lennon-McCartney Songs", was just published in the first issue of the HARVARD DATA SCIENCE REVIEW, the inaugural publication of harvard datascience published by the mit press. Combining features of a premier research journal, a leading educational publication, and a popular magazine, HDSR leverages digital technologies and data visualizations to facilitate author-reader interactions globally. Besides our article, the first issue features articles on topics ranging from machine learning models for predicting drug approvals to artificial intelligence. Read it now:
https://bit.ly/2Kuze2q.
#datascience #bigdata #machinelearing #statistics #AI

✴️ @AI_Python_EN
Speaking as an applied statistician, there are general methods that apply to any field such as descriptive and inferential statistics, sampling, and core modeling methods such as linear regression and PCA.

Once we get beyond these basics, applied statistics begins to diverge quite a bit. For the work I do, methods developed in biostatistics and epidemiology, econometrics and psychometrics are most useful.

Predictive analytics has been part of applied statistics from the very beginning, so much so that "predictive" was not used until the past decade or so. Statisticians simply called predictive analytics "statistics."

Computer scientists have made important contributions to this corner of applied statistics, and at the academic and practitioner level there is increasing collaboration between the two disciplines.

Though some types of predictive analytics are increasingly automated, statistics is not going to vanish any time soon. It's too complex and innovation is increasingly rapid. Human experts will be needed for the foreseeable future. IMO.

A greatly underappreciated aspect of statistics is that it is also a way of thinking, not merely math and programing. It also opens doors to many other fields, including art, music, literature and philosophy.

✴️ @AI_Python_EN
Module 3: Core Machine Learning (May-October Semester)
July 6th by FAST-NU AI/ML Training Center

Module 3 (Core Machine Learning) of our ongoing cohort (May October semester) for the AI-ML training program. It covers basic to intermediate Machine Learning and lays a solid foundation to build or transition into a career of ML and Data Science, and also to provide a thorough grounding for the next Deep Learning Module.

https://www.facebook.com/events/2195319697439547/

#deeplearning #machinelearning #opencv #AI #ML #Python

✴️ @AI_Python_EN
MIT/IBM's new AI lets you "paint" with a neural network. Try out the demo here:
http://bit.ly/GANPdemo

✴️ @AI_Python_EN
Some important things to consider in multivariate analysis include:

- Purpose of the modeling
- Background information and relevant theory
- Whether data are cross-sectional or longitudinal
- Are there different levels, e.g., household within city or region
- Which variables to include
- Measurement level for each variable (e.g., continuous, ordinal, nominal)
- Will continuous latent variables (aka factors) be included
- How the variables interrelate, e.g., hypothesized causal relationships
- Random intercepts, slopes
- Will exogenous variables be included, e.g., age affects factors
- Will categorical latent variables (classes) be included
- Will multiple categorical latent variables be needed
- If latent classes included, which parts of the model can vary by class
- Estimation, e.g., MLE or Bayes, numerous options within each

This may sound like gobbledygook to non-statisticians, but all of it can seriously impact decision-making!

✴️ @AI_Python_EN
PyTorchPipe (PTP)

A component-oriented framework for rapid prototyping and training of computational pipelines combining vision and language:
https://lnkd.in/ehJbseR

#PyTorch #NeuralNetworks #DeepLearning

✴️ @AI_Python_EN
This media is not supported in your browser
VIEW IN TELEGRAM
One of the hardest problems in #AI is common sense reasoning. This paper by Nazneen Rajani, Bryan McCann, Caiming Xiong +I makes huge progress on this. Powerful, simple and unsupervised method:
https://arxiv.org/abs/1906.02361
Github: https://github.com/salesforce/cos-e
Blog: https://blog.einstein.ai/leveraging-language-models-for-commonsense/

✴️ @AI_Python_EN
This media is not supported in your browser
VIEW IN TELEGRAM
Prof. Chris Manning, Director of StanfordAILab & founder of Stanfordnlp, shared inspiring thoughts on research trends and challenges in #computervision and #NLP at #CVPR2019. View full interview:

http://bit.ly/2KR21hO

✴️ @AI_Python_EN
https://lnkd.in/e2awdVx
Not to be confused with (https://lnkd.in/eydGDPu), mmdetection supports all the SOTA detection algorithms.
#pytorch #gpu

✴️ @AI_Python_EN
Pandas Crash Course

Github :https://lnkd.in/gmWAqhz

✴️ @AI_Python_EN
Facebook today announced the open source release of Deep Learning Recommendation Model (DLRM), a state-of-the-art AI model for serving up personalized results in production environments.

DLRM can be found on GitHub, and implementations of the model are available for Facebook’s PyTorch, Facebook’s distributed learning framework Caffe2, and Glow C++.
Link: https://lnkd.in/dEDtai3

Recommendation engines decide a lot of what people see today, whether it’s content on social media sites like Facebook, ecommerce sites like Amazon, or even the first options you see on an Xbox.

Last month, Amazon made its AI for the shopping recommendations system Personalize available on AWS.
FB Research blog for more details:
https://lnkd.in/du2N9Pd

✴️ @AI_Python_EN
Artificial Intelligence: the global landscape of ethics guidelines

Researchers: Anna Jobin, Marcello Ienca, Effy Vayena
Paper: http://ow.ly/mDA430p2R0q

#artificialintelligence #ai #ml #machinelearning #bigdata #deeplearning #technology #datascience

✴️ @AI_Python_EN