AI, Python, Cognitive Neuroscience
3.82K subscribers
1.09K photos
46 videos
78 files
891 links
Download Telegram
Have you heard of "R-Transformer?", a Recurrent Neural Network Enhanced Transformer

Recurrent Neural Networks have long been the dominating choice for sequence modeling. However, it severely suffers from two issues: impotent in capturing very long-term dependencies and unable to parallelize the sequential computation procedure.

Therefore, many non-recurrent sequence models that are built on convolution and attention operations have been proposed recently.

Here the authors propose the R-Transformer which enjoys the advantages of both RNNs and the multi-head attention mechanism while avoids their respective drawbacks.

The proposed model can effectively capture both local structures and global long-term dependencies in sequences without any use of position embeddings. They evaluated R-Transformer through extensive experiments with data from a wide range of domains and the empirical results show that R-Transformer outperforms the state-of-the-art methods by a large margin in most of the tasks.

Github code: https://lnkd.in/dpFckix

#research #algorithms #machinelearning #deeplearning #rnn

✴️ @AI_Python_EN
This media is not supported in your browser
VIEW IN TELEGRAM
Abnormal respiratory patterns classifier may contribute to large-scale screening of people infected with COVID-19 in an accurate and unobtrusive manner.

abs: https://arxiv.org/abs/2002.05534v1

#rnn #machinelearning #ArtificialIntelligence #DeepLearning #

❇️ @AI_Python_EN