Teaching deep generative modelling techniques for NLP applications at Yandex NLP week https://academy.yandex.ru/events/data_analysis/NLP_week/ … All material available from https://github.com/philschulz/VITutorial/tree/yandex2019/modules
events.yandex.ru
События Яндекса
События — вся информация о конференциях, школах, семинарах и других мероприятиях Яндекса — как прошедших, так и предстоящих.
The US Military Is Creating the Future of Employee Monitoring
"A new AI-enabled pilot project aims to sense “micro changes” in the behavior of people with top-secret clearances. If it works, it could be the future of corporate HR."
In Defense One: https://lnkd.in/e6iuJU6
#ArtificialIntelligence #DeepLearning #MachineLearning
✴️ @AI_Python_EN
"A new AI-enabled pilot project aims to sense “micro changes” in the behavior of people with top-secret clearances. If it works, it could be the future of corporate HR."
In Defense One: https://lnkd.in/e6iuJU6
#ArtificialIntelligence #DeepLearning #MachineLearning
✴️ @AI_Python_EN
Articles on Natural Language Processing (NLP)
by Vincent Granville
https://lnkd.in/fnBGQfM #technology #naturallanguageprocessing #artificialintelligence #businessintelligence #deeplearning #machinelearning #python #datascience
✴️ @AI_Python_EN
by Vincent Granville
https://lnkd.in/fnBGQfM #technology #naturallanguageprocessing #artificialintelligence #businessintelligence #deeplearning #machinelearning #python #datascience
✴️ @AI_Python_EN
Word2vec was great .... in 2013. There's a lot of value in the idea of embeddings still (although, for a lot of NLP tasks, you should really upgrade to deeper representations. Check out transformer networks from 2017 and what followed them).
In any case, this is a modern analysis that should make clear what the possibilities of embeddings created with word2vec and similar techniques are.
Do pay attention to the genealogy of techniques that followed word2vec, as the original idea is powerful.
https://lnkd.in/f5KBf76
#neuralnetworks #artificialintelligence #machinelearning #ai
✴️ @AI_Python_EN
In any case, this is a modern analysis that should make clear what the possibilities of embeddings created with word2vec and similar techniques are.
Do pay attention to the genealogy of techniques that followed word2vec, as the original idea is powerful.
https://lnkd.in/f5KBf76
#neuralnetworks #artificialintelligence #machinelearning #ai
✴️ @AI_Python_EN
#AI hype can certainly be irritating, and many AI researchers have expressed concerns that it may lead to yet another AI winter.
At the opposite extreme are apocalyptic visions of a future in which "the machines" enslave or even eliminate humanity. Eschatology in its latest incarnation.
In keeping with ancient statistical traditions, I personally am wishy-washy about AI and technology in general. AI represents both an opportunity and a threat, but I don't know what the future holds in store for me or anyone else.
I'm not counseling indifference, though. As with any issue we regard as important, I think we should stay calm and do our homework.
✴️ @AI_Python_EN
At the opposite extreme are apocalyptic visions of a future in which "the machines" enslave or even eliminate humanity. Eschatology in its latest incarnation.
In keeping with ancient statistical traditions, I personally am wishy-washy about AI and technology in general. AI represents both an opportunity and a threat, but I don't know what the future holds in store for me or anyone else.
I'm not counseling indifference, though. As with any issue we regard as important, I think we should stay calm and do our homework.
✴️ @AI_Python_EN
Understanding the need to use mean, median and mode.
Understanding the need to use Inter-quartile ranges and not normal ranges.
Understanding the use of a line chart instead of a bar chart.
Understanding the model parameters and stats behind it to create a better model.
All these questions and many more can be answered, if we have a stronghold of the basic/intermediate level of statistics.
Even though I have a decent understanding of statistical concepts and always rely on reading up the stats part of models when I'm using them. I picked up head first statistics book this week.
After a long time, I felt I had spent a good 2 hours of the day to learn something new apart from my daily business analytics work.
I'll summarize my learning in a blog post or article after completing the book and also suggest this book to anyone who wants to learn statistics for doing data science.
✴️ @AI_Python_EN
Understanding the need to use Inter-quartile ranges and not normal ranges.
Understanding the use of a line chart instead of a bar chart.
Understanding the model parameters and stats behind it to create a better model.
All these questions and many more can be answered, if we have a stronghold of the basic/intermediate level of statistics.
Even though I have a decent understanding of statistical concepts and always rely on reading up the stats part of models when I'm using them. I picked up head first statistics book this week.
After a long time, I felt I had spent a good 2 hours of the day to learn something new apart from my daily business analytics work.
I'll summarize my learning in a blog post or article after completing the book and also suggest this book to anyone who wants to learn statistics for doing data science.
✴️ @AI_Python_EN
Pranav Dar in his Medium post has classified the pretrained models into three different categories based on their application and here it is!
Multi-Purpose #NLP Models: ULMFiT, Transformer, Google’s BERT, Transformer-XL, OpenAI’s GPT-2.
Word Embeddings: ELMo, Flair.
Other Pretrained Models: StanfordNLP.
🌎 Link Review
✴️ @AI_Python_EN
Multi-Purpose #NLP Models: ULMFiT, Transformer, Google’s BERT, Transformer-XL, OpenAI’s GPT-2.
Word Embeddings: ELMo, Flair.
Other Pretrained Models: StanfordNLP.
🌎 Link Review
✴️ @AI_Python_EN
Yoshua Bengio, Geoffrey Hinton and Yann LeCun, the fathers of #DeepLearning, receive the 2018 #ACMTuringAward for conceptual and engineering breakthroughs that have made deep neural networks a critical component of computing today. http://bit.ly/2HVJtdV
✴️ @AI_Python_EN
✴️ @AI_Python_EN
Tracking Progress in Natural Language Processing
By Sebastian Ruder: https://lnkd.in/e6tkGHH
#deeplearning #machinelearning #naturallanguageprocessing
✴️ @AI_Python_EN
By Sebastian Ruder: https://lnkd.in/e6tkGHH
#deeplearning #machinelearning #naturallanguageprocessing
✴️ @AI_Python_EN
One of my favorite tricks is adding a constant to each of the independent variables in a regression so as to shift the intercept. Of course just shifting the data will not change R-squared, slopes, F-scores, P-values, etc., so why do it?
Because just about any software package capable of doing regression, even Excel, can give you standard errors and confidence intervals for the Intercept, but it is much harder to get most packages to give you standard errors and confidence intervals around the predicted value of the dependent variable for OTHER combinations of the independent variables. Shifting the intercept is an easy way to get confidence intervals for arbitrary combinations of the independent variables.
This sort of thing becomes especially important at a time when the Statistics community is loudly calling for a move away from P-values. Instead it is recommended that researchers give confidence intervals in clinically meaningful terms.
#data #researchers #statistics #r #excel #regression
✴️ @AI_Python_EN
Because just about any software package capable of doing regression, even Excel, can give you standard errors and confidence intervals for the Intercept, but it is much harder to get most packages to give you standard errors and confidence intervals around the predicted value of the dependent variable for OTHER combinations of the independent variables. Shifting the intercept is an easy way to get confidence intervals for arbitrary combinations of the independent variables.
This sort of thing becomes especially important at a time when the Statistics community is loudly calling for a move away from P-values. Instead it is recommended that researchers give confidence intervals in clinically meaningful terms.
#data #researchers #statistics #r #excel #regression
✴️ @AI_Python_EN
SciBERT: Pretrained Contextualized Embeddings for Scientific Text
Beltagy et al.: https://lnkd.in/eAT3mSK
#ArtificialIntelligence #DeepLearning #MachineLearning
✴️ @AI_Python_EN
Beltagy et al.: https://lnkd.in/eAT3mSK
#ArtificialIntelligence #DeepLearning #MachineLearning
✴️ @AI_Python_EN
All Data Science ***Cheat Sheets*** in one place.
Github link - https://lnkd.in/fGeGXQs
#datascience #machinelearning #excel #deeplearning #python #R
✴️ @AI_Python_EN
Github link - https://lnkd.in/fGeGXQs
#datascience #machinelearning #excel #deeplearning #python #R
✴️ @AI_Python_EN
GAN Lab: Play with Generative Adversarial Networks (GANs) in your browser!
https://lnkd.in/dfiFvrc
Research paper: https://lnkd.in/eeYFK4J
#AI #ArtificialIntelligence #GenerativeDesign #GenerativeAdversarialNetworks
✴️ @AI_Python_EN
https://lnkd.in/dfiFvrc
Research paper: https://lnkd.in/eeYFK4J
#AI #ArtificialIntelligence #GenerativeDesign #GenerativeAdversarialNetworks
✴️ @AI_Python_EN
Pretrained ULMFiT language models for 10 Indian languages! https://github.com/goru001/inltk
#nlp
✴️ @AI_Python_EN
#nlp
✴️ @AI_Python_EN
Self-Supervised Learning via Conditional Motion Propagation #CVPR2019 It learns kinematically-sound representations! State-of-the-art results on PASCAL VOC 2012 segmentation task. Paper: https://arxiv.org/abs/1903.11412 Project Page: http://mmlab.ie.cuhk.edu.hk/projects/CMP/
✴️ @AI_Python_EN
✴️ @AI_Python_EN
TensorFlow is dead, long live TensorFlow!
#TensorFlow just went full #Keras! (!!!!!) Here's why that's an earthquake for #AI and #DataScience...
🌎 TensorFlow
✴️ @AI_Python_EN
#TensorFlow just went full #Keras! (!!!!!) Here's why that's an earthquake for #AI and #DataScience...
🌎 TensorFlow
✴️ @AI_Python_EN
How This Researcher Is Using #DeepLearning To Shut Down Trolls And Fake Reviews. #BigData #Analytics #DataScience #AI #MachineLearning #NLProc #IoT #IIoT #PyTorch #Python #RStats #JavaScript #ReactJS #GoLang #Serverless #DataScientist #Linux
🌎 https://bit.ly/2U2J5BX
✴️ @AI_Python_EN
🌎 https://bit.ly/2U2J5BX
✴️ @AI_Python_EN
NODE - Neural Ordinary Differential Equations
This was recently presented as a new approach in NeurIPS.
The idea?
Instead of specifying a discrete sequence of hidden layers, they parameterized the derivative of the hidden state using a neural network. The output of the network is computed using a black- box differential equation solver.
They also propose CNF - or Continuous Normalizing Flow
The continuous normalizing flows, a generative model that can train by maximum likelihood, without partitioning or ordering the data dimensions. For training, we show how to scalably backpropagate through any ODE solver, without access to its internal operations. This allows end-to-end training of ODEs within larger models.
Paper: https://lnkd.in/ddMJQAS
#Github: Examples of implementations coming soon to our repository
#neuralnetwork #deeplearning #machinelearning
✴️ @AI_Python_EN
This was recently presented as a new approach in NeurIPS.
The idea?
Instead of specifying a discrete sequence of hidden layers, they parameterized the derivative of the hidden state using a neural network. The output of the network is computed using a black- box differential equation solver.
They also propose CNF - or Continuous Normalizing Flow
The continuous normalizing flows, a generative model that can train by maximum likelihood, without partitioning or ordering the data dimensions. For training, we show how to scalably backpropagate through any ODE solver, without access to its internal operations. This allows end-to-end training of ODEs within larger models.
Paper: https://lnkd.in/ddMJQAS
#Github: Examples of implementations coming soon to our repository
#neuralnetwork #deeplearning #machinelearning
✴️ @AI_Python_EN