AI, Python, Cognitive Neuroscience
3.87K subscribers
1.09K photos
47 videos
78 files
893 links
Download Telegram
Modern machine learning is driven by building good environments/datasets. We’ve just open-sourced a tool we created for rendering high-quality synthetic robotics data:
OpenAI : We're releasing ORRB (OpenAI Remote Rendering Backend)—a Unity3d-based system that enables rapid and customizable renderings of robotics environments.
Paper: https://arxiv.org/abs/1906.11633
Code: https://github.com/openai/orrb

✴️ @AI_Python_EN
Change Detection in Graph Streams by Learning Graph Embeddings on Constant-Curvature Manifolds
arxiv.org/abs/1805.06299

#NLP #MachineLearning #DeepLearning

✴️ @AI_Python_EN
Skill from Video data + motion reconstruction
code: https://github.com/akanazawa/motion_reconstruction
CVPR'19 human motion paper training
code: https://github.com/akanazawa/human_dynamics

✴️ @AI_Python_EN
0.pdf
12.3 MB
All you need to know about Classification and Regression (Machine Learning in 270 pages)

In classification problems, we are trying to predict a discrete number of values. The labels(y) generally comes in the categorical form and represents a finite number of classes.

+Decision Trees
+Logistic Regression
+Naive Bayes
+K Nearest Neighbors
+Linear SVC (Support vector Classifier)s.

In regression problems, we try to predict continuous valued output, take this example. Given the size of the house predict the price(real value).

+Regression Algorithms
+Linear Regression
+Regression Trees(e.g. Random Forest)
+Support Vector Regression (SVR)

✴️ @AI_Python_EN
0.pdf
752.6 KB
Does text recognition stuff excite you?
Or do you use google translate which makes you think 'Damn,this is magic!'

This is what I used to feel but after research and #positive approach in learning I landed up getting my basics clear for NLP. Sharing the same with you.

Natural Language Processing is no rocket science and a sub-field of computer science and AI that enables computers to understand and process human language.

What is a word?
-a sequence of meaningful characters, words can be silly as well,DUH ._.

And how it goes around??(basicsteps)
-some sequence of steps(just giving you a jiffy,rest well explained in the pdf)
-Tokenization
-Token Normalisation
-Stemming
-Lemmatization

Attaching a beautifuI research paper from arXiv,give it a read and start with your fun journey.
#datascience #nlp
🗣 @AI_Python_arXiv
✴️ @AI_Python_EN
Learn:

- Practical Deep Learning http://course.fast.ai/
- Deep Learning Foundations https://lnkd.in/dhJJYhw
- Computational Linear Algebra https://lnkd.in/e3zAvzF
- Intro Machine Learning http://course.fast.ai/ml

#artificialintelligence #deeplearning #machinelearning

✴️ @AI_Python_EN
🗣 @AI_Python_arXiv
Is this really a paradox as claimed by the authors? Because of small sample sizes, once 2-year data are in why don't we just ignore the individual yearly baseball performance figures?
http://qualitysafety.bmj.com/content/23/9/701

✴️ @AI_Python_EN
🗣 @AI_Python_arXiv
FREE COURSE Intro to TensorFlow for Deep Learning

This course is a practical approach to deep learning for software developers

https://www.udacity.com/course/intro-to-tensorflow-for-deep-learning--ud187

✴️ @AI_Python_EN
🗣 @AI_Python_arXiv
Interesting paper! Tensorflow 2.0 and PyTorch 1.1 already pushed the language to the limits of what it can do. As Julia and Swift mature their support for #deeplearning, we may need to switch
https://buff.ly/320IH76

✴️ @AI_Python_EN
🗣 @AI_Python_arXiv
A Review of “Compound Probabilistic Context-Free Grammars for Grammar Induction”

By Ryan Cotterell

https://lnkd.in/fVVvwud
paper https://lnkd.in/fr-U2vK

#MachineLearning
#NaturalLanguageProcessing #NLP

✴️ @AI_Python_EN
New Google Brain Optimizer Reduces BERT Pre-Training Time From Days to Minutes
http://bit.ly/30tZfDN
#AI #MachineLearning #DeepLearning #DataScience

✴️ @AI_Python_EN
A mathematical theory of semantic development in deep neural networks

https://lnkd.in/ejt9fe6

#MachineLearning #ArtificialIntelligence #Neurons #Cognition

✴️ @AI_Python_EN
This is the reference implementation of Diff2Vec - "Fast Sequence Based Embedding With Diffusion Graphs" (CompleNet 2018). Diff2Vec is a node embedding algorithm which scales up to networks with millions of nodes. It can be used for node classification, node level regression, latent space community detection and link prediction. Enjoy!

https://lnkd.in/dXiy5-U

#technology #machinelearning #datamining #datascience #deeplearning #neuralnetworks #pytorch #tensorflow #diffusion #Algorithms

✴️ @AI_Python_EN