AI, Python, Cognitive Neuroscience
3.88K subscribers
1.09K photos
47 videos
78 files
893 links
Download Telegram
This media is not supported in your browser
VIEW IN TELEGRAM
In a paper published two days ago in Nature, a group of scientists designed a recurrent neural network that decoded cortical signals to speech signals.
This problem considered much harder than decoding muscle movement from brain signal as the signals that responsible for spoken words are much difficult to decode.
Nature (paywall): https://lnkd.in/fM8EsuE
direct link to pdf: https://lnkd.in/ftrEbe5
#ai #neuralnetwork #science #rnn #neuroscience

✴️ @AI_Python_EN
THE_LOTTERY_TICKET_HYPOTHESIS_:FINDING.pdf
3.8 MB
Interesting paper with a simple and straightforward explanation about NN pruning, based on the following hypothesis:

"Dense, randomly-initialized, feed-forward networks contain subnetworks that - when trained in isolation - reach test accuracy comparable to the original network in a similar number of iterations."

#machinelearning #deeplearning #neuralnetwork #NN

✴️ @AI_Python_EN
Andriy Burkov
I often receive questions from people in my network about what should they learn and master to become a data scientist. While I personally think that the term "data scientist" is very unfortunate and without a clear definition, this is what a good modern #dataanalyst has to master:
#DataScience

– Data structures (local and distributed)
– Data indexing
– Data privacy and anonymization
– Data lifecycle management
– Data transformation (deduplication, handling outliers, and missing values, dimensionality reduction)
– Data analysis (experiment design, classification, regression, unsupervised methods)
#Machinelearning methods (feature engineering, regularization, hyperparameter tuning, ensemble methods, and #neuralnetwork s)
– Computer and database programming, numerical optimization
– Distributed data processing
– Real-time and high-frequency data processing
– Linux (my personal bias)

A modern data analyst also has to be a good popularizer of complex ideas. Having a Ph.D. is not a requirement, but a very big plus: it contributes to the popularizing skill and teaches the scientific approach to problem-solving.

✴️ @AI_Python_EN
Which doodles are human-drawn and which are AI-generated? Berkeley researchers Forrest Huang et al created a #neuralnetwork that can generate sketches based on text descriptions:
http://bit.ly/2LZSHJN


✴️ @AI_Python_EN
A gentle overview on the Deep Learning and Machine Learning

The Deep Learning is a subarea of the Machine Learning that makes use of Deep Neural Networks (with many layers) and specific novel algorithms for the pre-processing of data and regularization of the model: word embeddings, dropout, data-augmentation. Deep Learning takes inspiration from Neuroscience since Neural Networks are a model of the neuronal network within the brain. Unlike the biological brain, where any neuron can connect to any other under some physical constraints, Artificial Neural Networks (ANNs) have a finite number of layers, connections, and fixed direction of the data propagation. So far, ANNs have been ignored by the research and business community. The problem is their computational cost.

Between 2006 and 2012 the research group led by Geoffrey Hinton at the University of Toronto finally parallelized the ANNs algorithms to parallel architectures. The main breakthroughs are the increased number of layers, neurons, and model parameters in general (even over than 10 million) allowing to compute massive amounts of data through the system to train it.

https://lnkd.in/dq87iFy

#neuralnetwork
#deeplearning
#machinelearning

✴️ @AI_Python_EN
Vanishing/exploring gradients problem is a well often problem especially when training big networks, so visualizing gradients is a must when training neural networks. Here is the small network's on MNIST dataset gradients flow. A detailed article is on the way to explain many things in deep learning.

#machinelearning #deeplearning #artificialintelligence #computervision #neuralnetwork

❇️ @AI_Python_EN