AI, Python, Cognitive Neuroscience
3.81K subscribers
1.09K photos
46 videos
78 files
891 links
Download Telegram
Reza Zadeh:

Creating super slow motion videos by predicting missing frames using a neural network, instead of simple interpolation. With code.

Code: https://github.com/avinashpaliwal/Super-SloMo

Project: https://people.cs.umass.edu/~hzjiang/projects/superslomo/

#AI #DeepLearning #MachineLearning #DataScience #neuralnetwork


✴️ @AI_Python_EN
🗣 @AI_Python_arXiv
❇️ @AI_Python
This media is not supported in your browser
VIEW IN TELEGRAM
Deep Learning for beating Traffic is a sound idea given the fact that on an average we almost waste a week in traffic each year.

There is a great course on Deep learning by MIT through the applied theme of building a self-driving car. It is open to beginners and is designed for those who are new to machine learning, but it can also benefit advanced researchers in the field looking for a practical overview of #deeplearning methods and their application.

Also, the program has some cool programming challenge to test you on the concepts like #DeepTraffic wherein you have to create a #neuralnetwork to drive a vehicle (or multiple vehicles) as fast as possible through dense traffic.

Link to Course: https://lnkd.in/fGbjB3y

🗣 @AI_Python_Arxiv
✴️ @AI_Python_EN
Do Neural Networks Need To Think Like Humans?

#neuralnetwork


https://www.youtube.com/watch?v=YFL-MI5xzgg

✴️ @AI_Python_EN
How neural networks learn' - Part III: The learning dynamics behind generalization and overfitting:


https://www.youtube.com/watch?v=pFWiauHOFpY

#neuralnetwork

✴️ @AI_Python_EN
NODE - Neural Ordinary Differential Equations

This was recently presented as a new approach in NeurIPS.

The idea?
Instead of specifying a discrete sequence of hidden layers, they parameterized the derivative of the hidden state using a neural network. The output of the network is computed using a black- box differential equation solver.

They also propose CNF - or Continuous Normalizing Flow

The continuous normalizing flows, a generative model that can train by maximum likelihood, without partitioning or ordering the data dimensions. For training, we show how to scalably backpropagate through any ODE solver, without access to its internal operations. This allows end-to-end training of ODEs within larger models.

Paper: https://lnkd.in/ddMJQAS
#Github: Examples of implementations coming soon to our repository
#neuralnetwork #deeplearning #machinelearning

✴️ @AI_Python_EN