Neural Networks | Нейронные сети
11.6K subscribers
742 photos
162 videos
170 files
9.4K links
Все о машинном обучении

По всем вопросам - @notxxx1

№ 4959169263
Download Telegram
Deep Learning 2019 - Image classification

🎥 Lesson 1: Deep Learning 2019 - Image classification
👁 1 раз 6012 сек.
Note: please view this using the video player at http://course.fast.ai, instead of viewing on YouTube directly, to ensure you have the latest information. If you have questions, see if your question already has an answer by searching http://forums.fast.ai, and then post there if required.

The key outcome of lesson 1 is that we'll have trained an image classifier which can recognize pet breeds at state of the art accuracy. The key to this success is the use of *transfer learning*, which will be a key platfo
​Fast Simulation with Generative Adversarial Networks

🔗 Fast Simulation with Generative Adversarial Networks
In this video, Dr. Sofia Vallecorsa from openlab at CERN presents: Fast Simulation with Generative Adversarial Networks. "This talk presents an approach base...
​Understand the Impact of Learning Rate on Model Performance With Deep Learning Neural Networks

🔗 Understand the Impact of Learning Rate on Model Performance With Deep Learning Neural Networks
Deep learning neural networks are trained using the stochastic gradient descent optimization algorithm. The learning rate is a hyperparameter that controls how much to change the model in response to the estimated error each time the model weights are updated. Choosing the learning rate is challenging as a value too small may result in a …
​How to do Bayesian hyper-parameter tuning on a blackbox model
Optimization of arbitrary functions on Cloud ML Engine

🔗 How to do Bayesian hyper-parameter tuning on a blackbox model
Optimization of arbitrary functions on Cloud ML Engine
​Deep Face Recognition: A Survey
https://arxiv.org/abs/1804.06655

🔗 Deep Face Recognition: A Survey
Deep learning applies multiple processing layers to learn representations of data with multiple levels of feature extraction. This emerging technique has reshaped the research landscape of face recognition since 2014, launched by the breakthroughs of Deepface and DeepID methods. Since then, deep face recognition (FR) technique, which leverages the hierarchical architecture to learn discriminative face representation, has dramatically improved the state-of-the-art performance and fostered numerous successful real-world applications. In this paper, we provide a comprehensive survey of the recent developments on deep FR, covering the broad topics on algorithms, data, and scenes. First, we summarize different network architectures and loss functions proposed in the rapid evolution of the deep FR methods. Second, the related face processing methods are categorized into two classes: `one-to-many augmentation' and `many-to-one normalization'. Then, we summarize and compare the commonly used databases for bot
🎥 TensorFlow Docker vs. Compile from Source: Which Performs Better using NVIDIA RTX 2080 Ti
👁 1 раз 862 сек.
In this blog post, we examine and compare both methods of deploying the TensorFlow deep learning framework. We ran standard performance benchmarks on popular neural network models using synthetic data, one by one, and compared the results side by side.

Benchmark scripts:
https://github.com/tensorflow/benchmarks/tree/master/scripts/tf_cnn_benchmarks
​Zero-shot transfer across 93 languages: Open-sourcing enhanced LASER library



To accelerate the transfer of natural language processing (NLP) applications to many more languages, we have significantly expanded and enhanced our LASER (Language-Agnostic SEntence Representations) toolkit. We are now open-sourcing our work, making LASER the first successful exploration of massively multilingual sentence representations to be shared publicly with the NLP community. The toolkit now works with more than 90 languages, written in 28 different alphabets. LASER achieves these results by embedding all languages jointly in a single shared space (rather than having a separate model for each). We are now making the multilingual encoder and PyTorch code freely available, along with a multilingual test set for more than 100 languages.

🔗 LASER natural language processing toolkit - Facebook Code
Our natural language processing toolkit, LASER, performs zero-shot cross-lingual transfer with more than 90 languages and is now open source.
🎥 2019, Installing TensorFlow, Keras, & Python 3.7 in Windows
👁 1 раз 1254 сек.
Updated for 2019! This video walks you through a complete Python 3.7 and TensorFlow install. You will be shown the difference between Anaconda and Miniconda, and how to create a 3.6 environment inside of Anaconda for TensorFlow. Also discusses some of the ramifications coming with TensorFlow 2.0.

You can find the instructions here (from the video): https://github.com/jeffheaton/t81_558_deep_learning/blob/master/t81_558_class01_intro_python.ipynb

Please subscribe and comment!

Follow me:

YouTube: http
🎥 L1/4 Linear Algebra
👁 2 раз 1162 сек.
Dive into Deep Learning
UC Berkeley, STAT 157

Slides are at
http://courses.d2l.ai
The book is at
http://www.d2l.ai

Linear Algebra
2019-01-26 Илья Сиганов. CycleGAN или превращение людей в аниме.

🎥 2019-01-26 Илья Сиганов. CycleGAN или превращение людей в аниме.
👁 2 раз 2940 сек.
“Генерировать котиков из кривых скетчей? Нет? А может перекрасить всех лошадей в зебр? Не достаточно? Хм, а может превратить зиму в лето? ИЛИ МОЖЕТ ПРЕВРАЩАТЬ ЛЮДЕЙ В АНИМЕ???

Говорят нынешние нейросети и не такое могут! Но как это стало вообще возможно? А дело в том, что как-то раз ребята из исследовательской лабы Беркли обратились в шоу «Нейросеть на прокачку» и там им предложили встроить одну GAN модель в другую GAN модель. И после этого завер…..

На лекции я вам расскажу, как работает CycleGAN и на что