Neural Networks | Нейронные сети
11.8K subscribers
756 photos
165 videos
170 files
9.41K links
Все о машинном обучении

По всем вопросам - @notxxx1

№ 4959169263
Download Telegram
🔥Лекции по Big Data

1 - BigData. Введение в машинное обучение
2 - BigData. Python
3 - BigData. Что такое BigData
4 - BigData. OLAP. What and why
5 - BigData. IoT и BigData
6 - BigData. Сhallenges of classification
7 - BigData. Formal Context Analysis
8 - BigData. Регрессия
9 - BigData. Хранение и анализ больших данных
10 - BigData. Deep learning

🎥 1 - BigData. Введение в машинное обучение
👁 603 раз 1960 сек.
Лекция 1 - Введение в машинное обучение.
В лекции рассказывается о том, что подразумевается под понятием «машинное обучение» и какие задачи решаютс...


🎥 2 - BigData. Python
👁 651 раз 8499 сек.
Лекция 2 - Python, как язык анализа данных.
В лекции сделан небольшой обзор языков и программ для анализа данных. Рассказан базовый синтаксис языка...


🎥 3 - BigData. Что такое BigData
👁 235 раз 3792 сек.
Лекция 3 - Что такое BigData?
В лекции рассказывается о том, что же это такое. Цели, проблемы и практическая польза результатов
анализа BD на приме...


🎥 4 - BigData. OLAP. What and why
👁 184 раз 5766 сек.
Лекция 4 - OLAP. What and why. Lightning talk.
В лекции описание OLAP. Что это? Для чего? Каковы отличия от OLTP? Небольшой экскурс в анализ данных...


🎥 5 - BigData. IoT и BigData
👁 97 раз 4183 сек.
Лекция 5 - IoT and BigData
В лекции рассказывается о IoT and BigData. Области их пересечения, применения, основные проблемы и методы решения. Lambd...


🎥 6 - BigData. Сhallenges of classification
👁 82 раз 3923 сек.
Лекция 6 - Сhallenges of classification
The Internet is growing at a tremendous rate. The amount of information presented is beyond human comprehen...


🎥 7 - BigData. Formal Context Analysis
👁 71 раз 6046 сек.
Лекция 7 - Formal Concept Analysis
В этой лекции рассказывается о том, откуда возник анализ формальных понятий, для чего он используется и какие за...


🎥 8 - BigData. Регрессия
👁 89 раз 4118 сек.
Лекция 8 - Регрессия
В лекции рассказана задача регрессии на примере классической задачи предсказания цены дома в Силиконовой Долине. Также рассмот...


🎥 9 - BigData. Хранение и анализ больших данных
👁 113 раз 8210 сек.
Лекция 9 - Хранение и анализ больших данных
Лекция дает ответы на такие вопросы как: что такое большие данные, откуда они берутся, как их хранить, ...


🎥 10 - BigData. Deep learning
👁 104 раз 5703 сек.
Опубликовано: 19 февр. 2016 г.
Лекция 10 - Deep learning - нейронные сети и их применение.
Лекция рассказывает о истории возникновения и развития н...
Deep Learning 2019 - Image classification

🎥 Lesson 1: Deep Learning 2019 - Image classification
👁 1 раз 6012 сек.
Note: please view this using the video player at http://course.fast.ai, instead of viewing on YouTube directly, to ensure you have the latest information. If you have questions, see if your question already has an answer by searching http://forums.fast.ai, and then post there if required.

The key outcome of lesson 1 is that we'll have trained an image classifier which can recognize pet breeds at state of the art accuracy. The key to this success is the use of *transfer learning*, which will be a key platfo
​Fast Simulation with Generative Adversarial Networks

🔗 Fast Simulation with Generative Adversarial Networks
In this video, Dr. Sofia Vallecorsa from openlab at CERN presents: Fast Simulation with Generative Adversarial Networks. "This talk presents an approach base...
​Understand the Impact of Learning Rate on Model Performance With Deep Learning Neural Networks

🔗 Understand the Impact of Learning Rate on Model Performance With Deep Learning Neural Networks
Deep learning neural networks are trained using the stochastic gradient descent optimization algorithm. The learning rate is a hyperparameter that controls how much to change the model in response to the estimated error each time the model weights are updated. Choosing the learning rate is challenging as a value too small may result in a …
​How to do Bayesian hyper-parameter tuning on a blackbox model
Optimization of arbitrary functions on Cloud ML Engine

🔗 How to do Bayesian hyper-parameter tuning on a blackbox model
Optimization of arbitrary functions on Cloud ML Engine
​Deep Face Recognition: A Survey
https://arxiv.org/abs/1804.06655

🔗 Deep Face Recognition: A Survey
Deep learning applies multiple processing layers to learn representations of data with multiple levels of feature extraction. This emerging technique has reshaped the research landscape of face recognition since 2014, launched by the breakthroughs of Deepface and DeepID methods. Since then, deep face recognition (FR) technique, which leverages the hierarchical architecture to learn discriminative face representation, has dramatically improved the state-of-the-art performance and fostered numerous successful real-world applications. In this paper, we provide a comprehensive survey of the recent developments on deep FR, covering the broad topics on algorithms, data, and scenes. First, we summarize different network architectures and loss functions proposed in the rapid evolution of the deep FR methods. Second, the related face processing methods are categorized into two classes: `one-to-many augmentation' and `many-to-one normalization'. Then, we summarize and compare the commonly used databases for bot
🎥 TensorFlow Docker vs. Compile from Source: Which Performs Better using NVIDIA RTX 2080 Ti
👁 1 раз 862 сек.
In this blog post, we examine and compare both methods of deploying the TensorFlow deep learning framework. We ran standard performance benchmarks on popular neural network models using synthetic data, one by one, and compared the results side by side.

Benchmark scripts:
https://github.com/tensorflow/benchmarks/tree/master/scripts/tf_cnn_benchmarks
​Zero-shot transfer across 93 languages: Open-sourcing enhanced LASER library



To accelerate the transfer of natural language processing (NLP) applications to many more languages, we have significantly expanded and enhanced our LASER (Language-Agnostic SEntence Representations) toolkit. We are now open-sourcing our work, making LASER the first successful exploration of massively multilingual sentence representations to be shared publicly with the NLP community. The toolkit now works with more than 90 languages, written in 28 different alphabets. LASER achieves these results by embedding all languages jointly in a single shared space (rather than having a separate model for each). We are now making the multilingual encoder and PyTorch code freely available, along with a multilingual test set for more than 100 languages.

🔗 LASER natural language processing toolkit - Facebook Code
Our natural language processing toolkit, LASER, performs zero-shot cross-lingual transfer with more than 90 languages and is now open source.
🎥 2019, Installing TensorFlow, Keras, & Python 3.7 in Windows
👁 1 раз 1254 сек.
Updated for 2019! This video walks you through a complete Python 3.7 and TensorFlow install. You will be shown the difference between Anaconda and Miniconda, and how to create a 3.6 environment inside of Anaconda for TensorFlow. Also discusses some of the ramifications coming with TensorFlow 2.0.

You can find the instructions here (from the video): https://github.com/jeffheaton/t81_558_deep_learning/blob/master/t81_558_class01_intro_python.ipynb

Please subscribe and comment!

Follow me:

YouTube: http
🎥 L1/4 Linear Algebra
👁 2 раз 1162 сек.
Dive into Deep Learning
UC Berkeley, STAT 157

Slides are at
http://courses.d2l.ai
The book is at
http://www.d2l.ai

Linear Algebra