https://habr.com/ru/company/hsespb/blog/437402/
Как я научила робота бегать по видео с YouTube
#machinelearning #neuralnets #deeplearning #машинноеобучение
Наш телеграмм канал - https://t.me/ai_machinelearning_big_data
Как я научила робота бегать по видео с YouTube
#machinelearning #neuralnets #deeplearning #машинноеобучение
Наш телеграмм канал - https://t.me/ai_machinelearning_big_data
Хабр
Как я научила робота бегать по видео с YouTube
Мы продолжаем рассказывать о совместных научных проектах наших студентов и JetBrains Research. В этой статье поговорим об алгоритмах глубокого обучения с подкреплением, которые используются для...
Understanding Machine Learning on Point Clouds through PointNet++
https://towardsdatascience.com/understanding-machine-learning-on-point-clouds-through-pointnet-f8f3f2d53cc3
🔗 Understanding Machine Learning on Point Clouds through PointNet++
Point clouds are a convenient way of representing spatial data and other unordered data. But what are they, and how are they used in ML?
https://towardsdatascience.com/understanding-machine-learning-on-point-clouds-through-pointnet-f8f3f2d53cc3
🔗 Understanding Machine Learning on Point Clouds through PointNet++
Point clouds are a convenient way of representing spatial data and other unordered data. But what are they, and how are they used in ML?
Medium
Understanding Machine Learning on Point Clouds through PointNet++
Point clouds are a convenient way of representing spatial data and other unordered data. But what are they, and how are they used in ML?
Deep Learning 2019 - Image classification
🎥 Lesson 1: Deep Learning 2019 - Image classification
👁 1 раз ⏳ 6012 сек.
🎥 Lesson 1: Deep Learning 2019 - Image classification
👁 1 раз ⏳ 6012 сек.
Note: please view this using the video player at http://course.fast.ai, instead of viewing on YouTube directly, to ensure you have the latest information. If you have questions, see if your question already has an answer by searching http://forums.fast.ai, and then post there if required.
The key outcome of lesson 1 is that we'll have trained an image classifier which can recognize pet breeds at state of the art accuracy. The key to this success is the use of *transfer learning*, which will be a key platfo
Vk
Lesson 1: Deep Learning 2019 - Image classification
Note: please view this using the video player at http://course.fast.ai, instead of viewing on YouTube directly, to ensure you have the latest information. If you have questions, see if your question already has an answer by searching http://forums.fast.ai…
Lesson 4: Deep Learning 2019 - NLP; Tabular data; Collaborative filtering; Embeddings
🔗 Lesson 4: Deep Learning 2019 - NLP; Tabular data; Collaborative filtering; Embeddings
In lesson 4 we'll dive in to *natural language processing* (NLP), using the IMDb movie review dataset. In this task, our goal is to predict whether a movie r...
🔗 Lesson 4: Deep Learning 2019 - NLP; Tabular data; Collaborative filtering; Embeddings
In lesson 4 we'll dive in to *natural language processing* (NLP), using the IMDb movie review dataset. In this task, our goal is to predict whether a movie r...
YouTube
Lesson 4: Deep Learning 2019 - NLP; Tabular data; Collaborative filtering; Embeddings
In lesson 4 we'll dive in to *natural language processing* (NLP), using the IMDb movie review dataset. In this task, our goal is to predict whether a movie r...
Fast Simulation with Generative Adversarial Networks
🔗 Fast Simulation with Generative Adversarial Networks
In this video, Dr. Sofia Vallecorsa from openlab at CERN presents: Fast Simulation with Generative Adversarial Networks. "This talk presents an approach base...
🔗 Fast Simulation with Generative Adversarial Networks
In this video, Dr. Sofia Vallecorsa from openlab at CERN presents: Fast Simulation with Generative Adversarial Networks. "This talk presents an approach base...
YouTube
Fast Simulation with Generative Adversarial Networks
In this video, Dr. Sofia Vallecorsa from openlab at CERN presents: Fast Simulation with Generative Adversarial Networks. "This talk presents an approach base...
In Browser AI - neural networks for everyone - Kamila Stepniowska, Piotr Migdał
🔗 In Browser AI - neural networks for everyone - Kamila Stepniowska, Piotr Migdał
PyData Warsaw 2018 Let's talk about In Browser AI - the open source educational project brings together Python & JavaScript. Bring deep learning demos of you...
🔗 In Browser AI - neural networks for everyone - Kamila Stepniowska, Piotr Migdał
PyData Warsaw 2018 Let's talk about In Browser AI - the open source educational project brings together Python & JavaScript. Bring deep learning demos of you...
YouTube
In Browser AI - neural networks for everyone - Kamila Stepniowska, Piotr Migdał
PyData Warsaw 2018 Let's talk about In Browser AI - the open source educational project brings together Python & JavaScript. Bring deep learning demos of you...
Lesson 6: Deep Learning 2019 - Regularization; Convolutions; Data ethics
🔗 Lesson 6: Deep Learning 2019 - Regularization; Convolutions; Data ethics
Today we discuss some powerful techniques for improving training and avoiding over-fitting: - *Dropout*: remove activations at random during training in orde...
🔗 Lesson 6: Deep Learning 2019 - Regularization; Convolutions; Data ethics
Today we discuss some powerful techniques for improving training and avoiding over-fitting: - *Dropout*: remove activations at random during training in orde...
YouTube
Lesson 6: Deep Learning 2019 - Regularization; Convolutions; Data ethics
Today we discuss some powerful techniques for improving training and avoiding over-fitting: - *Dropout*: remove activations at random during training in orde...
Lesson 7: Deep Learning 2019 - Resnets from scratch; U-net; Generative (adversarial) networks
🔗 Lesson 7: Deep Learning 2019 - Resnets from scratch; U-net; Generative (adversarial) networks
In the final lesson of Practical Deep Learning for Coders we'll study one of the most important techniques in modern architectures: the *skip connection*. Th...
🔗 Lesson 7: Deep Learning 2019 - Resnets from scratch; U-net; Generative (adversarial) networks
In the final lesson of Practical Deep Learning for Coders we'll study one of the most important techniques in modern architectures: the *skip connection*. Th...
YouTube
Lesson 7: Deep Learning 2019 - Resnets from scratch; U-net; Generative (adversarial) networks
In the final lesson of Practical Deep Learning for Coders we'll study one of the most important techniques in modern architectures: the *skip connection*. Th...
Understand the Impact of Learning Rate on Model Performance With Deep Learning Neural Networks
🔗 Understand the Impact of Learning Rate on Model Performance With Deep Learning Neural Networks
Deep learning neural networks are trained using the stochastic gradient descent optimization algorithm. The learning rate is a hyperparameter that controls how much to change the model in response to the estimated error each time the model weights are updated. Choosing the learning rate is challenging as a value too small may result in a …
🔗 Understand the Impact of Learning Rate on Model Performance With Deep Learning Neural Networks
Deep learning neural networks are trained using the stochastic gradient descent optimization algorithm. The learning rate is a hyperparameter that controls how much to change the model in response to the estimated error each time the model weights are updated. Choosing the learning rate is challenging as a value too small may result in a …
MachineLearningMastery.com
Understand the Impact of Learning Rate on Neural Network Performance - MachineLearningMastery.com
Deep learning neural networks are trained using the stochastic gradient descent optimization algorithm. The learning rate is a hyperparameter that controls how much to change the model in response to the estimated error each time the model weights are updated.…
How to do Bayesian hyper-parameter tuning on a blackbox model
Optimization of arbitrary functions on Cloud ML Engine
🔗 How to do Bayesian hyper-parameter tuning on a blackbox model
Optimization of arbitrary functions on Cloud ML Engine
Optimization of arbitrary functions on Cloud ML Engine
🔗 How to do Bayesian hyper-parameter tuning on a blackbox model
Optimization of arbitrary functions on Cloud ML Engine
Towards Data Science
How to do Bayesian hyper-parameter tuning on a blackbox model
Optimization of arbitrary functions on Cloud ML Engine
Deep Face Recognition: A Survey
https://arxiv.org/abs/1804.06655
🔗 Deep Face Recognition: A Survey
Deep learning applies multiple processing layers to learn representations of data with multiple levels of feature extraction. This emerging technique has reshaped the research landscape of face recognition since 2014, launched by the breakthroughs of Deepface and DeepID methods. Since then, deep face recognition (FR) technique, which leverages the hierarchical architecture to learn discriminative face representation, has dramatically improved the state-of-the-art performance and fostered numerous successful real-world applications. In this paper, we provide a comprehensive survey of the recent developments on deep FR, covering the broad topics on algorithms, data, and scenes. First, we summarize different network architectures and loss functions proposed in the rapid evolution of the deep FR methods. Second, the related face processing methods are categorized into two classes: `one-to-many augmentation' and `many-to-one normalization'. Then, we summarize and compare the commonly used databases for bot
https://arxiv.org/abs/1804.06655
🔗 Deep Face Recognition: A Survey
Deep learning applies multiple processing layers to learn representations of data with multiple levels of feature extraction. This emerging technique has reshaped the research landscape of face recognition since 2014, launched by the breakthroughs of Deepface and DeepID methods. Since then, deep face recognition (FR) technique, which leverages the hierarchical architecture to learn discriminative face representation, has dramatically improved the state-of-the-art performance and fostered numerous successful real-world applications. In this paper, we provide a comprehensive survey of the recent developments on deep FR, covering the broad topics on algorithms, data, and scenes. First, we summarize different network architectures and loss functions proposed in the rapid evolution of the deep FR methods. Second, the related face processing methods are categorized into two classes: `one-to-many augmentation' and `many-to-one normalization'. Then, we summarize and compare the commonly used databases for bot
arXiv.org
Deep Face Recognition: A Survey
Deep learning applies multiple processing layers to learn representations of data with multiple levels of feature extraction. This emerging technique has reshaped the research landscape of face...
🎥 TensorFlow Docker vs. Compile from Source: Which Performs Better using NVIDIA RTX 2080 Ti
👁 1 раз ⏳ 862 сек.
👁 1 раз ⏳ 862 сек.
In this blog post, we examine and compare both methods of deploying the TensorFlow deep learning framework. We ran standard performance benchmarks on popular neural network models using synthetic data, one by one, and compared the results side by side.
Benchmark scripts:
https://github.com/tensorflow/benchmarks/tree/master/scripts/tf_cnn_benchmarks
Vk
TensorFlow Docker vs. Compile from Source: Which Performs Better using NVIDIA RTX 2080 Ti
In this blog post, we examine and compare both methods of deploying the TensorFlow deep learning framework. We ran standard performance benchmarks on popular neural network models using synthetic data, one by one, and compared the results side by side.
Benchmark…
Benchmark…
Zero-shot transfer across 93 languages: Open-sourcing enhanced LASER library
To accelerate the transfer of natural language processing (NLP) applications to many more languages, we have significantly expanded and enhanced our LASER (Language-Agnostic SEntence Representations) toolkit. We are now open-sourcing our work, making LASER the first successful exploration of massively multilingual sentence representations to be shared publicly with the NLP community. The toolkit now works with more than 90 languages, written in 28 different alphabets. LASER achieves these results by embedding all languages jointly in a single shared space (rather than having a separate model for each). We are now making the multilingual encoder and PyTorch code freely available, along with a multilingual test set for more than 100 languages.
🔗 LASER natural language processing toolkit - Facebook Code
Our natural language processing toolkit, LASER, performs zero-shot cross-lingual transfer with more than 90 languages and is now open source.
To accelerate the transfer of natural language processing (NLP) applications to many more languages, we have significantly expanded and enhanced our LASER (Language-Agnostic SEntence Representations) toolkit. We are now open-sourcing our work, making LASER the first successful exploration of massively multilingual sentence representations to be shared publicly with the NLP community. The toolkit now works with more than 90 languages, written in 28 different alphabets. LASER achieves these results by embedding all languages jointly in a single shared space (rather than having a separate model for each). We are now making the multilingual encoder and PyTorch code freely available, along with a multilingual test set for more than 100 languages.
🔗 LASER natural language processing toolkit - Facebook Code
Our natural language processing toolkit, LASER, performs zero-shot cross-lingual transfer with more than 90 languages and is now open source.
Engineering at Meta
Zero-shot transfer across 93 languages: Open-sourcing enhanced LASER library
To accelerate the transfer of natural language processing (NLP) applications to many more languages, we have significantly expanded and enhanced our LASER (Language-Agnostic SEntence Representation…
🎥 2019, Installing TensorFlow, Keras, & Python 3.7 in Windows
👁 1 раз ⏳ 1254 сек.
👁 1 раз ⏳ 1254 сек.
Updated for 2019! This video walks you through a complete Python 3.7 and TensorFlow install. You will be shown the difference between Anaconda and Miniconda, and how to create a 3.6 environment inside of Anaconda for TensorFlow. Also discusses some of the ramifications coming with TensorFlow 2.0.
You can find the instructions here (from the video): https://github.com/jeffheaton/t81_558_deep_learning/blob/master/t81_558_class01_intro_python.ipynb
Please subscribe and comment!
Follow me:
YouTube: http
Vk
2019, Installing TensorFlow, Keras, & Python 3.7 in Windows
Updated for 2019! This video walks you through a complete Python 3.7 and TensorFlow install. You will be shown the difference between Anaconda and Miniconda, and how to create a 3.6 environment inside of Anaconda for TensorFlow. Also discusses some of the…
Learn Machine learning with Python | Random Forest | Part 4 | Eduonix
🔗 Learn Machine learning with Python | Random Forest | Part 4 | Eduonix
In the last video, we went in detail about interpreting models and how to visualize them and a revision on decision trees. Here in this video, you will learn...
🔗 Learn Machine learning with Python | Random Forest | Part 4 | Eduonix
In the last video, we went in detail about interpreting models and how to visualize them and a revision on decision trees. Here in this video, you will learn...
YouTube
Learn Machine learning with Python | Random Forest | Part 4 | Eduonix
In the last video, we went in detail about interpreting models and how to visualize them and a revision on decision trees. Here in this video, you will learn...
🎥 L1/4 Linear Algebra
👁 2 раз ⏳ 1162 сек.
👁 2 раз ⏳ 1162 сек.
Dive into Deep Learning
UC Berkeley, STAT 157
Slides are at
http://courses.d2l.ai
The book is at
http://www.d2l.ai
Linear Algebra
Vk
L1/4 Linear Algebra
Dive into Deep Learning
UC Berkeley, STAT 157
Slides are at
http://courses.d2l.ai
The book is at
http://www.d2l.ai
Linear Algebra
UC Berkeley, STAT 157
Slides are at
http://courses.d2l.ai
The book is at
http://www.d2l.ai
Linear Algebra
Google Researchers Have a New Alternative to Traditional Neural Networks
Say hello to the capsule network.
https://www.technologyreview.com/the-download/609297/google-researchers-have-a-new-alternative-to-traditional-neural-networks/
🔗 Google Researchers Have a New Alternative to Traditional Neural Networks
Say hello to the capsule network.
Say hello to the capsule network.
https://www.technologyreview.com/the-download/609297/google-researchers-have-a-new-alternative-to-traditional-neural-networks/
🔗 Google Researchers Have a New Alternative to Traditional Neural Networks
Say hello to the capsule network.
MIT Technology Review
Google Researchers Have a New Alternative to Traditional Neural Networks
Say hello to the capsule network. AI has enjoyed huge growth in the past few years, and much of that success is owed to deep neural networks, which provide the smarts behind impressive tricks like image recognition. But there is growing concern that some…
2019-01-26 Илья Сиганов. CycleGAN или превращение людей в аниме.
🎥 2019-01-26 Илья Сиганов. CycleGAN или превращение людей в аниме.
👁 2 раз ⏳ 2940 сек.
🎥 2019-01-26 Илья Сиганов. CycleGAN или превращение людей в аниме.
👁 2 раз ⏳ 2940 сек.
“Генерировать котиков из кривых скетчей? Нет? А может перекрасить всех лошадей в зебр? Не достаточно? Хм, а может превратить зиму в лето? ИЛИ МОЖЕТ ПРЕВРАЩАТЬ ЛЮДЕЙ В АНИМЕ???
Говорят нынешние нейросети и не такое могут! Но как это стало вообще возможно? А дело в том, что как-то раз ребята из исследовательской лабы Беркли обратились в шоу «Нейросеть на прокачку» и там им предложили встроить одну GAN модель в другую GAN модель. И после этого завер…..
На лекции я вам расскажу, как работает CycleGAN и на что
Vk
2019-01-26 Илья Сиганов. CycleGAN или превращение людей в аниме.
“Генерировать котиков из кривых скетчей? Нет? А может перекрасить всех лошадей в зебр? Не достаточно? Хм, а может превратить зиму в лето? ИЛИ МОЖЕТ ПРЕВРАЩАТЬ ЛЮДЕЙ В АНИМЕ???
Говорят нынешние нейросети и не такое могут! Но как это стало вообще возможно?…
Говорят нынешние нейросети и не такое могут! Но как это стало вообще возможно?…