Neural Networks | Нейронные сети
11.6K subscribers
681 photos
149 videos
170 files
9.37K links
Все о машинном обучении

По всем вопросам - @notxxx1

№ 4959169263
Download Telegram
​Facebook has released #PyText — new framework on top of #PyTorch.

This framework is build to make it easier for developers to build #NLP models.

Link: https://code.fb.com/ai-research/pytext-open-source-nlp-framework/

🔗 Open-sourcing PyText for faster NLP development
We are open-sourcing PyText, a framework for natural language processing. PyText is built on PyTorch and it makes it faster and easier to build deep learning models for NLP.
#ai #pytorch #deeplearning
PyTorch CNN Weights - Learnable Parameters in Neural Networks

https://www.youtube.com/watch?v=stWU37L91Yc

🎥 PyTorch CNN Weights - Learnable Parameters in Neural Networks
👁 1 раз 1431 сек.
In this post, we'll be exploring the inner workings of PyTorch, Introducing more OOP concepts, convolutional and linear layer weight tensors, matrix multiplication for deep learning and more!

Check out the corresponding blog and other resources for this video at:
http://deeplizard.com/learn/video/stWU37L91Yc

❤️🦎 Special thanks to the following polymaths of the deeplizard hivemind:
George Tohme

Support collective intelligence, and join the deeplizard hivemind:
http://deeplizard.com/hivemind

Code:
https:
​Implementation of character based convolutional neural network

A #PyTorch implementation of Character Based ConvNets for text classification published by Yan LeCun in 2015 is open-sourced on. Many training features and hacks are implemented.

Link: https://github.com/ahmedbesbes/character-based-cnn

🔗 ahmedbesbes/character-based-cnn
Implementation of character based convolutional neural network - ahmedbesbes/character-based-cnn
#PyTorch #MachineLearning
PyTorch Machine Learning Tutorial - Machine Learning with Python and PyTorch

Наш телеграм канал - https://tele.click/ai_machinelearning_big_data

https://www.youtube.com/watch?v=TB-G1KqRb5o

🎥 PyTorch Machine Learning Tutorial - Machine Learning with Python and PyTorch
👁 1 раз 3292 сек.
Master Deep Learning and Computer Vision with PyTorch - Full Course on sale for $10! (normally $200):https://www.udemy.com/pytorch-for-deep-learning-and-computer-vision/learn/v4/?couponCode=RAYPYTORCHYOUTUBE09

PyTorch has rapidly become one of the most transformative frameworks in the field of Deep Learning. Since its release, PyTorch has completely changed the landscape in the field of deep learning due to its flexibility, and how easy it is to use when building Deep Learning models.

Rayan Slim's channe
🎥 Deep Learning with PyTorch Workshop - Mar 20 2019
👁 1 раз 8447 сек.
Event link: https://www.meetup.com/dsnet-blr/events/260057993/

Code links:
1. PyTorch Basics: https://jvn.io/aakashns/e5cfe043873f4f3c9287507016747ae5

2. Linear Regression:
https://jvn.io/aakashns/e556978bda9343f3b30b3a9fd2a25012

3. Logistic Regression:
https://jvn.io/aakashns/a1b40b04f5174a18bd05b17e3dffb0f0

For questions and discussions, join our Slack Group at http://dsindia.org , and then go to the #pytorch-workshop channel
🎥 Deep Q learning is Easy in PyTorch (Tutorial)
👁 1 раз 2055 сек.
Deep Q Learning w/ Pytorch: https://youtu.be/RfNxXlO6BiA
Where to find data for Deep Learning: https://youtu.be/9oW3WfKk6d4

#DeepQLearning #PyTorch #ReinforcementLearning

In this tutorial you will code up the simplest possible deep q network in PyTorch. We'll also correct some minor errors from previous videos, which were rather subtle.

You'll see just how easy it is to implement a deep Q network in Pytorch and beat the lunar lander environment. The agent goes from crashing on the lunar surface to landin
Applied Deep Learning with #PyTorch - Full Course

https://www.youtube.com/watch?v=CNuI8OWsppg

🎥 Applied Deep Learning with PyTorch - Full Course
👁 1 раз 20404 сек.
In this course you will learn the key concepts behind deep learning and how to apply the concepts to a real-life project using PyTorch and Python.

You'll learn the following:
⌨️ RNNs and LSTMs
⌨️ Sequence Modeling
⌨️ PyTorch
⌨️ Building a Chatbot in PyTorch

⭐️Requirements ⭐️
⌨️ Some Basic High School Mathematics
⌨️ Some Basic Programming Knowledge
⌨️ Some basic Knowledge about Neural Networks

⭐️Contents ⭐️
⌨️ (0:00:08) Recurrent Nerual Networks - RNNs and LSTMs
⌨️ (0:35:54) Sequence-To-Sequence Models
⌨️
​BoTorch: Programmable Bayesian Optimization in PyTorch
Balandat et al.: https://arxiv.org/abs/1910.06403
Code: https://github.com/pytorch/botorch
#MachineLearning #Bayesian #PyTorch

🔗 BoTorch: Programmable Bayesian Optimization in PyTorch
Bayesian optimization provides sample-efficient global optimization for a broad range of applications, including automatic machine learning, molecular chemistry, and experimental design. We introduce BoTorch, a modern programming framework for Bayesian optimization. Enabled by Monte-Carlo (MC) acquisition functions and auto-differentiation, BoTorch's modular design facilitates flexible specification and optimization of probabilistic models written in PyTorch, radically simplifying implementation of novel acquisition functions. Our MC approach is made practical by a distinctive algorithmic foundation that leverages fast predictive distributions and hardware acceleration. In experiments, we demonstrate the improved sample efficiency of BoTorch relative to other popular libraries. BoTorch is open source and available at https://github.com/pytorch/botorch.
​Reinforcement Learning Course from OpenAI

Reinforcement Learning becoming significant part of the data scientist toolbox.
OpenAI created and published one of the best courses in #RL. Algorithms implementation written in #Tensorflow.
But if you are more comfortable with #PyTorch, we have found #PyTorch implementation of this algs

OpenAI Course: https://spinningup.openai.com/en/latest/
Tensorflow Code: https://github.com/openai/spinningup
PyTorch Code: https://github.com/kashif/firedup

🔗 Welcome to Spinning Up in Deep RL! — Spinning Up documentation
​We just released our #NeurIPS2019 Multimodal Model-Agnostic Meta-Learning (MMAML) code for learning few-shot image classification, which extends MAML to multimodal task distributions (e.g. learning from multiple datasets). The code contains #PyTorch implementations of our model and two baselines (MAML and Multi-MAML) as well as the scripts to evaluate these models to five popular few-shot learning datasets: Omniglot, Mini-ImageNet, FC100 (CIFAR100), CUB-200-2011, and FGVC-Aircraft.

Code: https://github.com/shaohua0116/MMAML-Classification

Paper: https://arxiv.org/abs/1910.13616

#NeurIPS #MachineLearning #ML #code

🔗 shaohua0116/MMAML-Classification
An official PyTorch implementation of “Multimodal Model-Agnostic Meta-Learning via Task-Aware Modulation” (NeurIPS 2019) by Risto Vuorio*, Shao-Hua Sun*, Hexiang Hu, and Joseph J. Lim - shaohua0116...
PyTorch: An Imperative Style, High-Performance Deep Learning Library

Paszke et al.: https://arxiv.org/abs/1912.01703
#ArtificialIntelligence #deepLearning #PyTorch

🔗 PyTorch: An Imperative Style, High-Performance Deep Learning Library
Deep learning frameworks have often focused on either usability or speed, but not both. PyTorch is a machine learning library that shows that these two goals are in fact compatible: it provides an imperative and Pythonic programming style that supports code as a model, makes debugging easy and is consistent with other popular scientific computing libraries, while remaining efficient and supporting hardware accelerators such as GPUs. In this paper, we detail the principles that drove the implementation of PyTorch and how they are reflected in its architecture. We emphasize that every aspect of PyTorch is a regular Python program under the full control of its user. We also explain how the careful and pragmatic implementation of the key components of its runtime enables them to work together to achieve compelling performance. We demonstrate the efficiency of individual subsystems, as well as the overall speed of PyTorch on several common benchmarks.
Data Science / Machine Learning / AI / Big Data (VK)

PyTorch Geometric Temporal: Spatiotemporal Signal Processing with Neural Machine Learning Models
Rozemberczki et al.: https://arxiv.org/abs/2104.07788
#MachineLearning #ArtificialIntelligence #PyTorch
Forwarded from Machinelearning
📌 PyTorch: новые инструменты для для экономии памяти при обучении моделей.

PyTorch представил усовершенствованные методы Activation Checkpointing (AC), цель которых - снижение потребления памяти при обучении.

Традиционный подход в eager mode сохраняет промежуточные активации для обратного прохода, что зачастую приводит к значительному расходу ресурсов. AC позволяет не сохранять эти тензоры, а вычислять их заново при необходимости, тем самым жертвуя вычислительным временем ради экономии памяти.

Новая техника – Selective Activation Checkpoint (SAC). В отличие от обычного AC, который затрагивает всю выбранную область, SAC дает гранулярный контроль над тем, какие операции следует пересчитывать, а какие – сохранять. Это достигается за счет использования policy_fn, определяющей, нужно ли сохранять результаты конкретной операции. SAC будет полезен для избегания перевычисления ресурсоемких операций, например, матричных умножений.

Для torch.compile стала доступна Memory Budget API. Эта функция автоматически применяет SAC с оптимальной политикой, исходя из заданного пользователем бюджета памяти (от 0 до 1). Бюджет 0 соответствует обычному AC, а 1 – поведению torch.compile по умолчанию.

🔜 Читать подробную статью в блоге Pytorch


@ai_machinelearning_big_data

#AI #ML #Pytorch
Please open Telegram to view this post
VIEW IN TELEGRAM
Please open Telegram to view this post
VIEW IN TELEGRAM