Neural Networks | Нейронные сети
11.8K subscribers
760 photos
165 videos
170 files
9.41K links
Все о машинном обучении

По всем вопросам - @notxxx1

№ 4959169263
Download Telegram
Deep learning with Python Develop Deep Learning models on Theano and Thensorflow using Keras
#book #keras #DL

📝 5_6133943928459624650.pdf - 💾5 709 397
​Deep Learning From Scratch - Seth Weidman | ODSC East 2019

🔗 Deep Learning From Scratch - Seth Weidman | ODSC East 2019
There are many good tutorials on neural networks out there. While some of them dive deep into the code and show how to implement things, and others explain what is going on via diagrams or math, very few bring all the concepts needed to understand neural networks together, showing diagrams, code, and math side by side. In this video, you will get a clear, step-by-step explanation of neural networks, implementing them from scratch in Numpy, while showing both diagrams that explain how they work and the math
​Sparse Networks from Scratch: Faster Training without Losing Performance

🔗 Sparse Networks from Scratch: Faster Training without Losing Performance
We demonstrate the possibility of what we call sparse learning: accelerated training of deep neural networks that maintain sparse weights throughout training while achieving performance levels competitive with dense networks. We accomplish this by developing sparse momentum, an algorithm which uses exponentially smoothed gradients (momentum) to identify layers and weights which reduce the error efficiently. Sparse momentum redistributes pruned weights across layers according to the mean momentum magnitude of each layer. Within a layer, sparse momentum grows weights according to the momentum magnitude of zero-valued weights. We demonstrate state-of-the-art sparse performance on MNIST, CIFAR-10, and ImageNet, decreasing the mean error by a relative 8%, 15%, and 6% compared to other sparse algorithms. Furthermore, we show that our algorithm can reliably find the equivalent of winning lottery tickets from random initialization: Our algorithm finds sparse configurations with 20% or fewer weights which perform as w
🎥 Getting Started with TensorFlow 2.0 | SciPy 2019 Tutorial | Josh Gordon
👁 1 раз 7799 сек.
A hands-on introduction to training neural networks with TensorFlow 2.0. In this four hour tutorial, we will briefly introduce TensorFlow, then dive in to writing code. We will complete four exercises (classifying images, classifying text, training a GAN, etc). This tutorial is targeted at folks new to TensorFlow, and/or Deep Learning. Our goal is to help you get started efficiently and effectively, so you can continue learning on your own.

Tutorial information may be found at https://www.scipy2019.scipy.
🎥 Use Pre-Built Containers to Build Custom Deep Learning Models Quickly - AWS Online Tech Talks
👁 1 раз 2066 сек.
AWS Deep Learning Containers (AWS DL Containers) are Docker images pre-installed with TensorFlow and Apache MXNet deep learning frameworks to make it easy to deploy custom machine learning environments quickly by letting you skip the complicated process of building and optimizing your environments from scratch. In this tech talk, learn about how to deploy AWS DL Containers on Amazon SageMaker, Amazon Elastic Container Service for Kubernetes (Amazon EKS), self-managed Kubernetes, Amazon Elastic Container Ser
​Trending deep learning Github repositories

Наш телеграм канал - tglink.me/ai_machinelearning_big_data
Here's a list of top 100 deep learning Github trending repositories sorted by the number of stars gained on a specific day.

https://github.com/mbadry1/Trending-Deep-Learning

🔗 mbadry1/Trending-Deep-Learning
Top 100 trending deep learning repositories sorted by the number of stars gained on a specific day. - mbadry1/Trending-Deep-Learning
​Evaluation of Retinal Image Quality Assessment Networks in Different Color-spaces

Authors: Huazhu Fu, Boyang Wang, Jianbing Shen, Shanshan Cui, Yanwu Xu, Jiang Liu, Ling Shao

Abstract: …by its large-scale size, multi-level grading, and multi-modality. Then, we analyze the influences on RIQA of different color-spaces, and propose a simple yet efficient deep network, named Multiple Color-space Fusion Network (MCF-Net)
https://arxiv.org/abs/1907.05345

🔗 Evaluation of Retinal Image Quality Assessment Networks in Different Color-spaces
Retinal image quality assessment (RIQA) is essential for controlling the quality of retinal imaging and guaranteeing the reliability of diagnoses by ophthalmologists or automated analysis systems. Existing RIQA methods focus on the RGB color-space and are developed based on small datasets with binary quality labels (i.e., `Accept' and `Reject'). In this paper, we first re-annotate an Eye-Quality (EyeQ) dataset with 28,792 retinal images from the EyePACS dataset, based on a three-level quality grading system (i.e., `Good', `Usable' and `Reject') for evaluating RIQA methods. Our RIQA dataset is characterized by its large-scale size, multi-level grading, and multi-modality. Then, we analyze the influences on RIQA of different color-spaces, and propose a simple yet efficient deep network, named Multiple Color-space Fusion Network (MCF-Net), which integrates the different color-space representations at both a feature-level and prediction-level to predict image quality grades. Experiments on our
🎥 Reusable Execution in Production Using Papermill (Google Cloud AI Huddle)
👁 1 раз 3611 сек.
In this episode of Google Cloud AI Huddle, Matthew Seal, Senior Software Engineer at Netflix, goes over the pros and cons of using Jupyter Notebook and Papermill to make a reusable execution for production.

Deep Learning VMs → https://goo.gle/2ZNmmrG

Google AI Huddle is an open, collaborative and developer-first AI forum driven by Google AI expertise. It’s a monthly in-person engagement where Googlers engage with developers to speak on ML topics, deliver workshops / tutorials, and hands-on labs. AI Huddl
​Игра в go использую Сверточные нейронне сети

http://arxiv.org/abs/1907.04658

🔗 Playing Go without Game Tree Search Using Convolutional Neural Networks
The game of Go has a long history in East Asian countries, but the field of Computer Go has yet to catch up to humans until the past couple of years. While the rules of Go are simple, the strategy and combinatorics of the game are immensely complex. Even within the past couple of years, new programs that rely on neural networks to evaluate board positions still explore many orders of magnitude more board positions per second than a professional can. We attempt to mimic human intuition in the game by creating a convolutional neural policy network which, without any sort of tree search, should play the game at or above the level of most humans. We introduce three structures and training methods that aim to create a strong Go player: non-rectangular convolutions, which will better learn the shapes on the board, supervised learning, training on a data set of 53,000 professional games, and reinforcement learning, training on games played between different versions of the network. Our network has already surpassed
Изучаем pandas [2019] Майкл Хейдт, Артем Груздев

Наш телеграм канал - tglink.me/ai_machinelearning_big_data
Библиотека pandas – популярный пакет для анализа и обработки данных на языке Python. Он предлагает эффективные, быстрые, высокопроизводительные структуры данных, которые позволяют существенно упростить работу.



Данная книга познакомит вас с обширным набором инструментов, предлагаемых библиотекой pandas, – начиная с обзора загрузки данных с удаленных источников, выполнения численного и статистического анализа, индексации, агрегации и заканчивая визуализацией данных и анализом финансовой информации.



Во второе издание добавлены новые приложения, посвященные предварительной подготовке данных и настройке гиперпараметров, работе с датами, строками и предупреждениями. Подробно освещены алгоритмы случайного леса, градиентного бустинга CatBoost и логистической регрессии.



Издание предназначено всем разработчикам на языке Python, интересующимся обработкой данных.

📝 2019 Изучаем pandas. Майкл Хейдт, Артем Груздев.pdf - 💾21 816 454
​Обнаружение скрытых и составных параметров в многоуровневых моделях с помощью машинного обучения

https://arxiv.org/abs/1907.05417

🔗 Detecting hidden and composite orders in layered models via machine learning
We use machine learning to study layered spin models where composite order parameters may emerge as a consequence of the interlayerer coupling. We focus on the layered Ising and Ashkin-Teller models, determining their phase diagram via the application of a machine learning algorithm to the Monte Carlo data. Remarkably our technique is able to correctly characterize all the system phases also in the case of hidden order parameters, \emph{i.e.}~order parameters whose expression in terms of the microscopic configurations would require additional preprocessing of the data fed to the algorithm. Within the approach we introduce, owing to the construction of convolutional neural networks, naturally suitable for layered image-like data with arbitrary number of layers, no preprocessing of the Monte Carlo data is needed, also with regard to its spatial structure. The physical meaning of our results is discussed and compared with analytical data, where available. Yet, the method can be used without any \emph{a