Cutting Edge Deep Learning
262 subscribers
193 photos
42 videos
51 files
363 links
📕 Deep learning
📗 Reinforcement learning
📘 Machine learning
📙 Papers - tools - tutorials

🔗 Other Social Media Handles:
https://linktr.ee/cedeeplearning
Download Telegram
This media is not supported in your browser
VIEW IN TELEGRAM
Meet the world's first glass-free foldable smartphone. Learn more at https://www.royole.com/flexpai
----------
@machinelearning_tuts
Another sneak preview into TensorFlow 2.0. This is how the new architecture will look like:
TensorFlow 2.0 will focus on simplicity and ease of use, featuring updates like:

Easy model building with Keras and eager execution.
Robust model deployment in production on any platform.
Powerful experimentation for research.
Simplifying the API by cleaning up deprecated APIs and reducing duplication.

1. tf.data will replace the queue runners
2. Easy model building with tf.keras and estimators
3. Run and debug with eager execution
4. Distributed training on either CPU, GPU or TPU
5. Export models to SavedModel and deploy it via TF Serving, TF Lite. TF.js etc.

I really can't wait anymore to test all the new things out.

#deeplearning #machinelearning

Article: https://lnkd.in/drz7FyV
----------
@machinelearning_tuts
❇️ Machine learning glossary

#DataScience #MachineLearning
----------
@machinelearning_tuts
Constrained clustering, #python implementation
----------
@machinelearning_tuts
Fortifying the future of cryptography

Vinod Vaikuntanathan aims to improve encryption in a world with growing applications and evolving adversaries.


January 16, 2019

@machinelearning_tuts
----------
Link : http://news.mit.edu//2019/faculty-vinod-vaikuntanathan-0116
A Unified Framework of Deep Neural Networks by Capsules

--Abstract

With the growth of deep learning, how to describe deep neural networksunifiedly is becoming an important issue. We first formalize neural networksmathematically with their directed graph representations, and prove ageneration theorem about the induced networks of connected directed acyclicgraphs. Then, we set up a unified framework for deep learning with capsulenetworks. This capsule framework could simplify the description of existingdeep neural networks, and provide a theoretical basis of graphic designing andprogramming techniques for deep learning models, thus would be of greatsignificance to the advancement of deep learning.


2018-05-09T14:23:17Z

@machinelearning_tuts
----------
Link : http://arxiv.org/abs/1805.03551v2
Deep Learning for Sentiment Analysis : A Survey

--Abstract

Deep learning has emerged as a powerful machine learning technique thatlearns multiple layers of representations or features of the data and producesstate-of-the-art prediction results. Along with the success of deep learning inmany other application domains, deep learning is also popularly used insentiment analysis in recent years. This paper first gives an overview of deeplearning and then provides a comprehensive survey of its current applicationsin sentiment analysis.


2018-01-24T07:32:29Z

@machinelearning_tuts
----------
Link : http://arxiv.org/abs/1801.07883v2
Integrating Learning and Reasoning with Deep Logic Models

--Abstract

Deep learning is very effective at jointly learning feature representationsand classification models, especially when dealing with high dimensional inputpatterns. Probabilistic logic reasoning, on the other hand, is capable to takeconsistent and robust decisions in complex environments. The integration ofdeep learning and logic reasoning is still an open-research problem and it isconsidered to be the key for the development of real intelligent agents. Thispaper presents Deep Logic Models, which are deep graphical models integratingdeep learning and logic reasoning both for learning and inference. Deep LogicModels create an end-to-end differentiable architecture, where deep learnersare embedded into a network implementing a continuous relaxation of the logicknowledge. The learning process allows to jointly learn the weights of the deeplearners and the meta-parameters controlling the high-level reasoning. Theexperimental results show that the proposed methodology overtakes thelimitations of the other approaches that have been proposed to bridge deeplearning and reasoning.


2019-01-14T09:06:28Z

@machinelearning_tuts
----------
Link : http://arxiv.org/abs/1901.04195v1
Why walk when you can flop?
In one example, a simulated robot was supposed to evolve to travel as quickly as possible. But rather than evolve legs, it simply assembled itself into a tall tower, then fell over. Some of these robots even learned to turn their falling motion into a somersault, adding extra distance.

Blog by Janelle Shane: https://lnkd.in/dQnCVa9

Original paper: https://lnkd.in/dt63hJR

#algorithm #artificialintelligence #machinelearning #reinforcementlearning #technology

----------
@machinelearning_tuts
DeepFlash is a nice application of auto-encoders where they trained a neural network to turn a flash selfie into a studio portrait. It's an interesting paper with a real need, I seriously mean it! They've also tested their results against other approaches like pix2pix, style transfer etc.. Somehow from the first glance I had the feeling that pix2pix performed better than their suggested approach but their evaluation metrics (SSIM and PSNR) proved me wrong.
#deeplearning #machinelearning

Paper link: https://lnkd.in/eHM5rRx

----------
@machinelearning_tuts
Machine Learning Guide: 20 Free ODSC Resources to Learn Machine Learning: https://lnkd.in/ejqejpA

#BigData #DataScience #DataScientists #AI #DeepLearning


----------
@machinelearning_tuts
How do you go from self-play to the real world? : Transfer learning

NeurIPS 2017 Meta Learning Symposium: https://lnkd.in/e7MdpPc

A new research problem has therefore emerged: How can the complexity, i.e. the design, components, and hyperparameters, be configured automatically so that these systems perform as well as possible? This is the problem of metalearning. Several approaches have emerged, including those based on Bayesian optimization, gradient descent, reinforcement learning, and evolutionary computation.

#artificialintelligence #deeplearning #metalearning #reinforcementlearning
----------
@machinelearning_tuts
Delira was developed as a deep learning framework for medical images such as CT or MRI. Currently, it works on arbitrary data (based on NumPy).

Based on PyTorch, batchgenerators and trixi it provides a framework for

Dataset loading
Dataset sampling
Augmentation (multi-threaded) including 3D images with any number of channels
A generic trainer class that implements the training process
Already implemented models used in medical image processing and exemplaric implementations of most used models in general (like Resnet)
Web-based monitoring using Visdom
Model save and load functions
Delira supports classification and regression problems as well as generative adversarial networks and segmentation tasks.

#منابع #یادگیری_عمیق

Getting Started: https://lnkd.in/efeU8vv

----------
@machinelearning_tuts
When deep learning meets security

--Abstract

Deep learning is an emerging research field that has proven its effectivenesstowards deploying more efficient intelligent systems. Security, on the otherhand, is one of the most essential issues in modern communication systems.Recently many papers have shown that using deep learning models can achievepromising results when applied to the security domain. In this work, we providean overview for the recent studies that apply deep learning techniques to thefield of security.


2018-07-12T17:44:42Z

@machinelearning_tuts
----------
Link : http://arxiv.org/abs/1807.04739v1
Are Efficient Deep Representations Learnable?

--Abstract

Many theories of deep learning have shown that a deep network can requiredramatically fewer resources to represent a given function compared to ashallow network. But a question remains: can these efficient representations belearned using current deep learning techniques? In this work, we test whetherstandard deep learning methods can in fact find the efficient representationsposited by several theories of deep representation. Specifically, we train deepneural networks to learn two simple functions with known efficient solutions:the parity function and the fast Fourier transform. We find that usinggradient-based optimization, a deep network does not learn the parity function,unless initialized very close to a hand-coded exact solution. We also find thata deep linear neural network does not learn the fast Fourier transform, even inthe best-case scenario of infinite training data, unless the weights areinitialized very close to the exact hand-coded solution. Our results suggestthat not every element of the class of compositional functions can be learnedefficiently by a deep network, and further restrictions are necessary tounderstand what functions are both efficiently representable and learnable.


2018-07-17T13:08:21Z

@machinelearning_tuts
----------
Link : http://arxiv.org/abs/1807.06399v1
Deep Learning for Genomics: A Concise Overview

--Abstract

Advancements in genomic research such as high-throughput sequencingtechniques have driven modern genomic studies into "big data" disciplines. Thisdata explosion is constantly challenging conventional methods used in genomics.In parallel with the urgent demand for robust algorithms, deep learning hassucceeded in a variety of fields such as vision, speech, and text processing.Yet genomics entails unique challenges to deep learning since we are expectingfrom deep learning a superhuman intelligence that explores beyond our knowledgeto interpret the genome. A powerful deep learning model should rely oninsightful utilization of task-specific knowledge. In this paper, we brieflydiscuss the strengths of different deep learning models from a genomicperspective so as to fit each particular task with a proper deep architecture,and remark on practical considerations of developing modern deep learningarchitectures for genomics. We also provide a concise review of deep learningapplications in various aspects of genomic research, as well as pointing outpotential opportunities and obstacles for future genomics applications.


2018-02-02T12:50:25Z

@machinelearning_tuts
----------
Link : http://arxiv.org/abs/1802.00810v2