Cutting Edge Deep Learning
262 subscribers
193 photos
42 videos
51 files
363 links
📕 Deep learning
📗 Reinforcement learning
📘 Machine learning
📙 Papers - tools - tutorials

🔗 Other Social Media Handles:
https://linktr.ee/cedeeplearning
Download Telegram
The adoption of ML by enterprises has reached new heights as highlighted in a recent machine learning report. Adoption has been happening at break-neck speed as companies attempt to leverage the technology to get ahead of the competition. Factors that drive the development include machine learning capabilities like risk management, performance analysis, and reporting and automation. Below are statistics on ML adoption.
✔️The increase in ML adoption is seen to drive the cloud computing market’s growth. (teks.co.in)
✔️1/3 of IT leaders are planning to use ML for business analytics. (statista.com)
✔️25% of IT leaders plan to use ML for security purposes (statista.com)
✔️16% of IT leaders want to use ML in sales and marketing. (statista.com)
✔️Capsule networks are seen to replace neural networks. (teks.co.in)

https://financesonline.com
#machinelarning
#adoption

@cedeeplearning
🔻61% of marketers say AI is the most critical aspect of their data strategy. (memsql.com)
🔻87% of companies who use AI plan to use them in sales forecasting and email marketing. (statista.com)
🔻2000 – The estimated number of Amazon Go stores in the US by 2021. (teks.co.in)
🔻49% of consumers are willing to purchase more frequently when AI is present. (twitter.com)
🔻$1 billion – The amount Netflix saved from the use of machine learning algorithm. (technisider.com)
🔻15 minutes – Amazon’s ship time after it started using machine learning. (aiindex.org)

https://financesonline.com/

#machinelearning
#marketofML

@cedeeplearning
1_640_50fps_FINAL_VERSION.gif
12.1 MB
🔻Fast and Easy Infinitely Wide Networks with Neural Tangents

One of the key theoretical insights that has allowed us to make progress in recent years has been that increasing the width of DNNs results in more regular behavior, and makes them easier to understand. A number of recent results have shown that DNNs that are allowed to become infinitely wide converge to another, simpler, class of models called Gaussian processes. In this limit, complicated phenomena (like Bayesian inference or gradient descent dynamics of a convolutional neural network) boil down to simple linear algebra equations. Insights from these infinitely wide networks frequently carry over to their finite counterparts.

Left: A schematic showing how deep neural networks induce simple input / output maps as they become infinitely wide. Right: As the width of a neural network increases , we see that the distribution of
#deeplearning
#CNN

@cedeeplearning
This media is not supported in your browser
VIEW IN TELEGRAM
🔻More Efficient NLP Model Pre-training with ELECTRA
Recent advances in language pre-training have led to substantial gains in the field of natural language processing, with state-of-the-art models such as BERT, RoBERTa, XLNet, ALBERT, and T5, among many others. These methods, though they differ in design, share the same idea of leveraging a large amount of unlabeled text to build a general model of language understanding before being fine-tuned on specific NLP tasks such as sentiment analysis and question answering.
https://ai.googleblog.com/

#NLP
#deeplearning
#pretraining

@cedeeplearning
#Torch-Struct: Deep Structured Prediction Library

The literature on structured prediction for #NLP describes a rich collection of distributions and algorithms over #sequences, #segmentations, #alignments, and #trees; however, these algorithms are difficult to utilize in deep learning frameworks. We introduce Torch-Struct, a library for structured prediction designed to take advantage of and integrate with vectorized, auto-differentiation based #frameworks. Torch-Struct includes a broad collection of #probabilistic structures accessed through a simple and flexible distribution-based API that connects to any deep learning model. The library utilizes batched, vectorized operations and exploits auto-differentiation to produce readable, fast, and testable code. Internally, we also include a number of general-purpose optimizations to provide cross-algorithm efficiency. Experiments show significant performance gains over fast baselines and case-studies demonstrate the benefits of the library. Torch-Struct is available at:

Code: https://github.com/harvardnlp/pytorch-struct
Paper: https://arxiv.org/abs/2002.00876v1

@cedeeplearning
🔹How to Classify Photos of Dogs and Cats (with 97% accuracy)
Develop a Deep Convolutional Neural Network Step-by-Step to Classify Photographs of Dogs and Cats
by: https://machinelearningmastery.com/how-to-develop-a-convolutional-neural-network-to-classify-photos-of-dogs-and-cats/

#Deeplearning
#neuralnetwork

@cedeeplearning
What does it mean to be a data scientist? After all, there are many different skills that fall under the umbrella of data science. The professionals’ job role was logically related to their proficiency in different skills.
link: https://www.business2community.com/big-data/investigating-data-scientists-their-skills-and-team-makeup-01335085

via: @cedeeplearning
🔹Design your Neural Networks
What’s a good learning rate? How many hidden layers should your network have? Is dropout actually useful? Why are your gradients vanishing?
link: https://towardsdatascience.com/designing-your-neural-networks-a5e4617027ed

#neuralnetwork
#machinelearning

via: @cedeeplearning
Tinker With a Neural Network Right Here in Your Browser

This was created by Daniel Smilkov and Shan Carter. This is a continuation of many people’s previous work — most notably Andrej Karpathy’s #convnet.js demo and Chris Olah’s articles about neural networks.

Via: @cedeeplearning

https://playground.tensorflow.org/
#visualization #neural_networks
The advent of AI in #Architecture is still in its early days but offers promising results. More than a mere opportunity, such potential represents for us a major step ahead, about to reshape the architectural discipline.

Via: @cedeeplearning

Check this interesting post out:
https://towardsdatascience.com/ai-architecture-f9d78c6958e0
Cutting Edge Deep Learning
Photo
Here are 10 #courses to help with your spring learning season. Courses range from introductory #machinelearning to #deeplearning to natural language processing and beyond.

This collection comes courtesy of Columbia University, Krakow Technical University, MIT, UC Berkeley, University of Washington, University of Wisconsin–Madison, and Yandex Data School.

1️⃣ Machine Learning
🏛 (University of Washington)
This course is designed to provide a thorough grounding in the fundamental methodologies and algorithms of machine learning.

2️⃣ Machine Learning
🏛 (University of Wisconsin-Madison)
This course will cover the key concepts of machine learning, including classification, regression analysis, clustering, and dimensionality reduction.

3️⃣ Algorithms (in journalism)
🏛 (Columbia University )
This is a course on algorithmic data analysis in journalism, and also the journalistic analysis of algorithms used in society. The major topics are text processing, visualization of high dimensional data, regression, machine learning, algorithmic bias and accountability, monte carlo simulation, and election prediction.

4️⃣ Practical Deep Learning
🏛 (Yandex Data School)
Yandex Data School

5️⃣ Big Data in 30 Hours
🏛 (Krakow Technical University )
The goal of this technical, hands-on class is to introduce practical Data Engineering and Data Science to technical personnel (corporate, academic or students), during 15 lectures (2 hours each)

6️⃣ Deep Reinforcement Learning Bootcamp
🏛 (UC Berkeley(& others))
Reinforcement learning considers the problem of learning to act and is poised to power next generation AI systems, which will need to go beyond input-output pattern recognition (as has sufficed for speech, vision, machine translation) but will have to generate intelligent behavior

7️⃣ Introduction to Artificial intelligence
🏛 (University of Washington)


8️⃣ Brains, Minds and Machines Summer Course(MIT)
🏛 (MIT)
This course explores the problem of intelligence—its nature, how it is produced by the brain and how it could be replicated in machines—using an approach that integrates cognitive science, which studies the mind; neuroscience, which studies the brain; and computer science and artificial intelligence, which study the computations needed to develop intelligent machines

9️⃣ Design and Analysis of Algorithms
🏛 (MIT)
This is an intermediate algorithms course with an emphasis on teaching techniques for the design and analysis of efficient algorithms, emphasizing methods of application

🔟 Natural Language Processing
🏛 (University of Washington)
——————————————

Via: @cedeeplearning
Credit goes to: https://goo.gl/Riybxs
also check our other social media handles:
https://linktr.ee/cedeeplearning

#MachineLearning #DataScience #Course #DeepLearning #BigData #AI
Cutting Edge Deep Learning pinned «Here are 10 #courses to help with your spring learning season. Courses range from introductory #machinelearning to #deeplearning to natural language processing and beyond. This collection comes courtesy of Columbia University, Krakow Technical University…»
🔹Laying the Groundwork for the Production of Your Machine Learning Models

link: https://www.rocketsource.co/blog/machine-learning-models/

#machinelearning
#modle
#hierarcy

via: @cedeeplearning