Cutting Edge Deep Learning
262 subscribers
193 photos
42 videos
51 files
363 links
๐Ÿ“• Deep learning
๐Ÿ“— Reinforcement learning
๐Ÿ“˜ Machine learning
๐Ÿ“™ Papers - tools - tutorials

๐Ÿ”— Other Social Media Handles:
https://linktr.ee/cedeeplearning
Download Telegram
๐Ÿ”นDeep Learning #Algorithms Identify Structures in Living Cells

For cell biologists, fluorescence microยญscopy is an invaluable tool. Fusing dyes to antibodies or inserting genes coding for fluorescent proteins into the #DNA of living cells can help scientists pick out the location of #organelles, #cytoskeletal elements, and other subcellular #structures from otherwise #impenetrable microscopy images. But this technique has its #drawbacks.

๐Ÿ“ŒVia: @cedeeplearning

link: https://www.the-scientist.com/notebook/deep-learning-algorithms-identify-structures-in-living-cells-65778

#deeplearning
#neuralnetworks
#machinelearning
๐Ÿ”นArtificial Intelligence Vs Neural Networks

The term โ€œartificial intelligenceโ€ dates back to the mid-1950s, when mathematician John McCarthy, widely recognized as the father of AI, used it to describe machines that do things people might call intelligent. He and Marvin Minsky, whose work was just as influential in the AI field, organized the Dartmouth Summer Research Project on Artificial Intelligence in 1956.

๐Ÿ“ŒVia: @cedeeplearning

link: https://www.the-scientist.com/magazine-issue/artificial-intelligence-versus-neural-networks-65802

#neuralnetworks
#deepearning
#machinelearning
#AI
๐Ÿ”นAI Networks Generate Super-Resolution from Basic Microscopy

A new study uses deep learning to improve the resolution of biological images, but elicits skepticism about its ability to enhance snapshots of sample types that it has never seen before.

๐Ÿ“ŒVia: @cedeeplearning

link: https://www.the-scientist.com/news-opinion/ai-networks-generate-super-resolution-from-basic-microscopy-65219

#deeplerning
#neuralnetworks
#machinelearning
๐Ÿ”นNeural networks facilitate optimization in the search for new materials

Sorting through millions of possibilities, a search for battery materials delivered results in five weeks instead of 50 years. When searching through theoretical lists of possible new materials for particular applications, such as batteries or other energy-related devices, there are often millions of potential materials that could be considered, and multiple criteria that need to be met and optimized at once.

๐Ÿ“ŒVia: @cedeeplearning

link: http://news.mit.edu/2020/neural-networks-optimize-materials-search-0326

#MIT
#deeplearning
#neuralnetworks
#imagedetection
๐Ÿ”นDeep learning for mechanical property evaluation

New technique allows for more precise measurements of #deformation characteristics using nanoindentation tools.
A #standard method for testing some of the #mechanical properties of #materials is to poke them with a sharp point. This โ€œindentation techniqueโ€ can provide detailed measurements of how the material responds to the pointโ€™s force, as a function of its #penetration depth.

๐Ÿ“ŒVia: @cedeeplearning

link: http://news.mit.edu/2020/deep-learning-mechanical-property-metallic-0316

#neuralnetworks
#deeplearning
#machinelearning
๐Ÿ”นUnderstanding Generative Adversarial Networks (GANs)

Yann LeCun described it as โ€œthe most interesting idea in the last 10 years in #Machine_Learningโ€. Of course, such a compliment coming from such a prominent researcher in the #deep_learning area is always a great advertisement for the subject we are talking about! And, indeed, #Generative Adversarial #Networks (#GANs for short) have had a huge success since they were introduced in 2014 by Ian J. #Goodfellow and co-authors in the article Generative Adversarial Nets.

๐Ÿ“ŒVia: @cedeeplearning

link: https://towardsdatascience.com/understanding-generative-adversarial-networks-gans-cd6e4651a29
๐Ÿ”นStructured learning and GANs in TF, another viral face-swapper, optimizer benchmarks, and more...

This week in #deep_learning we bring you a GAN library for TensorFlow 2.0, another viral #face-swapping app, an #AI Mahjong player from Microsoft, and surprising results showing random architecture search beating neural architecture search. You may also enjoy an interview with Yann LeCun on the AI Podcast, a primer on #MLIR from Google, a few-shot face-#swapping #GAN, benchmarks for recent optimizers, a structured learning #framework for #TensorFlow, and more!

๐Ÿ“ŒVia: @cedeeplearning

link: https://www.deeplearningweekly.com/issues/deep-learning-weekly-issue-124.html
๐Ÿ”ปWhen not to use deep learning

Despite #DL many successes, there are at least 4 situations where it is more of a hindrance, including low-budget problems, or when explaining #models and #features to general public is required.
So when not to use #deep_learning?

1. #Low-budget or #low-commitment problems

2. Interpreting and communicating model parameters/feature importance to a general audience

3. Establishing causal mechanisms

4. Learning from โ€œ#unstructuredโ€ features

๐Ÿ“ŒVia: @cedeeplearning

link: https://www.kdnuggets.com/2017/07/when-not-use-deep-learning.html/2
๐Ÿ”ปFree Mathematics Courses for Data Science & Machine Learning

It's no secret that #mathematics is the foundation of data science. Here are a selection of courses to help increase your math skills to excel in #data_science, #machine_learning, and beyond. (๐Ÿ”นclick on the link below๐Ÿ”น)

๐Ÿ“ŒVia: @cedeeplearning

link: https://www.kdnuggets.com/2020/02/free-mathematics-courses-data-science-machine-learning.html
๐Ÿ”ป20 AI, Data Science, Machine Learning Terms You Need to Know in 2020

2020 is well underway, and we bring you 20 AI, #data_science, and #machine_learning #terms we should all be familiar with as the year marches onward.

๐Ÿ“ŒVia: @cedeeplearning

๐Ÿ”ปPart1: https://www.kdnuggets.com/2020/02/ai-data-science-machine-learning-key-terms-2020.html

๐Ÿ”ปPart2: https://www.kdnuggets.com/2020/03/ai-data-science-machine-learning-key-terms-part2.html

#deeplearning
#terminology
This media is not supported in your browser
VIEW IN TELEGRAM
๐Ÿ”นA Neural Weather Model for Eight-Hour Precipitation Forecasting

Predicting weather from minutes to weeks ahead with high #accuracy is a fundamental scientific challenge that can have a wide ranging impact on many aspects of society. Current forecasts employed by many meteorological agencies are based on physical models of the atmosphere that, despite improving substantially over the preceding decades, are inherently constrained by their computational requirements and are sensitive to approximations of the physical laws that govern them. An alternative approach to weather prediction that is able to overcome some of these constraints uses deep neural networks (#DNNs): instead of encoding explicit physical laws, DNNs discover #patterns in the #data and learn complex transformations from inputs to the desired outputs using parallel computation on powerful specialized hardware such as #GPUs and #TPUs.

๐Ÿ“ŒVia: @cedeeplearning

link: https://ai.googleblog.com/

#deeplearning
#neuralnetworks
#machinelearning
๐Ÿ”นLearning to See Transparent Objects

Optical 3D range sensors, like #RGB-D cameras and #LIDAR, have found widespread use in robotics to generate rich and accurate 3D maps of the environment, from #self-driving cars to autonomous manipulators. However, despite the ubiquity of these complex #robotic systems, transparent objects (like a glass container) can confound even a suite of expensive sensors that are commonly used. This is because optical 3D sensors are driven by algorithms that assume all surfaces are Lambertian, i.e., they reflect light evenly in all directions, resulting in a uniform surface brightness from all viewing angles. However, transparent objects violate this assumption, since their surfaces both refract and reflect light. Hence, most of the depth data from transparent objects are invalid or contain unpredictable noise.

๐Ÿ“ŒVia: @cedeeplearning

link: https://ai.googleblog.com/search?updated-max=2020-02-24T13:01:00-08:00&max-results=10&start=8&by-date=false

#deeplearning
#neuralnetworks
๐Ÿ”นWhat Are the Educational Requirements for Careers in Artificial Intelligence?

To take your first steps down the artificial intelligence career path, hiring managers will likely require that you hold at least a bachelorโ€™s degree in mathematics and basic computer technology. However, for the most part, bachelorโ€™s degrees will only get you into entry-level positions. If youโ€™re thinking of going to school to become an AI specialist, then youโ€™ll have to sign up for courses that typically cover the following:
1.Bayesian networking (including neural nets)
2. Computer science (gain coding experience with popular programming languages)
3. Cognitive science theory
4. Engineering
5. Physics
6. Robotics
7. Various level of math (algebra, calculus, logic and algorithms, probability, and statistics)

If youโ€™re already a software engineer, you can quickly become an artificial intelligence developer with a few AI-focused courses, taken at a brick-and-mortar school or an offline or online bootcamp.

๐Ÿ“ŒVia: @cedeeplearning
๐Ÿ”ปPopular Deep Learning #Courses of 2019๐Ÿ”ป

With #deep_learning and #AI on the forefront of the latest applications and demands for new business directions, additional #education is paramount for current machine learning engineers and #data_scientists. These courses are famous among peers, and will help you demonstrate tangible proof of your new skills.

๐Ÿ“ŒVia: @cedeeplearning

https://www.kdnuggets.com/2019/12/deep-learning-courses.html
๐Ÿ”นHow to Start Learning Deep Learning

Want to get started #learning_deep learning? Sure you do! Check out this great overview, advice, and list of resources.

Due to the recent achievements of artificial #neural_networks across many different tasks (such as face #recognition, object detection and Go), deep learning has become extremely popular. This post aims to be a starting point for those interested in learning more about it.

๐Ÿ”ปIf you already have a basic understanding of linear algebra, #calculus, #probability and #programming: I recommend starting with Stanfordโ€™s CS231n.

๐Ÿ”ปIf you donโ€™t have the relevant math background: There is an incredible amount of free material online that can be used to learn the required math knowledge. Gilbert Strangโ€™s course on #linear_algebra is a great introduction to the field. For the other subjects, edX has courses from MIT on both calculus and probability.

๐Ÿ“ŒVia: @cedeeplearning

link: https://www.kdnuggets.com/2016/07/start-learning-deep-learning.html
๐Ÿ”นWhat is Nvidia Deep Learning AI ?

Nvidia Deep Learning AI is a suite of products dedicated to deep learning and machine intelligence. This lets industries and governments power their decisions with smart and predictive analytics to provide customers and constituents with elevated services. Nvidia Deep Learning AI lets users pull insights from big data. This lets them realize their true value by utilizing them in creating solutions for current and forecasted problems. This allows them to arm themselves with the knowledge that can prove to be instrumental at a time when a challenge arises. With Nvidia Deep Learning AI, organizations can achieve a high rate of success and even protect themselves from fraud and other finance-related risks.

๐Ÿ“ŒVia: @cedeeplearning

link: https://reviews.financesonline.com/p/nvidia-deep-learning-ai/

#deeplearning
#neuralnetworks
#AI
Cutting Edge Deep Learning pinned ยซ๐Ÿ”ปPopular Deep Learning #Courses of 2019๐Ÿ”ป With #deep_learning and #AI on the forefront of the latest applications and demands for new business directions, additional #education is paramount for current machine learning engineers and #data_scientists. Theseโ€ฆยป