ββDo Adversarially Robust ImageNet Models Transfer Better?
TLDR - Yes.
Authors decide to check will adversarial trained network performed better on transfer learning tasks despite on worst accuracy on the trained dataset (ImageNet of course). And it is true.
They tested this idea on a frozen pre-trained feature extractor and trained only linear classifier that outperformed classic counterpart. And they tested on a full unfrozen fine-tuned network, that outperformed too on transfer learning tasks.
On pre-train task they use the adversarial robustness prior, that refers to a modelβs invariance to small (often imperceptible) perturbations of its inputs.
They show also that such an approach gives better future representation properties of the networks.
They did many experiments (14 pages of graphics) and an ablation study.
paper: https://arxiv.org/abs/2007.08489
code: https://github.com/Microsoft/robust-models-transfer
#transfer_learning #SOTA #adversarial
TLDR - Yes.
Authors decide to check will adversarial trained network performed better on transfer learning tasks despite on worst accuracy on the trained dataset (ImageNet of course). And it is true.
They tested this idea on a frozen pre-trained feature extractor and trained only linear classifier that outperformed classic counterpart. And they tested on a full unfrozen fine-tuned network, that outperformed too on transfer learning tasks.
On pre-train task they use the adversarial robustness prior, that refers to a modelβs invariance to small (often imperceptible) perturbations of its inputs.
They show also that such an approach gives better future representation properties of the networks.
They did many experiments (14 pages of graphics) and an ablation study.
paper: https://arxiv.org/abs/2007.08489
code: https://github.com/Microsoft/robust-models-transfer
#transfer_learning #SOTA #adversarial
ββhow gpt3 works. a visual thread
short thread with cool animations how gpt-3 works by jay alammar
collected twitter thread: https://threader.app/thread/1285498971960598529
#nlp #transformers #gpt3 #jayalammar
short thread with cool animations how gpt-3 works by jay alammar
collected twitter thread: https://threader.app/thread/1285498971960598529
#nlp #transformers #gpt3 #jayalammar
#GPT3 attracted lots of attention. Letβs try new format of discussing the matter in the comments, provided by peerboard.
For accessing the comments, just click the link below β¬οΈβ¬οΈβ¬οΈ, authorize with the telegram and follow the discussion.
For accessing the comments, just click the link below β¬οΈβ¬οΈβ¬οΈ, authorize with the telegram and follow the discussion.
(Rich Sutton, author of http://www.incompleteideas.net/IncIdeas/BitterLesson.html is on the right)
Data Science by ODS.ai π¦ pinned Β«ββUltimate post on where to start learning DS Most common request we received through the years was to share insights and advices on how to start career in data science and to recommend decent cources. Apparently, using hashtag #wheretostart wasn't enoughβ¦Β»
This media is not supported in your browser
VIEW IN TELEGRAM
Applying GPT-3 to generate neural network code
Matt Shumer used GPT-3 to generate code for a machine learning model, just by describing the dataset and required output.
#GPT3 #inception #codegeneration #NLU #NLP
Matt Shumer used GPT-3 to generate code for a machine learning model, just by describing the dataset and required output.
#GPT3 #inception #codegeneration #NLU #NLP
ββAstrologers proclaimed the week of #codegeneration. Number of articles about the subject doubled.
Deep learning to translate between programming languages
#FacebookAI released TransCoder, an entirely self-supervised neural transcompiler system that is claimed to make code migration easier and more efficient.
ArXiV: https://arxiv.org/pdf/2006.03511.pdf
Github: https://github.com/facebookresearch/TransCoder/
#NLU #codegeneration #NLP
#FacebookAI released TransCoder, an entirely self-supervised neural transcompiler system that is claimed to make code migration easier and more efficient.
ArXiV: https://arxiv.org/pdf/2006.03511.pdf
Github: https://github.com/facebookresearch/TransCoder/
#NLU #codegeneration #NLP
pytorch v1.6
[0] native mixed-precision support from nvidia (~2x perf improvement)
[1] distributed perf improvements
[2] new profiling tool for memory consumption
[3] microsoft commits to developing
[4] maintaining windows pytorch
β¦
github: https://github.com/pytorch/pytorch/releases/tag/v1.6.0
[0] native mixed-precision support from nvidia (~2x perf improvement)
[1] distributed perf improvements
[2] new profiling tool for memory consumption
[3] microsoft commits to developing
[4] maintaining windows pytorch
β¦
github: https://github.com/pytorch/pytorch/releases/tag/v1.6.0
GitHub
Release Stable release of automatic mixed precision (AMP). New Beta features include a TensorPipe backend for RPC, memory profilerβ¦
PyTorch 1.6.0 Release Notes
Highlights
Backwards Incompatible Changes
Deprecations
New Features
Improvements
Bug Fixes
Performance
Documentation
Highlights
The PyTorch 1.6 release includes a numb...
Highlights
Backwards Incompatible Changes
Deprecations
New Features
Improvements
Bug Fixes
Performance
Documentation
Highlights
The PyTorch 1.6 release includes a numb...
ββFunnel Activation for Visual Recognition
Authors offer a new activation function for image recognition tasks, called Funnel activation (FReLU), that extends ReLU and PReLU to a 2D activation by adding a negligible overhead of spatial condition.
Extensive experiments on COCO, ImageNet and CityScape show significant improvement and robustness.
Paper: https://arxiv.org/abs/2007.11824
Code: https://github.com/megvii-model/FunnelAct
#deeplearning #activationfunction #computervision #pytorch
Authors offer a new activation function for image recognition tasks, called Funnel activation (FReLU), that extends ReLU and PReLU to a 2D activation by adding a negligible overhead of spatial condition.
Extensive experiments on COCO, ImageNet and CityScape show significant improvement and robustness.
Paper: https://arxiv.org/abs/2007.11824
Code: https://github.com/megvii-model/FunnelAct
#deeplearning #activationfunction #computervision #pytorch
Data Science by ODS.ai π¦
Deep learning to translate between programming languages #FacebookAI released TransCoder, an entirely self-supervised neural transcompiler system that is claimed to make code migration easier and more efficient. ArXiV: https://arxiv.org/pdf/2006.03511.pdfβ¦
#Facebook released github repo with code for #TransCoder : https://github.com/facebookresearch/TransCoder/
GitHub
GitHub - facebookresearch/TransCoder: Public release of the TransCoder research project https://arxiv.org/pdf/2006.03511.pdf
Public release of the TransCoder research project https://arxiv.org/pdf/2006.03511.pdf - facebookresearch/TransCoder
Our friends from @loss_function_porn released their app and climbed AppStore top chart!
Letβs help them preserve that position by downloading an app and giving them 5βοΈ.
Letβs help them preserve that position by downloading an app and giving them 5βοΈ.
Forwarded from Karim Iskakov - ΠΊΠ°Π½Π°Π» (Vladimir Ivashkin)
This media is not supported in your browser
VIEW IN TELEGRAM
BREAKING NEWS! (sound on)
Our iOS app Avatarify is #1 in Russian App Store, and today we release it worldwide.
Vivify any photo with your face in real time: celebrity, your boss or even pet. Record video and share it to amaze your friends.
NN works completely on the device in zero-shot mode. Check it out!
π± App Store
π avatarify.ai
π @loss_function_porn
Our iOS app Avatarify is #1 in Russian App Store, and today we release it worldwide.
Vivify any photo with your face in real time: celebrity, your boss or even pet. Record video and share it to amaze your friends.
NN works completely on the device in zero-shot mode. Check it out!
π± App Store
π avatarify.ai
π @loss_function_porn
ββStanford updated tool Stanza with #NER for biomedical and clinical terms
Stanza extended with first domain-specific models for biomedical and clinical medical English. They range from approaching to significantly improving state of the art results on syntactic and NER tasks.
That means that now neural networks are capable of understanding difficult texts with lots of specific terms. That means better search, improved knowledge extraction and approach for performing META analysis, or even research with medical ArXiV publications.
Demo: http://stanza.run/bio
ArXiV: https://arxiv.org/abs/2007.14640
#NLProc #NLU #Stanford #biolearning #medicallearning
Stanza extended with first domain-specific models for biomedical and clinical medical English. They range from approaching to significantly improving state of the art results on syntactic and NER tasks.
That means that now neural networks are capable of understanding difficult texts with lots of specific terms. That means better search, improved knowledge extraction and approach for performing META analysis, or even research with medical ArXiV publications.
Demo: http://stanza.run/bio
ArXiV: https://arxiv.org/abs/2007.14640
#NLProc #NLU #Stanford #biolearning #medicallearning
ββHope that someday DL industry will evolve enough to develop tools for recognizing russian doctorsβ handwriting.
english to regex
generating regex by just describing it and providing an example (apparently powered by gpt-3)
web page: https://losslesshq.com
#regext #gpt3
generating regex by just describing it and providing an example (apparently powered by gpt-3)
web page: https://losslesshq.com
#regext #gpt3
ββLast day to apply for free Skoltech's Summer School of Machine Learning
Benefits of School:
+ top speakers from leading Data Science centers
+ new knowledge and advanced trends in statistical methods of machine learning.
+ free participation
How to apply:
Today is the LAST DAY to apply to school at the website
Link: https://smiles.skoltech.ru/school
#openedu #course #free #ml
Benefits of School:
+ top speakers from leading Data Science centers
+ new knowledge and advanced trends in statistical methods of machine learning.
+ free participation
How to apply:
Today is the LAST DAY to apply to school at the website
Link: https://smiles.skoltech.ru/school
#openedu #course #free #ml
Data Science by ODS.ai π¦
ββLast day to apply for free Skoltech's Summer School of Machine Learning Benefits of School: + top speakers from leading Data Science centers + new knowledge and advanced trends in statistical methods of machine learning. + free participation How to apply:β¦
Important information about the International Summer Online School of Machine Learning (SMILES):
We are often asked, what is a poster and why should you upload it if participation is free?
Let's go through this: submitting a poster about your project or research is a long-standing tradition at summer schools. The content should be informative, yet concise enough for the reader to understand your idea in 2 minutes or less.
What's the point?
Reason β1. The event will bring together top speakers, scientists, and entrepreneurs. So this is a good opportunity to get an expert opinion on your work, find partners for research, and potential investors and employers.
Reason β2. If you submit a poster, you will get access to the full range of events within SMILES: fireside chats, speed dating, social events, some lectures, etc.
Here are some examples of posters:
ββ https://bit.ly/2OSjfvs
ββ https://bit.ly/30H0XT7
If you still have questions, feel free to ask us in the comments. But If you don't, apply to SMILES and upload your poster right now:β https://smiles.skoltech.ru/school
π¨Update: lectures will be available without registration ππ€©π¨
π¨Update 2: poster examplesπ¨
β https://bit.ly/2OSjfvs
β https://bit.ly/30H0XT7
We are often asked, what is a poster and why should you upload it if participation is free?
Let's go through this: submitting a poster about your project or research is a long-standing tradition at summer schools. The content should be informative, yet concise enough for the reader to understand your idea in 2 minutes or less.
What's the point?
Reason β1. The event will bring together top speakers, scientists, and entrepreneurs. So this is a good opportunity to get an expert opinion on your work, find partners for research, and potential investors and employers.
Reason β2. If you submit a poster, you will get access to the full range of events within SMILES: fireside chats, speed dating, social events, some lectures, etc.
Here are some examples of posters:
ββ https://bit.ly/2OSjfvs
ββ https://bit.ly/30H0XT7
If you still have questions, feel free to ask us in the comments. But If you don't, apply to SMILES and upload your poster right now:β https://smiles.skoltech.ru/school
π¨Update: lectures will be available without registration ππ€©π¨
π¨Update 2: poster examplesπ¨
β https://bit.ly/2OSjfvs
β https://bit.ly/30H0XT7
Dropbox
main.pdf
Shared with Dropbox
Image "Cloaking" for Personal Privacy
New research project from the University of Chicago CS group claims to provide a new face recognition protection mechanism.
Project link: https://sandlab.cs.uchicago.edu/fawkes/
Github: https://github.com/Shawn-Shan/fawkes
#Privacy #DL #CV #facerecognition
New research project from the University of Chicago CS group claims to provide a new face recognition protection mechanism.
Project link: https://sandlab.cs.uchicago.edu/fawkes/
Github: https://github.com/Shawn-Shan/fawkes
#Privacy #DL #CV #facerecognition
GitHub
GitHub - Shawn-Shan/fawkes: Fawkes, privacy preserving tool against facial recognition systems. More info at https://sandlab.cβ¦
Fawkes, privacy preserving tool against facial recognition systems. More info at https://sandlab.cs.uchicago.edu/fawkes - Shawn-Shan/fawkes