PyTorch implementation of the Leap Meta-Learner
GitHub: https://github.com/amzn/metalearn-leap
Paper by Flennerhag et al.: https://arxiv.org/abs/1812.01054
#MachineLearning #ArtificialIntelligence #TransferLearning
GitHub: https://github.com/amzn/metalearn-leap
Paper by Flennerhag et al.: https://arxiv.org/abs/1812.01054
#MachineLearning #ArtificialIntelligence #TransferLearning
GitHub
GitHub - amzn/metalearn-leap: Original PyTorch implementation of the Leap meta-learner (https://arxiv.org/abs/1812.01054) along…
Original PyTorch implementation of the Leap meta-learner (https://arxiv.org/abs/1812.01054) along with code for running the Omniglot experiment presented in the paper. - GitHub - amzn/metalearn-lea...
Best paper award at #CVPR2018 :
"Taskonomy: Disentangling Task Transfer Learning"
Abstract : Do visual tasks have a relationship, or are they unrelated? For instance, could having surface normals simplify estimating the depth of an image? Intuition answers these questions positively, implying existence of a structure among visual tasks. Knowing this structure has notable values; it is the concept underlying transfer learning and provides a principled way for identifying redundancies across tasks, e.g., to seamlessly reuse supervision among related tasks or solve many tasks in one system without piling up the complexity. We proposes a fully computational approach for modeling the structure of space of visual tasks (...).
Paper: https://arxiv.org/pdf/1804.08328.pdf
Data: http://taskonomy.stanford.edu
#award #artificialintelligence #deeplearning #transferlearning
"Taskonomy: Disentangling Task Transfer Learning"
Abstract : Do visual tasks have a relationship, or are they unrelated? For instance, could having surface normals simplify estimating the depth of an image? Intuition answers these questions positively, implying existence of a structure among visual tasks. Knowing this structure has notable values; it is the concept underlying transfer learning and provides a principled way for identifying redundancies across tasks, e.g., to seamlessly reuse supervision among related tasks or solve many tasks in one system without piling up the complexity. We proposes a fully computational approach for modeling the structure of space of visual tasks (...).
Paper: https://arxiv.org/pdf/1804.08328.pdf
Data: http://taskonomy.stanford.edu
#award #artificialintelligence #deeplearning #transferlearning
Understanding Transfer Learning for Medical Imaging
ArXiV: https://arxiv.org/abs/1902.07208
#biolearning #dl #transferlearning
ArXiV: https://arxiv.org/abs/1902.07208
#biolearning #dl #transferlearning
arXiv.org
Transfusion: Understanding Transfer Learning for Medical Imaging
Transfer learning from natural image datasets, particularly ImageNet, using standard large models and corresponding pretrained weights has become a de-facto method for deep learning applications...
The State of Transfer Learning in NLP
By Sebastian Ruder : http://ruder.io/state-of-transfer-learning-in-nlp/
#TransferLearning #NaturalLanguageProcessing #NLP
By Sebastian Ruder : http://ruder.io/state-of-transfer-learning-in-nlp/
#TransferLearning #NaturalLanguageProcessing #NLP
ruder.io
The State of Transfer Learning in NLP
This post expands on the NAACL 2019 tutorial on Transfer Learning in NLP. It highlights key insights and takeaways and provides updates based on recent work.