Depth Hints are complementary depth suggestions which improve monocular depth estimation algorithms trained from stereo pairs
code:
https://github.com/nianticlabs/depth-hints
paper:
https://arxiv.org/abs/1909.09051
dataset :
https://lmb.informatik.uni-freiburg.de/resources/datasets/SceneFlowDatasets.en.html
code:
https://github.com/nianticlabs/depth-hints
paper:
https://arxiv.org/abs/1909.09051
dataset :
https://lmb.informatik.uni-freiburg.de/resources/datasets/SceneFlowDatasets.en.html
GitHub
GitHub - nianticlabs/depth-hints: [ICCV 2019] Depth Hints are complementary depth suggestions which improve monocular depth estimation…
[ICCV 2019] Depth Hints are complementary depth suggestions which improve monocular depth estimation algorithms trained from stereo pairs - nianticlabs/depth-hints
September 22, 2019
Light regression analysis of some Microsoft employees salary distrubution
How basic knowledge of regression and couple of graphs can make an information look much better and clear.
Link: https://onezero.medium.com/leak-of-microsoft-salaries-shows-fight-for-higher-compensation-3010c589b41e
#regression #simple #salary #infographic
How basic knowledge of regression and couple of graphs can make an information look much better and clear.
Link: https://onezero.medium.com/leak-of-microsoft-salaries-shows-fight-for-higher-compensation-3010c589b41e
#regression #simple #salary #infographic
Medium
Leak of Microsoft Salaries Shows Fight for Higher Compensation
The numbers range from $40,000 to $320,000 and reveal key details about how pay works at big tech companies
September 22, 2019
100,000 FACES GENERATED BY AI FREE FOR ANY USE
https://generated.photos/
https://drive.google.com/drive/folders/1wSy4TVjSvtXeRQ6Zr8W98YbSuZXrZrgY
https://generated.photos/
https://drive.google.com/drive/folders/1wSy4TVjSvtXeRQ6Zr8W98YbSuZXrZrgY
generated.photos
Generated Photos | Unique, worry-free model photos
AI-generated images have never looked better. Explore and download our diverse, copyright-free headshot images from our production-ready database.
September 23, 2019
FSGAN: Subject Agnostic Face Swapping and Reenactment
New paper on #DeepFakes creation
YouTube demo:
https://www.youtube.com/watch?v=duo-tHbSdMk
Link:
https://nirkin.com/fsgan/
ArXiV:
https://arxiv.org/pdf/1908.05932.pdf
#FaceSwap #DL #Video #CV
New paper on #DeepFakes creation
YouTube demo:
https://www.youtube.com/watch?v=duo-tHbSdMk
Link:
https://nirkin.com/fsgan/
ArXiV:
https://arxiv.org/pdf/1908.05932.pdf
#FaceSwap #DL #Video #CV
YouTube
New Face Swapping AI Creates Amazing DeepFakes!
📝 The paper "FSGAN: Subject Agnostic Face Swapping and Reenactment" is available here:
https://nirkin.com/fsgan/
❤️ Pick up cool perks on our Patreon page: https://www.patreon.com/TwoMinutePapers
🙏 We would like to thank our generous Patreon supporters…
https://nirkin.com/fsgan/
❤️ Pick up cool perks on our Patreon page: https://www.patreon.com/TwoMinutePapers
🙏 We would like to thank our generous Patreon supporters…
September 23, 2019
Torchdata is PyTorch oriented library focused on data processing and input pipelines in general
https://github.com/szymonmaszke/torchdata
https://github.com/szymonmaszke/torchdata
GitHub
GitHub - szymonmaszke/torchdatasets: PyTorch dataset extended with map, cache etc. (tensorflow.data like)
PyTorch dataset extended with map, cache etc. (tensorflow.data like) - szymonmaszke/torchdatasets
September 24, 2019
2_5203986206391534542.pdf
1.5 MB
Sarbazi, M., Sadeghzadeh, M., & Mir Abedini, S. J. (2019). Improving resource allocation in software-defined networks using clustering. Cluster Computing.
doi:10.1007/s10586-019-02985-3
❇️ @AI_Python_EN
doi:10.1007/s10586-019-02985-3
❇️ @AI_Python_EN
September 24, 2019
AI, Python, Cognitive Neuroscience
2_5203986206391534542.pdf
If you just published a paper let us inform other members.
@ai_python_en
@ai_python_en
September 24, 2019
September 25, 2019
Liquid Warping GAN: A Unified Framework for Human Motion Imitation, Appearance Transfer and Novel View Synthesis
pdf: https://arxiv.org/pdf/1909.12224.pdf
abs: https://arxiv.org/abs/1909.12224
project page: https://svip-lab.github.io/project/impersonator.html
github: https://github.com/svip-lab/imper
pdf: https://arxiv.org/pdf/1909.12224.pdf
abs: https://arxiv.org/abs/1909.12224
project page: https://svip-lab.github.io/project/impersonator.html
github: https://github.com/svip-lab/imper
arXiv.org
Liquid Warping GAN: A Unified Framework for Human Motion...
We tackle the human motion imitation, appearance transfer, and novel view synthesis within a unified framework, which means that the model once being trained can be used to handle all these tasks....
September 28, 2019
#AI for Mammography and Digital Breast Tomosynthesis: Current Concepts and Future Perspectives. Krzysztof et al explain in the newest Radiology article below.
http://bit.ly/2kULbDz
http://bit.ly/2kULbDz
pubs.rsna.org
Artificial Intelligence for Mammography and Digital Breast Tomosynthesis: Current Concepts and Future Perspectives | Radiology
Although computer-aided diagnosis (CAD) is widely used in mammography, conventional CAD programs that use prompts to indicate potential cancers on the mammograms have not led to an improvement in d...
September 28, 2019
PyTorch implementations of deep reinforcement learning algorithms and environments
GitHub, by Petros Christodoulou : https://lnkd.in/eRZCQ-d
#pytorch #reinforcementlearning #deeplearning
GitHub, by Petros Christodoulou : https://lnkd.in/eRZCQ-d
#pytorch #reinforcementlearning #deeplearning
GitHub
GitHub - p-christ/Deep-Reinforcement-Learning-Algorithms-with-PyTorch: PyTorch implementations of deep reinforcement learning algorithms…
PyTorch implementations of deep reinforcement learning algorithms and environments - GitHub - p-christ/Deep-Reinforcement-Learning-Algorithms-with-PyTorch: PyTorch implementations of deep reinforce...
September 28, 2019
September 28, 2019
Google researchers just released #ALBERT , that has beaten all models across various benchmarks.
Also, did you know that most NLP models achieves performance that outpaces average human performance?
——————————————————
ALBERT uses parameter reduction techniques to lower memory consumption and increase the training speed of BERT
1. They topped GLUE ( https://lnkd.in/dkWNRVk ) — 92.2%
2. SQuAD (https://lnkd.in/d_Xrba8 ) leaderboards. — 89.4%
3. RACE - they came third with their ensemble model (https://lnkd.in/d2yWbtC ) — 89.4%
——————————————————
Paper at openreview: https://lnkd.in/dzRvWYS
#deeplearning #machinelearning #NLU #NLG #artificiallintelligence #ai
Also, did you know that most NLP models achieves performance that outpaces average human performance?
——————————————————
ALBERT uses parameter reduction techniques to lower memory consumption and increase the training speed of BERT
1. They topped GLUE ( https://lnkd.in/dkWNRVk ) — 92.2%
2. SQuAD (https://lnkd.in/d_Xrba8 ) leaderboards. — 89.4%
3. RACE - they came third with their ensemble model (https://lnkd.in/d2yWbtC ) — 89.4%
——————————————————
Paper at openreview: https://lnkd.in/dzRvWYS
#deeplearning #machinelearning #NLU #NLG #artificiallintelligence #ai
September 29, 2019
Artificial Design: Modeling Artificial Super Intelligence with Extended General Relativity and Universal Darwinism via Geometrization for Universal Design Automation
https://openreview.net/forum?id=SyxQ_TEFwS
https://openreview.net/forum?id=SyxQ_TEFwS
September 29, 2019
Bias and Generalization in Deep Generative Models
Blog by Zhao et al.: https://lnkd.in/eRAhsuS
#DeepLearning #GenerativeModels #MachineLearning
Blog by Zhao et al.: https://lnkd.in/eRAhsuS
#DeepLearning #GenerativeModels #MachineLearning
ermongroup.github.io
Bias and Generalization in Deep Generative Models
Research
September 30, 2019
Self-Paced Learning:
- supervised method from 2010 #NIPS
- idea: start learning with the easiest samples first and only then learn the difficult ones
- distinct from curriculum learning, where samples are pre-classified to easy/hard: we need to decide the order on our own
sample in a latent model (outliers will be the hardest)
- a better measure (!): how good are the initial predictions for the sample (samples far away from the decision boundary are the easiest).
- for #classification, samples are only easy in context of other samples!
- the set of easy samples is iteratively enlarged
- results: outperforms CCCP in #DNA Motif Finding, handwritten digit recognition and others problems
- link: https://papers.nips.cc/paper/3923-self-paced-learning-for-latent-variable-models
- supervised method from 2010 #NIPS
- idea: start learning with the easiest samples first and only then learn the difficult ones
- distinct from curriculum learning, where samples are pre-classified to easy/hard: we need to decide the order on our own
sample in a latent model (outliers will be the hardest)
- a better measure (!): how good are the initial predictions for the sample (samples far away from the decision boundary are the easiest).
- for #classification, samples are only easy in context of other samples!
- the set of easy samples is iteratively enlarged
- results: outperforms CCCP in #DNA Motif Finding, handwritten digit recognition and others problems
- link: https://papers.nips.cc/paper/3923-self-paced-learning-for-latent-variable-models
papers.nips.cc
Self-Paced Learning for Latent Variable Models
Electronic Proceedings of Neural Information Processing Systems
September 30, 2019
Github has just launched a new NLP/information retrieval challenge: CodeSearchNet challenge. The goal of code search is to retrieve relevant code given natural language. Along with this, they released a huge dataset with: - 6m functions across 6 programming languages (Go, Java, Python etc) - 2m of those 6m functions have associated documentation (docstrings, JavaDoc etc) - And some metadata (line number and more). They also included some baseline models (e.g. BERT-like self-attention model) to help people get started with the challenge. Check it out! #deeplearning #machinelearning
📝 Article: https://lnkd.in/dezzhs9
🔤 Code: https://lnkd.in/dXhRqpE
✴️ @AI_PYTHON_EN
📝 Article: https://lnkd.in/dezzhs9
🔤 Code: https://lnkd.in/dXhRqpE
✴️ @AI_PYTHON_EN
September 30, 2019
October 1, 2019
Microsoft open-sourced scripts and notebooks to pre-train and finetune BERT natural language model with domain-specific texts
Github: https://github.com/microsoft/AzureML-BERT
#Bert #Microsoft #NLP #dl
✴️ @AI_PYTHON_EN
Github: https://github.com/microsoft/AzureML-BERT
#Bert #Microsoft #NLP #dl
✴️ @AI_PYTHON_EN
October 2, 2019
Deep Reinforcement Learning
CS 285 at UC Berkeley Lectures will be streamed and recorded.
lectures: https://www.youtube.com/playlist?list=PLkFD6_40KJIwhWJpGazJ9VSj9CFMkb79A
http://rail.eecs.berkeley.edu/deeprlcourse/
✴️ @AI_PYTHON_EN
CS 285 at UC Berkeley Lectures will be streamed and recorded.
lectures: https://www.youtube.com/playlist?list=PLkFD6_40KJIwhWJpGazJ9VSj9CFMkb79A
http://rail.eecs.berkeley.edu/deeprlcourse/
✴️ @AI_PYTHON_EN
YouTube
CS285 Fall 2019
Share your videos with friends, family, and the world
October 2, 2019
Transformers: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch
https://huggingface.co/transformers
✴️ @AI_PYTHON_EN
https://huggingface.co/transformers
✴️ @AI_PYTHON_EN
October 2, 2019