Data Science by ODS.ai 🦜
48.4K subscribers
444 photos
56 videos
7 files
1.59K links
First Telegram Data Science channel. Covering all technical and popular staff about anything related to Data Science: AI, Big Data, Machine Learning, Statistics, general Math and the applications of former. To reach editors contact: @haarrp
Download Telegram
​​Your Classifier is Secretly an Energy-Based Model and You Should Treat it Like One


Classifiers are secretly energy-based models! Every
softmax giving p(c|x) has an unused degree of freedom, which we use to compute the input density p(x). This makes classifiers into #generative models without changing the architecture.

The authors did math tricks to the joint Energy-Based Model (EBM) to a usual #classifier with #softmax. So it turns that in this usual classifier hiding the EBM.

Our key observation in this work is that one can slightly re-interpret the logits obtained from
to define p(x, y) and p(x) as well. Without changing fθ, one can re-use the logits to define an energy-based model of the joint distribution of data point x and labels y. The normalizing constant cancels out, yielding the standard Softmax parameterization. Thus, we have found a generative model hidden within every standard discriminative model!

In their result, they show that approach gives more adversarial robustness and for generated images maximize class confidence than its classified. It also gives more unify the density of confidence classes of all data. And of course, it can generate pictures. Interesting result by math trick!


paper: http://arxiv.org/abs/1912.03263
tweet: http://twitter.com/DavidDuvenaud/status/1204143678865866752
github: https://github.com/wgrathwohl/JEM