Am Neumarkt 😱
287 subscribers
89 photos
3 videos
17 files
522 links
Machine learning and other gibberish
Archives: https://datumorphism.leima.is/amneumarkt/
Download Telegram
https://github.com/porn-vault/porn-vault
Manage your ever-growing porn collection. Using Vue & GraphQL
If you live in Germany, here is a tip that might be useful: The VAT is getting back to 19% in the next year.
TachibanaYoshino/AnimeGAN: A Tensorflow implementation of AnimeGAN for fast photo animation ! This is the Open source of the paper 「AnimeGAN: a novel lightweight GAN for photo animation」, which uses the GAN framwork to transform real-world photos into anime images.
https://github.com/TachibanaYoshino/AnimeGAN
A new search engine by a former chief scientist who helped developing the AI platform Einstein for Salesforce.

The new search engine is called "you".

https://you.com/?refCode=5ac0f0ea
#ML #paper

https://arxiv.org/abs/2012.00152
Every Model Learned by Gradient Descent Is Approximately a Kernel Machine
Deep learning's successes are often attributed to its ability to automatically discover new representations of the data, rather than relying on handcrafted features like other learning methods.
https://events.ccc.de/2020/09/04/rc3-remote-chaos-experience/

CCC is hosting the event for 2020 fully online. Everyone can join with a pay-as-you-wish ticket. Join if you like programming, hacking, social events, learning something crazy and new. 👍👍👍
Interesting idea. The milk box as an ad platform 😱
#ML

https://arxiv.org/abs/2012.04863

Skillearn: Machine Learning Inspired by Humans' Learning Skills

Interesting idea. I didn't know interleaving is already being used in ML.
#science
The ergodicity problem in economics | Nature Physics
https://www.nature.com/articles/s41567-019-0732-0

I read another paper about hot hand/gamblers' fallacy a while ago and the author of that paper took a similar view. Here is the article:
Surprised by the Hot Hand Fallacy ? A Truth in the Law of Small Numbers by Miller
#machinelearning
https://arxiv.org/abs/2007.04504
Learning Differential Equations that are Easy to Solve

Jacob Kelly, Jesse Bettencourt, Matthew James Johnson, David Duvenaud

Differential equations parameterized by neural networks become expensive to solve numerically as training progresses. We propose a remedy that encourages learned dynamics to be easier to solve. Specifically, we introduce a differentiable surrogate for the time cost of standard numerical solvers, using higher-order derivatives of solution trajectories. These derivatives are efficient to compute with Taylor-mode automatic differentiation. Optimizing this additional objective trades model performance against the time cost of solving the learned dynamics. We demonstrate our approach by training substantially faster, while nearly as accurate, models in supervised classification, density estimation, and time-series modelling tasks.