Henok | Neural Nets
1.61K subscribers
233 photos
20 videos
13 files
157 links
Download Telegram
Big W, single author, simple and nice paper, look at that citation num
πŸ”₯12❀3πŸ‘1
So I randomly discovered an Ethiopian cat Instagram account with over 100 followers(same as mine lol) and casually following a bunch of other cats and dogs. And an entire childhood documented online. Honestly, I didn’t think I'd ever be jealous of a cat's social lifeπŸ˜‚. Now I'm thinking turning our cat into the next big influencer.

Btw this is the level of creativity LLMs won't getπŸ˜‚
😁25🀣4πŸ‘1
We got another AI Research oriented channel by Biruk. He's a Masters student at Paris Saclay and will share cool AI research papers, lectures, codebases, experiments and opportunities from around the world. Join his channel here.

@ethio_sota
πŸ”₯16❀4πŸ‘1
For anyone interested in computational neuroscience.

Apply to Simons Computational Neuroscience Imbizo which will happen in Cape Town, South Africa. They cover all your costs to attend.

https://imbizo.africa/
πŸ”₯7❀6πŸ‘2
πŸ˜‚πŸ˜‚

A. Amharic prompt
B. English prompt
🀣41😁2
every new model is a combination of two things, training algorithms and data

open-source algorithms are up-to-date. the things that work are usually fairly simple

but the data is complex, massive, gatekept, ever-changing. here open models are way behind, and probably hopeless

Source
πŸ”₯7❀3πŸ‘3πŸ’―2πŸ€”1
🀝10πŸ’―6
Hellooo, what do you think about this? Will make the code opensource if people like to play around with it etc, maybe for anything interesting, AI integration, bible verses or anything. For now it just displays some random proverbs.

PS. it's for Mac
πŸ”₯24❀‍πŸ”₯5πŸ‘Œ3
For people who love tokens, two great papers that came out recently.

Dynamic Chunking for End-to-End Hierarchical Sequence Modeling, i saw this yesterday and they created a hierarchical network (H-Net) that learns dynamic, content-aware chunking directly from data, and it replaces tokenization, finally :)


Tokenization is NP-Complete
This one is mostly a theoretical paper and they prove that finding the optimal tokenization (either by direct vocabulary selection or merge operations) is NP-complete, shows the inherent computational difficulty of the task.
πŸ”₯7
In the past, most ML research was driven by academia and there was a circle from people at UoToronto, UdM, Stanford, Berkeley etc. Almost everything that came out or even most pioneers were here. Now academia isn't dead but lost it's place.

Now the circle is OpenAI, Anthropic, Meta, Deepmind, XAI, Microsoft Research etc. Most of the people at the academia circle in the above are running these labs now but with less freedom to explore ideas over time.

Here comes the downfall in the second circle, in academia people spend 5 years to make some cool things and they did, most of the things were initiated in uni labs than industry. But now everyone is chasing benchmarks and 1% gain over the SOTA is almost enough to lead the race. Ofc industry made so many great things, but the same way scaling is converging, it'll be for industry labs, unless they have teams to do foundational research that might take years, e.g FAIR(Meta).
πŸ”₯9❀2πŸ€”1
Forwarded from Beka (Beka)
rwanda and those cheerful ladies are becoming more and more attractive
😁8
Is this really true?
πŸ‘€4😁2
Well, one way to find out
😁19πŸ”₯12⚑1
How many of you like or work on Computer Vision? Are there like any good Ethiopian based researchers working on CV that do some cool things, just wondering.

Currently in Kigali at ACVSS, it'll be very good to join it next year if you like Computer Vision.
πŸ”₯16❀4
Forwarded from Beka (Beka)
This media is not supported in your browser
VIEW IN TELEGRAM
Better Auth was live in Times Square NYC last night. Crazy how far and how quick things go ❀️
πŸ”₯12πŸ‘1
Alphafold isn't that novel work. I just learned yesterday that there were people who almost showed this 3 years prior and published a paper at a top conference, Neurips. Alphafold did almost exactly the same, used deep networks for prediction and scales it. However, they didn't even cite their paper, whyyyy sus

I learned that from Daniel Cremers yesterday and they were the ones who deserved the nobel prize.
🀯7
Machine learning is just statistics. On steroids. Lots and lots of steroids.


X
πŸ”₯10❀1
Machine learning is life. Karaoke is also life. Therefore machine learning is karaoke

X
😁19❀1