The future of programming is going to be super high level with or without AI.
β‘7π―6
Anthropic got the best interpretability team.
Check out this video Scaling Interpretability by Anthropic
I'm getting back into sparse autoencoders after a while.
Check out this video Scaling Interpretability by Anthropic
I'm getting back into sparse autoencoders after a while.
YouTube
Scaling interpretability
Science and engineering are inseparable. Our researchers reflect on the close relationship between scientific and engineering progress, and discuss the technical challenges they encountered in scaling our interpretability research to much larger AI models.β¦
π₯2
This is interesting, how is tensorflow beating Jax in a normal 10,000 Γ 10,000 matrices multiplication(element-wise)?
Anway, if you only need per-element operations, use multiply() (very fast) and tf will be faster than jax.
If you need actual matrix computations and wanted to do dot product, use matmul() (takes much longer time). JAX is the best for it
Anway, if you only need per-element operations, use multiply() (very fast) and tf will be faster than jax.
If you need actual matrix computations and wanted to do dot product, use matmul() (takes much longer time). JAX is the best for it
β‘5π3
Whatβs the point of having 54 Presidents/Prime Ministers in Africa? Wouldnβt 1 or 2 be enough to run the continent the way it's now?
π20π€£11β‘2β€1π1
All the countires in the world with not much progress in AI/ML are spending their time on AI policy discussions and on ethical AI!
Well, they are doing the right thing, what if they make AGI and don't have the policies right?
Well, they are doing the right thing, what if they make AGI and don't have the policies right?
π7π€£4π1π1π―1
The curse of optimization: When it's accurate, it's slow. When it's fast, it's wrong.
π8π4
This is cool but I wonder how many external API requests there are and if this is feasible as business but overall it's a cool.
Forwarded from Dagmawi Babi
Media is too big
VIEW IN TELEGRAM
This's so impressive wtfruits!!! π₯
Lucy β Multilingual AI Voice Assistant & Chatbot for Ethiopians
β’ https://www.linkedin.com/posts/zemenu_lucy-next-generation-ai-ugcPost-7298736880336920576-HD8B
I was impressed at the Amharic TTS and also when bro wrote amharic in english "endezih" and it understood. Not to mention it scrapes telegram channels and understands major Ethiopian languages,
Lucy β Multilingual AI Voice Assistant & Chatbot for Ethiopians
β’ https://www.linkedin.com/posts/zemenu_lucy-next-generation-ai-ugcPost-7298736880336920576-HD8B
I was impressed at the Amharic TTS and also when bro wrote amharic in english "endezih" and it understood. Not to mention it scrapes telegram channels and understands major Ethiopian languages,
π₯21π1
How can a startup says they work on AI, blockchain, nanotech, quantum computing etc and all while being Africa-based. They're not disrupting the market, they're defying physics.
π€£27π2π1π1
If United win today, I don't see any reason why AGI won't happen soon.
π11π€£6π―2
Henok | Neural Nets
This is interesting, how is tensorflow beating Jax in a normal 10,000 Γ 10,000 matrices multiplication(element-wise)? Anway, if you only need per-element operations, use multiply() (very fast) and tf will be faster than jax. If you need actual matrix computationsβ¦
Spent the last 2 weeks exploring JAX(I've some prior experience) for no particular reason, for most of the time Numpy outperforms it for a few things I tried esp CPU and low GPU requiring tasks, but I'll try to train/inference in big models like Gemma to see the difference.
If you hated PyTorch for not being functional, then go for JAX.
I really want to know if anyone worked with JAX extensively and see how your experience was.
Here is a very good notebook to learn more by some friends: Notebook
Oh and Deepmind uses JAXπ
If you hated PyTorch for not being functional, then go for JAX.
I really want to know if anyone worked with JAX extensively and see how your experience was.
Here is a very good notebook to learn more by some friends: Notebook
Oh and Deepmind uses JAXπ
GitHub
indaba-pracs-2024/practicals/Intro_to_ML_using_JAX/Introduction_to_ML_using_JAX.ipynb at main Β· deep-learning-indaba/indaba-pracsβ¦
Notebooks for the Practicals at the Deep Learning Indaba 2024. - deep-learning-indaba/indaba-pracs-2024
β€9π2
Forwarded from Beka (Beka)
Better Auth is 500 stars away from 10k stars β¨ could you please give us a star if you haven't ;)
https://github.com/better-auth/better-auth
https://github.com/better-auth/better-auth
GitHub
GitHub - better-auth/better-auth: The most comprehensive authentication framework for TypeScript
The most comprehensive authentication framework for TypeScript - better-auth/better-auth
β€3
Llama 3.2 400M Amharic
This is a smaller version of the Meta's Llama-3.2-1B decoder transformer model pretrained from scratch for 23 hours using a single A100 40GB GPU and 274 million tokens of Amharic text.
https://huggingface.co/rasyosef/Llama-3.2-400M-Amharic
This is a smaller version of the Meta's Llama-3.2-1B decoder transformer model pretrained from scratch for 23 hours using a single A100 40GB GPU and 274 million tokens of Amharic text.
https://huggingface.co/rasyosef/Llama-3.2-400M-Amharic
π₯15
Forwarded from Beka (Beka)
Hey guys good news :)
Better Auth has been accepted into Y Combinator's Spring 2025 batch (X25)! π
Myself and @kinfishfarms, will be part of YC's first spring batch. Super excited and thanks everyone here for being part of my journey so far :)) but a lot more to come!
Better Auth has been accepted into Y Combinator's Spring 2025 batch (X25)! π
Myself and @kinfishfarms, will be part of YC's first spring batch. Super excited and thanks everyone here for being part of my journey so far :)) but a lot more to come!
β‘13π₯7π1
Congrats @beka_cru on thisπ, let's not make fun of him at least for today π
π22π€£5
OpenAI is the company that's going to take us to the next chapter !!!
π₯14π1