Do you think AI can do what you are doing now in the next 5 years?
Final Results
52%
Yes
17%
No
11%
Definitely Noo
13%
Probably
7%
I don't really know
#EthiopianAcademics
I met people from different countries around the world doing Bsc/MS/PhD and say my professor/advisor helped me get something, helped me do this cool thing, mentored me etc, but Ethiopian counterparts(not all but almost all) are just a disgrace here.
I met people from different countries around the world doing Bsc/MS/PhD and say my professor/advisor helped me get something, helped me do this cool thing, mentored me etc, but Ethiopian counterparts(not all but almost all) are just a disgrace here.
π’5π2π1
In the spirit of Microsoft's service outages, i'll show you around their office for today. The food is the best thing π
π₯11π2π1
Vision + Language
I wanted to share a very good practical on vision + language. I worked with Yuki Asano to develop this notebook. If you got any interest in this, it could help a lot.
Basically, in this vision+lang models you will have a joint image and text embedding model which means that it maps both text and images to the same embedding space.
Notebookπ
I wanted to share a very good practical on vision + language. I worked with Yuki Asano to develop this notebook. If you got any interest in this, it could help a lot.
Basically, in this vision+lang models you will have a joint image and text embedding model which means that it maps both text and images to the same embedding space.
Notebookπ
π₯5π2
Analysis for Llama 3.1.
1. 15.6T tokens, Tools & Multilingual
2. Llama arch + new RoPE
3. fp16 & static fp8 quant for 405b
4. Dedicated pad token
5. <|python_tag|><|eom_id|> for tools?
6. Roberta to classify good quality data
7. 6 staged 800B tokens long context expansion
Data mixture
- 50% general knowledge
- 25% maths & reasoning
- 17% code data and tasks
- 8% multilingual data
Source
1. 15.6T tokens, Tools & Multilingual
2. Llama arch + new RoPE
3. fp16 & static fp8 quant for 405b
4. Dedicated pad token
5. <|python_tag|><|eom_id|> for tools?
6. Roberta to classify good quality data
7. 6 staged 800B tokens long context expansion
Data mixture
- 50% general knowledge
- 25% maths & reasoning
- 17% code data and tasks
- 8% multilingual data
Source
π2
Forwarded from Techα’α΅ (Hilina)
It's been an incredibly exciting week in the world of AI:
- OpenAI launched a new search tool called SearchGPT
- Meta updated its Llama language model to version 3.1
- Mistral AI released a new and improved Mistral Large 2 model
- DeepMind's AI achieved a silver medal at the International Math Olympiad
- Elon Musk announced plans to develop Grok 2 and 3
- OpenAI launched a new search tool called SearchGPT
- Meta updated its Llama language model to version 3.1
- Mistral AI released a new and improved Mistral Large 2 model
- DeepMind's AI achieved a silver medal at the International Math Olympiad
- Elon Musk announced plans to develop Grok 2 and 3
β‘5
Btw it's depressing that almost allπ of them are almost inaccessible, either beta version or needs high gpus.
π’6
I expect
1. bank rates will rapidly converge to the current parallel market rate, maybe a bit lower.
2. market rate will not go down but it will rise more slowly, over the next 6 months, % change of ETB/USD will be less than the last 6 months.
3. longer term, ETB will strengthen
Source from Nemo Semret
1. bank rates will rapidly converge to the current parallel market rate, maybe a bit lower.
2. market rate will not go down but it will rise more slowly, over the next 6 months, % change of ETB/USD will be less than the last 6 months.
3. longer term, ETB will strengthen
Source from Nemo Semret
πRanking Programming Languages by Energy Efficiency
Compiled languages βtend to beβ the most energy-efficient and fastest-running.
...the five slowest languages were all interpreted: Lua, Python, Perl, Ruby and Typescript. And the five languages which consumed the most energy were also interpreted ones.
Paper
Compiled languages βtend to beβ the most energy-efficient and fastest-running.
...the five slowest languages were all interpreted: Lua, Python, Perl, Ruby and Typescript. And the five languages which consumed the most energy were also interpreted ones.
Paper
β‘7π1
Who do you think are some of the smartest or coolest or awesome people in AI/ML?
I'll start Alec Radford (the real guy behind almost every OpenAI projects and amazing things)
I'll start Alec Radford (the real guy behind almost every OpenAI projects and amazing things)