et/acc
279 subscribers
1.17K photos
54 videos
15 files
359 links
Ethiopain Accleration.
Growth unlocks choices.



Discussion at: https://t.me/+Prr0V8-VR0c4NTA0
Download Telegram
The Big Vision

This technology could redefine what it means to be human—not by replacing our humanity, but by deepening it. Imagine a world where:


Grief is softened by reliving loved ones’ happiest memories.

Education is a sensory journey through history’s greatest minds.

Global empathy is built by sharing lived experiences across borders.


But doomers don't want to listen.
🔥2
We Africans are in a deep sleep.

Why can’t we even come together to create our own AI by combining resources, being less selfish, and working hard?

We have so many young tech folks in their 20s, who can put the time in open source, yet we’re always late to the party.

This is passive decel.
If you aren't actively accelerating, it implicit decel.
"Code is cheap. Money now chases utility wrapped in taste, function sculpted with beautiful form, and technology framed in artistry."

Read --> Link
👍3
🏒If you had 10x the agency you have right now, what would you do this week?
A million seconds is 11.5 days
A billion seconds is 31.7 years
A trillion seconds is 317 centuries
🔥2
✍️ building applications with Foundation Models

📖 AI Engineering
😂 Spweing DeepShit


But on a serious note: If you "fine-tune" AI models - it's your time to print BIRR.

Just fine tune Deepseek r1 for companies! This model is the first open-source model that is consistently good on local servers for local language.
2😁2
This media is not supported in your browser
VIEW IN TELEGRAM
DeepSeek R1 is now live on Azure AI Foundry and GitHub

This will never not be funny given Microsoft + OpenAI relationship

It's kinda available for free, remember the CEO tweeting Jevon Paradox...
The only tutorial you need.

👌Also, the only comment you need.
👍2
from transformers import AutoTokenizer, AutoModelForCausalLM, Trainer, TrainingArguments

# Load model and tokenizer
model_name = "deepseek/r1"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)

# Add Amharic tokens (if needed)
new_tokens = ["ሀ", "ሁ", "ሂ", ...] # Amharic characters
tokenizer.add_tokens(new_tokens)
model.resize_token_embeddings(len(tokenizer))

# Training arguments
training_args = TrainingArguments(
output_dir="./results",
learning_rate=2e-5,
per_device_train_batch_size=8,
num_train_epochs=3,
evaluation_strategy="epoch",
)

# Initialize Trainer
trainer = Trainer(
model=model,
args=training_args,
train_dataset=train_dataset,
eval_dataset=val_dataset,
)

# Start training
trainer.train()
🔥1
2017

▓▓▓▓▓▓▓░░░░░░░░░ ፴፱% (39%)
There’s only one sure purpose in life,
and that’s the pleasure of feeling alive.
Everything else is bullshit.



Bullshit eats life. Bullshit shits death.



Politics is bullshit.
Jobs are bullshit.
Hate is bullshit.
God is bullshit
when jealously raging, but
catch Her dancing, that’s
life for the living!
No bullshit! Join in!



Twitter is bullshit.
Facebook is bullshit.
Instagram/instabullshit.
Medium has its moments,
but it’s mostly bullshit, too.



TV is bullshit. The internet is bullshit.
This could all be a simulation, a dead
bullshit copy of something once alive.



Fake News is bullshit.
Real News is bullshit.



Walk outside. Hug a tree.
If, with your skin pressed
to the cool, gray bark,
you find yourself looking
at your phone, that’s bullshit.



Your to-do list is bullshit.
Your morning routine is bullshit,
crushed or otherwise.
(Crushed bullshit. Think about it).



So, what isn’t bullshit?



Little kids know this.
New lovers know this.
Flowers know this.
There’s the sun! Turn! Turn!



Play isn’t bullshit.
Love isn’t bullshit.
You can trust joy and glee
to be near-bullshit-free.



Every dead thing consumes your aliveness,
your only sure purpose in life.
So, resist! You’re alive! Now, feel it! Feel it!
Everything else is bullshit.
👎54
The sunset today, Addis.
3