Rob Stack
167 subscribers
142 photos
25 videos
3 files
91 links
Rob Stack is where I document my progress in software engineering👨‍💻 and share helpful tips and tech knowledge for fellow learners. And more of me.
Download Telegram
Forwarded from Natyiu (Naty)
I heard people say, “What if it doesn’t work?” but i think the regret of failure is way better than the regret of “I could’ve done it.” Failing at least means you tried. It means you had the courage to move. But sitting there… wondering what would’ve happened? That feeling stays longer. So yeah, maybe it won’t work. Maybe you’ll fail.

But I’d rather fail than live with “what if.”

@natyiu0
3
March smells like getting back on track

@R0bstack
🔥5
Rob Stack
Week 7 @R0bstack
Week 8
~February was tough for me.
~Living without my PC is really difficult, I lost my routine. One good thing I did was start reading.
~It's also hard to keep up with A2SV classes without my PC.

@R0bstack
Covid in High school, War in Uni

Are we the issue? 🤔

@R0bstack
Forwarded from Tech Nerd (Tech Nerd)
If you’re in university or college, make it your mission to find someone who can be your cofounder.

@selfmadecoder
Happy Women's Day 💪🏼😇

@R0bstack
😁123🔥2
🛑Breaking news: My PC survived. 💻

@R0bstack
🔥6👏5
Rob Stack
🛑Breaking news: My PC survived. 💻 @R0bstack
After a long wait, my PC is finally back.

@R0bstack
🔥7
The goal is to be better than yesterday, not to be better than others.

@R0bstack
5🤝3
😁4
Forwarded from Zaya
Recursion in DSA always felt a little poetic to me.
A function whispering the same question to itself...again and again
until the problem slowly softens
and the answer quietly appears.....

#random
@zaya_journal
4
Forwarded from kid cyber
⚠️ Scam Warning

If you see messages like “Premium giveaway. !  with a random link, do not click it. 🚫

These are often scam links used to steal your information or hack accounts. Always verify the source before clicking any link.

Be safe.
3
do you guys know what GPT stands for ?
GPT stands for Generative Pre-trained Transformer.

But what is a Transformer? It’s a deep learning architecture that uses attention mechanisms to understand relationships between words in a sequence and it is cleaner, faster, and way more scalable than the ones that came before. GPT specifically runs on a decoder-only setup.

If you wanna go deep and explore how LLMs are made, there’s a book called How to Build LLMs from Scratch. but building one from scratch right now isn’t really a wise move.

@R0bstack
2