Rob Stack
Week 7 @R0bstack
Week 8
~February was tough for me.
~Living without my PC is really difficult, I lost my routine. One good thing I did was start reading.
~It's also hard to keep up with A2SV classes without my PC.
@R0bstack
~February was tough for me.
~Living without my PC is really difficult, I lost my routine. One good thing I did was start reading.
~It's also hard to keep up with A2SV classes without my PC.
@R0bstack
Forwarded from Tech Nerd (Tech Nerd)
If you’re in university or college, make it your mission to find someone who can be your cofounder.
@selfmadecoder
@selfmadecoder
Rob Stack
Week 8 ~February was tough for me. ~Living without my PC is really difficult, I lost my routine. One good thing I did was start reading. ~It's also hard to keep up with A2SV classes without my PC. @R0bstack
Week 9
~Finished reading a book.
~ Catching up with A2SV tasks.
~Learned about the K-Means and DBSCAN clustering algorithms in ML.
@R0bstack
~Finished reading a book.
~ Catching up with A2SV tasks.
~Learned about the K-Means and DBSCAN clustering algorithms in ML.
@R0bstack
❤2
Forwarded from Zaya
Recursion in DSA always felt a little poetic to me.
A function whispering the same question to itself...again and again
until the problem slowly softens
and the answer quietly appears.....
#random
@zaya_journal
A function whispering the same question to itself...again and again
until the problem slowly softens
and the answer quietly appears.....
#random
@zaya_journal
❤4
Lumina
If you ever struggle with understanding recursion like me, check this site out... The animated visualizations really make it much easier to understand...from today's A2SV lecture btw @ahlumina
Just enter the recursive function, and it will visualize it as a tree, making it easier to understand.
@R0bstack
@R0bstack
❤3
do you guys know what GPT stands for ?
GPT stands for Generative Pre-trained Transformer.
But what is a Transformer? It’s a deep learning architecture that uses attention mechanisms to understand relationships between words in a sequence and it is cleaner, faster, and way more scalable than the ones that came before. GPT specifically runs on a decoder-only setup.
If you wanna go deep and explore how LLMs are made, there’s a book called How to Build LLMs from Scratch. but building one from scratch right now isn’t really a wise move.
@R0bstack
But what is a Transformer? It’s a deep learning architecture that uses attention mechanisms to understand relationships between words in a sequence and it is cleaner, faster, and way more scalable than the ones that came before. GPT specifically runs on a decoder-only setup.
If you wanna go deep and explore how LLMs are made, there’s a book called How to Build LLMs from Scratch. but building one from scratch right now isn’t really a wise move.
@R0bstack
✍2
Rob Stack
do you guys know what GPT stands for ? GPT stands for Generative Pre-trained Transformer. But what is a Transformer? It’s a deep learning architecture that uses attention mechanisms to understand relationships between words in a sequence and it is cleaner…
Build a Large Language Model from Scratch.pdf
3.9 MB
👌2
https://github.com/Robel-w/fake-news-detection
Neural Network model that can classify news headlines as either Fake or Real.
@R0bstack
Neural Network model that can classify news headlines as either Fake or Real.
@R0bstack
GitHub
GitHub - Robel-w/fake-news-detection
Contribute to Robel-w/fake-news-detection development by creating an account on GitHub.
🔥8