AkhenOsiris
Secular shifts are powerful. No need to be early to capitalize on massive gains. Cloud computing, social media, streaming content, etc etc.
And now AI. When ChatGPT debuted (Nov. 2022), NVDA quickly 3x'd in 8 months. It then consolidated and did nothing for the next 6 months. That's well over a year to analyze, research, absorb data. Watch how the narrative unfolds.
Since then, NVDA has 3.5x'd (with a few major drawdowns along the way). Second derivative rates may have finally peaked, but capex, tokens, API calls, etc are still growing. Models are being trained on ever larger clusters, with the latest chips (i.e. Blackwell).
The party will need further advancement, uptake, monetization to keep going, but hard to say with any conviction yet whether it is over or not.
tweet
Secular shifts are powerful. No need to be early to capitalize on massive gains. Cloud computing, social media, streaming content, etc etc.
And now AI. When ChatGPT debuted (Nov. 2022), NVDA quickly 3x'd in 8 months. It then consolidated and did nothing for the next 6 months. That's well over a year to analyze, research, absorb data. Watch how the narrative unfolds.
Since then, NVDA has 3.5x'd (with a few major drawdowns along the way). Second derivative rates may have finally peaked, but capex, tokens, API calls, etc are still growing. Models are being trained on ever larger clusters, with the latest chips (i.e. Blackwell).
The party will need further advancement, uptake, monetization to keep going, but hard to say with any conviction yet whether it is over or not.
tweet
Offshore
Photo
Ahmad
who here would like to see a build video guide for multiple RTX PRO 6000s?
already got the hardware ordered for a couple of 3090s and 5090s build guides btw
yes, there'll be GPU giveaways ;)
first video guide before Thanksgiving
anyway, Buy a GPU keeps on winning
tweet
who here would like to see a build video guide for multiple RTX PRO 6000s?
already got the hardware ordered for a couple of 3090s and 5090s build guides btw
yes, there'll be GPU giveaways ;)
first video guide before Thanksgiving
anyway, Buy a GPU keeps on winning
@TheAhmadOsman Probably nothing 👀🔥 https://t.co/sCr29jVV7J - Mike Bradleytweet
❤1
Offshore
Video
Clark Square Capital
Couldn't decide, so we are doing two idea threads: a Japan only, and a special sit only. Lesss go! https://t.co/5NiyqPdEnZ
tweet
Couldn't decide, so we are doing two idea threads: a Japan only, and a special sit only. Lesss go! https://t.co/5NiyqPdEnZ
Ok, guys. It's been about a month since the last idea thread. What's a good prompt for the next one? I will pick the best one and use that. - Clark Square Capitaltweet
Offshore
Photo
Dimitry Nakhla | Babylon Capital®
RT @TheShortBear: $MELI
tweet
RT @TheShortBear: $MELI
BREAKING: Argentina President Milei's party has won Argentina's midterm election. https://t.co/G2zlzr666e - The Kobeissi Lettertweet
Offshore
Photo
Ahmad
transformers are like onions. onions have layers, onions have layers...
you get it? they both have layers https://t.co/sf17lTd4am
tweet
transformers are like onions. onions have layers, onions have layers...
you get it? they both have layers https://t.co/sf17lTd4am
Ilya Sutskever just posted this https://t.co/1ze43nM7hQ - NIKtweet
Offshore
Photo
Ahmad
transformers are like onions. transformers have layers, onions have layers...
you get it? they both have layers https://t.co/XYOcfXbwNF
tweet
transformers are like onions. transformers have layers, onions have layers...
you get it? they both have layers https://t.co/XYOcfXbwNF
Ilya Sutskever just posted this https://t.co/1ze43nM7hQ - NIKtweet
Ahmad
this guy knows his stuff
give him a follow
tweet
this guy knows his stuff
give him a follow
@TheAhmadOsman 5090s for training(RL) small models like Qwen3 8b - tflops matters more
3090s for running larger models(eg glm 4.5 air) for my own use, where I can tolerate slower speed but more vrams are needed - mconcattweet
X (formerly Twitter)
mconcat (@monoidconcat) on X
@TheAhmadOsman 5090s for training(RL) small models like Qwen3 8b - tflops matters more
3090s for running larger models(eg glm 4.5 air) for my own use, where I can tolerate slower speed but more vrams are needed
3090s for running larger models(eg glm 4.5 air) for my own use, where I can tolerate slower speed but more vrams are needed