Offshore
Photo
Illiquid
Will have an update on Wasion later today. $3393. In the meantime, here is my favourite Blasion https://t.co/6Ga5QDtudz
tweet
Will have an update on Wasion later today. $3393. In the meantime, here is my favourite Blasion https://t.co/6Ga5QDtudz
tweet
Offshore
Photo
God of Prompt
RT @godofprompt: I collected every NotebookLM prompt that went viral on Reddit, X, and research communities.
These turned a "cool AI toy" into a research weapon that does 10 hours of work in 20 seconds.
16 copy-paste prompts. Zero fluff.
Steal them all π https://t.co/xRiTcsUnHi
tweet
RT @godofprompt: I collected every NotebookLM prompt that went viral on Reddit, X, and research communities.
These turned a "cool AI toy" into a research weapon that does 10 hours of work in 20 seconds.
16 copy-paste prompts. Zero fluff.
Steal them all π https://t.co/xRiTcsUnHi
tweet
Offshore
Photo
God of Prompt
RT @godofprompt: I collected every NotebookLM prompt that went viral on Reddit, X, and research communities.
These turned a "cool AI toy" into a research weapon that does 10 hours of work in 20 seconds.
16 copy-paste prompts. Zero fluff.
Steal them all π https://t.co/xRiTcsUnHi
tweet
RT @godofprompt: I collected every NotebookLM prompt that went viral on Reddit, X, and research communities.
These turned a "cool AI toy" into a research weapon that does 10 hours of work in 20 seconds.
16 copy-paste prompts. Zero fluff.
Steal them all π https://t.co/xRiTcsUnHi
tweet
Offshore
Photo
God of Prompt
RT @godofprompt: π¨ DeepMind discovered that neural networks can train for thousands of epochs without learning anything.
Then suddenly, in a single epoch, they generalize perfectly.
This phenomenon is called "Grokking".
It went from a weird training glitch to a core theory of how models actually learn.
Hereβs what changed (and why this matters now):
tweet
RT @godofprompt: π¨ DeepMind discovered that neural networks can train for thousands of epochs without learning anything.
Then suddenly, in a single epoch, they generalize perfectly.
This phenomenon is called "Grokking".
It went from a weird training glitch to a core theory of how models actually learn.
Hereβs what changed (and why this matters now):
tweet
Offshore
Video
Fiscal.ai
Live now on @fiscal_ai Terminal. Beautiful charting and features to help you grow your following here on @X.
tweet
Live now on @fiscal_ai Terminal. Beautiful charting and features to help you grow your following here on @X.
Live now, new chart exporting interface.
More customizability to make it yours and utilize it to grow here on X by attributing your handle. https://t.co/DRIGL2kxrS - Braden Dennistweet
Offshore
Photo