Offshore
Photo
Illiquid
$sndk $mu - looks like I can recycle my β€œif you missed storage and memory you should buy this” posts. https://t.co/NSE97z0JjU
tweet
Offshore
Photo
Quiver Quantitative
BREAKING: We just caught another crazy trade.

Someone on Polymarket just bet $40,000 that Trump will acquire Greenland.

They will win $300K if they are correct.

Another potential insider or a degenerate gambler? https://t.co/zLfxmaK3Qz
tweet
God of Prompt
the best content creators aren't writers.

they're librarians.

they know where to find information.
they know how to organize it.
they know how to resurface it at the right time.

creation is curation with context.
tweet
Offshore
Photo
God of Prompt
RT @godofprompt: R.I.P few-shot prompting.

Meta AI researchers discovered a technique that makes LLMs 94% more accurate without any examples.

It's called "Chain-of-Verification" (CoVe) and it completely destroys everything we thought we knew about prompting.

Here's the breakthrough (and why this changes everything): πŸ‘‡
tweet
Offshore
Photo
Wasteland Capital
$MU now passed the 300%+ return mark. Meanwhile $AEO just hit new 4-year highs.

$LYFT outperforming +55-60% vs $UBER and $DASH (both of those are negative).

Average return now +139%.

Compounding is beautiful. https://t.co/ZerFN5dOf0

Apart from my earnings reviews, I’ve only tweeted five new cases on here over the last year.

They ended up the #1 Mag7 YTD $GOOG, the #1 Semi YTD $MU & #1 China LargeCap YTD $BABA. Plus 2 πŸš€ smallcaps, $AEO & $LYFT

Average return +111% currently.

Less is more, as they say. https://t.co/LGr5bglq7t
- Wasteland Capital
tweet
Offshore
Photo
Illiquid
Wrong kind of storage to be checking out today. https://t.co/mTqMenqxLE
tweet
Offshore
Photo
Brady Long
RT @0xgaut: Mom: "how's the job search going?"

you: "Claude, build me a prediction market arbitrage bot" https://t.co/eFnRqOu05Y
tweet
Offshore
Photo
God of Prompt
RT @godofprompt: 🚨 DeepMind discovered that neural networks can train for thousands of epochs without learning anything.

Then suddenly, in a single epoch, they generalize perfectly.

This phenomenon is called "Grokking".

It went from a weird training glitch to a core theory of how models actually learn.

Here’s what changed (and why this matters now):
tweet
Offshore
Photo
memenodes
RT @Bqmbulu: Loving a girl more than she loves you: https://t.co/gmD2gP0Wxk
tweet
memenodes
Porn addiction is so crazy like how you addicted to other nig*as getting pussy?

STOP WATCHING PORN!!

STOP WATCHING PORN!!!

STOP WATCHING PORN!!

YES YOU!!πŸ‘€πŸ‘€..STOP IT!!

STOP WATCHING PORN!!
- m
tweet
memenodes
RT @Mona_Trades: You don’t suck at trading

You suck at waiting
tweet
Offshore
Video
memenodes
locking in and seeing no results

Apart from breakup, what else can make a man be like this?
https://t.co/fEOVkIpgF8
- naiive
tweet