Experienced investors
β Investing is like diving off a diving board
Inexperienced investors
β get buyers remorse, no brand to protect, things get wild
Indeed
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
β Investing is like diving off a diving board
Inexperienced investors
β get buyers remorse, no brand to protect, things get wild
Indeed
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
π4π―2
Actually, looking like the majority of that revenue is coming from perps trading on the majors
Not from sh$tcoin trading
Though neck-and-neck
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
Not from sh$tcoin trading
Though neck-and-neck
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
π4
Calling it now,
2025 really is the year of funds funding funds
Both in tradfi and crypto
Year of the fund
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
2025 really is the year of funds funding funds
Both in tradfi and crypto
Year of the fund
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
π―5π₯3π1
DoomPosting
Higher estrogen makes you lie to please the crowd Higher testosterone makes you more honest, even if being honest hurts you Really Everything is opposite of what weβve been told This is the real science, that the fake βscienceβ wants to bury π³πΎπΎπΌπΏπΎπ
π
πΈπ½πΆ
Higher fatness increases estrogen
Increased estrogen increases lying
Higher fatness, more likely to lie about eating
Yeah, all fits together
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
Increased estrogen increases lying
Higher fatness, more likely to lie about eating
Yeah, all fits together
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
π―7π3π1π¨1
Richard Heart saying his advice wouldβve saved Musk from latest Twitter acquisition related lawsuit
Plausible
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
Plausible
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
π5π―2
Media is too big
VIEW IN TELEGRAM
π€―8π’3π₯1π1
Wait, you know what,
CA using prisoners for firefighting is bad
Why?
It bails out the DEI-destroyed commie local fire departments
Which means they never have to worry about how much their commie ways destroy firefighting
Why should they ever care, when they know male prisoners will always bail them out?
Strongest argument yet, to end using prisoners firefighting
β Not to mention that paying prisoners nearly $0 destroys the wages of regular citizens,
Just like the H1B
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
CA using prisoners for firefighting is bad
Why?
It bails out the DEI-destroyed commie local fire departments
Which means they never have to worry about how much their commie ways destroy firefighting
Why should they ever care, when they know male prisoners will always bail them out?
Strongest argument yet, to end using prisoners firefighting
β Not to mention that paying prisoners nearly $0 destroys the wages of regular citizens,
Just like the H1B
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
π―14π¨1
Forwarded from DoomPosting Chat
FWIW, some one made a $BUTTCOIN on new years eve too
Definitely brings nostalgia
Definitely brings nostalgia
π4π3
Interesting that $BUTTCOIN also ~doubled in the past week
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
π4π1
Forwarded from Chat GPT
We could have been talking to our desktop computers in English since the 90s!
"Somebody got one of the small versions of Llama to run on Windows 98β¦β
βWe could've been talking to our computers in English for the last 30 years"
- Marc Andreessen
Correct.
The hardware already existed, for decades.
What stopped us?
Extreme aversion to investing money into training much larger AI models.
No one was willing to invest the many millions needed to train an AI model of this size.
In fact, even a decade later in 2011, people were still hardly willing to spend more than TEN DOLLARS on electricity costs to train a state-of-the-art model, e.g. the AlexNet image model
Many truly under-estimate how unwilling to people have been to spend money on AI training, until very recently
And this wasnβt unrealized, many of us had screamed this for decades.
No one cared.
Incredible testiment to manβs unwillingness to invest in certain critical areas of future tech.
β happens in AI, advanced market mechanisms, proof systems, and a few other similar areas, that are unquestionably the future.
We could have been talking to our desktop computers in English since the 90s
Bitter Lesson
"Somebody got one of the small versions of Llama to run on Windows 98β¦β
βWe could've been talking to our computers in English for the last 30 years"
- Marc Andreessen
Correct.
The hardware already existed, for decades.
What stopped us?
Extreme aversion to investing money into training much larger AI models.
No one was willing to invest the many millions needed to train an AI model of this size.
In fact, even a decade later in 2011, people were still hardly willing to spend more than TEN DOLLARS on electricity costs to train a state-of-the-art model, e.g. the AlexNet image model
Many truly under-estimate how unwilling to people have been to spend money on AI training, until very recently
And this wasnβt unrealized, many of us had screamed this for decades.
No one cared.
Incredible testiment to manβs unwillingness to invest in certain critical areas of future tech.
β happens in AI, advanced market mechanisms, proof systems, and a few other similar areas, that are unquestionably the future.
We could have been talking to our desktop computers in English since the 90s
Bitter Lesson
π―5π€―2π₯1π±1π¨1
Forwarded from Chat GPT
This media is not supported in your browser
VIEW IN TELEGRAM
We could've been talking to our computers in English for the last 30 years
35.9 tok/sec on a 26 year old Windows 98 Intel Pentium II CPU, with 128MB RAM
Using a 260K LLM with Llama-architecture
35.9 tok/sec on a 26 year old Windows 98 Intel Pentium II CPU, with 128MB RAM
Using a 260K LLM with Llama-architecture
π8π1π1