Media is too big
VIEW IN TELEGRAM
π€―8π’3π₯1π1
Wait, you know what,
CA using prisoners for firefighting is bad
Why?
It bails out the DEI-destroyed commie local fire departments
Which means they never have to worry about how much their commie ways destroy firefighting
Why should they ever care, when they know male prisoners will always bail them out?
Strongest argument yet, to end using prisoners firefighting
β Not to mention that paying prisoners nearly $0 destroys the wages of regular citizens,
Just like the H1B
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
CA using prisoners for firefighting is bad
Why?
It bails out the DEI-destroyed commie local fire departments
Which means they never have to worry about how much their commie ways destroy firefighting
Why should they ever care, when they know male prisoners will always bail them out?
Strongest argument yet, to end using prisoners firefighting
β Not to mention that paying prisoners nearly $0 destroys the wages of regular citizens,
Just like the H1B
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
π―14π¨1
Forwarded from DoomPosting Chat
FWIW, some one made a $BUTTCOIN on new years eve too
Definitely brings nostalgia
Definitely brings nostalgia
π4π3
Interesting that $BUTTCOIN also ~doubled in the past week
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
π4π1
Forwarded from Chat GPT
We could have been talking to our desktop computers in English since the 90s!
"Somebody got one of the small versions of Llama to run on Windows 98β¦β
βWe could've been talking to our computers in English for the last 30 years"
- Marc Andreessen
Correct.
The hardware already existed, for decades.
What stopped us?
Extreme aversion to investing money into training much larger AI models.
No one was willing to invest the many millions needed to train an AI model of this size.
In fact, even a decade later in 2011, people were still hardly willing to spend more than TEN DOLLARS on electricity costs to train a state-of-the-art model, e.g. the AlexNet image model
Many truly under-estimate how unwilling to people have been to spend money on AI training, until very recently
And this wasnβt unrealized, many of us had screamed this for decades.
No one cared.
Incredible testiment to manβs unwillingness to invest in certain critical areas of future tech.
β happens in AI, advanced market mechanisms, proof systems, and a few other similar areas, that are unquestionably the future.
We could have been talking to our desktop computers in English since the 90s
Bitter Lesson
"Somebody got one of the small versions of Llama to run on Windows 98β¦β
βWe could've been talking to our computers in English for the last 30 years"
- Marc Andreessen
Correct.
The hardware already existed, for decades.
What stopped us?
Extreme aversion to investing money into training much larger AI models.
No one was willing to invest the many millions needed to train an AI model of this size.
In fact, even a decade later in 2011, people were still hardly willing to spend more than TEN DOLLARS on electricity costs to train a state-of-the-art model, e.g. the AlexNet image model
Many truly under-estimate how unwilling to people have been to spend money on AI training, until very recently
And this wasnβt unrealized, many of us had screamed this for decades.
No one cared.
Incredible testiment to manβs unwillingness to invest in certain critical areas of future tech.
β happens in AI, advanced market mechanisms, proof systems, and a few other similar areas, that are unquestionably the future.
We could have been talking to our desktop computers in English since the 90s
Bitter Lesson
π―5π€―2π₯1π±1π¨1
Forwarded from Chat GPT
This media is not supported in your browser
VIEW IN TELEGRAM
We could've been talking to our computers in English for the last 30 years
35.9 tok/sec on a 26 year old Windows 98 Intel Pentium II CPU, with 128MB RAM
Using a 260K LLM with Llama-architecture
35.9 tok/sec on a 26 year old Windows 98 Intel Pentium II CPU, with 128MB RAM
Using a 260K LLM with Llama-architecture
π8π1π1
Forwarded from Chat GPT
This media is not supported in your browser
VIEW IN TELEGRAM
We could've been talking to our computers in English for the last 30 years
Somebody got one of the small versions of Llama to run on Windows 98β¦
We could've been talking to our computers in English for the last 30 years
- Marc Andreessen
Somebody got one of the small versions of Llama to run on Windows 98β¦
We could've been talking to our computers in English for the last 30 years
- Marc Andreessen
π―3π¨1
Something real disturbing thatβs started to appear involving the AI-generated comments on twitter lately..
Humans, tons of them, now starting to respond to the AI comments as if theyβre realβ¦
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
Humans, tons of them, now starting to respond to the AI comments as if theyβre realβ¦
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
π±5π3π―1π¨1
Telegramβs $TON blockchain is moving aggressively back into the US, betting that President-elect Trumpβs victory will bring a friendlier regulatory environment
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
π³πΎπΎπΌπΏπΎπ π πΈπ½πΆ
π6π1