honey, wake up
GPTea just dropped
GPTea just dropped
the Guardian
UK to invest ยฃ900m in supercomputer in bid to build own โBritGPTโ
Treasury announces plans for exascale computer so as not to risk losing out to China
๐ค21๐8๐คฎ2
Large language models (LLMs) can improve themselves without human intervention. The authors conducted ablation studies and showed that fine-tuning on reasoning is critical for self-improvement.
the paper
the paper
๐ฅ18๐ฅฑ6๐ค5๐ฑ2๐1
you can now install & run your local chatGPT like model
GitHub
GitHub - antimatter15/alpaca.cpp: Locally run an Instruction-Tuned Chat-Style LLM
Locally run an Instruction-Tuned Chat-Style LLM . Contribute to antimatter15/alpaca.cpp development by creating an account on GitHub.
๐คฏ11๐7๐3โค1๐ฅฐ1
I have not heard the word IBM even once in this AI chaos๐ซฅ
๐คฃ31๐6โค5๐คก3๐ค1๐1
waiting for an AI to rewrite all the software rot
https://fxtwitter.com/tsoding/status/1636036276687192068
https://fxtwitter.com/tsoding/status/1636036276687192068
FixTweet
ะขsัdiะธg (@tsoding)
How Fast is Your Computer?
๐15๐คฏ9๐2๐ค2๐ฅฐ1๐ฉ1
