Forwarded from Robi makes stuff (Robi)
The ones that didnt make it
https://www.pinterest.com/pin/774124931145523/
https://www.pinterest.com/pin/1125968742974608/
https://www.pinterest.com/pin/38210296834862473/
https://www.pinterest.com/pin/1026398571332038751/
https://www.pinterest.com/pin/6825836927944723/
https://www.pinterest.com/pin/1548181186047857/
https://www.pinterest.com/pin/5066618330561553/
https://www.pinterest.com/pin/260857003411466884/
https://www.pinterest.com/pin/774124931145523/
https://www.pinterest.com/pin/1125968742974608/
https://www.pinterest.com/pin/38210296834862473/
https://www.pinterest.com/pin/1026398571332038751/
https://www.pinterest.com/pin/6825836927944723/
https://www.pinterest.com/pin/1548181186047857/
https://www.pinterest.com/pin/5066618330561553/
https://www.pinterest.com/pin/260857003411466884/
🔥1
Media is too big
VIEW IN TELEGRAM
🗣️Yann LeCun: “LLMs aren’t a bubble, the AGI hype is.”
Meta’s chief scientist Yann LeCun pushed back against claims that large language models are an investment bubble. In his view, the money flowing into AI isn’t misplaced but the expectations around AGI are.
🖱 LeCun argues that LLMs already have lasting, practical value and will remain useful across industries for years.
🖱 The real “bubble,” he says, is the belief that scaling current models alone will reach human-level intelligence.
🖱 True progress, in his view, requires scientific breakthroughs not just more data, parameters, or compute power.
🖱 “We’re missing something important,” LeCun warned, suggesting that the current deep learning paradigm needs fundamental innovation.
Meta’s chief scientist Yann LeCun pushed back against claims that large language models are an investment bubble. In his view, the money flowing into AI isn’t misplaced but the expectations around AGI are.
🖱 LeCun argues that LLMs already have lasting, practical value and will remain useful across industries for years.
🖱 The real “bubble,” he says, is the belief that scaling current models alone will reach human-level intelligence.
🖱 True progress, in his view, requires scientific breakthroughs not just more data, parameters, or compute power.
🖱 “We’re missing something important,” LeCun warned, suggesting that the current deep learning paradigm needs fundamental innovation.
LeCun’s message lands as a reality check for the AI boom: LLMs may be profitable but not magical.