Jim Fan
OpenAI just dropped their “AGI roadmap” 👀
I read through it. Key takeaways:
Short term:
- OpenAI will become increasingly cautious with the deployment of their models. This could mean that users as well as use cases may be more closely monitored and https://t.co/VxLIZiyR9z… https://t.co/xOpx6k06Pg
tweet
OpenAI just dropped their “AGI roadmap” 👀
I read through it. Key takeaways:
Short term:
- OpenAI will become increasingly cautious with the deployment of their models. This could mean that users as well as use cases may be more closely monitored and https://t.co/VxLIZiyR9z… https://t.co/xOpx6k06Pg
tweet
Offshore
Photo
DataChazGPT 🤯 (not a bot)
.@Meta launched their #GPT3 competitor #LLaMA today! 👀
Learn about it here:
🔗 https://t.co/bFVvCb5Q08 https://t.co/S9NVVf46Ce
tweet
.@Meta launched their #GPT3 competitor #LLaMA today! 👀
Learn about it here:
🔗 https://t.co/bFVvCb5Q08 https://t.co/S9NVVf46Ce
tweet
Offshore
Photo
Ethan Mollick
RT @emollick: 🚨Its the AI homework showdown!
I asked Bing & ChatGPT to create assignments and rubrics for an essay on team performance. Than I gave the assignment to the other chatbot, and returned the assignment to the original to assign grades.
Bing gave ChatGPT 70%
ChatGPT gave Bing an A https://t.co/aiyy3ktxhZ
tweet
RT @emollick: 🚨Its the AI homework showdown!
I asked Bing & ChatGPT to create assignments and rubrics for an essay on team performance. Than I gave the assignment to the other chatbot, and returned the assignment to the original to assign grades.
Bing gave ChatGPT 70%
ChatGPT gave Bing an A https://t.co/aiyy3ktxhZ
tweet
Offshore
Photo
Robert Scoble
RT @karpathy: Didn't tweet nanoGPT yet (quietly getting it to good shape) but it's trending on HN so here it is :) :
https://t.co/qouvC6xuXq
Aspires to be simplest, fastest repo for training/finetuning medium-sized GPTs. So far confirmed it reproduced GPT-2 (124M). 2 simple files of ~300 lines https://t.co/dcjowL4jf3
tweet
RT @karpathy: Didn't tweet nanoGPT yet (quietly getting it to good shape) but it's trending on HN so here it is :) :
https://t.co/qouvC6xuXq
Aspires to be simplest, fastest repo for training/finetuning medium-sized GPTs. So far confirmed it reproduced GPT-2 (124M). 2 simple files of ~300 lines https://t.co/dcjowL4jf3
tweet
Robert Scoble
RT @appenz: 1/5 Meta launched their GPT-3 competitor LLaMA today. Here is a quick analysis of how it stacks up, how open it is and how it changes the industry landscape.
https://t.co/q2DPcrZtB1
tweet
RT @appenz: 1/5 Meta launched their GPT-3 competitor LLaMA today. Here is a quick analysis of how it stacks up, how open it is and how it changes the industry landscape.
https://t.co/q2DPcrZtB1
tweet
Greg Brockman
Our planning for AGI, including our thoughts on how to navigate the risks and distribute the benefits & governance (reflected deeply in our corporate structure):
https://t.co/Wj9ThqGP7n
tweet
Our planning for AGI, including our thoughts on how to navigate the risks and distribute the benefits & governance (reflected deeply in our corporate structure):
https://t.co/Wj9ThqGP7n
tweet