Offshore
Photo
Samanyou Garg
RT @osakasaul: @TheMacSweaty All over chatsonic. It just wrote https://t.co/nnGCW6X4Kd and provided all the data from which I created the table in the article. Just amazing! https://t.co/ohuQwddXSU
tweet
Offshore
Photo
Riley Goodside
I’m not really an LLM *researcher*, I’m more of a… https://t.co/zOFkb5nmi7
tweet
Grades 🗨
One thing I wish I'd done when starting building niche websites:

Document all the questions I had.

I must've typed hundreds of problem solving questions into Google over the last 3 years.

It would be great to tweet about my learnings now.

Lesson: document the process.
tweet
Offshore
Photo
Zohaib Ahmed
Expect to see a lot of alternatives to diffusion modeling in 2023.

“Compared to pixel-space diffusion models, such as Imagen and DALL-E 2, Muse is significantly more efficient due to the use of discrete tokens and requiring fewer sampling iterations”
https://t.co/GHeFnqrFOg https://t.co/NY0xnt9lU8
tweet
Offshore
Photo
pharmapsychotic
RT @arankomatsuzaki: Muse: Text-To-Image Generation via Masked Generative Transformers

Presents Muse, a text-to-image Transformer model that achieves SotA image generation perf while being far more efficient than diffusion or AR models.

proj: https://t.co/JljH4pGZX1
abs: https://t.co/52bc9HuT7o https://t.co/r5n8KJNgFU
tweet
Offshore
Photo
Linus (●ᴗ●)
🧠 Quick update on @SensiveXyz

We closed out our 3rd full year with our journaling and mood-tracking app Sensive

So humbled by everyone sending us feedback and telling us how Sensive is changing their lives

Free App (iOS only) https://t.co/f2UVWGi4tZ

🧵 https://t.co/8Gi8xa6Dhh
tweet
Offshore
Photo
Jay Hack
Can we compress large language models for better perf?

"SparseGPT: Massive Language Models can be Accurately Pruned in One Shot"

Eliminates the need to use/store 50% of weights for a 175B param model with no significant sacrifice in perf

https://t.co/YVISG0l06b

Here's how 👇 https://t.co/XQyTEVpG5d
tweet
François Chollet
Tweets are generative prompts for your thoughts.
tweet
François Chollet
Making something a lot easier to do isn't incremental improvement, it's zero-to-one enablement: a large group of folks who were previously not able to do it, now can.
tweet
Vova “words are a motherfucker” Zakharov
The word “variable” looks so weird when written in its unabbreviated form.
tweet