Continuous Learning_Startup & Investment
2.44K subscribers
513 photos
5 videos
16 files
2.74K links
We journey together through the captivating realms of entrepreneurship, investment, life, and technology. This is my chronicle of exploration, where I capture and share the lessons that shape our world. Join us and let's never stop learning!
Download Telegram
๋ฏธ๊ตญ์—์„œ ์ง€๋‚ด๋ฉด์„œ ๋“ค์—ˆ๋˜ ์—ฌ๋Ÿฌ๊ฐ€์ง€ ์ƒ๊ฐ๋“ค์„ ์ ์–ด๋ดค์Šต๋‹ˆ๋‹ค. ๋‘์„œ์—†๋Š” ๊ธฐ๋ก๋“ค์ด์ง€๋งŒ ๋‚˜์ค‘์— ๋Œ์ด์ผœ ๋ดค์„ ๋•Œ์— ์ด ๋•Œ์˜ ์ €๋ฅผ ๊ธฐ์–ตํ•  ์ˆ˜ ์žˆ๋Š” ์ˆ˜๋‹จ์ผ ๊ฒƒ ๊ฐ™์•„ ๊ธฐ๋กํ–ˆ๊ณ  ํ˜น์‹œ ๋ˆ„๊ตฐ๊ฐ€์—๊ฒŒ ๋„์›€์ด ๋  ์ˆ˜ ์žˆ์„๊นŒ ํ•ด์„œ ๊ณต๊ฐœํ•ฉ๋‹ˆ๋‹ค.

์™œ ๋ฏธ๊ตญ์— ์™”๋Š”์ง€ ์™€์„œ ์–ด๋–ค ๊ฒƒ์„ ๋ณด๊ณ  ๋А๋ผ๊ณ  ์žˆ๋Š”์ง€, AI์— ๋Œ€ํ•ด์„œ๋Š” ์–ด๋–ค ์ƒ๊ฐ์„ ๊ฐ€์ง€๊ณ  ์žˆ๋Š”์ง€, ๋ฏธ๊ตญ์—์„œ ํ•ด๋ณผ๋งŒํ•œ ์‚ฌ์—…์˜ ๊ธฐํšŒ์— ๋Œ€ํ•ด์„œ, ๋ฏธ๊ตญ์—์„œ ๋‹ค์‹œ ์‹ ์šฉ์„ ์Œ“์•„๊ฐ€๋Š” ์ƒํ™ฉ, ์ข‹์€ ์ธ์ƒ์„ ์‚ด๊ธฐ ์œ„ํ•ด ๋…ธ๋ ฅํ•˜๋Š” ๊ฒƒ๋“ค, ์ฐฐ๋ฆฌ ๋ฉ๊ฑฐ๊ฐ€ ๋‚จ๊ธด Legacy, ์žฌ๋ฐŒ๊ฒŒ ์ฝ์—ˆ๋˜ ๊ธ€์— ๋Œ€ํ•ด์„œ ๋ฉ”๋ชจํ•ด๋ดค์Šต๋‹ˆ๋‹ค.

https://www.notion.so/matthewcontinuouslearning/23-Oct-Nov-9fdc28312ac54dd0b975c495ee4bfff9?pvs=4
๐Ÿ”ฅ3
๋‹จ๋ฐฑ์งˆ ํ”„๋กœ๊ทธ๋ž˜๋ฐ ํ”Œ๋žซํผ ๊ตฌ์ถ•ํ•˜๋Š” Cradle, $24M ๊ทœ๋ชจ์˜ Series A ์ž๊ธˆ ์กฐ๋‹ฌ
- ๋‹จ๋ฐฑ์งˆ์˜ ์•„๋ฏธ๋…ธ์‚ฐ ์„œ์—ด์ด โ€˜์™ธ๊ณ„์˜ ํ”„๋กœ๊ทธ๋ž˜๋ฐ ์–ธ์–ด์™€ ๋น„์Šทํ•˜๋‹คโ€™๋Š” ํ†ต์ฐฐ์„ ๋ฐ”ํƒ•์œผ๋กœ AI ๋ชจ๋ธ์— ๋ถ„์ž ๊ตฌ์กฐ ํ•™์Šต
Stability AI, ํšŒ์‚ฌ ๋งค๊ฐ ๊ฒ€ํ†  ์ค‘(๋ธ”๋ฃธ๋ฒ„๊ทธ)
- ๊ธฐ์—…์˜ ๋ถˆ์•ˆ์ •ํ•œ ์žฌ๋ฌด ์ƒํƒœ์— ๋Œ€ํ•œ ํˆฌ์ž์ž๋“ค์˜ ์••๋ฐ•๊ณผ CEO ์‚ฌ์ž„ ์ด‰๊ตฌ ๋“ฑ์œผ๋กœ, ํšŒ์‚ฌ ๋งค๊ฐ์„ ๊ฒ€ํ†  ์ค‘
https://news.hada.io/topic?id=12187&utm_source=slack&utm_medium=bot&utm_campaign=T05AXQMJY68

์ „์„ธ๊ณ„ ๋…ผ๋ฌธ์„ AI๊ฐ€ ์ž๋™์œผ๋กœ ํ•œ๊ธ€ ๋ฒˆ์—ญ, ์š”์•ฝ, ๋ถ„์„, QnA๋ฅผ ํ•œ๋ฐฉ์—!!!
*ArXiv๋Š” ๊ธ€๋กœ๋ฒŒ ๋…ผ๋ฌธ ๊ฒ€์ƒ‰ ํฌํ„ธ ์„œ๋น„์Šค์ž…๋‹ˆ๋‹ค.

ArXiv GPT ์œ ํŠœ๋ธŒ ์†Œ๊ฐœ ๋™์˜์ƒ: https://youtu.be/-kpz0mjAd3Y

ArXiv์˜ ์ „์„ธ๊ณ„ ๋ชจ๋“  ๋…ผ๋ฌธ์„ AI๊ฐ€ ํ•œ๊ธ€๋กœ ์ „๋ฌธ ๋ฒˆ์—ญ, ์š”์•ฝ, ๋ถ„์„, QnA๋ฅผ ์ง€์›ํ•ฉ๋‹ˆ๋‹ค.
ArXiv๋Š” ๊ธ€๋กœ๋ฒŒ ๋…ผ๋ฌธ ๊ฒ€์ƒ‰ ํฌํ„ธ ์„œ๋น„์Šค์ž…๋‹ˆ๋‹ค.
MS, Copilot์˜ ์ถœ์‹œ ์˜ˆ์ • ๊ธฐ๋Šฅ ๊ณต๊ฐœ
- ํ–ฅํ›„ ๋ช‡ ์ฃผ ๋‚ด์— GPT-4 turbo ๊ธฐ๋Šฅ ํƒ‘์žฌ
- Bing ์ด๋ฏธ์ง€ ๊ฒ€์ƒ‰, GPT-4 Vision, ์›น ๊ฒ€์ƒ‰ ๋ฐ์ดํ„ฐ๋ฅผ ๊ฒฐํ•ฉํ•œ ๋ฉ€ํ‹ฐ๋ชจ๋‹ฌ ๊ฒ€์ƒ‰ ๊ธฐ๋Šฅ๊ณผ ์ฝ”๋“œ ์ธํ„ฐํ”„๋ฆฌํ„ฐ ๊ธฐ๋Šฅ ํƒ‘์žฌ ์˜ˆ์ •
- ์‚ฌ์šฉ์ž์˜ ์งˆ๋ฌธ์— ๋Œ€ํ•œ ์˜๋„๋ฅผ ํŒŒ์•…ํ•˜์—ฌ, ์˜๋„์— ๋งž๊ฒŒ ํ™•์žฅ๋œ ๊ฒ€์ƒ‰์„ ์ง„ํ–‰ํ•˜๋Š” โ€˜Deep Searchโ€™ ๊ธฐ๋Šฅ ํƒ‘์žฌ ์˜ˆ์ •
Continuous Learning_Startup & Investment pinned ยซ๋ฏธ๊ตญ์—์„œ ์ง€๋‚ด๋ฉด์„œ ๋“ค์—ˆ๋˜ ์—ฌ๋Ÿฌ๊ฐ€์ง€ ์ƒ๊ฐ๋“ค์„ ์ ์–ด๋ดค์Šต๋‹ˆ๋‹ค. ๋‘์„œ์—†๋Š” ๊ธฐ๋ก๋“ค์ด์ง€๋งŒ ๋‚˜์ค‘์— ๋Œ์ด์ผœ ๋ดค์„ ๋•Œ์— ์ด ๋•Œ์˜ ์ €๋ฅผ ๊ธฐ์–ตํ•  ์ˆ˜ ์žˆ๋Š” ์ˆ˜๋‹จ์ผ ๊ฒƒ ๊ฐ™์•„ ๊ธฐ๋กํ–ˆ๊ณ  ํ˜น์‹œ ๋ˆ„๊ตฐ๊ฐ€์—๊ฒŒ ๋„์›€์ด ๋  ์ˆ˜ ์žˆ์„๊นŒ ํ•ด์„œ ๊ณต๊ฐœํ•ฉ๋‹ˆ๋‹ค. ์™œ ๋ฏธ๊ตญ์— ์™”๋Š”์ง€ ์™€์„œ ์–ด๋–ค ๊ฒƒ์„ ๋ณด๊ณ  ๋А๋ผ๊ณ  ์žˆ๋Š”์ง€, AI์— ๋Œ€ํ•ด์„œ๋Š” ์–ด๋–ค ์ƒ๊ฐ์„ ๊ฐ€์ง€๊ณ  ์žˆ๋Š”์ง€, ๋ฏธ๊ตญ์—์„œ ํ•ด๋ณผ๋งŒํ•œ ์‚ฌ์—…์˜ ๊ธฐํšŒ์— ๋Œ€ํ•ด์„œ, ๋ฏธ๊ตญ์—์„œ ๋‹ค์‹œ ์‹ ์šฉ์„ ์Œ“์•„๊ฐ€๋Š” ์ƒํ™ฉ, ์ข‹์€ ์ธ์ƒ์„ ์‚ด๊ธฐ ์œ„ํ•ด ๋…ธ๋ ฅํ•˜๋Š” ๊ฒƒ๋“ค, ์ฐฐ๋ฆฌ ๋ฉ๊ฑฐ๊ฐ€ ๋‚จ๊ธด Legacyโ€ฆยป
(๊ธ€๋กœ๋ฒŒ ํ†ฑ ๊ณผํ•™์ž + ํˆฌ์ž์ž + $50mil. ์ž๊ธˆ ๋ฐ›์€ ํ›„ ์„ค๋ฆฝ 2๋…„ ๋งŒ์— ๋ฌธ๋‹ซ๋Š” ๋ฐ”์ด์˜คํ…)

============
ํ•˜๋ฒ„๋“œ/MIT์˜ ๋ธŒ๋กœ๋“œ์—ฐ๊ตฌ์†Œ, David Liu ๊ต์ˆ˜๊ฐ€ ์ฐฝ์—…ํ•œ ์„ค๋ฆฝ 2๋…„๋œ Resonance Medicine๊ฐ€ ๋ฌธ์„ ๋‹ซ๋Š”๋‹ค.

์ด ํšŒ์‚ฌ๋Š” Atlas Venture, ARCH Venture Partners, F-Prime Capital, GV, Newpath Partners๊ณผ ๊ฐ™์ด ์Ÿ์Ÿํ•œ ๋ฐ”์ด์˜ค ์ „๋ฌธ VC๋“ค๋กœ๋ถ€ํ„ฐ $50mil. ์ด์ƒ์˜ ํˆฌ์ž์œ ์น˜๋ฅผ ๋ฐ›์•˜๊ณ , ์•„์ง ์Šคํ‹ธ์Šค ๋ชจ๋“œ์— ์žˆ์—ˆ๋‹ค.

David Liu ๊ต์ˆ˜๋Š” ์œ ์ „์ž ํŽธ์ง‘ ๋ถ„์•ผ์˜ ๊ธ€๋กœ๋ฒŒ ํ†ฑ ๊ณผํ•™์ž์ด๋ฉฐ. ์ด ๋ถ„์•ผ ํ†ฑ ๋ฐ”์ด์˜คํ… Editas Medicine, Beam Therapeutics, Prime Medicine ์˜ ์ฐฝ์—…์ž์ด๊ธฐ๋„ ํ•˜๋‹ค.

๊ทธ๋Ÿฐ๋ฐ ์ง€๋‚œ ๊ธˆ์š”์ผ์— ์ž„์ง์›๋“ค์—๊ฒŒ ํšŒ์‚ฌ๋ฅผ ๋ฌธ๋‹ซ๊ธฐ๋กœ ํ–ˆ๋‹ค๊ณ  ํ†ต๋ณดํ–ˆ๋‹ค๊ณ  ํ•œ๋‹ค.

===========
Resonance์€ ํŠน์ • ๋‹จ๋ฐฑ์งˆ์„ ํƒ€๊ฒŸ์œผ๋กœ ํ•œ ๋‹จ๋ฐฑ์งˆ ๋ถ„ํ•ด์ œ(protease)๋ฅผ ์—”์ง€๋‹ˆ์–ด๋งํ•˜์—ฌ ์น˜๋ฃŒ์ œ๋กœ ๊ฐœ๋ฐœํ•˜๋ ค๊ณ  ํ–ˆ์—ˆ๋‹ค.

์œ ์ „์ž ํŽธ์ง‘ ๋ถ„์•ผ๋Š” ์•„๋‹ˆ์ง€๋งŒ Liu ๊ต์ˆ˜๋Š” ํšจ์†Œ ์—”์ง€๋‹ˆ์–ด๋ง ์ „๋ฌธ ์ƒํ™”ํ•™์ž์ด๋ฏ€๋กœ ๊ด€๋ จ์ด ์žˆ๋‹ค.

ํšŒ์‚ฌ๋ฅผ ์™œ ์ ‘๋Š”์ง€ ๋ฐํžˆ์ง€๋Š” ์•Š์•˜์œผ๋‚˜ ์™ธ๋ถ€ ํŽ€๋”ฉ ํ™˜๊ฒฝ ๋•Œ๋ฌธ์€ ์•„๋‹ˆ๋ผ๊ณ  ํ•˜๋‹ˆ, ์•„๋งˆ ๊ธฐ์ˆ ์ ์ธ ์ด์Šˆ์ผ ๊ฒƒ์œผ๋กœ ์ถ”์ •๋œ๋‹ค. ํŽ€๋”ฉ ํ™˜๊ฒฝ์ด ์•„๋ฌด๋ฆฌ ์–ด๋ ค์›Œ๋„ David Liu ๊ต์ˆ˜์˜ ์œ ๋ช…์„ธ์™€ ํˆฌ์ž์ž ๋ฆฌ์ŠคํŠธ๋ฅผ ๋ณผ ๋•Œ, ๊ธฐ์ˆ ๋งŒ ์ œ๋Œ€๋กœ ์ฆ๋ช…๋˜๋ฉด ํฌ๊ฒŒ ํˆฌ์ž์œ ์น˜๋ฅผ ํ•  ์ˆ˜ ์žˆ์—ˆ์„ ๊ฒƒ์ด๋‹ค.

์•„๋ฌด๋ฆฌ ๊ธ€๋กœ๋ฒŒ ํ†ฑ ๊ณผํ•™์ž, ํˆฌ์ž์ž, ์ž๊ธˆ์ด ์žˆ์–ด๋„ ๊ธฐ์ˆ ์ด ์•ˆ๋œ๋‹ค๊ณ  ํŒ๋‹จํ•˜๋ฉด ๋นจ๋ฆฌ ์ •๋ฆฌํ•˜๋Š” ๊ฒŒ ์ด์ต์ด๋ผ๊ณ  ์ƒ๊ฐํ•œ ๋“ฏ ํ•˜๋‹ค.

==========
(written by ์‹ฌ์ˆ˜๋ฏผ, ํŒŒํŠธ๋„ˆ์Šค์ธ๋ฒ ์ŠคํŠธ๋จผํŠธ)

ํ…”๋ ˆ๊ทธ๋žจ ์ฑ„๋„: https://t.me/global_biohealthcare_news

https://endpts.com/scoop-resonance-medicine-a-protease-biotech-from-david-lius-lab-shuts-down/
"Mamba: Linear-Time Sequence Modeling with Selective State Spaces" is a research paper by Albert Gu and Tri Dao that introduces a new neural network architecture called Mamba. This architecture is designed to address the computational inefficiency of Transformer models, particularly when dealing with long sequences of data

What is Mamba?
Mamba is a sequence model that integrates a selection mechanism into state space models (SSMs), leading to improved results in general sequence modeling, including language. It is a simplified end-to-end neural network architecture that does not rely on attention or multilayer perceptron (MLP) blocks.

The key innovation in Mamba is the introduction of selective SSMs, which allow the model to selectively propagate or forget information along the sequence length dimension depending on the current token. This selection mechanism enables context-dependent reasoning, which is a significant improvement over previous models that struggled with content-based reasoning.

Mamba is designed to be efficient, with fast inference and linear scaling in sequence length. It achieves up to 5 times higher throughput than Transformers. It also performs well on real data up to million-length sequences.

Performance and Applications
Mamba has demonstrated state-of-the-art performance across several modalities such as language, audio, and genomics. In language modeling, the Mamba-3B model outperforms Transformers of the same size and matches Transformers twice its size, both in pretraining and downstream evaluation. This makes Mamba a promising foundational model for different domains, especially those that require processing of long context sequences.

Limitations
While the paper and the Mamba model have shown promising results, it's important to note that these are initial findings. As with any new model or approach, replication and further testing by the broader scientific community are necessary to fully understand its strengths and limitations. The model's performance on a wider range of tasks and its adaptability to different settings and requirements need to be explored.

Conclusion and Future Work
Mamba represents a significant step forward in sequence modeling, offering a compelling alternative to the widely-used Transformer architecture. Its ability to handle long sequences efficiently and its strong performance across various modalities make it a promising tool for a range of applications.

As for future work, further exploration and testing of Mamba's capabilities are needed. This includes applying the model to a wider range of tasks, exploring its adaptability to different settings, and refining its architecture and algorithms based on feedback and results from these tests. The authors have made the Mamba codebase available for use and further development, which will facilitate these future explorations.
๐Ÿ‘2
์˜ค๋Š˜ ๋ฏธ๋ผํด ๋ ˆํ„ฐ(์ด๋•์ฃผ ๊ธฐ์ž๋‹˜) ๊ธ€์—์„œ ์ธ์ƒ์ ์ธ ๋ถ€๋ถ„

์‹ค๋ฆฌ์ฝ˜๋ฐธ๋ฆฌ ํˆฌ์ž ๊ตฌ๋ฃจ ๋น„๋…ธ๋“œ ์ฝ”์Šฌ๋ผ

์ฒซ์งธ, ๋ฐœํ‘œ๋ฅผ ์‹œ์ž‘ํ•œ์ง€ 60์ดˆ ์•ˆ์— ๋‚ด๊ฐ€ ํ•˜๊ณ ์‹ถ์€ ์–˜๊ธฐ๊ฐ€ ๋ฌด์—‡์ธ์ง€๋ฅผ ํˆฌ์ž์ž๋“ค์ด ์•Œ์•„์•ผํ•ด์š”. (์„œ๋ก ์ด ๊ธธ์ง€ ๋ง๊ณ  ๋ฐ”๋กœ ๋ณธ๋ก ์œผ๋กœ ๋“ค์–ด๊ฐ€์•ผํ•œ๋‹ค๋Š” ๊ฒƒ์ด์ฃ )

๋‘๋ฒˆ์งธ, โ€œ๋‚ด๊ฐ€ ์ด๊ฒƒ์„ ์ž…์ฆํ•˜๋ฉด ์ด๋Ÿฐ ๋ฉ‹์ง„ ์ผ์ด ๋ฒŒ์–ด์ง„๋‹คโ€๋Š” ๊ฑธ ๋ณด์—ฌ์ค˜์•ผํ•ด์š”. (ํˆฌ์ž์ž๋“ค์—๊ฒŒ ๊ฐ€๋Šฅ์„ฑ๊ณผ ์Šคํ† ๋ฆฌ๋ฅผ ๋ณด์—ฌ์ค˜์•ผํ•ด์š”)

์„ธ๋ฒˆ์งธ, ๊ณต๋žตํ•˜๋Š” ์‹œ์žฅ์ด ๋งค์šฐ ํฌ๊ฑฐ๋‚˜, ์•„๋‹ˆ๋ฉด ์ƒˆ๋กญ๊ณ  ํฅ๋ฏธ๋กœ์šด ์‹œ์žฅ์ด๋ผ๋Š” ๊ฑธ ๋ณด์—ฌ์ค˜์•ผํ•ด์š”. (ํฐ ์‹œ์žฅ ํ˜น์€ ๋ฏธ์ง€์˜ ์‹œ์žฅ์„ ๋ณด์—ฌ์ค˜์„œ ํˆฌ์ž์ž๋“ค์˜ ํƒ์š•์„ ์ž๊ทนํ•˜๋Š”๊ฑฐ์ฃ )

๋„ค๋ฒˆ์งธ, PMF(Product Market Fit) ์„ ์ž…์ฆํ–ˆ๊ฑฐ๋‚˜ ์ž…์ฆํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฑธ ๋ณด์—ฌ์ค˜์•ผํ•ด์š”.
(์Šคํƒ€ํŠธ์—…์ด๋ž€ ๊ฒฐ๊ตญ PMF ๋ฅผ ์ฐพ๋Š” ๊ฒƒ!)

๋‹ค์„ฏ๋ฒˆ์งธ, ๊ธฐ์ˆ ์ ์ธ ๋ฌธ์ œ๋Š” ๊ด€๋ฆฌํ•  ์ˆ˜ ์žˆ๊ณ , ์‹œ์žฅ๋„ ๋ช…ํ™•ํ•˜๋‹ค๋Š” ๊ฑธ ๋ณด์—ฌ์ค˜์•ผํ•ด์š”.

์ข‹์€ ํŒ€์„ ๊ฐ–๊ณ  ์žˆ๋‹ค๋Š” ๊ฒƒ์€ ๋งˆ์ง€๋ง‰์— ์–˜๊ธฐํ•ด์ฃผ๋ฉด ๋œ๋‹ค๊ณ  ํ•ฉ๋‹ˆ๋‹ค.

์ฝ”์Šฌ๋ผ๋Š” ํ”ผ์นญ ๋ฑ์„ ๋งŒ๋“ค ๋•Œ ๊ฐ€์žฅ ์ฒ˜์Œ ํ•ด์•ผํ•  ์ผ์„ โ€˜์šฐ๋ฆฌ ํšŒ์‚ฌ์— ํˆฌ์žํ•ด์•ผํ•  ์ด์œ โ€™์™€ โ€˜ํˆฌ์žํ•˜์ง€ ๋ง์•„์•ผํ•  ์ด์œ โ€™๋ฅผ ๋ธŒ๋ ˆ์ธ์Šคํ† ๋ฐํ•˜๋Š” ๊ฒƒ์ด๋ผ๊ณ  ํ–ˆ์–ด์š”.

์ด๋‚  ๊ทธ์˜ ๊ฐ•์—ฐ ๋‚ด์šฉ ์ค‘์— ๊ฐ€์žฅ ์ธ์ƒ ๊นŠ์—ˆ๋˜ ๋ถ€๋ถ„. ๋ฐ”๋กœ ์„ฑ๊ณตํ•˜๋Š” ์ฐฝ์—…์ž๊ฐ€ ๊ณตํ†ต์ ์œผ๋กœ ๊ฐ€์ง„ ๋Šฅ๋ ฅ์€ ๋ฐ”๋กœ '๋น ๋ฅธ ํ•™์Šต๋Šฅ๋ ฅ'์ด๋ผ๋Š” ๊ฒƒ์ด์—ˆ์–ด์š”. ์ฐฝ์—…์ž๋“ค์€ ๋งŽ์€ ์‚ฌ๋žŒ์„ ๋งŒ๋‚˜๊ณ , ๋งŽ์€ ์ •๋ณด๋ฅผ ์ˆ˜์ง‘ํ•ด์„œ, ์ตœ๋Œ€ํ•œ ๋งŽ์€ ์ธํ’‹์„ ์Šค์Šค๋กœ์—๊ฒŒ ํˆฌ์ž…ํ•œ๋‹ค๊ณ  ํ•ด์š”. ๊ทธ๋ฆฌ๊ณ  ์ˆ˜๋งŽ์€ ์ธํ’‹์„ ๋ฐ”ํƒ•์œผ๋กœ ์ค‘์š”ํ•œ ์˜์‚ฌ๊ฒฐ์ •์„ ๋‚ด๋ฆฝ๋‹ˆ๋‹ค. ๋ˆ„๊ตฌ์˜ ์˜ํ–ฅ๋„ ๋ฐ›์ง€ ์•Š๊ณ  ์Šค์Šค๋กœ์˜ ์ƒ๊ฐ์œผ๋กœ ๊ฒฐ์ •์„ ๋‚ด๋ฆฝ๋‹ˆ๋‹ค. ์ด๋Ÿฐ ๊ฒฐ์ •์„ ์œ„ํ•ด์„œ ์ค‘์š”ํ•œ ๊ฒƒ์€? ๋ฐ”๋กœ ๋งŽ์€ ์ธํ’‹!
๐Ÿ‘1
์ฐฐ๋ฆฌ ๋ฉ๊ฑฐ์˜ ์ถ”์ฒœ๋„์„œ ๋ชฉ๋ก

1. ๋กœ๋ฒ„ํŠธ ์น˜์•Œ๋””๋‹ˆ, ์„ค๋“์˜ ์‹ฌ๋ฆฌํ•™
2. ๋ง์ฝค ๊ธ€๋ž˜๋“œ์›ฐ, ์•„์›ƒ๋ผ์ด์–ด
3. ๋ฆฌ์ฒ˜๋“œ ๋„ํ‚จ์Šค, ์ด๊ธฐ์  ์œ ์ „์ž
4. ์ œ๋ ˆ๋“œ ๋‹ค์ด์•„๋ชฌ๋“œ, ์ด๊ท ์‡ 
5. Peter Bevelin, Seeking Wisdom
6. ๋ฒค์ž๋ฏผ ํ”„๋žญํด๋ฆฐ, ํ”„๋žญํด๋ฆฐ ์ž์„œ์ „
7. ๋ก  ์ฒ˜๋…ธ, ๋ถ€์˜ ์ œ๊ตญ ๋กํŽ ๋Ÿฌ
8. ์•ค๋“œ๋ฅ˜ ๊ทธ๋กœ๋ธŒ, ํŽธ์ง‘๊ด‘๋งŒ์ด ์‚ด์•„๋‚จ๋Š”๋‹ค.
9. ์œŒ๋ฆฌ์—„ ์†๋‹ค์ดํฌ, ํ˜„๊ธˆ์˜ ์žฌ๋ฐœ๊ฒฌ
10. ์œŒ๋ฆฌ์—„์œ ๋ฆฌ&๋กœ์ € ํ”ผ์…”, YES๋ฅผ ์ด๋Œ์–ด๋‚ด๋Š” ํ˜‘์ƒ๋ฒ•
11. ์กด ๋ณด๊ธ€, ๋ชจ๋“  ์ฃผ์‹์„ ์†Œ์œ ํ•˜๋ผ
12. ์Šคํ‹ฐ๋ธ ๋ ˆ๋น„, In The Plex 0๊ณผ 1๋กœ ์„ธ์ƒ์„ ๋ฐ”๊พธ๋Š” ๊ตฌ๊ธ€, ๊ทธ ๋ชจ๋“  ์ด์•ผ๊ธฐ
13. ๋งคํŠธ ๋ฆฌ๋“ค๋ฆฌ, ์ƒ๋ช… ์„ค๊ณ„๋„, ๊ฒŒ๋†ˆ
14. James Wallace, Hard Drive
15. ๋งฅ์Šค ๋ฒ ์ด์ €๋งŒ, ํŒ๋‹จ๊ณผ ๊ฒฐ์ •
16. ๋ฐ์ด๋น„๋“œ ๋žœ์ฆˆ, ๊ตญ๊ฐ€์˜ ๋ถ€์™€ ๋นˆ๊ณค
17. Garrett hardin, Living Wihtin Limits
18. ๋กœ๋ฒ„ํŠธ ์น˜์•Œ๋””๋‹ˆ, ์„ค๋“์˜ ์‹ฌ๋ฆฌํ•™3 (Yes!)
19. ์กด ๊ทธ๋ฆฌ๋นˆ, ๋น™ํ•˜๊ธฐ
20. Herbert A.Simon, Models of My Life
21. Lawrence M.Krauss, A Universe of Degrees
22. ๋กœ๋ฒ„ํŠธ ํ•ด๊ทธ์ŠคํŠธ๋กฌ, ์›Œ๋ Œ ๋ฒ„ํŽซ ํฌํŠธํด๋ฆฌ์˜ค
23. Gino Segre, A matter of Degrees
24. ๋กœ๋ฒ„ํŠธ ๋ผ์ดํŠธ, 3์ธ์˜ ๊ณผํ•™์ž์™€ ๊ทธ๋“ค์˜ ์‹ 
25. ์กด ๊ทธ๋ฆฌ๋นˆ, ๋”ฅ ์‹ฌํ”Œ๋ฆฌ์‹œํ‹ฐ
26. Connie Bruck, Master of the Game
27. Arthur Herman, How the scots Invented the Modern World
28. Frank Partnoy, Fiasco:ํŒŒ์ƒ๊ธˆ์œต์ƒํ’ˆ ์„ธ์ผ์ฆˆ๋งจ์˜ ๊ณ ๋ฐฑ
29. Carl Van Doren, Benjamin Franklin
30. Gregory Zuckerman, The Greatest Trade Ever
31. ์žฌ๋Ÿฌ๋“œ ๋‹ค์ด์•„๋ชฌ๋“œ, ์ œ3์˜ ์นจํŒฌ์ง€
32. Joseph Frazier Wall, Andrew Carnegie
33. Rober Caro, The Years of Lyndon Johnson (4 books)
34. Istvan Hargittai, The Martians of Science
35. ๋กœ๋ฒ„ํŠธ ํ•ด๊ทธ์ŠคํŠธ๋กฌ, ์›Œ๋ Œ๋ฒ„ํ• ํฌํŠธํด๋ฆฌ์˜ค
36. Roger Done, Getting It Done
37. Les Schwab, Pride in Performance
38. ์›”ํ„ฐ ์•„์ด์ž‘์Šจ, ์•„์ธ์Šˆํƒ€์ธ ์‚ถ๊ณผ ์šฐ์ฃผ
39. ์›”ํ„ฐ ์•„์ด์ž‘์Šจ, ๋ฒค์ž๋ฏผ ํ”„๋žญํด๋ฆฐ
40. Lawrence M. Krauss, A Universe from Nothing
41. Joe Nocera, A Piece of Action
42. ๋‚ธ์‹œ ํฌ๋ธŒ์Šค, ํŒจ๋Ÿฌ๋ฐ์ด์™€ ๋งฅ์Šค์›ฐ
43. ๋ฆฌ์ฒ˜๋“œ ๋„ํ‚จ์Šค, ๋ˆˆ๋จผ์‹œ๊ณ„๊ณต
๋ฒŒ์จ ๋ช‡ ๋…„ ์ „ ์ผ์ด๋‹ค. NCSL ์—ฐ๊ตฌ์‹ค ํ›„๋ฐฐ์˜ ๊ฒฐํ˜ผ์‹์„ ๊ณ„๊ธฐ๋กœ ์—ฐ๊ตฌ์‹ค ์‚ฌ๋žŒ๋“ค์ด ์˜ˆ์‹์žฅ์— ๋ชจ์˜€๋‹ค. ํ›„๋ฐฐ์˜ ๊ฒฐํ˜ผ์‹์ด ๋๋‚œ ํ›„ ์ง€๋„๊ต์ˆ˜๋‹˜๊ณผ ๋‚˜๋ˆด๋˜ ์ด์•ผ๊ธฐ ์ค‘์— ๊ผฌ๋ฆฌ๊ฐ€ ๊ธธ์–ด์ ธ ์ง€๊ธˆ๊นŒ์ง€๋„ ๋จธ๋ฆฟ์† ํ•œ ํŽธ์— ๋ถ™์žกํžŒ ์ฃผ์ œ๊ฐ€ ์žˆ๋‹ค.

๊ฒฐ๊ตญ ์ง€๊ธˆ ์ผ์„ ๊ณ„์†ํ–ˆ์„ ๋•Œ์˜ ๋ชฉํ‘œ์™€ ๋น„์ „์ด ๋ญ”๊ฐ€? ๋ผ๋Š” ์งˆ๋ฌธ์„ ๋ฐ›์•˜๋‹ค. ์•ž์œผ๋กœ ์˜ฌ ๊ฒƒ์œผ๋กœ ์ƒ๊ฐํ•˜๋Š” ์‹œ๋Œ€์— โ€œmachine-driven scienceโ€ ์‹œ๋Œ€๋ผ๋Š” ์ด๋ฆ„์„ ๋ถ™์ด๊ณ  ์„ค๋ช…ํ•ด ๋ณด์•˜๋‹ค.

๊ทธ ๋•Œ๋Š” ๋ฏธ๋ž˜์— ๋Œ€ํ•œ ์ƒ๊ฐ์ด์—ˆ์ง€๋งŒ, 2023๋…„์—” ๊ทธ ๋ฏธ๋ž˜๊ฐ€ ์ด์ œ ๋ˆˆ์•ž๊นŒ์ง€ ์™”๋‹ค.

์‹ค์šฉ์ธ๊ณต์ง€๋Šฅํ•™ํšŒ์—์„œ ๊ฐ์‚ฌํ•˜๊ฒŒ๋„ ํ‰์†Œ์— ์ด์•ผ๊ธฐํ•  ๊ธฐํšŒ๊ฐ€ ๋ณ„๋กœ ์—†๋Š” ์ฃผ์ œ๋ฅผ ๊บผ๋‚ผ ์ˆ˜ ์žˆ๋Š” ์‹œ๊ฐ„์„ ๋งŒ๋“ค์–ด์ฃผ์…จ๋‹ค. ๋‚ด์ผ ์˜ค์ „ ํ‚ค๋…ธํŠธ๋ฅผ ์ค€๋น„ํ•˜๋Š” ์ค‘ ์˜ค๋ž˜๋œ ๊ธฐ์–ต๊ณผ ์ƒ๊ฐ์ด ๋ชฐ๋ ค์™€ ์ž ์‹œ ์ ์–ด๋‘”๋‹ค.

*

๊ธฐ์ˆ  ์ฃผ๋„์˜ ๊ณผํ•™ ๋ฐœ์ „ ์‹œ๋Œ€
The era of technology-driven science

์ด ๊ฐ•์—ฐ์—์„œ๋Š” ๊ณผํ•™์˜ ๋ฐœ์ „์ด ๋” ์ด์ƒ ์ธ๊ฐ„์˜ ์†์œผ๋กœ๋งŒ ์ด๋ฃจ์–ด์ง€์ง€ ์•Š๋Š” ์‹œ๋Œ€์˜ ๋„๋ž˜์— ๋Œ€ํ•ด ์ด์•ผ๊ธฐํ•ฉ๋‹ˆ๋‹ค. ์ธ๋ฅ˜๋Š” 19์„ธ๊ธฐ, 20์„ธ๊ธฐ๋ฅผ ์ง€๋‚˜๋ฉฐ ๊ณผํ•™์˜ ๋ฐœ์ „๊ณผ ํ•จ๊ป˜ ๋†€๋ผ์šด ์—…์ ์„ ๋‹ฌ์„ฑํ–ˆ์Šต๋‹ˆ๋‹ค. 20์„ธ๊ธฐ์—๋Š” ์„ํƒ„์—์„œ ์ „๊ธฐ๋กœ์˜ ์—๋„ˆ์ง€ ์ „ํ™˜, ์ฆ๊ธฐ๊ธฐ๊ด€์—์„œ ๋กœ์ผ“์œผ๋กœ ์ด์–ด์ง„ ์šด์†ก ์ˆ˜๋‹จ์˜ ๋ณ€ํ™” ๋“ฑ ๋™์—ญํ•™ ๋ถ„์•ผ์˜ ํ˜์‹ ์ด ์žˆ์—ˆ์Šต๋‹ˆ๋‹ค. ๋™์‹œ์— ์ธ๋ฅ˜๋Š” ๋ˆˆ์— ๋ณด์ด์ง€ ์•Š๋Š” ๋ฏธ๋””์›€์„ ๊ตฌ์ถ•ํ–ˆ์Šต๋‹ˆ๋‹ค. ์ „์‹ , ์ปดํ“จํ„ฐ, ๋„คํŠธ์›Œํฌ๋กœ ์ด์–ด์ง€๋Š” ์ •๋ณด ์ฒด๊ณ„์˜ ์ง„ํ™”๋Š” 20์„ธ๊ธฐ ๋ฐ 21์„ธ๊ธฐ ์ดˆ๋ฅผ ํ†ตํ‹€์–ด ์ผ์–ด๋‚œ ๊ฑฐ๋Œ€ํ•œ ๋ณ€ํ™”๋ฅผ ์ด๋Œ์—ˆ์Šต๋‹ˆ๋‹ค. ์ด์ œ ์šฐ๋ฆฌ๋Š” ์ •๋ณด ์ฒ˜๋ฆฌ ๋ถ„์•ผ์˜ ํ˜์‹ ์ด ๋”ฅ ๋Ÿฌ๋‹์„ ํ†ตํ•ด ์ •๋ณด ์ƒ์„ฑ ๋ถ„์•ผ๋กœ ์ด์–ด์ง€๋Š” ์‹œ์ ์— ์™€ ์žˆ์Šต๋‹ˆ๋‹ค. ์ธ๊ฐ„์€ ์ƒ์ƒํ•˜๊ณ  ์ƒ์ƒ์„ ํ†ตํ•ด ํ•™์Šตํ•˜๋Š” ๋Šฅ๋ ฅ์ด ์žˆ์Šต๋‹ˆ๋‹ค. ๋”ฅ๋Ÿฌ๋‹์˜ ์ง„ํ™”์™€ ํ•ฉ์„ฑ ๋ฐ์ดํ„ฐ ๋ฐ ์–ธ์–ด ๋ชจ๋ธ์˜ ๊ฑฐ๋Œ€ํ™”๊ฐ€ ์–ด๋–ป๊ฒŒ ์ƒํ˜ธ ์—ฐ๊ฒฐ๋˜์–ด ์ƒˆ๋กœ์šด ๊ณผํ•™์  ๋ฐฉ๋ฒ•๋ก ์˜ ์‹œ๋Œ€๋ฅผ ๋งŒ๋“ค์–ด๋‚ด๋Š”์ง€ ์‚ดํŽด๋ด…์‹œ๋‹ค.

#AAiCON
๊ตฌ๊ธ€, ์ œ๋ฏธ๋‹ˆ 'Gemini 1.0' ์ •์‹๋ฐœํ‘œ (์ž๋ฃŒ: Google Blog)

- ์ˆœ๋‹ค๋ฅด ํ”ผ์ฐจ์ด CEO, ์—ฌ๋Ÿฌ ์ฃผ์š” ๋ฒค์น˜๋งˆํฌ์—์„œ ์ตœ์ฒจ๋‹จ ์„ฑ๋Šฅ ๋‹ฌ์„ฑํ•œ Gemini์— ๋Œ€ํ•ด ์ž์‹ . ์˜ฌํ•ด ์ดˆ Google Brain + DeepMind๊ฐ€ ํ•ฉ์ž‘ํ•ด์„œ ๋งŒ๋“  ์ฒซ LLM

- ์•ŒํŒŒ๊ณ ์˜ ์ฃผ์—ญ ๋ฐ๋ฏธ์Šค ํ—ˆ์‚ฌ๋น„์Šค๊ฐ€ ์ฃผ๋„. ํ…์ŠคํŠธ, ๋ฉ€ํ‹ฐ๋ชจ๋‹ฌ ๋ฒค์น˜๋งˆํฌ ์—ฌ๋Ÿฌ ๋ถ€๋ฌธ์—์„œ GPT-4๋ฅผ ๋Šฅ๊ฐ€ (์‹ค์ œ ์„ฑ๋Šฅ์€ ์กฐ๊ธˆ ๋” ๋ด์•ผ ํ•  ๊ฒƒ์œผ๋กœ ์˜ˆ์ƒ)

์ œ๋ฏธ๋‹ˆ ๋ชจ๋ธ์€ ์ด 3๊ฐ€์ง€
- Gemini Ultra : ์ตœ๊ณ  ์„ฑ๋Šฅ์„ ์œ„ํ•œ ๊ฐ€์žฅ ํฌ๊ณ , ๋›ฐ์–ด๋‚œ ๋ชจ๋ธ
- Gemini Pro : ๊ด‘๋ฒ”์œ„ํ•œ ์ž‘์—…์— ํ™•์žฅ ๊ฐ€๋Šฅํ•œ ๋ชจ๋ธ
- Gemini Nano : ์˜จ๋””๋ฐ”์ด์Šค AI ์ž‘์—…์„ ์œ„ํ•œ ๊ฐ€์žฅ ํšจ์œจ์  ๋ชจ๋ธ

- ์˜ค๋Š˜๋ถ€ํ„ฐ Bard๋Š” Gemini Pro ๋ชจ๋ธ์„ ์‚ฌ์šฉํ•  ๊ฒƒ, ๋˜ํ•œ ๊ตฌ๊ธ€ ํ”ฝ์…€ 8 ํ”„๋กœ ์ œํ’ˆ์—์„œ Gemini Nano ์‹คํ–‰ํ•˜๋„๋ก ํ•  ๊ฒƒ. ํ–ฅํ›„ ๊ฒ€์ƒ‰/๊ด‘๊ณ /ํฌ๋กฌ/๋“€์—ฃAI ๋“ฑ ๋” ๋งŽ์€ ์ œํ’ˆ์— ๋ฐฐํฌํ•  ์˜ˆ์ • PaLM์„ ๋Œ€์ฒดํ•˜๋Š” ๊ฒƒ

- 12์›” 13์ผ๋ถ€ํ„ฐ ๊ฐœ๋ฐœ์ž/๊ธฐ์—…๊ณ ๊ฐ์€ Google Vertex AI (=์ž์‹ ๋งŒ์˜ AI ๊ตฌ์ถ•ํ•˜๋Š” ์„œ๋น„์Šค)๋ฅผ ํ†ตํ•ด Gemini Pro ์‚ฌ์šฉ ๊ฐ€๋Šฅ

- ๊ฐ€์žฅ ๋›ฐ์–ด๋‚˜๊ณ  ํฐ ๋ฒ„์ „์ธ Gemini Ultra๋Š” ๋‚ด๋…„ ์ดˆ ์ถœ์‹œํ•˜๊ธฐ ์ „์— ์„ ํƒ๋œ ๊ณ ๊ฐ๊ณผ ํŒŒํŠธ๋„ˆ์—๊ฒŒ ์ดˆ๊ธฐ ์‹คํ—˜์šฉ์œผ๋กœ ๋ฐฐํฌํ•  ๊ฒƒ. ๋˜ํ•œ ๋‚ด๋…„ ์ดˆ Ultra ํƒ‘์žฌํ•œ 'Bard Advanced'๋„ ์ถœ์‹œํ•  ์˜ˆ์ •
โค1
Gemini is finally out. Its technical report, while 60 pages long, is light in details. I did a quick read-through and here's the summary.

1. Gemini was written in Jax and trained using TPUs. The architecture, while not explained in details, seems similar to that of DeepMind's Flamigo, with separate text encoder and vision encoder.

2. Gemini Pro's performance is similar to GPT-3.5 and Gemini Ultra is reported to be better than GPT-4. Nano-1 (1.8B params) and Nano-2 (3.25B params) are designed to run on-device.

3. 32K context length.

4. Very good at understanding vision and speech.

5. Coding ability: the big jump in HumanEval compared to GPT-4 (74.4% vs. 67%), if true, is awesome. However, the Natural2Code benchmark (no leakage on the Internet) shows a much smaller gap (74.9% vs. 73.9%).

6. On MMLU: using COT to show that Gemini is better than GPT-4 seems forced. In 5-shot setting, GPT-4 is better (86.4% vs. 83.7%).

7. No information at all on the training data, other than they ensured "all data enrichment workers are paid at least a local living wage."