Continuous Learning_Startup & Investment
2.44K subscribers
513 photos
5 videos
16 files
2.74K links
We journey together through the captivating realms of entrepreneurship, investment, life, and technology. This is my chronicle of exploration, where I capture and share the lessons that shape our world. Join us and let's never stop learning!
Download Telegram
Gemini is finally out. Its technical report, while 60 pages long, is light in details. I did a quick read-through and here's the summary.

1. Gemini was written in Jax and trained using TPUs. The architecture, while not explained in details, seems similar to that of DeepMind's Flamigo, with separate text encoder and vision encoder.

2. Gemini Pro's performance is similar to GPT-3.5 and Gemini Ultra is reported to be better than GPT-4. Nano-1 (1.8B params) and Nano-2 (3.25B params) are designed to run on-device.

3. 32K context length.

4. Very good at understanding vision and speech.

5. Coding ability: the big jump in HumanEval compared to GPT-4 (74.4% vs. 67%), if true, is awesome. However, the Natural2Code benchmark (no leakage on the Internet) shows a much smaller gap (74.9% vs. 73.9%).

6. On MMLU: using COT to show that Gemini is better than GPT-4 seems forced. In 5-shot setting, GPT-4 is better (86.4% vs. 83.7%).

7. No information at all on the training data, other than they ensured "all data enrichment workers are paid at least a local living wage."
https://youtu.be/v5tRc_5-8G4

์ฒ˜์Œ์—” ํ˜ผ์ž์„œ ์„œ์นญํ•˜๋Š” ๊ฒƒ๋“ค์„ ๋ฌผ์–ด๋ณด๋Š” ๋Œ€์ƒ์—์„œ, ํ† ๋ก ํ•˜๋Š” ๋Œ€์ƒ์—์„œ ์–ด๋А์ •๋„ ๊ฒฐ์ •๊นŒ์ง€ ์œ„์ž„ํ•  ์ˆ˜ ์žˆ๋Š” ๋Œ€์ƒ๊นŒ์ง€ ์šฐ๋ฆฌ๋Š” ์ปดํ“จํ„ฐ์™€ ์ผํ•˜๋Š” ๋ฐฉ์‹์ด ๋ณ€ํ•˜๋Š” ์‹œ์ ์— ์žˆ๋‹ค.
Influence

Reciprocity: This principle is based on the idea that people feel obligated to return a favor when something is given to them. For example, if a company offers a free sample of their product, customers may feel inclined to make a purchase in return.

Commitment and Consistency: People tend to honor their commitments, especially if they are consistent with their values and beliefs. For instance, if a person publicly commits to a goal, they are more likely to follow through with it to maintain consistency.

Social Proof: People often look to the actions and behaviors of others to determine their own. This is why testimonials and reviews are powerful marketing tools. If a product is highly rated by others, potential customers are more likely to purchase it.

Authority: People tend to follow the lead of credible, knowledgeable experts. This is why companies often use celebrities or experts in their field to endorse their products.

Liking: People are more likely to be persuaded by people they like. This can be influenced by physical attractiveness, similarity, compliments, and cooperative efforts. For example, salespeople often try to build rapport with their customers to increase their likability.

Scarcity: People value items more when they are scarce. Marketers often use this principle by creating a sense of urgency around a product or service, such as a limited-time offer.
ํ•œํ™” ์žฅ๊ฐ‘์ฐจ '๋ ˆ๋“œ๋ฐฑ' 129๋Œ€ ํ˜ธ์ฃผ ์ˆ˜์ถœโ€ฆ3์กฐ์›๋Œ€ ์ˆ˜์ฃผ ์พŒ๊ฑฐ
https://naver.me/xec1ncoy
The $41 million funding raise is across the seed and Series A financing rounds. Lightspeed led the Series A round and co-led the seed with Peak XV Partners. Peak XV and Khosla Ventures also participated in the Series A funding.

The Bengaluru-headquartered startup is building large language models with support for Indian languages, Sarvam AIโ€™s founder Vivek Raghavan told TechCrunch. The startup is also creating a platform that will allow businesses to build with LLMs โ€” โ€œeverything from writing an app, deploying it to popular channels, observing logs, and custom evaluation,โ€ he said.

Sarvam AI, which currently employs about 18 employees, is also focusing on building LLMs that use voice as the default interface in India. This strategy, combined with its emphasis on supporting local languages, aims to cater specifically to the Indian marketโ€™s requirements.

โ€œThis requires us to change the architecture of existing open models and to train them in custom ways to teach the new language. The advantage is that the resultant models are more efficient (in terms of tokens consumed) for understanding and generating Indian language than any of the existing LLMs,โ€ said Raghavan.
AMD, ์—”๋น„๋””์•„์˜ GPU โ€˜H100โ€™ ๋›ฐ์–ด๋„˜๋Š” โ€˜MI300โ€™ ์นฉ ๊ณต๊ฐœ
- ๋ฆฌ์‚ฌ ์ˆ˜ AMD CEO๋Š” โ€œAI ํ›ˆ๋ จ ๋Šฅ๋ ฅ์€ H100๊ณผ ๋™์ผํ•˜๋ฉฐ, ์ถ”๋ก ์—์„œ๋Š” ํ›จ์”ฌ ๋›ฐ์–ด๋‚˜๋‹คโ€๋ผ๊ณ  ๋ฐœ์–ธ
- ๋ฉ”ํƒ€, MS, OpenAI ๋“ฑ์ด ๊ตฌ๋งค ๊ณ„์•ฝ์„ ํ•˜๋Š” ๋“ฑ ์‹œ์žฅ ํ™•์žฅ ์ค‘์ด์ง€๋งŒ, ์—”๋น„๋””์•„์˜ ๋…์ ์„ ๋ง‰์„ ์ˆ˜ ์žˆ์„์ง€๋Š” ํšŒ์˜์ 

์ƒ์„ฑ AI ํŒŒ์šด๋ฐ์ด์…˜ ๋ชจ๋ธ ๊ตฌ์ถ•ํ•˜๋Š” ์Šคํƒ€ํŠธ์—… Liquid AI, ์•ฝ $37M ๊ทœ๋ชจ์˜ ํˆฌ์ž ์œ ์น˜
- ์ƒˆ๋กœ์šด ๋ชจ๋ธ ์•„ํ‚คํ…์ฒ˜์ธ ์•ก์ฒด ์‹ ๊ฒฝ๋ง(LNN, Liquid Neural Network)์„ ๊ธฐ๋ฐ˜์œผ๋กœ ๋ชจ๋ธ ๊ตฌ์ถ•
- ๋‰ด๋Ÿฐ์˜ ๋™์ž‘ ๋ฐฉ์ •์‹์„ ๊ณ ์ •ํ•˜์ง€ ์•Š๊ณ  ์•ก์ฒด์ฒ˜๋Ÿผ ๋ณ€๊ฒฝํ•  ์ˆ˜ ์žˆ์–ด, ๋ณ€๋™์„ฑ์ด ํฌ๊ฑฐ๋‚˜ ์—ฐ์†์ ์ธ ๋ฐ์ดํ„ฐ ์ฒ˜๋ฆฌ์— ํŠนํ™”
<์‹คํŒจ์—์„œ ๋ฐฐ์šธ ์ˆ˜ ์žˆ๋Š” ์‚ฌ๋žŒ์ด โ€˜์ฐโ€™์ž…๋‹ˆ๋‹ค!>
1. ์šฐ๋ฆฌ๋Š” ์‹ ๋ฌธ๊ณผ ์žก์ง€๋ฅผ ํ†ตํ•ด ๋น„์ฆˆ๋‹ˆ์Šค ์„ธ๊ณ„๋ฅผ ๋ฐฐ์šด๋‹ค. ๋˜, ๋Œ€๋ถ€๋ถ„์˜ ์‚ฌ๋žŒ๋“ค์€ ์†๋‹˜์ด ๋งŽ์€ ์‹๋‹น์„ ์ฐพ์•„๊ฐ€๊ณ , ๊ฐ€์žฅ ์ธ๊ธฐ ์žˆ๋Š” ์˜ํ™”๋ฅผ ๋ณด๋ ค๊ณ  ํ•œ๋‹ค.
2. ์ด๋ ‡๋“ฏ, ์šฐ๋ฆฌ๋Š” ์„ฑ๊ณตํ•œ ์‚ฌ๋ก€๋ฅผ ์ฃผ๋กœ ๊ฒฝํ—˜ํ•˜๊ฒŒ ๋˜๋Š”๋ฐ, ๋ฌธ์ œ๋Š” ์—ฌ๊ธฐ์„œ ์•ผ๊ธฐ๋œ๋‹ค. ์‹ ๋ฌธ๊ณผ ์žก์ง€๋Š” 10์–ต ๋‹ฌ๋Ÿฌ์— ๋งค๊ฐ๋œ ์Šคํƒ€ํŠธ์—…์—๋Š” ๋งŽ์€ ์ง€๋ฉด์„ ํ• ์• ํ•˜์ง€๋งŒ, ๋น„์Šทํ•˜๊ฒŒ ์‹œ์ž‘ํ–ˆ์œผ๋‚˜ ํŒŒ์‚ฐ์˜ ๋‚˜๋ฝ์œผ๋กœ ๋–จ์–ด์ง„ ์ˆ˜๋ฐฑ ๊ฐœ์˜ ์Šคํƒ€ํŠธ์—…์— ๋Œ€ํ•ด์„  ๊ฑฐ์˜ ๋‹ค๋ฃจ์ง€ ์•Š๋Š”๋‹ค.
3. ๋˜ ๋ถ๋น„๋Š” ํ”ผ์ž์ง‘์„ ์ฐพ์•„๊ฐ€๋Š” ๊ธธ์— ํ…… ๋นˆ ์‹๋‹น ์˜†์„ ์ง€๋‚˜๊ฐ€๋ฉด์„œ๋„ ๊ทธ ์‹๋‹น์—๋Š” ๋ˆˆ๊ธธ์กฐ์ฐจ ์ฃผ์ง€ ์•Š๋Š”๋‹ค.
4. ์ด๋ ‡๋“ฏ, ์šฐ๋ฆฌ๋Š” ์„ฑ๊ณตํ•œ ๊ฒƒ์— ์ฃผ๋ชฉํ•˜๋„๋ก ๊ธธ๋“ค์–ด์ง„ ํƒ“์—, ์‹คํŒจํ•œ ๊ฒƒ์€ ๋ฌด์‹œํ•ด๋ฒ„๋ฆฐ๋‹ค. ์„ฑ๊ณตํ•œ ๊ฒƒ์„ ์ฃผ๋กœ ์ ‘์ด‰ํ•˜๊ณ , ๊ทธ๋Ÿฐ ์™œ๊ณก๋œ ๊ฒฝํ—˜์œผ๋กœ๋Š” ์„ฑ๊ณต์— ํŽธํ–ฅ๋œ ์ถ”์ •์„ ํ•  ์ˆ˜๋ฐ–์— ์—†๋‹ค. ๊ทธ๋ ‡๊ฒŒ ์šฐ๋ฆฌ์˜ ์˜ˆ์ธก์€ ํ‹€๋ฆฌ๋Š” ๊ฒฝ์šฐ๊ฐ€ ๋งŽ๋‹ค.
5. ๋ฐ˜๋ฉด์—, (์ง„์งœ) ์„ฑ๊ณตํ•œ ์‚ฌ๋žŒ๋“ค์€ ์‹คํŒจ์— ๋Œ€ํ•œ ์ •๋ณด๋ฅผ ๊ตฌํ•˜๋Š” ๋ฐ ๋งŽ์€ ์‹œ๊ฐ„์„ ํ• ์• ํ•œ๋‹ค. ๊ทธ๋“ค์€ ์‹ ๋ฌธ์˜ ๋น„์ฆˆ๋‹ˆ์Šค๋ž€์—์„œ ํŒŒ์‚ฐํ•œ ๊ธฐ์—…์— ๋Œ€ํ•œ ๊ธฐ์‚ฌ๋ฅผ ์ฐพ์•„ ์ฝ๋Š”๋‹ค. ๋˜, ์Šน์ง„์— ์‹คํŒจํ•œ ๋™๋ฃŒ๋“ค๊ณผ ์ ์‹ฌ ์‹์‚ฌ๋ฅผ ํ•จ๊ป˜ ํ•˜๋ฉฐ, ๊ทธ๋“ค์—๊ฒŒ ๋ฌด์—‡์ด ์ž˜๋ชป๋˜์—ˆ๋‹ค๊ณ  ์ƒ๊ฐํ•˜๋Š”์ง€ ๋ฌป๋Š”๋‹ค.
6. ๋˜ํ•œ, ์—ฐ๋ก€ ํ‰๊ฐ€์—์„œ ์นญ์ฐฌ๋งŒ์ด ์•„๋‹ˆ๋ผ, ๋น„ํŒ๊นŒ์ง€ ์š”๊ตฌํ•œ๋‹ค. ๊ทธ๋ฆฌ๊ณ  ์ž์‹ ์˜ ์†Œ๋ง๋งŒํผ ๋งŽ์€ ๋ˆ์„ ์ €์ถ•ํ•˜์ง€ ๋ชปํ•œ ์ด์œ ๋ฅผ ์•Œ์•„๋‚ด๊ธฐ ์œ„ํ•ด ์‹ ์šฉ์นด๋“œ ๋ช…์„ธ์„œ๋ฅผ ๋ฉด๋ฐ€ํžˆ ์‚ดํ•€๋‹ค.
7. ํ‡ด๊ทผ๊ธธ์—๋Š” ์—…๋ฌด ์ค‘ ์ €์ง€๋ฅธ ์‚ฌ์†Œํ•œ ์‹ค์ˆ˜๋“ค์„ ์žŠ์œผ๋ ค ์• ์“ฐ์ง€ ์•Š๊ณ  ์˜คํžˆ๋ ค ๊ทธ๋Ÿฐ ์‹ค์ˆ˜๋“ค์„ ํ•˜๋‚˜์”ฉ ์งš์–ด ๊ฐ€๋ฉฐ ๋ฐ˜์„ฑ์˜ ์‹œ๊ฐ„์„ ๊ฐ–๋Š”๋‹ค. ๋˜, ์–ด๋–ค ๊ฒฐ์ •์ด ๊ดœ์ฐฎ์€ ํšจ๊ณผ๋ฅผ ๊ฑฐ๋‘์ง€ ๋ชปํ–ˆ๋‹ค๋ฉด, ๊ทธ ์ด์œ ๊ฐ€ ๋ฌด์—‡์ธ์ง€ ํƒ๊ตฌํ•˜๊ณ , ํšŒ์˜ ์‹œ๊ฐ„์— ๋” ๊ฐ„๊ฒฐํ•˜๊ฒŒ ๋ฐœ์–ธํ•  ์ˆ˜๋Š” ์—†์—ˆ์„๊นŒ๋ฅผ ๋ฐ˜์„ฑํ•˜๋Š” ์‹œ๊ฐ„๋„ ๊ฐ€์ง„๋‹ค.
8. ์šฐ๋ฆฌ๋Š” ๋‚™๊ด€์ ์œผ๋กœ ์ƒ๊ฐํ•˜๋ฉฐ ์ž์‹ ์˜ ์ž˜๋ชป์ด๋‚˜ ํƒ€์ธ์˜ ์ž‘์€ ์‹ค์ˆ˜๋ฅผ ์žŠ์œผ๋ ค๋Š” ์„ฑํ–ฅ์„ ๋ค๋‹ค. ํ•˜์ง€๋งŒ ์˜ฌ๋ฐ”๋ฅธ ์˜ˆ์ธก์„ ํ•ด๋‚ด๋ ค๋ฉด ํ˜„์‹ค์— ๊ธฐ๋ฐ˜์„ ๋‘๊ณ  ์ถ”์ •ํ•ด์•ผ ํ•˜๋ฉฐ, ์ถ”์ •์€ ๊ฒฝํ—˜์— ํฌ๊ฒŒ ์˜ํ–ฅ์„ ๋ฐ›๋Š”๋‹ค.
9. (์ฆ‰) ์ข‹์€ ๊ฒƒ์—๋งŒ ๊ด€์‹ฌ์„ ๊ธฐ์šธ์ธ๋‹ค๋ฉด, ์ž์‹ ์˜ ์˜ˆ์ธก๋ ฅ์„ ์Šค์Šค๋กœ ๋–จ์–ด๋œจ๋ฆฌ๋Š” ์…ˆ์ด๋‹ค.
10. โ€˜์ ์ •ํ•œ ํŒ๋‹จ๋ ฅ ํ–ฅ์ƒ์„ ์œ„ํ•œ ํ”„๋กœ์ ํŠธโ€™์— ์ฐธ์—ฌํ•œ ๋ฒ„ํด๋ฆฌ ๋Œ€ํ•™๊ต์˜ ๋ˆ ๋ฌด์–ด ๊ต์ˆ˜๋Š” ๋‹ค์Œ๊ณผ ๊ฐ™์€ ์˜์™ธ์˜ ๋ง์„ ํ•œ ์ ์ด ์žˆ๋‹ค. โ€œ์ตœ๊ณ ์˜ ๊ธฐ์—…๊ฐ€๋Š” ์„ฑ๊ณตํ•œ ์‚ฌ๋žŒ๋“ค์˜ ์ด์•ผ๊ธฐ๋งŒ ๋“ค์„ ๋•Œ ๋ฐœ์ƒํ•˜๋Š” ์œ„ํ—˜์„ ์ž˜ ์•Œ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค. ๊ทธ๋ž˜์„œ ์ตœ๊ณ ์˜ ๊ธฐ์—…๊ฐ€๋“ค์€ ๋Œ€๋ถ€๋ถ„์˜ ์‚ฌ๋žŒ๋“ค์ด ์˜์‹์ ์œผ๋กœ ํ”ผํ•˜๋ ค๊ณ  ํ•˜๋Š” ์‚ฌ๋žŒ๋“ค, ๊ทธ๋Ÿฌ๋‹ˆ๊นŒ ์‹คํŒจ๋ฅผ ๊ฒฝํ—˜ํ–ˆ๊ฑฐ๋‚˜ ๊ทธ ๊ณผ์ •์—์„œ ๋ถˆ๋งŒ์„ ๋Š˜์–ด๋†“๋Š” ์‚ฌ๋žŒ๋“ค๊ณผ ์–ด์šธ๋ฆฌ๋ฉฐ ์‹œ๊ฐ„์„ ๋ณด๋‚ด๋Š” ํŽธ์ž…๋‹ˆ๋‹คโ€
11. (๋‹ค์‹œ ๋งํ•ด) ๋” ๋‚˜์€ ๊ฒฐ์ •์„ ๋‚ด๋ฆฌ๋Š” ๋ฐฉ๋ฒ•์„ ํ„ฐ๋“ํ•˜๋Š” ์ค‘์š”ํ•œ ๋น„๊ฒฐ ์ค‘ ํ•˜๋‚˜๋Š” ๊ฒฝํ—˜์„ ๊ณจ๊ณ ๋ฃจ ์Œ“๋Š” ๊ฒƒ์ด๋‹ค. (ํŠนํžˆ) ์˜ฌ๋ฐ”๋ฅธ ์„ ํƒ์„ ํ•˜๋ ค๋ฉด ๋ฏธ๋ž˜๋ฅผ ์ž˜ ์˜ˆ์ธกํ•ด์•ผ ํ•˜๋ฉฐ, ๋ฏธ๋ž˜๋ฅผ ์ž˜ ์˜ˆ์ธกํ•˜๋ ค๋ฉด ์„ฑ๊ณตํ•œ ์‚ฌ๋ก€๋งŒ์ด ์•„๋‹ˆ๋ผ, ์‹คํŒจํ•œ ์‚ฌ๋ก€๊นŒ์ง€ ์ตœ๋Œ€ํ•œ ๋งŽ์ด ๋“ฃ๊ณ  ๋ด์•ผ ํ•œ๋‹ค.
12. ํ•ญ์ƒ ๋งŒ์›์ธ ์˜ํ™”๊ด€๋งŒ ๋‹ค๋‹ˆ๋Š” ๊ฒƒ์ด ์•„๋‹ˆ๋ผ, ํ…… ๋นˆ ์˜ํ™”๊ด€์—๋„ ์•‰์•„ ๋ณด์•„์•ผ ์–ด๋–ค ์˜ํ™”๊ฐ€ ์ˆ˜์ต์„ ์˜ฌ๋ฆฌ๋Š”์ง€ ๋” ์ž˜ ์•Œ ์ˆ˜ ์žˆ๊ณ , ํ›Œ๋ฅญํ•œ ๋น„์ฆˆ๋‹ˆ์Šค ๊ฐ๊ฐ์„ ํ‚ค์šฐ๋ ค๋ฉด ์„ฑ๊ณตํ•œ ๋™๋ฃŒํ•˜๊ณ ๋งŒ ์ง€๋‚ด์ง€ ๋ง๊ณ , ์‹คํŒจ๋ฅผ ๊ฒฝํ—˜ํ•œ ๋™๋ฃŒ๋“ค๊ณผ๋„ ํ•จ๊ป˜ ์‹œ๊ฐ„์„ ๋ณด๋‚ด์•ผ ํ•œ๋‹ค.
13. ๋ฌผ๋ก  ์„ฑ๊ณต์— ๋ˆˆ๊ธธ์ด ๋” ์‰ฝ๊ฒŒ ๊ฐ€๊ธฐ ๋งˆ๋ จ์ด๊ธฐ ๋•Œ๋ฌธ์— ์ด๋Š” ์–ด๋ ค์šด ์ฃผ๋ฌธ์ด๊ธฐ๋„ ํ•˜๋‹ค. ์‹คํŒจํ•œ ์นœ๊ตฌ์—๊ฒŒ ์‹ค๋ก€๋˜๋Š” ์งˆ๋ฌธ์„ ํ”ผํ•˜๊ณ  ์‹ถ์€ ๊ฒŒ ์ธ์ง€์ƒ์ •์ด๊ธฐ๋„ ํ•˜๊ณ .
14. ํ•˜์ง€๋งŒ ๊ธฐ์ค€์œจ์„ ์ ์ ˆํ•˜๊ฒŒ ์ธก์ •ํ•˜๋ ค๋ฉด ์„ฑ๊ณตํ•œ ์‚ฌ๋žŒ๋งŒ์ด ์•„๋‹ˆ๋ผ, ์‹คํŒจํ•œ ์‚ฌ๋žŒ์—๊ฒŒ๋„ ๋ฐฐ์›Œ์•ผ ํ•œ๋‹ค.
15. ๊ทธ๋ฆฌ๊ณ  ์ด๋Ÿฌํ•œ ๋…ธ๋ ฅ์„ ํ†ตํ•ด ์–ป์€ ํ†ต์ฐฐ๋ ฅ๊ณผ ์ง€ํ˜œ๋ฅผ ์ตœ๋Œ€ํ•œ ํ™œ์šฉํ•ด์•ผ ์กฐ๊ธˆ์ด๋ผ๋„ ๋ฏธ๋ž˜๋ฅผ ๋” ์ž˜ ์˜ˆ์ธกํ•  ์ˆ˜ ์žˆ๊ณ , ์ƒ๋Œ€์ ์œผ๋กœ ๋” ๋งŽ์€ ๊ฐ€๋Šฅ์„ฑ์„ ์ƒ๊ฐํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋œ๋‹ค.
- ์ฐฐ์Šค ๋‘ํžˆ๊ทธ, <1๋“ฑ์˜ ์Šต๊ด€> ์ค‘
๐Ÿ‘3
Continuous Learning_Startup & Investment
https://youtu.be/4ef0juAMqoE
[์—์–ด๋น„์•ค๋น„์˜ ๋ณ€ํ™”]
1) ์˜์‚ฌ ๊ฒฐ์ • ๊ณผ์ •์„ ์œ„์ž„ํ–ˆ๋”๋‹ˆ, ์†๋„๊ฐ€ ๋” ๋А๋ ค์กŒ์Šต๋‹ˆ๋‹ค.
2) ์ˆ˜์ฒœ๊ฐœ์˜ ABํ…Œ์ŠคํŠธ ๋Œ€์‹ , 1๋…„์— ๋‘๋ฒˆ ํฌ๊ฒŒ ๋Ÿฐ์นญํ•˜๋Š” ๊ฒƒ์— ์ง‘์ค‘ํ•ฉ๋‹ˆ๋‹ค.

Lenny์™€ ์ง„ํ–‰ํ•œ ์—์–ด๋น„์•ค๋น„ ์ฐฝ์—…์ž์˜ ์ธํ„ฐ๋ทฐ์—์„œ ์ข‹์€ ๋‚ด์šฉ์ด ๋งŽ์•„ ์š”์•ฝํ•ด ๋ณด์•˜์Šต๋‹ˆ๋‹ค.

[์—์–ด๋น„์•ค๋น„๊ฐ€ ๊ฒช์—ˆ๋˜ ๋ฌธ์ œ]
- ์กฐ์ง์ด ์ปค์ง€๋ฉด์„œ, ํŒ€๋“ค์€ ๊ฐ ํŒ€์—๊ฒŒ ์˜ค๋„ˆ์‹ญ์„ ๋‹ฌ๋ผ๊ณ  ์š”์ฒญํ–ˆ๋‹ค.
- ๊ฐ ํŒ€์—๊ฒŒ ์˜์‚ฌ๊ฒฐ์ •๊ถŒ์„ ์œ„์ž„ํ–ˆ๋”๋‹ˆ, ์˜คํžˆ๋ ค ํ”„๋กœ๋•ํŠธ ๋Ÿฐ์นญ ์†๋„๊ฐ€ ๋А๋ ค์กŒ๋‹ค. ๋ฐฉํ–ฅ์„ฑ์„ ๊ฒฐ์ •ํ•˜๋Š” ๋ฐ ํ˜ผ๋ž€์ด ์žˆ์—ˆ๊ธฐ ๋•Œ๋ฌธ์ด๋‹ค.
- ์ด ์˜ํ–ฅ์œผ๋กœ 2015๋…„๋ถ€ํ„ฐ 2020๋…„๊นŒ์ง€ ์—์–ด๋น„์•ค๋น„ ํ”„๋กœ๋•ํŠธ๊ฐ€ ํฌ๊ฒŒ ๋ฐ”๋€Œ์ง€ ๋ชปํ–ˆ๋‹ค.

[๋ฌธ์ œ๋ฅผ ๊ฒช์œผ๋ฉด์„œ ๊ฐ–๊ฒŒ๋œ ์ƒ๊ฐ]
- ํŒ€์›๋“ค์€ ๋ช…ํ™•ํ•œ ๋ฐฉํ–ฅ์„ฑ์„ ์›ํ•œ๋‹ค. CEO๊ฐ€ ๋ฆฌ๋”์‹ญ์„ ๊ฐ€์ง€๊ณ  ํ”„๋กœ๋•ํŠธ๋ฅผ ์ด๋Œ์–ด์•ผ ํ•œ๋‹ค. ๋ชจ๋‘๊ฐ€ ๊ฐ™์€ ๋ฐฉํ–ฅ์„ ๋ฐ”๋ผ๋ณผ ์ˆ˜ ์žˆ๋„๋ก ๋งŒ๋“œ๋Š” ๊ฒƒ์ด ๊ฐ€์žฅ ์ค‘์š”ํ•˜๋‹ค.
- ๋งค๋‹ˆ์ €์˜ ๊ฐ€์žฅ ์ค‘์š”ํ•œ ์—ญํ• ์€ ์ž์‹ ์˜ ๋ถ„์•ผ์—์„œ ์ค‘์š”ํ•œ ๊ฒฐ์ •์„ ๋‚ด๋ฆฌ๋Š” ๊ฒƒ์ด๋‹ค. ์‚ฌ๋žŒ๋“ค์„ ๊ด€๋ฆฌํ•˜๋Š” ๊ฑด ๋‘๋ฒˆ์งธ๋กœ ์ค‘์š”ํ•œ ์—ญํ• ์ด๋‹ค. ์‹ค๋ฌด๋ฅผ ๋ชปํ•˜๋ฉด์„œ ์กฐ์ง ๊ตฌ์„ฑ์›์„ ์ œ๋Œ€๋กœ ๊ด€๋ฆฌํ•˜๋Š” ๊ฑด ๋ถˆ๊ฐ€๋Šฅํ•˜๋‹ค.
- ๋ฐ์ดํ„ฐ๋ฅผ ๋ณด๋Š” ๊ฒƒ์ด ์ค‘์š”ํ•˜์ง€๋งŒ, ์ดํ•ดํ•˜์ง€ ๋ชปํ•˜๋Š” ๋ฐ์ดํ„ฐ๋Š” ์‹œ์ฒด๋‹ค. ๊ฒฐ๊ณผ๋ฅผ ์ดํ•ดํ•˜์ง€ ๋ชปํ•˜๋Š” AB ํ…Œ์ŠคํŠธ๋Š” ์ž˜๋ชป๋œ ๋ฐฉํ–ฅ์œผ๋กœ ์ด๋Œ ์ˆ˜ ์žˆ๋‹ค.
- ์—”์ง€๋‹ˆ์–ด์™€ ๋งˆ์ผ€ํ„ฐ์˜ ๊ด€๊ณ„๊ฐ€ ๊ฐ€๊นŒ์›Œ์•ผ ํ•œ๋‹ค. ์—”์ง€๋‹ˆ์–ด๋Š” ์‰ํ”„์ด๊ณ , ๋งˆ์ผ€ํ„ฐ๋Š” ์›จ์ดํ„ฐ๋‹ค. ๋‘˜์ด ์นœํ•˜์ง€ ์•Š์œผ๋ฉด ์†๋‹˜์—๊ฒŒ ์ตœ๊ณ ์˜ ๊ฒฝํ—˜์„ ์ค„ ์ˆ˜ ์—†๋‹ค.

[๋ณ€ํ™”]
- ๊ด€๋ฆฌ์ž ์ง๊ธ‰์„ ํฌ๊ฒŒ ์ค„์˜€๋‹ค (ํ˜„์žฌ ์ง์›์€ 7,000๋ช…, ๊ธฐ์—…๊ฐ€์น˜๋Š” 100์กฐ)
- ์ง„ํ–‰ํ•˜๋Š” ํ”„๋กœ์ ํŠธ๋ฅผ ํฌ๊ฒŒ ์ค„์˜€๋‹ค.
- ์ˆ˜์ฒœ๊ฐœ์˜ AB ํ…Œ์ŠคํŠธ ๋Œ€์‹ , 1๋…„์— ๋‘๋ฒˆ ํฌ๊ฒŒ ๋Ÿฐ์นญ ํ•˜๋Š” ๊ฒƒ์— ์ง‘์ค‘ํ•œ๋‹ค.
- CEO๋Š” ๋Ÿฐ์นญ์˜ ๋ชจ๋“  ๋””ํ…Œ์ผ์— ์ฐธ์—ฌํ•œ๋‹ค.

From ์ „๊ฒฝ์„๋‹˜ Unsexy Business
โค2
์ง€๋‚œ์ฃผ ๋‰ด์š•์—์„œ ๋ฐ์ด๋น„๋“œ ๋ฃจ๋ฒค์Šคํƒ€์ธ ์นผ๋ผ์ผ ์ฐฝ์—…์ž๋ฅผ ๋งŒ๋‚ฌ๋‹ค. ๋ฐ์ด๋น„๋“œ๋Š” ๋งŽ์€ ์ฐฝ์—…๊ฐ€์™€ ํˆฌ์ž์ž์™€์˜ ๋Œ€๋‹ด ์‹œ๋ฆฌ์ฆˆ๋กœ๋„ ์ž˜ ์•Œ๋ ค์ ธ ์žˆ๋Š”๋ฐ, ์„ธ๊ณ„๊ฒฝ์ œํฌ๋Ÿผ์—์„œ ์ข‹์€ ๊ธฐํšŒ๋ฅผ ๋งŒ๋“ค์–ด์ฃผ์–ด์„œ ์ง์ ‘ ๋ต ์ˆ˜ ์žˆ์—ˆ๋‹ค. ์™ธ๋ถ€์— ๊ณต์œ ํ•˜๊ธฐ ์–ด๋ ค์šด ๋‚ด์šฉ์€ ์ œ์™ธํ•˜๊ณ , ์„ฑ๊ณต์ ์ธ ์ฐฝ์—…๊ฐ€์— ๋Œ€ํ•œ ๊ด€์ ๊ณผ ์นผ๋ผ์ผ ์ฐฝ์—… ์ดˆ๊ธฐ์— ๋Œ€ํ•œ ์ด์•ผ๊ธฐ๋งŒ ๋”ฐ๋กœ ๋ฉ”๋ชจํ•ด ๋‘”๋‹ค.
"์„ฑ๊ณต์ ์ธ ์ฐฝ์—…๊ฐ€๋Š” ๊ฒฐ๊ตญ ์ถ”์ง„๋ ฅ ์žˆ๋Š” CEO์ด๋ฉด์„œ๋„, ๋ฌด์—‡์„ ๋ชจ๋ฅด๋Š”์ง€ ์ž˜ ๋ชจ๋ฅด๋Š” ์‚ฌ๋žŒ์ด๋ผ๋Š” ๊ณตํ†ต์ ์ด ์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค. ์œ„๋Œ€ํ•œ ๊ธฐ์—…์„ ๋งŒ๋“œ๋Š” ์ฐฝ์—…์ž ์ค‘์— 20๋Œ€ ํ˜น์€ 30๋Œ€๊ฐ€ ๋งŽ์€ ์ด์œ ๋Š”, ๊ทธ๋“ค์ด ๋ญ˜ ๋ชจ๋ฅด๊ธฐ ๋•Œ๋ฌธ์— ํ•  ์ˆ˜ ์—†๋‹ค๊ณ  ์ƒ๊ฐํ•˜์ง€ ์•Š๊ธฐ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค."
"์‚ฌ๋ชจํŽ€๋“œ ๋Œ€๋ถ€๋ถ„์ด ๋‰ด์š•์—์„œ ์‹œ์ž‘ํ•œ ๊ฒƒ๊ณผ ๋‹ค๋ฅด๊ฒŒ ์นผ๋ผ์ผ์„ ์›Œ์‹ฑํ„ด DC์—์„œ ์ฐฝ์—…ํ•œ ์ด์œ ๋Š”, ํˆฌ์ž์€ํ–‰ ๊ฒฝ๋ ฅ์ด ์—†์—ˆ๊ธฐ ๋•Œ๋ฌธ์— ๋‰ด์š•์—์„œ ์ฐฝ์—…์„ ์‹œ๋„ํ•  ์‹ ์šฉ๋„ ์—†์—ˆ์ง€๋งŒ, โ€œ๋„์‹œ์—์„œ ์ซ“๊ฒจ๋‚  ๋•Œ, ์•ž์žฅ์„œ์„œ ํ–‰์ง„ํ•˜๋Š” ๊ฒƒ์ฒ˜๋Ÿผ ํ–‰๋™โ€ํ•˜๊ธฐ ์œ„ํ•ด์„œ์˜€์Šต๋‹ˆ๋‹ค. ์ œ๊ฐ€ ์ฒ˜ํ•œ ์ƒํ™ฉ(์ง€๋ฏธ ์นดํ„ฐ ๋Œ€ํ†ต๋ น์ด ์—ฐ์ž„์— ์‹คํŒจํ•˜๋ฉด์„œ, ๋ฐฑ์•…๊ด€์— ๋ชธ๋‹ด๊ณ  ์žˆ๋˜ ๊ฒฝ๋ ฅ์ด ๋๋‚œ ๊ฒƒ)์„ ์ด์šฉํ–ˆ๋˜ ๊ฑฐ์ฃ . ๋‰ด์š•์— ์žˆ๋Š” ์‚ฌ๋ชจํŽ€๋“œ์™€ ๋‹ค๋ฅด๊ฒŒ ์—ฐ๋ฐฉ์ •๋ถ€๊ฐ€ ์–ด๋–ป๊ฒŒ ๋Œ์•„๊ฐ€๋Š”์ง€ ์ž˜ ์•Œ๊ณ  ์žˆ๋Š” ๊ฒƒ์ด ๋‚ด ์ž์‚ฐ์ด์—ˆ์Šต๋‹ˆ๋‹ค."
"๋‚˜์•„๊ฐ€ ํšŒ์‚ฌ๋ฅผ ์„ฑ์žฅ์‹œํ‚ค๋Š” ๊ณผ์ •์—์„œ ๊ฐ€์žฅ ์ค‘์š”ํ•œ ์„ธ ๊ฐ€์ง€ ์ž์งˆ์„ ๊ผฝ์ž๋ฉด 1) ์–ด๋””๋กœ ๊ฐ€๋Š”์ง€ ๋ช…ํ™•ํžˆ ์•„๋Š” ๊ฒƒ, 2) ๊ทธ๊ฒƒ์„ ์‚ฌ๋žŒ๋“ค์—๊ฒŒ ํšจ๊ณผ์ ์œผ๋กœ ์„ค๋“ํ•  ์ˆ˜ ์žˆ๋Š” ๊ฒƒ, ๊ทธ๋ฆฌ๊ณ  ๊ฐ€์žฅ ์ค‘์š”ํ•œ ๊ฒƒ์œผ๋กœ 3) ์†”์„ ์ˆ˜๋ฒ”์ž…๋‹ˆ๋‹ค."