Continuous Learning_Startup & Investment
2.43K subscribers
513 photos
5 videos
16 files
2.74K links
We journey together through the captivating realms of entrepreneurship, investment, life, and technology. This is my chronicle of exploration, where I capture and share the lessons that shape our world. Join us and let's never stop learning!
Download Telegram
Continuous Learning_Startup & Investment
State of GPT talk by Andrej Karpathy: https://www.youtube.com/watch?v=bZQun8Y4L2A&t=373s Would highly recommend watching the above! A 45-minute lecture going over the State of Generative LLMs, how are they trained, what they can and can't do, advanced techniquesโ€ฆ
Here's an http://assembly.ai transcript and chapter summaries:
๐Ÿ‘‚๐Ÿผ ๐Ÿค– ๐Ÿ“ƒ
https://www.assemblyai.com/playground/transcript/64kyzev80o-6ed4-4902-a066-7df25c363193

Andre Karpathi is a founding member of OpenAI. He will talk about how we train GPT assistants. In the second part he will take a look at how we can use these assistants effectively for your applications.

TRAINING NEURAL NETWORKS ON THE INTERNET

We have four major stages pretraining supervised fine tuning, reward modeling, reinforcement learning. In each stage we have a data set that powers that stage. And then we have an algorithm that for our purposes will be an objective for training a neural network.

GPT 3.1: BASE MODELS AND AGENTS

The GPT four model that you might be interacting with over API is not a base model, it's an assistant model. You can even trick base models into being assistants. Instead we have a different path to make actual GPT assistance, not just base model document completers.

NEUROANATOMY 2.8

In the reward modeling step, what we're going to do is we're now going to shift our data collection to be of the form of comparisons. Now, because we have a reward model, we can score the quality of any arbitrary completion for any given prompt. And then at the end, you could deploy a Rlhf model.

COGNITIVE PROCESSES AND GPT

How do we best apply a GPT assistant model to your problems? Think about the rich internal monologue and tool use and how much work actually goes computationally in your brain to generate this one final sentence. From GPT's perspective, this is just a sequence of tokens.

TREE OF THOUGHT AND PROMPT ENGINEERING

A lot of people are really playing around with kind of prompt engineering to bring back some of these abilities that we sort of have in our brain for LLMs. I think this is kind of an equivalent of AlphaGo but for text. I would not advise people to use it in practical applications.

WHAT ARE THE QUIRKS OF LLMS?

The next thing that I find kind of interesting is that LLMs don't want to succeed, they want to imitate. And so at test time, you actually have to ask for a good performance. Next up, I think a lot of people are really interested in basically retrieval augmented generation.

CONSTRAINT PROMPTING IN LLMS

Next, I wanted to briefly talk about constraint prompting. This is basically techniques for forcing a certain template in the outputs of LLMs. And I think this kind of constraint sampling is also extremely interesting.

FINE-TUNING A LANGUAGE MODEL

You can get really far with prompt engineering, but it's also possible to think about fine tuning your models. Fine tuning is a lot more technically involved. It requires human data contractors for data sets and or synthetic data pipelines. Break up your task into two major parts.

LIMITS TO FULLY AUTONOMOUS LLMS

There's a large number of limitations to LLMs today, so I would keep that definitely in mind for all your applications models. My recommendation right now is use LLMs in low stakes applications, combine them with always with human oversight. Think copilots instead of completely autonomous agents.
๐Ÿง‘๐Ÿผโ€โœˆ๏ธ ๐Ÿšง๐Ÿ’ป
In this post, I try to answer specific questions about the internals of Copilot, while also describing some interesting observations I made as I combed through the code. I will provide pointers to the relevant code for almost everything I talk about, so that interested folks can take a look at the code themselves.

https://thakkarparth007.github.io/copilot-explorer/posts/copilot-internals
<์ž๊ทน์„ ์ค„์ด๊ณ  ์ƒ๊ฐ์„ ๋Š˜๋ฆฌ๊ธฐ>

์š”์ฆ˜ ํ˜„๋Œ€์ธ๋“ค์€ ๊ฑฐ์˜ ADHD ์ƒํƒœ๋กœ ์ผ์„ ํ•œ๋‹ค๊ณ  ์ƒ๊ฐ์ด ๋“œ๋Š” ๋ฉด์ด ์žˆ๋‹ค. ์ง€์†์ ์œผ๋กœ ๋†’์€ ๊ฐ•๋„์˜ ์ž๊ทน์— ์ž์‹ ์„ ๋…ธ์ถœ์‹œํ‚ค๊ธฐ ์‰ฝ๊ธฐ ๋•Œ๋ฌธ์ด๋‹ค. ์ด๋Ÿฐ ํ™˜๊ฒฝ์†์—์„œ ๋ญ ํ•˜๋‚˜์— ์ฐจ๋ถ„ํ•˜๊ฒŒ ์ง‘์ค‘ํ•˜๊ณ  ๊นŠ์ด ์žˆ๋Š” ์‚ฌ๊ณ ๋ฅผ ํ•˜๊ธฐ๊ฐ€ ํž˜๋“ค๋‹ค.

๋‘ ๊ฐ€์ง€ ์‚ฌ๋ก€๋ฅผ ๋จผ์ € ์†Œ๊ฐœํ•˜๊ฒ ๋‹ค.

์‚ฌ๋ก€ 1)

๋‚ด๊ฐ€ ์•„๋Š” K๋ชจ์”จ๋Š” ๋Œ€๊ธฐ์—… ์ง์›์ด์—ˆ๋Š”๋ฐ, ํ•˜๋ฃจ์— ์ „์‚ฌ์—์„œ ๋“ค์–ด์˜ค๋Š” ์—…๋ฌด ์š”์ฒญ๋งŒ ์ˆ˜๋ฐฑ๊ฑด์ด๋ผ๊ณ  ํ–ˆ๋‹ค. ๊ทธ๋ž˜์„œ ๋‚ ๋งˆ๋‹ค ๋ฐค 11์‹œ์— ํ‡ด๊ทผ์„ ํ•˜๊ณ  ์žˆ์—ˆ๋‹ค.

๊ทธ๋Ÿฌ๋‹ค๊ฐ€ ๋‚˜์—๊ฒŒ์„œ ์• ์ž์ผ ์ด์•ผ๊ธฐ๋ฅผ ๋“ฃ๊ณ  ์‹คํ—˜์„ ํ•ด๋ณด๊ธฐ๋กœ ๊ฒฐ์‹ฌํ–ˆ๋‹ค. ์ •์‹œ ํ‡ด๊ทผ. ๊ทธ๋ž˜์„œ ํŒ€์žฅ์—๊ฒŒ ์ œ์•ˆ์„ ํ–ˆ๋‹ค. ์˜ค๋Š˜๋ถ€ํ„ฐ 18์‹œ ์ •์‹œ ํ‡ด๊ทผ์„ ํ•˜๊ฒ ๋‹ค. ํ˜น์—ฌ ์ผ ์ฒ˜๋ฆฌ๊ฐ€ ์กฐ๊ธˆ์ด๋ผ๋„ ๋–จ์–ด์ง„๋‹ค๋Š” ๋А๋‚Œ์ด ๋“ค๋ฉด ์–˜๊ธฐํ•ด๋ผ. ๋ฐ”๋กœ ์›๋ณตํ•˜๊ฒ ๋‹ค. ๊ทธ๋Ÿฌ๊ณ  ๊ทธ๋‚ ๋ถ€ํ„ฐ 18์‹œ ํ‡ด๊ทผ์„ ํ–ˆ๋‹ค. ์ง‘์— ์˜ค๋ฉด ์ €๋… 7์‹œ๋ถ€ํ„ฐ 9์‹œ๊นŒ์ง€ ๋‘ ์‹œ๊ฐ„์”ฉ 6์‚ด ์•„์ด๋ž‘ ๋†€์•„์คฌ๋‹ค๊ณ  ํ•œ๋‹ค. ๊ทธ์ „๊นŒ์ง€ ์•„์ด์—๊ฒŒ ์•„๋น ๋Š” ์—†๋Š” ์กด์žฌ์˜€๋‹ค. ์ฃผ์ค‘์—๋Š” ๋ฐค 11์‹œ์— ์˜ค๊ณ , ์•„์นจ์—๋Š” ์ž๊ธฐ๋ณด๋‹ค ๋จผ์ € ๋‚˜๊ฐ€๊ณ  ์ฃผ๋ง์—๋Š” ๊ณ„์† ์“ฐ๋Ÿฌ์ ธ ์žˆ์—ˆ์œผ๋‹ˆ. ๊ทธ๋Ÿฐ ์•„์ด์—๊ฒŒ "์•„๋น "๊ฐ€ ์ƒ๊ธด ๊ฑฐ๋‹ค.

๊ทผ๋ฐ ๋ฌธ์ œ๊ฐ€ ํ•˜๋‚˜ ์žˆ์—ˆ๋‹ค. ์ •์‹œ ํ‡ด๊ทผ์„ ํ–ˆ์œผ๋‹ˆ ๋‹ค ์ฒ˜๋ฆฌ ๋ชปํ•œ ์ผ๋“ค์ด ๋ฌธ์ œ. ๊ทธ๋Ÿฐ๋ฐ ๋ณด์•ˆ๋ฌธ์ œ ๋•Œ๋ฌธ์— ์ง‘์—์„œ ํšŒ์‚ฌ ์ปดํ“จํ„ฐ๋‚˜ ์ž๋ฃŒ์— ์ ‘๊ทผํ•  ์ˆ˜๊ฐ€ ์—†์—ˆ๋‹ค. ๊ทธ๋ž˜์„œ ๊ทธ๊ฐ€ ๋Œ€์•ˆ์œผ๋กœ ํ–ˆ๋˜ ๊ฑฐ๋Š” ๋ฐค 11์‹œ๋ถ€ํ„ฐ 1์‹œ๊นŒ์ง€ ๋‘ ์‹œ๊ฐ„ ๋™์•ˆ ์ž๊ธฐ ์ฑ…์ƒ์— ์ด๋ฉด์ง€ ํŽผ์น˜๊ณ  ์•‰์•„์„œ ์˜ค๋Š˜ ํ–ˆ๋˜ ์ผ๋“ค, ๋‚ด์ผ ํ•  ์ผ๋“ค์„ ์–ด๋–ป๊ฒŒ ํ•ด์•ผ ๋” ํ˜„๋ช…ํ•˜๊ฒŒ ์ฒ˜๋ฆฌํ•  ๊ฑด๊ฐ€ ์ „๋žต์„ ์งœ๋Š” ๊ฑฐ์˜€๋‹ค. ๊ทธ๊ฑธ ๋‚ ๋งˆ๋‹ค ํ–ˆ๋‹ค.

๊ทธ๋Ÿฌ๊ณ  ๋‹ค์Œ๋‚  ์ถœ๊ทผ์„ ํ•˜๋‹ˆ ์—…๋ฌด ์š”์ฒญ ์ค‘์˜ 50% ์ด์ƒ์€ ์ž๋™์œผ๋กœ ํ•ด๊ฒฐ๋œ ๊ฒฝ์šฐ๊ฐ€ ๋งŽ์•˜๊ณ (์š”์ฒญํ•œ ๋ถ€์„œ์—์„œ ๋‹ต๋‹ตํ•˜๋‹ˆ ์ž์ฒด์ ์œผ๋กœ ํ•ด๊ฒฐ), ๋‚จ์€ 50%๋Š” ์ง€๋‚œ ๋ฐค์— ๊ณ ๋ฏผํ•œ ๊ฒฐ๊ณผ ๋” ํ˜„๋ช…ํ•œ ๋ฐฉ๋ฒ•์œผ๋กœ ์ฒ˜๋ฆฌ๋ฅผ ํ•ด์„œ ๊ธˆ๋ฐฉ ๋๋‚ผ ์ˆ˜ ์žˆ์—ˆ๋‹ค.

๋ฌผ๋ก  ๊ฒฐ๊ณผ์ ์œผ๋กœ ๋ฐค 11์‹œ ํ‡ด๊ทผํ•  ๋•Œ๋ณด๋‹ค ์ˆ˜๋ฉด์‹œ๊ฐ„์ด ์ค„์—ˆ๋‹ค๊ณ  ํ•œ๋‹ค. ์˜ˆ์ „์—๋Š” ์ง‘์— ๋“ค์–ด์˜ค๋ฉด ๋ฐ”๋กœ ์“ฐ๋Ÿฌ์ ธ์„œ ์žค์œผ๋‹ˆ๊นŒ. ํ•˜์ง€๋งŒ ๋ชธ์ด ๋А๋ผ๋Š” ์—๋„ˆ์ง€๋Š” ํ›จ์”ฌ ์ข‹์•„์กŒ๋‹ค๊ณ  ํ•œ๋‹ค.

์‚ฌ๋ก€ 2)

์˜ˆ์ „์— ๊ตฐ๋Œ€์‹œ์ ˆ ์ž๋Œ€ ๋ฐฐ์น˜๋ฅผ ๋ฐ›๊ณ  ํ•ด๋‹น ๋ถ€๋Œ€์— ๊ฐ”๊ณ  ์‚ฌ์ˆ˜๋ฅผ ๋ฐฐ๋‹น ๋ฐ›์•˜๋‹ค. ๊ทผ๋ฐ ๊ทธ ์‚ฌ์ˆ˜ ์–ผ๊ตด์„ ๋ณด๊ธฐ๊ฐ€ ํž˜๋“ ๊ฑฐ๋‹ค. ๋ฉฐ์น  ์ง€๋‚˜ ์•Œ๊ฒŒ ๋๋Š”๋ฐ ๊ทธ ์‚ฌ์ˆ˜ ์ „์—ญ์ผ์ด 1์ฃผ์ผ ๋’ค๋ž€๋‹ค. ๋‚ด ์‚ฌ์ˆ˜์˜ ๋ณด์ง์€ ๋Œ€๋Œ€ ์ •๋น„๊ณผ ์„œ๋ฌด๋ณ‘. ์›Œ๋‚™ ํ•˜๋Š” ์ผ์ด ๋งŽ๊ณ  ๋ณต์žกํ•ด์„œ ํ†ต์ƒ 1๋…„ ์ •๋„๋Š” ์ธ์ˆ˜์ธ๊ณ„๋ฅผ ๋ฐ›์•„์•ผ ์ œ๋Œ€๋กœ ์ผ์„ ํ•˜๊ฒŒ ๋œ๋‹ค๊ณ  ํ•œ๋‹ค. ๊ทผ๋ฐ ์ด ์‚ฌ๋žŒ์€ 1์ฃผ์ผ ๋’ค์— ์ „์—ญํ•˜๊ณ , ์ด 1์ฃผ์ผ๋„ ์–ผ๋ ๋šฑ๋•… ์ง€๋‚˜๊ฐ€๊ณ  ์žˆ์—ˆ๋‹ค. ๊ฐ€๋” ์ •๋น„๊ณผ์— ๋‚ด๋ ค์™€์„œ๋Š” ๊ถ๊ธˆํ•œ ๊ฑฐ ๋ฌผ์–ด๋ดํ•˜๊ณ  ๋ˆ„์›Œ์žˆ๊ฑฐ๋‚˜ ํ•˜๋Š” ์ •๋„. ์ •๋ง ๋ฌธ์ œ๋Š” ์ด ์‚ฌ๋žŒ์˜ ๋ณด์ง์„ ์ •ํ™•ํ•˜๊ฒŒ ํŒŒ์•…ํ•˜๋Š” ์‚ฌ๋žŒ์ด ๊ฐ„๋ถ€๋‚˜ ๋ณ‘ ์ค‘์— ์•„๋ฌด๋„ ์—†๋‹ค๋Š” ๊ฑฐ.

๊ฒฐ๊ตญ ๋‚˜๋Š” ๊ฑฐ์˜ ์•„๋ฌด๊ฒƒ๋„ ๋ฐฐ์šฐ์ง€๋„ ๋ชปํ•œ ์ฑ„๋กœ ์‚ฌ์ˆ˜๊ฐ€ ์ „์—ญ์„ ํ–ˆ๊ณ , ์—…๋ฌด ๋งค๋‰ด์–ผ๋„ ํ•˜๋‚˜ ์—†์—ˆ๋‹ค. ์ฐธ๊ณ ํ•  ์ž๋ฃŒ๊ฐ€ ์ „ํ˜€ ์—†๋Š” ์ƒํ™ฉ.

๊ณ ๋ฏผํ•˜๋‹ค๊ฐ€ ๊ฒฐ๊ตญ ํ•˜๊ฒŒ ๋œ ์„ ํƒ์€ ์›๋ฆฌ์™€ ์›์น™์œผ๋กœ ์ƒ๊ฐํ•ด์„œ ํ–‰๋™ํ•˜์ž๋Š” ๊ฑฐ์˜€๋‹ค. ์–ด๋–ค ๋ฌธ์ œ ์ƒํ™ฉ์ด ๋ฐœ์ƒํ•˜๋ฉด ๋‚ด๊ฐ€ ์ƒ๊ฐํ•˜๋Š” ๊ธฐ๋ณธ์ ์ธ ์›๋ฆฌ์— ๋”ฐ๋ผ(์˜ˆ์ปจ๋Œ€ ์–ด๋–ป๊ฒŒ ํ•˜๋Š” ๊ฒƒ์ด ์œก๊ตฐ์—๊ฒŒ ์ด๋“์ด ๋˜๋Š” ํ–‰๋™์ธ๊ฐ€ ๊ฐ™์€) ๋…ผ๋ฆฌ์ ์œผ๋กœ ๋ง์ด ๋˜๋Š” ํ–‰๋™์„ ์ƒ๊ฐํ•ด์„œ ํ–ˆ๋‹ค. ๋‚ด๊ฐ€ ๋ชจ๋“  ๊ทœ์น™๊ณผ ๋ฒ•์„ ์„ค๊ณ„ํ•˜๋ฉด์„œ ํ–ˆ๋‹ค๊ณ  ํ• ๊นŒ. ์ด๋Ÿฌ๋‹ˆ๊นŒ ๊ฑฐ์น  ๊ฒƒ์ด ์—†์—ˆ๋‹ค. ๋ญ๋“ ์ง€ ๊นŠ๊ฒŒ ์ƒ๊ฐํ•ด์„œ ๊ทธ๋Œ€๋กœ ํ•˜๋ฉด ๋‹ค ํ’€๋ฆฌ๋”๋ผ๋Š”.

๊ทผ๋ฐ ์˜์™ธ๋กœ ์ด ๋ฐฉ๋ฒ•์ด ์ž˜ ํ†ตํ–ˆ๋‹ค. ๊ทธ๋ž˜์„œ ๊ฒฐ๊ตญ ๋‚ด๊ฐ€ ๋ชจ๋“  ์ฒด๊ณ„๋ฅผ ๋งŒ๋“ค์—ˆ๊ณ  ์ด๊ฑธ๋กœ ์ƒ๋„ ๋ช‡๋ฒˆ ๋ฐ›์•˜๋‹ค. ๊ตฐ๋‹จ์—์„œ ๊ฐ์‚ฌ ๋‚ด๋ ค์™”์„ ๋•Œ์—๋Š” ๋‚ด๊ฐ€ ๊ตฐ๋ฌด์›์ด๋ž‘ ์žฅ๊ต๋“ค ๋ชจ์•„๋†“๊ณ  ๋น„๊ณต์‹ ๊ฐ•์—ฐ๋„ ํ–ˆ๋‹ค.

----
๋•Œ๋กœ๋Š” ์™ธ๋ถ€ ์ž๊ทน/์ •๋ณด๋ฅผ ์ œํ•œํ•˜๊ณ  ์ƒ๊ฐ์— ์ง‘์ค‘ํ•˜๋Š” ๊ฒƒ์ด ๋„์›€์ด ๋˜๋Š” ๊ฒฝ์šฐ๊ฐ€ ์žˆ๋‹ค. ๋ค์œผ๋กœ ์ƒ๊ฐํ•˜๋Š” ๊ทผ์œก๊ณผ ๊ธฐ์ˆ ๋„ ๋Š˜๊ฒŒ ๋œ๋‹ค.

๊ทธ๋ž˜์„œ ๋‚˜๋Š” ์˜ˆ์ปจ๋Œ€ ๋‹ค์Œ๊ณผ ๊ฐ™์€ ๊ฒƒ๋“ค์„ ์ถ”์ฒœํ•œ๋‹ค:
* ๋ฒ„๊ทธ๊ฐ€ ๋‚˜์˜ค๋ฉด ๋ฐ”๋กœ ๊ฒ€์ƒ‰์ฐฝ์— ๋•Œ๋ ค๋„ฃ์ง€ ๋ง๊ณ  ์ ์–ด๋„ 5๋ถ„, 10๋ถ„๊ฐ„์€ ๋ฐฑ์ง€์—๋‹ค๊ฐ€ ๋ฌธ์ œ์ƒํ™ฉ์„ ๊ทธ๋ ค๋ณด๊ณ  ์›์ธ ์œ ์ถ”ํ•ด๋ณด๊ธฐ
* ์ „ํ˜€ ๋ชจ๋ฅด๋Š” ๋ถ„์•ผ์— ์ž…๋ฌธํ•˜๊ณ  ์‹ถ์„ ๋•Œ ์ธํ„ฐ๋„ท ๊ฒ€์ƒ‰๋ณด๋‹ค๋Š” ์„œ์ ์—์„œ ์ž˜๋‚˜๊ฐ€๋Š” ์ฑ… ์ค‘์— ์Šคํƒ€์ผ์ด ๋‹ค๋ฅธ ์ฑ… 3๊ถŒ์„ ๊ตฌ์ž…ํ•ด์„œ ์–˜๋ฅผ ๋น„๊ตํ•ด๋ณด๋ฉด์„œ ๋ณด๊ธฐ (๋‚˜๋Š” ์ด๊ฑธ bounded exploration์ด๋ผ๊ณ  ๋ถ€๋ฅธ๋‹ค -- ์ด๊ฑธ ์•ˆํ•˜๋ฉด ์–ด๋А ํ•˜๋‚˜ ์ œ๋Œ€๋กœ ๋ณด์ง€ ์•Š๊ณ  ๊ณ„์† ๊น”์ง๊น”์ง ๋Œ€๋ฉด์„œ ์‹œ๊ฐ„์„ ๋‚ญ๋น„ํ•˜๊ธฐ ์‰ฝ๋‹ค)
* ํ•ด๊ฒฐํ•ด์•ผํ•  ๋ณต์žกํ•œ ๋ฌธ์ œ๊ฐ€ ์žˆ์„ ๊ฒฝ์šฐ ์ถ”๊ฐ€ ์ •๋ณด๋ฅผ ์ „ํ˜€ ์ฐพ์ง€ ์•Š๊ณ  ๋ฐฑ์ง€๋ฅผ ํŽผ์ณ๋†“๊ณ  30๋ถ„ ๋™์•ˆ ๋…ผ๋ฆฌ์™€ ๋‚ด ์ƒ๊ฐ, ๋‚ด ๊ณผ๊ฑฐ๊ฒฝํ—˜์œผ๋กœ๋งŒ ํ•ด๊ฒฐ์ฑ…์„ ์„ค๊ณ„ํ•ด ๋ณด๊ธฐ

https://www.facebook.com/100000557305988/posts/pfbid02joCFDgeyR58vuv2MyZqQWJ1cf7FwrYZHS6FLq9ox8Bqu2RE9cV3HdgzWdHJvopjkl/?mibextid=jf9HGS
๐Ÿ‘5
Continuous Learning_Startup & Investment
Could one Language Learning Model handle all programming languages? Or should we tailor a model for each? What's your take? #LLM #ProgrammingLanguages https://www.linkedin.com/posts/mateizaharia_introducing-english-as-the-new-programming-activity-7080242815120637952โ€ฆ
๋„ˆ๋ฌด๋‚˜ ์‰ฌ์›Œ์ง€๋Š” ๋ฐ์ดํ„ฐ ์‚ฌ์ด์–ธ์Šค ๐Ÿš€

ChatGPT ๋•๋ถ„์— ๋ฐ์ดํ„ฐ ์‚ฌ์ด์–ธ์Šค๊ฐ€ ๋†€๋ž๋„๋ก ์‰ฌ์›Œ์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค. ๐Ÿค– ์ด์ „์—๋Š” ํด๋Ÿฌ์Šคํ„ฐ๋ง์„ ์ด์šฉํ•œ ์•„๋ž˜ ์ฐจํŠธ๋ฅผ ๋งŒ๋“ค๊ธฐ ์œ„ํ•ด ํ•„์š”ํ–ˆ๋˜ ์ง€์‹๋“ค์€ ๋‹ค์Œ๊ณผ ๊ฐ™์•˜์Šต๋‹ˆ๋‹ค.

## Google Colab ํ•™์Šต ์‹œ๊ฐ„ ๐Ÿ“š:

1. ๊ธฐ๋ณธ์ ์ธ ์‚ฌ์šฉ๋ฒ•์„ ์ตํžˆ๋Š”๋ฐ ์•ฝ 1์ฃผ ์ •๋„์˜ ํ•™์Šต ์‹œ๊ฐ„์ด ํ•„์š”ํ–ˆ์Šต๋‹ˆ๋‹ค.

2. ๋” ๋ณต์žกํ•œ ์ž‘์—…, ์˜ˆ๋ฅผ ๋“ค์–ด ์™ธ๋ถ€ ๋ฐ์ดํ„ฐ๋ฅผ ๋ถˆ๋Ÿฌ์˜ค๊ฑฐ๋‚˜, ํฐ ๊ทœ๋ชจ์˜ ๋ฐ์ดํ„ฐ๋ฅผ ์ฒ˜๋ฆฌํ•˜๋Š” ๋ฐฉ๋ฒ• ๋“ฑ์„ ํ•™์Šตํ•˜๋Š”๋ฐ ์ถ”๊ฐ€์ ์ธ 1~2์ฃผ์˜ ์‹œ๊ฐ„์ด ํ•„์š”ํ–ˆ์Šต๋‹ˆ๋‹ค.

## ๋ฐ์ดํ„ฐ ๊ณผํ•™ ๋ฐฐ๊ฒฝ ์ง€์‹ ๐ŸŽ“:

1. ํด๋Ÿฌ์Šคํ„ฐ๋ง: ๊ธฐ๋ณธ์ ์ธ ์ดํ•ด๋ฅผ ์œ„ํ•ด 1~2์ฃผ์˜ ํ•™์Šต ์‹œ๊ฐ„์ด ํ•„์š”ํ–ˆ์Šต๋‹ˆ๋‹ค.

2. ํด๋Ÿฌ์Šคํ„ฐ๋ง ํ‰๊ฐ€ ์ง€ํ‘œ: ๊ฐ ์ง€ํ‘œ์— ๋Œ€ํ•œ ๊ธฐ๋ณธ์ ์ธ ์ดํ•ด๋ฅผ ์œ„ํ•ด 1์ฃผ ์ •๋„์˜ ํ•™์Šต ์‹œ๊ฐ„์ด ํ•„์š”ํ–ˆ์Šต๋‹ˆ๋‹ค.

3. ๋ฐ์ดํ„ฐ ๋ถ„์„ ๋ฐ ์ฒ˜๋ฆฌ: ์ด ์ฃผ์ œ๋Š” ๊ด‘๋ฒ”์œ„ํ•˜๋ฏ€๋กœ, ๊ธฐ๋ณธ์ ์ธ ๋ฐ์ดํ„ฐ ์ „์ฒ˜๋ฆฌ ๋ฐ ๋ถ„์„ ๊ธฐ๋ฒ•์„ ์Šต๋“ํ•˜๋Š” ๋ฐ๋Š” ์ตœ์†Œํ•œ 1~2๊ฐœ์›”์˜ ํ•™์Šต ์‹œ๊ฐ„์ด ํ•„์š”ํ–ˆ์Šต๋‹ˆ๋‹ค.

## API ์ง€์‹ ๐Ÿ’ป:

1. Firebase Firestore: Firestore์˜ ๊ธฐ๋ณธ์ ์ธ ์‚ฌ์šฉ๋ฒ•์„ ๋ฐฐ์šฐ๋Š” ๋ฐ๋Š” 1~2์ฃผ์˜ ์‹œ๊ฐ„์ด ์†Œ์š”๋์Šต๋‹ˆ๋‹ค.

## ์ฝ”๋”ฉ ์Šคํ‚ฌ ๐Ÿ–ฅ๏ธ:

1. ํŒŒ์ด์ฌ: ํŒŒ์ด์ฌ์˜ ๊ธฐ๋ณธ ๋ฌธ๋ฒ•์„ ์ตํžˆ๋Š” ๋ฐ๋Š” ์•ฝ 1~2๊ฐœ์›”์˜ ํ•™์Šต ์‹œ๊ฐ„์ด ํ•„์š”ํ–ˆ์Šต๋‹ˆ๋‹ค.

2. NumPy: ๊ธฐ๋ณธ์ ์ธ NumPy ๊ธฐ๋Šฅ์„ ์ตํžˆ๋Š” ๋ฐ๋Š” ์•ฝ 1~2์ฃผ์˜ ํ•™์Šต ์‹œ๊ฐ„์ด ํ•„์š”ํ–ˆ์Šต๋‹ˆ๋‹ค.

3. Matplotlib: ๊ธฐ๋ณธ์ ์ธ ๊ทธ๋ž˜ํ”„๋ฅผ ๊ทธ๋ฆฌ๋Š” ๋ฐฉ๋ฒ•์„ ๋ฐฐ์šฐ๋Š” ๋ฐ๋Š” ์•ฝ 1์ฃผ์˜ ํ•™์Šต ์‹œ๊ฐ„์ด ํ•„์š”ํ–ˆ์Šต๋‹ˆ๋‹ค.

์œ„์—์„œ ์ œ์‹œํ•œ ๊ฐ ํ•ญ๋ชฉ์˜ ํ•™์Šต ์‹œ๊ฐ„์„ ํ•ฉ์‚ฐํ•˜๋ฉด ๋Œ€๋žต์ ์œผ๋กœ ๋‹ค์Œ๊ณผ ๊ฐ™์Šต๋‹ˆ๋‹ค: ๋ฐ์ดํ„ฐ ๊ณผํ•™ ๊ธฐ์ดˆ: ์•ฝ 2~4๊ฐœ์›”, API ์ง€์‹ (Firebase Firestore): ์•ฝ 1~2์ฃผ, ์ฝ”๋”ฉ ์Šคํ‚ฌ (ํŒŒ์ด์ฌ, NumPy, Matplotlib): ์•ฝ 2~3๊ฐœ์›”. ๋”ฐ๋ผ์„œ ์ด ํ•™์Šต ์‹œ๊ฐ„์€ ์•ฝ 4~7๊ฐœ์›” ์ •๋„๋กœ ์˜ˆ์ƒํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ๐Ÿ“ˆ

----

# ChatGPT๋ฅผ ์ด์šฉํ•˜๋‹ˆ ๋‹ค์Œ๊ณผ ๊ฐ™์ด ๋˜์–ด๋ฒ„๋ ธ์Šต๋‹ˆ๋‹ค. ๐Ÿ”„

AI๊ฐ€ ์ฝ”๋”ฉ๊ณผ ์‹คํ—˜ ์„ค๊ณ„๋ฅผ ๋‹ด๋‹นํ•˜๋ฏ€๋กœ ๊ทธ ๋ถ€๋ถ„์˜ ํ•™์Šต ์‹œ๊ฐ„์€ ์ œ์™ธํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ๊ทธ๋Ÿฌ๋ฏ€๋กœ, ๋‚จ์€ ๋ถ€๋ถ„์€ ๋ฐ์ดํ„ฐ ๊ณผํ•™์— ๋Œ€ํ•œ ๊ฐ€๋ฒผ์šด ๋ฐฐ๊ฒฝ ์ง€์‹๊ณผ Google Colab์— ๋Œ€ํ•œ ์ดํ•ด์ž…๋‹ˆ๋‹ค. ๐Ÿค”

1. ๋ฐ์ดํ„ฐ ๊ณผํ•™ ๋ฐฐ๊ฒฝ ์ง€์‹: AI ๋น„์„œ์˜ ์„ค๋ช…๊ณผ ๊ฐ€์ด๋“œ๋กœ, ์•ฝ 1๊ฐœ์›”๋กœ ๋‹จ์ถ•๋  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ๊ฒฝ์šฐ์— ๋”ฐ๋ผ์„œ๋Š” 2์ฃผ์—๋„ ๊ธฐ๋ณธ ๊ฐœ๋…์„ ํ›‘์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

2. Google Colab: AI ๋น„์„œ์˜ ๋„์›€์œผ๋กœ, ํ•™์Šต ์‹œ๊ฐ„์„ ์•ฝ 1์ฃผ๋กœ ์ค„์ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. - ์‚ฌ์‹ค 1์‹œ๊ฐ„๋งŒ ํ•ด๋„ ๋  ๊ฒƒ ๊ฐ™๊ธด ํ•ด์š”๏ฟผ

์ด ๊ฒฝ์šฐ, ์ด ํ•™์Šต ์‹œ๊ฐ„์€ ์•ฝ 1~2๊ฐœ์›” ์ •๋„๋กœ ์ถ”์ •๋ฉ๋‹ˆ๋‹ค. ์ด๋ฏธ ์ฝ”๋”ฉ ์Šคํ‚ฌ๊ณผ API ์‚ฌ์šฉ์— ๋Œ€ํ•œ ์ง€์‹์ด ์žˆ๋‹ค๋ฉด, ์ด ์‹œ๊ฐ„์€ ๋”์šฑ ๋‹จ์ถ•๋  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. โŒ›

----

๊ฒฐ๊ตญ ์ดˆ๋ณด์ž์˜ ๊ฒฝ์šฐ 6๊ฐœ์›” ์ฝ”์Šค -> 1๊ฐœ์›” ์ฝ”์Šค๊ฐ€ ๋ฉ๋‹ˆ๋‹ค. ๐ŸŽ‰ ๋ฐ์ดํ„ฐ ์‚ฌ์ด์–ธ์Šค ๋ฐฐ๊ฒฝ ์ง€์‹์„ ์•Œ๊ณ  ์žˆ๊ณ  ํŒŒ์ด์ฌ ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ ์‚ฌ์šฉ ๋ฐฉ๋ฒ•์„ ๋ชฐ๋ž๋˜ ์ œ ์ž…์žฅ์—์„œ๋Š” 3์ฃผ ์ •๋„์—์„œ ๋‘์‹œ๊ฐ„์œผ๋กœ ๋‹จ์ถ• ๋˜์—ˆ์Šต๋‹ˆ๋‹ค. ๐Ÿ˜ฒ ์ด ์™ธ์—๋„ ๋ฐ์ดํ„ฐ ๊ณผํ•™ ์ „๋ฐ˜์„ ๋ฐฐ์šฐ๋ ค๋ฉด 4๋…„๋„ ๋ชจ์ž๋ž๋‹ˆ๋‹ค.

๊ฒฐ๊ตญ ์‹œ๋‹ˆ์–ด ๋ฐ์ดํ„ฐ ์‚ฌ์ด์–ธํ‹ฐ์Šค ํ•œ๋ช…์ด ํ•  ์ˆ˜ ์žˆ๋Š” ์ผ์ด ์ฅฌ๋‹ˆ์–ด ์‚ฌ์ด์–ธํ‹ฐ์ŠคํŠธ์™€ ์ฅฌ๋‹ˆ์–ด ๋ฐ์ดํ„ฐ ์—”์ง€๋‹ˆ์–ด 10๋ช… ์ด์ƒ์— ํ•ด๋‹นํ•˜๋Š” ์ผ์ด ๋˜์–ด๋ฒ„๋ฆฝ๋‹ˆ๋‹ค.

์‹ค๋ฆฌ์ฝ˜๋ฐธ๋ฆฌ์—์„œ๋Š” ์ด๋ฏธ ์ฅฌ๋‹ˆ์–ด ๋ฐ์ดํ„ฐ ๊ณผํ•™์ž๋“ค์ด ๋น ๋ฅธ ์†๋„๋กœ ์ง์—…์„ ์žƒ๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค. ๐Ÿ˜ฑ

ํ•™๊ต์—์„œ์˜ ๊ณผ์ •๋„ ๋ฐ”๋€Œ์–ด์•ผ ํ•  ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค. ์˜คํžˆ๋ ค ๊ฐ™์€ ์‹œ๊ฐ„ ๋‚ด์— ๋” ๊นŠ์ด ์žˆ๋Š” ์ด๋ก ์„ ๋ฐฐ์šธ ์ˆ˜ ์žˆ์„ ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค. ๋˜ํ•œ ์‹ค์ œ ์ฝ”๋”ฉ๋ณด๋‹ค๋Š” ์—ฐ๊ตฌ ๋ฐฉ๋ฒ•๋ก ์— ์ค‘์ ์„ ๋‘๊ณ  ๊ต์œก ์„ค๊ณ„๋ฅผ ํ•ด์•ผ ํ•  ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค. ๋ฐ์ดํ„ฐ ์‚ฌ์ด์–ธํ‹ฐ์ŠคํŠธ๋“ค์ด ์‹ค๋ฌด ๊ธฐ์ˆ ๋ณด๋‹ค ์ง€์‹์ ์œผ๋กœ ์ƒํ–ฅ ํ‰์ค€ํ™” ๋˜๋Š” ์ƒํ™ฉ์ด ์˜ฌ ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค.

---

์•„๋ž˜ scatter plot์„ ์œ„ํ•ด ์‚ฌ์šฉํ•œ prompt:

1. Get the latest 1000 samples from user_tribes collection
2, tribeId is cluster id, x and y are the coordinates.
3. Measure the homogeneity and completeness using colab.
4. Visualize the results.

Kmeans๋ผ๊ณ  ๋ง๋„ ์•ˆ ํ–ˆ๋Š”๋ฐ ์•Œ์•„์„œ ๊ฐ–๋‹ค ์“ฐ๋„ค์š”.

https://www.facebook.com/634740022/posts/pfbid0cuABUXxgECdMwZfQaZ9u88HqXaLoLKzdJxBGLSsfHMfUovKRdQnuybjUYc9sJycsl/?mibextid=jf9HGS
์Šค๋ชฐํ† ํฌ ์ž˜ํ•˜๋Š” ํŒ, Conversation Threading

์†Œ์…œ๋ผ์ด์ง•์ด๋ผ๋Š” ๋งฅ๋ฝ์—์„œ Conv. Threading ์ด๋ž€, ์˜๋„์ ์œผ๋กœ ๋‚˜์— ๊ด€ํ•œ ํ‚ค์›Œ๋“œ๋“ค์„ ๋Œ€ํ™”์— ์ถ”๊ฐ€์ •๋ณด๋กœ ํ˜๋ฆฌ๋ฏ€๋กœ์จ ์ƒ๋Œ€๋ฐฉ(ํ˜น์€ ๊ทธ๋ฃน๋‚ด ํƒ€์ธ)์ด ๊ทธ ํ‚ค์›Œ๋“œ๋“ค์„ ์ค์คํ•˜์—ฌ ์ž์—ฐ์Šค๋Ÿฝ๊ฒŒ ํ ํ„ฐ๋ ˆ์ŠคํŒ…ํ•œ ๋Œ€ํ™”๊ฐ€ ์ด์–ด์ง€๋„๋ก ํ•˜๋Š” ํ–‰์œ„ ๋˜๋Š” ๋Œ€ํ™”๋ฒ•.

์˜ˆ๋ฅผ ๋“ค์–ด, ๊ณ ํ–ฅ์ด ์–ด๋””์„ธ์š”? ๋ผ๊ณ ํ•˜๋ฉด ์ผ๋ฐ˜์ ์œผ๋กœ๋Š” "์ „๋‚จ ์ˆœ์ฒœ์ด์š”." ๋ผ๊ณ  ๋‹จ๋‹ต์œผ๋กœ ๋๋‚ผ ์ˆ˜ ์žˆ๋Š”๊ฑธ CT๋ฅผ ์ž˜ ํ•˜๋Š” ์‚ฌ๋žŒ์€ โ€œ์ „๋‚จ ์ˆœ์ฒœ์ด์š”, ์—ฌ์ˆ˜ ๋ฐค๋ฐ”๋‹ค์™€ ๊ฐ€๊นŒ์šด๋ฐ ์ˆœ์ฒœ๋งŒ์Šต์ง€๋กœ ์œ ๋ช…ํ•˜๊ณ  ์ƒํƒœํ•™์Šต ํ•˜์‹œ๋Š” ๋ถ„๋“ค์˜ ์„ฑ์ง€์—์š”.โ€ ๋ผ๊ณ  ์–˜๊ธฐ๋ฅผ ํ•œ๋‹ค. ๊ทธ๋Ÿฌ๋ฉด ๊ทธ ๊ทธ๋ฃน์—์„œ ๋ˆ„๊ตฌ๋“  ์“ฐ๋ ˆ๋“œ๋ฅผ ์ด์–ด๊ฐˆ ์ˆ˜ ์žˆ๋‹ค. ์—ฌ์ˆ˜ ๋ฐค๋ฐ”๋‹ค ๋…ธ๋ž˜ ์–˜๊ธฐ๋ฅผ ํ•  ์ˆ˜๋„, ์—ฌ์ˆ˜ ์—ฌํ–‰๊ฐ„ ์–˜๊ธฐ๋ฅผ ํ•  ์ˆ˜๋„, ์Šต์ง€ ์–˜๊ธฐ๋‚˜ ์ƒํƒœํ•™์Šต์— ๋Œ€ํ•œ ์งˆ๋ฌธ์„ ํ•  ์ˆ˜๋„์žˆ๋‹ค.

๊ณผ๊ฑฐ ๋‚ด๊ฒŒ ์˜ํ–ฅ์„ ์คฌ๋˜ ๋งŽ์€ ๋ฆฌ๋”๋“ค์ด (ํŠนํžˆ ์˜์–ด๊ถŒ) ์ด ์Šคํ‚ฌ์„ ์ž์—ฐ์Šค๋Ÿฝ๊ฒŒ ์‚ฌ์šฉํ•˜๋Š”๊ฑธ ๋ณด๊ณ  ๋ฐฐ์šฐ๋ ค๊ณ  ๋…ธ๋ ฅ ๋งŽ์ด ํ–ˆ๋‹ค. ๊ทธ๋Ÿฐ๋ฐ ์•„์ง๋„ TMI ์™€ ํ ํ„ฐ๋ ˆ์ŠคํŒ…์˜ ์„ ์„ ๊ตฌ๋ถ„ํ•ด์„œ ํ™œ์šฉํ•˜๊ธฐ ์ฐธ ์–ด๋ ต๋‹ค. ์–ด์จŒ๋“  ํ‚ค์›Œ๋“œ๋ฅผ ํ˜๋ฆฌ๋ ค๊ณ  ๋…ธ๋ ฅํ•˜๋ฉด, ์ฃผ๋ณ€์ด๋“ค์˜ ์ค์คํ•˜๋Š” ์ƒํ™ฉ๋“ค์—์„œ ์ž์นซ ๋Š๊ธธ๋งŒํ•œ ๋Œ€ํ™”๊ฐ€ ์—ฐ๊ฒฐ๋˜๊ณ  ๋ผํฌ๋ฅผ ๋”์šฑ ์‰ฝ๊ฒŒ ๋งŒ๋“ค ์ˆ˜ ์žˆ๋‹ค.

https://loopward.com/improve-conversation-skills-using-conversational-threads-and-sharing-experiences/

https://www.facebook.com/1150372185/posts/pfbid02ke1dLH2EPwSGkNSSGVL7NutMUkGN5ADNT2Zzeh3cQE8BK1rmHNoiGwz75kVT22v8l/?mibextid=jf9HGS
๐Ÿ‘3
์—ฌ์„ฑ๋ฆฌ๋”๋ถ„๋“ค์„ ์œ„ํ•œ ์ด์•ผ๊ธฐ--

์—ฌ์„ฑ ๋ฆฌ๋”๋“ค๊ณผ์˜ ํ† ํฌ ๋ชจ์ž„์ด ์žˆ์—ˆ๋‹ค. ์ œ๊ฒŒ ๋‚จ์„ฑ์˜ ๊ด€์ ์—์„œ ์—ฌ์„ฑ๋ฆฌ๋”์—๊ฒŒ ์ด์•ผ๊ธฐ๋ฅผ ํ•ด ๋‹ฌ๋ผ๊ณ  ํ•ด์„œ ํ•œ ์ด์•ผ๊ธฐ ์ค‘ ๋ช‡๊ฐ€์ง€๋ฅผ ์ •๋ฆฌํ•˜๋ฉด~

0. ์—ฌ์„ฑ์œผ๋กœ ์ปค๋ฆฌ์–ด๋ฅผ ์Œ“๋Š”๋‹ค๋Š” ๊ฒƒ์€ ํ›จ์”ฌ ๊ธฐ์šธ์–ด์ง„ ์šด๋™์žฅ์—์„œ ํ”Œ๋ ˆ์ดํ•˜๋Š” ๊ฒƒ์ž„์€ ๋ถ„๋ช…ํ•˜๋‹ค. ๋‹คํ–‰ํžˆ๋„ ์‚ฌํšŒ๋ณ€ํ™”์— ๋”ฐ๋ผ ์กฐ๊ธˆ์”ฉ ๋‚˜์•„์ง€๋Š” ๋“ฏ ํ•˜๋‹ค.

1. ๋‚จ์„ฑ ๋ฆฌ๋”๋ฅผ ํ‰๋‚ด๋‚ด์ง€๋ง๊ณ  ์—ฌ์„ฑ์œผ๋กœ์„œ์˜ ๊ฐ•์ ์„ ํ™œ์šฉํ•˜๋Š”๊ฒŒ ์–ด๋–จ๊นŒ์š”
- ๊ณผ๊ฑฐ์—๋Š” ๋‚จ์ž๊ฐ™์€ ์Šคํƒ€์ผ, ๋‚จ์ž๋ณด๋‹ค ๋” ์Žˆ ์—ฌ์„ฑ๋“ค์ด ๋ฆฌ๋”๋กœ ์ ํ•ฉํ•˜๋‹ค๊ณ  ์—ฌ๊น€
- ๋ถˆํ™•์‹คํ•˜๊ณ  ๋‹ค์–‘ํ•œ ์‹œ๋Œ€, ๊ณต๊ฐ, ์ˆ˜ํ‰, ํฌ์šฉ์˜ ๋ฆฌ๋”์‹ญ์ด ์ค‘์š”ํ•œ ์ด๋•Œ ์—ฌ์„ฑ์˜ ๊ฐ•์ ์ด ๋ฆฌ๋”๋กœ์„œ ์ ์  ํ•„์š”ํ•ด์ง
- ๊ทธ๋Ÿฌ๋ฏ€๋กœ ์—ฌ์„ฑ์œผ๋กœ์„œ์˜ ๊ฐ•์ ์„ ๋งˆ์Œ๊ป ๋ฐœํœ˜ํ•˜์ž.

2. ์ž์‹ ๊ฐ ๊ฐ€์ง€๊ณ  ํ‘œํ˜„ํ•˜์ž
- ๋ถ€๋“œ๋Ÿฝ๊ณ  ๊ณต๊ฐ๋ ฅ์ด ์žˆ๋‹ค๋Š”๊ฒƒ๊ณผ ์ž์‹ ๊ฐ์ด ์—†๋‹ค๋Š” ๊ฒƒ์€ ๋‹ค๋ฆ„. ๋ถ€๋“œ๋Ÿฌ์›Œ๋„ ํ•จ๋ถ€๋กœ ๋Œ€ํ•˜๋Š” ์ด๋“ค์—๊ฒ ๋‹จํ˜ธํ• ์ˆ˜ ์žˆ๊ณ  ๋งค์‚ฌ ์ž์‹ ๊ฐ์— ๊ฐ€๋“์ฐฐ์ˆ˜ ์žˆ์–ด์š”.
- ๋„ˆ๋ฌด ๊ฒธ์†ํ•˜๊ณ  ์–‘๋ณดํ•˜์ง€ ๋ง๊ณ  ๋‹น๋‹นํ•˜๊ณ  ์ž์‹ ๊ฐ์„ ๊ฐ€์ง‘์‹œ๋‹ค. ๋Œ€๊ฐœ ๋‹น์‹ ๋ณด๋‹ค ์‹ค๋ ฅ์—†๋Š” ๋‚จ์„ฑ๋“ค์ด ํ›จ์”ฌ ๋” ์ž์‹ ๊ฐ์— ์ถฉ๋งŒํ•˜๋‹ค.

3. ๋” ํฐ ์ฑ…์ž„, ๋ฆฌ๋”์‹ญ, ํ”„๋กœ์ ํŠธ๋ฅผ ํ™•์žฅํ•ฉ์‹œ๋‹ค.
- R&R์— ์–ฝ๋งค์ด๊ณ  ์ฃผ์ €ํ•˜๊ธฐ๋ณด๋‹ค ์„ฑ์žฅํ• ์ˆ˜ ์žˆ๊ณ  ๊ธฐ์—ฌํ• ์ˆ˜ ์žˆ๋Š” ํ”„๋กœ์ ํŠธ, ์ฑ…์ž„์„ ๊ณผ๊ฐํžˆ ์ทจํ•˜์‹œ๋ผ.

4. ์ž์‹ ์—๊ฒŒ ์ฑ…์ž„๋Œ๋ฆฌ์ง€๋ง๊ณ , ์ƒํ•˜๊ฑฐ๋‚˜ ํญ๋ฐœํ•˜๋Š” ๊ฐ์ •์€ ๋น ๋ฅด๊ฒŒ ํšŒ๋ณตํ•ฉ์‹œ๋‹ค.
- ์ž์‹ ์„ ํƒ“ํ•˜๊ฑฐ๋‚˜ ์ž์‹ ์—๊ฒŒ ์ฑ…์ž„์„ ๋Œ๋ฆฌ์ง€ ๋งˆ์„ธ์š”. ๋‹น์‹ ์˜ ์ž˜๋ชป์ด ์•„๋‹ˆ์˜ˆ์š”.
- ๊ฐ์ •์€ ๋‚˜์œ ๊ฒƒ์ด ์—†์œผ๋‚˜ ๋‘๋ ค์›€, ์‹ค๋ง, ์Šฌํ””, ๋ถ„๋…ธ ๋“ฑ์˜ ๊ฐ์ •์„ ์˜ค๋ž˜ ๋‘๊ฑฐ๋‚˜ ๋„ˆ๋ฌด ๊ฐ•ํ•˜๊ฒŒ ํ‘œ์ถœํ•˜๊ธฐ ๋ณด๋‹ค๋Š” (์šด๋™, ๋ช…์ƒ, ๊ฑท๊ธฐ ๋“ฑ) ์ŠคํŠธ๋ ˆ์Šค ํ•ด์†Œ๋ฒ•์„ ๋งŒ๋“ค์–ด ๋น ๋ฅด๊ฒŒ ํšŒ๋ณตํ•˜์„ธ์š”.

5. ์™„๋ฒฝ์ฃผ์˜๋ฅผ ๋–จ์ณ๋ฒ„๋ฆฝ์‹œ๋‹ค.
- ์ „๋žต์ ์œผ๋กœ ๋ฌด๋Šฅํ•˜์„ธ์š”
- ๋ชจ๋“ ๊ฒƒ์„ ์ž˜ํ•˜๋ ค ํ•  ํ•„์š”๋Š” ์—†์–ด์š”.
- ์ธ์ƒ์„ ์ˆ™์ œํ•˜๋“ฏ ์‹œํ—˜๋ณด๋“ฏ ์‚ด ํ•„์š”๊ฐ€ ์žˆ๋‚˜์š”. ์ง์žฅ, ๊ฐ€์ •, ์นœ์ฒ™, ์‹œ๋Œ, ์‚ฌํšŒ ๋“ฑ ๋ชจ๋“ ๊ฒƒ์„ 100์  ๋งž์œผ๋ ค ํ•˜๋ฉด ๋„ˆ๋ฌด ํž˜๋“ค์ฃ .

6. ๊ดด๋กญํžˆ๋Š” ์ƒ์‚ฌ, ํž˜๋“  ์‚ฌ๋žŒ์€ ๊ธํœผํ•˜๊ฒŒ ๋ด…์‹œ๋‹ค.
- ์ง„์งœ ์†Œ์‹œ์˜ค๋Š” ์ƒ๊ฐ๋งŒํผ ๋ณ„๋กœ์—†๋‹ค. ์•Œ๊ณ ๋ณด๋ฉด ๋Œ€๊ฐœ ํ‰๋ฒ”ํ•œ ์•„์ €์”จ, ์•„์คŒ๋งˆ์ผ๋ฟ.
- ๋‹ค ์ƒ์กดํ•˜๊ธฐ์œ„ํ•ด ๋ถ„ํˆฌํ•˜๋Š”๊ฒƒ์ผ์ˆ˜ ์žˆ์œผ๋‹ˆ ๊ธํœผ์˜ ๋ˆˆ์œผ๋กœ ๋ณด์ž.

7. ๋ˆ„๊ตฐ๊ฐ€์˜ ๋ฌด์—‡์ด ์•„๋‹ˆ๋ผ ์Šค์Šค๋กœ์˜ ๊ธธ์„ ๊ฐ‘์‹œ๋‹ค.
- ๋„ˆ๋ฌด ๋นจ๋ฆฌ ์ž์‹ ์˜ ์ปค๋ฆฌ์–ด๋ฅผ ํฌ๊ธฐํ•˜์ง€ ๋ง™์‹œ๋‹ค. ๋‚˜์ค‘์— ํ›„ํšŒํ•˜๋Š” ๊ฒฝ์šฐ๊ฐ€ ๋” ๋งŽ๋”๊ตฐ์š”.
- ๋ˆ„๊ตฐ๊ฐ€์˜ ๋ฌด์—‡๋ณด๋‹ค ์ž์‹ ์˜ ์ปค๋ฆฌ์–ด์™€ ์‚ถ์„ ์‚ฝ์‹œ๋‹ค.

p.s. ํ•œ ํŽ˜์นœ์ด ์–ธ๊ธ‰ํ•œ๋ฐ”์™€ ๊ฐ™์ด ์ €๋Š” ๊ตฌ์กฐ์ ์ธ ๊ด€์ ๋ณด๋‹ค๋Š” ๊ฐœ์ธ์˜ ๊ด€์ ์—์„œ ๋ง์”€๋“œ๋ ธ์Šต๋‹ˆ๋‹ค. ์•„๋งˆ ๊ฐœ์ธ์œผ๋กœ์„œ ์–ด๋–ป๊ฒŒ ๋…ธ๋ ฅํ•˜๋ผ๋Š” ๊ฒƒ๋ณด๋‹ค ์‚ฌ์‹ค ์‚ฌํšŒ ๊ตฌ์กฐ์  ์ธ์‹๊ณผ ์ฒด๊ณ„์˜ ๋ณ€ํ™”๊ฐ€ ๋” ํ•„์š”ํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค.

https://www.facebook.com/100006237757461/posts/pfbid02Nuwoy8XghsAmtxmR8vfCRi6naXNwMYVS5V4qybY4pgjbturAb1RHV26YSLYQBoXHl/?mibextid=jf9HGS
AI์˜ ํ•ซํ•œ ์ฃผ์ œ๋“ค์— ๊ด€ํ•œ ๋‚ด์šฉ์ธ๋ฐ, ํ•˜๋‚˜ํ•˜๋‚˜ ๋‹ค ์ƒ๊ฐํ•ด๋ด์•ผํ•˜๋Š” ๋ถ€๋ถ„๋“ค์ด๊ณ  ๊ฐœ์ธ์ ์œผ๋กœ๋Š” ์ „๋ถ€ ๋™์˜.

๊ทธ๋ฆฌ๊ณ  ๋ฌด์—‡๋ณด๋‹ค๋„ ๊ฒฐ๋ก  ๋‚ด์šฉ ๋„ˆ๋ฌด ๋งž๋Š”๋ง์ด๋‹ค... ์—ญ์‹œ ๋‚ด ์ตœ์•  ๋‰ด์Šค๋ ˆํ„ฐ ๋‹ต๊ตฐ.

๋‹ค๋ฅธ ์‚ฌ๋žŒ๋“ค์˜ ์˜๊ฒฌ์€ ์—ด์‹ฌํžˆ ๋“ฃ๋˜, ํŒ๋‹จ์€ ๊ฒฐ๊ตญ ์Šค์Šค๋กœ ๋‚ด๋ ค์•ผํ•œ๋‹ค.

https://luttig.substack.com/p/hallucinations-in-ai
Forwarded from Buff
20-30๊ฐœ ํšŒ์‚ฌ ๋ฏธํŒ… ํ›„๊ธฐ ์ •๋ฆฌ by ๋†๊ตฌ์ฒœ์žฌ๋‹˜
์ถœ์ฒ˜: https://blog.naver.com/tosoha1/223143233920

1. ํ•ด์™ธ ์ˆ˜์ถœ์„ ์ค‘์‹ฌ์œผ๋กœ ์„ฑ์žฅํ•˜๋Š” ์ค‘์†Œํ˜• ํ™”์žฅํ’ˆ ์‚ฐ์—…์— ์†ํ•œ ํšŒ์‚ฌ๋“ค์€ ์ง€์†๋œ ์„ฑ์žฅ์„ ๋ณด์—ฌ์ฃผ๊ณ  ์žˆ๋Š” ๋“ฏ ํ•˜๋‹ค.

2. ๋งŽ์€ ์˜๋ฃŒ๊ธฐ๊ธฐ๋‚˜ ๋ฏธ์šฉ์— ๊ด€๋ จ๋œ ํšŒ์‚ฌ๋“ค์ด ์ˆ˜์ถœ ์ค‘์‹ฌ์œผ๋กœ ์ง€์† ์„ฑ์žฅ ์ค‘์ด์—ˆ๋‹ค.

3. ์œ ๊ฐ€๊ฐ€ ์ง€์ง€์ง€๋ถ€์ง„ํ•ด์„œ ์ƒ๋Œ€์ ์œผ๋กœ ์ˆ˜์ฃผ๊ฐ€ ์•ฝํ•ด์ง€๋Š”๊ฒŒ ์•„๋‹๊นŒ ์šฐ๋ คํ–ˆ๋˜ ์„์œ ํ™”ํ•™ ๊ธฐ์ž์žฌ ํšŒ์‚ฌ๋“ค๋„ ๊ณ„์† ์–‘ํ˜ธํ•œ ์—…ํ™ฉ์„ ์ด์–ด๊ฐ€๊ณ  ์žˆ์—ˆ๋‹ค.

4. ์กฐ์„  ์‚ฐ์—… ์—ญ์‹œ 2~3๋…„๊ฐ„์˜ ๊ฝ‰์ฐฌ ์ˆ˜์ฃผ ์ž”๊ณ ๋ฅผ ํ†ตํ•ด ํ–ฅํ›„ ์–‘ํ˜ธํ•œ ์„ ๊ฐ€์— ๋Œ€ํ•œ ์ˆ˜์ฃผ๋ฅผ ์ด์•ผ๊ธฐ ํ•˜๊ณ  ์žˆ์—ˆ๋‹ค.

5. ์ „๋ ฅ๊ธฐ๊ธฐ ํšŒ์‚ฌ๋“ค์˜ ์—…ํ™ฉ์€ ์ƒ๋Œ€์ ์œผ๋กœ ์ž‘๋…„๋ณด๋‹ค ๋” ์ข‹์•„์ง„ ๋А๋‚Œ์ด์—ˆ๋‹ค. ์ด์ œ๋Š” ์žฅ๊ธฐ ์‚ฌ์ดํด์„ ์กฐ์‹ฌ์Šค๋ ˆ ์ด์•ผ๊ธฐ ํ•˜๋Š” ๋“ฏ ํ–ˆ๋‹ค.

6. ํ”ผํฌ์•„์›ƒ ์šฐ๋ ค๊ฐ€ ํ•ญ์ƒ ์žˆ๋Š” ์ž๋™์ฐจ ์„นํ„ฐ ์—ญ์‹œ ์•„์ง์€ ์ข‹์€ ์‹ค์ ์„ ์œ ์ง€ ์ค‘.

7. ๋ฐฉ์‚ฐ ๊ด€๋ จ ํšŒ์‚ฌ๋“ค๋„ ์ž‘๋…„์˜ ๊ฐ•ํ•œ ๋ชจ๋ฉ˜ํ…€์€ ์•„๋‹ˆ์ง€๋งŒ ์ƒ๋Œ€์ ์œผ๋กœ ์–‘ํ˜ธํ•œ ์—…ํ™ฉ์„ ์ง€์†์ค‘์ด์—ˆ๋‹ค.

8. ๋‹ค ์ฃฝ์–ด๊ฐ€๋˜ ๋ฐ˜๋„์ฒด ์‚ฐ์—…๋‚ด ํšŒ์‚ฌ๋“ค๋„ 1,2๋ถ„๊ธฐ๋‚ด ์ €์  ํ†ต๊ณผ์— ๋Œ€ํ•ด ๋‚˜๋ฆ„ ์ž์‹ ํ•˜๊ณ  ์žˆ๋Š” ๋“ฏ ํ–ˆ๋‹ค.
Lessons gleaned from my wise bro

- Trust is built through a tapestry of gentle acts and kept promises.

Doug Leone of Sequoia Capital

Trust is forged through a blend of sincerity and excellence. In the absence of either, securing trust becomes challenging. Integrity is essential for fostering a healthy work environment; without it, you risk alienating your team. On the other hand, a lack of excellence hampers trust-building. It sets the tone and vision for a company. Once the company's direction is determined, it's crucial to attract the talent required to turn that vision into reality.

Jeff Bezos:

The founder of Amazon shared his 4-step method for building trust and reputation:
1. Do hard things: Earn trust by doing hard things well over and over again.
2. If you say you're going to do something, do it: Keep your promises and commitments.
3. Take controversial stances: Be willing to take risks and make tough decisions.
4. Have clarity: Be clear in your communication and decision-making.
Continuous Learning_Startup & Investment
https://www.longblack.co/note/738?ticket=NT17324b07422a33b8e9e23607de338f93262c450b
์ฃผ์˜๋ ฅ์€ โ€œ์ฃผ๋ณ€์˜ ๋ถˆํ•„์š”ํ•œ ์†Œ์Œ์ด๋‚˜ ์ˆ˜๋‹คโ€๋‚˜ โ€œ์˜์‹ ํ‘œ๋ฉด ์œ„๋กœ ๊ณ„์†ํ•ด์„œ ๋– ์˜ค๋ฅด๋Š” ์ž์ž˜ํ•˜๊ณ  ์žก์Šค๋Ÿฌ์šด ์ƒ๊ฐโ€_12p์„ ๊ฑธ๋Ÿฌ๋‚ด๋Š” ํž˜์ž…๋‹ˆ๋‹ค.

์ฃผ์˜๋ ฅ์ด ๋ชจ์ž๋ผ๋ฉด, ์ธ๊ฐ„์€ ํ˜ผ๋ž€์— ๋น ์ ธ์š”. ๋ฉํ•ด์ง€๋ฉด์„œ ์ฃผ๋ณ€์—์„œ ๋ฒŒ์–ด์ง€๋Š” ์‚ฌ๊ฑด์„ ์•Œ์•„์ฑ„์ง€ ๋ชปํ•˜์ฃ . ํŠนํžˆ ์š”์ฆ˜ ์šฐ๋ฆฌ๊ฐ€ ์ด๋Ÿฐ ์œ„๊ธฐ๋ฅผ ์ž์ฃผ ๊ฒช์Šต๋‹ˆ๋‹ค. ์Šค๋งˆํŠธํฐ์„ ํ†ตํ•ด ๋ฐ€๋ ค๋“œ๋Š” ์ •๋ณด์™€ ์œ ํ˜น์ ์ธ ์ฝ˜ํ…์ธ  ๋•Œ๋ฌธ์ด์ฃ . ์ด๋Ÿฐ ๊ฒŒ ์šฐ๋ฆฌ ์ฃผ์˜๋ฅผ ๋นผ์•—์•„์š”. ๋‚˜๋ฅผ ์‚ฐ๋งŒํ•˜๊ฒŒ ๋งŒ๋“ญ๋‹ˆ๋‹ค.

์‚ฌ์‹ค ํ•œ ๋ฒˆ๋ฟ์ธ ์ธ์ƒ์ด์ž–์•„์š”. ๋‚ด๊ฐ€ ๋‚ด ์‚ถ์— ์˜จ์ „ํžˆ ๋ชฐ์ž…ํ•˜์ง€ ๋ชปํ•˜๊ณ  ํ•œ ๋ฐœ์ง ๋–จ์–ด์ ธ ์žˆ๋‹ค๋Š” ๋А๋‚Œ์€ ์ธ๊ฐ„์„ ๋ถˆํ–‰์— ๋น ๋œจ๋ ค์š”. ์ขŒ์ ˆ๊ณผ ๋ถˆ๋งŒ, ๋ถˆ์•ˆ๊ณผ ๊ณตํ—ˆ์˜ ์›์ฒœ์ด ๋˜์ฃ .

โ€œ์—ฌ๊ธฐ์ €๊ธฐ ๋Œ์•„๋‹ค๋‹ˆ๋Š” ์ฃผ์˜๋ฅผ ์Šค์Šค๋กœ ๋˜์ฐพ์•„ ์˜ค๋Š” ๋Šฅ๋ ฅ์ด ์—†๋‹ค๋ฉด, ๋ˆ„๊ตฌ๋„ ์ž๊ธฐ ์ž์‹ ์˜ ์ฃผ์ธ์ด ๋  ์ˆ˜ ์—†๋‹ค.โ€

โ€œ์‹œ๊ฐ„ ์—ฌํ–‰์€ ๋ฌด์ฒ™ ์ž์—ฐ์Šค๋Ÿฝ๊ฒŒ ์ด๋ฃจ์–ด์ง„๋‹ค. ์ŠคํŠธ๋ ˆ์Šค๋ฅผ ๋ฐ›์œผ๋ฉด ์šฐ๋ฆฌ์˜ ์ฃผ์˜๋ ฅ์€ ์–ด๋–ค ๊ธฐ์–ต์„ ํ†ตํ•ด ๊ณผ๊ฑฐ๋กœ ๋Œ๋ ค๊ฐ€๊ณ , ๊ทธ ๊ธฐ์–ต ์†์—์„œ ๋ฐ˜์ถ”์˜ ๊ณ ๋ฆฌ์— ๊ฐ‡ํžŒ๋‹ค.โ€_23p

๊ฐ€์žฅ ๊ฐ•๋ ฅํ•˜์ง€๋งŒ, ๊ฐ€์žฅ ์ทจ์•ฝํ•œ ์ฃผ์˜๋ ฅ
๊ฒŒ๋‹ค๊ฐ€ ์šฐ๋ฆฌ๋Š” ๋ถ€์นดVUCA ์‹œ๋Œ€์— ์‚ด๊ณ  ์žˆ์–ด์š”. ๋ถˆ์•ˆ์ •ํ•˜๊ณ Volatility, ๋ถˆํ™•์‹คํ•˜๊ณ Uncertainty, ๋ณต์žกํ•˜๊ณ Complexity, ๋ชจํ˜ธํ•œAmbiguity ์‚ถ์˜ ์ƒํ™ฉ์„ ๊ฒฌ๋ŽŒ์•ผ ํ•˜์ฃ . ํ•œ๋งˆ๋””๋กœ ์ŠคํŠธ๋ ˆ์Šค๋กœ ๊ฐ€๋“ํ•œ ํ™˜๊ฒฝ์ด์—์š”. ์ด๋Ÿด ๋•Œ ์ข‹์€ ์‚ถ์„ ์‚ด๋ ค๋ฉด, ์ฃผ์˜๋ ฅ์ด ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค.

โ€œ์‚ถ์˜ ์ŠคํŠธ๋ ˆ์Šค๋ฅผ ํ”ผํ•  ์ˆ˜ ์—†๋Š” ๋ถˆํ™•์‹คํ•œ ์‹œ๋Œ€ ์†์—์„œ ๋ชฉํ‘œ๋ฅผ ์ด๋ฃจ๊ณ , ์šฐ๋ฆฌ๊ฐ€ ๋˜๊ณ  ์‹ถ์€ ์‚ฌ๋žŒ์ด ๋˜๊ณ , ์šฐ๋ฆฌ๊ฐ€ ์›ํ•˜๋Š” ๋ฐฉ์‹๋Œ€๋กœ ๋‹ค๋ฅธ ์‚ฌ๋žŒ๊ณผ ์šฐ๋ฆฌ ์ž์‹ ์„ ์ด๋Œ์–ด๊ฐ€๊ธฐ ์œ„ํ•ด ๊ทธ ์–ด๋А ๋•Œ๋ณด๋‹ค ์ฃผ์˜๋ ฅ์ด ์ ˆ์‹คํ•˜๋‹ค.โ€_29p

์ฃผ์˜๋ ฅ์€ ํ†ต์ œํ•˜๋Š” ๊ฒŒ ์•„๋‹ˆ๋‹ค

โ€œ์ฃผ์˜๋ ฅ์€ ๋‡Œ ํ™œ๋™์˜ ํŽธํ–ฅ์„ ์ผ์œผํ‚จ๋‹ค. ์ฃผ์˜๋ ฅ์€ ์šฐ๋ฆฌ๊ฐ€ ์„ ํƒํ•˜๋Š” ์ •๋ณด๋ฅผ ์ƒ๋Œ€์  ์šฐ์œ„๋กœ ๋งŒ๋“ ๋‹ค. ์šฐ๋ฆฌ๊ฐ€ ์ฃผ์˜๋ฅผ ๊ธฐ์šธ์ด๋Š” ๋Œ€์ƒ์ด ๋ฌด์—‡์ด๋“  ๊ฐ„์— ๊ทธ ๋Œ€์ƒ๊ณผ ๊ด€๋ จ๋œ ์‹ ๊ฒฝ ํ™œ๋™์ด ๋” ํ™œ๋ฐœํ•ด์ง„๋‹ค. ์ฃผ์˜๋ ฅ์€ ๋ฌธ์ž ๊ทธ๋Œ€๋กœ ๋‡Œ ๊ธฐ๋Šฅ์„ ์„ธํฌ ๋‹จ์œ„์—์„œ ๋ณ€ํ™”์‹œํ‚จ๋‹ค. ์ฃผ์˜๋ ฅ์€ ์ง„์ •ํ•œ ์ดˆ๋Šฅ๋ ฅ์ด๋‹ค.โ€


์ฃผ์˜๋ฅผ ๋นผ์•—๊ธฐ์ง€ ์•Š์œผ๋ ค๋ฉด ์ŠคํŠธ๋ ˆ์Šค๋ฅผ ์ค„์ด๊ณ , ๊ธฐ๋ถ„์„ ์ž˜ ์‚ดํŽด์•ผ ํ•ฉ๋‹ˆ๋‹ค. โ€˜์˜ค๋Š˜ ํ•˜๋ฃจ๋งŒ ๋ฐ˜๊ฐ’โ€™ ๊ฐ™์€ ๋ง์— ์†์•„ ๋„˜์–ด๊ฐ€์ง€ ์•Š์œผ๋ฉด ์ข‹์•„์š”. ๋ฌผ๋ก  ์‰ฌ์šด ์ผ์€ ์•„๋‹ˆ์ฃ . ์ฃผ์˜๋ฅผ ๋ง๊ฐ€๋œจ๋ฆฌ๋Š” ์š”์ธ๋“ค์„ ์ž˜ ์•Œ์•„์ฑ„๋Š” ๊ฑด ๊ทธ๋ƒฅ ๋˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.

โ€œ์šฐ๋ฆฌ๋Š” ์˜ค๋žซ๋™์•ˆ ์ง‘์ค‘์„ ์œ ์ง€ํ•˜๋ ค๊ณ  ๋…ธ๋ ฅํ•˜๋ฉด ์ฃผ์˜๋ ฅ์˜ ์ €ํ•ญ์„ ๋А๋ผ๊ธฐ ์‹œ์ž‘ํ•˜๊ณ , ๋‚˜์ค‘์—” ์–ด๋–ค ์‹์œผ๋กœ๋“  ์ง‘์ค‘์ด ํํŠธ๋Ÿฌ์ง„๋‹ค.โ€_153p

๋‹คํ–‰ํžˆ ์šฐ๋ฆฌ๋Š” ํ›ˆ๋ จ์„ ํ†ตํ•ด ๋‡Œ์˜ ๋ฐฐ์„  ์ƒํƒœ๋ฅผ ๋ฐ”๊ฟ€ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ๋ช…์ƒ ๊ฐ™์€ ๊ฑธ ํ†ตํ•ด์„œ์š”. โ€œ๋ช…์ƒ์€ ํŠน์ •ํ•œ ์ •์‹ ์  ์ž์งˆ์„ ๊ณ„๋ฐœํ•˜๊ธฐ ์œ„ํ•ด ์ •ํ•ด์ง„ ์ˆœ์„œ๋Œ€๋กœ ์—ฐ์Šตํ•˜๋Š” ํ–‰์œ„โ€_103p์˜ˆ์š”. ์ˆ˜์ฒœ ๋…„๊ฐ„ ์ธ๋ฅ˜๋Š” ์ฒ ํ•™๊ณผ ์ข…๊ต, ๋…์„œ ๋“ฑ์˜ ์˜์—ญ์—์„œ ๋ช…์ƒ ๊ฐ™์€ ๋งˆ์Œ์˜ ๊ทผ๋ ฅ ์šด๋™์„ ํ•ด์™”์–ด์š”.

์•„๋ฏธ์‹œ ๊ต์ˆ˜๋Š” ์ด๋ฅผ ๋งˆ์Œ์ฑ™๊น€mindfulness์ด๋ผ ๋ถ€๋ฆ…๋‹ˆ๋‹ค. ์ฃผ์˜๋ ฅ ํ›ˆ๋ จ์„ ์œ„ํ•œ ์ตœ์ƒ์˜ ์ˆ˜๋‹จ์ด๋ผ๊ณ  ๊ฐ•์กฐํ•˜์ฃ . ๋งˆ์Œ์ฑ™๊น€์€ ์ผ์ข…์˜ ์ •์‹ ์  ๊ฐ‘์˜ท์ž…๋‹ˆ๋‹ค. โ€œ์ŠคํŠธ๋ ˆ์Šค๋‚˜ ์••๋ฐ•๊ฐ์ด ์‹ฌํ•œ ์ƒํ™ฉ์—์„œ๋„ ์ฃผ์˜๋ ฅ์ด๋ผ๋Š” ๊ท€ํ•œ ์ž์›์„ ๋ณดํ˜ธํ•˜๊ณ  ๊ทธ ์ž์›์„ ์–ธ์ œ๋“  ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ๋Š” ์ƒํƒœ๋กœ ์œ ์ง€โ€_24pํ•˜๊ฒŒ ํ•˜์ฃ .

โ€œ์ฃผ์˜๋ ฅ์€ ์ค‘์š”ํ•œ ๊ฒƒ์„ ๊ฐ•์กฐํ•˜๋Š” ๋™์‹œ์— ์ง‘์ค‘์„ ๋ฐฉํ•ดํ•˜๋Š” ์š”์†Œ๋ฅผ ์•ฝํ™”ํ•ด ์šฐ๋ฆฌ๊ฐ€ ๊นŠ์ด ์ƒ๊ฐํ•˜๊ณ , ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๊ณ , ๊ณ„ํš์„ ์„ธ์šฐ๊ณ , ์šฐ์„ ์ˆœ์œ„๋ฅผ ์ •ํ•˜๊ณ  ํ˜์‹ ์„ ํ•˜๋„๋ก ์ด๋ˆ๋‹ค.โ€_42p


โ€œ๋” ๋งŽ์€ ์ผ์„ ํ•˜๋ ค๋ฉด, ํ•œ ๋ฒˆ์— ํ•œ ๊ฐ€์ง€ ์ผ๋งŒ ํ•˜์„ธ์š”. ๋ฉ€ํ‹ฐํƒœ์Šคํ‚น์„ ํ•˜์ง€ ๋งˆ์„ธ์š”. ๊ณผ์ œ ์ „ํ™˜์€ ์†๋„๋ฅผ ๋–จ์–ด๋œจ๋ฆฝ๋‹ˆ๋‹ค. ๋ฏธ๋ž˜๋ฅผ ์œ„ํ•ด ๊ฐ€์žฅ ์ข‹์€ ๊ณ„ํš์„ ์„ธ์šฐ๋ ค๋ฉด ์—ฌ๋Ÿฌ ๊ฐœ์˜ ์‹œ๋‚˜๋ฆฌ์˜ค๋ฅผ ๊ฐ€์ง€๊ณ  ์‹œ๋ฎฌ๋ ˆ์ด์…˜๋งŒ ํ•ด์„œ๋Š” ์•ˆ ๋ฉ๋‹ˆ๋‹ค. ํ˜„์žฌ๋ฅผ ๊ด€์ฐฐํ•˜๊ณ , ํ˜„์žฌ์˜ ์ˆœ๊ฐ„์— ๋จธ๋ฌผ๋ฉด์„œ ๋” ๋‚˜์€ ์•„์ด๋””์–ด๋ฅผ ์ˆ˜์ง‘ํ•˜์„ธ์š”.โ€_296p

์ฒ ํ•™์ž ์—๋ฆฌํžˆ ํ”„๋กฌ์˜ ๋ง์ด ๋– ์˜ค๋ฅด๋„ค์š”. โ€œ์šฐ๋ฆฌ๋Š” ๋Š˜ ๋ถ„์ฃผํ•˜์ง€๋งŒ ์ง‘์ค‘ํ•˜์ง€ ๋ชปํ•œ๋‹ค. ์•„์นจ์„ ๋จน์œผ๋ฉด์„œ ๋ผ๋””์˜ค๋ฅผ ๋“ฃ๊ณ  ์‹ ๋ฌธ์„ ์ฝ์œผ๋ฉฐ, ๊ทธ ์™€์ค‘์— ์•„๋‚ด์™€ ์•„์ด๋“ค๊ณผ ๋Œ€ํ™”๋ฅผ ๋‚˜๋ˆˆ๋‹ค. ๋‹ค์„ฏ ๊ฐ€์ง€ ์ผ์„ ๋™์‹œ์— ํ•˜์ง€๋งŒ ๊ทธ ์–ด๋–ค ์ผ๋„ ์ œ๋Œ€๋กœ ํ•˜์ง€ ์•Š๋Š”๋‹ค.โ€ ํ”„๋กฌ์€ ์ด๋ฅผ ์ž๊ธฐ ์‚ถ์„ ์‚ฌ๋ž‘ํ•  ์ค„ ๋ชจ๋ฅด๋Š” ์‚ฌ๋žŒ์˜ ๋Œ€ํ‘œ์ ์ธ ์˜ˆ๋กœ ๋“ค์—ˆ์–ด์š”.

๋‡Œ์˜ ๊ธฐ๋ณธ ์ƒํƒœ๋Š” ์‚ฐ๋งŒํ•จ์ด๊ณ , ํ•„์‚ฌ์ ์œผ๋กœ ์• ์จ์•ผ ์ง‘์ค‘ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์ฃผ์˜๋ ฅ์€ ์šฐ๋ฆฌ ๋งˆ์Œ์˜ ํŠน์ˆ˜ํ•œ ์ƒํƒœ์˜ˆ์š”.

โ€œ์šฐ๋ฆฌ๋Š” ๋จธ๋ฆฟ์†์œผ๋กœ ๋‹ค์Œ๋ฒˆ์— ์ผ์–ด๋‚  ์ผ์„ ๊ณ„ํšํ•˜๊ณ  ์ƒ์ƒํ•˜๋А๋ผ ๋„ˆ๋ฌด๋‚˜ ๋งŽ์€ ์‹œ๊ฐ„์„ ์‚ฌ์šฉํ•˜๊ธฐ์— ํ˜„์žฌ์˜ ์‚ถ์„ ์™„์ „ํžˆ ๋†“์นœ๋‹ค.โ€_217p

์šฐ๋ฆฌ๋Š” ๋์—†๋Š” ์ƒ๊ฐ์— ์ค‘๋…๋ผ ์žˆ์–ด์š”. ํ˜„์žฌ๋ฅผ ์žˆ๋Š” ๊ทธ๋Œ€๋กœ ๊ด€์ฐฐํ•˜๊ณ  ์ƒ๊ฐํ•˜๋Š” ๋Œ€์‹ , ํ•ญ์ƒ ์ „๋žต์„ ์งœ๊ณ  ๊ณ„ํš์„ ์„ธ์šฐ๋ฉฐ ์‚ด์ฃ . ๊ทธ๋Ÿฌ๋ฉด ์ง€๊ธˆ ์ด ์ž๋ฆฌ์—์„œ ์ธ์ƒ์„ ๋Œ์•„๋ณด๋ฉฐ, ์˜๋ฏธ๋ฅผ ๋”ฐ์งˆ ๋•Œ๋งŒ ์•Œ ์ˆ˜ ์žˆ๋Š” ๊ฑธ ๋†“์น˜๊ธฐ ์‰ฌ์›Œ์š”.

์ด๋ฅผ ์œ„ํ•ด ์ €์ž๋Š” โ€˜ํ•˜๋ฃจ 12๋ถ„โ€™์„ ๋งํ•ด์š”. ๋งค์ผ ์ ์–ด๋„ 12๋ถ„ ์ด์ƒ ๋งˆ์Œ์„ ๋Œ์ด์ผœ ์ฑ™๊ธฐ๋ฉด ์ฃผ์˜์˜ ์งˆ์ด ๋†’์•„์งˆ ์ˆ˜ ์žˆ๋‹ค๋Š” ๊ฒ๋‹ˆ๋‹ค. ๋˜ ์šฐ๋ฆฌ ํ–‰๋™๊ณผ ๋ชฉํ‘œ๊ฐ€ ์–ด๊ธ‹๋‚˜์ง€ ์•Š๊ฒŒ ๋ผ ์‚ถ์„ ๋” ๋ฐ”๋žŒ์งํ•œ ์ชฝ์œผ๋กœ ์ด๋Œ์–ด๊ฐˆ ์ˆ˜ ์žˆ์ฃ .

์•„๋ฆ„๋‹ค์šด ๋…ธ์„์„ ์ฒœ์ฒœํžˆ ์Œ๋ฏธํ•˜๋Š” ์ผ๋„, ํ–‰๋ณตํ•œ ์ถ”์–ต์„ ์„ธ์„ธํžˆ ๋”๋“ฌ๋Š” ์ผ๋„, ๋น›๋‚˜๋Š” ์•ž๋‚ ์„ ์ž์œ ๋กญ๊ฒŒ ๋– ์˜ฌ๋ฆฌ๋Š” ์ผ๋„ ๋ชจ๋‘ ์ฃผ์˜๋ ฅ ๋•๋ถ„์— ๊ฐ€๋Šฅํ•ด์š”. ์ธ์ƒ์€ ํ•œ ๋ฒˆ๋ฟ์ด๊ณ , ์ฃผ์˜๋ ฅ์€ ์ €์žฅ๋˜์ง€ ์•Š์•„์š”. ๊ทธ ์ˆœ๊ฐ„์ด ์ง€๋‚˜๋ฉด ๋์ด์ฃ . ์ฃผ์˜๋ฅผ ๊ธฐ์šธ์ด์ง€ ์•Š๊ณ  ์‚ฐ๋งŒํžˆ ํ˜๋ ค๋ณด๋‚ธ ์ธ์ƒ์€ ๋Œ์•„์˜ค์ง€ ์•Š์•„์š”. ํ๋ฅด๋Š” ๋ฌผ์— ๋ฐœ์„ ๋‘ ๋ฒˆ ๋‹ค์‹œ ๋‹ด๊ธ€ ์ˆ˜ ์—†๋Š” ๊ฒƒ๊ณผ ๊ฐ™์ฃ .