Offshore
Photo
The Long Investor
$TSLA hit $144 in Robinhood overnight trading and then it got halted.
Interesting
tweet
$TSLA hit $144 in Robinhood overnight trading and then it got halted.
Interesting
$HOOD Robinhood halts 24 hr trading. https://t.co/oVkbJf1T67 - Crossroadstweet
Offshore
Photo
The Long Investor
In the space of 2 weeks, the number of companies in the S&P500 with an RSI below 30 went from 6 to 64.
10 X jump.
Market is looking weak.
$SPY https://t.co/DMfeR0n7d5
tweet
In the space of 2 weeks, the number of companies in the S&P500 with an RSI below 30 went from 6 to 64.
10 X jump.
Market is looking weak.
$SPY https://t.co/DMfeR0n7d5
There are only 6 companies in the S&P 500 with RSI's below 30, 3 of them should be of considerable interest to you:
$UNH
$NKE
$LULU - The Long Investortweet
Offshore
Photo
Hidden Value Gems
Most analysts are trained to forecast next-year earnings and a relevant multiple, but few focus on how long a company will stay in business. During the second session of my first Investment Masterclass, we discussed the longevity effect in investing, among other points that are often missed in financial analysis.
There is still space left for the second cohort in June.
You can join the waitlist on the Hidden Value Gems website.
tweet
Most analysts are trained to forecast next-year earnings and a relevant multiple, but few focus on how long a company will stay in business. During the second session of my first Investment Masterclass, we discussed the longevity effect in investing, among other points that are often missed in financial analysis.
There is still space left for the second cohort in June.
You can join the waitlist on the Hidden Value Gems website.
tweet
AkhenOsiris
$META $NVDA
Zuckerberg on buying H100s back in 2022, initially for Reels:
"The corpus of content candidates that we could potentially show you expanded from on the order of thousands to on the order of hundreds of millions. It needed a completely different infrastructure. We started working on doing that and we were constrained on the infrastructure in catching up to what TikTok was doing as quickly as we wanted to. I basically looked at that and I was like “hey, we have to make sure that we're never in this situation again. So let's order enough GPUs to do what we need to do on Reels and ranking content and feed. But let's also double that.” Again, our normal principle is that there's going to be something on the horizon that we can't see yet.
We thought it was going to be something that had to do with training large models. At the time I thought it was probably going to be something that had to do with content. It’s just the pattern matching of running the company, there's always another thing. At that time I was so deep into trying to get the recommendations working for Reels and other content. That’s just such a big unlock for Instagram and Facebook now, being able to show people content that's interesting to them from people that they're not even following.
But that ended up being a very good decision in retrospect. And it came from being behind. It wasn't like “oh, I was so far ahead.” Actually, most of the times where we make some decision that ends up seeming good is because we messed something up before and just didn't want to repeat the mistake."
Source: youtu.be/bc6uFV9CJGg
tweet
$META $NVDA
Zuckerberg on buying H100s back in 2022, initially for Reels:
"The corpus of content candidates that we could potentially show you expanded from on the order of thousands to on the order of hundreds of millions. It needed a completely different infrastructure. We started working on doing that and we were constrained on the infrastructure in catching up to what TikTok was doing as quickly as we wanted to. I basically looked at that and I was like “hey, we have to make sure that we're never in this situation again. So let's order enough GPUs to do what we need to do on Reels and ranking content and feed. But let's also double that.” Again, our normal principle is that there's going to be something on the horizon that we can't see yet.
We thought it was going to be something that had to do with training large models. At the time I thought it was probably going to be something that had to do with content. It’s just the pattern matching of running the company, there's always another thing. At that time I was so deep into trying to get the recommendations working for Reels and other content. That’s just such a big unlock for Instagram and Facebook now, being able to show people content that's interesting to them from people that they're not even following.
But that ended up being a very good decision in retrospect. And it came from being behind. It wasn't like “oh, I was so far ahead.” Actually, most of the times where we make some decision that ends up seeming good is because we messed something up before and just didn't want to repeat the mistake."
Source: youtu.be/bc6uFV9CJGg
tweet
AkhenOsiris
$META
Zuckerberg on AI:
"Our bet is that it's going to basically change all of the products. I think that there's going to be a kind of Meta AI general assistant product. I think that that will shift from something that feels more like a chatbot, where you ask a question and it formulates an answer, to things where you're giving it more complicated tasks and then it goes away and does them. That's going to take a lot of inference and it's going to take a lot of compute in other ways too."
tweet
$META
Zuckerberg on AI:
"Our bet is that it's going to basically change all of the products. I think that there's going to be a kind of Meta AI general assistant product. I think that that will shift from something that feels more like a chatbot, where you ask a question and it formulates an answer, to things where you're giving it more complicated tasks and then it goes away and does them. That's going to take a lot of inference and it's going to take a lot of compute in other ways too."
tweet
AkhenOsiris
$META
Zuckerberg on 350,000 GPUs, Inference:
"That's the whole fleet. We built two, I think 22,000 or 24,000 clusters that are the single clusters that we have for training the big models, obviously across a lot of the stuff that we do. A lot of our stuff goes towards training Reels models and Facebook News Feed and Instagram Feed. Inference is a huge thing for us because we serve a ton of people. Our ratio of inference compute required to training is probably much higher than most other companies that are doing this stuff just because of the sheer volume of the community that we're serving."
tweet
$META
Zuckerberg on 350,000 GPUs, Inference:
"That's the whole fleet. We built two, I think 22,000 or 24,000 clusters that are the single clusters that we have for training the big models, obviously across a lot of the stuff that we do. A lot of our stuff goes towards training Reels models and Facebook News Feed and Instagram Feed. Inference is a huge thing for us because we serve a ton of people. Our ratio of inference compute required to training is probably much higher than most other companies that are doing this stuff just because of the sheer volume of the community that we're serving."
tweet
AkhenOsiris
$META $NVDA
Zuckerberg on AI, GPUs, scarcity, runway:
I think it's likely enough that we'll keep going. I think it’s worth investing the $10Bs or $100B+ in building the infrastructure and assuming that if it keeps going you're going to get some really amazing things that are going to make amazing products. I don't think anyone in the industry can really tell you that it will continue scaling at that rate for sure. In general in history, you hit bottlenecks at certain points. Now there's so much energy on this that maybe those bottlenecks get knocked over pretty quickly. I think that’s an interesting question.
Well, there are going to be different bottlenecks. Over the last few years, I think there was this issue of GPU production. Even companies that had the money to pay for the GPUs couldn't necessarily get as many as they wanted because there were all these supply constraints. Now I think that's sort of getting less. So you're seeing a bunch of companies thinking now about investing a lot of money in building out these things. I think that that will go on for some period of time. There is a capital question. At what point does it stop being worth it to put the capital in?
I actually think before we hit that, you're going to run into energy constraints. I don't think anyone's built a gigawatt single training cluster yet. You run into these things that just end up being slower in the world. Getting energy permitted is a very heavily regulated government function. You're going from software, which is somewhat regulated and I'd argue it’s more regulated than a lot of people in the tech community feel. Obviously it’s different if you're starting a small company, maybe you feel that less. We interact with different governments and regulators and we have lots of rules that we need to follow and make sure we do a good job with around the world. But I think that there's no doubt about energy.
If you're talking about building large new power plants or large build-outs and then building transmission lines that cross other private or public land, that’s just a heavily regulated thing. You're talking about many years of lead time. If we wanted to stand up some massive facility, powering that is a very long-term project. I think people do it but I don't think this is something that can be quite as magical as just getting to a level of AI, getting a bunch of capital and putting it in, and then all of a sudden the models are just going to… You do hit different bottlenecks along the way.
tweet
$META $NVDA
Zuckerberg on AI, GPUs, scarcity, runway:
I think it's likely enough that we'll keep going. I think it’s worth investing the $10Bs or $100B+ in building the infrastructure and assuming that if it keeps going you're going to get some really amazing things that are going to make amazing products. I don't think anyone in the industry can really tell you that it will continue scaling at that rate for sure. In general in history, you hit bottlenecks at certain points. Now there's so much energy on this that maybe those bottlenecks get knocked over pretty quickly. I think that’s an interesting question.
Well, there are going to be different bottlenecks. Over the last few years, I think there was this issue of GPU production. Even companies that had the money to pay for the GPUs couldn't necessarily get as many as they wanted because there were all these supply constraints. Now I think that's sort of getting less. So you're seeing a bunch of companies thinking now about investing a lot of money in building out these things. I think that that will go on for some period of time. There is a capital question. At what point does it stop being worth it to put the capital in?
I actually think before we hit that, you're going to run into energy constraints. I don't think anyone's built a gigawatt single training cluster yet. You run into these things that just end up being slower in the world. Getting energy permitted is a very heavily regulated government function. You're going from software, which is somewhat regulated and I'd argue it’s more regulated than a lot of people in the tech community feel. Obviously it’s different if you're starting a small company, maybe you feel that less. We interact with different governments and regulators and we have lots of rules that we need to follow and make sure we do a good job with around the world. But I think that there's no doubt about energy.
If you're talking about building large new power plants or large build-outs and then building transmission lines that cross other private or public land, that’s just a heavily regulated thing. You're talking about many years of lead time. If we wanted to stand up some massive facility, powering that is a very long-term project. I think people do it but I don't think this is something that can be quite as magical as just getting to a level of AI, getting a bunch of capital and putting it in, and then all of a sudden the models are just going to… You do hit different bottlenecks along the way.
tweet
AkhenOsiris
$META $NVDA
By when will the Llama models be trained on your own custom silicon?
Zuckerberg:
"Soon, not Llama-4. The approach that we took is we first built custom silicon that could handle inference for our ranking and recommendation type stuff, so Reels, News Feed ads, etc. That was consuming a lot of GPUs. When we were able to move that to our own silicon, we're now able to use the more expensive NVIDIA GPUs only for training. At some point we will hopefully have silicon ourselves that we can be using for at first training some of the simpler things, then eventually training these really large models. In the meantime, I'd say the program is going quite well and we're just rolling it out methodically and we have a long-term roadmap for it."
tweet
$META $NVDA
By when will the Llama models be trained on your own custom silicon?
Zuckerberg:
"Soon, not Llama-4. The approach that we took is we first built custom silicon that could handle inference for our ranking and recommendation type stuff, so Reels, News Feed ads, etc. That was consuming a lot of GPUs. When we were able to move that to our own silicon, we're now able to use the more expensive NVIDIA GPUs only for training. At some point we will hopefully have silicon ourselves that we can be using for at first training some of the simpler things, then eventually training these really large models. In the meantime, I'd say the program is going quite well and we're just rolling it out methodically and we have a long-term roadmap for it."
tweet
Antonio Linares
Bull market or not, AI is here to exponentiate human productivity. For select companies, this will lead to exponentially stronger moats.
Here are 10 businesses that are uniquely positioned to benefit from the rise of AI:
1. $MSFT: the world uses $MSFT’s tools to work. This gives the company access to data that it can then use to train AIs and automate work across pretty much any industry worldwide. In this manner, $MSFT’s AI copilots have a great chance of becoming indispensable and ubiquitous tools.
2. $GOOG: knows everything about everyone. Although the company’s recent AI launches have flopped and it seems to have a fair bit of internal cultural dysfunction, if $GOOG gets its affairs in order it has the raw ingredients to create the most powerful AI models on Earth.
3. $NVDA: Leading the field with an unmatched array of GPUs and a robust software ecosystem, $NVDA holds the top position globally as an AI compute provider. Despite potential market share gains by $AMD, $NVDA is poised for continued success.
4. $AMD: Utilizing its innovative chiplet design, $AMD is poised to challenge $NVDA's stronghold in the AI GPU market. With its current valuation reflecting its underdog status, $AMD represents a high-potential investment opportunity.
5. $CRWD: Positioned to dominate the XDR market, $CRWD's unique unified data model is its strategic edge. This structure facilitates rapid deployment of new cybersecurity solutions through efficient AI model training, enhancing its indispensability and competitive stance as AI technology advances.
6. $PLTR: As the premier provider of digital twins and a major facilitator of AI integration in the West, $PLTR is rapidly transforming into a key platform, fostering network effects that competitors will find challenging to emulate.
7. $AMZN: Beyond its well-known status as an e-commerce giant, $AMZN is advancing towards creating some of the most lucrative AI assistants in the market. These developments are projected to generate substantial revenues with exceptionally high margins.
8. $SPOT: In an internet age that has largely overlooked voice in favor of video, images, and text, $SPOT leads in understanding audience preferences for spoken content. This insight is driving $SPOT to develop AI solutions that benefit both content creators and consumers, likely enhancing its free cash flow over time.
9. $RBLX: Commonly perceived as just a children's gaming platform, $RBLX is rapidly evolving into a formidable social media contender. Leveraging AI, it is enhancing content creation speeds, which is expected to significantly impact its profitability.
10. $PATH: Often seen merely as a tool for screen monitoring and data scraping, $PATH is strategically advancing towards semantic automation. This evolution positions $PATH as a critical player in automation, with AI improving its data access and indispensability to customers.
tweet
Bull market or not, AI is here to exponentiate human productivity. For select companies, this will lead to exponentially stronger moats.
Here are 10 businesses that are uniquely positioned to benefit from the rise of AI:
1. $MSFT: the world uses $MSFT’s tools to work. This gives the company access to data that it can then use to train AIs and automate work across pretty much any industry worldwide. In this manner, $MSFT’s AI copilots have a great chance of becoming indispensable and ubiquitous tools.
2. $GOOG: knows everything about everyone. Although the company’s recent AI launches have flopped and it seems to have a fair bit of internal cultural dysfunction, if $GOOG gets its affairs in order it has the raw ingredients to create the most powerful AI models on Earth.
3. $NVDA: Leading the field with an unmatched array of GPUs and a robust software ecosystem, $NVDA holds the top position globally as an AI compute provider. Despite potential market share gains by $AMD, $NVDA is poised for continued success.
4. $AMD: Utilizing its innovative chiplet design, $AMD is poised to challenge $NVDA's stronghold in the AI GPU market. With its current valuation reflecting its underdog status, $AMD represents a high-potential investment opportunity.
5. $CRWD: Positioned to dominate the XDR market, $CRWD's unique unified data model is its strategic edge. This structure facilitates rapid deployment of new cybersecurity solutions through efficient AI model training, enhancing its indispensability and competitive stance as AI technology advances.
6. $PLTR: As the premier provider of digital twins and a major facilitator of AI integration in the West, $PLTR is rapidly transforming into a key platform, fostering network effects that competitors will find challenging to emulate.
7. $AMZN: Beyond its well-known status as an e-commerce giant, $AMZN is advancing towards creating some of the most lucrative AI assistants in the market. These developments are projected to generate substantial revenues with exceptionally high margins.
8. $SPOT: In an internet age that has largely overlooked voice in favor of video, images, and text, $SPOT leads in understanding audience preferences for spoken content. This insight is driving $SPOT to develop AI solutions that benefit both content creators and consumers, likely enhancing its free cash flow over time.
9. $RBLX: Commonly perceived as just a children's gaming platform, $RBLX is rapidly evolving into a formidable social media contender. Leveraging AI, it is enhancing content creation speeds, which is expected to significantly impact its profitability.
10. $PATH: Often seen merely as a tool for screen monitoring and data scraping, $PATH is strategically advancing towards semantic automation. This evolution positions $PATH as a critical player in automation, with AI improving its data access and indispensability to customers.
tweet
Offshore
Photo
Antonio Linares
2 Hour Deep Diver continues to give students the tools they need to do their own research and become Sovereign Investors.
The best investment you can make is in your education. https://t.co/rsL99TFEL5
tweet
2 Hour Deep Diver continues to give students the tools they need to do their own research and become Sovereign Investors.
The best investment you can make is in your education. https://t.co/rsL99TFEL5
tweet
Offshore
Photo
Giuliano
Recién publicamos la segunda no-entrevista.
El invitado de hoy: Mi padre👀 https://t.co/pNIHA6OTwc
tweet
Recién publicamos la segunda no-entrevista.
El invitado de hoy: Mi padre👀 https://t.co/pNIHA6OTwc
tweet