Offshore
Photo
The Transcript
RT @TheTranscript_: Thursday's earnings deck includes Amazon:
Before Open: $COP $BMY $CMI $EL $B $CAH $ENR $CI $PTON $OWL $SHEL $ROK $LIN
After Close: $AMZN $IREN $RDDT $MSTR $RBLX $FTNT $ARW $BE $CLSK $DLR $MCHP $DOCS $TEAM https://t.co/r5p6hddA50
tweet
RT @TheTranscript_: Thursday's earnings deck includes Amazon:
Before Open: $COP $BMY $CMI $EL $B $CAH $ENR $CI $PTON $OWL $SHEL $ROK $LIN
After Close: $AMZN $IREN $RDDT $MSTR $RBLX $FTNT $ARW $BE $CLSK $DLR $MCHP $DOCS $TEAM https://t.co/r5p6hddA50
tweet
Offshore
Photo
Dimitry Nakhla | Babylon Capital®
$ICE Q4 2025 Report 🗓️
✅ REV: $2.50B (+8% YoY)
✅ EPS: $2.82 (+13% YoY)
💵 FY FCF $4.19B +16% YoY
💰20th consecutive year of record revenues https://t.co/KHdjZeuUUA
tweet
$ICE Q4 2025 Report 🗓️
✅ REV: $2.50B (+8% YoY)
✅ EPS: $2.82 (+13% YoY)
💵 FY FCF $4.19B +16% YoY
💰20th consecutive year of record revenues https://t.co/KHdjZeuUUA
tweet
Offshore
Photo
Benjamin Hernandez😎
Missing one big move hurts—missing two changes traders.
If you watched $ELPW hit +85% yesterday, one premarket setup today is giving the same ignition signs.
Get the setup:✅ https://t.co/71FIJIdBXe
Don’t let this become another regret.
$OPEN $ASTS $BYND
tweet
Missing one big move hurts—missing two changes traders.
If you watched $ELPW hit +85% yesterday, one premarket setup today is giving the same ignition signs.
Get the setup:✅ https://t.co/71FIJIdBXe
Don’t let this become another regret.
$OPEN $ASTS $BYND
$ELPW Speculation Pick
Grab $ELPW ~$1.84
$ELPW is the "underdog" bet in the battery space. Recent reverse split has cleaned up the chart.
One-line why: High-conviction play on CEO Xiaodan Liu’s survival strategy and global Nasdaq presence. https://t.co/MF7Tyd785w - Benjamin Hernandez😎tweet
Wasteland Capital
Tech has now entered a liquidation spiral where people (eg hedge funds) are forced to sell good, cheap, accelerating assets (eg much of semis) to cover their losses in expensive, decelerating, high P/E sh*t (eg SAAS / AI victims).
Not sure how long this will take to play out.
tweet
Tech has now entered a liquidation spiral where people (eg hedge funds) are forced to sell good, cheap, accelerating assets (eg much of semis) to cover their losses in expensive, decelerating, high P/E sh*t (eg SAAS / AI victims).
Not sure how long this will take to play out.
tweet
Brady Long
RT @thisguyknowsai: Every day I open up X and have the same thought within 5 mins.
“Bro you don’t need AI. You just need to chill out.”
tweet
RT @thisguyknowsai: Every day I open up X and have the same thought within 5 mins.
“Bro you don’t need AI. You just need to chill out.”
tweet
Offshore
Photo
DAIR.AI
RT @dair_ai: We are just scratching the surface of agentic RAG systems.
Current RAG systems don't let the model think about retrieval.
Retrieval is still mostly treated as a static step.
So the way it currently works is that RAG retrieves passages in one shot, concatenates them into context, and hopes the model figures it out.
More sophisticated methods predefine workflows that the model must follow step-by-step.
But neither approach lets the model decide how to search.
This new research introduces A-RAG, an agentic RAG framework that exposes hierarchical retrieval interfaces directly to the model, turning it into an active participant in the retrieval process.
Instead of one-shot retrieval, A-RAG gives the agent three tools at different granularities: keyword_search for exact lexical matching, semantic_search for dense passage retrieval, and chunk_read for accessing full document content.
The agent decides autonomously which tool to use, when to drill deeper, and when it has gathered enough evidence to answer.
Information in a corpus is naturally organized at multiple granularities, from fine-grained keywords to sentence-level semantics to full chunks.
Giving the model access to all these levels lets it spontaneously develop diverse retrieval strategies tailored to each task.
Results with GPT-5-mini are impressive. A-RAG achieves 94.5% on HotpotQA, 89.7% on 2Wiki, and 74.1% on MuSiQue, outperforming GraphRAG, HippoRAG2, LinearRAG, and every other baseline across all benchmarks.
Even A-RAG Naive, equipped with only a single embedding tool, beats most existing methods, demonstrating the raw power of the agentic paradigm itself.
Context efficiency is where it gets interesting. A-RAG Full retrieves only 2,737 tokens on HotpotQA compared to Naive RAG's 5,358 tokens, while achieving 13 points higher accuracy. The hierarchical design lets the model avoid loading irrelevant content, reading only what matters.
The framework also scales with test-time compute. Increasing max steps from 5 to 20 improves GPT-5-mini by ~8%. Scaling reasoning effort from minimal to high yields ~25% gains for both GPT-5-mini and GPT-5.
The future of RAG isn't better retrieval algorithms. It's better retrieval interfaces that let models use their reasoning capabilities to decide what to search, how to search, and when to stop.
Paper: https://t.co/FbZsV87npT
Learn to build effective AI Agents in our academy: https://t.co/LRnpZN7L4c
tweet
RT @dair_ai: We are just scratching the surface of agentic RAG systems.
Current RAG systems don't let the model think about retrieval.
Retrieval is still mostly treated as a static step.
So the way it currently works is that RAG retrieves passages in one shot, concatenates them into context, and hopes the model figures it out.
More sophisticated methods predefine workflows that the model must follow step-by-step.
But neither approach lets the model decide how to search.
This new research introduces A-RAG, an agentic RAG framework that exposes hierarchical retrieval interfaces directly to the model, turning it into an active participant in the retrieval process.
Instead of one-shot retrieval, A-RAG gives the agent three tools at different granularities: keyword_search for exact lexical matching, semantic_search for dense passage retrieval, and chunk_read for accessing full document content.
The agent decides autonomously which tool to use, when to drill deeper, and when it has gathered enough evidence to answer.
Information in a corpus is naturally organized at multiple granularities, from fine-grained keywords to sentence-level semantics to full chunks.
Giving the model access to all these levels lets it spontaneously develop diverse retrieval strategies tailored to each task.
Results with GPT-5-mini are impressive. A-RAG achieves 94.5% on HotpotQA, 89.7% on 2Wiki, and 74.1% on MuSiQue, outperforming GraphRAG, HippoRAG2, LinearRAG, and every other baseline across all benchmarks.
Even A-RAG Naive, equipped with only a single embedding tool, beats most existing methods, demonstrating the raw power of the agentic paradigm itself.
Context efficiency is where it gets interesting. A-RAG Full retrieves only 2,737 tokens on HotpotQA compared to Naive RAG's 5,358 tokens, while achieving 13 points higher accuracy. The hierarchical design lets the model avoid loading irrelevant content, reading only what matters.
The framework also scales with test-time compute. Increasing max steps from 5 to 20 improves GPT-5-mini by ~8%. Scaling reasoning effort from minimal to high yields ~25% gains for both GPT-5-mini and GPT-5.
The future of RAG isn't better retrieval algorithms. It's better retrieval interfaces that let models use their reasoning capabilities to decide what to search, how to search, and when to stop.
Paper: https://t.co/FbZsV87npT
Learn to build effective AI Agents in our academy: https://t.co/LRnpZN7L4c
tweet
Offshore
Photo
Michael Fritzell (Asian Century Stocks)
RT @AlecStapp: Narrative violation:
The world is becoming less unequal. https://t.co/1t2FbxX2fd
tweet
RT @AlecStapp: Narrative violation:
The world is becoming less unequal. https://t.co/1t2FbxX2fd
The world is more equal than you think. https://t.co/Gc264SIPtx - Steven Pinkertweet
Offshore
Photo
Michael Fritzell (Asian Century Stocks)
RT @konichivalue: This is the story of the greatest investor in Japanese history:
https://t.co/iX9LoaQjtW
tweet
RT @konichivalue: This is the story of the greatest investor in Japanese history:
https://t.co/iX9LoaQjtW
tweet
Offshore
Photo
Jukan
This is an article written by a senior reporter at The Information. Personally, I don’t like The Information, but I thought this was quite worth reading, so I’m sharing it.
Big Tech companies are racing to invest in OpenAI.
Nvidia is considering $30 billion, Amazon at least $20 billion, and Microsoft $10 billion.
SoftBank is also saying they will put in $30 billion.
Ken Brown, a senior reporter at The Information, says this massive valuation of OpenAI doesn’t make sense and begins to reason why.
Before going into the reasoning, the reporter explains why the cost of OpenAI raising funds is increasing rapidly.
OpenAI has been facing growing skepticism from the market regarding its cash burn and future profitability.
OpenAI has effectively been funding its data center construction by leveraging the balance sheets of partners including Oracle, CoreWeave, and Vantage Data Centers. However, that strategy is now reaching its limits.
Investors are sending signals that there are credit limits for companies with high exposure to OpenAI. Specifically, they are raising the bond yields of those companies and driving down their stock prices.
This contrasts with how investors view tech giants. Despite large increases in capital expenditure (CapEx) and waves of borrowing, investors were generally enthusiastic about Big Tech's AI bets. Rather, the ones who were cautious were the large corporations like Meta, Alphabet, Amazon, and Microsoft. They covered most of the AI construction costs with their cash holdings and kept their borrowing levels low.
However, from a certain point, tech giants began investing in OpenAI. They provide the cash needed to reassure the lenders of OpenAI's suppliers. At the same time, they do not record this funding as capital expenditure, and at least so far, they are not financing it through debt.
The reporter explains that there is another meaning to these investments. Big Tech is now doing a variation of the circular trading that Nvidia did all last year.
They are creating circular financing deals where they send funds to their own customers.
Circular investment has different meanings for each company. For Nvidia, investing in companies that purchase its chips is a way to block competition and secure growth, and for Microsoft and Amazon, it means securing more cloud business from OpenAI.
Whatever the motivation behind the massive OpenAI investments by tech giants, the reporter points out that the impact is the same. Cash-rich companies are providing financial breathing room for OpenAI, allowing it to hold out until revenue and profits become sustainable or at least until it gets close enough for the market to open the funding tap.
This is exactly where shareholders have become worried. How long that will take, and whether tech giants will continue to write checks until then.
Citing the 14% drop in Microsoft's stock price as an example, investors have become more concerned about the company's reliance on OpenAI as a customer and whether they are getting a return on their AI spending.
I would like to summarize this report like this:
Big Tech is not valuing OpenAI at $730 billion because they want to. They have realized that if they don't give OpenAI $100 billion, $1 trillion will vanish from their own market capitalization.
Currently, most major IT company stock prices are inflated by an "AI premium." If OpenAI collapses due to a lack of funds for electricity and chip purchases, the logic supporting the entire AI industry could collapse.
tweet
This is an article written by a senior reporter at The Information. Personally, I don’t like The Information, but I thought this was quite worth reading, so I’m sharing it.
Big Tech companies are racing to invest in OpenAI.
Nvidia is considering $30 billion, Amazon at least $20 billion, and Microsoft $10 billion.
SoftBank is also saying they will put in $30 billion.
Ken Brown, a senior reporter at The Information, says this massive valuation of OpenAI doesn’t make sense and begins to reason why.
Before going into the reasoning, the reporter explains why the cost of OpenAI raising funds is increasing rapidly.
OpenAI has been facing growing skepticism from the market regarding its cash burn and future profitability.
OpenAI has effectively been funding its data center construction by leveraging the balance sheets of partners including Oracle, CoreWeave, and Vantage Data Centers. However, that strategy is now reaching its limits.
Investors are sending signals that there are credit limits for companies with high exposure to OpenAI. Specifically, they are raising the bond yields of those companies and driving down their stock prices.
This contrasts with how investors view tech giants. Despite large increases in capital expenditure (CapEx) and waves of borrowing, investors were generally enthusiastic about Big Tech's AI bets. Rather, the ones who were cautious were the large corporations like Meta, Alphabet, Amazon, and Microsoft. They covered most of the AI construction costs with their cash holdings and kept their borrowing levels low.
However, from a certain point, tech giants began investing in OpenAI. They provide the cash needed to reassure the lenders of OpenAI's suppliers. At the same time, they do not record this funding as capital expenditure, and at least so far, they are not financing it through debt.
The reporter explains that there is another meaning to these investments. Big Tech is now doing a variation of the circular trading that Nvidia did all last year.
They are creating circular financing deals where they send funds to their own customers.
Circular investment has different meanings for each company. For Nvidia, investing in companies that purchase its chips is a way to block competition and secure growth, and for Microsoft and Amazon, it means securing more cloud business from OpenAI.
Whatever the motivation behind the massive OpenAI investments by tech giants, the reporter points out that the impact is the same. Cash-rich companies are providing financial breathing room for OpenAI, allowing it to hold out until revenue and profits become sustainable or at least until it gets close enough for the market to open the funding tap.
This is exactly where shareholders have become worried. How long that will take, and whether tech giants will continue to write checks until then.
Citing the 14% drop in Microsoft's stock price as an example, investors have become more concerned about the company's reliance on OpenAI as a customer and whether they are getting a return on their AI spending.
I would like to summarize this report like this:
Big Tech is not valuing OpenAI at $730 billion because they want to. They have realized that if they don't give OpenAI $100 billion, $1 trillion will vanish from their own market capitalization.
Currently, most major IT company stock prices are inflated by an "AI premium." If OpenAI collapses due to a lack of funds for electricity and chip purchases, the logic supporting the entire AI industry could collapse.
tweet