Michael Fritzell (Asian Century Stocks)
RT @blondesnmoney: One of the sharpest small cap guys, @GuastyWinds , is finally back on Twitter. Made a bunch from his excellent callouts in 2023 and 2024, really excited to see what he has next and highly recommend a follow.
tweet
RT @blondesnmoney: One of the sharpest small cap guys, @GuastyWinds , is finally back on Twitter. Made a bunch from his excellent callouts in 2023 and 2024, really excited to see what he has next and highly recommend a follow.
tweet
Offshore
Photo
Jukan
I don’t have a girlfriend, but I still wanted to try some chocolate, so I looked around online.
And… it turns out Louis Vuitton makes chocolate too? It’s offline-exclusive, though.
The price is a whopping $360. I have no idea what kind of ridiculous money flex that is. https://t.co/I1YviRZ9Do
tweet
I don’t have a girlfriend, but I still wanted to try some chocolate, so I looked around online.
And… it turns out Louis Vuitton makes chocolate too? It’s offline-exclusive, though.
The price is a whopping $360. I have no idea what kind of ridiculous money flex that is. https://t.co/I1YviRZ9Do
tweet
Michael Fritzell (Asian Century Stocks)
RT @david_katunaric: @MoS_Investing To a man with an AI everything looks like disruption
tweet
RT @david_katunaric: @MoS_Investing To a man with an AI everything looks like disruption
tweet
anon
Yeah, but Ichiyoshi January 16, 2025 report: "AIMECHATEC: Initiating strong buy on anticipated long-term growth in semiconductor bonders and debonders".
Ishibashi-san did a long report on both AIMECHATEC and Shikoku Kasei (CCL sector report) before becoming baggers.
tweet
Yeah, but Ichiyoshi January 16, 2025 report: "AIMECHATEC: Initiating strong buy on anticipated long-term growth in semiconductor bonders and debonders".
Ishibashi-san did a long report on both AIMECHATEC and Shikoku Kasei (CCL sector report) before becoming baggers.
This tweet had 95 impressions and four likes. Ofc, the stock is up 124% since then. - Illiquidtweet
X (formerly Twitter)
Illiquid (@illyquid) on X
This tweet had 95 impressions and four likes. Ofc, the stock is up 124% since then.
Offshore
Photo
DAIR.AI
RT @dair_ai: // Improving Efficiency of Evolutionary AI Agents //
Evolutionary AI agents are powerful but can be wasteful.
Systems, inspired by AlphaEvolve and OpenEvolve, iteratively generate, mutate, and refine candidate solutions using LLMs. However, every refinement step invokes the same large model regardless of task difficulty.
Most mutations don't need a 32B model.
This new research introduces AdaptEvolve, a framework that dynamically selects which model handles each evolutionary step based on intrinsic generation confidence.
Instead of routing everything through the largest available model, a lightweight decision tree router estimates whether the small model's output is sufficient or needs escalation.
The confidence signal comes from four entropy-based metrics computed on the small model's token probabilities: Mean Confidence for global assurance, Lowest Group Confidence for localized reasoning collapses, Tail Confidence for solution stability, and Bottom-K% Confidence for distinguishing noise from systematic hallucination.
A shallow decision tree, bootstrapped from just 50 warm-up examples, uses these signals to make real-time routing decisions.
What makes this practical?
The router adapts online. An Adaptive Hoeffding Tree continuously updates its decision boundaries as the evolutionary population drifts toward harder edge cases.
On LiveCodeBench, AdaptEvolve retains 97.9% of the 32B upper-bound accuracy (73.6% vs 75.2%) while cutting compute cost by 34.4%. On MBPP, the router identifies that 85% of queries are solvable by the 4B model alone, reducing cost by 41.5% while maintaining 97.1% of peak accuracy. Across benchmarks, the method reduces total inference compute by 37.9% while retaining 97.5% of the upper-bound performance.
Evolutionary agents don't need maximum capability at every step. Confidence-driven routing turns the cost-capability trade-off from a fixed choice into a dynamic, per-step decision.
Paper: https://t.co/YSNCKZuTeN
Learn to build effective AI Agents in our academy: https://t.co/LRnpZN7L4c
tweet
RT @dair_ai: // Improving Efficiency of Evolutionary AI Agents //
Evolutionary AI agents are powerful but can be wasteful.
Systems, inspired by AlphaEvolve and OpenEvolve, iteratively generate, mutate, and refine candidate solutions using LLMs. However, every refinement step invokes the same large model regardless of task difficulty.
Most mutations don't need a 32B model.
This new research introduces AdaptEvolve, a framework that dynamically selects which model handles each evolutionary step based on intrinsic generation confidence.
Instead of routing everything through the largest available model, a lightweight decision tree router estimates whether the small model's output is sufficient or needs escalation.
The confidence signal comes from four entropy-based metrics computed on the small model's token probabilities: Mean Confidence for global assurance, Lowest Group Confidence for localized reasoning collapses, Tail Confidence for solution stability, and Bottom-K% Confidence for distinguishing noise from systematic hallucination.
A shallow decision tree, bootstrapped from just 50 warm-up examples, uses these signals to make real-time routing decisions.
What makes this practical?
The router adapts online. An Adaptive Hoeffding Tree continuously updates its decision boundaries as the evolutionary population drifts toward harder edge cases.
On LiveCodeBench, AdaptEvolve retains 97.9% of the 32B upper-bound accuracy (73.6% vs 75.2%) while cutting compute cost by 34.4%. On MBPP, the router identifies that 85% of queries are solvable by the 4B model alone, reducing cost by 41.5% while maintaining 97.1% of peak accuracy. Across benchmarks, the method reduces total inference compute by 37.9% while retaining 97.5% of the upper-bound performance.
Evolutionary agents don't need maximum capability at every step. Confidence-driven routing turns the cost-capability trade-off from a fixed choice into a dynamic, per-step decision.
Paper: https://t.co/YSNCKZuTeN
Learn to build effective AI Agents in our academy: https://t.co/LRnpZN7L4c
tweet
anon
RT @zephyr_z9: Chinese AI CAPEX (include SEA buildout as well) will surprise us
tweet
RT @zephyr_z9: Chinese AI CAPEX (include SEA buildout as well) will surprise us
Some accounts are consistently high-alpha like david orr or zephyr. you just consume it and keep rising higher with them. some, like mine, are very lumpy, high volatility, with some great winners and other dumpster stocks. Follow the track record. Trust the track record. - anontweet
X (formerly Twitter)
anon (@anonymous3nibrv) on X
Some accounts are consistently high-alpha like david orr or zephyr. you just consume it and keep rising higher with them. some, like mine, are very lumpy, high volatility, with some great winners and other dumpster stocks. Follow the track record. Trust the…
Offshore
Photo
anon
Yakult announces 2.56% s/o buyback with 8.08% of issued share cancellation. Another cash hoarding company that realises: we don't need this much cash. 2 yrs ago i felt many corp gov changes were 'reactive' - recently i feel they are more 'proactive'. https://t.co/8hihJTh5wt
tweet
Yakult announces 2.56% s/o buyback with 8.08% of issued share cancellation. Another cash hoarding company that realises: we don't need this much cash. 2 yrs ago i felt many corp gov changes were 'reactive' - recently i feel they are more 'proactive'. https://t.co/8hihJTh5wt
tweet
Offshore
Photo
anon
If you haven't read this stack article on MARUWA $5344.T, an 'AI optical infrastructure' supplier with 60% share of heat dissipation substractes crucial to optical transceivers, it's still not too late. Market expected to grow 60% CAGR over 5 yrs, and Maruwa ebit margins are 36%. https://t.co/UnHwTykb1x
tweet
If you haven't read this stack article on MARUWA $5344.T, an 'AI optical infrastructure' supplier with 60% share of heat dissipation substractes crucial to optical transceivers, it's still not too late. Market expected to grow 60% CAGR over 5 yrs, and Maruwa ebit margins are 36%. https://t.co/UnHwTykb1x
tweet
anon
PIA 4337 ticketing platform showing op lev inflection, q3 ebit up 150% y/y vs. 24% rev. trades for 4x ev/ebit. ticketing platform alone gross margin > 40%, and likely ebit margin 20%+, diluted by event production. This sub-5x ev/ebit with 40%+ gpm 10% rev cagr is my sweet spot.
tweet
PIA 4337 ticketing platform showing op lev inflection, q3 ebit up 150% y/y vs. 24% rev. trades for 4x ev/ebit. ticketing platform alone gross margin > 40%, and likely ebit margin 20%+, diluted by event production. This sub-5x ev/ebit with 40%+ gpm 10% rev cagr is my sweet spot.
tweet
anon
The way to play this in japan: Ferrotec $6890.T - vacuum seals and quartz/silicon/ceramics parts co w/ China subsidiary "CCMC" worth 2x 6890's market cap. Ferrotec itself is still trading at 6.7x ebitda with 11% ebit margins. likely 2026 winner https://t.co/d7SuCQESmH
tweet
The way to play this in japan: Ferrotec $6890.T - vacuum seals and quartz/silicon/ceramics parts co w/ China subsidiary "CCMC" worth 2x 6890's market cap. Ferrotec itself is still trading at 6.7x ebitda with 11% ebit margins. likely 2026 winner https://t.co/d7SuCQESmH
Chinese AI CAPEX (include SEA buildout as well) will surprise us - Zephyrtweet
X (formerly Twitter)
ムー (@Laabmooo) on X
フェローテック子会社 CCMC
店頭登録初日の時価総額:5,160億(243.45億元)
FTSVAとCCMCの持ち分評価額だけで見てもこれは…
店頭登録初日の時価総額:5,160億(243.45億元)
FTSVAとCCMCの持ち分評価額だけで見てもこれは…
Offshore
Photo
The Transcript
RT @TheTranscript_: $ABNB CEO: Airbnb’s defense against disintermediation is focusing on what AI can’t replicate
"A chatbot can give you a list of homes, but it can't give you the unique ones you find on Airbnb..." https://t.co/5lwQ6BVXcD
tweet
RT @TheTranscript_: $ABNB CEO: Airbnb’s defense against disintermediation is focusing on what AI can’t replicate
"A chatbot can give you a list of homes, but it can't give you the unique ones you find on Airbnb..." https://t.co/5lwQ6BVXcD
tweet