Offshore
Video
Dimitry Nakhla | Babylon Capitalยฎ
๐‚๐ก๐ซ๐ข๐ฌ ๐‡๐จ๐ก๐ง ๐จ๐ง ๐€๐ˆ, ๐๐ข๐ฌ๐ซ๐ฎ๐ฉ๐ญ๐ข๐จ๐ง, ๐š๐ง๐ ๐ฐ๐ก๐ฒ ๐ซ๐ž๐š๐ฅ ๐ฆ๐จ๐š๐ญ๐ฌ ๐ฆ๐š๐ฒ ๐ฆ๐š๐ญ๐ญ๐ž๐ซ ๐ฆ๐จ๐ซ๐ž ๐ญ๐ก๐š๐ง ๐ž๐ฏ๐ž๐ซ:

โ€œItโ€™s going to increase disruption in ways we canโ€™t even predictโ€ฆ but AI will increase productivity and lower the cost base of all companies.

And so if you have a company with these barriers to entry, itโ€™s going to be worth more.โ€
___

๐“๐ฐ๐จ ๐ข๐ฆ๐ฉ๐จ๐ซ๐ญ๐š๐ง๐ญ ๐ข๐๐ž๐š๐ฌ ๐ž๐ฆ๐›๐ž๐๐๐ž๐ ๐ก๐ž๐ซ๐ž:

๐Ÿ. ๐ƒ๐ข๐ฌ๐ซ๐ฎ๐ฉ๐ญ๐ข๐จ๐ง ๐ซ๐ข๐ฌ๐ค ๐ข๐ฌ ๐ซ๐ข๐ฌ๐ข๐ง๐ 

AI lowers barriers to doing things, which means competitive pressure increases across many industries. Business models built on labor-intensive, easily replicable work are especially vulnerable.

๐Ÿ. ๐Œ๐จ๐š๐ญ๐ฌ + ๐€๐ˆ ๐œ๐š๐ง ๐›๐ž ๐š ๐ฉ๐จ๐ฐ๐ž๐ซ๐Ÿ๐ฎ๐ฅ ๐œ๐จ๐ฆ๐›๐จ

If a company already has durable barriers to entry, AI becomes a margin and productivity lever rather than an existential threat.
___

A particularly ๐˜ข๐˜ต๐˜ต๐˜ณ๐˜ข๐˜ค๐˜ต๐˜ช๐˜ท๐˜ฆ ๐˜ฉ๐˜ถ๐˜ฏ๐˜ต๐˜ช๐˜ฏ๐˜จ ๐˜จ๐˜ณ๐˜ฐ๐˜ถ๐˜ฏ๐˜ฅ:

๐˜ฝ๐™ช๐™จ๐™ž๐™ฃ๐™š๐™จ๐™จ๐™š๐™จ ๐™ฌ๐™ž๐™ฉ๐™ ๐™ข๐™ช๐™ก๐™ฉ๐™ž๐™ฅ๐™ก๐™š ๐™—๐™–๐™ง๐™ง๐™ž๐™š๐™ง๐™จ ๐™ฉ๐™ค ๐™š๐™ฃ๐™ฉ๐™ง๐™ฎ ๐™–๐™ฃ๐™™ ๐™ก๐™–๐™ง๐™œ๐™š ๐™๐™ช๐™ข๐™–๐™ฃ-๐™˜๐™–๐™ฅ๐™ž๐™ฉ๐™–๐™ก ๐™˜๐™ค๐™จ๐™ฉ ๐™—๐™–๐™จ๐™š๐™จ.

๐˜ผ๐™„ ๐™˜๐™–๐™ฃ ๐™จ๐™ฉ๐™ง๐™ช๐™˜๐™ฉ๐™ช๐™ง๐™–๐™ก๐™ก๐™ฎ ๐™ก๐™ค๐™ฌ๐™š๐™ง ๐™ฉ๐™๐™š๐™ž๐™ง ๐™˜๐™ค๐™จ๐™ฉ ๐™จ๐™ฉ๐™ง๐™ช๐™˜๐™ฉ๐™ช๐™ง๐™š ๐™ฌ๐™๐™ž๐™ก๐™š ๐™ฉ๐™๐™š ๐™ข๐™ค๐™–๐™ฉ ๐™ฅ๐™ง๐™ค๐™ฉ๐™š๐™˜๐™ฉ๐™จ ๐™ฅ๐™ง๐™ž๐™˜๐™ž๐™ฃ๐™œ ๐™ฅ๐™ค๐™ฌ๐™š๐™ง.
___

Video: In Good Company | Norges Bank Investment Management (05/14/2025)
tweet
Offshore
Photo
DAIR.AI
Multi-agent memory has a homogenization problem.

This work finds that role-aware latent memory that is learnable, compact, and framework-agnostic consistently outperforms handcrafted memory architectures while being substantially more efficient.

When multiple agents share the same memory pool, they end up with identical recollections regardless of their distinct roles. A coding agent, a planning agent, and a review agent all retrieve the same memory entries, ignoring functional differences that should shape what each agent remembers.

The second bottleneck is information overload. MAS inherently involves long interaction contexts, and storing fine-grained memory entries at multiple granularities amplifies this burden, overwhelming agents and obscuring critical decision signals.

This new research introduces LatentMem, a learnable multi-agent memory framework that customizes agent-specific memories in a token-efficient manner.

Instead of storing and retrieving text-based memory entries, LatentMem compresses raw interaction trajectories into compact latent representations conditioned on each agent's role profile. A lightweight memory composer synthesizes fixed-length latent memories that are injected directly into the agent's reasoning process.

To train the memory composer, they introduce Latent Memory Policy Optimization (LMPO), which propagates task-level optimization signals through latent memories to encourage compact, high-utility representations. This exploits the differentiability of latent memory to enable gradient backpropagation through the entire memory pipeline.

Across six benchmarks and four MAS frameworks with Qwen3-4B, LatentMem achieves up to 16.20% improvement on TriviaQA and 19.36% on PopQA over vanilla settings. On code generation with KodCode, it delivers an 8.40-9.55% gain depending on the framework. It consistently outperforms eight existing memory architectures, including MetaGPT, Voyager, JoyAgent, and G-Memory.

The efficiency gains matter too: 50% fewer tokens and inference time reduced to roughly two-thirds compared to mainstream memory designs. On out-of-domain tasks, LatentMem still generalizes well, with 7.10% improvement on PDDL and 7.90% on unseen MAS frameworks like CAMEL.

Paper: https://t.co/VfmG0DYIf8

Learn to build effective AI agents in our academy: https://t.co/PE5l0X8fFq
tweet
Jukan
Ohโ€ฆ guys, Iโ€™ve been talking about Micronโ€™s HBM4 since last Septemberโ€ฆ
tweet
Offshore
Photo
Moon Dev
yeah id say opus 4.6 can cook https://t.co/Lvs4RWs2yt
tweet
Offshore
Photo
DAIR.AI
RT @omarsar0: Another banger by the Anthropic Engineering team.

The mass-parallelized 16 Claude instances to build a full C compiler from scratch.

100,000 lines of Rust. Compiles the Linux kernel. No active human supervision.

The wildest part isn't even the compiler itself. It's that they built a system where agents autonomously pick up tasks, lock files to avoid conflicts, and git sync with each other like a remote dev team.

Looks inspired by Ralph Loop.

2 billion input tokens, 140 million output tokens, 2 weeks, and $20k in total cost.

If you're still writing code one file at a time in a single session, you are massively underestimating where this is headed.

Agent swarms that coordinate on real codebases aren't a thing of the future anymore. They're a right now thing.

2026 is shaping up to be the year of agent harnesses. And the cool part is that you can go and build your agent team with Claude Code now.
tweet
Offshore
Photo
Bourbon Capital
Howard Marks: What was the most important event in the financial and investment world in the last 50 years?

Howard Marks: Most people would say Lehman Brothers, 2008, tech bubble..Black Monday.. but i believe that it was the decline in interest rates

declining interest rates are extremely beneficial for assets ownership....

Oaktree Capital Management (Howard Marks) 13F as Sep 2025 https://t.co/WPSyQWpsRV
- Bourbon Insider Research
tweet
Offshore
Photo
Fiscal.ai
The Hyperscalers now have more than $1 trillion in total cloud commitments.

Google Cloud: $243B (+161%)
AWS: $244B (+38%)
Microsoft Azure: $631B (+108%)

$GOOGL $AMZN $MSFT https://t.co/QLEZkSFvE7
tweet
Offshore
Photo
Javier Blas
RT @badralbusaidi: Very serious talks mediating between Iran and the US in Muscat today.
It was useful to clarify both Iranian and American thinking and identify areas for possible progress. We aim to reconvene in due course, with the results to be considered carefully in Tehran and Washington. https://t.co/OWctzf2CXA
tweet
Offshore
Photo
Moon Dev
$54,000,000 BTC long just entered 6 minutes

liquidation point at $67,348

will he get smoked or make $100m? https://t.co/mLSOMd7ian
tweet