Offshore
Moon Dev Information Arbitrage: Cracking PolyMarket With A Consensus-Driven AI Agent Swarm the code to dominating polymarket has finally been cracked and it doesn't involve being a genius or having a crystal ball. the real secret lies in building a machineโฆ
ging system. iterate to success is a motto that has saved me from giving up when a bot failed or a strategy turned out to be a dud. every failure is just a data point that helps you refine the next version of the machine until it finally clicks
the system prompts i use are designed to make the ai think like a professional market analyst who is focused on deep discovery. you can customize these prompts to look for very specific things like historical polling data or legislative nuances that a human might overlook. this level of customization is what turns a generic tool into a specialized weapon for your own specific trading style
it is important to remember that these systems are not a magic button that prints money without any human oversight or logic. they are tools for market discovery and for finding the anomalies that jim simons used to build his legendary empire at renaissance technologies. you still have to put your own mind on top of the machine and decide which signals are worth the risk of your own capital
most traders are afraid of the blockchain because they think it is too technical but for us it is a transparent gold mine of data. every trade and every market is recorded in a way that allows a bot to scan for patterns that have happened before. by saving all these predictions into a local database you can look back after a few months and see exactly where the ai was right and where it was wrong
this historical record becomes your own personal library of alpha that no one else can buy or steal from you. the markets are constantly evolving and what worked last month might not work today which is why the iteration process never truly ends. the people who win in the long run are the ones who are willing to keep building even when the market is quiet
the wall street firms will eventually catch up and start using these same agent swarms for their prediction market departments. however because we are using open source code we are already three steps ahead of the curve and we can adapt much faster than a slow corporation. the agility of a solo coder with a powerful swarm of ai is the most dangerous thing in the financial world right now
i want to make sure that people understand this isnt just about making one good trade today but about building a system that works for you forever. the peace of mind that comes from knowing a machine is watching your back while you sleep is something that manual trading can never provide. the end of the liquidation era starts when you finally decide to let the machines do the heavy lifting for you
the future of trading is going to be dominated by these types of autonomous agents and the people who start now will have the biggest advantage. there is still so much low hanging fruit in these new prediction markets that even a simple system can find massive inefficiencies. i invite you to grab the code and start making it your own because the only way to fail is to never start at all
tweet
the system prompts i use are designed to make the ai think like a professional market analyst who is focused on deep discovery. you can customize these prompts to look for very specific things like historical polling data or legislative nuances that a human might overlook. this level of customization is what turns a generic tool into a specialized weapon for your own specific trading style
it is important to remember that these systems are not a magic button that prints money without any human oversight or logic. they are tools for market discovery and for finding the anomalies that jim simons used to build his legendary empire at renaissance technologies. you still have to put your own mind on top of the machine and decide which signals are worth the risk of your own capital
most traders are afraid of the blockchain because they think it is too technical but for us it is a transparent gold mine of data. every trade and every market is recorded in a way that allows a bot to scan for patterns that have happened before. by saving all these predictions into a local database you can look back after a few months and see exactly where the ai was right and where it was wrong
this historical record becomes your own personal library of alpha that no one else can buy or steal from you. the markets are constantly evolving and what worked last month might not work today which is why the iteration process never truly ends. the people who win in the long run are the ones who are willing to keep building even when the market is quiet
the wall street firms will eventually catch up and start using these same agent swarms for their prediction market departments. however because we are using open source code we are already three steps ahead of the curve and we can adapt much faster than a slow corporation. the agility of a solo coder with a powerful swarm of ai is the most dangerous thing in the financial world right now
i want to make sure that people understand this isnt just about making one good trade today but about building a system that works for you forever. the peace of mind that comes from knowing a machine is watching your back while you sleep is something that manual trading can never provide. the end of the liquidation era starts when you finally decide to let the machines do the heavy lifting for you
the future of trading is going to be dominated by these types of autonomous agents and the people who start now will have the biggest advantage. there is still so much low hanging fruit in these new prediction markets that even a simple system can find massive inefficiencies. i invite you to grab the code and start making it your own because the only way to fail is to never start at all
tweet
Benjamin Hernandez๐
The tech sell-off is finally over. Support levels held firm and the rebound was violent. Reset your mind this weekend and trade with peace on Monday.
Trade easier: ๐ https://t.co/71FIJId47G
Message โCALMโ to join a group that trades easy. $ORCL $AMD $INTC $NVDA $TSM $TCNNF
tweet
The tech sell-off is finally over. Support levels held firm and the rebound was violent. Reset your mind this weekend and trade with peace on Monday.
Trade easier: ๐ https://t.co/71FIJId47G
Message โCALMโ to join a group that trades easy. $ORCL $AMD $INTC $NVDA $TSM $TCNNF
A powerful finish to a profitable week for all our members.
We locked in +89.58% on $SMX. The wins continued with $SLNG, $CNI, $WHLR, $FLXS, $BNAI, $DRCT, and $LIF.
Momentum is on our side for next week.
Relax this Saturday. - Benjamin Hernandez๐tweet
X (formerly Twitter)
Benjamin Hernandez๐ (@HayesStocks) on X
A powerful finish to a profitable week for all our members.
We locked in +89.58% on $SMX. The wins continued with $SLNG, $CNI, $WHLR, $FLXS, $BNAI, $DRCT, and $LIF.
Momentum is on our side for next week.
Relax this Saturday.
We locked in +89.58% on $SMX. The wins continued with $SLNG, $CNI, $WHLR, $FLXS, $BNAI, $DRCT, and $LIF.
Momentum is on our side for next week.
Relax this Saturday.
Offshore
Video
Moon Dev
dont buy an expensive computer to host these llms for clawdbot or whatever
your time is worth so much more than that https://t.co/l5Bif9xFSi
tweet
dont buy an expensive computer to host these llms for clawdbot or whatever
your time is worth so much more than that https://t.co/l5Bif9xFSi
tweet
Offshore
Video
Dimitry Nakhla | Babylon Capitalยฎ
๐๐ก๐ซ๐ข๐ฌ ๐๐จ๐ก๐ง ๐จ๐ง ๐ญ๐ก๐ ๐ข๐ฆ๐ฉ๐จ๐ซ๐ญ๐๐ง๐๐ ๐จ๐ ๐ซ๐๐๐ฎ๐ซ๐ซ๐ข๐ง๐ ๐ซ๐๐ฏ๐๐ง๐ฎ๐ ๐ฌ๐ญ๐ซ๐๐๐ฆ๐ฌ ๐ฐ๐ก๐๐ง ๐ฅ๐จ๐จ๐ค๐ข๐ง๐ ๐๐ญ ๐๐ฎ๐ฌ๐ข๐ง๐๐ฌ๐ฌ๐๐ฌ:
โIt is important, but the predictability of when they recur is notโฆ Whatโs most important for us is essential product or serviceโฆ We donโt like things that are discretionary.โ
๐๐ฉ๐ช๐ด ๐ช๐ด ๐ข ๐ด๐ถ๐ฃ๐ต๐ญ๐ฆ ๐ฃ๐ถ๐ต ๐ฑ๐ฐ๐ธ๐ฆ๐ณ๐ง๐ถ๐ญ ๐ฅ๐ช๐ด๐ต๐ช๐ฏ๐ค๐ต๐ช๐ฐ๐ฏ.
Not all recurring-like revenue is created equal. A SaaS subscription can be canceled. ๐ผ๐ฃ ๐๐จ๐จ๐๐ฃ๐ฉ๐๐๐ก ๐จ๐๐ง๐ซ๐๐๐ ๐๐๐ฃ ๐๐ ๐๐๐๐๐ง๐ง๐๐โ๐๐ช๐ฉ ๐ฃ๐ค๐ฉ ๐๐ซ๐ค๐๐๐๐.
___
Think of a ๐ต๐ฐ๐ญ๐ญ ๐ฃ๐ฐ๐ฐ๐ต๐ฉ.
If you commute from NJ to NY for work, you have to pay the toll. A snowstorm might pause traffic for a day or two, but it doesnโt break the business. Once conditions normalize, cars flow again.
The toll booth keeps collectingโoften with pricing power layered on top.
___
Thatโs the kind of โrecurrenceโ Hohn is talking about. ๐๐ฉ ๐ง๐๐ฆ๐ช๐๐ง๐๐จ ๐ฅ๐๐ฉ๐๐๐ฃ๐๐, ๐๐ช๐ฉ ๐๐ฉโ๐จ ๐๐ฃ๐๐ง๐๐๐๐๐ก๐ฎ ๐๐ช๐ง๐๐๐ก๐.
___
๐๐ฑ๐๐ฆ๐ฉ๐ฅ๐๐ฌ ๐๐๐ฅ๐จ๐ฐ ๐๐ฝ
$SPGI & $MCO: Debt issuance can be delayed, but it must eventually be refinanced and rated. These are essential stamps of approvalโnot discretionary spend.
$ASML: ~25% of revenue comes from services tied to a massive installed base. Once machines are in fabs, service demand is inevitable.
$LRCX: ~35% of revenue comes from customer support and services. The installed base drives repeat economics.
$GE: ~66% of revenue is services. Engines are sold once; maintenance lasts decades.
$ISRG: ~75% of revenue comes from instruments, accessories, and services tied to its installed base of systems.
$FICO: ~60% of revenue comes from scores. Scores drive credit decisions across the economy. Cheap, essential, and deeply embedded.
___
Video: In Good Company | Norges Bank Investment Management (05/14/2025)
tweet
๐๐ก๐ซ๐ข๐ฌ ๐๐จ๐ก๐ง ๐จ๐ง ๐ญ๐ก๐ ๐ข๐ฆ๐ฉ๐จ๐ซ๐ญ๐๐ง๐๐ ๐จ๐ ๐ซ๐๐๐ฎ๐ซ๐ซ๐ข๐ง๐ ๐ซ๐๐ฏ๐๐ง๐ฎ๐ ๐ฌ๐ญ๐ซ๐๐๐ฆ๐ฌ ๐ฐ๐ก๐๐ง ๐ฅ๐จ๐จ๐ค๐ข๐ง๐ ๐๐ญ ๐๐ฎ๐ฌ๐ข๐ง๐๐ฌ๐ฌ๐๐ฌ:
โIt is important, but the predictability of when they recur is notโฆ Whatโs most important for us is essential product or serviceโฆ We donโt like things that are discretionary.โ
๐๐ฉ๐ช๐ด ๐ช๐ด ๐ข ๐ด๐ถ๐ฃ๐ต๐ญ๐ฆ ๐ฃ๐ถ๐ต ๐ฑ๐ฐ๐ธ๐ฆ๐ณ๐ง๐ถ๐ญ ๐ฅ๐ช๐ด๐ต๐ช๐ฏ๐ค๐ต๐ช๐ฐ๐ฏ.
Not all recurring-like revenue is created equal. A SaaS subscription can be canceled. ๐ผ๐ฃ ๐๐จ๐จ๐๐ฃ๐ฉ๐๐๐ก ๐จ๐๐ง๐ซ๐๐๐ ๐๐๐ฃ ๐๐ ๐๐๐๐๐ง๐ง๐๐โ๐๐ช๐ฉ ๐ฃ๐ค๐ฉ ๐๐ซ๐ค๐๐๐๐.
___
Think of a ๐ต๐ฐ๐ญ๐ญ ๐ฃ๐ฐ๐ฐ๐ต๐ฉ.
If you commute from NJ to NY for work, you have to pay the toll. A snowstorm might pause traffic for a day or two, but it doesnโt break the business. Once conditions normalize, cars flow again.
The toll booth keeps collectingโoften with pricing power layered on top.
___
Thatโs the kind of โrecurrenceโ Hohn is talking about. ๐๐ฉ ๐ง๐๐ฆ๐ช๐๐ง๐๐จ ๐ฅ๐๐ฉ๐๐๐ฃ๐๐, ๐๐ช๐ฉ ๐๐ฉโ๐จ ๐๐ฃ๐๐ง๐๐๐๐๐ก๐ฎ ๐๐ช๐ง๐๐๐ก๐.
___
๐๐ฑ๐๐ฆ๐ฉ๐ฅ๐๐ฌ ๐๐๐ฅ๐จ๐ฐ ๐๐ฝ
$SPGI & $MCO: Debt issuance can be delayed, but it must eventually be refinanced and rated. These are essential stamps of approvalโnot discretionary spend.
$ASML: ~25% of revenue comes from services tied to a massive installed base. Once machines are in fabs, service demand is inevitable.
$LRCX: ~35% of revenue comes from customer support and services. The installed base drives repeat economics.
$GE: ~66% of revenue is services. Engines are sold once; maintenance lasts decades.
$ISRG: ~75% of revenue comes from instruments, accessories, and services tied to its installed base of systems.
$FICO: ~60% of revenue comes from scores. Scores drive credit decisions across the economy. Cheap, essential, and deeply embedded.
___
Video: In Good Company | Norges Bank Investment Management (05/14/2025)
tweet
Offshore
Photo
God of Prompt
RT @rryssf_: ICLR 2025 just gave an Outstanding Paper Award to a method that fixes model editing with one line of code ๐คฏ
here's the problem it solves:
llms store facts in their parameters. sometimes those facts are wrong or outdated. "model editing" lets you surgically update specific facts without retraining the whole model.
the standard approach: find which parameters encode the fact (using causal tracing), then nudge those parameters to store the new fact.
works great for one edit. but do it a hundred times in sequence and the model starts forgetting everything else. do it a thousand times and it degenerates into repetitive gibberish.
every edit that inserts new knowledge corrupts old knowledge. you're playing whack-a-mole with the model's memory.
AlphaEdit reframes the problem.
instead of asking "how do we update knowledge with less damage?" the authors ask "how do we make edits mathematically invisible to preserved knowledge?"
the trick: before applying any parameter change, project it onto the null space of the preserved knowledge matrix.
in plain english: find the directions in parameter space where you can move freely without affecting anything the model already knows. only move in those directions.
it's like remodeling one room in a house by only touching walls that aren't load-bearing. the rest of the structure doesn't even know anything changed.
the results from Fang et al. across GPT2-XL, GPT-J, and LLaMA3-8B:
> average 36.7% improvement over existing editing methods
> works as a plug-and-play addition to MEMIT, ROME, and others
> models maintain 98.48% of general capabilities after 3,000 sequential edits
> prevents the gibberish collapse that kills other methods at scale
and the implementation is literally one line of code added to existing pipelines.
what i find genuinely elegant: the paper proves mathematically that output remains unchanged when querying preserved knowledge. this isn't "it works better in practice." it's "we can prove it doesn't touch what it shouldn't."
the honest caveats:
largest model tested was LLaMA3-8B. nobody's shown this works at 70B+ scale yet. a follow-up paper (AlphaEdit+) flagged brittleness when new knowledge directly conflicts with preserved knowledge, which is exactly the hardest case in production. and the whole approach assumes causal tracing correctly identifies where facts live, which isn't always clean.
but as a core insight, this is the kind of work that deserves the award. not because it solves everything. because it changes the question.
the era of "edit and pray" for llm knowledge updates might actually be ending.
tweet
RT @rryssf_: ICLR 2025 just gave an Outstanding Paper Award to a method that fixes model editing with one line of code ๐คฏ
here's the problem it solves:
llms store facts in their parameters. sometimes those facts are wrong or outdated. "model editing" lets you surgically update specific facts without retraining the whole model.
the standard approach: find which parameters encode the fact (using causal tracing), then nudge those parameters to store the new fact.
works great for one edit. but do it a hundred times in sequence and the model starts forgetting everything else. do it a thousand times and it degenerates into repetitive gibberish.
every edit that inserts new knowledge corrupts old knowledge. you're playing whack-a-mole with the model's memory.
AlphaEdit reframes the problem.
instead of asking "how do we update knowledge with less damage?" the authors ask "how do we make edits mathematically invisible to preserved knowledge?"
the trick: before applying any parameter change, project it onto the null space of the preserved knowledge matrix.
in plain english: find the directions in parameter space where you can move freely without affecting anything the model already knows. only move in those directions.
it's like remodeling one room in a house by only touching walls that aren't load-bearing. the rest of the structure doesn't even know anything changed.
the results from Fang et al. across GPT2-XL, GPT-J, and LLaMA3-8B:
> average 36.7% improvement over existing editing methods
> works as a plug-and-play addition to MEMIT, ROME, and others
> models maintain 98.48% of general capabilities after 3,000 sequential edits
> prevents the gibberish collapse that kills other methods at scale
and the implementation is literally one line of code added to existing pipelines.
what i find genuinely elegant: the paper proves mathematically that output remains unchanged when querying preserved knowledge. this isn't "it works better in practice." it's "we can prove it doesn't touch what it shouldn't."
the honest caveats:
largest model tested was LLaMA3-8B. nobody's shown this works at 70B+ scale yet. a follow-up paper (AlphaEdit+) flagged brittleness when new knowledge directly conflicts with preserved knowledge, which is exactly the hardest case in production. and the whole approach assumes causal tracing correctly identifies where facts live, which isn't always clean.
but as a core insight, this is the kind of work that deserves the award. not because it solves everything. because it changes the question.
the era of "edit and pray" for llm knowledge updates might actually be ending.
tweet
Offshore
Photo
Javier Blas
CHART OF THE DAY: With only one month of data missing, US imports of Saudi crude likely fell to a fresh 30-year low in 2025.
According to monthly data from @EIAgov, the Jan-Nov 2025 period averaged 266,000 b/d, down from 274,000 b/d in the full year 2024. https://t.co/kLvAAcoCC8
tweet
CHART OF THE DAY: With only one month of data missing, US imports of Saudi crude likely fell to a fresh 30-year low in 2025.
According to monthly data from @EIAgov, the Jan-Nov 2025 period averaged 266,000 b/d, down from 274,000 b/d in the full year 2024. https://t.co/kLvAAcoCC8
tweet
Offshore
Photo
App Economy Insights
๐ This Week in Visuals
$AMD $PLTR $UBER $LLY $ABBV $NVS $MRK $NVO $AMGN $PFE $NTDOY $PEP $MDLZ $FTNT $CMG $YUM $PYPL $RBLX $TTWO $HSY $RDDT $TEAM $AFRM $SNAP $NYT $ALGN $MTCH $PTON
https://t.co/LagqbFw2RX
tweet
๐ This Week in Visuals
$AMD $PLTR $UBER $LLY $ABBV $NVS $MRK $NVO $AMGN $PFE $NTDOY $PEP $MDLZ $FTNT $CMG $YUM $PYPL $RBLX $TTWO $HSY $RDDT $TEAM $AFRM $SNAP $NYT $ALGN $MTCH $PTON
https://t.co/LagqbFw2RX
tweet
Offshore
Photo
The Transcript
Analyst: "I hope itโs not the death of software because my job might be dead, but thatโs a whole different conversation."
CFO: "You donโt have other skills?"
$SPT https://t.co/DUgpLIS4Nt
tweet
Analyst: "I hope itโs not the death of software because my job might be dead, but thatโs a whole different conversation."
CFO: "You donโt have other skills?"
$SPT https://t.co/DUgpLIS4Nt
tweet
Offshore
Photo
The Transcript
RT @TheTranscript_: $AMZN CEO: "AWS growth continued to accelerate to 24%, the fastest we've seen in 13 quarters, up $2.6 billion quarter over quarter and nearly $7 billion year over year." https://t.co/EOCk99gJ5y
tweet
RT @TheTranscript_: $AMZN CEO: "AWS growth continued to accelerate to 24%, the fastest we've seen in 13 quarters, up $2.6 billion quarter over quarter and nearly $7 billion year over year." https://t.co/EOCk99gJ5y
tweet