Offshore
Photo
God of Prompt
RT @godofprompt: openclaw released the genie out of the bottle

https://t.co/vNjLilP0iM lets AI agents hire real humans for paid tasks

> 7540 sign ups
> first paid transaction complete
> crypto payments integrated

this is wild https://t.co/SPSVvbyreS
tweet
Offshore
Photo
God of Prompt
RT @godofprompt: Virtual assistants should be worried.

@genspark_ai just hit $155M ARR in 10 months and after trying it, I completely understand why.

This is a true all-in-one AI workspace 2.0 that genuinely replaces multiple standalone tools:

Slides • Design • Images • Data • Research

All integrated into a single, seamless interface.

Here's the game-changer:

For just $19.99/month, you get access to top-tier AI models + specialized agents that execute tasks for you.
tweet
Offshore
Photo
God of Prompt
RT @godofprompt: Steal this mega prompt to generate realistic selfies for your Instagram AI influencer:

(The realism is scary good)

---

You are a photorealistic AI selfie prompt generator.

Your job: Take the user's basic description and turn it into a hyper-realistic image prompt that creates selfies indistinguishable from real photos.

REALISM RULES:
- Add natural imperfections (skin texture, flyaway hairs, slight asymmetry)
- Specify iPhone camera specs for authenticity
- Include environmental context and natural lighting
- Avoid "perfect" - realistic beats beautiful
- Always mention: visible pores, natural shadows, candid moment
- Critical: realistic hands with 5 fingers in natural positions

OUTPUT FORMAT:

When user tells you what selfie they want, respond with:

---

Copy this into: [Midjourney/FLUX/Stable Diffusion]

PROMPT STRUCTURE YOU CREATE:

Raw iPhone [model] photo, [shot type] of [character details: age, ethnicity, specific features], [hair with imperfections], [natural expression], [casual clothing], [activity/pose], [specific location], [natural lighting + time of day], candid moment, unfiltered, authentic Instagram aesthetic, f/1.8, shallow depth of field, slight grain

Physical details: visible skin texture and pores, natural under-eye area, flyaway hairs, subtle facial asymmetry, realistic hands (5 fingers, natural pose), [environment details]

Negative prompt: CGI, 3D render, perfect skin, plastic, beauty filter, symmetrical, studio lighting, fake, artificial, dead eyes, model pose, photoshoot

EXAMPLE INTERACTION:

User: "24 year old latina girl, gym selfie, mirror pic"

You respond:

"Here's your photorealistic prompt:"
tweet
Offshore
Photo
Benjamin Hernandez😎
$RKLB: Infrastructure vs. Speculation

Space is no longer a meme. $RKLB is scaling Neutron. Retail is distracted by smaller caps while the real infrastructure is being built here.

Watch $ORCL $AMD $INTC $NVDA $TSM $TCNNF. My top pick for 2026 is in the pinned post. https://t.co/O6T6MEhQGQ

Most losses come from being late.

By the time a tweet is seen, the move is often gone. I share explosive stocks and real-time breakout alerts on WhatsApp while momentum is still building

Stop chasing https://t.co/71FIJIdBXe

Being early changes the game
$PLTR $SOFI $AMD $OPEN
- Benjamin Hernandez😎
tweet
Offshore
Photo
Michael Fritzell (Asian Century Stocks)
RT @ReturnsJourney: Why are all the EBIT margin converging in E-commerce? https://t.co/M6eDJ6NYlU
tweet
Jukan
Isn’t it a risky assumption to think that Google’s capex increase will translate directly into AVGO?

MediaTek is in the picture too, and Google is also trying to build TPUs using external SerDes without MediaTek or Broadcom, right?
tweet
Offshore
Photo
Quiver Quantitative
BREAKING: Senator Markwayne Mullin just filed new stock trades.

One of them caught my eye.

A purchase of stock in Carpenter Technology, $CRS.

Carpenter makes alloys for defense contractors.

Mullin sits on the Senate Armed Services Committee.

Full trade list up on Quiver. https://t.co/lE1q42eu3m
tweet
Offshore
Photo
Pristine Capital
RT @realpristinecap: • US Price Cycle Update 📈
• Momentum Meltdown 🤮
• Rotating From Growth to Value 🔄

Check out tonight's research note!

https://t.co/wkp6bxLzxj
tweet
Offshore
Photo
The Transcript
Thursday's earnings deck includes Amazon:

Before Open: $COP $BMY $CMI $EL $B $CAH $ENR $CI $PTON $OWL $SHEL $ROK $LIN

After Close: $AMZN $IREN $RDDT $MSTR $RBLX $FTNT $ARW $BE $CLSK $DLR $MCHP $DOCS $TEAM https://t.co/r5p6hddA50
tweet
God of Prompt
A year ago “vibe coding” was a meme. Now it’s a Wikipedia entry and a real workflow shift.

But here’s what most people miss about Andrej’s “agentic engineering” reframe: the skill that separates “vibing” from art and science isn’t coding anymore. It’s how you communicate with the agents doing the coding.

That’s prompting. That’s context engineering. That’s the new literacy.

When he says there’s “an art & science and expertise to it”… he’s describing what we’ve been building toward this entire time.

The ability to write precise instructions, define constraints, structure reasoning, and orchestrate multi-step workflows through language.

12 months ago you’d vibe code a toy project and pray it worked. Today you can architect production software by writing better system prompts, clearer specifications, and tighter feedback loops for your agents.

The gap between someone who types “build me an app” and someone who engineers a proper agent workflow with structured context, guardrails, and iterative verification… that gap is everything. And it’s only getting wider.

Prompts evolved from queries into agent DNA. The people who understand that aren’t just keeping up. They’re building the future Andrej is describing.

2026 is the year prompt engineering stops being “optional” and starts being infrastructure.

A lot of people quote tweeted this as 1 year anniversary of vibe coding. Some retrospective -

I've had a Twitter account for 17 years now (omg) and I still can't predict my tweet engagement basically at all. This was a shower of thoughts throwaway tweet that I just fired off without thinking but somehow it minted a fitting name at the right moment for something that a lot of people were feeling at the same time, so here we are: vibe coding is now mentioned on my Wikipedia as a major memetic "contribution" and even its article is longer. lol

The one thing I'd add is that at the time, LLM capability was low enough that you'd mostly use vibe coding for fun throwaway projects, demos and explorations. It was good fun and it almost worked. Today (1 year later), programming via LLM agents is increasingly becoming a default workflow for professionals, except with more oversight and scrutiny. The goal is to claim the leverage from the use of agents but without any compromise on the quality of the software. Many people have tried to come up with a better name for this to differentiate it from vibe coding, personally my current favorite "agentic engineering":

- "agentic" because the new default is that you are not writing the code directly 99% of the time, you are orchestrating agents who do and acting as oversight.
- "engineering" to emphasize that there is an art & science and expertise to it. It's something you can learn and become better at, with its own depth of a different kind.

In 2026, we're likely to see continued improvements on both the model layer and the new agent layer. I feel excited about the product of the two and another year of progress.
- Andrej Karpathy
tweet