Inside NVIDIA GPUs: Anatomy of high performance matmul kernels - Aleksa GordiΔ
https://aleksagordic.com/blog/matmul
https://aleksagordic.com/blog/matmul
Aleksagordic
Inside NVIDIA GPUs: Anatomy of high performance matmul kernels - Aleksa GordiΔ
From GPU architecture and PTX/SASS to warp-tiling and deep asynchronous tensor core pipelines.
[2512.18552] Toward Training Superintelligent Software Agents through Self-Play SWE-RL
https://arxiv.org/abs/2512.18552
https://arxiv.org/abs/2512.18552
arXiv.org
Toward Training Superintelligent Software Agents through Self-Play SWE-RL
While current software agents powered by large language models (LLMs) and agentic reinforcement learning (RL) can boost programmer productivity, their training data (e.g., GitHub issues and pull...
π1
π‘ Remember Box
https://github.com/bellard/mquickjs
#todo port to zig and check speed
π‘ Remember Box
#todo port to zig and check speed
syntax translation is low leverage
a better thing would be WASM-compiled mquickjs as a sub-100KB sandboxed JS runtime
I've never felt this much behind as a programmer. The profession is being dramatically refactored as the bits contributed by the programmer are increasingly sparse and between. I have a sense that I could be 10X more powerful if I just properly string together what has become available over the last ~year and a failure to claim the boost feels decidedly like skill issue. There's a new programmable layer of abstraction to master (in addition to the usual layers below) involving agents, subagents, their prompts, contexts, memory, modes, permissions, tools, plugins, skills, hooks, MCP, LSP, slash commands, workflows, IDE integrations, and a need to build an all-encompassing mental model for strengths and pitfalls of fundamentally stochastic, fallible, unintelligible and changing entities suddenly intermingled with what used to be good old fashioned engineering. Clearly some powerful alien tool was handed around except it comes with no manual and everyone has to figure out how to hold it and operate it, while the resulting magnitude 9 earthquake is rocking the profession. Roll up your sleeves to not fall behind.
https://x.com/i/status/2004607146781278521
https://x.com/i/status/2004607146781278521
X (formerly Twitter)
Andrej Karpathy (@karpathy) on X
I've never felt this much behind as a programmer. The profession is being dramatically refactored as the bits contributed by the programmer are increasingly sparse and between. I have a sense that I could be 10X more powerful if I just properly string togetherβ¦
STARKs, Part I: Proofs with Polynomials
https://vitalik.eth.limo/general/2017/11/09/starks_part_1.html
https://vitalik.eth.limo/general/2017/11/09/starks_part_1.html
A Guide to Claude Code 2.0 and getting better at using coding agents | sankalp's blog
https://sankalp.bearblog.dev/my-experience-with-claude-code-20-and-how-to-get-better-at-using-coding-agents/
https://sankalp.bearblog.dev/my-experience-with-claude-code-20-and-how-to-get-better-at-using-coding-agents/
sankalp's blog
A Guide to Claude Code 2.0 and getting better at using coding agents
A deep dive into Claude Code 2.0 features, Opus 4.5 workflows, and context engineering. Learn sub-agents, MCP servers, hooks, skills, and practical tips to boost your AI-assisted coding productivity.
Stimulant medications affect arousal and reward, not attention networks: Cell
https://www.cell.com/cell/fulltext/S0092-8674(25)01373-X
https://www.cell.com/cell/fulltext/S0092-8674(25)01373-X
Cell
Stimulant medications affect arousal and reward, not attention networks
Stimulant medications (e.g., methylphenidate) were thought to improve attention by
acting on the brain's attention networks. Functional connectivity data now reveal
that stimulants are associated with changes in arousal and reward, but not attention
systemsβ¦
acting on the brain's attention networks. Functional connectivity data now reveal
that stimulants are associated with changes in arousal and reward, but not attention
systemsβ¦
π€―1π1