Wright’s Law Is Eating Everything, The Intelligence Cost Collapse, and The Industrialization Wave
the shape repeats.
Good morning
Today - a more thorough analysis, but worth reading.
In today’s edition:
Wright’s Law Is Eating Everything
The Intelligence Cost Collapse
Benchmarks Are Saturating (And That’s the Point)
Agents Are the Demand Multiplier
$300 Billion in Capex Says This Isn’t a Bubble
The Binding Constraint Is Watts, Not Weights
The Nuclear Renaissance Is Real
Solar and Batteries: The Other Exponential
The Industrialization Wave
Onwards!
Wright’s Law Is Eating Everything
Contrary Research just dropped their 2026 Tech Trends Report, and it’s 160+ slides of data on where AI, energy, and physical infrastructure are headed. I went through the whole thing. The most useful framing is the shape that keeps repeating across charts: exponential cost deflation creating demand that looked impossible two years ago.
The cost of intelligence is following the same curve that solar panels and lithium-ion batteries followed over the past fifteen years. If you understood Wright’s law in energy, you already know what’s coming in AI.
The Intelligence Cost Collapse
The chart that should anchor every AI investment thesis right now:
GPT-4-class intelligence went from ~$60 per million tokens at launch to roughly $0.07 today. Three orders of magnitude in under two years. No one’s cost model from 2023 survived contact with that curve.
A new commodity is entering the economy on a deflationary trajectory steeper than Moore’s law achieved. When something gets 1000x cheaper, the use cases that “didn’t make economic sense” become viable. Customer support, code review, legal document triage, medical note summarization: these weren’t waiting on better models, they were waiting on cheaper inference.
The standard objection is that cheaper inference means lower margins for model providers. That misses the mechanism. Cheaper inference expands total compute consumed. OpenAI, Anthropic, and Google are all reporting that inference volume is growing faster than price is falling. The revenue pool is expanding, not shrinking.
Benchmarks Are Saturating (And That’s the Point)
MMLU scores went from ~70% to 90%+ across two model generations. HumanEval is nearly maxed out. The benchmark community is scrambling to build harder tests, and the new ones are saturating within months.
Read one way, progress is slowing. Read another: general capability is becoming table stakes, and the differentiation is moving to reliability, latency, and tool use. When every frontier model can pass the bar exam, the bar exam stops being informative. What matters is whether the model can reliably file a motion at 2am without hallucinating a case citation.
This is the same transition that happened in cloud computing around 2015. Raw compute became a commodity, and the competition shifted to managed services, developer experience, and ecosystem. The parallel for AI: the model layer commoditizes, the application and agent layer captures value.
Agents Are the Demand Multiplier
SWE-bench scores tell the story, although I really don’t like the bench scores overall. Autonomous agents went from solving ~3% of real-world software engineering tasks in early 2024 to over 50% by early 2026. An S-curve inflection.
The practical impact is already visible in production. Contrary cites several data points on AI coding adoption:
Google says over 25% of new code is now AI-generated. GitHub Copilot completions are accepted at rates above 30% across millions of developers. Cursor, the AI-native editor, hit $100M ARR faster than almost any dev tool in history.
The headcount implications are real but misread. Companies aren’t firing engineers. They’re holding team sizes flat while shipping 2-3x more. The junior developer role is mutating: less “write boilerplate from scratch” and more “review, prompt, and integrate AI-generated code.” If you’re a founder planning your 2026 engineering budget around 2024 productivity assumptions, you’re overhiring.
AI-generated code has roughly equivalent defect rates to human code when properly reviewed.
$300 Billion in Capex Says This Isn’t a Bubble
The software story is interesting. The atoms story is where the capital is moving now.
Microsoft, Google, Amazon, and Meta are collectively deploying over $300 billion in capital expenditure in 2025-2026, predominantly on AI infrastructure. That number is larger than the GDP of most countries and larger than the entire US venture capital market combined.
These companies have the balance sheets and the unit economics to justify the spend. Microsoft alone is generating enough from Azure AI services to underwrite its capex program, and Google’s TPU infrastructure gives it cost advantages that compound with scale.
The less-covered story is the shift to custom silicon. Google’s TPU v5 and v6, Amazon’s Trainium2, Microsoft’s Maia: all designed to reduce dependence on NVIDIA and optimize for specific inference workloads. NVIDIA still dominates training, but the inference market (where most of the revenue growth lives) is fragmenting.
For investors: NVIDIA’s revenue trajectory is extraordinary, but the margin of safety is narrower than it looks. Custom silicon is a 2-3 year threat to inference market share. The smart money is watching the ratio of NVIDIA training revenue to inference revenue. When that flips, the competitive dynamics change.
The Binding Constraint Is Watts, Not Weights
All of this compute needs power.
US electricity demand was flat from 2005 to 2023. For two decades, efficiency gains offset new demand. That’s over. Data centers alone are projected to consume 8-12% of US electricity by 2030, up from roughly 4% today. Total US electricity demand is projected to grow 15-20% in the next five years, the fastest growth rate since the 1970s.
The energy constraint is already showing up in the real estate market. Northern Virginia, the largest data center hub in the world, is hitting grid capacity limits. New projects are being delayed or relocated because the power isn’t available. Contrary reports that some hyperscalers are now buying land based primarily on grid access rather than proximity to users.
Water is the second physical constraint nobody talks about. A single large data center can consume 1-5 million gallons of water per day for cooling. In regions already under water stress (Arizona, parts of Texas), this is becoming a permitting and political risk.
The Nuclear Renaissance Is Real
Constellation Energy is restarting Three Mile Island Unit 1 to supply power directly to Microsoft’s data center operations. The nuclear plant that became synonymous with American fear of atomic energy is coming back online to run AI inference.
Microsoft, Google, and Amazon have all signed or are negotiating power purchase agreements with nuclear operators. The demand signal from hyperscalers has done more to revive the US nuclear industry in two years than decades of policy advocacy.
Small modular reactors (SMRs) are the longer-term bet. Companies like NuScale, Oklo, and Kairos Power are targeting data center operators as anchor customers. The unit economics work if you assume 20-year PPAs at current electricity prices, which the hyperscalers are willing to sign.
For investors, nuclear is no longer a policy bet or a political stand. The customers exist, the willingness to pay exists, and the regulatory environment is shifting (the NRC is streamlining licensing for SMRs). The question is, can these companies build on time and on budget? Nuclear’s historical track record on construction is poor. But the incentive structure is different this time: the buyers are the most creditworthy companies on earth.
Solar and Batteries: The Other Exponential
Solar follows a learning curve of roughly 20% cost reduction per doubling of cumulative installed capacity. That relationship has held for over four decades. Solar is now the cheapest source of new electricity generation in most of the world, and it accounts for over 50% of planned new US grid capacity in 2025.
Batteries show the same pattern. Lithium-ion cell costs have fallen from over $1,000/kWh in 2010 to under $100/kWh today. The US is projected to add 67GW of utility-scale battery storage in the next five years. Enough to change how the grid handles peak demand and intermittency.
This is where the loop closes. AI’s demand for energy is driving investment in generation and storage. That investment accelerates cost declines in solar and batteries via Wright’s law. Cheaper energy makes more AI compute economically viable. More compute drives more energy demand. The flywheel is self-reinforcing.
The Industrialization Wave
The report covers autonomous vehicles, and the data is further along than most people realize.
Waymo is doing over 150,000 paid rides per week across its operating cities. In San Francisco, it has majority share of the robotaxi market. It’s expanding to 15+ metropolitan areas. The safety data shows Waymo vehicles are involved in fewer crashes per mile than human drivers.
This is a commercial operation scaling on a predictable curve. The autonomous vehicle “winter” narrative that dominated 2020-2023 was premature.
What This Means If You’re Building or Investing
The common theme across all of these sections: cost deflation in technology is accelerating, and the winners will be companies that treat this deflation as an input to their business model rather than a threat to it.
Specific implications:
If you’re building an AI application: assume inference costs will drop another 10x in the next 18 months. Design your product for an abundance of intelligence, not scarcity. Features that seem too expensive today (real-time audio processing, multi-agent workflows, continuous monitoring) will be viable by the time you ship.
If you’re investing in AI: the model layer is commoditizing fast. The durable value is in distribution (who owns the customer relationship), data moats (proprietary training data and feedback loops), and physical infrastructure (energy, chips, data centers). The application layer will produce enormous companies, but picking winners requires evaluating distribution advantages, not model benchmarks.
If you’re in energy or infrastructure: AI demand is the most significant demand shock to the US power grid in a generation. Every energy technology that can deliver reliable baseload power (nuclear, geothermal, natural gas with CCS) or cheap variable power (solar + storage) has a structural tailwind that will persist for a decade.
If you’re an operator planning headcount: productivity per engineer is increasing 2-3x with AI coding tools. The right move isn’t mass layoffs. It’s raising the bar for what a team of 10 can ship and using the productivity gain to either reduce burn or accelerate roadmap.
The Contrary report is one of the better data-driven snapshots of where we are. The shape that keeps appearing across every section is the Wright’s law learning curve. Intelligence, energy, storage, transportation are all on exponential cost decline curves, and those curves are starting to interact in ways that compound.
Interesting Analysis and Trends
AI, Agents & Infrastructure
Something Big Is Happening LINK
Data Is Your Only Moat LINK
The Post-Model World: Why The System Is The New Moat LINK
Emergent Behavior LINK
AI Is Getting Scary Good at Making Predictions LINK
Research Note: A Simpler AI Timelines Model Predicts 99% AI R&D Automation in ~2032 LINK
Grading AI 2027’s 2025 Predictions LINK
What To Watch in 2026 To Evaluate The AI Bubble LINK
AI Coding, Productivity & Tools
Don’t Waste Your Backpressure LINK
Microsoft Is Using Claude Code Internally While Selling You Copilot LINK
The Human in the Loop LINK
Claude Code: My Workflow Guide LINK
Code Is Cheap LINK
How AI Impacts Skill Formation LINK
How to Build AI Product Sense LINK
Startups, Growth & Product
The Hidden Danger of Shipping Fast LINK
The SaaSacre of 2026 LINK
AI Games Are Coming LINK
Three Dominant Models of Today’s AI Market LINK
Venture, Markets & Ecosystems
The Incompetent Confidence Complex LINK
The State of Fintech in 2026 LINK
The Insurance Stack Is Compressing LINK
Redefining the Mining Value Chain LINK
Meditations
Buckminster Fuller:
You never change things by fighting the existing reality. To change something, build a new model that makes the existing model obsolete.
----
Thank you for your time,
Bartek









Great read — gracias. Yesterday I spent a few hours with Claude for the first time, just playing around and seeing how it compares to ChatGPT and Gemini. I ended up creating five new landing pages and rewriting existing ones in minutes (i never coded a single line myself). Crazy time to build/do stuff.