- Future of Computing Newsletter
- Posts
- 🤖🦾 Behind the Scenes at OpenAI
🤖🦾 Behind the Scenes at OpenAI
A Newsletter for Entrepreneurs, Investors, and Computing Geeks
Happy Monday! This week’s deep dive looks behind the scenes at OpenAI. In our spotlights section, we cover Meta’s massive data center ambitions and Mira Murati’s $12B AI debut. We also cover major headlines across AI, semiconductors, quantum, neuromorphic, and cloud, alongside curated readings on compute architectures, photonic innovation, and next-gen materials. As always, there’s a full roundup of funding news. And in our bonus section, we highlight a particularly dense week of scientific breakthroughs.
The Future of Computing Conference is coming to Paris on November 6. After sold-out editions in London and Berlin, we’re bringing together 200 selected founders, researchers, investors, and engineers for a focused, one-day event on computing, semiconductors, AI, quantum, and more.
Organized in partnership with iXcampus, CDL-Paris, HEC Paris, and Elaia, the conference offers in-depth discussions, curated demos, and the chance to connect with others building the future of computing in Europe. Sign up here.
Deep Dive: Behind the Scenes at OpenAI
Source: “Reflections on OpenAI” (Calv) (20 mins)
In one of the most insightful blog posts on OpenAI to date, engineer and Segment co-founder Calvin French-Owen reflects on his year inside the company.
Below are selected takeaways from his post, particularly relevant for anyone building teams (and therefore culture), infrastructure, large-scale software systems, or coding tools, but the full post goes much deeper and is well worth your time.
Culture
Hypergrowth: OpenAI scaled from around 1,000 to 3,000 people within a year. The speed of this expansion led to internal breakdowns in reporting structures, hiring processes, and communication flows.
Bottom-Up Structure: Despite its size, OpenAI operates in a decentralized way. Especially in research, small teams often pursue ideas independently, without needing formal approval. Progress tends to come from iteration rather than top-down planning.
Meritocratic Ethos: Promotions are based on execution and contribution rather than visibility or politics. Many leaders are recognized for shipping high-impact work, even if they are not strong presenters or traditional managers.
Secrecy and Silos: Intense external scrutiny means much of the internal work is highly compartmentalized. Sensitive projects are siloed in restricted Slack channels, and access to financial and roadmap information is limited.
Fluid Teams: Team composition changes quickly. Engineers and researchers are often reassigned informally across efforts depending on immediate needs. Decisions are made quickly and new priorities can be pursued without waiting for formal planning cycles.
Not a Monolith: Different teams at OpenAI operate with different goals and mental models. Some view the company as a research lab, others as a consumer tech company, and others as an enterprise platform. These perspectives coexist and shape internal dynamics.
Code and Infrastructure
Tech Stack: Most services are built using FastAPI and Pydantic. There is no enforced style guide across the organization, which allows for speed and flexibility but results in inconsistencies across teams.
Monorepo: OpenAI operates on a large Python monorepo, with additional code in Rust and Go. Code quality ranges from production-grade infrastructure to lightweight experimental notebooks.
Code Wins: Development decisions are typically made by the teams that build the systems. This leads to fast execution and high ownership, but also results in duplicate tooling, such as multiple queueing or agent libraries.
Azure-Based Infra: The entire platform runs on Azure. Only a few core services, like AKS, CosmosDB, and BlobStore, are widely relied upon by engineers. Many teams are cautious about the rest of the Azure ecosystem.
Build In-House Philosophy: Due to the absence of equivalents to services like DynamoDB or BigQuery, many infrastructure components are developed internally. There is a strong preference for building rather than buying.
Performance-Driven Scaling: Model training begins with small-scale experiments and scales up when results are promising. GPU planning is driven by latency targets rather than theoretical compute capacity. Each new model generation introduces different usage patterns, which require continuous benchmarking and system tuning.
Spotlights
Meta’s Data Center Ambitions
Meta is going all-in on infrastructure to power frontier AI. The company is building out two massive projects (Hyperion and Prometheus) that signal a serious push to compete with OpenAI, Google, and Anthropic not just on models, but on compute scale.
💥 Mark Zuckerberg says Meta is building a 5GW AI data center (TechCrunch)
Hyperion, a new facility in Louisiana, will scale to 5 GW of compute which is enough to train frontier AI models at massive scale. It marks a major step in Meta’s effort to pair infrastructure scale with top-tier talent.
Prometheus, a 1 GW supercluster in Ohio, is expected to come online in 2026. It will be one of the largest AI compute clusters ever built by a tech company, designed to support training and inference for frontier-scale models.
Mira Murati’s $12B AI Debut
“Thinking Machines Lab, the AI startup founded by OpenAI’s former chief technology officer Mira Murati, officially closed a $2 billion seed round led by Andreessen Horowitz on Monday, a company spokesperson told TechCrunch.
The deal, which includes participation from Nvidia, Accel, ServiceNow, CISCO, AMD, and Jane Street, values the startup at $12 billion, the spokesperson said.”
Headlines
This week’s headlines cover major advances in AI capabilities, shifts in semiconductor strategy and production, and new milestones in quantum and neuromorphic computing.
🤖 AI
Meta’s New Superintelligence Lab Is Discussing Major A.I. Strategy Changes (The New York Times)
🦾 Semiconductors
Industry briefings set for 3D heterogeneous integration (3DHI) military microelectronics manufacturing (Military Aerospace)
⚛️ Quantum Computing
Europe’s Chip Moment: Terra Quantum Develops New Transistor Aimed at AI Market (The Quantum Insider)
Cornell and IBM Demonstrate Error-Resistant Quantum Computing Advance (Quantum Zeitgeist)
Study Suggests Today’s Quantum Computers Could Aid Molecular Simulation (The Quantum Insider)
Quantum Neural Networks Show Performance Peaks With Carefully Scaled Bosonic Modes (Quantum Zeitgeist)
Rigetti Computing Nears “Perfect Quantum Computing” with 99.5 % Entanglement (Quantum Zeitgeist)
🧠 Neuromorphic Computing
☁️ Cloud
If you’re looking for the usual photonic / optical computing updates, don’t worry. You’ll find them in the bonus section on recent breakthroughs!
Readings
This week’s reading list covers advances in semiconductor design, quantum and neuromorphic computing architectures, and the scaling challenges of next-gen data centers.
🦾 Semiconductors
2D Transistors Could Come Sooner Than Expected (IEEE Spectrum) (3 mins)
Heterogeneous Mobile Processing and Computing Market to Expand (OpenPR) (10 mins)
From Follower to Leader: How Mature Technology ASICs Can Give You the Edge (All About Circuits) (9 mins)
Feature Engineering at Scale: Optimizing ML Models in Semiconductor Manufacturing with NVIDIA CUDA‑X Data Science (NVIDIA Developer Blog) (9 mins)
⚛️ Quantum Computing
Hybrid Computing Unlocks New Frontiers (ScienceBlog) (6 mins)
D‑Wave’s Quantum Leap Draws Investor Hype And Scrutiny (Finimize) (5 mins)
Rotonium: Shaping the Future of Photonic Quantum Edge Computing (Future of Computing) (10 mins) Yes, this is our own interview and definitely worth a read :-)
⚡️ Photonic / Optical Computing
Why Photonics Will Reshape AI (DevX) (7 mins)
🧠 Neuromorphic Computing
Real-Time Motor Control for Robotics with Neuromorphic Chips (EDN) (6 mins)
Hybrid AI Models Blend Deep Learning with Neuromorphic Ideas (EE Times) (10 mins)
Emerging Low‑Dimensional Perovskite Resistive Switching Memristors: From Fundamentals to Devices (CW39) (8 mins)
Overcoming Epigenetic Therapy Resistance in Cancer Through High‑Throughput Neuromorphic Modeling (Nature Communications) (22 mins)
💥 Data Centers
From Circuits to Scale: Intel’s Path to Exascale (Intel Newsroom) (7 mins)
Funding News
This week’s funding news covers a wave of early-stage quantum rounds and one AI seed round that completely resets the scale.
Amount | Name | Round | Category |
---|---|---|---|
Undisclosed | Quantum | ||
€1.5M | Quantum | ||
$2.5M | Quantum | ||
$4.9M | Quantum | ||
$21M | Semiconductors | ||
$35M | Cloud | ||
$51M | Cloud | ||
€62M | Quantum | ||
€70M | Semiconductors | ||
€80M | Quantum | ||
$2B | AI |
Bonus: A Week of Breakthroughs
An unusually dense week for scientific progress spanning quantum chips, optics, materials science, and timekeeping. Here’s a roundup of the most interesting new publications and press releases:
Unlocking Smarter Quantum Computing: Photonics Breakthrough in Error Correction Codes (Embedded.com)
NIST Ion Clock Sets New Record for Most Accurate Clock in the World (NIST, National Institute of Standards and Technology)
One Tiny Structure Just Broke a Fundamental Rule of Optics (SciTechDaily)
Love these insights? Forward this newsletter to a friend or two. They can subscribe here.