- xAI plans to let Cursor train its Composer 2.5 coding model using tens of thousands of xAI’s GPUs.
- The deal effectively turns xAI into a cloud provider—one with embarrassingly low GPU utilization at just 11%.
- Cursor, valued at $29 billion and in talks for $50 billion, is xAI’s first external compute customer.
xAI is about to become a landlord. Elon Musk’s AI company plans to rent out tens of thousands of its GPUs to Cursor so the $29 billion coding startup can train Composer 2.5—its next-generation AI coding model, according to Business Insider, citing people familiar with the matter. It’s the first time xAI has offered its compute infrastructure to an outside company.
The arrangement is pragmatic. xAI operates Colossus, one of the largest AI data centers in the world, with roughly 200,000 Nvidia GPUs and plans to expand to 1 million. But running that many chips efficiently is another problem entirely. In a memo to staff last week, xAI president Michael Nicolls called the company’s Model FLOPs Utilization—an industry measure of how effectively GPUs are used during training—”embarrassingly low” at about 11%. Most large-scale AI training operates between 35% to 45%, according to Lambda AI.
Cursor, meanwhile, is in a sprint. The company is in talks to raise $2 billion or more at a $50 billion valuation, Bloomberg reported last month. It released Composer 2 in March—a coding model built on top of Chinese startup Moonshot AI’s Kimi and fine-tuned with proprietary developer usage data. Composer 2.5 is the next iteration, and it needs serious compute to train at the scale Cursor envisions.
Why xAI Is Becoming a Cloud Provider
The GPU rental business isn’t new—Amazon Web Services, Microsoft Azure, and Google Cloud have made billions doing it. CoreWeave and Lambda carved out niches supplying AI-specific compute. But xAI entering the space is different: it’s a model company deciding that idle infrastructure is worse than sharing it.
The move also reflects a deeper personnel connection. In March, xAI hired Cursor’s former product engineering leads, Andrew Milich and Jason Ginsburg. Both now oversee xAI’s product team and report directly to Musk and Nicolls. That’s not a typical vendor-client relationship—it’s a pipeline.
xAI’s infrastructure team has been turbulent. The company lost its infrastructure lead, Heinrich Küttler, last week. SpaceX’s Daniel Dueri stepped in to lead the compute infrastructure team, while Jake Palmer took over physical infrastructure. Selling spare GPU cycles to a partner with familiar faces in leadership is one way to justify the cost of operating a 200,000-GPU facility while the org chart stabilizes.
The AI Coding Arms Race Just Added a Cloud Layer
Cursor isn’t the only company building AI coding tools. Anthropic and OpenAI are both pushing aggressively into coding assistants. GitHub Copilot remains the incumbent. The difference now is that the infrastructure powering these tools is becoming its own revenue stream—not just a cost center.
Musk said during an all-hands meeting last December that xAI would beat competitors because it had access to more power to train its models. Letting Cursor tap that same power turns the bet into a business: xAI keeps developing its own models while generating revenue from compute that was previously burning electricity at 11% efficiency.
xAI’s Colossus facility in Memphis is backed by Tesla batteries and, in Phase 2, is expected to consume 300 megawatts—enough to power roughly 300,000 homes.