Orbital Computing Is Finally Getting Real, and Earth's Data Center Ban Might Speed It Up

For years, the pitch was irresistible: data centers in space. It sounded like science fiction. SpaceX and Blue Origin bought in. Startups raised hundreds of millions on the concept. But there was always a catch. For all the hype about orbital infrastructure, there just weren’t very many GPUs actually up there doing anything useful.

That’s starting to change, and it’s not happening the way most people expected.

The Infrastructure Play Nobody Saw Coming

Kepler Communications, a Canadian satellite company, launched what’s currently the largest compute cluster in orbit this past January. The specs sound modest compared to terrestrial data centers: about 40 Nvidia Orin edge processors spread across 10 satellites, all connected via laser links. But modest is the point.

According to TechCrunch reporting, Kepler now has 18 customers and just announced a partnership with Sophia Space, a startup that will test its custom operating system on Kepler’s constellation. This isn’t flashy. It’s the kind of infrastructure play that actually works.

The difference matters. While SpaceX and Blue Origin are chasing the dream of full-scale data centers with serious compute power, Kepler is solving an immediate problem: processing data where it’s collected, not sending everything back to Earth.

Edge Processing Changes the Game

The appeal of edge processing in orbit is straightforward but powerful. Military synthetic aperture radar systems, satellite imagery analysis, drone telemetry from ships at sea—all of these generate enormous amounts of data. Sending that data back to Earth takes time and bandwidth. Processing it in orbit, where it’s collected, is faster and more efficient.

This is where the partnership between Kepler and Sophia becomes interesting. Sophia specializes in passively-cooled space computers, which tackles one of orbital computing’s thorniest problems: heat. Active cooling systems in space are heavy and expensive. Passive cooling changes the equation.

When Sophia uploads its operating system to one of Kepler’s satellites and attempts to distribute it across six GPUs on two separate spacecraft, they’ll be doing something that’s routine on Earth but has never been attempted in orbit. It sounds technical. It is. But it’s also a crucial de-risking step for Sophia ahead of its planned first satellite launch in late 2027.

For Kepler, the partnership proves something harder to quantify: that third-party software can actually work on their infrastructure. As the sector matures, Kepler expects to become a networking and processing layer for other satellite operators, not just a carrier service.

When Earth’s Rules Start Driving Space Economics

Here’s where things get genuinely strange. Sophia CEO Rob DeMillo mentioned last week that Wisconsin adopted a ban on new data center construction. Congress has similar ideas brewing. This is normally the kind of regulatory move that would be dismissed as NIMBYism or Luddite thinking.

In this case, it might actually accelerate space-based alternatives.

“There’s no more data centers in this country,” DeMillo mused to TechCrunch. “It’s gonna get weird from here.”

He has a point, even if it sounds absurd. If jurisdictions keep tightening restrictions on terrestrial data centers, the financial case for orbital infrastructure gets stronger almost by accident. It’s not that space suddenly becomes cheap. It’s that Earth becomes restricted.

That’s a peculiar dynamic. The Technology industry has spent decades assuming it would build data centers wherever economics dictated. If that option starts disappearing, the industry might not wait for the perfect orbital solution. It might take what works well enough.

The Real Timeline

Most experts still don’t expect large-scale space data centers like those envisioned by the big names until the 2030s. That timeline assumes things go smoothly. It assumes passive cooling works at scale. It assumes space launch costs keep dropping.

What we’re seeing now is the groundwork. Kepler’s network. Sophia’s thermal engineering. Military adoption of space-to-air laser links. These are the unglamorous building blocks that actual infrastructure requires.

Kepler CEO Mina Mitry is clear about the vision: distributed GPUs running inference work 100% of the time, not occasional training workloads that consume power while sitting idle. It’s the opposite of the “superpower GPU” approach other companies are pursuing.

“If this thing consumes kilowatts of power and you’re only running at 10% of the time, then that’s not super helpful,” Mitry told TechCrunch.

That’s a useful reality check. Orbital compute isn’t going to work like terrestrial data centers. It can’t. The economics are different, the thermal constraints are different, the launch costs are real. Companies chasing that model are likely building expensive monuments to poor assumptions.

The ones thinking about this as edge infrastructure, as distributed processing, as a networking layer between ground and space? They’re the ones who might actually build something that lasts beyond the initial investor enthusiasm.

Regulation was supposed to constrain tech development. Sometimes it just redirects it toward the places regulators never imagined.

Written by

Adam Makins

I’m a published content creator, brand copywriter, photographer, and social media content creator and manager. I help brands connect with their customers by developing engaging content that entertains, educates, and offers value to their audience.