In one in every of my favourite films of all time, The Matrix, people develop into the facility supply that retains machines alive.
Elon Musk should have watched that film just lately, as a result of he simply pitched an analogous thought. Besides he needs idle machines to energy the way forward for intelligence, not the opposite manner round.
On Tesla’s current third-quarter earnings name, Musk floated this wild thought:
Really, one of many issues I believed, if we’ve obtained all these automobiles that perhaps are bored, whereas they’re form of, if they’re bored, we might even have an enormous distributed inference fleet and say, in the event that they’re not actively driving, let’s simply have an enormous distributed inference fleet.
Translation: each idle Tesla might quickly act as a node in a large AI community. Tens of hundreds of thousands of parked automobiles, considering collectively.
However how would Elon’s cellular supercomputer work?
That’s the place issues get actually fascinating…
A Fleet That Thinks
Estimates fluctuate, however as of 2024, there have been round 5 million Teslas on the highway worldwide.
Elon Musk has a lot greater plans, predicting the fleet may finally whole 100 million automobiles.
Right here’s what he stated throughout Tesla’s current earnings name:
Sooner or later, should you’ve obtained tens of hundreds of thousands of automobiles within the fleet, or perhaps sooner or later 100 million automobiles within the fleet, and let’s say that they had at that time, I don’t know, a kilowatt of inference functionality, of high-performance inference functionality, that’s 100 gigawatts of inference distributed with energy and cooling taken, with cooling and energy conversion taken care of. That looks like a fairly important asset.
In different phrases, 100 million Teslas, every able to about one kilowatt of high-performance inference.
That works out to roughly 100 gigawatts of compute energy.
To place that in perspective, 100 gigawatts is near the mixed output of 100 nuclear reactors or sufficient electrical energy to energy 75 million U.S. houses.
A single hyperscale information middle from Amazon Internet Companies or Google Cloud can draw 50 to 100 megawatts of energy. You’d want round 1,000 of these to match Musk’s theoretical 100-gigawatt community.
And all that potential computing energy would already be constructed, paid for and sitting in driveways.
Picture: Tesla
Tesla’s full-self-driving pc — often called {Hardware} 4 — is designed to strategy the form of efficiency seen in high-end information middle chips.
And a next-generation system known as AI5 is in improvement that would ship a number of instances extra processing energy, giving each Tesla the form of onboard compute as soon as reserved for information facilities.
What’s extra, every automobile already comprises a high-performance processor and energy system able to operating complicated AI duties. Each already has a built-in thermal-management system that retains chips cool and batteries balanced. And each automobile is related to Tesla’s cloud via the identical over-the-air replace community that pushes new software program and maps.
The distinction is, in contrast to a server rack, these programs spend most of their time doing nothing. As a result of the typical automobile sits parked 95% of the day.
So Musk’s pitch is straightforward. Let’s put these idle processors to work.
In case you might borrow a bit of little bit of power and compute from each parked Tesla, you could possibly type a world computing grid that will make at the moment’s cloud networks look far too centralized and inefficient by comparability.
Must run an image-recognition mannequin, simulate an autonomous-driving situation or course of video information?
Tesla might parcel out these jobs throughout hundreds of thousands of automobiles in a single day.
This is able to give Tesla a possible moat that no different automaker — or cloud firm — might simply match.
In spite of everything, GM and Ford don’t have proprietary chips just like the AI5 of their automobiles. And Amazon doesn’t have 5 million related autos plugged into its cloud.
It could additionally assist shift AI from centralized supercomputers to distributed inference. That’s the identical form of edge computing mannequin that powers smartphones, drones and industrial robots at the moment.
As a result of on this situation, the community wouldn’t must exist in a single central place.
It could dwell wherever a Tesla is parked.
Right here’s My Take
If Musk can truly execute on this wild thought, Tesla’s fleet might rival the biggest AI compute clusters on Earth.
However there are hurdles to unravel earlier than it might develop into actuality.
Operating inference jobs on automobile batteries might shorten their lifespan in the event that they aren’t managed fastidiously.
Some homeowners may refuse to permit their automobile for use for Tesla’s compute work, even when they’re compensated. And data-privacy legal guidelines in Europe and California would require consent and transparency.
However Tesla already has expertise orchestrating huge distributed programs. Each time it updates Autopilot or trains new imaginative and prescient fashions, it collects and processes video information from hundreds of thousands of automobiles worldwide.
The distinction right here is that Musk would need the Tesla fleet not simply to coach AI, however to run it.
On this future, Tesla’s automobiles would cease simply being autos and begin performing as cellular computing property. Homeowners may decide in via software program, permitting their autos to hire out compute cycles whereas parked, which might earn them credit or money in return.
For Tesla, it might be a wholly new income stream layered on prime of the prevailing fleet. And like Musk’s robotaxi enterprise, it might scale routinely.
As a result of each new automobile offered would broaden the community’s computing energy.
It’s a radical thought. And it might signify a radical shift for the corporate. If Tesla can pull it off, Musk might find yourself operating the world’s strongest, most distributed AI community…
With out ever constructing a knowledge middle.
Regards,

Ian King
Chief Strategist, Banyan Hill Publishing
Editor’s Word: We’d love to listen to from you!
If you wish to share your ideas or ideas concerning the Each day Disruptor, or if there are any particular matters you’d like us to cowl, simply ship an e-mail to [email protected].
Don’t fear, we received’t reveal your full identify within the occasion we publish a response. So be at liberty to remark away!













