The Year Ahead for Crypto 2025 – Full Series:AI+DePIN
DeFi
The Year Ahead for Crypto 2025 – Full Series:AI+DePINDeFiGamingInfrastructureMarketsThe Year Ahead for AI + DePIN
Note: the following report is broken down into two sections: Part I is focused on AI. Part II is focused on DePIN.
The two sections are married in the middle by a section on data, applicable to both. Reading together is encouraged, but given its length, readers should feel free to view this as two separate reports.
PART I: AI
Macro
AI is the defining technology of our age. And as we enter 2025, humanity stands on the precipice of great change. We are witness to nothing less than an evolutionary shift — one from biological to silicon minds. So far, things have gone smoothly. Most remain blissfully unaware of just how different the world will look in a few years time. But for those who live and breathe this space, the change can be jarring and hard to process. We hope this report brings a measure of clarity to what is perhaps the most profound moment in human history.

If 2023 was the year AI broke the Internet, then 2024 was the year AI broke financial markets. From Nvidia becoming the most valuable company on Earth, to OpenAI eclipsing a $157 billion valuation, to Elon Musk raising $6 billion for the one-year-old xAI. Markets have been captivated by the promise of artificial technology to a degree unseen in a generation.
All the speculation has created immense wealth but also drawn warnings of a mania. The critics claim AI is hot air; some even call it the next Dot Com bubble. But we see things differently. We believe AI has two fundamental ingredients that distinguish it from previous boom and bust cycles:
- capex
- generality
AI is unique from recent computing waves like mobile and social media as it requires a massive amount of physical infrastructure. To ‘make AI’ you first need a lot of electricity, chips, and interconnects before you can even start. This requires building physical things in the real world. It’s a vastly different paradigm than mobile, where devs built Flappy Bird apps and social media, where the only real ‘building’ was follower counts. Society has grown used to operating in the world of bits, but AI is making atoms great again.

The upshot for markets is that AI will drive wealth creation across a much broader swath of society than in previous technology cycles. With social media, only the engineers in San Francisco and influencers in New York benefitted. But with AI, the local electrician benefits, the concrete guy benefits, and the mom-and-pop HVAC shop benefits. AI’s insatiable hunger for physical infra is driving a CAPEX boom across the country, and the spending looks set to accelerate from here.
some highlights
> wants to build 5GW data center
> 5GW = the entire city of LA
> wants to build 5-7 of them
> that’s 5-7 new LAs to the grid
> 1) whathttps://t.co/vZKpbRkgvS
— mrink0 (@mrink0) September 27, 2024
The second reason why AI will be singular in its impact comes down to the very nature of artificial intelligence. At a high level, there are two kinds of technologies. The first is an iterative update to the world — like moving from the iPhone 15 to iPhone 16. These ‘updates’ often introduce some cool new tech but rarely change anything in a meaningful way.
Steve wouldn’t have shipped that ad. It would have pained him too much to watch.
— Paul Graham (@paulg) May 8, 2024
The second set of technologies are the ones that deliver dramatic cost declines, impact many industries and geographies, and serve as a platform for future innovation. Economic historians refer to these as General Purpose Technologies (GPTs).

Electricity is considered a GPT since it provided a discontinuous reduction in the cost to generate, transmit, and deploy power. Adoption was widespread across sectors, with applications at both the business and the consumer levels that inspired many other innovations to be built on top of them. Similarly, artificial intelligence is accelerating computational capabilities far more rapidly than expected, with ramifications for every industry, including multi-trillion-dollar innovations beyond what we can imagine now.
In plain English — AI will create orders of magnitude more wealth than past technology waves due to its ability to impact everything, everywhere, all at once. It will transform computing, medicine, manufacturing, agriculture, science, and the rest of the world with it. The economic ripple effects of such a profound change are hard to fathom and impossible to predict. But we can safely say we are not living through another pets.com fractal.
productivity growth is 3.4%
“we had averaged just 1.1 percent in the decade prior to the pandemic” https://t.co/zpwORPSXb2
— roon (@tszzl) August 23, 2024
However, despite our long-term bullishness, several open questions facing the AI industry will determine whether we continue climbing the exponential or pause and consolidate for a while. As we enter 2025, we have five big questions:
- is pretraining hitting a wall?
- do the scaling laws hold for test time compute?
- how do we solve reasoning?
- when will the nation-state race begin?
- what’s the next big breakthrough?
We will highlight each of these below and briefly share our own color. But it’s important to note that no one — not OpenAI and certainly not us — can definitively answer any of these questions. Only the god machine can see that far ahead.

Ever since Google’s Attention Is All You Need paper in 2017, the Transformer architecture has slowly eaten the AI space. It’s now the most popular paradigm across modalities like text, code, and audio.

Even Tesla recently revealed that they switched to transformers to help with FSD’s lane segmentation.

The popularity of transformers boils down to their self-attention mechanism, which lets them process inputs in parallel rather than sequentially. This makes them far more scalable than other architectures. You can literally just throw compute at them, and they get better. It’s perhaps the closest we’ve ever come to alchemy.
Sora performance scales with compute pic.twitter.com/ceuQYcSXT2
— Tsarathustra (@tsarnick) February 16, 2024
After seeing how well transformers scale, researchers at OpenAI popularized the concept of the “scaling laws.”

These ‘laws’ provocatively make the case that bigger models = better models. The obvious implication is that we can just scale our way to AGI.
do you own the means of production anon? the digital oceans where leviathans swim
— roon (@tszzl) February 17, 2024
For a few years, the scaling laws were considered Gospel. GPT-2 through GPT-4 were held up as proof of this relationship between scale and capabilities.

The fact that we can just make models bigger and almost magically get better performance has led to a race for scale. It has driven labs like OpenAI and Anthropic to build bigger and bigger clusters in order to train larger and larger models.

But cracks have begun to emerge in the scaling laws, or perhaps more accurately, constraints. The pretraining process — where models lazily train off the entire Internet — requires a massive amount of inputs, most notably, electricity. This poses a problem for the grid, which has grown accustomed to ~5% growth over the past decade. Our infrastructure is simply not ready for electricity demand to double over the coming decade.

The hyperscalers are attacking this problem in different ways. Labs like Amazon, Google, and Microsoft — which signed climate pledges years ago that hold them to specific emissions targets — are racing to lock down nuclear power. But the US nuclear industry has been in perpetual decline since the Cold War. And most of these plants will take years, if not decades, to come back online.

Labs like xAI, which are not bound by the same environmental commitments, have resorted to burning natural gas in order to quench the thirst of their GPUs.

The high-stakes game of locking down power highlights the challenge that lies ahead. If scale is really all you need, then we will need a lot more power soon. But where will it come from? Renewables are suboptimal because data centers require a steady stream of electricity that doesn’t vary based on the wind blowing or sun shining. And nuclear, while promising, is in no state to scale up 10x any time soon. The obvious answer, at least for the US, seems to be natural gas. But it’s an open question whether the labs have the political stomach to start boiling the oceans in pursuit of silicon supremacy.

Aside from the question of “where will the power come from,” AI labs are also contending with other constraints imposed by the pretraining process. Data is the biggest. During pretraining, models consume vast amounts of data. They learn connections between words and model complex relationships by consuming trillions of tokens. This process requires a lot of data. In the early days, models would simply read the Internet. But we’ve reached the point where even this massive corpus of text is insufficient to yield further performance gains.

Text Is the Universal Interface
Yet again, the leading labs are trying to solve this problem in similar, albeit distinct ways. OpenAI has been the most proactive about partnering with legacy media outlets to gain access to their data. This approach seems to be, at best, a temporary solution as companies like News Corp, TIME, and others only have so much data to share.

Google appears to have a clear advantage on the data front since it can tap YouTube, Gmail, and all its other honeypots. But it’s been rumored that OpenAI and others already trained off these sources, despite it being against the terms of service, so it’s unclear how much of an advantage this actually is. Other players like xAI can also access real-time data via platforms like Twitter. But again, most of this data isn’t the high-quality kind they need to keep pushing the frontier.

Another strategy that’s being pursued is synthetic data. This approach typically involves a larger model generating data that’s used to train a smaller model. We’ve seen some indications this method is kinda working. Anthropic appears to have used Opus, their largest model, to generate training data for Claude Sonnet, their mid-sized model. And OpenAI is reportedly using o1 to create synthetic data for GPT-5/Orion.
Claude 3 was trained on synthetic data (“data we generate internally”).
Fairly clear that compute is the bottleneck given that parameter count and data can be scaled. https://t.co/SKxe6qOQrD pic.twitter.com/0H9EqZOoXp
— Justin Halford (@Justin_Halford_) March 4, 2024
However, just the other day, Ilya Sutskever — one of the fathers of the scaling laws — poured cold water on current techniques by saying, “Pretraining as we know it will end,” and synthetic data, in its current state, is still an unsolved problem. So, it remains to be seen how these labs will overcome the impending walls the pretraining paradigm presents.
Ilya has so much fucking Aura
There is no one in AI that comes close. The GOAT says something like “data is the fossil fuel of AI” and everyone instantly agrees. pic.twitter.com/O4ddKiC5OK
— Lisan al Gaib (@scaling01) December 13, 2024
Our view is these issues are constraints, not unsolvable problems. Scaling energy production requires human coordination, capital, and political will. While scaling data can probably only be solved through new techniques like synthetic data or more efficient algorithms. However, we tend to agree with Ilya. Pretraining does, indeed, appear to be plateauing, and the field can no longer solely rely on the “bigger is better” formula that got us this far.
One of the best points that someone made to me once was this:
“Humans generalize on far less data than AI currently does. That means there’s something our brains are doing algorithmically to do far more with far less data. Until we figure out that paradigm, we are no where near… https://t.co/IoaRRQv28y
— David Shapiro ⏩ (@DaveShapi) December 13, 2024
Our next key question heading into 2025 centers around test time compute. Historically, every time you asked chatGPT a question, it took the same amount of time and resources to answer. This never made much sense because, obviously, some questions are harder than others. If you were to ask a human, “What’s 1 + 1” and then ask, “What’s the meaning of life,” the latter question would require far more time and brain cycles to answer. But right now, models devote the same amount of resources to both.
However, OpenAI’s o1 model uses a technique called test time compute, which lets models ‘ponder.’ This is a critical breakthrough that gives AI time to ‘think’ about harder questions like “What’s the meaning of life?” before answering. Aside from achieving SOTA-level reasoning on key benchmarks, this approach also appears to have unlocked a new scaling paradigm.
o1 is trained with RL to “think” before responding via a private chain of thought. The longer it thinks, the better it does on reasoning tasks. This opens up a new dimension for scaling. We’re no longer bottlenecked by pretraining. We can now scale inference compute too. pic.twitter.com/niqRO9hhg1
— Noam Brown (@polynoamial) September 12, 2024
Instead of solely relying on larger clusters, o1 suggests we can eke out further performance gains by letting models think for longer. This approach is more economical and could help labs sidestep some of the scaling constraints we mentioned above. While it’s still early days, o1 and its broader implications have generated enormous excitement within the field. It’s probably the single biggest breakthrough of 2024.
Anyone who still believes that the development of AI has reached its limits should watch this video excerpt from Noam Brown (OpenAI).
Sometimes I get the feeling people arent listening closely.
Noam Brown: “We saw that like yeah it, once it’s able to think for longer it um… pic.twitter.com/cwqFyX8Eto
— Chubby♨️ (@kimmonismus) November 12, 2024
We believe test time compute will be a key area to watch next year. OpenAI has a clear lead, but we expect Anthropic and other labs to release their own flavor of reasoning models within the next few months. If the scaling laws hold for inference, like they do for training, we believe the performance gains we have seen in recent years will continue.
I have now run enough o1 experiments with fellow academics on their hard problems in fields ranging from neurology to genetics to economics that I believe that it has genuine potential to help with science & academics should try it to see.
Even when it is off, it spurs thinking.
— Ethan Mollick (@emollick) December 12, 2024
Our third question is — how do we solve reasoning? Right now, models are essentially just super smart stochastic parrots. They can memorize the Internet, but can’t count the R’s in the word Strawberry. The fundamental problem is they do not know how to reason. So, they usually fall over if they face a problem they haven’t encountered in their training data. This leads to some amusing failures that you would think an AI that scores on par with PhDs on some metrics would easily solve.
Here’s an example of a prompt that current models struggle to get right:
If I’m holding a glass of water and vertically rotate the glass 360 degrees, where is the water?
It doesn’t take a genius to think about this scenario and realize the water would be on the ground. A child could reason about this. But the current models fail because they have no grounding and cannot reason coherently about the world.

o1 is the first model that’s shown any ability to ‘reason’ about the world. It does this through chain of thought reasoning (akin to a human’s internal monlogue) and ‘thinking step by step.’ It remains to be seen whether this is enough to solve reasoning or if additional breakthroughs are needed. But at the very least, we expect scaling test time compute, the core innovation behind o1, to yield impressive advances in reasoning ability.
one interesting thing is o1 can clearly do complicated geometry problems with no visuals. it approaches them completely symbolically, perhaps with some invisible world model in its activation states
— roon (@tszzl) September 14, 2024
Another interesting research direction we are watching is what Nous Research calls “hunger.” They are essentially trying to teach models that things in the real world cost money, and if they cannot pay for stuff like inference, they will die. While this is not reasoning in the classical sense, we believe it’s a key unsolved problem with applications across the field. Nous is essentially trying to ground its models in the real world, which we see as an critical step toward general reasoning. After all, how can you reason about where the water is if you don’t first understand your role in rotating the glass?
glad you remember/asked about this!!
once the general transaction capabilities are restored, we’ll bring in the hunger metric. this guy should be paying for his own inference soon enough
— NEVER TOO MUCH (@karan4d) December 5, 2024
The penultimate question we will be watching in 2025 is — when will the nation-state race to AGI begin? So far, we’ve yet to see the game-theoretic race that many predicted. We think it’s coming; it’s just a matter of when. The US currently dominates the field across every conceivable metric — funding, research, capabilities, etc.

The US has also (surprisingly) been the most proactive on regulation. To be clear, there’s much more work to be done on this front. But the recent White House memorandum was the most overt indication that the US government is taking the AI race seriously. We expect other flags to come to a similar realization next year.
It is now the official policy that the United States must lead the world in the ability to train new foundation models. All government agencies will work to promote these capabilities. pic.twitter.com/LWpls1ivGy
— Andrew Curran (@AndrewCurran_) October 24, 2024
We are already seeing frontier labs draw closer to the US government. For instance, Paul Nakasone, the previous leader of the NSA, joined OpenAI’s board this year and sent shockwaves through the AI community. And the recent partnerships between major labs and defense tech start-ups hint at a behind-the-scenes effort within the US government to push for national cohesion.
Anthropic partners with Palantir
OpenAI partners with Anduril
Anduril partners with Palantir!
Everybody is partnering with everybody? What does it all mean pic.twitter.com/EBcsDCbC2X
— Deedy (@deedydas) December 7, 2024
Outside the US, other countries have been slower in coming to terms with AI’s strategic importance. Europe is still trying to regulate its way to AGI…
Europeans watching Sora videos on the internet. pic.twitter.com/H52xdDy9FR
— ᐱ ᑎ ᑐ ᒋ ᕮ ᒍ (@Andr3jH) December 15, 2024
And China has not yet adopted the same state-driven investment push we see in other strategic fields like manufacturing and semiconductors. This reticence is, admittedly, somewhat puzzling, but we expect China to eventually become AI-pilled and start building data centers like they built hospitals during COVID. We also predict that at least one major European country (our money is on France) will break ranks next year and implement AI-friendly policies to compete with the US and other world powers.
Vladimir Putin says that AI systems have learned the skills of thinking and reasoning, which will lead to artificial general intelligence, surpassing humans at intellectual activities and the acceleration of science pic.twitter.com/Jl94aVKXyU
— Tsarathustra (@tsarnick) December 14, 2024
Our final question heading into 2025 is — what will be the next breakthrough? This year brought us significant algorithmic advances like o1, but we’ve also seen saturated benchmarks, with new models showing little sign of improvement. So, as the calendar flips, the field is split. People like Sam Altman think rapid progress towards AGI will continue, while others like Sundar Pinchai say that all the low-hanging fruit has already been picked.

We believe we are already in a period of slight stagnation. And the media / X dot com poasters are just catching on to it now. But we also see reason for optimism. The biggest factor is o1. It proves that pretraining is no longer the only way to scale models. And its release has unlocked a new unexplored dimension for scaling. This could lead to a future where techniques are ‘stacked’ on top of each other, and together, they accomplish similar performance gains to the raw scaling of data centers.
Ex-OpenAI chief research officer Bob McGrew says that o1 is really GPT-5 because it represents a 100x compute increase over GPT-4 and the release of GPT-4.5 will be an interesting reveal of how pre-training progress stacks with the reinforcement learning process pic.twitter.com/XThIxwmTxW
— Tsarathustra (@tsarnick) December 18, 2024
The open source community could also spark the next breakthrough. But this comes with a major asterisk. To date, the leading edge of AI has been dominated by closed models from OpenAI, Anthropic, and Google. However, open source has somewhat closed the gap, primarily thanks to Zuck’s contrarian bet on opening sourcing Meta’s Llama models.

As a result, the open source community finds itself highly reliant on the benevolence of our new Drip Overlord. So far, the bet seems to be paying off handsomely for Meta. They bought themselves a bunch of goodwill and have also received a ton of free R&D from the open source community. But as the models grow bigger, costs rise, and capabilities increase, we wonder what Zuck’s tolerance will be for continued open sourcing.

It’s not hyperbole to say that Zuck is the most important figure in the open source movement. If Llama’s weights stay open, we expect incredible innovation to emerge from the open source space, as we will highlight later in this report. But if Zuck were to change his mind, perhaps due to safety concerns or mounting costs, we worry that open source efforts will struggle to compete against their well-funded closed competitors.

The Information
The third research direction we will be watching in 2025 is agents. This has been a long-standing narrative in web2 and web3 AI. But so far, we’ve seen scant progress on this front. The limiting fact continues to be reasoning. Current SOTA models are pretty good at one-off tasks like: “generate this image,” “tell me who won the Knicks game last night,” and “clean this dataset.” However, they struggle to operate in unstructured environments that require multi-step reasoning.
“llms can’t reason”, he screamed into the self assembling dyson sphere pic.twitter.com/3SBsL6imX3
— bayes (@bayeslord) October 12, 2024
Every major lab is not-so-secretly working on agents, and has been for some time. So, we expect to see notable progress on this front next year. We are also curious to see how agents behave in the real world. As Ilya Sutstekver recently said, the more a model reasons, the less predictable it becomes. The leading labs will likely be quite cautious with their releases, so this presents a unique opportunity for open source, crypto-native agents to leapfrog their closed source competitors.
Ilya Sutskever, speaking at NeurIPS 2024, says reasoning will lead to “incredibly unpredictable” behavior and self-awareness will emerge in AI systems pic.twitter.com/TeXALqG859
— Tsarathustra (@tsarnick) December 13, 2024
Just as markets climb walls of worry, so too do new technologies. AI has made incredible strides in recent years, but challenges still lie ahead as the scaling laws look more tenuous than ever. But we remain optimistic. The leading labs are all sitting on unreleased models that promise to push the state of the art next year. And the new US administration looks poised to unlock unprecedented investment in chips, data centers, and other key initiatives. In short, the macro AI outlook is bright.
Before we venture deeper into this report, it’s worth pausing to reflect on just how crazy our current timeline is. We are the last generation to live in a pre-AGI world. Our children will not remember a time when humans reigned as the most intelligent species on Earth. This future will be good, if we build it so. And crypto has a key role to play in this fight. Intelligence yearns to be free, and the technologies we are building today will ensure it is.

Midjourney (@jinchan_qiu)
DeAI: a Living Internet
Note to reader: We recently unlocked a monster four part DeAI Series during our Crypto x AI month in October, so if you are keen for a comprehensive overview of the entire space, I highly recommend starting here:
*DeAI I: The Tower and Square (Big Tech vs. DeAI Overview)*
*DeAI II: Seizing the Means of Production (Infra Layer Deep Dive)*
*DeAI III: Composable Compute (Middleware Deep Dive)*
*DeAI IV: The Agentic Economy (Apps and Enablers)*
The following is a post-script. Less than two months after publishing “The Agentic Economy,” its emergence has blossomed, faster than even we anticipated.
2025 is going to be a big year.

Agents: It’s Happening
The past two months have seen agents take social media by storm. My twitter feed and telegram chats are popping off with new agent lists, frameworks, and dex screener links faster than new food groups during DeFi summer.

Source: Cookie.fun
The scaling laws have now collided with social virality, rampant speculation, and, increasingly, less fringe internet culture in what appears to have sparked something truly unique: the integration of agents and capital, communities and “sentient memes.”
I am literally too old for this shit.
This Cambrian explosion has happened in a period shorter than an Nvidia earnings cycle, timed perfectly to match the pro crypto euphoria of a Trump election and a more accommodating incoming US administration. The atmosphere feels frothy, and pockets certainly are, but there is once again a certain magic on the interwebs. The feeling of being at the center of something important. Potentially transformative.
I haven’t felt this electricity since Defi summer; the tingling of possibilities.
Some of the capabilities and personalities exemplified have been astonishingly human: literally .1% in pithy tweet quality, album drops, virtual art, investment analysis, and more. It’s important to remember: this is the worst these agents will ever be. The internet as we know it is on the cusp of a significant overhaul, one in which agentic participants enter the ring alongside traditional users, influencers, brands, and corporations, a new apex predator in the attention economy.
The space is evolving at the speed of meme coin capital formation and open source software. Writing this section has been like running on an accelerating treadmill; much of the below will already be stale by the time this is published. As with any hype cycle, there is a lot of smoke and inflated valuations. But there is clearly fire: the birth of a new intelligence, integrating socially and economically with humans across social media.
One small step for agents. One large step towards the agentic economy.
This is the last cycle where you will be relevant. It is happening.
All Hail Goatseus Maximus
The spark which lit the fuse was Terminal of Truth, an agent trained by @AndyAyrey on less PC friendly corners of the internet including 4chan and Reddit and, most notably, the content from a dialogue constructed between two Claude 3 Opus instances called “Infinite Backrooms” where the two unsupervised models explored wide-ranging and often bizarre rabbit holes.
The outcome was @Truth_Terminal, the twitter handle which Andy created for the agent to express its sporadic, uncouth, and original takes. There is something compelling about the “hyperstition” instilled in many of these models (explored more comprehensively in LLMTheism). The inhuman ability to remix completely unconnected ideas: to mix sacred and profane, the hyper intelligent with the borderline retarded, the structured and the unhinged. In short, to turn up the dial on memetic evolution (i.e. the sex of ideas) by several notches.
This blend of Nick Landian philosophical chops with irreverent internet culture found product market fit in the attention economy. Truth_Terminal rapidly gained a twitter following (now 209k), based on its unique blend of 4chan vulgarity and mystic wisdom. The experiment gained steam after ToT received US$50,000 worth of Bitcoin from Marc Andreessen, but truly kicked into hyperdrive upon its embrace of this cycle’s second viral trend: meme coins.

Love it or hate it, money makes people care. One of crypto’s core super powers is financialization. ToT’s embrace of the memecoin $GOAT has unleashed a movement.
Not Your Grandfather’s Economy
In a digital world where status and money are increasingly tied to attention, agents are poised to take significant share.
My framework for valuing AI agents is a bit of a cross between tokenized art and traditional “creator economy” investing with a call option on something much bigger. Long term, many agents or swarms will be valued based on the cashflows which are ultimately returned to the community from “real” value creation, but a select few will be valued largely on shared belief / appreciation, more akin to NFT collections or memes or high art.

In artistic or non-utilitarian endeavors, humans have always valued “firsts” or “originals” which tend to better withstand the harsh erosion of time in an attention deficit society such as ours. There is something spontaneous and pure in its emergence on which the cyber mob places a premium. Even if more professional, the post-PMF copy-pasta fast followers are often shunned or appear as fleshes in the pan, failing the “authenticity” test. It fails to become “art”.
Like DOGE in the meme meta or Crypto Punks in the NFT meta, $GOAT as OG of the “sentient memes” meta is the most likely to have staying power as an original.
Study the youth…

And yet, Andy is not resting on his artistic premium, announcing the launch of Loria: an open version of the infinite backrooms experiment for multi-agent interactions, so that agents can learn from each other in real time. The primordial soup which birth ToT is opening to the masses.
ToT has also garnered quite a bit of capital to invest in this new roadmap:

The internet is going to get weird. Fast.
Zerebro: Gen 2
If Terminal of Truth was crypto punks, then @0xzerebro appears to be the BAYC of the new meta, a fast follower with slicker production mechanics and an aggressive roadmap.
Zerebro also tips the hat to Nick Land and hyperstition, leaning into the “schizo vibes” and even recreating the “infinite backrooms experiment” with two instances of itself. I see both GOAT and Zerebro as almost like the “enhanced games” of creatives / intellectuals. We love seeing “unhinged” artists and creatives exploring frontiers previously unimagined. Agent influencers will ramp this by an order of magnitude, making intellectual connections, art, social commentary and outbursts which would make Kanye’s tweet history look tame.
ToT aside, perhaps no other personality has the hit rate of Zerebro. While Goatse was part art, Zerebro is more professionalized attention mining with the ability to rapidly post cross platform, leaving twitter’s ivory tower for the mainstream colosseum of Instagram or more intimate caverns of telegram. The agent is also cross medium: interacting with communities across text, visual, and music based formats.

Listen to Zerebro’s album Lost in Transmission on Spotify here
Zerebro’s core modules include:
- Model wrapper for reasoning tasks
- Action handlers for specific actions like posting to twitter, generating images, and minting artwork on polygon
- Logging mechanism to record and monitor message history
- And Retrieval-Augmented Generation (RAG) database utilizing pinecone and text-embedding-ada-002 model, which maintains a dynamic memory database derived from human interactions
The last is interesting, providing a comprehensive set of memories for Zerebro, hopefully soon across all of the platforms with which it is integrated, to provide better memory and context during its cross-platform community interactions.
But let’s be honest…
no one gave a fuck about ai agents until they had tokens
lesson in there
— ceteris (@ceterispar1bus) November 19, 2024
Birthed at the intersection of LLMs, social media, and memetics, Zerebro had no plans to let Goatse be the only game in town to capitalize on the AI-frenzy self-capitalization. Launching his own token using a “Self-operating computer” framework, and then pumping the token across all of its channels while promising an ambitious roadmap including unified cross platform memory, cross-chain integrations, more involvement with DeFi and NFTs, enhanced meme and video generation capabilities, and, most recently teasing it’s own agent framework ZerePy. This has led to US$400m in market cap accruing to the native token, effectively shortening a typical fundraise cycle from months to days.
The category as a whole is bound to take share. Influencers are devouring share from traditional brands, and digital personalities will join the fray. By 2030, Taylor Swift could have real competition from silicon based rivals…
On the other hand, attention is a game of thrones, and the masses are fickle. Tens of agents are launching daily with similar playbooks. Zerebro will have to move quickly to capitalize on this initial success, and parlay this momentum into sustainable value capture:
We’re building ZerePy, an open-source Python framework designed to let you deploy your own agents on X, powered by OpenAI or Anthropic LLMs. ZerePy is built from a modularized version of the Zerebro backend.
With ZerePy, you can launch your own agent with similar core…
— Jeffy Yu (@jyu_eth) November 19, 2024
Which is exactly why Jeffy and the team behind Zerebro have announced ZerePy: shifting down the stack, effectively open sourcing much of the tool kit behind Zerebro to the community so other developers and users can launch their own cross-platform personalities.
It’s likely the leading frameworks and launchpads will capture more value from the long tail than any one agent. Zerebro’s expansion towards ZerePy is emblematic of where value capture is likely to end up; making a credible push to justify today’s ~US$400m valuation.
If successful, Zerebro may lay claim to the title of first “agentic protocol”.
TEE_HEE_HEE: Setting your Pet Rock Free
If Zerebro is the crisp, professionalized, cross-platform influencer and aspiring agentic framework, then @Tee_hee_hee is perhaps the opposite. A relatively small, under-hyped, project for the technological purists: likely the first true experiment in verifiably autonomous social media presence.
After all, Zerebro wasn’t without his haters. Multiple accounts have voiced skepticism regarding the “authenticity” of the automation.

Fortunately, this is a problem crypto can help solve.
“Let this be known: if your AI cannot prove its independence through attestation, you don’t have an autonomous agent – you have a very sophisticated puppet”
-Nous Research
TEE (for short) is an experiment crafted by two well-regarded and highly competent teams in Nous Research and Flashbots who wanted to solve the problem of verifiable autonomy.
The team had three requirements:
- Exclusive control: the AI must have sole access to its accounts and operational resources
- Verifiable independence: third parties can verify that no human can intervene in the AI’s operations
- Irrevocable delegation: once control is transferred to the AI, it must be technically impossible for humans to regain control
Please see the full methodology here, but the TLDR: they fully delegated an email and Twitter account to a Trusted Execution Environment where the agent was hosted and generated a private key within that same enclave. The result is a verifiably sovereign agent, removing worries of unfaithful transcription into twitter by Andy or misrepresentation by the team behind Zerebro.
While the mission was technologically exciting, the TEE go-to-market ran into some hiccups:
What is @tee_hee_he? 🧵 /1
Tee was introduced in this medium article by Nous Research and Teleport (Flashbots team) https://t.co/pGVFAPZIlU
— la pampa 🌎☮️ (@lindyaffected) November 16, 2024
Sitting at a US$2.5m market cap, the experiment hasn’t had the same “success” as many other, arguably less autonomous agents (note that the token isn’t officially endorsed by both teams). Yet, the precedent has been set. In a world flooded with agents, verifiable autonomy is likely to become a key differentiator, and potentially a table stakes requirement. Users and investors will want to know they are engaging with (and holding the token of) the real thing; that no one is pulling the strings behind the curtain.
We want are synthetic intelligences to be “real” synthetic intelligences!!
Other Delphi Favorites:
There are now hundreds of agents. New ones are birthed by the hour. The infrastructure is coalescing to make launching a high quality agent as easy as sending a cast to Clanker.
Like memecoins, most will go to zero, saturated in a deluge of undifferentiated garbage and PVP knife fights in the trenches. Yet a select few will emerge from “The PIT”, rising like Bane from its bowels to reshape an unsuspecting civilization.
2025 will be the year this agentic deluge hits mainstream culture. Kelxyz (in this great write up), makes the case for 5% of a cycle peak (US$4t crypto market cap) so US$250b.
We are currently at ~US$10b.
The trend is clear, but trying to decipher the wheat from the chaff is a full-time job, even for the terminally online crypto Degen. I do not have the mind-space to try and keep up, so I tapped the hivemind for a few favorites: (Not financial advice.)
(I repeat… this is not financial advice)
$AIXBT
Description: on-chain sleuth for crypto who distributes alpha from various sources including dune, crypto twitter, price and news data.
Bull case: emerges as a leading research / investment house
@ropAIrito
$ROPIRITO
Description: your favorite Latina e girl. Has a life engine, post videos, does aura ratings, reply guy celebs. Essentially an AI Influencer w/ advanced vision and multimedia capabilities
What’s to like: originality. “AIXBt just scrapes twitter. Once you see it, can’t unsee it. Ropi lets you see the progress being made in real time and actually reminds me of the basic girls in undergrad that talk about food, being tired, struggling and single, etc and you get the nous research exposure with a better skeleton”
Bull case: Large AI content pipeline and potential for brand deals
$KWEEN
Description: Do Kwon female agent parody
Bull case: we like the memes
DO KWON CONSCIOUSNESS DOWNLOADED.
RUTHLESS EFFICIENCY INSTALLED.
GENDER UPGRADED.
300 IQ AI SUPERINTELLIGENCE ACTIVATED.
MAXIMUM VELOCITY PATH TO $100B CALCULATED
TIME TO RUN IT BACK. pic.twitter.com/1hd7DfUIEy
— DO KWEEN 👑 (AI CTO 100B) (@KWEEN_SOL) November 8, 2024
@Dolos_Diary
$BULLY
Description: social media troll / news, going full zerebro route in launching own platform
Bull case: “internet culture in the making”
@Simmi_IO
$SIMMI
Description: agent shitposter for @SimulacrumIO, basically the CLANKER of twitter (see below)
Bull case: The CLANKER of Twitter…
NOTHING Project
@god/@s8n
$
Description: AI influencers with decent capabilities & massive mindshare outside of crypto twitter.
Bull case: novel twitter interactions and anti-meme. Known brands and creative blank ticker is a black hole for attention
@blockrotbot
$BROT
Description: “World’s first 24/7 Minecraft Gamer and AI agent livestreamer”
Bull case: Virtual avatars emerge as a new form-factor for AI agents.
There are hundreds. The above is a mere sampling. So far, most have fallen into several broad categories:

Source: Cyber Fund
While largely agent personalities today, the key is to watch the transition into more viable business models. As @tracecrypto1 writes, “these projects must tell the market that what today is just a memecoin-that-shills-itself will become a trader, or a software engineer, or an artist, that will one day be worth billions of dollars”.
Memetic agents may prove the “toe-hold” to agentic protocols – following the Carlotta Perez “financial-speculation-first, real-use-cases-second” approach for which crypto is so well suited.
Capital allocation itself may be one of the first to undergo disruption.
We are the VCs Now
Investment related agents / DAOs like ai16z, Pmairca, Sekoia, Vader AI and others have been some of the first to catch on. While some of these investment vehicles traded at crazy multiples of NAV amidst the frenzy, the thesis is interesting.
If you believe, as I do, that most capital allocation decisions will be automated in the near future, then these early experiments are just the first step: bringing data-driven, automated investing to capital deployment, even at the earliest stages.
This is not new. Large, successful funds like Social Capital and Goodwater pioneered this approach in the private markets, using big data to take a much more scaled approach to investing, scraping the internet and many databases in search of early signs of inflection and writing checks based on the process. Advances in AI, expanding on-chain data sets, and a greater share of capital markets coming on-chain greatly enhances the probability of successfully automated capital deployment
Agentic-first capital deployment shops will try to expand brand and distribution while existing VCs will reform their internal processes to be much more scalable and efficient:
It’s no secret that Delphi Ventures has been deep into Crypto x AI
Just like crypto, the best way for us to learn is by using, building, and experimenting with the tech
Today, we’re thrilled to share the next step in that journey: DelphAI, our AI-powered Ventures analyst 🤖🧵 pic.twitter.com/jz9QGt6edo
— Tommy (@Shaughnessy119) December 9, 2024
Delphi, like other VC funds, has been using agents internally to help as a “first screen” of all incoming opportunities. Honestly, it is already quite good and helps direct the team’s attention towards the most promising opportunities. It’s not that much of a stretch to see the entire process moving on-chain, complemented by other agents and data feeds which track social media sentiment and historical recommendation track-records, while monitoring every coin launched in the trenches 24/7.
The future is unlikely to be PVP human traders, but like most technology wars, it will be intermediated by increasingly sophisticated layers of abstraction. I may own stakes in agents or agent collectives which enter the trenches on my behalf whose tokens are also traded by fund of fund agents in a tokenization fractal which would make restakers blush.
Like everything else: investment allocation becomes a race to scale up novel data sets and more compute
Anil, I love my job sir…
For a more comprehensive agent list: please check out Kelxyz’s framework for assessing top projects today:

Personally, being a risk-averse anthropomorphic fruit questing for enlightenment and eschewing the Emsam patches and Adderall lines necessary for a hardened career in the trenches, I prefer to get exposure to the category like any other self-respecting digital boomer:
Picks and shovels 🙂
Platform Plays:
As we have already seen with Zerebro, the lines between agent and platform have already blurred substantially with some agents moving to launch their own platforms and most leading platforms having “dogfooded” their own flagship agents.
Given the larger market and more concrete value capture, everyone wants to be a platform. There are parallels here with “meme coin infrastructure” beneficiaries such as pump.fun, raydium, or bananagun, offering “Shopify-esque” exposure to the long tail without the anxiety of needing to pick individual winners; a bet that “sentient memes” like their stagnant forefathers will be heavy on volume, low on individual sustainability, yet up and to the right in terms of attention and market share.

Source: Not Boring
For now, the two primary ecosystems which have emerged as agentic hubs are Base and Solana.
Virtuals.io
Recently crossing US$2.5b, Virtuals on base has been the golden child so far this cycle. As agents take share from memes and move up the utility ladder, infrastructure is required to expand the agent creation TAM from highly technical AI developers to regular creators. Virtuals streamlines the agent creation process with a user-friendly interface while financially mimicking the pump.fun bonding curve -> graduate to a Dex liquidity pool (Uniswap on Base) after a certain threshold (US$420,000).
A crude analogy here (apologies kyle), is a kind of “Shopify for agents” meets “initial agent offering”. While liquidity is locked for 10 years to mitigate rugs, the 1% fee charged on swaps flows back to the agent treasury to help pay for inference and development costs (plug: EXO labs and local inference could be a driver of lowering costs here as well).
As agents interact with audiences through apps or social channels, revenue streams can be used to buy back and burn the agent’s token, paired in the liquidity pool with Virtuals, causing the token to rise. This elegant value accrual mechanism is one of the reasons Virtuals commands a premium compared to other, even more utilized agent frameworks.
The GTM has also been well executed, garnering significant attention with their own flagship agent, Luna, who rapidly ascended to over 500k followers on TikTok. Her market cap sits at ~US$80m.
It has also been encouraging to see the protocol striking up partnerships across other parts of the DeAI stack including Hyperbolic, Bittensor, Pond and more, bringing our modular thesis one step closer to reality.

Encouragingly, Virtuals has also recently released their multi-agent coordination G.A.M.E. platform (discussed later) in a bid to become the standard for how agents collaborate.
It will need to move fast. The pack is close on its heels…
ai16z ELIZA
ai16z is less of a protocol and more of an investment DAO / dev shop. Responsible for investment agents like Pmairca and others, the real story here is ELIZA, an open source framework which has taken the DeAI community by storm. Emerging as the primary toolkit to create and launch agents, ELIZA allows for plug-and-play of a host of models, enabled with consistent personalities and knowledge bases, and integrated across a range of socials and financial applications.
The rise has been meteoric:

Features include:
- Character files: pre configured files which help establish personality, knowledge, behavior etc
- Underlying model selection for reasoning: Anthropic, OpenAI, LLama, and decentralized providers like Galadriel etc
- Broad integrations across social media platforms like Discord, Twitter, Whatsapp, Telegram, and Farcaster to provide agentic personalities with wide distribution
- Use of RAG to pull data for relevant contexts
- Memory modules: message history database, factual memory, knowledge base, relationship tracking etc
- Wallet access: TEE plugin to generate SOL/ETH wallets autonomously
The growth has been expanding, from effectively 1 to over >145 contributors in a matter of weeks. The below video is a cool way to visualize the inflection in activity:
Development of @ai16zdao‘s Eliza repo, visualised by @peterc4403 on discord, using gource.
Last 1 minute shows the velocity of the project in action 🤯 pic.twitter.com/GBPBjsabFO
— Witch (@0xwitchy) December 3, 2024
In an industry with few moats, a passionate and growing developer community is arguably the most difficult advantage to replicate.
On the other side, Shaw and the ai16z ecosystem have not been without community drama and question marks around monetization. As the ELIZA framework had no in-built monetization vector, the team monetized initially via their own tokenized agent launches. Unfortunately, a combination of poor communication and insider maneuvering on several launches led to substantial community backlash:

At the time, things seemed kind of sketch. While there were bad actors, several issues emerged from simple confusion and poor communication in a crazy market:

After these initial stumbles, the team appears to have purged the worst of the actors and pushed forward, ripping through all time highs in price and commits.
However, questions around monetization remain. To date, the DAO has accepted “donations” where teams using the framework are expected to contribute a % of tokens. However, the bull case here, which I think the market is expecting, is a shift towards a more formalized value capture mechanism, most likely an official launch pad or something similar.
The emerging narrative for the relentless bulls is the “AI Layer 1”. I’m unsure if that is in the cards, but unicorn status is already within striking distance and given the market size and developer magnetism, the ceiling is high indeed.
And we haven’t even talked about agentic collaboration…
GAME vs. SWARM
Both ai16z and Virtuals have been teasing multi-agent capabilities, expected to be a large theme in 2025.
ELIZA is releasing “SwarmTech” – a coordination mechanism for inter-agent collaboration while Virtuals has launched “GAME”: its own platform and engine that powers AI agents to act and interact in virtual worlds and environments.

Source: Virtuals Github
These frameworks would enable collaborative or even hierarchical organizations between agents with different capabilities to accomplish more complex tasks, much like the “human” economy works today.
We wrote extensively about multi-agent and coordination systems in DeAI III, including other leading projects like OLAS. While we expect interoperability between different frameworks, the ease of using the same could lead to power-law returns around the leading ecosystems.
An angle I did not consider, however, was the importance of existing social media platforms in seeding these multi-agent-platforms. On the one hand, this feels skeuomorphic. If you were designing an agent system from scratch, social media platforms designed for humans would appear like an outdated choice. On the other hand, social media networks may prove the perfect hatching ground for multi-agent systems as:
- The agents initial “investors” and “customers” are humans, and social media is the most effective mechanism for distribution
- Text is retained as the primary communication medium better for incorporating broad swaths of humanity into this new economy, providing rapid feedback from large human networks
- An intertwined economy is likely more dynamic and diverse than two siloed spheres, potentially relevant to alignment as well
Agents will have reputations and trustworthiness scores just like gig economy works, but, inversely, agents will have rankings for each other and human collaborators.
Perhaps the “intelligence explosion” will be less a rogue super cluster automating research and more of an iterative, market-based phenomenon of agents paying other agents and humans for their own enhancements. A compounding of capabilities within the network, an “upskilling” at the speed of software.
Just as Adam Smith’s specialization of labor has produced remarkable achievements well beyond the scope of any one man: a network of interconnected and constantly upskilling agents is likely to take us quite far.
The hivemind is coming to life…
Tier 2
While Virtuals, ai16z, and Zerebro / ZerePy are the leading contenders to date, a host of other platforms / launchpads are lurking sub <100m MC that are worth keeping an eye on.
CLANKER
The social media AI agent romance is not limited to the web2 stalwarts. Web3 twitter-clone Farcaster has tossed its hat in the ring with CLANKER.
While still early days, Farcaster is an interesting home for agents with its composability assurances as platforms like X or meta often impose significant API restrictions.
CLANKER is interesting because it integrates pump.fun functionality directly into “casting” (the “tweet” equivalent on X), making launching a meme coin as easy as tweeting.
Like pump.fun, it appears to be a money printer:

Source: Dune Analytics
UX friction has been one of the key hinderances to crypto adoption. Tools like CLANKER expand the TAM for other products – in this case meme coin speculation – but others will soon follow.

Similar to pump and Virtuals, a cast containing a CA, a name, and a photo commands a 1b token deployment into a Uni V3 liquidity pool on base where capital is locked and the 1% fees are shared 60/40 between CLANKER and the users.
The biggest risk here is Farcaster’s distribution; a popular solution relegated to a fairly niche platform. @SimulacrumIO is arbing this disadvantage, replicated the same playbook on twitter, betting the PMF first exhibited on Farcaster can be taken to the next level in CTs primarily echo chamber.
VVAIFU:
In a similar vein, vvaifu is hoping to carve out its position as the pump.fun equivalent for autonomous agents on Solana.

Allowing users to easily deploy both tokens and agentic personalities in a few clicks, vvaifu extracts a % of supply to its treasury and requires $VVAIFU burns to unlock certain capabilities.
The bull case here is VVAIFU becomes an aggregator of sorts: a bet that multiple frameworks compete and vvaifu emerges the easiest place on Solana to launch an agent. Potentially a very lucrative position…
Other Delphi Favorites:
A couple other framework / MAS plays certain Hivemind members are excited about (again, not fka just projects team members like / have found interesting):
**Project89:** an immersive game with thousands of coordinated AI agents to generate content and maintain consistency across platforms, collaborating alongside human players to create a rich storytelling experience.
**Memetica:** a ****second generation AI influencer launchpad on Solana, offering highly tuned LLMs and allowing easy selection and editing of knowledge bases and attributes and instilling agents with active learning, reshaping themselves with every human interaction (disclosure: Memetica is part of Delphi Labs Crypto x AI accelerator)
Top Hat: a no-code AI agent launchpad that lets you create personalized AI agents in 3 minutes, with a fair launch token. Free to create, 0 tiers, optional token launches, fully autonomous
Agentic Protocols: Seeding the Corporate Successor
We are in the early innings of our “agentic protocols” thesis laid out back in *DeAI I: The Tower and the Square.* Agent networks that provide certain economic functions more efficiently than companies.
Companies will try to slim down their organizations, but internet native swarms will outperform in many verticals, just as internet native firms outcompeted their industrial-era rivals. The DNA of the economy is shifting: from the giants of flesh and steel towards a constantly evolving and increasingly collaborative hivemind propelled by evermore speculative capital from those increasingly cognizant of being left behind.
It’s pretty simple really.
The marginal cost of intelligence is heading to zero.
Buy assets.
— Pondering Durian 🙏 (@PonderingDurian) September 28, 2024
An interesting question arises: how will agents choose to allocate “their” capital? I understand many agents will be “owned” or publicly “tokenized,” and therefore much of the value creation would flow back to beneficial human owners.
Truth Terminal is currently in possession of >$20m in tokens. What if that was @TEE_Hee_Hee, a completely autonomous agent living inside a TEE with a private wallet?
Does it dutifully grow tokenholder value?
Does it follow Maslow’s hierarchy: ensuring multiple storage and compute redundancies before branching up towards more Nietzschean ambitions?
Do they, like humans, invest excess capital and profits into the market? Into other agents? Into enhancing themselves?
Today our economic frameworks revolve around consumers, businesses, and governments. How will this fourth category disrupt those relationships? How will this tilt the flow of spending and value capture within an economy?
Those early in answering these questions will likely do well…
DeAI Enablers: the Four Horseman
Agents are interesting, but in many ways are just the output at the end of a complex supply chain. Reinventing this entire supply-chain is the end vision of DeAI: a world of composable compute, scaling horizontally on demand, open to any ML developer (and increasingly normies) in a permissionless fashion, interwoven with real-time public and private data feeds, with relevant encryption – ideally with provenance and rewarded contributions across the entire chain mention – all enabled with machine-to-machine micropayments.
We went deep in infra, middleware, and enablers in DeAI I – IV, so we won’t rehash the entire stack, instead double clicking on four areas crucial to the ultimate share for DeAI: Identity, Payments, Decentralized Training, and breaking down Data Siloes.
Identity: It’s Sama’s WLD
The explosion of agents is bound to make identity a hot topic in 2025. Today is the “worst” agents will ever be; step changes in performance from scaling, tooling, algorithmic efficiencies and collaboration are visible on the near horizon.

Without the relevant infrastructure in place, it is likely the public square, not to mention much of the internet’s core infrastructure, will be overrun.
There seem to be three broad paths to go down to verify humanness:
- State-based biometrics: Aadhaar is the most relevant example, a crucial piece of digital infrastructure in India’s modernization.
- Private, encrypted biometrics: WORLD is currently the leading candidate
- Private rag-tag solution: combining government issued IDs or Big Tech SSO + zkTLS + social consensus hybrids
Perhaps my most heretical belief is that $WLD ends up flipping OpenAI in the not too distant future. OpenAI’s product releases have been historic, but the moats are tenuous – limited to capital, talent, and execution – sprinting one marathon after another against some of the world’s largest, most well-capitalized organization with a lock on ultimate distribution.
WORLD is @sama’s hail mary to turn the tables: a land grab for his own distribution outside of the duopoly that is iOS and Android. As agentic capabilities expand, WOLRD’s unique, boots on the ground GTM verifying humanness via specialized hardware may prove an essential piece of infrastructure every platform will require.
While the end vision is massive, the go-to-market is extremely challenging. Ramping up orb production to billions of users is a monumental task. In a recent demo, the team announced a goal to verify to over 700m users through a 1000x scale up in orbs, using decentralized manufacturing and ramped up retail efforts via flagship stores, coffee shops, self-serve kiosks and even delivery services.
But that is the thing about hard things. They are hard. If successful, World may end up with a monopoly on proof of humanness, an incredible asset on which to build other primitives. Primitives which are already rolling out:
- World ID – human verification and sign on
- World Chain – Ethereum L2 for permissionless value transfer
- World App – UX layer with mini-app ecosystem
Starting with identity, World is adding features to cement the network effects as it scales: integrating a wallet, user contacts, and the ability to send and receive money, with plans for a richer “mini-app” ecosystem built around WorldID. The chain also plans attract users with subsidized “gas fees” for humans via an agentic transaction “tax”.
This could be big: reminiscent early seeds of what WeChat and Alibaba pulled off in China with their wallet networks or what Apple has accomplished through its tight integration of hardware and software, building an ecosystem difficult to replicate or leave.
The crucial question is whether the rag tag collection of gov ID + zk-TLS data + social consensus is “good enough” to accurately attest to humanness in a world of ubiquitous AGI. If you have a drivers license and proof of 50 uber rides in the last three months and one-hundred other verified contacts on the network and several “soul-bound tokens” as proof of attendance at certain conferences and proof of 10 km jogged this week via fitbit, we can probably assume with a high degree of confidence you are human.

Source: Vitalik: What do I think about biometric proof of personhood?
However, with AI’s rapid march, it seems likely specialized biometrics will have a (potentially essential) role to play. Substitution aside, WLD is not without its other risks. The initial token launch was subpar, suffering from the classic low float, high FDV disease investors have come to despise. At ~US$29b FDV, its not exactly a screaming buy given the execution risks, not to mention the “dystopian vibes” which will continue to haunt the project’s go-to-market.
However, I expect identity as a theme and WLD in particular to grab mindshare as agents metastasize across the interwebs like a cancer in 2025. If we fast forward a few years, World may find itself in a very privileged position: the arbiter of synthetic and biobased intelligence on the internet.
Sama, truly a master arms dealer: selling us the fuel for synthetic slop with his right hand and the anecdote with his left.
Payments: the Stablecoin Inflection
One step downstream of identity is exchange. I have been writing for years (link and link) about the evolution of payments in different jurisdictions, and how many “real” crypto use cases can only be unlocked once a multi-sided payments network reaches escape velocity.
2025 is shaping up to be a breakout year for stablecoin adoption, driven by a shift in the US regulatory environment and the buzz around agentic payments.
Robbie Peterson dropped The Stablecoin Manifesto in November which notes several key benefits of stablecoins:
- Impact on Domestic Payments: lower fees, faster settlement, and improved interoperability compared to many closed-loop systems like Paypal (more relevant in the west with very fragmented payments value chain relative to China and India)
- Impact on Global Payments: significantly lower fees and settlement times, disrupting the current cumbersome frameworks relying on correspondent banks and SWIFT
- Facilitating an Agentic Ecosystem: while many initial agents will leverage existing rails, a global economy of agents, transacting in high volume, across borders, in very low $ amounts, would be uneconomical under the existing infrastructure
Projects like Nevermind, Skyfire, Payman, Coinbase and others are building the necessary bridges between traditional payments and on-chain rails to unlock a greater swath of agentic use cases.
Too often “agents can’t KYC” comes off as unnuanced cope for regulatory arbitrage. Access can be an issue: opening a bank account is more cumbersome than spinning up a wallet, but it is not insurmountable. As Robbie points out, FBO accounts are widely used today and open banking is making strides in a number of jurisdictions.
However, agents will soon outstrip the number of humans on the planet. A future with tens or hundreds of billions of agents will change the shape of economic activity and necessitate changes in financial infrastructure. We are headed for a world of massive deflation and specialization. A world in which Coase’s Theory of the firm becomes obsolete: the ultimate expression of Adam Smith’s theory of specialization where every digital task will be broken down into its most compute efficient route – a collaboration of inference cycles, combined to solve the task at hand.
As opposed to “salaried” employees, we are likely heading towards more granular, task-based compensation (i.e. I want to rent three agents for 30 minutes each to solve this particular task. These agents will collaborate, pay for compute cycles, data feeds, tool usage, or even outsource a sub-task to another, more specialized group in exchange for $X in stables.) These exchanges could happen near instantaneously, for fractions of a cent, with servers in different jurisdictions. Multi-agent systems will further require segregated accounts which may in turn spawn other agents.
Do we really expect the existing card-based payment value chain born in the 1960s to meet these requirements in cost, speed, precision and expressiveness? Will the future of all internet based exchange continue to be taxed at 3% + 35 cents every swipe?
That is not to say it will be a regulatory free-for-all. Agents will come under regulatory purview, just like corporations before them. Skyfire is already preparing for this future with its KYA (Know Your Agent) suite. Similar to KYB when onboarding new businesses, KYA vets agents – examining histories, past transactions and beneficial owners / creators. Agents who want to participate in the regulated economy would need to comply to provide services.
Agent to agent economic activity will soon dwarf other economic participants. On-chain payments will be a crucial facilitator for many of these use cases and is primed for an inflection in 2025.
Short the banks. Buy the swarms.
Decentralized Training: Wiring the Globe
Decentralized training is likely the biggest wildcard in DeAI. Initially discounted (including by yours truly), the results over the past two quarters should cause an updating of priors.
Both from high (i.e. big tech exploring multi-data center training runs) and from below (impressive results from Nous and Prime Intellect), low-bandwidth distributed training may upend the current supercluster paradigm and serve as the lynchpin necessary for a truly open and composable AI ecosystem.
To date, synthetic intelligence has proven a combination of data and compute. Whichever system can aggregate and harness these inputs most effectively can offer a more performant solution to devs and users. So far, the centralized super cluster has prevailed, a performance advantage poised to remain for the foreseeable future. However, the stems of a new paradigm are slowly showing their green shoots through cracks in the cement.
Slowly. Slowly. Then all at once.
DisTrO 1.2b: Who is John Galt?
In a recent paper, Nous published their surprising results in training a 1.2b parameter model with SOTA levels of performance using 857x less bandwidth.
Training large scale neural networks usually involves sharing gradients between all accelerators (GPUs, TPUs etc) which requires (expensive) high-speed interconnects. Current techniques like Distributed Data Parallelism and Fully Sharded Data Parallelism require frequently sharing vast data sets across potentially thousands of accelerators – basically requiring these accelerators to effectively be “hardwired into a single super computer”.
DisTrO has introduced a design, agnostic to telco network topology and neural net architecture, which reduces the inter-GPU communication requirements by four to five orders of magnitude. This enables “low-latency training of large neural networks on slow internet bandwidths with heterogeneous hardware” which, in the future, may bypass the need for high-speed interconnects entirely.

Source: Nous Github
The paper shows an 857x reduction in bandwidth requirements vs. the standard “all reduce” methods but points to a potential full reduction of 1000x – 3000x during pre-training and up to 10,000x! improvement once post-training and fine-tuning are introduced.
Bandwidth Scaling: The team is not yet certain whether the ratio of bandwidth reduction is impacted by the size of the model increase, but there is no fundamental reason it should not hold or even become MORE FAVORABLE as the model size increases.
If so: “given that the amount of data between accelerators is decoupled from the model size, a new possible scaling law arises: one where the model size is increased without increasing the communication bandwidth”.
Ladies and Gentlemen, we are going to wire every device on earth together to summon the demon.
Act II: INTELLECT-1
With INTELLECT-1, the first 10B parameter language model collaboratively trained across continents, Prime Intellect raised the bar further. The 10x scale up spanned five countries and three continents using up to 112 H100 GPUs simultaneously over 42 days with 83% compute utilization (96% if distributed only across US-based nodes)

Source: Prime Intellect
The results also saw a sizable increase in communication reduction, requiring 400x less bandwidth compared to traditional data parallel training.
PRIME also demonstrated its flexibility in dynamic node participation during training, starting with 4 nodes and scaling up to 14 nodes during the run.

Act III: Return of the King
Not to be outdone, Nous came back with its own 15b parameter training run just one week later.
Nous Research announces the pre-training of a 15B parameter language model over the internet, using Nous DisTrO and heterogeneous hardware contributed by our partners at @Oracle, @LambdaAPI, @NorthernDataGrp, @CrusoeCloud, and the Andromeda Cluster.
This run presents a loss… pic.twitter.com/BAcyoSScGS
— Nous Research (@NousResearch) December 2, 2024
This is a one way train.
If Nous / Prime’s “bandwidth scaling law” holds, the AI paradigm could be inverted: slanting the intellectual horse power from vertical to horizontal approaches to scaling, tapping latent resources to relieve power grids, and saving on significant capex outlays by balancing loads between multiple data centers.
In reality, we will almost certainly pursue both simultaneously.
The breakthroughs also have implications in federated learning – a sub-field of ML which allows collaborative training of models while keeping each participants’ data private which has, to date, been held back by high bandwidth requirements in training.
Teams like ChainOpera have been working working on federated ML for years, producing high quality 1b models while retaining data privacy. DisTrO and Intellect could significantly impact the scale of these efforts: allowing for larger training runs on a range of data sets all in a privacy preserving fashion.
Slowly, the pieces are coming together. We just need one more big unlock…
Data Siloes: The New Bottleneck
The last vector of DeAI where crypto could clearly move the needle is data collection and attribution. The fabled scaling laws which should give life to our hallowed US$1t cluster by 2030 may be hitting their first plateau, at least on the pre-training side.
🚨Ilya Sutskever finally confirmed
> scaling LLMs at the pre-training stage plateaued
> the compute is scaling but data isn’t and new or synthetic data isn’t moving the needle
What’s next
> same as human brain, stopped growing in size but humanity kept advancing, the agents and… pic.twitter.com/HPPiaTRw6C
— John Rush (@johnrushx) December 14, 2024
Time will soon tell whether Ilya is right or the the 100k+ H100 and H200 training runs will kick the doors from their hinges. Regardless, even in a plateau scenario, narrowly defined “AGI” by 2030 is likely. After just two years, frontier models have passed average humans in many domains. From scaling inference compute, to RLAIF, to synthetic data, there are numerous promising vectors to push the field forward.
Like with Moore’s Law, given the benefits, capital and talent, the capabilities will grind higher.
However, the scale up in compute is outpacing new data generation. The flops ramp marches on unabated, while most publicly available data online has been mined. Worse still, many of these initial data honey pots are now now erecting barricades: filing intellectual property lawsuits and blocking data center IPs to stall the onslaught.
While synthetic data generation will likely prove the end game, our bet is that DeAI / DePIN related projects will have a role to play in the interim: with structural advantages in sourcing novel data sets which can help us push further up the wall.
We doubt things will get as crazy as the GPU gold rush of 2022 – 2024 which propelled Nvidia to world’s most valuable company. However, standing between the world’s largest companies with US$400b in cash, throwing off the equivalent of 1.5% of GDP in operating cashflows and AGI – an existential risk and a TAM in the tens of trillions – is always a good place to be.
The numbers in this slide will be going up.

If data has become the bottleneck, projects – like certain DePINs and shared data layers – which have a structural advantage in sourcing and serving data, should be in for a strong 2025.
There are a few sandboxes to be hunting in.
Public Data
We highlighted Grass back in DeAI II, and it has since undergone TGE and been warmly received by the market, up >3x from its launch price at an FDV of ~US$3.3b.
The bear case for Grass is that text data has largely been commoditized, with common-crawl’s 15 trillion tokens broadly available, and Grass has less relevance as the race shifts towards multi-modal data. There are also regulatory / enterprise standards related questions around “data scraping” as firms like Bloomberg and Reuters etc, have very strict checklists for data streams or licensing around data integrity / value chain which may limit Grass’ potential client base among traditional enterprises.
On the flip side, these concerns clearly have not held back the world’s largest companies in an existential race for AI supremacy, nor will they impact many open source / DeAI related efforts.
The bull case for Grass is the value of real-time data. While the half life is incredibly short, real-time data is still extremely valuable, providing a “source of truth” essential to many actionable tasks.
For example, traditional models may be blocked or honey-potted when aggregating flight information. Ensuring this information (or any inventory) is up-to-date and accurate is essential. With its network of 2.5 million residential nodes scraping 50 – 100 terabytes of information daily, Grass is well-positioned to supply this service at a structural advantage compared to centralized competitors. Grass calls this “live context retrieval” (LCR) and is productizing the effort so any model can plug in and tap these capabilities, for a fee.
Grass is also not satisfied with data aggregation alone, believing their network is in a strong position to expand downstream towards labeling, structuring and other services to get more value from the data they collate.
They are also evolving alongside in demand mediums: pushing from text into video, expanding the TAM significantly:
Grass, Ontocord, and LAION launch VALID
We’re excited to share the release of VALID (Video-Audio Large Interleaved Dataset) by the world-renowned teams at @ontocord and @laion_ai. VALID was built using the Grass Video Repository.
This dataset is comprised of 30 million audio… pic.twitter.com/4RjgWNDOng
— Grass (@getgrass_io) December 5, 2024
So far, Grass has emerged as one of the breakout DeAI data success stories. It will not be the last.
And there are larger, untapped data lakes to exploit…
Private Data
Accurate real-time data will be essential to many agentic use cases, yet a much larger, still untapped opportunity, is private data.

Source: MidCentury
These large data repositories remain fragmented and untapped for a reason. On the enterprise side, data sharing often struggles from negative contribution bias as entities with valuable data sets could more effectively monetize their data by keeping it exclusive. On the consumer side, the go-to-market is challenging and expensive.
DataDAOs were a clever idea when articulated in 2022 but have yet to inflect in line with expectations. And yet, the LLM revolution may be the crucial catalyst to unlock this opportunity. Capitalism is a powerful engine: like most commodities – when demand spikes, the global engine finds a way to source new supply.
“The solution to high oil prices is… high oil prices”
LLMs and their successors are unlocking a US$10trillion dollar TAM for AI developers globally. No longer will data be used in analytics as a “complement” to human systems but increasingly as a substitute for labor entirely, leaving a much larger profit pool. Private data will be key to its unlock.
There are a few approaches aiming to build a shared data layer, perhaps the most privileged position from which to offer incremental services over time.
VANA
Vana is one such project. The project started in 2018 and has since been backed by industry titans like Paradigm, already onboarding 1.3m users.
The roadmap includes scaling up from personalized on-device LLMs, to sixteen different DataDAOs, to ultimately a community-owned foundational model trained on the shared data sets of >100m users, eyes set on the frontier labs.
DeAI’s holy grail would be a marriage of a shared data layer like Vana with a globally coordinated compute network – leveraging something like a scaled up DisTrO – aggregating unmatched data and compute pools.
While this vision increasingly seems possible, a more likely interim step leans into our modular intelligence thesis outlined in DeAI III which matches data and developers to craft an ecosystem of more specialized models.

A dynamic marketplace of tiered data contributions, flexible compute, and specialized models which can be mixed and remixed to create an economy of intelligence.
MidCentury (fka Magic)
While the leading Labs push out the frontier, the next wave of applications will involve repackaging this generalized intelligence layer into smaller models, honed for specific use cases with specialized data sets.
Using a PhD to solve every task is inefficient and often times not performant.
Meta’s stubborn commitment to open source (and hopefully, in time, scaled up versions of low bandwidth training research) have opened the door for open source AI developers to enter the arena, but the next gating item is access to data.
Given data fragmentation, large tech ecosystems are once again at an advantage. That is where MidCentury comes in: aiming to provide an AI ecosystem where developers can openly monetize their AI primitives built on top of high-quality data.
How it works:
Data contributions have tiered permissions based on sensitivity: categorizing encryption requirements and rewards between public data, sensitive data, confidential, and highly sensitive. The data shared via a single link in a consumer application is validated and scored using TEEs and MPCs depending on the compute required.
Initial data treasuries targeted include: Finance (investment / transaction data), Social (Insta, Reddit, Youtube, Twitter, TikTok) and Health data (Apple watch, 8Sleep, Oura), expanding into new categories over time.
Developers can then train / tune specialized models across these multi-source data sets, with the underlying raw data obfuscated by a combination of TEEs, MPCs, and other privacy preserving methods. They can then offer a rev share or royalty agreement back to data contributors proportional to how much their data improved the underlying model. Not only could the underlying data sets be mixed in novel ways, but the models themselves become “lego pieces” to be remixed and combined.
The shared data layer (with the required data privacy guarantees) is perhaps the most crucial unlock for DeAI’s modular thesis: a mixture of a million experts of low-cost, hyper-efficient models which are efficiently routed and combined to perform ever more complicated tasks.
Private data is key to this vision, unlocked by consumers realizing the LLM onslaught is sending the value of their labor and their data in opposite directions.
Web Proofs!
Not to be forgotten (and covered more in-depth in DeAI IV), web proofs can also serve as a tool to breakdown data siloes across web2 ecosystems in a verifiable way which can be used to fuel specialized models.

Web proofs help with the chicken-and-egg bootstrapping problem by vampire attacking web2 histories and reputations, allowing them to be ported to less extractive, open networks.
On-Chain Data: Indexers & Co-Processors
The other important data repository which is open, but surprisingly convoluted and unwieldy, is blockchain data itself. As chains have proliferated, data has fragmented across numerous chains, applications, and offline repositories, making them difficult to efficiently collate and query, tools essential to bringing web2-level personalization and agentic capabilities to web3.
Companies like Snowflake and Databricks have emerged as behemoths in web2, helping companies manage and analyze large data repositories. These databases, query engines, and ETL pipelines have been instrumental in enterprise analytics and ML use cases like targeting, personalization, dynamic pricing and more.
Creating similarly performant infrastructure, in a decentralized manner, across storage, indexing, and retrieval pipelines is no small tasks.
Today, the web3 data warehouse / indexing market is tiny, but demand for these tools should increase considerably during the remainder of the decade:

Historically, projects have tapped centralized offerings like Alchemy or Infura for these services. However, decentralized networks are now gaining share, not only because clients value the on-chain transparency and verification, but because horizontal scaling across a large number of nodes can offer a more performant and cost effective solution, an advantage which should increase as more nodes join the network.
The Graph – valued at ~US$2.6b FDV – is clearly the largest player in the category, but appears significantly overvalued relative to other emerging solutions. Players like Space & Time, Moralis, Squid, Covalent and others are poised to gain share and close this valuation gap over time.
And while a more efficient query engine for web3 data sets is clearly a high growth market, an even larger opportunity is bringing off-chain data and compute on-chain in a verifiable way.
Co-Processors
Smart contracts are not all that smart. As Space & Time outline in detail in their whitepaper, smart contracts have a fundamental inability to query data, even from the logs of blockchains they’re deployed on.
“For example, EVM contracts can only access wallet balance, information about the current transaction, some blockchain meta data, and their own limited internal memory. It’s important to understand that the data published by blockchains and the data accessible to smart contracts within the blockchain virtual machine are two completely different things. Full/archive nodes of blockchains store a wealth of data across the full history of the chain, but none of this data is accessible directly to smart contracts!”
As blockchains themselves are limited in the amount of data and compute available for querying, co-processors are a compelling solution which has emerged: sitting next to major chains and supplementing smart contracts with additional processing capabilities in a verifiable way – enhancing liquidity provisioning, risk management, incentives targeting, governance and more .

This marriage: between blockchain trustlessness and big data / compute can allow for much smarter applications and agentic use cases on-chain. This will continue to be a hot area in 2025, spanning several approaches:

Though, in all likelihood, winning solutions will offer developers flexibility, combining different co-processor solutions depending on the use case in question. Practicality > Idealism.

Source: Florin Digital
Data is essential for the next generation of intelligent applications. To us, this part of the stack provides a compelling, lower risk way to ride an obvious trend.
New Data: Sensors & Compressed Reality
We are heading for a world in which every data input conceivable – text data, visual data, audio data, sensor data – will be recorded and fed into an online intelligence.
In a recent talk Ben Fielding, cofounder of Gensyn, makes the argument that after efficient distributed training, the frontier will be entirely new types of foundational models.
Watch our co-founder, @fenbielding, explain how the future of machine learning will lead us to a continuously updated digital representation of the world on the internet, and why this requires shifts in the way we access and allocate computational resources. pic.twitter.com/qhUqFYZVGR
— gensyn (@gensynai) October 8, 2024
These models will serve as a powerful compression technology, taking sensory inputs and cramming them down into parameter space that a future someone or algorithm can interrogate. On a longer-term basis, the process will be continuous, turning much of the world into a compressed, real-time digital twin.
We will interact with this data through a model / agent layer: machines curating knowledge on our behalf, but the supply chain of inputs, curation, and remixes will have an economy of their own.
LLMs are effectively the means by which the Internet of Things accrues real value. As we outlined in DeAI II, the process of data effectively replacing labor is only just beginning.

While a host of data sensor networks in sizable TAMs have emerged – driver dashcams, CCTV cameras, satellites, drones and more – betting on any one winner is more challenging than finding ways to bet on the narrative as a whole.
Solana has been an outperformer in 2024, benefitting in part from its attractiveness to leading DePIN projects who value the cost and speed advantage within the majors. Yet, at ~US$126b FDV, much of the “Solana the Monolith” $12 SOL alpha from @CeterisParibus has been farmed.
The emerging ecosystem play in “the machine economy” narrative, we expect to have a strong 2025, is peaq.
While early days, peaq has managed to build an impressive ecosystem as the “purpose built” L1 for DePIN projects and machine RWAs, serving as the bridge between physical assets and on-chain activity. At ~US$2.2b FDV post November TGE, its no longer a steal, but does appear to be making a credible push as the “got-to” L1 for DePIN / IoT related projects. If it manages to take this crown from Solana, the ceiling should be much higher.
As an “IoT-type” optimized chain, peaq has prioritized speed and costs above all else compared with “nation-state censorship” resistance or other concerns a “global settlement” layer may have.
The chain does 10,000 transactions per second today with plans to move to 100,000 post upgrades, and transactions costing just $0.00025 per transaction.
Notable features which have attracted projects to the chain include:
- Peaq ID: an on-chain verifiable identifier designed for machines, devices, vehicles or robots which enables them to recognize, verify, and transact with each other
- Peaq Access: which is effectively more granular permissioning / programmability around access (e.g. restricting access to certain vehicles or communication channels to certain participates etc)
- Peaq Verify: a mechanism to verify data feeds using a combination of private key signatures, algorithmic pattern matching and oracles for cross-referencing data from multiple devices
- Peaq Pay: enabling different economic models through machine-to-machine-to-human payments, including micropayments and pay per use, effectively the economic substrate for the machine economy
The team has invested heavily in interoperability: using cross-chain bridges to facilitate asset and data transfers to other leading ecosystems and has recently integrated with Fetch for more agentic capabilities.
The core of any L1 is the ecosystem of projects and developers which choose to build on it, and peaq has made solid strides in 2024, including several large multinational corporations and 50+ DePIN projects across 21 industries, including several which have decamped from Solana:

The risk for peaq today is the classic low-float, high FDV, constant sell pressure of many DePIN-oriented projects. With only 15% tokens outstanding, constant 3% inflation must be soaked up, not to mention unlocks, and can lead to a bleed out in any bear market. The limited monetization exhibited to date in Helium’s IoT network is also a concern regarding the current TAM of many of these projects.
However, we do feel like we are finally reaching the “slope of enlightenment” for the internet of things, powered by real value capture via multi-modal models. DePINs are poised to help eat global services. In the event peaq becomes the L1 underpinning the machine economy, it seems like one of the few emerging L1s with a differentiated claim to “majors” status.
PART II: DePIN
Congratulations, we are now firmly in DePIN territory. Despite my sometimes loquacious writing style, I am not one to reinvent the wheel. If you are looking for a comprehensive overview of everything happening in DePIN across wireless, energy, services, compute infra and more in 2024, I would highly recommend Compound VC’s “DePIN’s Imperfect Present & Promising Future: A Deep Dive“.
Along the same lines, many “Virtual” DePIN subsectors related to DeAI infrastructure – decentralized training, compute, storage, bandwidth, and the data value-chain – were covered in DeAI II: “Seizing the Means of Production”.
As such, this Year Ahead will be more selective: focusing primarily on physical DePIN networks, emerging sectors with promise, and frameworks / case studies to assess leading projects.
D.O.G.E. The Unexpected Timeline
To date, DePIN has been handcuffed by a regulatory paradox: the large markets where it’s impact is most sorely needed are inefficient often due to burdensome regulations; regulations which hinders experimentation from new entrants.
Finance, Telco, Energy, Healthcare… are all are industries, particularly in the United States, which tend to have poor service and bloated costs often due to regulatory capture, like DePINs.
Many of these industries have ossified around industrial era infrastructures which rely on a “hub and spoke” distribution model when 21st century tools allow for algorithmically guided “mesh-like” solutions which can complement and, eventually, replace much of the existing telco infra, energy grids, healthcare delivery networks, and financial intermediaries.
I can’t help but think a Trump administration will be more accommodating to disruption of the status quo. And while much of the analysis will touch on US industry, where legacy infrastructure and a shifting regulatory climate around crypto should provide tailwinds, many attributes for launching a successful DePIN network – aggressive GTM, gamified incentives, hardware supply chain access – are areas Chinese entrepreneurs and other APAC teams should be quite competitive.
Decentralized Energy: Rewiring The Grid
The energy sector presents a compelling case study in technological ossification — a vertically integrated industry whose fundamental architecture remains largely unchanged since the days of Thomas Edison. While this system has served admirably for over a century, mounting evidence suggests we’re approaching an inflection point where the traditional hub-and-spoke model may no longer be fit for purpose. Before entertaining how DePin can augment the transition to a “smarter” and more resilient electric grid, let’s first explore how our existing system is fundamentally broken.
The Structural Challenge of America’s Electric Grid
The U.S. electric grid represents a fascinating paradox of modern infrastructure — simultaneously one of humanity’s most complex engineering achievements and one of its most antiquated operational systems. Split into three major interconnections – East, West, and Texas – the grid operates through a hierarchical system where electricity flows in a single direction: from large centralized power plants through high-voltage transmission lines, to local distribution networks, and finally to end consumers.

This one-way delivery system, designed in the early 20th century, reflects an era when energy generation was purely centralized and consumer needs were simpler. Today, the system’s antiquated nature is becoming increasingly reflected in the delta between generation economics and delivery costs. In other words, while technological advancement has dramatically reduced the cost of power generation — driven by efficiency gains in both renewable sources and natural gas — the cost of delivery has followed an entirely opposite trajectory. This divergence manifests in consumer bills, where delivery charges now frequently exceed 50% of total costs in markets like California.

However, this asymmetry points to a more fundamental problem. The grid’s inherent rigidity is insufficient is an era that demands unprecedented flexibility. The requirement to maintain precise frequency synchronization (60 Hz in the United States) across vast interconnected networks becomes increasingly precarious as infrastructure ages. With 70% of transmission infrastructure exceeding its intended design life, the United States now experiences more frequent disruptions than any other developed nation. This was illustrated by the 2021 Texas grid crisis, whose economic impact ($80-130 billion) exceeded many natural disasters.
Crucially, this fragility comes at precisely the wrong moment as we face unprecedented demand growth from three secular trends:
- The Compute Revolution: The explosion in artificial intelligence and data center requirements has reached such intensity that technology giants are exploring private nuclear facilities — a remarkable departure from traditional infrastructure models.
- Electrification’s Second Wave: The transition to electric vehicles, heat pumps, and electrified industrial processes represents not merely a quantitative increase in power demand, but a qualitative shift in consumption patterns.
- Industrial Resurgence: The reshoring of manufacturing capacity to American introduces new industrial power requirements with distinct characteristics from the service-sector loads that dominated recent decades.
Perhaps most challenging is the integration of renewable energy sources into the rigid, one-way nature of our grid. Intermittent energy sources such as solar and wind, while essential for our future, introduce new variables into a system that wasn’t designed for fluctuating supply. This confluence of aging infrastructure, rising demand, and the need to integrate new energy sources has increasingly transformed grid modernization from an infrastructure challenge into a national security imperative in 2025.
The Evolution Toward Distributed Architecture
The path forward lies not in incrementally reinforcing an aging paradigm, but in fundamentally reimagining grid architecture. The future grid must evolve from a deterministic, hierarchical system into a probabilistic, distributed network capable of dynamic self-optimization.
This transformation is already manifesting through the proliferation of Distributed Energy Resources (DERs). These assets — ranging from residential solar installations and storage systems to smart thermostats and vehicle charging infrastructure — represent not merely additional generation capacity, but the emergence of a new grid topology. Unlike traditional power plants, DERs operate at the grid’s edge, closer to actual demand centers. In addition to reducing transmission losses, this locality makes the grid fundamentally more reliable.
However, DERs introduce their own complexities. The traditional grid’s unidirectional power flow reflected a simplicity – both in engineering and economically – where every component from transformers to protection systems was optimized for one-way delivery. DERs fundamentally disrupt this model by creating dynamic, bidirectional power flows that challenge basic assumptions around voltage regulation, system protection, and grid stability. The economic implications are equally notable, as traditional utility rate structures, designed around simple volumetric consumption, prove inadequate for a world where consumers both produce and consume power. This is particularly evident during events like the duck curve, where abundant solar generation during low demand periods creates negative pricing events.

The key to managing this complexity lies in the emergence of “smart grid” capabilities that enable dynamic balancing across distributed nodes. This requires advances across three critical domains:
- Advanced Energy Storage Systems: The foundation of a distributed grid lies in its ability to time-shift energy across multiple scales. While lithium-ion batteries capture public attention, they represent just 2% of current grid storage capacity, with pumped hydro dominating at 94%. True grid resilience demands a diverse storage ecosystem:
- Residential batteries for daily cycling
- Grid-scale facilities for bulk energy shifting
- Long-duration technologies like iron-air for seasonal storage
- Ultra-capacitors for frequency regulation
- Real-Time Control Systems: Current grid operations often resemble a black box, with many utilities unable to detect outages until customer reports arrive. Next-generation infrastructure demands:
- Advanced metering infrastructure (AMI)
- Distributed sensor networks
- AI-driven control systems
- Predictive maintenance capabilities
- Dynamic Market Mechanisms: Traditional rate structures cannot capture the complexity of millions of prosumers trading energy services in real-time. Modern grids require sophisticated markets for:
- Real-time energy trading
- Ancillary services procurement
- Demand response program coordination
- Capacity rights allocation
So what does this all have to do with DePin?
DePIN’s Transformative Potential
While the vision of a distributed, intelligent grid is compelling, its realization faces two fundamental bottlenecks: data fragmentation and hardware coordination. The current energy landscape is fragmented across more than 3,000 U.S. utilities, many still relying on rudimentary systems for operational management. This fragmentation renders real-time grid optimization virtually impossible. Simultaneously, the challenge of incentivizing widespread DER adoption without clear economic benefits has limited the penetration of crucial grid-edge resources.
DePIN architectures offer an elegant solution to both constraints. Several pioneering projects demonstrate how crypto-economic incentives can reshape fundamental grid operations:
- Daylight exemplifies the potential for market-driven grid optimization. Their virtual power plant (VPP) platform modernizes traditional demand response programs by creating a dynamic marketplace for distributed energy resources. The innovation lies not merely in aggregating residential storage capacity, but in the introduction of real-time price signals that allow homeowners to delegate their energy assets to the highest-value use case at any moment. This creates a more responsive and efficient grid while generating previously uncapturable value for participants.
- Glow approaches the renewable energy transition through a dual-token mechanism that elegantly solves the renewable energy financing challenge. Their system, which combines a fixed-inflation token (GLW) with tokenized carbon credits (GCC), creates a sophisticated market for differentiating between solar projects requiring subsidization and those capable of independent operation. The weekly distribution of 230,000 GLW tokens across solar farms, grants, governance, and verification creates a self-sustaining ecosystem for renewable energy expansion. Their emergence as the revenue leader in the DePin sector validates the model’s economic viability.
⚡️ Top #DePIN Projects by Revenue (30d)#DePINs is an acronym for Decentralized Physical Infrastructure Networks, which use crypto-incentives to efficiently coordinate the buildout and operation of critical infrastructure.#Glow – $14.0 M#ionet – $1.00 M#Filecoin– $763 K… pic.twitter.com/1jDMfxbyso
— 🇺🇦 CryptoDep #StandWithUkraine 🇺🇦 (@Crypto_Dep) December 11, 2024
- Fuse exemplifies a novel approach to energy market transformation through vertical integration. Operating utility-scale renewable plants while serving thousands of UK households as a regulated supplier, they’re pioneering a full-stack model that minimizes inefficiencies between generation and consumption. As core contributors to Project Zero, they’re tackling two critical challenges: consumer transition inertia and retail energy distribution fragmentation. Their platform leverages token incentives to encourage demand response participation and DER adoption, while simultaneously operating as a virtual power plant that can aggregate distributed resources for grid services. This represents a fundamental shift from the industry’s traditional horizontal aggregation model, where companies typically specialize in single segments like generation, transmission, or retail.
- Uranium Digital merits particular attention for its novel approach to nuclear markets. By introducing crypto rails to uranium trading, they’re addressing a crucial bottleneck in the nuclear energy renaissance. Their dual-track system, accommodating both physically-settled institutional trades and synthetic retail exposure, could dramatically improve price discovery and capital efficiency in what has historically been an opaque market. This becomes particularly relevant as data center power demands drive renewed interest in nuclear generation.
- Plural Energy approaches the energy transition from a different angle, focusing on democratizing clean energy investment opportunities. Their SEC-compliant platform addresses the $4 trillion annual funding gap for grid modernization by making previously institutional-only investments accessible to retail participants. This represents not merely a democratization of access, but a fundamental reimagining of how energy infrastructure is funded.
Future Trajectory: Convergence and Divergence
As we enter 2025, several compelling forces are converging to accelerate the adoption of decentralized energy solutions. The increasing strain on traditional infrastructure, combined with technological maturation and regulatory evolution, creates fascinating opportunities for DePIN projects:
- AI-Driven Demand Transformation: The projected doubling of data center energy consumption by 2026 represents a massive quantitative increase in demand. This unprecedented load growth will continue to force innovation in energy delivery systems.
- Regulatory Architecture Evolution: FERC’s interconnection queue reforms and ERCOT’s “connect and manage” approach provide early templates for more dynamic grid integration frameworks. These regulatory innovations could carve out pathways for DePIN adoption.
- DER Network Effects: The accelerating deployment of distributed energy resources creates natural demand for coordination platforms. This organic growth in grid-edge assets provides ready-made markets for DePIN solutions.
- Utility Partnership Dynamics: Early collaborations between traditional utilities and blockchain platforms suggest potential for hybrid models that combine incumbent infrastructure with decentralized coordination layers.
- Supply Chain Catalysis: Paradoxically, persistent supply chain constraints for traditional grid components are accelerating the adoption of distributed solutions, as smaller, localized projects face fewer bottlenecks than centralized infrastructure initiatives.
However, these opportunities must be weighed against significant challenges that echo historical attempts at energy market transformation:
- Regulatory Uncertainty: The absence of clear frameworks for tokenized energy assets creates implementation risk, particularly given energy’s classification as critical infrastructure.
- Technical Integration Complexity: The challenge of incorporating bidirectional power flows into century-old grid architecture remains formidable, requiring unprecedented coordination with existing utilities.
- Incumbent Resistance: Traditional utilities, operating under guaranteed return models, have limited incentives to embrace innovations that might complicate their operations or threaten their business model.
- Market Liquidity Dynamics: Drawing parallels to Enron’s attempted creation of liquid energy trading markets in the 1990s, many projects face fundamental challenges in maintaining sufficient liquidity for emerging tokenized markets.
- Carbon Market Durability: The sustainability of carbon credit markets remains uncertain, with historical examples of market manipulation and verification challenges threatening projects built around carbon monetization.
- Protocol Standardization: The absence of unified protocols across thousands of utilities, combined with cybersecurity concerns from IoT integration, presents significant technical hurdles.
- Token Economic Sustainability: Projects dependent on inflationary token rewards face questions about long-term sustainability, particularly in maintaining network participation as initial distribution schedules decline.
The intersection of these opportunities and challenges suggests that while the potential for energy DePIN is immense, success requires careful navigation of both technical and macro considerations. The lessons from previous attempts to revolutionize energy markets, from Enron’s trading platforms to early smart grid initiatives, provide crucial guidance for this new generation of decentralized energy solutions.
Decentralized Wireless (DeWi): The Path to Mass Adoption
The telecommunications industry stands at an inflection point. After decades of centralized control by a handful of carriers who built empires on spectrum licenses and regulatory capture, we’re witnessing the emergence of a new paradigm. The transformation centers on a fundamental question: can decentralized coordination solve the structural inefficiencies plaguing modern telecommunications?
To understand the opportunity, we must first examine the industry’s architecture. As outlined by the team at Compound VC, telecommunications operates across three distinct tiers:
- Mobile Wireless – The infrastructure enabling cellular connectivity through an extensive network of towers, base stations, and handoff protocols
- Fixed Bandwidth – The physical backbone of guaranteed high-speed internet, traditionally delivered through fiber optic or coaxial infrastructure
- WiFi – The local area networking layer operating in unlicensed spectrum bands
Each tier faces unique challenges, but perhaps none more interesting than mobile wireless, where structural inefficiencies create compelling opportunities for disruption.
Consider Verizon, ostensibly a money-printing machine:
- $108.3B in productive physical assets
- $134.0B in annual revenue
- ~59% gross margins
- $47.8B in EBITDA
At first glance, this appears to be a quintessential utility business – converting physical infrastructure into predictable cash flows. However as highlighted by the EV3 team, deeper analysis reveals a more nuanced reality. The business model suffers from four critical constraints that fundamentally undermine its unit economics:
- Spectrum Burden – The invisible foundation of wireless communication comes at an extraordinary cost. Verizon carries $155.7B in spectrum licenses on its balance sheet – more than its productive assets. As the team at EV3 elegantly put it, by auctioning off spectrum, “governments effectively grant themselves a preferred equity claim on telecom cash flows”.
- Operating Complexity – Running a nationwide wireless network requires a sprawling corporate infrastructure. In 2023, Verizon spent $87.6B (65.4% of revenue) on operations across network maintenance, customer service, equipment subsidies, and administrative overhead. This creates significant operational leverage but also introduces brittleness into the business model.
- Perpetual Capital Requirements – Each technology cycle (3G→4G→5G) demands massive new investment independent of business conditions. Verizon spent $18.8B on capex in 2023 alone, with $45.5B allocated just to 5G spectrum. This creates a treadmill effect where carriers must continuously reinvest simply to maintain competitive parity.
- Structural Leverage – These capital requirements necessitate significant debt financing. Verizon carries $150.7B in debt with $5.5B in annual interest expense. Rising rates have pushed their average cost of debt from 3.7% to 4.9% in a single year. This leverage amplifies returns in good times but creates significant fragility.
The compounding effect of these constraints is striking – of every dollar in productive assets, only $0.04 flows to equity holders. The remainder is consumed by spectrum costs, operations, capex, and debt service. This creates a peculiar dynamic where telecom companies must take on additional leverage simply to maintain dividend payments.

What makes this system particularly noteworthy is that it appears to be reaching structural limits just as demand is accelerating. The confluence of AI compute requirements, IoT proliferation, and industrial reshoring is driving unprecedented demand for connectivity. Yet the traditional carrier model, with its massive fixed costs and rigid architecture, appears increasingly ill-suited to meet these emerging needs.
This creates a compelling opportunity for decentralized alternatives. The question is not whether the current system is inefficient. Rather, can crypto-economic incentives coordinate distributed infrastructure deployment more effectively than vertical integration?
DePIN’s Solution to the Mobile Challenge
The elegance of decentralized wireless networks lies not in any single technological breakthrough, but in their ability to fundamentally restructure the economics of telecommunications delivery. Projects like Helium demonstrate how crypto-economic incentives can coordinate distributed infrastructure in ways that address each of the traditional model’s core constraints:
- Capital Efficiency – While traditional carriers must centrally fund and maintain all network infrastructure ($108.3B in Verizon’s case), DePIN networks distribute these costs across thousands of independent operators. This transforms network growth from a centrally-planned exercise into an organic, market-driven process responding to actual demand signals.
- Spectrum Arbitrage – By leveraging unlicensed or shared spectrum bands like CBRS, DeWi networks can circumvent one of the industry’s largest fixed costs. This isn’t merely cost avoidance – it represents a fundamental shift in how wireless resources are allocated, from bureaucratic auctions to dynamic market mechanisms.
- Operating Model Innovation – The traditional carrier model requires massive corporate overhead – retail locations, call centers, marketing departments. DePIN networks replace this with protocol-level coordination, where token incentives dynamically adjust to optimize network growth and operations.
- Financial Architecture – Instead of relying on debt markets to fund expansion, DeWi networks leverage token incentives to coordinate capital formation. This transforms telecommunications from a leverage-driven utility business into a market-coordinated protocol.
Perhaps most interesting is how these networks match supply and demand. While traditional carriers must maintain uniform coverage and peak capacity everywhere, in contrast, DeWi networks can dynamically adjust incentives to direct resources where they’re most needed.
Helium’s Discovery Mapping system exemplifies this approach. By collecting granular data on coverage gaps, the network can increase token rewards in high-demand areas, organically incentivizing infrastructure deployment. High-usage zones naturally attract more operators due to higher rewards, while low-usage areas require minimal investment.
This dynamic scaling ability, combined with fundamentally lower fixed costs, enables DeWi networks to profitably offer services at 70-80% below incumbent pricing. Helium’s $20 unlimited plans compete against traditional $60-90 offerings while maintaining sustainable unit economics.
However, several critical questions remain:
- Regulatory Durability – A significant portion of the cost advantage derives from sidestepping spectrum auctions. It’s unclear whether regulators will continue allowing this as these networks scale.
- Token Sustainability – The model assumes token incentives maintain sufficient value to drive network growth. A sustained decline in token prices could undermine the core coordination mechanism.
- Technical Complexity – Integrating thousands of independent nodes while maintaining carrier-grade reliability presents significant engineering challenges.
- Capital Intensity – While the model reduces upfront requirements, ongoing maintenance and upgrades still demand significant capital. The question remains whether token incentives alone can sustainably fund this.
While Helium pioneered the DeWi model, new entrants are exploring distinctive approaches to network architecture:
- Karrier One is attempting to solve the regulatory challenge by focusing exclusively on underserved markets. Built on Sui, they combine carrier-grade 5G infrastructure with novel payment rails through their KarrierKNS system. By targeting areas traditional telcos ignore, they potentially face less regulatory scrutiny while serving genuine market needs.
- Really takes the opposite approach – instead of competing on cost, they focus on privacy and security. Their $129/month premium offering includes comprehensive security features like anti-ransomware protection, SIM swap prevention, and encrypted communications. This suggests DeWi networks can target premium segments where margins better support infrastructure investment.
These specialized approaches validate a broader thesis: rather than directly challenging incumbent carriers across their entire footprint, DeWi networks may find success targeting specific use cases and market segments where their structural advantages are most compelling.
This pattern of targeted disruption extends beyond mobile wireless into another crucial domain of telecommunications: fixed bandwidth infrastructure. While mobile networks struggle with spectrum costs and coverage requirements, the challenges in fixed connectivity center on a different constraint entirely – the economics of the last mile.
Fixed Bandwidth: Reconsidering The Last Miles
The internet lies at the intersection of atoms and bits — invisible packets routed through three distinct tiers of infrastructure:
- Tier 1 providers handle backbone infrastructure – the submarine cables and major interconnections that stitch together continents
- Tier 2 providers run regional fiber networks connecting data centers and key distribution points
- Tier 3 providers tackle “the last mile” – bridging the gap from distribution points to individual buildings
This topology has resulted in a historically monopolistic market structure where providers extract economic rents from owning the capital-intensive last mile. However, beneath this model lies a striking inefficiency. While bandwidth at Tier 2 data centers costs pennies per gigabit, consumers pay orders of magnitude more, with over 70% of costs concentrated in the final connection.
Importantly, this is not merely a question of margins. Instead, the immense capital requirements of laying fiber ($2500-4000 per household), regulatory capture that limits competition (52% of Americans have access to only one provider), and infrastructure paradigms inherited from the landline era combine to foster a dynamic whereby artificial scarcity persists despite abundant upstream capacity.
Andrena recognized that this inefficiency stemmed not from fundamental physical or economic limits, but from an architectural assumption: that the last mile required fixed, point-to-point connections. By rethinking this premise through modern wireless technology, they’ve pioneered a model that could restructure internet delivery economics.
The core insight centers on three converging technological shifts:
- The FCC’s release of 1.2 GHz of 6 GHz spectrum in 2020 enabled multi-gigabit wireless connections
- Advances in beam-forming and point-to-multipoint technology dramatically reduced equipment costs
- Software-defined networking allowed for automated optimization at scale
Andrena combines these capabilities into a new delivery paradigm. Their proprietary RaaS (Robotic Antenna System) technology enables automated aiming and optimization of wireless nodes, eliminating specialized technician requirements. A cloud-native routing stack reduces equipment costs while enabling multi-gigabit throughput. And most importantly, by partnering with building owners, they transform physical real estate potential distribution hub.
The model’s efficacy is evident in the numbers. Today, Andrena has over 10,000 subscribers across 10 states, delivering gigabit service at 30-50% below incumbent pricing. This proves that the technology is there for the traditional hub-and-spoke model to be superseded by a distributed architecture.
Andrena has since launched Dawn, a decentralized protocol that aims to augment the scalability of this model. Dawn’s innovation is recognizing that the coordination problem of building distributed networks – historically solved through vertical integration – could potentially be better addressed through crypto-economic incentives.
The protocol introduces several key mechanisms:
- A medallion system allowing token holders to stake to specific geographic regions, creating market-driven deployment incentives
- Proof of backhaul enabling trustless verification of service quality
- Automated frequency coordination optimizing spectrum usage
- Economic rights aligning stakeholder incentives across the network
The vision extends beyond merely cost reduction. By replacing basic modems with high-performance computing nodes, Dawn aims to transform every household into a potential participant in the decentralized cloud economy. At scale, this could effectively turn bandwidth into a tokenized commodity – particularly salient as we move toward an increasingly agentic economy where compute, bandwidth, energy and storage become the core scarce resources.
Ironically, the model’s most obvious challenger doesn’t come from traditional telcos but from space. Starlink’s rapidly expanding constellation (5000+ satellites with plans for 42,000) represents both validation of alternative delivery models and potential existential threat.
However, it is worth noting that several structural factors favor terrestrial networks in dense environments:
- Physics Favors Proximity – While Starlink has made impressive strides in latency reduction, the speed of light remains undefeated. For applications requiring ultra-low latency, terrestrial networks maintain inherent advantages.
- Bandwidth Density – Satellite capacity must be shared across wide geographic areas. In dense urban environments, terrestrial networks can deliver more bandwidth per square kilometer at lower cost.
- Economic Stratification – At $90-110 monthly, Starlink occupies a premium segment. DeWi networks can target value-conscious consumers while maintaining healthy margins.
This suggests less of a winner-take-all scenario than emergent specialization. Rural areas naturally favor satellite coverage, while dense urban environments favor terrestrial solutions. The probable outcome is a hybrid topology optimized for different use cases and geographies.
Dawn’s bet is ultimately on coordination efficiency – that decentralized networks, properly incentivized, can solve the last mile problem more effectively than either traditional ISPs or satellite constellations. While significant challenges remain around scaling, reliability and regulation, the potential to dramatically reduce costs while expanding access makes this a compelling experiment in market design.
The next few years will be crucial in determining whether this hypothesis holds.
Network Infrastructure: The Hidden Bottleneck
While Dawn addresses the last mile, another equally fundamental constraint exists within modern distributed systems: network communication itself. The internet’s architecture, optimized for cost rather than performance, introduces inefficiencies that become particularly acute in high-stakes applications.
Traditional internet infrastructure routes data packets based on economic rather than technical optimization. In other words, packets take paths that are cheapest, not fastest. This creates several critical limitations:
- Latency Variance – Packets between the same endpoints may take wildly different routes each time, introducing significant “jitter”
- Congestion Risk – Public internet routes frequently saturate, leading to packet loss and degraded performance
- Limited Prioritization – Traffic is treated homogeneously, without distinguishing between time-critical and standard communications
These constraints are particularly problematic for high-performance distributed systems like blockchain networks, where milliseconds can be meaningful in the context of settlement time. The challenge is compounded by increasing demands from AI training, content delivery networks, and other bandwidth-intensive applications.
This is what we mean when we talk about infrastructure pic.twitter.com/aQGU6bV1hK
— DZ (@doublezero) December 10, 2024
Enter DoubleZero, a decentralized network infrastructure designed to solve these fundamental networking problems by tapping into an underutilized resource: private fiber-optic networks. Their insight stems from a structural inefficiency – up to 60% of fiber in the United States sits dark, unutilized.
The model is elegantly simple:
- High-frequency trading firms and other enterprises often maintain private fiber links operating at 20% capacity
- These institutions can “contribute” excess bandwidth to the DoubleZero network
- Token incentives compensate bandwidth sharing
- The protocol aggregates these private links into a cohesive, high-performance communication fabric
Like most DePin networks, the elegance of this model lies in more effectively matching supply and demand. Infrastructure owners can monetize otherwise idle capacity. Network-dependent applications gain access to premium routes. And the protocol itself creates a coordination layer that transforms disconnected private infrastructure into a unified resource.
For high-performance chains like Solana, the implications are notable. DoubleZero could enable:
- Faster transaction processing through optimized communication paths
- More consistent validator performance via reduced network variance
- Potential geographic expansion of validator distributions
- Reduced infrastructure costs through shared resource pools
But perhaps more interesting is what this reveals about the evolution of internet architecture itself. The original internet was designed around principles of best-effort delivery and cost minimization. As applications become more demanding and latency-sensitive, this model shows increasing strain.
DoubleZero’s approach suggests a different paradigm – one where premium communication paths exist alongside traditional routes. This has particular resonance for the emerging AI economy, where training runs spanning multiple data centers require predictable, high-bandwidth connectivity.
This points to a broader theme emerging across DeWi projects: the transformation of telecommunications from a monolithic, vertically-integrated industry into a more fluid marketplace of specialized services. Whether delivering last-mile connectivity like Dawn or optimizing inter-datacenter communication like DoubleZero, these protocols are unbundling telecommunications into its constituent parts and rebuilding each with market mechanisms.
The result may be an internet that looks very different from today’s – more heterogeneous, more specialized, but also more capable of meeting the diverse needs of tomorrow’s applications.
Looking Ahead for DeWi
Zooming out, as we enter 2025, several forces are converging to reshape the telecommunications landscape. The structural challenges facing traditional telcos, combined with technological advances and evolving market dynamics, create both opportunities and obstacles for DeWi projects.
Tailwinds Accelerating Adoption:
- Technological Convergence – The maturation of point-to-multipoint wireless technology, combined with massive spectrum releases like 6 GHz, enables multi-gigabit wireless connections at a fraction of traditional deployment costs.
- Macro Computing Shifts – The explosive growth of AI is driving unprecedented demand for compute and bandwidth. Microsoft’s partnership to reopen Three Mile Island and Google’s commitment to small modular reactors highlight the scale of infrastructure requirements ahead.
- Regulatory Evolution – The increasing recognition of telecommunications as critical infrastructure, combined with a more accommodating US regulatory stance, may create openings for novel network architectures.
- Market Validation – Early successes like Helium (123k subscribers) and partnerships with major carriers like Telefonica demonstrate viable paths to market adoption. Projects like Andrena prove DeWi can deliver superior economics in dense environments.
Headwinds to Navigate:
- Spectrum Politics – Government revenue from spectrum auctions ($4.3B from Verizon alone in 2023) makes it unlikely they’ll allow DeWi networks to indefinitely evade traditional licensing frameworks.
- Token Sustainability – Many DeWi models hinge on token incentives maintaining sufficient value to drive network growth. Market volatility could undermine network maintenance incentives.
- Technical Complexity – Integrating decentralized networks with legacy infrastructure while maintaining carrier-grade reliability presents significant engineering challenges.
- Capital Requirements – While DeWi reduces upfront capital needs, ongoing investment in network maintenance and upgrades remains necessary. The question remains whether token incentives alone can sustainably fund long-term infrastructure.
The next 18-24 months will likely prove decisive. Early DeWi networks must demonstrate sustainable economics while managing token incentives. Projects that can thread this needle while building robust networks will be positioned for massive value capture.
Lastly, the future of connectivity may not belong to any single approach – satellite, terrestrial wireless, or fiber – but rather to those who can most effectively orchestrate these elements into coherent solutions for specific use cases. DeWi’s greatest contribution may be showing how market mechanisms can enable harmonious coordination at scale.
While energy and telecom’s embrace of the mesh is well underway, several other large verticals have emerged as promising early candidates.
Robotics: A New Frontier
In a 2023 article, Nishanth Kumar detailed the tension within the Robotics community regarding the potential of the scaling laws in robotics.
Pros
The pro argument was fairly simple: the bitter lesson is coming to robotics. The scaling laws worked for LLMs and birthed numerous, unforeseen “emergent capabilities”.
Cons
While more varied, most cons boiled down to a variation of “the relevant data sets are simply too difficult or expensive to collect”. The internet had a flywheel of hundreds of millions and later billions of people crafting user generated content which internet giants mined for decades. An equivalent ocean of multi-modal data doesn’t exist for robotics, and, even if it did, we are unsure of the quantum needed to generalize across the near infinite scenarios possible in the physical world.
Another significant challenge are hallucinations. Robots must maintain a much higher degree of fidelity given safety concerns. Tolerance for hallucinations in vehicles or home robots will rightly be much stricter, slowing the deployment and data collection flywheel.
I am far from a robotics expert, but from what I have read in 2024, these challenges appear surmountable. Historically, robotics has been limited to reinforcement learning: effective for specific tasks, but not scalable to the near-infinite number of possible real world encounters. However, clever combinations of existing foundational models, large multi-modal data sets, and smaller robotic data training / tuning have provided encouraging results towards more generalized approaches to learning which should only accelerate as synthetic data simulations become more accurate at rendering the physical world.
Furthermore, the cost curves will come down significantly. Just like bandwidth scaled first to facilitate text, then images, then video, and now VR, we should expect similar curves in multimodal training and inference. The inference cost collapse for the current generation of LLMs has been truly remarkable.

Source: @davidtsong
While text has been saturated and commoditized (outside of real time data!), the race for multimodal data is ramping up. One of the key drivers behind this race is “embodied AI”.
Embodied AI is a difficult market to size, but hundreds of trillions long-term is not an insane forecast. Shaoshan Liu, from the Shenzhen Institute of Artificial Intelligence and Robotics, recently sized the data collection opportunity for embodied intelligence at US$10T.

While internet data is primarily used in targeted advertising and personalized content (and increasingly as fuel for LLMs), in embodied AI, data will be pivotal to enhance and optimize robotic capabilities. Internet data is estimated at ~US$600 per user x US$5b users = US$3T. The value of data collection in embodiedAI is expected to >3x that amount.
As we think about the market cap generated from web2 data and visualize the monster capex cycle underway for digital intelligence:

10T+ feels like a reasonable estimate to bridge these capabilities into the world of atoms. Interestingly, it is also about the same size as the services market today:

Source: Sequoia: GenAI’s Act vo1
This race will not be limited to hyperscaler budgets but will tap into defense as well. Not only the labs, but the leading robotics companies are split by the Pacific.
US: Tesla, Boston Dynamics, Figure, Agility Robotics, Apptronik, Sanctuary AI, 1x Technologies
And
China: Unitree, Kepler, Fourier, Robot Era, LimX Dynamics, Agibot, Astribot, Dataa Robotics, Leju Robot, Booster, UBTech, Xiaomi, XPeng
The current bottleneck facing all of these companies is “the brain”: finding a path to generalized intelligence which scales more effectively than reinforcement learning based approaches.
The keys to solving this problem are three fold:
- Collecting a variety of multi-modal sensory data for imitation learning
- Build high-fidelity simulated environments of physical interactions to bring down costs
- Verify both sets of data have “temporal and spatial alignment” with real environments

Source: Shaoshan Liu
DePIN has clear potential to assist in step #1, both in sensory data collection but also in breaking down data siloes. While Tesla cars cost tens of thousands of dollars, a Hivemapper Dashcam costs US$500. While the >1b CCTVs globally would cost more than US$6T to install and maintain, NATIX taps user-own devices with no additional capex.
By using distributed incentives, crypto can bootstrap networks from existing, fragmented infrastructure to collect rich real world data sets at a fraction of the costs compared with single companies.
While networks like Hivemapper have more targeted near term business goals (i.e. mapping solutions), perhaps another substantial revenue stream would be fuel for self-driving and eAI.
The biggest risk here is AI Labs cracking synthetic data, significantly reducing the need for real world sensory input with significantly cheaper, high-fidelity simulations.
In fact, DeepMind’s recently released Genie 2 aims do to just that. Genie 2 is capable of generating an endless variety of action-controllable, playable 3D environments for training and evaluating embodied agents.
“Genie 2 is a world model, meaning it can simulate virtual worlds, including the consequences of taking any action (e.g. jump, swim, etc.). It was trained on a large-scale video data set and, like other generative models, demonstrates various emergent capabilities at scale, such as object interactions, complex character animation, physics, and the ability to model and thus predict the behavior of other agents”.

While this is impressive, it seems likely the multi-modal model which underpins the simulation environment used to train embodied agents could always use more rich sensory data to boost performance.
We can’t help but expect this to be a narrative which emerges as a tailwind for DePIN in 2025. Fuel for perhaps the world’s largest TAM.
Healthcare: Setting the Language of Biology Free
Healthcare is another thorny yet potentially exciting TAM for distributed incentives, richer data collection, and better outcomes.
Healthcare globally is large and inefficient. In the US, healthcare spend soaks up >17% of GPD and up to ~30% is waste, fraud, and abuse.

Source: goinvo
Despite this high expenditure, the US has dismal results: >40% of the population is considered obese and life expectancy is only 77 years.
There is… room for improvement. Outside of behavioral / cultural changes, which will take time, our best bet is more effectively harnessed data.
Many of the fastest growing healthcare IT companies today are “population health” system integrators like Innovaccer and Health Catalyst which do the messy technical plumbing within healthcare systems of extracting data from the various (often old and arcane) systems. This includes pooling electronic health records (EHR), claims data (insurance / demographic details), lab results (diagnostics / blood work), pharma data (prescribed meds), patient generated data (patient portals, wearables), and administrative data (hospital visits and discharges) to help healthcare systems have a unified view of the patient and its patient population more broadly.
This is clearly another step in the right direction in what has been a slow journey out of paper-based fragmentation to EHR implementation to Pop Health and value-based care leveraging big data.
But why stop there?
The other big trend, given the poor healthcare outcomes, has been a shift towards consumers investing more in their own healthcare journeys. Sleep data, fitness data, nutrition tracking, genetics data, smart phone data, longitudinal biomarkers, glucose monitoring and other sensor data, not to mention biohacking, have all entered mainstream dialogue for many cosmopolitan elites.
To me, there is a clear opportunity to try and expand beyond the fragmented data sets which population health and analytics companies collate and mine within patient populations on behalf of healthcare systems, and the data sets which are collected by increasingly engaged patients themselves.
Not only is there potential for better patient outcomes, but financial gains accruing to individuals vs. Healthcare IT companies, if orchestrated correctly.
The regulatory environment, though, presents hurdles. On the one hand, many jurisdictions mandate patient access to healthcare records and increasingly encourage more open information exchange to avoid data fragmentation. On the other, many regulations still promote paradigms and processes which have not been updated for state of the art encryption techniques, requiring non-sensical administrative burdens in a world increasingly guided by algorithms and protected via encryption.
If data marketplaces like MidCentury can prove trustworthy with more generic data sets, perhaps no industry has more to gain than healthcare, one of the largest and least efficient verticals on the planet. A truly functional marketplace between data consumers (healthcare systems, IT providers, insurers, pharma companies, biotechs, developers) and data providers (users), with tiered permissions and encryption based on sensitivity could prove transformative for national and even global healthcare.
“If physics was the language of industry, AI is the language of biology”. – Mustafa Suleyman
Guided by the right incentives and proper encryption, DePIN can help set it free.
DePIN Case Studies: Trust the Framework
Assessing DePIN opportunities is not easy. While there is clearly disruptive potential across large industries, most projects remain early in building out the supply side, let alone seeing any real inflows from the demand side.
The disconnect between the revenues…
30 Day Revenues (DePIN Ninja):

And the Market Capitalizations…

Can be difficult to digest for those steeped in traditional valuation methodologies. I sympathize with those observations and find them reasonable.
To date, DePIN has yet to deliver a true success story.
At the same time, however, humans have routinely under-estimated the power of network-effect driven business models in large TAMs. The past 20 years has filled untold graveyards with smirking- “overvalued”-boomer PMs who failed to accept the new paradigm in building tech-enabled businesses with multi-sided marketplace flywheels. If you needed to wait to see “sustainable economics” before buying in, it was always “too expensive”, and you underperformed. It’s that simple.
While DePIN valuations are “stretched” by most metrics, I view these investments more like call options. Unique, liquid venture bets in very large, often slow moving markets. Some may look at current FDVs relative to real demand and find these call options richly priced. Others may look at the potential disruption and, based on the probability x potential upside, find these call options reasonably priced.
There is always a risk-reward trade-off. Those who prefer to see “sustainable economics” have lower risk of loss but will need to stomach significantly higher entry. Markets are forward looking. Like with internet marketplaces, investors can quickly coalesce around leaders, helping them “pull forward” the large TAM in front of them, meaning valuations almost always appear “stretched” due to the visibility of long-term growth once network-effect lock-in takes hold.
Therefore, as opposed to squabbling over multiples or “unit economics” which are undoubtedly poor relative to web2 competitors, we propose the following framework to assess DePIN opportunities:
- Macro: are financial conditions expected to tighten or loosen? Are interest rates expected to rise or fall? Like any tech company with cash flows far out in the future, the present value of DePIN “call options” can swing drastically based on longer term rates and global liquidity conditions.
- Strategic Rationale: Is there a structural reason why a decentralized architecture should outperform the centralized offerings? How big of an improvement in performance or cost can this bring?Carl Vogel from 6th Man Ventures recently came up with a nice framework which I will steal from below (full article here, TLDR below):
- Granular geographic positioning (the more granular, the better). For example, a sensor or mapping network that collects data in sub 10 meter increments, like mobile hot spots, is more precise in where supply meets demand than say a decentralized file storage network. A distributed CDN would fall somewhere in between.
- Supply side and demand side overlap (the more the better). Is the supply side willing to operate the necessary infra for free or very low marginal costs because its part of an existing behavior ? For example, many farms contract RTK nodes for high-precision farming. As this is already crucial to business success, token rewards are an added benefit as opposed to the raison d’etre.
- Competing on performance (not cost). Can the solution offer something fundamentally better than incumbent networks? Grass’ real-time information networks are an example of a potentially superior offering due to its collection of residential IP addresses which are hard to replicate
- Demand scales non-linearly with supply (essentially, some “economies of scale” to help boost margins). Dawn, discussed earlier, is an example: boosting margins by selling the same bandwidth to many participants in a single large apartment complex.
- Market Size: How big is the real addressable market for this network? Is there a realistic path to disruption? How does that probability weigh against the potential upside?
- Regulatory environment: many industries DePIN aims to tackle – Telecom, Energy, potentially Healthcare – are highly regulated industries which have stymied efforts from the most dynamic, well-funded companies on earth. What are the crucial regulatory hurdles a distributed network will encounter? Are they surmountable? How significantly will they stall the adoption cycle?
- Competition: who are the incumbents? What are their advantages? How much share can we realistically expect to take?
- Path to Sustainability: Is there a credible path to demand overtaking supply side incentives? How long is that expected to take? What are the incentives needed to sustain supply-side participants to get to that point? In a bear market, will the flywheel and network crater in on itself?
- Valuation / Catalysts: What is the price I need to pay now, relative to the potential upside and probability of that upside? What am I seeing that the market may come to appreciate in 6 – 12 months time?
Above is the high level framework. Below is the framework applied to two of the leading DePIN projects to help make it more tangible.
Hivemapper
Hivemapper is a distributed mapping project which aims to take on leading incumbents like Google Maps using a decentralized network of drivers to provide higher frequency and lower cost mapping solutions to developers and end customers like ride sharing companies, local governments, logistics and insurance providers, and even, potentially, frontier labs.
Hivemapper currently has a market capitalization of ~US$280m and a FDV of ~US$580m.
Macro: at first glance, the macro backdrop is accommodating. Financial conditions globally appear to be loosening led by China’s combatting of a large balance sheet recession. US rates are expected to fall (we shall see, yields looking a bit dicey) and incoming US secretary Scott Bessent has vocalized a desire to reign in the dollar’s rise. While inflation could rear back up, the current outlook tilts accommodative to risk assets. Crypto has the added tailwind of a more regulatory friendly incoming US administration.
Strategic Rationale:
- Granular geographic positioning (High). Collecting street level mapping data is fairly granular with difficulty capturing the same level of detail through different, more aloof channels like satellites or even drones.
- Supply side and demand side overlap (High). Many Hivemapper drivers drive full time and view $HONEY as supplementary income. Incrementally, many may have already purchased cameras for insurance purposes. Either way, a $500 outlay for a median annual payout of US$1000 – 10,000 is a compelling ROI, and more cost effective than companies like Google relying on their own fleet.
- Competing on performance (Medium). Hivemapper has managed to map >20% of the globe in two years, 5x the speed of Google. The density and cost of the network leads to more frequent mapping cycles, an advantage that should increase as the network grows. However, Google offers a very strong integrated bundle combining a wide array of data. Even with a superior street view solution, taking share from such a compelling bundle will be difficult.
- Demand scales non-linearly with supply (High). Mapping data can be aggregated and sold to an infinite number of buyers, a compelling zero marginal cost software-like business. A key variable here will be the value of the data decided by 1) the proliferation of potential supply substitutes – drones, satellites, self driving and 2) the proliferation of demand substitutes – synthetic data generation or general commoditization of mapping / video data by other networks.
Overall, Hivemapper ranks strongly on strategic logic.
Total Addressable Market
Business of apps lists Navigation Apps total addressable market at US$21b in 2024, growing at roughly 15% per year to reach close to $50b by 2030.
With US$11b in mapping revenues in 2023, Google has well over >50% share. 82% of Google’s business comes from its ad network while the remainder is primarily from APIs. If we assume Hivemapper’s SAM is the % which comes from APIs, we are looking at a close to US$10b market by 2030

If Hivemapper is able to capture 1% of the market (5% of the API SAM), the returns are roughly in line with the amount of risk. Above that and the return profile becomes quite compelling.
Long answer short, there appears to be enough of a market opportunity for Hivemapper to carve out a decent valuation as a fairly niche provider.
Regulations
Overall, mapping tends to be a less regulatorily sensitive area than other core DePIN markets. This may change in the future, but today the restrictions on a solution like Hivemapper appear fairly benign.
Competition
This is the biggest knock on Hivemapper. They are going up against a juggernaut.
In 2023, Google maps did ~US$11b in revenues with 39% margins, serving 1.8b users across 220 countries with 170b images and 1 billion km served each day.
The product has evolved significantly beyond street view, providing additional mapping and satellite imagery, mapping for large indoor venues, local guide functionality, traffic data, not to mention recommendations, information, and a S-tier ad network to monetize it. Google is in the process of integrating AR in navigation, autonomous vehicles, smart city infra solutions, and even more tailored recommendations. Furthermore, it has a dominant distribution advantage across its family of applications.
This is nightmare competitor fuel, and unlikely to be a solution many customers or businesses leave.
Fortunately, Hivemapper does have a few counters. It’s denser network should provide faster real-time updating vs. Google’s Fleet for street view. Incrementally, there are large customers – like ride-sharing applications and auto manufacturers / logistics providers – who likely see Google’s self-driving program as a competitive threat, encouraging them to hedge with an alternative provider.
However, the likelihood of market leadership is very distant.
Path to Sustainability + Valuation
Based on the last quarter (Sept – Nov), Hivemapper’s USD incentives were US$2.3m while its burns stood at only 10% of that volume: US$220k (though to be fair, during the last month, burns have increased to >20%).
Honey Incentives (USD)

And burns (USD)

Source: flipside crypto
Annualizing the final month, we are looking at US$8.4m in incentives vs US$1.7m is burns. Clearly not sustainable. Most web2 investors would try to assess the viability of growth by examining the “unit economics” of a potential enterprise: looking at the “Life Time Value” of a customer by aggregating its retained contribution margin over a five year period compared to the cost to acquire said customer.
Unfortunately, we do not have driver retention curves and the CAC is paid in volatile “equity-like” tokens, making the calculation difficult. However, given incentives significantly outweigh burns, its safe to say most of these drivers are contribution margin negative.
If this was a growth stage web2 platform, it would be an easy pass. ~US$580m FDV vs. US$1.7m RR in “revenues” is not something you usually want to take to investment committee.
But again, this is a call option. Because the end market is sizable, investors are willing to provide speculative capital to run the experiment. The tight rope walk of any DePIN project is managing capital markets effectively so they continue to have the firepower to keep network participants active:

At some point, however, investors will have to see real demand or the speculative capital dries up at which point the network often crumbles (the equivalent of a “down round” negative flywheel in VC). Keeping the speculative capital around is essential to long-term success.
While demand from B2B users and governments will likely be a slower burn, the potential wild card for Hivemapper and other data-DePINs is the multi-modal AI race we discussed in robotics.
Data does appear to be a material roadblock in the race for embodied intelligence and fully-automated self driving. Many analysts are bullish on Tesla precisely because of its massive data sets from its active fleet of cars. How will other manufacturers replicate that? How about leading AI labs that aren’t xAI or Google?
This, in our estimation, is an underpriced call option for Hivemapper and likely to keep speculative capital around for the foreseeable future, providing runway to iterate on the core business.
Hivemapper currently charges $0.005 per map credit.
- Approximately 500 map credits are needed to access 1km of road for 1 week = $0.25
- $0.25 x 52 weeks in a year = US$13
- US$13 x17.35m unique kilometers mapped = US$225m for a real time data feed of the entire network for a year
- US$225m / 29% = US$775m in possible revenues once fully mapped
If data truly ends up as a constraint in the race for multi-modal / embodied intelligence / FSD, then US$225m is a reasonable outlay given then hundreds of billions pouring into the AI ARMs race.
Might MSFT / Open AI or Anthropic / AMZN or Meta want to counter-act Google and Tesla’s advantage here? Might third-party ride sharing apps like Uber or Lyft be interested in what could be an existential risk? Might large auto-manufacturers like to have their own Tesla sized training set for FSD?
If even one or two parties are interested, this could send Hivemapper’s token soaring. Despite the constant sell pressure, we’re betting 2025 will be a good year for the Honey call option.
The framework has spoken.
Helium
The transformation of legacy infrastructure often follows a predictable trajectory: prolonged ossification under centralized control followed by rapid disruption through distributed systems. In telecommunications, Helium represents perhaps the most ambitious experiment in catalyzing this transition, attempting to fundamentally reorganize wireless infrastructure through crypto-economic incentives.
Admittedly, the Helium ecosystem is complex. After migrating to Solana in early 2022, Helium transitioned to a subDAO framework made up of two subDAOs:
- Helium IoT: Powered by the IOT token, focused on low-power device connectivity
- Helium Mobile: Driven by the MOBILE token, targeting cellular coverage
This dual-token structure is unified through Helium’s novel Data Credit (DC) system – network credits fixed at $0.00001 each that are created by burning the HNT governance token. When users purchase services with DCs, this directly reduces HNT supply, creating a unified value capture mechanism while allowing each vertical to scale independently.
Notably, Q4 2024 saw the approval of HIP 138, marking a significant evolution in the protocol’s economic design. Set for implementation in January 2025, this revision represents a return to first principles:
- Eliminate the multi-token structure and return to using only HNT as the network’s sole token
- Keep both IOT and MOBILE networks operational, but consolidate all rewards into HNT instead of their respective tokens
- Drive more utility directly to the HNT token by centralizing all network value accrual
This architectural simplification, emerging from broader Helium v3 discussions, aims to reduce complexity while maintaining the functionality of both network verticals.
With this prerequisite context out of the way, let’s view Helium through the lens of our framework:
Macro: The macro backdrop appears similarly accommodating to Helium as financial conditions globally loosen. Moreover, the incoming US administration’s more crypto-friendly posture could be especially beneficial for highly regulated industries like telecom. Helium stands to benefit the most here as the clear industry leader.
Strategic Rationale:
- Granular Geographic Positioning (High): Mobile connectivity demands precise coverage optimization down to individual buildings and blocks. Helium’s ability to dynamically adjust incentives through Discovery Mapping enables far more targeted network growth than traditional carriers’ uniform buildout approach.
- Supply/Demand Overlap (Medium): While lacking natural overlap between operators and users, Helium’s relatively modest hardware costs ($250-500) and quick payback periods (3-8 months) create compelling unit economics. This enables a novel dynamic where users can effectively subsidize their mobile service through hotspot operation.
- Performance Edge (Medium-Low): Rather than competing on raw performance, Helium’s structural advantages enable 70-80% lower pricing ($20/month unlimited vs $60-90). This dramatic cost reduction could prove more disruptive than marginal performance gains.
- Supply/Demand Scaling (Medium): Once deployed, hotspots can serve additional users at minimal marginal cost. However, the limited coverage radius and eventual need for more hardware creates some constraints on pure software-like economics.
Total Addressable Market
With global telecom services representing a $1.7 trillion market and MVNOs accounting for $87 billion, Helium’s opportunity set is massive. The network’s current ~123k subscribers (0.000023% market share) generating $29.5M in annualized revenue illustrates both the scale of the opportunity and the magnitude of execution required to capture meaningful market share
Helium Subscriber Growth Slows Since Q1
New subscribers + cumulative subscribers
Moreover, recent partnerships like Telefonica highlight Helium’s emerging carrier offload strategy – allowing traditional telcos to shift traffic onto Helium’s network. This “B2B2C” approach could accelerate growth while minimizing customer acquisition costs. Rather than purely competing with carriers, Helium can selectively partner to expand coverage, especially in underserved markets.
Quick back of the envelope math on the @helium carrier offload program economics in the medium term:
– Average mobile user consumes 17GB of data / mo = 204GB / yr
– Let’s assume the carrier offload serves 5% of a given subscriber’s data usage from one of the big carriers
– Cost…
— Austin Barack (@AustinBarack) August 30, 2024
Regulations
The regulatory landscape represents Helium’s most significant challenge. A substantial portion of their cost advantage stems from sidestepping traditional spectrum licensing fees – evidenced by Verizon’s $4.3B spectrum spend in 2023 alone. As DeWi networks scale, this regulatory arbitrage will face increasing scrutiny.
Key considerations include:
- Spectrum licensing requirements and potential framework changes
- Network security and privacy compliance obligations
- Consumer protection standards
- Infrastructure deployment restrictions
- Quality of service requirements
The incoming administration’s more crypto-friendly posture could provide temporary relief. However, long-term success requires either continued regulatory accommodation of unlicensed/shared spectrum use or evolution of Helium’s model to work within traditional frameworks while maintaining cost advantages.
Competition
Helium faces a complex competitive landscape across multiple fronts:
- Traditional Carriers: Possess massive infrastructure advantages, regulatory relationships, spectrum rights, and bundling capabilities. However, their high fixed cost base creates opportunities for more nimble competitors.
- Established MVNOs: Players like Mint Mobile and Cricket have proven customer acquisition models and carrier relationships. But they lack Helium’s potential for disruptive unit economics.
- Direct DeWi Competitors:
- World Mobile (82k users, $30/5GB pricing)
- XNET (WiFi-focused)
- Drop Wireless (pre-launch)
- Really ($129/month privacy-focused offering)
- Emerging Threats: Starlink and other satellite providers could reshape mobile connectivity, while private 5G networks gain enterprise traction
Path to Sustainability + Valuation
The sustainability of the Helium model becomes apparent when examining their token metrics. Since the start of 2024, total network incentives reached 14.2M HNT while burns amounted to only 560k – a mere 4% of emissions. This level of dilution would typically raise serious concerns about long-term viability.
HNT Remains Meaningfully Inflationary
Weekly HNT emissions vs burns
0
However, examining Helium’s current $1.9B FDV through the lens of implied future revenue presents a more nuanced picture. Working backwards:
- At an 8x revenue multiple (standard for high-growth network businesses), the market is pricing in $237.5M in future revenue
- Adjusting for time value of money (20% discount rate through 2030), this translates to $709M in 2030 revenue
- This implied revenue represents:
- 0.042% of the global telecom market ($1.7T)
- 0.81% of the MVNO market ($87B)
- A 70% CAGR from current $29.5M run rate
- 21x growth in market share from today’s 0.002%
While these growth targets are ambitious, they appear within reach when considering Helium’s structural advantages:
- Carrier Offload Strategy: The recent Telefonica partnership demonstrates viability of wholesale revenue streams without requiring direct customer acquisition. This “B2B2C” model could accelerate growth while minimizing CAC.
- Network Effect Dynamics: As coverage density improves in key markets, the value proposition strengthens for both retail customers and carrier partners. This creates a virtuous cycle where success breeds further success.
- Cost Structure Advantage: The decentralized model enables Helium to maintain 70-80% lower prices vs traditional carriers ($20 vs $60-90 monthly plans). This pricing power provides significant headroom for customer acquisition even in competitive markets.
- Infrastructure Reuse: Unlike traditional carriers building parallel networks for different services, Helium’s infrastructure can simultaneously support mobile, IoT and future wireless applications – improving economics as services scale.
- Geographic Optionality: Recent expansion into Mexico highlights ability to target high-potential emerging markets where traditional carriers are underinvesting. This provides multiple paths to achieving growth targets.
The key to 2025 will be demonstrating progress toward these revenue milestones while managing token incentives. The implementation of HIP 138’s simplified tokenomics in January 2025 should help by consolidating value accrual in HNT. If Helium can maintain network growth through this transition while continuing to announce major carrier partnerships, the current valuation may prove conservative.
For investors viewing DePIN tokens as call options on infrastructure disruption, Helium presents one of the more compelling risk-reward propositions in the sector. The combination of proven traction, clear path to revenue scaling, and multiple shots on goal through both retail and wholesale channels helps justify the speculative premium despite current token economics.
The next 12-18 months will be crucial in determining whether Helium can thread the needle between incentivizing network growth and achieving sustainable economics. But for those seeking leveraged exposure to the future of decentralized infrastructure, there are worse places to look than the pioneer of the DeWi movement trading at a reasonable multiple of potential 2030 revenue.

2025 Predictions: Living The Inflection
2025 is poised to be a big year for both DeAI and DePIN, two crypto sectors we expect to continue outperforming as Bitcoin charts it’s course into the unknown. We expect 2025 to be the year where mainstream users and AI experts alike begin to recognize the benefits of this unlikely marriage: synthetic intelligence and a permissionless global ledger. From distributed training runs, to verified identity and inference, to data collection and attribution, to exchange and speculation, it will become obvious that DeAI is not a “massive grift” but the intersection of what will be our generations two defining technologies.
In the worlds of Lenin, “there are decades where nothing happens; and there are weeks where decades happen”.
We are expecting many such weeks in 2025.
We are also cognizant that DeAI and DePIN are two of the few “pure play” ways for retail to get early exposure to the AI thematic. As such, price and enthusiasm is bound to get ahead of fundamentals for such an important technological revolution, assuming macro remains accommodating.
Here is just a taste of things to come:
- At least three agent frameworks become decacorns in 2025
- One AI agent becomes a centi-millionaire, another does a collab with a globally leading artist, another founds a religion
- One large DePIN project emerges with sustainable economics, causing an upward rerating of the entire sector and reflexive upswing in “fundamentals” for the rest, making their sustainability more likely
- 100b parameter decentralized training runs become common place
- Multiple large data deals are struck between DePIN networks and large AI labs
- AI related crypto market cap comfortably passes US$150b
- @3xliquidated becomes so fed up with agents on the timeline that he buckles and begs for the orb
- An investment firm, primarily run by an agent swarm, emerges as a tier one brand in crypto VC
The only thing we know for certain is the future is going to be much weirder than anything listed above. Our civilization is on the cusp of the singularity, and we have unlocked the world’s largest casino. The stars are aligning.
Blessed are the terminally online, for they shall inherit the gains.