Bitget App
Trade smarter
Buy cryptoMarketsTradeFuturesEarnWeb3SquareMore
Trade
Spot
Buy and sell crypto with ease
Margin
Amplify your capital and maximize fund efficiency
Onchain
Going Onchain, without going Onchain!
Convert & block trade
Convert crypto with one click and zero fees
Explore
Launchhub
Gain the edge early and start winning
Copy
Copy elite trader with one click
Bots
Simple, fast, and reliable AI trading bot
Trade
USDT-M Futures
Futures settled in USDT
USDC-M Futures
Futures settled in USDC
Coin-M Futures
Futures settled in cryptocurrencies
Explore
Futures guide
A beginner-to-advanced journey in futures trading
Futures promotions
Generous rewards await
Overview
A variety of products to grow your assets
Simple Earn
Deposit and withdraw anytime to earn flexible returns with zero risk
On-chain Earn
Earn profits daily without risking principal
Structured Earn
Robust financial innovation to navigate market swings
VIP and Wealth Management
Premium services for smart wealth management
Loans
Flexible borrowing with high fund security
Compute belongs to everyone, decentralize it | Opinion

Compute belongs to everyone, decentralize it | Opinion

CryptoNewsNetCryptoNewsNet2025/09/13 21:00
By:crypto.news

If artificial intelligence is the new electricity, a handful of private utilities already control the switch, and they can dim the light for everyone else…any time that they want.

Summary
  • Compute is the new chokepoint — today’s biggest models and breakthroughs depend on a few centralized servers, turning AI into a rigged tournament instead of an open race.
  • Treat compute as infrastructure — like electricity or broadband, it should be provisioned as a utility with transparent pricing, open scheduling, and fair set-asides.
  • Distribution beats concentration — spreading compute near renewables and regional hubs reduces grid strain, lowers costs, and makes capture harder.
  • Access fuels acceleration — when more people can experiment freely, iteration speeds multiply, unlocking breakthroughs and diffusing power across the ecosystem.

The largest models, the most daring experiments, and even the pace of discovery itself now hinge on access to a few tightly held servers and accelerators. This is far from a free market at work and far more like a gate deciding who gets to build tomorrow (and who has to wait).

You might also like: Web3 compute has a trust issue, but the solution is obvious | Opinion

Centralized compute does more than raise prices; it rigs the tournament. When training slots are allocated through exclusive deals and preferential pipelines, the outcome is predetermined long before the starting gun. Just look at Meta’s $10 billion cloud deal with Google.

Ambitious labs and students are told to economize their curiosity, entire research paths are pruned to fit quota, and the narrative of ‘inevitable winners’ becomes a self-fulfilling mirage. This is how innovation slows, not in headlines, but in the quiet suffocation of ideas that never touch silicon.

Build the network, not the bottleneck

Treat compute like the critical infrastructure it is, and wire accountability into every rack, and quickly start to change. Tie the incentives to access metric rather than exclusivity and publish the data; nothing hides in the shadows, the network builds, and everyone writes AI’s next chapter.

The question isn’t whether to build more capacity, it’s who controls it, on what terms, and how widely the benefits spread. Concentration turns a general-purpose technology into a private toll road. If intelligence is to serve the many, compute must be provisioned like a utility with equal access — no VIP lounges here.

Global electricity use by data centers is projected to more than double to approximately 945 terawatt-hours by 2030, primarily driven by AI. Packing that load into a few concentrated hubs magnifies grid stress and prices.

Imagine that was instead distributed across sites near new renewable energy sources and flexible energy grids. The result is a cleaner, cheaper, and more challenging system to capture, which benefits a far broader network.

Public money should be used to purchase public access today, including access to open scheduling, hard-set-asides for newcomers (such as students, civic projects, and first-time founders), and transparent cost-based pricing.

Europe’s AI Continent Action Plan proposes a network of AI Factories and regional antennas designed to widen access and interoperate across borders. Whatever one thinks of Brussels, building for diffusion rather than capture is the right instinct.

Elsewhere, the sums are even larger (and the risk of entrenchment sharper), seen in U.S. President Donald Trump’s pledge of up to $500 billion for AI infrastructure. Although it appears net-positive for everyone, it could foster a plural ecosystem or solidify a cartel, depending on the rules attached.

End scarcity-as-a-service

Let’s call it what it is. Scarcity has become the business model of centralized compute, it isn’t just a glitch. Mega cloud deals are often presented as ‘efficiency’, but they primarily foster dependence, as bargaining power is concentrated in the locations where the servers are housed.

When access rides on contracts rather than merit, good ideas fall before they pass a badge check. What’s truly needed is a reserved, real slice of capacity for newcomers at transparent, cost-based rates so the doors remain open to all in a fair manner.

APIs need to be open, schedules need to be interoperable, queue times and acceptance rates need to be published, and any exclusive lockups need to be public so gatekeeping can’t hide in fine print terms and conditions.

Think of it as more than just access to machines or cycles; it is a right to compute. Just as societies have come to recognize the importance of literacy, healthcare, or broadband, compute should be understood as a vital foundation for creativity, science, and progress. To treat it this way means embedding guarantees into the system itself: portability so work and data can move seamlessly across environments, carbon-aware scheduling so the cost of innovation doesn’t come at the expense of the planet, and community or campus-level nodes that plug directly into a shared and resilient fabric. The framing matters. This isn’t about charity, handouts, or subsidies. It’s about unlocking acceleration, making sure that anyone with an idea has the ability to test, to push, and to build without structural barriers slowing them down.

Because when more people can experiment, when they can try, fail, and try again without having to beg for a slot or wait weeks for permission, iteration speeds increase exponentially. What once took months can collapse into days. The cumulative effect of this freedom is not just faster prototypes, but faster learning curves, faster pivots, and ultimately, faster breakthroughs. And beyond the technical advantage, something subtler and perhaps more powerful happens: politics fade. Build the network, not the bottleneck.

Read more: Making quantum computing accessible through decentralization | Opinion
Chris Anderson

Chris Anderson is the CEO of ByteNova. Chris is an expert in marketing strategies and product management, and he has his own understanding of decentralized AI, combined with web3. He’s passionate about building new AI products, ways that Physical AI can emerge into humans’ lives, and the future of companionship AI.

0

Disclaimer: The content of this article solely reflects the author's opinion and does not represent the platform in any capacity. This article is not intended to serve as a reference for making investment decisions.

PoolX: Earn new token airdrops
Lock your assets and earn 10%+ APR
Lock now!