The new infrastructure combines storage and cloud GPUs — without dependence on Big Tech.
The Walrus and io.net teams announced the launch of a joint solution for training and deploying AI models. Now developers can upload, train, and store their models without the need to deploy their own servers.
Io.net provides distributed access to over 10,000 GPUs and CPUs worldwide. Walrus is responsible for storage — through its decentralized pay-per-use protocol. This means: no subscription fees, only actual computations and volume.
Deploying a model in production without Google and AWS.
The BYOM (Bring Your Own Model) format allows developers to run custom models without relying on centralized clouds. Users do not need to rent expensive servers, build their own data centers, or sacrifice privacy.
According to Rebecca Simmonds from the Walrus Foundation, centralized services are not only expensive. They limit flexibility, carry data leakage risks, and make the architecture dependent on failures.
"We offer an alternative that works on the developers' terms, not the platforms'," she stated.
The decentralized AI market is intensifying.
Walrus and io.net enter the competitive field with other projects: Bittensor, Lambda, Gensyn, Vast AI, Spheron, and Akash. Solutions from Google — including Vertex AI — continue to dominate. However, a focus on decentralization, transparency, and modularity may work in favor of Web3-oriented developers.
Especially against the backdrop of increasing pressure on Big Tech. The recent outage at Google Cloud affected dozens of services, including Cloudflare. This is yet another reason to seek resilient solutions that are independent of a central entity.
Walrus: $140 million and over 4,000 TB of storage.
The main network of Walrus was launched in March. At that time, the fund announced the raising of $140 million in investments. Today, 121 nodes and 103 operators are involved in the network. The total storage capacity is 4,167 TB, of which about 26% is in use.
The main priority of Walrus is programmable decentralized data storage. The protocol is based on the Sui network. Integration with io.net could open up avenues beyond the crypto industry — towards traditional AI developers.
The load on data centers will increase.
McKinsey predicts that by 2030, $6.7 trillion will be required for the development of computing infrastructure. More than $5 trillion of this sum will go towards AI. For comparison: all other IT applications will require no more than $1.5 trillion.
Against this backdrop, the demand for decentralized and custom computing solutions will only grow. And Walrus with io.net are among the first to offer such an alternative on a real scale.
What's next?
If the platform gains demand, it will be one of the first examples of how AI and Web3 converge at the infrastructure level. Developers will be able to deploy models without compromises. Users will gain access to custom solutions. And the network will scale based on external demand, not hype.