Quick take:
Keller is building a peer-to-peer decentralised cloud network, Flux, aimed at ensuring access and control of Web3 platforms remain decentralised.
According to Keller, by leveraging Federated Learning, decentralised AI can be used to protect patient privacy and ensure accurate, high-quality medical outputs.
Keller also believes the biggest challenge to integrating decentralised AI in healthcare systems is data migration.
Artificial intelligence (AI) is one of the biggest tech breakouts for the current decade, thanks to major breakthroughs in generative AI, driven by the emergence of applications like ChatGPT, DeepSeek, Meta AI and Google’s Gemini.
However, while such breakouts have been encouraging from a business perspective, they have been met by equally sceptical views relating to user privacy and the security of data. They have also faced criticism from the crypto community for the centralised economic models that only focus on rewarding the big corporations building the apps, not to mention the amount of power required to power such apps.
This is where Daniel Keller, the co-founder and CEO of the decentralised compute marketplace InFlux Technologies, spotted a weakness in the entire process of building, training and running AI agents.
“We rebranded to InFlux Technologies in 2018, and the focus shifted to building a peer-to-peer decentralised cloud network: Flux,” Says Keller, whose previous crypto project, Zel Technologies, focused on custodianship with the desktop-based Zelcore crypto wallet.
His company rebranded to InFlux Technologies, switching focus to building a peer-to-peer, decentralised cloud network. Keller believes his company serves an important role in ensuring infrastructure access and control remain decentralised. He also expects that decentralised computing will take over cloud computing, especially in AI workloads.
Keller, who also has a background in healthcare, shared how AI could be used to improve healthcare systems without reinforcing existing biases in clinical data.
“By incorporating fairness constraints—terms added to training objectives that penalise models whenever error rates differ too much between ethnic or cultural differences in patients—into the model logic, so that no patient is treated unequally,” he said, adding that the main challenge to integrating decentralised AI in healthcare systems is data migration.
Briefly, tell us about your journey in Web3 and what sparked your interest in crypto overall.
Well, I co-founded Zel Technologies a few years back, and we focused on custodianship with the Zelcore crypto wallet for desktop. We rebranded to InFlux Technologies in 2018, and the focus shifted to building a peer-to-peer decentralized cloud network: Flux.
What sparked my interest in crypto initially was two-fold. Firstly, the ridiculousness of paying someone else to custody my money was a huge draw of crypto. I mean the very idea of you walking into a bank and asking someone permission to access your own money is ludicrous to say the least.
Secondly, our money isn’t even worth the paper it’s printed on because the more of it they print, the more value it loses. Bitcoin is deflationary, as it has a fixed supply, meaning its value will only increase over time. Learning this, I realized that this is an asset class that I had to be involved with.
What is the most important role that InFlux Technologies plays in Web3, especially in decentralised AI?
Ensuring that infrastructure access and control remain decentralized. When participation is not only open but incentivized, especially at the hardware level, censorship is prevented. A censorship-free compute network enables faster, more innovative, and ultimately more resilient development.
How do you see decentralised compute fitting in in the wider cloud computing and AI sector? Does it have a future?
I see decentralized compute taking over cloud computing, especially for AI workloads. So, not only does it have a future, it IS the future. Current centralized cloud computing for AI development consumes excessive computational power, resulting in constant over- or under-provisioning and the generation of massive e-waste.
Decentralized computing enables more focused and local processing, ensuring that resources aren’t wasted and computations are executed closer to the source of the network. Additionally, decentralized compute can massively extend hardware lifespans as unused bandwidth is harnessed from existing GPUs instead of exclusively from brand-new ones.
How can developers ensure AI systems used in healthcare improve patient outcomes without reinforcing existing biases in clinical data?
By incorporating fairness constraints—terms added to training objectives that penalize models whenever error rates differ too much between ethnic or cultural differences in patients—into the model logic, so that no patient is treated unequally.
How can decentralised AI models be leveraged to protect patient privacy while still maintaining high-quality, collaborative medical insights across institutions? Does InFlux address this challenge in any way?
Federated Learning models can be leveraged to protect patient privacy and ensure accurate, high-quality medical outputs. These models would be hosted across multiple devices used within the same medical institution, all trained on local data, but without sharing the data.
Therefore, instead of the data being centralized in a single location, each medical device trains a local model using its own data, with all models being centrally coordinated but decentralized in their training and data storage.
One of blockchain’s biggest challenges relates to fragmentation and a lack of interoperability between chains. What are the key technical and operational challenges in integrating decentralised AI systems into existing healthcare infrastructure, particularly in terms of interoperability, model validation and real-time clinical decision support?
I would say that a lack of interoperability between chains is actually a huge misconception within distributed ledger technology. Design principles of modularity, composability, and abstraction are increasingly built into Blockchain architecture to enable seemingly isolated networks to interoperate in very complex ways.
However, the biggest challenge of integrating decentralized AI into existing healthcare systems—both technically and operationally—regarding model validation and real-time support, is data migration. Legacy healthcare systems rely on heterogeneous data sources—from EHRs to medical imaging—that use different formats and schemas for annotation, making migration to a decentralized AI network complicated, to say the least.
Furthermore, decentralized AI requires in-memory processing where data remains on local nodes. For large, low-latency data migrations, the nodes in the decentralized AI framework must be synchronized for model training, which can be very challenging to accomplish.
Stay on top of things:
Subscribe to our newsletter using this link – we won’t spam!
Follow us on X and Telegram.
The post Daniel Keller: “I see decentralised compute taking over cloud computing, especially for AI workloads.” appeared first on NFTgators .