$ICP The AI x internet revolution coming on #ICP: *Self-writing* sovereign web applications and internet services that owners can update in real-time by talking 💥 The Internet Computer has been designed for this and work has been ongoing for YEARS. Today I want to explore this topic in a bit of depth for the first time. Please note — this area of work is not to be confused with the AI capabilities of ICP networks (for newcomers: the Internet Computer is unequivocally the ONLY public network in the world that can host and run AI in the mode of smart contracts, which makes it network-resident and decentralized, and secure and unstoppable, and for example, I recently demonstrated onchain neural networks performing facial recognition, and pending enhancements to the ICP protocol will allow LLMs to run as smart contracts too). Today I'm talking about a very different challenge that ICP shall help the world meet, where *running* web applications and internet services are created and updated simply by talking. Users will create these for any purpose – to create something like a secure personal notes manager, or a personal website, or something for others, like social media, games, web3, or enterprise infrastructure, which involve online communities large and small – simply by issuing instructions in natural language. Please also note, this is a far bigger and different challenge to using AI to write and review software, which is already happening at scale. This is something well beyond, for reasons I will explain... To begin, let's step back and look at some general trends, to understand where the internet is going with AI: Many of you reading this, will already be using ChatGPT to explore ideas, get information, and analyze, refine and create content, and soon to search the internet. ChatGPT is an example of a large LLM (i.e. chat bot) with a huge number of parameters that has been trained on a colossal amount of data. If you're a software engineer, you'll already have been using LLMs to help write and review code, although maybe you prefer Claude. These models triggered the recent wave of hype around AI. But in fact, they are manifestations of a deeper trend that AI has been driving for some time now. The deeper trend is this: we interact with AI, knowingly or not, and it gives us something we want. Some of the first mass market manifestations were services like TikTok and Instagram Reels. These services are not traditional social media services at all. Under the skin, they are driven by powerful AI engines, without which they wouldn't work. When you use them, you are really interacting with AI. The AI inside these services classifies the social media content they can serve, so that it knows what's inside videos and other kinds of posts. Then, as it feeds you content to consume, telemetry is collected, enabling the AI to track how you interact with the content – for example, most simply, it can track how much of a video you watch, to determine what kinds of content you like. As the AI gets better at understanding what you find compelling (which can extend to the sequences of videos it shows you, not just individual videos), you receive a better and better experience, which is why these services are so incredibly addictive. This new media model is just part of inexorable journey we are being taken on by AI technology, where it gives us what we want – which will be far reaching. Seemingly impossible things are now happening. For example, researchers recently trained AI on millions of hours of footage of people playing Doom and Minecraft. This has enabled the AI to simulate those games in realtime for users. You can play the games, but there's no game server or game client, just an AI streaming video to you in response to traditional inputs such as left, right, run, jump and shoot (to be clear, there's no game server, or game client, or any other infrastructure involved from the original games). This hints at what's coming: in the future, AI will create virtual reality experiences for us, and watch how we interact with them, tailoring them in real time to make them more enoyable and compelling. Sci-fi as this might sound, really will just a continuation of the trend exemplifed by TikTok and Instagram. But enough of this. What else will AI give us on demand where ICP specifically plays a role? We in the ICP community are heavily focused on reinventing the platform on which we build, specifically providing a better way to create web applications and internet services (including those with web3 and AI functionality inside), which also happen to be sovereign. In the future, we will just talk to AI, to create and update the *running* web applications and internet services we want too. It's obvious why. For example, what if a businessperson wishes to build a custom personal website, to promote their brand, with blogging functionality, a section where they can embed media pulled from places like YouTube, a library page hosting documents they have created, and a page linking to their social media profiles – and have all that functionality look a certain way. In the future, will they still be expected to hire designers and developers to build such a thing, or otherwise be stuck fiddling with Wix, or just sticking with their vanilla LinkedIn profile page? Of course not, they'll just talk to AI. What about an avid gamer, who wishes to create their own custom online game to share with friends? Will it still be difficult to express creativity without special technical skills and a lot of time? Nope. And how about a business, NGO or government that just needs some custom CRM functionality? Will they still have to sign-up to big-ticket SaaS services like Salesforce, and hire consultant to customize them? Today, creating things on the internet is complex, time-consuming and costly – and this stops us building what we want. The world is waiting to break free from this. In the future, we will just talk to AI to create and update, which will result in an almost limitless number of new custom web applications and services being created for every purpose imaginable. Here's how it will work: 1) You'll describe to AI the custom app or service you want 2) It will return a URL to your web browser, and there it will be, ready to use! 3) You and others will use the app, causing it to accumulate content and data. 4) You'll describe improvements, extensions and fixes. 5) Then you'll just refresh the web page to see them. 6) Loop to 3, and continue iterating to realize value.
Over time, this new paradigm will massively change the way tech works. Imagine what this might mean for budding entrepreneurs around the world, who lack the technical skills or the funding to hire them, but who have ideas for ventures in social media, games, the sharing economy, AI-based services, web3 services, communication suites, and _ (fill in what you want). This will democratize access to the tech economy, and help enable vast additional worldwide talent to participate and become industrious and successful. This is an aim that has been part of the ICP project since inception. Furthermore, this functionality will be for all humanity, not just entrepreneurs. Imagine a simple group of high school students that want to organize the information they collect from a biology field trip. It works for them too! Imagine some business department that needs custom online functionality, but can't convince the CIO and CFO to allow them to sign up for something like Salesforce and hire some consultants to customize it. (Which will anyway take a very long time, and cost a huge amount, even if they agree). Solved. Now think about the current situation in the developing world. They having growing needs for custom online functionaltiy, but can't afford Big Tech SaaS services, and meanwhile don't have the skills to build it out themselves – and if they did, then they would also need the cybersecurity expertise to make what they built safe, which they lack. For those economies, this will be trasformative. The new paradigm will solve for all these needs, and in fact go one better. People who create custom applications and internet services will OWN the software that makes them, even though they don't write it themselves, and OWN the data inside them – which contrasts to popular SaaS services businesses use, which holds customer data hostage, and even consumer services like Google Photos, which make it impossible to get media out. These custom applications and services will be truly sovereign, and the owners won't be captive customers – which has long been an aim of the Internet Computer network. Counterintuitively, this new paradigm will also be great for software engineers – it will result in millions of new custom applications and services being created, and inevitably, there will be places where human assistance is useful to solve specific issues, and help making prompts go further. The orders of magnitude increase in online infrastructure will create huge numbers of software engineering jobs, all around the world. If you have stayed with me this far, hopefully by now you can see that this new paradigm is both inevitable, and represents one of the biggest revolutions ever in tech. So... the next question is, how exactly will ICP finally uncork the paradigm for the world. To understand this next part, we first have to look at the limitations of traditional IT when applied to this paradigm. Creating and updating *running* online applications is much more complicated that getting an LLM to write some software code. For example, using a typical traditional IT framework to build, this is what might be required:
1. Get an AWS account, and add a credit card 2. Get some "compute instances" (effectively servers) 3. Install some cybersecurity to make it safe 4. Install a database server, a web server, a... 5. Orchestrate it all with e.g. Kubernetes 6. Patch all your software for security 7. Design failover, backup and restore 8. Create tables in the database 9... 10. Install the code involved
This is quite a list of tasks, some of which are very involved, so the AI will have to do much more than just write code.
Let's assume the AI is given hooks, and has the knowledge, to perform all these steps itself. Will this solve the paradigm? Unfortunately there will still be issues... At a basic level, the paradigm should provide users with a real-time creation experience, and even installing a database server or patch can take some time. Of course, this kind of thing can be mitigated by using preinstalled images, but problems run deeper... The various steps and requirements involved in traditional IT can go wrong in multitudinous ways. The AI's build sequence will break, like it does for humans, and it will have to make judgements about how to address them within an *unconstrained problem space,* which will subtly impact things like security – which is a serious problem, because traditional IT is insecure by default, and even small mistakes can result in disaster. Traditional IT is a crazy complex Rube Goldberg machine, and providing AI with sole responsiblity in that kind of unconstrained problem space is potentially very dangerous, because it hallucinates, and can pickup bad memes from its training data. Everything the AI does would have to be manually reviewed and, if the application or service is important, audited, by technically competent people – and of course – the whole point of this paradigm is that it doesn't require creators to have technical skills, and to make creation a real-time iterative act. And there are other showstoppers using traditional IT. In the new paradigm, the users/creators will want to update their running web applications and internet services in real-time, simply by telling the AI about the improvements, extensions and fixes they want. The systems created will need to undergo substantial upgrades every few minutes!! Traditional IT has not been designed to make this possible. Anyone involved will know that upgrades are a big deal, and with production systems, typically occur on long time intervals. That's because changes often have to be made across multiple components in a synchronized way (e.g. updating database tables, changing web server configurations...) and it's a fiddle.
Moreover, when you change the design of web applications and services running on traditional IT, the upgrade process often involves refactoring/migrating data, which process is slow, computationally expensive and error prone – again, preventing traditional IT realizing the crucial real-time nature of the paradigm, which will involve running web applications and internet services being updated almost at the speed of chat. I could go on, but it should become clear traditional IT is not really suitable for the paradigm. Because of the difficulties, we will see services like Vercel, and probably eventually services like Google, providing AI that can create apps within custom infrastructure platforms they craft, which ameliorate some of these problems.
But their platforms will remain less than ideal, and moreover, the software created by the AI will also be locked to their special platforms, and they will probably also hold the data involved hostage somehow, creating customer lock-in, and the applications and services involved won't be sovereign. (Nonetheless, we predict that certain web3 projects that tend to imitate ICP, in desperation will use something like Vercel to create dumbed-down versions of this paradigm, then mis-sell their services to the token-buying public as being "onchain," but beyond successfully selling more tokens for them, their schemes will ultimately fail to be competitive with mainstream users/creators worldwide.) So what is really needed? The good news is that DFINITY has been working on a solution to this paradigm for years already. Let me explain... From the beginning, going back years, and now more than a 1000 person-years of R&D effort, our work has been ruthlessly focused on reinventing compute broadly using a decentralized network leveraging advanced cryptography, protocol math and computer science.
Our work is completely unique in the web3 field. A key feature of ICP is that you can build a web application entirely from secure and unstoppable netwok-resident software, which a vastly more powerful evolution of smart contracts. When you build on the network, you don't need Big Tech and traditional IT.
AI can write this code, and upload it to the an ICP network like the Internet Computer to create an web application or other internet service. To be clear, in the radical compute environment ICP creates, AI can create and update, just by uploading code...
There is no need to configure a cloud account, a database, a web server, cybersecurity. Moreover, the code is automatically secure and immune to cyber attack. So there's no need to worry that a hallucination will leave a door open for hackers.
Moreover, there's no need for the AI to design and configure complex failover systems, because the code is unstoppable and always runs.
So those roadblocks are out of the way. But the real power derives from a seminal computer science advance that ICP provides called "orthogonal persistence" (keep reading, I won't go too technical on you!) On ICP, units of software run inside persistent pages of memory, which is basically to say, that data "sticks" to the software logic that programmers create when they write in a software language, which saves them from the complexity and inefficiency of copying data in and out of databases and files, also removing the need for those things. Everything is just network-resident highly-abstracted software.
This makes it possible for engineers, and soon AI, to describe functionality in a much more simpler form, without dependencies, which is ideal for this paradigm. I described a vision for "orthogonal persistence" back in 2018, but only now, 6 years later, will it be fully realized through Motoko, a domain specific language that directly hooks in to the workings of the ICP platform. Get ready for what we are calling EOP, or "ehnanced orthogonal persistence," which will finally realize what we have been working towards. (This is gated by pending upgrades like the 64-bit change also need to run LLMs on the network, which are about to happen.) Above I talked about the importance of instant and safe upgrades to the new AI paradigm.
Well.. EOP makes it possible to "morph" software between upgrades.
Developers (human and AI) will write new versions of software, which realize the changes required. Then separately they will describe code that transforms that data from the old version.
(For example, if a Google Photos-style application has been created, then an upgrade might add location data and comments to photos, causing the strucutre of the "photo" data type to change.)
During upgrades, in the new paradigm, EOP does the following: 1) It adds type safety to the upgrade, ensuring that if the AI has made a mistake that might cause data loss, by hallucination or otherwise, the upgrade will fail, which greatly reduces the risks that would be ever-present in traditional IT architectures.
2) As software is morphed through upgrades, it allows data transformation to occur in a highly efficient manner, such that the paradigm can deliver real-time upgrades at the speed of talk.
Which is what is needed. Other benefits from the years focused on reinventing compute are important too. For example, because code and state exist as one within this the environment, ICP can easily almost instantly snapshot applications and services, enabling users to rollback to precisely where they were before, if they don't like how their data was transformed (using EOP, this can often also be done just by "upgrading" to earlier versions of the software).
I could go on, but I'll summarize. This new paradigm, which will ultimately profoundly change tech, will be unlocked by combining ever-improving AI, with ICP technology.
What's incredible for the ICP community, is that the paradigm provides utility to massive worldwide market, and we will not be constrained by web3 noise. People will use it because it gives them what they want. I can tell you that behind the scenes we are expending great effort on AI itself, and also on the framework that will let it build on ICP, and on making sure that the Internet Computer can scale to handle this – which work you might already be aware of. If you think the incredible recent growth of compute on the Internet Computer is impressive, buckle up, because this paradigm may mean we haven't seen anything yet.
As always, we choose to believe in this over narratives: Pure. Utility. From. Advanced. Alien. Tech. Will. Win. In. The. End.
Today we are closed than ever.
O, btw did I mentioned that this next generation of web applications and internet services that people create using AI will be internet native, and sovereign.
They will run on a network hosted by real decentralized hardware (the Internet Computer is one of the few web3 networks that does not really run on Big Tech), leverage trustless multi-chain functionality, and embed real onchain AI.
Can your blockchain host a single phone photo?! I asked Chat GPT to tell me how much it would cost to upload a 2 mb phone photo to #SOL, #AVAX , #SUI , #APT , #NEAR and #ICP The results speak for themselves.
$ICP is the only real onchain player. https://youtu.be/4jg7LAbyn50
$ICP 🧠📖🔍 Many misunderstand the concept of max supply and a blockchain's core utility token. When a blockchain has a max supply, it essentially bets that by the time this limit is reached, it will be able to cover all rewards (for nodes, staking, and governance) through fees.
If this fails to occur and the blockchain cannot cover all rewards through transaction fees alone, it faces limited options: 1. Shut down the network, 2. Increase the token supply 3. Increase fees.
Well-designed blockchains incorporate burning mechanisms, so when the network is thriving, the total supply decreases, and when the network faces challenges, rewards are covered through token emission.
Invest based on the utility that the blockchain can provide, not the misleading max supply metric. Remember, it's all software, so the arbitrarily set max supply parameter can easily change, and it will if the alternative is network shutdown.
$ICP has the potential to outshine all existing cryptocurrencies. On paper, there is no crypto better than ICP. This is why many crypto influencers refuse to talk about it; because if they did, they wouldn’t be able to justify discussing the thousands of other garbage cryptos, as the gap is simply too wide.
Millions of users have followed these influencers, who often promote bad hyped and short-term pumped projects. As a result, toxic communities have emerged, filled with people who have staked in projects for years that may eventually go to zero. These projects won’t be able to compete with ICP’s long-term capabilities. This is why there are so many trolls and haters. It’s not just competition, it’s fear of obsolescence.
$ICP has the potential to outshine all existing cryptocurrencies. On paper, there is no crypto better than ICP. This is why many crypto influencers refuse to talk about it; because if they did, they wouldn’t be able to justify discussing the thousands of other garbage cryptos, as the gap is simply too wide.
Millions of users have followed these influencers, who often promote bad hyped and short-term pumped projects. As a result, toxic communities have emerged, filled with people who have staked in projects for years that may eventually go to zero. These projects won’t be able to compete with ICP’s long-term capabilities. This is why there are so many trolls and haters. It’s not just competition, it’s fear of obsolescence.
$ICP Internet Computer (ICP) was one of the most anticipated blockchain projects of 2021. Backed by the Dfinity Foundation, ICP promised to revolutionize decentralized internet infrastructure. When the ICP token launched in May 2021, it was listed almost immediately on major exchanges, including FTX. The token’s value initially skyrocketed, hitting highs near $700 in its first days of trading, before crashing by over 90% within a short period. The sharp price collapse led to allegations that Alameda Research and FTX were involved in price manipulation.
1. Market Position and Listing Strategy
Sam Bankman-Fried’s exchange, FTX, played a key role in ICP’s early market environment. When ICP launched, most of the circulating supply was controlled by institutional players and early investors, with limited tokens available for retail trading. Alameda Research, a quantitative trading firm closely affiliated with FTX, reportedly acquired a large tranche of ICP tokens, potentially giving them substantial control over the liquidity and supply.
FTX listed ICP shortly after its launch, which was critical in concentrating liquidity on this one platform. Given the limited circulating supply, FTX had significant influence on ICP’s price formation, especially in the initial phase where price discovery was still ongoing. ICP’s listing on FTX was a gateway for price exposure but also made it vulnerable to manipulation, particularly through derivatives like perpetual swaps and futures that FTX specialized in.
2. Aggressive Selling and Shorting Strategies
Once listed on FTX, Alameda allegedly started aggressively selling large volumes of ICP, creating intense selling pressure. By flooding the market with tokens at a time when liquidity was thin, Alameda may have deliberately suppressed the token’s price.
At the same time, Alameda’s positioning in shorting ICP futures and perpetual contracts could have been another key element of the strategy. In a low-liquidity environment, short positions, combined with aggressive spot market selling, would drive down the token price, profiting from the resulting decline. This type of strategy is commonly used by institutional players to capitalize on an asset’s volatility or exploit inefficiencies in the market.
FTX’s derivative markets allowed leverage trading on ICP futures, which magnified the effect of even slight price movements. Alameda, with access to significant capital and influence over order flows, could take advantage of both the spot and derivative markets, moving the price sharply downward and profiting from their short positions in the process.
3. Control Over Circulating Supply and Market Depth
ICP’s price collapse wasn’t merely a result of normal market conditions. ICP’s circulating supply was extremely limited when trading began. The vast majority of tokens were still locked in vesting schedules for early investors, meaning the liquid supply on exchanges like FTX was constrained. This gave Alameda and other major institutional players outsized control over price action, particularly because they could easily overwhelm order books with sell orders due to the shallow liquidity.
Market depth on FTX was reportedly thin for ICP, amplifying the price impact of large orders. Alameda could use this to its advantage, placing substantial sell orders on the spot market while executing shorts in the futures market, triggering cascading liquidations of over-leveraged positions. Given the leverage available on FTX, this likely triggered forced liquidations of long positions, driving the price down further.
4. Market Panic and Liquidity Crisis
The rapid decline in ICP’s price soon led to market panic. Retail traders, who had bought into the hype around ICP’s launch, were trapped as the token’s value plummeted. Many of these traders were highly leveraged through FTX’s futures contracts, and as prices continued to fall, margin calls and forced liquidations followed, adding fuel to the fire.
By engineering a steep decline, allegedly through aggressive shorting and spot selling, Alameda and other institutional players may have taken advantage of retail traders and those betting on ICP’s long-term success. This created a liquidity crisis, with panic sellers flooding the market, further exacerbating the sell-off.
5. Price Manipulation Allegations and Unethical Conduct
The core of the allegations against Sam Bankman-Fried and Alameda is that they deliberately manipulated the ICP price to profit from short positions while knowingly exploiting the vulnerabilities of the retail market. Alameda’s ability to place large sell orders, while shorting the token simultaneously, allowed them to profit from ICP’s decline both directly and indirectly through FTX’s derivatives market.
Moreover, there are accusations that FTX’s platform mechanics may have been designed to exacerbate these conditions. Some critics argue that FTX’s risk management systems, particularly how it handled margin calls and liquidations, could have been exploited to create a feedback loop of forced liquidations, further driving down the price.
Conclusion
The ICP price manipulation allegations revolve around Sam Bankman-Fried’s control over both FTX and Alameda Research, enabling a coordinated attack on ICP’s price. Through aggressive selling, strategic shorting, and exploiting market depth, Alameda may have engineered a sharp price decline to profit from both spot and derivative markets.
While these allegations remain under investigation, the case highlights the potential for market manipulation in the crypto space, especially when a few large players hold significant power over both liquidity and exchange infrastructure.
More details here: https://cryptoleaks.info/case-no-1