The number one thesis I've been mulling on lately is that we are transitioning from a compute-constrained world to a data-constrained world.
The typical refrain is that Common Crawl accessible data has become commoditized, which is of course true. But an overlooked input to that problem is that the surface area of content accessible to crawlers has also contracted dramatically. Why? Incentives.
In this piece exploring the crossover between crypto and AI, I argue that the single agent with the best product market fit today-- the webcrawler-- is due for reinvention.
Big thanks to @VirtualElena, whose internet data sleuthing ignited my interest in this space, as well as to the @a16zcrypto editorial team for their help making this digestible. Let me know what you think :)