Back in 2019, when machine learning, statistical modeling, data munging, natural language processing and all the other subjects within the artificial intelligence umbrella term were starting to get attention from the Ethereum community. Vitalik Buterin published the first mention on how optimization surfaces are better with as saddle topography and how, in a medium term future, blockchains could integrate with machine learning using such optimization topographies. Which was obviously mostly inaccurate but it had already sparked interest from the Ethereum community back then. And more importantly, it helped the Ethereum community take a serious look at potential artificial intelligence applications on the blockchain and how these two seemingly dissimilar technologies could merge.
One of our former partners, back then working at a research lab in Brain Computer Interfaces, famously said that artificial intelligence and blockchain in their current (in 2020) forms were mostly incompatible. Sure, other projects like SingularityNET didn’t have a concrete product, NumerAI had launched already the first financial model aggregation and Akash was still heavily work in progress. As a matter of fact, this was not meant as a complete incompatibility of AI/ML and blockchains as we will present later, rather, it meant that models and data were not meant to be posted directly on the available blockspace.
The blockchain+AI issue didn’t raise any concerns before the advent of LLMs and high quality generative models. Why was that the case? Because models could be replicated within powerful, yet accessible hardware for most people and industries that were really interested in the technology. So centralization of models, data and training was not a real concern. In short, it was more practical to self-host your own models with your own infrastructure without any blockchain overhead. LLMs and generative models changed the game however. As soon as LLMs and high quality generative models appeared and were adopted by a wider population. The censorship pitfall began to emerge for all generative models. Regulators began taking notice of these “advanced models” (or should we talk about advanced data rather) and talks of regulations began to emerge. Two of the most important regulations were the EU AI Act, SB 1047 AI safety bill among many others in the works. Along with other anti-competitive pushes signed by private individuals and researchers it suddenly became clear that AI was a technology that could make use of censorship resistant properties that decentralized blockchains have.
Near was one of the first blockchains to realize the potential of censorship resistance technology usually found on blockchains applied to AI generative models. Internet Computer and others began realizing the potential for the final connection between these two as well formerly & commonly thought, dissimilar worlds.
How can this look in practice?
Unlike the most common application on blockchains currently, DeFi, AI apps are meant to serve end users directly while using the blockchain technology in the background. Making it an ideal candidate for blockchain abstraction to the end user. Others being payment rails / ticketing as well as prediction markets. Hyperbolic, Exabits, Akash and many others provide high end hardware available to the public and institutions for a lower price than self-hosting the hardware. In this simple example, payment rails provided by the blockchain are integrated to already well explored hardware provision by third party entities. Making a core use of DePIN / AI for a more technical audience. Cosmose and Jutsu however move abstraction a step forward by having AI as a service, available for end users directly.
How is Polkadot positioned at the moment?
Currently, Phala and Peaq are the leaders of the technology in Polkadot. The Polkadot-SDK (formerly known as Substrate-SDK) has the ability to integrate the payment rails and develop for whatever technology is needed for said application. Just like with other blockchains, the hardware distribution is the cornerstone of the censorship resistance for the models and data. What’s important however, is the builder and experimentation culture, that is currently strong in Near, ICP, Solana (also with native DePIN projects) and many blockchains. This is usually accompanied by strong financial incentives that allow projects to integrate and cooperate with other projects and people but more importantly that allows them to have enough funds to crystallize their ideas into tangible projects.
Even to this day, no blockchain has emerged victorious in the AI race so it’s still an open game for all AI projects and applications looking for censorship resistance in their models. This type of innovation is still young and potential is yet untapped. AI is not JUST a narrative waiting to be exploited for token appreciation, rather a potential application that can take blockchain technology to the next level.
Published by: Saxemberg on Sept. 24, 2024