Decentralizing AI: Dreamers vs. Pragmatists

Decentralizing AI: Dreamers vs.

PragmatistsJesus RodriguezBlockedUnblockFollowFollowingMay 23The decentralization of artificial intelligence(AI) is one of the most fascinating technology trends of the moment and one that can become the foundation of a sustainable path towards artificial intelligence.

The emergence of trends such as federated learning, blockchain technologies or secured-encrypted computations have provided a viable technological path for the creation of decentralized AI applications.

However, most of today’s applications of decentralized AI remain highly theoretical exercises or constrained to very self-contained use cases.

Despite the obvious benefits of the decentralization of machine knowledge, the path to its practical implementation is not trivial and it might very well not happen.

Today, I would like to provide a pragmatic perspective about a possible path towards the adoption of decentralized AI technologies based on the current realities of the AI and blockchain ecosystems.

Finding a viable path towards the decentralization of AI relies on finding a balance between the promised benefits of decentralized intelligence and the economic and technical realities of the AI space.

Last year, I published a long three-part essay(Part I, part II, part III) in which I outlined the economics and technical enablers that paved the way for the evolution of decentralized AI technologies.

Since then, my thinking has evolved into trying to find viable avenues to adopt decentralized models in AI applications.

The fact remains that, while decentralized AI models might hold the key to a better and more sustainable AI, the current technical delivery is incredibly limited to be adopted by mainstream organizations.

While the dreamer in me desperately wants to believe that a decentralized AI is a better AI, the pragmatist in me constantly wrestles against the technical and economic realities of the nascent AI space.

From my perspective, finding a viable path towards the adoption of decentralized is AI is a three-step process:i.

Understand the economic imperative of decentralizing AI.

ii.

Steadily introduce decentralization into some simpler areas of the AI application lifecycle.

iii.

Build the right incentives and network effects to foment the growth of decentralized AI networks.

The Dreamer’s Perspective: Decentralized AI is the Only Path to Sustainable AIDuring the last few years, AI has evolved in completely centralized models across the different aspects of the lifecycle of AI models.

Given that the current generation of AI solutions requires large, high quality training datasets, the bulk of the innovation in the market has come from the big corporate labs of companies like Google, Microsoft, Facebook or Uber instead of from the startup ecosystem.

That AI created by large companies is also conducive to better data which also turns into better intelligence.

That vicious cycle has steadily contribute to increase the gap between large companies with data and data science talent to execute on AI initiatives and smaller companies without those resources.

Extrapolating that dynamic to entire economies, the centralization of AI is likely to be one of the factors that increases the gap between first world countries and the rest of the planet.

The industrial revolution of the 18th century is one of the economic transformational movements that is often compared to the evolution of AI.

If the industrial revolution moved a few countries from hand production methods to machines, AI is leading the transformation from utility to intelligent software systems.

One of the side effects of the industrial revolution is that it created a 150 year gap between the countries that industrialized and the ones that didn’t.

It took that long for development countries to close the gap with the top world economies.

The centralized models of the current generation of AI applications is likely to create an even bigger gap between economies like The US and China and the rest of the world.

Decentralized AI models that facilitate the collective creation and sharing of knowledge is one of the only viable mechanisms to prevent AI to increase the gap between large companies and startups or the world leading economies and development nations.

Networks in which autonomous actors are incentivized to publish datasets, create, train or optimize models is a more sustainable mechanism to foment the creation of AI that doesn’t contribute to make the rich richer.

The Pragmatist View: The Practical Challenges of Decentralized AIWhile the value proposition of decentralized AI makes obvious sense, the practical implementations are plagued with challenges.

From the immaturity of the technology stacks to marked frictions in the delivery model, decentralized AI solutions currently face major roadblocks for mainstream adoption.

Decentralization typically creates disruption when applied to traditionally centralized structures and AI won’t be the exception.

While there are many challenges that can be linked to the limited adoption of decentralized AI technologies, most of them can be grouped in the following categories:· The Double-Disruption Challenge: AI is still in its infancy as a technology trend and most organizations are just starting to figure out ways to adopt new deep learning or machine learning stacks.

From limited talent availability to the native complexity of the technologies, most companies faced enormous challenges adopting AI as a key pillar of their technology strategy.

Decentralization is an additionally layer of complexity that, to most organizations, might not appear critical in these early stages.

· The Computation Challenge: Decentralized ledgers such as blockchains are still very limited to execute expensive computations such as the ones required by deep learning models.

To that extent, decentralized AI networks still require off-chain computation models which creates an infrastructure challenge for most organizations.

· The Incentive Challenge: Decentralized AI structures need to rely on incentive mechanisms to motivate different parties to participate in a network.

When talking about something as precious as data and knowledge, the incentive model need to not only be incredibly robust but remain competitive with the ROI of centralized AI approaches.

Additionally, incentive structures typically introduce an attack vector for bad actors to manipulate the behavior of a network.

Addressing the aforementioned challenges is the only way to create an economically and technologically feasible path for the adoption decentralized AI technologies.

Any strategy needs to strike a gentle balance between the nascent state of AI technologies with the disruptive nature of decentralized models.

A Viable Path Towards the Decentralization of AIA helpful way to strategize towards the adoption of decentralized AI models is not to think about them as a single problem but as a collection of challenges all related to different aspects of the lifecycle of AI applications.

From that perspective, instead of thinking of decentralizing AI as a whole, we can frame the problem as decentralizing aspects of AI.

If we try to organize that idea on a path inversely correlated to the level of disruption, we get something like the following:· Decentralized Data Sharing: Incentivizing data publishing and sharing among participants in a network is the least disruptive of all elements of decentralized AI applications.

Is simpler for an organization to join a network to publish and consume relevant datasets than to establish an infrastructure to run decentralized deep learning models.

· Decentralized Training and Predictions: After a decentralized network for data sharing is established, the next logical step is to decentralized the training of models and the publishing of results.

This structure will introduce centralization all aspects of an AI model except computations.

· Decentralized AI Models: Finally, we can think of decentralizing the execution of AI models themselves, the dynamic allocation of resources and the consumption of the models.

This will be the ultimate manifestation of decentralized AI.

The previous steps offer a pragmatic strategy towards the adoption of decentralized AI models.

The strategy is not only practical but we already have emerging technologies that are addressing each of the steps in the cycle.

Decentralized Data Sharing: Ocean ProtocolThe Ocean Protocol is one of the fastest growing decentralized AI stacks.

Conceptually, the main role of the Ocean Protocol architecture is to enable decentralized communications between entities in an AI workflow.

From data or algorithm providers to analytics tools the Ocean Protocol provides a model based on tokenized incentives and blockchain smart contracts to allow different parties to collaborate in AI workloads following fair and efficient interactions.

Despite its general feature set, the Ocean Protocol excels at the sharing of data between nodes in the network by introducing a tokenized incentive layer.

The Ocean Protocol is one of the few decentralized AI stacks that can be used in conjunction with major deep learning and machine learning frameworks without major disruptions.

Decentralized Training and Predictions: ErasureErasure is the protocol powering the famous Numerai hedge fund.

From all the aspects of decentralized AI, Erasure excels at the publishing and validation of predictions based on available datasets.

The goal of Erasure is to provide a decentralized marketplace in which data scientists can upload predictions based on available data, stake those predictions using crypto tokens and earn rewards based on the performance of the prediction.

While the first use cases are related to financial forecasts from Numerai, Erasure can be used for any time of prediction.

Architecturally, Erasure combines several components that provide a foundation of decentralized interactions between buyers and sellers in a decentralized marketplace.

Decentralized AI Models: SingularityNetSingularityNet is, arguably, the most ambitious company in the decentralized AI space.

Famous for powering the popular Sophia robot, SingularityNet is looking to introduce decentralization across all aspects of the AI lifecycle.

Technically, SingularityNet is a platform that enables the implementation and consumption of AI services in a decentralized model.

Built on the Ethereum blockchain, SingularityNet provides a model in which different participants in the network are incentivized for the implementation or usage of AI services.

From the architecture standpoint, SingularityNet is based on a series of components that abstract the fundamental aspects of the lifecycle of a decentralized AI application.

The path towards decentralized AI is about finding the right balance between the criticality of decentralization to break the knowledge concentration and influence of big players and minimizing disruption in a technologically challenging field.

Some of the ideas outlined in this article provide a pragmatic strategy to steadily adopt decentralized AI stacks in a pragmatic but still ambitious way.

.. More details

Leave a Reply