Pieces of the same Puzzle: AI & Blockchain

The Artificial Intelligence revolution is in its full swing. From autonomous driving cars to drone delivery systems, we are already experiencing the futuristic world. Advancements in AI have changed the daily life landscape that we were so accustomed to for a very long time. The rate of innovation in the field of AI is exponential. While Elon Musk recently predicted fully autonomous Tesla vehicles and taxis on the road next year enabled by their sophisticated AI.. A report by Tractica and TechXLR8 analyzed the AI market, forecasting enterprise AI application revenue to reach $80.7 B USD by 2025.

Despite the exponential growth in the field, there are several core issues that need to be solved for mainstream adoption of AI.

In this article, we will explore four main areas where blockchain technology will act as the enabling catalyst for AI-based applications to reach their full potential and increase human trust in machine-generated results.

1. Data — Security, Quality & Volume

Comparable to oxygen for humans, quality data is the most vital ingredient for any AI system. The saying “garbage in, garbage out” applies perfectly to this situation. If the quality of the data is poor, the results will be inaccurate and will ultimately have a negative impact on the overall business value.

Many enterprises have in-house data however, the quality is poor. Except for the largest organizations, the breadth of the data is limited which creates a barrier to complete a full analysis or address a business need. To add value to their analysis, these enterprises require access to high-volume and high-quality data. They can ramp up their data generation efforts by dedicating resources or work with third-party data providers and data sharing marketplaces. Despite involving a trusted third-party data provider, organizations still cannot ensure the quality of data because the flow of information is mostly undocumented.

Blockchain creates an environment where data is private, immutable, transparent and secure making it ideal for privately sharing confidential or personal information such as medical and health records. With a strong ecosystem, organizations can ensure the data gathered is of high-quality by issuing checks & balances over the flow of data. They can also trace where the data originated from, the steps that were taken to prepare that data and validate the authenticity of that data.

2. Infrastructure Needs

The more data we gather, the better the AI algorithms perform. Nevertheless, this is easier said than done. Even if we gather immense high-quality data, until we have access to high-performance hardware, generating results can take weeks or months depending on the data size. Currently, companies employ traditional and centralized web service providers like Amazon & Microsoft to fulfil their hardware needs. There are several issues with this:

i. the services are very expensive

ii. the servers and control are centralized

Blockchain-based solutions such as Cryptyk with decentralized storage and sharding provide an added layer of data security. While companies such as Taloflow are implementing AI to dramatically reduce and optimize cloud costs for organizations, some blockchain projects are creating CPU/GPU marketplaces for on-demand computing power to leverage mining infrastructure to fulfill computationally intensive tasks. With distributed ledger technology (DLT), the cost of computing will dramatically reduce, further fueling AI innovation.

3. Model Markets

Algorithms are the backbone of any AI system. Building high-quality machine learning or deep learning algorithms require expertise and extensive research. If an organization wants to build its own AI systems, it needs to hire data scientists and algorithm engineers to build high-quality machine learning and deep learning models. This is expensive and very time-consuming.

On the other hand, many developers build their models and upload them on websites like GitHub. Anyone can use them so long as they are familiar with how to use them in production.

Taking a democratized approach, in a marketplace of machine learning algorithms where contributing developers can build and share models for various purposes creates a bigger pool of algorithms that organizations can access. Organizations can reduce costs and extract more value from their data and due to financial incentives, developers will be more motivated to create better performing algorithms. Blockchain introduces transparency in terms of what steps were taken to build the algorithms and how many times the model was used by an organization. Similarly, individuals and organizations can be rewarded for sharing the input data to train the AI models and be compensated accordingly. The blockchain allows for encrypted data to be transferred while managing the usage rights of that data creating trust and transparency without which the entire ecosystem is threatened. This enables effective micro-licensing from individual developers to create comprehensive solutions.

4. AI Learning Provenance

While most people are aware that AI can enhance our daily lives and provide valuable insights, they rarely consider the ethical implications related to the use of AI. AI provides results based on the training — which is dependent on the input data. If the data used to train the AI is biased, the AI will consider that biasness as absolute fact, in turn producing biased results. In 2016 Microsoft launched AI twitter chatbot ‘Taytweets’ that quickly turned racist and misogynistic, learning from other twitter posts. If the AI engine is exposed to more racist tweets, then it’s more likely that the tweets generated will be racist.

If there are more fields representing a certain attribute or result, then the results are also often skewed and biased. The same goes for images — computer vision AI learns from only those images which it sees — those that are in its database. Racial or gender skewed data affects the result, an issue that Amazons’ (facial) Recognition technology for government has come under scrutiny for to the extent that Amazon may halt sales of the controversial product. As AI becomes more prevalent, recording and analysis of the AI decision tree will become more important to create transparency with the AI learning inputs. Trusting the AI result requires trusting the underlying dataset, requiring process be implemented for vetting the data and its methodology. A distributed ledger of data inputs and the decision tree for the AI makes analysis on poor, biased and skewed performance possible and tamper-resistant.

Blockchain technology holds the promise of adding structure and accountability to AI algorithms, as well as the quality and usefulness of the intelligence they produce.

References:

This is a repost of the Article written by Aly Madhavji and Raheel Ahmad featured in the Crypto Investment Times Magazine for the May 2019 edition: https://cryptoinvestmenttimes.com/CIT18/Crypto-Investment-Times-May-2019.pdf

read original article at https://medium.com/@alymadhavji/pieces-of-the-same-puzzle-ai-blockchain-b44b0028574d?source=rss——artificial_intelligence-5