The Importance of Explainable AI for Decision Making

A major advantage gained by leveraging AI in the business world is optimizing the human decision-making process. For example, computing power combined with Machine Learning models allows businesses to wean actionable information from massive amounts of data. As such, making sense of Big Data is driving the adoption of AI in multiple industries. 

At the same time, it’s critical not to blindly trust the decisions made by any ML model. Being able to ask questions about why a model makes a certain choice also provides meaningful insights. Afterall, AI is supposed to augment human inquisitiveness, not replace it.   

Explainable AI (XAI) is a relatively new concept focused on helping humans interpret the output of sophisticated ML models. It’s an approach aimed at emphasizing the partnership between a person and an AI algorithm. After all, when faced with a complex problem, making the right decision remains the ultimate goal. 

So let’s look more closely at how MindsDB’s explainable AI functionality helps users understand why a ML model reached a certain conclusion.   

Building Machine Language Models That Both Predict and Explain 

No matter the discipline – business operations, financial trading, healthcare etc – domain experts can benefit from the fast pattern recognition capability of a Machine Learning model. But running a model with the capability to actually explain its generated result set is a game changer when it comes to the application of AI. This approach truly becomes a form of human intelligence augmentation. 

MindsDB feels any ML generated prediction used to support the decision-making process must also answer three questions:

  • Why can I trust the prediction? 
  • Why did the model provide this prediction?
  • How can I make these predictions more reliable?

Additionally, explaining these answers to the domain experts responsible for decision-making is also critical. MindsDB adopted a black box approach – Soft XAI – which provides a sense of the ML model’s internal “thought” processes by varying the model’s inputs and analyzing its resultant outputs. 

MindsDB feels the continued application of Soft XAI evolves into a new concept called Introspective AI, where the machines and their models use XAI output data to actually become self-improving. Eventually, we hope Explainable AI becomes a requirement for any AI system used in real world applications. 

The Promise of Explainable AI in a Simple User Interface 

The MindsDB Scout tool provides the means for domain experts to leverage XAI for vetting the quality of any machine learning model used in MindsDB. To use Scout, a user simply connects to a data source, trains a predictive model, and runs their data analysis. This data source is either local, or hosted on the MindsDB server.

Once the data source is uploaded, the user is able to browse the data using a spreadsheet interface. However, using Scout to analyze the data provides meaningful information on its quality. This information includes data type and value distribution, along with the column-level sub-scores that contribute to the overall data quality score. Any column with a significantly low score warrants a closer look from an analyst.

At this point, the user trains the ML model by creating a predictor. Simply select a dataset, one or more columns, and a name for the predictor. MindsDB then extracts, cleans, and analyzes the data, which is then used to train the model. Afterword, a black box XAI routine runs; providing a useful analysis of the efficacy of the ML model.

Scout provides these predictor results in an easy-to-read fashion. The General Accuracy value serves as a summary of the model’s overall efficacy. However, the real magic of XAI appears when diving into the details after running a query. The user sees a variety of helpful information, including the importance of each column used by the model, columns not relevant to the prediction, and even relationships and potential correlations between different columns. 

Ultimately, the XAI-generated info provides the means for the user to fine tune their predictor. Eventually they are able to craft one that is useful for answering their complex question of the day. In the end, Scout offers a great way for domain experts to learn about the power of MindsDB without having to write any code.  

Using Machine Learning to Augment the Knowledge of a Domain Expert

Explainable AI offers powerful insights into why a Machine Learning model made a certain prediction. MindsDB’s  Scout tool provides a window; allowing a domain expert to gain a better understanding of the relationships and information hidden in their data. Importantly, it’s easy to use, with no programming skills or experience required. 

With XAI poised to usher in the next wave of artificial intelligence, expect MindsDB to continue to enhance Scout. It’s a powerful tool suitable for ML-powered data analysis at organizations of all sizes. Interested potential users can book a demo directly.. 

More
articles