Skip to main content

Local Explanations (LIME)

Welcome to the guide for generating local explanations for predictions in Kranium AI. Once you have completed model predictions, you can gain insights into the decision-making process of your model by generating local explanations. This guide will walk you through the steps to access the Explanation tab and generate local explanations for your predictions.

1. Accessing the Prediction Results

Steps:

  1. Navigate to the Models Section:
    • From the main dashboard, click on the "Models" tab to access the Models section.
  2. Select Your Trained Model:
    • In the Models listing screen, locate the trained model for which you want to generate explanations and click on its name to open the model's home tab.
  3. Open the Prediction Tab:
    • Within the model home tab, find and click on the "Prediction" tab. This tab contains the prediction functionality and results.

2. Generating Predictions

Steps:

  1. Enter Feature Values:
    • Fill in the feature values in the prediction form on the Prediction tab. These values are required to generate predictions using the trained model.
  2. Submit the Form:
    • Click on the "Predict" button to generate predictions. The results will be displayed in the results section of the Prediction tab.

3. Accessing the Explanation Tab

Steps:

  1. Locate the Explanation Tab:
    • After generating predictions, navigate to the results section on the Prediction tab. Here, you will find an "Explanation" tab.
  2. Open the Explanation Tab:
    • Click on the "Explanation" tab to access the local explanation functionality. This tab provides tools and visualizations to understand the model's decision-making process for the specific prediction.

4. Generating Local Explanations

Steps:

  1. Initiate Local Explanation:
    • In the Explanation tab, you will see options to generate local explanations for the selected prediction. Click on the "Generate Explanation" button to start the process.

5. Viewing and Analyzing Local Explanations

Steps:

  1. Review Explanation Results:
    • The Explanation tab will display the local explanation results, providing insights into the factors that influenced the model's prediction.
  2. Interpret Feature Contributions:
    • Examine the contribution of each feature to the prediction. This can help you understand which features had the most significant impact on the model's decision.
  3. Compare Explanations:
    • If needed, generate and compare explanations for multiple predictions to identify patterns and gain deeper insights into the model's behavior.

6. Taking Action Based on Explanations

Steps:

  1. Refine Model or Data:
    • Use the insights gained from the local explanations to refine your model or preprocess your data. This can help improve model performance and accuracy.
  2. Communicate Insights:
    • Share the explanation results with stakeholders to provide transparency and build trust in the model's predictions.

Generating local explanations for predictions in Kranium AI is a powerful way to gain insights into your model's decision-making process. By following this guide, you can easily generate and analyze local explanations for your predictions, helping you understand the factors influencing your model's outcomes. For any additional support or advanced configurations, refer to our support resources or contact the Kranium AI support team.