QUICK LINKS

GETTING STARTED

COMPONENTS

TUTORIALS

Introduction

The model performance dashboard enables you to analyze your model's performance either over time or between different model versions. This analysis provides insights across various parameters, comparing predicted and actual performance.

Basic concepts: 

  • ‍Baseline: Users can define the baseline basis on ‘Tag’ or segment of data based on ‘date’.
  • ‍Frequency: Users can define how frequently they want to calculate the monitoring metrics
  • ‍Alerts frequency: Users can configure how frequently they want to be notified about the alerts

Create dashboard: Model Performance Dashboard

The model performance dashboard enables you to analyze your model's performance either over time or between different model versions. This analysis provides insights across various parameters, comparing predicted and actual performance.

To set up a Model performance dashboard in AryaXAI, define the baseline and current tags, true and predicted labels, model type and date feature name. Next, select the compute option based on the specific server requirements for the model performance monitoring task.

The generated model performance report will display various metrics like accuracy, precision, recall, and quality.

Model Performance Monitor

With AryaXAI,  users can also proactively identify issues with the performance of your models post-deployment by using Monitors for model performance tracking.

To create a model performance monitor:

  • Navigate to the 'Monitors’ tab in ML Monitoring and select 'Create Monitor’.
  • Assign a name for the monitor and choose ‘Model performance' as the monitor type. Select the email list to which you want the alerts to be sent.
  • Choose the Model type and performance metric and set the model performance threshold. For the performance metric, you can define any of the following performance metrics: accuracy, f1, Auc-roc, precision, and recall. ‍
  • From the dropdown, select the baseline true and predicted label. Utilize tags to define the baseline data
  • Utilize the date feature to further segment your baseline and set the monitoring frequency.
TIP: When deploying multiple model versions, consider appending the model predictions directly into the same dataset as new features, rather than creating duplicate copies of the dataset. This approach enables efficient tracking of model performance over time.

Any new dashboard created for Model performance monitor will be listed in the Dashboard Logs, where you can view details such as the baseline and associated tags, the creation date and name of the dashboard, the owner, etc. In the Actions column, you have options to expand or collapse the dashboard to view or hide detailed information, configure alerts based on the specific dashboard configuration, or delete the dashboard from the logs.

NOTE: The detailed dashboard in the Actions column will only be visible if the status shows as ‘Completed’. If the status shows ‘Failed’ the reason will be specified in the ‘Error’ column.

For all the logs listed here, users can configure automatic alerts based on the dashboard log. This option is available in the 'Alerts' column.