QUICK LINKS

GETTING STARTED

COMPONENTS

TUTORIALS

Observations as Explanations

"Observations" serve as a powerful tool for assessing the correlation between industry knowledge and model performance. It enables subject matter experts to contribute to the explainability process by providing clear and understandable explanations to all stakeholders.

Here, users can explore the rationale behind each prediction. By defining specific conditions or causes as "observations," users can establish a correlation between these factors and the model's predictions. This functionality facilitates a deeper understanding of causation correlations within the model's decision-making process.

Create Observation

To create an observation


project.create_observation()

Help function to create an observation:


help(project.create_observation)

Manage observations

View observations

To view observations executed for a case:


case_info.explainability_observations()

You can also update the observations from time to time. The observation name can not be changed whereas, the other details like observation function, text and linked features can be changed as well.

To make Observation Active, Inactive and change params


project.update_observation(observation_id,observation_name,status)

Delete an observation


project.delete_observation(observation_id,observation_name)

Help function to update an observation:


help(project.update_observation)

Observation trail

To view observation trail (history of update in observations):


project.observation_trail()