Skip to main content

Inspecting a model

Inspecting a model is important to determine if a model’s performance is sufficient for its designated purpose. The Trained model tab in the project shows the Matthews correlation coefficient (MCC) score for each evaluation dataset. This provides a quick overview on the performance of the model where an MCC score close to 1 indicates the model has a very good performance, whereas an MCC score close to -1 indicates a very poor performance.

 

A more detailed model performance can be inspected by clicking on the model which opens a model page. The Evaluation tab shows the precision, recall, and F1-score for each label, as well as general scores across the entire dataset.

 

If the model performance is satisfactory, continue to infer a model.

 

If the evaluation results do not show the desired results, try to:

  • Create an experiment with different parameters.
  • Create an experiment with a different model type.
  • Improving or expanding the training data.