Metrics Tab

Overview

The metrics tab is meant to provide you visibility into how your project is going. Currently, we provide details on project Throughput and Quality.

Throughput

Throughput metrics give you details about how data is flowing through your project pipeline from submission to Scale to being accepted by you.

Quality

Quality metrics give you details about where Quality issues in the project are coming from. We calculate the following standard quality metrics based on your audits.

Detection Recall

Detection Recall counts Missing annotation frequency. A missing error occurs when your audit identifies an annotation that Scale missed.

Formulated mathematically:
Detection Recall = True Positive Annotations / (True Positive Annotations + False Negative Annotations)

512

Detection Precision

Detection Precision counts Extraneous annotation frequency. An extraneous error occurs when your audit identifies one of Scale’s annotations as unnecessary.

Formulated mathematically:
Detection Precision = True Positive Annotations / (True Positive Annotations + False Positive Annotations)

512

Annotation Precision

Annotation Precision counts errors present in an annotation. An Annotation Precision error occurs when your audit reveals something wrong with the Scale Annotations.

There are three types of Annotation Precision Errors: Geometry, Label, and Attribute.

  • A Geometry error occurs when the Geometry of the annotation is incorrect. This could be a problem with the annotation sizing or a problem along one of its dimensions (y axis out of place, for example).

  • A Label error occurs when an annotation is mislabeled as one Taxonomy Class when it should have been another.

  • An Attribute error occurs when one or more of the attribute(s) of an annotation are incorrect.

Formulated mathematically:
Annotation Precision = Annotation Errors / Possible Annotation Errors

512
Updated about 2 months ago