Improve Calibration Score

Calibration Batches are a critical step in the iteration of your project. Getting a good Calibration Score indicates a healthy project where taskers are able to understand your instructions and label your data.

Understanding Calibration Score

Calibration Score is a function of audit results and Tasker confidence
It is recommended that your Calibration Score is above 80% before moving on to a Production Batch

Inputs that go into calculating the Calibration Score include:

  • The number of rejections you made while auditing the returned completed tasks
  • How confident the Tasker self-reported after reading the instructions & completing your task
Here a score of 42% indicates improvements should be made before progressing to a Production Batch

Here a score of 42% indicates improvements should be made before progressing to a Production Batch

Implement Tasker feedback

Iterate on instructions
Write Instructions that cater better to the moments of confusion taskers may be having

For each Calibration Batch you will receive feedback on each task, based on how easy or confusing the Tasker found the process of reading the instructions and executing on the task.

`Feedback` is on the right side of each Task in the `Calibration Batch view`

Feedback is on the right side of each Task in the Calibration Batch view

Each task can be expanded to see complete feedback & all actions that can be taken

Each task can be expanded to see complete feedback & all actions that can be taken

Per feedback, you may want to view the Audit Response (to see tasker response and any of your audit versions) and see the instructions attached to that task. To edit the Instructions, you'll need to go to Definition > Instructions, where you can view the feedback and Calibration Score next to your instruction workspace.

Feedback from the most recent Calibration Batch is situated next to the workspace

Feedback from the most recent Calibration Batch is situated next to the workspace

Commonly, the feedback may point to adding more context, explanations, and examples of properly and improperly labelled tasks.

`Show examples` allows you to see all examples and add newly labelled examples

Show examples allows you to see all examples and add newly labelled examples

You can include examples as `Well labeled` or `Poorly labeled`

You can include examples as Well labeled or Poorly labeled

In general, you should aim to write instructions in a way that shows how one should do the task as opposed to just describing it.

Rapid Tip: Show how to do the task instead of just describing it

🐶 Example Image Annotation Project

Dogs & their coat color

Don't do: Describing the task

Label dogs and their coat colors [solid, merle, brindle, harlequin, ticked, spotted, roan, tricolor]

✔️ Do: Show how to do the task

Describe overall goal of labelling dogs and their coat colors.

"The goal of this project is to properly distinguish between dog coat colors across a variety of breeds. We would like to have each dog in the dataset labeled and also have their coat coloring marked."

Give an overview of how to label dogs:

"Draw a bounding polygon around the dogs, making sure to include all legs, ears, and tails"

→ Include examples of well drawn and poorly drawn polygons.

Give an overview of coat colors:

"Here are multiple examples of all of the coat colors across different breeds..."

→ Include various examples for coat colors.

Updated a month ago