Annotating data is collaborative process for which continuous iteration between labelers, reviewers, and administrators is an important ingredient for high-quality annotated data at scale.
Studio helps facilitate this collaboration and feedback through commenting capability. This guide will explain how this process works and how to use it to its fullest extent.
Commenting for other data types is on our roadmap. If there's a certain data type that you believe commenting would be helpful in your workflow, please reach out at firstname.lastname@example.org.
An annotator on your annotation team can easily add a comment by right-clicking anywhere on the image or video (particularly if there if the comment is relevant to a specific part of the task).
The annotator can easily access the comments that have been made using the toggle on the panel on the right.
The annotator can either choose to skip the task (having added the comments), or complete the task to the best of their ability, and submit the task.
The administrator is able to review all the comments submitted by navigating to the "Batches" page within the project. There will be a banner towards the top of the page if there are any tasks with comments.
As a reminder, in the annotator's queue, there are 2 types of tasks that they may encounter. The first is a regular task, which is part of the batch that the annotator is working on. The second is a benchmark task, also called an evaluation task, which is randomly interspersed in an annotator's queue to test their performance on a task that the administrator has set an 'answer key' to.
The annotator is able to add comments to both types of tasks (as they are unable to distinguish between an evaluation task vs. a regular task). Comments on both types of tasks will show up in the Studio Comments Queue.
For comments on an evaluation task, the administrator can review the annotator's response — as well as comments — and compare it against the expected benchmark result.
Then, the administrator can choose to either Resolve and Dismiss (i.e., keep the benchmark and dismiss the comments), or Retire Benchmark (i.e., deciding that the comments indicate this task is not fit as an evaluation task, and should no longer be served as an evaluation/benchmark task to annotators).
Resolve and Dismiss
For comments on a regular task, the administrator also has 2 options: Send Back to Queue, meaning the task will be sent back to the annotator who had worked on it and provided the comment in the first place, or Cancel Task, meaning the task will be pulled out of queue and will not be sent to another annotator.
The administrator should address the annotator's question or concern before utilizing Send Back to Queue.
Once the administrator has sent the task back to the annotator, the annotator will receive it as their next task, and is able to see the administrator's response and can proceed with completing the task.
You can configure at least 1 review layer, if your project requires has reviewers to check the work of the attempters (the first annotators to work on the project). This setting can be accessed via the Settings button on the Batches page.
If you have a review layer, the commenting behavior is as follows:
Attempter adds a comment onto a task, and skips the task
The task gets sent directly to the Studio Comments Queue accessible by the administrator, and skips the reviewer layer
Attempter adds a comment onto the task, and submits the task
The task gets sent to the reviewer, and the reviewer is able to view the comment from the attempter.
In the second case above, the reviewer has 2 choices of action:
Reviewer approves (or fixes and approves) the task, and submits the task
The task gets marked as complete
Reviewer adds another comment onto the task, and skips the task
The task gets sent to the Studio Comments Queue accessible by the administrator. The administrator can then choose to answer the comments (both reviewer and attempter comments) and Send Back to Queue, which gets sent to the reviewer.