Project Definition 2.0

The 2.0 version described below applies to the new version that is being rolled out gradually between June 1st, 2022 and July 2022. You can find the older Project Definition documentation here.

Project definition is the first step in creating labeling tasks. You'll upload your data, create a taxonomy, and write/proofread instructions.

Selecting Project Type

Create a new project by hitting "+ Create new project" in the dropdown.

You'll be brought to this setup flow which will guide you through setting up your project and launching your first Calibration Batch. If you're not quite ready to set up your own project, you can always opt to explore Rapid's selection of project templates.

Enter a project name to begin

Enter a project name to begin

If you did select templates, you can pick from a variety of templates to explore how Rapid might work for a use case similar to yours.

Browse templates to see examples of the task, instructions, and taxonomy

Browse templates to see examples of the task, instructions, and taxonomy

Upload Your Data

The next step is to upload some data relevant to your project. You'll want to add enough data to give your newly minted project a good test for your first Calibration Batch (this could be anywhere from 5-50 items, depending on the complexity and variety in your data).

Upload your images and see them populate into your project.

Upload your images and see them populate into your project.

Rapid supports uploading local files, importing from a previous project, as well as importing from Azure, AWS, and Google Cloud.

Task Setup

The next step is to define your labeling task. Rapid's Task Setup step allows you to preview one of your uploaded images as you craft the task at hand.

The goal of this step is to outline the labeling taxonomy, as well as provide any context and instructions to Rapid's labeling workforce that will be performing the labeling.

You will need to first select your Task Setup type – depending on what you've uploaded, you'll be shown a limited set of possible types. These types dictate the type of labels you can add to your task taxonomy (for example, uploading images allows you to create a image annotation, semantic segmentation, or text collection project types).

Once you've selected your task type, you can add labels using the "+ Add label" button. Each label's name can be adjusted, and clicking on a label opens a tray where you can add tasker-facing context, as well as specifying any attributes for the label.

Make sure to "Update preview" and "Save" intermittently to save your setup.

Add labels and descriptions for your task

Add labels and descriptions for your task

For more information about how to write your project taxonomy for your specific Task Type, please refer to our Documentation.

If you prefer, there is an option to use the JSON editor. Keep in mind that doing so reduces some capacity of the visual editor, so this is a feature best used if you already have a taxonomy prepared.

Taxonomy editor allows you to write the JSON  and preview the changes on top of your task

Taxonomy editor allows you to write the JSON and preview the changes on top of your task

Since your tasks are being sent to labelers that may not have domain knowledge on how to label your data the way you want it, it is essential to fill out the descriptions for each label. Examples are also crucial here - make sure to include clear examples of what is well labeled, and what is not, using the examples section for each label.

Examples section clearly indicates what's well and not well labeled.

Examples section clearly indicates what's well and not well labeled.

You can test your task intermittently with the preview to ensure your setup is working as intended.

Testing your taxonomy with the preview ensures everything is setup the way you want it

Testing your taxonomy with the preview ensures everything is setup the way you want it

Writing instructions is a crucial part of setting up your project. Thorough instructions will result in quality data because everything will be laid out clearly for the taskers to follow. Preview your instructions to see what the Scale labeling team will see before attempting tasks from your project.

Previewing the instructions will showcase all of your labels with the attached descriptions and examples that are relevant to them.

Previewing the instructions will showcase all of your labels with the attached descriptions and examples that are relevant to them.

Before you move on from the Task Setup step, ensure that if you are using Scale's labeling workforce, that the you instructions clearly illustrate:

  • The overall goal of the task
  • Key concepts the tasker should understand
  • What the entities being labeled are
  • Sufficient examples that are representative of the overall dataset
  • Any nuances or special cases they might encounter

Note that instruction writing is an iterative process. We’ve built in a feedback system for our labeling workforce to give you feedback on your instructions.

Prepare the batch for launch

After defining your task, you'll prepare for your Calibration Batch. The Calibration Batch is a smaller sized batch intended to test your task setup with the Scale labeling workforce. You can submit a max of 50 tasks, and you'll received specific feedback on your setup that could be helpful before launching a larger scale batch.

After your first Calibration Batch, if you feel confident about your project setup, you can opt to launch a Production Batch, where you won't get feedback and can focus on just labeling a larger volume of data.

Select your batch type, and then select the subset of data you want to label

Select your batch type, and then select the subset of data you want to label

You're now just about ready to launch!

Review the pricing estimator, making sure the line items match up with your expectations for your project. When you're ready, launch your batch and wait for the labels to come in!

Review pricing and launch!

Review pricing and launch!

Updated 21 days ago