Instructions should be a living document!
Writing task instructions is an iterative process. We very highly recommend multiple rounds of small batch submissions, reviewing the annotated data, and updating the instructions to clarify edge cases and examples.
The team doing your tasks and reading your instructions are comprised of talented, trained workers around the world.
To help us both out, we ask that you use clear, concise, and easy to understand language to describe your labeling requirements.
Keep in mind that English may not be our Tasker’s first language and there may be significant differences in cultural norms or environments.
We have two main ways of generating instructions for our Taskers today:
We are able to render markdown content to taskers. To leverage this option, you'd pass Markdown as a string directly to the
instruction parameter on the project or task, or paste it directly into the Scale application when creating a project.
Tactically, when it comes to newline characters in your Markdown string, simply using \n (newline character) is sufficient.
We generally recommend Google Docs as the preferred approach to generating instructions given the rich formatting and easy collaboration offered. The below instructions detail how to set up and use a Google Doc on our platform.
To embed your google docs instructions to your tasks:
Project or Task Level Instructions?
In most cases, we recommend you to embed the instructions on the project level, not task level.
Instructions that are embedded on the project level will cascade down to all of the tasks within that project.
Fictional Examples Below:
We recommend our customers to use this template when creating instructions.
After you have created instructions, it’s time to submit for your tasks to be labeled. Depending on your task complexity, it might be take a few iterations to create a comprehensive set of instructions.
If you don’t feel that we are aligned with your requirements, please refine the instructions document. It is helpful to:
Instructions are iterative!
For your first batch, we recommend submitting a small sample (between 10-20) of tasks to be annotated. After you receive the results, please use our quality assurance tools to reject or approve tasks.