⚗️Experiments

Experiments structure your annotation project: From testing the ontology to annotating a gold standard.

How Experiments work

On an abstract level, experiments are like milestones in your annotation project. If your projects involve multiple annotators and complex ontologies, you want to ensure that annotations are consistent and reliable. Iterating over the ontology and the annotation guideline can help you with that.

On a more concrete level, experiments are a collection of assignments that connect a document to an annotator. For example, Annotator A processes documents 1 to 50, and annotator B processes documents 51 to 100. The experiment tracks the progress on the assigned tasks and is finished when all documents are processed.

Creating an Experiment

You are then guided through the process.

1. Experiment Setup

In the first step you clarify what the experiment is about.

Experiments have the following properties:

  • Name: A distinctive name for the experiment.

  • Purpose (optional): To better structure the annotation project, you can give the experiment a purpose. Check the description below for more information.

  • Description (optional): A description to describe the experiments' overall goal.

  • Due Date (optional): Give the experiment a deadline. This helps annotators to gauge when their annotation tasks should be finished.

Purpose of experiments

2. Document Selection

In the second step, you select the documents you want to be annotated. You can select any document from the document hub by selecting individual documents or folders.

3. Task assignment

In the last step, you select the annotators that should process the documents.

First, you select the annotators. Here you can select as many annotators as your want from your organization.

In the second step, you choose the task assignment strategy. There are two main ways of assigning documents:

  • Manual Assignment: You choose exactly which document should be processed by which annotator.

  • Automatic Assignment: You choose the assignment's parameters: How many documents should be co-assigned (meaning: to assign a document to multiple annotators) and how many annotators each co-assigned document should go.

Generally, it is a good idea to have some co-assignment. This way, you can check if annotators understand the guidelines and if the ontology is explicit enough. If your goal is to test the guidelines, you want a high co-assignment ratio - so that all annotators process the same documents. Later in the process, when you are confident in the rules, you can drop the co-assignment to 20% or less.

Last updated