elinor - text as data
Book a demoFeaturesSolutionsDocs
  • Unlock the potential of text data.
  • Features
  • Solutions
    • 🗃️elinor for Organisations
    • 📊elinor for Data Teams
  • Documentation
    • 🎯elinor 101
    • 🚀Getting started
      • 🌳Ontologies
      • 📄Documents
      • 🔬Projects
      • ⚗️Experiments
      • 🖍️Annotation
    • 🛠️User Management
      • 📝Setting roles
      • 🧑Inviting Members
  • Guides
    • Building an efficient annotation workflow.
    • Creating ontologies using SKOS.
Powered by GitBook
On this page
  • How Experiments work
  • Creating an Experiment
  • 1. Experiment Setup
  • 2. Document Selection
  • 3. Task assignment
  1. Documentation
  2. Getting started

Experiments

Experiments structure your annotation project: From testing the ontology to annotating a gold standard.

PreviousProjectsNextAnnotation

Last updated 2 years ago

How Experiments work

On an abstract level, experiments are like milestones in your annotation project. If your projects involve multiple annotators and complex ontologies, you want to ensure that annotations are consistent and reliable. Iterating over the ontology and the annotation guideline can help you with that.

On a more concrete level, experiments are a collection of assignments that connect a document to an annotator. For example, Annotator A processes documents 1 to 50, and annotator B processes documents 51 to 100. The experiment tracks the progress on the assigned tasks and is finished when all documents are processed.

Creating an Experiment

To create an experiment, go to your project and then open the experiment pane and click .

You are then guided through the process.

1. Experiment Setup

In the first step you clarify what the experiment is about.

Experiments have the following properties:

  • Name: A distinctive name for the experiment.

  • Purpose (optional): To better structure the annotation project, you can give the experiment a purpose. Check the description below for more information.

  • Description (optional): A description to describe the experiments' overall goal.

  • Due Date (optional): Give the experiment a deadline. This helps annotators to gauge when their annotation tasks should be finished.

Purpose of experiments

Purpose
Description

Enrich an ontology

Annotate the documents to see if your ontology is covering all the concepts you need or if it needs some updates.

Test the annotation rules

Before you start creating a gold standard dataset your want to make sure that all annotators have a clear understanding of the annotation guidelines and concepts. Make sure to assign the exact documents to all annotators.

Annotate a gold standard

If your ontology is set and all annotators have a shared understanding of the project, you can assign documents automatically to different annotators to quickly create a gold standard that can be used for analysis or model training.

Annotate a subset of documents

Are there only a few documents you want to look at in detail? You can manually pick and assign documents.

2. Document Selection

In the second step, you select the documents you want to be annotated. You can select any document from the document hub by selecting individual documents or folders.

3. Task assignment

In the last step, you select the annotators that should process the documents.

First, you select the annotators. Here you can select as many annotators as your want from your organization.

In the second step, you choose the task assignment strategy. There are two main ways of assigning documents:

  • Manual Assignment: You choose exactly which document should be processed by which annotator.

  • Automatic Assignment: You choose the assignment's parameters: How many documents should be co-assigned (meaning: to assign a document to multiple annotators) and how many annotators each co-assigned document should go.

Generally, it is a good idea to have some co-assignment. This way, you can check if annotators understand the guidelines and if the ontology is explicit enough. If your goal is to test the guidelines, you want a high co-assignment ratio - so that all annotators process the same documents. Later in the process, when you are confident in the rules, you can drop the co-assignment to 20% or less.

🚀
⚗️
Experiment Setup View
Select documents
task assignment