- 17 May 2023
- Updated On 17 May 2023
The Dataloop platform has two types of QA tasks to fit your workflow needs.
- A QA Task created from an annotation task to validate annotations created by assignees.
- A Standalone QA task created to validate annotations uploaded to the platform, for example, created by your model.
See QA task for more information.
The Note Annotation is an easy menas of communication between annotators and annotation managers in the QA process.
The QA Process
The QA process is the same for image items and video items.
To understand the QA process, consider the following example:
A task or assignment is created for an annotator.
The annotator completes annotating the task.
The annotation manager sees the completed annotation task and creates a QA task (in this example, the annotation manager assigns the QA task to self and acts as a QA tester, but you can assign it to anyone).
From the Workflows > Tasks page, clicking Create QA Task creates a QA with the same name as the original task plus the word QA added at the end.
During the QA task, the QA tester (in this case, the annotation manager):
- Discovers issues with a couple of annotations and opens an issue on them.
- In addition, the QA tester discovers a few objects that are not annotated, and since there is no annotation leave an issue on.
- Creates a note over the unannotated objects (notes are in the shape of a ladybug) with instructions for the annotator.
When an issue or note is created, the Completed status is removed from the item, and the assignment becomes active again.
Until these issues are marked as resolved and approved by the QA tester, the item cannot be completed or approved.
On the Task page, the progress bar shows that one item is not completed, and there are issues remaining to be resolved.
From the above image, the annotation manager can browse the task, browse the issues, and browse pending reviews.
In the meantime, the Task page of the annotator shows that there are open issues in the task. Hovering over the red exclamation point shows the number of open issues.
Double clicking on the exclamation point opens the item with issues.
The annotator corrects the annotations in question and marks the issues in the Annotations tab for review by clicking the hourglass icon.
Now the annotator can click Complete. On the Task page, the task is shown with items for review.
Similarly, the Task page on QA tester’s platform shows that the annotator has submitted items for review.
The QA tester clicks Browse Pending Review to view the corrected annotations.
If the issues have been corrected satisfactorily, the QA tester approves them by selecting the annotations marked for review and clicking on the Approve checkmark icon.
At this point, the for review icons disappear, and the QA tester can click Approve.
Both the annotation task and the QA task are now completed.
When working on a QA assignment, the assignee or QA tester is expected to review annotations and set the item to status Approved if it is annotated satisfactorily, Discarded if the item is unsuitable for the task, or raise issues if there are problems with the annotation of the item that the QA tester wants the annotator to fix.
See this example of the QA process. Setting the status will trigger the studio to move to the next item.
Setting the status will trigger the studio to move to the next item.
Annotation QA Process
In a QA process, the reviewer can flag an Issue with one more annotation. The issue is flagged from the annotations list, or in bulk, by selecting multiple annotations and clicking the Issue icon.
- Open issue: When an issue is flagged on an annotation, the item is returned to the annotator to be fixed.
- Correcting issues: When entering the Assignments page, annotators will see their assigned Issues for each assignment in the table. Clicking on each open issue will redirect the Annotator back to the Studio, with the relevant item and annotations on display. When an annotation is corrected, the annotator can flag it For review.
- Reviewing fixed annotations: Annotations whose status was changed from Issue to for review will appear in the Assignments of Task page tables by their assignment. Clicking on each For Review, will redirect the reviewer back to the Studio, with the relevant item and annotations on display. The Reviewer who opened the issue will then see the item again in its QA-Task, review corrected annotations, and either flag them with issue again or flag them as Approve.
You can view the total number of open Issues and For review items on a task level.
- Once an item has no annotations with issues, it can be flagged as Approved again.
A reviewer and annotator can leave messages to each other on an item using the note tool.
For example, why the issue was flagged (like what’s the problem with an annotation), or even place a note where annotation should have been placed (was not created and therefore cannot be flagged with an issue).
You can review the QA status of a task in several different ways, depending on the user role and the type of information requested.
You can see on the task page if there are open issues or items pending review on every task.
Task Assignments Page
Double-click a task to see all its assignments. Each assignment line shows an indication of whether there are open issues or items pending review.
On the Workflows > Issues page, you can see the full list of the issues in a project. The issues include the following modes of details:
- My Issues: Issues on annotations created by current user, that are pending correction.
- Project Issues: All issues created by contributors in the project.
While reviewing “Project Issues”, users can use additional filters (explanations are provided as per their number below):
- Filter by project: Select between project issues and my issues. As a result, the relevant data will be displayed in a tabular format. Note that the number of issues is displayed as per the contributors filter update.
- Filter by contributor: This filter enables you to view a list of contributors with their matching issues. Select from the drop-down list between ‘all contributors’ and a specific contributor’s e-mail. As a result, the relevant data will be displayed in a tabular structure, including the following columns:
- Assignee column: Lists the assignment contributor/assignee
- Task column: Lists the task with its relevant open issue. Note that this column can be sorted.
- Filter by task: This filter enables you to view tasks with their matching issues. Since this filter is a multi select one, select between ‘all tasks’ and specific tasks.
Each role has its own unique viewing option, as follows:
- Annotation manager: This role includes My Issues by default. Once the project issues is selected, it results in displaying all issues in the project subjected to the “filter tasks by ORG” project setting.
- Annotator: Annotator can view only My issues options.
- Developer and project manager: This role includes Project issues and All contributors by default.
Note annotation allows a QA reviewer to create a new annotation and assign it for review by an annotator. While using standard Issues allows the reviewer to flag existing annotations for correction, the Note tool allows flagging annotations missed by the annotator.
The process for using note annotation as part of the review process is:
- Select the Note tool and create a note annotation where the original annotator may have missed placing one.
- In the dialog box, select the annotator you wish to assign the Note to (since the item can have annotations from several sources).
A. The list includes all annotators with an annotation assignment (not QA) containing the item, and all users that are annotation managers or above.
B. The first name in the list is the annotator that was the last to assign a status to the item. For example, the most likely one to be the designated assignee.
- The Note annotation will be created, and it will immediately have an Issue on it. The Complete status will be removed from the item, and it will be available for fix in the annotator's assignment.
As part of the review process, annotators, reviewers, and managers can leave comments for each other in the Note tool. Read more about using the Note Annotation tool.
Read more about using the Note Annotation tool.