Authoring Information

The Annotation Authoring Information feature provides comprehensive tracking and visibility into the creation and modification history of annotations. It captures metadata about who created or modified an annotation, when changes were made, how the annotation was generated, and where in the workflow it was edited.

Key information tracked includes:

  • Timestamp: Creation and edit dates/times for each annotation

  • User Identity: Email ID or employee ID of the creator/editor

  • Annotation Source: Origin method (prelabel, manual drawing, smart automation like interpolation or dynamic sizing)

  • Workflow Stage: The specific stage/node where changes occurred

  • Change History: All attribute modifications, including positional, dimensional, rotational, and property changes

  • Visual Comparison: Side-by-side visualization of any two historical versions

The system consolidates changes per session (stage + labeler combination) to show the final state rather than every incremental edit, providing a clear audit trail without overwhelming detail.


Use Cases

  • Quality Assurance & Review: Trace annotation lineage to identify error patterns, verify standards compliance, and understand whether automated or manual processes produced specific results.

  • Dispute Resolution: Review complete history to understand decision-making context and resolve disagreements with factual evidence.

  • Debugging & Issue Investigation: Trace back through history to identify where problems were introduced and prevent recurrence.


Benefits

  • Good: Reviewers can quickly identify suspicious changes, verify automation accuracy, and ensure annotations meet project standards, resulting in higher-quality datasets.

  • Fast: Reduce time spent investigating quality issues from hours to minutes by quickly pinpointing when and where problems occurred.

  • Cost Efficient:

    • Identify and address quality issues early before they propagate through the pipeline, avoiding expensive late-stage corrections.

    • Analyze prelabel vs. manual annotation performance to make data-driven decisions about automation ROI and where human expertise is essential.

    • Understand which stages and annotators require more support, allowing managers to allocate training and supervision where they'll have the greatest impact.


How to Use the Authoring Information

Accessing Authoring Information

1

Click on any cuboid, 3D polyline, or 2D bounding box.

2

Right-click and navigate to Authoring Information. The Authoring Information panel appears.

3

The Authoring Information panel displays the following (sorted by recent changes):

  • Creation timestamp and creator identity

  • Most recent edit timestamp and editor identity

  • Annotation source (prelabel/manual/smart automation)

  • Workflow stage where changes occurred

  • List of all attribute changes made

Understanding the Information Display

  • User Identity: Shows email address when available; otherwise displays employee ID for attribution.

  • Source Indicators:

    • Prelabel: Annotation originated from automated prelabeling pipeline

    • Manual: Created by human labeler from scratch

    • Automation: Generated or modified by tools like interpolation, dynamic sizing, or propagation

  • Stage Information: Displays the workflow node (e.g., "Labeling/OP", "Review/QC") where the annotation reached its current state in that session.

  • Change Consolidation: Only the final state per session (stage + labeler) is shown, not every intermediate adjustment.

Visual Comparison Feature

1

In the authoring information panel, click the toggle beside each version to select them.

2

The system displays both versions simultaneously with the version history appearing on the right panel.

3

Switch between split-screen, overlay, and difference highlighting modes to compare visually.


Best Practices

For Project Managers

  • Schedule periodic reviews of authoring information to identify bottlenecks, training needs, or quality issues before they escalate.

  • Analyze prelabel vs. manual annotation metrics to optimize automation investment and understand where human expertise adds value.

  • Communicate that authoring information is tracked for quality improvement (not punitive measures) to maintain a positive learning culture.

For Quality Reviewers

  • Start with Source: Check the source first - prelabeled items may need closer scrutiny.

  • Compare Versions Strategically: Use visual comparison between initial creation and current state to spot significant changes quickly.

  • Document Patterns: When recurring issues arise from specific sources, stages, or individuals, document patterns to drive improvements.

  • Validate Automation: Regularly sample smart automation results to ensure tools like interpolation and dynamic sizing maintain accuracy.

General Best Practices

  • Always use visual comparison when investigating annotation discrepancies.

  • Be aware that only final session states are shown - if you need more granular detail, monitor changes in real time rather than relying on retroactive history.

  • When unusual patterns appear (e.g., multiple rapid edits), investigate and document root causes to prevent future issues.

Last updated