Plausible Object Motion

Plausible Object Motion is a sanity check that detects abrupt or physically impossible positional shifts of a cuboid centre between consecutive frames. For each object class, a maximum permitted distance threshold is configured - if the cuboid centre moves more than this distance from one frame to the next, a visual cue is shown and the violation is recorded in a per-frame violation list. This ensures that object trajectories remain smooth and realistic in accordance with the project guidelines.

lightbulb-on

Use Case

When annotating across large sequences, even experienced annotators can inadvertently misplace a cuboid in a single frame - causing an object to appear to teleport, then snap back. These single-frame errors are difficult to catch visually during fast annotation but produce large discontinuities that corrupt motion-dependent model features such as velocity estimation and trajectory prediction.

Plausible Object Motion addresses this by:

  • Computing the frame-to-frame displacement of every tracked cuboid centre in real time

  • Comparing each displacement against the class-appropriate maximum distance threshold

  • Displaying a visual cue immediately when the threshold is exceeded

  • Maintaining a violation list grouped by track ID so annotators and reviewers can audit all affected frames

Common scenarios include:

  • A vehicle cuboid accidentally dragged to the wrong position in a single frame, producing an unrealistic jump

  • Keyframe interpolation producing an intermediate frame position that overshoots the realistic path

  • Copy-paste of a cuboid from a distant frame without position adjustment

  • Tracking ID reassignment causing a cuboid to inherit an incompatible previous position


Benefits

For Annotators

  • Instant Cue on Violation - The visual cue appears as soon as the displacement threshold is exceeded, enabling immediate correction before moving to the next frame.

  • Class-Aware Thresholds - Each class has its own limit, so the same displacement that is acceptable for a fast vehicle is correctly flagged as impossible for a stationary pedestrian.

  • Immediate, Non-Blocking Warnings - Alerts appear only when guidelines are violated, so the workflow is never interrupted unnecessarily.

For Project Managers

  • Configurable Per Class - Thresholds can be set to match the exact speed and frame-rate assumptions in the project guidelines, with iMerit-recommended defaults available.

  • Trajectory Integrity - Ensures training data used for motion prediction and velocity estimation is free from single-frame positional anomalies.

  • Reduced Rework Cost - Positional errors caught at annotation time are significantly cheaper to fix than those identified during model training or post-delivery QA.


Reference Thresholds

The following values are iMerit-suggested starting points. All thresholds must be validated against the project's actual frame rate and sensor capture speed before go-live, and updated in the workflow configuration to match customer guidelines.

Object Class
Max Distance (m)
Approx. Speed Equivalent
Notes

Vehicle

5.0

~112 mph / 180 kph

Default - configurable

Pedestrian

1.5

~33 mph / 54 kph

Default - configurable

Cyclist

2.5

~56 mph / 90 kph

Suggested starting value

Motorcycle

4.0

~90 mph / 144 kph

Suggested starting value

circle-info

These thresholds assume a typical LiDAR capture rate of approximately 10 Hz (one frame every 100 ms). If your dataset uses a different frame rate, recalculate the per-frame distance limit as: max_speed (m/s) × frame_interval (s).


Steps to Use

1

Configure Thresholds in the Project

  • Navigate to your project's recipe settings (category schema).

  • Locate and enable the Sanity Checks section.

  • For each object class, set the maximum permitted frame-to-frame displacement in metres.

  • Consider your dataset's frame rate when setting values - a higher frame rate means smaller per-frame distances for the same real-world speed.

  • Save and publish the recipe. Thresholds apply immediately to all active tasks on reload.

2

Annotate as Usual

  • Open the task in the 3D Multi-Sensor Fusion Labeling Editor.

  • Annotate and track objects using your standard cuboid guidelines.

  • The system computes the cuboid centre displacement between consecutive frames in real time for every tracked object.

3

Respond to the Visual Cue

  • When a displacement exceeds the class threshold, a visual cue appears on and adjacent to all the affected cuboids on the timeline.

  • The cue identifies the frames where the violation was detected.

  • Determine whether the cuboid was misplaced (correct the position) or whether the object genuinely moved that distance (adjust the threshold if warranted).

  • The cue clears automatically once the displacement is within the permitted range.

4

Complete the Task

  • Violations are non-blocking - you may submit the task with active rotation warnings.

  • Unresolved violations remain visible to QA reviewers, providing context for improbably object motion.


Best Practices

  • Calibrate to Frame Rate - Always derive thresholds from real-world speed limits divided by the frame rate, not from arbitrary round numbers. Document the calculation in the project guidelines.

  • Set Conservative Initial Values - Begin with tighter thresholds and loosen them based on annotator feedback. It is easier to relax a threshold than to retroactively fix a large set of false negatives.

  • Use the Violation List as a Review Tool - Encourage annotators to work through the violation list systematically at the end of each sequence rather than resolving cues one at a time mid-annotation.

  • Differentiate by Traffic Context - Consider setting tighter thresholds for datasets captured in low-speed environments (car parks, urban stop-start) versus motorway sequences.

  • Pair with Rotation Check - Motion and rotation violations often co-occur on misplaced frames. Reviewing both lists together accelerates correction.

  • Flag Genuine High-Speed Events - If the dataset includes legitimately fast objects (emergency vehicles, racing scenarios), document an exception handling procedure so annotators know what to do.

Last updated