Product
Product

Managing 3D Sensor Fusion Data with Scenes in Scale Nucleus

byon October 20, 2021

Today we’re thrilled to share that Nucleus now supports 3D Sensor Fusion data, including LiDAR point clouds.

Scale Nucleus is the mission control for your data, providing advanced tooling for curating, visualizing and exploring machine learning datasets and debugging ML models. We recently launched an all-new Nucleus for dataset management, complete with privacy features and updated Autotag capabilities, all to help teams build better ML models by building better datasets. Now, Nucleus can help teams manage 3D datasets to develop highly accurate perception models for a range of use cases, including autonomous vehicles, robotics, and augmented/virtual reality (AR and VR).

Scale Nucleus helps you visualize and debug 3D Sensor Fusion data by making it easy to:

  • Construct Scenes that contain a sequence of LiDAR point clouds and their corresponding camera images over time
  • Visualize, query and share sensor fusion data with colleagues
  • Integrate Sensor Fusion data with your 3D model predictions to uncover and identify:
  • Discrepancies between 3D object detection model outputs and ground truth
  • False positive, common class confusions, and cuboid predictions that have low Intersection over Union (IoU) with ground truth
  • Model performance and improvements over time

The Nucleus API makes it easy to import your data, construct scenes, and add model predictions. You can also import data from an existing Scale project directly into Nucleus.

A Primer: Nucleus Scenes

Scenes are the building blocks of 3D datasets in Nucleus and are designed with point clouds as the primary unit of data. More formally, a Scene is a sequence of Frames over time, where each Frame contains exactly one point cloud and an optional set of camera images. Here we can see a LiDAR point cloud above two paired images (the full dataset includes more than just front and back cameras, but a full suite of sensor data), which helps explain why we designed Nucleus Scenes to encapsulate data across both time and sensor modalities:

Structure Diagram of Nucleus 3D ScenesOne useful way to interpret a LiDAR Scene is as a table of sensor readings. A row in the table corresponds to a sequence of readings from a particular sensor, while a column in the table represents a collection of sensor data from the same timestamp (we call this a “frame” of the scene).

Complex Nucleus 3D Scene on SF's Embarcadero, Quarter Angle View

On their own, point clouds can be difficult to interpret, due to lack of motion cues or associated camera data. Nucleus Scenes make sets of 3D and 2D data easier to interpret for the following reasons: first, any number of 2D camera frames can be attached to each 3D LiDAR frame; and second, Scenes link frames from the same timespan so that you can examine all frames in a linear fashion.

Let’s now view the same Scene in Nucleus and see if we can identify the mystery object from above:

After uploading predicted cuboids from an ML model for your Scene, Nucleus automatically computes performance metrics under the hood. You can sort predicted cuboids by IoU or filter by false positives to quickly narrow in on your model’s failure modes. (These features have been available for 2D image analysis for some time, but today we are announcing their ability for 3D data, too.)

To summarize, by creating and uploading 3D Scenes to Nucleus, you can easily visualize LiDAR point clouds and images side-by-side, quality-check your ground truth annotations, and search your data based on metadata or even ML-produced attributes. Lastly, you can use Nucleus to measure key prediction performance metrics, such as IoU and false positives.

Birds' Eye View of SF Embarcadero as Seen through LiDAR and Multiple Image Sensors in Nucleus Scene

Getting Started with 3D Data in Nucleus

Visualizing, querying, and measuring the quality of sensor fusion datasets is a challenge facing even the most sophisticated ML teams. With the newly launched support of multimodal scenes, Nucleus makes it easy to navigate through massive datasets, and quickly narrow down on the most difficult examples. Unleash the full potential of your datasets by leveraging Nucleus for debugging, search, and quality related workflows.

The complete code for this tutorial is available in this Colab notebook. If you’d like to follow along step by step and create a personal dataset on Nucleus, you will first need to make a free Scale account and obtain a test API key (which you can find here if you navigate to your user icon, and then click on API keys). You can then copy the notebook and add your personal key to upload your data and try out Nucleus for yourself. At the end, you’ll also find a section that shows you how to format and upload your own LiDAR data for curation and management in Nucleus if you’re ready to manage your own 3D data in Nucleus.


The future of your industry starts here.