PoLAr-MAE

Particle Trajectory Representation Learning with Masked Point Modeling

Stanford University1
SLAC National Accelerator Laboratory2

PCA projection of the learned representation of the 3D point cloud.

Abstract

Effective self-supervised learning (SSL) techniques have been key to unlocking large datasets for representation learning. While many promising methods have been developed using online corpora and captioned photographs, their application to scientific domains, where data encodes highly specialized knowledge, remains in its early stages.

We present a self-supervised masked modeling framework for 3D particle trajectory analysis in Time Projection Chambers (TPCs). These detectors produce globally sparse (<1% occupancy) but locally dense point clouds, capturing meter-scale particle trajectories at millimeter resolution.

Starting with PointMAE, this work proposes volumetric tokenization to group sparse ionization points into resolution-agnostic patches, as well as an auxiliary energy infilling task to improve trajectory semantics. This approach — which we call Point-based Liquid Argon Masked Autoencoder (PoLAr-MAE) — achieves 99.4% track and 97.7% shower classification F-scores, matching that of supervised baselines without any labeled data. While the model learns rich particle trajectory representations, it struggles with sub-token phenomena like overlapping or short-lived particle trajectories.

To support further research, we release PILArNet-M — the largest open LArTPC dataset (1M+ events, 5.2B labeled points) — to advance SSL in high energy physics (HEP).

Truth

Predicted

Semantic segmentation of the 3D point cloud after fine-tuning the PoLAr-MAE model on 10x less labeled data than the supervised baseline, SPINE.

Paper

BibTeX

@misc{young2025particletrajectoryrepresentationlearning,
      title={Particle Trajectory Representation Learning with Masked Point Modeling}, 
      author={Sam Young and Yeon-jae Jwa and Kazuhiro Terao},
      year={2025},
      eprint={2502.02558},
      archivePrefix={arXiv},
      primaryClass={hep-ex},
      url={https://arxiv.org/abs/2502.02558},
}