RF-URL 2.0: A General Unsupervised Representation Learning Method for RF Sensing

IEEE Trans Pattern Anal Mach Intell. 2025 Jul 10:PP. doi: 10.1109/TPAMI.2025.3587718. Online ahead of print.

Abstract

The major challenge in learning-based RF sensing is acquiring high-quality large-scale annotated datasets. Unlike visual datasets, RF signals are inherently non-intuitive and non-interpretable, making their annotation both time-consuming and labor-intensive. To address this challenge, we propose RF-URL 2.0, a novel unsupervised representation learning (URL) framework for RF sensing, which enables pre-training on easily collected, large-scale unannotated RF datasets to make downstream tasks solve easier. Existing URL techniques, such as contrastive learning, are primarily designed for natural images and are prone to learn shortcuts rather than meaningful information when applied to RF signals. RF-URL 2.0 is the first framework to overcome these limitations by constructing positive and negative pairs through well-established RF signal processing algorithms. Besides, it introduces a novel signal-model-driven augmentation technique, which augments signal representations by identifying and perturbing physically meaningful parameters of signal processing models. Moreover, the RF-URL 2.0 is carefully designed to take into account the heterogeneity characteristics of different RF signal processing representations. We show the universality of RF-URL 2.0 in three typical RF sensing tasks using two general RF devices (WiFi and radar), including human gesture recognition, 3D pose estimation, and silhouette generation. Extensive experiments on the HIBER and Widar 3.0 datasets demonstrate that RF-URL 2.0 takes a significant step toward learning-based solutions for RF sensing. Code will be released at: https://github.com/Intelligent-Perception-Lab.