Motivation: Chronic kidney disease (CKD) and acute kidney injury (AKI) are prominent public health concerns affecting more than 15% of the global population. The ongoing development of spatially resolved transcriptomics (SRT) technologies presents a promising approach for discovering the spatial distribution patterns of gene expression within diseased tissues. However, existing computational tools are predominantly calibrated and designed on the ribbon-like structure of the brain cortex, presenting considerable computational obstacles in discerning highly heterogeneous mosaic-like tissue architectures in the kidney. Consequently, timely and cost-effective acquisition of annotation and interpretation in the kidney remains a challenge in exploring the cellular and morphological changes within renal tubules and their interstitial niches.
Results: We present an empowered graph deep learning framework, REGNN (Relation Equivariant Graph Neural Networks), designed for SRT data analyses on heterogeneous tissue structures. To increase expressive power in the SRT lattice using graph modeling, REGNN integrates equivariance to handle n-dimensional symmetries of the spatial area, while additionally leveraging Positional Encoding to strengthen relative spatial relations of the nodes uniformly distributed in the lattice. Given the limited availability of well-labeled spatial data, this framework implements both graph autoencoder and graph self-supervised learning strategies. On heterogeneous samples from different kidney conditions, REGNN outperforms existing computational tools in identifying tissue architectures within the 10× Visium platform. This framework offers a powerful graph deep learning tool for investigating tissues within highly heterogeneous expression patterns and paves the way to pinpoint underlying pathological mechanisms that contribute to the progression of complex diseases.
Availability and implementation: REGNN is publicly available at https://github.com/Mraina99/REGNN.
© The Author(s) 2025. Published by Oxford University Press.