Context aware hierarchical attention for abstractive dialogue summarization

Sci Rep. 2025 Jul 1;15(1):21625. doi: 10.1038/s41598-025-05923-z.

Abstract

Abstractive dialogue summarization has gained increasing attention due to its ability to generate concise and informative summaries from complex conversational data. In social dialogues, phenomena like ellipsis and topic shifts frequently occur, making it essential to account for the rich contextual information embedded at multiple levels. Traditional transformer-based models often fail to fully exploit this multi-level context. To address this limitation, we propose a novel Hierarchical Context-aware Attention (HCAtt) network. Our model incorporates both segment-level and utterance-level contextual information into the transformer framework, enhancing the model's ability to capture the intricate dependencies in dialogue data. Specifically, we hierarchically integrate these levels during the calculation of query and key transformations, which improves the modeling of contextual relationships across token representations. Experimental results on the benchmark SAMSum, DialogSum and AMI datasets demonstrate that HCAtt outperforms existing methods, highlighting its effectiveness in handling the complexities of dialogue summarization.

Keywords: Abstractive Summarization; Dialogue Modeling; Hierarchical Context-aware Attention (HCAtt); Multi-level Context.