DeltaDPD
Exploiting Dynamic Temporal Sparsity in Recurrent Neural Networks for Energy-Efficient Wideband Digital Predistortion
Yizhou Wu (TU Delft - Electronics)
Yi Zhu (Ampleon)
Kun Qian (TU Delft - RST/Storage of Electrochemical Energy)
Qinyu Chen (Universiteit Leiden)
Anding Zhu (University College Dublin)
John Gajadharsing (Ampleon)
LCN de Vreede (TU Delft - Electronics)
C. Gao (TU Delft - Electronics)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
Digital predistortion (DPD) is a popular technique to enhance signal quality in wideband radio frequency (RF) power amplifiers (PAs). With increasing bandwidth and data rates, DPD faces significant energy consumption challenges during deployment, contrasting with its efficiency goals. State-of-the-art DPD models rely on recurrent neural networks (RNNs), whose computational complexity hinders system efficiency. This letter introduces DeltaDPD, exploring the dynamic temporal sparsity of input signals and neuronal hidden states in RNNs for energy-efficient DPD, reducing arithmetic operations and memory accesses while preserving satisfactory linearization performance. Applying a TM3.1a 200 MHz-BW 256-QAM OFDM signal to a 3.5-GHz GaN Doherty RF PA, DeltaDPD achieves −50.03 dBc in adjacent channel power ratio (ACPR), −37.22dB in normalized mean square error (NMSE) and −38.52 dB in error vector magnitude (EVM) with 52% temporal sparsity, leading to a 1.8\times reduction in estimated inference power.
Files
File under embargo until 15-12-2025