On The Higher Moment Disparity of Backdoor Attacks

Published in IEEE International Conference on Multimedia and Expo (ICME), 2024

Backdoor attacks are a significant concern in deep learning, especially in applications where models are trained on data from untrusted sources. Plenty of approaches use latent representations of a backdoor model to separate trigger samples from clean ones. However, these defenses rely on some clean data to train a classifier. Recently, researchers have designed adaptive attacks that are latently inseparable, making it even harder for the defender to prevent backdoor attacks. For these reasons, we propose a novel defense, Higher Moment Disparity (HMD), based on the higher moment inspired by latent statistics. HMD uses no clean data and all intermediate representations to avoid previous concerns. Extensive experiments show that our defense against various attacks is promising.

Recommended citation: Ching-Chia Kao, Cheng-Yi Lee, Chun-Shien Lu, Chia-Mu Yu, Chu-Song Chen. (2024). "On The Higher Moment Disparity of Backdoor Attacks." ICME.