Survey of optical-based physical domain adversarial attacks and defense

Deep learning models are misled into making false predictions by adversarial attacks that implant tiny perturbations into the original input, which are imperceptible to the human eye. This poses a huge security threat to computer vision systems that are based on deep learning. Compared to digital-do...

Full description

Saved in:
Bibliographic Details
Main Authors: CHEN Jinyin, ZHAO Xiaoming, ZHENG Haibin, GUO Haifeng
Format: Article
Language:English
Published: POSTS&TELECOM PRESS Co., LTD 2024-04-01
Series:网络与信息安全学报
Subjects:
Online Access:http://www.cjnis.com.cn/thesisDetails#10.11959/j.issn.2096-109x.2024026
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1841529568100352000
author CHEN Jinyin
ZHAO Xiaoming
ZHENG Haibin
GUO Haifeng
author_facet CHEN Jinyin
ZHAO Xiaoming
ZHENG Haibin
GUO Haifeng
author_sort CHEN Jinyin
collection DOAJ
description Deep learning models are misled into making false predictions by adversarial attacks that implant tiny perturbations into the original input, which are imperceptible to the human eye. This poses a huge security threat to computer vision systems that are based on deep learning. Compared to digital-domain adversarial attacks, physical-domain adversarial attacks are enabled to introduce perturbations into the input before the adversarial input is captured by the acquisition device and converted into a binary image within the vision system, posing a real security threat to deep learning-based computer vision systems. Optical-based physical-domain adversarial attack techniques, such as those using projected irradiation as a typical example, are more likely to be overlooked and provided negligible protection due to their perturbations being very similar to effects produced by natural environments in the real world. Given their high degree of invisibility and executability, they could pose a significant or even fatal threat to real systems. Based on existing research work, the introduction and discussion of optical-based physical-domain adversarial attack techniques within computer vision systems were presented. The attack scenarios, tools, goals, and performances of these techniques were compared and analyzed. Potential future research directions for optical-based physical-domain adversarial attacks were also discussed.
format Article
id doaj-art-d771c9c4660c49bca0e37e2e616f18e2
institution Kabale University
issn 2096-109X
language English
publishDate 2024-04-01
publisher POSTS&TELECOM PRESS Co., LTD
record_format Article
series 网络与信息安全学报
spelling doaj-art-d771c9c4660c49bca0e37e2e616f18e22025-01-15T03:17:09ZengPOSTS&TELECOM PRESS Co., LTD网络与信息安全学报2096-109X2024-04-011012163897528Survey of optical-based physical domain adversarial attacks and defenseCHEN JinyinZHAO XiaomingZHENG HaibinGUO HaifengDeep learning models are misled into making false predictions by adversarial attacks that implant tiny perturbations into the original input, which are imperceptible to the human eye. This poses a huge security threat to computer vision systems that are based on deep learning. Compared to digital-domain adversarial attacks, physical-domain adversarial attacks are enabled to introduce perturbations into the input before the adversarial input is captured by the acquisition device and converted into a binary image within the vision system, posing a real security threat to deep learning-based computer vision systems. Optical-based physical-domain adversarial attack techniques, such as those using projected irradiation as a typical example, are more likely to be overlooked and provided negligible protection due to their perturbations being very similar to effects produced by natural environments in the real world. Given their high degree of invisibility and executability, they could pose a significant or even fatal threat to real systems. Based on existing research work, the introduction and discussion of optical-based physical-domain adversarial attack techniques within computer vision systems were presented. The attack scenarios, tools, goals, and performances of these techniques were compared and analyzed. Potential future research directions for optical-based physical-domain adversarial attacks were also discussed.http://www.cjnis.com.cn/thesisDetails#10.11959/j.issn.2096-109x.2024026adversarial attackdeep learningsecurity threatoptical physical domain adversarial attack
spellingShingle CHEN Jinyin
ZHAO Xiaoming
ZHENG Haibin
GUO Haifeng
Survey of optical-based physical domain adversarial attacks and defense
网络与信息安全学报
adversarial attack
deep learning
security threat
optical physical domain adversarial attack
title Survey of optical-based physical domain adversarial attacks and defense
title_full Survey of optical-based physical domain adversarial attacks and defense
title_fullStr Survey of optical-based physical domain adversarial attacks and defense
title_full_unstemmed Survey of optical-based physical domain adversarial attacks and defense
title_short Survey of optical-based physical domain adversarial attacks and defense
title_sort survey of optical based physical domain adversarial attacks and defense
topic adversarial attack
deep learning
security threat
optical physical domain adversarial attack
url http://www.cjnis.com.cn/thesisDetails#10.11959/j.issn.2096-109x.2024026
work_keys_str_mv AT chenjinyin surveyofopticalbasedphysicaldomainadversarialattacksanddefense
AT zhaoxiaoming surveyofopticalbasedphysicaldomainadversarialattacksanddefense
AT zhenghaibin surveyofopticalbasedphysicaldomainadversarialattacksanddefense
AT guohaifeng surveyofopticalbasedphysicaldomainadversarialattacksanddefense