Survey of optical-based physical domain adversarial attacks and defense
Deep learning models are misled into making false predictions by adversarial attacks that implant tiny perturbations into the original input, which are imperceptible to the human eye. This poses a huge security threat to computer vision systems that are based on deep learning. Compared to digital-do...
Saved in:
Main Authors: | CHEN Jinyin, ZHAO Xiaoming, ZHENG Haibin, GUO Haifeng |
---|---|
Format: | Article |
Language: | English |
Published: |
POSTS&TELECOM PRESS Co., LTD
2024-04-01
|
Series: | 网络与信息安全学报 |
Subjects: | |
Online Access: | http://www.cjnis.com.cn/thesisDetails#10.11959/j.issn.2096-109x.2024026 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Survey on adversarial attacks and defenses for object detection
by: Xinxin WANG, et al.
Published: (2023-11-01) -
Adversarial attacks and defenses in deep learning
by: Ximeng LIU, et al.
Published: (2020-10-01) -
Adversarial attack and defense on graph neural networks: a survey
by: Jinyin CHEN, et al.
Published: (2021-06-01) -
Double adversarial attack against license plate recognition system
by: Xianyi CHEN, et al.
Published: (2023-06-01) -
An Adversarial Attack via Penalty Method
by: Jiyuan Sun, et al.
Published: (2025-01-01)