Wenger et al., 2021 - Google Patents
Backdoor attacks against deep learning systems in the physical worldWenger et al., 2021
View PDF- Document ID
- 2334533234773766548
- Author
- Wenger E
- Passananti J
- Bhagoji A
- Yao Y
- Zheng H
- Zhao B
- Publication year
- Publication venue
- Proceedings of the IEEE/CVF conference on computer vision and pattern recognition
External Links
Snippet
Backdoor attacks embed hidden malicious behaviors into deep learning models, which only activate and cause misclassifications on model inputs containing a specific" trigger." Existing works on backdoor attacks and defenses, however, mostly focus on digital attacks that apply …
- 230000001815 facial 0 abstract description 18
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/36—Image preprocessing, i.e. processing the image information without deciding about the identity of the image
- G06K9/46—Extraction of features or characteristics of the image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/62—Methods or arrangements for recognition using electronic means
- G06K9/6267—Classification techniques
- G06K9/6268—Classification techniques relating to the classification paradigm, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/316—User authentication by observing the pattern of computer usage, e.g. typical user behaviour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00221—Acquiring or recognising human faces, facial parts, facial sketches, facial expressions
- G06K9/00268—Feature extraction; Face representation
- G06K9/00281—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/62—Methods or arrangements for recognition using electronic means
- G06K9/68—Methods or arrangements for recognition using electronic means using sequential comparisons of the image signals with a plurality of references in which the sequence of the image signals or the references is relevant, e.g. addressable memory
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/55—Detecting local intrusion or implementing counter-measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/70—Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
- G06F21/82—Protecting input, output or interconnection devices
- G06F21/83—Protecting input, output or interconnection devices input devices, e.g. keyboards, mice or controllers thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/62—Methods or arrangements for recognition using electronic means
- G06K9/6217—Design or setup of recognition systems and techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00221—Acquiring or recognising human faces, facial parts, facial sketches, facial expressions
- G06K9/00288—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2133—Verifying human interaction, e.g., Captcha
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/20—Image acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Wenger et al. | Backdoor attacks against deep learning systems in the physical world | |
Shan et al. | Fawkes: Protecting privacy against unauthorized deep learning models | |
Gao et al. | Backdoor attacks and countermeasures on deep learning: A comprehensive review | |
Gao et al. | Design and evaluation of a multi-domain trojan detection method on deep neural networks | |
Sehwag et al. | Analyzing the robustness of open-world machine learning | |
Guo et al. | An overview of backdoor attacks against deep neural networks and possible defences | |
Kong et al. | A survey on adversarial attack in the age of artificial intelligence | |
Shan et al. | Poison forensics: Traceback of data poisoning attacks in neural networks | |
Struppek et al. | Learning to break deep perceptual hashing: The use case neuralhash | |
Ning et al. | Capjack: Capture in-browser crypto-jacking by deep capsule network through behavioral analysis | |
Wang et al. | Backdoor attack through frequency domain | |
Doan et al. | Defending backdoor attacks on vision transformer via patch processing | |
Guo et al. | A master key backdoor for universal impersonation attack against DNN-based face verification | |
Zhang et al. | Cassandra: Detecting trojaned networks from adversarial perturbations | |
Wang et al. | A survey of neural trojan attacks and defenses in deep learning | |
Liang et al. | Poisoned forgery face: Towards backdoor attacks on face forgery detection | |
US20220398325A1 (en) | Method and system for security assessment | |
Wenger et al. | Sok: Anti-facial recognition technology | |
Ma et al. | TransCAB: Transferable clean-annotation backdoor to object detection with natural trigger in real-world | |
Cai et al. | Randomized channel shuffling: Minimal-overhead backdoor attack detection without clean datasets | |
Le Roux et al. | A Comprehensive Survey on Backdoor Attacks and their Defenses in Face Recognition Systems | |
Li et al. | Temporal-distributed backdoor attack against video based action recognition | |
Ma et al. | Watch out! simple horizontal class backdoor can trivially evade defense | |
Xiang et al. | Revealing perceptible backdoors, without the training set, via the maximum achievable misclassification fraction statistic | |
Ma et al. | Horizontal class backdoor to deep learning |