Exploiting Backdoors of Face Synthesis Detection with Natural Triggers
Abstract
References
Index Terms
- Exploiting Backdoors of Face Synthesis Detection with Natural Triggers
Recommendations
Embedding Backdoors as the Facial Features: Invisible Backdoor Attacks Against Face Recognition Systems
ACM TURC '20: Proceedings of the ACM Turing Celebration Conference - ChinaDeep neural network (DNN) based face recognition systems have been widely applied in various identity authentication scenarios. However, recent studies show that the DNN models are vulnerable to backdoor attacks. An attacker can embed backdoors into the ...
The triggers that open the NLP model backdoors are hidden in the adversarial samples
AbstractDeep neural networks (DNNS) have been proven to be vulnerable to adversarial attacks. But the adversarial perturbations are generated for specific input samples, and the perturbations of one sample cannot be applied to other samples. ...
Natural Backdoor Attacks on Speech Recognition Models
Machine Learning for Cyber SecurityAbstractWith the rapid development of deep learning, its vulnerability has gradually emerged in recent years. This work focuses on backdoor attacks on speech recognition systems. We adopt sounds that are ordinary in nature or in our daily life as triggers ...
Comments
Please enable JavaScript to view thecomments powered by Disqus.Information & Contributors
Information
Published In

Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Check for updates
Author Tags
Qualifiers
- Research-article
Funding Sources
- National Natural Science Foundation of China
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 350Total Downloads
- Downloads (Last 12 months)350
- Downloads (Last 6 weeks)45
Other Metrics
Citations
View Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in