Deepfakes for medical video de-identification: Privacy protection and diagnostic information preservation
B Zhu, H Fang, Y Sui, L Li - Proceedings of the AAAI/ACM Conference …, 2020 - dl.acm.org
B Zhu, H Fang, Y Sui, L Li
Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society, 2020•dl.acm.orgData sharing for medical research has been difficult as open-sourcing clinical data may
violate patient privacy. Traditional methods for face de-identification wipe out facial
information entirely, making it impossible to analyze facial behavior. Recent advancements
on whole-body keypoints detection also rely on facial input to estimate body keypoints. Both
facial and body keypoints are critical in some medical diagnoses, and keypoints invariability
after de-identification is of great importance. Here, we propose a solution using deepfake …
violate patient privacy. Traditional methods for face de-identification wipe out facial
information entirely, making it impossible to analyze facial behavior. Recent advancements
on whole-body keypoints detection also rely on facial input to estimate body keypoints. Both
facial and body keypoints are critical in some medical diagnoses, and keypoints invariability
after de-identification is of great importance. Here, we propose a solution using deepfake …
Data sharing for medical research has been difficult as open-sourcing clinical data may violate patient privacy. Traditional methods for face de-identification wipe out facial information entirely, making it impossible to analyze facial behavior. Recent advancements on whole-body keypoints detection also rely on facial input to estimate body keypoints. Both facial and body keypoints are critical in some medical diagnoses, and keypoints invariability after de-identification is of great importance. Here, we propose a solution using deepfake technology, the face swapping technique. While this swapping method has been criticized for invading privacy and portraiture right, it could conversely protect privacy in medical video: patients' faces could be swapped to a proper target face and become unrecognizable. However, it remained an open question that to what extent the swapping de-identification method could affect the automatic detection of body keypoints. In this study, we apply deepfake technology to Parkinson's disease examination videos to de-identify subjects, and quantitatively show that: face-swapping as a de-identification approach is reliable, and it keeps the keypoints almost invariant, significantly better than traditional methods. This study proposes a pipeline for video de-identification and keypoint preservation, clearing up some ethical restrictions for medical data sharing. This work could make open-source high quality medical video datasets more feasible and promote future medical research that benefits our society.
ACM Digital Library