Evidence suggests that North Korean IT workers are using real-time deepfake technology to infiltrate organizations through remote work positions, which poses significant security, legal and compliance risks. Investigators have documented cases where interviewees presented synthetic video feeds, using identical virtual backgrounds across different candidate profile. DPRK IT (Democratic People's Republic of Korea, a.k.a North Korea) workers incrementally advanced their infiltration methodology by implementing real-time deepfake technology. They used only single images generated by thispersonnotexist[.]org, which permits the use of generated faces for personal and commercial purposes, as well as free tools for deepfakes. Techniques suggested to help detect several technical shortcomings in real-time deepfake systems that create detection opportunities: Temporal consistency issues: Rapid head movements caused noticeable artifacts as the tracking system struggled to maintain accurate landmark positioning; Occlusion handling: When the operator's hand passed over their face, the deepfake system failed to properly reconstruct the partially obscured face; Lighting adaptation: Sudden changes in lighting conditions revealed inconsistencies in the rendering, particularly around the edges of the face; Audio-visual synchronization: Slight delays between lip movements and speech were detectable under careful observation.
Source: unit42.paloaltonetworks.com