MENU

Republic of the Philippines
National Police Commission
PHILIPPINE NATIONAL POLICE
ANTI-CYBERCRIME GROUP
Camp BGen Rafael T Crame, Quezon City
E-mail: This email address is being protected from spambots. You need JavaScript enabled to view it.

ACG-CYBER SECURITY BULLETIN NR 189: Understanding of Deepfake Threat Technology

Reference Number ACG-CSB 052720189

The following information was obtained from different cyber security sources for notification to all parties concerned pursuant to the mandate of the Philippine National Police Anti-Cybercrime Group (PNP ACG) and classified as “Restricted” pursuant to the PNP Regulation 200-012 on Document Security and Impact Rating as high based on PNP Information Communication Technology (ICT) Security Manual s.2010-01 p. 22 and p.129.

SUMMARY

Deepfake is a type of Artificial Intelligence (AI) used to create convincing image, audio and video hoaxes. Deepfake content is created by using two competing AI algorithms, one is called the generator and the other is called the discriminator.

It takes a few steps to make a face-swap video. First, you run thousands of face shots of the two people through an AI algorithm called an encoder. The encoder finds and learns similarities between the two faces, and reduces them to their shared common features, compressing the images in the process. A second AI algorithm called a decoder is then taught to recover the faces from the compressed images.

Deepfakes are not illegal per se, but producers and distributers can easily fall foul of the law. Depending on the content, a deepfake may infringe copyright, breach data protection law, and be defamatory if it exposes the victim to ridicule. There is also the specific criminal offense of sharing sexual and private images without consent, ie revenge porn, for which offenders can receive up to two years in jail.

Identity theft is one area of concern in deepfakes. Facial scans are part of biometric identification, the Face ID lock of the iPhone for instance and there are fears that a deepfake face can eventually fool a facial scan. Less onerous but more damaging is the potential for deepfake slander, someone impersonating another and can say something damaging about anyone’s reputation which creates criminal and civil liabilities and paves the way for disinformation campaigns.

In this era of fake news and misinformation where one can be easily misled by a poorly photoshopped picture, it is frightening to imagine what a deepfake video can do, how many people it can deceive. Despite the looming difficulties, then, Congress should really begin crafting legislation to address issues of deepfake. Otherwise, this technology can be eventually used as a means to achieving less than noble purposes. In the first place, what does deepfake really achieve? Are there legitimate purposes to it at all?
The potential dangers of misusing and/or abusing deepfake technology, including the ability to create fake news, cause political havoc, and target specific demographics such as but not limited to women, minorities, and vulnerable persons.

The sad truth is that the more videos and images there are of you online, the easier you are to deepfake, though the technology is actually getting better all the time, and now it can be done with just one photo of you. The latest tech also shows how voice-over by another actor is not even needed, but instead text can be generated that allows new words to be put into someone’s mouth.

RECOMMENDATION

All PNP personnel as well as the public are advised to follow the importance of understanding deepfake threat technology to avoid being a victim of cybercrime:

• Have good basic protocols "trust but verify";
• Ensure employees and family know about how Deepfaking works and the challenges it can pose;
• Make sure you are media literate and use good quality news sources;
• Take a step towards compliance that will support the protection of sensitive data like financial, personal or business information; and
• Regular backups protect your data against ransomware and gives you the ability to restore damaged data.
For additional information, please refer to the following websites:
• https://www.theguardian.com/technology/2020/jan/13/what-are-deepfakes-and-how-can-you-spot-them
• https://news.abs-cbn.com/ancx/culture/spotlight/08/26/19/deep-fakes-bring-fake-news-to-a-whole-other-leveland-you-should-be-concerned
• https://www.mondaq.com/canada/copyright/687716/what-can-the-law-do-about-deepfake39
• https://www.avg.com/en/signal/what-is-deepfake-video-and-how-to-spot-it


POINT OF CONTACT

Please contact PMAJ ANGELICA STARLIGHT L. RIVERA, Chief, Personnel Records Management Section thru e-mail address This email address is being protected from spambots. You need JavaScript enabled to view it. or contact us on telephone number (632) 7230401 local 3562 for any inquiries related to this CYBER SECURITY BULLETIN.