Issue link: https://maltatoday.uberflip.com/i/1544378
7 maltatoday | SUNDAY • 12 APRIL 2026 FEATURE deepfake creators deliberately eliminating telltale artefacts, irregular blinking, misaligned shadows, and inconsistencies in facial movement that detec- tion systems are trained to de- tect. "A tool may be good for a certain period but the moment there is an evolution on the deepfakes side, another tool will come out," the superinten- dent says. "We have to keep in- vesting to stay up to date." A chain that cannot break When Europol dismantled a cryptocurrency fraud network in December 2025, Malta was involved. Four Maltese victims lost nearly €500,000 to a scam using deepfake videos of ce- lebrities to lure victims, with a suspect traveling to Malta to collect cash from one victim. Malta's cybercrime unit as- sisted with digital forensics, examining every seized device and computer, and passed ev- idence to countries across the operation that might otherwise never have had it. "It was a very successful case," the superintendent says. "With the tools we had and the work we did, we managed to pass in- formation to other countries that were involved, who per- haps did not have that infor- mation, because we managed to extract it from the devices of the person who was here." The core principle is the chain of custody, requiring all evidence to be traceable from start to finish. A missing link can cause everything to fail in court. Buckingham's disserta- tion identified this as Malta's key challenge, with local re- spondents rating deepfakes' impact on presenting evidence as high, especially before judg- es unfamiliar with the technol- ogy. Deepfake evidence, when it reaches a Maltese courtroom, is treated procedurally the same as any other digital evi- dence. Where the trail goes cold Identifying a deepfake and identifying whoever made it are two very different things. Even when investigators con- firm a video was generated us- ing a specific AI tool, they are still nowhere near a name. The tool does not give up the account, and the account does not give up the person who is usually abroad, they explained. On top of this, mutual legal assistance requests, European Investigation Orders, and com- panies that will not hand over data without a court order all add months to a process that the technology does not wait for. "Even if you know it was made in ChatGPT; which account made it?" one officer asks. "You always have a long investiga- tion ahead of you." It is precisely this drag be- tween the speed of technology and the speed of the law that Prime Minister Robert Abe- la acknowledged in February, announcing that the Justice Ministry would table a bill to regulate the improper use of deepfakes. Whether the cyber- crime unit was consulted draws a measured response. "Like any other law, normally when there is a final draft it is shared, and we see whether there is any- thing that impacts us or not," the superintendent says. What they want from the bill is specific: A watermark, something embedded invisi- bly into AI-generated content that identifies the tool that produced it. Buckingham's dis- sertation recommended exact- ly this, alongside an EU-wide certification system for foren- sic tools capable of detecting synthetic media. The EU is already moving in that direc- tion. "If you create a video with ChatGPT, it will write some- where how it was made, so that whoever sees it knows it is not real," one officer explains. The part nobody wants to talk about The conversation eventual- ly arrives at children. Parents who photograph their children at the cinema and post it be- fore leaving the car park. Faces lifted from those images, iden- tities repurposed, photographs of minors processed into child sexual abuse material. Buckingham's dissertation found that Maltese respond- ents specifically flagged the creation and sharing of CSAM using deepfake technology as one of the most commonly reported crimes they encoun- tered, with cases referred by Europol increasing drastical- ly in recent years. Unlike tra- ditional CSAM, synthetically generated content can evade the hash-based detection sys- tems designed to flag known material. "Once you put a photo on the internet, you lose control of it forever," the superintendent says. "Especially where chil- dren are concerned, as much as possible, do not put photos of your children online." Too good to be true The EU AI Act deadline ar- rives in August 2026. Like every other piece of legislation, the unit will absorb it, adapt, and continue. Buckingham's dissertation concluded that for Malta spe- cifically, the combination of pending legislation, no proce- dural guidelines for deepfake evidence in court, and public awareness as the primary na- tional response meant that the gap between technology and the law remained wide. The officers make a similar argument, stating that they want people to slow down, to look twice at an investment op- portunity that sounds too good before clicking; to call their bank on the number on the card rather than the number that just rang them. No legitimate financial insti- tution, they say, will ever ask for credentials over a video call with a stranger. "Wherever you see things for free, the product is you," one officer says. "Facebook uses your content. ChatGPT learns from you. Everything that is free costs something. And if something looks too good to be true, it is." Identifying a deepfake and identifying whoever made it are two very different things. Even when investigators confirm a video was generated using a specific AI tool, they are still nowhere near a name.

