Now Bangladeshi actresses fall victim to deepfake, how to detect these videos

Deepfake videos of many female celebrities from across the world have gone viral in recent times.Collage

An intimate video of a woman has gone viral on social media. The video is being claimed to be of popular small screen actress Tanjin Tisha.

After reviewing the video, Dismislab, an online fact checking and media research platform has confirmed it to be a fake. Tanjin Tisha’s face has been swapped with the woman in that video taken out of a pornography site. It has zero connection to Tanjin Tisha.

Deepfake videos of many female celebrities from across the world including Priyanka Chopra, Alia Bhatt, Rashmika Mnadana, Natalie Portman and Emma Watson have gone viral in recent times. Right when people are concerned about these videos, a deepfake video of actress Tanjin Tisha has surfaced.

Reportedly, this is the first deepfake video of any Bangladeshi celebrity. The fake video was circulated from several Facebook pages in November last year. Then it went ‘viral’ in December.

What is deepfake: Someone’s face or voice can be finely replicated through deepfake technology. Using this technology, fake videos can be created by swapping anyone’s face in any offensive video. These sort of fake videos are known as ‘deepfake video. It’s quite tough for common people to tell this sort of fake videos apart.

How dangerous deepfake can be in Bangladesh

Experts say that the rise of artificial intelligence has made the deepfake technology much easier. The misuse of deepfake technology is increasing alarmingly throughout the world let alone Bangladesh. The deepfake technology can now be accessed for free or for a minimal cost.

Cyber criminals can spread offensive videos by replacing them with anyone’s face. These sort of fake videos are created intricately in the deepfake technology. Therefore, it’s quite tough for common people to identify them.

Speaking on this topic, professor Clare McGlynn of Durham University in the United Kingdom told Prothom Alo on Sunday afternoon, “Incidents of harassment using deepfake are occurring all over the world. There are no geographical borders to it. Any women from any corner of the world can fall victim to it. There are plenty of such incidents involving South Asian politicians and celebrities.”

Innumerous people in Bangladesh watch Facebook reels, YouTube shorts and TikTok videos as part of their entertainment. Many also believe in the fake videos going viral on these platforms.

Professor Clare McGlynn from Durham University in the United Kingdom has been working on sexual violence and cyber bullying.
from Instagram
Any women from any corner of the world can fall victim to it. There are plenty of such incidents involving South Asian politicians and celebrities.
Clare McGlynn, professor, Durham University, United Kingdom

Information technology expert Suman Ahmed told Prothom Alo last Thursday that the deepfake technology is highly dangerous for Bangladesh. The reason is that many don’t contemplate enough to differentiate between original and fake videos on social media. That’s why deepfake is a more challenging issue in Bangladesh.

Since this deepfake technology has become even easier to access recently, this IT expert believes that this challenge will grow even larger.

He is concerned that deepfake can be extensively misused not only to victimise the celebrities or the politicians, but to humiliate anyone, to serve personal enmity or to scam people in future.

This type of fake videos has turned into a matter of global concern. Fact Check editor (Bangladesh) for AFP, Qadaruddin Shishir believes this to be an even ‘bigger concern’ in the Bangladeshi context.

He told Prothom Alo last Thursday, “The problem is even acuter particularly in Bangladesh where most people have limited or no technological literacy. People can be easily misled with deepfake content.”

In August last year, deepfake videos of several politicians including BNP leader Rumin Farhana and Nipun Roy had come to light. Then Dismislab had debunked them to be fake videos. In Qadaruddin Shishir’s words, deepfake videos are basically made and spread to humiliate someone socially or in the political field. 

The deepfake technology is highly dangerous for Bangladesh. The reason is that many don’t contemplate enough to differentiate between original and fake videos on social media. That’s why deepfake is a more challenging issue in Bangladesh.
Suman Ahmed, information technology expert

Suman Ahmed believes that the concept of deepfake videos have spread throughout the world in the last five to six years. Cases of deepfake videos have been increasing in Bangladesh for about a year. But no specific statistics have been found about since when deepfake videos are being made here. 

Earlier, fake videos of several Dhallywood stars including Mahiya Mahi, Pori Moni and Mehzabien Chowdhury have gone viral on social media. However, none of those were deepfakes. Cyber criminals had made those fake videos through manual editing.  

Qadaruddin Shishir said, “The level of manipulation was limited in the manually edited videos from before. The artificial intelligence has erased that line. Now it’s possible to make more credible fake videos in way less time and cost than before.” 

A report of research organisation Sensity AI published in 2019, stated that 99 per cent of the deepfake videos created worldwide are of women after all. And as much as 96 per cent of these deepfake videos have been made without their consent.
Natalie Portman, Priyanka Chopra and Rashmika Mandana. Several stars including them have fallen victim to deepfake.
Collaz

Why women are the target

Starting from Natalie Portman, Emma Watson of Hollywood, Priyanka Chopra, Alia Bhatt, Rashmika Mandana of Bollywood to Tanjin Tisha of Dhallywood, majority of the deepfake victims across the world consists of women indeed.

Quite a few documentaries including ‘My Blonde GF’ and ‘Another Body’ based on deepfake pornography have shown the horrific experiences women had to go through.  

A report of research organisation Sensity AI published in 2019, stated that 99 per cent of the deepfake videos created worldwide are of women after all. And as much as 96 per cent of these deepfake videos have been made without their consent.

Professor Clare McGlynn has long been working on sexual violence and cyber bullying. She told Prothom Alo, “Deepfake pornography is being used as a weapon to harass women. The pornography industry is designed for men basically. As a result, women have been turned into the target of deepfake.”

As the reason for targeting women Clare McGlynn mentioned, “Society does not have a good record of taking crimes against women seriously, and this is also the case with deepfake porn. Online abuse is too often minimised and trivialised.”

The deepfake videos of female celebrities are ‘sellable’. Some make deepfake sexual contents of those celebrities to make money online.
Fact checker Qadaruddin Shishir

Women have been noticed to become victims of deepfake in every corner of the world whether it’s in the United States, in Europe or in Bangladesh.

Dhaka-based fact checker Qadaruddin Shishir said that the deepfake videos of female celebrities are ‘sellable’. Some make deepfake sexual contents of those celebrities to make money online. Apart from that, there could be personal issues as well. Plus, he believes that many people make deepfake videos also out of revenge.   

Professor of women and gender studies at Dhaka University Tania Haque told Prothom Alo last Thursday, “We still don’t perceive women as human beings rather they are considered as commodities. Women are just seen from a sexual perspective.”

If deepfake victims submit complaints, there’s scope to take measures under this act.
from CID's Facebook

Demand to take measures

Amidst the global concern about misuse of deepfake, mega platforms like YouTube and Facebook have started taking it seriously. They have updated their policies. Meanwhile in India, the government has conducted raids to prevent deepfake.

Actress Runa Khan has demanded punishment for those who make and circulate this type of deepfake videos in Bangladesh.

It’s a punishable crime to create and publish objectionable photographs or videos of someone without permission under the Cyber Security Act, 2023. If deepfake victims submit complaints, there’s a ground to take measures under this act.

We still don’t perceive women as human beings rather they are considered as commodities. Women are just seen from a sexual perspective.
Tania Haque, professor, Women and Gender Studies Department, Dhaka University

People can complain at the Cyber Police Center of the Criminal Investigation Department (CID), Cyber Crime Investigation Department of DMP and the complaint centre of Police Cyber Support for Women.

Additional DIG at CID’s Cyber Police Centre Rezaul Masud told Prothom Alo last Thursday that they haven’t yet received any complaint about deepfake. Measures will be taken if there’s any complaint. Cyber criminals are identified at CID’s digital forensic lab and it’s capable of finding the criminals behind deepfake as well, added he.

Deepfake videos are normally a few seconds long.
Reuters

How to tell deepfake videos apart

1.    In many cases there are some anomalies in the deepfake videos. Sometimes, the body language of the person is abnormal. Sometimes the speed of speech delivery is also unusual. The facial expressions don’t match the speech. That’s why most of the time, cunningly, deepfake videos don’t have voices.

2.    Usually more attention is paid to replicate the person’s facial image. So, there remain various mismatches with the other visible parts of the body. Their clothing or face can also be inconsistent with the bodily movement.

3.    Deepfake videos are normally a few seconds long. That’s because it’s an expensive affair to create long deepfake videos.

4.    In this sort of videos, there can be incoherence in the shape and build of different body parts of the person such as lips, eyes, ears, nose, hair or the movements of the eyelids. In some cases, sources can be found about some deepfake videos on the internet from searching reverse image.

Also Read

* The report, originally published in the print and online editions of Prothom Alo, has been rewritten in English by Nourin Ahmed Monisha