Tasnim Tabassum Munmun

Fellow, Dismislab
AI video spreads claiming Hindu youth in Bangladesh begging for life

AI video spreads claiming Hindu youth in Bangladesh begging for life

Tasnim Tabassum Munmun

Fellow, Dismislab

A video circulating on social media platform X shows a young man speaking in a panicked voice. He claims the situation in Bangladesh has suddenly worsened and that they fear for their lives. He urges viewers to share the video to seek help. However, Dismislab’s fact-check found that the video is not real and was created using artificial intelligence (AI). Multiple similar videos have spread across social media from an Instagram account operated from India.

In a video shared from an X account called The Jaipur Dialogues, a person is heard saying, “This is Bangladesh. It is nighttime now. They will finish us off like Dipu Chadar. You can see for yourselves what is happening. Please share this video as much as possible so that someone can save us. I do not understand how everything suddenly went wrong.”

The video caption reads in English, “Bangladesh is getting out of hands.”  The text on the video says, “Hindus are being killed every day” and “Like Dipu Chandra, we too.” The post has been viewed more than 170,000 times and reposted more than 1,000 times.

The same video was also shared from another X account called The Alternative Media. That post used hashtags such as “HindusUnderAttack,” “Islamist” and “Yunus.” The video has been viewed more than 31,000 times.

A reverse image search using keyframes of the video found multiple similar videos on an Instagram account and a YouTube channel. The videos are being shared from an Instagram account and YouTube channel run by the same person named Kuldeep Meena. Both the account and the channel are reported to be operating from India.

The video was first posted on Kuldeep Meena’s Instagram account on December 24, drwaing 4 million views. At the 6-second and 13-second marks, inconsistencies are visible in the Bangladesh flag and the upper part of the flagpole.

Several more videos (1, 2, 3, 4, 5) about Bangladesh were posted from the same account. The videos claim to show “live ground scenes” from Bangladesh. However, across videos from different days, the surrounding elements appear almost identical, including the Bangladesh flag, fires in buildings or shops, and vehicles. In some videos (1, 2), the names of nearby shops appear distorted. In one video, the person is heard speaking in English. The lip movement does not match the audio, which is a clear sign of artificial intelligence.

The Instagram account also contains other edited and AI-generated videos (1, 2). In one video, the person is seen filming the Titanic sinking from nearby. In the caption of another video, he is seen mentioning “Sora OpenAI.” Sora OpenAI is an advanced artificial intelligence model that can generate realistic and long videos from text prompts. It can create videos by understanding complex scenes, character movement and environmental changes.

Further verification using content detection tools Deepfake-o-Meter and Hive Moderation found that the video was almost certainly created using AI. India Today also published a detailed fact-check report showing that the video in question was AI-generated.

Therefore, the video claiming to show a Bangladeshi Hindu youth seeking help is not real. It was created using artificial intelligence.

It is relevant to note that in Bhaluka, Mymensingh, a garment worker named Dipu Chandra Das was beaten to death, hung on a tree and set ablaze over allegations of insulting Islam.  Subsequent police investigation found Dipu did no such thing and that there were personal motives behind his killing. 

Videos created using AI tools are often spread on social media with false claims. Dismislab has previously published multiple fact-check reports on AI-generated videos.