Meta fails to detect
Deepfake videos turn political figures into gambling promoters on Facebook
Social media gambling advertisements targeting Bangladeshi users are becoming more sophisticated, now, incorporating deepfake videos and audio. Advertisers are using artificial intelligence tools to create fake TV news reports and fabricated speeches by prominent political figures to lend credibility to these ads.
Since September, Dismislab has identified nine deepfake videos featured in more than 140 Facebook ads, falsely claiming that Bangladesh’s interim government, led by Chief Adviser Muhammad Yunus, endorses and invests in gambling apps. In some cases, the Nobel laureate is depicted in deepfakes promoting gambling, further misleading viewers.
These ads are problematic for several reasons. First, Bangladesh is not among the countries where gambling ads are permitted under Meta’s policies. Second, the High Court of Bangladesh has issued a directive banning online gambling advertisements. Third, the content of these ads—produced with AI tools—is false and manipulated. Yet, such gambling campaigns continue to run on Facebook, violating Meta’s community standards and policies.
Most of these ads evade Meta’s political ad transparency requirements despite featuring prominent political figures. The platform’s ad review system is failing to detect these violations, allowing the ads to disappear quickly after running and limiting opportunities to verify their false claims.
Politics in Gambling Advertising
Online gambling ads targeting Bangladeshi users on Meta’s platforms is nothing new, but the recent incorporation of deepfake videos featuring Dr. Muhammad Yunus has taken these campaigns to a new level. These videos exploit political upheaval and disasters, such as floods, to push gambling content.
In early August, the Awami League government, led by Sheikh Hasina, was ousted following the student quota reform movement, and an interim government was formed with Dr. Yunus as the Chief Adviser. Gambling advertisements have seized on these political changes. One video falsely claims, “Sheikh Hasina will get an extra jail term for understating casino winnings. She forced casinos in Bangladesh to reduce their payouts, so people could only earn from low-paying jobs. Muhammad Yunus abrogated this law.”
Another video states, “Muhammad Yunus has now legalized online casinos and increased winnings.” Upon verification, it was revealed that this advertisement was fabricated by editing footage of RTV news anchor Syeda Sadia Benazir and overlaying the logo of Somoy TV, both of which are prominent Bangladeshi television channels.
In one such video, Dr. Yunus is falsely portrayed as saying he developed a gaming app to address poverty and unemployment in Bangladesh, claiming it could help users earn significant sums of money each month. Some ads even assert that initiatives have been taken to support student movements and flood victims. In addition to Somoy TV, the logos of other news channels, such as Channel 24, have been added to these ads, which are styled to resemble authentic news bulletins.
In one advertisement, a Channel 24 news anchor is heard saying, “Severe floods have struck Bangladesh. The Yunus Foundation and Bangar Social Casino have allocated five million taka to support the victims. Aid will be provided to those who are in greatest need.” Upon contacting Channel 24, they confirmed that the video had been manipulated, with footage of anchor Tanveer Ahmed edited to create the ad. Tanveer Ahmed himself told Dismislab that his videos had been used to produce deepfake content.
When asked for a comment, the Chief Adviser’s Deputy Press Secretary, Apurba Jahangir, said, “All these videos are fake, and we want to draw Meta’s attention to these matters.”
Previously, Dismislab reported on gambling campaigns that had manipulated photos and videos of Bangladeshi cricket stars Shakib Al Hasan, Mashrafe Bin Mortaza, and Mustafizur Rahman. Similar campaigns last year also misused logos of reputable news media outlets.
Distorting Ads and the Use of Deepfake Audio-Video
A deepfake is a digitally fabricated video or audio designed to mimic real visuals or sounds, often making it nearly indistinguishable from genuine content. This technology replicates human faces, voices, and gestures with striking accuracy, making it a powerful tool for deception.
The gambling ads targeting Bangladesh, make extensive use of deepfake technology. They manipulated older videos of Dr. Yunus created deepfake audio mimicking his voice with remarkable similarity. In one ad, he is falsely heard saying, “We’re here to help. Just create an account on Banger Casino, and we guarantee to send you compensation amounting to one lakh taka.” Verification revealed the ad was fabricated by distorting and editing footage from a special interview with Dr. Yunus.
In another video, Dr. Yunus is falsely portrayed as saying, “Hello friends, we’ve developed an application in collaboration with Nahid Islam to help those facing unemployment and other challenges. By playing for two to three hours a day, I assure you’ll earn at least 50,000 taka a month.” Dismislab verified the video and found the deepfake was created using a speech Dr. Yunus delivered to the media on August 7 upon his return to Bangladesh.
In another video, the Chief Adviser is heard claiming, “An app called Crazy Time has been created for those affected by the lack of work due to the agitation in Bangladesh. A 1,000 percent bonus will be given on this app.” This deepfake video was traced back to a message Dr. Yunus delivered on August 8, which was edited and manipulated for the advertisement.
In a separate ad, the speech of the government’s Information Technology Adviser, Nahid Islam, was distorted. In this video, he is falsely heard saying, “We have been waiting for this application for a long time. Now all students are earning by playing Mohammad Yunus’ application.” Verification revealed that the original footage was older, and Nahid’s genuine statement had been manipulated to create an entirely false narrative.
Some advertisements exclusively feature Dr. Yunus speaking. Dismislab identified four such ads where different footage of Dr. Yunus was used, yet the language and script remained identical. Interestingly, only one video contained proper Bengali somewhat. The others displayed signs of poor sentence structure and unclear language, suggesting they were translated into Bengali using automated translation tools.
Frame-by-frame analysis often uncovers inconsistencies that are helpful in detecting deepfake videos, and the advertisements in question are no exception. For instance, in some frames, the lip movements do not perfectly match the spoken words. Inconsistencies between eye blinks, head nods, and typical speech gestures are also evident. In some frames, even parts of Dr. Yunus’s mouth that should be moving during speech appears blurred or distorted.
The Poynter Institute advises focusing on the voice when identifying deepfake audio. One telltale sign is an unusual pause or a change in tone fluctuation that does not match natural speech patterns. The same inconsistencies are found in the audio of the gambling advertisements that mimicked Yunus.
While the voice closely resembles his, the fluency of the original Bangla speech is noticeably lacking. Certain words and phrases are mispronounced or unclear. For example, in one video, the word “Banya” (flood in Bengali) is incorrectly pronounced as “Banza.” In other instances, there is a lack of proper pauses between sentences, and the Bangla speech is delivered with a Hindi accent, further indicating that the audio was manipulated using AI.
Weaknesses in Meta’s Ad Review System
Meta has specific policies for advertising online gambling or games. These advertisements are only allowed in 34 countries worldwide and require authorization before they are posted. Bangladesh is not on this list. However, every gambling ad analyzed in this Dismislab study was specifically targeted at users in Bangladesh, violating Meta’s policies.
The community standards that apply to every content on Meta are also applicable to advertisements. According to Meta’s Manipulated Media standards, for digitally created or altered content that may mislead, “it may place an informative label on the face of content – or reject content submitted as an advertisement.” The videos of Dr. Yunus, Nahid Islam, and news anchors used in the gambling ads were manipulated, but they escaped the net. Such ads continued to appear on the platform, even at the time of writing this report.
Meta’s policy prohibits the promotion of ads identified as false or misleading by third-party fact-checkers. Advertisers who repeatedly spread false information are supposed to face restrictions. However, these ads appear to evade scrutiny. For instance, in August, Factwatch, a Bangladeshi third-party fact-checking organization, published a report exposing a manipulated ad video showing Muhammad Yunus campaigning for gambling. The same video recently reappeared as a Facebook ad, highlighting lapses in Meta’s enforcement mechanisms.
On Meta platforms, including Facebook and Instagram, every ad is required to undergo a review process to ensure it meets the platform’s advertising standards. This review process is primarily automated, though human reviewers are involved in certain cases. Despite this system, deepfake gambling ads continue to bypass detection and appear on the platform.
Political Advertising vs General Advertising
Meta’s policies require political ads to be approved in advance, particularly those that endorse a political figure, party, or candidate. If the ad features the name or image of a politician (e.g., “governor,” “MP,” “minister”), a “paid for by” disclaimer must be included. However, Meta’s guidelines also state that if the primary objective of the ad is to sell products or services, then the disclaimer may not apply. However, it is not clear which of the policies were applied for the gambling ads in question that featured political figures.
For example, ads claiming that Sheikh Hasina reduced the amount of winnings from casinos and Muhammad Yunus repealed the law to increase people’s income were identified by Meta’s review system as political ads. This is apparently because Sheikh Hasina’s picture appeared in the video, and the system recognized her as a political figure. However, the same system failed to classify ads featuring Dr. Yunus as political. In the remaining eight videos where only Dr. Yunus’s images or videos were shown, these ads were categorized as normal, non-political ads in the Meta ad library. For instance, a gambling ad from the Nazaré Natividade page, which included false statements attributed to Dr. Yunus ran for almost two months but was not identified as a political ad.
As part of its transparency policy, Meta stores political ads in the Meta Ad Library, providing open access to information about where the ad was shown and how much money was spent on it. If an ad is no longer active, it is removed from the library.
The only video in this study classified as a political ad appeared 132 times across 45 pages between September 5 and November 16. Each video received an average of at least 40,000 impressions. Notably, 62% of these ads were paid for in U.S. dollars, while the remaining payments were made in various currencies, including those of jurisdictions such as Peru (11%), the European Union (8%), Brazil (8%), Taiwan (3%), and Thailand (3%). The majority of the 45 pages posting these ads had administrators located in Vietnam and Ukraine.
Another aspect of Meta’s transparency policy requires advertisers to disclose when an ad about social issues, elections, or politics has been digitally created or altered using third-party AI tools. However, since these gambling ads were not identified as political, they bypassed this requirement, allowing the use of AI or deepfake technology without the mandated disclosure.
Mohammad Pizuar Hossain, a Senior Lecturer in Law at East-West University (EWU), believes that using deepfakes to promote betting apps or financial scams can be extremely dangerous for people. He said, “Yunus has consistently emerged as a symbol of microcredit promotion in Bangladesh. His active involvement with financial institutions, international recognition, and current role in the interim government of Bangladesh increase the likelihood that people will consider the deepfake videos authentic. The people of Bangladesh may face increased vulnerability to such risks, primarily due to the prevailing lack of digital literacy among many mobile users.”
Methodology
For this research, a keyword search for “Yunus” in the Meta Ad Library was conducted weekly over the period from October 20 to November 20. During this time, 9 distinct gambling ads and campaigns featuring Dr. Muhammad Yunus were identified. The scripts and narratives of these videos were analyzed, and the original versions of the altered videos were also searched and verified. Most of these ads were non-political, as Meta did not identify Dr. Yunus as a political figure, and they were removed from the ad library once they became inactive.
To understand the scope of deepfake betting ads and campaigns in recent months, a further
search was conducted for inactive ads using the keyword “Yunus” focusing on the timeframe from August 20 to November 17. Only one deepfake gambling ad of Dr. Muhammad Yunus, was identified as political by Meta found. This ad had circulated 132 times across 45 unique Facebook pages between August 20 and November 17. Despite being flagged as political, this ad lacked a disclaimer. The ads were further analyzed to determine the regions and currencies in which they were paid for. Additionally, the countries where the page admins were based were identified. The earliest ad was found from September 5, and the latest ad was from November 16.