
YouTube fuels fake talk show boom as real voices lose ground
At first glance, the video was extraordinary: Khaled Muhiuddin, one of Bangladesh’s most prominent talk show hosts, appearing to moderate a conversation between the country’s two rival former prime ministers, Sheikh Hasina and Khaleda Zia. Facing the camera directly, the two women joined the virtual talk show from two locations, as Muhiuddin invited Hasina to reflect on the popular movement that had forced her from power in August 2024. But it quickly became clear the video was fake. Hasina’s voice drifted out of sync with her image; Khaleda’s movements slowed unnaturally, as if the footage had been warped or stretched; the host’s gestures repeated, caught in a visible loop. At that point, any viewer would realize that the talk show clip had been doctored and pieced together from different sources to create a conversation that never took place.
Yet, by late April 2025, the video reached over 200,000 views on Facebook. It wasn’t uploaded by the original creator. Instead, a user shared it with a caption accusing the host of showing undue respect to a “disgraced” former prime minister. The clip bore a logo labeled “Talk Show First News,” which a reverse image search linked to a YouTube channel of the same name. There, the video had gained another 135,000 views. That channel had several other clips, all altered in a similar manner.

A search for “Khaled Muhiuddin talk show” on YouTube led to a flood of lookalike videos. Dismislab flagged about 50 channels, and within just one week—March 25 to 31—29 had posted at least one doctored talk show clip. Researchers reviewed 288 videos in total. Most relied on the same tactic: slicing real interviews out of context to manufacture moments that never happened. These videos garnered an average of 12,000 views.
Dismislab found, 20 of the 29 channels had been created after Bangladesh’s political transition in August 2024; the rest dated back to between 2021 and 2023. About 90% of the videos carried ads during the investigation, suggesting a clear profit motive. The clips often featured clickbait headlines, misleading thumbnails, and distorted the context of what the guests had actually said.
All of these practices violated multiple YouTube community guidelines, including rules against deceptive practices, impersonation, and copyright violation. Among the videos reviewed, researchers found that at least 58 political figures, journalists, and talk show hosts had their images, footages, or statements manipulated. One commentator said the fake clips damaged his credibility, diverted his audience, and reduced his income from the original content he created.
Experts said fake political talk shows often blend financial and ideological motives, and warned that prolonged political transitions can fuel anti-incumbency sentiments, creating openings for opportunistic groups to exploit.
Stolen and stitched
According to Dismislab’s analysis, all 288 videos had been lifted from other sources, with backgrounds altered, footage cropped or zoomed, and original contexts distorted. Most stitched together clips from YouTube, television shows, Facebook Lives, and stolen audio recordings. Though stitching is a common editing technique, here it was used to create misleading narratives. In many cases, voice-overs from unrelated contexts were layered onto footage of targeted individuals to make the scenes appear authentic. Sometimes, a discussant hadn’t said a word on the claimed topic, yet headlines suggested otherwise, misleading thousands of viewers.
A video posted on March 27, 2025, carried the headline: “Dr. Yunus resigned immediately after visiting China,” with a thumbnail bearing the logo of Jamuna TV, a major Bangladeshi news channel. Jamuna later confirmed to Dismislab that the logo had been used without permission. The video stitched together footage of talk show host Khaled Muhiuddin and Golam Maula Rony, a businessman and politician with 1.23 million YouTube followers. Khaled appeared as a dummy host, while Rony’s footage, lifted from his own YouTube channel, was used to imply a resignation that had never happened. In reality, Rony had discussed the diplomatic implications of Yunus’s China trip.

Another clip showed Hasnat Abdullah, a leader of the National Citizen Party (NCP), speaking about police and administrative issues after the July movement. The headline, however, claimed: “Hasnat reveals sensational information about the army live!”—even though the military was never mentioned.
Masood Kamal, a journalist and political analyst, said he regularly falls victim to such cheapfakes. “People often think these fake channels belong to me, and it’s extremely embarrassing,” he said. Kamal’s real talk shows are frequently stolen and twisted. “In our country, viewers rarely check a video’s source,” he added.
In one case, Kamal’s criticism of NCP leader Sarjis Alam was stolen and republished under the title: “What Masood Kamal said about Sarjis Alam’s second marriage”—even though he had never mentioned Alam’s personal life. “If they had just reshared my real content, it would have been tolerable,” Kamal said. “But twisting my words for money—that’s alarming.”

The manipulated videos circulated widely across platforms, especially Facebook, often amplified in politically motivated ways regardless of the original YouTube channel’s affiliation. Both Awami League (1, 2) and interim government (1, 2) supporters shared the content through Facebook accounts and groups aligned with their interests. In several cases (1, 2), the same video was picked up and boosted by different political actors, depending on how well it could be spun to serve their narrative.
Celebrity faces used to capture attention
While content theft has long been an issue, fake talk show creators are now weaponizing the visual authority of trusted hosts and media personalities, including Khaled Muhiuddin and Masood Kamal, to mislead audiences.
Together, Muhiuddin’s “Thikanay Khaled Muhiuddin” and Kamal’s “Kotha” command more than a million followers, a likely reason why they have become frequent targets. Other popular hosts, like Sharmin Chowdhury and Nobonita Chowdhury, were also regularly misused, despite having smaller but loyal audiences.
The problem extended beyond talk show hosts. Influencer and activists such as Elias Hossain, Pinaki Bhattacharya, Jacob Milton, Nayeem Elli, and Nijhoom Majumder also had their solo Facebook Lives or YouTube sessions doctored—to suggest staged debates that never happened. All of them have a large social media follower base.
In group settings, participants like Advocate Fazlur Rahman, a senior BNP advisor, and ZI Khan Panna, a veteran Supreme Court lawyer, were also manipulated. Channels stripped out real moderators and inserted celebrity hosts like Muhiuddin to lend fake conversations an air of authenticity.

Some videos went even further, inserting footage of public figures to create false confrontations. BNP acting chairman Tarique Rahman and student leader-turned-politician Md Sarjis Alam, for instance, were depicted smiling during scenes where speakers criticized them—creating the illusion of a live, face-to-face exchange.
Speaking to Dismislab, Muhiuddin said he was aware of the manipulated videos and urged viewers to trust only the content posted on his verified YouTube channel. “Viewers should not trust videos of me from other sources,” he said.
Masood Kamal described deeper damage. “This hurts my credibility as a journalist—and there’s no easy way to recover from that,” he said. “It’s not just about credibility; it’s financial, too. When people search for ‘Masood Kamal’, they often find the fake channels first. I’m losing viewers—and with them, revenue.”
Politically targeted, financially motivated
Dismislab analyzed each video’s title, thumbnail, description, and full content to assess its political narrative, categorizing the clips by major parties, the government, and the army. Findings show that 40% of the 288 videos attacked the interim government led by Dr. Muhammad Yunus, while 20% targeted the student-led National Citizen Party (NCP). The two major parties—BNP and AL—were each targeted in about 10% of the videos.
Qadruddin Shishir, the former fact-checking editor at AFP’s Bangladesh bureau, said that some content creators might be exploiting the anti incumbency sentiment against the interim government for profit, producing more videos critical of those in power to maximize earnings. But the possibility of political coordination, he cautioned, cannot be ruled out: “It’s also possible that a political group is deliberately amplifying anti-interim government narratives, not just for financial gain, but for political advantage as well.”
To check whether videos were monetized, Dismislab examined their source code, looking for the “yt_ad” string. While this method reliably shows if ads were active at the time of review, it does not confirm whether the uploader was part of YouTube’s Partner Program or received ad revenue. The analysis found that nearly 90% of the videos carried ads during the investigation.
Every click and view feeds YouTube’s advertising machine. Since YouTube earns the bulk of its revenue through ads, engaging misinformation becomes profitable—not just for creators, but for the platform itself. Under the YouTube Partner Program, creators typically keep about 55% of ad revenue, while YouTube retains 45%. In 2024, YouTube generated roughly $36.2 billion from advertising, with a record $10.5 billion earned in the fourth quarter alone. Cheapfakes—videos created with basic editing tricks—have flooded social media platforms across the Global South. According to Context, a Thomson Reuters Foundation initiative, these cheapfakes are especially damaging in countries with lower digital literacy and weaker media ecosystems. During Bangladesh’s 2024 election, nearly half of the detected misinformation involved cheapfake videos.
YouTube Community Guidelines Breached
Stolen and stitched videos violate YouTube policies, including its Misleading Metadata or Thumbnails guideline. Under its Spam and Deceptive Practices policy, the platform warns against using titles, thumbnails, or descriptions that mislead viewers about what they will actually see in a video.
These videos also appear to breach YouTube’s impersonation policy, which prohibits using “someone else’s real name, username, image, brand, logo, or other personal information to trick people into believing you are that person.” They may also violate copyright rules by repurposing footage from journalists, talk show hosts, and creators without permission.
Digital media experts argue that platforms like YouTube, despite having policies against manipulated content, continue to let these videos thrive. They call it “engagement capitalism,” where virality wins while quality journalism loses.
Ananta Yusuf, Editor of Star Multimedia at The Daily Star, a leading English newspaper in Bangladesh, explained how these manipulations easily slip through platform detection systems.
“Manipulations happen across multiple layers—thumbnails, visuals, voiceovers,” Yusuf said. “Cropping or zooming slightly on stolen footage can easily trick YouTube’s detection system.”
YouTube’s failure to take action against such content has long been a concern. In an open letter to the platform in 2022, the International Fact-Checking Network argued that YouTube is allowing its platform to be weaponized by unscrupulous actors to manipulate and exploit others, and to organize and fundraise themselves. In 2021, an investigation by Mozilla found that YouTube’s algorithm recommends videos that violate the platform’s very own policy.
Methodology
After encountering a few cases—including two fake Khaled Muhiuddin talk show videos cited in recent fact-check reports—Dismislab searched the term “Khaled Muhiuddin talk show” on YouTube, uncovering a surge of similar manipulated videos. Around 50 channels were initially flagged for posting stitched or artificially combined political content. Narrowing the focus to a single week of uploads from March 25 to 31, researchers identified 29 channels that had published at least one manipulated video during that period. In total, 288 videos were reviewed.
Each video was manually assessed for signs of manipulation, including stitched footage, AI-generated elements, misleading headlines, or altered backgrounds.
Researchers also examined each video’s source code to check for the presence of the “yt_ad” string, indicating whether the video was showing YouTube ads. Original source materials were traced to confirm whether footage had been re-edited, taken out of context, or fabricated.
Videos were further categorized according to the political entities they targeted—including the Awami League, Bangladesh Nationalist Party (BNP), National Citizen Party (NCP), Bangladesh Jamaat-e-Islami, the interim government, or the army. Videos that did not consistently target a specific group were classified as ambiguous.