The recent earthquakes gave Bangladesh quite a jolt. Many people have still not recovered from the shock. The panic has been further fuelled by various rumours circulating on social media.
Fact-checking by the country’s platforms over the past week shows that most false or misleading information during this period centred around the earthquake.
On 21 November, a 5.7-magnitude earthquake shook the entire country, including the capital Dhaka. The epicentre was in Madhabdi of Narsingdi, near Dhaka.
Several smaller tremors were felt in the following days. Although these subsequent quakes registered below four on the Richter scale, social media posts clearly showed how anxious people remained.
Amid this, numerous fake and misleading photos and videos were shared on Facebook. Seeing images of collapsed buildings, cracked roads and fallen flyovers, many people believed them to be real.
Rumour Scanner, Dismisslab, Fact-Watch and BanglaFact together verified 113 pieces of misleading content between 22 November and last Friday. Among these, 45 were related to the earthquake. Of these, 24 involved misleading AI-generated images or videos.
Rumour Scanner alone carried out 94 fact-checks last week. Dismisslab published 11 reports, Fact-Watch 5, and BanglaFact 3.
After the earthquake, the most widespread trend was the sharing of old images and videos from foreign earthquakes, claiming they were from Bangladesh.
Old footage from Myanmar, Nepal, Indonesia and other countries was used to allege building collapses or cracks in Bangladeshi roads, even though no buildings collapsed in Bangladesh.
In one case, an old photo of an accident in Chattogram and a picture of a culvert damaged by heavy rain last August were widely circulated as images of a road damaged by the 21 November earthquake.
Photos of the Rafin Plaza building in Dhaka’s New Market and the minaret of a mosque in Bashundhara were also shared with misleading narratives.
Media outlets in Pakistan and India, as well as Al Arabiya English TV, used the New Market building photo. The curved architectural design of the building and minaret was presented as structural damage from the earthquake.
Old photos of collapsed buildings from Turkey and several other countries were also circulated as earthquake damage in Bangladesh.
AI-generated videos were used to spread false claims about flyover collapses. A deepfake video went viral claiming that a flyover in Dhaka’s Diabari area had collapsed. The same video was also circulated as an incident in Ashulia.
Several Indian media outlets ran the deepfake video. Fact-checking confirmed that the video was entirely AI-generated and that no such flyover collapse occurred anywhere in Dhaka.
In total, the fact-checking platforms identified 38 AI-generated images and videos related to the earthquake and other issues.
Apart from the earthquake, the platforms also identified 21 fake photo cards over the past week concerning political events, election campaigns, fabricated quotes attributed to international figures and misleading posts involving celebrities.
Miraj Ahmed Chowdhury, managing director of the digital rights and information research organisation Digitally Right, told Prothom Alo that natural disasters easily stir people’s emotions and curiosity while this emotional vulnerability is exploited through the spread of misinformation and disinformation.
Emphasising the need for caution, Miraj Ahmed Chowdhury said that everyone must be more vigilant and responsible when sharing information on social media, especially regarding natural disasters.