Awami League's coordinated bot network on Facebook exposed

The same profile picture was used for 8 different Facebook accounts. Researchers identified these accounts as fakeCourtesy: Dismislab

Before and after the 12th Jatiya Sangsad election, fake Facebook profiles were used on behalf of Awami League to post comments on the Facebook pages of various media and opposition parties.

These fake Facebook profiles were operated by mean of a bot network. Research has unearthed a bot network with 1,369 Facebook profiles.

Dismislab carried out this study on related to the impact of technology on the information system. The research report was published today, Thursday.

The study reveals that over 21,000 comments were made on behalf of Awami League from these Facebook profiles simultaneously on 197 posts. They made identical comments on various posts. The same comments were posted from various profiles.

The number of bot accounts and comments from these accounts may be even higher. This study was run on the basis of information from 197 posts.

The bot network is basically an automated system where specialised software (bot) is used. The bot network operations are almost entirely automated. They select specific keywords from a list and automatically target posts. They used another list to post comments. There is hardly any human monitoring of what is being posted and where the comments are being posted in this process.

The study revealed that the bot network generally made comments on political posts. When they spot certain political keywords on Facebook posts, they leap in. But they goof up at times

According to the study, the bot network was activated just before the 7 January 12th Jatiya Sangsad (national parliament) election. Up till June this year, the bot network repeatedly posted 478 similar political comments.
The comments were created before the election. The same network continued to post comments after the election. The target of the bot network was Facebook pages of the established media platforms of the country and BNP, the opposition political party at the time.

The study found a similar pattern among these fake profiles, from the manner in which 'likes' were posted on two particular pages.
It was seen that 70 per cent of 1,124 accounts whose profiles were not locked, followed one of both of the accounts -- Banglar Khabar and Awami League Media Cell.

The Awami League government was toppled on 5 August by means of the student-people's uprising. Since then the Awami League Media Cell page can no longer be found.

Finding the bot network

On 21 June a report was published on the online news portal bdnews24.com and its Facebook page, 'Is there an Invisible Code on Every Page printed by a Colour Printer?' In both instances, comments immediately appeared below the post, stating, "The election will be transparent. People will be able to vote freely in the upcoming election. BNP is scared to participate because they can't rig the vote this time."

It was six months after the election and the users were still criticising BNP and were expressing their expectations for a successful and fair election, demanding that the election be held under the government in power, along with all such political comment, one after the other.

In delving into the reason behind such comments, Dismislad found the existence of a bot network with 1,369 Facebook accounts. In a matter of just six months, on behalf of Awami League this network posted over 21,000 comments in coordination on 197 Facebook pages of various news media outlets and opposition political parties.

For the sake of this research, Dismislab collected all these comments from that Facebook post of bdnews24.com. Using Google Search, it was found the same comment was made on 196 more posts. And from the comments made on those posts, a database of 35,000 comments was created. From this database, over 21,000 political comments made by the bot account were separated.

The study revealed that the bot network generally made comments on political posts. When they spot certain political keywords on Facebook posts, they leap in. But they goof up at times. For example, they goofed up over the word EC (Election Commission). In the report on colour printers, there was mention of Machine Identification Code (MIC), transliterated in Bangla as 'MIEC'. The mention of 'EC' tipped the bot network off and it rolled off a few hundred comments.

Also Read

How the profiles are fake

There are certain points to verify whether a Facebook profile is genuine or fake. The bot profile follows extra secrecy. It provides no information in the 'about' section.

It either has very few posts in its profile or an extreme number of posts. The profile picture is collected from internet. The number of friends is normally low.

Dismislad studies details of 1,369 profiles, including the privacy settings, number of friends, posting patterns, personal information and profile pictures. These were activated before the election. Either there were no profile picture or the pictures were stolen. They have a very few number of friends or no friends at all. Most of the accounts follow two specific pages. These characteristic match the common definition of political bots.

A total of 247 of the profiles were locked. And the profile pictures of 70 per cent of the remaining 1,122 accounts were collected from the internet, that is, the pictures belonged to others. In some instances, the same picture was used for several different profiles.

The study showed 77 per cent of 1,369 profiles were in the names of women. These names too were all similar, 24 per cent of these names ending in Akhter, like Diya Akhter, Riya Akhter, Liza Akhter, Lima Akhter, Lisa Akhter, etc.

Male profiles frequently used the surname 'Ahmed', such as Nayeem, Ahmed, Nader Ahmed, Kamal Ahmed, Maheen Ahmed, Samir Ahmed, etc.
And 90 per cent of the names, male and female, consisted of two words. Sometimes one name was broken and made into two, such as Ri Pa, Mi Na, Li Za, Zu Thi, Lam Ya, Mu Na, Jos Na, etc.

AI will be able to create many profile pictures instantly, making it difficult to identify coordinated misinformation campaigns. Therefore, it is essential to raise public awareness about these issues from now on
Naeemul Hassan, Assistant Professor, Philip Merrill College of Journalism and Information Studies, University of Maryland

Accounts activated before the election

The research considered the date when a bot profile first posted any content as the time when the account was activated. The 12th national parliament election took place on 7 January. And the fake profiles became active basically from June to November last year.

Half the fake profiles became activated from September to November 2023. Just from 24 to 30 November, 344 accounts posted for the first time. Earlier, from 2 to 12 September, 240 profiles were activated.
The research identified 21,221 political comments, of which 474 were unique, repeatedly posted across different pages.

The analysis clearly shows interconnection among the bots. For example, a profile Riya Akhter made 138 comments on different posts over the last six months. Long after the election, on 18 May, one of her comments was, "Bangladesh is an independent sovereign country. An Independent Election Commission (EC) has been formed her by law. The upcoming national election will be neutral, I hope." This comment was posted on 96 posts by 109 bot accounts including Diya Akhter, Raisa, Rafia Akhter, Nahid and Nipa.

The bot network mainly targetted 42 Facebook pages which were primary related to various media outlets and political opponents of Awami League. Of these, 68 per cent were top-tier and well-known media pages and 31 per cent were BNP of BNP affiliated pages. The network also targeted the page of the political party Ganashanghati Samity.

Content analysis shows that 86 per cent of the 474 unique bot comments were criticism against BNP and its leaders. For example, "BNP is a terrorist organisation. They should be punished." Or. BNO is conspiring to plunder Bangladesh by embezzling and smuggling money and creating Hawa Bhaban." The remaining 14 per cent of the comments comprised praise for the government and called for peaceful or fair elections.

Bot blunders

The election was held in 7 January. However, the bot comments posted in February, March, April, May or June continued to use phrases like "the upcoming 12th Jatiya Sangsad election" or "as the time for the 12th national parliamentary election approaches." This shows that the comments were create before the election but were posted randomly even afterwards.

In its various other blunders, the bots posted political comments on irrelevant new items. These news items somewhere or the other had mention of 'EC' such as ICU, ICC, Raisi, ICT, dialysis and crisis (EC in the Bangla transliteration).

"EC" is an abbreviation for the Election Commission. The bot network jumped on any post containing the term “EC,” whether relevant or not, and posted coordinated political comments there. This suggests that the network used computational tools based on keyword detection to select which posts to comment on.

Organised propaganda

Research has shown that the proliferation of bot networks can have a profound impact on democratic processes and the integrity of information. Such networks are capable of distorting public discourse by amplifying certain viewpoints while suppressing others. This manipulation can create a false sense of consensus, polarize public opinion, and ultimately undermine trust in democratic institutions.

Naeemul Hassan, an Assistant Professor at the Philip Merrill College of Journalism and Information Studies at the University of Maryland, noted, “In the future, these technologies will become even more advanced with the use of artificial intelligence (AI). For example, it will no longer be necessary to use the same profile picture across multiple fake accounts, as AI will be able to create many profile pictures instantly, making it difficult to identify coordinated misinformation campaigns. Therefore, it is essential to raise public awareness about these issues from now on.”

Also Read