Who is destroying the reputation of Ukrainian charitable foundations: hybrid warfare and information attacks

6 January 2025 22 minutes Author: Cyber Witcher

You will learn how information attacks on Ukrainian charitable foundations that help the Armed Forces of Ukraine work, who is behind these campaigns, and what methods fake news is spread by. The article discusses the serious impact of such disinformation on the level of trust of citizens and the volume of donations, and also emphasizes the importance of a critical approach to the information received. The analysis of the sources and methods of attacks is unique, in particular, data on the Russian footprint in 36% of negative publications is mentioned.

Of the 379 recorded cases of attacks on charitable foundations, 36% are of Russian origin

From the beginning of the full-scale invasion until October 2024, 379 negative publications and posts aimed at discrediting five large charitable foundations were detected. All these attacks were classified by the main types of theses, and possible connections between the initiators of the attacks and various political or social structures were also identified.

At first glance, 379 attacks over such a long period may seem like a small number, but some of them could have been removed by platform moderators. It is also worth considering that some publications could have disappeared due to blocking or removal of content by the platforms themselves, such as Facebook or X (formerly Twitter), in response to user complaints. Data collection was carried out only from those sources that were available and indexed at the time of analysis.

Additionally, it should be noted that some social networks, such as Facebook, have their own content indexing system. This means that individual posts may not appear in a general search, but may be available when searching directly on the author’s page. This situation may be related to internal content ranking algorithms.

The distribution of attacks by affiliation with certain structures was made on the basis of the analyzed data. It is important to emphasize that conclusions about possible connections with certain groups are likely assumptions, although in some cases it was possible to find concrete evidence of participation in campaigns to discredit foundations.

Affiliation to a structure means that the authors of the attacks posted posts in support of a particular organization or person. Conclusions about probable affiliation are made on the basis of a sufficient number of sources indicating the connection of the authors of the content with the specified structures. In some cases, such connections were indirect.

As the results of the analysis show, among the probable initiators of the attacks, in addition to representatives of Russian structures, there are possible supporters of some public figures and political forces. In particular, these may be probable supporters of the parties “European Solidarity”, “Servant of the People”, individual individuals associated with Rinat Akhmetov, as well as Oleksiy Arestovych.

A collage from a screenshot of Alexey Arestovich’s FB post. This post appears to be an attack on the foundation’s reputation. It is likely that the main goal of this attack is the person’s personal PR, using the conventional narrative of “blackmailing the military”. This post was distributed by media outlets such as Informator, Ukrainian News.

Russian structures mentioned as potential customers of the attacks include both official Russian state bodies (FSB, GRU, and others) and propagandists or accounts that openly support Russian policy. For example, in 2021, the SBU discovered a network of Telegram channels that were under the control of the Russian GRU. After that, the Kyiv District Court of Kharkiv ruled to seize the intellectual property rights of four channels — “Legitimny”, “Resident”, “Cartel” and “Spletnitsa” — and ordered to block access to them. However, providers were technically unable to implement a complete blocking of these resources.

If it was not possible to accurately determine the affiliation of the authors of the attacks to certain structures, or when the channel or account did not demonstrate clear support for a specific structure and at the same time did not spread pro-Russian narratives, such attacks were assigned the category “not established”. A probable affiliation to a particular structure was noted only if there was a sufficient number of confirming sources. In some cases, such sources indicated an indirect connection to the relevant structure.

34 million negative comments published by Russian botnet in just 4 months, carrying out attacks under a foreign flag

This spring, the German media outlets Süddeutsche Zeitung, NDR and WDR, as well as the Estonian online publication Delfi, conducted an investigation into the activities of the Russian “Agency for Social Design” (ASD) based on leaked data. This organization specializes in creating and managing a network of bots on social networks that spread pro-Russian narratives on behalf of citizens of countries that are the object of discredit. There are reasons to believe that ASD works directly for the Russian presidential administration. It is known that the bots not only promote openly pro-Russian messages, but also emphasize topics that concern society: mobilization, losses at the front, restrictions on freedoms during martial law, corruption, reductions in military aid and counteroffensives.

To sow discord among Ukrainians, the bots actively use the opposition of various public figures, in particular President Volodymyr Zelensky, his predecessor Petro Poroshenko, Kyiv Mayor Vitali Klitschko and former Commander-in-Chief of the Armed Forces of Ukraine Valeriy Zaluzhny. From January to April 2024 alone, bots posted about 34 million negative comments on social media under user posts.

Additionally, it is known that the former head of the Wagner PMC, Yevgeny Prigozhin, has financed and controlled the Internet Research Agency since 2014, known for its propaganda activities. In particular, information support for the annexation of Crimea and the start of the war in Donbas was largely carried out through a network of bots overseen by Prigozhin.

In the first weeks after the start of the full-scale invasion, Twitter blocked 100 accounts that were spreading fakes under the hashtag #istandwithputin and other Russian propaganda messages. By August 2022, the number of blocked accounts had increased to 75 thousand. In May 2024, the SBU exposed 46 bot farms that were controlled from Russia, had over 3 million accounts and could influence an audience of up to 12 million people.

Special attention is paid to social networks in attacks, as their popularity among Ukrainians has increased significantly: in 2015, 51% of people received news through social networks, and in 2024 this figure reached 84%. The main platform for the spread of fakes and disinformation is Telegram.

*Other sources — mainly news sites

In 52% of the analyzed cases, attacks occurred on Telegram, as this messenger has become one of the key sources of news for Ukrainians. According to USAID, 72% of Ukrainian citizens receive information through Telegram. At the same time, some experts consider it the main platform for the mass distribution of fakes, as the messenger does not implement international standards for checking the quality and authenticity of content distributed by channels and users.

If we analyze the alleged supporters of the structures and superimpose the committed attacks on the platforms within which these attacks were carried out, we can trace the epicenters of concentration. For example, Russian structures mainly attack funds on Telegram.

These are mostly information or news resources.

By structural affiliation, we mean that the authors of the attacks published certain content that supports a specific organization or individual. The stated affiliation is based on a sufficient number of sources confirming the corresponding connection. In some cases, such connections were indirect in nature.

Waste, embezzlement, PR in war: what narratives are spread during attacks?

As noted earlier, attacks on charitable foundations occur against the backdrop of public sentiment and often use topics that cause heated discussions in society, especially on the Internet. The analysis showed that in the case of all five charitable foundations, one key pattern is observed – attacks aimed at sowing doubts about the integrity of the foundations and the intended use of funds. The main narratives of these attacks can be reduced to several types:

– attacks on individuals associated with the foundation, in particular founders, leaders or those associated with the foundation’s activities. The goal of such attacks is to undermine trust in these individuals and reduce their authority. Manipulations of real events are often used, which are supplemented with fictional or provocative statements. For example, the story of threats to the ambassador of the United24 platform Timothy Snyder or attempts to discredit Taras Chmut – the head of the “Return Alive” foundation.

— accusations of embezzlement or misuse of funds. As an example, one can cite questions to the Serhiy Prytula Foundation regarding a satellite for which a high-profile collection was announced, or comments about the opacity of the financial operations of the KSE Foundation, which were addressed to Timofey Mylovanov.

— undermining the reputation of foundations and casting doubt on their activities. This type of attack is systematic and continues throughout the entire period of the foundations’ operation, and not only during large-scale collections or high-profile events. Media research confirms that such attacks are often used by Russian propaganda to spread fake news and reduce citizens’ trust in charitable organizations. In some cases, the reason for the attacks could be competition between foundations or differences in political views. An example is the attack on the Petro Poroshenko Foundation, which was about accusations of PR during the war.

Three main key narratives were identified that often become the basis for discrediting campaigns against foundations.

*several types of attacks could be used in one attack, for example, accusations against Fedorov and Zelensky at the same time and slogans related to PR or money laundering.

As we can see, the main goal of the attacks is to discredit the funds by any available method so that Ukrainians and businesses do not donate to them, and, therefore, the funds would not be able to send that large amount of aid to the front.

More detailed analytics of attacks on funds

“Come back alive”

“Return Alive” is a charitable foundation that has been providing assistance to the Ukrainian army since 2014. The director of the foundation is Taras Chmut. As of 2024, the foundation is engaged in the implementation of military and veteran projects and, according to Forbes and the official website, has raised 12 billion hryvnias to support the defense forces of Ukraine.

The results of the analysis show that the “Return Alive” foundation is subject to attacks from individuals and accounts that are likely to be associated with Russian structures or the “EU” party.

During the study of attacks on the foundation, publications on social networks and the media were analyzed, 68 cases of attacks were identified, of which 34 were carried out by real people or media, 32 by unidentified sources, and 2 publications were probably bots.

The majority of attacks (39 cases) were aimed at discrediting Taras Chmut personally, in particular due to his public statements about the defense-industrial complex, as well as accusations of ties to the authorities or incompetence. In 14 cases, attacks were directed simultaneously at the foundation and Chmut, with the most common narrative being accusations of embezzlement. The foundation itself was targeted separately in 15 attacks, mainly with a focus on financing the Territorial Centers for Procurement and Social Support (TCS).

Examples of three attacks on the fund under the most common narratives and slogans (personal discrediting of Chmut, embezzlement of money, conflict of interest (they say that the “Return Alive” fund finances the CCK).

Screenshot of an example of an Instagram “stealing” attack
Screenshot with an example of a Twitter attack about “TCC funding”
Screenshot with an example of a Twitter attack to discredit Taras Chmut

Only 22.1% of attacks on social media were directed exclusively at the foundation, while the majority (57.4%) concerned Taras Chmut personally. The intensity of attacks usually increased after his public statements about the army or after important events, such as Chmut’s appointment as a member of the supervisory board of the Defense Procurement Agency.

The “Return Alive” foundation notes that after a post begins to gain popularity, a wave of attacks almost always follows — these can be mass complaints about the content, bot attacks in the comments, or even temporary deletion of posts through automated platform algorithms. Such activity also increases after interviews with foundation representatives or announcements of new meetings. “Return Alive” emphasizes that such attacks are predictable and patterned.

In addition, the foundation reports a trend of creating fake personal accounts of foundation employees on Instagram and Facebook. Attackers try to raise funds under the guise of these accounts. Thanks to the monitoring system in place, the team is able to quickly detect and block fake activity.

The foundation believes that such attacks are aimed at undermining trust in the organization and its team, but emphasizes that this does not affect their activities. The active community that supports the foundation often independently detects bots and helps fight disinformation.

Serhiy Prytula Foundation

The Serhiy Prytula Foundation is a charitable organization that supports the army and provides humanitarian aid. It was founded in 2020 by Serhiy Prytula, and its director is Andrey Shuvalov.

The analysis shows that the foundation is subject to attacks by individuals and bots that are likely to have connections with Russian structures, the EU party, or unidentified structures. These attacks are usually not carried out during active fundraising or project implementation, but occur after their completion, which indicates attempts to undermine trust in the foundation and its founder, rather than hinder specific initiatives. The most common reason for attacks is high-profile public events.

An analysis of the sources of attack distribution showed that the main platform for their distribution is Telegram, where 64.5% of all attacks occurred. About 13% of attacks were distributed on Facebook, and 12.3% in the media and YouTube. Twitter and other platforms were used for 8.7% of posts.

Despite the fact that the share of bots as a source of attack is not large, it is still possible to clearly track and analyze the artificiality of the attack. For example, the FB user Lyubov Pavlovskaya is probably a bot. Comments were left on other posts on behalf of this account. Why is there a reason to call this account a bot? Because when we checked the photo of her account, it turned out that this photo was used on numerous other “adult” sites.

Collage of screenshots showing an example of a bot and the result of searching for a person by photo from a commenter’s profile

Examples of three attacks on the fund under the most common narratives and slogans (embezzlement of funds, attack on the reputation of the fund, and attack on an associated person).

Screenshot of an example of an attack on Telegram with accusations of “embezzlement of funds”
Screenshot with an example of an attack on the reputation of a fund in Telegram
Screenshot with an example of an attack on Telegram — a post by Shariy (whom the SBU has declared suspicious of committing treason under martial law), the purpose of which is to discredit a person associated with the foundation

Most of the attacks are directed at Serhiy Prytula as a person and a volunteer. The frequency of these attacks is not always related to the activities of the foundation – any informational pretext can be used to discredit, and the foundation or Prytula himself is mentioned in a negative context. In comments and publications, the authors of the attacks address Serhiy Prytula personally, indicating that he is responsible for all the actions of the charitable organization.

One of the loudest topics of the attacks was the accusation that Prytula purchased three apartments during a full-scale war. In fact, the real estate was purchased back in 2020 at the construction stage, which is confirmed by published documents. However, this information is used to try to sow doubts about the foundation’s honesty and reduce the level of donations, since formal ownership of the apartments was obtained only in 2023.

Despite the foundation’s regular statements that its operations are financed from separate accounts, and that all donations to the “military” account are directed exclusively to helping the army, publications continue to appear online that promote a fake narrative that “20% of the funds” are allegedly spent on personal needs.

Separate attacks also concerned the purchase of the ICEYE satellite. Fakes were spread that the satellite “does not work” or that the foundation overpaid for it. Despite the fact that the satellite is actively used to collect intelligence for the Armed Forces of Ukraine, such information attacks were aimed at undermining trust in this important project.

The foundation is actively engaged in moderating comments on social networks, especially during large gatherings or after important interviews. The team monitors spam and fakes, removes malicious comments, and promptly responds to bot attacks, which requires significant resources and involves cooperation with platform administrations.

It was found that a significant number of attacks fell on TikTok (226 negative comments) and YouTube (105 negative comments), where the fund was under the greatest pressure.

According to the fund’s representatives, the main goals of the attacks are to reduce the volume of financial support, undermine trust in volunteers as a key pillar of society, and discredit Ukraine in the international arena by spreading false information about the ineffectiveness of public initiatives. For example, during the launch of the “Riv Pomsty 2.0” fundraiser, the team worked until the morning to counteract spam and disinformation, which distracted resources from the main goal of attracting donations. And during the “Mega Fundraiser Nightmare” due to mass complaints, 80% of the content was removed, and the META platform threatened to block the page, which significantly reduced the reach and reduced the effectiveness of fundraising.

United24

United24 is the official fundraising platform of Ukraine, created at the initiative of President Volodymyr Zelensky. The platform does not have its own accounts, using the accounts of several ministries of Ukraine, and is aimed at attracting mainly international donors for humanitarian and military projects. Ukrainian and foreign public figures are actively involved in promoting the funds. In addition, a media agency was created to disseminate information about the war in English and other languages.

The analysis showed that the United24 platform is being attacked by individuals and bots, who are likely associated with Russian structures, the EU party, as well as from unidentified sources.

About 29.4% (44 cases) of attacks were directed at the organizers of the platform or individuals associated with it. Due to the lack of a single formal leader of the platform, the negative was aimed at the initiator of United24 – President Volodymyr Zelensky and one of the heads of the directions – Deputy Prime Minister for Digital Transformation Mykhailo Fedorov.

According to the United24 press service, bots are activated during large gatherings or high-profile events, such as support for the platform by famous people with a large audience. As an example, they cite the case during the US elections, when there was a sharp surge in bot activity on the X social network. Bots with Russian names massively subscribed to the United24 page and then unsubscribed, which was probably done in order to defeat the platform’s algorithms. These bots were also subscribed to other popular Ukrainian accounts and initiatives (“Return Alive”, Igor Lachenkov, etc.).

During the preparation of the report, 16 unique narratives were identified for 113 attacks, which were divided by main topics. Some of the attacks were written in Russian. Of these, 20.4% of publications and posts were made by probable bots, another 8.8% were difficult to identify as publications from the media or real people.

Examples of three attacks on the fund under the most common narratives and slogans (stealing money, attack on Fedorov, attack on Zelenskyy).

A collage of screenshots with examples of attacks on a person associated with the foundation — in blogger Oleshko’s post, an attack on Volodymyr Zelenskyy, in Shariy’s post, an attack on Mykhailo Fedorov.
Screenshot with an example of an attack under the narrative of “embezzlement of funds” in a repost on Telegram by Taras Chornovil

30% of attacks on social networks and news were directly related to the platform initiators, while the rest of the attacks were aimed at spreading other narratives. The most common of them was the narrative about the opacity of reporting and embezzlement of funds, which was used in 51% of cases.

According to the press service of the fundraising platform, most bot attacks on Instagram occur after the publication of reporting videos or joint posts with the President of Ukraine. On Facebook, bots write more often in private messages, and this is not always related to any specific events.

An analysis of posts on social networks by United24 showed that the main platform for attacks was the social network Facebook, where the activity of a network of bots with the same registration date and similar avatars was observed. On other social networks, either no significant number of negative comments were found (probably due to the work of platform moderators), or the posts were closed for commenting.

The United24 team confirmed that it is actively working on content moderation, removing negative comments containing Russian propaganda and blocking the accounts of the authors of such posts. United24 notes that the main goal of such attacks is to create a negative impression of the platform, discredit it or influence the algorithms for issuing content on social networks.

Petro Poroshenko Foundation and NGO “Solidarity of Communities”

Two BFs are associated with Petro Poroshenko:

  • Petro Poroshenko Foundation (FB): a foundation that, based on Poroshenko’s own statements, does not accept donations from citizens and operates at the expense of Petro Poroshenko’s family. The foundation is engaged not only in providing assistance to the military, but also in civilian projects;

  • NGO “Community Affairs” (legal name – NGO “Solidarity Affairs of Communities”) InstFBTWTG: a foundation that collects charitable contributions and spends them only on the purchase of assistance. The foundation is aimed at providing assistance to the military, the projects are divided into groups: logistics, everyday life, equipment, intelligence, destruction, other projects.

As part of the analysis, attacks directed directly at Petro Poroshenko that did not mention the name of the platform or the foundation’s projects were not taken into account, since their nature is mostly related to Poroshenko’s political activities as a politician, the 5th President of Ukraine and a current deputy, and not to the foundation’s work.

The results of the analysis show that the Petro Poroshenko Charitable Foundation and the NGO “Solidarity of Communities” are being attacked by individuals and bots that are likely to be associated with Russian structures, the Servant of the People party, as well as unidentified structures.

18 unique narratives were identified for 51 attacks, some of which were written in Russian. Of these attacks, 13 were distributed by other network participants with similar texts.

The largest share of attacks (58.8%, 30 cases) was directed at the Poroshenko Foundation due to the presence of the politician’s surname in the name of the foundation, which directly associates him with his person. At the same time, 13 attacks were carried out on both foundations (25.5%), while the “Community Affairs” foundation accounted for 8 attacks (almost 16%).

Examples of three attacks on the fund under the most common narratives and slogans (misuse of funds, PR on donations, and PR on war).

Screenshot with an example of a media attack of the “PR at war” type
Screenshot with an example of a “donate PR” attack on Telegram
Screenshot with an example of a tweet attack of the “misuse of funds” type

The analysis shows that the main goal of the attacks is to discredit Petro Poroshenko as a politician and public figure, even when the negative is directed at the funds associated with his name. This is confirmed by a significant number of comments under publications about assistance to the defense forces on the official pages of charitable funds and similar comments under the posts of Poroshenko himself.

25.5% of attacks in social networks and news resources concerned the narrative about the misuse of funds. Another 25.5% of negative publications covered the narratives of “PR in war” and “PR on donations” (11.8% and 13.7%, respectively). The main platform for attacks is Telegram, where 76.5% of all negative posts were recorded.

KSE Foundation

KSE (Kyiv School of Economics) is a private higher education institution of international level, founded in 1996 by the Economic Research and Education Consortium (EERC) and the Eurasia Foundation. The KSE Foundation, a charitable foundation established in 2007 as a subsidiary of a non-profit corporation in the USA, was initially focused on providing scholarships to talented young people. The President of KSE is Tymofiy Mylovanov, and the Director of KSE Foundation is Svitlana Denysenko.

The analysis revealed that the KSE Foundation is not actually the main target of attacks on social networks. All negative publications and reviews mainly concern a person associated with the foundation – Tymofiy Mylovanov, who previously held the position of Minister of Economic Development, Trade and Agriculture of Ukraine from August 2019 to March 2020. In total, 9 negative articles and reviews related to the foundation’s activities were found, 4 of which are comments under Mylovanov’s own Facebook posts about the foundation’s projects.

The foundation itself is almost never the target of direct attacks, as most of the negativity is directed specifically at Tymofiy Mylovanov. Here are a few examples of attacks:

Screenshot with an example of an attack in the media by the type of “attack on an associated person”
Screenshot with an example of an attack on the forum “sexism in grants”
Screenshot with an example of an attack in the comments under a Facebook post of the type “opaque money transfer scheme”

Therefore, we can conclude that the KSE Foundation itself is practically not a target for attacks, since most of the negative publications are associated with Timofey Mylovanov as an associated person. Negative reviews mostly appear in the form of comments under the posts of the charitable foundation or persons associated with it. This partially confirms the hypothesis that the attacks are aimed not so much at the activities of the foundation as at its associated persons.

Infographics with diagrams were not added to the text due to the insufficient number of cases of attacks, which makes it impossible to conduct a detailed analysis based on the collected data.

Information was taken from open sources Molfar

Other related articles
Found an error?
If you find an error, take a screenshot and send it to the bot.