The Difficult Relationship Between Nonprofits and Facebook

| GS INSIGHTS

Over the last decade, voices within the nonprofit community have expressed concern about ties to Facebook, a potentially game changing fundraising tool, but one that comes with questions and compromises. These worries ramped up after 2016, as the social media giant took heat over foreign election tampering, the presence of extremist content on the platform, and the 2018 disclosure of the Cambridge Analytica leak. COVID-19 and the disinformation that appeared online sent concerns to a new high. Then and since, Facebook has been unceasingly broadsided about its ethics and business practices.

Articles have appeared suggesting that nonprofits were either leaving Facebook or were mulling the possibility. Several observers noted that 90% of Facebook users are outside the U.S. anyway, making its utility as a networking and fundraising tool questionable. But that still leaves 179 to 239 million domestic users. Estimates vary due to methodology, as well as because some sources include projected 2022 user declines, but even so, a minimum of 179 million users is a lot of potential engagement to abandon. Have nonprofits actually been leaving Facebook?

With 1.5 million nonprofit organizations in the U.S., a trend of jettisoning Facebook may not be numerically quantifiable just yet, but there have been well publicized defections, such as in 2020 by the Nonprofit Technology Enterprise Network (NTEN). The same year, the reform campaign Stop Hate for Profit launched, with the support of hundreds of businesses and more than one hundred nonprofits, including 350.org, Southern Poverty Law Center, GLAAD, and others. Last year Nonprofit Marketing Guide left Facebook, noting that, “Over the last few years, a small but growing number of members of our community have refused to use Facebook.” Anecdotally, then, it does seem that nonprofits are increasingly wary of the platform.

Concerns about cost-benefit, privacy, and ethics

It's doubtful at this point that many people think of tech companies as possessing lofty ideals. Wading through Facebook's numerous ads is enough of an experience by itself to prove that profit is its overriding motive. Decades ago, Noam Chomsky and others laid bare the machinery behind the television industry by explaining that the viewers were, in fact, the product. The programming was merely the lure that prompted viewers to self-organize in front of various shows based on their individual interests, and advertisers then paid networks for the right to showcase their wares to these highly siloed groups. Digital media has taken that process to its extreme, with algorithms digging out the most granular and intimate information imaginable, but the basic idea is unchanged—the users are the product.

On Facebook, page owners that don't pay for advertising receive less engagement than those that do, which seems straightforward enough. However, even users who have already engaged with and liked certain pages are less likely to see those pages than you'd think. The desires of advertisers—Facebook's real customers—are prioritized over the preferences of users. That being the case, the costs versus the benefits of using the platform come into question, because even for users who are aware of and like your page, your content is constantly being shuffled beneath that of paying clients.

Around the time Facebook was starting to be hammered over extremist content and disinformation, it began seeking greater engagement with nonprofits. It had already created fundraising tools in late 2015, and announced in 2018 that hundreds of millions of dollars had been raised using those methods. At that point, it announced that it would begin offering these tools for free. (Previous fees were only to cover operational costs, according to Facebook.) The company's slant on its new charitable focus was that it was seeking to associate its platform with good causes. The realist's slant was that Facebook was seeking donor data.

This obviously raises privacy concerns. Everyone who interacts with a nonprofit's Facebook page is producing data that the company mines and sells to third parties, and while the monetary value of the data is unknown to all but a few insiders, we can surmise that information about people who are willing and able to give to charitable causes must be immensely valuable. Facebook, through its free fundraising tools, draws nonprofits into sometimes unwitting partnership with the company. This business model is so entrenched that good-faith efforts to resolve worrisome issues are  probably a remote possibility. Data mining is Facebook's business.

And then there are the ethical issues. Much of what is known about Facebook's internal practices was revealed by whistleblower Frances Haugen last year, who anonymously leaked thousands of pages of documents, later unmasked herself on 60 Minutes, then subsequently testified before Congress. She was a member of Facebook's civic integrity team, which was tasked with combatting false and misleading content on the platform.

Haugen told Congress that Facebook prioritizes profits over not only user preference, but also public safety, and in the process endangers lives. She brought up the example of human traffickers and armed groups in Ethiopia using the platform to advance their aims. Facebook, she explained, used algorithms to steer users toward high engagement posts, which happen to be those that provoke emotional responses. One of the most intense emotional responses is outrage. Facebook was creating a false reality by pushing healthy emotional reactions to the margins and centering posts that triggered anger, division, and extremism.

Negative impacts were especially serious for younger users, and Facebook insiders knew it. One internal document found that 6% of U.S. teens and 13% of British teens that had suicidal thoughts traced the urge to kill themselves to Instagram, which is owned by Facebook's parent corporation Meta. While Haugen's revelations exposed rampant bad practices, the idea of regulating Facebook and other tech companies angers free speech absolutists, who mostly fail to understand that speech on Facebook is not untampered with anyway, because it's already manipulated by algorithms rather than determined by organic interest.

Is change possible or has that boat sailed?

Thoughtful advocates of reform propose that rather than taking down specific pages, which is a process rife with bias and the potential for political hijacking, it would be better to change the algorithms that determine why certain pages and posts are displayed. However, Facebook and the other tech giants know that new algorithms mean less low hanging fruit, profit-wise, and have opted for restrictions and takedowns, such as the recent banning of Robert Kennedy, Jr. from Facebook and Instagram. While such acts generate news, it's impossible to say whether the bans have paid societal dividends, or merely sown more division.

Frances Haugen believes algorithm change is the way forward, and is now the guiding force behind a fledgling nonprofit called Beyond the Screen, which has the goals of making social media healthier and more accountable, and creating “a more equitable digital economy [and] a new civic architecture for the digital world.” The venture was launched in mid-September. Its online presence is currently just a landing page and funding pitch, but that page describes the group as a “coalition of technologists, designers, and thinkers fighting against online harms.” It will collaborate with Project Liberty through the McCourt Institute.

Disengagement from Facebook hinges on numerous factors, including the type of nonprofit involved and the available alternatives. As popular as the service is, it wasn't the fastest growing social media platform last year. That was TikTok, which pushes a type of content favored by users under age forty. Next fastest were Reddit, LinkedIn, and Instagram. Each of the four can do bits and pieces of what Facebook does, but perhaps can't offer the same encompassing opportunities for engagement. Numerous other platforms have launched but haven't soared, among them Diaspora, Ello, Path, MeWe, Minds, and Vero. For now Facebook remains king.

Whether to use the platform is a difficult calculation. It's not feasible in today's world to keep one's hands completely clean. For example, nonprofits have no choice but to use banking services, which have their own murky ethics, ranging from helping to enforce economic inequality to investment in climate killing industries. Full disentanglement from harmful companies is probably impossible. Even cellphones, those ubiquitous and indispensable devices, enable modern day slavery. By 2019, Facebook had raised $3 billion in donations. By the end of 2021, the total had reached $6 billion. A lot of vital services have been made possible.

The ethics of donors is a familiar topic, but the growing discussion around Facebook has helped to bring forward a parallel debate about the ethics of downstream providers, whether social media companies, banks, or the makers of devices. To tolerate a little bad in order to do greater good has become the quintessential digital conundrum, and it's one nonprofits ponder each day. Trying to chart a route through what seems to be an ethical labyrinth may be too time consuming, too costly to finances and operations, and ultimately fruitless. However, shining a light on the dodgy practices of powerful companies will always remain a responsibility.

Action steps you can take today