WhatsApp, acquired by Meta in 2014, is a messaging service used by more than 2 billion people globally. Its encrypted design makes it a critical communication tool in many regions, yet Meta has faced backlash for exploiting metadata and attempting to integrate WhatsApp more deeply into its advertising ecosystem. In countries like India, Myanmar, and Brazil, WhatsApp has been a key vector for the spread of disinformation and coordinated political propaganda, with devastating real-world consequences. While marketed as private and secure, its ownership under Meta ties it to the same surveillance-driven business model and global record of human rights abuses.
Meta’s platforms are central to the spread of harmful content. Independent investigations have documented systemic failures to moderate hate speech, incitement to violence, and misinformation, including policies that disproportionately censor Palestinian voices while allowing content that fuels discrimination to remain unchecked. Recent policy changes, such as weakening fact-checking standards, have further amplified misinformation and hate speech at scale.
The company’s business model rests on surveillance-driven advertising. By harvesting and monetizing vast amounts of user data, Meta has repeatedly placed profit above privacy, with scandals from Cambridge Analytica to opaque AI systems showing little accountability for misuse or abuse.
The consequences of this model are global. In Myanmar, an Amnesty International investigation found that Meta “substantially contributed” to atrocities against the Rohingya people by allowing its platform to be used for hate speech and incitement to violence. Despite clear evidence, Meta has refused to provide remedy or reparations to affected communities. In Palestine, the company continues to silence journalists, activists, and everyday users documenting state violence, normalizing censorship while enabling occupation-supporting narratives to proliferate.
These failures are not isolated errors but structural choices embedded in Meta’s design and governance. By prioritizing profit and state relationships over human rights, Meta has become complicit in some of the most serious abuses of the digital age. Boycotting Meta disrupts one of the most powerful engines of digital harm and political manipulation.
Signal