
Gary Baker (right), CEO of Equimundo speaks on the SDG Media Zone panel "The Manosphere: Understanding and Countering Online Misogyny" with, from left to right, Janelle Dumalaon, Panel Moderator and US Correspondent for Deutsche Welle; Jaha Durureh, UN Women Regional Goodwill Ambassador for Africa; and Ljubica Fuentes, Founder of ‘Ciudadanas del Mundo’.
As the digital landscape continues to expand into all areas of daily life, humanitarian experts are warning of growing risks, particularly as artificial intelligence (AI), online anonymity, and weak regulatory systems increase opportunities for abuse and harassment. Women and girls remain disproportionately affected, with nearly half lacking effective legal protection.
Ahead of the annual 16 Days of Activism against Gender-Based Violence campaign—which seeks to use digital platforms to empower women and promote gender equality—UN Women has raised alarm over a worsening crisis of online abuse. According to the organisation, around 1 in 3 women worldwide experience gender-based violence in their lifetime, and between 16 and 58 per cent have faced some form of digital violence.
“What begins online doesn’t stay online,” said UN Women Executive Director Sima Bahous. “Digital abuse spills into real life, spreading fear, silencing voices, and—in the worst cases—leading to physical violence and femicide. Laws must keep pace with technology to protect women both online and offline. Weak legal protections leave millions vulnerable while perpetrators act with impunity. This is unacceptable.”
Online harassment has surged in recent years, driven by platforms such as Instagram, X (formerly Twitter), and TikTok. The rise of generative AI has worsened the problem, contributing to increases in cyberstalking, non-consensual image sharing, deepfakes, and disinformation targeting women. World Bank figures show that fewer than 40 per cent of countries have adequate legal frameworks against online harassment, leaving 44 per cent of women and girls—around 1.8 billion—unprotected.
Rapid advances in generative AI have made image-based abuse easier, allowing perpetrators to create realistic deepfake images and videos that circulate on social media and explicit websites. These AI-generated images can be copied endlessly and stored in private devices, making them difficult to track or remove. Accountability remains limited due to weak safeguards and inadequate moderation.
UN Women reports a sharp rise in image-based sexual harassment. Schoolgirls increasingly face fake nude images of themselves being posted online, while female leaders are targeted with deepfake campaigns designed to intimidate and discredit them.
“There is massive reinforcement between the explosion of AI technology and the toxic extreme misogyny of the manosphere,” feminist activist Laura Bates told UN Women. “This is fundamentally about misogyny. AI is amplifying a much larger offline truth: men target women for gendered violence and abuse.”
Digital violence takes many forms, from inappropriate messages and coercive behaviour by intimate partners to anonymous threats. Women and girls living in low-income or rural areas face heightened risks, though its impact is felt across all demographics.
“Online abuse can undermine women’s sexual and reproductive rights and has real-life consequences,” said Anna Jeffreys, Media and Crisis Communications Adviser for UNFPA. “It can be used to control partners, restrict decision-making, or create fear and shame that prevents them from seeking help or health services.”
UN Women notes that young women, journalists, politicians, activists, and human rights defenders are frequent targets, with many also facing intersecting discrimination based on race, disability, or sexual orientation.
“When you escape your abusers, you feel safer—but digital violence follows you everywhere,” said Ljubica Fuentes, a human rights lawyer and founder of Ciudadanas del Mundo. “If you’re a feminist or activist, you are not allowed to be wrong, not even to have a past.”
Data collected from around the world highlights the scale of harm. In the Philippines, 83 per cent of online abuse survivors reported emotional harm, 63 per cent experienced sexual assault, and 45 per cent suffered physical violence. In Pakistan, online harassment has been linked to femicide, suicide, physical assault, job loss, and silencing of women. Across the Arab states, 60 per cent of women internet users have faced online violence; in Africa, 46 per cent of women parliamentarians report attacks; and in Latin America and the Caribbean, 80 per cent of women in public life have reduced their online presence out of fear.
UN Women is urging stronger global cooperation to ensure that digital platforms and AI systems follow safety and ethical standards. The organisation is calling for increased funding for women’s rights groups supporting survivors, and for stronger mechanisms to hold perpetrators accountable.
“The priority must be accountability and regulation,” Bates said. “AI tools should meet safety standards before release, platforms must be responsible for the content they host, and the burden of prevention must shift away from victims.”
Tech companies are also urged to employ more women, respond to abuse reports promptly, and remove harmful content. UN Women stresses the need for investment in prevention, including digital literacy training and programmes addressing toxic online cultures.
Jeffreys said UNFPA is supporting governments to strengthen national laws and working with communities, schools, and frontline workers to promote digital safety and ensure survivors can access confidential help.
“Digital platforms can expand access to information, education, and essential health services—especially for young people,” she said. “But these tools must be safe. When digital spaces are secure, they can advance gender equality rather than undermine it.”