Intelligenze artificialipersoneetnie e culture

Artificial intelligence? “It’s a matter of power, not just of technology.”

Data are not neutral and tend to reinforce existing hierarchies. Those who pay the price are primarily people living in poverty, minorities, and women. Interview with Ivana Bartoletti, expert in AI governance and digital rights: “A feminist approach helps address these dynamics.”
By Antonella Patete
01 Apr 2026

“When we talk about AI, we are not only talking about technology, but above all about power. Because whoever decides the data, goals, and rules of a system also decides who is seen, evaluated, and rewarded.”

For Ivana Bartoletti, Vice President and Global Head of Privacy and AI Governance at Wipro, data are never neutral, and those who pay the highest price are always minorities, including, of course, women. “Technology incorporates choices: which categories to use, what to measure, which norms to assume. Since data reflect unequal societies, AI tends to reinforce existing hierarchies: access to services, credit, work, online visibility. People with less power—women, minorities, those living in poverty—experience more errors, surveillance, and exclusion, and have fewer tools to contest them,” says Bartoletti, an internationally recognized professional for her contribution to the debate on technology, law, and society. She founded the network Women Leading in AI and is the author of several volumes on the relationship between artificial intelligence, ethics, politics, and finance.

Doctor Bartoletti, in recent weeks Grok, the AI integrated into the X platform, has generated significant discussion due to the spread of sexual deepfakes involving women and minors. Is this phenomenon more widespread than it seems? Are there other similar cases? And what tools have been implemented to combat digital violence against women?
The Grok case made visible an already widespread phenomenon: the production of non-consensual intimate images using AI (deepfakes and “nudify”), which primarily affects women and girls and can involve minors. Grok continued to generate sexualized images without consent, prompting investigations in the United Kingdom, France, and at the EU level. Countermeasures include specific offenses and aggravating circumstances, obligations for rapid removal and traceability, age verification for high-risk functions, watermarking and content provenance, bans on “nudify” models, reporting channels, and victim support.

You also said that artificial intelligence is never neutral. Can you give some examples in which algorithmic biases have produced injustices or exclusion, and how a feminist approach can change this?
Recurring examples include personnel selection software that penalizes female CVs because it was trained on historical male hiring patterns, facial recognition systems that are less accurate on women and non-white people, creating risks of unjustified checks, and medical models that underestimate pain or cardiovascular risk in women because historical clinical data are skewed. A feminist approach changes the process: it analyzes who benefits and who is at risk, makes asymmetries in data visible, enforces impact assessments and independent auditing, involves affected groups in design, and prioritizes accountability and rights, not just performance.

Fortunately, recently the discussion has not focused solely on Grok, but also on AfroféminasGPT. What exactly is this, and how can this initiative change the way AI “thinks”?
AfroféminasGPT is a GPT built by Afro-feminist activism (linked to the Afroféminas project) to correct the dominant viewpoint of generalist chatbots. It is trained/curated with decolonial and anti-racist sources and tends to define racism as a system of power, not merely as individual prejudice. In practice, it shows that AI does not “think” in the abstract: it responds according to the archive of texts and values incorporated. Initiatives like this push toward more transparent models regarding sources, governance, and objectives, and open the way for community-curated datasets, not only those from large platforms.

Many of the most interesting experiences seem to come from the so-called Global South. Is this really the case?
Yes, many innovations come from the Global South, particularly regarding language, moderation, and civic uses: there, harms from imported automation (linguistic bias, digital exclusion, surveillance) are immediate, which increases the drive for contextual solutions. Moreover, the plurality of languages and dialects forces a rethink of datasets and metrics. Feminist and anti-racist activist networks experiment with their own tools, while mobile and informal adoption favors lightweight, open-source prototypes. This is not romanticism: it is often innovation driven by necessity and deep social expertise. Consider, for example, India—a multicultural country, a huge global talent hub, and a major economic power.

Is there any important topic we haven’t addressed?
Yes, there is one last fundamental point: feminism speaks of power, and artificial intelligence is not only technology—it is power as well. The power to reshape the world, labor markets, and geopolitical balances. The power to decide what we see, who receives resources, services, or opportunities through algorithmic systems. The power to crystallize or transform reality, translating social inequalities into code and software. A feminist approach to AI means intervening in these power dynamics: ensuring greater diversity not only among those writing the code but, above all, in the places where AI decisions are made—from politics to the boards of technology companies. Without this shift in power, talking about ethics remains insufficient.

Registration with the Court of Bergamo under No. 04, 9 April 2018. Registered office: Via XXIV maggio 8, 24128 BG, VAT no. 03930140169. Layout and printing by Sestante Editore Srl. Copyright: all material by the editorial staff and our contributors is available under the Creative Commons Attribution/Non-commercial-Share Alike 3.0/ licence. It may be reproduced provided that you cite DIVERCITY magazine, share it under the same licence and do not use it for commercial purposes.
magnifiercrosschevron-down