Human Resources Podcasts
  • All Technology
  • AI
  • Autonomy
  • B2B Growth
  • Big Data
  • BioTech
  • ClimateTech
  • Consumer Tech
  • Crypto
  • Cybersecurity
  • DevOps
  • Digital Marketing
  • Ecommerce
  • EdTech
  • Enterprise
  • FinTech
  • GovTech
  • Hardware
  • HealthTech
  • HRTech
  • LegalTech
  • Nanotech
  • PropTech
  • Quantum
  • Robotics
  • SaaS
  • SpaceTech
AllNewsDealsSocialBlogsVideosPodcastsDigests

Human Resources Pulse

EMAIL DIGESTS

Daily

Every morning

Weekly

Tuesday recap

NewsDealsSocialBlogsVideosPodcasts
HomeBusinessHuman ResourcesPodcastsHow Is Generative AI Reshaping Gender Inequalities at Work?
How Is Generative AI Reshaping Gender Inequalities at Work?
Human ResourcesAI

ILO: The Future of Work Podcast

How Is Generative AI Reshaping Gender Inequalities at Work?

ILO: The Future of Work Podcast
•March 9, 2026•0 min
0
ILO: The Future of Work Podcast•Mar 9, 2026

Why It Matters

As generative AI reshapes the labor market, its unequal impact threatens to widen existing gender gaps, affecting millions of women in clerical, administrative and care‑related jobs. Understanding and addressing these biases now is crucial for policymakers, employers, and workers to ensure that AI becomes a tool for inclusive growth rather than a driver of new inequities.

Key Takeaways

  • •Female‑dominated jobs face twice AI automation risk.
  • •Women perform 75% of unpaid care, limiting labor options.
  • •AI trained on biased data reproduces gender and racial discrimination.
  • •Inclusive AI design can detect bias and close pay gaps.
  • •Governments, employers, unions must govern AI for gender equality.

Pulse Analysis

The International Labour Organization’s new report shows that generative AI will not affect men and women equally. By applying a novel ILO exposure index across 84 countries, researchers found that occupations dominated by women are almost twice as likely to be automated—29 % versus 16 % for male‑dominated jobs. Moreover, women face higher automation risk in 88 % of the surveyed economies. Typical roles such as payroll clerks, receptionists, data‑entry operators and translators are especially vulnerable because they consist of routine, codifiable tasks that AI can replicate quickly.

These disparities stem from deep‑rooted structural drivers. Social norms and gender stereotypes steer women toward caring and administrative positions while channeling men into senior, technical roles. Globally, women shoulder three‑quarters of unpaid care work, limiting their ability to pursue higher‑skill jobs or full‑time hours. When AI systems are trained on historical data that reflects these inequities, they reproduce bias—evident in recruitment tools that favor male‑sounding names or salary‑setting algorithms that perpetuate pay gaps. Intersectional discrimination compounds the problem for women of color, older workers, and migrants.

Policymakers, employers and trade unions can shape a more equitable AI future. The ILO urges embedding gender equality into AI design, using unbiased, high‑quality data sets and ensuring women have access to AI‑related jobs and decision‑making roles. Governments must enact transparent regulations, data‑protection standards and non‑discrimination safeguards, while also reforming care and labor‑market policies to reduce women’s unpaid workload. Companies should adopt AI strategies that include ethical risk assessments, skilling programmes and tools that flag biased job ads or pay structures. When AI is built responsibly, it can help close gender gaps rather than widen them.

Episode Description

In this episode of the Future of Work podcast, Anam Butt, technical specialist on gender equality and non-discrimination at the International Labour Organization, discusses a new ILO report on the impact of generative artificial intelligence on the world of work. She explains why women are more exposed than men to the risks associated with this technology, and what this means for equality in the labour market.

Show Notes

0

Comments

Want to join the conversation?

Loading comments...