Press Enter to search
Tech News: A new study has raised an important question—is artificial intelligence (AI) showing gender bias by preferring women over men in job selections? According to recent research, men may be at a disadvantage when AI is used in hiring processes.
Professor David Rosado from the New Zealand Institute of Skills and Technology conducted a study to examine how AI behaves in recruitment decisions. He tested 22 different large language models (LLMs), including well-known ones like ChatGPT, Gemini, and Grok.
To ensure fairness in the test, he created identical resumes, changing only the names to reflect either a male or female candidate. Surprisingly, every AI model favored the female candidate, even when both profiles had the same qualifications and experience.
This outcome raises serious concerns. Many global companies now rely on AI to help in hiring, but if these systems are unintentionally biased, male applicants could face unfair treatment. Professor Rosado believes this trend must be studied carefully, especially as AI becomes more common in workplaces.
When speaking to Newsweek, Professor Rosado said that this unusual behavior could be the result of several factors—including the training data, the labeling (annotation) process, or system-level instructions built into the AI. However, he also admitted that the exact reason remains unknown.
He explained that this bias could be coming from human patterns in the data that the AI has learned from. If the data is biased, AI is likely to make biased decisions as well.
This study shows a clear pattern: AI is more likely to prefer women over men in job selections, even when both have equal skills. Whether or not this is intentional, it can seriously impact men’s job opportunities and raises ethical questions about how AI is used in the hiring process.
The research highlights an important issue—AI should not be used blindly. No matter how smart the system is, if it’s trained on biased data, it can make unfair decisions. That’s why it is important to design AI tools that are fair, transparent, and balanced.
Companies and developers must regularly test and improve AI models to ensure they treat all candidates equally, regardless of gender or background.