Building AI without women will lead to biased results: Microsoft

Building AI without women will lead to biased results: Microsoft

IANS
Hot News
Published:
Microsoft. (File Photo: IANS)
i
Microsoft. (File Photo: IANS)
null

advertisement

New Delhi, March 8 (IANS) As Artificial Intelligence (AI) becomes the buzz of the town, building AI-based solutions without the inclusion of women would give way to a technology that is inherently biased, a top Microsoft executive said on Friday.
According to the "World Economic Report 2018", only 22 per cent of AI professionals globally are female while almost a third (32 per cent) believe that gender bias is still a major hurdle in the recruitment process in the industry.
"If AI systems are built only by one representative group such as all male, all Asian or all Caucasian, then they are more likely to create biased results," Mythreyee Ganapathy, Director, Programme Management, Cloud and Enterprise, Microsoft, told IANS.
Data sets that will be used to train AI models need to be assembled by a diverse group of data engineers.
"A simple example is data sets that are used to train speech AI models which focus primarily on adult speech samples unintentionally exclude children and hence the models are unable to recognise children's voices," Ganapathy added.
India is at the 108th spot in the gender gap index, according to the "World Economic Forum 2018" report. It also has one of the lowest participation rates of women in the labour market at 27 per cent.
A different set of people should be included to increase the diversity of AI teams as more than half (52 per cent) women globally, perceive the tech sector to be a "male" industry, the report adds.
To balance the gender gap in the country, the tech giant promotes the study of computer science at traditionally female colleges and other universities.
"We believe that attracting, developing and helping women in STEM fields is vital to ensuring a well-rounded, inclusive society without which we risk having hundreds of thousands of jobs left unfilled and decades of innovation absent of female perspectives," the Microsoft executive noted.
Corporate and academic AI teams have inadvertently made systems biased against women.
For example, tech giant Amazon's ML experts scrapped a "sexist" AI recruiting tool in October 2018 after they discovered the recruiting engine "did not like women".
Members of the team working on the system said it effectively taught itself that male candidates were preferable.
--IANS
ksc/na/bg

(This story was auto-published from a syndicated feed. No part of the story has been edited by The Quint.)

(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)

Published: undefined

ADVERTISEMENT
SCROLL FOR NEXT