An algorithm that was being tested as a recruitment tool has been scrapped by Amazon for being sexist against women.
The artificial intelligence system was trained on data submitted by applicants over a 10-year period, much of which came from men, it claimed.
Members of the team working on it reported the system effectively taught itself that male candidates were preferable.
Amazon has not responded to the claims.
Members of the team who worked on the AI tool, all of whom wished to stay anonymous, explained that the system was intended to review job applications and give candidates a score ranging from one to five stars.
‘They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those,’ said an engineer on the team.
Highly Gender sensitive tool
By 2015, it was clear that the system was not rating candidates in a gender-neutral way because it was built on data accumulated from CVs submitted to the firm mostly from males, reports claim.
The system started to penalise CVs which included the word ‘women.’ The program was edited to make it neutral to the term but it became clear that the system could not be relied upon, it added.
The project was abandoned, although Reuters said that it was used for a period by recruiters who looked at the recommendations generated by the toll but never relied solely on it.
According to Amazon, its current global workforce is split 60:40 in favour of males.
About 55% of US human resources managers said that AI would play a role in recruitment within the next five years, according to a survey by software firm CareerBuilder.
It is not the first time doubts have been raised about how reliable algorithms trained on potentially biased data will be