50 - Promising developer personality
As the captain of the women’s chess club, she was in a bad position to be selected.
Amazon developed a computer programme from 2014 onwards to evaluate the CVs of applicants. The aim was to automate the search for suitable candidates for any vacant positions. The program used artificial intelligence and rated the suitability of the application with one to five stars. However, in 2015 the company found that the program did not make gender-neutral selections for applications for software development or other technical positions. The online recruiting program simply didn't like women.
This was due to the training of the computer model, which was fed with CVs of applicants from the last 10 years. Most of the applications came from men, a reflection of male dominance across the tech industry. The system taught itself that male applicants should be given preference and downgraded applicants if the word “woman” appeared in their resumes, for example “captain of the women's chess club”. After the case became known, Amazon claimed that the program was never used by recruiters to evaluate candidates. However, insiders said that the AI-based recommendation system had been used, but Amazon’s recruiters never relied solely on those rankings.
The example shows that data quality is of crucial importance in machine learning. According to the motto “Garbage In, Garbage Out”, an algorithm can only be as good as the data set it is fed for training by humans.