A study on bias against women in recruitment algorithms
Surveying the fairness literature in the search for a solution
More Info
expand_more
Abstract
Algorithms have a more prominent presence than ever in the domain of recruitment. Many different tasks ranging from finding candidates to scanning resumes are handled more and more by algorithms and less by humans. Automating these tasks has led to bias being exhibited towards different unprivileged groups, among these women. This has prompted a need to find solutions to this bias in order to achieve fairness in algorithms for everyone. This survey analyses the state of the literature on fairness and bias against women with a focus on recruitment algorithms. It has been found that a plethora of methods to achieve and measure fairness exist, with many of the technical methods having only been tested in a controlled environment and not in a production environment. Companies are not very forthcoming in sharing how they ensure fairness, which complicates the development of the field. There are many limitations to the current methods, however due to widespread usage of algorithms in the recruitment process, there is still a need for solutions to exist while these critiques are being addressed. It is vital for fairness to not only consider technical solutions, but also social solutions and to be aware of the limitations of each approach. Solving the issue of bias against women in algorithms requires sometimes realising that the best solution does not involve an algorithm at all, or in the case where an algorithm is applied, a critical engineer that is aware of the limitations and possibilities of different methods to achieve fairness. Future work is in addressing some major critiques of the current fairness literature and reducing the bias in society that often leads to algorithmic bias.