Ifeoma Ajunwa / Hiring by Algorithm / 11.7.17

Ifeoma Ajunwa is an assistant professor at Cornell University’s Industrial and Labor Relations School (ILR), and a faculty associate member of Cornell Law School. Ajunwa is interested in how the law and private firms respond to job applicants or employees perceived as “risky.” She looks at the legal parameters for the assessment of such risk and also  how technology and organizational behavior mediates risk reduction by private firms. She examines the sociological processes in regards to how such risk is constructed and the discursive ways such risk assessment is deployed in the maintenance of inequality. Ajunwa also examines ethical issues arising from how firms off-set risk to employees.

Talk: “Hiring by Algorithm”

Abstract: In the past decade, advances in computing processes such as data mining and machine learning have prompted corporations to rely on algorithmic decision making with the presumption that such decisions are efficient and fair. The use of such technologies in the hiring process represents a particularly sensitive legal arena. In this Article, I note the increasing use of automated hiring platforms by large corporations and how such technologies might facilitate unlawful employment discrimination, whether due to (inadvertent) disparate impact on protected classes or the technological capability to substitute facially neutral proxies for protected demographic details. I also parse some of the proposed technological solutions to discrimination in hiring and examine them for the potential for unintended outcomes. I argue that technologically-based solutions should be employed only in support of legislative and litigation-driven redress mechanisms that encourage employers to adopt fair hiring practices. I make the policy argument that audits of automated hiring platforms should be a mandated business practice that serves the ends of equal opportunity in employment. Notably, akin to Professor Ayres’ and Professor Gerarda’s Fair Employment Mark, employers that subject their automated hiring platform to external audits could receive a certification that serves to distinguish them in the labor market. Finally, borrowing from Tort law, I argue that an employer’s failure to audit its automated hiring platforms for disparate impact should serve as prima facie evidence of discriminatory intent under Title VII.

Leave a Reply

Your email address will not be published. Required fields are marked *