New requirements for federal contractors using AI-powered technology provides sound advice for all employers
As Artificial Intelligence (AI) increasingly informs employment decisions, recent guidance for federal contractors suggests employers of all types should consider their current practices. On April 29, 2024, the Office of Federal Contract Compliance Programs (OFCCP) issued a Q&A addressing concerns arising from the intersection of obligations specific to federal government contractors and the proliferation of AI-based hiring decisions. See Artificial Intelligence and Equal Employment Opportunity for Federal Contractors | U.S. Department of Labor (dol.gov). Although the document is limited to government contractors, all employers may find it useful.
Requirements for federal contractors using AI systems for employment decisions
Although the guidance addresses some general technical considerations like “What is an algorithm?” and “What are automated systems?” the Q&A focuses primarily on employment practices to help contractors avoid the risk that their use of AI systems may perpetuate unlawful bias and/or automate unlawful discrimination.
The OFCCP noted that long-standing requirements for federal government contractors (e.g. the prohibition on discrimination in employment and the required use of affirmative action in recruiting/hiring) continue to apply when a federal contractor uses an automated system, including AI, to make employment decisions. The OFCCP provided the following examples of contractor requirements that may intersect with a contractor’s use of AI:
- Maintain records and ensure confidentiality of records consistent with all OFCCP-enforced regulatory requirements. In the context of AI, contractors must keep records of resume searches, both from searches of external websites and internal resume databases, and the records must include the substantive search criteria used.
- Cooperate with OFCCP by providing the requested information on the contractor’s AI systems.
- Make reasonable accommodations to the known physical or mental limitations of an otherwise qualified applicant or employee with a disability as defined in OFCCP’s regulations, unless the federal contractor can demonstrate that the accommodation would impose an undue hardship on the operation of its business. This obligation extends to the contractor’s use of automated systems, including but not limited to, electronic or online job application systems.
With respect to employment-related AI systems that have an adverse impact based on a protected class, such as sex, race or ethnic background (i.e. a selection rate which is less than 80% of the selection rate for the group with the highest rate), federal contractors must validate the system using a strategy that meets applicable OFCCP-enforced nondiscrimination laws and the Uniform Guidelines on Employee Selection Procedures (UGESP). These requirements, which apply whether the contractor builds its own AI software or acquires it from a vendor, require the employer to:
- Understand and clearly articulate the business needs that motivate the use of the AI system.
- Analyze job-relatedness of the selection procedure.
- Obtain results of any assessment of system bias, debiasing efforts and/or any study of system fairness.
- Conduct routine independent assessments for bias and/or inequitable results.
- Explore potentially less discriminatory alternative selection procedures.
OFCCP pointed out that federal contractors cannot avoid these compliance obligations by merely “contracting around them” with a third-party. Because federal contractors are responsible for their use of third-party products and services, such as automated tools for employment decisions, the contractor must be able to provide relevant, requested information and answer questions during an OFCCP audit.
Best practices for federal contractors that apply to all employers
Finally, the OFCCP guidance provides suggested best-practices for federal contractors that can benefit any employer using AI systems as part of its employment decision-making process. For example, the guidance suggests that contractors should:
- Provide advance notice and appropriate disclosure to applicants, employees and their representatives if the contractor intends to use AI in the hiring process or employment decisions that allows individuals to understand how they are being evaluated.
- Inform all applicants in clear and accessible terms how to request and obtain a reasonable accommodation in the hiring process, if needed.
- Standardize AI systems used to ensure all candidates/applicants go through the same process and establish, in advance, procedures the employer or the third-party administering the process will follow to receive and promptly respond to reasonable accommodation requests.
- Routinely monitor and analyze where the use of the AI system is causing a disparate or adverse impact before implementation, during use (at regular intervals) and after use. If such an impact exists, take steps to reduce it or use a different tool. This should include assessing whether the use of historical data in the creation of an AI system may reproduce patterns of systemic discrimination.
- Not rely solely on AI and automated systems to make employment decisions and ensure there is meaningful human oversight of any such decisions supported by AI.
- Retain and safely store documentation of the data used to develop or deploy the AI system with the contractor or ensure that such documentation is easily obtainable from the vendor.
These recommended practices also make clear one additional important point: AI is not “set it and forget it.” Employers need to monitor the AI systems to continue to check for biases and adverse impacts that may occur over time.
The use of AI systems for making employment-related decisions is here to stay. Although federal contractors are among the set of employers currently subject to specific AI requirements, this space will continue to become more heavily regulated for all types of employers. In the US, states and cities have already begun regulating this area, including New York City’s requirements for publication of audits. Businesses that begin to take incremental steps now to address concerns while compliance is still optional will benefit from a much smoother transition as regulation proliferates in tandem with the use of AI.