Artificial Intelligence vs. No Intelligence: DOL Issues New Field Assistance Bulletin (FAB) No. 2024-1, With Guidance On the Use of Artificial Intelligence in the Workplace

“AI is a tool. The choice about how it gets deployed is ours.”

—Oren Etzioni

On April 29, 2024, the Wage and Hour Division (“WHD”) of the Department of Labor (“DOL”) published Field Assistance Bulletin (FAB) No. 2024-1, Artificial Intelligence and Automated Systems in the Workplace under the Fair Labor Standards Act and Other Federal Labor Standards.  See https://www.dol.gov/sites/dolgov/files/WHD/fab/fab2024_1.pdf .The Bulletin provides guidance regarding the application of the Fair Labor Standards Act (“FLSA”) and other federal labor standards with respect to the use of artificial intelligence (“AI”) and other automated technologies in the workplace.

How does AI cross paths with the FLSA, first enacted in 1938? Well, the FLSA requires certain recordkeeping by employers. And according to DOL, AI is sometimes used by employers to track work hours and performance, and even process medical leave requests, among other functions. DOL says that “The federal labor laws WHD enforces are flexible enough to cover these changing workplace practices, and protections apply regardless of technological innovation.” Indeed, long ago, DOL issued a timekeeping app for employees to use to track hours worked and to document their claims of unpaid working time. Now a newly hip DOL, is hopping on “the all things AI” trend, to remind employers that when employing new technology, they still must ensure its responsible use to maintain compliance with federal labor laws.

Here is the gist of the WHD Bulletin:

AI and other automated systems can provide ways to streamline tasks for employers, improve workplace efficiency and safety, and enhance workforce accountability. However, without responsible human oversight, the use of such technologies may pose potential compliance challenges with respect to federal labor standards. As new technological innovations emerge, the federal laws administered and enforced by WHD continue to apply, and employees are entitled to the protections these laws provide, regardless of the tools and systems used in their workplaces.

Id.

I suppose what worries WHD is that employer’s will blindly rely on AI to keep track of time or leave use or other wage and hour records, and the AI may produce inaccurate results. WHD is intent on reminding employers that AI is no substitute for accuracy in timekeeping, payroll and benefit administration. The potential for abuse exists, according to DOL in many aspects of the FLSA and other employment laws, including the following situations: 

  •        Tracking working time;

  •        Monitoring break periods;

  •        Tracking waiting time;

  •        Monitoring location of working time for travel and prevailing wage purposes;

  •        Calculating wages due for different work rates and duties;

  •        Keeping track of Family and Medical Leave Act (“FMLA”) leave; and

  •        Assuring breaks and other protections for nursing mothers.

DOL  isn’t trying to bar the door to technological advances. They nonetheless conclude that “w]ithout proper human supervision, however, these technologies can pose potential risks to workers with respect to labor standards and may result in violations of the laws enforced by WHD.” Id.  

Of course, the sorry history of government’s misuse of technology and records is more likely to be a source of abuse than employer recordkeeping. There is a long history of government relying on glitchy computers, making errors, and refusing to correct them. My wife had me watch a painful 2024 drama series and documentary on brit box this week called Mr. Bates vs The Post Office about how the United Kingdom’s (“UK’s”) Post Office computer made bookkeeping errors, and the UK authorities used those errors to destroy or damage the lives of more than 550 local “Subpostmasters”, claiming that the employees had engaged in theft of public funds. Of course, there is a Government contractor angle to the story, as the computer system was procured from and managed by Fujitsu. The persecution was so bad it caused two suicides. That sorry story of official abuse suggests it might be more fruitful for WHD to spend more of its time cleaning up its own highly inaccurate and misused computer data base records, especially before it decides to move ahead with debarment actions against small government contractors based on its own flawed investigatory records. That isn’t artificial intelligence, of course; it just plain common sense.