Federal Court Approves Collective Action in Workday AI Bias Lawsuit

In a landmark ruling, a federal judge in Northern California has greenlit a collective action lawsuit against Workday’s AI-powered hiring system. Filed by job seeker Derek Mobley, the complaint alleges that Workday’s tools systematically discriminated against applicants over 40, Black, and with disabilities by automatically filtering them out. The court’s decision to proceed with collective action marks a pivotal moment in the legal scrutiny of algorithmic hiring.

Mobley’s Allegations and Collective Action Approval

Derek Mobley, a Black man over 40 who has experienced disability, claims he applied to over 100 jobs via platforms powered by Workday’s AI tools—and was rejected every time. Mobley filed suit in 2023 under Title VII, the ADEA, and the ADA, alleging systemic discrimination. On May 16, 2025, Judge Rita Lin certified the suit as a nationwide collective action under the ADEA, specifically for applicants aged 40+ who were denied automated recommendations by Workday since September 24, 2020 (Holland & Knight; HR Dive).

Workday’s Response and Legal Position

Workday maintains that it solely provides software and does not make hiring decisions—a distinction it argues should exempt it from liability. Nevertheless, the court found sufficient evidence on whether Workday acted as an agent of employers, rejecting the motion to dismiss and allowing the collective to proceed (HR Dive; Inside Tech Law).

Broader Legal and Regulatory Implications

This case sets new precedent in holding AI vendors potentially liable under federal civil rights laws when their systems influence hiring decisions at scale. Employers and vendors alike are being urged to audit algorithmic tools, validate demographic impacts, and ensure transparency to ward off legal risks (Seyfarth; Reuters).

Next Steps in the Lawsuit Process

The case will move into discovery, including evaluation of Workday’s training data, feature selection criteria, and internal bias testing. Plaintiffs will need to present statistical evidence of disparate impacts, while Workday may push to decertify the class or dispute key elements. This litigation will heavily influence how AI hiring tools are deployed, audited, and governed in HR systems.

Conclusion

The federal court’s decision to allow collective action in the Workday AI bias case serves as a critical warning: as AI becomes an integral part of hiring, so does the need for accountability. HR teams, AI vendors, and legal departments must now work together to ensure ethical, fair, and compliant use of automated hiring systems—before future litigation or regulatory mandates force their hand.

Sources