Biden’s Executive Order on AI Bias Leaves Out the Gig Economy—And It Shows
President Biden made it clear that he was determined to control AI’s rapidly changing role in public life when he signed Executive Order 14110. The directive sets a strong tone by demanding stronger digital rights, non-discriminatory algorithms, and transparent systems. However, the safeguards are oddly silent for people who work via apps rather than offices.
Platforms like Uber, DoorDash, and Instacart have revolutionized labor organization over the last ten years by using code rather than supervisors to assign tasks and assess performance. AI is already the manager, scheduler, and occasionally the judge for many employees; it is not just a concept for the future. However, most of those gig workers are still unprotected under the executive order.
| Category | Detail |
|---|---|
| Executive Order | Safe, Secure, and Trustworthy Development and Use of AI (EO 14110) |
| Signed By | President Joe Biden |
| Date Enacted | October 30, 2023 |
| Key Themes | AI safety, algorithmic fairness, civil rights, employment safeguards |
| Covered Workers | Employees under FLSA; excludes most gig workers |
| AI Oversight Direction | Department of Labor to guide employers on AI tool use |
| Exclusion Noted | No mention of Uber, DoorDash, or app-based labor regulation |
| Criticism Raised By | Labor unions, worker rights groups, civil rights organizations |
| Source Reference | Brookings Institute – AI Executive Order Overview |
The order excludes the large and expanding class of independent contractors by concentrating primarily on employers as defined by the Fair Labor Standards Act. Although they are not considered “employees,” these individuals perform crucial tasks in the areas of transportation, delivery, freelance media, and caregiving. It wasn’t just a technical error that they weren’t included. It brought attention to a policy gap.
The order provides specific guidance for how federal agencies should monitor AI in work environments, which is remarkably effective in theory. However, those environments—fixed schedules, physical offices, and HR departments—remain representative of a traditional conception of work. Gig platforms function in a different way. The experience is driven by algorithms, which frequently determine who works, when they work, and how they are graded or suspended.
For instance, after one hectic weekend, a driver in Austin informed me that his ability to receive ride requests decreased. There had been a drop in task offers, but no warning. His performance rating might have been impacted by a customer complaint that was probably handled by an automated system. The information supporting the choice was never presented to him. The executive order is specifically designed to address this lack of transparency, but not for him.
Businesses have increased operational efficiency while avoiding traditional labor obligations by incorporating advanced analytics into gig platforms. These systems are very flexible; they can process data in real time, reroute tasks, and instantly change prices. However, they create uncertainty and, far too frequently, accountability without redress for the workers who are affected.
The silence is striking when considering labor protections.
A number of researchers have called on the administration to broaden AI oversight frameworks through public advocacy and strategic alliances with labor groups. According to one study, AI’s negative effects are “most deeply felt and least regulated” in gig work. Workers face particular risks without corresponding rights, from AI-generated content moderation tools to delivery apps that heavily rely on surveillance.
It is noteworthy that the executive order promotes bias assessments and fairness audits, but only in the context of regular employment relationships. This creates a gap for gig-model early-stage startups. They are exempt from the same ethical requirements and are able to use experimental algorithmic tools.
The use of platform-based services increased during the pandemic. Millions of workers depended on these apps to generate flexible income, while consumers welcomed the convenience. Labor markets were altered by that event, but regulatory frameworks took longer to change. That lag is now becoming more dangerous as AI is incorporated into employment platforms.
AI is predicted to completely change the way labor markets operate in the years to come. Data will play a bigger role in worker evaluations, wage calculations, and task assignment. Because of this, it is especially crucial that federal policy continue to be inclusive, not only safeguarding salaried positions but also adapting to the changing nature of contemporary labor.
The executive order establishes adaptation mechanisms, which is encouraging. Future guidance reflecting more expansive definitions of work may be released by organizations such as the Department of Labor. Civic organizations can advocate for wording that acknowledges algorithmic employment as a type of authority that needs to be supervised, not just traditional management.
It is feasible to create protections that are remarkably transparent and equitable for all employment types by taking advantage of the current policy momentum. The first step in doing so is to redefine what it means to be a “worker” in the AI era, not by status but by exposure and function.
In the end, this executive order serves as a basis. Gig workers who are currently negotiating algorithmic control without safeguards can be included in this policy blueprint with careful revision. By doing so, lawmakers will be upholding a social compact rather than merely regulating software.
One where everyone at the table is included, regardless of whether they tap to access the internet or clock in.