91 UMKC L. Rev. 403 (2022)
Algorithmic assessments increasingly shape individuals’ success in education and employment. Schools, recruiters, and companies now rely on automated platforms and analytics services to sort through an overwhelming number of options. Similar technologies help students and workers find opportunities to pursue. The “opportunity brokers” who provide these tools not only vet applicants, but also personalize advertisements, curate user-facing recommendations, and identify potential candidates to recruit or promote.
Because algorithms rely on data reflecting historical bias and inequality, artificial intelligence can retrench existing patterns of inequity. Automated systems compound this disadvantage as a few dominant companies draw on the same data and apply similar criteria. Further, the scale and pressures of the platform economy encourage both organizations and opportunity seekers to prioritize options with the highest probability of success. This leads to less diverse outcomes across industries and over time, constructing an imperceptible, but systemic, barrier to opportunity that I call the Silicon Ceiling.
Like the glass ceiling, the Silicon Ceiling undermines traditional regulatory regimes focused on discrete decisions and ex post remediation/fair scores and procedural guarantees. Algorithmic intermediation occurs out of view as targeted advertising, personalized recommendations, and passive recruiting shape the candidate pool well before formal decision making. As a result, the automated opportunity system precludes, rather than denies, access to opportunity. Most people will not know the details of the decisions, the entities making them, or that they have even been evaluated. It is not just that the “black box” of algorithmic opacity obscures inflection points; the barrier itself is invisible. These paradigmatic shifts call for new conceptualizations of harm and structural reforms to shatter the Silicon Ceiling.
Read entire publication here.