top of page
Selected Publications

The University of Kansas Missouri Law Review, forthcoming 2022

Schools increasingly use artificial intelligence in instruction. Personalized learning systems take on a whole host of other educational roles as well, fundamentally reconfiguring education in the process. They not only perform the functions of “robot teachers,” but make pedagogical and policy decisions typically left to teachers and policymakers. Their design, affordances, analytical methods, and visualization dashboards construct a technological, computational, and statistical infrastructure that literally codifies what students learn, how they are assessed, and what standards they must meet. Educators and legislators can no longer afford to overlook the pedagogical and policy implications of their technology choices.

Notre Dame Journal of Emerging Technology, Jan 2022

The widespread use of online proctoring software during the Covid pandemic prompted a deluge of horror stories, generating significant public backlash. This Article offers a nuanced analysis of the online proctoring technosocial system, which not only includes proctoring software, but also its implementation by educators and schools. Online proctoring software relies on controversial technologies – facial recognition, artificial intelligence, and biometric surveillance in intimate surroundings – without vendors or schools accounting for their biases or acknowledging the lack of evidence supporting automated proctoring accuracy and efficacy. This Article then examines legal and extra-legal levers that can promote—or push—companies and schools to adopt more responsible policies and eschew unreliable and unproven automated tools. None of these, however, will cure the fundamental flaws with proctoring technologies, which should only be deployed after significant reform to ensure fairness and due process and under limited circumstances where their use will promote, rather than undermine, equity by expanding access to education.

The Oxford Handbook of Artificial Intelligence Ethics, March 2020

Schools increasingly use artificial intelligence in instruction. Personalized learning systems take on a whole host of other educational roles as well, fundamentally reconfiguring education in the process. They not only perform the functions of “robot teachers,” but make pedagogical and policy decisions typically left to teachers and policymakers. Their design, affordances, analytical methods, and visualization dashboards construct a technological, computational, and statistical infrastructure that literally codifies what students learn, how they are assessed, and what standards they must meet. Educators and legislators can no longer afford to overlook the pedagogical and policy implications of their technology choices.

With Helen Nissenbaum, Theory and Research in Education, November 2018

Strong regulation of student privacy promotes many important purposes of higher education, including the creation of citizens capable of self-governance, socioeconomic mobility, and intellectual experimentation. However, users who receive instruction directly from Massive Open Online Courses (MOOCs) and other Virtual Education Providers (VLEs) fall outside this protection and instead face the caveat emptor data practices of commercial websites. If these private, often for-profit online education providers truly seek to support the public-minded missions they claim, they should instead offer online learners heighted privacy protection.

AI PULSE, Spring 2021 (with Edward Parson et al.)

This article considers AI advances, impacts, and governance concerns.

The Cambridge Handbook of Consumer Privacy, May 2018

Education is increasingly driven by big data. New education technology creates virtual learning environments accessible online or via mobile devices. These interactive platforms generate a previously unimaginable array and detail of information about students’ actions both within and outside of classrooms. This information not only can drive instruction, guidance, and school administration, but also better inform education-related decision-making for students, educators, schools, ed tech providers, and policymakers. This chapter describes the benefits of these innovations, the privacy concerns they raise and the relevant laws in place.

Big Data, June 2018

This article examines how big data-driven education technologies alter the structure of schools’ pedagogical decision-making, and, in doing so, change fundamental aspects of America’s education enterprise. Digital mediation and data-driven decision-making have a particularly significant impact in learning environments because the education process primarily consists of dynamic information exchange. First, virtual learning environments create information technology infrastructures featuring constant data collection, continuous algorithmic assessment, and possibly infinite record retention. This undermines the traditional intellectual privacy and safety of classrooms. Second, these systems displace pedagogical decision-making from educators serving public interests to private, often for-profit, technology providers. They constrain teachers’ academic autonomy, obscure student evaluation, and reduce parents’ and students’ ability to participate or challenge education decision-making. Third, big data-driven tools define what ‘counts’ as education by mapping the concepts, creating the content, determining the metrics, and setting desired learning outcomes of instruction. These shifts cede important decision-making to private entities without public scrutiny or pedagogical examination.

The Handbook of Learning Analytics, May 2017

This chapter describes the regulatory framework governing student data, its neglect of learning analytics and educational data mining, and proactive approaches to privacy. Traditional student privacy law focuses on ensuring that parents or schools approve disclosure of student information. They are designed, however, to apply to paper “education records,” not “student data.” As a result, they no longer provide meaningful oversight. The primary federal student privacy statute does not even impose direct consequences for noncompliance or cover “learner” data collected directly from students. Newer privacy protections are uncoordinated, often prohibiting specific practices to disastrous effect or trying to limit “commercial” use. These also neglect the nuanced ethical issues that exist even when big data serves educational purposes. I propose a proactive approach that goes beyond mere compliance.

University of Miami Law Review, March 2017

Both FERPA and new state reforms rely on education purpose limitations as a compromise that allows schools to outsource data-reliant functions while addressing stakeholder concerns. However, current regulations define “education purposes” as information practices conducted on behalf of schools or pursuant to their authorization. Accordingly, they provide more procedural than substantive constraints. As with student privacy protections based on controlling access to education records, modern technological affordances limit the protection provided by education purpose limitations. Data-driven education tools change the nature of student information, the structure and method of school decision-making, and the creation of academic credentials. Broad education purpose limitations provide limited protection under these circumstances because they (1) treat education and non-education purposes as binary and mutually exclusive; (2) presume data practices serving education purposes align with students’ academic interests; (3) overlook the ethical complications created by “beta” education; (4) neglect the pedagogical effects of computerized instructional tools; and (5) discount the impact of data-driven technology on education itself. Ethical discourse regarding education technology points to productive avenues for more substantive student privacy protection.

Drexel Law Review, August 2016

This paper offers perspective on how the regulatory mechanisms of FERPA and new reforms work within today’s technological, institutional, and economic systems. It makes several contributions, including (1) an analysis of the assumptions that underlie FERPA’s FIPPs-based approach to privacy; (2) finding such protections insufficient in light of the technological and information infrastructure created by networked systems, cloud computing, and big data analytics that are often provided by private entities; and (3) enumerating practical, political, and pedagogical, and philosophical characteristics of America’s education system that limit the efficacy of privacy protection through personal or institutional self-management.

The Journal of the National Association of State Boards of Education (NASBE), May 2016

Addressing parents’ fears that student data will be abused requires states to shift toward proactive management of education records.

The Future of Privacy Form, March 2016

This report highlights the ways that newly available technology, data, and analytical techniques can create better educational outcomes. It presents concrete examples from Pre-K through higher education of how education data can be used to benefit students, the education system, and society-at-large. The cases here illustrate how students, educators, researchers, and advocates apply data analysis to encourage student success and retention, facilitate more effective instruction, advising, and administration, and ameliorate inequalities. Data-driven education has the potential help bridge the achievement, retention, and discipline gaps so that all students can enjoy a high quality education and achieve career success.

Slate, March 4, 2015

Algorithms Can Be Lousy Fortunetellers . . . But  employers could take them seriously anyway.

October 2014

This paper examines the concerns captured in the concept of the proverbial "permanent record," how closely these fears match and diverge from information flow surrounding state longitudinal student data systems, and the mechanisms in place to address these fears.

bottom of page