Reception immediately following in 3rd floor atrium
Today, computer systems terminate Medicaid benefits, remove voters from the rolls, exclude travelers from flying on commercial airlines, label (and often mislabel) individuals as dead-beat parents, and flag people as possible terrorists from their email and telephone records. But when an automated system rules against an individual, that person often has no way of knowing if a defective algorithm, erroneous facts, or some combination of the two produced the decision. Research showing strong psychological tendencies to defer to automated systems suggests that a hearing officer’s check on computer decisions will have limited value.
At the same time, automation impairs participatory rulemaking, the traditional stand-in for individualized due process. Computer programmers routinely alter policy when translating it from human language into computer code. An automated system’s opacity compounds this problem by preventing individuals and courts from ascertaining the degree to which the code departs from established rules. Programmers thus are delegated vast and effectively un¬reviewable discretion formulating policy.
Danielle will be talking about a concept of technological due process that can vindicate the norms underlying last century’s procedural protections. A carefully structured inquisitorial model of quality control can partially replace aspects of adversarial justice that automation renders ineffectual. Her proposal provides a framework of mechanisms capable of enhancing the accuracy of rules embedded in automated decision-making systems.
Danielle Citron is an Associate Professor of Law at the University of Maryland Law School.