Aggression Detectors: The Unproven, Invasive Surveillance Technology Schools Are Using to Monitor Students

Number 28488
Subject Surveillance
Source ProPublica
State
Year 2019
Publication Date June 2019
Summary In response to mass shootings, some schools and hospitals have been installing devices equipped with machine learning algorithms that purport to identify stressed and angry voices before violence erupts. Our analysis found this technology unreliable. Our goal was to reverse-engineer the algorithm, so we could see for ourselves if it actually worked as the company advertised. (One salesperson suggested to us that the device could prevent the next school shooting.) We purchased the device and rewired its programming so we could feed it any sound clip of our choosing. We then played gigabytes of sound files for the algorithm and measured its prediction for each. After this preliminary testing, we ran several real-world experiments to test where the algorithm could be flawed. We recorded the voices of high school students in real-world situations, collected the algorithm's predictions and analyzed them.
Category Philip Meyer Contest entry
Pages
Keywords ProPublica, technology, surveillance, schools, education, school shootings, aggression, monitors, hospitals, machine learning, algorithms
Related Links https://www.youtube.com/watch?v=WUL_Kk5EiNw&feature=youtu.be https://www.youtube.com/watch?v=lsW6ROCTWIg&feature=youtu.be
Related Video
Ordering info Want to place an order? Email us or call us at 573-882-3364 (Stories are only available to members of IRE. For membership information, please refer to our membership page)