Stories

The IRE Resource Center is a major research library containing more than 27,000 investigative stories.

Most of our stories are not available for download but can be easily ordered by contacting the Resource Center directly at 573-882-3364 or rescntr@ire.org where a researcher can help you pinpoint what you need.

Search results for "algorithm" ...

  • Aggression Detectors: The Unproven, Invasive Surveillance Technology Schools Are Using to Monitor Students

    In response to mass shootings, some schools and hospitals have been installing devices equipped with machine learning algorithms that purport to identify stressed and angry voices before violence erupts. Our analysis found this technology unreliable. Our goal was to reverse-engineer the algorithm, so we could see for ourselves if it actually worked as the company advertised. (One salesperson suggested to us that the device could prevent the next school shooting.) We purchased the device and rewired its programming so we could feed it any sound clip of our choosing. We then played gigabytes of sound files for the algorithm and measured its prediction for each. After this preliminary testing, we ran several real-world experiments to test where the algorithm could be flawed. We recorded the voices of high school students in real-world situations, collected the algorithm's predictions and analyzed them.
  • Artificial Unintelligence

    A guide to understanding the inner workings and outer limits of technology and why we should never assume that computers always get it right.
  • NYT: Privacy, Propaganda and Profit in Silicon Valley

    Internet titans, including Facebook, empowered hucksters and propagandists stoking fear and hate, and misled the public about their behavior.
  • Artificial Intelligence: The Robots Are Now Hiring

    Hiring is undergoing a profound revolution. Since skills have a shorter and shorter shelf life, companies are moving away from assessing candidates based on their resumes and skills, towards making hiring decisions based on people’s personalities.
  • Dangerous Doses

    For one story, “The hunt for dangerous doses,” investigative reporter Sam Roe led a collaboration with data scientists, pharmacologists and cellular researchers at Columbia University Medical Center in an attempt to discover potentially deadly combinations of prescription drugs. Intrigued by the novel data mining algorithms developed by Columbia scientist Nicholas Tatonetti, Roe proposed that the two team up to search for drug combinations that might cause a potentially fatal heart condition. Roe also recruited Dr. Ray Woosley, the leading authority on that condition and a former dean of the University of Arizona medical school, to the team. Over two years, as he orchestrated the project, Roe traveled to New York 12 times to meet with Tatonetti. They brainstormed, analyzed data and talked with Woosley via conference calls. Several of Tatonetti’s graduate students joined the team, as did Columbia cellular researchers whose work provided a critical layer of validation of the results.
  • Inequality Calculator

    The Inequality Calculator is an application based on the massive data analysis of the income of citizens of 16 countries that reveals, in an interactive and comparative way, the enormous income gaps that exist between the poor and multimillionaires of Latin America and the Caribbean. The INEQUALITY CALCULATOR is based on an algorithm that divides a household's monthly income among its members and compares this with the country's population ordered from poorest to richest in 10 groups, or deciles, plus the group of multimillionaires. The result of these calculations will provide the user with an estimate of the time he or she would need to work to attain the average monthly income of a multimillionaire and will also allow comparison to the country's other income groups. The timeframe—some will need to work for several centuries to achieve this income—will highlight, in an amusing but direct fashion, the insuperable gap that separates the ordinary citizen and the multimillionaire.
  • LAPD underreported serious assaults, skewing crime stats for 8 years

    A Los Angeles Times machine-learning analysis found that the Los Angeles Police Department misclassified an estimated 14,000 serious assaults as minor offenses in an eight-year period, artificially lowering the city's crime levels. Reporters used an algorithm to learn key words in crime report narratives that identified offenses as serious of minor, and then used it to review nearly eight years of data in search of classification errors.
  • Machine Bias

    With our Machine Bias series, we are investigating the algorithms that are increasingly making decisions about our lives, from what news or ads we see online to what sentences are meted out for crimes. Algorithms are often proprietary "black boxes," raising important questions about transparency and due process. By collecting and analyzing the output of these systems, we set out to reverse-engineer and make accountable some of the algorithms that were having the biggest impact on people’s lives. Our investigative methods included linear regression, statistical analysis, and the creation of our own software. Among the series’ findings were evidence of racial bias in risk assessment systems, and the preferential treatment of Amazon’s own products in its so-called open market.
  • Daily Beast: The Apple ‘Kill List’: What Your iPhone Doesn’t Want You to Type

    "Spell ‘electrodialysis’ wrong in a text, and Apple will correct you. Miss ‘abortion’ by one letter? You’re on your own. A Daily Beast investigation into your iPhone's hidden taboos." We investigated iPhone's spellcheck algorithm writing a series of scripts and iOS programs to mimic spellcheck hundreds of thousands of words and up to 14 different types of misspellings of those words. We found a list containing politically sensitive words that the iOS software will not accurately correct, even for slight misspellings.
  • Murder Mysteries

    Schripps Howard News Service has conducted the most complete accounting ever made of homicide victims in the United States. Aggressive use of state and local Freedom of Information laws allowed the wire service to assemble a database of 525,742 homicides, including records of 15,322 killings never reported to the FBI. The "Murder Mysteries" project calculated the homicide clearance rate for every police department in the U.S., prompting four departments to promise reforms. Scripps also developed an algorithm that identified 161 suspicious clusters of unsolved homicides involving women of similar age killed through similar means. Authorities in Gary, Ind., and Youngstown, Ohio, Launched new investigations into possible serial murder in their communities as a result of this project.