Posts Tagged ‘CAR Conference’
Visualizing data with Tableau, a trainer’s perspective
Journalsts at the 2012 CAR Conference attend hands-on Tableau training.Photo courtesy of Tableau By Daniel HomTableau Public Many journalists have experienced leafing through stacks of documents, searching for important numbers to back up stories and trying to relay all of that in a way that captivates readers. It’s a world where too often multimedia…
Read MoreNICAR 2012 Wrap-Up
Hundreds of attendees and dozens of speakers descended on St. Louis for the 2012 Computer-Assisted Reporting Conference, for a weekend packed with data analysis, web development, other sessions, and a panda costume. We had a full team of students attending and blogging about panels throughout the conference. In all, the bloggers covered dozens of sessions,…
Read MoreGoing beyond the campus for coverage
By Mayra Cruz @MayraC27 Campus coverage can be daunting, but looking beyond the campus is a way to get the story, Jennifer Wheeler of The Register-Mail said at “DataU: the databases you need to cover higher ed.” From grants to graduation rates, one of the major databases to mine for information is the Integrated Postsecondary…
Read MoreFinding out what public figures don’t want you to know
By Jon McClure@JonRMcClure Sex sells. But it sometimes buys, too. Online. As described in the panel “Hidden databases: Mining the private parts of public officials,” the trick is learning how to uncover the online footprint of public figures and track the nefarious deeds they might do under the cover of online alter-egos. Russ Ptacek of…
Read MoreDouble-check environmental data
Many investigative reporters are recreational data users, but data alone cannot be trusted. “You can’t take what is in those databases for granted,” said Kate Golden, a reporter and multimedia producer for WisconsinWatch.org. At the panel “Environmental analyses for any newsroom,” she emphasized the importance of speaking with the lead agency to find out what…
Read MoreGetting around PIOs with Web Inspector
By Mayra Cruz @MayraC27 One way to get around bureaucratic hassles is to get the to the data directly by scraping it off the Web. The fight for public records can sometimes be avoided by taking the data directly from websites, Dan Nguyen of ProPublica said. On Saturday, Nguyen led a hands-on class of “Web…
Read MoreHack the Census
By Anna Boiko-Weyrauch@AnnaBoikoW “Hacking the Census” was a collection of lightning talks on tools, tricks and codes to hack the Census and American Community Survey, ranging from introductory to advanced. Steve Doig, professor at Arizona State University, said the Census has information about people and households, of course, but there’s also info on business, education, foreign trade, and more.…
Read MoreExcel on steroids: NodeXL and PowerPivot
By Hilary Niles@nilesmedia Excel has two free, plug-ins for Windows users that can dramatically help reporters: NodeXL and PowerPivot. (Sorry Mac devotees, nothing for us.) Tom Torok, CAR editor of The New York Times, and Peter Aldhous, New Scientist’s San Francisco Bureau Chief demoed the two plugins at the 2012 CAR Conference. NodeXL is a network analysis tool compatible…
Read MoreLearning to liberate data
By Anna Boiko-Weyrauch@AnnaBoikoW Syntax error. What does this bit of code do? Syntax error. Let’s go back to the source. Syntax error. Maybe try this? After two hours of educated guesses, trial, error and some friendly help, Pam Dempsey, of cu-citizenaccess.org, and I had finally scraped our first bit of text: the word “2011” from a page of…
Read MoreFrom where? Validating data in the real world
By Anna Boiko-Weyrauch@AnnaBoikoW To understand your data, let’s go back to grade-school science class. Remember when you learned about the forest, and all the animals that call it home? The forest is a dynamic ecosystem. Your data is like a chimpanzee; it plays a role in the forest ecosystem. Over time, the changes in the…
Read More