I’m working on fixing Chicago’s open government data. In late-September, my friend Mike Emerick and I will be giving a Strata Conference talk that will culminate months of research aimed at making open government data more reliable. We hope our backgrounds in big data and civic involvement will help aid that goal.
Do you know any writers doing data-driven journalism in Chicago? Get in touch with me here to get involved in this project.
Here’s the abstract (approved by Strata):
Open government is an incredibly popular topic today. From the appointment to office of the nation’s first chief data scientist, to cities like New York, San Francisco, and Chicago signing executive orders to open up city data to the public, more government data is available to us than ever before. Because of that, one would be right to think we’re in an era of unprecedented transparency.
Yet in late February 2015, investigative journalists revealed that the Chicago police department had been operating “black sites” around the city — essentially, places where Americans were detained, and then disappeared off the record. None of this data made it onto the city’s open data networks. This egregious, illegal behavior was not uncovered through open data, but through traditional journalistic methods.
This, and other stories like this, fundamentally calls into question the data integrity of open government initiatives. How can we still use this information to derive insights into our government? What can we do to identify omissions in the data? And how can we improve the integrity of open government data through traditional data analysis?