I am a 23-year-old journalist in Phoenix. Over the past year, my work as one of The Arizona Republic’s data reporters uncovered troubling financial performance at charter schools, the disproportionate impact of voter purges on low-income and minority communities, transparency issues with restaurant inspections and a lack of action on a pressing public health issue in the country’s sixth largest city.
Through the past year, I gained proficiency in web scraping, analysis languages and geospatial analysis. I now regularly use R and Python for most data tasks, creating reproducible analysis for our newsroom. I’m also comfortable doing large-scale analysis in ArcGIS, which drove the findings on the pedestrian deaths story.
We have a general news audience, so our stories (even the most complex) must be focused on the impact of the issue on our readers’ lives. In my work, I try to do this by focusing on basic questions. The answers to these are usually very illustrative of the effectiveness of public institutions and businesses in serving the community.
Recently, I used a combination of scripted and geospatial analysis for an investigation into pedestrian deaths in Arizona. These crashes have become one of the leading causes of death in our state, and my analysis found road design likely plays a large role in why Arizona has such a high rate of fatalities. The investigation found Phoenix hadn’t redesigned the streets in its deadliest areas, nor had it put potentially life-saving crossing signals in them.
I used a web scraping script to pull restaurant inspections data from Maricopa County’s website when the public body didn’t provide its full database to us. The analysis of that data found many restaurants – especially those with poor inspections – opted out of the county’s grading system. This undercut the idea of the grading system itself, which was to provide an easy way for patrons to look at restaurant performance.
That story helped launch AZ Data Central, a growing collection of public data that’s relevant to our readers’ lives. In addition to building the restaurant inspection web scraper, I designed and created the individual pages for all datasets and produced numerous other watchdog stories that placed them in context for our readers.
What makes this project innovative?
What was the impact of your project? How did you measure it?
Source and methodology