Project description

I am a 23-year-old journalist in Phoenix. Over the past year, my work as one of The Arizona Republic’s data reporters uncovered troubling financial performance at charter schools, the disproportionate impact of voter purges on low-income and minority communities, transparency issues with restaurant inspections and a lack of action on a pressing public health issue in the country’s sixth largest city.

Through the past year, I gained proficiency in web scraping, analysis languages and geospatial analysis. I now regularly use R and Python for most data tasks, creating reproducible analysis for our newsroom. I’m also comfortable doing large-scale analysis in ArcGIS, which drove the findings on the pedestrian deaths story.

We have a general news audience, so our stories (even the most complex) must be focused on the impact of the issue on our readers’ lives. In my work, I try to do this by focusing on basic questions. The answers to these are usually very illustrative of the effectiveness of public institutions and businesses in serving the community.

Recently, I used a combination of scripted and geospatial analysis for an investigation into pedestrian deaths in Arizona. These crashes have become one of the leading causes of death in our state, and my analysis found road design likely plays a large role in why Arizona has such a high rate of fatalities. The investigation found Phoenix hadn’t redesigned the streets in its deadliest areas, nor had it put potentially life-saving crossing signals in them.

I used a web scraping script to pull restaurant inspections data from Maricopa County’s website when the public body didn’t provide its full database to us. The analysis of that data found many restaurants – especially those with poor inspections – opted out of the county’s grading system. This undercut the idea of the grading system itself, which was to provide an easy way for patrons to look at restaurant performance.

That story helped launch AZ Data Central, a growing collection of public data that’s relevant to our readers’ lives. In addition to building the restaurant inspection web scraper, I designed and created the individual pages for all datasets and produced numerous other watchdog stories that placed them in context for our readers.

What makes this project innovative?

The work demonstrates my ability to cover a range of different topics in unique ways. For example, slow internet speeds are a common complaint for many of our readers, but there’s no public dataset that shows measured speeds. To get around this, I created a user survey to collect the data. The project allowed us to bring our readers into the reporting process while also holding internet service providers accountable for the results. In all, nearly 1,000 people have filled out the survey since it debuted in June of last year, and we used responses to inform our story last August. In making AZ Data Central, I had to come up with a display solution for large databases that would work well on any device. Despite having limited previous background in web design, I was able to produce high-quality pages that rival other database sites. Finally, these stories show my chops as a traditional reporter. In each analysis, I attempt to figure out what the data means and what the underlying story is. Then, I play a central role in finding and interviewing sources and writing the story itself.

What was the impact of your project? How did you measure it?

For me, success is uncovering compelling findings that are useful to our readers. My stories have been well received by our readers – many have engagement times at or above a minute. I also accompany these stories with threads on Twitter that have been growing in popularity. I try to accompany most investigations with a solutions-focused element. For example, I wrote a sidebar story for a projects about evictions that offered multiple ideas for ways to fix Arizona’s eviction system. The central premise of the pedestrian deaths project was that there are proven countermeasures that city officials hadn’t incorporated yet. My charter school analysis was one of the preliminary attempts to quantify financial problems at the institutions. The work was built upon in the paper’s award-winning work on charter schools. These stories have played a role in bringing fresh scrutiny to the operation of these companies.

Source and methodology

Sources for these stories include data obtained through public records requests, a web scraper and a reader survey. The methodologies differed depending on the nature of the story, but typically focused on using data analysis tools to pick out patterns and associations in the data.

Technologies Used

I collected, cleaned, analyzed and presented this data in R, Python, ArcGIS, Carto and Excel among other tools.

Project members

Republic staffers Bree Burkitt, Dianna Nanez and John Paul McDonnall helped with various stories in this portfolio.

Link

Additional links

Followers

Click Follow to keep up with the evolution of this project:
you will receive a notification anytime the project leader updates the project page.