Palm Beach County public school leaders touted a survey to claim that their teachers and staff were increasingly optimistic about their schools’ direction.
But it turned out that it wasn’t true. An analysis by The Palm Beach Post’s Mahima Singh and Andrew Marra proved it.
In analyzing the survey for a digital presentation on the paper’s website, the reporters discovered major errors in the district’s results.
Stunned by the obvious discrepancies, Singh and Marra ran the numbers again. They wrote separate code in Python to check their data. They re-did the analysis in Excel. They calculated the data by hand.
Then they confronted the school district.
At first, the district ducked.
District officials canceled two interviews. They conducted their own review and confirmed The Post’s findings, identifying two main problems with their own calculations.
District administrators acknowledged their mistakes, disclosing to Singh and Marra that their research staff made a significant, previously undisclosed change to how they calculated teacher-satisfaction rate that made the latest data look rosier that previous years. At the same time, administrators said, a computer program built to compile survey results tallied the results incorrectly, falsely inflating the teacher satisfaction rates even further.
In the end, the district conceded that the reported spike in school morale was largely an illusion and issued an apology to the school district’s teachers and other employees via email and a recorded video.
What makes this project innovative?
This is also the first time the Palm Beach Post has undertaken a project like this.
The Post’s Mahima Singh and Andrew Marra took what seemed to be a happy story about school district morale and found the truth, forced the school district to own up to its mistakes and correct false impressions and gave readers an in-depth and close look at how teachers really feel about their schools.
What was the impact of your project? How did you measure it?
The results remained the same. The data that the district had on their website was inflated. This data was also used in presentations across school boards to paint a rosy picture of happy teachers.
In the beginning, when the Palm Beach Post reached out to the administrators and their data team, they canceled two interviews scheduled to discuss the discrepancies while they conducted their own review and discovered two main problems with their calculations.
They discovered that district researchers made a significant change to how they calculated teacher-satisfaction rate and at the same time, administrators said a computer program built to compile survey results tallied the results incorrectly, falsely inflating the teacher satisfaction rates.
In the end, the district conceded that the reported spike in school morale was largely an illusion and also issued an apology to the school district’s teachers and other employees via email and a recorded video.
Source and methodology
Based on the “Overall Satisfaction” rates, The Post created a relative ranking of the schools and divided them into four quartiles: “very happy,” “pretty happy,” “less happy,” “among the least happy.”
The landing page of the web app lists all the schools sorted by satisfaction rating. The user can click on the sort icon to change how the schools are listed. They can even type in a schools name to directly see it in the listing. Once the user clicks on a particular school they are taken to that school's page. Here they can see how the school compared to its previous year’s results. We also present the user with the detailed results of the survey along with the specific questions.