Bxl’air bot was officially launched on the 25th April 2017, in order to provide live informations about the state of the air quality in Brussels with relying on public open data. It collects and stores data about air quality indicators (PM10, PM 2.5, Azote, Ozone and Black Carbon) to automatically provide natural language generation, charts, maps and stastistical analytics. This DDJ project is a one year monitoring, in order to observe the exceedings regarding the EU norms and WHO recommendations. As an object of journalism, it provides clear and summarize informations about a situation which is evolving with time. As a tool for journalists, it has provided raw material to the newsroom for a wider investigative to explain causes and consequences of air quality issues in Brussels. The web application also provides automatically real time alerts by email and via Twitter.
What makes this project innovative?
This project is the first news bot launched in Belgium and was developed in the context of a PhD thesis. The newsbot is running by itself with doing something which would be time-consuming for journalists. The newsroom have no need to compute or to make statistics as everything is provided by the tool. For the audiences, it is the only point to find news informations even if the authorities communicate about it (data are widespread and are not presented in a journalistic way which can lead to confuse audiences, between optimistic communication and scientific communication). This "objectivation by numbers" helps to follow in real-time the situation about the air quality in Brussels as well to see how the situation has evolved with time.
What was the impact of your project? How did you measure it?
Audiences have retweet and have communicate about it as well as citizen organisation which are claiming an improvement of the situation. There were press articles, conferences to present the project to professionals, an academic paper published and it was also quoted in an academic paper about the situation of air quality in Brussels. The project is still yet not finished as next month, a special issue of Alter Echos will be published around the data collected. It have help to contribute to let know the work of Alter Echos's small newsroom.
Source and methodology
The data quality aspect was very important in this project. The development is relying on a methodology developed within a reasearch work with aiming to meet both the journalistic and the journalistic challenges. A conceptual framework was first developed to assess data quality with a combination of deterministic and empirical quality indicators. If data quality is a multidimensional concept, the object was to establish how to fit the needs of journalistic projects. Formal quality indicators are essentials when data are collected and/or automated. We can call it the technical challenge. Empirical indicators are also essentials regarding professional practices. We can call it the journalistic challenge. This research have also demonstrated how and why data quality literacy is able to meet and to support journalistic requirements. This framework was then applied on a operational way to prevent errors and abnomalies.
It was developed in PHP and MySQL, applying NLG rules to provide written real-time-information. The use of JS libraries were also included (Highcharts.js and Leaflet.js for dataviz, Bootstrap.js and Masonry.js for the visual design of the news bot). Cron tasks are make several times/day to retrieve and store data. All of the development is thus based on open source material, which were customized and mofified to feet the needs of the projetcs. The news bot is hosted on a shared web server, meaning that there were constraints on this side. But with no big means, it was launched and is still running.
This project was developed with Alter Echos, represented by her chief-editor, Sandrine Warsztacki