Special Session 33

Back to Schedule

Title of session: Communication quality and engaging with users

Chair: Martina Hahn

Room: S3B Sukiennice

Time: 17:00 - 18:30

Date: 27 June

Session 33 - papers & presentations

Presenting AuthorAbstract
Martina Hahn
e-mail: Martina.hahn@ec.europa.eu
Title: <<< Branding of official statistics: new evidence and recommendations >>>
Better communicating on the value of official statistics has been identified as a strategic need for statistical organisations. It is one of the key goals of the ESS Vision 2020 and it is implemented through the DIGICOM project (Digital Communication, User Analytics and Innovative products). In this context, Eurostat has commissioned a study to get a better understanding of users' perception of official statistics, as well as strategic and operational recommendations on communication. The study will look at general branding aspects, such as brand awareness (do users know the various brands) and brand positioning (how do users perceive the brand as regards its competitors?). It will also cover users' views on the quality of European statistics (Do users associate European statistics with high quality? Which quality aspects are important for them? Are there quality gaps?). This study will be carried out from December 2017 to June 2018 in 8 EU Member States. It will mainly rely on qualitative methods (focus groups and in-depth interviews of representatives of the main user groups), but will also include e-reputation analysis and a quantitative survey. This paper will present the methodology and early results of this study for Eurostat and the ESS as a whole.
Maja Islam
e-mail: maja.islam@ec.europa.eu
Title: <<< Engaging with users to modernise the dissemination of European statistics >>>
Modernising the dissemination of European statistics is driven by users’ needs and facilitating access to official statistics. Following current trends, more visual and attractive content is used, while text is presented in a structured and concise way, addressing the most common user questions. Engaging with users serves as the fundament and impetus for any changes in this process. In 2017, several user research activities were launched at Eurostat as part of the DIGICOM project – an ESS project aiming to modernise the dissemination and communication of European statistics. The aim of these user research activities was to learn more about our users and their needs, and get recommendations on what we can do to modernise the dissemination of European statistics. Two methods were used: field studies and usability tests. In the field studies, 40 different users were interviewed over a period of 6 months. Users were asked about their profile and their use of statistics, and observed as they interacted with a number of dissemination products on the Eurostat website. The outcomes were high-level recommendations on how to improve the dissemination products tested, and personas of the users of European statistics. The usability tests, conducted with smaller groups of users and focusing on a smaller number of products, resulted in more specific recommendations to improve the usability of the tested products. In practice, this is a circular process: Eurostat proposes new or improved dissemination products to users who then provide their feedback; on this basis, recommendations are made which subsequently result in improvements of the products. Learning from users now will help Eurostat to disseminate better, custom-made products in the future. This presentation will include concrete examples of user feedback and its translation into improved dissemination services.
Sarah Tucker
e-mail: sarah.tucker@ons.gov.uk
Title: <<< What do ONS users want from quality and methods information? >>>
“There’s no substitute for talking to real users” 1
As National Statistical Institutes (NSIs), we must provide statistical quality and methods information that helps our users make better use of our data. We decided to review our standard user-orientated quality report (Quality and Methodology Information (QMI)) to ensure that it continued to meet user needs, the first question we asked was “How can we find out what our users really need from our quality and methods information?” The simplest answer, though not necessarily the simplest thing to do, was to ask them. With the help of an ONS user researcher we ran a series of user tests designed to:
- Discover who our main users are
- Help us understand which of the current QMI topics are most important to users
- Help create Quality Information User Profiles
Gain detailed insight on what our users need from quality and methods information and what they use this information for. Discover if additional topics are needed to help users make better decisions about the data. Test if our interpretations were correct when redesigning the contents of the QMI. In this paper I will describe the methods we used to meet these goals and discuss the main findings from the user tests. I’ll share our newly created Quality Information Profiles which give insight on what different types of users need from quality and methods information, to enable them to use the data with more confidence. I’ll give some detail about our users and what they told us about their needs. This included a few surprises which required us to do some rethinking of our assumptions to ensure that we stayed on the right path. Finally, I will share the new contents list and briefly discuss the next steps.
Giorgia Simeoni
e-mail: simeoni@istat.it
Title: <<< Users’ engagement and national quality reporting at Istat >>>
Istat has recently re-organised the set of activities carried out to manage the relationship with users. The resulting framework is aimed at segmenting users (identification of profiles in order to classify users with different needs), collecting information needs, assessing the overall users’ satisfaction. A wide set of methods and tools are used to fulfil these objectives, ranging from traditional satisfaction surveys, to consultations on specifics topics, or indirect analysis of users’ requests and feedbacks. The paper will firstly present this new comprehensive approach as well as the tools already in use. Secondly, the focus will be shifted to the results of users’ consultation in terms of perceived quality of official statistics and metadata communication. Indeed, since 2014 the yearly Istat users’ satisfaction survey includes also two sections devoted respectively to disseminated metadata and product quality. Then, recent developments on the tools for reporting quality to users, that would also meet the users’ needs according to findings of previous analysis, will be presented. In particular, the new detailed national Quality Reports, harmonised with ESS metadata standards and oriented to expert and demanding national users, are currently being implemented. They fill a gap in the reporting for the users, that so far has been oriented to not expert ones, through the dissemination of Quality at a glance reports. They have also been developed in response to the recommendation of the last round of Peer Review on the ES Code of Practice. So far, the Istat quality reporting tools are tailored for traditional statistics based on survey and/or administrative data, however the issue of documenting the new experimental statistics is arising. A preliminary hypothesis on how to document this kind of statistics will be finally proposed, also on the basis of a review of what is already done in other statistical institutes.

Back to Schedule

Font Resize