Session 5

Back to Schedule

Title of session: Quality measurement and reporting II

Chair: Alexandru Gherasim

Room: S3A Barbakan

Time: 09:00 - 10:30

Date: 28 June


Session 5 - papers & presentations


Presenting AuthorAbstract
Javier Alcantara Ortega
e-mail: Javier.ALCANTARA@ec.europa.eu
Title: <<< Quality reporting in the ESS - State of play and next steps >>>
Member States have to provide Eurostat with reports on the quality of the data transmitted. Eurostat analyses the reports in order to assess the quality of the data and publishes a summary. Modalities, structure and periodicity of the quality reports depend on the different sectoral legislation. But the general framework for quality reporting in the ESS is standardised: quality criteria and a reporting structure are well defined. Four main pillars sustain Quality Reporting. In first place, the legislation about quality in statistics that goes from Article 12 of Regulation 223/2009 to the different sectoral legislation. Secondly, the standards: ESMS (Euro-SDMX Metadata Structure, user-oriented), ESQRS (ESS Standard for Quality Reports Structure, producer-oriented) and SIMS (Single Integrated Metadata Structure, merging the previous two). The third pillar, the technical implementation across all statistical domains. The fourth pillar, the guidelines and advice, with the ESS Handbook for Quality Reports as flagship publication. The challenges ahead are achieving as much as possible a harmonisation of the standards and guidelines used for Quality Reporting across domains. The ESS Handbook on quality reporting should be the first and most important point of reference for all domains, giving plenty examples of use for different sources of data and statistical domains. More centralised work and coordination in each national statistical authority and in Eurostat is also needed to provide support, advice and training of staff. Legislation concerning quality reporting must be harmonised across domains to the extent possible. Cooperation and interoperability of metadata systems must be enhanced, over domains and functions. Countries should also be strongly encouraged to publish their national quality reports.
Karin Blix
e-mail: kwb@dst.dk
Title: <<< Raising awareness and continuously improving quality in Statistics Denmark >>>
The quality management system in Statistics Denmark is built on three pillars – quality assurance of documentation of statistics (quality reports), quality audits and our process model. The documentation of statistics is based on the Single Integrated Metadata Structure (SIMS). The central quality unit is responsible for quality assurance of the documentation of statistics which is updated with any new publication of statistical products. The audit process is built up around the European Statistics Code of Practice (CoP) and the Generic Statistical Business Process Model (GSBPM) and is conducted by the quality coordinator and a team of internal and external experts. The subject matter units are confronted with principles 4-15 in the CoP and are asked to fill out a self-assessment form based on these principles. They are encouraged to consult the Quality Assurance Framework (QAF) during the self-assessment. The idea was to think big and start small. The first statistical products were chosen from each of the departments – economic, social and business statistics. In the second round of auditing, statistical products were chosen to involve as many middle managers as possible and in this way promote CoP in every corner of the organisation. In the third round of auditing we went for central and more complex statistical products also involving external experts in the process. The next step is to involve the users more intensely by conducting interviews in focus groups comprising central users of the statistical product. In the paper the process of the continuous improvement will be described with examples and additional initiatives taken to raise awareness of quality in statistics will be presented.
Carsten Schumann
e-mail: carsten.schumann@destatis.de
Title: <<< Coherence - Accuracy - Flexibility >>>
In European Business Statistics the “number of active enterprises” and the “number of persons employed” are provided from two sources (“Structural Business Statistics” (SBS) and “Business Demography” (BD)) with equal populations and similar definitions. Consequently, users expect the same results. However, deviations occur because most Member States use sample surveys for SBS and administrative data processed in the Business Register (BR) for BD. In the past some deviations have been severe and Eurostat launched initiatives to remove the gap. Coherence can be improved to a certain degree by calibrating results. However, it is difficult to decide on a benchmark: primary surveys have a sophisticated quality assurance for high reliability of each characteristic while the BR respectively the administrative data has better coverage and allows the publication of small-area data. Unfortunately, the NACE-code – a very crucial characteristic for business statistics – is not always reliable from administrative sources and SBS has to cope with non-responses. Furthermore, there is a limit to the amount of calibration constraints and not calibrated variables and breakdowns can be biased. Clearly, this results in a delicate trade-off between credibility of published results, their accuracy and flexibility in promptly producing additional results on newly arising user demand. Recent initiatives promoted “from stove pipe to data warehouse” and “reuse of existing data” to enhance flexibility and improve responsiveness while reducing response burden. Pilot projects have proven the vast potential of micro-data-linking and are still just tapping the surface. Ex-ante restrictions imposed on results before publication might hamper these possibilities. This presentation wants to trigger a discussion about goals and limitations of initiatives to improve coherence. Should we bend results to enforce coherence? Is coherence needed on every level of aggregation? How much gap can be explained to users? How will calibrations affect accuracy and flexibility?
Laurie Reedman
e-mail: laurie.reedman@canada.ca
Title: <<< A modest attempt at measuring and communicating about quality >>>
While we were mostly disseminating aggregate statistics derived from a sample survey, we used the sampling error as the backbone of quality reporting. Now we are moving towards disseminating aggregate statistics derived from non-survey methods, and disseminating micro-data products. What should we be communicating about the quality of these products? What do we as data producers need to know about the quality of data is it traverses our processing steps, and what do the users of our data products want or need to know about its quality? This paper explores these questions in the Statistics Canada context. We are looking for common vocabulary, and a standardised way to represent different aspects / attributes / dimensions of quality. In particular we are exploring ways to measure and report accuracy, and how to reflect the impact on accuracy of processing steps such as data integration, imputation and disclosure avoidance. Finally we look at how to summarize and communicate the accuracy of a data product in a way that informs use.
Heather Bergdahl
e-mail: heather.bergdahl@scb.se
Title: <<< Fulfilling the ES Code of Practice and assessments of quality in official statistics >>>
A new self-assessment of the quality in official statistics has been launched for the Swedish System of Official Statistics, the results of which are reported annually to the Swedish government. The objective is to stimulate quality improvements in official statistics and provide a basis for the government to make a “Commitment on Confidence in Statistics”. In 2013 the Official Statistics Act in Sweden was amended to include quality criteria comprising relevance, accuracy, timeliness, punctuality, accessibility and clarity, comparability, and coherence (viz. Regulation (EC) No 223/2009, Article 12). In 2016 Statistics Sweden issued executional regulations for the System as a whole specifying an updated quality concept with five main components to be used to describe quality in the development, production and dissemination of official statistics. In 2017, additional regulations were issued in the form of a questionnaire for statistical authorities to annually carry out self-assessments of the quality in their official statistics. The questionnaire is based on the updated quality concept and the focus is on change in product quality over time. An essential starting point in the assessment is the purpose of the statistics and user information needs, which are sub-subcomponents in our updated quality concept. Statistical authorities work actively to strive towards compliance to the ES Code of Practice (CoP). In this context, a most relevant question has been raised as to how our new self-assessments assist us in our efforts towards this goal. Performing assessments is fundamental for making continuous improvements. The CoP refers to assessments in connection to several principles besides principle 12, Accuracy. In this paper, we will explore how approach to assess quality in official statistics can be a useful and potentially powerful tool for continuous improvements and a means to achieve quality improvements in official statistics as well as greater compliance to the CoP.

 

Back to Schedule

Font Resize
Contrast