Speed Talk Session 8

Back to Schedule

Title of session: Quality of multi-sources statistics

Chair: Włodzimierz Okrasa

Room: S4A Mariacki

Time: 18:45 - 19:30

Date: 27 June

Speed Talk Session 8 - papers & presentations

Presenting AuthorAbstract
Paweł Michalik
e-mail: pawel.michalik@nbp.pl
Title: <<< “Balance of Payments quality management in Poland” >>>
The BoP quality management process starts with the source data that must be collected from reporters. Special focus will be given to clarity of reporting requirements, close contact with reporters, training provided to reporters and sanctions for non-compliance. The second stage consists of collecting data from the Central Statistical Office of Poland and administrative data from other institutions. Cooperation quality has a direct impact on quality of statistical process. Institutional bodies in Poland that facilitate exchange of knowledge and information will be described. The next stage ie.. compilation process will be presented with tools and techniques for detection of outliers and other non-standard observations. The last stage of the process, ie. data dissemination of statistics with modern tools and techniques dedicated to key user groups will be described. Special emphasis will be given to feedback received.
Esa Katajamäki
e-mail: esa.katajamaki@luke.fi
Title: <<< Data validation in livestock statistics >>>
Natural Resources Institute Finland made a study on data validation of livestock statistics in 2017. The key goal of this project was to evaluate the validation methods that are currently in use and to make recommendations for improving them. The statistical process from administrative register to statistics and validation methods in different stages were developed during this project. Another objective was to evaluate and update revisions related to direct data collection. The livestock statistics validation methods currently in use were documented and their strengths and weaknesses were evaluated. In the comparison with other agricultural statistics, the validation methods for administrative register data were compared to methods used for agricultural structure statistics and horticultural statistics. For administrative materials, the project recommends implementing a unified process pursuant to the model which was developed in this project. Based on the model, register data is first copied on unit level into the raw data table of the statistics. The data is then transferred to the basic data table on unit level. All discovered errors and missing data is corrected in the basic data table. At the next stage, the publishable sum level data is calculated and transferred to the sum level database table. A material revision report and different authenticators and validation rules are connected with each stage. In statistics production that is carried out using this process the most important step in data validation is to ensure that the unit level data in the basic data table is correct. The sum level revisions performed after this are a last check that the data to be published is accurate. If data errors are discovered after publishing, the required corrections are made to the unit level basic data.
Zsolt Kovari
e-mail: zsolt.kovari@ksh.hu
Title: <<< Impact of estimated and administrative data on quality >>>
The growing interest in the service sector compels to using of model based methods. These procedures have impacts on the quality and it is an exciting question how these effects can be measured and analysed. There are general methods available to us but the application of these means always more or less new methodological challenges. Hungarian Central Statistical Office has service turnover date from 2009 using the same nomenclatures and classification systems. The new group was created in Hungarian Central Statistical Office in order to process the VAT database and produce reliable administrative datasets for further use. How can the quality of results are measured? The paper presents the challenges with which we faced over processing VAT data summarises and compares the tested methods and describes the main achievements, outlines a new methodology of the VAT data and introduce the main effects for the quality of the estimated data.
Elsa Dhuli
e-mail: edhuli@instat.gov.al
Title: <<< Implementation of a General Statistical Business Process Model in the administrative resource >>>
The quality of production of official statistics is a contemporary demand which is growing dramatically by all countries, no matter what their inputs, statistical surveys or administrative sources are. Increasing the use of administrative resources as a primary or substitute element first requires first-rate quality assessment. Given the fact that the common approach is the measurement of the input quality and the statistical product process, when the main source is the administrative source, this approach takes on other dimensions. Generally speaking, the quality of each processing phase up to the realization of the statistical product has its importance, but the quality of the collection phase of administrative information and the implementation of statistical models in their holders also significantly guarantees the quality of other phases of the process. This paper aims to outline the implementation of a General Statistical Business Process Model in the administrative resource system at some of the phases to guaranty their quality to be usable as primary source for producing official statistics. The analysis of the quality of each phase of the statistical process from the administrative source of the experience of developed countries and the recommendations given for further improvement is a positive approach for enlargement countries.
Gemma Agostoni
e-mail: Gemma.Agostoni@ecb.europa.eu
Title: <<< Who’s telling the truth? Statistical techniques for error detection in double-sided reporting of money market transactions >>>
In 2016 the European Central Bank started collecting statistical data on money market transactions based on the Money Market Statistical Reporting (MMSR) Regulation. This granular dataset covers four segments of the euro money markets, namely unsecured, secured, foreign exchange swaps, and euro overnight index swaps. The detailed trade data to be provided comprise amongst others the volume, rate, and counterparty or collateral type information together with the time when the transaction was conducted. On average 45,000 transactions are being received every day from the largest 52 banking institutions in the euro area. Different procedures to ensure high quality have been adopted, including using ISO 20022 XML standard for the exchange of information. Particularly relevant are the methods aiming to reconcile the data reported by the different institutions. Both sides of a transaction (borrowing and lending) are reported by the parties involved. The lack of a unique transaction identifier poses significant challenges to the identification of the two sides of a single transaction. This paper presents different techniques, applied to MMSR, for pairing and matching the two sides of a transaction based on incomplete and partially incorrect information. Errors can be of several kinds: over-reporting and under-reporting of transactions, and misreporting. First, the matching exercise is conducted at macro level comparing the total volume exchanged between each pair of counterparties. This allows identifying the institutions experiencing possible issues of under- or over-reporting of transactions. In a second step, absolute, partial, iterative, and fuzzy matching techniques for pairing and matching individual transactions are developed, which allow automatically identifying out-of-scope or missing transactions, and misreported values, which would have not been detected otherwise. Inconsistencies detected are then referred to the specific reporting agents. These techniques have been critical in enhancing the quality of the MMSR, and may find applications in other datasets.
Susana Santos
e-mail: smsantos@bportugal.pt
Title: <<< Boomerang effect of quality control on the compilation of Financial Accounts and flow of funds - the experience of Banco de Portugal >>>
Financial Accounts are fundamental to monitor financial stability by quantifying the impact of financial decisions of a host of economic agents. In Portugal, the compilation of these statistics is a responsibility of Banco de Portugal. One of the main purposes of the Statistics Department of Banco de Portugal is to ensure this statistical production with high quality standards, aiming at fully meeting user’s needs, by developing a wide set of quality control procedures. Financial accounts are derived statistics stemmed from a vast array of other primary statistics, including balance of payments and monetary and financial statistics. In this context, Banco de Portugal developed a multidisciplinary team with experts from financial accounts and from the different underlying primary statistics. Within this format, all team members are co-responsible for producing national financial accounts, on a bottom-up approach, thus improving both the quality of these statistics, as well as the quality of primary statistics. This is the result of a systematic iterative process of data cross-check and reconciliation which may represent an opportunity to validate the soundness of microdata, on a top-down approach. To better understanding economic sectors’ interlinkages and to assess how intersectoral financial linkages have changed, flow of funds is a powerful analytical tool.

Back to Schedule

Font Resize