13. The quality of data, statistics and records used to measure progress towards achieving the SDGs: a fictional situation analysis
John McDonald
This is a fictional situation report from the fictional nation of Patria that aims to illustrate the issues associated with managing the data, statistics and records used to measure the Sustainable Development Goals (SDGs) in lower- and middle-income countries. This approach should make it possible to present key issues without identifying individual countries and it should allow them to assess the extent to which the fictional situation matches their own realities. A close match will suggest greater relevance. It is hoped that this will increase awareness and understanding of issues and their consequences, leading to concrete action.
Background
Patria is one of 193 countries that signed on to the SDGs initiative led by the United Nations. Within the government, the Ministry of Government Affairs (MGA) is responsible for coordinating its commitments under the initiative and for submitting the statistics to the UN Statistics Division. Generated by the Patrian National Bureau of Statistics (NBS), they are based on its own in-house surveys, as well as on surveys and other data sources supported by individual ministries and are used to measure the SDG indicators developed by the UN.
The government’s efforts to implement the Millennium Development Goals (MDGs) have revealed significant weaknesses in the data and statistics used to measure their achievement. In a number of cases, it has been found that the statistics produced to measure the indicators were flawed, often because the data used to generate them were also flawed. In some cases, it was possible to identify where and why this was so, but in too many others it was impossible because there was a lack of supporting documentation. The records that should have documented the processes tended to be fragmented or missing altogether. It was embarrassing to the government when it was discovered that certain goals had not been achieved or when it was clear that the government’s statistics could not be trusted. This caused several development partner organisations and private sector investors to question whether to trust the statistics when deciding on the level of donor support to provide.
In order to avoid the same issues emerging in the SDG initiative, the MGA commissioned a situation analysis to assess the quality, completeness and integrity of the data and statistics used to measure SDG indicators. The government recognised that while issues associated with the quality and integrity of data and statistics were at least partially understood, the role of records was poorly defined. This helped shape the scope of the study, the analysis of issues and the development of suggested strategies. The following key issues were identified:
•the quality and integrity of statistics are based on the quality and integrity of the data input to the statistics
•the quality and integrity of data input to the statistics relies on the quality and integrity of the processes for collecting, processing, analysing and maintaining the data, as well as on the processes for producing and reporting the statistics
•the quality and integrity of the processes can be demonstrated by complete, authentic and accurate records of sufficient quality and integrity to provide evidence of decisions and actions supporting the processes.
The MGA retained an information management expert to undertake the situation analysis. In addition to improving the quality and integrity of the data and statistics used to measure the SDGs, the government expects that the analysis will also improve the data and statistics that support operational and administrative programmes. The result should be more accurate, complete, authentic, relevant and trustworthy data, statistics and records.
Organisation of the report
This report describes the results of the situation analysis. It defines the methodology for the study and the terms used, and it analyses the quality and integrity of the processes followed in collecting and manipulating data and producing statistics used to measure the SDG indicators, with an emphasis on the quality and integrity of the records that document the processes. The implications for achieving the SDGs are highlighted at both ministry and NBS levels.
The report suggests strategies for addressing the issues that have been identified by establishing a framework of policies, standards, systems and people supported by an effective management structure. The last section describes a series of maturity levels to help the government establish milestones for implementing the strategies, and it recommends immediate first steps.
Methodology
The consultant’s activities focused on:
•conducting research to identify current relevant initiatives underway at the international level
•conducting interviews and reviewing documentation to identify and describe the work processes and management frameworks for collecting, analysing and presenting data and statistics needed to measure the SDG indicators1
•conducting interviews and reviewing documentation to identify and describe the characteristics of the records (correspondence, documents, completed forms, data files, logs, and so on) needed to collect data and produce statistics as well as the supporting management frameworks
•analysing and assessing the level of authenticity, completeness, accuracy and integrity of data, statistics and records documenting the supporting processes
•identifying areas where the authenticity, completeness, accuracy and integrity of the data, statistics and records are at risk and why
•preparing an overview of the implications of risks for the government’s ability to deliver on SDG commitments
•proposing strategies to address the issues and reduce or eliminate risks
•defining a roadmap describing the way forward based on a set of progressively more sophisticated maturity levels
•providing a set of immediate next steps that can be taken to strengthen the management of data, statistics and records in Patria.
Definitions
When government officials were interviewed, including those responsible for data, statistics, records management, IT, audit and programme management, it was clear that their understanding of basic concepts such as data, statistics, records and processes differed. For some, the concept of data embraced all recorded information, regardless of physical form, from information recorded on paper or in electronic form, to the highly structured information recorded digitally on computer readable media. For these individuals, records in digital form, including email and reports, contained data that could be manipulated and exploited just as readily as the data recorded in highly structured computer-based data files. Records were just another form of data.
Others made a clear distinction between records and data. According to them, data were highly structured codified information recorded in computer readable form for processing and manipulation by computers. Records, whether in paper or electronic form, were information recorded with the primary purpose of documenting actions and decisions and serving as evidence to meet various accountability requirements. Records, for these individuals, were static, never-changing documents, rather than data that could be manipulated. Some were even more focused in their views, believing that records were the paper files that they use, while data were what are held in databases that IT people use.
Given this range of views, it was decided to use definitions that reflected a balance, respected the scope and objectives of the study and were, as far as possible, based on authoritative sources. The following definitions resulted from considerable discussion among representatives from the various disciplines:
•data are, technically, recorded information regardless of physical media; for the purposes of this study they are defined as the set of values of qualitative or quantitative variables (recorded in multiple physical forms) that are generated, manipulated and analysed to support the production of statistics
•statistics are the results of manipulating and analysing data. They are a type of data. For the purposes of the study, they are the instruments used to measure the SDG indicators2
•records refers to recorded information produced or received in the initiation, conduct and/or completion of activities and that document those activities. They are a special form of recorded information or data. When well-managed, they comprise content, context and structure sufficient to provide evidence of the activities.3 Records are not simply correspondence or other documents generated to oversee management of the activity. They include all forms of recorded information, including data and statistics, that can serve to document the activity. This is why a data file can serve as both input to a set of statistics and, at the same time, as part of a series of records documenting the activity. It is all data, but the data take on different forms depending on the context and their purpose
•metadata refers to data that provide context for data and statistics used to measure the SDG indicators and the supporting processes. They are also an important attribute of the records that document an SDG activity, such as the conduct of a survey, the analysis of data or the production of statistics. Metadata describe the relationships among the records, which provides a documentary trail of the activity, and places them in the context of their creation, management and use. In short, metadata make it possible for the information in data, statistics and records to be understood, verified and used in context
•process refers to a collection of related, structured steps or tasks needed to achieve a specific service, product or goal.4 For the purposes of this study, it refers to the structured steps or tasks involved in collecting, processing and manipulating data to produce the statistics that are used to measure SDG indicators. These include, for instance, the steps involved in planning and approving the survey, designing and testing the survey methodology, conducting the survey, collecting the data, processing and analysing the data, producing and reporting on the statistics, and performing an evaluation of the entire exercise. Data, statistics and records are generated continuously throughout the process
•records management is the management function responsible for efficient and systematic control of the creation, receipt, maintenance, use and disposition of records.5 It enables ongoing capture and continued accessibility of high-quality, authentic, reliable, accurate, complete, relevant and timely records. This includes data files which, as part of a given documentary trail, must share these characteristics if they are to be trusted
•records management framework refers to the policies, standards and practices, systems and technologies, and governance structures for managing records. Just as policy frameworks govern the management of personnel, finances and security, a records management framework must be based on a government-wide policy. The records management framework is designed such that records in all of their different forms can play multiple roles.
The roles of records include:
•providing evidence of how a ministry or a person conducted business including decisions, actions, non-decisions and inactions.6 For instance, records could be used to prove that a statistical survey was deliberately manipulated to show a favourable outcome
•enabling organisations to hold themselves accountable in relation to laws and practices. For instance, records documenting the process for extracting statistical data from a government database could be requested under an access to information law; the same records could also support an audit of a government programme responsible for extracting the data
•supporting individual rights and freedoms. For instance, records documenting processes for producing land settlement statistics could be used to locate original survey forms completed by individuals seeking to substantiate their claims
•being the source of qualitative and quantitative data that can be used for multiple purposes beyond those that led to the records’ creation. For instance, the master data file produced as a result of the annual health survey could be merged with census data and data from other sources to perform analyses not possible using the master data file alone.
Analysis
This section describes issues that present high risks for the quality of data, statistics and records, based on an analysis of:
•processes that support collecting and analysing data used to produce statistics for measuring the SDG indicators and for disseminating the statistics themselves
•data and statistics generated by the processes
•records documenting the processes, the data and the statistics.
The government of Patria and the SDGs
Ministries are required to submit final versions of data files and statistics to the NBS as the basis for measuring specific SDG indicators. The Bureau, which serves as a coordinating hub, incorporates the data and statistics in reports that it submits to the MGA. It also collects, verifies, analyses and produces its own data and statistics. In some cases, it merges data from a ministry with data from other ministries as well as its own data to produce statistics covering multiple SDG indicators. It may also undertake additional data processing to ensure that statistics reported to the MGA are presented in a consistent format. In turn, the ministry reviews the reports, confirms their acceptance and produces a summary report that it submits to the UN Statistics Division according to a predefined schedule.
Data collection and analysis at the ministry level
Twelve of 23 government ministries are responsible for collecting data used to measure the SDG indicators. A ministry may be responsible for one or several SDGs, or it may only support one of several indicators associated with a given SDG. The ministries may use one or more of the following methods to collect data and produce statistics.
Survey data
Survey data are collected through longitudinal (repeated observations of the same variables over short or long periods of time) or one-time surveys using questionnaires or interviews. Some are large, such as the census; others are small, such as household surveys. In some cases, ministries have designed entirely new surveys to meet the requirements of a specific SDG.
Surveys are generally designed and administered by research divisions in ministries. Typically, data are collected on survey forms by mail or distributed by contract survey staff to a sample of a target population. In a few cases, data are submitted online. The data then are transferred to coding sheets and input digitally to a raw data file. Several process files may be produced as the data analysis moves through various stages. Where the data contain personal information, anonymised versions may be created.
The resulting statistics are formatted into tables and embedded in various reports for distribution to a wide range of audiences in paper form or, in a few cases, in digital form via a ministry website. In some cases, especially for longitudinal survey results, the data file might be input to a database of data files from previous surveys. A customised report, together with a copy of the master data file produced as a result of the analysis, is forwarded to the NBS, which produces a standardised report containing the statistics and submits it to the MGA to report on progress towards meeting specific SDGs.
Records documenting decisions and actions relating to planning, designing and conducting surveys may be in various forms. These include emails, paper-based correspondence and reports about a given survey; process and master files created as a result of the survey; survey documentation, such as completed coding sheets, survey design reports and descriptions of the methodology; and records describing the business context for planning, designing and carrying out the survey. Together, these records, when well-managed, provide evidence that can substantiate the integrity and trustworthiness of the data and statistics used to measure relevant SDG indicators.
Examples of SDG indicators supported by survey data are:
1.4.1
Proportion of population living in households with access to basic services
5.1.1
Legal frameworks are in place, or not in place, to promote, enforce and monitor equality and non-discrimination on the basis of sex
5.2.2
Proportion of women and girls aged 15 years and older subjected to sexual violence by persons other than an intimate partner in the previous 12 months, by age and place of occurrence
6.1.1
Proportion of population using safely managed drinking water services
10.1.1
Growth rates of household expenditure or income per capita among the bottom 40 per cent of the population and the total population
11.1.1
Proportion of urban population living in slums, informal settlements or inadequate housing
16.6.2
Proportion of population satisfied with their last experience of public services
16.7.2
Proportion of population who believe decision-making is inclusive and responsive, by sex, age, disability and population group
Registration and administrative data
This type of data results from administrative activities, such as personnel and finance or operational registration activities, such as licensing. Personnel and finance data tend to be generated in relation to ministry-wide standards and procedures or to workflows associated with hiring and retaining staff, processing expenditure and preparing budgets. Work processes established for registration activities, such as licensing, vary depending on the process, and most are well defined. For instance, in the case of a typical licensing process, licensing applications are received by the responsible ministry and reviewed for completeness and suitability; applicants are notified of whether or not they are accepted, the information is processed and included in a database, and licences are issued to applicants.
Records documenting these processes may be in multiple forms. For instance, records documenting a licensing process might include emails, paper-based correspondence and reports about a given application for a licence; completed application forms; copies of notifications; completed data conversion forms or logs documenting entry of the data into a database; review and analysis documentation (to verify qualification for a licence); copies of approval and notification documents; and reports documenting review and renewal actions. Together these records provide evidence that verifies the integrity and trustworthiness of the data and statistics used to measure relevant SDG indicators.
Examples of SDGs supported by registration and/or administrative data are:
2.3.1
Volume of production per labour unit by classes of farming/pastoral/forestry enterprise size
3.6.1
Death rate due to road traffic injuries
5.5.2
Proportion of women in managerial positions
8.1.1
Annual growth rate of real GDP per capita
9.1.2
Passenger and freight volumes, by mode of transport
12.4.2
Hazardous waste generated per capita and proportion of hazardous waste treated, by type of treatment
16.1.1
Number of victims of intentional homicide per 100,000 population, by sex and age
17.1.1
Total government revenue as a proportion of GDP, by source
Scientific data
These include instrument readings measuring natural or physical phenomena, such as weather (for example, temperature, rainfall), geology (for example soil composition, erosion) and hydrology (for example, water levels, pollutants). Data generated from instruments are stored within the instrument or transmitted to receivers that store the data separately. In the case of weather data, for instance, readings are made on a regular basis from instruments located around the country. These are transmitted to a satellite, which transmits the data to ground stations supported by computers that automatically convert the readings and merge them in a master database holding not only the readings but also the processed data that underpins weather reports. There is very little human intervention.
To take another example, water acidity measurements are taken by staff and volunteers for the natural resources ministry using instruments that take water quality readings, including acidity levels, on an annual basis in selected areas of the country. These are recorded on coding sheets, submitted to the ministry, converted to digital form, analysed, and used to produce a wide range of statistics including average marine acidity statistics used to measure SDG indicator 14.3.1. Some of the statistics also are combined with land use data to measure the impact of agricultural land use on the levels of water pollution.
Again, records documenting these processes may be in multiple forms. From the water quality example, these might include emails, paper-based correspondence and reports about a given water quality activity; completed water quality measurement logs; completed data input forms; data verification logs; extract files (data files created from the master database); and report files describing statistics resulting from analysis of the data. Documentation on planning, designing and operating the water quality measurement process, the database, the programme administering the process and the database all form part of the documentary trail.
Examples of SDGs supported by scientific data are:
2.4.1
Proportion of agricultural area under productive and sustainable agriculture
6.4.2
Level of water stress: freshwater withdrawal as proportion of available freshwater resources
14.3.1
Average marine acidity (pH) measured at agreed suite of representative sampling stations
15.1.1
Forest area as a proportion of total land area
Data and records issues at the ministry level7
The findings that follow are based on interviews with selected staff and on-site observations in the 12 ministries responsible for measuring the SDGs. Quotes from some of those interviewed for the study are included to illustrate the practical issues involved in measuring the SDG indicators reliably using official data and statistics.
In general, ministry staff tend not to recognise the need to build a documentary trail to support the processes of collecting and processing data and producing statistics. Often, they are unaware of:
•the kinds of records that need to be in place
•how the records can be related to one another
•where and how they should be organised and stored
•how ongoing accessibility should be managed.
In most of the ministries responsible for conducting longitudinal or one-time surveys, data files are not well described; documentation on data structures, coding and formats are fragmented; and data verification and quality control procedures are weak and often non-existent. Little care has been taken to ensure that a documentary trail is in place to provide evidence of how surveys are designed and conducted, how data are collected and processed and how statistics are produced.
Manager in a research division:
The ministry wants us to document our surveys, but I don’t know what this means. The minister was worried about a sensitive data file that had errors and we couldn’t explain where the errors came from. It’s not because of us in the Research Division. We’ve tried to follow some data management standards and survey guides we found online. Now the ministry says we need to document things like why the surveys were done and how they were managed. That information is mostly with other people in emails and memos that I don’t see. Action officers in other divisions have that information on their desktops.
Large operational databases in the participating ministries tend to be well-managed, but data extracted from the databases to measure SDG indicators are often poorly documented. Records documenting data extraction tend to be fragmented or non-existent, and procedures for managing the data after they are extracted and used are generally poorly defined. In some cases, the lack of metadata makes it hard to understand the relationship between the extracted data and the source data in the database. Without records documenting changes made to the structure of the extracted data or to the definition of key fields, it is often difficult to know to what extent statistics are inaccurate or misleading.
Manager, IT division:
We had a senior managers’ meeting, and someone said we should store all our data with an outside service bureau that has better storage conditions than we do. Other managers agreed and said storing in the ‘cloud’ was the answer to storing the government’s data. But it’s too risky. We don’t know if it’s secure. I think we must keep our data inhouse. Anyway, we would still have data quality issues. Storing outside is not the answer.
In a number of ministries, staff managing large databases are being asked to generate statistics to support measuring the SDGs. This is a new task, and many staff do not have the expertise needed to document the processes generating the statistics or to ensure quality control.
Database manager:
Senior management asked me to get statistics from our immigration database. I was told to send them to the NBS because they needed them for the SDG indicators. I can write a program to extract the data in a report, but I was told I needed to produce the statistics according to industry standards. I don’t know what that means. What are the right industry standards? I’ve talked to other IT managers, but no one seems to know anything about industry standards. We need training.
Given the need for rigorous standards for collecting and analysing scientific data, the quality of the documentary trail is somewhat better than for administrative and survey data. However, data reliability is undermined by the failure to keep records of changes in the instruments used to make scientific measurements, by changes in sampling methods or by failure to update metadata schema.
Staff member, environmental monitoring division, environment ministry:
We monitor marine acidity at stations along the coast and take manual samples. But we don’t have trained staff to take samples, and equipment has been stolen from some sites. At others it has broken down. How are we supposed to generate good statistics? The ministry still wants us to use the data we have for the SDG indicator 14.3.1 – that’s the one about marine acidity. I’ve told the minister that our data are not good enough to do the analysis, but he wants us to try anyway.
Several of the surveys used to measure the SDG indicators lack sufficient documentation about the metadata schema supporting the surveys. There are inadequate definitions of key terms, which has led to confusion when interpreting some of the statistics generated from the surveys.
Official in the social development ministry:
The labour ministry uses a different definition of ‘employment status’ from us. I think we should include more people, like part-time street traders and part-time farmers, even children. How can we report statistics for employment if we are using different definitions? I’ve searched in our files to find out why we use our definition, but I can’t find any records. Maybe there are no records. I’ve asked the labour ministry where their definition comes from, and sent reminders, but I haven’t had a reply.
Problems in finding, retrieving and understanding data held in older data files for trend analysis purposes are hindering the government’s ability to regularly measure SDG indicators.
Official, agriculture ministry:
The chairman of one of our farmers’ association asked us for data on crop production. He wanted the information for his members so that they could look at trends. He also asked the NBS, but we can’t find any data files earlier than four years ago. We don’t have any record of where the data files are stored, and all the staff that were involved have left.
Metadata describing the context for many data and statistical files tend to be incomplete. This makes the analysis of trends very difficult and it also makes it difficult to respond to access to information requests or court challenges.
Government lawyer:
NOPA, that’s the National Oil Producers’ Association, says the government has sent the UN incorrect statistics. This is in connection with SDG 7 on energy. They’ve asked the government to provide the documentation on how the statistics were produced, but we can’t find the records. We’ve asked the records office and the action officers involved but no one can find anything. To be honest, I’m not even sure the methodology was properly documented.
Many organisational units across government are involved in developing statistics that support the SDG indicators. Multiple organisations may be involved in developing any given SDG indicator, from the initial planning for a survey or the extraction of data from a database, to the final submission of the statistics to the UN Statistics Division. In many ministries, it is practically impossible to bring together the complete story of measuring an indicator because each unit takes its own approach to capturing and classifying the records documenting its activities.
Official, labour ministry:
We’ve had a big problem with statistics for the SDGs initiative. We receive data from two other ministries and merge them with our own data to produce the statistics. Now there is an expert looking at how we produce the statistics. We gave him copies of the records we send to the NBS. He says the records are not good enough to document the processes in the other two ministries. The quality of the data can’t be trusted. He’s right. We can’t match their data to the records we keep, so the statistics can’t be trusted.
When one organisational unit passes data to another unit, if the units take different approaches to capturing and managing records documenting the processes they follow, it can be challenging if not impossible to bring together the complete story of how the indicators are measured.
Assistant secretary, social development ministry:
Our Research and Statistics Division (R&S) has complained that its statistics have been altered. R&S sent the statistics to our Communications Division for submission to the NBS and somehow the statistics were changed. The NBS sent the statistics on to the MGA. The ministry was supposed to send the statistics to the UN for SDG indicator 10.1.1, but that hasn’t happened because we don’t know why or how the statistics were altered. I’m trying to sort things out and get to the bottom of this, but no one seems to be able to find any record of why the statistics were changed.
Most ministries have assigned accountability internally for producing statistics to support the SDGs. However, often no one is accountable for classifying records that should document processes for collecting and analysing data and producing statistics or for ensuring that they are complete and accessible through time. Changes in methodology (for instance, in the sample size) and in definitions of key concepts (for instance the target object being measured) tend not to be well-documented. In a few ministries with long involvement in generating statistics there is documentation on survey methodologies (such as coding schemes and analytical techniques) and on conducting surveys (data verification checks, evaluations and audits) somewhere in the ministry, such as in the library. Even in these cases, however, there is seldom a link to the records, such as emails and correspondence, that document the conduct of the survey itself. As a result, the quality and completeness of the documentary trail for individual surveys and for the survey programme varies considerably.
Records management programmes do not exist in most ministries. The one exception is the Ministry of Health, which has a small records management unit with responsibility for managing all of the ministry’s records and ensuring that they are accessible through time. Unfortunately, the unit does not yet support the Health Statistics Division, which is responsible for generating statistics measuring several SDG indicators. The staff are on their own in managing records documenting their surveys.
Records manager, health ministry:
I only have three staff and none of us have professional qualifications. We have some training, but it only covers paper records. We keep asking for professional training or training in electronic records management. We see other people going for training, but not us. The ministry think we are here to manage the paper files and that’s our job, but I can’t deal with the records of the Health Statistics Division without more training. I don’t know anything about data files.
The government of Patria does not have a digital preservation strategy. Most IT staff believe that digital preservation means storing data securely but don’t recognise the importance of managing the metadata that will make it possible to access and understand the data through time. Nor do they realise that there is a need to convert data to new formats that new software can read or to generate and maintain complete and accurate records documenting these changes. Many look to the NBS for direction and guidance, and some have suggested that it should become a centre of expertise or even a storage centre for data files with long-term value. However, the Bureau lacks the necessary resources and expertise, as does the National Archives.
Head of IT in a large ministry:
I think our archiving strategy is sound; we back everything up on tape.
Senior officials in several ministries suggested that some or all activities involved in producing statistics for measuring the SDG indicators should be outsourced. They feel that if the resources and expertise aren’t available in-house, outside service bureaus and contractors should fill the gap. Others argue that the government needs to control its own production processes, assess the quality of its own data and be able to prove that the statistics it provides to the UN Statistics Division can be trusted. In their view, most companies don’t have the necessary expertise in any case.
Interviews in several ministries participating in the SDG indicators process revealed that sometimes the numbers are changed as the result of political pressure before statistics are provided to the NBS. This doesn’t seem to happen often, but when it does, it usually isn’t recorded. The combination of poor recordkeeping practices and corrupt actions on the part of government officials has undermined significantly the quality and trustworthiness of the statistics used to measure the SDGs.
Data and records issues at the NBS
The NBS maintains a large database that describes the demographic characteristics of the population, including sex, geographic location, education, employment status and income level. Many of the data are collected through the surveys managed by the NBS, with some provided by ministries based on their own surveys. In some cases, the NBS amalgamates data provided by several ministries to generate statistics on cross-cutting topics. In these cases, data submitted by the ministries are converted to formats and structures that can be matched with specific sets of demographic data from the demographic database and matched with other survey data files.
Regardless of their source, data held in the NBS are used to produce statistics that are then incorporated into report files and submitted to the MGA before being transmitted to the UN Statistics Division. The reports are in both digital and hard-copy form. Hard-copy reports are held in filing cabinets managed by the administrative assistant in the office of the director responsible for the demographic database. Digital versions of the report, together with any master data files, are held in the data library ‘forever’ and managed by the head of the IT area. Copies of the data from the ministries are also maintained in the library but disposed of after five years on the assumption that if the files are needed, they can be accessed through the respective ministries.
Statisticians and IT staff in the NBS understand the importance of providing quality data to support the SDGs. However, the lack of resources and of a records management infrastructure make it difficult to document processes for collecting and processing data and producing statistics. This, in turn, makes it hard to ensure that data and statistics are of high enough quality and integrity to be used effectively.
Records documenting the design of the demographic database and the management of the data (data collection, processing, analysis and reporting) are poorly maintained, and there are no documentation standards.
Staff member, socio-economic statistics division, NBS:
Three ministries send us data for indicator 8.1.1. We convert the metadata to a standard format before we merge it with the census data. If we didn’t convert the metadata to a common standard, it wouldn’t match up. The problem is there are so many differences in the data, like spellings and names, that the statistics we produce are not very reliable. Also, the ministries are always changing their staff and how they do things.
The NBS assumes that ministries are submitting data files and statistics of appropriate quality and integrity.
Staff member, socio-economic statistics division, NBS:
We sent some incorrect data to the MGA for indicator 2.3.1, but it was not our fault. The ministry said we must check the data before sending them, but it’s the ministries’ responsibility to check their own data. It’s not our job. Even if it was, we don’t have the documentation to verify the data. I am not sure if even the ministries have the documentation.
Sometimes the documentary trail is broken when statistical files are transferred from ministries to the NBS. Each participating ministry uses its own classification standards, which makes it difficult to get a complete picture of any given survey/data collection and analysis activity. The lack of evidence of the quality and integrity of statistics increases the risk that they could be flawed.
Senior official at MGA:
Some months ago, the environment ministry changed its definition of hazardous waste to make it wider. This meant that they had to make changes to their surveys and databases, and the way they produce statistics. Now the environment ministry has found out that the NBS has not been using the new definition in its reports for the UN. We have not been able to find any records about why the NBS is not using the new definition and none of the staff can explain it. We don’t know how this affects the data used to measure SDG indicator 12.4.2 and the related indicators.
Although the professional staff responsible for the demographic database are concerned about preserving the data, they do not feel equipped to tackle this complex issue.
Manager of the household surveys division:
I’ve read all of the literature and I think I know how I would go about developing a digital preservation strategy, but I don’t have the resources. I have too many other problems to address, and, in any event, the data and their supporting documentation are in a mess.
There tend not to be formal retention and disposition schedules.
Staff member, the socio-economic statistics division:
I am worried about our policy for deleting data. The last director general made up rules for how long we keep data in our division. We are supposed to keep anonymised master files and summarised versions forever. But input and process files must be deleted one year after we create the master files. I don’t know why he came up with this idea. I’ve raised it at management meetings and asked if we can look at it again. If we don’t keep the raw data, how can we demonstrate how we measured the SDG indicators, especially over time? It worries me.
Implications of the failure to establish a management framework
The implications of these issues for the government’s inability to achieve the SDGs are:
•poorly managed records make it hard to verify the quality and integrity of data generated to measure SDG indicators; this will undermine the government’s efforts to report on progress to the UN and jeopardise its ability to make good use of the findings
•data can be flawed, but without a reliable documentary trail to reveal the flaw, it can go unrecognised. Without records as evidence, the government will find it difficult to demonstrate the data’s integrity or to trace where a flaw occurred
•flawed data from one source could skew the statistics provided to the MGA, even when the quality of the data from all other sources can be proven by the existence of properly managed records. This could lead to flawed statistics being inadvertently provided to the UN Statistics Division by the MGA
•the government could waste resources taking action to implement SDG findings based on data that lacks integrity
•the quality of data collected through time may be eroded as more and more flawed data join the database. This could have significant consequences for the quality of the data and statistics used to measure SDG indicators in the future
•the loss of credibility due to flawed data could bring the quality of other data into question, which could be problematic without records to prove the quality of the processes followed
•in addition to the implications for measuring the SDG indicators and implementing the SDGs themselves, the impact of poor recordkeeping is likely to affect the government’s ability to carry out its mandated responsibilities
•individual rights can be compromised if individuals who provided data as part of a data collection activity (such as a survey) cannot be accessed, or if data or records documenting decisions about the collection and use of the data cannot be found
•national economic interests could be threatened if government policy and direction are based on flawed data and statistics or if the level of quality and integrity cannot be confirmed.
Strategies for sustainable solutions
This section focuses on strategies for developing a comprehensive and sustainable framework for managing data, statistics and records. The issues identified in the previous sections reflect weaknesses in the overall framework for managing data, statistics and records. Just as there are frameworks for managing human and financial resources, this framework should provide an integrated combination of laws and policies, standards and practices, systems and technologies, and people, supported by management and governance structures. A focus on symptoms, without considering the broader causes, will result in fragmented and ineffective strategies and offer only short-term temporary solutions. Inevitably, there will be issues needing urgent and immediate attention, but the focus should be on establishing a comprehensive and sustainable framework for managing the completeness, authenticity and trustworthiness of data, statistics and records.
The section is organised according to the components of the framework. The key issues and relevant strategies are described below for each component.
Laws and policies
Issues
•there is no law requiring the government to set up a records management programme. The access to information law provides the right of access to a wide range of government records, but it does not require the government to ensure that its records are authentic, accurate, complete and accessible. The Privacy Act requires that personal information be protected and retention standards applied, but there is no public pressure for this to be enforced
•apart from the Ministry of Health and the MGA, none of the ministries participating in measuring the SDGs, including the NBS, has a records management policy. At the health ministry, the policy focuses on managing paper records and does not yet address records in digital form. The policy for the MGA is limited to managing the paper records of the secretary and the executive committee
•there are some policies in place for managing data and statistics and conducting surveys, but they do not address the role of records in providing evidence to document survey and other data collection and processing activities.
Strategies
It is important that existing laws, such as a national archives act, data protection legislation, statistics act or other relevant legislation, should support the effective management of information needed to measure the SDGs and enable the government to achieve its operational and strategic goals and meet a wide range of accountability requirements, notably:
•ensure that the freedom of information or right to information law enables citizens to have the right of access to the data, statistics and records generated to support measuring the SDGs
•ensure that the Privacy Act gives citizens the right of access to their personal information as recorded in the data, statistics and records generated to support measuring the SDGs
•develop a government-wide policy on managing records as evidence that embraces data and statistics as high-quality sources of information for decision-making and for verifying the integrity of the processes involved
•strengthen policies for managing data and statistics to ensure that responsibility and accountability for documenting relevant processes are clearly defined and that there are provisions for managing data and statistics as part of the documentary trail of surveys and other data collection and analysis activities
•develop policies and guidance to protect personal information in relation to the data, statistics and records generated for measuring the SDGs
•ensure that in all contracts with private sector firms conducting surveys on behalf of the government, the contractor is obliged to document its activities, protect the data and statistics it generates, respect the government’s ownership of the data and statistics, and transfer all data, statistics and supporting records to the government when the contract is completed.
Standards and practices
Issues
•there is no guidance on how to document processes for collecting and processing data and producing statistics. Ministries establish their own practices for creating and managing documentary trails, which often are not complete, accurate and authentic. Variations in how these processes are designed and managed makes it difficult to establish standard approaches to documenting them
•the NBS follows standards for managing survey documentation, such as code books and survey methodology documentation, but generally these standards and practices are not in place in the ministries. Even when survey documentation standards are applied, there is no way to link the documentation to the records, which are often in the form of emails and attachments that document decisions and actions about the management of the survey itself. Establishing a complete and comprehensive documentary record of the survey is impossible
•procedures are not in place for converting and sharing data across ministry boundaries. Achieving interoperability when there are multiple recording formats and diverse technologies is virtually impossible. For instance, the NBS must convert data and statistics it receives from ministries in order to provide statistics to the MGA in a standard format. There has been little effort to document these conversion activities, which means that flaws in the data that emerge at this stage are difficult to trace
•retention standards for data and statistics are rarely in place, and even when they have been assigned, they are not consistent across the data, statistics and records associated with a given process. Final statistical data files may be kept ‘forever’, but records documenting the circumstances of their creation may be destroyed much earlier. Digital preservation presents a huge challenge for any organisation, but it is possible to take preliminary steps, such as researching possible strategies, and assessing needs. At present, there is little evidence that this is happening
•especially when several ministries are involved and records documenting a given process are in multiple forms, the difficulties of bringing together the complete story, make it nearly impossible to establish a digital preservation plan covering all the records associated with the process.
Strategies
•develop criteria for identifying records that should be in place to document processes for collecting and processing data and producing statistics
•develop procedures to ensure that records documenting data and statistics activities are captured, managed and integrated with procedures for conducting surveys, analysing data, merging data and reporting statistics
•develop metadata standards and guidance for managing individual processes, linking records, data and statistics, and accessing data, statistics and records within and across processes and different media
•establish retention and disposition standards and guidance for all forms of records that document collecting and processing data and producing statistics
•monitor and draw from international work on digital preservation strategies and implementation plans for the long-term accessibility and integrity of data, statistics and records.
Systems and technologies
Issues
•technologies for managing data and statistics are usually specific to the unit responsible and the kinds of data being managed. For instance, the technology for managing data in a database may be different from technologies for extracting data from the database and processing them as statistics to support SDGs. Once the statistics are passed to the NBS for further processing, other technologies may be used. Documenting the changes that take place from one technology environment to another is a significant challenge
•custom-designed databases are in place for managing survey documentation, but technologies have yet to be developed to manage the records of decisions and actions taken regarding surveys or other data collection activities. Nor are there systems for tracking how data are collected and processed and how statistics are produced. Records generated by these activities are not being identified, classified and managed.
Strategies
•use generally accepted IT project management standards to plan, design, test, implement and maintain systems for managing the authenticity, integrity and continued accessibility of statistics, data files and records across space and through time
•use internationally approved standards to develop functional requirements for managing statistics, data files and records and incorporate them into the requirements for designing IT systems
•develop audit and evaluation tools for assessing the quality and integrity of data, statistics and records supporting the SDGs; integrate them into standards and practices for systems and into management audits and evaluations.
People
•the NBS has some staff with professional expertise in managing data and statistics, but they do not have the records management expertise needed to manage records documenting the processes that generate data and statistics. Few people in ministries have this expertise. Records management staff in some ministries, such as the health ministry and the MGA, are generally only responsible for paper records. The NBS recently introduced a training programme for ministry staff responsible for collecting and processing data and producing statistics to support the SDGs. This will help, but at present there are no training materials on managing records in relation to data and statistics
•in some ministries there is a wide gulf between those responsible for technical aspects of the data (such as IT), and those responsible for processes that generate the data, statistics and records (such as programme managers); often each assumes that the other is looking after the requirement. This gap has serious consequences for the integrity and quality of data and statistics.
Strategies
•define the work involved in managing data, statistics and records used to support measuring the SDGs
•define competencies associated with the work
•design and implement appropriate training programmes
•design and implement appropriate recruitment programmes
•enhance tools and techniques for measuring performance so that competencies for managing data, statistics and records can be assessed
•establish programmes for allocating staff with the required expertise between ministries to fill competency gaps
•establish partnerships, including with organisations outside of the government, to pool human and financial resources for developing the framework
•work with relevant university programmes to enhance existing courses or develop new ones to address the management of data, statistics and records.
Management and governance
Issues
•although there are accountability frameworks for managing personnel and finance, they have not been introduced for managing records documenting how data are collected and processed and statistics are produced. Accountability has not been assigned for ensuring that a complete and accurate documentary trail is in place. Audit units in ministries measuring the SDGs don’t yet cover this issue in management and systems audits.
Strategies
•establish accountability and assign roles and responsibilities8 for staff at all levels to ensure the quality and integrity of data and statistics used to measure the SDGs
•establish an authority at a senior level of government with responsibility for ensuring that records are managed to support high-quality data and statistics across government.
Awareness
Issues
•some senior managers are beginning to recognise the importance of preserving data files and statistics, but few understand the crucial role that records play. Records documenting processes by which data files were created and used and documenting the data files themselves, such as coding schemes and storage formats, must be preserved if the data files are to be accessed in the future
•through time, as the demand for historical data to analyse trends grows, this lack of awareness will have greater implications. Few recognise that the issue needs to be addressed now, rather than in the future, when data files generated early in the SDG initiative may already be inaccessible. The initiative, which asks governments to measure indicators over a 15-year period, is bringing the issue into sharp focus
•few citizens are aware of their rights in relation to data collected about them in connection with the SDGs, and few have challenged the way the data are used. Government ministries have not yet felt the pressure to ensure the completeness and accuracy of the data, statistics and records for which they are responsible. However, there is growing citizen concern about these issues and growing awareness of the government’s inability to manage the personal information it holds, especially in digital form.
Strategies
•ensure that senior managers responsible for programmes and processes supporting measuring the SDGs are aware of key concepts, issues, implications and possible strategies
•develop tools and techniques for enhancing awareness, for instance briefings and brochures for relevant staff at all levels
•incorporate these tools and techniques in training and awareness programmes, including orientation programmes for staff, management seminars and workshops.
The Ministry of Public Administration (which manages the civil service) could be an appropriate agency to take the lead in establishing a framework to address the quality and integrity of the data, statistics and records used to measure the SDGs and, at a more general level, to support the requirements of government programmes for authentic, complete, accurate and relevant data, statistics and records for decision-making and accountability.
Implementing the strategies
Capacity levels to guide the way forward
A roadmap, in the form of capacity levels, will enable the government to move incrementally through defined stages to build the capacity needed to manage data, statistics and records in line with available resources. Five capacity levels are described below, the fifth level being an ideal state for a country that wants to ensure that data, statistics and records used to measure the SDG indicators are of a high enough quality to measure and implement the goals. For most organisations, achieving Level 5 or even Level 4 will be challenging.
The levels reflect diminishing degrees of risk, with Level 1 representing the highest risk of loss and inaccuracy and Level 5 being the least risk. They also reflect increasing levels of sophistication in terms of the way data, statistics and records can be used to support implementation of the SDGs and, more broadly, the government’s operational and strategic goals. The roadmap for moving forward will support an objective and systematic approach. Examples are included in the maturity level descriptions, drawn from the targets and indicators supporting SDG 5:
•SDG 5: achieve gender equality and empower all women and girls
•SDG target 5.5: ensure women’s full and effective participation and equal opportunities for leadership at all levels of decision-making in political, economic and public life
•SDG indicator 5.5.2: proportion of women in managerial positions.
Level 1: poor-quality data, statistics and records undermine SDG implementation
The organisation produces statistics to measure the SDGs, but they are unreliable. Professionals responsible for data, statistics and records lack the knowledge and skills needed to develop a reliable framework of policies, standards, practices and systems.
Example
Annual labour force data is collected through survey forms sent to companies across the country. Data are collected and analysed, and the resulting statistics serve multiple purposes, including measuring the proportion of women in management positions in support of SDG 5. The lack of metadata standards and the absence of records documenting how data were collected and processed and the statistics produced makes it impossible to relate statistics from year to year. Data files from previous years are poorly organised and documented, so records of decisions, including changes in survey design, data-processing methods and data formats are fragmented and scattered in multiple locations. The implications will not be known for some time, but without a reliable evidence base or the expertise to prove the trustworthiness of the data, the annual statistics cannot be relied upon as an accurate measure of SDG indicator 5.5.2.
Level 2: data, statistics and records enable basic SDG measurement
A framework of laws, policies, standards, procedures and people is in place to ensure that data and statistics are gathered and analysed to measure the SDGs. Managers are generally aware of their responsibility for ensuring that data files and statistics, with their supporting documentation, are stored properly. However, the framework is not applied universally, with some managers providing poorly documented data and statistics. There are no standards for documenting surveys and other data-gathering and analysis activities, nor have policies been developed for managing the records that should document processes for collecting and processing data and producing statistics. Records and data management professionals do not have the expertise needed to manage the interrelationships among data, statistics and records, especially those that need to be preserved through time.
Example
Annual labour force data, including data extracted from the labour force database to produce statistics for measuring SDG indicator 5.5.2, are collected and processed based on approved standards and procedures. However, emails, reports, logs and other records documenting the design and conduct of the survey, including changes in survey methodology, cannot be related to records documenting processes for extracting and analysing data and producing statistics. Records management professionals in the Labour Force Statistics Division responsible for measuring SDG 5.5.2 do not have the expertise needed to ensure that records, data and statistics are managed as a whole. The lack of a digital preservation strategy increases the likelihood that trend data needed to measure SDG 5 from now until 2030 will not survive.
Level 3: the quality of data, statistics and records makes it possible to measure SDGs effectively and supports government programme activities
Data, statistics and records generated to measure SDGs are managed through a comprehensive framework of policies, standards and practices, systems and technologies, and qualified people. Records management staff work effectively with data management and other professional staff to ensure that requirements for identifying, describing, classifying, protecting and retaining data, statistics and records are integrated in the design of processes for collecting data and producing and using statistics. Managers know that they are responsible for ensuring that the data, statistics and records generated are authentic, reliable, accessible and understandable and can be retrieved when needed. Professional staff apply clear, consistent standards and practices. However, preservation is not addressed adequately; retention requirements have not been established, metadata standards for data, statistics and records have not been developed, and preservation standards, procedures and technologies are not in place.
Example
All processes for generating statistics to measure SDG 5 are supported by the same framework of policies, standards and practices, systems and technologies, and people. For instance, data, statistics and records generated to measure SDG indicator 5.2 (the proportion of women in management positions) are well described, organised and managed to provide a comprehensive documentary trail of evidence. The statistics can be trusted because the comprehensive management framework itself can be trusted. Unfortunately, the lack of a digital preservation strategy means that while statistics measuring the participation of women in management positions can be compared for the past two years, the government cannot ensure the integrity of the statistics over the 15-year life of the SDG initiative.
Level 4: well-managed data, statistics and records make it possible to measure SDG implementation effectively and consistently through time; data and statistics are of high enough quality and integrity to support government programme activities at the strategic level
Data, statistics and records generated to measure the SDG indicators can be reliably merged or combined with other data sources to support programme activities, including those supporting the organisation’s strategic goals. Organisation-wide policies and standards are in place to protect records of decisions, and accountability requirements, for instance under access to information legislation, are supported by consistently applied records management policies and standards. Trends can be analysed through time, and comparisons can be made from year to year because changes to formats, coding schemes and data collection and analysis methods are well-documented. Preservation standards ensure that data, statistics and records are stored properly and migrated to take account of changes in technology. The preservation programme ensures continued accessibility and authenticity of data, statistics and records through time.
Example
Gender equality is a government strategic priority. Labour force data used to produce statistics for measuring the proportion of women in management positions (SDG 5) is being merged with statistics from the Ministry of Industry on female participation in various industry sectors to support the strategic goal. This is possible because of the way the data from both sources were formatted and described. The resulting database can be used to measure progress toward gender equality, while at the same time contributing to the statistics needed to measure SDG indicator 5.2. The comprehensive framework of policies, standards and assigned accountability ensures the integrity and trustworthiness of the data, statistics and records. A preservation programme dedicated to ensuring the authenticity and completeness of the increasing volumes of data and statistics makes it possible to perform complex analyses through time.
Level 5: processes generating data, statistics and records, and the framework for managing them, are designed to make it possible to exploit data, statistics and records, including those measuring SDGs, in new and innovative ways
Managers of SDG initiatives understand the benefits of sharing and exploiting data, statistics and records for stimulating innovative thinking on implementing the SDGs and achieving the operational goals of individual programme activities and the strategic goals of the organisation. Professional staff have the knowledge and expertise needed to design comprehensive management frameworks covering multiple organisations and technology environments that encourage information in the data, statistics and records to be exploited to the greatest possible extent.
Example
Employment data from several large private enterprises have been merged with the government’s labour force data and employment data to create a government-industry database. The complex interjurisdictional processes are well-documented, data are well-managed, and the statistics produced from the database can be trusted because the management framework can be trusted. Staff have the confidence to look for new and innovative ways to exploit the data, even as its volume and complexity grows. Innovative and advanced technologies are applied, and information is published in new forms to meet the needs of a wide range of individuals and groups and to give citizens access regardless of location. A wide range of statistical products serve multiple purposes, including not only the measurement of SDG 5 but also the management of the government’s commitments in support of the Open Government Partnership’s agenda on gender equality.
First steps
Rather than trying to work on everything at once, it is suggested that the government should start by identifying and defining solutions for a few processes where weak management of data, statistics and records has significant implications for achieving the SDGs. This experience will then inform the development of the framework.
Identify a leader and assemble a team
Given the MGA’s leading role in the SDG initiative, a senior official in the ministry should oversee the initiative. This person should have a background in data management, statistics, information technology or records management, the capacity to bridge these disciplines and the ability to communicate with a variety of stakeholders, including senior management.
A steering committee should be appointed, made up of representatives from government programmes supporting the SDGs as well as programmes where the quality and integrity of data, statistics and records is particularly important. Specialists in managing data, statistics, records and information technology, as well as legal experts and auditors, should also be included. The committee should help select the SDG processes to be covered, to identify issues and strategies, and to explore how to extend the results to other SDGs.
Some government officials have argued that attempting to build a comprehensive framework is creating a ‘mountain out of a mole hill’ and that the focus should be on addressing immediate issues associated with specific SDG initiatives. Others have realised that systemic issues need to be addressed across government as a whole. This tension between the need to address immediate and critical problems and the goal of developing comprehensive and sustainable solutions needs careful management. One way to address the tension is to focus on specific carefully identified processes in order to gain knowledge and skills that can be extended to other processes or used in developing a comprehensive management framework.
Identify processes as examples
For each of the three process types (survey, registration/administrative and scientific), identify one or two processes that present significant challenges for measuring one or more SDG indicators and for using data, statistics and records for operational and programme delivery. These are likely to be processes where undocumented flaws or inaccuracies in data, statistics and/or records have led to embarrassment, bad decisions about the use of government resources, missed opportunities or increased risk and costs.
Describe the selected processes
The description should cover the stages of generating the data, statistics and records and of managing a given process:
•the stages of a survey process are likely to include planning and approving the survey, designing the survey methodology, designing the data collection tools and techniques (such as survey forms), testing the survey methodology, conducting the survey, analysing the results, reporting the findings and reviewing how the survey was conducted
•in the case of a registration/administration process, the stages are likely to reflect the stages of the systems development life cycle, including planning the system, defining functional requirements, designing the system and database, testing the design, implementing the system and the database, maintaining the system and database, and evaluating the extent to which the system and database follow the stated requirements
•in the case of a scientific process the stages would include planning the project, assessing data collection methods and technologies, designing the process, testing data collection and measurement tools, procedures, analytical techniques and statistical reporting methods, implementing and maintaining the process, and reviewing/evaluating the project.
It should be possible to identify the data, statistics and records created at each stage. The aim is not to describe every single stage and every piece of data and statistics and every record for a given process but to identify key stages of the process and the associated data, statistics and records that are significant in terms of measuring an SDG and providing a complete and authentic documentary trail of the process.
Finally, the overall framework for managing both the process and the data, statistics and records should be reviewed. Policies and standards are particularly important, as is the governance structure (who is accountable to whom for what). This will provide a template for analysing the quality and integrity of the process itself and the data, statistics and records it generates.
Identify issues and implications
It should then be possible to analyse the issues, distinguish between symptoms and causes and identify solutions. For instance, a poorly documented data file input to a set of statistics that turned out to be flawed is a symptom. The cause was the failure to establish metadata and documentation standards at the planning and design stages of the process and to assign accountability for implementing them as part of the management framework for the process. In identifying the issues, it is important to distinguish between immediate issues particular to measuring a given SDG indicator and issues related to the broader management framework for the organisation as a whole.
Finally, issues should be explained in a way that programme managers responsible for generating SDG statistics can understand. A key idea that should be reinforced continually is that where data, statistics and records are flawed, where their accuracy cannot be established and where they are lost or destroyed, the credibility of the manager responsible for the data, statistics and records will be undermined irrevocably. By extension, society’s trust that the government is capable of carrying out its obligations, including achieving the SDGs, will be eroded significantly.
Develop strategies for resolving issues
Understanding where symptoms and causes are located on the roadmap and how they relate to one another will be helpful in developing integrated strategies within the context of the overall management framework. Most of the strategies should focus on the planning and design stages of the survey, system or other data collection and analysis activity when the steps in the strategy can be integrated more easily and more cost-effectively. For instance, the need to develop and apply enhanced metadata standards and procedures for enabling data, statistics and records to be related to one another should be acknowledged and addressed at the planning stage with subsequent stages incorporating the testing, implementation and assessment of the standards and procedures themselves. This approach to developing and implementing strategies can be applied to any process, from small one-time surveys to large IT systems supporting continuously updated databases from which data and statistics supporting SDGs are extracted.
Apply the experience to other processes and to the framework for managing data/statistics/records
The approach should result in strategies that can be applied to all processes, not just for those measuring the SDGs but for any process supporting the government’s programmes and services where data, statistics and records essential to decision-making and the ability to meet accountability requirements are being placed at risk. In parallel and over the longer term, the results will be invaluable in developing a comprehensive, policy-driven standards-based framework for managing data, statistics and records, regardless of the process or business function.
Ultimately the goal is to build a comprehensive management framework to cover all government programmes and services and to allow the government to demonstrate that the data, statistics and records it generates can be trusted. The outcome should be that the government is able to demonstrate, through the availability of complete, accurate and relevant data, statistics and records, a high level of credibility, both to Patrian people and to international partners, investors, development agencies and other international organisations, including the United Nations.
1Information for this step was derived from the SDG Indicators and Metadata Repository, United Nations, 2017, https://unstats.un.org/sdgs/metadata/.
2According to the Inter Agency and Expert Group on Sustainable Development Goal Indicators, ‘official data’ refers to a set of values of qualitative or quantitative variables, which are produced and/or disseminated by an official source such as the National Statistical Office or another governmental agency or department including non-traditional types of data. ‘Official statistics’ means a part of official data, which is produced in compliance with ‘Fundamental Principles of Official Statistics’. See: ‘Guidelines and Best Practices on Data Flows and Global Data Reporting for Sustainable Development Goals’, 9 November 2017, p. 4, https://unstats.un.org/sdgs/files/meetings/iaeg-sdgs-meeting-06/20171108_Draft%20Guidelines%20and%20Best%20Practices%20for%20Global%20SDG%20Data%20Reporting.pdf.
3Derived from definitions provided by the International Council on Archives, http://www.ica.org/en.
4Derived from Wikipedia, https://en.wikipedia.org/wiki/Business_process.
5For a complete definition, see ISO Standard 15489, Records Management, which states that: ‘records management is the field of management responsible for the efficient and systematic control of the creation, receipt, maintenance, use and disposition of records, including processes for capturing and maintaining evidence of and information about business activities and transactions in the form of records’.
6Derived from State of Queensland – Department of Public Works, Glossary of Archival and Recordkeeping Terms (Queensland: Queensland State Archives, 2010), as described in International Council on Archives Multilingual Terminology.
7Information for this section was inspired by M. Jerven’s Poor Numbers: How We Are Misled by African Development Statistics and What to Do About It (Ithaca, NY: Cornell University Press, 2013).
8It is important to note the difference between accountability and responsibility: accountability is always upward to someone; responsibility is for something (to be done).