Login

On this page you can learn about the main types of data quality processes that can be built into the digital reporting process.

Digital Reporting and Data Quality

The purpose of the XBRL specifications is to provide a universal digital framework to support any kind of periodic reporting carried out by companies (and other kinds of organisations), no matter their size and no matter the complexity of the subject matter.

Learn more about the basics of digital reporting and the importance of data quality here.

Companies and their service providers need to understand that the way to achieve data quality in digital reporting is to rigorously test their reports at all stages of preparation. It is especially important to carry out those tests before handing the report over to a third party (like an auditor), and especially important to complete them before submitting them to an exchange, a regulator or market utility set up to receive digital reports. Many, if not all of these tests are automated and should not be time consuming.

Types of automated testing

There are several steps that can be taken to ensure that digital reports prepared in XBRL are high quality and represent all of the disclosures that the company releasing them is making in an accurate way.

Built in testing

The most basic of these is the application of the baseline tests that are built in to XBRL to ensure that reports, extension taxonomies and report packages are correctly constructed and can be properly consumed. Every piece of XBRL software needs to apply these tests. You can be certain that your software is applying all of these tests correctly by checking that it is XBRL Certified Software.

Filing Manual Rules

Some regulators provide filing manuals that set out rules that need to be applied prior to a report being submitted. These can be simple, or relatively complex. Some relate specifically to issues such as authentication and non-repudiation, or format-specific issues that relate to a particular filing environment. Others also include taxonomy-specific automated rules (see below). Filing manual rules should generally be built into the software that companies use to create their reports and should normally be run every time changes are made to a draft report.

Taxonomy-Specific Automated Rules

The next step in improving data quality is the use of automated data quality checks. These are machine-executable rules which can be run over a report by anyone that has them. Typically these checks will be looking for the kinds of errors that can be all too easy to make but computers are well suited to catching. They can (and should) be run by companies (and their software or service providers) before they are submitted to the regulator or a regulated market. Equally, they can (and typically are) also run by the regulator or regulated market on receipt. To permit this, these kinds of automatic rules are made public.

Some regulators (especially banking and insurance regulators and tax authorities) make extensive use of these kinds of rules. Others are introducing them slowly. The idea is to keep the simple mistakes out of digital reports so that users can focus on analysis.

The use of automated rules is a key way to ensure and enhance the utility of digital reports.

Other testing processes

Review Processes and Supporting Software

The automated steps outlined above, if baked into the processes used by companies in creating their digital reports, will identify a wide range of potential issues and greatly improve the quality of reports overall.

However, companies should be particularly focussed on the processes involved in the selection and use of taxonomy concepts (aka “tags”) in their reports. Automated tests can only help in this area to a certain extent.

Some software tools have tight connection to data sources within a company and this technique can assist in ensuring that the right tags are selected and consistently applied.

However, most reporting taxonomies are large and complex, mirroring the disclosure choices available in accounting and corporate reporting. It is therefore necessary to have a range of measures in place to assess the suitability of tag selection.

Having multiple expert people review tag selection, either within a company or by using external experts is very helpful.

XBRL Review tools exist and can help by orchestrating test and review of element selection. Some provide a range of peer analysis and even machine learning tools to assist companies determine whether they have chosen the right tags in the right place.

Assurance and Audit

Auditors can (and in many jurisdictions must) conduct independent review of digital reports. Although they will apply a range of automated tests such as filing rules and automated taxonomy-specific automated rules, these are routine and should generally have been carried out by companies prior to providing them to their audit team.

Auditors are well placed to review the judgement applied to markup decisions in company reports as they understand the business of that company, they understand the relevant accounting standards and they understand the official taxonomy that represents those accounting standards. They can therefore assess whether the tagging carried out is accurate, appropriate and complete.

IFRS filers can and should apply the XBRL International Data Quality Review tests to improve the quality of their draft reports.