Data validation

The purpose of data validation is to encourage best practices in data collection and reporting. It is a vital element in GRESB’s work to provide high-quality data to Investor Members and the fund managers and companies completing the Assessments. This page provides an overview of the validation process. The Reference Guides contain more information on the process for each Assessment.

 

Validation can be structured into two categories: automatic validation and manual validation. Automatic validation is integrated into the portal as participants fill out their Assessments, and consists of errors and warnings displayed in the portal to ensure that Assessment submissions are complete and accurate. Manual validation takes place after submission, and consists of document and text review to check that the answers provided in Assessment are backed up by concrete evidence.

 

  • Automatic validation consists of errors and warnings displayed to participants as they fill out their Assessments. Some of the rules implemented are:

    Checks on existence:
    – Mandatory evidence uploads are present
    – Mandatory open text boxes are completed
    – Answers are present for all indicators

    Checks on data types:
    – Fields that should contain numbers, percentages, text, etc. only contain those data types

    Checks on logic:
    – Percentages must be between 0 and 100

    Participants are not permitted to submit their Assessments unless all validation errors are resolved.

  • There is a comprehensive set of validation logic implemented for asset-level reporting. These rules mainly consist of logic checks on the relationships between different data fields in the asset portal.

  • Based on statistical modeling, GRESB identifies outliers in reported performance data for the Infrastructure Asset and Real Estate Performance components. This analysis is performed to ensure that all participating entities included in the benchmarking and scoring process are compared based on a fair, quality-controlled dataset.

  • For Real Estate, the outlier model is built to detect outliers at the asset level. Outliers are flagged in the asset portal so that the participant can check their input data, and make corrections if necessary.

  • The manual validation process reviews the content of all Assessment submissions for accuracy and consistency. This work is performed after submission by a third-party validation team. During manual validation, the following are checked for their content:

    • All scored “other” answers, to ensure they are relevant to indicator and are not duplicates of standard answers.
    • All scored open text boxes, to ensure answers meet the specific indicator requirements.
    • All indicators where evidence uploads are mandatory, to ensure that the evidence supports the claims made by participants.
    • Additionally provided information related to third parties, e.g. organization names, assurance, audit, certification and verification standards.

    Information on Assessment-specific validation requirements, both overall and per indicator, can be found in each assessment’s Reference Guide.

     

  • Additionally, there is a second level of manual validation performed by an external team of validators for a subset of participants that submit a Real Estate Performance Component. For a randomly selected sample of submissions, the evidence provided in the Reporting Characteristics section of the Real Estate Performance Component will be reviewed and compared against the information provided in the Asset Portal for consistency. If there are issues discovered with this evidence, the validation team and GRESB will reach out to the participants for clarification. Participants will have the opportunity to further explain or update the information in this section. If the participants cannot adequately resolve any discrepancies, then they will not receive a score for the Performance Component and will therefore not be included in the Benchmark. For more information on this review, and on the requirements for reporting boundaries for Real Estate, see the Real Estate Reference Guide.

     

  • GRESB works to ensure that validation decisions accurately reflect the requirements set out in the reference guides, and that decisions are consistent across indicators and submissions. The external validation team uses the same requirements described in the reference guides as their main source of validation guidance when reviewing submission answers. The validation process also includes a review of selected decisions by a second validator. Additionally, GRESB checks a sample of all validation decisions to ensure that the requirements are being interpreted correctly by the validators.

    To ensure consistency across answers, the external validators review all answers for a given indicator at a time, and are typically assigned to validate related sets of indicators. This means that individual validators become “experts” on their set of indicators and can make sure that their decisions are consistent across all submissions. On top of this, GRESB runs additional consistency checks using a model that checks for the similarity between provided answers per indicator, and flags any answers that have inconsistent validation decisions.