2016 GRESB Real Estate Assessment: What You Need to Know about Your Results

The 2016 GRESB data has been available for over a month, and judging from the phone calls and emails received by the GRESB Helpdesk, the 750+ entities who submitted their ESG data are busy evaluating their performance and writing internal reports explaining how they did. At the same time, the GRESB team has been playing the supporting role in untangling the GRESB results. With this post, I would like to share some observations on some of the most frequent questions that come up.

How to read and communicate my results?

First things first: GRESB analyzes the ESG performance of real estate portfolios, regardless of how big or small they are in terms of number of assets, value or floor area, whether they’re listed or private, regionally diversified or locally focused. While it is true that 25% of the GRESB Score is given by governance indicators, it is worth clarifying that the GRESB Score should not be interpreted as a stamp of approval for a fund manager or an organization as a whole – or at least, not without looking at an array of other indicators (not least financial track record).
That said, a proper interpretation of results implies a narrative that goes beyond the one number (i.e. the GRESB Score) that puts your portfolio on the map. If you’ve participated in the GRESB Real Estate Assessment before, you’re probably most interested in the historical trend, as well as in your individual performance development on the E(nvironmental), S(ocial) and G(overnance) aspects. Once that’s clear, take it one step further and look into the story told by your development on the seven Sustainability Aspects (or eight, if you reported on development activities). The Reference Guide will help you with that — pay attention to the intent of each aspect.
In 2016, GRESB also introduced the GRESB Rating as a tool for high-level differentiation on the ESG performance of real estate companies and funds. The rating is calculated relative to the global performance of reporting entities, which means you now know exactly where you stand on a global scale. On an absolute level, entities with scores higher than 50 on both the Implementation & Measurement and the Management & Policy dimensions are still rated a “Green Star” and can communicate their position as such. The only difference as compared to last year is that the designation is not accompanied by a Green Star logo.

We’re disappointed by our results this year, how will we explain this to our investors?

First of all, it is important to analyze the “why” (and maybe take a step back, what does “disappointing” really mean?). It is safe to assume that anyone who scored below their expectations will fall into this category, so let’s break down the issue into a few probable scenarios:
1. We scored worse than last year, even though we didn’t do anything different
The very nature of a benchmark means that unvarying efforts do not guarantee unvarying results. This, coupled with changes in the benchmarking sample, may lead to deviations from the expected outcomes. In a few cases, I’ve been told that participants were able to estimate their scores exactly. This is due to a good understanding of the initial requirements, as well as an honest evaluation of their own input.
Something else worth mentioning here is the quality of reporting. Overall, the industry is getting better at “taking the test,” but accurate representation of performance through detailed checks and detailed supporting evidence should not be underestimated.
2. We dropped in scores because we were selected for Validation Plus
In 2016, the GRESB data quality control team performed a thorough analysis of the supporting evidence provided by more than 150 entities. Being selected for Validation Plus is in no way a punishment for any portfolio, as the Assessment Portal makes the selection randomly at the moment of submission. Yes, being subject to a detailed data quality check increases the possibility of GRESB finding incomplete information. No, this is not unfair. All submittals should be prepared in such a way that Validation Plus or Site Visits do not pose a threat to the final scores.
3. Our portfolio changed a lot, there was very little data we could collect for 2015
Changes in portfolio structure may impact the availability of absolute Performance Indicators data. However, these situations will not lead to drastic drops in performance. Variations in scores determined by variations in the underlying assets should be explained to investors accordingly.
4. We dropped in scores because our peer group changed since last year
The overall score is formed by a selection of non-benchmarked indicators (the number of points obtained depends only on your individual performance and reporting against the Reference Guide requirements) and a selection of benchmarked indicators (the number of points obtained depends on your performance per property type).
Year on year, GRESB aims to create more granular and meaningful peer groups. Property type and regional allocation of assets, as well as the legal status of the entity are the main criteria for peer group allocation, but additional criteria are applied depending on the sample size. If you would still like to know how you perform relative to the previous year’s peer group, the improvement badge in the Historical Trend will help you out (Note: If the size of the peer group falls below four peers, no historical trend is provided for that year).
These are some of the questions that come up every day. If you need more help, the Results Review is a new GRESB Real Estate service that we offer to participants upon request. Any participating entity can opt for a review, as long as they have a Benchmark Report (which can be obtained either as part of GRESB Membership or purchased separately).
You could view this as a “sister” service to the Response Checks that we offer in spring, in advance of the Assessment submission. The difference is, of course, that we focus on results, the validation decisions for each indicator and an interpretation of the overall results. GRESB does not consult participants on the implementation of projects or governance procedures that could ultimately lead to an increase in performance (our Partners provide such services) but helps them understand their position relative to peers. In fact, this is the kind of support we have always offered, but packaged in a more formal and structured way.
Our team is always available at [email protected] if you need additional support.
This article is written by Roxana Isaiu, Director, ESG & Real Estate at GRESB.