We are living in a different era.
Data is the new oil and companies are now increasingly awash in information being piped in from the internet, from email, from internal accounting, asset management, tenants, vendors, partners and other stakeholders in your business. That doesn’t even begin to consider the impending deluge of data that sensors embedded in the new generation of smart buildings are starting to generate. Consider that you currently process more data one day at work (about 34 Gigabytes) than your 15th Century ancestor processed in their entire life time. Not surprisingly, the complaints of information overload and overwork are on the rise.
While the amount of data you are expected to navigate increases exponentially, the cost of storing that data is dropping. In 1980, it cost more than 1 million USD to store one gigabyte of data – today the cost to store that same data is less than 2 cents.
So, data is now significantly more abundant and a lot cheaper to deliver and store. What does that mean? It means that in this new era there will be winners and losers. Gartner recently reported that 85% of the Fortune 500 companies fail to exploit their data to generate a competitive advantage. Interestingly, in the last 15 years, as data availability has surged, 52% of the Fortune 500 companies have disappeared, and the average life expectancy for Fortune 500 company has dropped from 75 years in 1955 to 15 Years in 2015. This is a business environment where the nimble survive, and those that do not learn to navigate the tides of new data do so at their own peril.
In the hyper-competitive world of commerce, retail giant Walmart processes more than 200 billion rows of transactional data representing just a few weeks of operations. The company’s stated goal is s to consistently get information to their business partners as fast as they can, so they can take action and cut down turnaround time. Walmart calls it proactive and reactive analytics.
As Walmart has discovered, it used to be content that was king, now it is context that is king. Research shows that people hold off making decisions when they either have too much information (data overload) or too little (fear of making an uninformed decision). In the real estate sector, the value of initiatives like GRESB is that they provide that context – an agreed upon framework of performance benchmarks that represent a focal point against which abundant data can be used to make informed decisions. GRESB is one of many opportunities that provide a context to get the right data to the right person at the right time, so actions are taken that are measurable, repeatable and make a real difference.
Reaching a competitive advantage
The good news in all of this is that as automatic sensor equipment provides an ever easier and cheaper means to gather and store performance data, so the ability to identify the right actions to solve a particular problem becomes increasingly easier. The last hurdle in real estate and other sectors will be to have the various information systems – accounting, tenant management, asset management, and building management begin to speak to one another and coordinate activities seamlessly. While this may sound like science fiction, rapid advances in machine learning, artificial intelligence, and computing power make this a real possibility in the next five to ten years.
The buildings we construct, manage and own represent hives of information and form elaborate, connected communities, both within the building envelope as well as between buildings. Smart buildings will soon connect to form smart cities, and the world as we know it will become very different in a surprisingly short period of time. The companies that will win through this time of rapid change will be those who can sift through the vast amount of data to provide meaningful insights inside of an appropriate context. While leaders like Walmart are blazing a trail, it is never too early to begin thinking about how the data you are currently collecting can be used to create a competitive advantage.