As the year draws to a close, the Lyra Intel team likes to look back at the major themes that affected the US commercial real estate market. This blog will later be accompanied by our 2019 predictions post, where we try to look forward and provide industry thought leaders with key items to keep an eye on in the coming year.
Here’s a truth: No matter what your community’s retention rate is, turnover is expensive. Generally, replacing a resident will cost at least five times more than it costs to keep them, so having a strong retention program in place makes solid financial sense.
From optimizing for NOI to understanding business challenges and emerging industry opportunities, data plays an important role in how property managers and owners oversee their portfolios. However, most data the multifamily industry collects is delayed and siloed, and therefore is not actionable. In such an aggressive industry, inactionable data could mean the difference between hitting your revenue goals and missing them completely.
Part One of Addressing the Significance of Data Quality in CRE focused on the current state of data collection and quality in the commercial real estate market. In Part Two, we'll address the necessary steps property companies must take in order to improve data integrity to make better decisions.
Since the dawn of time until 2003 the quantity of information generated is estimated at some five exabytes. According to Intel, that same amount of information is now created every two days. Businesses have long understood that there is value – somewhere – to be extracted from this burgeoning volume of data. And increasingly, they have been able to get at it more efficiently and cost effectively. Yet for all their enthusiasm for “big data,” most companies are only scratching the surface of the opportunities that await them.
We’ve all heard the old adage “Garbage in, garbage out,” a term attributed to be coined by George Fuechsel, an IBM programmer and instructor. Simply put, the phrase notes the cause and effect relationship between bad data input and output in the realm of computer science. In other words, software can only process what it is given.