Over the last five millennia, little has changed in the way we transact these physical assets. From listing a property to closing the deal, multiple stakeholders, data sources, service providers, regulators and government agencies take their place between sellers and buyers, contributing friction, paper, redundancies, errors, waste delays and costs.

Ask any real estate professional to describe her day, and you’ll hear about the challenges she faces when it comes to listing exposure, data accessibility and transaction delays. If a broker wants to locate a property in New York City, he can search several different portals and see the same property listed under varying names. If an agent wants to attract an international buyer, he has to take additional marketing steps to obtain global exposure.

To close a deal, paperwork is passed back and forth in the form of email that often leads to miscommunication and transaction delays. While there have been a range of investments in real estate technology in recent years, most people are struggling with much the same problems as they’ve always had.

Those problems will persist as long we continue to approach them with solutions that do not take into account the fundamentally interconnected nature of real estate, the data that defines its ownership and its value. By establishing a base layer of universally accessible and usable data, information and records, we can literally reinvent real estate from the ground up.

Step one is creating an international standard for property data. Six years ago I was expanding a commercial real estate brokerage when I realized that the cost and access to our own data was eating into our bottom line. Read more from forbes.com…

thumbnail courtesy of forbes.com