James Ginley, Technical Surveying Director at e.surv Chartered Surveyors
In my previous article I highlighted the importance of provenance when using data to make decisions. However, provenance is only part of the problem we face. The real estate appraisal process is essentially a judgment at a point in time. An appraisal, an investigation, a credit decision are all made based on what is believed to be true at a given time, using the facts available at that time. That sounds robust enough until you start pulling at the thread of what “where” actually means in a system where information moves slowly, processes run in parallel, and reality has a habit of changing between one checkpoint and the next.
Before we even get into questions about verification or data quality, there is a more fundamental issue: how do you know if the information you rely on is still current? How do you know that circumstances haven’t changed? Building permits, building regulations, remediation works, structural issues – they all operate on different timelines, are recorded in different systems, often with long delays between an event happening and that event becoming visible to the wider market.
That gap between reality and record is becoming more and more problematic, not smaller. Whether it concerns searches for title information. the system struggles to monitor in real time or even near real time who owns what security, when and for what exactly. The faster the transaction, the greater the chance of misalignment.
What lies beneath all of this is a deeper structural problem: real estate data is not linear. Planning permission may be granted on one date, but no one really knows when – or if – the permitted work has been carried out until another event occurs later. Building codes may confirm otherwise at some point.
Even if the data is accurate, it is accurate at different times. One data set may correctly state that the building permit was granted on a certain date. Another may later confirm completion, recovery, or compliance. Both are true, but only within their own temporal framework. The difficulty is connecting these artifacts so that users understand not only what is known, but also when it was last known to be true, and for what purpose.
This leads to the question of origin and relevance. Not all data needs to be verified equally for every use. An electrical safety certificate may be irrelevant in one context and crucial in another. It really matters to a landlord whether it is up-to-date; a homeowner or lender is not allowed to do that. Without discipline, systems risk becoming overloaded with information that is technically accurate but operationally distracting.
The pursuit of increasing transparency and availability of data in the banking and real estate sectors is understandable and even desirable. But more data does not automatically mean better decisions. Without context, time tracking and clarity about appropriate use, it can just as easily slow down processes, obscure material risks or create a false sense of security.
The real challenge isn’t collecting more real estate data. It’s about understanding the truth of something, when that truth was last verified, and whether it is appropriate for the decision being made now. Until systems can align these timelines more effectively, ownership will remain what it has always been: an asset defined as much by what we don’t know yet as by what we think we know.
James Ginley, Technical Surveying Director at e.surv Chartered Surveyors

