Contact Us!

By submitting this form, you agree to receive promotional messages from PiinPoint about our products and services. You can unsubscribe at any time by clicking on the link at the bottom of our emails.

Thank you

We will be in touch soon!
Oops! Something went wrong with our form.
Please send us an email at Info@PiinPoint.com and we will help you out.

"PiinPoint has become an integral part of my role as Retail Analyst at Cushman & Wakefield Waterloo Region. The platform allows me to put together professional looking reports and provide clients with the insights they need to make real estate decisions.

I honestly don’t know how I would do my job effectively without PiinPoint."

Jessica McCabe, M.Ed.
Retail Analyst

PiinPoint
PiinPoint Logo
Login

GeoAI Blog Series: Trust and Transparency; Beating the “Black Box” Syndrome

PiinPoint

-

February 14, 2024

In recent discussions with industry leaders and as outlined in our recent thought leadership piece “Evolution of Real Estate Network Planning in the Age of AI”, one of the most popular impactful barriers to the adoption of AI/ML technologies in their organizations was the inability for Executives and Leadership more generally, to “believe” or “trust” the outcomes from complex model systems. This remains a challenge even when a model has been validated to be accurate to a high standard. Outcomes such as sales forecasts, profitability estimates, ideal markets to locate in, foot traffic estimates, and a host of others that are predictions formed using complex neural networks and other AI type algorithms were difficult to explain. They are regularly touted as proverbial “black boxes” whose outputs leave Executives scratching their heads and asking tough questions. 

This is not that new actually. Historically, fields of applied statistical analysis such as econometrics, financial modeling, psychometrics, biometrics, engineering, etc. used model structures that were designed based on testable theories of how real world phenomena work. The focus was on understanding and testing theoretical relationships among input variables and target variables, rather than simply predicting the next possible outcome of the target variable. 

Modern data science and machine learning advances have led us down a path where the actual predictions and their accuracy are more valued than the interpretation of how the model works and what variables tend to drive the outcome. This leads to using algorithms that can accommodate thousands of variables to predict the target variable (without over-fitting) with more accuracy, rather than understanding what variables are most important in driving the outcome. As such, structural relationships within the model are difficult to interpret. Recently, much has been written about “trust” in the AI field, particularly as it pertains to Generative AI and advanced AI/ML algorithms related to deep learning. The changes are coming at light speed. The concern came up in our research interviews as well and from two sources; Real Estate Leadership and classically trained GIS Practitioners. 

From Real Estate leadership, the “trust” issue stems not from the technical point of view but more from a traditional bias to exercise their own judgment, relationships with brokers, and the always present “I have always done things this way” attitude. The switch to data/model driven predictions about which markets are good or even what the potential mature sales could be at a net new location is a hard one, especially when the data-driven predictions are counter to their own experience. Only time will tell of course, but there is movement here when the analytical teams are able to explain “the why” in their models. This barrier has been significantly beaten down among AI High Performers with an AI strategy and a Center of Excellence already in place. 

The absence of trust and transparency in AI/ML-based models emerges as a significant concern on the technical side among GIS practitioners as well. The traditional training of GIS professionals has historically been about building structured regression (Gravity) models for trade area estimation with tunable and explicable parameters e.g demographics and customer sales data. Professionals in this domain often view AI/ML models as opaque "black boxes," lacking clarity on their construction and key driving factors leaving them in the dark on the key variables driving the outcomes. Accuracy is good of course, but sometimes Executives want to know what is driving the prediction. Without a clear understanding of the underlying variables or critical predictive characteristics, real estate executives and traditional GIS practitioners lose faith. The lack of insight into what propels these AI/ML models creates skepticism, questioning their reliability and usability.

The concern that AI models are often seen as "black boxes" has been a subject of debate and research in the field of artificial intelligence and machine learning. There are formal initiatives and general discussions ongoing in the AI community on how organizations can address the “trust” and “transparency” issues.

Explainable AI (XAI) Movement:

The field of Explainable AI (XAI) focuses on developing models and methods that provide insights into why a particular decision was made. XAI aims to make AI models more transparent and interpretable, reducing the "black box" problem. Techniques like LIME (Local Interpretable Model-agnostic Explanations) and SHAP (SHapley Additive exPlanations) are examples of approaches that can help explain AI model predictions. The core goal of these and other techniques is to provide an understanding of the “most influential” input variables to a predicted outcome variable. The AI community recognizes the importance of model transparency and interpretability. Researchers are actively working on techniques and tools to make AI models more understandable. This includes methods for visualizing what the model has learned, like feature importance, attention mechanisms, and saliency Maps.

The goal is to provide a better understanding of how the model works, support discussions with executives to promote understanding and ultimately build “trust” in the models and promote discussion on strategy, rather than a constant mistrust of the technique itself. 

Education and “Have a Human” in the Loop:

AI models' complexity often stems from their ability to handle vast and intricate datasets - a benefit - but can be challenging to interpret in isolation - the downside. However, working in collaboration with human experts like GIS professionals, can help validate and contextualize AI model outputs, reducing the opacity of the "black box." As a result, part of the solution to the "black box" issue is educating GIS professionals and users about the capabilities and limitations of AI models. Understanding the types of data and tasks where AI is most effective, as well as the potential biases and limitations, is essential. We heard several times in our AI High Performer interviews that building simpler models is a mandate, not an option, making them more interpretable through structured design sessions, even at the expense of model performance. Striking a balance between interpretability and performance depends on the specific application and the level of transparency required. These points further validate the need to bring GIS and DS teams together under the umbrella of an AI Strategy, data strategy and talent strategy more generally.

The pathway to mitigating the Transparency and Trust issue lies in two areas

  1. Adopting a methodological approach via Explainable AI (XAI) and focusing on developing models and methods that provide insights into why a particular decision was made. XAI aims to make AI models more transparent and interpretable, reducing the "black box" problem. 
  1. Fostering consistent collaboration between GIS and Data Science (DS) teams (Center of Excellence), aligning their objectives toward building AI/ML models for Real Estate. A concerted joint effort “with humans in the loop” would enable the articulation and comprehension of these models, thereby addressing the "black box" syndrome. Instilling transparency, parsimony and a clear understanding of the model's mechanics by establishing standardized protocols and frameworks for AI/ML model development, WILL enhance transparency and trust.

The PiinPoint Way

At PiinPoint, we have embraced XAI methods to promote our work with clients. In fact, it was the demands of clients over the years that we provide good intuitive and logical explanations of our machine learning model systems that have allowed us to build a lasting trust in our GeoAI capabilities we deliver to our clients. There are two core tenants of our approach:

  1. Technical Clarity: We provide contribution metrics for each model input variable along with overall model results to our clients. This involves highlighting three related dimensions: the magnitude of impact of important contributing variables for a given prediction, the dominant directionality of that variable’s impact across the fitting data, and finally, since the magnitude changes across the fitting data due to the nature of machine learning algorithms, we provide the range of the impact magnitude for each variable seen across the fitting data. This provides a well-rounded view of the main factors leading to a given out-of-sample prediction.
  1. Communication Protocols: As part of all our implementations and delivery of custom modeling approaches, regular and timely review of models, the key variables driving the models, and their structure are communicated in workshops that employ XAI techniques as described above. Client feedback is then incorporated into future rounds of modeling to ensure the final outcome is not a “black box” but a “clear box” understood by all. This collaboration with clients is transparent and builds trust in the system and ultimately builds lasting relationships. 

Check out PiinPoint. PiinPoint exists to support our clients' need for transparency and to increase their trust in their Real Estate market analysis, forecasting and network planning processes powered by GeoAI. 

Learn more about how PiinPoint can help your business evolve and thrive.