Bias and flawed decision-making can hinder deals.
AI needs solid input data, brain illustration

Artificial intelligence is already proving to be valuable to early adopters in commercial real estate who are involved in content creation and marketing, property valuation and market analysis, predictive analytics, and risk assessment. But unquestioningly trusting the data can be risky.

Hallucinations and Other Problems

“Hallucinations” are AI-generated responses that sound plausible but are factually incorrect. They’re difficult to detect because AI-generated answers are typically well-stated and sound matter-of-fact, yet might provide false data sources and, in some cases, even manufacture quotes.

“AI models depend heavily on the quality of input data used for training models. If it’s incomplete, it can lead to biased predictions or responses,” says Ra’eesa Motala, SIOR, president of Chicago-based Evoke Partners. “Do you have all the key factors inputted correctly to ensure your output will yield results with little room for error or miscalculation? Have you taken variables into account? You need to understand for yourself what went into that decision-making to avoid [hindering] an investment or development transaction.”

How to Ask Questions

“The way that AI generates a response is based on how you ask the questions,” says Kim Ford, SIOR, CEO of the Rise Agency Group in Pittsburgh. ChatGPT provides tips on how to ask questions that will result in the most useful answers:

  • Be specific with your request.
  • Provide context and background information.
  • Use explicit constraints and guidelines.
  • Experiment with various phrasings and approaches.

Another challenge, Ford says, is the lack of publicly available information to input into AI, especially on the leasing side, whereas recorded data exists for building sales.

Adapted from SIOR Report, Spring 2024.

Advertisement