Negotiations, Customer Interactions: How CRE Could Use Voice Sentiment Analysis

There could be an AI arms race, with one side trying to catch someone out while the other uses related technology to remain at large.

Artificial intelligence implementers have found a new application — helping investors suss out CEOs. The Financial Times reported at length how AI tools can monitor voice recordings for changes in pitch, volume, and speed of speech to better identify the true emotions of top executives, comparing what they’re saying to what they may be feeling.

When Francis deSouza, CEO of gene sequencing company Illumina, was on an earnings call, analysts asked about various sources of friction and criticism. He tried to pass it off as nothing. But David Pope, chief data scientist for Speech Craft Analytics, which does AI-enabled voice analysis, told the Financial Times that the sum of vocal characteristics in how he spoke “betrays signs of anxiety and tension specifically when addressing this sensitive issue.” Two months later, deSouza resigned.

Fine for big equity investors but expect similar technologies to help CRE professionals better negotiate, work with tenants, and interact with business partners. At the same time, it would be reckless to think that other technology won’t help mask what is going on.

Such tech doesn’t come from only one company. Deloitte claims to have a product, TrueVoice, that analyzes both language and behavior.

A Wharton white paper by Saurabh Goorha, CEO of AffectPercept, a perception AI advisory and analytics firm, and Raghuram Iyengar, a Wharton professor of marketing, wrote that “advances in psycholinguistic data analytics and affective computing that allow for inferring emotions, attitude, and intent with data-driven modeling of voice” would become a core trend in AI voice analytics.

The investor case is a strong example, as those putting money into a company want to know as much as is possible about what management is doing beyond what they claim. In CRE, a natural application would be in negotiations. Recording negotiation sessions would allow later incorporation emotional dynamics into an analysis. Tenant calls could be recorded and checked for stress and emotional tells that might help predict future behavior.

Any such scenarios raise ethical questions at the least. Even if legal, should the other party realize what was happening, it could undermine a relationship.

And then there are the predictable technical countermeasures. A combination of AI manipulation of voice characteristics and the ability of deep fakes to let one person impersonate another could allow someone to impersonate themselves with computing aids to make their voices sound calm and confident. Then there will be a need for technology to detect such real-time manipulations. It’s going to be an interesting time in the future.