Deepfakes Will Be a Problem for CRE

New types of fraud always find ways to exploit everything possible, and that includes CRE.

The concerns and warnings started last year. It probably wasn’t soon enough.

New forms of artificial intelligence — generative AI — has made it relatively easy for fraudsters to imitate almost anyone or anything. That includes many aspects of commercial real estate.

The Federal Trade Commission mentioned the problem last year. “Thanks to AI tools that create “synthetic media” or otherwise generate content, a growing percentage of what we’re looking at is not authentic, and it’s getting more difficult to tell the difference,” they wrote. “And just as these AI tools are becoming more advanced, they’re also becoming easier to access and use. Some of these tools may have beneficial uses, but scammers can also use them to cause widespread harm.”

The FTC says there is already evidence that criminals and con people have put the technology into use. “They can use chatbots to generate spear-phishing emails, fake websites, fake posts, fake profiles, and fake consumer reviews, or to help create malware, ransomware, and prompt injection attacks,” they wrote. “They can use deepfakes and voice clones to facilitate imposter scams, extortion, and financial fraud. And that’s very much a non-exhaustive list.”

All well and good in theory. For what it means practically, the National Association of Realtors had an interesting list of possible problems, including the following:

Some steps you can take include making careful use of deepfake detection software (still in its early days and may give false results), verify everyone’s information, watermark all documents, and keep following developments.