Onsite Apartment Staff Sometimes Struggles with AI

Challenges are a work in progress for apartment companies making use of the technology.

They were ready to celebrate, or so they thought.

Morgan Properties announced to its onsite teams through a pilot program, “We’re getting out of the pay stubs business” after it found an artificial intelligence (AI)-based technology solution to improve a step in its resident verification process.

Speaking at the National Multifamily Housing Council’s OpTech Conference Nov. 2 in Las Vegas, Amy Weissberger, senior vice president of corporate strategy, Morgan Properties, said some staff didn’t trust it.

“When we could show them that it detects fraud, they started to believe, but many are still checking paystubs,” she said. “You have to prove a new technology is better than [the old way], so you keep training them, you keep reinforcing it.”

Who’s at Fault When AI Fails

When it comes to using data to drive AI and machine learning, fellow panelist Stephanie Fuhrman, vice president of corporate development, Entrata, said you really have to focus on the margin of error in your data.

Greater acceptance of data driving operations through AI and machine learning is still a work in progress, the panel said. Presently, a key issue is determining who’s at fault when AI tools fail: The provider, the client, or the customer.

IBM, for example, says in its offerings that it is not responsible, and that it’s the owner of the data being used. IBM also provides the caveat that this technology is intended for human intervention.

Weissberger said she relies on her vendors and their products and to honor the services they are promising. “Do we as a company have some responsibility, ‘Yes,’ ” she said. “We can recognize when it’s not working and it’s our role to let the providers know.”

Fuhrman said at this early stage in the technology, there will be a lot of precedents needed to make that kind of determination about responsibility.

She added that another challenge in AI use is unintentional bias, which can be a result of who is entering the data or other circumstances.

Weissberger said a computer can’t be expected to know that a property is short-staffed and that was a reason for why performance was at a certain level. Or perhaps the community was behind on doing apartment turns because it lacked enough maintenance technicians.