Brainbox AI Figured Out How to Eliminate Hallucinations in Its Gen AI

They found that doing less was much more.

On the surface, the product announcement from BrainBox AI sounded like many others: a “world’s first.” In this case, virtual building assistant called ARIA (Artificial Responsive Intelligent Agent) that is, in the latest fashion, an implementation of a large language model (LLM) generative AI product.

Unlike many other companies, though, the company wasn’t developing or licensing some “it will do anything and everything” automation. Instead, the company understood the potential shortcomings and dangers of the technology and ensured that it served a neat and subservient role.

“For us it’s super important,” company co-founder and chief technology officer Jean-Simon Venne tells GlobeSt.com. “As soon as we have an hallucination, the building engineer will stop using it.”

Hallucinations are when LLM software make something up. It’s a danger because the programs pore through training data and create enormous collections of links between words. Sometimes, the statistical nature means strings of words can come out that are coherent but unconnected to fact because the programs don’t think. They just provide a response that seems appropriate.

“When you think about it, it was obvious that we were going to end up where we are with the way we train these models,” Venne says. “You train it on all the information on the Internet. You’re training a model on information that is exact information, that is factual, and a lot of false information. When you ask a question, guess what you’re going to get? You’re going to get both.”

Venne’s team trained the LLM only on the data that comes out of its system — a lot of data collected from thousands of implementations — and then restricted it from generating anything novel. “As soon as you do that in the prompt engineering, you’re killing things like having the head of a dog on a chicken,” he says. “If you don’t want to create new stuff, you have to cut that out at the very beginning.”

By restricting what the software can do, the company focuses it on the amounts of data that are so large as to be incomprehensible to a human brain. That is where the value lies.

“It’s extremely frustrating to be a facility manager because you have these 200 things you need to do today,” Venne explains. “There are too many and we don’t have enough facility managers” to do them all.

Someone at Dollar Tree stores, a customer of the company’s, might need to generate a report that will take two hours of time — checking where the data lies, writing code to process it correctly, formatting, and then generating the result. “We could do that with ARIA in two minutes,” says Venne.

It may not write poetry, but then it doesn’t have to.