In the third installment of a three-part series, Ron Di Carlantonio, CEO and founder of iNAGO, a Tokyo/Toronto-based company creating next-generation conversational digital assistants, unpacks how the applicability of deep NLU (natural language understanding), context-aware and automated dialogue management systems.
He discusses how the solution shows promise at filling gaps in communication in industrial contexts, as well as the current challenges the industry faces in adopting AI-assisted solutions.
Machine Design: How are you figuring out the gaps and what use cases are you working on with partners?
Di Carlantonio said his company is working with manufacturing partners in Japan to implement the NLU assistant technology with the goal of uncovering areas where they can make improvements through automated knowledge retrieval.
“There is definitely a gap between people making the machines and the people using the machines,” said Di Carlantonio. “Right now, it’s in the discovery process…We went to an automotive company and we said, ‘okay, so what do people want to ask?’”
No one truly understood what plant-level users’ or end-users’ needs are because this information has never been captured. “Basically, the plant knows the problems, but they have no idea what a user (at the plant level) would do with a machine other than what the machine maker has said you could do.”
Neither do plants have the time and resources to gather this information.
“In this discovery process, we’re going to be able to first enable machines to talk,” he said. “Maybe it’s going to be a limited language. Maybe it’s going to be quite simple at the beginning, but it will give people an opportunity to interact with machines to allow the manufacturers to understand what’s going on.”
READ MORE: For Better or For Worse, Multimodal AI Cultivates New Frontiers
iNAGO’s intelligent assistant technology will give plants an opportunity to interact with machines, which in turn enables manufacturers to understand what’s going on at the machine level.
“The work we’re doing right now is, how do we take that knowledge or information we know about machines and put it in these AI models that allow people to interact with them in a natural way?” he said. “And then, the second step of that is: What are they asking? What problems are they having?”
MD: What potential does this technology hold for new builds versus retrofitting older equipment?
Bringing the technology to industrial applications is a step process, surmised Di Carlantonio.
The first step is an independent solution. “It can be on a tablet to support the machinery, the plants, processes, the regulations, etc.,” he said. “It’s an independent thing, not connected to the machine, doesn’t have the context of the machine, perhaps, doesn’t have the context of, let’s say, the systems running, but it’s independent and it can support [the machine].”
The second step is brought in. “This is working with the older machines,” he said. “It’s a brought-in solution that does integrate with the machine because the machines are spewing data. The [brought-in solution] integrates with the machine such that it understands the context and it can alert you…This little red light, this is what it means. And there’s a little yellow light over there. Take these two things together and here’s what you need to do so it can understand the context to help people.”
The final solution is embedded into the machine and processes. “It is not only communicating with the people on the floor, but it can communicate better with the other machines on the floor,” Di Carlantonio explained. “It might have a microphone built in, so you can literally talk to it or it might be that you’ve got a special device—a phone—that allows you to talk to the machine.”
MD: What are the limitations?
One of the immediate challenges is figuring out where the information is and how to transform the data into something that can be interacted with in a natural way, said Di Carlantonio.
“That’s the challenge we’ve been trying to solve for six years,” he added.
iNAGO has been working, for example, on ways to convert the machine’s user manual or a specification, in combination with AI, into something that allows users to merely ask questions and get information.
Another challenge, he said, was recording institutional information (the information experts on the floor hold in their heads) and convert that into an accessible data set. “How do we get those people to be able to share in this and be involved and be able to take their knowledge and take it down into the AI?”
READ MORE: Making Conversation: Using AI to Extract Intel from Industrial Machinery and Equipment
The final challenge is the business model, according to Di Carlantonio.
“Who's going to pay? And of course, with AI or with ChatGPT, for example, Microsoft paid a billion dollars in their first investment to create these things. Well, nobody in manufacturing is going to invest a billion dollars to do this kind of thing. We need to gradually bring it in and we need to figure out the right business model that is a win for everybody.”
Di Carlantonio said he believed the end customer would pay for a service to improve their productivity and efficiency. “Companies like ours, the tech companies at the end, will be at the very bottom, but we will be providing that technology to allow current manufacturers to incorporate this into their technology and then provide it as a service to the end customer.”
For more coverage of emergent technologies in the manufacturing space, be sure to check out the Nov./Dec. issue of Machine Design, out now.