The UK company P2i adds water-repelling nanocoatings to smartphones and other gadgets. Normally, it flies engineers to its clients’ factories to identify and solve quality-control problems.
That’s not an option in a world where flights are grounded, borders closed, and security tightened. So in some plants, P2i now relies on a system that uses artificial intelligence to look for even the slightest defects.
“Over the last four months, since the coronavirus, we've had to reevaluate how we are going to service and deploy our machines worldwide,” says Neal Harkrider, chief operating officer at P2i.
To spot problems, P2i is using technology from a company called Instrumental. Cameras dotted around P2i’s nano-coating machines examine smartphones after they’ve been treated, and an algorithm sounds an alert if the process appears to have gone awry.
“That vision system is our primary quality-control methodology now,” Harkrider says. He says the company can adjust its tolerance for error on the fly, “and we can do that remotely, which is fantastic.”
The pandemic has forced many manufacturers to rethink established practices. In some places, remote sensing and machine learning substitute for fewer visits, overnight package deliveries, and manual inspections. Robots may be far from displacing humans in manufacturing that requires nimble fingers and flexibility. But systems like the one used by P2i show how AI can help machines carve out niches in manufacturing.
Before Covid-19, Harkrider says, most companies were reluctant to allow outsiders—including their own partners—to connect to their manufacturing equipment, for security reasons. Now, he says, five plants have allowed P2i’s machines, and Instrumental’s inspection technology, to be monitored and controlled remotely.
Bruce Lawler, managing director of the MIT Machine Intelligence for Manufacturing and Operations program, says the pandemic came as manufacturers already were warming to deploying automated inspection technology. “One of the big problems in manufacturing is ‘Where did the problem occur?’” he says. “If you can do more inspection more often, and have a camera on every robot, for every step, then you can say, ‘Well OK, that was here.’”
Manufacturers have long used computer vision to inspect products for defects or other problems, but this traditionally involved hand-coded rules for identifying flaws, making it time-consuming to deploy and change the equipment. Using AI, inspection systems can be fed examples of particular flaws or—as with Instumental’s system—be trained on what a product is supposed to look like and asked to identify abnormalities.
A form of AI known as deep learning has transformed computer vision in recent years and is rapidly spreading through manufacturing. An algorithm fed many thousands of example images can learn to identify dogs or cats in images, or to spot a particular person in security footage. It can also be trained to spot deviations from the norm in images of screws or circuit boards or screens.
Instrumental was founded by ex-Apple engineers to use machine learning to automate the monitoring of production lines. “Traditional vision systems are not well suited to discover and solve problems, because they're ultimately rule-based,” says Anna-Katrina Shedletsky, the company’s CEO. “It’s a really good time to be talking about AI inspection, because there are these new pain points.”
Makers of traditional computer-vision systems, such as Cognex, increasingly tout machine learning in their products. Some startups offer off-the-shelf systems that promise to be easy to deploy and use.
At a Toyota manufacturing plant in Indiana that churns out hundreds of cars a day, quality control is crucial. Put the wrong widget into the wrong dashboard and production may grind to a halt. Workers normally scan a barcode on each part to double-check that it’s correct. But the plant is now preparing to deploy a robotic system that moves a camera around an object when an employee holds one out. It peers at the part from different angles and uses artificial intelligence to identify the component before (hopefully) giving the OK to install it.
The inspection robot, sold by Elementary Robotics, a startup based in Los Angeles, doesn’t look particularly futuristic, with a camera that moves horizontally and vertically along H-shaped bars. Place an object in front of the camera and it will inspect it from several perspectives. The robot shows how human workers and autonomous systems may work together on some manufacturing lines.
“Automation is classically a very brittle environment where you design these really complex, kind of kludged-together solutions,” says Carlo Cruz, a senior engineer at Toyota who is overseeing testing of the system. “I think the idea of having a human in the loop becomes fundamentally important in the future.”
Cruz says he would like to deploy the technology in other areas eventually, including inspection and quality control. “We see a lot of potential,” he says.
Elementary Robotics, founded in 2017, has been operating in stealth mode until now; it announced a $12.7 million Series A funding round Tuesday. Over Zoom, the company’s CEO, Arye Barnehama, shows off another version of the inspection system designed to examine ecommerce products for packaging damage and misapplied labels. He also demonstrates a version being used by another customer to examine circuit boards for flaws.
The systems cost “in the low teens” of thousands each, Barnehama says, and they need relatively few examples to be trained. A customer sends a few dozen images of an object to Elementary Robotics, which uses them to train an algorithm. As workers display new objects, the algorithm determines if they are as intended. A worker clicks a button to say whether the algorithm was correct, improving the process for the next round.