Guest contributor |
Engineers are increasingly using artificial intelligence to automate processes and make decisions faster and more effectively than humans can. But, while engineers are experts in their area of specialisation, most of them are not data scientists. And they don’t have the time to learn data science and write the complex code that AI modules require.
Microsoft Project Bonsai helps engineers create AI-powered automation without using data science. Instead, it graphically connects software modules that have already been programmed to perform certain AI functions. A complete set of connected functions that can perform a task is called a ‘brain’. A brain is a standalone, portable software module that can be used as part of an open loop to advise a human operator on the best decision, or it can be configured in closed-loop mode where it replaces the human, making decisions and carrying them out by itself.
Microsoft is working with Ansys Twin Builder software to create digital twins of equipment or processes to be automated using AI. Digital twins can generate the large amounts of data needed to train AI brains much more quickly and at a lower cost than using physical machines for data generation.
As automated processes become more complex, the method of training an AI brain is changing too. When the goal was simple image or text recognition, flooding the AI brain with tons of labelled data so it could identify patterns worked fine. This is the basis of machine learning (ML). But when AI is being used to control a complex, multi-step process on an industrial scale, ML is not as effective. The variety of inputs from numerous sensors of different types overwhelms the brain.
Consequently, Microsoft engineers developed the concept of machine teaching (MT), which relies more on the human approach to learning. Just as a mathematics teacher doesn’t start trying to teach young students calculus before they have mastered the concepts of basic arithmetic, engineers can’t expect an AI brain to understand how an electric turbine works before it learns about rotation.
Cyrill Glockner, a principal program manager at Microsoft, said: “Imagine you’re starting with the hardest problem where the chances of finding a solution are almost nil. The AI brain will never find a way to do that. But it can slowly work its way up to it by following a combination of exploitation and exploration, taking advantage of what it has already learned and looking across the data environment to ensure that it finds an optimal solution to the problem.”
In practice, human experts first break the process down into smaller tasks. They then give the AI brain a few simple problems so it can begin learning how to use its algorithms to solve these easy challenges. Then they combine small tasks that the brain has already seen into larger ones until it can automatically control large, complex systems.
“We basically reduce the mathematical space that the AI brain has to look at by limiting it to certain parameters and ranges,” said Glockner. “Then we increase the range over time. The brain only has to deal with the new space and it already has some methods that it found in the earlier, smaller range that can be applied to the larger ones as well.”
While it is important to start with small tasks and limited amounts of data when initially training a brain using MT, once the brain is well-trained it requires large amounts of data to fully optimise its operations.
Typically, this involves generating huge amounts of data by running a physical process over and over. This data can then be fed into the brain to fine-tune its operation on the complete machine or process it was designed to automate. But generating so much data from physical processes is time-consuming and expensive. Also, if a condition occurs only once every trillion times — a ‘corner case’ — and is not encountered during the training runs, the brain will not have seen it before and will not know how to react if the same situation happens again.
Working with Ansys, Microsoft Project Bonsai overcomes these limitations by running hundreds of virtual models of the machine or application simultaneously and feeding the data generated by these digital twins directly into the brain to optimise it. Using large numbers of virtual models – instead of fewer physical ones – reduces the time and cost of training a brain. It also enables engineers to introduce corner cases in the virtual environment – which might be dangerous or damaging to a physical machine – so the brain has seen all possible scenarios before it is put into operation. The physics-based digital twin model created with Ansys can be further improved by incorporating knowledge coming from asset data, for example for model calibration or augmentation, which leads to a hybrid digital twin.
Andy Byers is the director of strategic partnerships at Ansys
This article was originally published in the Winter 2022 issue of Technology Record. To get future issues delivered directly to your inbox, sign up for a free subscription.