Best practices for machine learning in industrial automation | Avnet Silica

Display portlet menu

Best practices for machine learning in industrial automation | Avnet Silica

Display portlet menu

Best practices and use cases for machine learning in industrial automation

Michaël Uyttersprot, Market Segment Manager Artificial Intelligence and Vision
Robot working on an embedded board

Industrial automation will undergo dramatic changes in the near future. Continuous and highly dynamic advances are underway in factory and process automation with the goal of helping companies manufacture faster, better, with higher quality and more flexibility – all of which is achieved in a secured production environment. Artificial intelligence (AI) is playing a key role in the industrial automation of tomorrow, particularly machine learning.

In this article, we aim to present a range of application examples and solution approaches for the use of machine learning in manufacturing, including:

  • Self-learning robots and “cobots”
  • Environmental monitoring in factory automation
  • Operations and process management with AI-based smart glasses
  • Edge computing and intelligent sensors

 

Self-learning industrial robots and cobots

Ever since manufacturers started using robots, the focus has been on repeatable processes in mass production to ensure faster and better output. These traditional industrial robots are either undergoing dramatic improvements with the integration of self-learning mechanisms; or being replaced by self-learning or collaborative robots (known as cobots). The objective is clear: getting them to react to – or directly interact with – their surroundings or certain conditions in their environment to enable more flexible, secure and better production. Combined with computer vision (or smart image processing) and extensive, integrated sensor technologies, machine learning can help companies make a quantum leap towards the goals of autonomous production and incident predictability.

Self-learning industrial robots

The biggest difference between ordinary and self-learning robots is the necessity – or lack of it – to be programmed. An inflexible program can be replaced or complemented by algorithms that enable a high degree of autonomy, even with complex production tasks. Techniques like deep reinforcement learning and unsupervised learning produces robots that use cameras to monitor their surroundings (which could include things like localising parts they need for production) while learning more and more over time. The robots receive instructions about what they should produce, but not how they should produce it. They complete their tasks autonomously rather than automatically, while independently correcting any errors they have detected in their surroundings and in the production conditions they are monitoring and analysing. This makes lengthy programming processes virtually unnecessary, as programmers no longer need to take multiple possible incidents into account or build them into the program. A human specifies the task, while the robot autonomously translates the specification into a program that it adjusts whenever required.

Self-learning robots can be used flexibly and rapidly for a variety of applications and individual tasks with no need to create a “new” and dedicated robot each time. They will play a bigger role in small-batch production and in personalised products – like clothes, shoes and much more. As a result, the number of robots in use over the next decade will grow significantly.

Collaborative industrial robots or “cobots”

These are defined as robots that have been designed to interact with humans in a secure, common environment, unlike industrial robots that work autonomously with little human influence.

With cobots, we are talking about very complex machines equipped with sensors, integrated image processing and protected movable parts (arms). They have no parts at risk of jamming, making it much easier for them to work with people. A cobot’s embedded sensors enable it to instantly detect a human touch and adapt its working speed accordingly. Cobots are typically designed to be human-sized and can be implemented directly in a variety of environments.

A good example of the way cobots are used is on assembly lines, where experience shows that humans and cobots working together are more productive than purely manual or fully automated assembly lines. Further examples of cobot applications include industrial processes such as moving heavy objects, welding, grinding, pressing, sorting and fixing screws.

A cobot assisting a worker
A cobot assisting a worker. Result: a faster, more accurate and more secure process. (Source: Adobe Stock)

While working, cobots can replicate a worker’s dexterity while bringing additional features into processes that can be beneficial to production or to the process itself. These include rapid process adaptation, configuration and error prevention. The fact that they programme themselves by learning facilitates their deployment. The worker takes the robot’s arm, guides it through various steps or process positions, opens and closes its gripper and so on. In this way, the worker teaches the cobot what to do instead of programming it. And machine learning algorithms improve the learning process itself, which results in faster and more effective production, either on the assembly line or elsewhere.

Application examples for self-learning robots

  • Materials handling: Moving, packing, selecting, sorting materials or products. The robots can minimise or eliminate the need for humans to carry out recurring, heavy or dangerous tasks. They can work together with them instead.
  • Assembly line processes: the cobot can carry out tiring or difficult (heavy) tasks or help the worker to complete the process.
  • Tasks that are difficult to automate: the robot or cobot can learn and take over the execution of nonlinear processes that are not easy to replicate. These include polishing, cutting, sanding, deburring or other precision mechanical tasks. Machine learning can also improve the quality of these processes.
  • Sealing: specialised robots with multiple arms previously required extensive programming. Now, they can move into unusual positions and areas thanks to machine learning. They decide for themselves which action to execute with which arm.
  • Welding, painting: these are also areas where self-learning robots are capable of adapting themselves to complicated situations and of optimising processes while working with people. In addition, they are not endangered by hazardous surroundings and can take over unpleasant tasks from their human co-workers.

 

Condition monitoring in factory automation

Industry 4.0 not only regroups automation and data sharing technology for manufacturing, it also takes them to the next level thanks to IoT, cloud computing and artificial intelligence. And Industry 4.0 naturally also covers optimum working and machine conditions, which are increasingly underpinned by intelligent sensors and self-learning algorithms. Human intervention is reduced to a minimum.

Application examples for self-learning robots

Automated manufacturing requires stable conditions, which in turn depend on the quality of machines, materials and clearly-defined processes – and on machine maintenance carried out at fixed intervals. Repairs often represent the worst case in that they involve a production standstill. The order of the day is that companies should monitor machine conditions BEFORE the worst case occurs – conditions in the form of parameters that indicate an upcoming outage, problem or error.

Intelligent systems with sensors and self-learning condition monitoring enable predictive maintenance planning, reducing the risk of costly downtime. Predictive maintenance is the (r)evolution from reactive (once the worst case has occurred) to preventive (regular maintenance to stop the worst case from occurring). Real-time monitoring of parameters and the detection of anomalies in the behaviour of the production environment, combined with self-learning algorithms, can prevent major outages. However, they can also dramatically reduce maintenance costs – and enable companies to schedule maintenance so that any downtime occurs outside of their peak production periods.

  • Predictive maintenance: this involves the continuous monitoring and evaluation of production equipment. Self-learning algorithms interpret the data and draw conclusions from this analysis. Using this information, companies plan maintenance cycles to prevent outages or a drop in performance (based on a predefined tolerance range) – and schedule them for times that are as cost-effective as possible. Predictive maintenance differs from traditional (preventive) maintenance in that it uses artificial intelligence in combination with extensive sensor technology. Unlike the robotics described above, the algorithms are based on supervised learning.
  • Detection of anomalies: unsupervised learning algorithms can be used to identify and analyse rare incidents, events and observations deduced from significant data discrepancies. Data types and parameters must be able to corroborate these suspicions by detecting a structural defect or a factor that has induced a problem
  • Prescriptive maintenance: this may sound somewhat clinical, but is the right expression to describe the next step following on from the first two. With the help of advanced analytics software, the system goes beyond a forecast to deliver a recommendation and associated action to resolve the expected problem in the most efficient way.

Use case: bottling

Condition monitoring for single machines is one thing, but how does it work for entire production lines? How can companies prevent the outage of a system (motor, pump, valve etc.) from hampering all of the production? It would be a huge challenge to program actions for every possible incident in advance.

In this use case, a bottling line comprising multiple interconnected systems (filling machines, conveyor belt etc.), the filling machine is equipped with a machine controller and an AI controller. The machine controller records signals and signal patterns from the sensors in the machine, while the AI controller uses its machinelearning algorithms to interpret anomalies in the signals or signal combinations.

If there are signs of a possible malfunction of things like the motors in the filling machine or on the conveyor belt, the AI can predict the possible outage and a targeted maintenance job can be scheduled for periods of low-level production. Another possibility is that individual systems can be isolated (such as a single bottle filler) and tested using a few empty bottles until the problem is resolved. Every incident is traceable – down to the level of a single bottle – using QR codes. As a result, there are no abrupt outages, soiling or empty runs as everything is managed efficiently with prediction.

Machine Learning Maintenance Evolution diagram
Machine learning for condition monitoring in the manufacturing industry will shake up all maintenance practices. (Source: Avnet Silica)

Application examples for condition monitoring in factory automation

  • Monitoring machine performance: condition analysis based on sensor data (temperature, vibration etc.).
  • Intelligent asset monitoring: automated, real-time monitoring of all types of goods or physical assets.
  • Predictive quality assurance: predictive analytics algorithms can help to detect and resolve potential quality problems or trends in good time.
  • Warehouse management: Monitoring and control of the availability of parts or materials to reduce or eliminate the risk of surplus stock (too much operating capital) or insufficient stock (irritated customers, loss of revenue).
  • Workplace safety: to ensure a safe working environment, companies must protect staff and their surroundings. This can be done using sensor data with self-learning algorithms to interpret it and put any protection measures into place before hazards can cause injury.

 

Production and process management with AI-enabled glasses

Traditional operations management software, and its various characteristics, supports a range of very different processes: from production management to maintenance to logistics (pre- and post-production). It does this in a highly coordinated – not to mention rigid – manner with processes programmed in a fixed and unchangeable way. In today’s operations process chain, companies aim to make management software solutions more flexible by using them with technologies like augmented reality combined with AI. That allows them to take a range of process-related data into account that was previously not available or too difficult to program – real-time sensor data, incident information and details of people, machines and material.

In this specific application example, the interconnections are managed by self-learning algorithms driven by an AI processor built into a pair of augmented reality glasses. These glasses have got it all: a processor with an AI engine, high-resolution cameras, field-of-sight display, audio I/O, position sensors and a wireless connection. The user benefits by combining their own perception coupled with the computer-generated data in their glasses. Their hands are free, enabling them to find additional information in their surroundings. The deep learning algorithms use the computer vision data and that of the user to recognise and classify objects. The technology also includes natural language processing (in several languages and in real time), providing really useful human-machine cooperation that can save time in many areas of manufacturing – and in the wider world.

Benefits of AI-enabled smart glasses

Intelligent AR glasses in the industrial environment deliver several critical benefits, both to support staff in their work and to train them.

  • Speech and computer vision functions make the human-machine connection really easy: the glasses recognise objects and can use speech or images to give the worker instructions, recommendations or tips. They are useful in each specific situation, helping the worker make a judgement about it and plan the next steps.
  • The technology also “knows” which tasks the person is currently working on or is about to start work on, where they currently are, which objects they are holding and which obstacles there are in the surrounding area. This enables companies to achieve significant performance and quality improvements across a range of processes.
  • Workflows keep the worker’s hands free for more flexibility, efficiency and safety.
  • Computer-aided situation assessments (with image data display and AI-validated data capture) almost inevitably ensure fewer errors and wrong decisions in critical situations.
  • Experienced staff can help a new or less experienced co-worker without needing to be on site, acting as a virtual, real-time shadow, watching what the co-worker does and correcting him/her or recommending additional actions. Learning by augmented doing, so to speak.

Worker operating a complex machine with AI glasses
Working on a complex machine with AI glasses. (Source: Microsoft Corp.)

There are many ways these AI glasses can be used. In industrial manufacturing these include:

  • Maintenance and repair: complex machines and devices that have unusual dimensions, parts or shapes need experienced specialists who are not always available everywhere. Intelligent glasses “know” what needs to be done and collect important information on-site that can be used to recommend actions to others, even non-experts.
  • Field service: maintenance technicians in the field access product or machine data (e.g. about power stations, transformers, wind turbines, lighting systems etc.) for faster diagnosis, maintenance or repairs. AI-enabled glasses are often quicker at detecting where repairs are required.
  • Remote monitoring: systems can now be monitored and managed from almost anywhere using a “digital twin” accessible with AI glasses that gives its “physical twin” instructions (e.g. turn a system or subsystem on or off). These examples can also be combined.
  • Logistics: using AI glasses in logistics centres, users can identify the shortest or fastest paths or processes and systematise them.
  • Design and model construction: engineers can build new, optimised machines by getting inside a digital model to examine and test it.
  • Training: virtual training using AI glasses – like flight simulators for pilots – helps new co-workers learn processes and experienced staff to work on improving a process.

 

Edge computing and intelligent sensors

In today’s Internet of Things (IoT) applications, sensor data is collected and sent to the cloud, where intelligent algorithms analyse it or recommend decisions. The AI runs on very powerful servers. However, this cloud- and server-based structure can cause time lags – and in a real-time situation this can result in delayed decisions and possible problems.

Edge processing, or edge computing, brings the AI-based data analysis nearer to the application by using a powerful processor with an AI engine that quickly analyses sensor data on site, making it easier to make a quick decision. As it is network-independent, the technology is useful for a wide range of autonomous systems, as long as there is sufficient local intelligence to support self-learning algorithms.

Best practices and use cases for machine learning in industrial automation

The heart of these applications is comprised of intelligent sensors with integrated processors, which can run data analysis and self-learning algorithms. They deliver results in real-time (milliseconds or microseconds) regardless of any bandwidth or cloud availability problems.

The need for this type of smart sensor will increase dramatically, although the technology will not completely replace cloud computing. Instead, it will ensure a practical division of tasks between real-time decisionmaking locally and complex data analyses in the cloud.

Key benefits of edge computing combined with intelligent sensors

  • Real-time data analysis: the need to transfer data to the cloud for further processing is eliminated, so there are no more delays or network-related problems, which significantly improves real-time decision making. The data analysis is also considerably faster (microseconds) and closer to the application.
  • Security and data protection: the edge devices include data security to severely restrict critical data from being transferred externally. This is also likely to reduce the risk of companies falling victim to hackers.
  • Higher process reliability: all decisions about processes, workflows and equipment condition are taken on site, making machine learning much closer to the application and enabling rapid optimisations.
  • Cost reduction: Many of the costs associated with the cloud and communication are reduced or eliminated entirely.
  • Less server power: although edge computing needs more expensive computing power nearer to the sensors, today’s semiconductor innovations already enable self-learning algorithms to be integrated on low-cost microcontrollers. In future, we may well see sensor chips with AI engines. All this significantly reduces the costs of edge computing and generally those of machine learning too.

Intelligent sensors used in manufacturing
Intelligent sensors used in manufacturing (Source: Adobe Stock)

Application examples

There are many possible applications for smart sensors across all industry segments and business areas. Almost all IoT
applications will move in that direction if their nature, environment or scope make them suitable for edge computing. Here are
just three of the stand-out use cases:

  • Intelligent motor monitoring: smart condition monitoring can be installed directly on the billions of electric motors in use today, giving companies a quick way to implement a predictive maintenance strategy. That will help them prevent downtime while dramatically reducing maintenance costs throughout the industry.
  • Data logging: based on the data generated by sensors – about temperatures, vibrations or the surrounding conditions – companies can collect and store a lot of information independently of predictive analytics. They can then link this information to historical data as basis for innovative quality management initiatives, as the sensors can learn on-site, make comparisons and potentially make strategic recommendations.
  • Human activity recognition (HAR) sensors: these can monitor and predict human movement, enhancing human-machine interaction and preventing accidents or hazardous situations. The benefits of the real-time and edge computing aspects here are obvious. This alone could become a huge market.

 

Conclusion

Whether it is connected or autonomous, machine learning delivers benefits that go way beyond “faster, better and cheaper” production. But everything must always start with the use case. When projects fail today, it is often due to complexity or because their goals were unclear. With self-learning capabilities, machine intelligence can help companies master projects that are difficult to plan.

Follow Avnet Silica on LinkedIn

About Author

Michaël Uyttersprot, Market Segment Manager Artificial Intelligence and Vision
Michaël Uyttersprot

Michaël Uyttersprot is Market Segment Manager at Avnet Silica, which is continuing to develop and ad...

Best practices for machine learning in industrial automation | Avnet Silica

Display portlet menu

Want to download the article for later?

Click on the link below, leave us your email address and we'll send you the link to the pdf.

Best practices for machine learning in industrial automation | Avnet Silica

Display portlet menu
Related Articles
laptop with graphic overlay
How is AI changing the electronics industry?
By Philip Ling   -   May 22, 2024
Artificial intelligence has the potential to influence every part of any company. As a trusted partner and leader in technology, Avnet has a responsibility to its customers and suppliers to consider the impact of AI from all angles.
tensors
Why AI and embedded design share the same DNA
By Philip Ling   -   May 22, 2024
Intelligence comes in many forms. More of us are interacting with devices that appear to understand us. What and how they understand depends on the technology inside. How are embedded engineers implementing intelligence?

Best practices for machine learning in industrial automation | Avnet Silica

Display portlet menu
Related Events
City at night
How to Quickly Connect STM32U5 Discovery Kit to the Cloud
Date: January 25, 2024
Location: online, on-demand