How Do Machines Learn?
Let's take a look at how machines learn and also explore predictive maintenance and process control.
Join the DZone community and get the full member experience.Join For Free
Factories have witnessed a sea change in the past three decades. The 80s and 90s witnessed Industrial Automation and robots came to the forefront. During the past decade, multiple game-changing technologies are reshaping the factories. Machine Learning, Internet of Things (IoT), Big Data, Virtual Reality (VR), and Artificial Intelligence (AI) are fundamentally alerting the way factories work. Their impact is not limited to manufacturing, they are influencing almost every industry. This article tries to explain Machine Learning and its significance in the world of manufacturing.
First, let’s try and explain Machine Learning. Simply put, it refers to algorithms improving on their own. Normally, when a program is written, it is expected to deliver a prespecified output for a given set of inputs. Over time, they identify patterns and “learn” to generate an output corresponding to a different set of inputs. In cases, algorithms learn to respond to new situations, e.g. algorithms on the trading floor “learn” to respond to different market situations. Machine Learning can be implemented through Decision Tree Learning, Association Rule Learning, Artificial Neural Networks etc.
How do we leverage Machine Learning in factories? There are multiple ways they can be leveraged in a factory. The following sections briefly describe how Machine Learning can be leveraged on the shop floor.
Equipment on the shop floor generates tons of data during operation just like airplanes generate a large volume of data while flying. Most of these data were going virtually unnoticed. With the advent of advanced analytics, using Big Data and IoT platforms, these data can be analyzed. It helps to identify the range of critical parameters before breakdowns. Hence, algorithms can be written, which can analyze machine-generated data real-time and trigger an alarm as soon as any critical parameter goes into the red zone. After a while, algorithms can predict the type of break downs that have never occurred. This helps prevent machine breakdowns and minimize unplanned downtime.
Just like equipment, processes generate a large amount of data. These data conceal a lot of information about process control as well as Out of Control (OOC) and Out of Spec (OOS) situations. In OOC and OOS, a series of steps need to be followed, which are known as OCAP (Out of Control Action Plan). With automation, OOC and OOS can automatically be detected and to a good extent, OCAP steps can be automated. Machine Learning can help automation respond to OOC and OOS that has not been encountered in the past. However, it’s not entirely true that Machine Learning can enable automated response to “any” new OOS or OOC scenario. Over a period, automation becomes more “intelligent” to respond to certain new situations.
It’s important to know probable defects and their impact on Yield before we mass produce a product. Predictive models analyze historical data associated with similar products as well with the same product during its proto typing to predict defects and their quantum, which will, in turn, affect Yield. As predictive models mature, they will improve with regards accuracy and they can forecast defect rates for scenarios that have not been programmed.
Energy expense is one of the major overheads in a factory. Companies want to minimize energy-related cost. For a given production volume, equipment clusters, floor layout, etc, programs calculate the optimum level of energy required. If there is a closed loop mechanism, it makes sure energy consumption is at an optimum level in cases it goes beyond prescribed limits. However, in today’s world, the shop floor has a dynamic environment where multiple parameters can change. For example, there may overtime on certain equipment or there are human errors. Programs designed for energy optimization “learns” over time to react to such dynamic situation and keep energy consumption at the desired level.
Material costs are a major contributor to overall cost, especially in Process Manufacturing environments. Hence, Process Manufacturing driven plants closely monitor chemicals consumed at different stations and by each process steps. Algorithms have been developed to monitor the consumption and to take corrective steps in cases when consumption start going up beyond limits. They check a pre-programmed list of probable causes and take corrective actions as required. Over time, these algorithms “improve” and “learn” to respond to some situations which are not explicitly defined in the program.
Scrap has the potential to drag Yield down. Real-time analysis provides insights into defects leading to scrap. This, in turn, helps come up with mitigation methods for each type of defects. Checks and Controls can be put in place to prevent defects. Underlying programs and logic keep improving and can provide new perspectives regarding defects which help reduce defects. This helps improve Yield.
Manufacturing organizations want to carry minimum inventory to avoid carrying cost and to reduce working capital requirements. Organizations resort to different forecasting and planning methods to minimize on-hand inventory. There are still gaps which result in stock-outs and higher inventory levels in plants and warehouses. Advanced analytics using big data and IoT platforms can crunch large volume of real-time data and provide recommendations to consistently maintain optimum (or minimum) level of inventory. As these models become more matured, they can correctly respond to certain scenarios which have not been experienced before. This helps effectively managing inventory levels in plants.
Supply Chain Planning
It involves multiple decision-making steps, e.g. which plant should produce a given product and what should be the production volume, whether to make or buy a product, where to maintain manufacturing facilities, where to source raw materials from etc. Current models rely mostly on historical data, hence, many times, they fail to predict new scenarios. With Machine Learning these gain the capability to analyze a large volume of data real-time and become “intuitive.” This helps build flexible models which can respond to certain situations not encountered before.
Machine Learning holds the promise to facilitate improvement in multiple areas in manufacturing. It’s not “One size fits all.” Organizations must be careful in choosing the right Use Case and in selecting the right technology platform and System Integrator (SI). We also need to keep in mind that it takes a while before we can see tangible benefits. A longer-term roadmap with correct choices of Use Case, Technology platform, and SI partners hold the key to the success of such initiatives.
Opinions expressed by DZone contributors are their own.