Executive summaries from IIE Transactions (September 2010)
Edited by Susan Albin and Joseph C. Hartman
This month's research highlights industrial engineers involved in process control. The first overview involves optimally allocating rapid measurement and flexible tooling devices to control quality. The second highlights multivariate system monitoring and control. These articles and four others appear in the October 2010 issue of IIE Transactions. (Volume 42, No. 10).
Modern equipment, analytical models and quality control
Modern manufacturing processes such as semiconductor or automotive manufacturing can be characterized as multistage processes. In such processes, quality errors are introduced into the product at each stage and interact with each other to form a complex dependency between process parameters and product quality. The recent decade has seen a tremendous proliferation of models describing the aforementioned connections, which enabled model-based approaches to identifying root causes of quality problems in modern manufacturing processes, characterization and allocation of measurements, and design approaches to maximize one’s root cause identification ability.
Recent advancement of rapid measurement and flexible tooling devices, such as optical CMMs in automotive manufacturing or modern lithography tools in microelectronic manufacturing, enable real-time insight into the quality of the product as it progresses through the manufacturing system. And operators can react immediately to any quality problems as they occur. Unfortunately, such measurement and controllable tooling devices are expensive and slow down the manufacturing process. Therefore, one most likely will have a limited number of such devices, and they need to be allocated and utilized intelligently.
In “Joint Allocation of Measurement Points and Controllable Tooling Machines in Multistage Manufacturing Processes,” doctoral student Yibo Jiao and professor Dragan Djurdjanovic from the University of Texas at Austin used explicit models of the flow of quality errors in modern manufacturing processes to devise a novel stochastic quality control method. This method facilitates distributed feed-forward adaptations of controllable tooling in the manufacturing system in response to quality problems detected through the in-process measurements of the product.
The resulting reduction in quality variations is then utilized to optimize allocation of the measurements and controllable tooling devices across a modern manufacturing process. This optimization enabled smart utilization of measurements and controllable tooling through a trade-off between maximally reducing the variations in product quality and the cost of measurement and tooling resources needed to accomplish that. The results are demonstrated using the error flow model of an actual industrial process for automotive cylinder head machining. Most recent research by the authors demonstrated the capability of this method to achieve significant quality improvement in multilayer lithography in semiconductor manufacturing.
CONTACT: Dragan Djurdjanovic; firstname.lastname@example.org; (512) 232-0815; 1 University Station, C2200, ETC 5.122, Department of Mechanical Engineering, University of Texas at Austin, Austin, TX 78712
A missing link in multivariate system monitoring and control
Statistical process control and engineering process control (closed-loop, feedback adjustment) have different histories, but both are widely used strategies to maintain process output near a specified target value. Rather than adjust at every measurement, a bounded adjustment strategy postpones a process adjustment until there is evidence that the system has drifted from its target value. Consequently, it is an important link between statistical and engineering process control strategies. The optimal bounded adjustment strategy for the case of a single variable was derived in the 1960s. Over the years, additional work has expanded upon this relationship, but the strategy still only holds for a single variable.
However, modern systems typically record many process measurements, and multivariate methods are used extensively to control and monitor the systems. Consequently, the link between statistical and engineering process control is important in the multivariate case.
In “Optimal Multivariate Bounded Adjustment,” doctoral student Zilong Lian and professor Enrique del Castillo from The Pennsylvania State University and professor George Runger from Arizona State University derived the optimal bounded adjustment strategy for a multivariate system. They used dynamic programming methods and exploited a symmetrical relationship to obtain a closed-form solution for the optimal strategy. A simple calculation emerges that generalizes the univariate (single variable) case. Furthermore, a numerical method is developed to analyze the adjustment strategy for an arbitrary number of measurements with only a one-dimensional integral. Both infinite- and finite-time-horizon systems are analyzed, along with a numerical illustration.
CONTACT: George Runger; email@example.com; (480) 965-3193, Industrial Engineering, Arizona State University, Tempe, AZ 85287-8809
The most recent issue of The Engineering Economist (Volume 55, Number 2) features five articles, from cash flows analysis (average rate of return and probabilistic net present value) to N-person games under uncertainty to analyzing inflation and capacity utilization. Two of the articles are summarized here.
Capacity utilization and inflation
The decision of when to expand capacity, which includes adding new equipment, production lines or plants, is driven by a number of factors. In an economic downturn, such as the one the U.S. is currently facing, companies often cut back on capital spending in order to save cash and survive lower sales. However, other companies that have cash on hand often invest during downturns in order to have the capacity ready when the economy rebounds. This is especially true if there are significant lead-times involved with bringing the capacity online. And generally, better pricing terms for materials and labor can be negotiated in slower economic times.
At the other end of the spectrum, many companies make capital investments during good economic times in order to take advantage of strong sales and good cash flow. It is often the case that a good economy will push utilization levels to their limits, requiring investment in order to meet demand. Due to this, it is often predicted through traditional economic models that high capacity utilization leads to periods of accelerated inflation. This can contribute to Federal Reserve Board worries in strong economic times that can force decisions to increase interest rates in order to slow investment, thus curbing inflation.
In “Further Evidence on the Dynamic Link between Capacity Utilization and Inflation,” associate professor Mark Thompson and professor Bradley T. Ewing, both of Texas Tech University, expand on previous research to look at the capacity-inflation relationship. First, they allow the relationship to be asymmetrical. Second, and more interestingly, they expand the analysis period to cover the 1990s through today, a period that has experienced relatively stable inflation and often is characterized by two major movements in manufacturing: lean manufacturing and information engineering. These movements also have been strongly associated with interest in supply chain management.
From their analysis, the authors concluded that it appears that the capacity-inflation relationship may be relegated to earlier times. More specifically, the relationship does not appear in the current “information engineering” era. The authors hypothesize that “reasons for this are likely due to greater efficiencies in the supply chain, especially due to information technology and the speed in which price information is incorporated into related products, materials, supplies and services along the supply chain.”
Interestingly, the analysis also showed that the steady-state capacity utilization rates have dropped in recent times. This suggests that companies can operate more efficiently than previously thought at lower utilization rates or that companies reserve capacity in order to be able to respond more quickly to customer demands.
CONTACT: Mark Thompson; firstname.lastname@example.org; (806) 742-1535; Texas Tech University, Health Organization Management, Lubbock, TX 79409
Subsidizing investments in security
Homeland security has received significant attention over the years. This has included investments in networks in order to improve security. This is a complicated problem since many entities must work together to achieve a comfortable level of security for the entire system. For example, a transportation or computer network requires that each node be secured in order for the system to be secure. As the nodes in the network may be owned or operated by a number of different players, the level of achievable security is dependent on the investment decisions of each player in the network.
There has been significant research into examining the security investment decision in recent years. It has been shown that if investment costs are too high or the predicted returns on investment are too low, then the dominant strategy for a player is not to invest. Obviously, this can be detrimental to the common good, and thus, government entities often provide subsidies or incentives to promote investment. This action often can spur players to invest such that it becomes the norm for all players on a network.
In “Impacts of Subsidized Security on Stability and Total Social Costs of Equilibrium Solutions in an N-Player Game with Errors,” Jun Zhuang, an assistant professor of industrial and systems engineering at the University at Buffalo, examines the situation where some of the players, or agents, on the network invest while others do not. Those that do not are said to make “erroneous” investments, and this often can happen in practice. That is, a model may suggest investing, but the decision is ignored by those that must implement the decision.
The author investigates this phenomenon and shows that subsidizing or providing free security can reduce the effects of erroneous choices, decreasing the social cost of investment. Thus, it would appear that the government must continue to invest in security to ensure that all relevant agents participate.
CONTACT: Jun Zhuang; email@example.com; (716) 645-4707; University at Buffalo, Department of Industrial and Systems Engineering, 403 Bell Hall, Buffalo, NY 14260
Susan Albin is a professor at Rutgers University in the department of industrial and systems engineering. She is editor-in-chief of IIE Transactions and a fellow of IIE.
IIE Transactions is IIE's flagship research journal and is published monthly. To subscribe, call (800) 494-0460 or (770) 449-0460.
Joseph C. Hartman is editor of The Engineering Economist. He is a professor and the department chair of industrial and systems engineering at the University of Florida. He has been a member of IIE since 1995 and previously served on the IIE Board of Trustees as senior vice president for publications.
The Engineering Economist is a quarterly refereed journal devoted to issues of capital investment. Topics include economic decision analysis, capital investment analysis, research and development decisions, cost estimating and accounting, and public policy analysis. To subscribe, call (800) 494-0460 or (770) 449-0460.