0
Select Articles

Emerging Nexis of Cyber, Modeling, and Estimation in Advanced Manufacturing PUBLIC ACCESS

Vacuum ARC Remelting to 3D Printing

[+] Author Notes

Joseph J. Beaman Jr.’s manufacturing research interest is in Additive Manufacturing and he was the first academic researcher in the field. He was one of the founders of DTM Corporation (now merged with 3D Systems), which markets Selective Laser Sintering. He is a fellow of ASME and is Editor of the Journal of Dynamic Systems, Measurement and Control. Dr. Beaman also serves on the Board of Directors of SME. Dr. Beaman was elected to the United States National Academy of Engineers in 2013.Felipe Lopez was born in Peru. He received a B.S. degree in Mechanical Engineering from Pontificia Universidad Catolica del Peru in 2009, and a M.S. degree from the University of Texas at Austin in 2011. He is currently pursuing a Ph.D. in Mechanical Engineering at the University of Texas at Austin with a concentration in manufacturing and design. His research interests include remelting processes, estimation, and control.

Mechanical Engineering 136(12), S8-S15 (Dec 01, 2014) (8 pages) Paper No: ME-14-DEC6; doi: 10.1115/1.2014-Dec-6

This article describes opportunities for exploiting cyber, modeling, and estimation technical areas for advanced manufacturing in small lots. In particular, Cyber Enabled Manufacturing Systems (CeMs) for small lot manufacturing that incorporates a model of the process directly into the control algorithm are presented and discussed. The model enables the manufacturing monitoring and control algorithm to accommodate changing conditions without extensive additional experiments. One of the manufacturing processes currently being studied with this methodology is Vacuum Arc Remelting (VAR). Similar to Additive Manufacturing, VAR is a small lot, high-value manufacturing process. There is great opportunity for the control community to have a major impact on advanced manufacturing. This includes increasing the performance of mature manufacturing processes such as VAR or developing the critical control of emerging manufacturing processes like 3D printing. This opportunity is especially timely because of a nexus of multi-physics simulation software, modern estimation methods, and real-time computer architecture and hardware.

There have been tremendous advances in three important technical areas in the last decade: computing capability, physics-based modeling, and estimation methods. Although these advances are known in the research community, they have not been deployed to any great extent in the manufacturing industry. It has become increasingly clear that manufacturing is of fundamental importance to the vitality of the US economy. Small lot or small volume manufacturing, often of high value products, offers a unique opportunity to open up fundamentally new businesses for manufacturers. One of the major challenges in small lot manufacturing is the cost of qualifying and certifying that the product meets its design specifications. This is substantially the function of manufacturing process control. Contemporary process control is statistics based and is most effective for large volume manufacturing. Such process control is often not effective if the conditions or the product changes, such as occurs in small lots.

This article will describe opportunities for exploiting these three technical areas for advanced manufacturing in small lots. Two illustrative applications will be highlighted:

1. Application in a new emerging manufacturing process - 3D Printing or Additive Manufacturing.

2. Application in a mature manufacturing process - Vacuum Arc Remelting for super alloy production.

In particular, Cyber Enabled Manufacturing Systems (CeMs) for small lot manufacturing that incorporates a model of the process directly into the control algorithm is presented and discussed. The model enables the manufacturing monitoring and control algorithm to accommodate changing conditions without extensive additional experiments. Objectives of the CeMs system are rational setting of manufacturing tolerances, real time prediction of manufacturing defects, real time control of process to eliminate defects, and real time monitoring and control for small lot manufacturing. These goals are achieved through high fidelity, physics based models including models of faults/ defects, uncertainty quantification, reduced order models that run in real time, measurement, real time prediction, real time computer architecture, real time control with inverse solutions, and automating the CeMs process for generic manufacturing processes. The development of such accurate control algorithms and their application to manufacturing processes can provide a competitive edge.

Manufacturing is of fundamental importance to the vitality of the United States economy and national security. Besides the obvious large quantity of products that are manufactured and the higherthan-average paid jobs created today, the manufacturing sector is a key element in creating new innovations that become the new businesses of the future.

Manufacturing provides many of the jobs and drives many of the businesses of today. Yet its role in providing jobs and providing the businesses of tomorrow is even more important. The manufacturing sector accounts for about 72% of all private sector R&D spending and employs 60% of U.S. industry’s R&D workforce. As a result, the manufacturing sector develops and produces many of the technologies that advance the competitiveness and growth of the entire economy, including much of the service sector.[1]

Manufacturing also contributes greatly to economic growth. Manufacturing has a larger multiplier on growth than any other sector as can be seen in the chart in Fig. 1 in which $1 of demand in manufacturing results $1.41 of supporting demand in the economy.

Figure 1 Intermediate demand necessary to produce $1.00 of a sector demand [2]

Grahic Jump LocationFigure 1 Intermediate demand necessary to produce $1.00 of a sector demand [2]

Control systems have always played an important role in manufacturing, but this article is about a unique and timely opportunity for the control community in advanced manufacturing. Advanced manufacturing is not a well-defined term, but there are generally two categories: (1) new manufacturing processes and (2) the use of technology to improve existing manufacturing processes in time, cost, and yield. The opportunities for the control community in these two categories are somewhat diff erent. New manufacturing processes are often associated with start-up ventures which may not have resources or time to use modern estimation and control technology. However, a new manufacturing process usually does not have established certification procedures, so technology can be more easily incorporated into the process. For existing processes there may be readily available resources for new technology, but the cost of changing established certification procedures of an existing process may be high. This article focuses on areas for both new processes and existing processes that have the potential to overcome these barriers to implementation.

The term 3D Printing is often used to describe a number of new manufacturing processes. One of these processes, Selective Laser Sintering (SLS), was co-developed at the University of Texas by the first author of this article in the mid 1980’s. The name that we used to describe the 3D Printing technology is Solid Freeform Fabrication. We defined Solid Freeform Fabrication as the ability to fabricate complex solid objects from a computer model of an object without part-specific tooling or human intervention. Essentially, this is an art-to-part process in which a designer sits at a computer design station and hits hard copy or print to make the 3D part, see Fig. 2.

Figure 2 Solid Freeform Fabrication.

Grahic Jump LocationFigure 2 Solid Freeform Fabrication.

The problem that Solid Freeform Fabrication is trying to solve is how to make one of something quickly without part-specific tooling or human intervention. A conceptual solution to this problem is voxel manufacturing in which a voxel is a volume element of matter. The concept is to deliver two types of voxels: structural voxels and support voxels (see Fig. 3). As implied by their names, structural voxels have strength while support voxels only need to supply temporary support to the part and are eventually removed.

Figure 3 Voxel manufacturing with structural and support voxels

Grahic Jump LocationFigure 3 Voxel manufacturing with structural and support voxels

The SLS process, is shown as a schematic in Fig. 4: laser beam energy is applied to the top surface of a powder bed in the pattern of a cross-section of a part to be made. The position of the beam can be very precisely controlled with a set of galvanometer driven mirrors. Wherever the beam hits the powdered material, the powder melts (this is not sintering but melting) and subsequently solidifies into structural material. Scan speed and beam power are set in order to have a regular melt penetration depth. In this embodiment, the powder is delivered to the part cylinder from two piston-driven powder cartridges by a counter-rotating roller that delivers and levels the powder in layers. The object is built in the part-build cylinder vertically in repetitive layers that are bound together by the melt penetration. A controlled piston in the part-build cylinder determines the layer depth. Throughout the entire process the part is fully supported by the un-melted powder.

Figure 4 Schematic of SLS Process

Grahic Jump LocationFigure 4 Schematic of SLS Process

The SLS process and other similar Solid Freeform Fabrication processes were made possible by three enabling technologies that matured in the 1980’s.

  1. Solid modeling was commercialized. Solid models can represent three-dimensional parts easily in a computer and on a monitor. A solid model is required for necessary Solid Freeform Fabrication sub-processes such as slicing the solid into layers.

  2. Lasers became available at reasonable prices. Lasers deliver the energy required to solidify the object and can deliver this power in a precise way in order to make geometrically accurate parts.

  3. The personal computer was commercialized. The geometric and precision power operations in Solid Freeform Fabrication require a level of computing power that became available in the personal computer.

As expected the process control for these first systems was very crude and, somewhat surprisingly, the process control remains fairly crude today. SLS is a thermal manufacturing process and for polymer SLS the process chamber is equipped with a part-build surface heater and powder cartridge surface heater. A single point infrared temperature sensor pointed at the part-build surface controls the part-build heater, and two single point infrared sensors pointed at the powder cartridge surface control the two powder cartridge heaters. One of the primary control issues in polymer SLS is the temperature distribution across the part-build surface.

Polymers that build the best parts in SLS machines are typically semi-crystalline. The relatively sharp melting point of semicrystalline polymers yields parts that have sharp edges and thus accurate parts. Differential Scanning Calorimetry (DSC) captures the important thermal properties of the polymer. A representative DSC plot is shown in Fig. 5. In this figure, Tm is the melt temperature and Tc is the recrystallization temperature. The operating temperature of the part-build surface has to be below the melt temperature to avoid melting the entire part-build surface and above the recrystallization temperature to prevent part distortion. The state-of-the-art of process control in commercial machines consists of calibrating multi-zone heaters to achieve a relatively uniform temperature field across the part-build surface and then using the single point infrared temperature sensor to keep this surface point in the operating temperature window. Of course this method does not measure the entire part build surface, which can vary substantially from the controlled surface point. This variation can result from both residual laser heat as well as convective cooling, which is different from part to part since the essence of the process is to make different parts.

Figure 5 Typical DSC of for a Semi-Crystalline Polymer (Courtesy Harvest Technologies)

Grahic Jump LocationFigure 5 Typical DSC of for a Semi-Crystalline Polymer (Courtesy Harvest Technologies)

The current process control in Additive Manufacturing machines is coarse because SLS technology initially grew out of a commercial need for rapid prototyping. This market has a different requirement for process and quality control from traditional manufacturing. As shown in Fig. 6 concept models require relatively low accuracy and low strength. Increasingly inexpensive machines, with little or no process control, are serving this concept model market. At the other end of the spectrum is true manufacturing that requires relatively high accuracy and strength. This is the emerging market for Additive Manufacturing, which is presently being served by machines that were designed for the rapid prototyping market. Prototypes are primarily used for initial form, fit and function but do not typically have to last a long time. Also variation between prototype parts can be tolerated more than in a part for production.

Figure 6 Markets in Solid Freeform Fabrication

Grahic Jump LocationFigure 6 Markets in Solid Freeform Fabrication

True Additive Manufacturing has grown slowly over the last decade and has been embraced by the aerospace industry. There are many polymer parts on existing aerospace systems that have been manufactured by Additive Manufacturing. These are for the most part non-structural parts such as ducting. These parts are made without part-specific tooling and due to the geometric complexity available in Additive Manufacturing can be almost any shape. This leads to a great reduction of part count and cost. Maybe even more importantly, the cost to make one part is roughly the same cost to make any number of parts. Most manufacturing processes have a great reduction in price as the volume goes up. This feature of Additive Manufacturing can enable a new business model for manufactured parts called the long tail [3]; a strategy that states that selling a large number of unique items is more profitable that selling a few popular items in large numbers.

Another aspect of Additive Manufacturing is the process can run mostly without human intervention. This leads to a business model of regional manufacturing in which it is economic to manufacture locally rather than in low wage countries. For this and other reasons, there is a great deal of excitement about the potential of Additive Manufacturing in the business community. But, none of this potential is possible, if Additive Manufacturing does not have adequate process control. Although the economic production of Additive Manufacturing can be greater than 10,000 parts [4], it is for the most part a small-lot process. Contemporary process control is statistics based and is most effective for large volume manufacturing. Such process control is often not effective if the conditions or the product changes, such as occurs in small lots. Although not unique to Additive Manufacturing, small lot manufacturing process control is critical to its success in manufacturing. In the following section, we describe a methodology for achieving this control not only for Additive Manufacturing, but also for other small-lot manufacturing processes.

Statistics based quality control techniques, often used in the manufacturing industry, are not easily extendable to small-lot manufacturing. Extensive testing is required for the determination of error bars that quantify admissible deviations from optimal operating conditions, which are not well defined in small lot manufacturing or when each product has unique specifications.

Although not extensively used in manufacturing practice, model-based techniques, which are used more regularly in the chemical processing industry, can provide distinct advantages for small lots. Due to the predictive nature of models, model-based methods can substantially reduce or possibly eliminate extensive testing. These techniques combine knowledge of the process with in-process measurements to adjust to characteristics of each product to ensure they are free of defects.

There are simplifications that are intrinsic to traditional model- based approaches. For example, it is assumed that measurements, communication, and data processing are tasks that are performed instantaneously, which often ignore the internal dynamics of these elements. The reality of a CeMs process control system is different, as computational models, sensors and actuators, and computing units become part of a network that has interaction with the physical manufacturing process. The CeMs approach that we are advocating seeks to incorporate all computational components and the physical plant in a unified control system.

CeMs is a derivative of a generic cyber-physical system consisting of embedded and distributed sensors, actuators, and computational units that are networked to effectively gather and process information while being coupled with the control system for immediate response. The advantages of this approach are evident. For example, optimization of computation and communication enables the adoption of methods that were considered too slow for closed-loop operation. The study of CeMs is a multidisciplinary task that involves the study of manufacturing processes, computational models, characterization of defects, distributed sensors, communication, fast processing of measurements, and control algorithms. This novel area is designed to enable collaboration between process engineers, control engineers, and computer scientists.

Incorporation of high-fidelity physics-based models in a process controller would have been difficult a decade ago. Tremendous advances in modeling have occurred in recent years. Commercial modeling packages have resulted in a large reduction in time required to develop these models. In the chemical process industry, these modeling tools are now used to study the effect of varying parameters on the quality of the obtained products, enabling open-loop optimization. In the case of small lots, the effect of un-modeled disturbances that affect the product quality cannot be neglected. It becomes necessary for computational models to adjust to varying operating conditions, which can be done by coupling them to the plant and updating them in real time. This approach results in increased information of the varying process dynamics in the manufacturing process and it enables the observation of phenomena that cannot be measured physically. If a computational model is used for sensing it must return information quickly enough for it to be used by the controller, making the acceleration of these models a necessity.

The complexity of high-fidelity models can prevent their application in the design of process estimators and controllers, neccesitating reduced-order models to decrease the state dimension. The challenge is to do so in a way that is faithful to the outputs over the input parameter space. In linear models, balance and truncate, singular perturbation approximations, and Hankel norm approximations are used to reduce the order of state-space realizations; but these methods are not easily extendable to nonlinear models [5]. In such scenarios, reduced order models are typically constructed with snapshots generated from training points that sample input parameter space [6]. Thus, the reduced model can be rapidly evaluated within a control loop or while sampling probabilistic parameter spaces. One challenge has been to ensure that the reduced order models remain faithful to the outputs of interest over a wide range of inputs.

Anomalies in manufacturing are difficult to detect and correct in a timely manner. They are usually described by parameters that are not measured directly, or have to be inferred from noisy observations, making accurate state and parameter estimation necessary.

Materials processing used in manufacturing involves processes governed by transport equations: fluid mechanics, heat transfer, and mass transfer [7]. These equations are well known but are controlled by state-dependent coefficients that make them highly nonlinear. Nonlinear estimation is still considered challenging, as optimal solutions available for linear Gaussian scenarios are not extendable when nonlinearities appear. Nonlinear estimators available in the literature can be classified in three groups: (1) extensions to the classic Kalman filter, such as the extended and unscented Kalman filters; (2) moving horizon estimators, common in the model predictive control literature; and (3) Sequential Monte Carlo methods.

Sequential Monte Carlo methods are commonly presented as the most promising alternative because of their flexibility [8]. In these methods, a continuous probability density function is approximated by a discrete probability measure, where the support is defined by the finite number of particles and the shape of the distribution is defined by the weights on the particles. The particles and their weights are updated when a new set of measurements becomes available. Two of the most common Sequential Monte Carlo estimators, the particle filter and the auxiliary particle filter (see Fig. 7), are often used in a wide variety of applications, such as computer vision, finance, robotics, etc. Until recently the biggest impediment for the adoption of Monte Carlo techniques in online estimation applications has been their computational cost.

Figure 7 Particle filter, the most common Sequential Monte Carlo estimator.

Grahic Jump LocationFigure 7 Particle filter, the most common Sequential Monte Carlo estimator.

A computational architecture that can meet the real-time constraints of the manufacturing process and Monte Carlo methods is essential to the success of the CeMs approach. One way to accelerate computations is to implement them in a computational unit with an optimal architecture. For example, Sequential Monte Carlo methods are parallel by nature, and their performance is improved by orders of magnitude when moved from a CPU to a more appropriate unit, such as graphics processing units (GPUs) [9]. Similar approaches can be followed to accelerate the physics- based model, if necessary [10].

One of the manufacturing processes currently being studied with this methodology is Vacuum Arc Remelting (VAR). Similar to Additive Manufacturing, VAR is a small lot, high value manufacturing process. VAR is used in the superalloy industry for the production of expensive rotor-grade materials. Production is performed in a discrete manner, and each production run takes an entire day or longer, which drives manufacturers to search for alternative approaches that will ensure a higher yield in order to reduce manufacturing costs.

In contrast to Additive Manufacturing, VAR is a mature manufacturing process, which has been around since the 1950s, and it has a stable economic base of machines. It is used to homogenize and remove volatile components when producing segregation-sensitive superalloys. In industry, it is common to use VAR to improve the quality of ingots obtained with vacuum induction melting (VIM), electroslag remelting (ESR), compacting metal sponge, or other primary techniques. In this process, a metal electrode is suspended in a water-cooled copper crucible, where a vacuum is set. Direct current is used to melt the electrode, which falls to the bottom of the crucible and solidifies into an ingot. The CeMs strategy is currently under development for VAR.

Despite the method's many advantages, VAR ingots can be prone to segregation defects caused by un-dissolved material fall-in (white spots), convection instabilities (freckles) and perturbations in the melt pool (tree rings) [11]. Prevention of such defects can be attained by following desired melting and solidification conditions. It is known that the tendency for freckles decreases when melting with shallower liquid pools, as the tendency for buoyancy- driven flows decreases. On the other hand, it is important that the liquid pool is deep enough to dissolve any fall-in material that might become a solidification precursor. Prevention of freckles and white spots can be thought of as a balancing act that requires having a liquid pool that is deep enough to prevent white spots while being shallow enough to prevent freckles [12].

A major impediment for accurate prevention of such defects is that no measurements are available from the solidification front. However, one can use a finite volume model in parallel with the furnace, updating it to account for time-varying control inputs and process parameters, to gather predictions of the otherwise unobservable solidification front. As shown in Fig. 8 this methodology was implemented in a laboratory-scale experiment showing that the predicted liquid pool depths compared favorably to those measured manually [13].

Figure 8 Favorable comparison between predicted temperature field and marked solidification front [14].

Grahic Jump LocationFigure 8 Favorable comparison between predicted temperature field and marked solidification front [14].

A computational model, which was used for process optimization and post-mortem analysis in the 1990s, is now able to return real-time predictions of solidification conditions. The incorporation of a computational model in a network of sensors enhanced the capabilities of the controller, but it was still based on a linear model accurate only when operating near nominal conditions. Highly-transient behavior is observed in VAR in the beginning of the process (start-up) and in the end (hot-top). Control in these conditions is difficult, and it is common practice to discard these parts of the ingot since segregation defects are common. Superalloys producers are interested in enhancing their control capabilities to prevent defects in these highly-transient regions too, and that requires the application of nonlinear estimation and control techniques.

VAR is a relatively slow process, and therefore a good candidate to start testing these time-critical techniques as it is not as restrictive as other fast processes such as Additive Manufacturing. A typical VAR experiment takes an entire day, and has a sampling time on the order of seconds (2 seconds for small ingots and 5 seconds for production ones). Research was dedicated to the selection of an appropriate controller that was not only accurate but that also could be accelerated with a proper choice of computer architecture.

The stochastic model that describes the VAR process is defined by what in Bayesian estimation is known as a diffuse-prior peaked-likelihood problem, which means that the evolution equation for the state is much more uncertain than the observation equation. This is due to the disturbances in the system, such as the unknown current distribution in the electric arc, inhomogeneity of the metal electrode, varying helium pressure in the cooling system, and end effects at the beginning and end of the melt. Some measurements, on the other hand, are quite precise, such as those for tracking the position and the mass of the moving electrode; but there are also some others that are extremely noisy, such as electrode gap.

Application of Sequential Monte Carlo techniques can be inefficient in peaked-likelihood scenarios, because a large number of the particles proposed based on prior knowledge will end up in regions of the state space with a small likelihood based on the observed measurement and will disappear when resampling. An alternative approach that takes the current measurement into account when exploring the state space was found to be more efficient. The auxiliary particle filter resulted in a decrease in the number of particles required for acceptable accuracy by 87.5%.

Although the auxiliary particle filters resulted in a huge reduction in computational cost, the required number of particles was still high due to the high dimensionality of the system, and it was necessary to accelerate the estimator to meet the constraints of the control system. As shown in Fig. 9 NVIDIA’s GeForce GTX TITAN, powered by GK110 and based on CUDA (Computer Unified Device Architecture), was used for a faster implementation [15]. The implementation of the algorithm on the GPU resulted in estimates obtained in milliseconds even when millions of particles were used, proving fast enough not only for this remelting application but also for other manufacturing processes with faster dynamics.

Figure 9 GeForce GTX TITAN, used in the acceleration of parallel estimation and control algorithms.

Grahic Jump LocationFigure 9 GeForce GTX TITAN, used in the acceleration of parallel estimation and control algorithms.

A similar approach can be taken to accelerate a constrained optimization problem for process control. A finite-horizon constrained optimization problem, such as model predictive control, can be expressed in the form of a sequential estimation problem [16]. Once in the form of an estimation problem, it too can be accelerated using parallel architectures. An example of nonlinear model predictive control where estimation and control are performed using Sequential Monte Carlo methods in shown in Fig. 10.

Figure 10 Auxiliary particle filter-model predictive control for the VAR process showing accurate control of the process even when the reference liquid pool depth is changed.

Grahic Jump LocationFigure 10 Auxiliary particle filter-model predictive control for the VAR process showing accurate control of the process even when the reference liquid pool depth is changed.

To date the research in CeMs for VAR has focused on development of models (both high-fidelity for defect prediction and reduced-order for estimation and control), estimation and control algorithms that can be accelerated, and computer architectures. The next steps in the project involve the design of a test bed where a proper study of the network will be included to improve the robustness of the system. These efforts are expected to ensure repeatability of quality standards that are necessary to complete current quality control techniques used in the superalloy industry.

CeMs for SLS is currently under development. The SLS process like the VAR process is a primarily a thermal process, but it involves at least two distinct thermal regions. First, there is a microscopic region of the laser material interaction, which includes rapid melting and solidification. If this microscopic thermal region is poorly controlled, the bonding between layers can be degraded leading to part failure. This region is much faster than the dynamics of the VAR process and requires even faster methods than those used in VAR. Second there is a macroscopic region of the entire build chamber including the part bed. It is well known that the entire time-temperature history of a part can affect its properties. This region has a very slow change in time and is on the order of the VAR process (an SLS run might take 24 hours). For this reason, two separate models that feed into each other are being developed.

One distinct advantage the SLS process has over the VAR process is the ability to change the process. This is possible because it is an emerging process and the Additive Manufacturing community is just starting to understand how to control this process. We are developing our control on a laboratory SLS machine that we have custom built. For example, this machine has an IR camera that can image the entire part-build surface and additional temperature sensors that profile the entire build chamber. This would be difficult to do in an established commercial machine and process. This system has available thermal measurement of every single layer. The system also has the capability to measure the microscopic region of melting by a sensor that looks down the laser mirrors.

Even with all of these sensors, the CeMs method for process control for SLS will be essentially the same as VAR. It will include multi-physics computational models, modern (nonlinear) estimation methods and measurement, and high performance computational units.

There is great opportunity for the control community to have a major impact on advanced manufacturing. This includes increasing the performance of mature manufacturing processes such as VAR or developing the critical control of emerging manufacturing processes like 3D printing. This opportunity is especially timely because of a nexus of multi-physics simulation software, modern estimation methods, and real time computer architecture and hardware.

The authors are pleased to acknowledge support for the research described here from ONR, NSF, and AFRL.

Executive Office of the President, National Science and Technology Council, A National Strategic Plan for Advanced Manufacturing, Feb., 2012, www.white-house.gov/sites/default/files/microsites/ostp/iam_advancedmanufacturing_strategicplan_2012.pdf
Popkin, J and K. Kobe, Manufacturing Resurgence: A Must for U.S. Prosperity, National Association of Manufacturers and Council of Manufacturing Associations, 2010.
Anderson, Chris. “The Long Tail” Wired, October 2004
Hopkinson, N. and P Dickens, “Analysis of Rapid Manufacturing – Using Layered Manufacturing Processes for Production”; Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science, v 217, n 1, p 31-40, 2003. [CrossRef]
S. Skogestad, I. Postlethwaite, Multivariable feedback control, John Wiley & Sons, 2nd Edition, 2005.
A.T. Patera, G. Rozza, Reduced Basis Approximation and A Posteriori Error Estimation for Parametrized Partial Differential Equations, MIT, 2012.
R. Bird, W. Stewart, E. Lightfoot, Transport phenomena, John Wiley & Sons, 2nd Edition, 2006.
A. Doucet, N. de Freitas, N. Gordon, Sequential Monte Carlo methods in practice, Springer, 2001. [CrossRef]
M. Chitchian, A. Simonetto, A.S. van Amesfoort, T. Keviczky, Distributed computation particle filter on GPU architecture for real-time control applications, Control Systems Technology 21(6), pp. 2224-2238, 2013. [CrossRef]
C. Cecka, A. J. Lew, E. Darve, Assembly of finite element methods on graphics processors, International Journal for Numerical Methods in Engineering, 85, 640-669, 2011. [CrossRef]
K.O. Yu, J.A. Domingue, Control of solidification structure in VAR and ESR processed alloy 718 ingots, Superalloy 718 – Metallurgy and applications, TMS, 1989.
T. Watt, E. Taleff, F. Lopez, J. Beaman, Solidification mapping of a Nickel alloy 718 laboratory VAR ingot, International Symposium on Liquid Metal Processing & Casting, pp. 261-270, TMS, 2013.
J. Beaman, F. Lopez, R. Williamson, Modeling of the vacuum arc remelting process for estimation and control of the liquid pool profile, ASME Journal of Dynamic Systems, Measurement, and Control, Vol. 136, No.3, pp. 031007-1, 2014. [CrossRef]
L.A. Bertram, PR. Schunk, S.N. Kempka, F. Spadafora, R. Minisandram, The macroscale simulation of remelting processes, JOM, pp. 18-21, 1998.
D. Stahl, J. Hauth, PF-MPC: Particle filter-model predictive control, Systems & Control Letters, Vol. 60, No. 8, pp. 632-643, 2011. [CrossRef]
Copyright © 2014 by ASME
View article in PDF format.

References

Executive Office of the President, National Science and Technology Council, A National Strategic Plan for Advanced Manufacturing, Feb., 2012, www.white-house.gov/sites/default/files/microsites/ostp/iam_advancedmanufacturing_strategicplan_2012.pdf
Popkin, J and K. Kobe, Manufacturing Resurgence: A Must for U.S. Prosperity, National Association of Manufacturers and Council of Manufacturing Associations, 2010.
Anderson, Chris. “The Long Tail” Wired, October 2004
Hopkinson, N. and P Dickens, “Analysis of Rapid Manufacturing – Using Layered Manufacturing Processes for Production”; Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science, v 217, n 1, p 31-40, 2003. [CrossRef]
S. Skogestad, I. Postlethwaite, Multivariable feedback control, John Wiley & Sons, 2nd Edition, 2005.
A.T. Patera, G. Rozza, Reduced Basis Approximation and A Posteriori Error Estimation for Parametrized Partial Differential Equations, MIT, 2012.
R. Bird, W. Stewart, E. Lightfoot, Transport phenomena, John Wiley & Sons, 2nd Edition, 2006.
A. Doucet, N. de Freitas, N. Gordon, Sequential Monte Carlo methods in practice, Springer, 2001. [CrossRef]
M. Chitchian, A. Simonetto, A.S. van Amesfoort, T. Keviczky, Distributed computation particle filter on GPU architecture for real-time control applications, Control Systems Technology 21(6), pp. 2224-2238, 2013. [CrossRef]
C. Cecka, A. J. Lew, E. Darve, Assembly of finite element methods on graphics processors, International Journal for Numerical Methods in Engineering, 85, 640-669, 2011. [CrossRef]
K.O. Yu, J.A. Domingue, Control of solidification structure in VAR and ESR processed alloy 718 ingots, Superalloy 718 – Metallurgy and applications, TMS, 1989.
T. Watt, E. Taleff, F. Lopez, J. Beaman, Solidification mapping of a Nickel alloy 718 laboratory VAR ingot, International Symposium on Liquid Metal Processing & Casting, pp. 261-270, TMS, 2013.
J. Beaman, F. Lopez, R. Williamson, Modeling of the vacuum arc remelting process for estimation and control of the liquid pool profile, ASME Journal of Dynamic Systems, Measurement, and Control, Vol. 136, No.3, pp. 031007-1, 2014. [CrossRef]
L.A. Bertram, PR. Schunk, S.N. Kempka, F. Spadafora, R. Minisandram, The macroscale simulation of remelting processes, JOM, pp. 18-21, 1998.
D. Stahl, J. Hauth, PF-MPC: Particle filter-model predictive control, Systems & Control Letters, Vol. 60, No. 8, pp. 632-643, 2011. [CrossRef]

Figures

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In