234 results
Search Results
2. Simulation optimization: A comprehensive review on theory and applications.
- Author
-
TEKIN, EYLEM and SABUNCUOGLU, IHSAN
- Subjects
SIMULATION methods & models ,STRUCTURAL optimization ,COMPUTER engineering ,DECISION support systems ,COMPUTER simulation - Abstract
For several decades, simulation has been used as a descriptive tool by the operations research community in the modeling and analysis of a wide variety of complex real systems. With recent developments in simulation optimization and advances in computing technology, it now becomes feasible to use simulation as a prescriptive tool in decision support systems. In this paper, we present a comprehensive survey on techniques for simulation optimization with emphasis given on recent developments. We classify the existing techniques according to problem characteristics such as shape of the response surface (global as compared to local optimization), objective functions (single or multiple objectives) and parameter spaces (discrete or continuous parameters). We discuss the major advantages and possible drawbacks of the different techniques. A comprehensive bibliography and future research directions are also provided in the paper. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
3. Applying simulated annealing to the open shop scheduling problem.
- Author
-
Ching-Fang Liaw
- Subjects
PRODUCTION scheduling ,NEIGHBORHOODS ,SIMULATION methods & models ,ALGORITHMS ,BENCHMARKING (Management) ,MANAGEMENT - Abstract
This paper addresses the problem of scheduling a nonpreemptive open shop with the objective of minimizing makespan. A neighborhood search algorithm based on the simulated annealing technique is proposed. The algorithm is tested on randomly generated problems, benchmark problems in the literature, and new hard problems generated in this paper. Computational results show that the algorithm performs well on all of the test problems. In many cases, an optimum solution is found, and in others the distance from the optimum or lower bound is quite small. Moreover, some of the benchmark problems in the literature are solved to optimality for the first time. [ABSTRACT FROM AUTHOR]
- Published
- 1999
- Full Text
- View/download PDF
4. An Approach for Modeling Small-lot Assembly Networks.
- Author
-
Saboo, S. and Wilhelm, W. E.
- Subjects
MATHEMATICAL models ,SIMULATION methods & models ,SYSTEMS engineering ,MATHEMATICAL optimization ,MATHEMATICAL analysis ,CASE studies - Abstract
A model for estimating the transient performance of assembly networks is presented in this paper. Based on the fundamental assumption that operation start and finish times are related by the multivariate normal distribution, the approach relies upon computational procedures for estimating the correlations between certain operation finishing times. Fundamental properties of these correlations are identified and used to develop a procedure for estimating transient performance. Evaluated in a set of hypothetical test cases, the approach gave estimates which compare favorably with those derived from a simulation model both in accuracy and runtime. The approach is demonstrated as a decision aid in a case study involving material flow planning. Test results indicate that the approach offers unique capability to model transient operations in assembly networks. [ABSTRACT FROM AUTHOR]
- Published
- 1986
- Full Text
- View/download PDF
5. Sizing Storage Facilities For Open Pit Coal Mines.
- Author
-
Bradley, Charles E., Taylor, Sam G., and Gray, Wayne I.
- Subjects
COAL mining ,ENERGY industries ,SIMULATION methods & models ,OPERATIONS research ,STORAGE facilities - Abstract
This paper develops a general procedure to size storage facilities for open pit coal mines. More particularly, the paper presents (1) deterministic models for estimating working stock requirements, and (2) simulation models in the SLAM language which can be used to analyze safety stock requirements. The design procedure is illustrated using data similar to that experienced by a mine in Wyoming's Powder River Basin. [ABSTRACT FROM AUTHOR]
- Published
- 1985
- Full Text
- View/download PDF
6. Solving Multiple Response Simulation Models Using Modified Response Surface Methodology Within A Lexicographic Goal Programming Framework.
- Author
-
Rees, Loren P., Clayton, Edward R., and Taylor, Bernard W.
- Subjects
SIMULATION methods & models ,EXPERIMENTAL design ,MATHEMATICAL programming ,OPERATIONS research ,METHODOLOGY - Abstract
This paper describes a new procedure for obtaining satisfactory solutions to multiple-response, multiple-input simulation models. A modified version of response surface methodology is incorporated to obtain input values which meet user specified goals for the responses. The approach is illustrated with three examples which demonstrate the method. The desirability of incorporating this approach into an interactive computer mode is also discussed. [ABSTRACT FROM AUTHOR]
- Published
- 1985
- Full Text
- View/download PDF
7. Advances in simulation optimization and its applications.
- Author
-
Lee, LooHay, Chew, EkPeng, Frazier, PeterI., Jia, Qing-Shan, and Chen, Chun-Hung
- Subjects
SIMULATION methods & models ,MATHEMATICAL optimization ,MATHEMATICAL variables ,PARALLEL algorithms ,MATHEMATICAL bounds ,PROBLEM solving - Published
- 2013
- Full Text
- View/download PDF
8. Optimal budget allocation for discrete-event simulation experiments.
- Author
-
Chen, Chun-Hung, Yücesan, Enver, Dai, Liyi, and Chen, Hsiao-Chang
- Subjects
SIMULATION methods & models ,OPERATIONS research ,NUMERICAL analysis ,HEURISTIC algorithms ,MATHEMATICAL analysis - Abstract
Simulation plays a vital role in analyzing discrete-event systems, particularly in comparing alternative system designs with a view to optimizing system performance. Using simulation to analyze complex systems, however, can be both prohibitively expensive and time-consuming. Effective algorithms to allocate intelligently a computing budget for discrete-event simulation experiments are presented in this paper. These algorithms dynamically determine the simulation lengths for all simulation experiments and thus significantly improve simulation efficiency under the constraint of a given computing budget. Numerical illustrations are provided and the algorithms are compared with traditional two-stage ranking-and-selection procedures through numerical experiments. Although the proposed approach is based on heuristics, the numerical results indicate that it is much more efficient than the compared procedures. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
9. Profile monitoring for a binary response.
- Author
-
Yeh, ArthurB., Huwang, Longcheen, and Li, Yu-Mei
- Subjects
MATHEMATICAL variables ,LOGISTIC regression analysis ,QUALITY control charts ,SIMULATION methods & models ,ANALYSIS of covariance ,ANALYSIS of means - Abstract
Pertaining to industrial applications in which the response variable of interest is binary, this paper studies how the profile functional relationship between the response and predictor variables can be monitored using logistic regression. Under such a premise, several Hotelling T2 charts that have been studied under continuous response variable to binary response variable for the purpose of Phase I profile monitoring are extended. The performance of these T2 charts in terms of the signal probability for different out-of-control scenarios is compared based on simulation studies. A real example originated from aircraft construction is given in which these T2 charts are applied and compared using the data. A discussion of potential future research is also given. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
10. Benefits of cross-training in a skill-based routing contact center with priority queues and impatient customers.
- Author
-
Ahghari, Mahvareh and Balcioĝlu, Bariş
- Subjects
TRAINING ,CALL centers ,SERVICE centers ,AUTOMATIC call distribution ,TELEPHONE calls ,EMAIL ,QUEUEING networks ,ROUTING (Computer network management) ,SIMULATION methods & models - Abstract
Customer contact centers that provide different types of services to customers who place phone calls or send e-mail messages are studied. Customers calling are impatient; hence phone requests have a higher priority over e-mail messages. E-mails that are not responded to within a specified time limit can be prioritized. The goal of this paper is to assess the performance improvement via cross-training the agents. The performance of contact centers operated under different strategies are compared. An extensive simulation study is presented that shows that strategies permitting pre-emptive-resume policies provide the best performance for phone calls. The results also demonstrate that limited cross-training with two skills per agent results in considerable performance improvements. However, the unbalanced traffic intensities due to different mean service times for each class necessitate more cross-training at three skills per agent to have considerable improvement. [Supplemental materials are available for this article. Go to the publisher's online edition of IIE Transactions for the following free supplemental resource: Appendix of additional simulation results] [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
11. AWIP: A simulation-based feedback control algorithm for scalable design of self-regulating production control systems.
- Author
-
Masin, Michael and Prabhu, Vittal
- Subjects
SIMULATION methods & models ,FEEDBACK control systems ,ALGORITHMS ,COMPUTERS in production control ,WORK in process ,CONTROLLERSHIP ,FLEXIBLE manufacturing systems ,PRODUCTION planning ,INDUSTRIAL management - Abstract
A new simulation-based feedback control algorithm, called Adaptive Work In Process (AWIP), for the design of Self-regulating Production Control Systems (SPCSs) such as Kanban, CONWIP, base stock control and their generalizations is presented. The problem of minimizing average Work In Process (WIP) subject to a required throughput is solved. The AWIP algorithm is used as a feedback controller to adjust the WIP at various stages in the production system. The algorithm is synthesized based on the structural properties of SPCSs that are established analytically in this paper. In this approach simulation is used to provide the feedback to the controllers, and leads to an iterative numerical computational algorithm. Computational experiments show that the AWIP algorithm is near-optimal, and computationally efficient, which makes it an attractive approach for designing and controlling large production systems. [Supplementary materials are available for this article. Go to the publisher's online edition of IIE Transactions for the following free supplemental resource: Appendix] [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
12. Modeling hospital discharge policies for patients with pneumonia-related sepsis.
- Author
-
Kreke, JenniferE., Bailey, MatthewD., Schaefer, AndrewJ., Angus, DerekC., and Roberts, MarkS.
- Subjects
SEPSIS ,CAUSES of death ,HOSPITAL admission & discharge ,HOSPITAL administration ,HOSPITAL patients ,MEDICAL care costs ,SIMULATION methods & models ,PARAMETER estimation ,MANAGEMENT - Abstract
Sepsis, the tenth-leading cause of death in the United States, accounts for more than $16.7 billion in annual health care costs. A significant factor in these costs is hospital length of stay. The lack of standardized hospital discharge policies and an inadequate understanding of sepsis progression have resulted in unnecessarily long hospital lengths of stay. In this paper, a general model of when to discharge a patient with pneumonia-related sepsis from the hospital is presented. The model is parameterized using patient-based disease progression data from a large clinical study in order to characterize optimal discharge policies for various problem instances. In the presented experiments, patient health is represented by SOFA scores, which are commonly used to assess sepsis severity. Control-limit policies for specific patient cohorts defined by age and race are demonstrated. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
13. Analysis of long flow lines with quality and operational failures.
- Author
-
Kim, Jongyoon and Gershwin, StanleyB.
- Subjects
APPROXIMATION theory ,MANUFACTURING processes ,ASSEMBLY line methods ,MACHINERY ,FAILURE analysis ,SIMULATION methods & models - Abstract
This paper presents approximation methods for the performance analysis of long manufacturing lines, i.e. lines with more than two machines and one buffer, that have both quality and operational failures. We describe three different versions of long flow lines that differ in the locations of the inspection stations and in the sets of machines that each inspection station monitors. We explain a transformation method that approximates long manufacturing lines that have quality and operational failures with long lines that only have operational failures. Such lines can be evaluated by decomposition methods. We introduce other approximations to quantify the effects of the separation of inspections from operations. Comparison with simulation shows that the solution methods provide reliable performance estimates. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
14. Transient behavior of serial production lines with Bernoulli machines.
- Author
-
Meerkov, SemyonM. and Zhang, Liang
- Subjects
PRODUCTION planning ,RELIABILITY in engineering ,MATHEMATICAL models ,BERNOULLI numbers ,SIMULATION methods & models ,EIGENVALUES - Abstract
The steady-state performance of production systems with unreliable machines has been analyzed extensively during the last 50 years. In contrast, the transient behavior of these systems remains practically unexplored. Transient characteristics, however, may have significant manufacturing implications. Indeed, if, for example, transients are sluggish and the steady state is reached only after a relatively long settling time, the production system may lose some of its throughput, thus leading to a lower efficiency. This paper is devoted to analytical and numerical investigation of the transient behavior of serial production lines with machines having the Bernoulli reliability model. The transients of the states (i.e., the probabilities of buffer occupancy) are described by the Second Largest Eigenvalue (SLE) of the transition matrix of the associated Markov chain. The transients of the outputs (i.e., production rate, PR, and work-in-process, WIP) are characterized by both the SLE and Pre-Exponential Factors (PEF). We study SLE and PEF as functions of machine efficiency, buffer capacity and the number of machines in the system. In addition, we analyze the settling times of PR and WIP and show that the former is often much shorter than the latter. Finally, we investigate production losses due to transients and show that they may be significant in serial lines with relatively large buffers and many machines. To avoid these losses, it is suggested that all buffers initially be half full. For two- and three-machine lines these analyzes are carried out analytically; longer lines are investigated by simulations. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
15. A finite-memory algorithm for estimating the variance of the sample mean.
- Author
-
Song, WheymingTina
- Subjects
ANALYSIS of variance ,SIMULATION methods & models ,ESTIMATION theory ,STATISTICAL sampling ,STOCHASTIC analysis ,MATHEMATICAL analysis - Abstract
Estimating the variance of the sample mean is a classical problem of steady-state simulation output analysis. Traditional batch means estimators require specification of the simulation run length a priori. To our knowledge, the Dynamic Non-overlapping Batch Means (DNBM) estimator is the only existing variance estimator that requires a constant storage space for any sample size. In this paper, we develop the Dynamic Partial-overlapping Batch Means (DPBM) algorithm, that also requires a constant storage space. In terms of the mean squared error, the statistical performance of the DPBM estimators is superior to that of the DNBM estimators. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
16. Throughput performance of automated storage/retrieval systems under stochastic demand.
- Author
-
BOZER, YAVUZA. and CHO, MYEONSIG
- Subjects
AUTOMATED storage retrieval systems ,SYSTEM analysis ,SYSTEMS theory ,SIMULATION methods & models ,SYSTEMS engineering - Abstract
In this paper we are concerned with the throughput performance of an Automated Storage/Retrieval (AS/R) system under stochastic demand, i.e., the case where storage and retrieval requests arrive randomly. Although AS/R systems have been the subject of extensive research, their performance under stochastic demand remains relatively unexplored. In fact, with random storage and retrieval requests, the primary tool for AS/R system analysis has been simulation. Assuming a particular dwell point strategy for the storage/retrieval machine, in this paper we derive closed-form analytical results to evaluate the performance of an AS/R system under stochastic demand and determine whether or not it meets throughput. Although the results are derived for a given system, they can also be used in the design or evaluation of new/proposed systems. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
17. A Simulation-Based Learning Automata Framework for Solving Semi-Markov Decision Problems Under Long-Run Average Reward.
- Author
-
GOSAVI, ABHIJIT, DAS, TAPASK., and SARKAR, SUDEEP
- Subjects
DYNAMIC programming ,DECISION making ,SIMULATION methods & models ,MARKOV processes ,ALGORITHMS - Abstract
Many problems of sequential decision making under uncertainty, whose underlying probabilistic structure has a Markov chain, can be set up as Markov Decision Problems (MDPs). However, when their underlying transition mechanism cannot be characterized by the Markov chain alone, the problems may be set up as Semi-Markov Decision Problems (SMDPs). The framework of dynamic programming has been used extensively in the literature to solve such problems. An alternative framework that exists in the literature is that of the Learning Automata (LA). This framework can be combined with simulation to develop convergent LA algorithms for solving MDPs under long-run cost (or reward). A very attractive feature of this framework is that it avoids a major stumbling block of dynamic programming; that of having to compute the one-step transition probability matrices of the Markov chain for every possible action of the decision-making process. In this paper, we extend this framework to the more general SMDP. We also present numerical results on a case study from the domain of preventive maintenance in which the decision-making problem is modeled as a SMDP. An algorithm based on LA theory is employed, which may be implemented in a simulator as a solution method. It produces satisfactory results in all the numerical examples studied. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
18. Simulation-based shop floor control: formal model, generation and control interface.
- Author
-
Young Jun Son, Wysk, Richard A., and Jones, Albert T.
- Subjects
PRODUCTION control ,MANUFACTURING processes ,PROCESS control systems ,FACTORY management ,INDUSTRIAL engineering ,INDUSTRIAL management ,FACILITY management ,SIMULATION methods & models ,AUTOMATION - Abstract
In this paper, a structure and architecture for the rapid realization of a simulation-based real-time shop floor control system for a discrete part manufacturing system is presented. The research focuses on automatic simulation model and execution system generation from a production resource model. An Automatic Execution Model Generator (AEMG) has been designed and implemented for generating a Message-based Part State Graph (MPSG)-based shop level execution model. An Automatic Simulation Model Generator (ASMG) has been designed and implemented for generating an Arena simulation model based on a resource model (MS Access 97) and an MPSG-based shop level execution model. A commercial finite capacity scheduler. Tempo, has been used to provide schedule information for the Arena simulation model. This research has been implemented and tested for six manufacturing systems, including The Pennsylvania State University CIM laboratory. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
19. A flexible simulation tool for manufacturing-cell design, I. model structure, operation and case study.
- Author
-
De Los A. Irizarry, Maria, Wilson, James R., and Trevino, Jaime
- Subjects
MANUFACTURING cells ,PRODUCT life cycle ,PRODUCT management ,SIMULATION methods & models ,MANUFACTURING processes ,MARKETING ,PRODUCTION engineering - Abstract
We present a general manufacturing-cell simulation model for evaluating the effects of world-class manufacturing practices on expected cell performance. The modular structure of the simulation provides the flexibility to analyze a wide variety of manufacturing cells. We formulate a comprehensive annualized cost function for evaluation and comparison of alternative cell configurations. A case study involving assembly of printed circuit hoards illustrates the potential benefits of using this tool for cell design and analysis. The simulation model is intended for use in a two-phase approach to cell design that is based on simulated experimentation and response surface analysis as detailed in a companion paper. [ABSTRACT FROM AUTHOR]
- Published
- 2001
- Full Text
- View/download PDF
20. A flexible simulation tool for manufacturing-cell design, II: response surface analysis and case study.
- Author
-
De Los A. Irizarry, Maria, Wilson, James R., and Trevino, Jaime
- Subjects
MANUFACTURING cells ,SIMULATION methods & models ,MANUFACTURING processes ,PRODUCTION engineering ,DESIGN ,METHODOLOGY ,MATHEMATICAL analysis - Abstract
We present a two-phase approach to design and analysis of manufacturing cells based on simulated experimentation and response surface methodology using a general manufacturing-cell simulation model. The first phase involves factor-screening simulation experiments to identity design and operational factors that have a significant effect on cell performance as measured by a comprehensive annual cost function. In the second phase of experimentation, we construct simulation (response surface) metamodels to describe the relationship between the significant cell design and operational factors ( the controllable input parameters) and the resulting simulation-based estimate of expected annual cell cost (the output response). We use canonical and ridge analyses of the estimated response surface to estimate the levels of the quantitative input factors that minimize the cell's expected annual cost. We apply this methodology to an assembly cell for printed circuit hoards. Compared to be current cell operating policy. the simulation metamodel-based estimate of the optimum operating policy is predicted to yield average annual savings of approximately $425 000, which is a 20% reduction in annual cost. In a companion paper, we detail the structure and operation of the manufacturing-cell simulation model. [ABSTRACT FROM AUTHOR]
- Published
- 2001
- Full Text
- View/download PDF
21. Autonomous agents architectures and algorithms in flexible manufacturing systems.
- Author
-
Adacher, Ludovica, Agnetis, Alessandro, and Meloni, Carlo
- Subjects
PROCESS control systems ,MANUFACTURING processes ,SIMULATION methods & models ,CONJOINT analysis ,JOB shops ,MANUFACTURING cells - Abstract
This paper investigates possible implementations of the autonomous agents concept in flexible manufacturing control. The implementation issues and the effectiveness of different control architectures and algorithms are analyzed by means of a simulation model of a flexible job shop. Extensive experimental results are reported, allowing the evaluation of the trade-off between the degree of autonomy and system performance. [ABSTRACT FROM AUTHOR]
- Published
- 2000
- Full Text
- View/download PDF
22. Loading algorithms for flexible manufacturing systems with partially grouped machines.
- Author
-
Dong-Ho Lee and Yeong-Dae Kim
- Subjects
FLEXIBLE manufacturing systems ,RESOURCE allocation ,ALGORITHMS ,HEURISTIC ,CONFIGURATIONS (Geometry) ,SIMULATION methods & models ,MACHINERY - Abstract
The loading problem in a Flexible Manufacturing System (FMS) involves allocating operations and associated cutting tools to machines for a given set of parts. There may be different environments for the loading problem that result from three ways of grouping machines in an FMS, i.e., no grouping, partial grouping, and total grouping. Unlike most previous studies on the loading problem for the configurations of no grouping and total grouping, this paper focuses on the loading problem resulting from partial grouping, in which each machine is tooled differently but each operation can be processed by one or more machines. Two types of heuristic algorithms are suggested for the loading problem with the objective of minimizing the maximum workload of the machines. Performances of the suggested loading algorithms are tested on randomly generated test problems and the results show that the suggested algorithms perform better than existing ones. In addition, it is found from simulation experiments that loading plans from partial grouping give significantly better performance than those from total grouping. [ABSTRACT FROM AUTHOR]
- Published
- 2000
- Full Text
- View/download PDF
23. Controlling shop floor operations in a multi-family, multi-cell manufacturing environment through constant work-in-process.
- Author
-
Golany, B., Dar-el, E. M., and Zeev, N.
- Subjects
MANUFACTURED products ,MATHEMATICAL programming ,HEURISTIC ,PRODUCTION management (Manufacturing) ,SELF-service (Economics) ,SIMULATION methods & models ,PROBLEM solving - Abstract
This paper discusses pertinent issues in applying CONstant Work-In-Process (CONWIP) principles to control shop floor operations in a manufacturing environment characterized by several product families processed along different routes in several production cells. The approach we take is to simultaneously answer two major questions: (1) what is the best WIP level? and (2) how to arrange the backlog list for a given system? The problem is posed as a mathematical programming model and solved via a simulated annealing heuristic. We design an experiment that captures essential elements of the systems under investigation. We then execute an extensive simulation to evaluate the effectiveness of various control schemes in a multi-cell, multi-family production environment. Specifically, we compare two variants of CONWIP control, one where containers are restricted to stay within given cells all the time and the other where containers are allowed to move through the entire system. We demonstrate the superiority of the latter in all the simulated scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 1999
- Full Text
- View/download PDF
24. A methodology for formulating, formalizing, validating, and evaluating a real-time process control advisor.
- Author
-
Schmidt, Douglas C., Haddock, Jorge, Marchandon, Stephane, Runger, George C., Wallace, William A., and Wright, Roger N.
- Subjects
MANUFACTURING processes ,GUIDELINES ,ARTIFICIAL neural networks ,ARTIFICIAL intelligence ,SIMULATION methods & models ,HUMAN-machine systems - Abstract
This research presents guidelines to design control processes where improving quality is achieved by improving the manufacturing consistency through the use of intelligent process control. Conventional control processes cannot include the theoretical knowledge, experimental knowledge, and expert knowledge available concerning the product. A hybrid intelligent process control (IPC) combining a continuous simulation (CS) and an artificial neural network (ANN) can make this knowledge available to the operator for process control. This paper presents a methodology for combining the CS and ANN to achieve real-time process control. A human-machine interface (HMI) is included in the process to aid operators in communication with the CS/ANN hybrid IPC. The result of the new process is a real-time process control advisor (RTPCa). A case example for the methodology of formulating, formalizing, validating, and evaluating the RTPCa is given. The case studied concerns galvanizing continuous sheet steel at a steel plant. The CS is written in SIMAN, and the ANN in C. The research validates and evaluates the RTPCa using plant data, simulation output, and face validation by plant personnel. The authors conclude that the benefits of the RTPCa over other forms of IPC include better process communication to the operator, robustness to moderate changes in system parameters, the flexibility to retrain the ANN if conditions change dramatically, and the computation speed necessary for real-time process control. This methodology has further applications to other continuous processes where quality is determined by manufacturing consistency of the product, such as in the pulp paper and film processing industries. [ABSTRACT FROM AUTHOR]
- Published
- 1998
- Full Text
- View/download PDF
25. Deadlock detection and resolution for discrete-event simulation: multiple-unit seizes.
- Author
-
Venkatesh, S., Smith, J., Deuermeyer, B., and Curry, G.
- Subjects
SIMULATION methods & models ,ALGORITHMS ,CRISES ,COMPUTER simulation ,PRODUCTION management (Manufacturing) - Abstract
This paper develops an automatic scheme to detect and resolve deadlocks in discrete-event simulation systems with entities capable of requesting multiple units of a resource. The research extends earlier deadlock work on discrete simulation systems with unit resource requests. The purpose of the deadlock handling scheme is to provide for additional capabilities in discrete simulation systems. This is accomplished by endowing the simulation system with appropriate data structures and algorithms. The algorithms presented are based on a graph model of deadlocks in the simulation system. The proposed algorithms identify different categories of permanent and transient deadlocks in the simulation system. A deadlock resolution scheme is also developed in the case of group-processing for permanent deadlocks. [ABSTRACT FROM AUTHOR]
- Published
- 1998
- Full Text
- View/download PDF
26. A formal structure for discrete event simulation. Part II: Object-oriented software implementation for manufacturing systems.
- Author
-
Karacal, S. Cem and Mize, Joe H.
- Subjects
OBJECT-oriented methods (Computer science) ,SIMULATION methods & models ,COMPUTER software ,COMPUTER simulation ,SMALLTALK (Computer program language) ,PROTOTYPES - Abstract
This paper describes the development of a prototype object-oriented software system for discrete event simulation and the embedded decision processes of a system being modeled based on previously defined formalism [1] and the Smalltalk programming language. The paper addresses the modular and structured representations of physical and logical entities of a manufacturing system for simulation modeling in the form of reusable software objects. The software takes advantage of the natural link between object-oriented programming and simulation and utilizes inheritance and other features of object-oriented programming to achieve modular yet uniform representation at every level of the model. After giving a brief overview of the object-oriented modeling environment and the relationships between software objects and formalism constructs, a small number of object classes and their operations are summarized. The intelligent entities of the formalism utilize a knowledge-based non-programmed decision mechanism implemented in Smalltalk. [ABSTRACT FROM AUTHOR]
- Published
- 1998
- Full Text
- View/download PDF
27. Computer-assisted simulation analysis.
- Author
-
Yu-Hui Tao, J. G. and Nelson, Barry L.
- Subjects
COMPUTER software ,SIMULATION methods & models ,SYSTEMS design ,STRUCTURAL frames ,COMPUTER systems ,ELECTRONIC systems - Abstract
We propose a framework for understanding the complex activities of Simulation Experiment Design and Analysis (SEDA), a framework that provides a basis for developing SEDA software. This paper presents the framework and illustrates it with a brief example. The definitions of six SEDA components form the core of the framework. The framework itself includes a dynamic model and a static model. The dynamic model is a generic description of the sequential nature of SEDA. The static model, in contrast is a very specific description of the SEDA computer system structure for our chosen problem domain: comparison of queueing networks in terms of their expected performance. We argue that the proposed SEDA framework is a good one on the basis of its properties and by comparing it with related frameworks. We close by briefly summarizing our prototype SEDA software, which was used to evaluate our framework. [ABSTRACT FROM AUTHOR]
- Published
- 1997
- Full Text
- View/download PDF
28. ESTIMATING SIMULATION METAMODELS USING COMBINED CORRELATION-BASED VARIANCE REDUCTION TECHNIQUES.
- Author
-
Tew, Jeffrey D. and Wilson, James R.
- Subjects
SIMULATION methods & models ,OPERATIONS research ,RANDOM numbers ,REGRESSION analysis - Abstract
This paper develops a procedure for jointly applying all of the correlation-based variance reduction techniques (namely, the methods of antithetic variates, common random numbers, and control variates) in a simulation experiment that is designed to estimate a linear metamodel (that is, a linear regression model) for a single response variable expressed in terms of an input vector of design variables for the target system. This procedure combines (a) the Schruben-Margolin strategy for metamodel estimation based on joint application of the methods of common random numbers and antithetic variates, and (b) a metamodel estimation scheme based on the method of control variates. Under specified conditions on the dependency structure of the simulation outputs and with respect to a variety of optimality criteria, the combined procedure is shown to be superior to each of the following conventional correlation-based variance reduction techniques: independent random number streams, common random number streams, control variates, and the original Schruben-Margolin strategy. [ABSTRACT FROM AUTHOR]
- Published
- 1994
- Full Text
- View/download PDF
29. SEPARATING THE ART AND SCIENCE OF SIMULATION OPTIMIZATION: A KNOWLEDGE-BASED ARCHITECTURE PROVIDING FOR MACHINE LEARNING.
- Author
-
Greenwood, Allen G., Rees, Loren Paul, and Crouch, Ingrid W. M.
- Subjects
SIMULATION methods & models ,MATHEMATICAL optimization ,ARCHITECTURE ,TECHNOLOGY ,MODULAR design - Abstract
The purpose of this paper is to develop an architecture for simulation optimization, building on the work we published in this journal in 1985. The need for a dramatically updated architecture is established by examining the simulation optimization process, traditional approaches to the problem, and difficulties inherent with these methodologies. Our framework directly addresses these problems by exploiting concepts and technologies introduced that have become popular in the last ten years, such as expert systems, to capture heuristic opinions and experience, and neural networks, to introduce machine learning. Three main contributions result from this research. First, the simulation optimization process is examined from a completely new perspective--a strategic overview of the process leads to an unbundling of the "art" and "science" elements that are co-mingled in current practice, thereby promoting modularity and making the other two contributions possible. Second, the newly unbundled process is translated to a knowledge-based context facilitating the direct inclusion of human expertise. The third contribution of this paper is the incorporation of machine learning into the framework thus permitting the optimizer to teach itself from its experiences. [ABSTRACT FROM AUTHOR]
- Published
- 1993
- Full Text
- View/download PDF
30. SIMULATION METAMODELS OF SYSTEM AVAILABILITY AND OPTIMUM SPARE AND REPAIR UNITS.
- Author
-
Madu, Christian N. and Chu-Hua Kuei
- Subjects
QUEUING theory ,TAGUCHI methods ,QUALITY control ,REGRESSION analysis ,SIMULATION methods & models - Abstract
This paper presents a new approach to maximize the steady-state availability of a closed queuing multiechelon repairable system with cold standbys. The new method is based on the use of the Taguchi inner array design in simulation modeling of this problem. Regression models are developed based on the results generated from the balanced simulation design. Two special cases of the exponential and Erlang-2 service distributions are presented. These models were validated and shown to provide good approximations to simulation results. Within the specified domain of the problem's parameters, the total operating cost of this system is minimized to determine the optimum spare and repair units. [ABSTRACT FROM AUTHOR]
- Published
- 1992
- Full Text
- View/download PDF
31. The Analysis of a Production Line with Unreliable Machines and Random Processing Times.
- Author
-
Yushin Hong, Glassey, C. Roger, and Seong, Deokhyun
- Subjects
MANUFACTURING processes ,PRODUCTION management (Manufacturing) ,SIMULATION methods & models ,OPERATIONS research - Abstract
This paper develops an efficient analytical method for the analysis of an n-machine production line with unreliable machines and random processing times. The behavior of the n-machine line is approximated by the behaviors of the (n - 1) two-machine lines based on the decomposition. Simulation and numerical experiment show that the analytical method works well and is very efficient. [ABSTRACT FROM AUTHOR]
- Published
- 1992
- Full Text
- View/download PDF
32. An Approximate Model for Field Service Territory Planning.
- Author
-
Hill, Arthur V., March, Salvatore T., Nachtsheim, Christopher J., and Shanker, Murali S.
- Subjects
SIMULATION methods & models ,OPERATIONS research ,DECISION support systems ,MANAGEMENT information systems ,SERVICE industries - Abstract
Field service managers are often faced with the problem of balancing the number of technicians, territory size, and field service quality. This paper presents an approximate state-dependent queuing model that can help field service managers make these tradeoffs. Simulation experiments over a variety of field service environments demonstrate that this model is quite accurate for predicting mean travel time and mean response time. The approximate queuing model has been imbedded in a decision support system and implemented by a Fortune 100 company. Management found the decision support system very useful in making important field service decisions. [ABSTRACT FROM AUTHOR]
- Published
- 1992
- Full Text
- View/download PDF
33. Assembly/Disassembly Systems: An Efficient Decomposition Algorithm for Tree-Structured Networks.
- Author
-
Gershwin, Stanley B.
- Subjects
SYSTEM analysis ,SYSTEMS theory ,MATHEMATICAL models ,SIMULATION methods & models - Abstract
Assembly/Disassembly Networks are networks of queues in which assembly or disassembly (often called join or fork) take place. This paper describes and analyzes a class of these systems in which machines are unreliable, buffers are finite, machines perform operations whenever none of their upstream buffers are empty and none of their downstream buffers are full, and the network structure is a tree. These systems are difficult to evaluate because of their large state spaces and because they cannot be decomposed exactly. An efficient approximate decomposition method for the evaluation of performance measures is presented. This decomposition approach is based on such system characteristics as conservation of flow. Comparison with simulation results indicate that it is accurate. [ABSTRACT FROM AUTHOR]
- Published
- 1991
- Full Text
- View/download PDF
34. Balancing Cycle Time and Workstations.
- Author
-
Deckro, Richard F.
- Subjects
MATHEMATICAL models ,SIMULATION methods & models ,OPERATIONS research ,ASSEMBLY line methods ,PRODUCTION engineering ,INDUSTRIAL management - Abstract
This paper develops a model which simultaneously considers the minimization of cycle time and the number of workstations in an assembly line balancing problem. The model is developed from such known zero-one formulations as the Patterson and Albract and the Thangavelu and Shetty models. The flexibility provided by the model should prove especially valuable in the planning stages of a line balancing operation. [ABSTRACT FROM AUTHOR]
- Published
- 1989
- Full Text
- View/download PDF
35. Budgeting in Hierarchical Systems Under Uncertainty.
- Author
-
Sinuany-Stern, Zilla and Rosenblatt, Meir J.
- Subjects
BUDGET ,PRESENT value ,INTERNAL rate of return ,CASH flow ,MATHEMATICAL programming ,SIMULATION methods & models - Abstract
This paper deals with budgeting procedures in a multi-division firm. The global objective function of the firm is composed of maximizing the present value and minimizing the risk of the accepted projects. A weight is assigned to each one of these objectives. Different budgeting procedures are developed for the cases where these weights are known or unknown and for centralized and decentralized settings. In the cases where these weights are known, mathematical programming methods are suggested. However, in the cases where the values of the weights are unspecified, the notion of Discrete Efficient Frontier (DEF) is developed to represent sets of efficient (not dominated) combinations of projects. A simple measure of performance is developed for evaluating the effectiveness of the various budgeting procedures. This measure is based on the total area beneath the DEF. Finally, an extensive simulation study is carried out to see the effectiveness of the centralized vs. the decentralized procedures suggested in this study. [ABSTRACT FROM AUTHOR]
- Published
- 1987
- Full Text
- View/download PDF
36. Economic Production Cycles with Imperfect Production Processes.
- Author
-
Meir J. Rosenblatt and Lee, Hau L.
- Subjects
MANUFACTURING processes ,PRODUCTION engineering ,MATHEMATICAL analysis ,MATHEMATICAL models ,MATHEMATICAL optimization ,SIMULATION methods & models - Abstract
In this paper, we study the effects of an imperfect production process on the optimal production cycle time. The system is assumed to deteriorate during the production process and produce some proportion of defective items. The optimal production cycle is derived, and is shown to be shorter than that of the classical Economic Manufacturing Quantity model. The analysis is extended to the case where the defective rate is a function of the set-up cost, for which the set-up cost level and the production cycle time are jointly optimized. Finally, we also consider the case where the deterioration process is dynamic in its nature, i.e., the proportion of defective items is not constant. Both linear, exponential, and multi-state deteriorating processes are studied. Numerical examples are provided to illustrate the derivation of the optimal production cycle time in these situations. [ABSTRACT FROM AUTHOR]
- Published
- 1986
- Full Text
- View/download PDF
37. A Multifactor Priority Rule for Jobshop Scheduling Using Computer Search.
- Author
-
Bunnag, Panit and Smith, Spencer B.
- Subjects
JOB hunting ,COMPUTER systems ,ELECTRONIC systems ,AUTOMATION ,COMPUTER simulation ,SIMULATION methods & models - Abstract
Priority rules are widely used in jobshop scheduling to determine the sequence in which jobs are to be processed. The research in this area has been directed at developing generally applicable priority rules. This paper presents a method for determining an effective priority rule specific to the jobshop scheduling problem to be solved. First, a generalized objective function is formulated which is the sum of costs of tardiness, carrying in-process inventory and machine idleness. Second, a multifactor priority rule is developed which is a weighted average of four factors used in simple priority rules. Third, a method is presented for using a computer search technique to determine the best weights to use in the priority rule. Finally, a computer simulation for testing this approach versus using other priority rules is described and the experimental results are reported. [ABSTRACT FROM AUTHOR]
- Published
- 1985
- Full Text
- View/download PDF
38. Control Variates for Multipopulation Simulation Experiments.
- Author
-
van Nozari, Arda, Arnold, Steven F., and Pegden, C. Dennis
- Subjects
OPERATIONS research ,SIMULATION methods & models ,COST control ,INDUSTRIAL costs ,INDUSTRIAL research - Abstract
In this paper the application of control variates for multipopulation simulation experiments is discussed. Procedures for statistical analysis, when these variates are observed, are presented, and results on the efficiency of employing control variates are established. [ABSTRACT FROM AUTHOR]
- Published
- 1984
- Full Text
- View/download PDF
39. ON THE IMPACT OF FAMILY SCHEDULING PROCEDURES.
- Author
-
Wemmerlöv, Urban and Vakharia, Asso J.
- Subjects
PRODUCTION scheduling ,SIMULATION methods & models ,OPERATIONS research ,SYSTEMS engineering - Abstract
This paper shows that family scheduling procedures can generate substantial reductions in waiting time even if the relative setup reduction time is small. The authors note that in a recent article in IIE Transactions, S. Kekre provided an excellent analysis of setup avoidance and waiting time behavior for a single machine facility. In particular, the value of applying a setup reducing look ahead sequencing rule (part family sequencing) was investigated. Using lot sizes determined to minimize waiting time in queue under random setup avoidance conditions, Kekre concluded, based on both analytic derivations and simulations, that efforts to reduce queue times by processing all parts available from the queue for which cell is already set up and saving setup time results in little gain. This conclusion is based on the erroneous assumption that setup time reduction equals time reduction. The authors conclude that setup-avoiding sequencing rules, when lot sizes are calculated to minimize waiting time in queue for identical items, can be highly beneficial with respect to average waiting time performance. Thus, according to them, Kekre's conclusion that sequencing using look-ahead rule yields little benefits is misleading.
- Published
- 1993
- Full Text
- View/download PDF
40. Monitoring and accurately interpreting service processes with transactions that are classified in multiple categories.
- Author
-
Duran, RodrigoI. and Albin, SusanL.
- Subjects
CUSTOMER services ,BINOMIAL distribution ,PROBABILITY theory ,CHI-square distribution ,QUALITY control charts ,SIMULATION methods & models - Abstract
Consider a process where transactions, such as customer service transactions, are classified into categories. With just two categories, the fraction in each can be monitored with the familiar p-chart based on the binomial distribution. This paper presents a new method for monitoring the number of transactions among K categories the p-tree method, which provides an accurate and easy way to help pinpoint the categories where there has been a disturbance. In contrast to the existing practice the proposed method not only signals an out-of-control situation but also helps identify which categories are causing the problem. It is shown that a K category process can be represented by a probability tree with K - 1 binary stages and hence monitored with K - 1 independent p-charts. Simulation studies show that the p-tree method is a helpful diagnostic tool and that the sensitivity is comparable to existing multinominal-based control charts. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
41. Indirect cycle time quantile estimation using the Cornish-Fisher expansion.
- Author
-
Bekki, JenniferM., Fowler, JohnW., Mackulak, GeraldT., and Nelson, BarryL.
- Subjects
SIMULATION methods & models ,MATHEMATICAL analysis ,MANUFACTURING processes ,PRODUCTION engineering ,COMPUTER simulation - Abstract
This paper proposes a technique for estimating steady-state quantiles from discrete-event simulation models, with particular attention paid to cycle time quantiles of manufacturing systems. The technique is based on the Cornish-Fisher expansion, justified through an extensive empirical study, and is supported with mathematical analysis. It is shown that the technique provides precise and accurate estimates for the most commonly estimated quantiles with minimal data storage and low computational requirements. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
42. Confidence interval estimation using quasi-independent sequences.
- Author
-
Chen, E.Jack and Kelton, W.David
- Subjects
SIMULATION methods & models ,OPERATIONS research ,QUANTITATIVE research ,CONFIDENCE intervals ,STOCHASTIC processes - Abstract
A Quasi-Independent (QI) subsequence is a subset of time series observations obtained by systematic sampling. Because the observations appear to be independent, as determined by the runs tests, classical statistical techniques can be used on those observations directly. This paper discusses implementation of a sequential procedure to determine the simulation run length to obtain a QI subsequence, and the batch size for constructing confidence intervals for an estimator of the steady-state mean of a stochastic process. The proposed QI procedures increase the simulation run length and batch size progressively until a certain number of essentially independent and identically distributed samples are obtained. The only (mild) assumption is that the correlations of the stochastic-process output sequence eventually die off as the lag increases. An experimental performance evaluation demonstrates the validity of the QI procedure. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
43. Variance component analysis based fault diagnosis of multi-layer overlay lithography processes.
- Author
-
Yu, Jie and Qin, S.Joe
- Subjects
ANALYSIS of variance ,LITHOGRAPHY ,ESTIMATION theory ,ASYMPTOTIC distribution ,DEMODULATION ,SEMICONDUCTOR manufacturing ,ELECTRONIC systems ,SIMULATION methods & models ,ELECTRONICS - Abstract
The overlay lithography process is one of the most important steps in semiconductor manufacturing. This work attempts to solve a challenging problem in this technique, namely error source identification and diagnosis for multistage overlay processes. In this paper, a multistage state space model for the misalignment errors of the lithography process is developed and a general mixed linear input-output model is then formulated to incorporate both fixed and random effects. Furthermore, the minimum norm quadric unbiased estimation strategy is used to estimate the mean and variance components of potential fault sources, and their asymptotic distributions are used to test the hypothesis concerning the statistical significance of each potential fault. Based on the above procedures, the root cause of misalignment errors in a multi-layer overlay process can be detected and diagnosed with physical inference. A number of simulated examples are designed and tested to verify the validity of the presented approach in fault detection and diagnosis of multi-stepper overlay processes. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
44. A decomposition approximation for three-machine closed-loop production systems with unreliable machines, finite buffers and a fixed population.
- Author
-
Maggio, N., Matta, A., Gershwin, S.B., and Tolio, T.
- Subjects
PRODUCTION scheduling ,BUFFER storage (Computer science) ,MACHINE theory ,APPROXIMATION theory ,FUNCTIONAL analysis ,PROBABILITY theory ,STATISTICAL correlation ,SIMULATION methods & models ,MANUFACTURING processes - Abstract
This paper describes an approximate analytical method for evaluating the average values of throughput and buffer levels of closed three-machine production systems with finite buffers. The method includes a new set of decomposition equations and a new building block model. The machines have deterministic processing times and geometrically distributed probabilities of failure and repair. The numerical results of the method are close to those from simulation. The method performs well because it takes into account the correlation among the numbers of parts in the buffers. Extensions to larger systems are discussed. [Supplementary materials are available for this article. Go to the publisher's online edition of IIE Transactions for the following free supplemental resource: Appendix of additional numerical results] [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
45. A large-scale simulation model of pandemic influenza outbreaks for development of dynamic mitigation strategies.
- Author
-
Das, TapasK., Savachkin, AlexA., and Zhu, Yiliang
- Subjects
INFLUENZA viruses ,PANDEMICS ,COMMUNICABLE disease treatment ,COMMUNICABLE diseases ,HAZARD mitigation ,MARKOV processes ,SIMULATION methods & models ,PREVENTION of communicable diseases - Abstract
Limited stockpiles of vaccine and antiviral drugs and other resources pose a formidable healthcare delivery challenge for an impending human-to-human transmittable influenza pandemic. The existing preparedness plans by the Center for Disease Control and Health and Human Services strongly underscore the need for efficient mitigation strategies. Such a strategy entails decisions for early response, vaccination, prophylaxis, hospitalization and quarantine enforcement. This paper presents a large-scale simulation model that mimics stochastic propagation of an influenza pandemic controlled by mitigation strategies. The impact of a pandemic is assessed via measures including total numbers of infected, dead, denied hospital admission and denied vaccine/antiviral drugs, and also through an aggregate cost measure incorporating healthcare cost and lost wages. The model considers numerous demographic and community features, daily human activities, vaccination, prophylaxis, hospitalization, social distancing, and hourly accounting of infection spread. The simulation model can serve as the foundation for developing dynamic mitigation strategies. The simulation model is tested on a hypothetical community with over 1100 000 people. A designed experiment is conducted to examine the statistical significance of a number of model parameters. The experimental outcomes can be used in developing guidelines for strategic use of limited resources by healthcare decision makers. Finally, a Markov decision process model and its simulation-based reinforcement learning framework for developing mitigation strategies are presented. The simulation-based framework is quite comprehensive and general, and can be particularized to other types of infectious disease outbreaks. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
46. A five-class variance swapping rule for simulation experiments: A correlated-blocks design.
- Author
-
Song, WheymingTina and Chiu, Wenchi
- Subjects
SIMULATION methods & models ,FACTORIAL experiment designs ,FACTOR analysis ,EXPERIMENTAL design ,RANDOM numbers - Abstract
A simulation experiment is frequently performed to estimate a metamodel, which is a functional relationship between the mean response of the simulation model and a set of simulation inputs. Variance Swapping Rules (VSRs), which assign pseudo-random number streams in simulation experiments, are often used to increase the precision of the functional relationship. This paper proposes a five-class VSR, which classifies all variances of the effects estimators into five classes for linear metamodels of 2k factorial designs. The five-class VSR induces correlations among all blocks for which all design points have a special correlation structure. This five-class correlated-blocks VSR is a generalization of all existing VSRs, which are viewed as one, two, or three-class VSRs in terms of the variances of the effects estimators. The five-class rule provides a better VSR than the existing VSRs in that it allows one to make a finer distinction among all effects for which the variances are not allowed to be swapped in three-class VSRs. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
47. Fully sequential selection procedures with a parabolic boundary.
- Author
-
Batur, Demet and Kim, Seong-Hee
- Subjects
SEQUENTIAL analysis ,RANKING (Statistics) ,SIMULATION methods & models ,PARABOLA ,VARIANCES - Abstract
We present two fully sequential indifference-zone procedures to select the best system from a number of competing simulated systems when best is defined in terms of the maximum or minimum expected performance. These two procedures have parabola shaped continuation regions rather than the triangular continuation regions employed in several papers in the existing literature. The procedures we present accommodate unequal and unknown variances across systems and the use of common random numbers. However, we assume that basic observations are independent and identically normally distributed. We compare the performance of our procedures with those of other fully sequential procedures available in the literature. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
48. Integrated simulation application design for short-term production scheduling.
- Author
-
Kumar, Sameer and Nottestad, DanielA.
- Subjects
PRODUCTION scheduling ,SIMULATION methods & models ,MANUFACTURING processes ,COMPUTER software ,MANAGEMENT - Abstract
This paper focuses on the development of an integrated short-term production scheduling simulation model for a complex manufacturing operation. The decision making tool has a front-end user data interface built with Microsoft Access and Visual Basic for Applications software. The model assists production planners to make informed decisions on how best to schedule simultaneously two lanes of daily or weekly production on a unique plastic parts manufacturing line. The advantages and disadvantages (model weaknesses) of using the simulation model, especially as a scheduling tool, are discussed. In this particular application, we find that the advantages far outweigh the disadvantages. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
49. A review on design, modeling and applications of computer experiments.
- Author
-
Chen, VictoriaC.P., Tsui, Kwok-Leung, Barton, RussellR., and Meckesheimer, Martin
- Subjects
STATISTICS ,EXPERIMENTAL design ,SIMULATION methods & models ,ENGINEERING ,STOCHASTIC models - Abstract
In this paper, we provide a review of statistical methods that are useful in conducting computer experiments. Our focus is on the task of metamodeling, which is driven by the goal of optimizing a complex system via a deterministic simulation model. However, we also mention the case of a stochastic simulation, and examples of both cases are discussed. The organization of our review first presents several engineering applications, it then describes approaches for the two primary tasks of metamodeling: (i) selecting an experimental design; and (ii) fitting a statistical model. Seven statistical modeling methods are included. Both classical and newer experimental designs are discussed. Finally, our own computational study tests the various metamodeling options on two two-dimensional response surfaces and one ten-dimensional surface. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
50. A Survey of Recent Advances in Discrete Input Parameter Discrete-Event Simulation Optimization.
- Author
-
SWISHER, JAMESR., HYDEN, PAULD., JACOBSON, SHELDONH., and SCHRUBEN, LEEW.
- Subjects
SIMULATION methods & models ,MATHEMATICAL optimization ,OPERATIONS research ,DISCRETE-time systems - Abstract
Discrete-event simulation optimization is a problem of significant interest to practitioners interested in extracting useful information about an actual (or yet to be designed) system that can be modeled using discrete-event simulation. This paper presents a survey of the literature on discrete-event simulation optimization published in recent years (1988 to the present), with a particular focus on discrete input parameter optimization. The discrete input parameter case differentiates techniques appropriate for small and for large numbers of feasible input parameter values. Examples of applications that illustrate these methods are also discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.