5,393 results
Search Results
2. Publishing an applied statistics paper: Guidance and advice from editors.
- Author
-
Anderson‐Cook, Christine M., Lu, Lu, Gramacy, Robert B., Jones‐Farmer, L. Allison, Montgomery, Douglas C., and Woodall, William H.
- Subjects
- *
RESEARCH personnel , *ADVICE , *WRITING processes , *MENTORING , *PUBLISHING , *ELECTRONIC publications , *ONLINE comments - Abstract
One of the tasks required of most statistics researchers and academic faculty is to publish their innovative ideas in the peer‐reviewed literature. In this paper, we provide guidance about the different stages of the process as experienced authors and offer advice from those who hold the decision about the success or failure of these papers, namely the editors of applied statistics journals. The paper is organized into four sections focusing on the different stages of publishing: (1) Planning what to write about, where to submit and how to organize the paper; (2) The process of writing the paper; (3) Interpreting and responding to peer‐reviews from the journal editors and referees to prepare for resubmission; and (4) General comments about the publication process, including collaboration and mentoring. Each section starts with fundamentals provided by the moderators (C.A.C. and L.L.) on key aspects to consider on each topic and then is followed with discussion from some current and past editors of impactful journals in the field of applied statistics. Our hope is that this collection of insights may help accelerate learning about the process for young researchers and help all researchers to understand some of the important often‐unspoken aspects of the process. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Advances and novel applications in systems reliability and safety engineering (selected papers of the International Conference of SRSE 2022)
- Author
-
Peng, Weiwen, Xu, Ancha, and Hu, Jiawen
- Published
- 2024
- Full Text
- View/download PDF
4. A bibliography of the literature on process capability indices (PCIs): 2010–2021, Part I: Books, review/overview papers, and univariate PCI‐related papers.
- Author
-
Yum, Bong‐Jin
- Subjects
- *
PROCESS capability , *BIBLIOGRAPHY , *ACCEPTANCE sampling , *QUALITY control charts , *INTEGRATED software - Abstract
This is the author's second bibliography on process capability indices (PCIs) and contains approximately 1080 journal papers and books for the period 2010–2021. The related literature is classified into six major categories, namely, books, review/overview papers, theory‐ and method‐related papers, special applications, software packages, and papers omitted in the author's previous bibliography. Theory‐ and method‐related papers are further classified into univariate, multivariate, and functional PCI‐related papers. Special applications include acceptance sampling, control charts, supplier selection, and tolerance design and other optimizations. The present bibliography consists of two parts. Part I contains books, review/overview papers, and univariate PCI‐related papers, while Part II includes multivariate and functional PCI‐related papers, special applications, software packages, and papers omitted in the author's previous bibliography. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
5. A bibliography of the literature on process capability indices (PCIs): 2010–2021, Part II: Multivariate PCI‐ and functional PCI‐related papers, special applications, software packages, and omitted papers.
- Author
-
Yum, Bong‐Jin
- Subjects
- *
PROCESS capability , *BIBLIOGRAPHY , *INTEGRATED software - Abstract
This is the second part of the bibliography on process capability indices (PCIs) for the period 2010–2021, and includes multivariate and functional PCI‐related papers, special applications, software packages, and papers omitted in the author's previous bibliography. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
6. QREI's Best Paper Awards 2020.
- Author
-
Tang, Loon Ching
- Subjects
STATISTICAL process control ,ACCELERATED life testing ,RADIAL basis functions - Abstract
Prof. Frank Coolen of Durham University Prof. Enrique Del Castillo of Penn State University Prof. Marcus Perry of University of Alabama Prof. Fugee Tsung of HKUST Prof. Bill Woodall of Virginia Tech Prof. Enrico Zio of Politecnico di Milano Our judges not only gave very insightful comments and assessments but also pointed out that the list of papers is quite diverse, and we are not comparing apple with apple. Professor Woodall, who has been on QREI's editorial board for many years, suggested that instead of selecting only the best paper, it would be fairer (and easier to make the call) to select one in each of the three areas, namely, SPC, DOE, and reliability. [Extracted from the article]
- Published
- 2020
- Full Text
- View/download PDF
7. A review of a smart condition monitoring and control system destined for harsh environments<FNR>†</FNR><FN>This paper is an expanded version of a paper presented at the 2nd International Conference on the Control of Industrial Processes, Newcastle, March 1999. </FN>
- Author
-
Horler, Greg
- Subjects
- *
THERMOCOUPLES , *SAMPLING (Process) , *TEMPERATURE , *COMMUNICATION , *THERMOELECTRIC apparatus & appliances - Abstract
This paper describes the need for wireless piston telemetry together with elements of the design, development and operation of such a system. A specific feature of the system is the multiplicity of operating modes enabled by two-way communication. The system has been demonstrated to work with thermocouples and accelerometers embedded in the piston of a very small engine at speeds of over 2000 rev min-l. The piston-mounted components can be fitted to a piston as small as 80 mm diameter, and size reductions are anticipated with improvements in implementation technologies. Typical results are quoted in the paper. Patents for this system are pending. Copyright © 2000 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2000
- Full Text
- View/download PDF
8. Quality Improvement from the Viewpoint of Statistical Method<FN>This paper is based on a presentation given at the second ENBIS Conference, Rimini, September 2002 </FN>.
- Author
-
De Mast, Jeroen
- Subjects
- *
QUALITY assurance , *SIX Sigma , *INDUSTRIES , *PROJECT management , *CONSTRUCTION - Abstract
With the purpose of guiding professionals in conducting improvement projects in industry, several quality improvement strategies have been proposed which strongly rely on statistical methods. Examples are the Six Sigma programme, the Shainin System and Taguchi's methods. This paper seeks to make a rational reconstruction of these types of improvement strategies, which results in a methodological framework. The paper gives a demarcation of the subject of study and proposes a reconstruction research approach. Thereupon, the elements of the methodological framework are listed and briefly discussed. Finally, the effectiveness of the framework is illustrated by showing to what extent it reconstructs Six Sigma's Breakthrough Cookbook. Copyright © 2003 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
9. Robust Design Methodology: Status in the Swedish Manufacturing Industry<FN>This paper is based on a presentation given at the second ENBIS Conference, Rimini, September 2002 </FN>.
- Author
-
Gremyr, Ida, Arvidsson, Martin, and Johansson, Per
- Subjects
- *
ROBUST control , *MANUFACTURING processes , *INDUSTRIES , *ROBUST statistics - Abstract
While robust design methodology is a fairly common subject in the literature on quality and statistics, this paper shows that only 17% of Swedish manufacturing companies apply robust design methodology. The low level of use of robust design methodology is surprising since a majority of the companies think it is important to minimize performance variation. However, although knowledge and use of robust design methodology is poor, methods suitable in robust design methodology are often used. For example, 53% of the companies use design of experiments to some extent. The data presented in this paper were collected in a telephone survey of 105 companies in the Swedish manufacturing industry; the response rate was 83%. The sample was stratified with respect to company size, and it is shown that robust design methodology and related methods are used to a greater extent in large companies than in small and medium-sized ones. Copyright © 2003 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
10. An examination of the potential role of the Internet in distributed SPC and quality systems<FN>This is a modified version of a paper presented at the 2nd Conference on the Control of Industrial Processes, 1998. </FN>.
- Author
-
Thompson, D. M., Homer, G. R., and Thelwall, M.
- Subjects
- *
STATISTICAL process control , *INTERNET , *EXTRANETS (Computer networks) , *INDUSTRIES , *MANUFACTURING processes - Abstract
Statistical process control (SPC) techniques have been used with varying degrees of success in many sectors of manufacturing industry for many years. This paper examines their use in one specific sector of manufacturing industry, the steel cold rolling industry, and suggests how Internet-based technologies could be integrated into the SPC process in order to provide a more responsive, customer-centred system. The authors indicate how this approach is already being implemented in certain service sectors and propose a wider application of the approach. In order to set the context, the paper includes a brief description of how one particular manufacturing sector, the automotive supply chain, has implemented and benefited from the integration of Internet-based techniques into its business and manufacturing processes. Copyright © 2000 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2000
- Full Text
- View/download PDF
11. Criticality analysis revisited<FN>This is a significantly shorter version of the paper originally submitted. A copy of the full paper is available from the authors on application. </FN>.
- Author
-
Moss, T. R. and Woodhouse, J.
- Subjects
- *
CRITICALITY (Nuclear engineering) , *FUZZY logic , *SYSTEMS design , *RELIABILITY in engineering , *OPERATIONS research - Abstract
Criticality analysis is applied in risk and reliability studies to rank decisions on system design and operation. There is a wide variety of methods used to meet the requirements of different organizations. Most methods feature an initial assessment of the consequences of failure and its probability of occurrence; however, other factors may also be applied to provide a more robust analysis applicable to each specific situation. As well as assessing system criticality during the design phase, it is also necessary to continue to evaluate system and equipment criticality during operation so that availability can be maximized. Some alternatives to the well-known MILHBK 1629A approach are considered in this paper. Copyright © 1999 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 1999
- Full Text
- View/download PDF
12. Six Sigma: 20 Key Lessons Learned<FNR></FNR><FN>This paper is a slightly modified version of Hahn GJ. 20 key lessons learned: Experience shows what works and does not work. Six Sigma Forum Magazine 2002; (May). Reprinted with permission </FN>.
- Author
-
Hahn, G. J.
- Subjects
- *
SIX Sigma , *QUALITY control standards , *PROCESS control systems , *BEST practices , *TOTAL quality management - Abstract
This paper discusses 20 key lessons learned about Six Sigma in the 20 years since its introduction. It was previously published in the May 2002 issue of Six Sigma Forum Magazine. Copyright © 2005 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
13. Bayesian Analysis and Prediction of Failures in Underground Trains<FN>This paper is based on a presentation given at the second ENBIS Conference, Rimini, September 2002 </FN>.
- Author
-
Pievatolo, Antonio, Ruggeri, Fabrizio, and Argiento, Raffaele
- Subjects
- *
RAILROAD trains , *FAILURE analysis , *BAYESIAN analysis , *POISSON processes , *MONTE Carlo method , *TRANSPORTATION - Abstract
Based upon failure data (date and odometer reading) in the early operating time, a public transportation company is interested in checking the actual reliability of the door opening system of its subway trains before their warranty expires. We consider non-homogeneous Poisson processes with a double scale because both time and kilometres are recorded for each failure. Different choices to model the relation between operated time and kilometres run are possible: in this paper the kilometres run are incorporated within the intensity function as a random function of time, modelled as a gamma process. Furthermore a periodic component is introduced to deal with the seasonality in the data. Bayesian inference is then carried out via Monte Carlo simulation, obtaining prediction intervals for the expected number of failures during periods of desired length, using only part of the data. The predictions are then compared with the observed data. Copyright © 2003 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
14. Assessing the Effect of Time in Factorial Designs: The Practical Application<FN>This paper is based on a presentation given at the second ENBIS Conference, Rimini, September 2002 </FN>.
- Author
-
Chatfield, M. J. and Owen, M. R.
- Subjects
- *
FACTORIAL experiment designs , *TIME-domain analysis , *CHEMICAL process control , *CHEMICAL processes , *QUANTITATIVE research - Abstract
The scientist asks a statistician the question: ‘should we incorporate time as a factor in our factorial designs or measure the response at various time-points?’ Advantages and disadvantages of the approaches are discussed. The aspects to be considered include: the relative costs of performing the experiment, the response measurement and the statistical analysis; the quality of the data collected and the information to be gained from the experimentation; implications for possible designs; implications for setting up, analysis and interpretation of designs in standard software available to scientists; and pitfalls to avoid. The field of chemical process investigation is used to illustrate some of the issues involved. In some scientific areas the experimental or resource issues may be overriding and the answer obvious. For the chemist, in some cases the only choice will be the factorial design which includes time as a factor due to experimental reasons but in most cases the appropriate answer depends on careful consideration of the aspects listed above. The paper illustrates the importance of a partnership between scientist and statistician in addressing the practical application of statistical techniques. Copyright © 2003 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
15. Visually Mining Off-line Data for Quality Improvement<FN>This paper is based on a presentation given at the second ENBIS Conference, Rimini, September 2002 </FN>.
- Author
-
Porzio, Giovanni C. and Ragozini, Giancarlo
- Subjects
- *
MANUFACTURING processes , *STATISTICAL process control , *DATA mining , *QUALITY control , *INDUSTRIES - Abstract
Highly automated modern manufacturing processes are yielding large databases with records on hundreds of process variables and product characteristics. This large amount of information calls for new approaches to production process analysis. In this paper, we discuss why a data mining framework can be appropriate for this goal, and we propose a visual data mining strategy to mine large and high-dimensional off-line data sets. The strategy allows users to achieve a deeper process understanding through a set of linked interactive graphical devices, and is illustrated within an industrial process case study. Copyright © 2003 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
16. The Use of Observational Data to Implement an Optimal Experimental Design<FN>This paper is based on a presentation given at the second ENBIS Conference, Rimini, September 2002 </FN>.
- Author
-
Berni, Rossella
- Subjects
- *
EXPERIMENTAL design , *SEQUENTIAL analysis , *QUANTITATIVE research , *MULTILEVEL models , *RANDOM variables - Abstract
This paper is focused on observational data and on the use of large data sets to implement an experimental design without additional runs, for an efficient use of these data. More specifically, the proposed procedure is based on several steps, aimed at avoiding some problems, such as lack of randomization and efficiency, obtaining a final experimental design that can be assumed as optimal from the point of view of a sequential optimality. Copyright © 2003 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
17. Weibull prediction of a future number of failures<FN>Presentation of this work at the 1995 Joint Statistical Meetings received an Honorable Mention Outstanding Presentation Award (third among 60 contributed papers) from the Section on Physical and Engineering Sciences of the American Statistical Association. </FN>
- Author
-
Nelson, Wayne
- Subjects
- *
WEIBULL distribution , *FAILURE analysis , *PREDICTION models , *CUMULANTS , *PARAMETER estimation - Abstract
This paper presents simple new prediction limits for the number of failures that will be observed in a future inspection of a sample of units. The past data consist of the cumulative number of failures in a previous inspection of the same sample of units. Life of such units is modelled with a Weibull distribution with a given shape parameter value. Copyright © 2000 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2000
- Full Text
- View/download PDF
18. Announcing the establishment of QREI Best Paper Award.
- Author
-
Tang, Loon Ching
- Subjects
PUBLISHING ,PERIODICAL awards ,PERIODICAL articles ,PERIODICAL subscriptions ,AUTHOR-publisher relations - Abstract
The article offers information on the planning of the editor for publishing the Best Paper Award from the periodical. Topics discussed include information on the awarding the official Certificate of Award to all authors of the best paper; discussions on the evaluation criteria for selecting the best paper such as insights for decision making and novelty of ideas; and the information on the one-year online subscription to the journal as a award.
- Published
- 2019
- Full Text
- View/download PDF
19. Do we need a PEM reliability model?<FN>This paper was published by ASME in Advances in Electronic Packaging 1999, the Proceedings of InterPACK, June 1999. </FN>.
- Author
-
Hakim, Edward B.
- Subjects
- *
RELIABILITY in engineering , *INTEGRATED circuits , *MICROELECTRONICS , *ACCELERATED life testing , *MOISTURE meters , *CORROSION resistant materials - Abstract
Reliability prediction models for microcircuits have been a function of steady state temperature. Failure rates generated from accelerated temperature tests were extrapolated to predict system reliability at system use temperatures. This is now known to be completely inaccurate. Attempts are now being made to predict the reliability of plastic-encapsulated microcircuits (PEMs) based on accelerated temperature/humidity testing. Failure rates generated owing to corrosion failure mechanisms at these high stress levels are then extrapolated and used to predict system reliability at use temperature/humidity conditions. This paper discusses the fallacy of this approach. A new concept for assurance of PEM corrosion resistance is proposed. It will be shown that today's best commercial practice suppliers have already addressed the design, materials and processing issues of moulded packaged microcircuits, and corrosion is no longer a mechanism of concern to the user. Copyright © 1999 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 1999
- Full Text
- View/download PDF
20. Sensitivity analysis of availability estimates to input data characterization using design of experiments<FNR></FNR><FN>This paper was produced under the auspices of the US Government and is therefore not subject to copyright in the US </FN>.
- Author
-
Durkee, Darren P., Pohl, Edward A., and Mykytka, Edward F.
- Subjects
- *
SCIENTIFIC experimentation , *EXPERIMENTAL design , *RELIABILITY in engineering , *ENGINEERING , *QUALITY control - Abstract
Reliability analysts are often faced with the challenge of characterizing the behaviour of system components based on limited data. The purpose of this study is to provide insight into which availability model input data are most significant and how many data are necessary to achieve desired accuracy requirements. The overall goal is to improve the efficiency and cost-effectiveness of the data collection and data characterization processes. A 25-1V factorial designed experiment was conducted to determine which of five input data characterization factors (for a simple series–parallel structure) may significantly affect availability model accuracy. The results from this experiment show that in this instance the factors under study do not have a significant effect on model output accuracy. Additional research is planned to more closely scrutinize the effects of these factors. © 1998 John Wiley & Sons, Ltd. This paper was produced under the auspices of the US Government and is therefore not subject to copyright in the US. [ABSTRACT FROM AUTHOR]
- Published
- 1998
- Full Text
- View/download PDF
21. Influence of database mistakes on journal citation analysis: remarks on the paper by Franceschini and Maisano, QREI (2010).
- Author
-
Franceschini, Fiorenzo and Maisano, Domenico
- Subjects
- *
DATABASES , *BIBLIOMETRICS , *SCIENCE publishing , *CITATION analysis , *PUBLISHING - Abstract
This short note contains some remarks on a recent bibliometric survey about some of the major scientific journals in the field of Quality Engineering/Management ( Qual. Reliab. Engng. Int. 2010; 26(6):593-604). In particular, thanks to Professor Woodall's precious indication, it has been freshly noticed that some results in the original work are biased by mistakes in the bibliometric databases (in this case Google Scholar). After a careful examination and correction of biased data, a synthetic analysis of the typical mistakes of bibliometric databases is presented, focussing the attention on the importance of using robust bibliometric indicators. Copyright © 2010 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
22. Discussion (3): Jones–Johnson Paper.
- Author
-
LOEPPKY, JASON L. and WILLIAMS, BRIAN J.
- Subjects
- *
GAUSSIAN processes , *COMPUTER simulation , *EXPERIMENTS , *SIMULATION methods & models , *MATHEMATICAL models - Abstract
This article presents a discussion on a study by Bradley Jones and Rachel T. Johnson on the use of Gaussian Process model in computer simulation research. The authors believe that the uncertainty intervals on unsampled points generated by using a plug-in approach from maximum likelihood estimates may be too short. Their own experiences confirm that computer models are widely used in many areas of science and engineering, and the need to understand the performance of such models is critical.
- Published
- 2009
- Full Text
- View/download PDF
23. Discussion (2): Jones–Johnson Paper.
- Author
-
QIAN, PETER Z. G. and WU, C. F. JEFF
- Subjects
- *
GAUSSIAN processes , *COMPUTER simulation , *EXPERIMENTS , *STOCHASTIC processes , *RESEARCH methodology - Abstract
This article presents a discussion on a study by Bradley Jones and Rachel T. Johnson on the use of Gaussian Process model in computer simulation research. The authors believe that the researchers presented a useful survey of the design and analysis of Gaussian process models for computer experiments. They provide a methodology for extending the range of inquiry of computer experiments to include categorical factors.
- Published
- 2009
- Full Text
- View/download PDF
24. A Bibliography of Process Capability Papers.
- Author
-
Spiring, Fred, Leung, Bartholomew, Cheng, Smiley, and Yeung, Anthony
- Subjects
- *
BIBLIOGRAPHY , *INFORMATION resources , *ENGINEERING , *INDUSTRIAL arts , *TECHNOLOGY - Abstract
Presents a bibliography of process capability books, manuals and articles, in relation to the engineering sector. "Process Capability Indices in Theory and Practice," by C. Lovelace and S. Kotz; "Measuring Process Capability," by D. R. Bothe; "Process Capability Indices," by N. L. Johnson and S. Kotz.
- Published
- 2003
- Full Text
- View/download PDF
25. Statistical Efficiency: The Practical Perspective<FN>This paper is based on a presentation given at the second ENBIS Conference, Rimini, September 2002 </FN>.
- Author
-
Kenett, Ron S., Coleman, Shirley, and Stewardson, Dave
- Subjects
- *
QUALITY control , *STATISTICAL hypothesis testing , *RELIABILITY in engineering , *STRUCTURAL analysis (Engineering) , *QUANTITATIVE research - Abstract
The idea of adding a practical perspective to the mathematical definition of statistical efficiency is based on a suggestion by Churchill Eisenhart who, years ago gave, in an informal ‘Beer and Statistics’ seminar, a new definition of statistical efficiency. Later Bruce Hoadley from Bell Laboratories picked up where Eisenhart left off and added his version nicknamed ‘Vador’. Blan Godfrey, former CEO of the Juran Institute, more or less used Hoadley's idea during his Youden Address at the Fall Technical Conference of the American Society for Quality Control. We expand on this idea adding an additional component, the value of the data actually collected, which we believe is critical to the overall idea. The concept of Practical Statistical Efficiency (PSE) derived from these developments is introduced and demonstrated using five case studies. We suggest that PSE be considered before, during and after undertaking any quality improvement projects. Copyright © 2003 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
26. A Comparison of Shewhart Individuals Control Charts Based on Normal, Non-parametric, and Extreme-value Theory<FN>This paper is based on a presentation given at the second ENBIS Conference, Rimini, September 2002 </FN>.
- Author
-
Vermaat, M. B. (Thijs), Ion, Roxana A., Does, Ronald J. M. M., and Klaassen, Chris A. J.
- Subjects
- *
QUALITY control charts , *STATISTICAL process control , *STATISTICAL bootstrapping , *KERNEL functions , *EXTREME value theory - Abstract
Several control charts for individual observations are compared. Traditional ones are the well-known Shewhart individuals control charts based on moving ranges. Alternative ones are non-parametric control charts based on empirical quantiles, on kernel estimators, and on extreme-value theory. Their in-control and out-of-control performance are studied by simulation combined with computation. It turns out that the alternative control charts are not only quite robust against deviations from normality but also perform reasonably well under normality of the observations. The performance of the Empirical Quantile control chart is excellent for all distributions considered, if the Phase I sample is sufficiently large. Copyright © 2003 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
27. Simulation Models for Robust Design Using Location Depth Methods<FN>This paper is based on a presentation given at the second ENBIS Conference, Rimini, September 2002 </FN>.
- Author
-
Stadlober, Ernst, Kocher, Michael, and Rappitsch, Gerhard
- Subjects
- *
ELECTRONIC circuit design , *SEMICONDUCTOR industry , *COST effectiveness , *ROBUST control , *DESIGN , *ELECTRONIC circuits - Abstract
For a cost-effective production of integrated circuits, one important aspect is the accurate simulation of electronic circuits with regard to process variation. Process variation is described as the range of simulation (SPICE) parameters, but to reduce the costs of simulation they are replaced by easily available e-test parameters. An approximate algorithm for the location depth (multivariate ranking) selects all data points with location depth less than or equal to one as boundary points for the multidimensional data set of e-test parameters. The corresponding SPICE parameter values are simply obtained by linear mapping. To increase the robustness of the simulation the region covered by the set of boundary points is extended by determining the point with the deepest location (multivariate median) and adding to each boundary vector a fixed portion of the vector from the median to the boundary vector. This natural extension also covers moderate process shifts without changing the covariance structure of the data. These methods are integrated into an automated generation flow to be applicable in a production and circuit design environment. The statistical methods are validated by simulation experiments of typical analog/mixed-signal circuit designs. Copyright © 2003 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
28. An alternative rule for placement of empirical points on Weibull probability paper.
- Author
-
Drapella, Antoni and Kosznik, Sylwia
- Subjects
- *
PROBABILITY theory , *WEIBULL distribution , *MONTE Carlo method , *ESTIMATION theory , *MATHEMATICAL transformations , *DISTRIBUTION (Probability theory) - Abstract
This paper proposes an alternative rule for placing empirical points on Weibull probability paper. Following the standard or old rule, one takes the mean or median value of the order statistics first, then performs the Weibull transformation. The new rule is to make the Weibull transformation first, then take the expected value of the transformed order statistics. Although the new rule for placing points requires much more complicated calculations, a simple Monte Carlo experiment shows that the new rule estimator of the shape parameter is unbiased, whereas the old rule estimator turns out to be heavily biased. Copyright © 1999 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 1999
- Full Text
- View/download PDF
29. WEIBULL PROBABILITY PAPER USED ON HIGHLY STRESSED BIMODAL COMPONENTS.
- Author
-
Drapella, Antoni
- Subjects
- *
WEIBULL distribution , *SYSTEM failures , *DISTRIBUTION (Probability theory) , *RELIABILITY in engineering , *ENGINEERING , *SYSTEMS engineering - Abstract
Under high stress conditions the `freak' and `strong' subpopulations lie very close to each other on the time scale. It is easy to `read' such a time-to-failure distribution as unimodal. especially when probability paper is used. This paper puts forward Parzen's estimator of the probability density function as a very useful method of indicating the `freak' subpopulation. [ABSTRACT FROM AUTHOR]
- Published
- 1985
- Full Text
- View/download PDF
30. An addendum on "Three‐level designs: Evaluation and comparison for screening purposes".
- Author
-
Alomair, Mohammed, Georgiou, Stelios, and Stylianou, Stella
- Subjects
FACTORIAL experiment designs - Abstract
In our recent paper "Three‐Level Designs: Evaluation and Comparison for Screening Purposes", by the same authors, we investigated and compared the original definitive screening designs and the designs that were constructed from weighing matrices. For this purpose, we used the generalized resolution and minimum aberration criteria as well as the projection estimation capacity criterion from the literature to perform the comparison. After our paper was accepted, it came to our attention that there were four significant papers, published by a number of authors, having related content and these were not mentioned in our paper. With this short addendum, we would like to kindly acknowledge the important contributions from those authors in those newly discovered papers in recent literature and also to include them in the related reference list. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
31. Discussion (1): Jones–Johnson Paper.
- Author
-
MORRIS, MAX D.
- Subjects
- *
GAUSSIAN processes , *COMPUTER simulation , *EXPERIMENTS , *DESIGN , *MATHEMATICAL models - Abstract
This article presents a discussion on a study by Bradley Jones and Rachel T. Johnson on the use of Gaussian Process model in computer simulation research. The author believes that the researchers offer a clear and helpful introduction to ideas and methods used in the design and analysis of computer experiments. He provides some additional points of importance regarding analysis, design and computing.
- Published
- 2009
- Full Text
- View/download PDF
32. Discussion (4): Jones–Johnson Paper.
- Author
-
STEINBERG, DAVID M.
- Subjects
- *
GAUSSIAN processes , *COMPUTER simulation , *EXPERIMENTS , *SIMULATION methods & models , *RADIOISOTOPES - Abstract
This article presents a discussion on a study by Bradley Jones and Rachel T. Johnson on the use of Gaussian Process model in computer simulation research. The author believes that sometimes a computer simulation run is not expensive. He references a study that examined migration of radionuclides into ground water from a nuclear waste repository.
- Published
- 2009
- Full Text
- View/download PDF
33. Discussion (5): Jones–Johnson Paper.
- Author
-
NOTZ, WILLIAM
- Subjects
- *
GAUSSIAN processes , *COMPUTER simulation , *EXPERIMENTS , *SIMULATION methods & models , *MATHEMATICAL models - Abstract
This article presents a discussion on a study by Bradley Jones and Rachel T. Johnson on the use of Gaussian Process model in computer simulation research. The author believes that the Gaussian process model often performs adequately apparent nonstationarity. He explores the topic of model calibration, the relationship between the design and the model used to fit the data, and design augmentation.
- Published
- 2009
- Full Text
- View/download PDF
34. Accuracy analysis of satellite antenna panel expansion based on BP neural network.
- Author
-
Qian, Hua‐Ming, Zhang, Hua, Huang, Tudi, Huang, Hong‐Zhong, and Wang, Ke
- Subjects
ANTENNAS (Electronics) ,NONLINEAR equations ,TELECOMMUNICATION satellites - Abstract
Large deployable space mechanisms are widely used in the field of aerospace and have been paid increasingly high attention recently. The satellite antenna expansion system is the classic large deployable space mechanism. However, during the expansion of the satellite antenna deployable mechanism, the expansion accuracy is affected by the existing various uncertain factors which even result in scraping the satellite. For example, the hinge locking error has the significant influence on the deployment accuracy of the satellite antenna panel. In term of this issue, considering the advantage of the back‐propagation (BP) neural network for the high dimensional nonlinear problem, this paper mainly adopts it to analyze the impact of hinge locking error on the expansion accuracy of the antenna panel. The results show that the probability of the actual flatness deviation of the satellite antenna panel falling in the required accuracy area is 99.56%, and the probability of the actual pointing angle deviation falling in the required accuracy area is 99.84%. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
35. Discussion of 'Is designed data collection still relevant in the Big Data era?'.
- Author
-
King, Caleb and Jones, Bradley
- Subjects
ACQUISITION of data ,BIG data ,SOCIAL processes ,EXPERIMENTAL design - Abstract
Given the popularity of Big Data (BD), there can be an impression that fields such as design of experiments (DOE) are now irrelevant. We would like to thank the authors for starting the conversation about the possible relationship between these two fields. A key contribution of this paper is in showing how DOE principles, as summarized under the name designed data collection (DDC), can be applied throughout the BD process. This name is quite appropriate, demonstrating that these principles apply not just to designed experiments, but to any form of data collection. This is especially important for situations where designed experiments are either impossible (i.e., assessing how a country's economy may impact certain responses) or unethical (i.e., certain sensitive types of medical studies). It shows that DOE is more than a particular choice of design type, but is rather a methodology for approaching data collection. One that seeks to extract the most relevant information from the data while also taking into account the various nuances and constraints of physical and social processes, which are ever present, even in massive datasets. The paper divides BD efforts into three general phases: Before BD, During BD, and After BD. As such, we have grouped our discussion accordingly, with general comments provided for the suggested contributions of DDC in each phase. We then close with some additional thoughts. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
36. A probabilistic uncertain linguistic approach for FMEA‐based risk assessment.
- Author
-
Tang, Yingwei, Zhou, Dequn, Zhu, Shichao, and Ouyang, Linhan
- Subjects
- *
FAILURE mode & effects analysis , *SYSTEM failures , *DISTRIBUTION (Probability theory) , *SENSITIVITY analysis , *RISK assessment - Abstract
Failure Mode and Effect Analysis (FMEA) is acknowledged as a beneficial instrument for identifying and mitigating system failures. However, the traditional FMEA method has its limitations. For instance, crisp numbers fail to adequately represent the intricate information and cognitive nuances of experts. Additionally, the conventional approach overlooks the significance of weights assigned to FMEA experts and risk factors (RFs). Furthermore, the simplistic ranking of failure modes in traditional FMEA does not accurately reflect priorities. In light of these drawbacks, this paper introduces an innovative, fully data‐driven FMEA method, leveraging a probabilistic uncertain linguistic term sets (PULTSs) environment and the Weighted Aggregates Sum Product Assessment (WASPAS) method. In the assessment process, PULTSs serve as linguistic tools that express probability distribution, allowing for a more reasonable and precise description of information. To address the issue of weights for RFs, the regret theory and Modified CRITIC method are employed. Subsequently, the WASPAS method is applied to determine the risk rankings of failure modes. To illustrate the feasibility and rationality of this novel FMEA model, the paper includes an example involving the production of Lithium‐ion batteries. To emphasize the excellence of the proposed FMEA model, sensitivity and comparative analyses are carried out. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Dynamic predictive maintenance strategy for multi‐component system based on LSTM and hierarchical clustering.
- Author
-
Yaqiong, Lv, Pan, Zheng, Yifan, Li, and Xian, Wang
- Subjects
- *
REMAINING useful life , *HIERARCHICAL clustering (Cluster analysis) , *INDUSTRIALISM , *COST control , *MAINTENANCE costs - Abstract
In recent years, there has been growing interest in employing predictive methods to forecast the remaining useful life of industrial equipment. However, the challenge lies in how to take advantage of the dynamic predictive information to facilitate the maintenance of decision‐making. This problem becomes particularly challenging for complex industrial systems consisting of multiple components with economic dependencies. This paper aims at providing an effective maintenance strategy for multi‐component systems based on predictive information, while considering economic dependencies among different system components. To this end, a dynamic predictive maintenance (PdM) strategy that minimizes the mean maintenance cost over a decision period is proposed, where both long‐term and short‐term policies are integrated into the decision‐making framework. Specifically, the long‐term policy is formulated using predictions derived from historical degradation data through a Long Short‐Term Memory (LSTM) model. Concurrently, real‐time monitoring data is employed to forecast imminent degradation in components, serving as a basis for determining the necessity of short‐term adjustments. This paper embeds the consideration of economic dependencies among components within the maintenance strategy design and employs hierarchical clustering to establish an effective and efficient maintenance grouping policy. The experimental results demonstrate that our proposed strategy significantly outperforms conventional approaches, including block‐based and age‐based maintenance, resulting in substantial cost savings. The proposed strategy is also compared with a similar version without grouping, and the results verify the added value of the optimal maintenance grouping policy in cost reduction. Moreover, a comprehensive analysis of the proposed method is provided, including the impact of different inspection costs and inspection intervals on maintenance decision‐making, which can provide insightful guidance to various PdM scenarios in practice. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. A review of ranked set sampling and modified methods in designing control charts.
- Author
-
Mohammadkhani, Atieh, Amiri, Amirhossein, and Khoo, Michael B. C.
- Subjects
QUALITY control charts ,LITERATURE reviews ,SAMPLING methods ,EVIDENCE gaps ,UNITS of measurement - Abstract
Statistical process monitoring (SPM) and designing control charts include measuring quality characteristics by considering the time and cost constraints. Ranked set sampling (RSS) is a very efficient and an inexpensive approach in obtaining a more representative sample when the actual measurement of the sampling units is too expensive or destructive, where it is easier to rank the observations. Since 1997, when the first paper on designing control charts based on the RSS scheme was presented, a significant number of papers have been published in this area. This paper provides a literature review on using the RSS schemes in designing control charts. Relevant papers are classified and analyzed to identify the research gaps and provide recommendations and directions for future studies. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
39. Reliability‐based fault analysis models with industrial applications: A systematic literature review.
- Author
-
Ahmed, Qadeer, Raza, Syed Asif, and Al‐Anazi, Dahham M.
- Subjects
INDUSTRIAL applications ,FAULT diagnosis ,ROAD maps ,COST control - Abstract
Effective and early fault detection and diagnosis techniques have tremendously enhanced over the years to ensure continuous operations of contemporary complex systems, control cost, and enhance safety in assets‐intensive industries, including oil and gas, process, and power generation. The objective of this work is to understand the development of different fault detection and diagnosis methods, their applications, and benefits to the industry. This paper presents a contemporary state‐of‐the‐art systematic literature survey focusing on a comprehensive review of the models for fault detection and their industrial applications. This study uses advanced tools from bibliometric analysis to systematically analyze over 500 peer‐reviewed articles on focus areas published since 2010. We first present an exploratory analysis and identify the influential contributions to the field, authors, and countries, among other key indicators. A network analysis is presented to unveil and visualize the clusters of the distinguishable areas using a co‐citation network analysis. Later, a detailed content analysis of the top‐100 most‐cited papers is carried out to understand the progression of fault detection and artificial intelligence–based algorithms in different industrial applications. The findings of this paper allow us to comprehend the development of reliability‐based fault analysis techniques over time, and the use of smart algorithms and their success. This work helps to make a unique contribution toward revealing the future avenues and setting up a prospective research road map for asset‐intensive industry, researchers, and policymakers. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
40. Analysis and optimization of split‐plot operating window experiments: Methodology and application.
- Author
-
Wang, Juan, Byun, Jai‐Hyun, and Ma, Yizhong
- Abstract
The method of Operating Window (OW) is a statistical‐engineering approach to improve the robustness and reliability of products/processes. Like many industrial experiments, operating window experiments are often conducted with a split‐plot structure in order to accommodate the nature of some experimental factors. Existing research on OW has paid little attention to this aspect of the OW experiments. In this paper we focus on the modeling and optimization of OW experiments by incorporating the split‐plot structure. For ease of reading, we use the ubiquitous paper feeder example to illustrate each step of modeling and optimization. First, we employ the generalized linear mixed effects models (GLMM) to model the complex error structure afforded by the split‐plot structure. Then we obtain statistically significant variables for each failure mode in the feeder example: misfeed and multifeed. These analysis results enable us to make inference about the predicted failure probability for each mode. The optimization step is performed by minimizing some performance measures proposed in the literature, especially the one by Joseph and Wu.1 Performance measures for each control run are calculated and then modeled in terms of the identified control variables, which lead to the identification of optimal settings of these variables. This case study not only reveals the split‐plot structure common in industrial experiments and identify the key factors for the process, but also provides some additional insights on the process. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
41. An importance measure of a CNC lathe considering failure correlations.
- Author
-
Gu, Dongwei, Zhong, Yuhong, Xu, Zhen, Chen, Bingkun, and Wang, Zhiqiong
- Subjects
STANDARD deviations ,LATHES ,REDUNDANCY in engineering ,COPULA functions ,NUMERICAL control of machine tools - Abstract
Failure correlation is a physical phenomenon observed, for example, in CNC (computer numerical control) lathe applications. Reliability importance measure considering failure correlation is an effective measure to identify the importance of CNC lathe. This paper fully considers the failure correlation among the subsystems of the CNC lathe. Based on the Copula function, the joint reliability model of each subsystem of the CNC lathe is established. Three methods (D‐test, the average cumulative error, and the root mean square error) are used to compare the joint reliability models under different Copula functions. Then ascertained the Clayton Copula model can describe the failure correlation reliability of CNC lathe most accurately. The reliability dynamic importance measure can judge the difference in CNC lathe subsystem importance measure more accurately. In this paper, the reliability dynamic core importance measure is proposed to analyze the potential of the CNC lathe subsystems for improvement. It can determine that the servo subsystem is the crucial part of the CNC lathe for priority improvement. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
42. Implementation of compound optimal design in progressive first‐failure censored data.
- Author
-
Dhameliya, Vaibhav N., Maurya, Raj Kamal, and Bhattacharya, Ritwik
- Subjects
- *
DISTRIBUTION (Probability theory) , *COST functions , *EXPONENTIAL functions , *ACQUISITION of data , *NEIGHBORHOODS - Abstract
In many research studies, multiple objectives need to be considered simultaneously to ensure an effective and efficient investigation. A compound optimal design provides a viable solution to this problem, allowing for the maximization of overall benefits through the integration of several factors. The paper addresses the application of compound optimal designs in the context of progressive first‐failure censoring, with a particular focus on the Generalized Exponential distribution with two parameters. The paper provides an illustrative example of compound designs by considering the cost function along with trace, variance, and determinant of inverse Fisher information. The best design is determined using a graphical solution technique that is both comprehensible and precise. Using a simple example, we demonstrate the advantage of compound optimal designs over constraint optimal designs. Furthermore, the paper examines real‐world data collection to demonstrate the practical utility of compound optimal designs. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. A mixed uncertain structural reliability analysis method considering random and convex set variables with correlation.
- Author
-
Ma, Hang, Bi, Junxi, Li, Haibin, Ge, Xinyu, Zhou, Dachuan, Jiao, Jiaming, and Wang, Guofu
- Subjects
- *
STRUCTURAL reliability , *CONVEX sets , *RANDOM sets , *WIND turbine blades , *RELIABILITY in engineering - Abstract
In this paper, a mixed model reliability analysis method is put forward for the problem of assessing the reliability of complex engineering structures containing both random and convex aggregate variables. By integrating the ellipsoidal model with correlation and the interval model, the uncertainty region characterized by the ellipsoidal model with correlation is optimized with full consideration of the limited amount of engineering structure sample data, and the risk region represented by the reliability model is redefined, a new mixed reliability assessment criterion is established, and the minimum safe nonprobability reliability index of the structure is built. A key reference is offered by the safety limit diagram of constant reliability indexes set for the first time for the optimal design of engineering structure reliability. The proposed mixed reliability model is compared and analyzed with three classical models. The sensitivity of nonprobability reliability indexes, influenced by random variable parameters and correlation coefficients, is analyzed. This verification confirms that the new reliability model not only provides an accurate assessment of engineering structures' reliability but also lowers the computational demands of engineering design. In this paper, aero‐engine blades and wind turbine blades are taken as examples to expound the validity of the reliability model built using this method and its importance to the structural safety analysis of actual engineering. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. Optimal maintenance policy considering imperfect switching for a multi‐state warm standby system.
- Author
-
Qi, Fa‐Qun, Wang, Yun‐Ke, and Huang, Hong‐Zhong
- Subjects
- *
FAILURE mode & effects analysis , *OPERATING costs , *MAINTENANCE - Abstract
A novel maintenance policy for a two‐component warm standby system with multiple standby states is presented in this paper. Two standby states, that is, cold and warm standby, for components in the system are considered. Components can realize the transition from the cold standby to the warm standby state by periodic switching, intended to shorten recovery time and save system operating costs. Preventive maintenance (PM) and preventive switching (PS) of components are considered. In the PS strategy, the standby component can switch before the operating component fails. In addition, additional standby failure modes based on idle time are studied. Derive the long‐term average cost of the system through the semi‐regenerative process. A numerical example eventually verifies the feasibility of this paper's proposed maintenance and switching strategy. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. Log‐location‐scale increment degradation model: A Bayesian perspective.
- Author
-
Yu, I‐Tang and Wang, Kuei‐Mao
- Subjects
- *
MARKOV chain Monte Carlo , *BAYESIAN analysis , *STOCHASTIC processes - Abstract
Degradation modeling serves as a valuable tool for assessing the lifetime information of highly reliable products. One frequently employed approach for describing the degradation phenomenon involves the use of a degradation model that relies on stochastic processes. In a stochastic‐process‐based degradation model, it is assumed that the increments follow a distribution with the additivity property. This property makes the further inferences mathematically and statistically tractable. However, it limits the choices of the distributions. This paper aims to use those distributions without the additivity property to model the increments and explores distributions from the log‐location‐scale family. Under the frame of Bayesian analysis, Markov Chain Monte Carlo algorithms are developed for executing the necessary computations. Given that the proposed degradation models do not adhere to the additivity property, this paper tackles the challenges involved in predicting the lifetime of both on‐line and off‐line products. Two illustrative examples are subsequently analyzed to demonstrate the procedural steps outlined. The suitability of the proposed model is finally validated through a simulation study. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Failure prediction of mechanical system based on meta‐action.
- Author
-
Ran, Yan, Chen, Jingjie, Sagor, Nafis Jawyad, and Zhang, Genbao
- Subjects
- *
MECHANICAL failures , *INDUSTRIAL robots - Abstract
Highly reliable mechanical systems can lead to significant losses in the event of failure, and the lack of comprehensive failure data presents challenges for developing techniques such as critical part identification and failure prediction. In light of this, this paper proposes a meta‐action‐based fault prediction method that effectively addresses the issue of limited fault data. Initially, the mechanical system is decomposed utilizing the "Function‐Motion‐Action" (FMA) methodology to derive individual meta‐action units (MAUs). Subsequently, the limited sample of fault data from the mechanical system is combined with processed expert knowledge to construct the corresponding fault propagation‐directed graph. Furthermore, the key MAUs are determined by applying the Decision–Making Trial and Evaluation Laboratory (DEMATEL) method. Last, the degradation data of the key MAUs is acquired by monitoring them, and a non‐homogeneous discrete grey model (DNGM) integrated with an improved BP neural network is proposed to facilitate the fault prediction of MAUs. Using an industrial robot as a case study, the prediction results demonstrate the superiority of the method proposed in this paper over a single gray model and neural network, thereby providing a reliable prediction approach for anticipating the future trends of data‐deficient mechanical systems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. A novel intelligent fault diagnosis method of rolling bearings based on capsule network with Fast Routing algorithm.
- Author
-
Jiang, Guang‐Jun, Li, De‐Zhi, Li, Qi, and Sun, Hong‐Hua
- Subjects
- *
CAPSULE neural networks , *ROLLER bearings , *ROUTING algorithms , *FAULT diagnosis , *DIAGNOSIS methods - Abstract
Fault diagnosis is a novel technology crucial for monitoring the proper functioning and ensuring the stability of mechanical devices and components. Nevertheless, most existing data‐driven methods for rolling bearing fault diagnosis exhibit limited diagnostic capabilities in scenarios characterized by noise interference and inadequate training data. To address this issue, this paper proposes a novel intelligent fault diagnosis method for rolling bearings based on Capsule Network with Fast Routing algorithm (FCN). Firstly, the vibration signal is transformed into a time‐frequency map through continuous wavelet transform (CWT), and the transformed time‐frequency map is input into the network model to enable the network to learn features more fully. Subsequently, this paper introduces FCNs into capsule networks, effectively mitigating the extended training times typically associated with capsule networks and reducing the demands on training equipment. Extensive experiments are conducted utilizing two distinct bearing datasets to assess the method's stability and generalization. The results of these experiments demonstrate the proposed approach's ability to maintain robust fault diagnosis capabilities, even in the presence of noise interference and limited training data. This innovative method lays the foundation for intelligent rolling bearing diagnosis and is readily adaptable to other rotating components. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. Maximal entropy prior for the simple step‐stress accelerated test.
- Author
-
Moala, Fernando Antonio and Chagas, Karlla Delalibera
- Subjects
- *
BAYES' estimation , *ACCELERATED life testing , *MONTE Carlo method , *MARKOV chain Monte Carlo , *MARGINAL distributions , *ENTROPY - Abstract
The step‐stress procedure is a popular accelerated test used to analyze the lifetime of highly reliable components. This paper considers a simple step‐stress accelerated test assuming a cumulative exposure model with uncensored lifetime data following a Weibull distribution. The maximum likelihood approach is often used to analyze accelerated stress test data. Another approach is to use the Bayesian inference, which is useful when there is limited data available. In this paper, the parameters of the model are estimated based on the objective Bayesian viewpoint using non‐informative priors. Our main aim is to propose the maximal data information prior (MDIP) presented by Zellner (1984) as an alternative prior to the conventional independent gamma priors for the unknown parameters, in situations where there is little or no a priori knowledge about the parameters. We also obtain the Bayes estimators based on both classes of priors, assuming three different loss functions: square error loss function (SELF), linear‐exponential loss function (LINEX), and generalized entropy loss function (GELF). The proposed MDIP prior is compared with the gamma priors via Monte Carlo simulations by examining their biases and mean square errors under the three loss functions, and coverage probability. Additionally, we employ the Markov Chain Monte Carlo (MCMC) algorithm to extract characteristics of marginal posterior distributions, such as the Bayes estimator and credible intervals. Finally, a real lifetime data is presented to illustrate the proposed methodology. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Multi‐objective Bayesian modeling and optimization of 3D printing process via experimental data‐driven method.
- Author
-
Ding, Chunfeng, Wang, Jianjun, Ma, Yan, Tu, Yiliu, and Ma, Yizhong
- Subjects
- *
THREE-dimensional printing , *PRODUCT quality , *PRODUCT improvement , *PRINTING industry , *MANUFACTURING industries - Abstract
The instability of product quality and low printing efficiency are the main obstacles to the widespread application of 3D printing in the manufacturing industry. Optimizing printing parameters can substantially improve product quality and printing efficiency. However, existing methods for optimizing process parameters primarily rely on computationally expensive numerical simulations or costly physical experiments, which cannot balance model accuracy and experiment cost. To the best of our knowledge, almost no relevant papers have been found to address the issues of product quality and printing efficiency in 3D printing from experimental data‐driven perspective. In this paper, we propose a method that integrates multiobjective Bayesian optimization (MOBO) with experimental data‐driven, aiming at obtaining more accurate optimization results at a lower cost. Distinguishing from previous studies, the proposed method utilizes experimental data instead of predicted values to update the model and find the optimal process parameters based on expected hypervolume improvement. The results of the 3D printing case study show that the proposed method can better model and optimize the highly fluctuating 3D printing process and obtain the optimal process parameters at a much lower cost. In addition, confirmatory experiments verify that the proposed method achieves higher printing efficiency while maintaining product quality. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. A time‐varying meshing stiffness model for gears with mixed elastohydrodynamic lubrication based on load‐sharing.
- Author
-
Gu, Yingkui, Chen, Ronghua, Qiu, Guangqi, and Huang, Peng
- Subjects
- *
ELASTOHYDRODYNAMIC lubrication , *SURFACE morphology , *LUBRICATION & lubricants , *ROUGH surfaces , *GEARING machinery - Abstract
In mixed elastohydrodynamic lubrication (EHL), the load distribution between the gear meshing surfaces is shared by the oil film and the asperities of the gear's rough surface. Based on the load‐sharing concept, this paper proposes a time‐varying meshing stiffness (TVMS) model for gears with mixed EHL. The initial step involves the utilization of the Greenwood‐Williamson model to calculate the contact stiffness of surface asperities, while the lubricating film is assessed using a curve‐fitting formula to investigate the influence of gear surface morphology on TVMS. Subsequently, the incorporation of gear fillet foundation deformation and friction enables accurate TVMS determination. The proposed method is employed to examine the meshing stiffness of the gear pair under both dry lubrication and EHL conditions. Comparative analysis reveals favorable agreement between the proposed model and experimental results obtained under dry lubrication, thereby highlighting the superior performance of the proposed approach. Moreover, the time‐varying friction coefficient under EHL is computed, and the impacts of gear surface morphology parameters, temperature, speed, and load on lubrication conditions and TVMS are investigated. The findings presented in this paper contribute significantly to advancements in gear design and performance. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.