100 results on '"Samy Missoum"'
Search Results
2. Uncertainty quantification and global sensitivity analysis of composite wind turbine blades.
- Author
-
Mishal Thapa and Samy Missoum
- Published
- 2022
- Full Text
- View/download PDF
3. Correction: Reliability Assessment of Uncertain Linear Systems Subjected to Random Vibrations
- Author
-
Luis E. Ballesteros Martínez and Samy Missoum
- Published
- 2023
- Full Text
- View/download PDF
4. Reliability Assessment of Uncertain Linear Systems Subjected to Random Vibrations
- Author
-
Luis E. Ballesteros Martínez and Samy Missoum
- Published
- 2023
- Full Text
- View/download PDF
5. Stochastic Crashworthiness Optimization Accounting for Simulation Noise
- Author
-
Seyed Saeed Ahmadisoleymani and Samy Missoum
- Subjects
Mechanics of Materials ,Computer science ,Simulation noise ,business.industry ,Mechanical Engineering ,Crashworthiness ,Structural engineering ,business ,Computer Graphics and Computer-Aided Design ,Computer Science Applications - Abstract
Finite element-based crashworthiness optimization is extensively used to improve the safety of motor vehicles. However, the responses of crash simulations are characterized by a high level of numerical noise, which can hamper the blind use of surrogate-based design optimization methods. It is therefore essential to account for the noise-induced uncertainty when performing optimization. For this purpose, a surrogate, referred to as Non-Deterministic Kriging (NDK), can be used. It models the noise as a non-stationary stochastic process, which is added to a traditional deterministic kriging surrogate. Based on the NDK surrogate, this study proposes an optimization algorithm tailored to account for both epistemic uncertainty, due to the lack of data, and irreducible aleatory uncertainty, due to the simulation noise. The variances are included within an extension of the well-known expected improvement infill criterion referred to as Modified Augmented Expected Improvement (MAEI). Because the proposed optimization scheme requires an estimate of the aleatory variance, it is approximated through a regression kriging, which is iteratively refined. The proposed algorithm is tested on a set of analytical functions and applied to the optimization of an Occupant Restraint System (ORS) during a crash.
- Published
- 2021
- Full Text
- View/download PDF
6. A Multi-Fidelity Approach for Reliability Assessment Based on the Probability of Classification Inconsistency
- Author
-
Bharath Pidaparthi and Samy Missoum
- Subjects
Computer Graphics and Computer-Aided Design ,Industrial and Manufacturing Engineering ,Software ,Computer Science Applications - Abstract
Most multi-fidelity schemes for optimization or reliability assessment rely on regression surrogates, such as Gaussian processes. Contrary to these approaches, we propose a classification-based multi-fidelity scheme for reliability assessment. This technique leverages multi-fidelity information to locally construct failure boundaries using support vector machine (SVM) classifiers. SVMs are subsequently used to estimate the probability of failure using Monte Carlo simulations. The use of classification has several advantages: It can handle discontinuous responses and reduce the number of function evaluations in the case of a large number of failure modes. In addition, in the context of multi-fidelity techniques, classification enables the identification of regions where the predictions (e.g., failure or safe) from the various fidelities are identical. At the core of the proposed scheme is an adaptive sampling routine driven by the probability of classification inconsistency between the models. This sampling routine explores sparsely sampled regions of inconsistency between the models of various fidelity to iteratively refine the approximation of the failure domain boundaries. A lookahead scheme, which looks one step into the future without any model evaluations, is used to selectively filter adaptive samples that do not induce substantial changes in the failure domain boundary approximation. The model management strategy is based on a framework that adaptively identifies a neighborhood of no confidence between the models. The proposed scheme is tested on analytical examples of dimensions ranging from 2 to 10, and finally applied to assess the reliability of a miniature shell and tube heat exchanger.
- Published
- 2022
- Full Text
- View/download PDF
7. A Multi-Fidelity Approach for Reliability Assessment Based on the Probability of Model Inconsistency
- Author
-
Bharath Pidaparthi and Samy Missoum
- Abstract
Most multi-fidelity schemes rely on regression surrogates, such as Gaussian Processes, to combine low- and high-fidelity data. Contrary to these approaches, we propose a classification-based multi-fidelity scheme for reliability assessment. This multi-fidelity technique leverages low- and high-fidelity model evaluations to locally construct the failure boundaries using support vector machine (SVM) classifiers. These SVMs can subsequently be used to estimate the probability of failure using Monte Carlo Simulations. At the core of this multi-fidelity scheme is an adaptive sampling routine driven by the probability of misclassification. This sampling routine explores sparsely sampled regions of inconsistency between low- and high-fidelity models to iteratively refine the SVM approximation of the failure boundaries. A lookahead check, which looks one step into the future without any model evaluations, is employed to selectively filter the adaptive samples. A novel model selection framework, which adaptively defines a neighborhood of no confidence around low fidelity model, is used in this study to determine if the adaptive samples should be evaluated with high- or low-fidelity model. The proposed multi-fidelity scheme is tested on a few analytical examples of dimensions ranging from 2 to 10, and finally applied to assess the reliability of a miniature shell and tube heat exchanger.
- Published
- 2022
- Full Text
- View/download PDF
8. CFD Based Design Optimization of Multiple Helical Swirl-Inducing Fins for Concentrated Solar Receivers
- Author
-
Bharath Pidaparthi, Samy Missoum, and Ben Xu
- Abstract
Concentrated Solar Power (CSP) with Thermal Energy Storage (TES) has the potential to realize grid parity. This can be achieved by operating CSP systems at temperatures above 700 °C to reach high thermal efficiencies (> 50%). However, operating CSP systems at elevated temperatures poses several problems, among which the design of solar receivers to handle increased thermal loads is critical. To this end, this work explores and optimizes various swirl-inducing internal fin designs for improving heat transfer in solar receiver tubes. These fin designs, in addition to enhancing the thermal performance of receiver tubes, are also capable of reducing temperature unevenness caused by nonuniform solar loads. This work optimizes the geometric parameters such as height and helical pitch of these fin designs by maximizing the Nusselt number with a constraint on the friction factor. The fin design optimization, however, is computationally intensive, often requiring hundreds of simulation call to the Computational Fluid Dynamics (CFD) model. To circumvent this problem, this work employs surrogate models to approximate the simulation outputs needed during the optimization.
- Published
- 2022
- Full Text
- View/download PDF
9. Surrogate-based stochastic optimization of horizontal-axis wind turbine composite blades
- Author
-
Mishal Thapa and Samy Missoum
- Subjects
Control and Optimization ,Control and Systems Engineering ,Computer Graphics and Computer-Aided Design ,Software ,Computer Science Applications - Published
- 2022
- Full Text
- View/download PDF
10. Nonlinear dynamics of the wolf tone production
- Author
-
Etienne Gourc, Christophe Vergez, Pierre-Olivier Mattei, Samy Missoum, Laboratoire de Mécanique et d'Acoustique [Marseille] (LMA ), Aix Marseille Université (AMU)-École Centrale de Marseille (ECM)-Centre National de la Recherche Scientifique (CNRS), and University of Arizona
- Subjects
Physics ,[PHYS.MECA.VIBR]Physics [physics]/Mechanics [physics]/Vibrations [physics.class-ph] ,Acoustics and Ultrasonics ,Mechanical Engineering ,Wolf tone ,Condensed Matter Physics ,Resonance (particle physics) ,Cello ,String (physics) ,Stability (probability) ,[PHYS.MECA.ACOU]Physics [physics]/Mechanics [physics]/Acoustics [physics.class-ph] ,Nonlinear system ,Classical mechanics ,Mechanics of Materials ,Piecewise ,Boundary value problem - Abstract
International audience; Some bowed string instruments such as cello or viola are prone to a parasite phenomenon called the wolf tone that gives rise to an undesired warbling sound. It is now accepted that this phenomenon is mainly due to an interaction between a resonance of the body and the motion of the string. A simple model of bowed string instrument consisting of a linear string with a mass-spring boundary condition (modeling the body of the instrument) and excited by Coulomb friction is presented. The eigenproblem analysis shows the presence of a frequency veering phenomenon close to 1 : 1 resonance between the string and the body, giving rise to modal hybridation. Due to the piecewise nature of Coulomb friction, the periodic solutions are computed and continued using a mapping procedure. The analysis of classical as well as non-smooth bifurcations allows us to relate warbling oscillations to the loss of stability of periodic solutions. Finally, a link is made between the bifurcations of periodic solutions and the minimum bow force generally used to explain the appearance of the wolf tone.
- Published
- 2022
- Full Text
- View/download PDF
11. Entropy-Based Optimization for Heat Transfer Enhancement in Tubes With Helical Fins
- Author
-
Bharath Pidaparthi, Peiwen Li, and Samy Missoum
- Subjects
Entropy (classical thermodynamics) ,Materials science ,Mechanics of Materials ,Mechanical Engineering ,Heat transfer enhancement ,Heat transfer ,General Materials Science ,Mechanics ,Condensed Matter Physics ,Multi-objective optimization ,Surrogate based optimization - Abstract
In this work, a tube with internal helical fins is analyzed and optimized from an entropy generation point of view. Helical fins, in addition to providing heat transfer enhancements, have the potential to level the temperature of the tube under nonuniform circumferential heating. In this work, the geometric parameters of internal helical fins are optimized under two different entropy-based formulations. Specifically, the optimal design solution obtained through the minimization of total entropy is compared with the solutions from the multiobjective optimization of the thermal and viscous entropy contributions when considered as two separate objectives. The latter quantities being associated with heat transfer and pressure drops, it is shown that, from a design optimization point of view, it is important to separate both entropies which are conflicting objectives.
- Published
- 2021
- Full Text
- View/download PDF
12. A note on the effect of material uncertainty on acoustic source localization error in anisotropic plates
- Author
-
Samy Missoum, Novonil Sen, and Tribikram Kundu
- Subjects
Acoustics and Ultrasonics ,Latin hypercube sampling ,Mathematical analysis ,Monte Carlo method ,Log-normal distribution ,Probability distribution ,Acoustic source localization ,Elasticity (physics) ,Orthotropic material ,Random variable ,Mathematics - Abstract
The uncertainty in material properties of an anisotropic plate may influence the acoustic source localization process undertaken for the plate. To study this effect of material uncertainty, the two moduli of elasticity of an orthotropic plate material are considered in this note as independent random variables and the propagation of this material uncertainty through the wave front shape-based acoustic source localization approach is investigated. Assuming lognormal probability distributions for the two random variables, several design points in lognormal spaces are picked using Latin Hypercube Sampling. Finite element analysis is performed for each design point to simulate the elastic wave propagation due to an acoustic event and wave front shape-based approach is applied to estimate the source location. The time-of-arrivals and source localization errors obtained for each design point are considered as separate response functions at that design point and regression kriging metamodels through the responses at the design points are constructed. Monte Carlo simulations are carried out using these metamodels to obtain the distribution parameters (i.e., ranges, means and standard deviations) of the time-of-arrivals and localization errors. A global sensitivity analysis is performed to estimate the effect of each random variable on the localization errors. It is observed that for lognormally distributed moduli of elasticity with same coefficients of variation, uncertainty in the modulus of elasticity in the major direction affects the source localization accuracy more compared to the uncertainty in the modulus of elasticity in the minor direction, particularly when the ellipse-based technique is used.
- Published
- 2021
13. A Multi-Fidelity Approach for the Reliability Assessment of Shell and Tube Heat Exchangers
- Author
-
Peiwen Li, Bharath Pidaparthi, and Samy Missoum
- Subjects
Computer science ,media_common.quotation_subject ,Heat exchanger ,Mechanical engineering ,Fidelity ,Engineering simulation ,Reliability (statistics) ,media_common ,Shell and tube heat exchanger - Abstract
The objective of this paper is to efficiently perform the reliability assessment of shell and tube heat exchangers. Although inexpensive empirical/analytical heat exchanger models exist and could be used for brute force Monte-Carlo simulations (MCS) based reliability analysis, they typically do not characterize the shell side flow accurately. Hence, higher fidelity models are often needed for predicting the shell and tube heat exchanger performance with a desired level of accuracy. These higher fidelity models are generally associated with higher computational costs, making them impractical for MCS-based reliability analysis. To circumvent this problem, a multi-fidelity technique leveraging the lower and higher fidelity models is proposed to locally construct the failure boundaries using support vector machines (SVM). For this purpose, an adaptive sampling scheme, which explores the regions of inconsistency between failure boundaries from lower and higher fidelity models, is developed.
- Published
- 2021
- Full Text
- View/download PDF
14. Stochastic Kriging for Crashworthiness Optimization Accounting for Simulation Noise
- Author
-
Samy Missoum and Seyed Saeed Ahmadisoleymani
- Subjects
Mathematical optimization ,Optimization algorithm ,Simulation noise ,Stochastic process ,Kriging ,Computer science ,Crashworthiness - Abstract
Vehicle crash simulations are notoriously costly and noisy. When performing crashworthiness optimization, it is therefore important to include available information to quantify the noise in the optimization. For this purpose, a stochastic kriging can be used to account for the uncertainty due to the simulation noise. It is done through the addition of a non-stationary stochastic process to the deterministic kriging formulation. This stochastic kriging, which can also be used to include the effect of random non-controllable parameters, can then be used for surrogate-based optimization. In this work, a stochastic kriging-based optimization algorithm is proposed with an infill criterion referred to as the Augmented Expected Improvement, which, unlike its deterministic counterpart the Expect Improvement, accounts for the presence of irreducible aleatory variance due to noise. One of the key novelty of the proposed algorithm stems from the approximation of the aleatory variance and its update during the optimization. The proposed approach is applied to the optimization of two problems including an analytical function and a crashwor-thiness problem where the components of an occupant restraint system of a vehicle are optimized.
- Published
- 2021
- Full Text
- View/download PDF
15. Crashworthiness Optimization Based on the Probability of Traumatic Brain Injury Accounting for Simulation Noise and Impact Conditions
- Author
-
Samy Missoum and Seyed Saeed Ahmadisoleymani
- Subjects
Optimization algorithm ,Traumatic brain injury ,Computer science ,Simulation noise ,Computation ,medicine ,Crashworthiness ,medicine.disease ,Reliability engineering - Abstract
Finite element-based crashworthiness optimization is nowadays extensively used to improve the safety of vehicles. However, the responses of a crash simulation are notoriously noisy. In addition, the actual or simulated responses during a crash can be highly sensitive to uncertainties. These uncertainties appear in various forms such as uncontrollable random parameters (e.g., impact conditions). To address these challenges, an optimization algorithm based on a Stochastic Kriging (SK) and an Augmented Expected Improvement (AEI) infill criterion is proposed. A SK enables the approximation of a response while accounting for the noise-induced aleatory variance. In addition, SK has the advantage of reducing the dimensionality of the problem by implicitly accounting for the influence of random parameters and their contribution to the overall aleatory variance. In the proposed algorithm, the aleatory variance is initially estimated through direct sampling and subsequently approximated by a regression kriging. This aleatory variance approximation, which is refined adaptively, is used for the computation of the infill criterion and probabilistic constraints. The algorithm is implemented on a crashworthiness optimization problem that involves a sled and dummy models subjected to an acceleration pulse. The sled model includes components of a vehicle occupant restraint system such as an airbag, seatbelt, and steering column. In all problems considered, the objective function is the probability of traumatic brain injury, which is computed through the Brain Injury Criterion (BrIC) and a logistic injury risk model. In some cases, probabilistic constraints corresponding to other types of bodily injuries such as thoracic injury are added to the optimization problem. The design variables correspond to the properties of the occupant restraint system (e.g., loading curve that dictates the airbag vent area versus pressure). In addition to the inherent simulation noise, uncertainties in the loading conditions are introduced in the form of a random scaling factor of the acceleration pulse.
- Published
- 2021
- Full Text
- View/download PDF
16. Stochastic Optimization of Nonlinear Energy Sinks for the Mitigation of Limit Cycle Oscillations
- Author
-
Samy Missoum and Bharath Pidaparthi
- Subjects
Liquid crystal on silicon ,Support vector machine ,Airfoil ,Physics ,Nonlinear system ,Control theory ,Aerospace Engineering ,Particle swarm optimization ,Stochastic optimization ,Probability density function ,Computer Science::Databases ,Energy (signal processing) - Abstract
This paper investigates the mitigation of limit cycle oscillations (LCOs) using optimally designed nonlinear energy sinks (NESs). The study is based on an NES attached to a two-degree-of-freedom ai...
- Published
- 2019
- Full Text
- View/download PDF
17. Construction of a risk model through the fusion of experimental data and finite element modeling: Application to car crash-induced TBI
- Author
-
Samy Missoum and Seyed Saeed Ahmadisoleymani
- Subjects
Support Vector Machine ,Adaptive sampling ,Computer science ,Acceleration ,Finite Element Analysis ,0206 medical engineering ,Biomedical Engineering ,Bioengineering ,Crash ,02 engineering and technology ,Logistic regression ,Risk Assessment ,03 medical and health sciences ,0302 clinical medicine ,Brain Injuries, Traumatic ,Humans ,Computer Simulation ,Probability ,Accidents, Traffic ,Law of total probability ,030229 sport sciences ,General Medicine ,Sensor fusion ,020601 biomedical engineering ,Finite element method ,Computer Science Applications ,Human-Computer Interaction ,Logistic Models ,Stress, Mechanical ,Metric (unit) ,Risk assessment ,Algorithm - Abstract
This article introduces a new approach for the construction of a risk model for the prediction of Traumatic Brain Injury (TBI) as a result of a car crash. The probability of TBI is assessed through the fusion of an experiment-based logistic regression risk model and a finite element (FE) simulation-based risk model. The proposed approach uses a multilevel framework which includes FE simulations of vehicle crashes with dummy and FE simulations of the human brain. The loading conditions derived from the crash simulations are transferred to the brain model thus allowing the calculation of injury metrics such as the Cumulative Strain Damage Measure (CSDM). The framework is used to propagate uncertainties and obtain probabilities of TBI based on the CSDM injury metric. The risk model from FE simulations is constructed from a support vector machine classifier, adaptive sampling, and Monte-Carlo simulations. An approach to compute the total probability of TBI, which combines the FE-based risk assessment as well as the risk prediction from the experiment-based logistic regression model is proposed. In contrast to previous published work, the proposed methodology includes the uncertainty of explicit parameters such as impact conditions (e.g., velocity, impact angle), and material properties of the brain model. This risk model can provide, for instance, the probability of TBI for a given assumed crash impact velocity.
- Published
- 2019
- Full Text
- View/download PDF
18. Entropy-Based Optimization of Helical Fins for Heat Transfer Enhancement Inside Tubes
- Author
-
Samy Missoum, Bharath Pidaparthi, and Peiwen Li
- Subjects
Pressure drop ,Physics ,Entropy (classical thermodynamics) ,Heat transfer enhancement ,Heat transfer ,Mechanics ,Multi-objective optimization - Abstract
The design optimization of a tube with internal helical fins is considered from an entropy generation point of view. The primary focus of the article is to study the optimization results based on entropy-based formulations. Specifically, this work compares the optimal design solution obtained through the minimization of total entropy and through the multiobjective optimization of the heat transfer and frictional entropies when considered as two separate objectives. The latter quantities being associated with heat transfer and pressure drops, it is shown that, from a design optimization point of view, it is important to separate both entropies which are conflicting objectives.
- Published
- 2020
- Full Text
- View/download PDF
19. Woodwind instrument design optimization based on impedance characteristics with geometric constraints
- Author
-
Christophe Vergez, Augustin Ernoult, Michael Jousserand, Philippe Guillemain, Samy Missoum, Advanced 3D Numerical Modeling in Geophysics (Magique 3D), Laboratoire de Mathématiques et de leurs Applications [Pau] (LMAP), Université de Pau et des Pays de l'Adour (UPPA)-Centre National de la Recherche Scientifique (CNRS)-Université de Pau et des Pays de l'Adour (UPPA)-Centre National de la Recherche Scientifique (CNRS)-Inria Bordeaux - Sud-Ouest, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria), Sons, Laboratoire de Mécanique et d'Acoustique [Marseille] (LMA ), Centre National de la Recherche Scientifique (CNRS)-Aix Marseille Université (AMU)-École Centrale de Marseille (ECM)-Centre National de la Recherche Scientifique (CNRS)-Aix Marseille Université (AMU)-École Centrale de Marseille (ECM), University of Arizona, Centre National de la Recherche Scientifique (CNRS)-Aix Marseille Université (AMU)-École Centrale de Marseille (ECM), Buffet Group, This work has been partly supported by the french Agence Nationale de la Recherche (ANR16-LCV2-0007-01 Liamfi project), in cooperation with Buffet Crampon., Centre National de la Recherche Scientifique (CNRS)-Université de Pau et des Pays de l'Adour (UPPA)-Centre National de la Recherche Scientifique (CNRS)-Université de Pau et des Pays de l'Adour (UPPA)-Inria Bordeaux - Sud-Ouest, Aix Marseille Université (AMU)-École Centrale de Marseille (ECM)-Centre National de la Recherche Scientifique (CNRS)-Aix Marseille Université (AMU)-École Centrale de Marseille (ECM)-Centre National de la Recherche Scientifique (CNRS), and Aix Marseille Université (AMU)-École Centrale de Marseille (ECM)-Centre National de la Recherche Scientifique (CNRS)
- Subjects
[SPI.ACOU]Engineering Sciences [physics]/Acoustics [physics.class-ph] ,Optimization problem ,Acoustics and Ultrasonics ,Computer science ,Acoustics ,Resonance ,Input impedance ,01 natural sciences ,Bore profile optimization ,Musical acoustics ,03 medical and health sciences ,0302 clinical medicine ,Amplitude ,Geometric design ,Arts and Humanities (miscellaneous) ,0103 physical sciences ,Woodwind instruments ,Musical instrument design ,030223 otorhinolaryngology ,Instrument design ,Impedance characteristics ,010301 acoustics ,Electrical impedance - Abstract
Computational optimization algorithms coupled with acoustic models of wind instruments provide instrument makers with an opportunity to explore new designs. Specifically, they enable the automatic discovery of geometries exhibiting desired resonance characteristics. In this paper, the design optimization of woodwind instruments with complex geometrical features (e.g., non-cylindrical bore profile and side holes with various radii and chimney heights) is investigated. Optimal geometric designs are searched so that their acoustic input impedance has peaks with specific target frequencies and amplitudes. However, woodwind instruments exhibit complex input impedance whose features, such as resonances, might have non-smooth evolution with respect to design variables, thus hampering gradient-based optimization. For this reason, this paper introduces new formulations of the impedance characteristics (resonance frequencies and amplitudes) using a regularized unwrapped angle of the reflection function. The approach is applied to an illustrative instrument subjected to geometric constraints similar to the ones encountered by manufacturers (a key-less pentatonic clarinet with two-registers). Three optimization problems are considered, demonstrating a strategy to simultaneously adjust several impedance characteristics on all fingerings.
- Published
- 2020
- Full Text
- View/download PDF
20. Optimization under Uncertainty of a Chain of Nonlinear Resonators using a Field Representation
- Author
-
Samy Missoum and Seyed Saeed Ahmadisoleymani
- Subjects
Optimization problem ,Field (physics) ,Computer science ,Applied Mathematics ,Metamaterial ,FOS: Physical sciences ,02 engineering and technology ,Applied Physics (physics.app-ph) ,Physics - Applied Physics ,Topology ,01 natural sciences ,Resonator ,Nonlinear system ,020303 mechanical engineering & transports ,0203 mechanical engineering ,Chain (algebraic topology) ,Optimization and Control (math.OC) ,Modeling and Simulation ,0103 physical sciences ,FOS: Mathematics ,Stochastic optimization ,010301 acoustics ,Mathematics - Optimization and Control ,Curse of dimensionality - Abstract
Chains of resonators in the form of spring-mass systems have long been known to exhibiting interesting properties such as band gaps. Such features can be leveraged to manipulate the propagation of waves such as the filtering of specific frequencies and, more generally, mitigate vibrations and impact. Adding nonlinearities to the system can also provide further avenues to manipulate the propagation of waves in the chain and enhance its performance. This work proposes to optimally design such a chain of resonators to mitigate vibrations in a robust manner by accounting for various sources of design uncertainties (e.g., nonlinear stiffness) and aleatory uncertainties (e.g., loading). The stochastic optimization algorithm is tailored to account for discontinuities in the chain response due to the presence of nonlinearities. In addition, a field formulation is used to define the properties of the resonators along the chain and reduce the dimensionality of the optimization problem. It is shown that the combination of the stochastic optimization algorithm and the field representation leads to robust designs that could not be achieved with optimal properties constant over the chain., 25 figures, 24 pages, submitted to the Journal of Applied Mathematical Modelling
- Published
- 2020
21. Identification of material properties of composite sandwich panels under geometric uncertainty
- Author
-
Samy Missoum, Sylvain Lacaze, Marco Amabili, and Farbod Alijani
- Subjects
Random field ,Materials science ,business.industry ,Numerical analysis ,Process (computing) ,02 engineering and technology ,Structural engineering ,Sandwich panel ,021001 nanoscience & nanotechnology ,Identification (information) ,020303 mechanical engineering & transports ,0203 mechanical engineering ,Normal mode ,Ceramics and Composites ,0210 nano-technology ,business ,Material properties ,Sandwich-structured composite ,Civil and Structural Engineering - Abstract
This study deals with the influence of manufacturing-induced geometric variability on the identification of material properties of composite sandwich panels. The objective of this article is twofold. First, this work aims to demonstrate the marked influence of geometric uncertainties on a foam core sandwich panel whose skin material properties need to be identified. Several identification cases are studied based on experimentally obtained natural frequencies and mode shapes. The second objective is to propose a numerical method for the identification process in the case where uncertainties can be treated as a random field (e.g., thickness distribution). The identification method is built around a classification-based technique referred to as “fidelity maps”, which has the ability to simultaneously treat several responses to match without any assumption on their correlation. The approach uses a proper orthogonal decomposition for the extraction and the selection of the features of the random field considered as important for the identification. The identification method is demonstrated on a foam core sandwich panel whose thickness distribution is modeled as a random field.
- Published
- 2017
- Full Text
- View/download PDF
22. Optimization under uncertainty of parallel nonlinear energy sinks
- Author
-
Christophe Vergez, Samy Missoum, Pierre-Olivier Mattei, and Ethan Boroson
- Subjects
Optimal design ,Mathematical optimization ,Work (thermodynamics) ,Acoustics and Ultrasonics ,Mechanical Engineering ,02 engineering and technology ,Expected value ,Condensed Matter Physics ,01 natural sciences ,Vibration ,Nonlinear system ,020303 mechanical engineering & transports ,0203 mechanical engineering ,Mechanics of Materials ,Control theory ,Tuned mass damper ,0103 physical sciences ,Range (statistics) ,010301 acoustics ,Computer Science::Databases ,Energy (signal processing) ,Mathematics - Abstract
Nonlinear Energy Sinks (NESs) are a promising technique for passively reducing the amplitude of vibrations. Through nonlinear stiffness properties, a NES is able to passively and irreversibly absorb energy. Unlike the traditional Tuned Mass Damper (TMD), NESs do not require a specific tuning and absorb energy over a wider range of frequencies. Nevertheless, they are still only efficient over a limited range of excitations. In order to mitigate this limitation and maximize the efficiency range, this work investigates the optimization of multiple NESs configured in parallel. It is well known that the efficiency of a NES is extremely sensitive to small perturbations in loading conditions or design parameters. In fact, the efficiency of a NES has been shown to be nearly discontinuous in the neighborhood of its activation threshold. For this reason, uncertainties must be taken into account in the design optimization of NESs. In addition, the discontinuities require a specific treatment during the optimization process. In this work, the objective of the optimization is to maximize the expected value of the efficiency of NESs in parallel. The optimization algorithm is able to tackle design variables with uncertainty (e.g., nonlinear stiffness coefficients) as well as aleatory variables such as the initial velocity of the main system. The optimal design of several parallel NES configurations for maximum mean efficiency is investigated. Specifically, NES nonlinear stiffness properties, considered random design variables, are optimized for cases with 1, 2, 3, 4, 5, and 10 NESs in parallel. The distributions of efficiency for the optimal parallel configurations are compared to distributions of efficiencies of non-optimized NESs. It is observed that the optimization enables a sharp increase in the mean value of efficiency while reducing the corresponding variance, thus leading to more robust NES designs.
- Published
- 2017
- Full Text
- View/download PDF
23. Stochastic optimization of nonlinear energy sinks
- Author
-
Samy Missoum and Ethan Boroson
- Subjects
Optimal design ,Computer Science::Computer Science and Game Theory ,Mathematical optimization ,Control and Optimization ,Computer Science::Neural and Evolutionary Computation ,02 engineering and technology ,01 natural sciences ,Computer Graphics and Computer-Aided Design ,Computer Science Applications ,Nonlinear system ,020303 mechanical engineering & transports ,0203 mechanical engineering ,Control and Systems Engineering ,Kriging ,0103 physical sciences ,Stochastic optimization ,Probabilistic design ,Cluster analysis ,010301 acoustics ,Random variable ,Computer Science::Databases ,Software ,Curse of dimensionality ,Mathematics - Abstract
Nonlinear energy sinks (NES) are a promising technique to achieve vibration mitigation. Through nonlinear stiffness properties, NES are able to passively and irreversibly absorb energy. Unlike the traditional Tuned Mass Damper (TMD), NES absorb energy from a wide range of frequencies. Many studies have focused on NES behavior and dynamics, but few have addressed the optimal design of NES. Design considerations of NES are of prime importance as it has been shown that NES dynamics exhibit an acute sensitivity to uncertainties. In fact, the sensitivity is so marked that NES efficiency is near-discontinuous and can switch from a high to a low value for a small perturbation in design parameters or loading conditions. This article presents an approach for the probabilistic design of NES which accounts for random design and aleatory variables as well as response discontinuities. In order to maximize the mean efficiency, the algorithm is based on the identification of regions of the design and aleatory space corresponding to markedly different NES efficiencies. This is done through a sequence of approximated sub-problems constructed from clustering, Kriging approximations, a support vector machine, and Monte-Carlo simulations. The refinement of the surrogates is performed locally using a generalized max-min sampling scheme which accounts for the distributions of random variables. The sampling scheme also makes use of the predicted variance of the Kriging surrogates for the selection of aleatory variables values. The proposed algorithm is applied to three example problems of varying dimensionality, all including an aleatory excitation applied to the main system. The stochastic optima are compared to NES optimized deterministically.
- Published
- 2016
- Full Text
- View/download PDF
24. Optimization of a Chain of Nonlinear Resonators for Vibration Mitigation
- Author
-
Seyed Saeed Ahmadisoleymani and Samy Missoum
- Subjects
Physics ,Nonlinear system ,Resonator ,Acoustics ,Vibration mitigation - Published
- 2018
- Full Text
- View/download PDF
25. Optimization of Nonlinear Energy Sinks for the Mitigation of Limit Cycle Oscillations
- Author
-
Samy Missoum and Bharath Pidaparthi
- Subjects
Physics ,Nonlinear system ,020303 mechanical engineering & transports ,0203 mechanical engineering ,Limit cycle oscillation ,0103 physical sciences ,02 engineering and technology ,Mechanics ,010301 acoustics ,01 natural sciences ,Energy (signal processing) - Published
- 2018
- Full Text
- View/download PDF
26. Comment l'optimisation peut-elle être une aide à la facture instrumentale ?
- Author
-
Augustin Ernoult, Samy Missoum, Michael Jousserand, Philippe Guillemain, Christophe Vergez, Patrick Sanchez, Ernoult, Augustin, Sons, Laboratoire de Mécanique et d'Acoustique [Marseille] (LMA ), Centre National de la Recherche Scientifique (CNRS)-Aix Marseille Université (AMU)-École Centrale de Marseille (ECM)-Centre National de la Recherche Scientifique (CNRS)-Aix Marseille Université (AMU)-École Centrale de Marseille (ECM), University of Arizona, Department of Mechanical and Aerospace Engineering [Arizona State University], Arizona State University [Tempe] (ASU), Buffet Group, and Aix Marseille Université (AMU)-École Centrale de Marseille (ECM)-Centre National de la Recherche Scientifique (CNRS)-Aix Marseille Université (AMU)-École Centrale de Marseille (ECM)-Centre National de la Recherche Scientifique (CNRS)
- Subjects
[SPI.ACOU]Engineering Sciences [physics]/Acoustics [physics.class-ph] ,[SPI]Engineering Sciences [physics] ,[SPI.ACOU] Engineering Sciences [physics]/Acoustics [physics.class-ph] ,[SPI] Engineering Sciences [physics] ,ComputingMilieux_MISCELLANEOUS - Abstract
International audience
- Published
- 2018
27. Risk Prediction of Traumatic Brain Injury From Car Accidents
- Author
-
Seyed Saeed Ahmadisoleymani and Samy Missoum
- Subjects
medicine.medical_specialty ,Physical medicine and rehabilitation ,Traumatic brain injury ,business.industry ,medicine ,Engineering simulation ,medicine.disease ,business - Abstract
The purpose of this study is to build a risk model to predict the probability of Traumatic Brain Injury (TBI). The focus is on the occurrence of one of TBI outcomes, Diffuse Axonal Injury (DAI), due to car crashes. This goal is achieved by developing a multilevel framework, which includes vehicle crash Finite Element (FE) simulations with a dummy along with FE simulations of the brain using loading conditions derived from the crash simulations. The framework is used to propagate uncertainties and obtain probabilities of DAI based on certain injury criteria such as Cumulative Strain Damage Measure (CSDM). The risk model is constructed from a support vector machine classifier, adaptive sampling, and Monte-Carlo simulations. In contrast to previous risk models, it includes the uncertainty of explicit parameters such as impact conditions (e.g., velocity, impact angle), and material properties of the brain model. This risk model can provide, for instance, the probability of DAI for a given assumed velocity.
- Published
- 2017
- Full Text
- View/download PDF
28. Parameter estimation with correlated outputs using fidelity maps
- Author
-
Sylvain Lacaze and Samy Missoum
- Subjects
Adaptive sampling ,Estimation theory ,Mechanical Engineering ,media_common.quotation_subject ,Bayesian probability ,Aerospace Engineering ,Fidelity ,Estimator ,Ocean Engineering ,Statistical and Nonlinear Physics ,Parameter space ,Condensed Matter Physics ,Support vector machine ,Nuclear Energy and Engineering ,Statistics ,Algorithm ,Classifier (UML) ,Civil and Structural Engineering ,media_common ,Mathematics - Abstract
This paper introduces a new approach for parameter estimation and model update based on the notion of fidelity maps. Fidelity maps refer to the regions of the parameter space within which the discrepancy between computational and experimental data is below a user-defined threshold. It is shown that fidelity maps provide an efficient and rigorous approach to approximate likelihoods in the context of Bayesian update or maximum likelihood estimation. Fidelity maps are constructed explicitly in terms of the parameters and aleatory uncertainties using a Support Vector Machine (SVM) classifier. The approach has the advantage of handling numerous correlated responses, possibly discontinuous, without any assumption on the correlation structure. The construction of accurate fidelity map boundaries at a moderate computational cost is made possible through a dedicated adaptive sampling scheme. A simply supported plate with uncertainties in the boundary conditions is used to demonstrate the methodology. In this example, the construction of the fidelity map is based on several natural frequencies and mode shapes to be matched simultaneously. Various statistical estimators are derived from the map.
- Published
- 2014
- Full Text
- View/download PDF
29. A Minimal Model of a Single-Reed Instrument Producing Quasi-Periodic Sounds
- Author
-
Samy Missoum, Jean-Baptiste Doc, Christophe Vergez, Sons, Laboratoire de Mécanique et d'Acoustique [Marseille] (LMA ), Aix Marseille Université (AMU)-École Centrale de Marseille (ECM)-Centre National de la Recherche Scientifique (CNRS)-Aix Marseille Université (AMU)-École Centrale de Marseille (ECM)-Centre National de la Recherche Scientifique (CNRS), University of Arizona, and Centre National de la Recherche Scientifique (CNRS)-Aix Marseille Université (AMU)-École Centrale de Marseille (ECM)-Centre National de la Recherche Scientifique (CNRS)-Aix Marseille Université (AMU)-École Centrale de Marseille (ECM)
- Subjects
Minimal model ,Support vector machine ,Engineering ,Acoustics and Ultrasonics ,business.industry ,Model parameters ,Quasi periodic ,business ,Classifier (UML) ,Algorithm ,Music ,[PHYS.MECA.ACOU]Physics [physics]/Mechanics [physics]/Acoustics [physics.class-ph] - Abstract
International audience; Single-reed instruments can produce multiphonic sounds when they generate quasi-periodic oscillations. The aim of this article is to identify a minimal model of a single reed-instrument producing quasi-periodic oscillations. To better understand the influence of model parameters on the production of quasi-periodic regimes, the mapping between parameters and quasi-periodic regimes is explicitly identified using a support vector machine (SVM) classifier. SVMs enable the construction of boundaries between quasi-periodic and periodic regimes that are explicitly defined in terms of the parameters. Results and conclusions obtained from the numerical model are compared to published experiments related to the the production of quasi-periodic oscillations with an alto saxophone. This qualitative comparison highlights the influence of key parameters on the production of multiphonic sounds.
- Published
- 2014
- Full Text
- View/download PDF
30. A generalized 'max-min' sample for surrogate update
- Author
-
Samy Missoum and Sylvain Lacaze
- Subjects
Sampling scheme ,Mathematical optimization ,Control and Optimization ,Adaptive sampling ,Optimization problem ,Computer Graphics and Computer-Aided Design ,Computer Science Applications ,Control and Systems Engineering ,Joint probability distribution ,Norm (mathematics) ,Applied mathematics ,Random variable ,Software ,Mathematics - Abstract
This brief note describes the generalization of the “max-min” sample that was originally used in the update of approximated feasible or failure domains. The generalization stems from the use of the random variables joint distribution in the sampling scheme. In addition, this note proposes a numerical improvement of the max-min optimization problem through the use of the Chebychev norm.
- Published
- 2013
- Full Text
- View/download PDF
31. Stochastic Optimization of Nonlinear Energy Sinks Using Resonance-Based Clustering
- Author
-
Ethan Boroson and Samy Missoum
- Subjects
Optimal design ,Work (thermodynamics) ,Engineering ,Nonlinear system ,Control theory ,business.industry ,Tuned mass damper ,Stochastic optimization ,Dissipation ,business ,Cluster analysis ,Computer Science::Databases ,Energy (signal processing) - Abstract
Nonlinear energy sinks (NESs) are promising devices for achieving passive vibration mitigation. Unlike traditional tuned mass dampers (TMDs), NESs, characterized by nonlinear stiffness properties, are not tuned to specific frequencies and absorb energy over a wider range of frequencies. NES efficiency is achieved through time-limited resonances, leading to the capture and dissipation of energy. However, the efficiency with which a NES dissipates energy is highly dependent on design parameters and loading conditions. In fact, it has been shown that a NES can exhibit a near-discontinuous efficiency. Thus, NES optimal design must account for uncertainty. The premise of the stochastic optimization method proposed is the segregation of efficiency regions separated by discontinuities in potentially high dimensional space. Clustering, support vector machine classification, and dedicated adaptive sampling constitute the basic techniques for maximizing the expected value of NES efficiency. Previous works depended solely on the ratio of energy dissipated by the NES for clustering. This work also includes information about the type of m:p resonances present. Three examples of optimization for the maximization of the expected value of efficiency for NESs subjected to transient loading are presented. The optimization accounts for both design variables with uncertainty and aleatory variables to characterize loading.Copyright © 2016 by ASME
- Published
- 2016
- Full Text
- View/download PDF
32. Reliability Analysis in the Presence of Aleatory and Epistemic Uncertainties, Application to the Prediction of a Launch Vehicle Fallout Zone
- Author
-
Sylvain Lacaze, Loïc Brevault, Samy Missoum, and Mathieu Balesdent
- Subjects
Engineering ,business.industry ,Mechanical Engineering ,0211 other engineering and technologies ,02 engineering and technology ,01 natural sciences ,Computer Graphics and Computer-Aided Design ,Computer Science Applications ,Reliability engineering ,010101 applied mathematics ,Mechanics of Materials ,Launch vehicle ,0101 mathematics ,business ,Reliability (statistics) ,021106 design practice & management - Abstract
The design of complex systems often requires reliability assessments involving a large number of uncertainties and low probability of failure estimations (in the order of 10−4). Estimating such rare event probabilities with crude Monte Carlo (CMC) is computationally intractable. Specific numerical methods to reduce the computational cost and the variance estimate have been developed such as importance sampling or subset simulation. However, these methods assume that the uncertainties are defined within the probability formalism. Regarding epistemic uncertainties, the interval formalism is particularly adapted when only their definition domain is known. In this paper, a method is derived to assess the reliability of a system with uncertainties described by both probability and interval frameworks. It allows one to determine the bounds of the failure probability and involves a sequential approach using subset simulation, kriging, and an optimization process. To reduce the simulation cost, a refinement strategy of the surrogate model is proposed taking into account the presence of both aleatory and epistemic uncertainties. The method is compared to existing approaches on an analytical example as well as on a launch vehicle fallout zone estimation problem.
- Published
- 2016
- Full Text
- View/download PDF
33. Methods for high-dimensional and computationally intensive models
- Author
-
Sylvain Lacaze, L. Brevaul, Samy Missoum, Mathieu Balesdent, and Jérôme Morio
- Subjects
Surrogate model ,Computer science ,Kriging ,Morris method ,Context (language use) ,Sobol sequence ,Sensitivity (control systems) ,Data mining ,computer.software_genre ,computer ,Metamodeling ,Event (probability theory) - Abstract
Complex simulation codes such as the ones used in aerospace industry are often computationally expensive and involve a large number of variables. These features significantly hamper the estimation of rare event probabilities. To reduce the computational burden, an analysis of the most important variables of the problem can be performed before applying rare event estimation methods. Another way to reduce this burden is to build a surrogate model of the computationally costly simulation code and to perform the probability estimation on this metamodel. In this chapter, we first review the main techniques used in sensitivity analysis and then describe several surrogate models that are efficient in the probability estimation context.
- Published
- 2016
- Full Text
- View/download PDF
34. Optimization Under Uncertainty of Parallel Nonlinear Energy Sinks
- Author
-
Ethan Boroson and Samy Missoum
- Subjects
Physics ,Mathematical optimization ,Nonlinear system ,0103 physical sciences ,0211 other engineering and technologies ,02 engineering and technology ,010301 acoustics ,01 natural sciences ,Energy (signal processing) ,021106 design practice & management - Published
- 2016
- Full Text
- View/download PDF
35. Parallel construction of explicit boundaries using support vector machines
- Author
-
Anirban Basudhar, Samy Missoum, and Ke Lin
- Subjects
Theoretical computer science ,Speedup ,Computer science ,Reliability (computer networking) ,General Engineering ,Binary number ,Boundary (topology) ,Parallel computing ,Computer Science Applications ,Support vector machine ,Automatic parallelization ,Range (mathematics) ,Computational Theory and Mathematics ,Parallel processing (DSP implementation) ,Software - Abstract
Purpose – The purpose of this paper is to present a study of the parallelization of the construction of explicit constraints or limit‐state functions using support vector machines. These explicit boundaries have proven to be beneficial for design optimization and reliability assessment, especially for problems with large computational times, discontinuities, or binary outputs. In addition to the study of the parallelization, the objective of this article is also to provide an approach to select the number of processors.Design/methodology/approach – This article investigates the parallelization in two ways. First, the efficiency of the parallelization is assessed by comparing, over several runs, the number of iterations needed to create an accurate boundary to the number of iterations associated with a theoretical “linear” speedup. Second, by studying these differences, an “appropriate” range of parallel processors can be inferred.Findings – The parallelization of the construction of explicit boundaries ca...
- Published
- 2012
- Full Text
- View/download PDF
36. Constrained efficient global optimization with support vector machines
- Author
-
Anirban Basudhar, Samy Missoum, Christoph Dribusch, and Sylvain Lacaze
- Subjects
Mathematical optimization ,Control and Optimization ,Optimization problem ,Probabilistic logic ,Constrained optimization ,Boundary (topology) ,Function (mathematics) ,Computer Graphics and Computer-Aided Design ,Computer Science Applications ,Support vector machine ,Vector optimization ,Control and Systems Engineering ,Global optimization ,Software ,Mathematics - Abstract
This paper presents a methodology for constrained efficient global optimization (EGO) using support vector machines (SVMs). While the objective function is approximated using Kriging, as in the original EGO formulation, the boundary of the feasible domain is approximated explicitly as a function of the design variables using an SVM. Because SVM is a classification approach and does not involve response approximations, this approach alleviates issues due to discontinuous or binary responses. More importantly, several constraints, even correlated, can be represented using one unique SVM, thus considerably simplifying constrained problems. In order to account for constraints, this paper introduces an SVM-based "probability of feasibility" using a new Probabilistic SVM model. The proposed optimization scheme is constituted of two levels. In a first stage, a global search for the optimal solution is performed based on the "expected improvement" of the objective function and the probability of feasibility. In a second stage, the SVM boundary is locally refined using an adaptive sampling scheme. An unconstrained and a constrained formulation of the optimization problem are presented and compared. Several analytical examples are used to test the formulations. In particular, a problem with 99 constraints and an aeroelasticity problem with binary output are presented. Overall, the results indicate that the constrained formulation is more robust and efficient.
- Published
- 2012
- Full Text
- View/download PDF
37. A multifidelity approach for the construction of explicit decision boundaries: application to aeroelasticity
- Author
-
Samy Missoum, Philip S. Beran, and Christoph Dribusch
- Subjects
Scheme (programming language) ,Engineering ,Mathematical optimization ,Control and Optimization ,business.industry ,media_common.quotation_subject ,Boundary (topology) ,Fidelity ,Control engineering ,Construct (python library) ,Aeroelasticity ,Computer Graphics and Computer-Aided Design ,Computer Science Applications ,Support vector machine ,Control and Systems Engineering ,Decision boundary ,business ,Engineering design process ,computer ,Software ,Mathematics ,Envelope (motion) ,computer.programming_language ,media_common - Abstract
This paper presents a multifidelity approach for the construction of explicit decision boundaries (constraints or limit-state functions) using support vector machines. A lower fidelity model is used to select specific samples to construct the decision boundary corresponding to a higher fidelity model. This selection is based on two schemes. The first scheme selects samples within an envelope constructed from the lower fidelity model. The second technique is based on the detection of regions of inconsistencies between the lower and the higher fidelity decision boundaries. The approach is applied to analytical examples as well as an aeroelasticity problem for the construction of a nonlinear flutter boundary.
- Published
- 2010
- Full Text
- View/download PDF
38. An improved adaptive sampling scheme for the construction of explicit boundaries
- Author
-
Anirban Basudhar and Samy Missoum
- Subjects
Scheme (programming language) ,Mathematical optimization ,Control and Optimization ,Adaptive sampling ,Sample (statistics) ,Computer Graphics and Computer-Aided Design ,Computer Science Applications ,Support vector machine ,Control and Systems Engineering ,Kernel (statistics) ,Convergence (routing) ,Limit state design ,Engineering design process ,computer ,Algorithm ,Software ,Mathematics ,computer.programming_language - Abstract
This article presents an improved adaptive sampling scheme for the construction of explicit decision functions (constraints or limit state functions) using Support Vector Machines (SVMs). The proposed work presents substantial modifications to an earlier version of the scheme (Basudhar and Missoum, Comput Struct 86(19–20):1904–1917, 2008). The improvements consist of a different choice of samples, a more rigorous convergence criterion, and a new technique to select the SVM kernel parameters. Of particular interest is the choice of a new sample chosen to remove the “locking” of the SVM, a phenomenon that was not understood in the previous version of the algorithm. The new scheme is demonstrated on analytical problems of up to seven dimensions.
- Published
- 2010
- Full Text
- View/download PDF
39. Reliability-Based Design Optimization of Nonlinear Aeroelasticity Problems
- Author
-
Samy Missoum, Christoph Dribusch, and Philip S. Beran
- Subjects
Nonlinear system ,Control theory ,Limit cycle ,Aerospace Engineering ,Design process ,Flutter ,Aeroelasticity ,Finite element method ,Mathematics ,Sequential quadratic programming ,Nonlinear programming - Abstract
This paper introduces a methodology for the reliability-based design optimization (RBDO) of nonlinear aeroelastic problems. It is based on the construction of explicit flutter and subcritical limit cycle oscillations (LCO) boundaries in terms of the design variables. The boundaries, generated using a Support Vector Machine (SVM), can then be used to efficiently evaluate probabilities of failure and solve an RBDO problem. Test results are presented demonstrating the construction of flutter boundaries as well as LCO boundaries for problems with structural nonlinearities. The solution of an example of RBDO problem is also provided.
- Published
- 2010
- Full Text
- View/download PDF
40. Simulation and probabilistic failure prediction of grafts for aortic aneurysm
- Author
-
Ron Layman, Jonathan P. Vande Geest, and Samy Missoum
- Subjects
Engineering ,Aorta ,business.industry ,General Engineering ,Probabilistic logic ,medicine.disease ,Finite element method ,Computer Science Applications ,Support vector machine ,Aortic aneurysm ,Aneurysm ,Computational Theory and Mathematics ,medicine.artery ,medicine ,Limit state design ,business ,Algorithm ,Software ,Simulation ,Parametric statistics - Abstract
PurposeThe use of stent‐grafts to canalize aortic blood flow for patients with aortic aneurysms is subject to serious failure mechanisms such as a leak between the stent‐graft and the aorta (Type I endoleak). The purpose of this paper is to describe a novel computational approach to understand the influence of relevant variables on the occurrence of stent‐graft failure and quantify the probability of failure for aneurysm patients.Design/methodology/approachA parameterized fluid‐structure interaction finite element model of aortic aneurysm is built based on a multi‐material formulation available in LS‐DYNA. Probabilities of failure are assessed using an explicit construction of limit state functions with support vector machines (SVM) and uniform designs of experiments. The probabilistic approach is applied to two aneurysm geometries to provide a map of probabilities of failure for various design parameter values.FindingsParametric studies conducted in the course of this research successfully identified intuitive failure regions in the parameter space, and failure probabilities were calculated using both a simplified and more complex aneurysmal geometry.Originality/valueThis research introduces the use of SVM‐based explicit design space decomposition for probabilistic assessment applied to bioengineering problems. This technique allows one to efficiently calculate probabilities of failure. It is particularly suited for problems where outcomes can only be classified as safe or failed (e.g. leak or no leak). Finally, the proposed fluid‐structure interaction simulation accounts for the initiation of Type I endoleak between the graft and the aneurysm due to simultaneous fluid and solid forces.
- Published
- 2010
- Full Text
- View/download PDF
41. Three Dimensional Active Contours for the Reconstruction of Abdominal Aortic Aneurysms
- Author
-
Avinash Ayyalasomayajula, Anirban Basudhar, Samy Missoum, Andrew Polk, Lavi Nissim, and Jonathan P. Vande Geest
- Subjects
Aorta ,Computer science ,3D reconstruction ,Models, Cardiovascular ,Biomedical Engineering ,Lumen (anatomy) ,medicine.disease ,Abdominal aortic aneurysm ,Aortic aneurysm ,Imaging, Three-Dimensional ,Aneurysm ,medicine.artery ,cardiovascular system ,medicine ,Humans ,Segmentation ,Aorta, Abdominal ,cardiovascular diseases ,Tomography ,Tomography, X-Ray Computed ,Aortic Aneurysm, Abdominal ,Biomedical engineering - Abstract
An aneurysm is a gradual and progressive ballooning of a blood vessel due to wall degeneration. Rupture of abdominal aortic aneurysm (AAA) constitutes a significant portion of deaths in the US. In this study, we describe a technique to reconstruct AAA geometry from CT images in an inexpensive and streamlined fashion. A 3D reconstruction technique was implemented with a GUI interface in MATLAB using the active contours technique. The lumen and the thrombus of the AAA were segmented individually in two separate protocols and were then joined together into a hybrid surface. This surface was then used to obtain the aortic wall. This method can deal with very poor contrast images where the aortic wall is indistinguishable from the surrounding features. Data obtained from the segmentation of image sets were smoothed in 3D using a Support Vector Machine technique. The segmentation method presented in this paper is inexpensive and has minimal user-dependency in reconstructing AAA geometry (lumen and wall) from patient image sets. The AAA model generated using this segmentation algorithm can be used to study a variety of biomechanical issues remaining in AAA biomechanics including stress estimation, endovascular stent-graft performance, and local drug delivery studies.
- Published
- 2009
- Full Text
- View/download PDF
42. Adaptive explicit decision functions for probabilistic design and optimization using support vector machines
- Author
-
Anirban Basudhar and Samy Missoum
- Subjects
Weighted sum model ,Decision support system ,Mechanical Engineering ,Constrained optimization ,Decision tree ,Computer Science Applications ,Support vector machine ,Modeling and Simulation ,Decision boundary ,General Materials Science ,Probabilistic design ,Algorithm ,Civil and Structural Engineering ,Analytic function ,Mathematics - Abstract
This article presents a methodology to generate explicit decision functions using support vector machines (SVM). A decision function is defined as the boundary between two regions of a design space (e.g., an optimization constraint or a limit-state function in reliability). The SVM-based decision function, which is initially constructed based on a design of experiments, depends on the amount and quality of the training data used. For this reason, an adaptive sampling scheme that updates the decision function is proposed. An accurate approximated explicit decision functions is obtained with a reduced number of function evaluations. Three problems are presented to demonstrate the efficiency of the update scheme to explicitly reconstruct known analytical decision functions. The chosen functions are the boundaries of disjoint regions of the design space. A convergence criterion and error measure are proposed. The scheme is also applied to the definition of an explicit failure region boundary in the case of the buckling of a geometrically nonlinear arch.
- Published
- 2008
- Full Text
- View/download PDF
43. Limit state function identification using Support Vector Machines for discontinuous responses and disjoint failure domains
- Author
-
Antonio Harrison Sanchez, Anirban Basudhar, and Samy Missoum
- Subjects
Mathematical optimization ,Mechanical Engineering ,Reliability (computer networking) ,Aerospace Engineering ,Ocean Engineering ,Statistical and Nonlinear Physics ,Disjoint sets ,Classification of discontinuities ,Condensed Matter Physics ,Support vector machine ,Discontinuity (linguistics) ,Identification (information) ,Nuclear Energy and Engineering ,Design process ,Limit state design ,Algorithm ,Civil and Structural Engineering ,Mathematics - Abstract
This article presents a method for the explicit construction of limit state functions using Support Vector Machines (SVM). Specifically, the approach aims at handling the difficulties associated with the reliability assessment of problems exhibiting discontinuous responses and disjoint failure domains. The SVM-based explicit construction of limit state functions allows for an easy calculation of a probability of failure and enables the association of a specific system behavior with a region of the design space. The explicit limit state function can then be used within a reliability-based design optimization (RBDO) problem. Two problems are presented to demonstrate the successful application of the developed method for explicit construction of limit state function and reliability-based optimum design.
- Published
- 2008
- Full Text
- View/download PDF
44. Probabilistic optimal design in the presence of random fields
- Author
-
Samy Missoum
- Subjects
Mathematical optimization ,Control and Optimization ,Random field ,Multivariate random variable ,Random function ,Random element ,Computer Graphics and Computer-Aided Design ,Computer Science Applications ,Random variate ,Control and Systems Engineering ,Stochastic simulation ,Sum of normally distributed random variables ,Random compact set ,Algorithm ,Software ,Mathematics - Abstract
This article describes a methodology to incorporate a random field in a probabilistic optimization problem. The approach is based on the extraction of the features of a random field using a reduced number of experimental observations. This is achieved by proper orthogonal decomposition. Using Lagrange interpolation, a modified random field is obtained by changing the contribution of each feature. The contributions are controlled using scalar parameters, which can be considered as random variables. This allows one to perform a random-field-based probabilistic optimization with few random variables. The methodology is demonstrated on a tube impacting a rigid wall for which a random field modifies the planarity of the tube’s wall.
- Published
- 2007
- Full Text
- View/download PDF
45. A convex hull approach for the reliability-based design optimization of nonlinear transient dynamic problems
- Author
-
Samy Missoum, Raphael T. Haftka, and Palaniappan Ramu
- Subjects
Convex hull ,Optimal design ,Mechanical Engineering ,Computational Mechanics ,General Physics and Astronomy ,Computer Science Applications ,Nonlinear programming ,Dynamic programming ,Nonlinear system ,Dynamic problem ,Mechanics of Materials ,Control theory ,Design process ,Transient response ,Algorithm ,Mathematics - Abstract
Nonlinear problems such as transient dynamic problems exhibit structural responses that can be discontinuous due to numerous bifurcations. This hinders gradient-based or response surface-based optimization. This paper proposes a novel approach to split the design space into regions where the response is continuous. This makes traditional optimization viable. A convex hull approach is adopted to isolate the points corresponding to unwanted bifurcations in the design space. The proposed approach is applied to a tube impacting a rigid wall representing a transient dynamic problem. Since nonlinear behavior is highly sensitive to small variations in design, reliability-based design optimization is performed. The proposed method provides the designer an optimal design with a prescribed dynamic behavior.
- Published
- 2007
- Full Text
- View/download PDF
46. Fusion of Clinical and Stochastic Finite Element Data for Hip Fracture Risk Prediction
- Author
-
Samy Missoum, Zhao Chen, and Peng Jiang
- Subjects
Engineering ,Aging ,Support Vector Machine ,Finite Element Analysis ,Biomedical Engineering ,Biophysics ,computer.software_genre ,Risk Assessment ,Article ,Risk Factors ,Statistics ,medicine ,Humans ,Orthopedics and Sports Medicine ,Computer Simulation ,Femur ,Aged ,Hip fracture ,Stochastic Processes ,Models, Statistical ,Stochastic process ,business.industry ,Hip Fractures ,Rehabilitation ,Middle Aged ,medicine.disease ,Sensor fusion ,Finite element method ,Biomechanical Phenomena ,Support vector machine ,ROC Curve ,Metric (mathematics) ,Female ,Data mining ,business ,Risk assessment ,computer ,Curse of dimensionality - Abstract
Hip fracture affects more than 250,000 people in the US and 1.6 million worldwide per year. With an aging population, the development of reliable fracture risk models is therefore of prime importance. Due to the complexity of the hip fracture phenomenon, the use of clinical data only, as it is done traditionally, might not be sufficient to ensure an accurate and robust hip fracture prediction model. In order to increase the predictive ability of the risk model, the authors propose to supplement the clinical data with computational data from finite element models. The fusion of the two types of data is performed using deterministic and stochastic computational data. In the latter case, uncertainties in loading and material properties of the femur are accounted for and propagated through the finite element model. The predictive capability of a support vector machine (SVM) risk model constructed by combining clinical and finite element data was assessed using a Women׳s Health Initiative (WHI) dataset. The dataset includes common factors such as age and BMD as well as geometric factors obtained from DXA imaging. The fusion of computational and clinical data systematically leads to an increase in predictive ability of the SVM risk model as measured by the AUC metric. It is concluded that the largest gains in AUC are obtained by the stochastic approach. This gain decreases as the dimensionality of the problem increases: a 5.3% AUC improvement was achieved for a 9 dimensional problem involving geometric factors and weight while a 1.3% increase was obtained for a 20 dimensional case including geometric and conventional factors.
- Published
- 2015
47. A sampling-based RBDO algorithm with local refinement and efficient gradient estimation
- Author
-
Lacaze, S., Samy Missoum, Brevault, L., and Balesdent, M.
- Abstract
This article describes a two stage Reliability-Based Design Optimization (RBDO) algorithm. The first stage consists of solving an approximated RBDO problem using meta-models. In order to use gradient-based techniques, the sensitivity of failure probabilities are derived with respect to hyperparameters of random variables as well as, and this is a novelty, deterministic variables. The second stage focuses on the local refinement of the meta-models around the first stage solution using generalized “max-min” samples. The approach is demonstrated on three examples including a crashworthiness problem with 11 random variables and 10 probabilistic constraints.
- Published
- 2015
- Full Text
- View/download PDF
48. Probability of failure sensitivity with respect to decision variables
- Author
-
Sylvain Lacaze, Mathieu Balesdent, Samy Missoum, Loïc Brevault, University of Arizona, ONERA - The French Aerospace Lab [Palaiseau], and ONERA-Université Paris Saclay (COmUE)
- Subjects
Mathematical optimization ,Control and Optimization ,Monte Carlo method ,Dirac (software) ,[INFO.INFO-DS]Computer Science [cs]/Data Structures and Algorithms [cs.DS] ,Estimator ,Computer Graphics and Computer-Aided Design ,Computer Science Applications ,PROBABILITY ,Distribution (mathematics) ,Indicator function ,Control and Systems Engineering ,SUBSET SAMPLING ,Failure domain ,Subset simulation ,Marginal distribution ,SENSITIVITY ,Software ,Mathematics - Abstract
International audience; This note introduces a derivation of the sensitivities of a probability of failure with respect to decision variables. For instance, the gradient of the probability of failure with respect to deterministic design variables might be needed in RBDO. These sensitivities might also be useful for Uncertainty-based Multidisciplinary Design Optimization. The difficulty stems from the dependence of the failure domain on variations of the decision variables. This dependence leads to a derivative of the indicator function in the form of a Dirac distribution in the expression of the sensitivities. Based on an approximation of the Dirac, an estimator of the sensitivities is analytically derived in the case of Crude Monte Carlo first and Subset Simulation. The choice of the Dirac approximation is discussed.
- Published
- 2015
- Full Text
- View/download PDF
49. Study of a new local update scheme for cellular automata in structural design
- Author
-
Samy Missoum, Zafer Gürdal, and Shahriar Setoodeh
- Subjects
Mathematical optimization ,Control and Optimization ,Series (mathematics) ,Topology optimization ,Jacobi method ,Truss ,Computer Graphics and Computer-Aided Design ,Cellular automaton ,Computer Science Applications ,symbols.namesake ,Local analysis ,Control and Systems Engineering ,Convergence (routing) ,symbols ,Engineering design process ,Algorithm ,Software ,Mathematics - Abstract
This paper investigates an improved local update scheme for cellular automata (CA) applied to structural design. Local analysis and design rules are derived for equilibrium and minimum compliance design. The new update scheme consists of repeating analysis and optimality-based design rules locally. The benefits of this approach are demonstrated through a series of systematic experiments. Truss topology design problems of various sizes are used based on the Gauss–Seidel and the Jacobi iteration modes. Experiments show the robust convergence of the approach as compared to an earlier CA implementation. The approach is also extended to a plate problem.
- Published
- 2004
- Full Text
- View/download PDF
50. Convergence analysis for cellular automata applied to truss design
- Author
-
Douglas J. Slotta, Brian Tatting, Zafer Gu¨rdal, Layne T. Watson, and Samy Missoum
- Subjects
Computer science ,Continuous automaton ,MathematicsofComputing_NUMERICALANALYSIS ,General Engineering ,Stability (learning theory) ,Truss ,Cellular automaton ,Computer Science Applications ,Domain (software engineering) ,Computational Theory and Mathematics ,Scalability ,Convergence (routing) ,Algorithm ,Software ,Block (data storage) - Abstract
Traditional parallel methods for structural design, as well as modern preconditioned iterative linear solvers, do not scale well. This paper discusses the application of massively scalable cellular automata (CA) techniques to structural design, specifically trusses. There are two sets of CA rules, one used to propagate stresses and strains, and one to perform design updates. These rules can be applied serially, periodically, or concurrently, and Jacobi or Gauss‐Seidel style updating can be done. These options are compared with respect to convergence, speed, and stability for an example, problem of combined sizing and topology design of truss domain structures. The central theme of the paper is that the cellular automaton paradigm is tantamount to classical block Jacobi or block Gauss‐Seidel iteration, and consequently the performance of a cellular automaton can be rigorously analyzed and predicted.
- Published
- 2002
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.