41 results on '"Bayes' factor"'
Search Results
2. On‐Line Warning System for Pipe Burst Using Bayesian Dynamic Linear Models.
- Author
-
Henriques‐Silva, Renato, Duchesne, Sophie, St‐Gelais, Nicolas Fortin, Saran, Naysan, and Schmidt, Alexandra M.
- Subjects
DYNAMIC models ,OUTLIER detection ,TIME series analysis ,MISSING data (Statistics) ,WATER distribution - Abstract
Pipe breaks are a recurrent problem in water distribution networks and detecting them quickly is crucial to minimize the economic and environmental costs for municipalities. This study presents a burst detection methodology applying Bayesian dynamic linear models (DLMs) on water flow time series combined with an outlier monitoring tool. The model is used to characterize the actual flow and, for each time, a one‐step ahead forecast distribution is obtained recursively before moving onto the next observation. The outlier detection method consists of comparing the main model with an alternative one wherein the mean flow is shifted to a higher value (as bursts tend to increase flow) to evaluate which model best fit the observed data. If the alternative model is favored, a burst alarm is issued. To verify the performance of this approach, the DLM and monitoring tool were applied on 2 yr of flow data from two district meter areas (DMAs) in Halifax (Canada), and a historical break data set is used to assess model accuracy. The model was able to detect up to 75% and 71.2% of the pipe breaks, with a false alarm rate of 5.15% and 12% in the first and second DMA, respectively. Finally, the proposed model allows for straightforward interpretation of model parameters, nonlinear relationship between flow and predictors of interest, naturally describes the uncertainty for future predictions, can easily accommodate missing values and can be tuned to maximize break detection or minimize false alarm rates to adapt to specific objectives of water infrastructure managers. Key Points: A Bayesian dynamic linear model is developed to detect pipe burst by modeling flow time series and monitoring outliersThe model naturally accommodates nonlinear associations between flow and predictors (e.g., pressure, temperature)Non‐stationarity of the flow time series is naturally accounted for through the dynamic structure of the parameters in the model [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
3. BACON: A tool for reverse inference in brain activation and alteration.
- Author
-
Costa, Tommaso, Manuello, Jordi, Ferraro, Mario, Liloia, Donato, Nani, Andrea, Fox, Peter T., Lancaster, Jack, and Cauda, Franco
- Subjects
- *
BACON , *FUNCTIONAL magnetic resonance imaging , *BRAIN diseases , *ANATOMICAL variation - Abstract
Over the past decades, powerful MRI‐based methods have been developed, which yield both voxel‐based maps of the brain activity and anatomical variation related to different conditions. With regard to functional or structural MRI data, forward inferences try to determine which areas are involved given a mental function or a brain disorder. A major drawback of forward inference is its lack of specificity, as it suggests the involvement of brain areas that are not specific for the process/condition under investigation. Therefore, a different approach is needed to determine to what extent a given pattern of cerebral activation or alteration is specifically associated with a mental function or brain pathology. In this study, we present a new tool called BACON (Bayes fACtor mOdeliNg) for performing reverse inference both with functional and structural neuroimaging data. BACON implements the Bayes' factor and uses the activation likelihood estimation derived‐maps to obtain posterior probability distributions on the evidence of specificity with regard to a particular mental function or brain pathology. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
4. Finding specificity in structural brain alterations through Bayesian reverse inference.
- Author
-
Cauda, Franco, Nani, Andrea, Liloia, Donato, Manuello, Jordi, Premi, Enrico, Duca, Sergio, Fox, Peter T., and Costa, Tommaso
- Subjects
- *
BRAIN diseases , *VOXEL-based morphometry , *ALZHEIMER'S disease - Abstract
In the field of neuroimaging reverse inferences can lead us to suppose the involvement of cognitive processes from certain patterns of brain activity. However, the same reasoning holds if we substitute "brain activity" with "brain alteration" and "cognitive process" with "brain disorder." The fact that different brain disorders exhibit a high degree of overlap in their patterns of structural alterations makes forward inference‐based analyses less suitable for identifying brain areas whose alteration is specific to a certain pathology. In the forward inference‐based analyses, in fact, it is impossible to distinguish between areas that are altered by the majority of brain disorders and areas that are specifically affected by certain diseases. To address this issue and allow the identification of highly pathology‐specific altered areas we used the Bayes' factor technique, which was employed, as a proof of concept, on voxel‐based morphometry data of schizophrenia and Alzheimer's disease. This technique allows to calculate the ratio between the likelihoods of two alternative hypotheses (in our case, that the alteration of the voxel is specific for the brain disorder under scrutiny or that the alteration is not specific). We then performed temporal simulations of the alterations' spread associated with different pathologies. The Bayes' factor values calculated on these simulated data were able to reveal that the areas, which are more specific to a certain disease, are also the ones to be early altered. This study puts forward a new analytical instrument capable of innovating the methodological approach to the investigation of brain pathology. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
5. Status of Mandibular Third Molar Development as Evidence in Legal Age Threshold Cases.
- Author
-
Konigsberg, Lyle W., Frankenberg, Susan R., and Liversidge, Helen M.
- Subjects
- *
THIRD molars , *PROBABILITY theory , *TEETH , *BAYESIAN analysis , *FORENSIC sciences , *ESTIMATION theory - Abstract
The completion of the third molar roots has played an important role in ascertaining whether individuals may be at or over a legal threshold of age, often taken as 18 years. This study demonstrates that root apex completion in the third molar is relatively uninformative regarding the threshold of age 18 years in a sample of 1184 males, where mean age‐of‐attainment of root apex completion for third mandibular molars is about 19.4 years. This paper also considers the legal age threshold problem for cases where the third mandibular molar is not completely formed, and outlines the use of parametric models and Bayes' factors to evaluate dental evidence in statistically appropriate ways. It attempts to resolve confusion over age‐within‐stage versus age‐of‐attainment, likelihood ratios versus other diagnostic tests, and prior odds for a case versus the prior density for an age distribution. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
6. Factor Bayes. De puente a puente
- Author
-
Molina, Manuel
- Subjects
probabilidad ,probability ,probabilidad posprueba ,odds ,Bayes' factor ,General Materials Science ,postest probability ,factor Bayes - Abstract
The concepts of odds and probability are described, as well as their combined use together with the Bayes’ factor for calculating the probability of being sick or healthy after knowing the result of a diagnostic test., Se describen los conceptos de odds y probabilidad, así como su uso combinado junto al factor o regla de Bayes para los cálculos de la probabilidad de estar enfermo o sano tras conocer el resultado de una prueba diagnóstica.
- Published
- 2023
- Full Text
- View/download PDF
7. Statistical Interpretation of Evidence: Bayesian Analysis
- Author
-
Franco Taroni, Colin Aitken, and Alex Biedermann
- Subjects
Bayes' rule ,Frequentist probability ,Bayes’ factor ,Bayes’ theorem ,Categorical data ,Continuous data ,Decision theory ,Degree of belief ,Evidence evaluation ,Fallacy ,Interpretation ,Likelihood ratio ,Posterior probability ,Prior probability ,Probability theory ,Subjective probability ,Utility ,computer.software_genre ,Bayesian inference ,Empirical probability ,Epistemology ,Bayes' theorem ,Dempster–Shafer theory ,Inductive probability ,Data mining ,computer ,Probability interpretations ,Mathematics - Abstract
Probability theory provides the general framework within which assignments of probabilities of past, present, and future events are coherently modified in the light of observed events or, more generally, new information. Forensic scientists, as an illustrative example, routinely face tasks of reasoning under uncertainty when they seek to assist members of the judiciary in evaluating or interpreting the meaning of items of scientific evidence. As a consequence of the laws of probability theory and related concepts, Bayes’ theorem is the key rule according to which to conduct such reasoning in order to comply with the requirement of rationality. This quantification, though, does not represent the end of the matter as the forensic scientist may also be confronted with questions of how to make a rational choice amongst alternative courses of action. This article presents the role of Bayes’ theorem, and its extension to decision analysis, in categorical and continuous data analysis in forensic science applications. It emphasizes the importance of propositional hierarchies, the role of background information, the interpretation of probability as personal degrees of belief and the personal quantification of the consequences of decisions. The discussion also includes a sketch of some common pitfalls of intuition associated with probabilistic reasoning in legal contexts.
- Published
- 2023
8. Fact or fiction: reducing the proportion and impact of false positives.
- Author
-
Stahl, D. and Pickles, A.
- Subjects
- *
DIAGNOSTIC errors , *PSYCHIATRY , *PSYCHOLOGY - Abstract
False positive findings in science are inevitable, but are they particularly common in psychology and psychiatry? The evidence that we review suggests that while not restricted to our field, the problem is acute. We describe the concept of researcher ‘degrees-of-freedom’ to explain how many false-positive findings arise, and how the various strategies of registration, pre-specification, and reporting standards that are being adopted both reduce and make these visible. We review possible benefits and harms of proposed statistical solutions, from tougher requirements for significance, to Bayesian and machine learning approaches to analysis. Finally we consider the organisation and methods for replication and systematic review in psychology and psychiatry. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
9. Bayesian model evidence as a practical alternative to deviance information criterion
- Author
-
C. M. Pooley and G. Marion
- Subjects
bayes' factor ,bayesian model evidence ,marginal likelihood ,markov chain monte carlo ,thermodynamic integration ,deviance information criterion ,Science - Abstract
While model evidence is considered by Bayesian statisticians as a gold standard for model selection (the ratio in model evidence between two models giving the Bayes factor), its calculation is often viewed as too computationally demanding for many applications. By contrast, the widely used deviance information criterion (DIC), a different measure that balances model accuracy against complexity, is commonly considered a much faster alternative. However, recent advances in computational tools for efficient multi-temperature Markov chain Monte Carlo algorithms, such as steppingstone sampling (SS) and thermodynamic integration schemes, enable efficient calculation of the Bayesian model evidence. This paper compares both the capability (i.e. ability to select the true model) and speed (i.e. CPU time to achieve a given accuracy) of DIC with model evidence calculated using SS. Three important model classes are considered: linear regression models, mixed models and compartmental models widely used in epidemiology. While DIC was found to correctly identify the true model when applied to linear regression models, it led to incorrect model choice in the other two cases. On the other hand, model evidence led to correct model choice in all cases considered. Importantly, and perhaps surprisingly, DIC and model evidence were found to run at similar computational speeds, a result reinforced by analytically derived expressions.
- Published
- 2018
- Full Text
- View/download PDF
10. The Rough Bayesian Model for Distributed Decision Systems
- Author
-
Ślȩzak, Dominik, Kanade, Takeo, editor, Kittler, Josef, editor, Kleinberg, Jon M., editor, Mattern, Friedemann, editor, Mitchell, John C., editor, Nierstrasz, Oscar, editor, Pandu Rangan, C., editor, Steffen, Bernhard, editor, Sudan, Madhu, editor, Terzopoulos, Demetri, editor, Tygar, Dough, editor, Vardi, Moshe Y., editor, Weikum, Gerhard, editor, Carbonell, Jaime G., editor, Siekmann, Jörg, editor, Tsumoto, Shusaku, editor, Słowiński, Roman, editor, Komorowski, Jan, editor, and Grzymała-Busse, Jerzy W., editor
- Published
- 2004
- Full Text
- View/download PDF
11. Optimum calibration of a corrosion rate instrument using information gain criterion within a Bayesian framework.
- Author
-
Faroz, Sharvil Alex, Ghosh, Siddhartha, and Pushkaran, Thushara
- Subjects
- *
STRUCTURAL health monitoring , *CALIBRATION , *MEASUREMENT errors , *NONDESTRUCTIVE testing - Abstract
Structural health monitoring (SHM) and non-destructive testing (NDT) provide necessary information in assessing the current condition of an ageing structure. However, measurements obtained from SHM and NDT can be erroneous and it is important that the uncertainties associated with these measurement errors are quantified through instrument calibration. Considering that the process of calibration is expensive and time consuming, a novel optimum instrument calibration methodology is proposed in this work, which uses an information theoretic criterion to decide on the optimum number of calibration data points required. The proposed methodology is demonstrated through the calibration of a corrosion rate instrument with measurement uncertainty. The calibrated model is validated using Bayes' factor as the validation metric. Case studies demonstrate 42.5% and more reduction in the required number of calibration data points. The proposed method also suggests an optimum sequence of data collection for calibration. • A novel optimum NDT/SHM instrument calibration method is proposed. • The proposed method uses information theoretic criterion in a Bayesian framework. • This is illustrated through the optimum calibration of a corrosion rate instrument. • Validation of the proposed calibration is demonstrated using Bayes' factor. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
12. A probabilistic approach to evaluate salivary microbiome in forensic science when the Defense says: 'It is my twin brother'
- Author
-
Gilbert Greub, V. Scherz, Silvia Bozza, and Franco Taroni
- Subjects
Male ,Computer science ,Machine learning ,computer.software_genre ,Pathology and Forensic Medicine ,Bayes' theorem ,Bayes’ factor, Cut-off, Discrimination, Monozygotic twins, Salivary microbiome, Similarity score, Evaluation of evidence ,Discrimination ,Genetics ,Humans ,Microbiome ,Similarity score ,Saliva ,Genotyping ,Bayes Theorem ,Forensic Medicine ,Microbiota/genetics ,Siblings ,Bayes’ factor ,Cut-off ,Evaluation of evidence ,Monozygotic twins ,Salivary microbiome ,business.industry ,Microbiota ,Probabilistic logic ,Forensic science ,Artificial intelligence ,business ,Settore SECS-S/01 - Statistica ,computer ,Microbiota composition - Abstract
Salivary microbiota profiles may represent a valid contribution to forensic investigation when standard DNA genotyping methods fail. Starting from questioned and control materials in the form of saliva, the evidence can be expressed by means of a distance between those materials taking into account specific aspects of the microbiota composition. The value of the evidence for forensic discrimination purposes is quantified by means of a Bayes’ factor, that allows one to overcome the major limitations and pitfalls of intuition connected to the use of cut-off values as a mean of decision.
- Published
- 2022
13. The Bayes' factor: the coherent measure for hypothesis confirmation
- Author
-
Taroni, F, Garbolino, P, Bozza, S, and Aitken, C
- Subjects
Bayesian confirmation theory ,Bayes' ,factor ,Likelihood ratio ,Logical properties ,Philosophy of science ,Probabilistic reasoning ,Scientific evidence evaluation ,Law ,Statistics, Probability and Uncertainty ,Philosophy ,Bayes’ factor ,Bayesian confirmation theory, Bayes’ factor, Likelihood ratio, Logical properties, Philosophy of science, Probabilistic reasoning, Scientific evidence evaluation ,Settore SECS-S/01 - Statistica - Abstract
What have been called ‘Bayesian confirmation measures’ or ‘evidential support measures’ offer a numerical expression for the impact of a piece of evidence on a judicial hypothesis of interest. The Bayes’ factor, sometimes simply called the ‘likelihood ratio’, represents the best measure of the value of the evidence. It satisfies a number of necessary conditions on normative logical adequacy. It is shown that the same cannot be said for alternative expressions put forward by some legal and forensic quarters. A list of desiderata are given that support the choice of the Bayes’ factor as the best measure for quantification of the value of evidence.
- Published
- 2021
14. Bayesian evaluation of dynamic signatures in operational conditions
- Author
-
Linden, Jacques, Bozza, Silvia, Marquis, Raymond, and Taroni, Franco
- Subjects
Questioned Document Examination ,Biometrics ,Bayes’ factor ,Feature selection ,Handwritten signature ,Multivariate data ,Online signature ,Settore SECS-S/01 - Statistica ,Law ,Pathology and Forensic Medicine - Abstract
Forensic handwriting examiners (FHE) activities are focused on comparative analysis of handwritten objects such as signatures. Their role is to provide and evaluate evidence for and against the authenticity of a questioned signature. In recent years, cases involving handwritten signatures captured on electronic devices have become more commonplace. These so-called 'dynamic signatures' (also known as 'digitally captured signatures') are much different from paper-based signatures. Not only does the medium of recording differ, but also the type, volume of data and features are different from the pattern-based evidence that makes up paper-based signatures. Recent developments in forensic science - including signature examination - have led to the adoption of evaluative probabilistic methodologies in many disciplines [see, e.g. ENFSI 1915 Guidelines]. In the current paper, a probabilistic model to evaluate signature evidence in the form of multivariate data, as proposed and described in Wacom Europe GmbH (2019), is adopted. Topics like data sparsity, joint evaluation of multiple features and feature selection are investigated. Performed experimental studies showed an accuracy rate above 90% even when a limited number (5) of reference signatures was available. The performances of a multivariate approach are compared with those characterizing a so-called multiplicative approach where variables (features) are taken as independent and the Bayes' factor (BF) is obtained as the product of univariate BFs associated to each selected feature. The simplicity of this latter approach is, however, accompanied by severe issues about the reliability of results. The use of a multivariate approach is therefore highly recommended. Finally, the evidential values in correspondence of alternative feature sets are compared. Results suggest that discriminative features are writer-related and necessitate a case-specific selection.
- Published
- 2021
15. Small Sample Tests for Shape Parameters of Gamma Distributions.
- Author
-
Bhaumik, Dulal K., Kapur, Kush, Balakrishnan, Narayanaswamy, Keating, Jerome P., and Gibbons, Robert D.
- Subjects
- *
GAMMA distributions , *PARAMETER estimation , *STATISTICAL models , *CHI-square distribution , *STOCHASTIC analysis , *ERROR analysis in mathematics - Abstract
The introduction of shape parameters into statistical distributions provided flexible models that produced better fit to experimental data. The Weibull and gamma families are prime examples wherein shape parameters produce more reliable statistical models than standard exponential models in lifetime studies. In the presence of many independent gamma populations, one may test equality (or homogeneity) of shape parameters. In this article, we develop two tests for testing shape parameters of gamma distributions using chi-square distributions, stochastic majorization, and Schur convexity. The first one tests hypotheses on the shape parameter of a single gamma distribution. We numerically examine the performance of this test and find that it controls Type I error rate for small samples. To compare shape parameters of a set of independent gamma populations, we develop a test that is unbiased in the sense of Schur convexity. These tests are motivated by the need to have simple, easy to use tests and accurate procedures in case of small samples. We illustrate the new tests using three real datasets taken from engineering and environmental science. In addition, we investigate the Bayes’ factor in this context and conclude that for small samples, the frequentist approach performs better than the Bayesian approach. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
16. Investigating the stochastic dispersion of 2D engineered frame structures under symmetry of variability.
- Author
-
Ribeiro, Luiz H.M.S., Dal Poggetto, Vinícius F., Beli, Danilo, Fabro, Adriano T., and Arruda, José R.F.
- Subjects
- *
STRUCTURAL frames , *UNIT cell , *STOCHASTIC analysis , *BRILLOUIN zones , *SYMMETRY - Abstract
Additive manufacturing has enabled the construction of increasingly complex mechanical structures. However, the variability of mechanical properties may be higher than that of conventionally manufactured structures. Typically, the computational cost of the numerical modeling of such structures considerably increases when variability is considered. In deterministic analyses of periodic structures, the dispersion diagrams obtained for the first Brillouin zone (FBZ) can be used to predict attenuation bands for any direction of propagation. This can be further simplified considering only the contour of the irreducible Brillouin zone (IBZ) if the unit cell presents symmetries. The objective of the current investigation is to present evidence that, similarly to what occurs in deterministic cases, the stochastic results obtained by scanning only the IBZ contour of the proposed two-dimensional unit cell under 4-fold rotational symmetry of statistical variability coincides with statistical results obtained scanning the FBZ. This is not a direct result, because each individual unit cell sample is asymmetric. We show that, under symmetry of variability statistics, the stochastic results computed for supercells and for finite metastructures consisting of a finite number of cells also coincide with the results for the IBZ contour of the unit cell. This result is important, as it dramatically reduces the computation cost of the stochastic analysis of such structures. • Under symmetry of variability, the IBZ coincides results obtained scanning the FBZ. • Stochastic results of supercells and finite metastructures coincide with the IBZ. • The results show a way to reduce the computation cost of the stochastic analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
17. Bayesian classification criterion for forensic multivariate data.
- Author
-
Bozza, S., Broséus, J., Esseiva, P., and Taroni, F.
- Subjects
- *
CANNABIS (Genus) , *FORENSIC sciences , *PLANT classification , *GAS chromatography , *BAYESIAN analysis - Abstract
This study presents a classification criteria for two-class Cannabis seedlings. As the cultivation of drug type cannabis is forbidden in Switzerland, law enforcement authorities regularly ask laboratories to determine cannabis plant's chemotype from seized material in order to ascertain that the plantation is legal or not. In this study, the classification analysis is based on data obtained from the relative proportion of three major leaf compounds measured by gas-chromatography interfaced with mass spectrometry (GC-MS). The aim is to discriminate between drug type (illegal) and fiber type (legal) cannabis at an early stage of the growth. A Bayesian procedure is proposed: a Bayes factor is computed and classification is performed on the basis of the decision maker specifications (i.e. prior probability distributions on cannabis type and consequences of classification measured by losses). Classification rates are computed with two statistical models and results are compared. Sensitivity analysis is then performed to analyze the robustness of classification criteria. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
18. Bayes factor for investigative assessment of selected handwriting features.
- Author
-
Taroni, F., Marquis, R., Schmittbuhl, M., Biedermann, A., Thiéry, A., and Bozza, S.
- Subjects
- *
MULTIVARIATE analysis , *DATA analysis , *FORENSIC sciences , *GRAPHOLOGY ,SEX differences (Biology) - Abstract
This paper extends previous research [1] on the use of multivariate continuous data in comparative handwriting examinations, notably for gender classification. A database has been constructed by analyzing the contour shape of loop characters of type a and d by means of Fourier analysis, which allows characters to be described in a global way by a set of variables (e.g., Fourier descriptors). Sample handwritings were collected from right- and left-handed female and male writers. The results reported in this paper provide further arguments in support of the view that investigative settings in forensic science represent an area of application for which the Bayesian approach offers a logical framework. In particular, the Bayes factor is computed for settings that focus on inference of gender and handedness of the author of an incriminated handwritten text. An emphasis is placed on comparing the efficiency for investigative purposes of characters a and d. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
19. Bayes Variable Selection in Semiparametric Linear Models.
- Author
-
Kundu, Suprateek and Dunson, David B.
- Subjects
- *
MATHEMATICAL models , *MATHEMATICAL functions , *LINEAR statistical models , *REGRESSION analysis , *DIRICHLET forms , *BAYESIAN analysis - Abstract
There is a rich literature on Bayesian variable selection for parametric models. Our focus is on generalizing methods and asymptotic theory established for mixtures ofg-priors to semiparametric linear regression models having unknown residual densities. Using a Dirichlet process location mixture for the residual density, we propose a semiparametricg-prior which incorporates an unknown matrix of cluster allocation indicators. For this class of priors, posterior computation can proceed via a straightforward stochastic search variable selection algorithm. In addition, Bayes’ factor and variable selection consistency is shown to result under a class of proper priors ongeven when the number of candidate predictorspis allowed to increase much faster than sample sizen, while making sparsity assumptions on the true model size. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
20. Uncertainty quantification of elastic material responses: testing, stochastic calibration and Bayesian model selection
- Author
-
Fitt, Danielle, Wyatt, Hayley, Woolley, Thomas E., and Mihai, L. Angela
- Published
- 2019
- Full Text
- View/download PDF
21. Assessment of forensic findings when alternative explanations have different likelihoods—“Blame-the-brother”-syndrome.
- Author
-
Nordgaard, Anders, Hedell, Ronny, and Ansell, Ricky
- Subjects
FORENSIC sciences ,PROBLEM solving ,BAYESIAN analysis ,PROBABILITY theory ,QUANTITATIVE research ,DNA ,STATISTICAL decision making - Abstract
Abstract: Assessment of forensic findings with likelihood ratios is for several cases straightforward, but there are a number of situations where contemplation of the alternative explanation to the evidence needs consideration, in particular when it comes to the reporting of the evidentiary strength. The likelihood ratio approach cannot be directly applied to cases where the proposition alternative to the forwarded one is a set of multiple propositions with different likelihoods and different prior probabilities. Here we present a general framework based on the Bayes'' factor as the quantitative measure of evidentiary strength from which it can be deduced whether the direct application of a likelihood ratio is reasonable or not. The framework is applied on DNA evidence in forms of an extension to previously published work. With the help of a scale of conclusions we provide a solution to the problem of communicating to the court the evidentiary strength of a DNA match when a close relative to the suspect has a non-negligible prior probability of being the source of the DNA. [Copyright &y& Elsevier]
- Published
- 2012
- Full Text
- View/download PDF
22. The use of the likelihood ratio for evaluative and investigative purposes in comparative forensic handwriting examination
- Author
-
Taroni, F., Marquis, R., Schmittbuhl, M., Biedermann, A., Thiéry, A., and Bozza, S.
- Subjects
- *
WRITING evaluation , *CONTOURS (Cartography) , *INVESTIGATIONS , *FORENSIC sciences , *FOURIER analysis , *COMPARATIVE studies , *STATISTICAL hypothesis testing , *AUTHORSHIP - Abstract
Abstract: This paper extends previous research and discussion on the use of multivariate continuous data, which are about to become more prevalent in forensic science. As an illustrative example, attention is drawn here on the area of comparative handwriting examinations. Multivariate continuous data can be obtained in this field by analysing the contour shape of loop characters through Fourier analysis. This methodology, based on existing research in this area, allows one describe in detail the morphology of character contours throughout a set of variables. This paper uses data collected from female and male writers to conduct a comparative analysis of likelihood ratio based evidence assessment procedures in both, evaluative and investigative proceedings. While the use of likelihood ratios in the former situation is now rather well established (typically, in order to discriminate between propositions of authorship of a given individual versus another, unknown individual), focus on the investigative setting still remains rather beyond considerations in practice. This paper seeks to highlight that investigative settings, too, can represent an area of application for which the likelihood ratio can offer a logical support. As an example, the inference of gender of the writer of an incriminated handwritten text is forwarded, analysed and discussed in this paper. The more general viewpoint according to which likelihood ratio analyses can be helpful for investigative proceedings is supported here through various simulations. These offer a characterisation of the robustness of the proposed likelihood ratio methodology. [Copyright &y& Elsevier]
- Published
- 2012
- Full Text
- View/download PDF
23. Value of information in product recovery decisions: a Bayesian approach.
- Author
-
Parlikad, Ajith Kumar and McFarlane, Duncan
- Subjects
PRODUCT recovery ,RADIO frequency identification systems ,QUALITY ,BAYESIAN analysis ,PRODUCT quality - Abstract
There is a widespread recognition of the need for better information sharing and provision to improve the viability of end-of-life (EOL) product recovery operations. The emergence of automated data capture and sharing technologies such as RFID, sensors and networked databases has enhanced the ability to make product information; available to recoverers, which will help them make better decisions regarding the choice of recovery option for EOL products. However, these technologies come with a cost attached to it, and hence the question 'what is its value?' is critical. This paper presents a probabilistic approach to model product recovery decisions and extends the concept of Bayes' factor for quantifying the impact of product information on the effectiveness of these decisions. Further, we provide a quantitative examination of the factors that influence the value of product information, this value depends on three factors: (i) penalties for Type I and Type II errors of judgement regarding product quality; (ii) prevalent uncertainty regarding product quality and (iii) the strength of the information to support/contradict the belief. Furthermore, we show that information is not valuable under all circumstances and derive conditions for achieving a positive value of information. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
24. Variable Selection for Clustering with Gaussian Mixture Models.
- Author
-
Maugis, Cathy, Celeux, Gilles, and Martin-Magniette, Marie-Laure
- Subjects
- *
CLUSTER analysis (Statistics) , *MATHEMATICAL variables , *MATHEMATICAL models , *BAYESIAN analysis , *ALGORITHMS , *REGRESSION analysis , *SIMULATION methods & models , *GENOMICS - Abstract
This article is concerned with variable selection for cluster analysis. The problem is regarded as a model selection problem in the model-based cluster analysis context. A model generalizing the model of Raftery and Dean (2006, Journal of the American Statistical Association 101, 168–178) is proposed to specify the role of each variable. This model does not need any prior assumptions about the linear link between the selected and discarded variables. Models are compared with Bayesian information criterion. Variable role is obtained through an algorithm embedding two backward stepwise algorithms for variable selection for clustering and linear regression. The model identifiability is established and the consistency of the resulting criterion is proved under regularity conditions. Numerical experiments on simulated datasets and a genomic application highlight the interest of the procedure. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
25. A Bayesian Approach to Gravitational Lens Model Selection, SF2A proceeding
- Author
-
balmès, irène, Corasaniti, Pier-Stefano, Laboratoire Univers et Théories (LUTH (UMR_8102)), Institut national des sciences de l'Univers (INSU - CNRS)-Observatoire de Paris, Université Paris sciences et lettres (PSL)-Université Paris sciences et lettres (PSL)-Centre National de la Recherche Scientifique (CNRS)-Université Paris Diderot - Paris 7 (UPD7), and PSL Research University (PSL)-PSL Research University (PSL)-Université Paris Diderot - Paris 7 (UPD7)-Centre National de la Recherche Scientifique (CNRS)
- Subjects
[PHYS]Physics [physics] ,model selection ,strong lensing ,Bayes' factor ,Astrophysics::Cosmology and Extragalactic Astrophysics ,Astrophysics - Cosmology and Nongalactic Astrophysics - Abstract
Over the past decade advancements in the understanding of several astrophysical phenomena have allowed us to infer a concordance cosmological model that successfully accounts for most of the observations of our universe. This has opened up the way to studies that aim to better determine the constants of the model and confront its predictions with those of competing scenarios. Here, we use strong gravitational lenses as cosmological probes. Strong lensing, as opposed to weak lensing, produces multiple images of a single source. Extracting cosmologically relevant information requires accurate modeling of the lens mass distribution, the latter being a galaxy or a cluster. To this purpose a variety of models are available, but it is hard to distinguish between them, as the choice is mostly guided by the quality of the fit to the data without accounting for the number of additional parameters introduced. However, this is a model selection problem rather than one of parameter fitting that we address in the Bayesian framework. Using simple test cases, we show that the assumption of more complicate lens models may not be justified given the level of accuracy of the data., Comment: 4 pages, proceeding for the SF2A conference
- Published
- 2020
26. Coherently updating degrees of belief: Radical probabilism, the generalization of Bayes’ Theorem and its consequences on evidence evaluation
- Author
-
Franco Taroni, Paolo Garbolino, and Silvia Bozza
- Subjects
Bayes’ Theorem, Radical Probabilism, Bayesian conditionalization, Probability Kinematics ,Bayes’ factor, evidence evaluation ,evidence evaluation ,Philosophy ,Statistics, Probability and Uncertainty ,Law ,Generalization ,Computer science ,010401 analytical chemistry ,Bayes’ Theorem ,Probabilism ,01 natural sciences ,0104 chemical sciences ,Bayes’ factor ,03 medical and health sciences ,Bayes' theorem ,Probability Kinematics ,0302 clinical medicine ,Bayesian conditionalization ,030216 legal & forensic medicine ,Settore SECS-S/01 - Statistica ,Mathematical economics ,Radical Probabilism - Abstract
The Bayesian perspective is based on conditioning related to reported evidence that is considered to be certain. What is called ‘Radical Probabilism’ replaces such an extreme view by introducing uncertainty on the reported evidence. How can such equivocal evidence be used in further inferences about a main hypothesis? The theoretical ground is introduced with the aim of offering to the readership an explanation for the generalization of the Bayes’ Theorem. This extension—that considers uncertainty related to the reporting of evidence—also has an impact on the assessment of the value of evidence through the Bayes’ factor. A generalization for such a logical measure of the evidence is also presented and justified.
- Published
- 2020
27. Finding specificity in structural brain alterations through Bayesian reverse inference
- Author
-
Enrico Premi, Franco Cauda, Andrea Nani, Jordi Manuello, Sergio Duca, Peter T. Fox, Donato Liloia, and Tommaso Costa
- Subjects
Brain activity and meditation ,Computer science ,Bayesian probability ,Inference ,Neuroimaging ,computer.software_genre ,Proof of Concept Study ,050105 experimental psychology ,Diagnosis, Differential ,03 medical and health sciences ,Bayes' theorem ,0302 clinical medicine ,Voxel ,Alzheimer Disease ,alteration specificity ,medicine ,Humans ,voxel-based morphometry ,0501 psychology and cognitive sciences ,Radiology, Nuclear Medicine and imaging ,voxel‐based morphometry ,pain ,Gray Matter ,Research Articles ,Radiological and Ultrasound Technology ,05 social sciences ,Default Mode Network ,Bayes' factor ,Cognition ,Bayes Theorem ,brain disorders ,Models, Theoretical ,Alzheimer's disease ,medicine.disease ,schizophrenia ,Neurology ,Schizophrenia ,reverse probability ,Neurology (clinical) ,Anatomy ,Nerve Net ,computer ,Neuroscience ,030217 neurology & neurosurgery ,Research Article - Abstract
In the field of neuroimaging reverse inferences can lead us to suppose the involvement of cognitive processes from certain patterns of brain activity. However, the same reasoning holds if we substitute “brain activity” with “brain alteration” and “cognitive process” with “brain disorder.” The fact that different brain disorders exhibit a high degree of overlap in their patterns of structural alterations makes forward inference‐based analyses less suitable for identifying brain areas whose alteration is specific to a certain pathology. In the forward inference‐based analyses, in fact, it is impossible to distinguish between areas that are altered by the majority of brain disorders and areas that are specifically affected by certain diseases. To address this issue and allow the identification of highly pathology‐specific altered areas we used the Bayes' factor technique, which was employed, as a proof of concept, on voxel‐based morphometry data of schizophrenia and Alzheimer's disease. This technique allows to calculate the ratio between the likelihoods of two alternative hypotheses (in our case, that the alteration of the voxel is specific for the brain disorder under scrutiny or that the alteration is not specific). We then performed temporal simulations of the alterations' spread associated with different pathologies. The Bayes' factor values calculated on these simulated data were able to reveal that the areas, which are more specific to a certain disease, are also the ones to be early altered. This study puts forward a new analytical instrument capable of innovating the methodological approach to the investigation of brain pathology., We created Bayesian reverse inference maps of the two most represented pathologies in BrainMap (schizophrenia and Alzheimer's disease). We provided evidence that Bayesian reverse inference is capable of identifying the cerebral areas exhibiting a higher alteration specificity to a certain pathology. We performed temporal simulations of the alteration spreads associated with different pathologies, revealing that the areas, which are more specific to a certain disease, are also the ones to be early altered.
- Published
- 2020
28. Sib-parentage testing using molecular markers when parents are unknown.
- Author
-
García, D, Carleos, C, Parra, D, and Cañón, J
- Subjects
- *
ALLELES , *DISTRIBUTION (Probability theory) , *PATERNITY - Abstract
The formulae for computing the so-called Sib Index using codominant alleles for (1) full-sib and (2) half-sib parentage are given. Hypothesis testing is based on the distribution of conditional likelihood ratio or Bayes' factor. Thresholds for rejecting the null hypothesis and P -values were obtained in function of the number of alleles and their frequency distributions. Simulations showed that a relatively low number of marker systems (e.g. 20) are enough to accept the hypothesis of sib parentage with a reasonable power for usual significance levels, but that a higher number would be necessary if full-sib against half-sib parentage is the contrast to be carried out. The effect of sampling variation on the allele frequencies on power calculations is also analysed. [ABSTRACT FROM AUTHOR]
- Published
- 2002
- Full Text
- View/download PDF
29. Bayesian evaluation of dynamic signatures in operational conditions.
- Author
-
Jacques, Linden, Silvia, Bozza, Raymond, Marquis, and Franco, Taroni
- Subjects
- *
PROBABILISTIC databases , *AUTHENTICITY (Philosophy) , *GRAPHOLOGY , *DIGITAL signatures , *BIOMETRIC identification - Abstract
Forensic handwriting examiners (FHE) activities are focused on comparative analysis of handwritten objects such as signatures. Their role is to provide and evaluate evidence for and against the authenticity of a questioned signature. In recent years, cases involving handwritten signatures captured on electronic devices have become more commonplace. These so-called 'dynamic signatures' (also known as 'digitally captured signatures') are much different from paper-based signatures. Not only does the medium of recording differ, but also the type, volume of data and features are different from the pattern-based evidence that makes up paper-based signatures. Recent developments in forensic science - including signature examination - have led to the adoption of evaluative probabilistic methodologies in many disciplines [see, e.g. ENFSI 1915 Guidelines]. In the current paper, a probabilistic model to evaluate signature evidence in the form of multivariate data, as proposed and described in Wacom Europe GmbH (2019), is adopted. Topics like data sparsity, joint evaluation of multiple features and feature selection are investigated. Performed experimental studies showed an accuracy rate above 90% even when a limited number (5) of reference signatures was available. The performances of a multivariate approach are compared with those characterizing a so-called multiplicative approach where variables (features) are taken as independent and the Bayes' factor (BF) is obtained as the product of univariate BFs associated to each selected feature. The simplicity of this latter approach is, however, accompanied by severe issues about the reliability of results. The use of a multivariate approach is therefore highly recommended. Finally, the evidential values in correspondence of alternative feature sets are compared. Results suggest that discriminative features are writer-related and necessitate a case-specific selection. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
30. A probabilistic approach to evaluate salivary microbiome in forensic science when the Defense says: 'It is my twin brother'.
- Author
-
Bozza, S., Scherz, V., Greub, G., and Taroni, F.
- Subjects
FORENSIC sciences ,TWINS ,BROTHERS ,DNA fingerprinting - Abstract
Salivary microbiota profiles may represent a valid contribution to forensic investigation when standard DNA genotyping methods fail. Starting from questioned and control materials in the form of saliva, the evidence can be expressed by means of a distance between those materials taking into account specific aspects of the microbiota composition. The value of the evidence for forensic discrimination purposes is quantified by means of a Bayes' factor, that allows one to overcome the major limitations and pitfalls of intuition connected to the use of cut-off values as a mean of decision. • salivary microbiota composition to support discrimination between related individuals. • Beta-diversity indeces to highlight taxonomical differences between pairs of individuals. • a probabilistic approach to evaluate salivary microbiome. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
31. A generalised Bayes' factor formula for evidence evaluation under activity level propositions: Variations around a fibres scenario
- Author
-
Franco Taroni, Colin Aitken, and Paolo Garbolino
- Subjects
Activity level ,scientific evidence evaluation ,010401 analytical chemistry ,Bayes' factor ,Bayesian network ,Bayes factor ,Bayes' factor, scientific evidence evaluation, activity level proposition ,01 natural sciences ,0104 chemical sciences ,Pathology and Forensic Medicine ,03 medical and health sciences ,Bayes' theorem ,A fibres ,0302 clinical medicine ,activity level proposition ,Econometrics ,030216 legal & forensic medicine ,Law ,Value (mathematics) ,Mathematics - Abstract
Generalised Bayes' factors and associated Bayesian networks are developedfor the transfer of extrinsic evidence at the activity level, developments thatextend previous work on activity level evaluation. A strategy for the assessmentof extrinsic evidence is developed in stages with progressive increases incomplexity. The final development is illustrated with an example involvingfibres from clothing. This provides a list of factors involved in the considerationof a transfer case with activity level propositions and their roles in thedetermination of evidential value
- Published
- 2021
- Full Text
- View/download PDF
32. Sequential analyses in psychological research using Bayesian statistics
- Author
-
Klintefors, Pierre and Klintefors, Pierre
- Abstract
It is important in psychological research to use well planned methods that are as time and resource efficient as possible, without jeopardizing the reliability and validity of psychological science. The present paper aims to test how sequential analyses could be implemented in psychological research using Bayesian statistics. With sequential analyses it is possible to stop an experiment or study in the data collection stage for success or futility. To avoid offset estimation and false alarms, a mixture of model testing with Bayes Factor and Bayesian parameter estimation were used as stopping rules. After several runs of Monte Carlo simulations, it appears as a Bayes’ Factor (BF) boundary of 6 together with 95% Highest density interval (HDI) width under a SD*0.60 served as suitable stopping rules under conditions of simulations. However, the generalizability is limited by the simulations settings and the stopping rules are recommended to be implemented on data from real conducted experiments.
- Published
- 2019
33. A Bayesian Approach to the Overlap Analysis of Epidemiologically Linked Traits
- Author
-
Asimit, Jennifer L, Panoutsopoulou, Kalliope, Wheeler, Eleanor, Berndt, Sonja I, GIANT Consortium, The ArcOGEN Consortium, Cordell, Heather J, Morris, Andrew P, Zeggini, Eleftheria, Barroso, Inês, Asimit, Jennifer [0000-0002-4857-2249], Wheeler, Eleanor [0000-0002-8616-6444], Barroso, Ines [0000-0001-5800-4520], and Apollo - University of Cambridge Repository
- Subjects
obesity ,Models, Genetic ,Bayes Theorem ,threshold calibration ,Polymorphism, Single Nucleotide ,Bayes’ factor ,Body Mass Index ,P-value ,osteoarthritis ,Phenotype ,Quantitative Trait, Heritable ,overlap analysis ,Data Interpretation, Statistical ,Sample Size ,Humans ,Computer Simulation ,Genome-Wide Association Study ,Probability - Abstract
Diseases often cooccur in individuals more often than expected by chance, and may be explained by shared underlying genetic etiology. A common approach to genetic overlap analyses is to use summary genome-wide association study data to identify single-nucleotide polymorphisms (SNPs) that are associated with multiple traits at a selected P-value threshold. However, P-values do not account for differences in power, whereas Bayes' factors (BFs) do, and may be approximated using summary statistics. We use simulation studies to compare the power of frequentist and Bayesian approaches with overlap analyses, and to decide on appropriate thresholds for comparison between the two methods. It is empirically illustrated that BFs have the advantage over P-values of a decreasing type I error rate as study size increases for single-disease associations. Consequently, the overlap analysis of traits from different-sized studies encounters issues in fair P-value threshold selection, whereas BFs are adjusted automatically. Extensive simulations show that Bayesian overlap analyses tend to have higher power than those that assess association strength with P-values, particularly in low-power scenarios. Calibration tables between BFs and P-values are provided for a range of sample sizes, as well as an approximation approach for sample sizes that are not in the calibration table. Although P-values are sometimes thought more intuitive, these tables assist in removing the opaqueness of Bayesian thresholds and may also be used in the selection of a BF threshold to meet a certain type I error rate. An application of our methods is used to identify variants associated with both obesity and osteoarthritis.
- Published
- 2018
- Full Text
- View/download PDF
34. A probabilistic approach to evaluate salivary microbiome in forensic science when the Defense says: `It is my twin brother'.
- Author
-
Bozza S, Scherz V, Greub G, and Taroni F
- Subjects
- Bayes Theorem, Forensic Medicine, Humans, Male, Saliva, Microbiota genetics, Siblings
- Abstract
Salivary microbiota profiles may represent a valid contribution to forensic investigation when standard DNA genotyping methods fail. Starting from questioned and control materials in the form of saliva, the evidence can be expressed by means of a distance between those materials taking into account specific aspects of the microbiota composition. The value of the evidence for forensic discrimination purposes is quantified by means of a Bayes' factor, that allows one to overcome the major limitations and pitfalls of intuition connected to the use of cut-off values as a mean of decision., (Copyright © 2021 Elsevier B.V. All rights reserved.)
- Published
- 2022
- Full Text
- View/download PDF
35. A Bayesian Approach to the Overlap Analysis of Epidemiologically Linked Traits
- Author
-
Eleftheria Zeggini, Eleanor Wheeler, Sonja I. Berndt, Kalliope Panoutsopoulou, Inês Barroso, Heather J. Cordell, Jennifer L. Asimit, and Andrew P. Morris
- Subjects
obesity ,Epidemiology ,Computer science ,Calibration (statistics) ,Bayesian probability ,threshold calibration ,Polymorphism, Single Nucleotide ,Body Mass Index ,Bayes' theorem ,Quantitative Trait, Heritable ,Frequentist inference ,overlap analysis ,Statistics ,Econometrics ,Humans ,Computer Simulation ,p-value ,P‐value ,Genetics (clinical) ,Research Articles ,Probability ,Models, Genetic ,Bayes factor ,Bayes Theorem ,Bayes’ factor ,osteoarthritis ,Phenotype ,Sample size determination ,Data Interpretation, Statistical ,Sample Size ,Type I and type II errors ,Research Article ,Genome-Wide Association Study - Abstract
Diseases often cooccur in individuals more often than expected by chance, and may be explained by shared underlying genetic etiology. A common approach to genetic overlap analyses is to use summary genome‐wide association study data to identify single‐nucleotide polymorphisms (SNPs) that are associated with multiple traits at a selected P‐value threshold. However, P‐values do not account for differences in power, whereas Bayes’ factors (BFs) do, and may be approximated using summary statistics. We use simulation studies to compare the power of frequentist and Bayesian approaches with overlap analyses, and to decide on appropriate thresholds for comparison between the two methods. It is empirically illustrated that BFs have the advantage over P‐values of a decreasing type I error rate as study size increases for single‐disease associations. Consequently, the overlap analysis of traits from different‐sized studies encounters issues in fair P‐value threshold selection, whereas BFs are adjusted automatically. Extensive simulations show that Bayesian overlap analyses tend to have higher power than those that assess association strength with P‐values, particularly in low‐power scenarios. Calibration tables between BFs and P‐values are provided for a range of sample sizes, as well as an approximation approach for sample sizes that are not in the calibration table. Although P‐values are sometimes thought more intuitive, these tables assist in removing the opaqueness of Bayesian thresholds and may also be used in the selection of a BF threshold to meet a certain type I error rate. An application of our methods is used to identify variants associated with both obesity and osteoarthritis.
- Published
- 2015
36. Bayesian multivariate models for case assessment in dynamic signature cases.
- Author
-
Linden, Jacques, Taroni, Franco, Marquis, Raymond, and Bozza, Silvia
- Subjects
- *
FORENSIC document examination , *SIGNATURES (Writing) , *BAYES' theorem , *BAYESIAN analysis , *FORENSIC sciences - Abstract
Dynamic signatures are recordings of signatures made on digitizing devices such as tablet PCs. These handwritten signatures contain both dynamic and spatial information on every data point collected during the signature movement and can therefore be described in the form of multivariate data. The management of dynamic signatures represents a challenge for the forensic science community through its novelty and the volume of data available. Much as for static signatures, the authenticity of dynamic signatures may be doubted, which leads to a forensic examination of the unknown source signature. The Bayes' factor, as measure of evidential support, can be assigned with statistical models to discriminate between competing propositions. In this respect, the limitations of existing probabilistic solutions to deal with dynamic signature evidence is pointed out and explained in detail. In particular, the necessity to remove the independence assumption between questioned and reference material is emphasized. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
37. The use of the likelihood ratio for evaluative and investigative purposes in comparative forensic handwriting examination
- Author
-
Silvia Bozza, Franco Taroni, A. Thiéry, Alex Biedermann, Matthieu Schmittbuhl, and Raymond Marquis
- Subjects
Male ,Multivariate statistics ,Handwriting ,Computer science ,Speech recognition ,Inference ,computer.software_genre ,Evidence evaluation ,Pathology and Forensic Medicine ,Handwriting evidence ,Sex Factors ,Robustness (computer science) ,Humans ,Bayes’ factor ,Likelihood ratio ,Sex determination ,Investigation ,Likelihood Functions ,business.industry ,Forensic Sciences ,Bayes factor ,Continuous data ,Female ,Artificial intelligence ,business ,Law ,computer ,Natural language processing - Abstract
This paper extends previous research and discussion on the use of multivariate continuous data, which are about to become more prevalent in forensic science. As an illustrative example, attention is drawn here on the area of comparative handwriting examinations. Multivariate continuous data can be obtained in this field by analysing the contour shape of loop characters through Fourier analysis. This methodology, based on existing research in this area, allows one describe in detail the morphology of character contours throughout a set of variables. This paper uses data collected from female and male writers to conduct a comparative analysis of likelihood ratio based evidence assessment procedures in both, evaluative and investigative proceedings. While the use of likelihood ratios in the former situation is now rather well established (typically, in order to discriminate between propositions of authorship of a given individual versus another, unknown individual), focus on the investigative setting still remains rather beyond considerations in practice. This paper seeks to highlight that investigative settings, too, can represent an area of application for which the likelihood ratio can offer a logical support. As an example, the inference of gender of the writer of an incriminated handwritten text is forwarded, analysed and discussed in this paper. The more general viewpoint according to which likelihood ratio analyses can be helpful for investigative proceedings is supported here through various simulations. These offer a characterisation of the robustness of the proposed likelihood ratio methodology.
- Published
- 2012
38. Bayesian model and variable selection in generalized linear models and implementation of the MC3 algorithm
- Author
-
Dedakis, Giannis G., Φουσκάκης, Δημήτρης, Παπανικολάου, Βασίλης, Κοκολάκης, Γιώργος, and Εθνικό Μετσόβιο Πολυτεχνείο. Σχολή Εφαρμοσμένων Μαθηματικών και Φυσικών Επιστημών. Τομέας Μαθηματικών.
- Subjects
Bayes’ theorem ,Μπεϋζιανή στατιστική ,Bayesian model and variable selection ,Παράγοντας Bayes ,MC3 algorithm ,αλγόριθμος MC3 ,Θεώρημα Bayes ,generalized linear models ,γενικευμένα γραμμικά μοντέλα ,Bayesian statistics ,Μπεϋζιανή επιλογή μοντέλων και μεταβλητών ,Bayes’ factor - Abstract
Στην παρούσα διπλωματική εργασία, η ανάπτυξη του προβλήματος επιλογής μοντέλου και μεταβλητών εξετάζεται από τη σκοπιά της Μπεϋζιανής Στατιστικής. Συγκεκριμένα, εξετάζεται η γενική θεωρία επιλογής μοντέλων και μεταβλητών με αναφορά στα δημοφιλή γενικευμένα γραμμικά μοντέλα καθώς επίσης και ο τρόπος με τον οποίο προσεγγίζουμε το παραπάνω πρόβλημα από τη σκοπιά της Μπεϋζιανής θεωρίας. Ο όρος «Μπεϋζιανή» έχει αναφορά στον Thomas Bayes (1702-1761), ο οποίος απέδειξε μια ειδική περίπτωση αυτού που καλείται τώρα το Θεώρημα του Bayes. Ωστόσο, ήταν ο Pierre Simon Laplace (1749-1827) ο οποίος παρουσίασε μια γενική μορφή του Θεωρήματος και το χρησιμοποίησε για την προσέγγιση των προβλημάτων στην ουράνια μηχανική, στην επεξεργασία ιατρικών στοιχείων και στη νομολογία. Στη δεκαετία του 1980 υπήρξε μια δραματική αύξηση στον τομέα της έρευνας και των εφαρμογών των Μπεϋζιανών μεθόδων, γεγονός που ως επί το πλείστον οφείλεται στην ανακάλυψη Markov Chain Monte Carlo τεχνικών, οι οποίες ήραν πολλά από τα υπολογιστικά προβλήματα που παρουσιάζονταν μέχρι τότε κατά την εφαρμογή των μεθόδων αυτών. Η Στατιστική κατά Bayes βασίζεται σε μία απλή ιδέα: η μόνη ικανοποιητική περιγραφή της αβεβαιότητας μας επιτυγχάνεται μέσω της πιθανότητας. Η Μπεϋζιανή προσέγγιση μας δίνει, μέσω του υπολογισμού πιθανοτήτων, ένα ισχυρό εργαλείο να καταλάβουμε, να χειριστούμε και να ελέγξουμε την αβεβαιότητα. Ο βασικός κανόνας στη Μπεϋζιανή συμπερασματολογία είναι ότι όλες οι άγνωστες ποσότητες θεωρούνται τυχαίες μεταβλητές και πρέπει να περιγράφονται δια μέσου πιθανοτήτων. Η Στατιστική συμπερασματολογία χρησιμοποιείται για την εξαγωγή συμπερασμάτων από τα δεδομένα που έχει στη διάθεση του ο ερευνητής για τον πληθυσμό. Βασικό εργαλείο όλων των Μπεϋζιανών μεθόδων είναι οι εκ των προτέρων (prior) κατανομές. Οι κατανομές αυτές εκφράζουν τις εκ των προτέρων γνώσεις και πεποιθήσεις του ερευνητή για τις άγνωστες παραμέτρους του μοντέλου και μέσω της Μπεϋζιανής μεθοδολογίας οδηγούν σε εκ των υστέρων (posterior) κατανομές. Στις εκ των υστέρων κατανομές εμπεριέχεται όλη η στατιστική συμπερασματολογία των αγνώστων αυτών παραμέτρων όπως αυτή έχει προκύψει από την Μπεϋζιανή ανάλυση. Η ιδέα της εκ των προτέρων κατανομής αποτελεί και την «καρδιά» της θεωρίας κατά Bayes και θεωρείται το μεγαλύτερο πλεονέκτημα ή το σοβαρότερο μειονέκτημα έναντι της κλασικής Στατιστικής. Η παρούσα διπλωματική διαρθρώνεται σε τέσσερα κεφάλαια ως εξής: Στο πρώτο κεφάλαιο γίνεται μια εισαγωγή στα γενικευμένα γραμμικά μοντέλα και αναφέρονται κάποιες βασικές έννοιες και ιδιότητες των μοντέλων που ανήκουν σε αυτήν την κατηγορία. Στο δεύτερο κεφάλαιο αναπτύσσονται οι βασικές αρχές της Μπεϋζιανής Στατιστικής θεωρίας όπου μεταξύ άλλων δίνεται ο ορισμός της εκ των προτέρων κατανομής, της εκ των υστέρων κατανομής και του Θεωρήματος Bayes. Το τρίτο κεφάλαιο περιγράφει τον τρόπο με τον οποίο η Μπεϋζιανή θεωρία αντιμετωπίζει το πρόβλημα της επιλογής μοντέλων και μεταβλητών στα γενικευμένα γραμμικά μοντέλα και αναφέρονται όλες οι βασικές έννοιες που συνδέονται με το πρόβλημα αυτό. Τέλος, στο τέταρτο κεφάλαιο αναλύεται κατά κύριο λόγο ο αλγόριθμος MC3 (Markov Chain Monte Carlo Model Composition) που αποτελεί μία από τις πολλές Μπεϋζιανές υπολογιστικές μεθόδους για την επιλογή μοντέλων καθώς και τρεις εφαρμογές του αλγορίθμου αυτού σε πραγματικά δεδομένα., In this thesis, the development of model and variable selection problem examined from the perspective of Bayesian Statistics. Specifically, we consider the general theory of model and variable selection with reference to the popular generalized linear models as well as how we approach the above problem from the perspective of Bayesian theory. The term "Bayesian" has reference to Thomas Bayes (1702-1761), who proved a special case of what is now called the Bayes’ Theorem. However, it was Pierre Simon Laplace (1749-1827) who presented a general form of the Theorem and used it to approach problems in celestial mechanics, to the processing of medical data and case law. In the 1980's there was a dramatic increase in research and applications of Bayesian methods, which are mostly due to the discovery of Markov Chain Monte Carlo techniques, which removed many of the computational problems that occurred previously in the application of these methods. The Bayesian Statistics is based in a simple idea: the only satisfactory description of our uncertainty is achieved through the probability. The Bayesian approach give us, by calculating probabilities, a powerful tool to understand, manipulate and control the uncertainty. The basic rule in Bayesian inference is that all unknown quantities are random variables and must be described through probabilities. The Statistical inference is used to draw conclusions from data that is available to the researcher for the population. The basic tool of all Bayesian methods is the prior distributions. These distributions are expressing the prior knowledge and beliefs of the researcher for the unknown model parameters and through the Bayesian methodology lead to the posterior distributions. In the posterior distributions is included all the statistical inference for the unknown parameters such as resulting from the Bayesian analysis. The concept of prior distribution is the "heart" of the Bayes’ Theory and considered the biggest advantage or the serious disadvantage against the classical statistics. This thesis is divided into four chapters as follows: The first chapter is an introduction to generalized linear models, referred to some basic concepts and properties of models in this category. The second chapter describes the basic principles of Bayesian Statistical theory where include the definition of prior distribution, the posterior distribution and Bayes’ Theorem. The third chapter describes how the Bayesian theory approaches the model and variable selection problem in generalized linear models and lists all the basic concepts associated with this problem. Finally, the fourth chapter analyzes basically the MC3 algorithm (Markov Chain Monte Carlo Model Composition) which is one of many Bayesian computational methods for selecting models and three applications of this algorithm on real data., Ιωάννης Γ. Δεδάκης
- Published
- 2011
- Full Text
- View/download PDF
39. ’Blame the brother’-Assessment of forensic DNA evidence when alternative explanations have different likelihoods
- Author
-
Nordgaard, Anders, Ansell, Ricky, Hedell, Ronny, Nordgaard, Anders, Ansell, Ricky, and Hedell, Ronny
- Abstract
In a crime case where a suspect is assumed to be the donor of a recovered stain, forensic DNA evidence presented in terms of a likelihood ratio is a clear course as long as the set of alternative donors contains no close relative of the suspect, since the latter has a higher likelihood than has an individual unrelated to the suspect. The state-of-art today at several laboratories is to report the likelihood ratio but with a reservation stating its lack of validity if the stain originates from a close relative. Buckleton et al[†] derived a so-called extended likelihood ratio for reporting DNA evidence values when a full sibling is present in the set of potential alternative donors. This approach requires consideration of prior probabilities for each of the alternative donors to be the source of the stain and may therefore be problematic to apply in practice. Here we present an alternative way of using prior probabilities in the extended likelihood ratio when the latter is reported on an ordinal scale of conclusions. Our example show that for a 12 STR-marker profile using the extended likelihood ratio approach would not imply a change in the level reported compared to the ordinary likelihood ratio approach, unless the close relative has a very high prior probability of being the donor compared to an unrelated individual. [†] Buckleton JS, Triggs CM, Champod C., Science & Justice 46: 69-78.
- Published
- 2011
40. Likelihood Ratio as Weight of Forensic Evidence: A Closer Look.
- Author
-
Lund SP and Iyer H
- Abstract
The forensic science community has increasingly sought quantitative methods for conveying the weight of evidence. Experts from many forensic laboratories summarize their findings in terms of a likelihood ratio. Several proponents of this approach have argued that Bayesian reasoning proves it to be normative. We find this likelihood ratio paradigm to be unsupported by arguments of Bayesian decision theory, which applies only to personal decision making and not to the transfer of information from an expert to a separate decision maker. We further argue that decision theory does not exempt the presentation of a likelihood ratio from uncertainty characterization, which is required to assess the fitness for purpose of any transferred quantity. We propose the concept of a lattice of assumptions leading to an uncertainty pyramid as a framework for assessing the uncertainty in an evaluation of a likelihood ratio. We demonstrate the use of these concepts with illustrative examples regarding the refractive index of glass and automated comparison scores for fingerprints.
- Published
- 2017
- Full Text
- View/download PDF
41. A Bayesian Approach to the Overlap Analysis of Epidemiologically Linked Traits.
- Author
-
Asimit JL, Panoutsopoulou K, Wheeler E, Berndt SI, Cordell HJ, Morris AP, Zeggini E, and Barroso I
- Subjects
- Bayes Theorem, Body Mass Index, Computer Simulation, Data Interpretation, Statistical, Humans, Models, Genetic, Obesity genetics, Osteoarthritis genetics, Phenotype, Polymorphism, Single Nucleotide genetics, Probability, Sample Size, Genome-Wide Association Study methods, Obesity epidemiology, Osteoarthritis epidemiology, Quantitative Trait, Heritable
- Abstract
Diseases often cooccur in individuals more often than expected by chance, and may be explained by shared underlying genetic etiology. A common approach to genetic overlap analyses is to use summary genome-wide association study data to identify single-nucleotide polymorphisms (SNPs) that are associated with multiple traits at a selected P-value threshold. However, P-values do not account for differences in power, whereas Bayes' factors (BFs) do, and may be approximated using summary statistics. We use simulation studies to compare the power of frequentist and Bayesian approaches with overlap analyses, and to decide on appropriate thresholds for comparison between the two methods. It is empirically illustrated that BFs have the advantage over P-values of a decreasing type I error rate as study size increases for single-disease associations. Consequently, the overlap analysis of traits from different-sized studies encounters issues in fair P-value threshold selection, whereas BFs are adjusted automatically. Extensive simulations show that Bayesian overlap analyses tend to have higher power than those that assess association strength with P-values, particularly in low-power scenarios. Calibration tables between BFs and P-values are provided for a range of sample sizes, as well as an approximation approach for sample sizes that are not in the calibration table. Although P-values are sometimes thought more intuitive, these tables assist in removing the opaqueness of Bayesian thresholds and may also be used in the selection of a BF threshold to meet a certain type I error rate. An application of our methods is used to identify variants associated with both obesity and osteoarthritis., (© 2015 The Authors. *Genetic Epidemiology published by Wiley Periodicals, Inc.)
- Published
- 2015
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.