3,682 results on '"Delta method"'
Search Results
2. Applying RBF-Delta to Estimate the Aerodynamic Coefficients of an Aircraft in Longitudinal Flight
- Author
-
Maheshwari, Pratham, Kumar, Ajit, Chaari, Fakher, Series Editor, Gherardini, Francesco, Series Editor, Ivanov, Vitalii, Series Editor, Haddar, Mohamed, Series Editor, Cavas-Martínez, Francisco, Editorial Board Member, di Mare, Francesca, Editorial Board Member, Kwon, Young W., Editorial Board Member, Tolio, Tullio A. M., Editorial Board Member, Trojanowska, Justyna, Editorial Board Member, Schmitt, Robert, Editorial Board Member, Xu, Jinyang, Editorial Board Member, Kumar, Ajit, editor, Iyer, Ganesh, editor, Desai, Ulkesh, editor, and Kumar, Arun, editor
- Published
- 2025
- Full Text
- View/download PDF
3. The law of the iterated logarithm for functionals of the Wiener process
- Author
-
Logachov, A. and Yambartsev, A.
- Published
- 2025
- Full Text
- View/download PDF
4. An improved method for biometric analysis of soil test – crop response data sets.
- Author
-
Rohan, Maheswaran and Conyers, Mark
- Subjects
- *
SOIL testing , *PLANT yields , *PHOSPHORUS in soils , *STATISTICAL models , *AGRICULTURE - Abstract
Context: To increase cereal production, primary producers want to know the amount of fertiliser that needs to be applied to achieve high yield. To calculate the critical soil test value (CSTV) especially in Colwell-P, several models were found in the literature. The arcsine-log calibration curve has been commonly used in Australia to estimate the CSTV. However, this method has some mathematical weaknesses, which tend to give underestimated values for CSTV. Aim: In this paper, we describe the mathematical issues and propose a model to overcome these issues. The simplified model proposed allows us to estimate the CSTV and its standard error. Method: We have applied the regression and the delta method to the data used in the arcsine-log calibration curve (ALCC) method. Key results: Based on the given data, a soil test value of 31.5 mg P kg−1 soil is required to achieve 90% relative yield of wheat, which is the middle ground of previously published critical values between the underestimate (21.4 mg kg−1) generated by the ALCC algorithm and the overestimate (40 mg kg−1) generated by the conventional Mitscherlich method. Conclusions: Advantages of this method are: (1) simple to apply to any data sets; and (2) easy to incorporate other covariates into the models. This method should be applied for computing estimates of CSTV and its standard error because it overcomes the contentious issue of the division of the y -axis by the correlation coefficient. Implication: The proposed method should replace the ALCC algorithm and the current P values used in farming may need to be updated. There are various methods available for the calculation of critical soil test values. We provide a new method that overcomes the shortcomings of previous approaches and provides a result for critical soil test value that is intermediate to the conventional Mitscherlich and the arcsine-log calibration curve (ALCC) methods. We recommend that this new proposed method be applied to a broad range of soil test data sets in order to establish its utility as the preferred method. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
5. Statistical Inference for Box–Cox based Receiver Operating Characteristic Curves.
- Author
-
Bantis, Leonidas E., Brewer, Benjamin, Nakas, Christos T., and Reiser, Benjamin
- Subjects
- *
RECEIVER operating characteristic curves , *INFERENTIAL statistics , *ACCOUNTING methods , *SENSITIVITY & specificity (Statistics) , *DIAGNOSIS methods - Abstract
Receiver operating characteristic (ROC) curve analysis is widely used in evaluating the effectiveness of a diagnostic test/biomarker or classifier score. A parametric approach for statistical inference on ROC curves based on a Box–Cox transformation to normality has frequently been discussed in the literature. Many investigators have highlighted the difficulty of taking into account the variability of the estimated transformation parameter when carrying out such an analysis. This variability is often ignored and inferences are made by considering the estimated transformation parameter as fixed and known. In this paper, we will review the literature discussing the use of the Box–Cox transformation for ROC curves and the methodology for accounting for the estimation of the Box–Cox transformation parameter in the context of ROC analysis, and detail its application to a number of problems. We present a general framework for inference on any functional of interest, including common measures such as the AUC, the Youden index, and the sensitivity at a given specificity (and vice versa). We further developed a new R package (named 'rocbc') that carries out all discussed approaches and is available in CRAN. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Efficacy assessment in crop protection: a tutorial on the use of Abbott's formula.
- Author
-
Piepho, Hans-Peter, Malik, Waqas Ahmed, Bischoff, Robert, El-Hasan, Abbas, Scheer, Christian, Sedlmeier, Jan Erik, Gerhards, Roland, Petschenka, Georg, and Voegele, Ralf T.
- Subjects
- *
WEED science , *LEAST squares , *ANALYSIS of variance , *BLOCK designs , *PLANT diseases - Abstract
In 1925, the American entomologist Walter Sidney Abbott proposed an equation for assessing efficacy, and it is still widely used today for analysing controlled experiments in crop protection and phytomedicine. Typically, this equation is applied to each experimental unit and the efficacy estimates thus obtained are then used in analysis of variance and least squares regression procedures. However, particularly regarding the common assumptions of homogeneity of variance and normality, this approach is often inaccurate. In this tutorial paper, we therefore revisit Abbott's equation and outline an alternative route to analysis via generalized linear mixed models that can satisfactorily deal with these distributional issues. Nine examples from entomology, weed science and phytopathology, each with a different focus and methodological peculiarity, are used to illustrate the framework. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. A Bayesian approach for estimating the uncertainty on the contribution of nitrogen fixation and calculation of nutrient balances in grain legumes
- Author
-
Francisco Palmero, Trevor J. Hefley, Josefina Lacasa, Luiz Felipe Almeida, Ricardo J. Haro, Fernando O. Garcia, Fernando Salvagiotti, and Ignacio A. Ciampitti
- Subjects
Delta method ,Bootstrapping ,N balance ,Roots ,Plant culture ,SB1-1110 ,Biology (General) ,QH301-705.5 - Abstract
Abstract Background The proportion of nitrogen (N) derived from the atmosphere (Ndfa) is a fundamental component of the plant N demand in legume species. To estimate the N benefit of grain legumes for the subsequent crop in the rotation, a simplified N balance is frequently used. This balance is calculated as the difference between fixed N and removed N by grains. The Ndfa needed to achieve a neutral N balance (hereafter $$\theta$$ θ ) is usually estimated through a simple linear regression model between Ndfa and N balance. This quantity is routinely estimated without accounting for the uncertainty in the estimate, which is needed to perform formal statistical inference about $$\theta$$ θ . In this article, we utilized a global database to describe the development of a novel Bayesian framework to quantify the uncertainty of $$\theta$$ θ . This study aimed to (i) develop a Bayesian framework to quantify the uncertainty of $$\theta$$ θ , and (ii) contrast the use of this Bayesian framework with the widely used delta and bootstrapping methods under different data availability scenarios. Results The delta method, bootstrapping, and Bayesian inference provided nearly equivalent numerical values when the range of values for Ndfa was thoroughly explored during data collection (e.g., 6–91%), and the number of observations was relatively high (e.g., $$\ge 100$$ ≥ 100 ). When the Ndfa tested was narrow and/or sample size was small, the delta method and bootstrapping provided confidence intervals containing biologically non-meaningful values (i.e. 100%). However, under a narrow Ndfa range and small sample size, the developed Bayesian inference framework obtained biologically meaningful values in the uncertainty estimation. Conclusion In this study, we showed that the developed Bayesian framework was preferable under limited data conditions ─by using informative priors─ and when uncertainty estimation had to be constrained (regularized) to obtain meaningful inference. The presented Bayesian framework lays the foundation not only to conduct formal comparisons or hypothesis testing involving $$\theta$$ θ , but also to learn about its expected value, variance, and higher moments such as skewness and kurtosis under different agroecological and crop management conditions. This framework can also be transferred to estimate balances for other nutrients and/or field crops to gain knowledge on global crop nutrient balances.
- Published
- 2024
- Full Text
- View/download PDF
8. Measure of deviancy from marginal mean equality based on cumulative marginal probabilities in square contingency tables.
- Author
-
Ando, Shuji
- Subjects
- *
CONTINGENCY tables , *DEVIANT behavior , *CONFIDENCE intervals , *CLINICAL trials , *PROBABILITY theory - Abstract
This study proposes a measure that can concurrently evaluate the degree and direction of deviancy from the marginal mean equality (ME) model in square contingency tables with ordered categories. The proposed measure is constructed as the function of the row and column cumulative marginal probabilities. When the ME model does not fit data, we are interested in measuring the degree of deviancy from the ME model, because the model having weaker restrictions than the ME model is only the saturated model. This existing measure, which represents the degree of deviancy from the ME model, does not depend on the probabilities that observations will fall in the main diagonal cells of the table. For the data in which observations are concentrated in the main diagonal cells, the existing measure may overestimate the degree of deviancy from the ME model. The proposed measure can address this issue. This study derives an estimator and an approximate confidence interval for the proposed measure using the delta method. The proposed measure would be utility for comparing degrees of deviancy from the ME model in two datasets. The proposed measure is evaluated the usefulness with the application to real data of clinical trials. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Confidence bounds for compound Poisson process.
- Author
-
Skarupski, Marek and Wu, Qinhao
- Subjects
POISSON processes ,DISTRIBUTION (Probability theory) ,CENTRAL limit theorem ,CONTINUOUS distributions ,ENGINEERING reliability theory - Abstract
The compound Poisson process (CPP) is a common mathematical model for describing many phenomena in medicine, reliability theory and risk theory. However, in the case of low-frequency phenomena, we are often unable to collect a sufficiently large database to conduct analysis. In this article, we focused on methods for determining confidence intervals for the rate of the CPP when the sample size is small. Based on the properties of process parameter estimators, we proposed a new method for constructing such intervals and compared it with other known approaches. In numerical simulations, we used synthetic data from several continuous and discrete distributions. The case of CPP, in which rewards came from exponential distribution, was discussed separately. The recommendation of how to use each method to have a more precise confidence interval is given. All simulations were performed in R version 4.2.1. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Analysis of overlapping count data.
- Author
-
Ryan, Kenneth J., Brydon, Michaela S., Leatherman, Erin R., and Hamada, Michael S.
- Subjects
- *
MAXIMUM likelihood statistics , *POISSON regression , *REGRESSION analysis , *DATA modeling - Abstract
Counts of a specific characteristic were obtained within regions defined on an object that was manufactured in a proprietary setting. The count regions were altered during production and resulted in misaligned or overlapping count data. A closed-formula maximum likelihood estimator (MLE) of the new region means is derived using all of the available count data and an independent Poisson model. The MLE is shown to be preferable to estimators constructed using generalized linear models for the overlapping data setting. This closed-form estimator extends to over-dispersed overlapping count data as the quasi-MLE and also performs well with correlated overlapping count data. Standard errors for the estimator are approximated and are validated with a simulation study. Additionally, the methods are extended to overlapping multinomial data. Illustrative examples of the methods are provided throughout the paper and are reproducible with the supplemental R code. Proofs of the paper's results are also included in the supplemental material. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Estimation and inference on the partial volume under the receiver operating characteristic surface.
- Author
-
Young, Kate J and Bantis, Leonidas E
- Subjects
- *
RECEIVER operating characteristic curves , *PANCREATIC cancer , *CANCER patients , *CANCER diagnosis , *BIOMARKERS - Abstract
Summary measures of biomarker accuracy that employ the receiver operating characteristic surface have been proposed for biomarkers that classify patients into one of three groups: healthy, benign, or aggressive disease. The volume under the receiver operating characteristic surface summarizes the overall discriminatory ability of a biomarker in such configurations, but includes cutoffs associated with clinically irrelevant true classification rates. Due to the lethal nature of pancreatic cancer, cutoffs associated with a low true classification rate for identifying patients with pancreatic cancer may be undesirable and not appropriate for use in a clinical setting. In this project, we study the properties of a more focused criterion, the partial volume under the receiver operating characteristic surface, that summarizes the diagnostic accuracy of a marker in the three-class setting for regions restricted to only those of clinical interest. We propose methods for estimation and inference on the partial volume under the receiver operating characteristic surface under parametric and non-parametric frameworks and apply these methods to the evaluation of potential biomarkers for the diagnosis of pancreatic cancer. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. A Bayesian approach for estimating the uncertainty on the contribution of nitrogen fixation and calculation of nutrient balances in grain legumes.
- Author
-
Palmero, Francisco, Hefley, Trevor J., Lacasa, Josefina, Almeida, Luiz Felipe, Haro, Ricardo J., Garcia, Fernando O., Salvagiotti, Fernando, and Ciampitti, Ignacio A.
- Subjects
CROP management ,NITROGEN fixation ,FIELD crops ,INFERENTIAL statistics ,BAYESIAN field theory - Abstract
Background: The proportion of nitrogen (N) derived from the atmosphere (Ndfa) is a fundamental component of the plant N demand in legume species. To estimate the N benefit of grain legumes for the subsequent crop in the rotation, a simplified N balance is frequently used. This balance is calculated as the difference between fixed N and removed N by grains. The Ndfa needed to achieve a neutral N balance (hereafter θ ) is usually estimated through a simple linear regression model between Ndfa and N balance. This quantity is routinely estimated without accounting for the uncertainty in the estimate, which is needed to perform formal statistical inference about θ . In this article, we utilized a global database to describe the development of a novel Bayesian framework to quantify the uncertainty of θ . This study aimed to (i) develop a Bayesian framework to quantify the uncertainty of θ , and (ii) contrast the use of this Bayesian framework with the widely used delta and bootstrapping methods under different data availability scenarios. Results: The delta method, bootstrapping, and Bayesian inference provided nearly equivalent numerical values when the range of values for Ndfa was thoroughly explored during data collection (e.g., 6–91%), and the number of observations was relatively high (e.g., ≥ 100 ). When the Ndfa tested was narrow and/or sample size was small, the delta method and bootstrapping provided confidence intervals containing biologically non-meaningful values (i.e. < 0% or > 100%). However, under a narrow Ndfa range and small sample size, the developed Bayesian inference framework obtained biologically meaningful values in the uncertainty estimation. Conclusion: In this study, we showed that the developed Bayesian framework was preferable under limited data conditions ─by using informative priors─ and when uncertainty estimation had to be constrained (regularized) to obtain meaningful inference. The presented Bayesian framework lays the foundation not only to conduct formal comparisons or hypothesis testing involving θ , but also to learn about its expected value, variance, and higher moments such as skewness and kurtosis under different agroecological and crop management conditions. This framework can also be transferred to estimate balances for other nutrients and/or field crops to gain knowledge on global crop nutrient balances. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Inference of stress-strength reliability based on adaptive progressive type-Ⅱ censing from Chen distribution with application to carbon fiber data
- Author
-
Essam A. Ahmed and Laila A. Al-Essa
- Subjects
chen distribution ,stress-strength reliability ,maximum likelihood estimator ,delta method ,bootstrap ,bayes estimator ,markov chain monte carlo ,adaptive progressive censored ,Mathematics ,QA1-939 - Abstract
In this paper, we used the maximum likelihood estimation (MLE) and the Bayes methods to perform estimation procedures for the reliability of stress-strength $ R = P(Y < X) $ based on independent adaptive progressive censored samples that were taken from the Chen distribution. An approximate confidence interval of $ R $ was constructed using a variety of classical techniques, such as the normal approximation of the MLE, the normal approximation of the log-transformed MLE, and the percentile bootstrap (Boot-p) procedure. Additionally, the asymptotic distribution theory and delta approach were used to generate the approximate confidence interval. Further, the Bayesian estimation of $ R $ was obtained based on the balanced loss function, which came in two versions here, the symmetric balanced squared error (BSE) loss function and the asymmetric balanced linear exponential (BLINEX) loss function. When estimating $ R $ using the Bayesian approach, all the unknown parameters of the Chen distribution were assumed to be independently distributed and to have informative gamma priors. Additionally, a mixture of Gibbs sampling algorithm and Metropolis-Hastings algorithm was used to compute the Bayes estimate of $ R $ and the associated highest posterior density credible interval. In the end, simulation research was used to assess the general overall performance of the proposed estimators and a real dataset was provided to exemplify the theoretical results.
- Published
- 2024
- Full Text
- View/download PDF
14. Inference of stress-strength reliability based on adaptive progressive type-II censing from Chen distribution with application to carbon fiber data.
- Author
-
Ahmed, Essam A. and Al-Essa, Laila A.
- Subjects
MARKOV chain Monte Carlo ,GIBBS sampling ,BAYES' estimation ,MAXIMUM likelihood statistics ,ASYMPTOTIC distribution - Abstract
In this paper, we used the maximum likelihood estimation (MLE) and the Bayes methods to perform estimation procedures for the reliability of stress-strength R = P(Y < X) based on independent adaptive progressive censored samples that were taken from the Chen distribution. An approximate confidence interval of R was constructed using a variety of classical techniques, such as the normal approximation of the MLE, the normal approximation of the log-transformed MLE, and the percentile bootstrap (Boot-p) procedure. Additionally, the asymptotic distribution theory and delta approach were used to generate the approximate confidence interval. Further, the Bayesian estimation of R was obtained based on the balanced loss function, which came in two versions here, the symmetric balanced squared error (BSE) loss function and the asymmetric balanced linear exponential (BLINEX) loss function. When estimating R using the Bayesian approach, all the unknown parameters of the Chen distribution were assumed to be independently distributed and to have informative gamma priors. Additionally, a mixture of Gibbs sampling algorithm and Metropolis-Hastings algorithm was used to compute the Bayes estimate of R and the associated highest posterior density credible interval. In the end, simulation research was used to assess the general overall performance of the proposed estimators and a real dataset was provided to exemplify the theoretical results. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Stylometric analysis of French plays of the 17th century.
- Author
-
Savoy, Jacques
- Subjects
- *
SEVENTEENTH century , *ATTRIBUTION of authorship , *FRENCH fiction , *DIGITAL humanities - Abstract
The automatic assignment of a text to one or more predefined categories presents multiple applications. In this context, the current study focuses on author attribution in which the true author of a doubtful text must be identified. This analysis focuses on the style of sixty-six French comedies in verse written by seventeen supposed authors during the 17th century. The hypothesis we want to verify assumes that the real author is the name appearing on the cover (called the signature hypothesis). In order to validate the reliability of two attribution procedures, we used two additional corpora based on 200 extracts of novels written in French, with thirty authors and 140 Italian novels authored by forty persons. After this verification, we propose an improvement of the Delta method as well as a new analysis grid for this model. Finally, we applied these approaches to our French comedy corpus. The results demonstrate that the signature hypothesis must be discarded. Moreover, these works present similar styles, making any attribution difficult to support with a high degree of certainty. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. On the Asymptotic Normality of the Method of Moments Estimators for the Birnbaum–Saunders Distribution with a New Parametrization
- Author
-
Piyapatr Busababodhin, Tossapol Phoophiwfa, Andrei Volodin, and Sujitta Suraphee
- Subjects
Birnbaum–Saunders distribution ,method of moments estimation ,asymptotic normality ,delta method ,return level ,Mathematics ,QA1-939 - Abstract
This study investigates the asymptotic properties of method-of-moments estimators for the Birnbaum–Saunders distribution under a newly proposed parametrization. Theoretical derivations establish the asymptotic normality of these estimators, supported by explicit expressions for the mean vector and variance–covariance matrix. Simulation studies validate these results across various sample sizes and parameter values. A practical application is demonstrated through modeling cumulative rainfall data from northeastern Thailand, highlighting the distribution’s suitability for extreme weather prediction.
- Published
- 2025
- Full Text
- View/download PDF
17. Reliability characteristics of COVID-19 death rate using generalized progressive hybrid censored data
- Author
-
Irfan, Mohd and Sharma, Anup Kumar
- Published
- 2024
- Full Text
- View/download PDF
18. A crucial note on stress-strength models: Wrong asymptotic variance in some published papers.
- Author
-
Saber, Mohammad Mehdi and Taghipour, Mehrdad
- Subjects
- *
MAXIMUM likelihood statistics - Abstract
The purpose of this note is to point out a problem related to stress-strength models in some published papers. This difficulty has occurred in the asymptotic variance of the maximum likelihood estimator of the stress-strength parameter. The asymptotic variance will be studied in general and then in the mentioned papers. Finally, the correct version of asymptotic variance is computed. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. Income growth, forecasting and stock valuation.
- Author
-
Psychoyios, Dimitris
- Subjects
- *
VALUATION of corporations , *FORECASTING , *ENTERPRISE value , *CAPITAL costs , *INCOME accounting - Abstract
We show that using forecasts of a firm's income growth in the context of stock valuation can lead to overpricing of the firm's stock, which is consistent with 'optimism bias' reported among financial analysts. Firms with volatile earnings, high income growth and low systematic risk are prone to larger valuation errors. To address this issue, we develop a new estimator of the firm value that corrects for the bias from forecasting income growth. The new estimator is significantly superior to the traditional value estimator under various performance measures. To support managerial decision-making, we construct closed-form confidence intervals for the firm value and the implied cost of capital that account for uncertainty in income growth. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Sample size determination when the parameter of interest is the coefficient of variation under normality for the data.
- Author
-
Alberto Achcar, Jorge and Barili, Emerson
- Subjects
- *
MARKOV chain Monte Carlo , *ASYMPTOTIC normality , *MAXIMUM likelihood statistics , *GAUSSIAN distribution , *STATISTICAL sampling - Abstract
This study considers classical and Bayesian inference approaches for the coefficient of variation under normality for the data, especially on the determination of the sample size of a random sample needed in the second stage of an experiment. This topic has been explored by many authors in the last decades. The first goal of the study is to present simple formulations to get the inferences of interest for the coefficient of variation under normality and the usual frequentist approach based on the asymptotic normality of the maximum likelihood estimators for the mean and standard deviation of the normal distribution and using the delta method to get the inferences of interest for the coefficient of variation. Simple hypothesis tests and determination of the sample size are discussed under the frequentist approach. The second goal of the study is to present a sample size determination under a Bayesian approach, where it is assumed a Jeffreys non-informative prior distribution of the parameters of the normal distribution assumed for the data and using standard Markov Chain Monte Carlo (MCMC) methods to get the posterior summaries of interest. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Estimation for stochastic differential equation mixed models using approximation methods.
- Author
-
Jamba, Nelson T., Jacinto, Gonçalo, Filipe, Patrícia A., and Braumann, Carlos A.
- Subjects
CATTLE weight ,MAXIMUM likelihood statistics ,PARAMETER estimation - Abstract
We used a class of stochastic differential equations (SDE) to model the evolution of cattle weight that, by an appropriate transformation of the weight, resulted in a variant of the Ornstein-Uhlenbeck model. In previous works, we have dealt with estimation, prediction, and optimization issues for this class of models. However, to incorporate individual characteristics of the animals, the average transformed size at maturity parameter α and/or the growth parameter β may vary randomly from animal to animal, which results in SDE mixed models. Obtaining a closed-form expression for the likelihood function to apply the maximum likelihood estimation method is a difficult, sometimes impossible, task. We compared the known Laplace approximation method with the delta method to approximate the integrals involved in the likelihood function. These approaches were adapted to allow the estimation of the parameters even when the requirement of most existing methods, namely having the same age vector of observations for all trajectories, fails, as it did in our real data example. Simulation studies were also performed to assess the performance of these approximation methods. The results show that the approximation methods under study are a very good alternative for the estimation of SDE mixed models. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Hybrid subconvexity bounds for twists of GL(3)L-functions.
- Author
-
Wang, Xin and Zhu, Tengyou
- Subjects
- *
L-functions , *CUSP forms (Mathematics) - Abstract
Let F be a Hecke–Maass cusp form on SL (3 , ℤ) and χ a primitive Dirichlet character of prime power conductor = p k with p prime. In this paper, we will prove the following subconvexity bound L (1 2 + i t , F × χ) ≪ π , p 3 / 4 ((1 + | t |)) 3 / 4 − 3 / 4 0 + , for any > 0 and t ∈ ℝ. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. On algebraic twists with composite moduli.
- Author
-
Lin, Yongxiao and Michel, Philippe
- Abstract
We study bounds for algebraic twists sums of automorphic coefficients by trace functions of composite moduli. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Estimation for stochastic differential equation mixed models using approximation methods
- Author
-
Nelson T. Jamba, Gonçalo Jacinto, Patrícia A. Filipe, and Carlos A. Braumann
- Subjects
delta method ,laplace method ,maximum likelihood estimation ,mixed models ,stochastic differential equations ,Mathematics ,QA1-939 - Abstract
We used a class of stochastic differential equations (SDE) to model the evolution of cattle weight that, by an appropriate transformation of the weight, resulted in a variant of the Ornstein-Uhlenbeck model. In previous works, we have dealt with estimation, prediction, and optimization issues for this class of models. However, to incorporate individual characteristics of the animals, the average transformed size at maturity parameter $ \alpha $ and/or the growth parameter $ \beta $ may vary randomly from animal to animal, which results in SDE mixed models. Obtaining a closed-form expression for the likelihood function to apply the maximum likelihood estimation method is a difficult, sometimes impossible, task. We compared the known Laplace approximation method with the delta method to approximate the integrals involved in the likelihood function. These approaches were adapted to allow the estimation of the parameters even when the requirement of most existing methods, namely having the same age vector of observations for all trajectories, fails, as it did in our real data example. Simulation studies were also performed to assess the performance of these approximation methods. The results show that the approximation methods under study are a very good alternative for the estimation of SDE mixed models.
- Published
- 2024
- Full Text
- View/download PDF
25. Measuring Event Diffusion Momentum (EDM): Applications in Social Movement Research
- Author
-
Zhang, Tony Huiquan and Cai, Tianji
- Published
- 2023
- Full Text
- View/download PDF
26. GL(2) Weyl bound via a multiplicative character delta method.
- Author
-
Leung, Wing Hong
- Subjects
- *
CUSP forms (Mathematics) , *L-functions - Abstract
We use a trivial delta method with multiplicative characters for congruence detection to prove the Weyl bound for GL(2) in the t -aspect for a holomorphic or Hecke–Maass cusp form of arbitrary level and nebentypus. This parallels the work of Aggarwal [2] in 2018, with the difference being the multiplicative character has a more natural connection to the twisted L -function. This provides another viewpoint to understand and explore the trivial and other delta methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Delta method, asymptotic distribution.
- Author
-
Beutner, Eric
- Subjects
- *
ASYMPTOTIC distribution , *STOCHASTIC processes , *DIFFERENTIABLE functions - Abstract
The delta method for deriving asymptotic distributions is presented. Assume interest lies in fθ0 where θ0 is an unknown parameter and f is a known function. The delta method allows to immediately obtain an approximation of the distribution of the plug‐in estimator fθ̂n through the asymptotic distribution of anfθ̂n−fθ0whenever the asymptotic distribution of anθ̂n−θ0 is known and f is differentiable at θ0. This article is categorized under:Data: Types and Structure > Time Series, Stochastic Processes, and Functional Data [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. Assessment of Various Rainfall Bias Correction Techniques in Peninsular Malaysia
- Author
-
Satianesan, Yashotha, Tan, Wei Lun, Ling, Lloyd, Fournier-Viger, Philippe, Series Editor, Wahi, Nadihah, editor, Mohd Safari, Muhammad Aslam, editor, Hasni, Roslan, editor, Abdul Razak, Fatimah, editor, Gafurjan, Ibragimov, editor, and Fitrianto, Anwar, editor
- Published
- 2023
- Full Text
- View/download PDF
29. Confidence intervals for the cross product ratio under the special case of direct-inverse sampling scheme and its applications.
- Author
-
Nadeem, Hira, Ejaz Ahmed, S., and Volodin, Andrei
- Subjects
- *
CONFIDENCE intervals , *PROBABILITY theory , *CROSSES - Abstract
This article focuses on the estimation of the cross-product ratio ρ = p 1 (1 − p 2) p 2 (1 − p 1) under so-called special case of the direct-inverse sampling scheme, where the number of successes in the direct sampling scheme is used in the second sampling scheme of the inverse binomial scheme. Asymptotic confidence intervals are constructed. Our goal is to investigate the cases when the normal approximations for estimators of the cross-product ratio are reliable for the construction of confidence intervals. We use the closeness of the confidence coefficient to the nominal confidence level as our main evaluation criterion, and use the Monte-Carlo method to investigate the key probability characteristics of intervals. We present estimations of the coverage probability and interval width in tables. In the last section, Cytochrome Psychotropic Genotyping Under Investigation for Decision Support case study is discussed where the standard and genetically guided therapy is compared and estimates for the cross-product ratio are presented and interpreted when the participants are enrolled according to the special case of the direct-inverse sampling scheme. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
30. On the Normal Approximations to the Method of Moments Point Estimators of the Parameter and Mean of the Zero-Truncated Poisson Distribution.
- Author
-
Ngamkham, Thuntida and Panta, Chom
- Abstract
In applied statistical research, a common type of dataset used is count data. However, there are cases where zero events are not observed in the dataset. Consequently, the Poisson distribution, a basic discrete probability model, is inappropriate in such situations. Instead, we need to consider the so-called Zero-Truncated Poisson distribution. Unfortunately, deriving the simplest Method of Moments estimators for the parameter and mean of this distribution in closed form is not feasible. Therefore, estimating the Zero-Truncated Poisson parameter and mean becomes a challenging problem. In this article, the authors used the classical delta method to apply an estimation procedure for the zero-truncated Poisson parameter and the mean and investigated their asymptotic normality. Furthermore, we demonstrated the practicality of our approach through an application to a real-life dataset on unrest events in the southern border area of Thailand. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
31. A History of the Delta Method and Some New Results.
- Author
-
Bera, Anil K. and Koley, Malabika
- Abstract
Use of the delta method in statistics and econometrics is ubiquitous. Its mention can be found in almost all advanced statistics and econometrics textbooks but mostly without any reference. It appears that nobody knows for certain when the first paper on the topic was published or how the idea was first conceived. A seemingly unrelated method to find the asymptotic variance of a statistic involving one or more nuisance parameters was given by Pierce (Ann. Stat10, 475–478 1982). In the first part of the paper a comprehensive review of the delta method is presented with the objective of unearthing its history. In the second part a comparative analytic study of the delta method with the Pierce method is presented. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
32. Analysis of academic trajectories of higher education students by means of an absorbing Markov chain.
- Author
-
Batún, José Luis, Cool, Rubén, and Pantí, Henry
- Subjects
MARKOV processes ,MAXIMUM likelihood statistics ,EDUCATION students ,STOCHASTIC processes ,HIGHER education ,ACADEMIC improvement ,MATHEMATICS ,ACADEMIC programs - Abstract
Copyright of Revista de la Academia Colombiana de Ciencias Exactas, Físicas y Naturales is the property of Academia Colombiana de Ciencias Exactas, Fisicas y Naturales and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
- Full Text
- View/download PDF
33. Reliability inference for stress-strength model based on inverted exponential Rayleigh distribution under progressive Type-II censored data.
- Author
-
Ma, Jin'ge, Wang, Liang, Tripathi, Yogesh Mani, and Rastogi, Manoj Kumar
- Subjects
- *
RAYLEIGH model , *ASYMPTOTIC distribution , *CENSORING (Statistics) , *ACCELERATED life testing , *RANDOM variables - Abstract
In this paper, stress-strength model is studied for an inverted exponential Rayleigh distribution (IERD) when the latent failure times are progressively Type-II censored. When both strength and stress random variables follow common IERD scale parameters, the maximum likelihood estimate of stress-strength reliability (SSR) is established and the associated approximate confidence interval is also constructed using the asymptotic distribution theory and delta method. By constructing pivotal quantities, another alternative generalized estimates for SSR are also proposed for comparison. Moreover, when there are arbitrary strength and stress parameters, likelihood and generalized pivotal based estimates are also presented. In addition, testing problem is gave for comparing the equality of different strength and stress parameters. Finally, simulation study and a real data example are provided for illustration. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
34. DEUE: Delta Ensemble Uncertainty Estimation for a More Robust Estimation of Ejection Fraction
- Author
-
Kazemi Esfeh, Mohammad Mahdi, Gholami, Zahra, Luong, Christina, Tsang, Teresa, Abolmaesumi, Purang, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Wang, Linwei, editor, Dou, Qi, editor, Fletcher, P. Thomas, editor, Speidel, Stefanie, editor, and Li, Shuo, editor
- Published
- 2022
- Full Text
- View/download PDF
35. MODERATE DEVIATIONS AND INVARIANCE PRINCIPLES FOR SAMPLE AVERAGE APPROXIMATIONS.
- Author
-
MINGJIE GAO and KA-FAI CEDRIC YIU
- Subjects
- *
LIPSCHITZ continuity , *STOCHASTIC programming , *DEVIATION (Statistics) , *LARGE deviations (Mathematics) - Abstract
We study moderate deviations and convergence rates for the optimal values and optimal solutions of sample average approximations. Firstly, we give an extension of the Delta method in large deviations. Then under Lipschitz continuity on the objective function, we establish a moderate deviation principle for the optimal value by the Delta method. When the objective function is twice continuously differentiable and the optimal solution of true optimization problem is unique, we obtain a moderate deviation principle for the optimal solution and a Cramér-type moderate deviation for the optimal value. Motivated by the Donsker invariance principle, we consider a functional form of stochastic programming problem and establish a Donsker invariance principle, a functional moderate deviation principle, and a Strassen invariance principle for the optimal value. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
36. The effect of piping stream channels on dissolved oxygen concentration and ecological health.
- Author
-
Ketabchy, Mehdi, Buell, Elyce N., Yazdi, Mohammad Nayeb, Sample, David J., and Behrouz, Mina Shahed
- Subjects
ENVIRONMENTAL health ,ECOSYSTEM health ,BROOK trout ,WATERSHED restoration ,WATER quality ,RIVER channels ,WATERSHEDS - Abstract
Sunlight plays a key role in the nutrient cycle within streams. Streams are often piped to accommodate urban residential or commercial development for buildings, roads, and parking. This results in altered exposure to sunlight, air, and soil, subsequently affecting the growth of aquatic vegetation, reducing reaeration, and thus impairing the water quality and ecological health of streams. While the effects of urbanization on urban streams, including changing flow regimes, stream bank and bed erosion, and degraded water quality, are well understood, the effects of piping streams on dissolved oxygen (DO) concentrations, fish habitats, reaeration, photosynthesis, and respiration rates are not. We addressed this research gap by assessing the effects of stream piping on DO concentrations before and after a 565-m piped section of Stroubles Creek in Blacksburg, VA, for several days during the summer of 2021. Results indicate that the DO level decreased by approximately 18.5% during daylight hours as water flowed through the piped section of the creek. Given the optimum DO level (9.0 mg·L
−1 ) for brook trout (Salvelinus sp.), which are native and present in a portion of Stroubles Creek, the resulting DO deficits were − 0.49 and − 1.24 mg·L−1 , for the inlet and outlet, respectively, indicating a possible adverse impact from piping the stream on trout habitat. Photosynthesis and respiration rates were reduced through the piped section, primarily due to the reduced solar radiation and the resultant reduction in oxygen production from aquatic vegetation; however, the reaeration rate increased. This study can inform watershed restoration efforts, particularly decisions regarding stream daylighting with respect to potential water quality and aquatic habitat benefits. [ABSTRACT FROM AUTHOR]- Published
- 2023
- Full Text
- View/download PDF
37. Semi-parametric model approach to causal mediation analysis for longitudinal data.
- Author
-
Li, Youjun and Albert, Jeffrey M.
- Subjects
- *
TYPE 2 diabetes , *STRUCTURAL equation modeling , *MEDICAL personnel , *LONGITUDINAL method , *PARAMETRIC modeling - Abstract
There has been a lack of causal mediation analysis implementation on complicated longitudinal data. Most existing work focuses on extensions of parametric models that have been well developed for causal mediation analysis. To better handle more complex data patterns, our approach takes advantage of the flexibility of penalized splines and performs the causal mediation analysis under the structural equation model framework. We also provide the formula for identifying the natural direct and indirect effects based on our semi-parametric models, whose inference is carried out by the delta method and Monte Carlo approximation. Our approach is first evaluated by conducting simulation studies, where the two methods for inference are compared. Finally, we apply the method to data from a longitudinal cohort study to examine the effect of a training programme for healthcare providers on improving their patients' type 2 diabetes condition. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
38. An equivalence test between features lists, based on the Sorensen–Dice index and the joint frequencies of GO term enrichment
- Author
-
Pablo Flores, Miquel Salicrú, Alex Sánchez-Pla, and Jordi Ocaña
- Subjects
Delta method ,Bootstrap ,Simulation ,Type I error ,Irrelevance of dissimilarity ,Gene lists ,Computer applications to medicine. Medical informatics ,R858-859.7 ,Biology (General) ,QH301-705.5 - Abstract
Abstract Background In integrative bioinformatic analyses, it is of great interest to stablish the equivalence between gene or (more in general) feature lists, up to a given level and in terms of their annotations in the Gene Ontology. The aim of this article is to present an equivalence test based on the proportion of GO terms which are declared as enriched in both lists simultaneously. Results On the basis of these data, the dissimilarity between gene lists is measured by means of the Sorensen–Dice index. We present two flavours of the same test: One of them based on the asymptotic normality of the test statistic and the other based on the bootstrap method. Conclusions The accuracy of these tests is studied by means of simulation and their possible interest is illustrated by using them over two real datasets: A collection of gene lists related to cancer and a collection of gene lists related to kidney rejection after transplantation.
- Published
- 2022
- Full Text
- View/download PDF
39. Modeling climate extremes using the four-parameter kappa distribution for r-largest order statistics
- Author
-
Yire Shin and Jeong-Soo Park
- Subjects
Delta method ,Generalized Gumbel distribution ,Heavy rainfall ,Intensity and frequency of climate extremes ,Penalized likelihood estimation ,Profile likelihood ,Meteorology. Climatology ,QC851-999 - Abstract
Accurate estimation of the T-year return levels of climate extremes using statistical distribution is a critical step in the projection of future climate and in engineering design for disaster response. We show how the estimation of such quantities can be improved by fitting the four-parameter kappa distribution for r-largest order statistics (rK4D), which was developed in this study. The rK4D is an extension of the generalized extreme value distribution for r-largest order statistics (rGEVD), similar to the four-parameter kappa distribution (K4D), which is an extension of the generalized extreme value distribution (GEVD). This new distribution (rK4D) can be useful not only for fitting data when three parameters in the GEVD are not sufficient to capture the variability of the extreme observations, but also in reducing the estimation uncertainty by making use of the r-largest extreme observations instead of only the block maxima. We derive a joint probability density function (PDF) of rK4D and the marginal and conditional cumulative distribution functions and PDFs. To estimate the parameters, the maximum likelihood estimation and the maximum penalized likelihood estimation methods were considered. The usefulness and practical effectiveness of the rK4D are illustrated by the Monte Carlo simulation and by an application to the Bangkok extreme rainfall data. A few new distributions for r-largest order statistics are also derived as special cases of the rK4D, such as the r-largest logistic, the r-largest generalized logistic, and the r-largest generalized Gumbel distributions. These distributions for r-largest order statistics would be useful in modeling extreme values for many research areas, including hydrology and climatology.
- Published
- 2023
- Full Text
- View/download PDF
40. Testing for dummy-variable effects in semi-logarithmic regressions.
- Author
-
Blackburn, McKinley L.
- Subjects
DUMMY variables ,NONLINEAR functions - Abstract
Applied economists often use a non-linear function to estimate percentage-change effects of dummy variables in semi-logarithmic models. Delta-method-based inference on these marginal effects is questionable, especially as the dummy variable can be arbitrarily defined to increase the suggestion of evidence of an impact. Inference should instead be based on the untransformed coefficient. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
41. Prediction and confidence intervals of willingness-to-pay for mixed logit models.
- Author
-
Scaccia, Luisa, Marcucci, Edoardo, and Gatta, Valerio
- Subjects
- *
LOGISTIC regression analysis , *CONFIDENCE intervals , *WILLINGNESS to pay , *SAMPLING errors , *UTILITY functions , *FORECASTING - Abstract
Heterogeneity in agents' preferences is generally analysed through mixed logit models, which assume taste parameters are distributed in the population according to a certain mixing distribution. As a result, if the utility function is linear in attributes, the willingness to pay is the ratio of two random parameters and is itself random. This paper proposes a technique built on the Delta method, partly analytical and partly based on simulations, to obtain the sampling distribution of the willingness to pay, accounting for both heterogeneity and sampling error. The paper contributes to the literature by: (i) redressing some imprecisions in Bliemer and Rose (2013) that produce biased results; (ii) proposing a faster estimation process, compared to the Krinsky and Robb (1986, 1990) method that, relying on simulation only, proves computationally more demanding; (iii) comparing the performance of different methods using both synthetic and real data sets. The paper shows, via a Monte Carlo study, that the method we develop and the Krinsky and Robb one produce similar results, while outperforming that proposed by Bliemer and Rose. • Uncertainty in estimating Willingness To Pay under Mixed Logit Model arises from both heterogeneity and sampling error. • Delta method can be used, in combination with normal mixtures, to estimate both sources of uncertainty. • Both confidence and prediction intervals can be computed for Willingness To Pay by means of Delta method. • Delta method, being partly analytical, proves time-saving when compared to Krinsky and Robb method. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
42. Testing Differential Item Functioning Without Predefined Anchor Items Using Robust Regression.
- Author
-
Wang, Weimeng, Liu, Yang, and Liu, Hongyun
- Subjects
FALSE positive error ,ITEM response theory ,LIKELIHOOD ratio tests ,ERROR rates ,INFERENTIAL statistics ,T-test (Statistics) - Abstract
Differential item functioning (DIF) occurs when the probability of endorsing an item differs across groups for individuals with the same latent trait level. The presence of DIF items may jeopardize the validity of an instrument; therefore, it is crucial to identify DIF items in routine operations of educational assessment. While DIF detection procedures based on item response theory (IRT) have been widely used, a majority of IRT-based DIF tests assume predefined anchor (i.e., DIF-free) items. Not only is this assumption strong, but violations to it may also lead to erroneous inferences, for example, an inflated Type I error rate. We propose a general framework to define the effect sizes of DIF without a priori knowledge of anchor items. In particular, we quantify DIF by item-specific residuals from a regression model fitted to the true item parameters in respective groups. Moreover, the null distribution of the proposed test statistic using robust estimator can be derived analytically or approximated numerically even when there is a mix of DIF and non-DIF items, which yields asymptotically justified statistical inference. The Type I error rate and the power performance of the proposed procedure are evaluated and compared with the conventional likelihood-ratio DIF tests in a Monte Carlo experiment. Our simulation study has shown promising results in controlling Type I error rate and power of detecting DIF items. Even when there is a mix of DIF and non-DIF items, the true and false alarm rate can be well controlled when a robust regression estimator is used. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
43. Asymptotic Characteristics of the Non-Iterative Estimates of the Linear-by-Linear Association Parameter for Ordinal Log-Linear Models.
- Author
-
Zafar, Sidra, Cheema, Salman A., Beh, Eric J., and Hudson, Irene L.
- Subjects
LOG-linear models ,CONTINGENCY tables ,ORTHOGONAL polynomials - Abstract
Over the past decade, a series of procedures has been introduced to estimate, using a non-iterative method, the linear-by-linear association parameter of an ordinal log-linear model. This paper will examine the two key non-iteratively determined estimates of the parameter for the analysis of the association between the two categorical variables that form a contingency table; these are the log and the Beh-Davy non-iterative estimates, referred to simply as the LogNI and the BDNI estimates, respectively. Such an examination will focus on determining their asymptotic characteristics. To do so, a computational study was undertaken for tables of varying sizes to show that these two estimates are asymptotically unbiased. It is also shown that both estimates are asymptotically normally distributed. On the basis of the standard errors, their relative efficiency was established for the 13 commonly analysed contingency tables that appear throughout the literature. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
44. Data Sources and Methodology
- Author
-
Duulatov, Eldiiar, Chen, Xi, Issanova, Gulnura, Orozbaev, Rustam, Mukanov, Yerbolat, Amanambu, Amobichukwu C., Duulatov, Eldiiar, Chen, Xi, Issanova, Gulnura, Orozbaev, Rustam, Mukanov, Yerbolat, and Amanambu, Amobichukwu C.
- Published
- 2021
- Full Text
- View/download PDF
45. Range-Preserving Confidence Intervals and Significance Tests for Scalability Coefficients in Mokken Scale Analysis
- Author
-
Koopman, Letty, Zijlstra, Bonne J. H., van der Ark, L. Andries, Wiberg, Marie, editor, Molenaar, Dylan, editor, González, Jorge, editor, Böckenholt, Ulf, editor, and Kim, Jee-Seon, editor
- Published
- 2021
- Full Text
- View/download PDF
46. An Illustration on the Quantile-Based Calculation of the Standard Error of Equating in Kernel Equating
- Author
-
González, Jorge, Wallin, Gabriel, Wiberg, Marie, editor, Molenaar, Dylan, editor, González, Jorge, editor, Böckenholt, Ulf, editor, and Kim, Jee-Seon, editor
- Published
- 2021
- Full Text
- View/download PDF
47. Statistical Inference for Competing Risks Model with Adaptive Progressively Type-II Censored Gompertz Life Data Using Industrial and Medical Applications.
- Author
-
Almuqrin, Muqrin A., Salah, Mukhtar M., and A. Ahmed, Essam
- Subjects
- *
INFERENTIAL statistics , *MARKOV chain Monte Carlo , *COMPETING risks , *DISTRIBUTION (Probability theory) , *SPECTRAL element method , *MAXIMUM likelihood statistics , *ACCELERATED life testing , *CENSORING (Statistics) - Abstract
This study uses the adaptive Type-II progressively censored competing risks model to estimate the unknown parameters and the survival function of the Gompertz distribution. Where the lifetime for each failure is considered independent, and each follows a unique Gompertz distribution with different shape parameters. First, the Newton-Raphson method is used to derive the maximum likelihood estimators (MLEs), and the existence and uniqueness of the estimators are also demonstrated. We used the stochastic expectation maximization (SEM) method to construct MLEs for unknown parameters, which simplified and facilitated computation. Based on the asymptotic normality of the MLEs and SEM methods, we create the corresponding confidence intervals for unknown parameters, and the delta approach is utilized to obtain the interval estimation of the reliability function. Additionally, using two bootstrap techniques, the approximative interval estimators for all unknowns are created. Furthermore, we computed the Bayes estimates of unknown parameters as well as the survival function using the Markov chain Monte Carlo (MCMC) method in the presence of square error and LINEX loss functions. Finally, we look into two real data sets and create a simulation study to evaluate the efficacy of the established approaches. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
48. The variances of non-parametric estimates of the cross-sectional distribution of durations.
- Author
-
Tian, Maoshan and Dixon, Huw
- Subjects
- *
MONTE Carlo method , *SURVIVAL analysis (Biometry) , *KAPLAN-Meier estimator , *AGE distribution , *PRICES - Abstract
This paper focuses on the link between non-parametric survival analysis and three distributions. The delta method is applied to derive the variances of the non-parametric estimators of three distributions: the distribution of durations (DD), the cross-sectional distribution of ages (CSA) and the cross-sectional distribution of (completed) durations (CSD). The non-parametric estimator of the the cross-sectional distribution of durations (CSD) has been defined and derived by Dixon (2012) and used in the generalized Taylor price model (GTE) by Dixon and Le Bihan (2012). The Monte Carlo method is applied to evaluate the variances of the estimators of DD and CSD and how their performance varies with sample size and the censoring of data. We apply those estimators to two data sets: the UK CPI micro-price data and waiting-time data from UK hospitals. Both the estimates of the distributions and their variances are calculated. Depending on the empirical results, the estimated variances indicate that the DD and CSD estimators are all significant. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
49. Directional Measure for Analyzing the Degree of Deviance from Generalized Marginal Mean Equality Model in Square Contingency Tables.
- Author
-
Ando, Shuji
- Abstract
When the concerned model does not fit the data, we may be interested in measuring the degree of deviance from the concerned model. This study proposes a measure for simultaneously analyzing the degree and direction of deviance from the generalized marginal mean equality model based on the ordered scores for each category. Previous study proposed a measure for analyzing both the degree and direction of deviance from the marginal mean equality model based on only the equally spaced scores. When it is appropriate to assign the ordered scores to categories, we are interested in analyzing whether the row marginal mean based on the known ordered scores is equal to the column marginal mean. It is necessary to analyze both the degree and direction of deviance from the generalized marginal mean equality model because there are two kinds of direction. We derive a confidence interval for the proposed measure using the delta method. The proposed measure is also helpful for comparing degrees of deviance from the generalized marginal mean equality model for several datasets. We show the utility of the proposed measure by applied it to real data. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
50. Strategies for assessing the impact of loss to follow-up on estimates of neurodevelopmental impairment in a very preterm cohort at 2 years of age
- Author
-
Aurélie Piedvache, Stef van Buuren, Henrique Barros, Ana Isabel Ribeiro, Elizabeth Draper, Jennifer Zeitlin, and the EPICE Research group
- Subjects
Loss to follow-up ,Preterm births ,Neurodevelopment ,Multiple imputation ,Inverse probability weighting ,Delta method ,Medicine (General) ,R5-920 - Abstract
Abstract Background Loss to follow-up is a major challenge for very preterm (VPT) cohorts; attrition is associated with social disadvantage and parents with impaired children may participate less in research. We investigated the impact of loss to follow-up on the estimated prevalence of neurodevelopmental impairment in a VPT cohort using different methodological approaches. Methods This study includes births
- Published
- 2021
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.