1,516 results on '"hazard function"'
Search Results
2. The Lambert-G Family: Properties, Inference, and Applications.
- Author
-
Al Abbasi, Jamal N., Afify, Ahmed Z., Alnssyan, Badr, and Shama, Mustafa S.
- Subjects
HAZARD function (Statistics) ,FAMILIES ,PARAMETER estimation - Abstract
This study proposes a new flexible family of distributions called the Lambert-G family. The Lambert family is very flexible and exhibits desirable properties. Its three-parameter special sub-models provide all significantmonotonic and non-monotonic failure rates. A special sub-model of the Lambert family called the Lambert-Lomax (LL) distribution is investigated. General expressions for the LL statistical properties are established. Characterizations of the LL distribution are addressed mathematically based on its hazard function. The estimation of the LL parameters is discussed using six estimation methods. The performance of this estimation method is explored through simulation experiments. The usefulness and flexibility of the LL distribution are demonstrated empirically using two real-life data sets. The LL model better fits the exponentiated Lomax, inverse power Lomax, Lomax-Rayleigh, power Lomax, and Lomax distributions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Survival Analysis and Applications Using SAS and SPSS
- Author
-
Chowdhury, Rafiqul, Huda, Shahariar, and Mitra, Amal K., editor
- Published
- 2024
- Full Text
- View/download PDF
4. Topp-Leone Exponentiated Pareto Distribution: Properties and Application to Covid-19 Data
- Author
-
Fabio M. Correa, Braimah J. Odunayo, Ibrahim Sule, and Olalekan A. Bello
- Subjects
Hazard function ,Order statistics ,Survival function ,Probabilities. Mathematical statistics ,QA273-280 - Abstract
Abstract This paper proposes a new Topp-Leone Exponentiated Pareto (TLEtP) distribution. The new distribution family is derived by expanding the Topp Leone-G family of distributions with additional positive shape parameters. The corresponding density and distribution functions are derived and shown. Some of the derived mathematical properties of the distribution include quantile function, ordinary and incomplete moments generating function (mgf), hazard function, survival function, odd function, probability weighted moment, and distribution of order statistic. The parameters of the distribution are estimated using Maximum Likelihood method. The proposed distribution’s validity is demonstrated by fitting two sets of real data and comparing the results with two existing same-family distributions, the Topp-Leone Pareto type I(TLPI) and Pareto (P), with the Akaike Information Criteria (AIC) and Bayesian Information Criteria (BIC), respectively. The comparison of the proposed Topp-Leone Exponentiated Pareto (TLEtP) to the Topp-Leone Pareto type I(TLPI) and Pareto (P) distribution demonstrate that the TLEtP distribution offers a better fit for the data sets than the other two distributions.
- Published
- 2024
- Full Text
- View/download PDF
5. Improved estimation of hazard function when failure information is missing not at random.
- Author
-
Chen, Feifei, Zhang, Wangxing, Sun, Zhihua, and Guo, Yuanyuan
- Subjects
- *
HAZARD function (Statistics) , *INFERENTIAL statistics , *SURVIVAL analysis (Biometry) - Abstract
Hazard function plays a crucial role in survival analysis. Its estimation has garnered a lot of attention when the survival time variable suffers from right-censoring. Most of the existing works focus on the cases that failure information is complete or missing at random (MAR). When the censoring information is missing not at random (MNAR), statistical inferences on hazard function are very challenging. In this study, estimation of hazard function is addressed under the MNAR mechanism of the failure information. Three estimators are proposed by employing the techniques of correcting biases and adjusting weighting probabilities. These estimators are validated to be consistent and asymptotically normal. Simulation studies and two real-data analyses are performed to assess the proposed methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. THE LENGTH-BIASED WEIGHTED WILSON HILFERTY DISTRIBUTION AND ITS APPLICATIONS.
- Author
-
Singh, Shivendra Pratap, Kumar, Surinder, and Kabdwal, Naresh Chandra
- Subjects
- *
LORENZ curve , *HAZARD function (Statistics) , *MAXIMUM likelihood statistics - Abstract
In this article, we propose a new length-biased weighted form of Wilson Hilferty distribution named as Length-Biased Weighted Wilson Hilferty Distribution. The various Statistical properties of the proposed distribution like, reliability function, hazard rate function, reverse hazard rate function, moment generating function, quantile function, the coefficient of variation etc. are considered to understand its nature. Furthermore, we have used the method of maximum likelihood for estimation of the parameters of proposed distribution. Also, we obtain the Shannon's entropy, stochastic ordering, Lorenz and Bonferroni curves. The performance of the proposed distribution is compared with competitive distributions using two real data sets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
7. Topp-Leone Exponentiated Pareto Distribution: Properties and Application to Covid-19 Data.
- Author
-
Correa, Fabio M., Odunayo, Braimah J., Sule, Ibrahim, and Bello, Olalekan A.
- Subjects
PARETO distribution ,COVID-19 pandemic ,DISTRIBUTION (Probability theory) ,HAZARD function (Statistics) ,ORDER statistics - Abstract
This paper proposes a new Topp-Leone Exponentiated Pareto (TLEtP) distribution. The new distribution family is derived by expanding the Topp Leone-G family of distributions with additional positive shape parameters. The corresponding density and distribution functions are derived and shown. Some of the derived mathematical properties of the distribution include quantile function, ordinary and incomplete moments generating function (mgf), hazard function, survival function, odd function, probability weighted moment, and distribution of order statistic. The parameters of the distribution are estimated using Maximum Likelihood method. The proposed distribution's validity is demonstrated by fitting two sets of real data and comparing the results with two existing same-family distributions, the Topp-Leone Pareto type I(TLPI) and Pareto (P), with the Akaike Information Criteria (AIC) and Bayesian Information Criteria (BIC), respectively. The comparison of the proposed Topp-Leone Exponentiated Pareto (TLEtP) to the Topp-Leone Pareto type I(TLPI) and Pareto (P) distribution demonstrate that the TLEtP distribution offers a better fit for the data sets than the other two distributions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Asymptotic properties of asymmetric kernel estimators for non-negative and censored data.
- Author
-
Ghettab, Sarah and Guessoum, Zohra
- Subjects
- *
PROBABILITY density function , *DISTRIBUTION (Probability theory) , *ASYMPTOTIC normality , *HAZARD function (Statistics) , *RANDOM variables , *SEQUENCE spaces , *CENSORING (Statistics) - Abstract
Let { X i , i ≥ 1 } be a sequence of independent and identically distributed random variables with distribution function F and probability density function f. We propose new type of kernel estimators for density and hazard functions that perform well at the boundary, when the variable of interest is positive and right censored. The estimators are constructed using asymmetric kernels with expectation 1. We establish uniform strong consistency rates and we study asymptotic properties and normality of the resulting estimators. A large simulation study is conducted to comfort the theoretical results. An application to real data is done. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Hazard function analysis of prognosis after recurrent colorectal cancer.
- Author
-
Ise, Ichiro, Kawai, Kazushige, Nakano, Daisuke, Takao, Misato, Natsume, Soichiro, Kato, Hiroki, Nakamori, Sakiko, Dejima, Akira, and Yamaguchi, Tatsuro
- Subjects
- *
HAZARD function (Statistics) , *SURVIVAL rate , *COLORECTAL cancer , *POSTMORTEM changes , *PROGNOSIS - Abstract
Background and objectives: Mean survival time (MST) is used as the indicator of prognosis in patients with a colorectal cancer (CRC) recurrence. The present study aimed to visualize the changes in death risk after a CRC recurrence using hazard function analysis (HFA) to provide an alternative prognostic indicator to MST. Methods: The medical records of 725 consecutive patients with a recurrence following R0 radical surgery for CRC were retrospectively reviewed. Results: The five-year, post-recurrence survival rate was 37.8%, and the MST was 3.5 years while the risk of death peaked at 2.9 years post-recurrence. Seven variables were found to predict short-term survival, including the number of metastatic organs ≥ 2, non-surgical treatment for the recurrence, and a short interval before recurrence. In patients with a recurrence in one organ, the MST was four years, the peak time of death predicted by HFA was 2.9 years, and the five-year survival rate was 45.8%. In patients with a surgical resection of the recurrence, the MST was 8 years, the peak time of death was 3.3 years, and the five-year survival rate was 62%. Conclusions: The present study established a novel method of assessing changes in mortality risk over time using HFA in patients with a CRC recurrence. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Unit Maxwell-Boltzmann Distribution and Its Application to Concentrations Pollutant Data.
- Author
-
Biçer, Cenker, Bakouch, Hassan S., Biçer, Hayrinisa Demirci, Alomair, Gadir, Hussain, Tassaddaq, and Almohisen, Amal
- Subjects
- *
DISTRIBUTION (Probability theory) , *AIR pollutants , *POLLUTANTS , *LEAST squares , *MOMENTS method (Statistics) - Abstract
In the vast statistical literature, there are numerous probability distribution models that can model data from real-world phenomena. New probability models, nevertheless, are still required in order to represent data with various spread behaviors. It is a known fact that there is a great need for new models with limited support. In this study, a flexible probability model called the unit Maxwell-Boltzmann distribution, which can model data values in the unit interval, is derived by selecting the Maxwell-Boltzmann distribution as a base-line model. The important characteristics of the derived distribution in terms of statistics and mathematics are investigated in detail in this study. Furthermore, the inference problem for the mentioned distribution is addressed from the perspectives of maximum likelihood, method of moments, least squares, and maximum product space, and different estimators are obtained for the unknown parameter of the distribution. The derived distribution outperforms competitive models according to different fit tests and information criteria in the applications performed on four actual air pollutant concentration data sets, indicating that it is an effective model for modeling air pollutant concentration data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Characterizations of the Recently Introduced Discrete Distributions.
- Author
-
Hamedani, G. G. and Roshani, Amin
- Subjects
- *
HAZARD function (Statistics) , *CONDITIONAL expectations , *DISTRIBUTION (Probability theory) , *RANDOM variables - Abstract
Certain characterizations of 26 recently introduced discrete distributions are presented in three directions: (i) based on an appropriate function of the random variable; (ii) in terms of the reverse hazard function and (iii) in terms of the hazard function. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. A Partial Maximum Likelihood Method to Estimate Cox Model for Competing Risk Data with Application.
- Author
-
Hussein, Ibrahim Khalil and Fadam, Entsar Arebe
- Subjects
MAXIMUM likelihood statistics ,HAZARD mitigation ,PROPORTIONAL hazards models ,COMPETING risks ,NEWTON-Raphson method ,HAZARD function (Statistics) - Abstract
Copyright of Journal of Economics & Administrative Sciences is the property of Republic of Iraq Ministry of Higher Education & Scientific Research (MOHESR) and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
13. Weibull Statistic and Artificial Neural Network Analysis of the Mechanical Performances of Fibers from the Flower Agave Plant for Eco-Friendly Green Composites
- Author
-
Imen Lalaymia, Ahmed Belaadi, Messaouda Boumaaza, Hassan Alshahrani, Mohammad K. A. Khan, and Amar Dib
- Subjects
Agave americana ,tensile behavior ,statistical methods ,Weibull statistic ,hazard function ,美洲龙舌兰 ,Science ,Textile bleaching, dyeing, printing, etc. ,TP890-933 - Abstract
ABSTRACTThe research conducted focused on examining the unique properties of Agave Americana Flower Stem fiber (AAFS), particularly its behavior under quasi-static tensile conditions. A total of 200 AAFS fibers were subjected to tensile tests using a standard gauge length of 40 mm. Tests spanned seven groups with quantities (N) ranging from 30 to 200. The study aimed to understand the fibers’ mechanical traits, as tensile resistance and modulus of elasticity, and to see how different test quantities influence these properties. A significant observation was the dispersion of the tensile characteristics of AAFS fibers, a common trait of natural fibers. To understand this, we applied rigorous statistical tools, including the Weibull distribution at a 95% confidence interval and one-way ANOVA. A mathematical model was produced utilizing data from experiments regarding the tensile behavior of AAFS fibers. The ANN provided correlation coefficients (R2) of 0.9897, 0.9971, 0.9993, and 0.9939 for training, validation, testing, and all datasets respectively, which were able to accurately predict the experimental data. The proposed model would be of tremendous assistance to engineers and designers in obtaining green composite materials that are based on natural fibers and thereby more durable. These methods illuminated the patterns in our results, enriching our understanding of AAFS fiber mechanics.
- Published
- 2024
- Full Text
- View/download PDF
14. Suitable Patient Selection and Optimal Timing of Treatment for Persistent Air Leak after Lung Resection.
- Author
-
Yamauchi, Yoshikane, Adachi, Hiroyuki, Takahashi, Nobumasa, Morohoshi, Takao, Yamamoto, Taketsugu, Endo, Makoto, Ueno, Tsuyoshi, Woo, Tekkan, Saito, Yuichi, and Sawabata, Noriyoshi
- Subjects
- *
PATIENT selection , *CUMULATIVE distribution function , *FORCED expiratory volume , *FIBRIN tissue adhesive , *BODY mass index , *PLEURODESIS - Abstract
Objectives: The choice of therapeutic intervention for postoperative air leak varies between institutions. We aimed to identify the optimal timing and patient criteria for therapeutic intervention in cases of postoperative air leaks after lung resection. Methods: This study utilized data from a prospective multicenter observational study conducted in 2019. Among the 2187 cases in the database, 420 cases with air leaks on postoperative day 1 were identified. The intervention group underwent therapeutic interventions, such as pleurodesis or surgery, while the observation group was monitored without intervention. A comparison between the intervention group and the observation group were analyzed using the cumulative distribution and hazard functions. Results: Forty-six patients (11.0%) were included in the intervention group. The multivariate analysis revealed that low body mass index (p = 0.019), partial resection (p = 0.010), intraoperative use of fibrin glue (p = 0.008), severe air leak on postoperative day 1 (p < 0.001), and high forced expiratory volume in 1 s (p = 0.021) were significant predictors of the requirement for intervention. The proportion of patients with persistent air leak in the observation group was 20% on postoperative day 5 and 94% on postoperative day 7. The hazard of air leak cessation peaked from postoperative day 3 to postoperative day 7. Conclusions: This research contributes valuable insights into predicting therapeutic interventions for postoperative air leaks and identifies scenarios where spontaneous cessation is probable. A validation through prospective studies is warranted to affirm these findings. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. A New Two-Parameters Lindley-Frailty Model: Censored and Uncensored Schemes under Different Baseline Models: Applications, Assessments, Censored and Uncensored Validation Testing.
- Author
-
Teghri, Samia, Goua, Hafida, Loubna, Hamami, Butt, Nadeem S., Khedr, Abdelrahman M., Yousof, Haitham M., Ibrahim, Mohamed, and Salem, Moustafa
- Subjects
- *
FAILURE time data analysis , *MAXIMUM likelihood statistics , *PARETO distribution , *GOODNESS-of-fit tests , *HAZARD function (Statistics) , *CENSORSHIP - Abstract
Classical survival models assume homogeneity among the population of individuals who are susceptible to the event of interest. However, in many practical circumstances, there is a certain amount of unobserved heterogeneity that can be caused by a variety of sources, such as environmental or genetic factors. If the heterogeneity is ignored, many issues could arise, including an overestimation of the hazard rate and inaccurate estimates of the regression coefficients. Frailty models are usually used to model the heterogeneity among individuals. In this paper, we propose a novel univariate frailty model. The frailty variable is assumed to follow the Two Parameter Lindley distribution. The maximum likelihood method is used to estimate the model parameters. The baseline hazard functions are assumed to follow Weibull, Exponential, Gompertz, and Pareto distributions, and a simulation study is performed under this assumption. We examine the characteristics of the distribution and assess its performance compared to other distributions that are frequently applied in frailty modeling by using both Nikulin-Rao-Robson and Bagdonavicius-Nikulin goodness-of-fit tests to determine the adequacy of the model. We analyze a fresh medical dataset collected from an emergency hospital in Algeria to evaluate the effectiveness and applicability of the proposed model. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Estimating Hazard Function through Reliability Function and Empirical Methods.
- Author
-
Khudhur, Azhin M., Hama Noory, Shvan A., and Abdulkareem, Bestun M.
- Subjects
HAZARD function (Statistics) ,PROBABILITY density function ,EMPIRICAL research ,RAYLEIGH model ,RELIABILITY in engineering - Abstract
In this research, the reliability functions are applied to estimate the hazard function of four used car components such as (tires, brakes, lights, and engine), which are inspected by aperiodic vehicle inspection (PVI) established in Erbil city, a specialized company that conducts the annual technical inspection of vehicles to detect the failure component, that either require repair or replace it with a new one. For our purpose, the data about the failure components of a sample of size (50,000) cars are obtained from the Erbil traffic directorate, which are annually inspected for 11 years (2010–2020) by a (PVI) company. From the available data, the reliability function, hazard function, and probability density function of the failure time of each component are found by the non-parametric method and the estimated Rayleigh distribution since the failure rates of the components are the linear functions of time, also the comparison between their reliability values have made by the mean absolute error method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Discussion of "Specifying prior distributions in reliability applications".
- Author
-
Ng, Hon Keung Tony
- Subjects
MONTE Carlo method ,WEIBULL distribution ,BAYESIAN analysis ,HAZARD function (Statistics) ,CENSORING (Statistics) - Abstract
Specifying prior distributions in the Bayesian method is a fundamental but complex problem, especially when conjugate prior does not exist. Tian et al. (Appl Stoch Models Bus Ind; 2023) have captured the spirit of specifying prior distributions in Bayesian analysis for reliability data and presented different approaches coherently. In this discussion, I will focus on specifying prior distributions based on the knowledge of the aging behavior or hazard function and study the effect of misspecifying the informative priors. A Monte Carlo simulation study based on Type 2 censored data from Weibull distribution is used to illustrate the performance of the estimation procedures based on informative and non‐informative priors. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Estimating the Parameters of Exponential-Rayleigh Distribution under Type-I Censored Data.
- Author
-
Shatti, Rihaam N. and Al-Kinani, Iden H.
- Subjects
RAYLEIGH model ,HOSPITAL size ,PROBABILITY density function ,MAXIMUM likelihood statistics ,HAZARD function (Statistics) ,CENSORSHIP ,CENSORING (Statistics) ,LOG-rank test - Abstract
Copyright of Baghdad Science Journal is the property of Republic of Iraq Ministry of Higher Education & Scientific Research (MOHESR) and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
19. Determining the optimal time to report mortality after lobectomy for lung cancer: An analysis of the time-varying risk of deathCentral MessagePerspective
- Author
-
Matthew Shiu Hang Wong, Aina Pons, BSc, PGCert, Paulo De Sousa, BSc, PgDip, RGN, Chiara Proli, MD, Simon Jordan, MB BCh, MD, FRCS, Sofina Begum, MB ChB, MSc, FRCS, Silviu Buderi, MB BCh, MSc, FRCSEd, Vladimir Anikin, MD, FRCS, Jonathan Finch, MBBS, FRCS, Nizar Asadi, MD, FRCS, Emma Beddow, MBBS, FRCS, and Eric Lim, MB ChB, MD, MSc, FRCS
- Subjects
hazard function ,lobectomy ,lung cancer ,mortality ,outcomes analysis ,Diseases of the circulatory (Cardiovascular) system ,RC666-701 ,Surgery ,RD1-811 - Abstract
Objective: Surgical mortality has traditionally been assessed at arbitrary intervals out to 1 year, without an agreed optimum time point. The aim of our study was to investigate the time-varying risk of death after lobectomy to determine the optimum period to evaluate surgical mortality rate after lobectomy for lung cancer. Methods: We performed a retrospective study of patients undergoing lobectomy for lung cancer at our institution from 2015 to 2022. Parametric survival models were assessed and compared with a nonparametric kernel estimate. The hazard function was plotted over time according to the best-fit statistical distribution. The time points at which instantaneous hazard rate peaked and stabilized in the 1-year period after surgery were then determined. Results: During the study period, 2284 patients underwent lobectomy for lung cancer. Cumulative mortality at 30, 90, and 180 days was 1.3%, 2.9%, and 4.9%, respectively. Log-logistic distribution showed the best fit compared with other statistical distribution, indicated by the lowest Akaike information criteria value. The instantaneous hazard rate was greatest during the immediate postoperative period (0.129; 95% confidence interval, 0.087-0.183) and diminishes rapidly within the first 30 days after surgery. Instantaneous hazard rate continued to decrease past 90 days and stabilized only at approximately 180 days. Conclusions: In-hospital mortality is the optimal follow-up period that captures the early-phase hazard during the immediate postoperative period after lobectomy. Thirty-day mortality is not synonymous to “early mortality,” as instantaneous hazard rate remains elevated well past the 90-day time point and only stabilizes at approximately 180 days after lobectomy.
- Published
- 2023
- Full Text
- View/download PDF
20. A Generalized Gompertz Distribution with Hazard Power Parameter and Its Bivariate Extension: Properties and Applications
- Author
-
Muhammed, Hiba Zeyada
- Published
- 2024
- Full Text
- View/download PDF
21. Using the hazard function to evaluate hepatocellular carcinoma recurrence risk after curative resection.
- Author
-
Li, Wei-Feng, Moi, Sin-Hua, Liu, Yueh-Wei, Yong, Chee-Chien, Wang, Chih-Chi, Yen, Yi-Hao, and Lin, Chih-Yun
- Abstract
Predicting recurrence patterns of hepatocellular carcinoma (HCC) can be helpful in developing surveillance strategies. This study aimed to use the hazard function to investigate recurrence hazard and peak recurrence time transitions in patients with HCC undergoing liver resection (LR). We enrolled 1204 patients with HCC undergoing LR between 2007 and 2018 at our institution. Recurrence hazard, patterns, and peak rates were analyzed. The overall recurrence hazard peaked at 7.2 months (peak hazard rate [pHR]: 0.0197), but varied markedly. In subgroups analysis based on recurrence risk factors, patients with a high radiographic tumor burden score (pHR: 0.0521), alpha-fetoprotein level ≥ 400 ng/ml (pHR: 0.0427), and pT3–4 (pHR: 0.0656) showed a pronounced peak within the first year after LR. Patients with cirrhosis showed a pronounced peak within three years after LR (pHR: 0.0248), whereas those with Barcelona Clinic Liver Cancer stage B (pHR: 0.0609) and poor tumor differentiation (pHR: 0.0451) showed multiple peaks during the 5-year follow-up period. In contrast, patients without these recurrence risk factors had a relatively flat hazard function curve. HCC recurrence hazard, patterns, and peak rates varied substantially depending on different risk factors of HCC recurrence. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
22. Expected length of stay at residential aged care facilities in Australia: current and future.
- Author
-
Zhang, Jinhui, Shi, Yanlin, and Huang, Guogui
- Abstract
This study explores the changing patterns of the length of stay (LOS) at Australian residential aged care facilities during 2008–2018 and likely trends up to 2040. The expected LOS was estimated via the hazard function of exiting from such a facility and its heterogeneity by residents’ sociodemographic characteristics using an improved Cox regression model. Data were sourced from the Australian Institute of Health and Welfare. In-sample modelling results reveal that the estimated LOS differed by age (in general, shorter for older groups), marital status (longer for the widowed) and sex (longer for females). In addition, the estimated LOS increased slowly from 2008–2009 to 2016–2017 but declined steadily thereafter. Out-of-sample predictions suggest that the declining trend of the estimated LOS will continue until 2040 and that the longest LOS (approximately 37 months) will be observed among widowed females aged 50–79 years. Relative uncertainty measures are provided. The results portray the current changing landscape and the future trend of residential aged care use in Australia, which can inform the development of optimised residential aged care policies to support ageing Australians more effectively. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
23. Explicit Timing Differently Predicts Implicit Timing Performance in Younger and Older Adults.
- Author
-
Otsuka, Taku, Yotsumoto, Yuko, and Mioni, Giovanna
- Abstract
Temporal processing can be divided into explicit timing and implicit timing. Explicit timing tasks require participants to attend to the temporal aspects of the task, whereas in implicit timing tasks, temporal information affects performance without explicit instruction to process time. Compared to younger adults, older adults have been shown to exhibit greater variability in explicit timing, while in implicit timing, they have been shown to rely more on temporal predictions formed by the hazard function. However, the relationship between explicit and implicit timing and its age-related changes have yet to be explored. To address this issue, we collected data in which younger and older adults performed a time bisection task (i.e., explicit timing task) and a foreperiod task (i.e., implicit timing task) in a within-subjects design. Based on a Bayesian optimization framework, we hypothesized that individuals with higher variability in explicit timing would show a stronger foreperiod effect, which is an index of the degree of reliance on temporal predictions. Results showed a different relation between explicit and implicit timing in younger and older adults. In older adults, results were consistent with the hypothesis that increased variability in explicit timing was associated with a stronger foreperiod effect. In contrast, bias in the temporal representation but not variability was associated with the foreperiod effect in younger adults. Implications for the age-related difference in the relation between explicit and implicit timing are discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
24. Doubly bounded exponential model: Some information measures and estimation.
- Author
-
P. Singh, Brijesh, Dhar Das, Utpal, Karakaya, Kadir, S. Bakouch, Hassan, and Abba, Badamasi
- Abstract
Abstract. A three-parameter probability distribution is derived from a composed cumulative distribution function, which is itself a family of bounded support transformations. The transformed model called the doubly bounded exponential distribution, which exhibits decreasing shaped density while the hazard rate has increasing shape. Some statistical properties are obtained in closed form, such as the moments and various entropy functions. The parameter estimation is carried out by the methods of maximum likelihood estimate, least squares estimate, weighted least squares estimate, Anderson-Darling estimate, and Cramér-von Mises estimate. The performance of these estimators is assessed through a Monte Carlo simulation study. The identifiability of the DB-Exp model’s parameters is also investigated. The proposed distribution can produce a higher performance than several well-known bounded distributions in the literature, as shown by an application to rainfall data. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
25. Extending Beyond Bagust and Beale: Fully Parametric Piecewise Exponential Models for Extrapolation of Survival Outcomes in Health Technology Assessment.
- Author
-
Cooney, Philip and White, Arthur
- Subjects
- *
SURVIVAL rate , *PARTITION functions , *MEDICAL technology , *HAZARD function (Statistics) , *PARAMETRIC modeling - Abstract
When extrapolating time-to-event data the Bagust and Beale (B&B) approach uses the Kaplan-Meier survival function until a manually chosen time point, after which a constant hazard is assumed. This study demonstrates an objective statistical approach to estimate this time point. We estimate piecewise exponential models (PEMs), whereby the hazard function is partitioned into segments each with constant hazards. The boundaries of these segments are known as change points. Our approach determines the location and number of change points in PEMs from which the hazard in the final segment is used to model long-term survival. We reviewed previous applications of the B&B approach in National Institute for Health and Care Excellence Technology Appraisals (TAs) completed between July 2011 and June 2017. The time points after which constant hazards were assumed were compared between PEMs and the B&B approaches. When further survival data were published following the original TA, we compared these updated estimates to predicted survival from the PEM and other parametric models adjusted for general population mortality. Six of the 59 TAs in this review considered the B&B approach. There was general agreement between the location of time points identified through the PEM and the B&B approaches. In 2 of the identified TAs the best fitting model to the data was a no-change-point model. Of the 3 TAs for which further survival data became available, PEM provided the closest prediction for survival outcomes in 2 TAs. PEMs are useful for survival extrapolation when a long-term constant hazard trend for the disease is clinically plausible. • For clinical and administrative reasons, the early portion of clinical trials can be subject to transient effects that are not representative of the long-term hazards and can potentially bias survival extrapolation. • In this article, we describe a survival model that objectively identifies the location after which disease-related hazards become approximately constant and compare the accuracy of extrapolated survival against other parametric models. • This study illustrates that if disease-related hazards can be assumed constant, the extrapolated survival (adjusting for general population mortality) can be a reasonable estimate for use in decision-analytic model-based cost-effectiveness analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
26. Unit Exponential Probability Distribution: Characterization and Applications in Environmental and Engineering Data Modeling.
- Author
-
Bakouch, Hassan S., Hussain, Tassaddaq, Tošić, Marina, Stojanović, Vladica S., and Qarmalah, Najla
- Subjects
- *
DISTRIBUTION (Probability theory) , *ENGINEERING models , *ENVIRONMENTAL engineering , *DATA modeling , *MAXIMUM likelihood statistics , *PARAMETER estimation - Abstract
Distributions with bounded support show considerable sparsity over those with unbounded support, despite the fact that there are a number of real-world contexts where observations take values from a bounded range (proportions, percentages, and fractions are typical examples). For proportion modeling, a flexible family of two-parameter distribution functions associated with the exponential distribution is proposed here. The mathematical and statistical properties of the novel distribution are examined, including the quantiles, mode, moments, hazard rate function, and its characterization. The parameter estimation procedure using the maximum likelihood method is carried out, and applications to environmental and engineering data are also considered. To this end, various statistical tests are used, along with some other information criterion indicators to determine how well the model fits the data. The proposed model is found to be the most efficient plan in most cases for the datasets considered. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
27. Estimation of the Parameters of the Modified Weibull Distribution with Bathtub-shaped Failure Rate Function.
- Author
-
Hussein Adam, Adam Abdelrahman and Sazak, Hakan SavaÅŸ
- Subjects
- *
WEIBULL distribution , *PARAMETER estimation , *SOFTWARE reliability , *HAZARD function (Statistics) - Abstract
In this study, we propose two estimators called the 3-step modified maximum likelihood (MML) and the combined estimators of the parameters of the modified Weibull distribution which is used in reliability models with bathtub-shaped failure rate function. The simulations show the superiority of both estimators over the graphical estimators. Particularly, the combined estimators are the better of the two. Two real-life data applications also show the superiority of the proposed estimators compared to the graphical estimators. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
28. Risk Analysis in Practice and Theory
- Author
-
Kitsos, Christos P., Kitsos, Christos P., editor, Oliveira, Teresa A., editor, Pierri, Francesca, editor, and Restaino, Marialuisa, editor
- Published
- 2023
- Full Text
- View/download PDF
29. Survival Analysis
- Author
-
Emmert-Streib, Frank, Moutari, Salissou, Dehmer, Matthias, Emmert-Streib, Frank, Moutari, Salissou, and Dehmer, Matthias
- Published
- 2023
- Full Text
- View/download PDF
30. NONPARAMETRIC FUNCTIONAL HAZARD WITH k NEAREST NEIGHBORS ESTIMATION WHERE THE OBSERVATIONS ARE M.A.R. AND RELATED VIA A FUNCTIONAL SINGLE-INDEX STRUCTURE.
- Author
-
BOUABÇA, ASMA, BOUABSA, WAHIBA, and BELLATRACH, NADJET
- Subjects
- *
HAZARDS , *K-nearest neighbor classification , *HAZARD function (Statistics) - Abstract
The nonparametric local linear hazard functional estimation analyzed by the k.N.N method of the scalar response variable B, not totally observed provided the functional variable A. The purpose of this paper is to illustrate, under some general conditions, the almost complete convergence (with rates) of the constructed estimator. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
31. Transform orders and stochastic monotonicity of statistical functionals.
- Author
-
Lando, Tommaso, Arab, Idir, and Oliveira, Paulo Eduardo
- Subjects
- *
STOCHASTIC orders , *STATISTICAL bootstrapping , *FUNCTIONALS , *STOCHASTIC dominance , *GOODNESS-of-fit tests , *CONFIDENCE intervals - Abstract
In some inferential statistical methods, such as tests and confidence intervals, it is important to describe the stochastic behavior of statistical functionals, aside from their large sample properties. We study such a behavior in terms of the usual stochastic order. For this purpose, we introduce a generalized family of stochastic orders, which is referred to as transform orders, showing that it provides a flexible framework for deriving stochastic monotonicity results. Given that our general definition makes it possible to obtain some well known ordering relations as particular cases, we can easily apply our method to different families of functionals. These include some prominent inequality measures, such as the generalized entropy, the Gini index, and its generalizations. We also illustrate the applicability of our approach by determining the least favorable distribution, and the behavior of some bootstrap statistics, in some goodness‐of‐fit testing procedures. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
32. A NEW FINITE MIXTURE OF PROBABILITY MODELS WITH APPLICATION.
- Author
-
SAKTHIVEL, K. M. and G., VIDHYA
- Subjects
- *
PROBABILITY theory , *RAYLEIGH model - Abstract
In this research, we present an approach to model lifetime data by a weighted three-parameter probability distribution utilizing the exponential and gamma distributions. We have presented some of the essential characteristics such as the shapes of pdf, cdf, moments, incomplete moments, survival function, hazard function, mean residual life, stochastic ordering, and order statistics of the proposed distribution. Furthermore, we also presented the Bonferroni index and Lorenz curve of the proposed distribution. The maximum likelihood approach is used to estimate the parameters of the distribution. Finally, the proposed probability distribution is compared to goodness of fit with Lindley, Akash, exponential, two-parameter Lindley, cubic transmuted Rayleigh, and Exponential-Gamma distributions for the real-time data set. [ABSTRACT FROM AUTHOR]
- Published
- 2023
33. Survival Analysis and Hazard of Log Logistic Distribution on Type I Censored Data Parametrically.
- Author
-
Hosana, Ruth, Septia Sari, Ni Wayan Widya, and Kurniawan, Ardi
- Subjects
SURVIVAL analysis (Biometry) ,HAZARD function (Statistics) ,LOGISTIC distribution (Probability) ,STUDENT attitudes ,DATA analysis - Abstract
Survival Analysis is a research method that examines the survival time of individuals or experimental units in relation to events such as death, disease, recovery, or other experiences. This study utilizes a parametric survival analysis model with a 2 parameter log logistic distribution and Maximum Likelihood Estimation (MLE) method to analyze the survival of students during their study period. The log logistic distribution is chosen due to its ability to capture early or late failure patterns. The objective of this research is to analyze type I censored survival data using the log logistic distribution applied to secondary data on student study duration. The dataset consists of 98 observations. The calculated values for the β and γ parameters of the 2 parameters log logistic distribution are 2.12831 and 0.0918891, respectively. The probability of students completing their studies by semester 8 (hazard function h(8)) is 0.370102, while the probability of students continuing their studies in semester 9 (survival function s(9)) is 0.320817. [ABSTRACT FROM AUTHOR]
- Published
- 2023
34. A New Sine Family of Generalized Distributions: Statistical Inference with Applications.
- Author
-
Benchiha, SidAhmed, Sapkota, Laxmi Prasad, Al Mutairi, Aned, Kumar, Vijay, Khashab, Rana H., Gemeay, Ahmed M., Elgarhy, Mohammed, and Nassr, Said G.
- Subjects
WEIBULL distribution ,PERSONAL names ,TRIGONOMETRIC functions ,LEAST squares ,FAMILIES - Abstract
In this article, we extensively study a family of distributions using the trigonometric function. We add an extra parameter to the sine transformation family and name it the alpha-sine-G family of distributions. Some important functional forms and properties of the family are provided in a general form. A specific sub-model alpha-sine Weibull of this family is also introduced using the Weibull distribution as a parent distribution and studied deeply. The statistical properties of this new distribution are investigated and intended parameters are estimated using the maximum likelihood, maximum product of spacings, least square, weighted least square, and minimum distance methods. For further justification of these estimates, a simulation experiment is carried out. Two real data sets are analyzed to show the suggested model's application. The suggested model performed well compares to some existing models considered in the study. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
35. Simulating time-to-event data under the Cox proportional hazards model: assessing the performance of the non-parametric Flexible Hazards Method
- Author
-
Jennifer L. Delzeit and Devin C. Koestler
- Subjects
Cox proportional hazards model ,survival data ,simulation ,time-to-event ,methodology ,hazard function ,Applied mathematics. Quantitative methods ,T57-57.97 ,Probabilities. Mathematical statistics ,QA273-280 - Abstract
Numerous methods and approaches have been developed for generating time-to-event data from the Cox Proportional Hazards (CPH) model; however, they often require specification of a parametric distribution for the baseline hazard even though the CPH model itself makes no assumptions on the distribution of the baseline hazards. In line with the semi-parametric nature of the CPH model, a recently proposed method called the Flexible Hazards Method generates time-to-event data from a CPH model using a non-parametric baseline hazard function. While the initial results of this method are promising, it has not yet been comprehensively assessed with increasing covariates or against data generated under parametric baseline hazards. To fill this gap, we conducted a comprehensive study to benchmark the performance of the Flexible Hazards Method for generating data from a CPH model against parametric methods. Our results showed that with a single covariate and large enough assumed maximum time, the bias in the Flexible Hazards Method is 0.02 (with respect to the log hazard ratio) with a 95% confidence interval having coverage of 84.4%. This bias increases to 0.054 when there are 10 covariates under the same settings and the coverage of the 95% confidence interval decreases to 46.7%. In this paper, we explain the plausible reasons for this observed increase in bias and decrease in coverage as the number of covariates are increased, both empirically and theoretically, and provide readers and potential users of this method with some suggestions on how to best address these issues. In summary, the Flexible Hazards Method performs well when there are few covariates and the user wishes to simulate data from a non-parametric baseline hazard.
- Published
- 2023
- Full Text
- View/download PDF
36. A Novel Version of the Exponentiated Weibull Distribution: Copulas, Mathematical Properties and Statistical Modeling.
- Author
-
Refaie, Mohamed K. A., Yaqoob, Asmaa Ayoob, Selim, Mahmoud Ali, and Ali, Emadeldin I. A.
- Subjects
- *
WEIBULL distribution , *STATISTICAL models , *COPULA functions , *KURTOSIS , *RENYI'S entropy , *BATHTUBS - Abstract
In this study, the authors of the current work describe a novel exponentiated Weibull distribution that they have invented. The study was written by the writers of the current work. It is required to analyze those properties once the pertinent mathematical properties have been derived. In addition to the dispersion index, the anticipated value, variance, skewness, and kurtosis are also statistically examined. The dispersion index is likewise examined. Other beneficial shapes that the new density can assume include "bathtub," "right skewed," "bimodal and left skewed," "unimodal and left skewed," and "bimodal and right skewed." Additionally, these forms can be merged to create a "bathtub." The term "bathtub (U-HRF)," "constant," "monotonically increasing," "upside down-increasing (reversed U-increasing)," "J-HRF," "upside down-constant," "increasing-constant," or "upside down (reversed U)" may be used to describe the new rate of failure. The greatest likelihood method's efficiency is assessed via graphical analysis. The main measures for this procedure's evaluation are biases and mean squared errors. The reader is given a scenario that graphically displays the adaptability and value of the innovative distribution through the use of three separate sets of actual data. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
37. Effect of Number of Tests on the Mechanical Characteristics of Agave sisalana Yarns for Composites Structures: Statistical Approach.
- Author
-
Gahgah, Mounir, Belaadi, Ahmed, Boumaaza, Messaouda, Alshahrani, Hassan, and Khan, Mohammad K. A.
- Subjects
- *
YARN , *COMPOSITE structures , *PLANT fibers , *NATURAL fibers , *YOUNG'S modulus , *AGAVES - Abstract
A designer of sustainable biocomposite structures and natural ropes needs to have a high confidence interval (95% CI) for mechanical characteristics data of performance materials, yet qualities for plant-based fibers are very diverse. A comprehensive study of the elements that enhance the performance of biocomposites or sustainable ropes created from vegetable fibers is necessary. The current study included five groups with varying numbers (N) of tests of 20, 40, 60, 80, and 100 on the mechanical characteristics at room temperatures. The purpose of this study was to determine how changing N affects the mechanical properties of sisal yarn. These properties include its strength, Young's modulus, and deformation at rupture. A significance testing program including more than 100 tests was performed. Owing to the heterogeneity of the plant yarn, each group received more than 20 samples at a gauge length (GL) of 100 mm. The tensile strength characteristics of sisal yarns produced a wide range of findings, as is common for natural fibers, necessitating a statistical analysis. Its dispersion was explored and measured using the statistical methods. The Weibull distribution with two parameters and a prediction model with a 95% confidence level for maximum likelihood (ML) and least squares (LS) were used to investigate and quantify its dispersion. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
38. On Modeling Bivariate Lifetime Data in the Presence of Inliers
- Author
-
Bhattacharya, Sumangal, Das, Ishapathik, and Kunnummal, Muralidharan
- Published
- 2024
- Full Text
- View/download PDF
39. Unit Maxwell-Boltzmann Distribution and Its Application to Concentrations Pollutant Data
- Author
-
Cenker Biçer, Hassan S. Bakouch, Hayrinisa Demirci Biçer, Gadir Alomair, Tassaddaq Hussain, and Amal Almohisen
- Subjects
hazard function ,Maxwell-Boltzmann ,characterizations ,estimation ,simulation ,application ,Mathematics ,QA1-939 - Abstract
In the vast statistical literature, there are numerous probability distribution models that can model data from real-world phenomena. New probability models, nevertheless, are still required in order to represent data with various spread behaviors. It is a known fact that there is a great need for new models with limited support. In this study, a flexible probability model called the unit Maxwell-Boltzmann distribution, which can model data values in the unit interval, is derived by selecting the Maxwell-Boltzmann distribution as a base-line model. The important characteristics of the derived distribution in terms of statistics and mathematics are investigated in detail in this study. Furthermore, the inference problem for the mentioned distribution is addressed from the perspectives of maximum likelihood, method of moments, least squares, and maximum product space, and different estimators are obtained for the unknown parameter of the distribution. The derived distribution outperforms competitive models according to different fit tests and information criteria in the applications performed on four actual air pollutant concentration data sets, indicating that it is an effective model for modeling air pollutant concentration data.
- Published
- 2024
- Full Text
- View/download PDF
40. Suitable Patient Selection and Optimal Timing of Treatment for Persistent Air Leak after Lung Resection
- Author
-
Yoshikane Yamauchi, Hiroyuki Adachi, Nobumasa Takahashi, Takao Morohoshi, Taketsugu Yamamoto, Makoto Endo, Tsuyoshi Ueno, Tekkan Woo, Yuichi Saito, and Noriyoshi Sawabata
- Subjects
postoperative air leak ,risk factor ,cumulative distribution ,hazard function ,Medicine - Abstract
Objectives: The choice of therapeutic intervention for postoperative air leak varies between institutions. We aimed to identify the optimal timing and patient criteria for therapeutic intervention in cases of postoperative air leaks after lung resection. Methods: This study utilized data from a prospective multicenter observational study conducted in 2019. Among the 2187 cases in the database, 420 cases with air leaks on postoperative day 1 were identified. The intervention group underwent therapeutic interventions, such as pleurodesis or surgery, while the observation group was monitored without intervention. A comparison between the intervention group and the observation group were analyzed using the cumulative distribution and hazard functions. Results: Forty-six patients (11.0%) were included in the intervention group. The multivariate analysis revealed that low body mass index (p = 0.019), partial resection (p = 0.010), intraoperative use of fibrin glue (p = 0.008), severe air leak on postoperative day 1 (p < 0.001), and high forced expiratory volume in 1 s (p = 0.021) were significant predictors of the requirement for intervention. The proportion of patients with persistent air leak in the observation group was 20% on postoperative day 5 and 94% on postoperative day 7. The hazard of air leak cessation peaked from postoperative day 3 to postoperative day 7. Conclusions: This research contributes valuable insights into predicting therapeutic interventions for postoperative air leaks and identifies scenarios where spontaneous cessation is probable. A validation through prospective studies is warranted to affirm these findings.
- Published
- 2024
- Full Text
- View/download PDF
41. Weibull-Fréchet distribution: A new lifetime distribution with application to gastric cancer data
- Author
-
Chindranata Marko, Nurrohmah Siti, and Fithriani Ida
- Subjects
hazard function ,maximum likelihood method ,unimodal ,weibull-g ,Information technology ,T58.5-58.64 - Abstract
Lifetime data is a type of data that consists of waiting time until an event occurs. Some of the events of lifetime data are deaths, occurrence of a disease, or failure of a machine. The distribution usually used for modeling lifetime data is the Weibull distribution. However, Weibull distribution has a limitation in its application: it can only model data with a monotonic hazard function. Therefore, a method for generalizing the Weibull distribution is needed so it can model data with a non-monotonic hazard function. One of those generalizations is the Weibull-Fréchet distribution (WFr) which was introduced by Afify in 2016. The WFr distribution has an advantage over the Weibull distribution, due to its capability in modeling data with unimodal hazard function. The method used in generating the WFr distribution is the Weibull-G (WG) that were introduced by Bourguignon in 2014. The WG method combines the distribution of a Weibull distribution with an arbitrary distribution with a cumulative distribution function (cdf) G(x) using a function W[G(x)]. The characteristics of WFr distribution discussed include probability density function (pdf), cumulative distribution function, survival function, hazard function, and the moment. The hazard function of WFr can be monotonic or unimodal. The maximum likelihood estimation method is used in estimating the parameters of the distribution. Finally, lifetime data of gastric cancer patients is given for illustration purposes. The data is modeled using the WFr distribution, and both the Weibull and Fréchet distribution for comparison. The model result shows that the WFr distribution is the best distribution for modeling the lifetime data of gastric cancer patients.
- Published
- 2024
- Full Text
- View/download PDF
42. Alpha power inverse Weibull distribution: A new lifetime distribution with application to gastric cancer data
- Author
-
Rasjid Julio Majesty, Nurrohmah Siti, and Fithriani Ida
- Subjects
alpha power transformation ,hazard function ,inverse weibull distribution ,lifetime data ,maximum likelihood method ,Information technology ,T58.5-58.64 - Abstract
Lifetime data analysis has an essential role in various fields of science. In general, lifetime data have a skewed distribution pattern. The Weibull distribution is one of the most frequently used distributions for modeling lifetime data. However, the Weibull distribution is not suitable for modeling data with non-monotonous hazard functions, one of which is an upside-down bathtub shape. According to Sharma et al. (2015), the inverse version of several probability distributions can model the data with an upside-down bathtub shape, one of which is the inverse Weibull distribution. This paper explains the Alpha Power Inverse Weibull (APIW) distribution as a generalization version of the inverse Weibull distribution. This distribution is constructed by using the Alpha Power Transformation method. The modification is done by adding a shape parameter to the inverse Weibull distribution to increase flexibility. The characteristics of APIW distribution discussed include probability density function, distribution function, survival function, hazard function, and the r-th moment. The probability density function of APIW distribution is left-skewed and unimodal. In addition, the hazard function of APIW distribution has an upside-down bathtub shape. The parameters of this distribution are estimated by the maximum likelihood method. Finally, for illustration purposes, the data about the time until gastric cancer patients die are modelled with the inverse Weibull distribution, and the APIW distribution is given. The modeling result shows that the Alpha Power Inverse Weibull distribution is better at modeling the time until gastric cancer patients die data than the inverse Weibull distribution.
- Published
- 2024
- Full Text
- View/download PDF
43. On an Induced Distribution and its Statistical Properties
- Author
-
Brijesh P. Singh and Utpal Dhar Das
- Subjects
Induced distribution ,Bonferroni and Gini index ,Entropy ,Generating function ,Hazard function ,MLE ,Statistics ,HA1-4737 ,Probabilities. Mathematical statistics ,QA273-280 - Abstract
In this study an attempt has been made to propose a way to develop new distribution. For this purpose, we need only idea about distribution function. Some important statistical properties of the new distribution like moments, cumulants, hazard and survival function has been derived. The R´enyi entropy, Shannon entropy has been obtained. Also ML estimate of parameter of the distribution is obtained, that is not closed form. Therefore, numerical technique is used to estimate the parameter. Some real data sets are used to check the suitability of the proposed distribution over some other existing one parameter lifetime distributions. The various diagnostic tools such as -2LL, AIC, BIC and K-S test shows that the proposed distribution provides better fit than other distributions for the considered data sets.
- Published
- 2023
- Full Text
- View/download PDF
44. Double-Exponential-X Family of Distributions: Properties and Applications
- Author
-
Kehinde Adekunle Bashiru, Taiwo Adetola Ojurongbe, Lawal Sola, Nureni .Olawale Adeboye, and Habeeb Abiodun Afolabi
- Subjects
Hazard Function ,stochastic Dominance ,Double Exponential Distribution ,Molecular Simulation ,Science - Abstract
A new family of distribution named Double-Exponential-X family is proposed. The proposed family is generated from the double exponential distribution. The forms of the probability densities and hazard functions of two distinct subfamilies of the proposed family are examined and reported. Generalproperties such as moment, survival, order statistics, probability weighted moments and quartile functions of the models are investigated. A sub family of the developed family of double –Exponential-X family of the distribution known as double-Exponential-Pareto distribution was used to fit a real life data on the use of antiretroviral drugs. Molecular simulation of efficacy of antiretroviral drugs is conducted to evaluate the performance of the model. The models were tested using some models diagnostic tests and it was revealed that the proposed model was better than the ones proposed before it from the same family and also, stochastic dominance method was used to affirm the best antiretroviral drugs used in the study.
- Published
- 2023
- Full Text
- View/download PDF
45. A CRITICAL ANALYSIS OF PROPOSED POWER LOMAX GEOMETRIC DISTRIBUTION.
- Author
-
Saqib, Muhammad, Memon, Ahmed Zogo, and Chand, Sohail
- Subjects
- *
GEOMETRIC distribution , *CRITICAL analysis , *UNCERTAINTY (Information theory) , *PARAMETER estimation , *PROBABILITY theory - Abstract
The probability model 'Power Lomax Geometric Distribution' proposed in this paper is derived from Power Lomax distribution through its compounding with the Geometric distribution. We present its various theoretical properties including hazard function, mean residual life and entropy. The family of this distribution constitutes heterogeneous characteristics, and is so classified into three distinguishable subfamilies. Each subfamily is unique in its importance with respect to theoretical and practical applications. The ML method is used for the estimation of four parameters of this distribution. To assess its adequacy we consider four real life data sets comparing it with other fitted models in literature. The model is flexible in its applicability to a large number of different situations. [ABSTRACT FROM AUTHOR]
- Published
- 2023
46. Regression Modeling for Competing Risk Analysis with Leukemia at Nanakali Hospital/Erbil.
- Author
-
Majeed, Muhamad Fareed and Saleh, Samira Muhammad
- Subjects
REGRESSION analysis ,COMPETING risks ,LEUKEMIA ,RISK assessment ,AKAIKE information criterion - Abstract
Copyright of Al-Anbar University Journal of Economic & Administration Sciences is the property of Republic of Iraq Ministry of Higher Education & Scientific Research (MOHESR) and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
47. A New Generalized Gamma-Weibull Distribution and Its Applications.
- Author
-
Aleshinloye, Nihimat I., Aderoju, Samuel A., Abiodun, Alfred A., and Taiwo, Bako L.
- Subjects
GAMMA rays ,GAMMA (Electronic computer system) ,MATHEMATICAL models ,HAZARD function (Statistics) ,DATA analysis - Abstract
In this paper, a New Generalized Gamma-Weibull (NGGW) distribution is developed by compounding Weibull and generalized gamma distribution. Some mathematical properties such as moments, R°enyi entropy and order statistics are derived and discussed. The maximum likelihood estimation (MLE) method is used to estimate the model parameters. The proposed model is applied to two real-life datasets to illustrate its performance and flexibility as compared to some other competing distributions. The results obtained show that the new distribution fits each of the data better than the other competing distributions. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
48. Surveillance Strategy after Curative Resection for Oesophageal Squamous Cell Cancer Using the Hazard Function
- Author
-
Kyohei Kanematsu, Yozo Kudose, Daichi Utsunomiya, Kentaro Kubo, Yusuke Fujii, Daisuke Kurita, Koshiro Ishiyama, Junya Oguma, and Hiroyuki Daiko
- Subjects
Oesophageal squamous cell cancer ,Hazard rates ,Hazard function ,Recurrence ,surveillance ,Neoplasms. Tumors. Oncology. Including cancer and carcinogens ,RC254-282 - Abstract
Abstract Background The optimal surveillance period and frequency after curative resection for oesophageal squamous cell carcinoma (OSCC) remain unclear, and current guidelines are mainly based on traditional Kaplan–Meier analyses of cumulative incidence rather than risk analysis. The aim of this study was to determine a suitable follow-up surveillance program following oesophagectomy for OSCC using the hazard function. Methods A total of 1187 patients who underwent curative resection for OSCC between 2000 and 2014 were retrospectively analyzed. The changes in the estimated hazard rates (HRs) of recurrence over time were analyzed according to tumour-node-metastasis stage. Results Four hundred seventy-eight (40.2%) patients experienced recurrence during the follow-up period (median, 116.5 months). The risk of recurrence peaked at 9.2 months after treatment (HR = 0.0219) and then decreased to half the peak value at 24 months post-surgery. The HRs for Stage I and II patients were low (
- Published
- 2022
- Full Text
- View/download PDF
49. On Maxwell–Lomax distribution: properties and applications
- Author
-
Alfred Adewole Abiodun and Aliyu Ismail Ishaq
- Subjects
Hazard function ,Maxwell–Lomax ,moments ,order statistics ,quantile function ,Science - Abstract
The development of new generalizations based on certain baseline probability distribution has become one of the current trends in distribution theory literature. New generators are often required to define wider distributions for modelling real life data. In this study, we proposed and studied a new generalization of Maxwell and Lomax distributions using the T-X method. Several structural and statistical properties of the proposed distribution were obtained such as moments, quantile function, survival and hazard functions, skewness, kurtosis and order statistics. The method of maximum likelihood estimation (MLE) was used to estimate the parameters of the proposed distribution. In addition, a simulation study was conducted to evaluate the performance of the MLE method. The proposed distribution was applied to two real life datasets to illustrate its flexibility. It was found that the proposed distribution was superior to offer a better fit than the other competing extensions of Lomax distributions considered in the study.
- Published
- 2022
- Full Text
- View/download PDF
50. The Beta Exponential Power Series Distribution
- Author
-
Khojastehbakht, Nafiseh, Ghatari, Amirhossein, and Samani, Ehsan Bahrami
- Published
- 2023
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.