48,395 results on '"STATISTICAL MODEL"'
Search Results
2. Prediction of Native Liver Survival in Patients With Biliary Atresia 20 Years After the Kasai Procedure
- Published
- 2025
- Full Text
- View/download PDF
3. Comprehensive evaluation of seasonal forecasts from NMME and statistical models over the Blue Nile Basin and the Grand Ethiopian Renaissance Dam (GERD) watershed
- Author
-
Gebremichael, Mekonnen, Tien, Yu-Chuan, and Nourani, Vahid
- Published
- 2025
- Full Text
- View/download PDF
4. The theoretical freezing model of sandstone considering the statistical arrangement of pore structure
- Author
-
Zhang, Hui, Yang, Yugui, Cai, Chengzheng, Hou, Shanshan, and Li, Chenxiang
- Published
- 2025
- Full Text
- View/download PDF
5. Optimized the performance of conductive mortar with hybrid fiber and steel-slag via RSM and MOPSO
- Author
-
Zha, Wenhua, Lv, Wenfang, Li, Jielian, Xu, Tao, Yang, Ke, Hua, Xinzhu, and Chen, Denghong
- Published
- 2025
- Full Text
- View/download PDF
6. Widespread presence of metallic compounds and organic contaminants across Pacific coral reef fish
- Author
-
Wejieme, Noreen, Vigliola, Laurent, Parravicini, Valeriano, Sellanes, Javier, Wafo, Emmanuel, Zapata-Hernandez, German, Bustamante, Paco, and Letourneur, Yves
- Published
- 2025
- Full Text
- View/download PDF
7. Identification of the flow structure of dense phase in a gas-solid fluidized bed reactor in bubbling fluidization regime with Geldart B + A particles
- Author
-
Chai, Xuesen, Wang, Anyu, Fu, Zhijie, Duan, Chenlong, and Bi, Xiaotao
- Published
- 2025
- Full Text
- View/download PDF
8. Entangled hidden elephant random walk model
- Author
-
Souissi, Abdessatar, Mukhamedov, Farrukh, Soueidi, El Gheteb, Rhaima, Mohamed, and Mukhamedova, Farzona
- Published
- 2024
- Full Text
- View/download PDF
9. Theoretical framework and inference for fitting extreme data through the modified Weibull distribution in a first-failure censored progressive approach
- Author
-
Eliwa, Mohamed S., Al-Essa, Laila A., Abou-Senna, Amr M., El-Morshedy, Mahmoud, and EL-Sagheer, Rashad M.
- Published
- 2024
- Full Text
- View/download PDF
10. A novel non-parametric statistical method in reliability theory: Mathematical characterization and analysis of asymmetric data in the fields of biological sciences and engineering
- Author
-
Al-Essa, Laila A., Etman, Walid B.H., Eliwa, Mohamed S., El-Morshedy, Mahmoud, and EL-Sagheer, Rashad M.
- Published
- 2024
- Full Text
- View/download PDF
11. A statistical framework for a new Kavya-Manoharan Bilal distribution using ranked set sampling and simple random sampling
- Author
-
Shafiq, Anum, Sindhu, Tabassum Naz, Riaz, Muhammad Bilal, Hassan, Marwa K.H., and Abushal, Tahani A.
- Published
- 2024
- Full Text
- View/download PDF
12. Unit upper truncated Weibull distribution with extension to 0 and 1 inflated model – Theory and applications
- Author
-
Okorie, Idika E., Afuecheta, Emmanuel, and Bakouch, Hassan S.
- Published
- 2023
- Full Text
- View/download PDF
13. A statistical analysis of antigenic similarity among influenza A (H3N2) viruses
- Author
-
Adabor, Emmanuel S.
- Published
- 2021
- Full Text
- View/download PDF
14. Exploration of the adsorption capability by doping Pb@ZnFe2O4 nanocomposites (NCs) for decontamination of dye from textile wastewater
- Author
-
Jethave, Ganesh, Fegade, Umesh, Attarde, Sanjay, Ingle, Sopan, Ghaedi, Mehrorang, and Sabzehmeidani, Mohammad Mehdi
- Published
- 2019
- Full Text
- View/download PDF
15. Tidal Flooding Contributes to Eutrophication: Constraining Nonpoint Source Inputs to an Urban Estuary Using a Data-Driven Statistical Model.
- Author
-
Macías-Tapia, Alfonso, Mulholland, Margaret R., Selden, Corday R., Clayton, Sophie, Bernhardt, Peter W., and Allen, Thomas R.
- Abstract
In coastal urban areas, tidal flooding brings water carrying nutrients and particles back from land to estuarine and coastal waters. A statistical model to predict nutrient loads during tidal flooding events can help estimate nutrient loading from previous and future flooding events and adapt nutrient reduction strategies. We measured concentrations of dissolved inorganic nitrogen and phosphorus in floodwater at seven sentinel sites during 15 tidal flooding events from January 2019 to September 2020. The study area was the Lafayette River watershed in Norfolk, VA, USA, which is prone to tidal flooding and is predicted to experience more frequent and intense flooding in the future. We calculated the difference in dissolved inorganic nitrogen (ΔDIN) or phosphorus (ΔDIP) concentrations between floodwater and those measured in the estuary prior to tidal flooding for each sentinel site and flooding event. We calculated the correlations between ΔDIN and ΔDIP with corresponding data on precipitation, wind, flooding intensity, average estuarine nutrient concentrations, population density, income, land elevation, land use, and land coverage. Using the variables with the highest R
2 values for the linear regression with either ΔDIN or ΔDIP, we built multi-variable random forest regression models. ΔDIN showed the strongest correlations with floodwater nutrient concentrations, water level, and water temperature. ΔDIP also had a strong correlation with floodwater nutrient concentrations and water temperature, but had also a strong correlation wind speed. Models indicated that inputs per flooding event ranged from − 5000 to 7500 kg N, for DIN, while those for DIP ranged from 2000 to 23,000 kg P, with net inputs of > 5000 kg N and > 100,000 kg P, respectively. Removing the dissolved nutrient concentration in floodwater variables from the models, we were able to calculate loads from events that occurred all the way back to 1946. Predicted DIN load per single flooding event ranged from ~ 0 to 1.5 × 105 kg N and showed a significant linear regression with time. Predicted DIP load estimates per single flooding event ranged from > − 1.0 × 105 to < 1.5 × 105 kg P, with a significant positive trend over time. The positive trend in these load values over time shows that they have and will continue to be an increasing problem for the water quality of the local water systems. These results indicate that further action should be taken to control the input of dissolved nutrients during tidal flooding events in urban coastal areas. [ABSTRACT FROM AUTHOR]- Published
- 2025
- Full Text
- View/download PDF
16. Strength and Durability of Superplasticizer Concrete Based on Different Component Parameters: An Experimental and Statistical Study.
- Author
-
Tugrul Tunc, Esra
- Subjects
- *
CONCRETE durability , *CONCRETE additives , *CONCRETE mixing , *CIVIL engineering , *ABRASION resistance - Abstract
Concrete, which forms the skeleton of buildings, is the most important building material to ensure the continuity of a building's durability and to survive a possible earthquake. Concrete durability is directly related to its constituent materials. In this study, it was investigated how concrete aggregate and chemical admixture change the strength of concrete according to their type. The research question of this study is: what is the place and importance of aggregate and chemical admixture in increasing concrete strength? Recent earthquakes, especially in Turkey, have shown that most of the buildings that collapsed had poor-quality concrete. The aim of this study is to determine the concrete mix designs for the production of superplasticizer concrete for the production of concrete with the desired strength depending on the tested parameters. In this study, the effect of the parameters that make up the tested concrete content on concrete strength was investigated both experimentally and statistically. Water–cement ratio, aggregate type, Los Angeles abrasion resistance of aggregates, aggregate–cement ratio and new-generation polycarboxylate-supported superplasticizer chemical admixture are the parameters in the concrete content. Statistical analysis was carried out with SPSS, an up-to-date software, using the experimental findings. There was a very good agreement between both measured and predicted values. The equations with a coefficient of determination R2 > 0.96 were derived. The developed statistical method was found to be unique and highly accurate. Thus, it is aimed to provide safe, economical, practical and time-saving pre-mix designs. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
17. Evaluating the efficiency of environmentally friendly magnetic photocatalyst for the treatment of industrial effluents containing dye pollutants.
- Author
-
Abbasi, Sedigheh
- Subjects
PHOTOCATALYSTS ,ORGANIC dyes ,STATISTICAL models ,INDUSTRIAL wastes ,POLLUTANTS ,NANOCOMPOSITE materials ,SUSTAINABILITY - Abstract
In this research, nanocomposites based on two-dimensional nanostructures such as graphene oxide were used as photocatalysts to remove methyl orange. One of the most important advantages of this type of photocatalyst is its high efficiency in the decomposition of colored organic pollutants, as well as its ease of separation at the end of the process. For this purpose, titania semiconductor is synthesized as photocatalytic nanoparticles by hydrolysis method on the surface of graphene oxide. The dependence of photocatalytic activity of nanocomposite and hybrid without graphene oxide on operating conditions including irradiation time and photocatalyst concentration shows the positive effect of these two factors on the removal efficiency of methyl orange. By increasing the irradiation time from 5 to 40 min and also changing the photocatalyst concentration from 0.05 to 0.2%wt, the photocatalytic activity increases significantly, while with the increase in the pH of the suspension from 3 to 11, the removal efficiency initially decreases and then increases. The statistical findings indicate that independent single factors, their binary and triple interactions have a significant effect on the response at the 5% confidence level. Therefore, the statistical models proposed by nanocomposite and hybrid are able to estimate the dependence of removal efficiency on significant factors with an accuracy of 99%. Also, the Box–Cox curve confirms the adequacy of the statistical models without any transformation function. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
18. An examination of daily CO2 emissions prediction through a comparative analysis of machine learning, deep learning, and statistical models.
- Author
-
Ajala, Adewole Adetoro, Adeoye, Oluwatosin Lawrence, Salami, Olawale Moshood, and Jimoh, Ayoola Yusuf
- Subjects
ARTIFICIAL neural networks ,MACHINE learning ,LONG short-term memory ,STATISTICAL models ,SUPPORT vector machines ,DEEP learning ,RECURRENT neural networks - Abstract
Human-induced global warming, primarily attributed to the rise in atmospheric CO
2 , poses a substantial risk to the survival of humanity. While most research focuses on predicting annual CO2 emissions, which are crucial for setting long-term emission mitigation targets, the precise prediction of daily CO2 emissions is equally vital for setting short-term targets. This study examines the performance of 14 models in predicting daily CO2 emissions data from 1/1/2022 to 30/9/2023 across the top four polluting regions (China, India, the USA, and the EU27&UK). The 14 models used in the study include four statistical models (ARMA, ARIMA, SARMA, and SARIMA), three machine learning models (support vector machine (SVM), random forest (RF), and gradient boosting (GB)), and seven deep learning models (artificial neural network (ANN), recurrent neural network variations such as gated recurrent unit (GRU), long short-term memory (LSTM), bidirectional-LSTM (BILSTM), and three hybrid combinations of CNN-RNN). Performance evaluation employs four metrics (R2 , MAE, RMSE, and MAPE). The results show that the machine learning (ML) and deep learning (DL) models, with higher R2 (0.714–0.932) and lower RMSE (0.480–0.247) values, respectively, outperformed the statistical model, which had R2 (− 0.060–0.719) and RMSE (1.695–0.537) values, in predicting daily CO2 emissions across all four regions. The performance of the ML and DL models was further enhanced by differencing, a technique that improves accuracy by ensuring stationarity and creating additional features and patterns from which the model can learn. Additionally, applying ensemble techniques such as bagging and voting improved the performance of the ML models by approximately 9.6%, whereas hybrid combinations of CNN-RNN enhanced the performance of the RNN models. In summary, the performance of both the ML and DL models was relatively similar. However, due to the high computational requirements associated with DL models, the recommended models for daily CO2 emission prediction are ML models using the ensemble technique of voting and bagging. This model can assist in accurately forecasting daily emissions, aiding authorities in setting targets for CO2 emission reduction. [ABSTRACT FROM AUTHOR]- Published
- 2025
- Full Text
- View/download PDF
19. A Flexible Discrete Probability Model for Partly Cloudy Days.
- Author
-
HUSSAIN, TASSADDAQ, BAKOUCH, HASSAN S., REHMAN, ZAHID UR, SHAKIL, MOHAMMAD, QINGSONG SHAN, and QIANNING LIU
- Subjects
- *
LAPLACE transformation , *ASYMPTOTIC distribution , *STATISTICAL models , *DATA analysis , *PROBABILITY theory - Abstract
In this article, a discrete-valued probability model is proposed, its mathematical properties and formulation are studied under the nabla structure, which include discrete Laplace transformation, moments, recurrence relation between moments, index of dispersion, and asymptotic distribution of extremes. Furthermore, application of model with rereference to the partly cloudy days is discussed. Moreover, the model compatibility is checked by chi-square, Anderson-Darling, Cramér-von Mises, information criterion and Vuong statistics, and it is found that the proposed model is the best strategy for such data analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
20. Brittleness evaluation and damage evolution of sandstone under hydromechanical coupling.
- Author
-
Zhang, Kuan, Wang, Wei, Cao, Yajun, Liu, Shifan, and Duan, Xuelei
- Subjects
- *
HYDRAULIC couplings , *FAILURE mode & effects analysis , *THRESHOLD energy , *STRAINS & stresses (Mechanics) , *COUPLINGS (Gearing) , *BRITTLENESS , *ROCK deformation - Abstract
Investigating the brittleness characteristics and damage evolution of deep rock masses under hydromechanical coupling has important significance. The variations in mechanical properties and brittleness characteristics of sandstone under different confining pressures and pore pressures were studied. Based on the stress threshold evolution and energy conversion analysis of the full stress-strain behavior characteristics of the rock, the new brittleness evaluation indexes were proposed, which effectively described the rock brittle failure mode and verified the reliability and applicability of the brittleness index. Additionally, from the perspective of rock pore micro-elements and the growth of matrix particle defects, the strain statistical damage theory was introduced to establish a rock statistical damage evolution model capable of accounting for the influence of pore pressure, thereby effectively capturing the nonlinear soft hardening of porous rocks under hydraulic coupling conditions. The correlation between rock brittleness and rock soft and hardening characteristics was reasonably expressed by constructing a new brittleness evaluation index, discovered from the relationship between rock damage parameters and brittleness characteristics. Eventually, based on the proposed nonlinear expression and statistical damage evolution model, the development trend of sandstone lateral strain is predicted well. The theoretical validation has good consistency with the experimental data and illustrates the rationality of the model. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
21. Effective Factors in the Number of Children Among Iranian Women: An Application of Poisson Regression Model.
- Author
-
Ghorbani, Raheb, Gharibi, Maryam, Ansari-Nia, Fayeze, Ghorbani, Narges, Safari, Habib-Allah, Kahouei, Mehdi, and Soltani-Kermanshahi, Mojtaba
- Subjects
FAMILY planning ,CROSS-sectional method ,MARRIAGE ,RESEARCH funding ,CULTURE ,SOCIOECONOMIC factors ,INTERVIEWING ,QUESTIONNAIRES ,PSYCHOLOGY of women ,DESCRIPTIVE statistics ,FAMILIES ,CONTENT mining ,DATA analysis software ,REGRESSION analysis - Abstract
Background: The number of children born into a family has a significant impact on a mother's reproductive system's physiological health and overall well-being. Objectives: This study aimed to explore the effective factors in the number of living children (NLC) among married women in Semnan, Iran, with a particular focus on social, cultural, and economic influences. Methods: This cross-sectional study examined the reproductive histories of 600 married women aged 15 to 49 years from Semnan, Iran. Sample size calculations were based on the principal variable, the expected number of children, using PASS software. The women were selected through a multistage random sampling method from health center lists in 2018 (April-October). The data were collected through interview questionnaires and analyzed using the Quasi-Poisson model. Results: The mean (standard deviation [SD]) age at first marriage for mothers (MAM) and fathers was estimated to be 21.02 (4.80) and 25.10 (4.70) years, respectively. Additionally, the mean (SD) values for the number of expected children (NCEX), living children, and the number of pregnancies were 2.19 (0.96), 1.85 (0.81), and 2.15 (1.03), respectively. Among the variables of interest, only NCEX, the number of pregnancies (with a positive effect), MAM, father's education, multiple births, and the desire to have children (with a negative effect) significantly influenced NLC. Conclusions: This study recommends implementing an educational program to promote an optimal and ideal family size to prevent adverse pregnancy outcomes. The age of women at marriage is a significant factor, as an increase in women's age leads to shorter pregnancy intervals, exposing women to complications, such as premature birth, perinatal death, and intrauterine growth restriction. Policymakers should, therefore, encourage early marriages by fostering a culture and providing supportive facilities. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
22. Evaluating the efficiency of environmentally friendly magnetic photocatalyst for the treatment of industrial effluents containing dye pollutants
- Author
-
Sedigheh Abbasi
- Subjects
Two-dimensional nanostructures ,Nanocomposite ,Photocatalytic activity ,Statistical model ,Box–Cox ,Water supply for domestic and industrial purposes ,TD201-500 - Abstract
Abstract In this research, nanocomposites based on two-dimensional nanostructures such as graphene oxide were used as photocatalysts to remove methyl orange. One of the most important advantages of this type of photocatalyst is its high efficiency in the decomposition of colored organic pollutants, as well as its ease of separation at the end of the process. For this purpose, titania semiconductor is synthesized as photocatalytic nanoparticles by hydrolysis method on the surface of graphene oxide. The dependence of photocatalytic activity of nanocomposite and hybrid without graphene oxide on operating conditions including irradiation time and photocatalyst concentration shows the positive effect of these two factors on the removal efficiency of methyl orange. By increasing the irradiation time from 5 to 40 min and also changing the photocatalyst concentration from 0.05 to 0.2%wt, the photocatalytic activity increases significantly, while with the increase in the pH of the suspension from 3 to 11, the removal efficiency initially decreases and then increases. The statistical findings indicate that independent single factors, their binary and triple interactions have a significant effect on the response at the 5% confidence level. Therefore, the statistical models proposed by nanocomposite and hybrid are able to estimate the dependence of removal efficiency on significant factors with an accuracy of 99%. Also, the Box–Cox curve confirms the adequacy of the statistical models without any transformation function.
- Published
- 2025
- Full Text
- View/download PDF
23. Weather-based rice yield prediction in Kerala using ANN, SMLR and normal regression
- Author
-
Davis, P Lincy, Ajithkumar, B, Riya, K R, Vysakh, Arjun, and Babu, Kavya
- Published
- 2024
- Full Text
- View/download PDF
24. Systematics of Nuclear Dissipation Around A = 200 Region.
- Author
-
Rai, N. K., Behera, B. R., and Sadhukhan, Jhilam
- Abstract
We have studied the effect of nuclear dissipation on the fusion-fission dynamics by performing statistical model analysis of the available neutron multiplicity ( ν pre ) data of the even-even isotopes of Pb (Z= 82) in the mass range 192 ≤ A ≤ 204 , where strong shell effects are expected. Here, a comparative study with the neutron shell closure of neutron number N = 126 has been also carried out. Our statistical model calculation includes finer corrections such as shell effects, collective enhancement in the level density parameter (CELD), and modification in fission decay widths due to the orientation of the compound nuclear spin. The reduced dissipation strength β is used as a tunable parameter in order to reproduce the experimental data, viz-a-viz to understand the behavior of nuclear dissipation. Particularly, the influence of various properties of the target-projectile combination such as the fissility parameter and N/Z of the compound system are investigated to extract a systematic trend of the nuclear dissipation strength. The nuclear dissipation increases with the increasing value of the N/Z, and decreases with the increasing value of fissility for the nuclei of proton magic number Z= 82 and the neutron magic number N = 126. Nuclear dissipation also shows a strong dependence on the excitation energy. The higher values of β in the energy range 50–60 MeV indicate a strong dissipation effect due to the dominating nature of the shell effect and a clear systematics of the nuclear dissipation has not been observed in this energy range. Here, we have also studied the role of different forms of the level density parameter on the nuclear dissipation and a higher value of dissipation strength β is obtained when all the effects like CELD and shell correction in the level density are included. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
25. Uncovering the impact of outliers on clusters’ evolution in temporal data-sets: an empirical analysis
- Author
-
Muhammad Atif, Muhammad Farooq, Muhammad Shafiq, Tmader Alballa, Somayah Abdualziz Alhabeeb, and Hamide Abd El-Wahed Khalifa
- Subjects
Streaming data ,Clustering ,Outliers ,Change detection ,Statistical model ,Stochastic systems ,Medicine ,Science - Abstract
Abstract This study investigates the impact of outliers on the evolution of clusters in temporal data-sets. Monitoring and tracing cluster transitions of temporal data sets allow us to observe how clusters evolve and change over time. By tracking the movement of data points between clusters, we can gain insights into the underlying patterns, trends, and dynamics of the data. This understanding is essential for making informed decisions and drawing meaningful conclusions from the clustering results. Cluster evolution refers to the changes that occur in the clustering results over time due to the arrival of new data points. The changes in cluster solutions are classified as external and internal transitions. The study employs the survival ratio and history cost function to investigate the effects of outliers on changes experienced by the clusters at successive time points. The results demonstrate that outliers have a significant impact on cluster evolution, and appropriate outlier handling techniques are necessary to obtain reliable clustering results. The findings of this study provide useful insights for practitioners and researchers in the field of stream clustering and can help guide the development of more robust and accurate stream clustering algorithms.
- Published
- 2024
- Full Text
- View/download PDF
26. Характеристики деформационного процесса в зоне субдукции Курило-Камчатской островной дуги в фазе афтершоков на основе дробной модели деформационной активности
- Author
-
Шереметьева, О.В. and Шевцов, Б.М.
- Subjects
афтершоки ,аппроксимация ,дробный процесс пуассона ,функция миттаг-леффлера ,эредитарность ,нестационарность ,статистическая модель ,дробная модель ,aftershocks ,approximation ,fractional poisson process ,mittag-leffler’s function ,herediterity ,non-stationarity ,statistical model ,fractional model ,Science - Abstract
В статье представлены результаты расчётов значений параметров, определяющих свойства деформационного процесса, на основании данных каталога землетрясений Камчатского филиала ФГБУН Федеральный исследовательский центр «Единая геофизическая служба РАН» (КФ ФИЦ ЕГС РАН) за период 01.01.1962 − 31.12.2002 гг. для зоны субдукции Курило-Камчатской островной дуги в фазе афтершоков в рамках ранее представленной авторами дробной модели деформационного процесса. В качестве модели рассматривается составной степенной процесс Пуассона в дробном представлении по времени. Афтершоки, связанные с главным событием заданной энергии, определяются на основе энергетического, пространственного и временного критериев. Для построения эмпирического закона распределения афтершоков фиксированного класса в зависимости от времени до главного события применяется метод наложения «эпох» к последовательностям афтершоков для всех главных событий заданной энергии в каталоге. Эмпирические кумулятивные законы распределения времени ожидания афтершоков аппроксимируются функцией Миттаг–Леффлёра на основании разработанной авторами дробной модели деформационного процесса. Результаты расчётов значений параметров функции Миттаг–Леффлёра показали, что деформационный процесс в рассматриваемой зоне обладает свойствами нестационарности и эредитарности в фазе афтершоков для главных событий классов K < 12, 5. При увеличении класса главного удара процесс можно считать нестационарным стандартным пуассоновским.
- Published
- 2024
- Full Text
- View/download PDF
27. Modeling heterogeneity of Sudanese hospital stay in neonatal and maternal unit: non-parametric random effect models with Gamma distribution
- Author
-
Amani Almohaimeed and Ishag Adam
- Subjects
non–parametric regression ,Gamma distribution ,Statistical model ,hospitalization length ,Computer applications to medicine. Medical informatics ,R858-859.7 ,Analysis ,QA299.6-433 - Abstract
Abstract Objective Studies looking into patient and institutional variables linked to extended hospital stays have arisen as a result of the increased focus on severe maternal morbidity and mortality. Understanding the length of hospitalization of patients after delivery is important to gain insights into when hospitals will reach capacity and to predict corresponding staffing or equipment requirements. In Sudan, the distribution of the length of stay during delivery hospitalizations is heavily skewed, with the average length of stay of 2 to 3 days. This study aimed to investigate the use of non-parametric random effect model with Gamma distributed response for analyzing skewed hospital length of stay data in Sudan in neonatal and maternal unit. Methods We applied Gamma regression models with unknown random effects, estimated using the non-parametric maximum likelihood (NPML) technique [5]. The NPML reduces the heterogeneity in the distribution of the response and produce a robust estimation since it does not require any assumptions on the distribution. The same applies to the log–Gamma link that does not require any transformation for the data distribution and it can handle the outliers in the data points. In this study, the models are fitted with and without covariates and compared using AIC and BIC values. Results The findings imply that in the context of health care database investigations, Gamma regression models with non–parametric random effect consistently reduce heterogeneity and improve model accuracy. The generalized linear model with covariates and random effect (k = 4) had the best fit, indicating that Sudanese hospital length of stay data could be classified into four groups with varying average stays influenced by maternal, neonatal, and obstetrics data. Conclusion Identifying factors contributing to longer stays allows hospitals to implement strategies for improvement. Non-parametric random effect model with Gamma distributed response effectively accounts for unobserved heterogeneity and individual-level variability, leading to more accurate inferences and improved patient care. Including random effects can significantly affect variable significance in statistical models, emphasizing the need to consider unobserved heterogeneity when analyzing data containing potential individual-level variability. The findings emphasise the importance of making robust methodological choices in healthcare research in order to inform accurate policy decisions.
- Published
- 2024
- Full Text
- View/download PDF
28. Improvement of Statistical Models by Considering Correlations among Parameters: Local Anesthetic Agent Simulator for Pharmacological Education
- Author
-
Toshiaki Ara and Hiroyuki Kitamura
- Subjects
local anesthetics ,statistical model ,Monte Carlo simulation ,correlation coefficient ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 ,Computer applications to medicine. Medical informatics ,R858-859.7 - Abstract
Background: To elucidate the effects of local anesthetic agents (LAs), guinea pigs are used in pharmacological education. Herein, we aimed to develop a simulator for LAs. Previously, we developed a statistical model to simulate the LAs’ effects, and we estimated their parameters (mean [μ] and logarithm of standard deviation [logσ]) based on the results of animal experiments. The results of the Monte Carlo simulation were similar to those from the animal experiments. However, the drug parameter values widely varied among individuals, because this simulation did not consider correlations among parameters. Method: In this study, we set the correlations among these parameters, and we performed simulations using Monte Carlo simulation. Results: Weakly negative correlations were observed between μ and logσ (rμ−logσ). In contrast, weakly positive correlations were observed among μ (rμ) and among logσ (rlogσ). In the Monte Carlo simulation, the variability in duration was significant for small rμ−logσ values, and the correlation for the duration between two drugs was significant for large rμ and rlogσ values. When parameters were generated considering the correlation among the parameters, the correlation of the duration among the drugs became larger. Conclusions: These results suggest that parameter generation considering the correlation among parameters is important to reproduce the results of animal experiments in simulations.
- Published
- 2024
- Full Text
- View/download PDF
29. Comprehensive guidelines for appropriate statistical analysis methods in research
- Author
-
Jonghae Kim, Dong Hyuck Kim, and Sang Gyu Kwak
- Subjects
algorithms ,biostatistics ,data analysis ,guideline ,statistical data interpretation ,statistical model ,Anesthesiology ,RD78.3-87.3 - Abstract
Background The selection of statistical analysis methods in research is a critical and nuanced task that requires a scientific and rational approach. Aligning the chosen method with the specifics of the research design and hypothesis is paramount, as it can significantly impact the reliability and quality of the research outcomes. Methods This study explores a comprehensive guideline for systematically choosing appropriate statistical analysis methods, with a particular focus on the statistical hypothesis testing stage and categorization of variables. By providing a detailed examination of these aspects, this study aims to provide researchers with a solid foundation for informed methodological decision making. Moving beyond theoretical considerations, this study delves into the practical realm by examining the null and alternative hypotheses tailored to specific statistical methods of analysis. The dynamic relationship between these hypotheses and statistical methods is thoroughly explored, and a carefully crafted flowchart for selecting the statistical analysis method is proposed. Results Based on the flowchart, we examined whether exemplary research papers appropriately used statistical methods that align with the variables chosen and hypotheses built for the research. This iterative process ensures the adaptability and relevance of this flowchart across diverse research contexts, contributing to both theoretical insights and tangible tools for methodological decision-making. Conclusions This study emphasizes the importance of a scientific and rational approach for the selection of statistical analysis methods. By providing comprehensive guidelines, insights into the null and alternative hypotheses, and a practical flowchart, this study aims to empower researchers and enhance the overall quality and reliability of scientific studies.
- Published
- 2024
- Full Text
- View/download PDF
30. Uncovering the impact of outliers on clusters' evolution in temporal data-sets: an empirical analysis.
- Author
-
Atif, Muhammad, Farooq, Muhammad, Shafiq, Muhammad, Alballa, Tmader, Abdualziz Alhabeeb, Somayah, and Abd El-Wahed Khalifa, Hamide
- Subjects
CLUSTERING algorithms ,COST functions ,STOCHASTIC systems ,STOCHASTIC models ,STATISTICAL models - Abstract
This study investigates the impact of outliers on the evolution of clusters in temporal data-sets. Monitoring and tracing cluster transitions of temporal data sets allow us to observe how clusters evolve and change over time. By tracking the movement of data points between clusters, we can gain insights into the underlying patterns, trends, and dynamics of the data. This understanding is essential for making informed decisions and drawing meaningful conclusions from the clustering results. Cluster evolution refers to the changes that occur in the clustering results over time due to the arrival of new data points. The changes in cluster solutions are classified as external and internal transitions. The study employs the survival ratio and history cost function to investigate the effects of outliers on changes experienced by the clusters at successive time points. The results demonstrate that outliers have a significant impact on cluster evolution, and appropriate outlier handling techniques are necessary to obtain reliable clustering results. The findings of this study provide useful insights for practitioners and researchers in the field of stream clustering and can help guide the development of more robust and accurate stream clustering algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. A Prostate Imaging‐Reporting and Data System version 2.1‐based predictive model for clinically significant prostate cancer diagnosis.
- Author
-
Gelikman, David G., Azar, William S., Yilmaz, Enis C., Lin, Yue, Shumaker, Luke A., Fang, Andrew M., Harmon, Stephanie A., Huang, Erich P., Parikh, Sahil H., Hyman, Jason A., Schuppe, Kyle, Nix, Jeffrey W., Galgano, Samuel J., Merino, Maria J., Choyke, Peter L., Gurram, Sandeep, Wood, Bradford J., Rais‐Bahrami, Soroush, Pinto, Peter A., and Turkbey, Baris
- Subjects
- *
MAGNETIC resonance imaging , *RECEIVER operating characteristic curves , *CANCER diagnosis , *STATISTICAL models , *PROSTATE cancer - Abstract
Objectives Patients and Methods Results Conclusion To develop and validate a Prostate Imaging‐Reporting and Data System (PI‐RADS) version 2.1 (v2.1)‐based predictive model for diagnosis of clinically significant prostate cancer (csPCa), integrating clinical and multiparametric magnetic resonance imaging (mpMRI) data, and compare its performance with existing models.We retrospectively analysed data from patients who underwent prospective mpMRI assessment using the PI‐RADS v2.1 scoring system and biopsy at our institution between April 2019 and December 2023. A ‘Clinical Baseline’ model using patient demographics and laboratory results and an ‘MRI Added’ model additionally incorporating PI‐RADS v2.1 scores and prostate volumes were created and validated on internal and external patients. Both models were compared against two previously published MRI‐based algorithms for csPCa using area under the receiver operating characteristic curve (AUC) and decision curve analysis.A total of 1319 patients across internal and external cohorts were included. Our ‘MRI Added’ model demonstrated significantly improved discriminative ability (AUCinternal 0.88, AUCexternal 0.79) compared to our ‘Clinical Baseline’ model (AUCinternal 0.75, AUCexternal 0.68) (P < 0.001). The ‘MRI Added’ model also showed higher net benefits across various clinical threshold probabilities and compared to a ‘biopsy all’ approach, it reduced unnecessary biopsies (defined as biopsies without Gleason Grade Group ≥2 csPCa) by 27% in the internal cohort and 10% in the external cohort at a risk threshold of 25%. However, there was no significant difference in predictive ability and reduction in unnecessary biopsies between our model and comparative ones developed for PI‐RADS v2 and v1.Our PI‐RADS v2.1‐based mpMRI model significantly enhances csPCa prediction, outperforming the traditional clinical model in accuracy and reduction of unnecessary biopsies. It proves promising across diverse patient populations, establishing an updated, integrated approach for detection and management of prostate cancer. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. GLOBAL CLIMATE CHANGE AND TEMPERATURE BALANCE: RISKS AND PROSPECTS.
- Author
-
Yusubov, Fakhraddin, Valiyeva, Govhar, and Nasirov, Igor
- Subjects
- *
CLIMATE change , *TEMPERATURE , *SOLAR activity - Abstract
This paper focuses on global climate change and the balance of global temperatures. It has been established that CO2 emissions and solar activity exert the greatest influence on changes in average temperatures. The study explores the relationship between average temperature changes, the rate of solar energy input, CO2 concentration, and cyclical periods. It was found that CO2 emissions can be significantly reduced. The findings indicate that even with reductions and minimization of CO2 emissions, such as on the European continent, the average temperature increase observed over the past 48–50 years will not decrease, and a return to the temperature levels of 100 years ago is not feasible. However, it is possible to project the stabilization of this increase at around 1.5°C. Furthermore, this research highlights the critical link between CO2 emissions, solar activity, and the broader natural and technological risks associated with climate change. These risks include extreme weather events, natural disasters, and impacts on industrial processes, underscoring the importance of innovative methodologies and practices for risk mitigation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
33. Predicting Enteric Methane Emissions from Dairy and Beef Cattle Using Nutrient Composition and Intake Variables.
- Author
-
Wang, Yaodong, Song, Weitao, Wang, Qian, Yang, Fafa, and Yan, Zhengang
- Subjects
- *
NONLINEAR statistical models , *GREENHOUSE gases , *LINEAR statistical models , *BEEF cattle , *DAIRY cattle - Abstract
Simple Summary: Enteric methane (CH4) production in cattle accounts for a significant portion of global greenhouse gas emissions. Measurement of enteric methane emission is complex, expensive, and large-scale measurement is impractical. Therefore, in the absence of measurements, modeling can be used to predict CH4 production and help investigate mitigation options. In this study, we used dietary nutrient composition (g/kg), nutrients (kg/day), energy (MJ/day), and energy and organic matter (OM) digestibility (g/kg) as predictors of CH4 production to develop linear and nonlinear statistical models for predicting enteric methane emission from beef and dairy cattle and to evaluate the few available models. The objective of this study was to develop linear and nonlinear statistical models for predicting enteric methane emissions from beef and dairy cattle (EME, MJ/day). Ration nutrient composition (g/kg), nutrient (kg/day), energy (MJ/day), and energy and organic matter (OM) digestibility (g/kg) were used as predictors of CH4 production. Three databases of beef cattle, dairy cattle, and their combinations were developed using 34 published experiments to model EME predictions. Linear and nonlinear regression models were developed using a mixed-model approach to predict CH4 production (MJ/day) of individual animals based on feed composition. For the beef cattle database, Equation methane (MJ/d) = 1.6063 (±0.757) + 0.4256 (±0.0745) × DMI + 1.2213 (±0.1715) × NDFI + −0.475 (±0.446) × ADFI had the smallest RMSPE (21.99%), with 83.51% of this coming from random error and a regression bias was 16.49%. For the dairy cattle database, the RMSPE was minimized (15.99%) for methane (MJ/d) = 0.3989 (±1.1073) + 0.8685 (±0.1585) × DMI + 0.6675 (±0.4264) × NDFI, of which 85.11% was from random error and the regression deviation was 14.89%. When the beef and dairy cattle databases were combined, the RMSPE was minimized (24.4%) for methane(MJ/d) = −0.3496 (±0.723) + 0.5941 (±0.0851) × DMI + 1.388 (±0.2203) × NDFI + −0.027 (±0.4223) × ADFI, of which 85.62% was from the random error and the regression bias was 14.38%. Among the nonlinear equations for the three databases, the DMI-based exponential model outperformed the other nonlinear models, but the predictability and goodness of fit of the equations did not improve compared to the linear model. The existing equations overestimate CH4 production with low accuracy and precision. Therefore, the equations developed in this study improve the preparation of methane inventories and thus improve the estimation of methane production in beef and dairy cattle. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Combining Satellite, Teleconnection, and In Situ Data to Improve Understanding of Multi‐Decadal Coastal Ice Cover Dynamics on Earth's Largest Freshwater Lake.
- Author
-
Venumuddula, Manish, Kirchner, Karl, Chen, Austin, Rood, Richard B., and Gronewold, Andrew D.
- Subjects
- *
LAKES , *SHORELINES , *ATMOSPHERIC models , *STATISTICAL models , *DECISION making - Abstract
To differentiate and understand drivers behind coastal ice cover trends and variability, we advance development of a model combining satellite, in situ, and teleconnection data along the shoreline of Earth's largest freshwater lake (Lake Superior). Previous studies suggest a regime shift in Lake Superior's ice cover starting in 1998. Our study includes seven years of new data and subsequent model analysis that provide new insight into characteristics of the post‐1998 regime. In addition to providing a valuable extension to the historical ice cover record for this domain, we find the regime shift in coastal ice cover starting in 1998 is characterized by pronounced variability, and not simply a shift in pre‐1998 trends. Our findings represent an important stepping stone for future ice and climate modeling not only on Lake Superior but across the entire Great Lakes region and in other global high‐latitude coastal regions as well. Plain Language Summary: Ice cover on the Laurentian Great Lakes has become more variable over the past 25 years. To better understand this variability, we re‐develop a model from a previous study focused on a portion of the shoreline of Lake Superior. The previous study, which culminated in 2015, suggested a change in both interannual variability and long‐term trends in ice cover after 1998. At the time of the previous study, however, the extent to which those changes might continue to propagate into the future was unclear. Here, we extend the historical ice cover record through 2022 while also exploring a broad range of potential explanatory variables in a simulation model. Our analysis indicates that the post‐1998 regime is characterized by more pronounced variability than previous studies indicated, with near‐record‐high years of ice cover followed by years of very little or even no appreciable ice cover. These interannual ice cover dynamics were not evident in the historical record prior to 1998, and their persistence from 1998 through 2022 underscores the importance of not only differentiating regime shifts from trends in climate change studies, but also of the value in correctly reflecting those regime shifts in simulation and forecasting models. Key Points: For the past 25 years, ice cover along Lake Superior's shoreline has been characterized by pronounced interannual variabilityWe reproduce these dynamics in two probabilistic models (Cox survival and beta) combining satellite, in situ, and teleconnection dataCoastal ice conditions along the shoreline of Earth's largest freshwater lake reflect a regime shift that began in the late 1990s [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. A new probabilistic model: Its implementations to time duration and injury rates in physical training, sports, and reliability sector.
- Author
-
Lu, Guang, Alamri, Osama Abdulaziz, Alnssyan, Badr, and Alshahrani, Mohammed A.
- Subjects
DISTRIBUTION (Probability theory) ,PHYSICAL education ,PHYSICAL training & conditioning ,STATISTICS ,STATISTICAL models - Abstract
The study of real-life situations, notably in the fields of physical education, sports, and reliability, underscores the importance of probability distributions for comprehensive statistical analysis. Thus, researchers are consistently pursuing new and flexible probability distributions to ensure an optimal fit for real-life phenomena. Recognizing the vital importance of probability-based approaches, this article presents a new probability model known as the modified exponentiated exponential (ME-exponential) distribution. This model is created through the combination of the exponentiated exponential distribution and a familiar probabilistic technique. The investigation of the mathematical properties, particularly the quartile-based characteristics of the ME-exponential distribution, has been undertaken. Furthermore, the derivation of point estimators for the unknown parameters of the new model is outlined. A comprehensive simulation study has also been conducted to assess the performances of these point estimators. In the context of physical education, sports, and reliability, we analyze the practical significance of the ME-exponential distribution. By employing four critical assessment tools, it becomes evident that this distribution provides a superior fit relative to numerous other distributions. Our findings indicate that this newly established distribution adds to the suite of probability distributions that can be employed for the statistical analysis of real-life situations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Improvement of Statistical Models by Considering Correlations among Parameters: Local Anesthetic Agent Simulator for Pharmacological Education.
- Author
-
Ara, Toshiaki and Kitamura, Hiroyuki
- Subjects
MONTE Carlo method ,ANIMAL experimentation ,LOCAL anesthetics ,GUINEA pigs ,STATISTICAL models - Abstract
Background: To elucidate the effects of local anesthetic agents (LAs), guinea pigs are used in pharmacological education. Herein, we aimed to develop a simulator for LAs. Previously, we developed a statistical model to simulate the LAs' effects, and we estimated their parameters (mean [ μ ] and logarithm of standard deviation [ log σ ]) based on the results of animal experiments. The results of the Monte Carlo simulation were similar to those from the animal experiments. However, the drug parameter values widely varied among individuals, because this simulation did not consider correlations among parameters. Method: In this study, we set the correlations among these parameters, and we performed simulations using Monte Carlo simulation. Results: Weakly negative correlations were observed between μ and log σ ( r μ − log σ ). In contrast, weakly positive correlations were observed among μ ( r μ ) and among log σ ( r log σ ). In the Monte Carlo simulation, the variability in duration was significant for small r μ − log σ values, and the correlation for the duration between two drugs was significant for large r μ and r log σ values. When parameters were generated considering the correlation among the parameters, the correlation of the duration among the drugs became larger. Conclusions: These results suggest that parameter generation considering the correlation among parameters is important to reproduce the results of animal experiments in simulations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Topological Characterization of Some New Anti-Viral Drugs for Cancer Treatment.
- Author
-
Zaman, Shahid, Mushtaq, Mahnoor, Danish, Muhammad, Ali, Parvez, and Rasheed, Sadaf
- Abstract
Topological characterization in drug design is used to predict the pharmacological properties of compounds. In drug activity, these indices help to assess a compound's biological activity, such as binding affinity to a target protein or enzyme, by quantifying its molecular complexity and functional groups. Topological indices are useful in the development of new medications with properties similar to those of effective anti-cancer drugs. In this paper, the medications used to cure cancer are perfragilin A, carmustine, and melatonin. In order to represent the features of these medications in QSPR, the idea of edge partitions has been applied, and curve fitting model is used to construct the QSPR study. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. Tools and Results of the Study of the Relationship between Production Dynamics and the Dynamics of Costs for Technological Innovation in the Russian Economy.
- Author
-
Suvorov, N. V., Beletsky, Yu. V., and Treshchina, S. V.
- Abstract
The article examines methodological and instrumental issues related to the quantitative assessment of the impact of costs of technological innovation on the dynamics of the real sector of the domestic economy in the 1990–2000s. The use of the production functions apparatus for this purpose is justified. The results of identification of sectoral production functions for the real sector of the domestic economy are presented. The relationship between the rate of change in production efficiency and the scale of innovation activity has been studied. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Silicomanganese fume-based alkali-activated mortar: experimental, statistical, and environmental impact studies.
- Author
-
Najamuddin, Syed Khaja, Johari, Megat Azmi Megat, Bahraq, Ashraf A., Yusuf, Moruf Olalekan, Maslehuddin, Mohammed, and Ibrahim, Mohammed
- Subjects
ENVIRONMENTAL impact analysis ,ORTHOCLASE ,PRODUCT life cycle assessment ,X-ray diffraction ,SCANNING electron microscopy ,MORTAR - Abstract
This paper evaluates the flowability and strength properties of alkali-activated mortar produced using silicomanganese fume (SiMnF) as the sole binder, combined with alkaline activators and sand, cured at room temperature (23 ± 1 °C). A total of 18 mixes were prepared by varying binder content (370, 470, and 570 kg/m
3 ), alkaline activator content (33, 43, and 53% of binder by weight), and NaOH concentration (8 M and 12 M). The SiMnF-based alkali-activated pastes were characterized using SEM, XRD, and FTIR techniques to study morphology, mineral composition, and functional groups, respectively. Statistical modeling, including analysis of variance (ANOVA) and response surface method (RSM), was performed to optimize the mixes, and a life cycle assessment was conducted to evaluate the environmental impact of the developed SiMnF-based alkali-activated mortars (SiMnF-AAM). The experimental results showed that an optimal mix design with 470 kg/m3 SiMnF, 43% alkaline activator content, and NaOH concentrations of 8 M and 12 M achieved the best balance of flow and strength. XRD and FTIR analyses confirmed that Nchwaningite was the primary reaction product, with secondary phases including magnetite, manganese ferrite, and potassium feldspar, influenced by alkali concentration. The SiMnF-based mixtures had a significantly lower CO₂ footprint (0.08 kg CO₂/kg) compared to the cement-based mix, with alkali activators being the primary contributors to emissions. The developed SiMnF-AAM mixes, cured at room temperature, exhibited improved workability, mechanical properties, and reduced environmental impact, making them adaptive to real-life applications. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
40. Forecasting and Comparative Application of PV System Electricity Generation for Sprinkler Irrigation Machines Based on Multiple Models.
- Author
-
Li, Bohan, Liu, Kenan, Cai, Yaohui, Sun, Wei, and Feng, Quan
- Subjects
- *
PHOTOVOLTAIC power generation , *MACHINE learning , *SPRINKLER irrigation , *POWER resources , *SPRINKLERS - Abstract
Currently, photovoltaic (PV) resources have been widely applied in the agricultural sector. However, due to the unreasonable configuration of multi-energy collaboration, issues such as unstable power supply and high investment costs still persist. Therefore, this study proposes a solution to reasonably determine the area and capacity of PV panels for irrigation machines, addressing the fluctuations in power generation of solar sprinkler PV systems under different regional and meteorological conditions. The aim is to more accurately predict photovoltaic power generation (PVPG) to optimize the configuration of the solar sprinkler power supply system, ensuring reliability while reducing investment costs. This paper first establishes a PVPG prediction model based on four forecasting models and conducts a comparative analysis to identify the optimal model. Next, annual, seasonal, and solar term scale models are developed and further studied in conjunction with the optimal model, using evaluation metrics to assess and compare the models. Finally, a mathematical model is established based on the optimal combination and solved to optimize the configuration of the power supply system in the irrigation machines. The results indicate that among the four PVPG prediction models, the SARIMAX model performs the best, as the R2 index reached 0.948, which was 19.4% higher than the others, while the MAE index was 10% lower than the others. The solar term scale model exhibited the highest accuracy among the three time scale models, the RMSE index was 4.8% lower than the others, and the MAE index was 1.1% lower than the others. After optimizing the configuration of the power supply system for the irrigation machine using the SARIMAX model based on the solar term scale, it is verified that the model can ensure both power supply reliability and manage energy overflow effectively. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. From Data to Diagnosis: Machine Learning Revolutionizes Epidemiological Predictions.
- Author
-
Abdul Rahman, Abdul Aziz, Rajasekaran, Gowri, Ramalingam, Rathipriya, Meero, Abdelrhman, and Seetharaman, Dhamodharavadhani
- Subjects
- *
ARTIFICIAL neural networks , *MACHINE learning , *DEEP learning , *STATISTICAL models , *COMMUNICABLE diseases - Abstract
The outbreak of epidemiological diseases creates a major impact on humanity as well as on the world's economy. The consequence of such infectious diseases affects the survival of mankind. The government has to stand up to the negative influence of these epidemiological diseases and facilitate society with medical resources and economical support. In recent times, COVID-19 has been one of the epidemiological diseases that created lethal effects and a greater slump in the economy. Therefore, the prediction of outbreaks is essential for epidemiological diseases. It may be either frequent or sudden infections in society. The unexpected raise in the application of prediction models in recent years is outstanding. A study on these epidemiological prediction models and their usage from the year 2018 onwards is highlighted in this article. The popularity of various prediction approaches is emphasized and summarized in this article. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. Modeling heterogeneity of Sudanese hospital stay in neonatal and maternal unit: non-parametric random effect models with Gamma distribution.
- Author
-
Almohaimeed, Amani and Adam, Ishag
- Subjects
RANDOM effects model ,STATISTICAL significance ,LENGTH of stay in hospitals ,GAMMA distributions ,DATA distribution - Abstract
Objective: Studies looking into patient and institutional variables linked to extended hospital stays have arisen as a result of the increased focus on severe maternal morbidity and mortality. Understanding the length of hospitalization of patients after delivery is important to gain insights into when hospitals will reach capacity and to predict corresponding staffing or equipment requirements. In Sudan, the distribution of the length of stay during delivery hospitalizations is heavily skewed, with the average length of stay of 2 to 3 days. This study aimed to investigate the use of non-parametric random effect model with Gamma distributed response for analyzing skewed hospital length of stay data in Sudan in neonatal and maternal unit. Methods: We applied Gamma regression models with unknown random effects, estimated using the non-parametric maximum likelihood (NPML) technique [5]. The NPML reduces the heterogeneity in the distribution of the response and produce a robust estimation since it does not require any assumptions on the distribution. The same applies to the log–Gamma link that does not require any transformation for the data distribution and it can handle the outliers in the data points. In this study, the models are fitted with and without covariates and compared using AIC and BIC values. Results: The findings imply that in the context of health care database investigations, Gamma regression models with non–parametric random effect consistently reduce heterogeneity and improve model accuracy. The generalized linear model with covariates and random effect (k = 4) had the best fit, indicating that Sudanese hospital length of stay data could be classified into four groups with varying average stays influenced by maternal, neonatal, and obstetrics data. Conclusion: Identifying factors contributing to longer stays allows hospitals to implement strategies for improvement. Non-parametric random effect model with Gamma distributed response effectively accounts for unobserved heterogeneity and individual-level variability, leading to more accurate inferences and improved patient care. Including random effects can significantly affect variable significance in statistical models, emphasizing the need to consider unobserved heterogeneity when analyzing data containing potential individual-level variability. The findings emphasise the importance of making robust methodological choices in healthcare research in order to inform accurate policy decisions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. ESTIMATE OF SEASONAL ELECTRICITY CONSUMPTION AND POWER DEMAND IN THE AGRICULTURAL SECTOR OF THE STATE OF SÃO PAULO.
- Author
-
Nogueira Christovão, Monclar, Mollo Neto, Mario, and de Luca Oliveira Christovão, Ana Flávia
- Subjects
ENERGY consumption forecasting ,ELECTRIC power consumption ,AGRICULTURAL industries ,LEAST squares ,ENERGY industries - Abstract
Copyright of Environmental & Social Management Journal / Revista de Gestão Social e Ambiental is the property of Environmental & Social Management Journal and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
44. Environmental impact of hydraulic fracturing on groundwater by isotope composition and hydrochemistry.
- Author
-
Li, Zhao and Luo, Zujiang
- Subjects
PRINCIPAL components analysis ,WATER quality monitoring ,HYDRAULIC fracturing ,ENVIRONMENTAL impact analysis ,STABLE isotopes - Abstract
Hydraulic fracturing is widely applied for unconventional energy to improve production capacity. But concerns exist about the potential negative impacts of hydraulic fracturing on the environment. Previous researches evaluated the impact of hydraulic fracturing on the environment by water quality monitoring qualitatively. Some numerical models were estimated to study it quantitatively. But there is uncertainty in the acquisition of mechanical parameters of deep rock. This study presents a statistical model for studying the environmental impact of hydraulic fracturing on the groundwater in the region. The contribution ratio of CBM co-produced water is calculated to estimate the containment degree by End Member Mixing Analysis. In order to obtain the end members purely, Vertex Component Analysis and Principal Components Analysis are used to extract the end members. The model is applied to Qinshui Basin, China. It was shown that hydraulic fracturing doesn't damage the aquiclude in the region. But shallow groundwater was polluted by CBM co-produced water in the Shizhuang Block. The mixing path of CBM co-produced water is from the surface to the aquifer. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. Sequence-to-Sequence Models and Their Evaluation for Spoken Language Normalization of Slovenian.
- Author
-
Sepesy Maučec, Mirjam, Verdonik, Darinka, and Donaj, Gregor
- Subjects
STATISTICAL hypothesis testing ,TRANSFORMER models ,SPEECH ,ORAL communication ,STATISTICAL models - Abstract
Sequence-to-sequence models have been applied to many challenging problems, including those in text and speech technologies. Normalization is one of them. It refers to transforming non-standard language forms into their standard counterparts. Non-standard language forms come from different written and spoken sources. This paper deals with one such source, namely speech from the less-resourced highly inflected Slovenian language. The paper explores speech corpora recently collected in public and private environments. We analyze the efficiencies of three sequence-to-sequence models for automatic normalization from literal transcriptions to standard forms. Experiments were performed using words, subwords, and characters as basic units for normalization. In the article, we demonstrate that the superiority of the approach is linked to the choice of the basic modeling unit. Statistical models prefer words, while neural network-based models prefer characters. The experimental results show that the best results are obtained with neural architectures based on characters. Long short-term memory and transformer architectures gave comparable results. We also present a novel analysis tool, which we use for in-depth error analysis of results obtained by character-based models. This analysis showed that systems with similar overall results can differ in the performance for different types of errors. Errors obtained with the transformer architecture are easier to correct in the post-editing process. This is an important insight, as creating speech corpora is a time-consuming and costly process. The analysis tool also incorporates two statistical significance tests: approximate randomization and bootstrap resampling. Both statistical tests confirm the improved results of neural network-based models compared to statistical ones. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Perturbation Approach to Polynomial Root Estimation and Expected Maximum Modulus of Zeros with Uniform Perturbations.
- Author
-
Nafisah, Ibrahim A., Sheikh, Sajad A., Alshahrani, Mohammed A., Almazah, Mohammed M. A., Alnssyan, Badr, and Dar, Javid Gani
- Subjects
- *
PERTURBATION theory , *ESTIMATION theory , *STABILITY theory , *NUMERICAL analysis , *SYSTEMS design - Abstract
This paper presents a significant extension of perturbation theory techniques for estimating the roots of polynomials. Building upon foundational results and recent work by Pakdemirli and Yurtsever, as well as taking inspiration from the concept of probabilistic bounds introduced by Sheikh et al., we develop and prove several novel theorems that address a wide range of polynomial structures. These include polynomials with multiple large coefficients, coefficients of different orders, alternating coefficient orders, large linear and constant terms, and exponentially decreasing coefficients. Among the key contributions is a theorem that establishes an upper bound on the expected maximum modulus of the zeros of polynomials with uniformly distributed perturbations in their coefficients. The theorem considers the case where all but the leading coefficient receive a uniformly and independently distributed perturbation in the interval [ − 1 , 1 ] . Our approach provides a comprehensive framework for estimating the order of magnitude of polynomial roots based on the structure and magnitude of their coefficients without the need for explicit root-finding algorithms. The results offer valuable insights into the relationship between coefficient patterns and root behavior, extending the applicability of perturbation-based root estimation to a broader class of polynomials. This work has potential applications in various fields, including random polynomials, control systems design, signal processing, and numerical analysis, where quick and reliable estimation of polynomial roots is crucial. Our findings contribute to the theoretical understanding of polynomial properties and provide practical tools for engineers and scientists dealing with polynomial equations in diverse contexts. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. AN EXTENDED SYMMETRICAL AND ASYMMETRICAL GENERATOR: PROPERTIES, INFERENCE, ACTUARIAL MEASURES, AND APPLICATIONS.
- Author
-
El-Sheikh, Amany M., Alqawba, Mohammed, El-Sherpieny, El-Sayed A., and Afify, Ahmed Z.
- Subjects
- *
PORTFOLIO management (Investments) , *VALUE at risk , *STATISTICAL models , *COMPUTER simulation , *DATA analysis - Abstract
In this paper, we introduce the Burr-X Marshall--Olkin-F (BXMO-F) family, which proves to be highly effective for real-life data analysis. We establish several of its mathematical properties and demonstrate that the BXMO-F family can accommodate a variety of hazard rates and density functions. We derive six risk measures for the BXMO-Lomax distribution, which are crucial for portfolio optimization. The parameters of the BXMO-Lomax distribution are estimated using eight different estimation approaches, and these approaches are thoroughly evaluated through detailed numerical simulations. Finally, we explore the applicability of the BXMO-Lomax distribution by analyzing two real-life data sets from engineering and medicine. Our analysis shows that the BXMO-Lomax distribution offers a superior fit compared to several well-known extensions of the Lomax distribution. [ABSTRACT FROM AUTHOR]
- Published
- 2024
48. m5c-iDeep: 5-Methylcytosine sites identification through deep learning.
- Author
-
Malebary, Sharaf J., Alromema, Nashwan, Suleman, Muhammad Taseer, and Saleem, Maham
- Subjects
- *
NUCLEOTIDE sequence , *METHYLCYTOSINE , *INDEPENDENT sets , *METHYL groups , *SEQUENCE analysis , *DEEP learning - Abstract
• The current study focused on the prediction of 5-methylcytosine (m5C) sites in RNA sequence • A novel method of feature generation has been proposed by incorporating statistical moments for features reduction. • The LSTM model revealed the maximum accuracy score in independent set test and k-fold cross validation as well. • A webserver, m5c-iDeep, is freely available online for the enhancement of research in m5C sites detection. 5-Methylcytosine (m5c) is a modified cytosine base which is formed as the result of addition of methyl group added at position 5 of carbon. This modification is one of the most common PTM that used to occur in almost all types of RNA. The conventional laboratory methods do not provide quick reliable identification of m5c sites. However, the sequence data readiness has made it feasible to develop computationally intelligent models that optimize the identification process for accuracy and robustness. The present research focused on the development of in-silico methods built using deep learning models. The encoded data was then fed into deep learning models, which included gated recurrent unit (GRU), long short-term memory (LSTM), and bi-directional LSTM (Bi-LSTM). After that, the models were subjected to a rigorous evaluation process that included both independent set testing and 10-fold cross validation. The results revealed that LSTM-based model, m5c-iDeep, outperformed revealing 99.9 % accuracy while comparing with existing m5c predictors. In order to facilitate researchers, m5c-iDeep was also deployed on a web-based server which is accessible at https://taseersuleman-m5c-ideep-m5c-ideep.streamlit.app/. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Training and Education Strategies of Mental Health Nursing Personnel in the Era of Artificial Intelligence.
- Author
-
Meiqin Zheng
- Abstract
With the rapid development of artificial intelligence (AI) technology, its widespread use in various industries, especially in education, has triggered an in-depth exploration of the role of AI in specific fields. Mental health nursing education, as a technology-sensitive and rapidly responsive field, is particularly important for research on the use of AI technology in order to enhance educational effectiveness and efficiency. Based on an in-depth analysis of existing educational models, this study explores how AI technology can revolutionize mental health nursing education, and comprehensively analyses the actual potential of AI technology in enhancing the quality of mental health nursing education through questionnaires, experimental design and the construction and application of statistical models. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. A Statistical Approach for Distribution System State Estimation.
- Author
-
Tekdemir, Ibrahim Gursu
- Subjects
ENERGY consumption ,SOLAR energy ,CONSUMPTION (Economics) ,TEST systems ,STATISTICAL models - Abstract
Power system state estimation is a useful technique that enables the system to be monitored when sufficient measurements are not available. Although it has been practiced for a long time, distribution system state estimation (DSSE) is still challenging today and is being studied from various perspectives. This is because distribution systems are large, complex, and hard to be monitored entirely using adequate measuring devices. In this study, a novel approach is proposed for DSSE, and it is demonstrated that it is possible to improve conventional state estimation results by using proper statistical models of energy consumption behaviors. For that purpose, a feeder in the Civanlar test system is analyzed by adapting real energy consumption data into a virtual consumption region with 10 465 residents created in this study. It is observed that estimated bus voltage amplitude values are improved as a result of the analyses carried out for 16 scenarios in total, which consist of four seasons and two time periods. The scenarios are grouped into two cases, base system and system with solar energy generation, each containing eight scenarios. The obtained results are significant in terms of showing that it is possible to improve DSSE results by using a statistical approach. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.