15,165 results on '"consistency"'
Search Results
2. Balancing Complementarity and Consistency via Delayed Activation in Incomplete Multi-view Clustering
- Author
-
Li, Bo, Xu, Zhiwei, Yun, Jing, Wang, Jiatai, Goos, Gerhard, Series Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Lin, Zhouchen, editor, Cheng, Ming-Ming, editor, He, Ran, editor, Ubul, Kurban, editor, Silamu, Wushouer, editor, Zha, Hongbin, editor, Zhou, Jie, editor, and Liu, Cheng-Lin, editor
- Published
- 2025
- Full Text
- View/download PDF
3. Self-Supervised Contrastive Learning for Consistent Few-Shot Image Representations
- Author
-
Karimijafarbigloo, Sanaz, Azad, Reza, Merhof, Dorit, Goos, Gerhard, Series Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Rekik, Islem, editor, Adeli, Ehsan, editor, Park, Sang Hyun, editor, and Cintas, Celia, editor
- Published
- 2025
- Full Text
- View/download PDF
4. Branding consistency across product portfolios in the wine industry
- Author
-
Jeffery, Tayla, Hirche, Martin, Faulkner, Margaret, Page, Bill, Trinh, Giang, Bruwer, Johan, and Lockshin, Larry
- Published
- 2024
- Full Text
- View/download PDF
5. Quantifying convergence and consistency.
- Author
-
Matiasz, Nicholas J., Wood, Justin, and Silva, Alcino J.
- Abstract
The reproducibility crisis highlights several unresolved issues in science, including the need to develop measures that gauge both the consistency and convergence of data sets. While existing meta‐analytic methods quantify the consistency of evidence, they do not quantify its convergence: the extent to which different types of empirical methods have provided evidence to support a hypothesis. To address this gap in meta‐analysis, we and colleagues developed a summary metric—the cumulative evidence index (CEI)—which uses Bayesian statistics to quantify the degree of both consistency and convergence of evidence regarding causal hypotheses between two phenomena. Here, we outline the CEI's underlying model, which quantifies the extent to which studies of four types—positive intervention, negative intervention, positive non‐intervention and negative non‐intervention—lend credence to any of three types of causal relations: excitatory, inhibitory or no‐connection. Along with p‐values and other measures, the CEI can provide a more holistic perspective on a set of evidence by quantitatively expressing epistemic principles that scientists regularly employ qualitatively. The CEI can thus address the reproducibility crisis by formally demonstrating how convergent evidence across multiple study types can yield progress toward scientific consensus, even when an individual type of study fails to yield reproducible results. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Human-Browser Interaction: Investigating Whether the Current Browser Application's Design Actually Make Sense for Its Users?
- Author
-
Carroll, Fiona
- Abstract
As a society, we can get so engrossed in using the World Wide Web (WWW) also known as the web. We look for, read about and interact with different people, text, images, videos etc. that we often take for granted how we are doing this. We all use a web browser, we must, to get around the web. Yet, do we ever stop to give much thought to this gateway to the web? The web browser can be fittingly described as a transparent technology. This is in the sense that we don't really stop to think about the browser and its features, we just unconsciously absorb and use it. This paper reports on a study that explores just how effectively we understand and use our web browser applications. The study was focused on people's perceived awareness of desktop web browser functionality, what features they are using/aware of and most importantly what understanding they take from these? The findings show that the majority (69%) of the five hundred and twenty-eight participants studied do not fully understand what the padlock feature on their web browsers represents. In fact, the findings highlight that many of the participants feel that the padlock represents a safe website which it clearly does not. This paper succinctly draws attention to the fact that the current desktop web browser application design is not fit for purpose. In summary, the research pushes for more effective web browser application designs; it provides design recommendations aimed at achieving web browser consistency and creating designs that promote safety, trust, and confidence. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Kernel estimators for mean residual lifetime in length-biased sampling.
- Author
-
Zamini, R., Ajami, M., and Ghafouri, S.
- Subjects
- *
BANDWIDTHS - Abstract
In this article, we propose three non parametric kernel estimators for the mean residual life function when the data are selected proportionally to their length. We evaluate the mean squared error of the three estimators and investigate the consistency for all three of them. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Asymptotic inference for a sign-double autoregressive (SDAR) model of order one.
- Author
-
Iglesias, Emma M.
- Abstract
AbstractWe propose an extension of the double autoregressive (DAR) model: the sign-double autoregressive (SDAR) model, in the spirit of the GJR-GARCH model (also named the sign-ARCH model). Our model shares the important property of DAR models where a unit root does not imply non stationarity and it allows for asymmetry, as other alternatives in the literature such as the GJR-GARCH or asymmetric linear DAR and dual-asymmetry linear DAR models. We establish consistency and asymptotic normality of the quasi-maximum likelihood estimator in the context of the SDAR model. Furthermore, it is shown by simulations that the asymptotic properties also apply in finite samples. Finally, an empirical application shows the usefulness of our model specially in periods of supply/demand crises of oil disruptions, where spikes of volatility are very likely to be predominant. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Modified intrinsic Bayes factor for multivariate regression models.
- Author
-
Taheri, Marzieh and Kheradmandnia, Manouchehr
- Abstract
The multivariate Regression model (MRM) is one of the most widely used models for multivariate statistical inference. In this paper, a closed-form and fully objective Bayes factor for comparing two MRMs is introduced. We also introduce a Bayes factor-based method for comparing several MRMs. It is well known that the arithmetic intrinsic Bayes factor (AIBF) does not satisfy the pairwise and multiple coherency conditions. So, the AIBF cannot be used in multiple hypotheses testing and model selection. Even in the case of only two competing hypothesis or models, lack of pairwise coherency leads to the embarrassing position of two Bayes factors for comparing two models instead of just one. In this paper, we have made a simple modification to the original definition and introduced the modified intrinsic Bayes factor (ModIBF) which satisfies the pairwise and the multiple coherency conditions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Boki: Towards Data Consistency and Fault Tolerance with Shared Logs in Stateful Serverless Computing.
- Author
-
Jia, Zhipeng and Witchel, Emmett
- Abstract
Bokiis a new serverless runtime that exports a shared log API to serverless functions. Boki shared logs enable stateful serverless applications to manage their state with durability, consistency, and fault tolerance. Boki shared logs achieve high throughput and low latency. The key enabler is the metalog, a novel mechanism that allows Boki to address ordering, consistency and fault tolerance independently. The metalog orders shared log records with high throughput and it provides read consistency while allowing service providers to optimize the write and read path of the shared log in different ways. To demonstrate the value of shared logs for stateful serverless applications, we build Boki support libraries that implement fault-tolerant workflows, durable object storage, and message queues. Our evaluation shows that shared logs can speed up important serverless workloads by up to 4.2×. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Visualizing the assumptions of network meta‐analysis.
- Author
-
Tu, Yu‐Kang, Lai, Pei‐Chun, Huang, Yen‐Ta, and Hodges, James
- Subjects
- *
SEPTIC shock , *STEROID drugs , *HOMOGENEITY - Abstract
Network meta‐analysis (NMA) incorporates all available evidence into a general statistical framework for comparing multiple treatments. Standard NMAs make three major assumptions, namely homogeneity, similarity, and consistency, and violating these assumptions threatens an NMA's validity. In this article, we suggest a graphical approach to assessing these assumptions and distinguishing between qualitative and quantitative versions of these assumptions. In our plot, the absolute effect of each treatment arm is plotted against the level of effect modifiers, and the three assumptions of NMA can then be visually evaluated. We use four hypothetical scenarios to show how violating these assumptions can lead to different consequences and difficulties in interpreting an NMA. We present an example of an NMA evaluating steroid use to treat septic shock patients to demonstrate how to use our graphical approach to assess an NMA's assumptions and how this approach can help with interpreting the results. We also show that all three assumptions of NMA can be summarized as an exchangeability assumption. Finally, we discuss how reporting of NMAs can be improved to increase transparency of the analysis and interpretability of the results. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Who is in focus? A scoping review of themes and consistency in inclusive education for all.
- Author
-
Navrátilová, Jana, Svojanovský, Petr, Obrovská, Jana, Kratochvílová, Jana, Lojdová, Kateřina, and Plch, Lukáš
- Subjects
- *
INCLUSIVE education , *RESEARCH personnel , *RESEARCH methodology , *EDUCATION research , *DISABILITIES - Abstract
This review addresses the current knowledge on inclusive education suggesting that many studies focus on predetermined groups of disadvantaged learners . This review aims 1) to map the research on inclusive education for all (IEFA) and 2) to assess the consistency of studies in their approach to inclusive education (IE), research methodology and research results/discussions regarding the concept of IEFA. The analysis of 26 studies (out of 2,780 original datasets) revealed two main results: First, they represent three overarching themes of inclusion – values, practices and school development. They illustrate that beliefs about inclusive education and beliefs about practices are not always aligned. Second, the review highlights the challenges researchers face in describing a student population that includes representatives with diverse needs. While some researchers aim to describe students in detail, they often end up focusing only on those with common characteristics, usually disability. Consequently, the sample sections do not reflect the actual diversity of students addressed by IEFA. Only 46% of the studies are consistent in their approach, meaning, they conceptualise the broader concept of inclusion in theoretical approach, focus on it in methodology and interpret it in the results/discussions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Some novel exact and non-standard finite difference schemes for a class of diffusion–advection–reaction equation.
- Author
-
Kayenat, Sheerin and Kumar Verma, Amit
- Subjects
- *
FINITE differences , *CLASS differences , *OPTIMISM , *EQUATIONS - Abstract
In this paper, a generalized diffusion–advection–reaction equation (GDARE) is considered and by using the solitary wave solution two new class of exact finite difference (EFD) schemes are proposed under certain functional relationship between the temporal and space step sizes. Suitable denominator functions are identified from these EFD schemes and a new non-standard finite difference (NSFD) scheme is developed. This NSFD scheme preserves positivity and boundedness properties of the solution of GDARE. The $ L_2 $ L 2 , $ L_\infty $ L ∞ errors and CPU times are obtained for the proposed NSFD and EFD schemes to show the proficiency and accuracy of the schemes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. Nonparametric estimation of P(X<Y) from noisy data samples with non-standard error distributions.
- Author
-
Phuong, Cao Xuan and Thuy, Le Thi Hong
- Subjects
- *
RANDOM variables , *LEBESGUE measure , *NONPARAMETRIC estimation , *MEASUREMENT errors , *SAMPLING errors - Abstract
Let X, Y be continuous random variables with unknown distributions. The aim of this paper is to study the problem of estimating the probability θ : = P (X < Y) based on independent random samples from the distributions of X ′ , Y ′ , ζ and η , where X ′ = X + ζ , Y ′ = Y + η and X, Y, ζ , η are mutually independent random variables. In this context, ζ , η are referred to as measurement errors. We apply the ridge-parameter regularization method to derive a nonparametric estimator for θ depending on two parameters. Our estimator is shown to be consistent with respect to mean squared error if the characteristic functions of ζ , η only vanish on Lebesgue measure zero sets. Under some further assumptions on the densities of X, Y, ζ and η , we obtain some upper and lower bounds on the convergence rate of the estimator. A numerical example is also given to illustrate the efficiency of our method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Error estimates for completely discrete FEM in energy‐type and weaker norms.
- Author
-
Angermann, Lutz, Knabner, Peter, and Rupp, Andreas
- Subjects
- *
BOUNDARY value problems , *GALERKIN methods , *LINEAR equations , *CONFORMITY - Abstract
The paper presents error estimates within a unified abstract framework for the analysis of FEM for boundary value problems with linear diffusion‐convection‐reaction equations and boundary conditions of mixed type. Since neither conformity nor consistency properties are assumed, the method is called completely discrete. We investigate two different stabilized discretizations and obtain stability and optimal error estimates in energy‐type norms and, by generalizing the Aubin‐Nitsche technique, optimal error estimates in weaker norms. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Why advertisers should embrace event typicality and maximize leveraging of major events.
- Author
-
Carrillat, François A., Mazodier, Marc, and Eckert, Christine
- Abstract
The current study details how marketing campaigns featuring event-typical ads adapted to sporting events (e.g., a car ad that displays its brand logo on an Olympic podium) affect brand attitudes and incentive-aligned brand choice in more positive ways than proven advertising strategies such as product category consistency. Presenting four field and lab experiments across a total of 3 events and 32 ads, we show that these effects are driven by the combination of 3 mechanisms: event-typical ads' capacity to trigger a sufficient feeling of knowing what the ad is about, provoke curiosity, and transfer attributes from the event to the brand, even with very short ad exposures. Advertisers, brand managers, or event organizers can thus exploit the creative potential around sporting events by using event-typical ads. Furthermore, when these stakeholders know the most typical elements of an event, they can either adapt their marketing activities or register them to avoid ambush marketing (i.e., advertisers willing to associate their brand with the event in the absence of any legitimate link with it). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Tensor low-rank representation combined with consistency and diversity exploration.
- Author
-
Kan, Yaozu, Lu, Gui-Fu, Ji, Guangyan, and Du, Yangfan
- Abstract
In recent years, many tensor data processing methods have been proposed. Tensor low-rank representation (TLRR) is a recently proposed tensor-based clustering method that has shown good clustering performance in some applications. However, TLRR does not make full use of the consistency and diversity information hidden in different similarity matrices. Therefore, we propose the TLRR combined with consistency and diversity exploration (TLRR-CD) method. First, the tensor Frobenius norm and tensor product (t-product), which is defined as the multiplication of two tensors, are used to obtain the low-rank representation tensor, which can be seen as being composed of many similarity matrices. Second, the low-rank representation tensor is further decomposed into a consistent tensor, which contains the common structural information contained in the different similarity matrices, and a diversity tensor, which contains the locally specific structural information of different similarity matrices. Finally, the Hilbert–Schmidt Independence Criterion (HSIC), which is used to measure the relevance of local specific structural information, and spectral clustering are unified into the final objective function to improve clustering performance. In addition, the optimization process of TLRR-CD is also given. The experimental results show the good performance of TLRR-CD. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. A Dynamical Study of Modeling the Transmission of Typhoid Fever through Delayed Strategies.
- Author
-
Tashfeen, Muhammad, Dayan, Fazal, Ur Rehman, Muhammad Aziz, Abdeljawad, Thabet, and Mukheimer, Aiman
- Subjects
TYPHOID fever ,FINITE difference method ,FINITE differences ,RUNGE-Kutta formulas ,EULER method - Abstract
This study analyzes the transmission of typhoid fever caused by Salmonella typhi using a mathematical model that highlights the significance of delay in its effectiveness. Time delays can affect the nature of patterns and slow down the emergence of patterns in infected population density. The analyzed model is expanded with the equilibrium analysis, reproduction number, and stability analysis. This study aims to establish and explore the non-standard finite difference (NSFD) scheme for the typhoid fever virus transmission model with a time delay. In addition, the forward Euler method and Runge-Kutta method of order 4 (RK-4) are also applied in the present research. Some significant properties, such as convergence, positivity, boundedness, and consistency, are explored, and the proposed scheme preserves all the mentioned properties. The theoretical validation is conducted on how NSFD outperforms other methods in emulating key aspects of the continuous model, such as positive solution, stability, and equilibrium about delay. Hence, the above analysis also shows some of the limitations of the conventional finite difference methods, such as forward Euler and RK-4 in simulating such critical behaviors. This becomes more apparent when using larger steps. This indicated that NSFD is beneficial in identifying the essential characteristics of the continuous model with higher accuracy than the traditional approaches. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. Assessing Consistency of Respondent-driven Sampling Estimators by Using Repeated Surveys among People Who Inject Drugs (PWID) in New Jersey.
- Author
-
Wang, Peng, Wogayehu, Afework, Bolden, Barbara, Ibrahim, Abdel R., and Raymond, Henry F.
- Subjects
HIV infection epidemiology ,CROSS-sectional method ,REPEATED measures design ,INTRAVENOUS drug abuse ,HUMAN sexuality ,DESCRIPTIVE statistics ,PSYCHOLOGY of drug abusers ,SEX customs ,LONGITUDINAL method - Abstract
Respondent-driven sampling (RDS) is widely used to sample populations with higher risk of HIV infection for whom no sampling frames exist. However, few studies have been done to assess the consistency of RDS estimators in real world settings. We conducted an assessment study on the consistency of RDS estimators using data from the National HIV Behavioral Surveillance – People Who Inject Drugs surveys in Newark, New Jersey from 2005 to 2018. Population parameter estimates based on RDS-I, RDS-II, Gile's SS, and HCG were compared longitudinally and cross-sectionally. Population homophily statistics and differential recruitment statistics were estimated and compared. Convergence plots were used for RDS diagnosis. Sensitivity analyses were conducted on population size estimates and seeds biases. By comparing time-insensitive population parameters and population homophily statistics estimated by four RDS estimators, the study found that RDS-II and Gile's SS could provide longitudinally and cross-sectionally consistent estimates and population homophily statistics on gender and sexual orientation. Cross-sectional comparison of time-sensitive population parameter estimates also supported the consistency of RDS-II and Gile's SS. However, RDS-I and HCG did not perform well in those comparisons. In conclusion, RDS estimators may not address all inconsistencies, but RDS-II and Gile's SS are recommended to weight RDS samples given enough consistency was observed in them. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. A Global C-Staggered Composite Model for Shallow Water Equations with Latitude–Longitude Grid and Reductions in the Polar Regions.
- Author
-
Lima, Genilson S.
- Subjects
FINITE differences ,SHALLOW-water equations ,CONSERVATION of mass ,FLUIDS ,EQUATIONS - Abstract
To develop a numerical method for global geophysical fluids, we usually need to choose a spherical grid and numerical approximations to represent the partial derivative equations. Some alternatives include the use of finite differences or finite volumes with latitude–longitude or reduced grids. Each of these cases has some advantages and also some limitations. This paper presents a comparison between two methods and describes a composite model using them side by side. The first is a well-known method for latitude–longitude grids and was used from 75 ∘ S until 75 ∘ N. The second is a recently developed scheme for reduced grids and was used only in the polar regions. The similarity between the two methods allows the use of small adaptations in their approximations to obtain consistency and mass conservation also in the transition between the two regions. The composite model combines advantages of the other two schemes and has a smaller computational cost. Numerical tests indicated order 2 of convergence, prevention of grid-imprinting errors, and avoidance of nonlinear instability. This model has numerical properties that may lead to efficient implementations with massive parallel computation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. The Reliability of Specific Physical Fitness Assessments in Elite Female Chinese Wrestlers.
- Author
-
Bai, Yinchuan, Xu, Naidan, Li, Xiangchen, and Shen, Yupeng
- Subjects
ELITE athletes ,INTRACLASS correlation ,COMBAT sports ,PHYSICAL fitness ,TEST reliability - Abstract
The aim of this study was to evaluate the reliability of eight specific fitness tests for elite female Chinese wrestlers. Twenty-eight elite female wrestlers participated in the study (age: 26.9 ± 2.81 years). The reliability of the tests was analyzed using the intraclass correlation coefficient (ICC) and the 95% confidence interval (CI), the coefficient of variation (CV), and other metrics. The 30-s Sit-Up (SU30) and 6-m Rope Climb (RC6m) tests showed excellent reliability (ICC > 0.9). The 30-s Dummy Throw (DT30) had good to excellent reliability, while the 30-s Bridge-Return (B-R30) showed moderate to good reliability. The 30-s Burpee (BUR30), 15-s Leg Attack (LA15), 15-s Leg Defense (LD15), and Dummy Suplex and Gut Wrench (DS&GW) tests ranged from poor to good reliability. SU30, DT30, LA15, and RC6m tests displayed low variability (CV < 5%), while others exhibited moderate variability. SU30, B-R30, DT30, and RC6m tests are reliable for assessing wrestling fitness. However, BUR30 and LA15 tests showed high variability and should be used carefully. LD15 and DS&GW tests are not recommended for assessing fitness in elite female wrestlers. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Local inconsistency detection using the Kullback–Leibler divergence measure.
- Author
-
Spineli, Loukia M.
- Subjects
- *
LEGAL evidence , *STANDARD deviations , *DENSITY - Abstract
Background: The standard approach to local inconsistency assessment typically relies on testing the conflict between the direct and indirect evidence in selected treatment comparisons. However, statistical tests for inconsistency have low power and are subject to misinterpreting a p-value above the significance threshold as evidence of consistency. Methods: We propose a simple framework to interpret local inconsistency based on the average Kullback–Leibler divergence (KLD) from approximating the direct with the corresponding indirect estimate and vice versa. Our framework uses directly the mean and standard error (or posterior mean and standard deviation) of the direct and indirect estimates obtained from a local inconsistency method to calculate the average KLD measure for selected comparisons. The average KLD values are compared with a semi-objective threshold to judge the inconsistency as acceptably low or material. We exemplify our novel interpretation approach using three networks with multiple treatments and multi-arm studies. Results: Almost all selected comparisons in the networks were not associated with statistically significant inconsistency at a significance level of 5%. The proposed interpretation framework indicated 14%, 66%, and 75% of the selected comparisons with an acceptably low inconsistency in the corresponding networks. Overall, information loss was more notable when approximating the posterior density of the indirect estimates with that of the direct estimates, attributed to indirect estimates being more imprecise. Conclusions: Using the concept of information loss between two distributions alongside a semi-objectively defined threshold helped distinguish target comparisons with acceptably low inconsistency from those with material inconsistency when statistical tests for inconsistency were inconclusive. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. Testing Mortars for 3D Printing: Correlation with Rheological Behavior.
- Author
-
Bao, Ta Minh Phuong, Yeakleang, Muy, Abdelouhab, Sandra, and Courard, Luc
- Subjects
- *
RHEOLOGY , *ARCHITECTURAL design , *YIELD stress , *ARCHITECTURAL designs , *SHEARING force - Abstract
Three-dimensionally printed concrete is a transformative technology that addresses housing shortages due to population growth and enables innovative architectural designs. The objective of this study is to investigate the connection between a conventional test and the rheological properties of 3D-printed concrete. A more precise assessment of material quality based on traditional evaluation techniques is proposed. Standard tests are conducted to evaluate the consistency of 3D-printed concrete materials. Complementarily, a rheometer is employed to accurately measure key rheological properties, thereby establishing a link with empiric testing methodologies. The correlation between the flow table test and rheological coefficients, such as yield stress and viscosity, has been identified as the most effective in basic experiments for evaluating material behavior. This approach allows for a preliminary assessment of printability without the need for additional complex equipment. The study has successfully established a relationship between flow table tests and rheological parameters. However, further research involving a broader range of materials and print-test experiments is essential to enhance the correlation between other conventional testing methods and rheometer results. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. An Evaluation of Sentinel-3 SYN VGT Products in Comparison to the SPOT/VEGETATION and PROBA-V Archives.
- Author
-
Toté, Carolien, Swinnen, Else, and Henocq, Claire
- Subjects
- *
DATA libraries , *TIME series analysis , *PRODUCT quality , *PRODUCT design , *REFLECTANCE - Abstract
Sentinel-3 synergy (SYN) VEGETATION (VGT) products were designed to provide continuity to the SPOT/VEGETATION (SPOT VGT) base products archive. Since the PROBA-V mission acted as a gap filler between SPOT VGT and Sentinel-3, and in principle, a continuous series of data products from the combined data archives of SPOT VGT (1998–2014), PROBA-V (2013–2020) and Sentinel-3 SYN VGT (from 2018 onwards) are available to users, the consistency of Sentinel-3 SYN VGT with both the latest SPOT VGT (VGT-C3) and PROBA-V (PV-C2) archives is highly relevant. In past years, important changes have been implemented in the SYN VGT processing baseline. The archive of SYN VGT products is therefore intrinsically inconsistent, leading to different consistency levels with SPOT VGT and PROBA-V throughout the years. A spatio-temporal intercomparison of the combined time series of VGT-C3, PV-C2 and Sentinel-3 SYN VGT 10-day NDVI composite products with an external reference from LSA-SAF, and an intercomparison of Sentinel-3 SYN V10 products with a climatology of VGT-C3 resp. PV-C2 for three distinct periods with different levels of product quality have shown that the subsequent processing baseline updates have indeed resulted in better-quality products. It is therefore essential to reprocess the entire Sentinel-3 SYN VGT archive; a uniform data record of standard SPOT VGT, PROBA-V and Sentinel-3 SYN VGT products, spanning over 25 years, would provide valuable input for a wide range of applications. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. On a nonparametric estimation of ℙ(<italic>X</italic><<italic>Y</italic><<italic>Z</italic>) in the presence of measurement errors.
- Author
-
Trang, Bui Thuy and Phuong, Cao Xuan
- Subjects
- *
CONTINUOUS distributions , *NONPARAMETRIC estimation , *RANDOM variables , *SEPARATION of variables , *INDEPENDENT variables - Abstract
AbstractLet θ=P(X
Y are independent random variables, with known constants μX,μZ∈R, σ>0. Suppose we observe a random sample Y1′,…,Yn′ from the distribution of Y′=Y+ε. Here ε is a random error and distributed with a known continuous distribution. Our aim is to estimate θ based on the sample as well as on the complete knowledge about the distributions of X ,Z and ε. We propose a nonparametric estimator of θ by applying the Fourier deconvolution method. Our estimator is shown to be mean consistent with respect to the mean squared error and strongly consistent. Under some regularity assumptions on the densities ofY and ε, some error estimates of the proposed estimator are derived. A simulation study is also conducted to illustrate the effectiveness of our method. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
26. Consistency-oriented clustering ensemble via data reconstruction.
- Author
-
Zhang, Hengshan, Wang, Yun, Chen, Yanping, and Sun, Jiaze
- Subjects
PRIOR learning ,DEFINITIONS - Abstract
The study highlights that using different distance measures on the same dataset leads to varying clustering results, making the choice of distance measure a challenge when prior knowledge is lacking. To address this issue, a consistency-oriented clustering ensemble via data reconstruction is developed. This approach eliminates the need to select a specific distance measure and achieves higher consistency between the clustering ensemble and base clusterings while maintaining superior clustering performance. First, the base clustering is generated via the clustering with different distance measures and a consistency definition is introduced in the proposed method. Then the ensemble process updates the weights of base clusterings to ensure they reach the consistency. At the same time data reconstruction process is integrated into the ensemble process to guarantee a high convergence rate and efficient clustering. Finally, the clustering ensemble result is achieved with the higher consistency measure and improved clustering performance by balancing both factors. In the experiment, the effectiveness of the proposed method is verified and the specification of the parameters is advised through the various experimental outcomes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Over-specification of small, borderline cardinalities and color in referential communication: the role of visual context, modifier position, and consistency.
- Author
-
Zevakhina, Natalia A., Dongarova, Kseniya N., Shubina, Daria, and Popova, Daria P.
- Subjects
VISUAL communication ,NUMERALS ,ADJECTIVES (Grammar) ,PERCENTILES ,COLOR - Abstract
This paper reports on two flash-mode experiments that test redundant descriptions of small (2-4) cardinalities, borderline (5-8) cardinalities, and color in referential communication. It provides further support for the idea that small cardinalities are more salient (due to subitizing), less sensitive to visual context, and therefore give rise to higher over-specification rates than color. Because of greater salience, Russian speakers more often use prenominal positions for numerals than for color adjectives. The paper also investigates borderline cardinalities and argues for the order factor that affects their salience, since ordered items can be perceived in small subitized parts. The ordered mode of presentation of the borderline cardinalities leads to higher over-specification rates and to higher percentages of prenominal positions than the unordered one. The paper provides further evidence for the consistency of small, borderline cardinalities, and color in people's choices to minimally specify or over-specify given objects in referential communication. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. Correlation between cone resistance with standard penetration value for predicting consistency of cohesive soil in Eastern India.
- Author
-
Nandi, Saptarshi, Basu, Dipanjan, Bandyopadhyay, Kaushik, and Shiuly, Amit
- Subjects
CONE penetration tests ,SOIL classification ,SUBSOILS ,GEOTECHNICAL engineering ,SILT - Abstract
In the geotechnical engineering field, the rapid speed of urbanization triggers the need for direct measurements of sub-soil parameters through in situ testing, accompanied by instant results. At this juncture, the cone penetration test (CPT) is selected for this study. Here, an attempt is made to develop a correlation between cone penetration resistance (q
c ) and standard penetration blow count (SPT N) in order to predict a reference range of qc for cohesive (silty clay/clayey silt) sub-soil of different SPT-based consistencies. In this context, 25 CPT were conducted adjacent to conventional boreholes accompanied by SPT tests at eight important locations in West Bengal (WB) and Odisha (OR), India, focusing on infrastructure development. Primarily, sub-soil is characterized by bulk unit weight (γ) along with soil behavior type index (IC ) estimated from the CPT and compared with the sub-soil profile identified from conventional boreholes. Further, a comparison of qc with SPT N is made to establish a correlation. Also, an attempt is made to tally the established correlation with the earlier correlations established for different regions. This study quantitatively establishes a quadratic correlation (R2 = 0.84) between qc and SPT N, which is found to be in good agreement with the previous correlations. Overall, the key findings of this study, i.e., the predicted range of qc , reveal a reliable method for assessing the consistency of cohesive sub-soil by virtue of the qc . However, this correlation is limited to soft to very stiff silty clay/clayey silt sub-soil formation. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
29. 新型柔性贴片式心电仪的临床应用.
- Author
-
陆艺蓓, 金冬霞, 宋振春, 马昊原, 李彦, 郝天旭, and 李曦铭
- Abstract
Objective To explore the diagnostic value of a wearable flexible patch ECG instrument in arrhythmia, the alarm situation during clinical application, patient satisfaction and safety. Methods A total of 1 443 subjects wore flexible patch ECG and conventional dynamic ECG (control) for 24 h to test the validity and consistency of arrhythmia diagnosis, and counted the alarm of remote ECG and the occurrence of related adverse events during the wearing of the instrument. Results There were 987 cases of arrhythmia detected by flexible patch ECG and 992 cases by conventional dynamic ECG. The total coincidence rate of arrhythmia diagnosis was 98.7%. The mean heart rate was measured by flexible patch ECG (75.4±11.4) times/min, conventional dynamic heart rate (71.5±12.1) times/min, the intra-group correlation coefficient (ICC) of 2 instruments was 0.892 (95%CI: 0.537-0.956), with good repeatability. The correct alarm rate of flexible patch ECG was 100%. The incidences of skin pruritus (0.28% vs.1.32%), skin allergy, redness and swelling (0.14% vs. 0.69%) and electrode strip shedding (0 vs. 0.28%) during wearing the flexible patch electrocardiogram were lower than those of the conventional holter electrocardiogram (P<0.05). Conclusion The flexible patch ECG has few adverse reactions, high comfort, good safety and clinical applicability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Increasing Movement Amplitude in Speeded Hitting Enhances Contact Velocity Without Affecting Directional Accuracy or Movement Variability.
- Author
-
Okazaki, Victor Hugo Alves and Teixeira, Luis Augusto
- Subjects
- *
LINEAR velocity , *ANGULAR velocity , *BATTING (Baseball) , *ACCELERATION (Mechanics) , *LIFTING & carrying (Human mechanics) - Abstract
AbstractPerformance of sport-related ballistic motor skills, like ball hitting in golf and baseball, requires wide movements to produce highly fast and spatially accurate movements. In this study, we assessed the effect of movement amplitude on directional accuracy in a ballistic hitting task. Participants performed the task of moving a manual handle over a flat surface to hit with high speed a moveable disc, aiming to propel it towards a frontal target. Five movement amplitudes were compared, ranging from 11.5 cm to 27.5 cm in steps of 4 cm. Kinematic analysis evaluated motions of the handle, disc, and arm joints. Results showed that greater movement amplitudes led to longer acceleration phases, with delayed peak velocities at the handle, shoulder and elbow, leading to higher contact and peak linear velocities of the handle, and higher angular velocities at the shoulder and elbow. Manipulation of movement amplitude led to no evidence for effects on either disc directional accuracy or variability. Results also revealed no evidence for differences in variability of contact velocity, peak velocity and time of peak velocity across movement amplitudes in the shoulder, elbow, and wrist. Our results indicated that greater movement amplitudes in hitting a spatial target lead to increased contact velocity while not affecting directional accuracy or movement variability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. Do schools that employ an inspector get better inspection grades?
- Author
-
Bokhove, Christian, Jerrim, John, and Sims, Sam
- Subjects
- *
PRIMARY schools , *SECONDARY schools , *SCHOOL inspections (Educational quality) , *PORTFOLIO management (Investments) - Abstract
In England, a substantial proportion of school inspections are conducted by current school leaders. This may lead to concerns that this gives their school (about 2% of schools) an advantage in the inspection process when it is their turn to be inspected. Yet scant evidence exists on this issue. This paper thus presents the first evidence on this matter, using data obtained via a freedom of information request and linking this with other publicly available information about England's schools. We find that schools where a member of staff also works for Ofsted receive better inspection outcomes than schools without an inspector on their payroll. Our findings nevertheless suggest that other schools may benefit from having access to the training material and professional development opportunities Ofsted provides to its inspectors. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. INVESTIGATING THE EFFECTS OF MICRO MONTMORILLONITE ON CEMENT MORTAR PHYSICAL PROPERTIES.
- Author
-
Moyet, Aous Abdulhussain, Owaid, Khalid M., and Raouf, Raouf Mahmood
- Subjects
- *
FLEXURAL strength , *COMPRESSIVE strength , *MONTMORILLONITE , *CEMENT , *CONCRETE , *PORTLAND cement - Abstract
The effects of micro-sized montmorillonite clay (MMT) on Portland cement concrete were examined in this study, focusing on regular and calcined MMT as partial cement replacements. Key properties assessed include consistency, setting time, flexural strength, and compressive strength at 7 and 28 days. Regular MMT initially acts as an inert filler, potentially reducing compressive strength, while calcined MMT improves compatibility and strength. Both types extend the setting time, with regular MMT causing more delay. The study highlights that the regular MMT increases water demand and setting time, whereas calcined MMT enhances strength and mitigates these issues. The results emphasize the importance of carefully choosing MMT type and dosage. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. Cost allocation and airport problems.
- Author
-
Thomson, William
- Subjects
- *
COST allocation , *NUCLEOLUS , *OPEN-ended questions , *AIRPORTS , *COST - Abstract
We consider the problem of dividing the cost of a facility when agents can be ordered in terms of the needs they have for it, and accommodating an agent with a certain need allows accommodating all agents with lower needs at no extra cost. This problem is known as the "airport problem", the facility being the runway. We review the literature devoted to its study, and formulate a number of open questions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Redis-based full-text search extensions for relational databases.
- Author
-
Liao, Xuehua, Peng, Lilan, Yang, Ting, Li, Tianrui, and Zhu, Zhousen
- Abstract
In order to overcome the inefficiency and resource consumption of full-text search in relational databases, a light full-text search model with auxiliary cache is developed. Specially, we utilize the MySQL as the data storage layer and the Redis as the index cache layer. We first design a full-index cache mechanism by the Redis-based inverted indexes construction methods to augment the efficient memory processing capability of relational databases. In addition, an increment-index synchronization mechanism is implemented to fit the dynamic update of relation database. For hot data, an index update optimization mechanism is provided to guarantee the fast response and accuracy of full-text search. The proposed Redis-based auxiliary cache method has also been put into practical industrial applications and achieved promising results. Finally, we evaluate our method from index space occupation, time consumption and the accuracy of retrieval results. The experimental results show that the proposed model outperforms MySQL Full-Text method 2–3 times and surpasses ElasticSearch 12 times in space resource consumption. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. Multi-Robot Cooperative Localization: Time-Varying Unobservable Subspace Analysis and Consistent Algorithm Design.
- Author
-
Hao, Ning, He, Fenghua, and Yao, Yu
- Abstract
This paper investigates the problem of cooperative localization (CL) for a multi-robot system (MRS) under dynamic measurement topology, which involves a group of robots collectively estimating their poses with respect to a common reference frame using ego-motion measurements and robot-to-robot relative measurements. The authors provide a theoretical analysis of the time-varying unobservable subspace and propose a consistent cooperative localization algorithm. First, the authors introduce the relative measurement graph (RMG) to represent the relative pose measurements obtained by the MRS at each instant. Then, the authors derive the local observability matrix over a time interval. An equivalent relationship is established between the local observability matrix and the spectral matrices of the RMG. Moreover, the authors present a method for constructing the unobservable subspace based on the RMG under different topology conditions. Based on this analysis, the authors design a consistent cooperative localization algorithm that satisfies the constraints of the time-varying unobservable subspace. An analytical optimal solution is derived for the constrained optimization problem. Monte Carlo numerical simulations are conducted to demonstrate the consistency and accuracy of the proposed method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Multi-stage distributionally robust convex stochastic optimization with Bayesian-type ambiguity sets.
- Author
-
Ma, Wentao and Chen, Zhiping
- Subjects
CONFIDENCE regions (Mathematics) ,INVENTORY control ,ROBUST control ,ALGORITHMS ,SCHEDULING ,AMBIGUITY - Abstract
The existent methods for constructing ambiguity sets in distributionally robust optimization often suffer from over-conservativeness and inefficient utilization of available data. To address these limitations and to practically solve multi-stage distributionally robust optimization (MDRO), we propose a data-driven Bayesian-type approach that constructs the ambiguity set of possible distributions from a Bayesian perspective. We demonstrate that our Bayesian-type MDRO problem can be reformulated as a risk-averse multi-stage stochastic programming problem and subsequently investigate its theoretical properties such as consistency, finite sample guarantee, and statistical robustness. Moreover, the reformulation enables us to employ cutting planes algorithms in dynamic settings to solve the Bayesian-type MDRO problem. To illustrate the practicality and advantages of the proposed model and algorithm, we apply it to a distributionally robust inventory control problem and a distributionally robust hydrothermal scheduling problem, and compare it with usual formulations and solution methods to highlight the superior performance of our approach. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. A Bayesian approach to data-driven multi-stage stochastic optimization.
- Author
-
Chen, Zhiping and Ma, Wentao
- Subjects
STOCHASTIC programming ,ALGORITHMS ,INVENTORIES ,TREES - Abstract
Aimed at sufficiently utilizing available data and prior distribution information, we introduce a data-driven Bayesian-type approach to solve multi-stage convex stochastic optimization, which can easily cope with the uncertainty about data process's distributions and their inter-stage dependence. To unravel the properties of the proposed multi-stage Bayesian expectation optimization (BEO) problem, we establish the consistency of optimal value functions and solutions. Two kinds of algorithms are designed for the numerical solution of single-stage and multi-stage BEO problems, respectively. A queuing system and a multi-stage inventory problem are adopted to numerically demonstrate the advantages and practicality of the new framework and corresponding solution methods, compared with the usual formulations and solution methods for stochastic optimization problems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. A Mallows-type model averaging estimator for ridge regression with randomly right censored data.
- Author
-
Zeng, Jie, Hu, Guozhi, and Cheng, Weihu
- Abstract
Instead of picking up a single ridge parameter in ridge regression, this paper considers a frequentist model averaging approach to appropriately combine the set of ridge estimators with different ridge parameters, when the response is randomly right censored. Within this context, we propose a weighted least squares ridge estimation for unknown regression parameter. A new Mallows-type weight choice criterion is then developed to allocate model weights, where the unknown distribution function of the censoring random variable is replaced by the Kaplan–Meier estimator and the covariance matrix of random errors is substituted by its averaging estimator. Under some mild conditions, we show that when the fitting model is misspecified, the resulting model averaging estimator achieves optimality in terms of minimizing the loss function. Whereas, when the fitting model is correctly specified, the model averaging estimator of the regression parameter is root-n consistent. Additionally, for the weight vector which is obtained by minimizing the new criterion, we establish its rate of convergence to the infeasible optimal weight vector. Simulation results show that our method is better than some existing methods. A real dataset is analyzed for illustration as well. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Adaptive Learning of the Latent Space of Wasserstein Generative Adversarial Networks.
- Author
-
Qiu, Yixuan, Gao, Qingyi, and Wang, Xiao
- Abstract
AbstractGenerative models based on latent variables, such as generative adversarial networks (GANs) and variational auto-encoders (VAEs), have gained lots of interests due to their impressive performance in many fields. However, many data such as natural images usually do not populate the ambient Euclidean space but instead reside in a lower-dimensional manifold. Thus an inappropriate choice of the latent dimension fails to uncover the structure of the data, possibly resulting in mismatch of latent representations and poor generative qualities. Toward addressing these problems, we propose a novel framework called the latent Wasserstein GAN (LWGAN) that fuses the Wasserstein auto-encoder and the Wasserstein GAN so that the intrinsic dimension of the data manifold can be adaptively learned by a modified informative latent distribution. We prove that there exist an encoder network and a generator network in such a way that the intrinsic dimension of the learned encoding distribution is equal to the dimension of the data manifold. We theoretically establish that our estimated intrinsic dimension is a consistent estimate of the true dimension of the data manifold. Meanwhile, we provide an upper bound on the generalization error of LWGAN, implying that we force the synthetic data distribution to be similar to the real data distribution from a population perspective. Comprehensive empirical experiments verify our framework and show that LWGAN is able to identify the correct intrinsic dimension under several scenarios, and simultaneously generate high-quality synthetic data by sampling from the learned latent distribution. Supplementary materials for this article are available online, including a standardized description of the materials available for reproducing the work. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. Test-Retest and Inter-Rater Reliability for Selected Outcomes from a Wearable 3D Inertial Sensor Over Different Stable and Unstable Postural Conditions: A Validation Study.
- Author
-
D’Emanuele, Samuel, Nardello, Francesca, Garau, Fabrizio, Campaci, Diego, Schena, Federico, and Tarperi, Cantor
- Subjects
- *
MEASUREMENT errors , *STATISTICAL reliability , *INTRACLASS correlation , *UNITS of measurement , *WEARABLE technology - Abstract
The agreement between a wearable inertial sensor (GYKO, G) and the force platform (P) was assessed by evaluating
test-retest andinter-rater reliability . Thirty-eight subjects were enrolled; the selected indices of balance were investigated over foot positions and (un)stable conditions. Intraclass correlation coefficient (ICC), standard error of measurement (SEM%) and minimal detectable change (MDC95%) were computed. For G, ICC bounds range frompoor toexcellent (0.040 ÷ 0.921), mean of 0.687 (=moderate reliability ). Regarding P, ICC ranges frompoor toexcellent (0.070 ÷ 0.920), mean of 0.683 (=moderate reliability ). For G, SEM% ranges from 11% to 47%; MDC95% from 30% to 132%. Concerning P, SEM% ranges from 7% to 41%; MDC95% from 21% to 114%. Finally, theinter-rater reliability ICC by comparing devices ranges frompoor toexcellent (−0.162 ÷ 0.911), mean of 0.338 (=poor reliability ). GYKO appears to be a convenient tool with high consistency among multiple measurements but for specific clinic/research purposes. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
41. On the consistency of circuit lower bounds for non-deterministic time.
- Author
-
Atserias, Albert, Buss, Sam, and Müller, Moritz
- Subjects
- *
BOUNDED arithmetics , *CIRCUIT complexity , *ARITHMETIC , *LOGICAL prediction , *POLYNOMIALS - Abstract
We prove the first unconditional consistency result for superpolynomial circuit lower bounds with a relatively strong theory of bounded arithmetic. Namely, we show that the theory V20 is consistent with the conjecture that NEXP⊈P/poly, i.e. some problem that is solvable in non-deterministic exponential time does not have polynomial size circuits. We suggest this is the best currently available evidence for the truth of the conjecture. The same techniques establish the same results with NEXP replaced by the class of problems decidable in non-deterministic barely superpolynomial time such as NTIME(nO(logloglog n)). Additionally, we establish a magnification result on the hardness of proving circuit lower bounds. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. A machine learning approach towards assessing consistency and reproducibility: an application to graft survival across three kidney transplantation eras.
- Author
-
Achilonu, Okechinyere, Obaido, George, Ogbuokiri, Blessing, Aruleba, Kehinde, Musenge, Eustasius, and Fabian, June
- Subjects
KIDNEY transplantation ,GRAFT survival ,ACADEMIC medical centers ,PREDICTION models ,RECEIVER operating characteristic curves ,TREATMENT effectiveness ,RETROSPECTIVE studies ,DESCRIPTIVE statistics ,GRAFT rejection ,MEDICAL records ,ACQUISITION of data ,MACHINE learning ,ALGORITHMS - Abstract
Background: In South Africa, between 1966 and 2014, there were three kidney transplant eras defined by evolving access to certain immunosuppressive therapies defined as Pre-CYA (before availability of cyclosporine), CYA (when cyclosporine became available), and New-Gen (availability of tacrolimus and mycophenolic acid). As such, factors influencing kidney graft failure may vary across these eras. Therefore, evaluating the consistency and reproducibility of models developed to study these variations using machine learning (ML) algorithms could enhance our understanding of post-transplant graft survival dynamics across these three eras. Methods: This study explored the effectiveness of nine ML algorithms in predicting 10-year graft survival across the three eras. We developed and internally validated these algorithms using data spanning the specified eras. The predictive performance of these algorithms was assessed using the area under the curve (AUC) of the receiver operating characteristics curve (ROC), supported by other evaluation metrics. We employed local interpretable model-agnostic explanations to provide detailed interpretations of individual model predictions and used permutation importance to assess global feature importance across each era. Results: Overall, the proportion of graft failure decreased from 41.5% in the Pre-CYA era to 15.1% in the New-Gen era. Our best-performing model across the three eras demonstrated high predictive accuracy. Notably, the ensemble models, particularly the Extra Trees model, emerged as standout performers, consistently achieving high AUC scores of 0.95, 0.95, and 0.97 across the eras. This indicates that the models achieved high consistency and reproducibility in predicting graft survival outcomes. Among the features evaluated, recipient age and donor age were the only features consistently influencing graft failure throughout these eras, while features such as glomerular filtration rate and recipient ethnicity showed high importance in specific eras, resulting in relatively poor historical transportability of the best model. Conclusions: Our study emphasises the significance of analysing post-kidney transplant outcomes and identifying era-specific factors mitigating graft failure. The proposed framework can serve as a foundation for future research and assist physicians in identifying patients at risk of graft failure. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Consistency and stability of individualized cortical functional networks parcellation at 3.0 T and 5.0 T MRI.
- Author
-
Minhua Yu, Bo Rao, Yayun Cao, Lei Gao, Huan Li, Xiaopeng Song, and Haibo Xu
- Subjects
FUNCTIONAL connectivity ,FUNCTIONAL magnetic resonance imaging ,MAGNETIC resonance imaging ,FUNCTIONAL analysis ,MAGNETIC fields - Abstract
Background: Individualized cortical functional networks parcellation has been reported as highly reproducible at 3.0 T. However, in view of the complexity of cortical networks and the greatly increased sensitivity provided by ultra-high field 5.0 T MRI, the parcellation consistency between different magnetic fields is unclear. Purpose: To explore the consistency and stability of individualized cortical functional networks parcellation at 3.0 T and 5.0 T MRI based on spatial and functional connectivity analysis. Materials and methods: Thirty healthy young participants were enrolled. Each subject underwent resting-state fMRI at both 3.0 T and 5.0 T in a random order in less than 48 h. The individualized cortical functional networks was parcellated for each subject using a previously proposed iteration algorithm. Dice coefficient was used to evaluate the spatial consistency of parcellated networks between 3.0 T and 5.0 T. Functional connectivity (FC) consistency was evaluated using the Euclidian distance and Graph-theory metrics. Results: A functional cortical atlas consisting of 18 networks was individually parcellated at 3.0 T and 5.0 T. The spatial consistency of these networks at 3.0 T and 5.0 T for the same subject was significantly higher than that of inter-individuals. The FC between the 18 networks acquired at 3.0 T and 5.0 T were highly consistent for the same subject. Positive cross-subject correlations in Graph-theory metrics were found between 3.0 T and 5.0 T. Conclusion: Individualized cortical functional networks at 3.0 T and 5.0 T showed consistent and stable parcellation results both spatially and functionally. The 5.0 T MR provides finer functional sub-network characteristics than that of 3.0 T. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. Solving a fractional diffusion PDE using some standard and nonstandard finite difference methods with conformable and Caputo operators.
- Author
-
Appadu, Appanah R., Kelil, Abey S., Nyingong, Ndifon Wikocho, Qureshi, Sania, and Sene, Ndolane
- Subjects
FINITE difference method ,TRANSPORT theory ,OPTIMISM - Abstract
Introduction: Fractional diffusion equations offer an effective means of describing transport phenomena exhibiting abnormal diffusion pat-terns, often eluding traditional diffusion models. Methods: We construct four finite difference methods where fractional derivatives are approximated using either conformable or Caputo operators. Results: Stability of the proposed schemes is analyzed using von Neumann stability analysis, and conditions are established to preserve positivity. Consistency analysis is performed for all methods, and numerical results with fractional parameters (a) set to 0.75, 0.90, 0.95, and 1.0 are presented. Discussion: The rate of convergence in time for the four methods is computed. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. Consistency Analysis of Collaborative Process Data Change Based on a Rule-Driven Method.
- Author
-
Wang, Qianqian and Shao, Chifeng
- Subjects
- *
BUSINESS process management , *PETRI nets , *INFORMATION processing , *DATA analysis , *LOGIC - Abstract
In business process management, business process change analysis is the key link to ensure the flexibility and adaptability of the system. The existing methods mostly focus on the change analysis of a single business process from the perspective of control flow, ignoring the influence of data changes on collaborative processes with information interaction. In order to compensate for this deficiency, this paper proposes a rule-driven consistency analysis method for data changes in collaborative processes. Firstly, it analyzes the influence of data changes on other elements (such as activities, data, roles, and guards) in collaborative processes, and gives the definition of data influence. Secondly, the optimal alignment technology is used to explore how data changes interfere with the expected behavior of deviation activities, and decision rules are integrated into the Petri net model to accurately evaluate and screen out the effective expected behavior that conforms to business logic and established rules. Finally, the initial optimal alignment is repaired according to the screened effective expected behavior, and the consistency of business processes is recalculated. The experimental results show that the introduced rule constraint mechanism can effectively avoid the misjudgment of abnormal behavior. Compared with the traditional method, the average accuracy, recall rate, and F1-score of effective expected behavior are improved by 4%, 4.7%, and 4.3%, respectively. In addition, the repaired optimal alignment significantly enhances the system's ability to respond quickly and self-adjust to data changes, providing a strong support for the intelligent and automated transformation of business process management. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. MRI 多序列扫描在肝细胞癌诊断和微血管侵犯评估中的应用价值.
- Author
-
陈泳松, 王德芬, 陈海洋, 赵 方, 陈廷昊, and 陈泽胜
- Subjects
- *
MAGNETIC resonance imaging , *MAGNETICS , *HEPATOCELLULAR carcinoma , *DIAGNOSIS , *LAVA - Abstract
Objective: To explore the application value of magnetic resonance imaging (MRI) multi-sequence scan in the diagnosis of hepatocellular carcinoma (HCC) and evaluation of microvascular invasion (MVI). Methods: A retrospective collection was performed on the clinical data of 100 patients with suspected HCC in the hospital between October 2019 and October 2022. All completed MRI multi-sequence scan and pathological examination. Taking pathological examination as the golden standard, the consistency between MRI multi-sequence scan and pathological examination in the diagnosis of HCC and evaluation of MVI was analyzed by Kappa test. Results: Among the 100 patients, pathological examination showed that there were 51 cases (51.00%) with HCC and 49 cases (49.00%) with benign liver lesions. In the 51 HCC patients, there were 17 cases (33.33%) with MVI. Taking pathological examination as the golden standard, sensitivity, specificity and Kappa values of T2WI in the diagnosis of HCC and MVI were (82.35%, 79.59%, 0.620) and (64.71%, 85.29%, 0.507), respectively. The sensitivity, specificity and Kappa values of TIWI in the diagnosis of HCC and MVI were (76.47%, 81.63%, 0.580) and (58.82%, 88.24%, 0.492), respectively. The sensitivity, specificity and Kappa values of LAVA in the diagnosis of HCC and MVI were (84.31%, 83.67%, 0.580) and (64.71%, 91.18%, 0.585), respectively. The sensitivity, specificity and Kappa values of MRI multi-sequence scan in the diagnosis of HCC and MVI were (96.08%, 79.59%, 0.759) and (94.12%, 82.35%, 0.712), respectively. The sensitivity of MRI multi-sequence scan was higher than that of single sequence (P<0.05). The proportions of unsmooth tumor edge and peritumoral enhancement in patients with MVI positive were higher than those with MVI negative (P<0.05). Conclusion: MRI multi-sequence scan is conductive to clinical diagnosis of HCC and evaluation of MVI. Unsmooth tumor edge and peritumoral enhancement have predictive significance in HCC and MVI. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. PCABM: Pairwise Covariates-Adjusted Block Model for Community Detection.
- Author
-
Huang, Sihan, Sun, Jiajin, and Feng, Yang
- Subjects
- *
MAXIMUM likelihood statistics , *STOCHASTIC models , *FEATURE selection , *GENERALIZATION , *ALGORITHMS - Abstract
One of the most fundamental problems in network study is community detection. The stochastic block model (SBM) is a widely used model, and various estimation methods have been developed with their community detection consistency results unveiled. However, the SBM is restricted by the strong assumption that all nodes in the same community are stochastically equivalent, which may not be suitable for practical applications. We introduce a pairwise covariates-adjusted stochastic block model (PCABM), a generalization of SBM that incorporates pairwise covariate information. We study the maximum likelihood estimators of the coefficients for the covariates as well as the community assignments, and show they are consistent under suitable sparsity conditions. Spectral clustering with adjustment (SCWA) is introduced to efficiently solve PCABM. Under certain conditions, we derive the error bound of community detection for SCWA and show that it is community detection consistent. In addition, we investigate model selection in terms of the number of communities and feature selection for the pairwise covariates, and propose two corresponding algorithms. PCABM compares favorably with the SBM or degree-corrected stochastic block model (DCBM) under a wide range of simulated and real networks when covariate information is accessible. for this article are available online. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. DDAC-SpAM: A Distributed Algorithm for Fitting High-dimensional Sparse Additive Models with Feature Division and Decorrelation.
- Author
-
He, Yifan, Wu, Ruiyang, Zhou, Yong, and Feng, Yang
- Subjects
- *
DISTRIBUTED algorithms , *STATISTICAL learning , *INFERENTIAL statistics , *DATA analysis , *ADDITIVES - Abstract
Abstract– Distributed statistical learning has become a popular technique for large-scale data analysis. Most existing work in this area focuses on dividing the observations, but we propose a new algorithm, DDAC-SpAM, which divides the features under a high-dimensional sparse additive model. Our approach involves three steps: divide, decorrelate, and conquer. The decorrelation operation enables each local estimator to recover the sparsity pattern for each additive component without imposing strict constraints on the correlation structure among variables. The effectiveness and efficiency of the proposed algorithm are demonstrated through theoretical analysis and empirical results on both synthetic and real data. The theoretical results include both the consistent sparsity pattern recovery as well as statistical inference for each additive functional component. Our approach provides a practical solution for fitting sparse additive models, with promising applications in a wide range of domains. for this article are available online. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Dataset factors associated with age‐related changes in brain structure and function in neurodevelopmental conditions.
- Author
-
Vandewouw, Marlee M., Ye, Yifan, Crosbie, Jennifer, Schachar, Russell J., Iaboni, Alana, Georgiades, Stelios, Nicolson, Robert, Kelley, Elizabeth, Ayub, Muhammad, Jones, Jessica, Arnold, Paul D., Taylor, Margot J., Lerch, Jason P., Anagnostou, Evdokia, and Kushki, Azadeh
- Subjects
- *
LARGE-scale brain networks , *BRAIN anatomy , *FUNCTIONAL connectivity , *RACE , *DEMOGRAPHIC characteristics - Abstract
With brain structure and function undergoing complex changes throughout childhood and adolescence, age is a critical consideration in neuroimaging studies, particularly for those of individuals with neurodevelopmental conditions. However, despite the increasing use of large, consortium‐based datasets to examine brain structure and function in neurotypical and neurodivergent populations, it is unclear whether age‐related changes are consistent between datasets and whether inconsistencies related to differences in sample characteristics, such as demographics and phenotypic features, exist. To address this, we built models of age‐related changes of brain structure (regional cortical thickness and regional surface area; N = 1218) and function (resting‐state functional connectivity strength; N = 1254) in two neurodiverse datasets: the Province of Ontario Neurodevelopmental Network and the Healthy Brain Network. We examined whether deviations from these models differed between the datasets, and explored whether these deviations were associated with demographic and clinical variables. We found significant differences between the two datasets for measures of cortical surface area and functional connectivity strength throughout the brain. For regional measures of cortical surface area, the patterns of differences were associated with race/ethnicity, while for functional connectivity strength, positive associations were observed with head motion. Our findings highlight that patterns of age‐related changes in the brain may be influenced by demographic and phenotypic characteristics, and thus future studies should consider these when examining or controlling for age effects in analyses. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. An investigation into dynamic behaviour of reconstituted and undisturbed fine-grained soil during triaxial and simple shear.
- Author
-
Önalp, Akın, Özocak, Aşkın, Bol, Ertan, Sert, Sedat, Arslan, Eylem, and Ural, Nazile
- Subjects
- *
DYNAMIC testing , *CONFORMANCE testing , *EARTHQUAKES , *CLAY , *SOILS - Abstract
This study aims to evaluate the factors controlling the sensitivity of fine-grained soils to seismic stresses and revise the criteria previously proposed by the authors to diagnose liquefaction. To this end, dynamic tests have been performed on artificial mixes as well as natural soils from a wide area of an earthquake devastated city (Adapazari) using two types of dynamic testing. Studies have led to findings suggesting that the gray area between susceptible and non-susceptible soils proposed by several investigators in the past can now be dispensed with. Although physical properties of fine-grained soil supply sufficient information for diagnosis, the dynamic simple shear test is found to be a convenient and rapid way to confirm the judgement. However, it has been seen that dynamic testing alone may not be the last word in the determination of liquefaction, and physical properties should also be addressed. Anomalies observed in test results are also discussed. Conclusions show significant differences from existing proposed criteria in the literature. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.