31 results
Search Results
2. The Challenges of Algorithm Management: The Spanish Perspective.
- Author
-
Prado, Daniel Perez del
- Subjects
ALGORITHMS ,LABOR laws ,DISRUPTIVE innovations ,ARTIFICIAL intelligence ,DIGITAL technology - Abstract
This paper focuses on how Spain's labour and employment law is dealing with technological disruption and, particularly, with algorithm management, looking for a harmonious equilibrium between traditional structures and profound changes. It pays special attention to the different actors affected and the most recent normative changes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Confidence of a k-Nearest Neighbors Python Algorithm for the 3D Visualization of Sedimentary Porous Media.
- Author
-
Bullejos, Manuel, Cabezas, David, Martín-Martín, Manuel, and Alcalá, Francisco Javier
- Subjects
PYTHON programming language ,K-nearest neighbor classification ,POROUS materials ,CONFIDENCE ,ECONOMIC decision making ,ALGORITHMS - Abstract
In a previous paper, the authors implemented a machine learning k-nearest neighbors (KNN) algorithm and Python libraries to create two 3D interactive models of the stratigraphic architecture of the Quaternary onshore Llobregat River Delta (NE Spain) for groundwater exploration purposes. The main limitation of this previous paper was its lack of routines for evaluating the confidence of the 3D models. Building from the previous paper, this paper refines the programming code and introduces an additional algorithm to evaluate the confidence of the KNN predictions. A variant of the Similarity Ratio method was used to quantify the KNN prediction confidence. This variant used weights that were inversely proportional to the distance between each grain-size class and the inferred point to work out a value that played the role of similarity. While the KNN algorithm and Python libraries demonstrated their efficacy for obtaining 3D models of the stratigraphic arrangement of sedimentary porous media, the KNN prediction confidence verified the certainty of the 3D models. In the Llobregat River Delta, the KNN prediction confidence at each prospecting depth was a function of the available data density at that depth. As expected, the KNN prediction confidence decreased according to the decreasing data density at lower depths. The obtained average-weighted confidence was in the 0.44−0.53 range for gravel bodies at prospecting depths in the 12.7−72.4 m b.s.l. range and was in the 0.42−0.55 range for coarse sand bodies at prospecting depths in the 4.6−83.9 m b.s.l. range. In a couple of cases, spurious average-weighted confidences of 0.29 in one gravel body and 0.30 in one coarse sand body were obtained. These figures were interpreted as the result of the quite different weights of neighbors from different grain-size classes at short distances. The KNN algorithm confidence has proven its suitability for identifying these anomalous results in the supposedly well-depurated grain-size database used in this study. The introduced KNN algorithm confidence quantifies the reliability of the 3D interactive models, which is a necessary stage to make decisions in economic and environmental geology. In the Llobregat River Delta, this quantification clearly improves groundwater exploration predictability. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
4. A K-Nearest Neighbors Algorithm in Python for Visualizing the 3D Stratigraphic Architecture of the Llobregat River Delta in NE Spain.
- Author
-
Bullejos, Manuel, Cabezas, David, Martín-Martín, Manuel, and Alcalá, Francisco Javier
- Subjects
K-nearest neighbor classification ,SUPERVISED learning ,PYTHON programming language ,ALGORITHMS ,MACHINE learning ,SEDIMENTARY structures ,PLIOCENE Epoch - Abstract
The k-nearest neighbors (KNN) algorithm is a non-parametric supervised machine learning classifier; which uses proximity and similarity to make classifications or predictions about the grouping of an individual data point. This ability makes the KNN algorithm ideal for classifying datasets of geological variables and parameters prior to 3D visualization. This paper introduces a machine learning KNN algorithm and Python libraries for visualizing the 3D stratigraphic architecture of sedimentary porous media in the Quaternary onshore Llobregat River Delta (LRD) in northeastern Spain. A first HTML model showed a consecutive 5 m-equispaced set of horizontal sections of the granulometry classes created with the KNN algorithm from 0 to 120 m below sea level in the onshore LRD. A second HTML model showed the 3D mapping of the main Quaternary gravel and coarse sand sedimentary bodies (lithosomes) and the basement (Pliocene and older rocks) top surface created with Python libraries. These results reproduce well the complex sedimentary structure of the LRD reported in recent scientific publications and proves the suitability of the KNN algorithm and Python libraries for visualizing the 3D stratigraphic structure of sedimentary porous media, which is a crucial stage in making decisions in different environmental and economic geology disciplines. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
5. KNN and adaptive comfort applied in decision making for HVAC systems.
- Author
-
Aparicio-Ruiz, Pablo, Barbadilla-Martín, Elena, Guadix, José, and Cortés, Pablo
- Subjects
THERMAL comfort ,DECISION making ,SUPPORT vector machines ,ALGORITHMS ,AIR conditioning ,HEATING & ventilation industry - Abstract
The decision making of a suitable heating, ventilating and air conditioning system's set-point temperature is an energy and environmental challenge in our society. In the present paper, a general framework to define such temperature based on a dynamic adaptive comfort algorithm is proposed. Due to the fact that the thermal comfort of the occupants of a building has different ranges of acceptability, this method is applied to learn such comfort temperature with respect to the running mean temperature and therefore to decide the suitable range of indoor temperature. It is demonstrated that this solution allows to dynamically build an adaptive comfort algorithm, an algorithm based on the human being's thermal adaptability, without applying the traditional theory. The proposed methodology based on the K-Nearest-Neighbour algorithm was tested and compared with data from an experimental thermal comfort field study carried out in a mixed mode building in the south-western area of Spain and with the Support Vector Machine method. The results show that K-Nearest-Neighbour algorithm represents the pattern of thermal comfort data better than the traditional solution and that it is a suitable method to learn the thermal comfort area of a building and to define the set-point temperature for a heating, ventilating and air-conditioning system. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
6. Comparison of Optimisation Algorithms for Centralised Anaerobic Co-Digestion in a Real River Basin Case Study in Catalonia.
- Author
-
Palma-Heredia D, Verdaguer M, Puig V, Poch M, and Cugueró-Escofet MÀ
- Subjects
- Anaerobiosis, Digestion, Spain, Algorithms, Rivers
- Abstract
Anaerobic digestion (AnD) is a process that allows the conversion of organic waste into a source of energy such as biogas, introducing sustainability and circular economy in waste treatment. AnD is an intricate process because of multiple parameters involved, and its complexity increases when the wastes are from different types of generators. In this case, a key point to achieve good performance is optimisation methods. Currently, many tools have been developed to optimise a single AnD plant. However, the study of a network of AnD plants and multiple waste generators, all in different locations, remains unexplored. This novel approach requires the use of optimisation methodologies with the capacity to deal with a highly complex combinatorial problem. This paper proposes and compares the use of three evolutionary algorithms: ant colony optimisation (ACO), genetic algorithm (GA) and particle swarm optimisation (PSO), which are especially suited for this type of application. The algorithms successfully solve the problem, using an objective function that includes terms related to quality and logistics. Their application to a real case study in Catalonia (Spain) shows their usefulness (ACO and GA to achieve maximum biogas production and PSO for safer operation conditions) for AnD facilities.
- Published
- 2022
- Full Text
- View/download PDF
7. A Self-Assembly Portable Mobile Mapping System for Archeological Reconstruction Based on VSLAM-Photogrammetric Algorithm.
- Author
-
Ortiz-Coder P and Sánchez-Ríos A
- Subjects
- Cloud Computing, Equipment Design, Imaging, Three-Dimensional instrumentation, Photogrammetry instrumentation, Software, Spain, Workflow, Algorithms, Archaeology methods, Imaging, Three-Dimensional methods, Photogrammetry methods
- Abstract
Three Dimensional (3D) models are widely used in clinical applications, geosciences, cultural heritage preservation, and engineering; this, together with new emerging needs such as building information modeling (BIM) develop new data capture techniques and devices with a low cost and reduced learning curve that allow for non-specialized users to employ it. This paper presents a simple, self-assembly device for 3D point clouds data capture with an estimated base price under €2500; furthermore, a workflow for the calculations is described that includes a Visual SLAM-photogrammetric threaded algorithm that has been implemented in C++. Another purpose of this work is to validate the proposed system in BIM working environments. To achieve it, in outdoor tests, several 3D point clouds were obtained and the coordinates of 40 points were obtained by means of this device, with data capture distances ranging between 5 to 20 m. Subsequently, those were compared to the coordinates of the same targets measured by a total station. The Euclidean average distance errors and root mean square errors (RMSEs) ranging between 12-46 mm and 8-33 mm respectively, depending on the data capture distance (5-20 m). Furthermore, the proposed system was compared with a commonly used photogrammetric methodology based on Agisoft Metashape software. The results obtained demonstrate that the proposed system satisfies (in each case) the tolerances of 'level 1' (51 mm) and 'level 2' (13 mm) for point cloud acquisition in urban design and historic documentation, according to the BIM Guide for 3D Imaging (U.S. General Services).
- Published
- 2019
- Full Text
- View/download PDF
8. [Optimization of the prediction of financial problems in Spanish private health companies using genetic algorithms].
- Author
-
González-Martín JM, Sánchez-Medina AJ, and Alonso JB
- Subjects
- Artificial Intelligence, Forecasting, Humans, Spain, Algorithms, Bankruptcy, Health Care Sector economics, Private Sector economics
- Abstract
Objective: This paper presents a methodology to optimize, using Altman's Z-Score for private companies, the prediction of private companies of the Spanish health sector entering a situation of bankruptcy., Method: The proposed method consists of the application of genetic algorithms (GA) to find the coefficients of the formula of the chain of ratios proposed by Altman in the version of the score for private companies which optimize the prediction for Spanish private health companies, maximizing sensitivity and specificity, and thereby reducing type I and type II errors. For this purpose, a sample of 5,903 companies from the Spanish private health sector obtained from the database of the Iberian Balance Analysis System (SABI) between 2007 and 2015 was used., Results: The results show that the predictive model obtained with the AG presents greater accuracy, sensitivity and specificity than that proposed by Altman for private companies with both test data and all sample data., Conclusions: The most important finding of this study was to establish a methodology that can identify the optimized coefficients for the Altman Z-Score, which allows a more accurate prediction of bankruptcy in Spanish private healthcare companies., (Copyright © 2018 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.)
- Published
- 2019
- Full Text
- View/download PDF
9. Estimation of COVID-19 epidemic curves using genetic programming algorithm.
- Author
-
Anđelić, Nikola, Šegota, Sandi Baressi, Lorencin, Ivan, Mrzljak, Vedran, and Car, Zlatan
- Subjects
HIGH performance computing ,COVID-19 ,CONVALESCENCE ,MACHINE learning ,INFECTIOUS disease transmission ,RESEARCH funding ,STATISTICAL models ,ALGORITHMS - Abstract
This paper investigates the possibility of the implementation of Genetic Programming (GP) algorithm on a publicly available COVID-19 data set, in order to obtain mathematical models which could be used for estimation of confirmed, deceased, and recovered cases and the estimation of epidemiology curve for specific countries, with a high number of cases, such as China, Italy, Spain, and USA and as well as on the global scale. The conducted investigation shows that the best mathematical models produced for estimating confirmed and deceased cases achieved R2 scores of 0.999, while the models developed for estimation of recovered cases achieved the R2 score of 0.998. The equations generated for confirmed, deceased, and recovered cases were combined in order to estimate the epidemiology curve of specific countries and on the global scale. The estimated epidemiology curve for each country obtained from these equations is almost identical to the real data contained within the data set [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
10. Bidders Recommender for Public Procurement Auctions Using Machine Learning: Data Analysis, Algorithm, and Case Study with Tenders from Spain.
- Author
-
García Rodríguez, Manuel J., Rodríguez Montequín, Vicente, Ortega Fernández, Francisco, and Villanueva Balsera, Joaquín M.
- Subjects
GOVERNMENT purchasing ,MACHINE learning ,ALGORITHMS ,RECOMMENDER systems ,RANDOM forest algorithms ,DATA analysis - Abstract
Recommending the identity of bidders in public procurement auctions (tenders) has a significant impact in many areas of public procurement, but it has not yet been studied in depth. A bidders recommender would be a very beneficial tool because a supplier (company) can search appropriate tenders and, vice versa, a public procurement agency can discover automatically unknown companies which are suitable for its tender. This paper develops a pioneering algorithm to recommend potential bidders using a machine learning method, particularly a random forest classifier. The bidders recommender is described theoretically, so it can be implemented or adapted to any particular situation. It has been successfully validated with a case study: an actual Spanish tender dataset (free public information) which has 102,087 tenders from 2014 to 2020 and a company dataset (nonfree public information) which has 1,353,213 Spanish companies. Quantitative, graphical, and statistical descriptions of both datasets are presented. The results of the case study were satisfactory: the winning bidding company is within the recommended companies group, from 24% to 38% of the tenders, according to different test conditions and scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
11. Breast Density Analysis Using an Automatic Density Segmentation Algorithm.
- Author
-
Oliver, Arnau, Tortajada, Meritxell, Lladó, Xavier, Freixenet, Jordi, Ganau, Sergi, Tortajada, Lidia, Vilagran, Mariona, Sentís, Melcior, and Martí, Robert
- Subjects
BREAST ,ALGORITHMS ,MAMMOGRAMS ,BREAST tumors ,DIAGNOSTIC imaging ,LONGITUDINAL method ,COMPUTERS in medicine ,PROBABILITY theory ,REGRESSION analysis ,RESEARCH funding ,T-test (Statistics) ,EVALUATION research ,DESCRIPTIVE statistics ,ANATOMY - Abstract
Breast density is a strong risk factor for breast cancer. In this paper, we present an automated approach for breast density segmentation in mammographic images based on a supervised pixel-based classification and using textural and morphological features. The objective of the paper is not only to show the feasibility of an automatic algorithm for breast density segmentation but also to prove its potential application to the study of breast density evolution in longitudinal studies. The database used here contains three complete screening examinations, acquired 2 years apart, of 130 different patients. The approach was validated by comparing manual expert annotations with automatically obtained estimations. Transversal analysis of the breast density analysis of craniocaudal (CC) and mediolateral oblique (MLO) views of both breasts acquired in the same study showed a correlation coefficient of ρ = 0.96 between the mammographic density percentage for left and right breasts, whereas a comparison of both mammographic views showed a correlation of ρ = 0.95. A longitudinal study of breast density confirmed the trend that dense tissue percentage decreases over time, although we noticed that the decrease in the ratio depends on the initial amount of breast density. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
12. ALGORITHMIC (IN)VISIBILITY TACTICS AMONG IMMIGRANT TIKTOKERS.
- Author
-
JARAMILLO-DENT, DANIELA
- Subjects
SCIENTIFIC literature ,IMMIGRANTS ,SOCIAL media ,DIGITAL video - Abstract
It is well established in scientific literature that immigrants are excluded from their own stories, which are often instrumentalized to fulfill specific communicative, othering intentions. In this sense, migrant agency and voice are, in many cases, absent from narratives related to their life experiences and subject to various symbolic, digital, and material borders. Moreover, although social media has been recognized as a prime space for self-representation across different segments of society, immigrants are often excluded from these spaces due to the risks that sharing certain information publicly represent to them. In this article I draw from a 16-month digital ethnography and inductive, multimodal content analysis of videos created by 53 Latin American immigrant tiktokers in the United States and Spain. This enables the conceptualization of their algorithmic (in)visibility practices which refer to the set of strategies deployed by immigrant content creators on social media --and possibly other marginalized and vulnerable populations-- to negotiate the conspicuousness of their controversial content with the aim of avoiding its deletion from the platform. The findings unveil three exemplary algorithmic (in)visibility practices that include content reuse and re-upload, vernacular visibility, and partial deplatforming. I find that these strategies shift between collective and individual approaches to achieve selective visibility and concealed conspicuousness within algorithmic moderation systems. [ABSTRACT FROM AUTHOR]
- Published
- 2022
13. Multiple time scales in modeling the incidence of infections acquired in intensive care units.
- Author
-
Wolkewitz, Martin, Cooper, Ben S., Palomar-Martinez, Mercedes, Alvarez-Lerma, Francisco, Olaechea-Astigarraga, Pedro, Barnett, Adrian G., and Schumacher, Martin
- Subjects
INTENSIVE care units ,INFECTION risk factors ,NOSOCOMIAL infections ,CRITICAL care medicine ,HOSPITAL admission & discharge ,DISEASE prevalence ,METHICILLIN-resistant staphylococcus aureus ,ALGORITHMS ,COMPARATIVE studies ,CROSS infection ,LENGTH of stay in hospitals ,MATHEMATICAL models ,RESEARCH methodology ,MEDICAL cooperation ,RESEARCH ,RESEARCH funding ,RISK assessment ,STAPHYLOCOCCAL diseases ,TIME ,THEORY ,EVALUATION research ,DISEASE incidence ,PROPORTIONAL hazards models ,PHYSIOLOGY - Abstract
Background: When patients are admitted to an intensive care unit (ICU) their risk of getting an infection will be highly depend on the length of stay at-risk in the ICU. In addition, risk of infection is likely to vary over calendar time as a result of fluctuations in the prevalence of the pathogen on the ward. Hence risk of infection is expected to depend on two time scales (time in ICU and calendar time) as well as competing events (discharge or death) and their spatial location. The purpose of this paper is to develop and apply appropriate statistical models for the risk of ICU-acquired infection accounting for multiple time scales, competing risks and the spatial clustering of the data.Methods: A multi-center data base from a Spanish surveillance network was used to study the occurrence of an infection due to Methicillin-resistant Staphylococcus aureus (MRSA). The analysis included 84,843 patient admissions between January 2006 and December 2011 from 81 ICUs. Stratified Cox models were used to study multiple time scales while accounting for spatial clustering of the data (patients within ICUs) and for death or discharge as competing events for MRSA infection.Results: Both time scales, time in ICU and calendar time, are highly associated with the MRSA hazard rate and cumulative risk. When using only one basic time scale, the interpretation and magnitude of several patient-individual risk factors differed. Risk factors concerning the severity of illness were more pronounced when using only calendar time. These differences disappeared when using both time scales simultaneously.Conclusions: The time-dependent dynamics of infections is complex and should be studied with models allowing for multiple time scales. For patient individual risk-factors we recommend stratified Cox regression models for competing events with ICU time as the basic time scale and calendar time as a covariate. The inclusion of calendar time and stratification by ICU allow to indirectly account for ICU-level effects such as local outbreaks or prevention interventions. [ABSTRACT FROM AUTHOR]- Published
- 2016
- Full Text
- View/download PDF
14. La sección "Tendencias" en YouTube en España durante las primeras semanas de la pandemia de Covid-19: visibilidad de las industrias culturales frente a los youtubers.
- Author
-
Patricio Pérez-Rufi, José and Castro-Higueras, Antonio
- Subjects
CULTURAL industries ,SOCIAL responsibility ,COVID-19 ,ACCESS to information ,PRODUCE trade ,USER-generated content - Abstract
Copyright of Estudios sobre el Mensaje Periodistico is the property of Universidad Complutense de Madrid and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2021
- Full Text
- View/download PDF
15. Day- and night-time aerosol optical depth implementation in CÆLIS.
- Author
-
González, Ramiro, Toledano, Carlos, Román, Roberto, Fuertes, David, Berjón, Alberto, Mateos, David, Guirado-Fuentes, Carmen, Velasco-Merino, Cristian, Carlos Antuña-Sanchez, Juan, Calle, Abel, E. Cachorro, Victoria, and M. de Frutos, Ángel
- Subjects
- *
OPTICAL depth (Astrophysics) , *AEROSOLS , *OBSERVATIONS of the Moon , *ALGORITHMS , *QUALITY control - Abstract
The University of Valladolid (UVa, Spain) manages since 2006 a calibration center of the AErosol RObotic NETwork (AERONET). The CÆLIS software tool, developed by UVa, was created to manage the data generated by the AERONET photometers, for calibration, quality control and data processing purposes. This paper exploits the potential of this tool in order to obtain products like the aerosol optical depth (AOD) and Angstrom exponent (AE), which are of high interest for atmospheric and climate studies, as well as to enhance the quality control of the instruments and data managed by CÆLIS. The AOD and cloud screening algorithms implemented in CÆLIS, both based on AERONET version 3, are described in detail. The obtained products are compared with the AERONET database. In general, the differences in daytime AOD between CÆLIS and AERONET are far below the expected uncertainty of the instrument, ranging the mean differences between −1.3×10[sup −4] at 870 nm and 6.2×10[sup −4] at 380 nm. The standard deviations of the differences range from 2.8×10[sup −4] at 675 nm to 8.1×10[sup −4] at 340 nm. The AOD and AE at night-time calculated by CÆLIS from Moon observations are also presented, showing good continuity between day and night-time for different locations, aerosol loads and moon phase angles. Regarding cloud screening, around 99.9 % of the observations classified as cloud-free by CÆLIS are also assumed cloud-free by AERONET; this percentage is similar for the cases considered as cloud-contaminated by both databases. The obtained results point out the capability of CÆLIS as processing system. The AOD algorithm provides the opportunity to use this tool with other instrument types and to retrieve other aerosol products in the future. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
16. Hourly-resolution analysis of electricity decarbonization in Spain (2017–2030).
- Author
-
Victoria, Marta and Gallego-Castillo, Cristobal
- Subjects
- *
ELECTRIC power production , *CARBONIZATION , *ELECTRIC power consumption , *RENEWABLE energy sources , *CARBON dioxide mitigation , *ALGORITHMS - Abstract
Highlights • Hourly-resolved model to investigate highly-renewable electricity generation in Spain. • Correlation analysis of time series to create 900 combinations used in simulations. • Transition paths evaluated based on security of supply, CO 2 emissions, and renewable share. • Short-term phase-out of nuclear and coal power plants proven to be feasible. Abstract Two alternative paths to achieve highly-renewable electricity generation in peninsular Spain are investigated in this paper. Every transition path comprises a description of the installed and decommissioned generation and storage capacities, from 2017 to 2030, as well as a hypothesis on the evolution of the electricity demand. The electricity mix for every hour within the transition path is determined through a dispatch algorithm that prioritizes electricity from renewable energy sources. The simulation is run for 900 different combinations of time series representing the hourly capacity factors of different technologies, as well as the electricity demand. This robust approach allows the evaluation of the transition paths based on the statistical distribution of several defined assessment criteria, such as security of supply, CO 2 emissions or renewable share in electricity generation. The feasibility of a Spanish power system with high renewable penetration is investigated not only in a future reference year but throughout the transition path. In particular, a progressive and simultaneous phase-out of nuclear and coal power plants in the short-term is proven to be feasible. Furthermore, the results sensitivity is analyzed including scenarios with a delayed nuclear phase-out, lower hydroelectricity generation due to more frequent and severe droughts caused by climate change and higher annual increment for the electricity demand. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
17. An Automatic Algorithm to Date the Reference Cycle of the Spanish Economy.
- Author
-
Camacho, Maximo, Gadea, María Dolores, and Gómez-Loscos, Ana
- Subjects
GAUSSIAN distribution ,BUSINESS cycles ,ECONOMIC indicators ,MARKOV processes ,ALGORITHMS ,RECESSIONS - Abstract
This paper provides an accurate chronology of the Spanish reference business cycle adapting a multiple change-point model. In that approach, each combination of peaks and troughs dated in a set of economic indicators is assumed to be a realization of a mixture of bivariate Gaussian distributions, whose number of components is estimated from the data. The means of each of these components refer to the dates of the reference turning points. The transitions across the components of the mixture are governed by Markov chain that is restricted to force left-to-right transition dynamic. In the empirical application, seven recessions in the period from February 1970 to February 2020 are identified, which are in high concordance with the timing of the turning point dates established by the Spanish Business Cycle Dating Committee (SBCDC). [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
18. Statistical Analysis and Machine Learning Prediction of Fog-Caused Low-Visibility Events at A-8 Motor-Road in Spain.
- Author
-
Cornejo-Bueno, Sara, Casillas-Pérez, David, Cornejo-Bueno, Laura, Chidean, Mihaela I., Caamaño, Antonio J., Cerro-Prada, Elena, Casanova-Mateo, Carlos, and Salcedo-Sanz, Sancho
- Subjects
STATISTICS ,MACHINE learning ,PARETO distribution ,MAXIMUM likelihood statistics ,ALGORITHMS ,PEARSON correlation (Statistics) - Abstract
This work presents a full statistical analysis and accurate prediction of low-visibility events due to fog, at the A-8 motor-road in Mondoñedo (Galicia, Spain). The present analysis covers two years of study, considering visibility time series and exogenous variables collected in the zone affected the most by extreme low-visibility events. This paper has then a two-fold objective: first, we carry out a statistical analysis for estimating the fittest probability distributions to the fog event duration, using the Maximum Likelihood method and an alternative method known as the L-moments method. This statistical study allows association of the low-visibility depth with the event duration, showing a clear relationship, which can be modeled with distributions for extremes such as Generalized Extreme Value and Generalized Pareto distributions. Second, we apply a neural network approach, trained by means of the ELM (Extreme Learning Machine) algorithm, to predict the occurrence of low-visibility events due to fog, from atmospheric predictive variables. This study provides a full characterization of fog events at this motor-road, in which orographic fog is predominant, causing important traffic problems during all year. We also show how the ELM approach is able to obtain highly accurate low-visibility events predictions, with a Pearson correlation coefficient of 0.8 , within a half-hour time horizon, enough to initialize some protocols aiming at reducing the impact of these extreme events in the traffic of the A-8 motor road. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
19. A Method of Estimating Time-to-Recovery for a Disease Caused by a Contagious Pathogen Such as SARS-CoV-2 Using a Time Series of Aggregated Case Reports.
- Author
-
Koutsouris, Dimitrios-Dionysios, Pitoglou, Stavros, Anastasiou, Athanasios, and Koumpouros, Yiannis
- Subjects
DISEASE progression ,COMPUTER software ,COVID-19 ,CONFIDENCE intervals ,TIME ,CONVALESCENCE ,WORLD health ,EPIDEMICS ,TIME series analysis ,DESCRIPTIVE statistics ,SENSITIVITY & specificity (Statistics) ,PREDICTION models ,COVID-19 pandemic ,ALGORITHMS - Abstract
During the outbreak of a disease caused by a pathogen with unknown characteristics, the uncertainty of its progression parameters can be reduced by devising methods that, based on rational assumptions, exploit available information to provide actionable insights. In this study, performed a few (~6) weeks into the outbreak of COVID-19 (caused by SARS-CoV-2), one of the most important disease parameters, the average time-to-recovery, was calculated using data publicly available on the internet (daily reported cases of confirmed infections, deaths, and recoveries), and fed into an algorithm that matches confirmed cases with deaths and recoveries. Unmatched cases were adjusted based on the matched cases calculation. The mean time-to-recovery, calculated from all globally reported cases, was found to be 18.01 days (SD 3.31 days) for the matched cases and 18.29 days (SD 2.73 days) taking into consideration the adjusted unmatched cases as well. The proposed method used limited data and provided experimental results in the same region as clinical studies published several months later. This indicates that the proposed method, combined with expert knowledge and informed calculated assumptions, could provide a meaningful calculated average time-to-recovery figure, which can be used as an evidence-based estimation to support containment and mitigation policy decisions, even at the very early stages of an outbreak. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
20. A Novel Information Theoretical Criterion for Climate Network Construction.
- Author
-
Cornejo-Bueno, Sara, Chidean, Mihaela I., Caamaño, Antonio J., Prieto-Godino, Luis, and Salcedo-Sanz, Sancho
- Subjects
WIND speed ,CLIMATOLOGY ,CONSTRUCTION ,ALGORITHMS ,FORECASTING ,WIND power plants - Abstract
This paper presents a novel methodology for Climate Network (CN) construction based on the Kullback-Leibler divergence (KLD) among Membership Probability (MP) distributions, obtained from the Second Order Data-Coupled Clustering (SODCC) algorithm. The proposed method is able to obtain CNs with emergent behaviour adapted to the variables being analyzed, and with a low number of spurious or missing links. We evaluate the proposed method in a problem of CN construction to assess differences in wind speed prediction at different wind farms in Spain. The considered problem presents strong local and mesoscale relationships, but low synoptic scale relationships, which have a direct influence in the CN obtained. We carry out a comparison of the proposed approach with a classical correlation-based CN construction method. We show that the proposed approach based on the SODCC algorithm and the KLD constructs CNs with an emergent behaviour according to underlying wind speed prediction data physics, unlike the correlation-based method that produces spurious and missing links. Furthermore, it is shown that the climate network construction method facilitates the evaluation of symmetry properties in the resulting complex networks. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
21. A study of differential microRNA expression profile in migraine: the microMIG exploratory study.
- Author
-
Gallardo, V. J., Gómez-Galván, J. B., Asskour, L., Torres-Ferrús, M., Alpuente, A., Caronna, E., and Pozo-Rosich, P.
- Subjects
RESEARCH ,MONONUCLEAR leukocytes ,MIGRAINE ,RESEARCH methodology ,MICRORNA ,INTERVIEWING ,CASE-control method ,RANDOM forest algorithms ,GENETIC markers ,GENE expression profiling ,QUESTIONNAIRES ,FACTOR analysis ,RESEARCH funding ,CLUSTER analysis (Statistics) ,HEADACHE ,WOMEN'S health ,LONGITUDINAL method ,ALGORITHMS ,EPIGENOMICS - Abstract
Background: Several studies have described potential microRNA (miRNA) biomarkers associated with migraine, but studies are scarcely reproducible primarily due to the heterogeneous variability of participants. Increasing evidence shows that disease-related intrinsic factors together with lifestyle (environmental factors), influence epigenetic mechanisms and in turn, diseases. Hence, the main objective of this exploratory study was to find differentially expressed miRNAs (DE miRNA) in peripheral blood mononuclear cells (PBMC) of patients with migraine compared to healthy controls in a well-controlled homogeneous cohort of non-menopausal women. Methods: Patients diagnosed with migraine according to the International Classification of Headache Disorders (ICHD-3) and healthy controls without familial history of headache disorders were recruited. All participants completed a very thorough questionnaire and structured-interview in order to control for environmental factors. RNA was extracted from PBMC and a microarray system (GeneChip miRNA 4.1 Array chip, Affymetrix) was used to determine the miRNA profiles between study groups. Principal components analysis and hierarchical clustering analysis were performed to study samples distribution and random forest (RF) algorithms were computed for the classification task. To evaluate the stability of the results and the prediction error rate, a bootstrap (.632 + rule) was run through all the procedure. Finally, a functional enrichment analysis of selected targets was computed through protein–protein interaction networks. Results: After RF classification, three DE miRNA distinguished study groups in a very homogeneous female cohort, controlled by factors such as demographics (age and BMI), life-habits (physical activity, caffeine and alcohol consumptions), comorbidities and clinical features associated to the disease: miR-342-3p, miR-532-3p and miR-758-5p. Sixty-eight target genes were predicted which were linked mainly to enriched ion channels and signaling pathways, neurotransmitter and hormone homeostasis, infectious diseases and circadian entrainment. Conclusions: A 3-miRNA (miR-342-3p, miR-532-3p and miR-758-5p) novel signature has been found differentially expressed between controls and patients with migraine. Enrichment analysis showed that these pathways are closely associated with known migraine pathophysiology, which could lead to the first reliable epigenetic biomarker set. Further studies should be performed to validate these findings in a larger and more heterogeneous sample. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
22. An Innovative JavaScript-Based Framework for Teaching Backtracking Algorithms Interactively.
- Author
-
Nasralla, Moustafa M.
- Subjects
JAVASCRIPT programming language ,ALGORITHMS ,CONCEPT learning ,ENGINEERING education ,EIGENFUNCTIONS ,DIGITAL learning - Abstract
Algorithm fundamentals are useful to learn at different levels engineering education. One of the most difficult concepts to teach and understand is backtracking algorithms with proper bounding functions. This article proposes a framework to implement interactive online tools showing examples of backtracking algorithms in which students can graphically observe execution step-by-step. This approach is illustrated with the n-queens problem with students from Prince Sultan University, Saudi Arabia, and Complutense University of Madrid, Spain. The results show 6.67% increased learning on a backtracking exercise in the experimental group over the control group, in which the algorithms were automatically validated with DOMjudge software (an automated system used to run programming contests). The proposed framework was evaluated as easy to use, with a score of 74.5% in the validated System Usability Scale (SUS); easy to learn, with a score of 6.22 out of 7 in the validated Usefulness, Satisfaction, and Ease-of-Use (USE) scale; and with a general satisfaction of 5.97 out of 7 in the validated USE scale. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
23. Estimation of RVoG Scene Parameters by Means of PolInSAR With TanDEM-X Data: Effect of the Double-Bounce Contribution.
- Author
-
Romero-Puig, Noelia, Lopez-Sanchez, Juan M., and Ballester-Berman, J. David
- Subjects
CROPS ,ALGORITHMS ,PADDY fields ,BISTATIC radar ,GROUND vegetation cover - Abstract
This article evaluates the effect of the double-bounce (DB) decorrelation term that appears in single-pass bistatic acquisitions, as in the TanDEM-X system, on the inversion of scene parameters by means of polarimetric SAR interferometry (PolInSAR). The retrieval of all scene parameters involved in the Random Volume over Ground (RVoG) model (i.e., ground topography, vegetation height, extinction, and ground-to-volume ratios) is affected by this term when the radar response from the ground is dominated by the DB. The estimation error in all these parameters is analyzed by means of simulations over a wide range of system configurations and scene variables for both agricultural crops and forest scenarios. Simulations demonstrate that the inclusion of the DB term, which complicates the inversion algorithm, is necessary for the angles of incidence shallower than 30° to achieve an estimation error below 10% in vegetation height and to avoid a significant underestimation in the ground-to-volume ratios. At steep incidences, this decorrelation term does not affect the estimation of vegetation height and ground-to-volume ratios. Regarding the extinction, this parameter is intrinsically not well estimated, since most retrieved values are close to the initial guesses employed for the optimization algorithm, regardless of the use or not of the DB decorrelation term. Finally, these findings are compared with the experimental results from the TanDEM-X data acquired over the rice fields in Spain for the available system parameters (baseline and incidence angle) of the acquired data set. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
24. A genome-wide analysis of copy number variation in Murciano-Granadina goats.
- Author
-
Guan, Dailu, Martínez, Amparo, Castelló, Anna, Landi, Vincenzo, Luigi-Sierra, María Gracia, Fernández-Álvarez, Javier, Cabrera, Betlem, Delgado, Juan Vicente, Such, Xavier, Jordana, Jordi, and Amills, Marcel
- Subjects
GOAT breeds ,GOATS ,ATP-binding cassette transporters ,GENE targeting ,DNA copy number variations ,ALGORITHMS ,SECRETION ,GENETIC transduction - Abstract
Background: In this work, our aim was to generate a map of the copy number variations (CNV) segregating in a population of Murciano-Granadina goats, the most important dairy breed in Spain, and to ascertain the main biological functions of the genes that map to copy number variable regions. Results: Using a dataset that comprised 1036 Murciano-Granadina goats genotyped with the Goat SNP50 BeadChip, we were able to detect 4617 and 7750 autosomal CNV with the PennCNV and QuantiSNP software, respectively. By applying the EnsembleCNV algorithm, these CNV were assembled into 1461 CNV regions (CNVR), of which 486 (33.3% of the total CNVR count) were consistently called by PennCNV and QuantiSNP and used in subsequent analyses. In this set of 486 CNVR, we identified 78 gain, 353 loss and 55 gain/loss events. The total length of all the CNVR (95.69 Mb) represented 3.9% of the goat autosomal genome (2466.19 Mb), whereas their size ranged from 2.0 kb to 11.1 Mb, with an average size of 196.89 kb. Functional annotation of the genes that overlapped with the CNVR revealed an enrichment of pathways related with olfactory transduction (fold-enrichment = 2.33, q-value = 1.61 × 10
−10 ), ABC transporters (fold-enrichment = 5.27, q-value = 4.27 × 10−04 ) and bile secretion (fold-enrichment = 3.90, q-value = 5.70 × 10−03 ). Conclusions: A previous study reported that the average number of CNVR per goat breed was ~ 20 (978 CNVR/50 breeds), which is much smaller than the number we found here (486 CNVR). We attribute this difference to the fact that the previous study included multiple caprine breeds that were represented by small to moderate numbers of individuals. Given the low frequencies of CNV (in our study, the average frequency of CNV is 1.44%), such a design would probably underestimate the levels of the diversity of CNV at the within-breed level. We also observed that functions related with sensory perception, metabolism and embryo development are overrepresented in the set of genes that overlapped with CNV, and that these loci often belong to large multigene families with tens, hundreds or thousands of paralogous members, a feature that could favor the occurrence of duplications or deletions by non-allelic homologous recombination. [ABSTRACT FROM AUTHOR]- Published
- 2020
- Full Text
- View/download PDF
25. Players’ selection for basketball teams, through Performance Index Rating, using multiobjective evolutionary algorithms.
- Author
-
Pérez-Toledano, Miguel Ángel, Rodriguez, Francisco J., García-Rubio, Javier, and Ibañez, Sergio José
- Subjects
EVOLUTIONARY algorithms ,BASKETBALL teams ,SPORTS competitions ,SPORTS administration ,BIOLOGICAL evolution ,DIFFERENTIAL evolution - Abstract
In any sport the selection of players for a team is fundamental for its subsequent performance. Many factors condition the selection process from the characteristics of the sport discipline to financial limitations, including a long list of restrictions associated with the environment of the competitions in which the team takes part. All of this makes the process of selecting a roster of players very complex, as it is affected by multiple variables and in many cases marked by a great deal of subjectivity. The purpose of this article was to objectively select the players for a basketball team using an evolutionary algorithm, the Non-dominated Sorting Genetic Algorithm II (NSGA-II) that uses stochastic search methods based on the imitation of natural biological evolution. The sample was composed of the players from the teams competing in the top Spanish basketball league, the Association of Basketball Clubs (ACB). To assess the quality of the solutions obtained, the results were compared with the teams in the ACB playing in the same competition as the players used in the study. The results make it possible to obtain different solutions for composing teams rendering financial resources profitable and taking into account the restrictions of the competition and of each sport management. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
26. Corneal Stability following Hyperopic LASIK with Advanced Laser Ablation Profiles Analyzed by a Light Propagation Study.
- Author
-
Gharaibeh, Almutez M., Villanueva, Asier, Mas, David, Espinosa, Julian, and Alió, Jorge L.
- Subjects
CORNEA physiology ,ALGORITHMS ,CORNEAL topography ,HYPEROPIA ,SCIENTIFIC observation ,POSTOPERATIVE period ,REGRESSION analysis ,SURGEONS ,VISUAL acuity ,LASIK ,STATISTICAL reliability ,RETROSPECTIVE studies - Abstract
Purpose. To assess anterior corneal surface stability 12 months following hyperopic LASIK correction with a light propagation algorithm. Setting. Vissum Instituto Oftalmológico de Alicante, Universidad Miguel Hernández, Alicante, Spain. Methods. This retrospective consecutive observational study includes 37 eyes of 37 patients treated with 6th-generation excimer laser platform (Schwind Amaris). Hyperopic LASIK was performed in all of them by the same surgeon (JLA) and completed 12-month follow-up. Corneal topography was analyzed with a light propagation algorithm, to assess the stability of the corneal outcomes along one year of follow-up. Results. Between three and twelve months postoperatively, an objective corneal power (OCP) regression of 0.39D and 0.41D was found for 6mm and 9mm central corneal zone, respectively. Subjective outcomes at the end of the follow-up period were as follows: 65% of eyes had spherical equivalent within ±0.50 D. 70% of eyes had an uncorrected distance visual acuity 20/20 or better. 86% of eyes had the same or better corrected distance visual acuity. In terms of stability, 0.14D of regression was found. No statistically significant differences were found for all the study parameters evaluated at different postoperative moments over the 12-month period. Conclusions. Light propagation analysis confirms corneal surface stability following modern hyperopic LASIK with a 6th-generation excimer laser technology over a 12-month period. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
27. Predicting the onset of hazardous alcohol drinking in primary care: development and validation of a simple risk algorithm.
- Author
-
Bellón, Juan Ángel, de Dios Luna, Juan, King, Michael, Nazareth, Irwin, Motrico, Emma, GildeGómez-Barragán, María Josefa, Torres-González, Francisco, Montón-Franco, Carmen, Sánchez-Celaya, Marta, Díaz-Barreiros, Miguel Ángel, Vicens, Catalina, and Moreno-Peral, Patricia
- Subjects
ALCOHOL drinking ,PRIMARY care ,CLINICAL prediction rules ,ALGORITHMS ,CHILD sexual abuse ,SMOKING ,PREVENTION of alcoholism ,PSYCHOLOGY of alcoholism ,ALCOHOLISM ,COMPARATIVE studies ,LONGITUDINAL method ,RESEARCH methodology ,MEDICAL cooperation ,PRIMARY health care ,PROGNOSIS ,QUESTIONNAIRES ,RESEARCH ,RISK assessment ,EVALUATION research ,BEHAVIOR disorders - Abstract
Background: Little is known about the risk of progressing to hazardous alcohol use in abstinent or low-risk drinkers.Aim: To develop and validate a simple brief risk algorithm for the onset of hazardous alcohol drinking (HAD) over 12 months for use in primary care.Design and Setting: Prospective cohort study in 32 health centres from six Spanish provinces, with evaluations at baseline, 6 months, and 12 months.Method: Forty-one risk factors were measured and multilevel logistic regression and inverse probability weighting were used to build the risk algorithm. The outcome was new occurrence of HAD during the study, as measured by the AUDIT.Results: From the lists of 174 GPs, 3954 adult abstinent or low-risk drinkers were recruited. The 'predictAL-10' risk algorithm included just nine variables (10 questions): province, sex, age, cigarette consumption, perception of financial strain, having ever received treatment for an alcohol problem, childhood sexual abuse, AUDIT-C, and interaction AUDIT-C*Age. The c-index was 0.886 (95% CI = 0.854 to 0.918). The optimal cutoff had a sensitivity of 0.83 and specificity of 0.80. Excluding childhood sexual abuse from the model (the 'predictAL-9'), the c-index was 0.880 (95% CI = 0.847 to 0.913), sensitivity 0.79, and specificity 0.81. There was no statistically significant difference between the c-indexes of predictAL-10 and predictAL-9.Conclusion: The predictAL-10/9 is a simple and internally valid risk algorithm to predict the onset of hazardous alcohol drinking over 12 months in primary care attendees; it is a brief tool that is potentially useful for primary prevention of hazardous alcohol drinking. [ABSTRACT FROM AUTHOR]- Published
- 2017
- Full Text
- View/download PDF
28. Effect of country-of-origin contextual factors and length of stay on immigrants' substance use in Spain.
- Author
-
Sordo, L., Indave, B. I., Vallejo, F., Belza, M. J., Sanz-Barbero, B., Rosales-Statkus, M., Fernández-Balbuena, S., and Barrio, G.
- Subjects
SUBSTANCE abuse ,ALGORITHMS ,CONFIDENCE intervals ,EMIGRATION & immigration ,LENGTH of stay in hospitals ,POISSON distribution ,RESEARCH funding ,STATISTICAL sampling ,DATA analysis software ,DESCRIPTIVE statistics - Abstract
Background: Factors explaining disparities in risk of substance use between immigrants and natives and between immigrant subgroups are poorly understood. We aimed to describe such disparities and identify some explanatory factors in Spain. Methods: Participants were residents aged 15-64 years from 2005 to 07 nationally representative surveys. Outcomes were prevalences of alcohol, tobacco, sedative-hypnotics, cannabis and other illegal substance use. Immigrants were recent if <5 years of Spanish stay and long term if ≥10 years. Country-of-origin income per capita and population level of substance use were taken from international databases. Adjusted prevalence ratios (aPRs) and percent change from Poisson regression with robust variance were used to estimate risk disparities and effects of immigration variables. Results: Most immigrants had lower substance use than natives, although it generally increased with increasing Spanish stay, especially for illegal substances. This lower risk could be partially explained by country-of-origin contextual factors as a lower level of income or substance use and religious or cultural factors such as Islam. By origin, recent immigrant aPRs and convergence-divergence risk patterns were, respectively, as follows: lower aPRs with upward convergence (often incomplete) toward natives' risk in immigrants from Muslim area, Eastern-Europe and Latin-America excluding South-Cone, lower/ similar aPRs with upward overtaking or divergent patterns in South-Cone Americans and similar/higher aPRs with stable or upward divergent patterns in Non-Eastern-Europeans. Conclusion: Spain is a host context that seems to facilitate increased substance use among immigrants, even those from countries with prevalences close to Spain. However, country-of-origin context is important in explaining disparities in substance use among immigrants. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
29. A New Methodology to Study Street Accessibility: A Case Study of Avila (Spain).
- Author
-
Curado, Manuel, Rodriguez, Rocio, Jimenez, Manuel, Tortosa, Leandro, and Vicent, Jose F.
- Subjects
ALGORITHMS ,MUNICIPAL services ,ECONOMIC models ,ECONOMIC impact ,FACTOR structure ,STREETS ,LOCAL transit access - Abstract
Taking into account that accessibility is one of the most strategic and determining factors in economic models and that accessibility and tourism affect each other, we can say that the study and improvement of one of them involved the development of the other. Using network analysis, this study presents an algorithm for labeling the difficulty of the streets of a city using different accessibility parameters. We combine network structure and accessibility factors to explore the association between innovative behavior within the street network, and the relationships with the commercial activity in a city. Finally, we present a case study of the city of Avila, locating the most inaccessible areas of the city using centrality measures and analyzing the effects, in terms of accessibility, on the commerce and services of the city. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
30. Contribution of Driving Efficiency to Vehicle-to-Building.
- Author
-
Borge-Diez, David, Ortega-Cabezas, Pedro Miguel, Colmenar-Santos, Antonio, and Blanes-Peiró, Jorge Juan
- Subjects
APPLICATION program interfaces ,ELECTRIC power consumption ,SOCIAL groups ,BRAIN-computer interfaces ,ALGORITHMS - Abstract
Energy consumption in the transport sector and buildings are of great concern. This research aims to quantify how eco-routing, eco-driving and eco-charging can increase the amount of energy available for vehicle-to-building. To do this, the working population was broken into social groups (freelancers, local workers and commuters) who reside in two cities with different climate zones (Alcalá de Henares-Spain and Jaén-Spain) since the way of using electric vehicles is different. An algorithm based on the Here
® application program interface and neural networks was implemented to acquire data of the stochastic usage of EVs of each social group. Finally, an increase in the amount of energy available for vehicle-to-building was assessed thanks to the algorithm. The results per day were as follows. Owing to the algorithm proposed a reduction ranging from 0.6 kWh to 2.2 kWh was obtained depending on social groups. The proposed algorithm facilitated an increase in energy available for vehicle-to-building ranging from 13.2 kWh to 33.6 kWh depending on social groups. The results show that current charging policies are not compatible with all social groups and do not consider the renewable energy contribution to the total electricity demand. [ABSTRACT FROM AUTHOR]- Published
- 2021
- Full Text
- View/download PDF
31. Clinical validation of automatic local activation time annotation during focal premature ventricular complex ablation procedures.
- Author
-
Acosta, Juan, Soto-Iglesias, David, Fernández-Armenta, Juan, Frutos-López, Manuel, Jáuregui, Beatriz, Arana-Rueda, Eduardo, Fernández, Marcos, Penela, Diego, Alcaine, Alejandro, Cano, Lucas, Pedrote, Alonso, and Berruezo, Antonio
- Subjects
ARRHYTHMIA diagnosis ,ACTION potentials ,ALGORITHMS ,ARRHYTHMIA ,CATHETER ablation ,COMPARATIVE studies ,HEART beat ,HEART function tests ,RESEARCH methodology ,MEDICAL cooperation ,RESEARCH ,RESEARCH evaluation ,SIGNAL processing ,TIME ,EVALUATION research ,TREATMENT effectiveness ,PREDICTIVE tests - Abstract
Aims: Current navigation systems incorporate algorithms for automatic identification of local activation time (LAT). However, data about their utility and accuracy in premature ventricular complex (PVC) ablation procedures are scarce. This study analyses the accuracy of an algorithmic method based on automatic annotation of the maximal negative slope of the unipolar electrogram within the window demarcated by the bipolar electrogram compared with conventional manual annotation during PVC ablation procedures.Methods and results: Forty patients with successful ablation of focal PVC in three centres were included. Electroanatomical activation maps obtained with the automatic system (WF-map) were compared with manual annotation maps (M-map). Correlation and concordance of LAT obtained with both methods were assessed at 3536 points. The distance between the earliest activation site (EAS) and the effective radiofrequency application point (e-RFp) were determined in M-map and WF-map. The distance between WF-EAS and M-EAS was assessed. Successful ablation sites included left ventricular outflow tract (LVOT; 55%), right ventricular outflow tract (40%), and tricuspid annulus (5%). Good correlation was observed between the two annotation approaches (r = 0.655; P < 0.0001). Bland-Altman analysis revealed a systematic delayed detection of LAT by WF-map (bias 33.8 ± 30.9 ms), being higher in LVOT than in the right ventricle (42.6 ± 29.2 vs. 27.2 ± 30.5 ms, respectively; P < 0.0001). No difference in EAS-eRFp distance was observed between M-map and WF-map (1.8 ± 2.8 vs. 1.8 ± 3.4 mm, respectively; P = 0.986). The median (interquartile range) distance between WF-EAS and M-EAS was 2.2(0-6) mm.Conclusion: Good correlation was found between M-map and WF-map. Local activation time detection was systematically delayed in WF-map, especially in LVOT. Accurate identification of e-RFp was achieved with both annotation approaches. [ABSTRACT FROM AUTHOR]- Published
- 2018
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.