128 results
Search Results
2. The Challenges of Algorithm Management: The Spanish Perspective.
- Author
-
Prado, Daniel Perez del
- Subjects
ALGORITHMS ,LABOR laws ,DISRUPTIVE innovations ,ARTIFICIAL intelligence ,DIGITAL technology - Abstract
This paper focuses on how Spain's labour and employment law is dealing with technological disruption and, particularly, with algorithm management, looking for a harmonious equilibrium between traditional structures and profound changes. It pays special attention to the different actors affected and the most recent normative changes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Confidence of a k-Nearest Neighbors Python Algorithm for the 3D Visualization of Sedimentary Porous Media.
- Author
-
Bullejos, Manuel, Cabezas, David, Martín-Martín, Manuel, and Alcalá, Francisco Javier
- Subjects
PYTHON programming language ,K-nearest neighbor classification ,POROUS materials ,CONFIDENCE ,ECONOMIC decision making ,ALGORITHMS - Abstract
In a previous paper, the authors implemented a machine learning k-nearest neighbors (KNN) algorithm and Python libraries to create two 3D interactive models of the stratigraphic architecture of the Quaternary onshore Llobregat River Delta (NE Spain) for groundwater exploration purposes. The main limitation of this previous paper was its lack of routines for evaluating the confidence of the 3D models. Building from the previous paper, this paper refines the programming code and introduces an additional algorithm to evaluate the confidence of the KNN predictions. A variant of the Similarity Ratio method was used to quantify the KNN prediction confidence. This variant used weights that were inversely proportional to the distance between each grain-size class and the inferred point to work out a value that played the role of similarity. While the KNN algorithm and Python libraries demonstrated their efficacy for obtaining 3D models of the stratigraphic arrangement of sedimentary porous media, the KNN prediction confidence verified the certainty of the 3D models. In the Llobregat River Delta, the KNN prediction confidence at each prospecting depth was a function of the available data density at that depth. As expected, the KNN prediction confidence decreased according to the decreasing data density at lower depths. The obtained average-weighted confidence was in the 0.44−0.53 range for gravel bodies at prospecting depths in the 12.7−72.4 m b.s.l. range and was in the 0.42−0.55 range for coarse sand bodies at prospecting depths in the 4.6−83.9 m b.s.l. range. In a couple of cases, spurious average-weighted confidences of 0.29 in one gravel body and 0.30 in one coarse sand body were obtained. These figures were interpreted as the result of the quite different weights of neighbors from different grain-size classes at short distances. The KNN algorithm confidence has proven its suitability for identifying these anomalous results in the supposedly well-depurated grain-size database used in this study. The introduced KNN algorithm confidence quantifies the reliability of the 3D interactive models, which is a necessary stage to make decisions in economic and environmental geology. In the Llobregat River Delta, this quantification clearly improves groundwater exploration predictability. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
4. A K-Nearest Neighbors Algorithm in Python for Visualizing the 3D Stratigraphic Architecture of the Llobregat River Delta in NE Spain.
- Author
-
Bullejos, Manuel, Cabezas, David, Martín-Martín, Manuel, and Alcalá, Francisco Javier
- Subjects
K-nearest neighbor classification ,SUPERVISED learning ,PYTHON programming language ,ALGORITHMS ,MACHINE learning ,SEDIMENTARY structures ,PLIOCENE Epoch - Abstract
The k-nearest neighbors (KNN) algorithm is a non-parametric supervised machine learning classifier; which uses proximity and similarity to make classifications or predictions about the grouping of an individual data point. This ability makes the KNN algorithm ideal for classifying datasets of geological variables and parameters prior to 3D visualization. This paper introduces a machine learning KNN algorithm and Python libraries for visualizing the 3D stratigraphic architecture of sedimentary porous media in the Quaternary onshore Llobregat River Delta (LRD) in northeastern Spain. A first HTML model showed a consecutive 5 m-equispaced set of horizontal sections of the granulometry classes created with the KNN algorithm from 0 to 120 m below sea level in the onshore LRD. A second HTML model showed the 3D mapping of the main Quaternary gravel and coarse sand sedimentary bodies (lithosomes) and the basement (Pliocene and older rocks) top surface created with Python libraries. These results reproduce well the complex sedimentary structure of the LRD reported in recent scientific publications and proves the suitability of the KNN algorithm and Python libraries for visualizing the 3D stratigraphic structure of sedimentary porous media, which is a crucial stage in making decisions in different environmental and economic geology disciplines. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
5. Variable neighborhood search to solve the generalized orienteering problem.
- Author
-
Urrutia‐Zambrana, Adolfo, Tirado, Gregorio, and Mateos, Alfonso
- Subjects
NEIGHBORHOODS ,ALGORITHMS ,METAHEURISTIC algorithms - Abstract
This paper presents a variable neighborhood search (VNS) algorithm to solve the extension of the orienteering problem known as the generalized orienteering problem (GOP). Our algorithm aims to use a reduced number of neighborhoods without compromising the quality of the results. This reduced number of neighborhoods, together with the precalculation of scores associated with points of interest, allows us, in most cases, to outperform all previous metaheuristics proposed for this problem. This is the first time a VNS is being applied to the GOP, and it provides promising computational results. In particular, in the case studies considered in the paper, we were able to find 35 new best solutions, all of which were found using a shorter computational time. Furthermore, the information regarding other best‐known solutions provided in the literature has also been improved, with corrections to some previously published errors regarding scores and distances. In addition, the benchmark has been extended with the incorporation of new case studies based on real data from three of the most popular tourist cities in Spain. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
6. KNN and adaptive comfort applied in decision making for HVAC systems.
- Author
-
Aparicio-Ruiz, Pablo, Barbadilla-Martín, Elena, Guadix, José, and Cortés, Pablo
- Subjects
THERMAL comfort ,DECISION making ,SUPPORT vector machines ,ALGORITHMS ,AIR conditioning ,HEATING & ventilation industry - Abstract
The decision making of a suitable heating, ventilating and air conditioning system's set-point temperature is an energy and environmental challenge in our society. In the present paper, a general framework to define such temperature based on a dynamic adaptive comfort algorithm is proposed. Due to the fact that the thermal comfort of the occupants of a building has different ranges of acceptability, this method is applied to learn such comfort temperature with respect to the running mean temperature and therefore to decide the suitable range of indoor temperature. It is demonstrated that this solution allows to dynamically build an adaptive comfort algorithm, an algorithm based on the human being's thermal adaptability, without applying the traditional theory. The proposed methodology based on the K-Nearest-Neighbour algorithm was tested and compared with data from an experimental thermal comfort field study carried out in a mixed mode building in the south-western area of Spain and with the Support Vector Machine method. The results show that K-Nearest-Neighbour algorithm represents the pattern of thermal comfort data better than the traditional solution and that it is a suitable method to learn the thermal comfort area of a building and to define the set-point temperature for a heating, ventilating and air-conditioning system. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
7. Comparison of Optimisation Algorithms for Centralised Anaerobic Co-Digestion in a Real River Basin Case Study in Catalonia.
- Author
-
Palma-Heredia D, Verdaguer M, Puig V, Poch M, and Cugueró-Escofet MÀ
- Subjects
- Anaerobiosis, Digestion, Spain, Algorithms, Rivers
- Abstract
Anaerobic digestion (AnD) is a process that allows the conversion of organic waste into a source of energy such as biogas, introducing sustainability and circular economy in waste treatment. AnD is an intricate process because of multiple parameters involved, and its complexity increases when the wastes are from different types of generators. In this case, a key point to achieve good performance is optimisation methods. Currently, many tools have been developed to optimise a single AnD plant. However, the study of a network of AnD plants and multiple waste generators, all in different locations, remains unexplored. This novel approach requires the use of optimisation methodologies with the capacity to deal with a highly complex combinatorial problem. This paper proposes and compares the use of three evolutionary algorithms: ant colony optimisation (ACO), genetic algorithm (GA) and particle swarm optimisation (PSO), which are especially suited for this type of application. The algorithms successfully solve the problem, using an objective function that includes terms related to quality and logistics. Their application to a real case study in Catalonia (Spain) shows their usefulness (ACO and GA to achieve maximum biogas production and PSO for safer operation conditions) for AnD facilities.
- Published
- 2022
- Full Text
- View/download PDF
8. A Self-Assembly Portable Mobile Mapping System for Archeological Reconstruction Based on VSLAM-Photogrammetric Algorithm.
- Author
-
Ortiz-Coder P and Sánchez-Ríos A
- Subjects
- Cloud Computing, Equipment Design, Imaging, Three-Dimensional instrumentation, Photogrammetry instrumentation, Software, Spain, Workflow, Algorithms, Archaeology methods, Imaging, Three-Dimensional methods, Photogrammetry methods
- Abstract
Three Dimensional (3D) models are widely used in clinical applications, geosciences, cultural heritage preservation, and engineering; this, together with new emerging needs such as building information modeling (BIM) develop new data capture techniques and devices with a low cost and reduced learning curve that allow for non-specialized users to employ it. This paper presents a simple, self-assembly device for 3D point clouds data capture with an estimated base price under €2500; furthermore, a workflow for the calculations is described that includes a Visual SLAM-photogrammetric threaded algorithm that has been implemented in C++. Another purpose of this work is to validate the proposed system in BIM working environments. To achieve it, in outdoor tests, several 3D point clouds were obtained and the coordinates of 40 points were obtained by means of this device, with data capture distances ranging between 5 to 20 m. Subsequently, those were compared to the coordinates of the same targets measured by a total station. The Euclidean average distance errors and root mean square errors (RMSEs) ranging between 12-46 mm and 8-33 mm respectively, depending on the data capture distance (5-20 m). Furthermore, the proposed system was compared with a commonly used photogrammetric methodology based on Agisoft Metashape software. The results obtained demonstrate that the proposed system satisfies (in each case) the tolerances of 'level 1' (51 mm) and 'level 2' (13 mm) for point cloud acquisition in urban design and historic documentation, according to the BIM Guide for 3D Imaging (U.S. General Services).
- Published
- 2019
- Full Text
- View/download PDF
9. Estimation of COVID-19 epidemic curves using genetic programming algorithm.
- Author
-
Anđelić, Nikola, Šegota, Sandi Baressi, Lorencin, Ivan, Mrzljak, Vedran, and Car, Zlatan
- Subjects
HIGH performance computing ,COVID-19 ,CONVALESCENCE ,MACHINE learning ,INFECTIOUS disease transmission ,RESEARCH funding ,STATISTICAL models ,ALGORITHMS - Abstract
This paper investigates the possibility of the implementation of Genetic Programming (GP) algorithm on a publicly available COVID-19 data set, in order to obtain mathematical models which could be used for estimation of confirmed, deceased, and recovered cases and the estimation of epidemiology curve for specific countries, with a high number of cases, such as China, Italy, Spain, and USA and as well as on the global scale. The conducted investigation shows that the best mathematical models produced for estimating confirmed and deceased cases achieved R2 scores of 0.999, while the models developed for estimation of recovered cases achieved the R2 score of 0.998. The equations generated for confirmed, deceased, and recovered cases were combined in order to estimate the epidemiology curve of specific countries and on the global scale. The estimated epidemiology curve for each country obtained from these equations is almost identical to the real data contained within the data set [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
10. Bidders Recommender for Public Procurement Auctions Using Machine Learning: Data Analysis, Algorithm, and Case Study with Tenders from Spain.
- Author
-
García Rodríguez, Manuel J., Rodríguez Montequín, Vicente, Ortega Fernández, Francisco, and Villanueva Balsera, Joaquín M.
- Subjects
GOVERNMENT purchasing ,MACHINE learning ,ALGORITHMS ,RECOMMENDER systems ,RANDOM forest algorithms ,DATA analysis - Abstract
Recommending the identity of bidders in public procurement auctions (tenders) has a significant impact in many areas of public procurement, but it has not yet been studied in depth. A bidders recommender would be a very beneficial tool because a supplier (company) can search appropriate tenders and, vice versa, a public procurement agency can discover automatically unknown companies which are suitable for its tender. This paper develops a pioneering algorithm to recommend potential bidders using a machine learning method, particularly a random forest classifier. The bidders recommender is described theoretically, so it can be implemented or adapted to any particular situation. It has been successfully validated with a case study: an actual Spanish tender dataset (free public information) which has 102,087 tenders from 2014 to 2020 and a company dataset (nonfree public information) which has 1,353,213 Spanish companies. Quantitative, graphical, and statistical descriptions of both datasets are presented. The results of the case study were satisfactory: the winning bidding company is within the recommended companies group, from 24% to 38% of the tenders, according to different test conditions and scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
11. The Empirically Corrected EP-TOMS Total Ozone Data Against Brewer Measurements at El Arenosillo (Southwestern Spain).
- Author
-
Antón, Manuel, Vilaplana, José Manuel, Kroon, Mark, Serrano, Antonio, Parias, Marta, Cancillo, María Luisa, and de la Morena, Benito A.
- Subjects
OZONE ,SPECTROMETERS ,SPECTRORADIOMETER ,SATELLITE geodesy - Abstract
This paper focuses on the validation of the empirically corrected total ozone column (TOC) data provided by the Earth Probe Total Ozone Mapping Spectrometer (EP-TOMS) using ground-based measurements recorded by a well-calibrated Brewer spectroradiometer located at El Arenosillo (Spain). In addition, satellite TOC observations derived from the OzoneMonitoring Instrument (OMI) with the TOMS algorithm are also used in this paper. The agreement between EP-TOMS TOC data and Brewer measurements is excellent (R² ~ 0.92) even for the period 2000-2005 when a higher EP-TOMS instrument degradation occurred. Despite its low magnitude, the EP-TOMS-Brewer relative differences depend on the solar zenith angle (SZA), showing a clear seasonal cycle with amplitude between ±2% and ±4%. Conversely, OMI-Brewer relative differences show a constant negative value around -1% with no significant dependence on SZA. No significant dependence on the ground-based to satellitebased differences with respect to the EP-TOMS scene or to the OMI crosstrack position is observed for either satellite retrieval algorithm. Finally, TOC, estimated by the two satellite instruments, have also been compared, showing a good agreement (R² ~ 0.88). Overall, we conclude that the empirical correction of the EP-TOMS data record provides a reprocessed set of high quality. However, EP-TOMS data after year 2000 should not be used in calculations of global-ozone trending due to remaining errors in the data set and because it is no longer an independent data set. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
12. Control Algorithm for Coordinated Reactive Power Compensation in a Wind Park.
- Author
-
Díaz-Dorado, E., Carrillo, C., and Cidrás, J.
- Subjects
WIND power plants ,WIND power ,ALGORITHMS ,POWER resources ,WIND turbines ,INDUCTION generators ,CAPACITOR banks ,DYNAMIC programming ,SIMULATION methods & models ,REACTIVE power - Abstract
The penetration level of wind energy is continuously growing, and it is especially relevant in European countries such as Denmark, Germany, and Spain. For this reason, grid codes in different countries have been recently revised, or are now under revision in order to integrate this energy in the network taking into account the security of supply. This paper is related to reactive compensation, which is one aspect usually included in these codes. On the other hand, a great number of installed wind parks are formed by fixed speed wind turbines equipped with induction generators. The typical scheme for reactive compensation in this kind of wind parks is based on capacitor banks locally controlled in each machine. This configuration makes very difficult to follow the requirements of the new grid codes. To overcome this problem, a configuration with a central controller that coordinates the actuation over all the capacitor steps in the wind park is proposed in this paper. A central controller algorithm that is based on a dynamic programming is presented and evaluated by means of simulation. At this time, the proposed scheme has been installed at the Sotavento Experimental Wind Park (Spain) and it is currently being tested. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
13. A mobile decision support system for red eye diseases diagnosis: experience with medical students.
- Author
-
López, Marta, López, Miguel, Torre Díez, Isabel, Jimeno, José, and López-Coronado, Miguel
- Subjects
EYE diseases ,ALGORITHMS ,DECISION support systems ,MEDICAL databases ,INFORMATION storage & retrieval systems ,MEDICAL students ,RESEARCH funding ,SURVEYS ,MOBILE apps ,DIAGNOSIS - Abstract
A good primary health care is the base for a better healthcare system. Taking a good decision on time by the primary health care physician could have a huge repercussion. In order to ease the diagnosis task arise the Decision Support Systems (DSS), which offer counselling instead of refresh the medical knowledge, in a profession where it is still learning every day. The implementation of these systems in diseases which are a frequent cause of visit to the doctor like ophthalmologic pathologies are, which affect directly to our quality of life, takes more importance. This paper aims to develop OphthalDSS, a totally new mobile DSS for red eye diseases diagnosis. The main utilities that OphthalDSS offers will be a study guide for medical students and a clinical decision support system for primary care professionals. Other important goal of this paper is to show the user experience results after OphthalDSS being used by medical students of the University of Valladolid. For achieving the main purpose of this research work, a decision algorithm will be developed and implemented by an Android mobile application. Moreover, the Quality of Experience (QoE) has been evaluated by the students through the questions of a short inquiry. The app developed which implements the algorithm OphthalDSS is capable of diagnose more than 30 eye's anterior segment diseases. A total of 67 medical students have evaluated the QoE. The students find the diseases' information presented very valuable, the appearance is adequate, it is always available and they have ever found what they were looking for. Furthermore, the students think that their quality of life has not been improved using the app and they can do the same without using the OphthalDSS app. OphthalDSS is easy to use, which is capable of diagnose more than 30 ocular diseases in addition to be used as a DSS tool as an educational tool at the same time. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
14. Breast Density Analysis Using an Automatic Density Segmentation Algorithm.
- Author
-
Oliver, Arnau, Tortajada, Meritxell, Lladó, Xavier, Freixenet, Jordi, Ganau, Sergi, Tortajada, Lidia, Vilagran, Mariona, Sentís, Melcior, and Martí, Robert
- Subjects
BREAST ,ALGORITHMS ,MAMMOGRAMS ,BREAST tumors ,DIAGNOSTIC imaging ,LONGITUDINAL method ,COMPUTERS in medicine ,PROBABILITY theory ,REGRESSION analysis ,RESEARCH funding ,T-test (Statistics) ,EVALUATION research ,DESCRIPTIVE statistics ,ANATOMY - Abstract
Breast density is a strong risk factor for breast cancer. In this paper, we present an automated approach for breast density segmentation in mammographic images based on a supervised pixel-based classification and using textural and morphological features. The objective of the paper is not only to show the feasibility of an automatic algorithm for breast density segmentation but also to prove its potential application to the study of breast density evolution in longitudinal studies. The database used here contains three complete screening examinations, acquired 2 years apart, of 130 different patients. The approach was validated by comparing manual expert annotations with automatically obtained estimations. Transversal analysis of the breast density analysis of craniocaudal (CC) and mediolateral oblique (MLO) views of both breasts acquired in the same study showed a correlation coefficient of ρ = 0.96 between the mammographic density percentage for left and right breasts, whereas a comparison of both mammographic views showed a correlation of ρ = 0.95. A longitudinal study of breast density confirmed the trend that dense tissue percentage decreases over time, although we noticed that the decrease in the ratio depends on the initial amount of breast density. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
15. ALGORITHMIC (IN)VISIBILITY TACTICS AMONG IMMIGRANT TIKTOKERS.
- Author
-
JARAMILLO-DENT, DANIELA
- Subjects
SCIENTIFIC literature ,IMMIGRANTS ,SOCIAL media ,DIGITAL video - Abstract
It is well established in scientific literature that immigrants are excluded from their own stories, which are often instrumentalized to fulfill specific communicative, othering intentions. In this sense, migrant agency and voice are, in many cases, absent from narratives related to their life experiences and subject to various symbolic, digital, and material borders. Moreover, although social media has been recognized as a prime space for self-representation across different segments of society, immigrants are often excluded from these spaces due to the risks that sharing certain information publicly represent to them. In this article I draw from a 16-month digital ethnography and inductive, multimodal content analysis of videos created by 53 Latin American immigrant tiktokers in the United States and Spain. This enables the conceptualization of their algorithmic (in)visibility practices which refer to the set of strategies deployed by immigrant content creators on social media --and possibly other marginalized and vulnerable populations-- to negotiate the conspicuousness of their controversial content with the aim of avoiding its deletion from the platform. The findings unveil three exemplary algorithmic (in)visibility practices that include content reuse and re-upload, vernacular visibility, and partial deplatforming. I find that these strategies shift between collective and individual approaches to achieve selective visibility and concealed conspicuousness within algorithmic moderation systems. [ABSTRACT FROM AUTHOR]
- Published
- 2022
16. Application of the GoRoSo Feedforward Algorithm to Compute the Gate Trajectories for a Quick Canal Closing in the Case of an Emergency.
- Author
-
Soler, Joan, Gómez, Manuel, Rodellar, José, and Gamazo, Pablo
- Subjects
CANALS ,RIVERS ,OPEN-channel flow ,QUADRATIC programming ,FEEDFORWARD control systems - Abstract
The canal delivery system in the Left Hemidelta area of the Ebro River in Spain consists of a tree-shaped net of open canals. The overall system can be quickly isolated in the case of an emergency by closing the upstream pool. Transients, in which the initial state is hydraulically far from the final state, are difficult to handle and cannot be made in only one gate movement in order to protect the canal lining. Therefore, they have to be as smooth as possible. GoRoSo is a feedforward control algorithm for irrigation canals based on sequential quadratic programming. With this tool, it is possible to calculate the gate trajectories that smoothly carry the canal from the initial state to the final state by keeping the water depth constant at checkpoints. The paper shows the efficient implementation of GoRoSo in both the closure and opening operations of the canal delivery system. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
17. MEAN SHIFT: A NON-PARAMETRIC ALGORITHM FOR THE SEGMENTATION OF ANOMALIES IN GEOPHYSICAL IMAGES OBTAINED FROM MAGNETIC PROSPECTION DATA.
- Author
-
SALGUERO, F., PRAT, F., MORENO, F., and ROMERO, S.
- Subjects
GEOPHYSICS in archaeology ,ARCHAEOLOGY methodology ,ALGORITHMS ,IMAGE processing ,MAGNETOMETRY in archaeology - Abstract
This paper studies the applicability of the Mean Shift algorithm as support in interpreting geophysical images produced, on this occasion, from magnetic prospection data. The data obtained from a magnetic survey carried out in Gilena (Seville province, Spain) by the La Rábida Archaeophysics Group will be used for the research. Its applicability is illustrated by comparing, on the one hand, some reduction-to-pole algorithms and on the other, the (well-known) k-means algorithm. Finally, the paper shows the results obtained by applying the Mean Shift algorithm as an alternative method to 'unsupervised clustering' of anomalies that appear in images obtained from geophysical data, in which the a priori knowledge of the number of classes is difficult or impossible. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
18. KL-optimal experimental design for discriminating between two growth models applied to a beef farm.
- Author
-
Campos-Barreiro S and López-Fidalgo J
- Subjects
- Animals, Computer Simulation, Red Meat, Reproducibility of Results, Research Design, Sensitivity and Specificity, Spain, Aging physiology, Agriculture methods, Algorithms, Body Weight physiology, Cattle growth & development, Models, Biological
- Abstract
The body mass growth of organisms is usually represented in terms of what is known as ontogenetic growth models, which represent the relation of dependence between the mass of the body and time. The paper is concerned with a problem of finding an optimal experimental design for discriminating between two competing mass growth models applied to a beef farm. T-optimality was first introduced for discrimination between models but in this paper, KL-optimality based on the Kullback-Leibler distance is used to deal with correlated obsevations since, in this case, observations on a particular animal are not independent.
- Published
- 2016
- Full Text
- View/download PDF
19. Multiple time scales in modeling the incidence of infections acquired in intensive care units.
- Author
-
Wolkewitz, Martin, Cooper, Ben S., Palomar-Martinez, Mercedes, Alvarez-Lerma, Francisco, Olaechea-Astigarraga, Pedro, Barnett, Adrian G., and Schumacher, Martin
- Subjects
INTENSIVE care units ,INFECTION risk factors ,NOSOCOMIAL infections ,CRITICAL care medicine ,HOSPITAL admission & discharge ,DISEASE prevalence ,METHICILLIN-resistant staphylococcus aureus ,ALGORITHMS ,COMPARATIVE studies ,CROSS infection ,LENGTH of stay in hospitals ,MATHEMATICAL models ,RESEARCH methodology ,MEDICAL cooperation ,RESEARCH ,RESEARCH funding ,RISK assessment ,STAPHYLOCOCCAL diseases ,TIME ,THEORY ,EVALUATION research ,DISEASE incidence ,PROPORTIONAL hazards models ,PHYSIOLOGY - Abstract
Background: When patients are admitted to an intensive care unit (ICU) their risk of getting an infection will be highly depend on the length of stay at-risk in the ICU. In addition, risk of infection is likely to vary over calendar time as a result of fluctuations in the prevalence of the pathogen on the ward. Hence risk of infection is expected to depend on two time scales (time in ICU and calendar time) as well as competing events (discharge or death) and their spatial location. The purpose of this paper is to develop and apply appropriate statistical models for the risk of ICU-acquired infection accounting for multiple time scales, competing risks and the spatial clustering of the data.Methods: A multi-center data base from a Spanish surveillance network was used to study the occurrence of an infection due to Methicillin-resistant Staphylococcus aureus (MRSA). The analysis included 84,843 patient admissions between January 2006 and December 2011 from 81 ICUs. Stratified Cox models were used to study multiple time scales while accounting for spatial clustering of the data (patients within ICUs) and for death or discharge as competing events for MRSA infection.Results: Both time scales, time in ICU and calendar time, are highly associated with the MRSA hazard rate and cumulative risk. When using only one basic time scale, the interpretation and magnitude of several patient-individual risk factors differed. Risk factors concerning the severity of illness were more pronounced when using only calendar time. These differences disappeared when using both time scales simultaneously.Conclusions: The time-dependent dynamics of infections is complex and should be studied with models allowing for multiple time scales. For patient individual risk-factors we recommend stratified Cox regression models for competing events with ICU time as the basic time scale and calendar time as a covariate. The inclusion of calendar time and stratification by ICU allow to indirectly account for ICU-level effects such as local outbreaks or prevention interventions. [ABSTRACT FROM AUTHOR]- Published
- 2016
- Full Text
- View/download PDF
20. The Use of Algorithms within Administrative Procedures: National Experiences compared through the Lens of European Law.
- Author
-
Pressi, Matteo
- Subjects
- *
ADMINISTRATIVE procedure , *ALGORITHMS ,EUROPEAN law - Abstract
This paper aims to analyze, from a comparative perspective, the main elements of the discipline on the use of algorithms within administrative procedures developed by the national lawmakers of France, Spain and Italy. Furthermore, the article intends to verify, on the basis of the principle of good administration, the existence of a minimum core of guarantees addressed to the citizen who is the recipient of an automated decision. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
21. Circulation types and winter precipitation in Spain.
- Author
-
Casado, M. J. and Pastor, M. A.
- Subjects
WINTER ,ATMOSPHERIC circulation ,METEOROLOGICAL precipitation ,ALGORITHMS - Abstract
ABSTRACT This paper is concentrated on the evaluation of circulation type classifications (CTCs) from the European Action COST733 catalogue version 2.0 in terms of their ability to capture winter precipitation, expressed as percentage, over Spanish Iberia and the Balearic Islands. The explained variation, the pseudo-F statistics and the Brier skill score are used to quantify the explanatory power of circulation classifications. As secondary aims, the impact of using different number of circulation types, additional types of variables and 4-day sequences in the generation of classifications is analysed. Although no optimal method has been found, nevertheless, the results suggest that the use of CTCs based on optimization algorithms are, in general, performing better than those which are based on other algorithms (i.e. leader algorithms). Distinct variations in skill exist not only among classifications from different groups of basic methods but as well between classifications from the same method group; being remarkable the behaviour of the optimum random centroid method. Results are very dependent on the metric, for the explained variation, and the Brier skill score, the larger the number of circulation types, the better the performance; contrary to the behaviour of the pseudo-F statistic. The inclusion of 500 hPa vorticity in the generation of classifications improves results while a general deterioration is observed when considering 4-day sequences. These results are only valid for the selected season and cannot be transferred to other locations and seasons. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
22. Improving hospital decision making with interpretable associations over datacubes.
- Author
-
Molina C, Prados-Suarez B, Prados de Reyes M, and Peña Yañez C
- Subjects
- Spain, Algorithms, Data Mining standards, Decision Support Systems, Clinical standards, Electronic Health Records standards, Patient-Centered Care standards, Quality Improvement standards
- Abstract
In this paper we propose a new Classification based on Association Rules (CAR) algorithm that improves the interpretability of the results, works over real data from the electronic health records (EHRs), and allows the study of the patient as a whole. It enables tasks such as the discovery of relationships between diseases, or offering several alternative and reasoned diagnoses for the cases of patients with several diseases that analysed separately could lead to mistaken diagnosis. We aim to achieve several goals: to discover hidden relationships; to improve the interpretability and reduce the complexity of the result; to obtain more reliable diagnosis (getting alternative reasoned diagnoses and higher robustness to noisy rules), and to improve the quality of the classifier avoiding the usual over-fitting problem. To this purpose, we define and exploit hierarchies defined over datacubes dimensions, and change the way the association rules are obtained, and their evaluation at the classification process. To prove the utility of our proposal we have used it in an example of cancer discrimination.
- Published
- 2014
23. Day- and night-time aerosol optical depth implementation in CÆLIS.
- Author
-
González, Ramiro, Toledano, Carlos, Román, Roberto, Fuertes, David, Berjón, Alberto, Mateos, David, Guirado-Fuentes, Carmen, Velasco-Merino, Cristian, Carlos Antuña-Sanchez, Juan, Calle, Abel, E. Cachorro, Victoria, and M. de Frutos, Ángel
- Subjects
- *
OPTICAL depth (Astrophysics) , *AEROSOLS , *OBSERVATIONS of the Moon , *ALGORITHMS , *QUALITY control - Abstract
The University of Valladolid (UVa, Spain) manages since 2006 a calibration center of the AErosol RObotic NETwork (AERONET). The CÆLIS software tool, developed by UVa, was created to manage the data generated by the AERONET photometers, for calibration, quality control and data processing purposes. This paper exploits the potential of this tool in order to obtain products like the aerosol optical depth (AOD) and Angstrom exponent (AE), which are of high interest for atmospheric and climate studies, as well as to enhance the quality control of the instruments and data managed by CÆLIS. The AOD and cloud screening algorithms implemented in CÆLIS, both based on AERONET version 3, are described in detail. The obtained products are compared with the AERONET database. In general, the differences in daytime AOD between CÆLIS and AERONET are far below the expected uncertainty of the instrument, ranging the mean differences between −1.3×10[sup −4] at 870 nm and 6.2×10[sup −4] at 380 nm. The standard deviations of the differences range from 2.8×10[sup −4] at 675 nm to 8.1×10[sup −4] at 340 nm. The AOD and AE at night-time calculated by CÆLIS from Moon observations are also presented, showing good continuity between day and night-time for different locations, aerosol loads and moon phase angles. Regarding cloud screening, around 99.9 % of the observations classified as cloud-free by CÆLIS are also assumed cloud-free by AERONET; this percentage is similar for the cases considered as cloud-contaminated by both databases. The obtained results point out the capability of CÆLIS as processing system. The AOD algorithm provides the opportunity to use this tool with other instrument types and to retrieve other aerosol products in the future. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
24. Lowering the 'floor' of the SF-6D scoring algorithm using a lottery equivalent method.
- Author
-
Abellán Perpiñán JM, Sánchez Martínez FI, Martínez Pérez JE, and Méndez I
- Subjects
- Adult, Female, Humans, Male, Middle Aged, Spain, Algorithms, Health Status, Quality of Life, Surveys and Questionnaires
- Abstract
This paper presents a new scoring algorithm for the SF-6D, one of the most popular preference-based health status measures. Previous SF-6D value sets have a minimum (a floor), which is substantially higher than the lowest value generated by the EQ-5D model. Our algorithm expands the range of SF-6D utility scores in such a way that the floor is significantly lowered. We obtain the wider range because of the use of a lottery equivalent method through which preferences from a representative sample of Spanish general population are elicited., (Copyright © 2011 John Wiley & Sons, Ltd.)
- Published
- 2012
- Full Text
- View/download PDF
25. Adaptive thresholding algorithm based on SAR images and wind data to segment oil spills along the northwest coast of the Iberian Peninsula.
- Author
-
Mera D, Cotos JM, Varela-Pet J, and Garcia-Pineda O
- Subjects
- Environmental Monitoring instrumentation, Models, Chemical, Petroleum Pollution statistics & numerical data, Remote Sensing Technology, Spacecraft, Spain, Wind, Algorithms, Environmental Monitoring methods, Petroleum Pollution analysis, Radar, Water Pollutants, Chemical analysis
- Abstract
Satellite Synthetic Aperture Radar (SAR) has been established as a useful tool for detecting hydrocarbon spillage on the ocean's surface. Several surveillance applications have been developed based on this technology. Environmental variables such as wind speed should be taken into account for better SAR image segmentation. This paper presents an adaptive thresholding algorithm for detecting oil spills based on SAR data and a wind field estimation as well as its implementation as a part of a functional prototype. The algorithm was adapted to an important shipping route off the Galician coast (northwest Iberian Peninsula) and was developed on the basis of confirmed oil spills. Image testing revealed 99.93% pixel labelling accuracy. By taking advantage of multi-core processor architecture, the prototype was optimized to get a nearly 30% improvement in processing time., (Copyright © 2012 Elsevier Ltd. All rights reserved.)
- Published
- 2012
- Full Text
- View/download PDF
26. Deep learning ensembles for accurate fog-related low-visibility events forecasting.
- Author
-
Peláez-Rodríguez, C., Pérez-Aracil, J., de Lopez-Diz, A., Casanova-Mateo, C., Fister, D., Jiménez-Fernández, S., and Salcedo-Sanz, S.
- Subjects
- *
DEEP learning , *MACHINE learning , *FORECASTING , *ELITISM , *ALGORITHMS - Abstract
In this paper we propose and discuss different Deep Learning-based ensemble algorithms for a problem of low-visibility events prediction due to fog. Specifically, seven different Deep Learning (DL) architectures have been considered, from which multiple individual learners are generated. Hyperparameters of the models, including parameters concerning data preprocessing, models architecture and training procedure, are randomly selected for each model within a pre-defined discrete range. Also, every model is trained with slightly different data sampled randomly, assuring that every models introduce variety in the ensemble. Then, three different information fusion techniques are employed to build the ensemble models. The influence of the filtering process and the elitism level (the percentage of the individual models entering the ensemble) is also assessed. The performance of the proposed methodology have been tested in two real problems of low-visibility events prediction due to orographical and radiation fog, at the north of Spain. Comparison with different Machine Learning, alternative DL algorithms and meteorological-based methods show the good performance of the proposed deep learning ensembles in this problem. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
27. A function for quality evaluation of retinal vessel segmentations.
- Author
-
Gegúndez-Arias ME, Aquino A, Bravo JM, and Marín D
- Subjects
- Humans, Image Enhancement methods, Image Enhancement standards, Image Interpretation, Computer-Assisted standards, Imaging, Three-Dimensional standards, Observer Variation, Pattern Recognition, Automated standards, Quality Assurance, Health Care methods, Reproducibility of Results, Retinoscopy standards, Sensitivity and Specificity, Spain, Algorithms, Image Interpretation, Computer-Assisted methods, Imaging, Three-Dimensional methods, Pattern Recognition, Automated methods, Retinal Vessels anatomy & histology, Retinoscopy methods
- Abstract
Retinal blood vessel assessment plays an important role in the diagnosis of ophthalmic pathologies. The use of digital images for this purpose enables the application of a computerized approach and has fostered the development of multiple methods for automated vascular tree segmentation. Metrics based on contingency tables for binary classification have been widely used for evaluating the performance of these algorithms. Metrics from this family are based on the measurement of a success or failure rate in the detected pixels, obtained by means of pixel-to-pixel comparison between the automated segmentation and a manually-labeled reference image. Therefore, vessel pixels are not considered as a part of a vascular structure with specific features. This paper contributes a function for the evaluation of global quality in retinal vessel segmentations. This function is based on the characterization of vascular structures as connected segments with measurable area and length. Thus, its design is meant to be sensitive to anatomical vascularity features. Comparison of results between the proposed function and other general quality evaluation functions shows that this proposal renders a high matching degree with human quality perception. Therefore, it can be used to enhance quality evaluation in retinal vessel segmentations, supplementing the existing functions. On the other hand, from a general point of view, the applied concept of measuring descriptive properties may be used to design specialized functions aimed at segmentation quality evaluation in other complex structures.
- Published
- 2012
- Full Text
- View/download PDF
28. Measuring lexical similarity methods for textual mapping in nursing diagnoses in Spanish and SNOMED-CT.
- Author
-
Cruanes J, Romá-Ferri MT, and Lloret E
- Subjects
- Semantics, Spain, United States, Algorithms, Artificial Intelligence, Natural Language Processing, Nursing Diagnosis, Pattern Recognition, Automated methods, Systematized Nomenclature of Medicine, Translating
- Abstract
One of the current problems in the health domain is the reuse and sharing the clinical information between different professionals, as they are written in natural language using specific terminologies. To overcome this issue it is necessary to use a common terminology, like SNOMED-CT, allowing an information reuse that offers the health professionals the quickest access to quality information. In order to use this terminology all the other terminologies have to be mapped to it. One solution to perform that mapping is using a lexical similarity approach. In this paper we analyze the appropriateness of 15 lexical similarity methods for mapping a set of NANDA-I labels to a set of SMOED-CT descriptions in Spanish. Our aim is to establish how to choose the best algorithm in this domain, from the recall and the precision point of view. After running six different tests, we have established that the three best algorithms where those that maximize the recall, because they always return the best solution.
- Published
- 2012
29. Diet modelling: How it can inform the development of dietary recommendations and public health policy.
- Author
-
Buttriss, J. L., Briend, A., Darmon, N., Ferguson, E. L., Maillot, M., and Lluch, A.
- Subjects
NUTRITION ,PUBLIC health ,MATHEMATICAL models ,ALGORITHMS ,CONFERENCES & conventions ,INGESTION ,HEALTH policy ,NUTRITION policy ,NUTRITIONAL requirements ,OBESITY ,POPULATION density ,THEORY ,FOOD security ,SOCIETIES - Abstract
With the global population expected to increase to 9 billion by 2050 coupled with concerns about food security in relation to climate change and increasing prosperity in many parts of the world causing desire for a less monotonous diet, efficient use of resources such as food becomes ever more important. While the prevalence of obesity is a cause for concern in many parts of the world, many people still go to bed hungry, and in many communities, obesity co-exists with poor diet quality. The result is a series of complex and challenging nutrition problems, such as the access to nutritionally adequate and affordable diets and the development of dietary recommendations. Diet modelling is a useful tool to help identify solutions to such complex questions and this paper summarises a session on this topic at the International Congress of Nutrition that took place in September 2013. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
30. Analyzing the commercial activities of a street network by ranking their nodes: a case study in Murcia, Spain.
- Author
-
Agryzkov, Taras, Oliver, Jose L., Tortosa, Leandro, and Vicent, Jose F.
- Subjects
GEOGRAPHIC information systems ,ALGORITHMS ,INFORMATION & communication technologies ,CARTOGRAPHIC materials ,INFORMATION storage & retrieval systems ,CARTOGRAPHY - Abstract
Urban researchers and planners are often interested in understanding how economic activities are distributed in urban regions, what forces influence their special pattern and how urban structure and functions are mutually dependent. In this paper, we want to show how an algorithm for ranking the nodes in a network can be used to understand and visualize certain commercial activities of a city. The first part of the method consists of collecting real information about different types of commercial activities at each location in the urban network of the city of Murcia, Spain. Four clearly differentiated commercial activities are studied, such as restaurants and bars, shops, banks and supermarkets or department stores, but obviously we can study other. The information collected is then quantified by means of a data matrix, which is used as the basis for the implementation of a PageRank algorithm which produces a ranking of all the nodes in the network, according to their significance within it. Finally, we visualize the resulting classification using a colour scale that helps us to represent the business network. [ABSTRACT FROM PUBLISHER]
- Published
- 2014
- Full Text
- View/download PDF
31. Predictive optimal control of sewer networks using CORAL tool: application to Riera Blanca catchment in Barcelona.
- Author
-
Puig V, Cembrano G, Romera J, Quevedo J, Aznar B, Ramón G, and Cabot J
- Subjects
- Calibration, Models, Theoretical, Spain, Water Purification, Algorithms, Drainage, Sanitary, Software
- Abstract
This paper deals with the global control of the Riera Blanca catchment in the Barcelona sewer network using a predictive optimal control approach. This catchment has been modelled using a conceptual modelling approach based on decomposing the catchments in subcatchments and representing them as virtual tanks. This conceptual modelling approach allows real-time model calibration and control of the sewer network. The global control problem of the Riera Blanca catchment is solved using a optimal/predictive control algorithm. To implement the predictive optimal control of the Riera Blanca catchment, a software tool named CORAL is used. The on-line control is simulated by interfacing CORAL with a high fidelity simulator of sewer networks (MOUSE). CORAL interchanges readings from the limnimeters and gate commands with MOUSE as if it was connected with the real SCADA system. Finally, the global control results obtained using the predictive optimal control are presented and compared against the results obtained using current local control system. The results obtained using the global control are very satisfactory compared to those obtained using the local control.
- Published
- 2009
- Full Text
- View/download PDF
32. An algebraic approach to detect logical inconsistencies in medical appropriateness criteria.
- Author
-
García-Remesal M, Maojo V, Laita L, Roanes-Lozano E, and Crespo J
- Subjects
- Humans, Reproducibility of Results, Sensitivity and Specificity, Spain, User-Computer Interface, Algorithms, Decision Support Systems, Clinical, Decision Support Techniques, Diagnosis, Computer-Assisted methods, Expert Systems, Logistic Models, Medical Records Systems, Computerized
- Abstract
In this paper, we present a computerized approach to detect inconsistencies in medical knowledge bases. The method has been applied to a set of medical appropriateness criteria developed for the review of coronary artery disease management. One of the main problems associated to medical appropriateness criteria is to detect logical inconsistencies in the criteria set, a process often manually carried out by health services specialists. In our approach, appropriateness criteria are automatically translated to rules containing propositional variables, using three-valued Łukasiewicz's logic augmented with modal operators to manage uncertainty. The method assigns a polynomial to each of the rules, integrity constraints, and facts from the rule-based set. This rule set is then checked for inconsistencies. The problem of determining if a formula is a tautological consequence of a set of formulae is reduced by our method into an ideal membership problem in computer algebra. Finally, the set of medical appropriateness criteria is represented in a flowchart format that can be disseminated and remotely accessed over Internet, and can be prospectively used for patient care and management. The method reported in this paper can be applied to other knowledge bases represented by means of IF-THEN rules.
- Published
- 2007
- Full Text
- View/download PDF
33. Simulation of germanium detector calibration using the Monte Carlo method: comparison between point and surface source models.
- Author
-
Ródenas J, Burgos MC, Zarza I, and Gallardo S
- Subjects
- Calibration, Computer Simulation, Monte Carlo Method, Radiation Dosage, Radiation Monitoring methods, Radiation Monitoring standards, Radiation Protection methods, Radiation Protection standards, Reproducibility of Results, Sensitivity and Specificity, Spain, Algorithms, Computer-Aided Design, Equipment Failure Analysis methods, Germanium radiation effects, Models, Statistical, Radiation Monitoring instrumentation, Radiation Protection instrumentation
- Abstract
Simulation of detector calibration using the Monte Carlo method is very convenient. The computational calibration procedure using the MCNP code was validated by comparing results of the simulation with laboratory measurements. The standard source used for this validation was a disc-shaped filter where fission and activation products were deposited. Some discrepancies between the MCNP results and laboratory measurements were attributed to the point source model adopted. In this paper, the standard source has been simulated using both point and surface source models. Results from both models are compared with each other as well as with experimental measurements. Two variables, namely, the collimator diameter and detector-source distance have been considered in the comparison analysis. The disc model is seen to be a better model as expected. However, the point source model is good for large collimator diameter and also when the distance from detector to source increases, although for smaller sizes of the collimator and lower distances a surface source model is necessary.
- Published
- 2005
- Full Text
- View/download PDF
34. Wind speed reconstruction from synoptic pressure patterns using an evolutionary algorithm
- Author
-
Carro-Calvo, L., Salcedo-Sanz, S., Prieto, L., Kirchner-Bossi, N., Portilla-Figueras, A., and Jiménez-Fernández, S.
- Subjects
- *
WIND speed , *ALGORITHMS , *PRESSURE , *DATA analysis , *MATHEMATICAL models , *TOWERS , *PROBLEM solving - Abstract
Abstract: This paper presents an evolutionary algorithm for wind speed reconstruction from synoptic pressure patterns. The algorithm operates in a search space formed by grids of pressure measures, and must classify the different situations into classes, in such a way that a measure of wind speed in a given point is minimized among patterns assigned to the same class. Then, each class is assigned a mean wind speed and direction, so the wind speed reconstruction is possible for a new grid of synoptic pressures. In this paper we present the problem model and the specific description of the evolutionary algorithm proposed to solve the problem. We also show the good performance of the proposed method in the reconstruction of the average wind speed in six wind towers in Spain. The proposed method is applicable to wind speed reconstruction or reconstruction of wind missing data of wind series, specially when there is no other variable or related measure available. [Copyright &y& Elsevier]
- Published
- 2012
- Full Text
- View/download PDF
35. Hourly-resolution analysis of electricity decarbonization in Spain (2017–2030).
- Author
-
Victoria, Marta and Gallego-Castillo, Cristobal
- Subjects
- *
ELECTRIC power production , *CARBONIZATION , *ELECTRIC power consumption , *RENEWABLE energy sources , *CARBON dioxide mitigation , *ALGORITHMS - Abstract
Highlights • Hourly-resolved model to investigate highly-renewable electricity generation in Spain. • Correlation analysis of time series to create 900 combinations used in simulations. • Transition paths evaluated based on security of supply, CO 2 emissions, and renewable share. • Short-term phase-out of nuclear and coal power plants proven to be feasible. Abstract Two alternative paths to achieve highly-renewable electricity generation in peninsular Spain are investigated in this paper. Every transition path comprises a description of the installed and decommissioned generation and storage capacities, from 2017 to 2030, as well as a hypothesis on the evolution of the electricity demand. The electricity mix for every hour within the transition path is determined through a dispatch algorithm that prioritizes electricity from renewable energy sources. The simulation is run for 900 different combinations of time series representing the hourly capacity factors of different technologies, as well as the electricity demand. This robust approach allows the evaluation of the transition paths based on the statistical distribution of several defined assessment criteria, such as security of supply, CO 2 emissions or renewable share in electricity generation. The feasibility of a Spanish power system with high renewable penetration is investigated not only in a future reference year but throughout the transition path. In particular, a progressive and simultaneous phase-out of nuclear and coal power plants in the short-term is proven to be feasible. Furthermore, the results sensitivity is analyzed including scenarios with a delayed nuclear phase-out, lower hydroelectricity generation due to more frequent and severe droughts caused by climate change and higher annual increment for the electricity demand. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
36. Data-driven fuzzy habitat suitability models for brown trout in Spanish Mediterranean rivers
- Author
-
Mouton, A.M., Alcaraz-Hernández, J.D., De Baets, B., Goethals, P.L.M., and Martínez-Capel, F.
- Subjects
- *
FUZZY systems , *MATHEMATICAL models , *HABITATS , *BROWN trout , *DISTRIBUTION (Probability theory) , *ALGORITHMS , *STATISTICAL decision making , *RIVERS - Abstract
Abstract: In recent years, fuzzy models have been acknowledged as a suitable approach for species distribution modelling due to their transparency and their ability to incorporate the ecological gradient theory. Specifically, the overlapping class boundaries of a fuzzy model are similar to the transitions between different environmental conditions. However, the need for ecological expert knowledge is an important constraint when applying fuzzy species distribution models. Moreover, the consistency of the ecological preferences of some fish species across different rivers has been widely contested. Recent research has shown that data-driven fuzzy models may solve this ‘knowledge acquisition bottleneck’ and this paper is a further contribution. The aim was to analyse the brown trout (Salmo trutta fario L.) habitat preferences based on a data-driven fuzzy modelling technique and to compare the resulting fuzzy models with a commonly applied modelling technique, Random Forests. A heuristic nearest ascent hill-climbing algorithm for fuzzy rule optimisation and Random Forests were applied to analyse the ecological preferences of brown trout in 93 mesohabitats. No significant differences in model performance were observed between the optimal fuzzy model and the Random Forests model and both approaches selected river width, the cover index and flow velocity as the most important variables describing brown trout habitat suitability. Further, the fuzzy model combined ecological relevance with reasonable interpretability, whereas the transparency of the Random Forests model was limited. This paper shows that fuzzy models may be a valid approach for species distribution modelling and that their performance is comparable to that of state-of-the-art modelling techniques like Random Forests. Fuzzy models could therefore be a valuable decision support tool for river managers and enhance communication between stakeholders. [Copyright &y& Elsevier]
- Published
- 2011
- Full Text
- View/download PDF
37. Dialect Classification via Text-Independent Training and Testing for Arabic, Spanish, and Chinese.
- Author
-
Lei, Yun and Hansen, John H. L.
- Subjects
AUTOMATION ,SPEECH research ,FIELD theory (Physics) ,SYSTEM identification ,FINITE Gaussian mixture models (Statistics) ,MAXIMUM likelihood statistics ,ALGORITHMS - Abstract
Automatic dialect classification has emerged as an important area in the speech research field. Effective dialect classification is useful in developing robust speech systems, such as speech recognition and speaker identification. In this paper, two novel algorithms are proposed to improve dialect classification for text-independent spontaneous speech in Arabic and Spanish languages, along with probe results for Chinese. The problem considers the case where no transcripts but dialect labels are available for training and test data, and speakers are speaking spontaneously, which is defined as text-independent dialect classification. The Gaussian mixture model (GMM) is used as the baseline system for text-independent dialect classification. The major motivation is to suppress confused/distractive regions from the dialect language space and emphasize discriminative/sensitive information of the available dialects. In the training phase, a symmetric version of the Kullback–Leibler divergence is used to find the most discriminative GMM mixtures (KLD-GMM), where the confused acoustic GMM region is suppressed. For testing, the more discriminative frames are detected and used via the location of where the frames are in the GMM mixture feature space, which is termed frame selection decoding (FSD-GMM). The first KLD-GMM and second FSD-GMM techniques, are shown to improve dialect classification performance for three-way dialect tasks. The two algorithms and their combination are evaluated on dialects of Arabic and Spanish corpora. Measurable improvement is achieved in both two cases, over a generalized maximum-likelihood estimation GMM baseline (MLE-GMM). [ABSTRACT FROM PUBLISHER]
- Published
- 2011
- Full Text
- View/download PDF
38. Survey of Visual and Force/Tactile Control of Robots for Physical Interaction in Spain.
- Author
-
Garcia, Gabriel J., Corrales, Juan A., Pomares, Jorge, and Torres, Fernando
- Subjects
ROBOTICS ,REMOTE sensing ,DETECTORS ,TACTILE sensors ,FORCE & energy ,TORQUE ,ALGORITHMS ,ARCHITECTURE - Abstract
Sensors provide robotic systems with the information required to perceive the changes that happen in unstructured environments and modify their actions accordingly. The robotic controllers which process and analyze this sensory information are usually based on three types of sensors (visual, force/torque and tactile) which identify the most widespread robotic control strategies: visual servoing control, force control and tactile control. This paper presents a detailed review on the sensor architectures, algorithmic techniques and applications which have been developed by Spanish researchers in order to implement these mono-sensor and multi-sensor controllers which combine several sensors. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
39. Photogrammetric Methodology for the Production of Geomorphologic Maps: Application to the Veleta Rock Glacier (Sierra Nevada, Granada, Spain).
- Author
-
de Matías, Javier, de Sanjosé, José Juan, López-Nicolás, Gonzalo, Sagüés, Carlos, and Guerrero, José Jesús
- Subjects
PHOTOGRAMMETRY ,GEOMORPHOLOGICAL mapping ,CARTOGRAPHY ,GLACIERS ,GEODETIC observations ,ALGORITHMS - Abstract
In this paper we present a stereo feature-based method using SIFT (Scale-invariant feature transform) descriptors. We use automatic feature extractors, matching algorithms between images and techniques of robust estimation to produce a DTM (Digital Terrain Model) using convergent shots of a rock glacier.The geomorphologic structure observed in this study is the Veleta rock glacier (Sierra Nevada, Granada, Spain). This rock glacier is of high scientific interest because it is the southernmost active rock glacier in Europe and it has been analyzed every year since 2001. The research on the Veleta rock glacier is devoted to the study of its displacement and cartography through geodetic and photogrammetric techniques. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
40. Application of the Radon transform to detect small-targets in sea clutter.
- Author
-
Carretero-Moya, J., Gismero-Menoyo, J., Asensio-López, A., and Blanco-del-Campo, Á.
- Subjects
HEURISTIC ,RADAR ,RADON transforms ,MATRICES (Mathematics) ,ALGORITHMS ,DOPPLER effect ,MONTE Carlo method - Abstract
The authors present a novel and heuristic approach for the detection of low radar cross-section targets in high-resolution sea clutter. The proposed technique is based on the application of the Radon transform to range–time matrices formed by column-wise storage of consecutive range profiles. The objective of this paper is 2-fold: to analyse the effect of the transform on real high-resolution sea clutter and to describe a detection scheme based on the insight obtained. The proposed technique emulates the behaviour of traditional motion target detection algorithms without the need for reliable Doppler information. It also constitutes a powerful non-coherent integration strategy of the target's energy along its specific path on the range–time plot. The performance of the detection technique has been tested against real high-resolution sea clutter data, acquired at the south coast of Spain with an in-house developed continuous wave linear frequency modulated millimetre-wave radar system. Monte Carlo simulations show a significant improvement over the conventional cell averaging constant false alarm rate schemes. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
41. Validation of a temperature emissivity separation hybrid method from airborne hyperspectral scanner data and ground measurements in the SEN2FLEX field campaign.
- Author
-
Peres, L. F., Sobrino, J. A., Libonati, R., Jiménez‐Muñoz, J. C., Dacamara, C. C., and Romaguera, M.
- Subjects
REMOTE sensing ,TEMPERATURE measurements ,SURFACE of the earth ,ALGORITHMS ,RADIOMETERS - Abstract
This paper presents an assessment of the performance of a hybrid method that allows a simultaneous retrieval of land-surface temperature (LST) and emissivity (LSE) from remotely-sensed data. The proposed method is based on a synergistic usage of the split-window (SW) algorithm and the two-temperature method (TTM) and combines the advantages of both procedures while mitigating their drawbacks. The method was implemented for thermal channels 76 (10.56 µm) and 78 (11.72 µm) of the Airborne Hyperspectral Scanner (AHS), which was flown over the Barrax test site (Albacete, Spain) in the second week of July 2005, within the framework of the Sentinel-2 and Fluorescence Experiment (SEN2FLEX) field campaign. A set of radiometric measurements was performed in the thermal infrared region in coincidence with aircraft overpasses for different surface types, e.g. bare soil, water body, corn, wheat, grass. The hybrid method was tested and compared with a standard SW algorithm and the results obtained show that the hybrid method is able to provide better estimates of LST, with values of bias (RMSE) of the order of 0.8 K (1.9 K), i.e. about one third (one half) of the corresponding values of 2.7 K (3.4 K) that were obtained for bias (RMSE) when using the SW algorithm. These figures provide a sound indication that the developed hybrid method is particularly useful for surface and atmospheric conditions where SW algorithms cannot be accurately applied. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
42. Humanoid robot RH-1 for collaborative tasks: a control architecture for human-robot cooperation.
- Author
-
Monje, Concepción A., Pierro, Paolo, and Balaguer, Carlos
- Subjects
ROBOTICS ,ROBOT kinematics ,ALGORITHMS ,VIRTUAL reality ,AUTONOMOUS robots ,UNIVERSITIES & colleges - Abstract
The full-scale humanoid robot RH-1 has been totally developed in the University Carlos III of Madrid. In this paper we present an advanced control system for this robot so that it can perform tasks in cooperation with humans. The collaborative tasks are carried out in a semi-autonomous way and are intended to be put into operation in real working environments where humans and robots should share the same space. Before presenting the control strategy, the kinematic model and a simplified dynamic model of the robot are presented. All the models and algorithms are verified by several simulations and experimental results. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
43. Underwater SLAM in man-made structured environments.
- Author
-
Ribas, David, Ridao, Pere, Tardós, Juan Domingo, and Neira, José
- Subjects
SUBMERSIBLES ,NAVIGATION ,DAMS ,HARBORS ,SONAR ,ALGORITHMS ,KALMAN filtering ,MAPS - Abstract
This paper describes a navigation system for autonomous underwater vehicles (AUVs) in partially structured environments, such as dams, harbors, marinas, and marine platforms. A mechanically scanned imaging sonar is used to obtain information about the location of vertical planar structures present in such environments. A robust voting algorithm has been developed to extract line features, together with their uncertainty, from the continuous sonar data flow. The obtained information is incorporated into a feature-based simultaneous localization and mapping (SLAM) algorithm running an extended Kalman filter. Simultaneously, the AUV's position estimate is provided to the feature extraction algorithm to correct the distortions that the vehicle motion produces in the acoustic images. Moreover, a procedure to build and maintain a sequence of local maps and to posteriorly recover the full global map has been adapted for the application presented. Experiments carried out in a marina located in the Costa Brava (Spain) with the Ictineu AUV show the viability of the proposed approach. © 2008 Wiley Periodicals, Inc. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
44. An Automatic Algorithm to Date the Reference Cycle of the Spanish Economy.
- Author
-
Camacho, Maximo, Gadea, María Dolores, and Gómez-Loscos, Ana
- Subjects
GAUSSIAN distribution ,BUSINESS cycles ,ECONOMIC indicators ,MARKOV processes ,ALGORITHMS ,RECESSIONS - Abstract
This paper provides an accurate chronology of the Spanish reference business cycle adapting a multiple change-point model. In that approach, each combination of peaks and troughs dated in a set of economic indicators is assumed to be a realization of a mixture of bivariate Gaussian distributions, whose number of components is estimated from the data. The means of each of these components refer to the dates of the reference turning points. The transitions across the components of the mixture are governed by Markov chain that is restricted to force left-to-right transition dynamic. In the empirical application, seven recessions in the period from February 1970 to February 2020 are identified, which are in high concordance with the timing of the turning point dates established by the Spanish Business Cycle Dating Committee (SBCDC). [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
45. Artificial immune-based supervised classifier for land-cover classification.
- Author
-
Pal, Mahesh
- Subjects
ALGORITHMS ,CLASSIFICATION ,DECISION trees ,IMMUNE system ,IMMUNOLOGY ,ARTIFICIAL neural networks ,STATISTICS - Abstract
This paper explores the potential of an artificial immune-based supervised classification algorithm for land-cover classification. This classifier is inspired by the human immune system and possesses properties similar to nonlinear classification, self/non-self identification, and negative selection. Landsat ETM+ data of an area lying in Eastern England near the town of Littleport are used to study the performance of the artificial immune-based classifier. A univariate decision tree and maximum likelihood classifier were used to compare its performance in terms of classification accuracy and computational cost. Results suggest that the artificial immune-based classifier works well in comparison with the maximum likelihood and the decision-tree classifiers in terms of classification accuracy. The computational cost using artificial immune based classifier is more than the decision tree but less than the maximum likelihood classifier. Another data set from an area in Spain is also used to compare the performance of immune based supervised classifier with maximum likelihood and decision-tree classification algorithms. Results suggest an improved performance with the immune-based classifier in terms of classification accuracy with this data set, too. The design of an artificial immune-based supervised classifier requires several user-defined parameters to be set, so this work is extended to study the effect of varying the values of six parameters on classification accuracy. Finally, a comparison with a backpropagation neural network suggests that the neural network classifier provides higher classification accuracies with both data sets, but the results are not statistically significant. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
46. Actuator fault-tolerance evaluation of linear constrained model predictive control using zonotope-based set computations.
- Author
-
Ocampo-Martinez, C., Guerra, P., Puig, V., and Quevedo, J.
- Subjects
ACTUATORS ,FAULT tolerance (Engineering) ,PREDICTIVE control systems ,AUTOMATIC control systems ,SIGNALS & signaling ,ALGORITHMS ,SEWERAGE - Abstract
This paper presents a computational procedure to evaluate the fault tolerance of a linear-constrained model predictive control (LCMPC) scheme for a given actuator fault configuration (AFC). Faults in actuators cause changes in the constraints related to control signals (inputs), which in turn modify the set of MPC feasible solutions. This fact may result in an empty set of admissible solutions for a given control objective. Therefore, the admissibility of the control law facing actuator faults can be determined by knowing the set of feasible solutions. One of the aims of this paper is to provide methods to compute this set and to evaluate the admissibility of the control law for a given AFC, once the control objective and the admissibility criteria have been established. In particular, the admissible solution set for the predictive control problem, including the effect of faults (either through reconfiguration or accommodation), is determined using an algorithm that is implemented using set computations based on zonotopes. Finally, the proposed method is tested on a real application consisting of a part of the Barcelona sewer network. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
47. Supply Estimation Using Coevolutionary Genetic Algorithms in the Spanish Electrical Market.
- Author
-
De La Cal MarÍn, Enrique A. and Sánchez Ramos, Luciano
- Subjects
ALGORITHMS ,CONFIGURATIONS (Geometry) ,ELECTRICITY ,ELECTRIC utilities ,ELECTRIC generators ,ECONOMIC competition ,SUPPLY & demand ,ECONOMIC models ,ECONOMIC statistics - Abstract
The price of electrical energy in Spain has not been regulated by the government since 1998, but determined by the supply from the generators in a competitive market, the so-called "electrical pool". A genetic method for analyzing data from this new market is presented in this paper. The eventual objective is to determine the individual supply curves of the competitive agents. Adopting the point of view of the game theory, different genetic algorithm configurations using coevolutionary and non-coevolutionary strategies combined with scalar and multi-objective fitness are compared. The results obtained are the first step toward solving the induction of the optimal individual strategies into the Spanish electrical market from data in terms of perfect oligopolistic behavior. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
48. Statistical Analysis and Machine Learning Prediction of Fog-Caused Low-Visibility Events at A-8 Motor-Road in Spain.
- Author
-
Cornejo-Bueno, Sara, Casillas-Pérez, David, Cornejo-Bueno, Laura, Chidean, Mihaela I., Caamaño, Antonio J., Cerro-Prada, Elena, Casanova-Mateo, Carlos, and Salcedo-Sanz, Sancho
- Subjects
STATISTICS ,MACHINE learning ,PARETO distribution ,MAXIMUM likelihood statistics ,ALGORITHMS ,PEARSON correlation (Statistics) - Abstract
This work presents a full statistical analysis and accurate prediction of low-visibility events due to fog, at the A-8 motor-road in Mondoñedo (Galicia, Spain). The present analysis covers two years of study, considering visibility time series and exogenous variables collected in the zone affected the most by extreme low-visibility events. This paper has then a two-fold objective: first, we carry out a statistical analysis for estimating the fittest probability distributions to the fog event duration, using the Maximum Likelihood method and an alternative method known as the L-moments method. This statistical study allows association of the low-visibility depth with the event duration, showing a clear relationship, which can be modeled with distributions for extremes such as Generalized Extreme Value and Generalized Pareto distributions. Second, we apply a neural network approach, trained by means of the ELM (Extreme Learning Machine) algorithm, to predict the occurrence of low-visibility events due to fog, from atmospheric predictive variables. This study provides a full characterization of fog events at this motor-road, in which orographic fog is predominant, causing important traffic problems during all year. We also show how the ELM approach is able to obtain highly accurate low-visibility events predictions, with a Pearson correlation coefficient of 0.8 , within a half-hour time horizon, enough to initialize some protocols aiming at reducing the impact of these extreme events in the traffic of the A-8 motor road. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
49. A Method of Estimating Time-to-Recovery for a Disease Caused by a Contagious Pathogen Such as SARS-CoV-2 Using a Time Series of Aggregated Case Reports.
- Author
-
Koutsouris, Dimitrios-Dionysios, Pitoglou, Stavros, Anastasiou, Athanasios, and Koumpouros, Yiannis
- Subjects
DISEASE progression ,COMPUTER software ,COVID-19 ,CONFIDENCE intervals ,TIME ,CONVALESCENCE ,WORLD health ,EPIDEMICS ,TIME series analysis ,DESCRIPTIVE statistics ,SENSITIVITY & specificity (Statistics) ,PREDICTION models ,COVID-19 pandemic ,ALGORITHMS - Abstract
During the outbreak of a disease caused by a pathogen with unknown characteristics, the uncertainty of its progression parameters can be reduced by devising methods that, based on rational assumptions, exploit available information to provide actionable insights. In this study, performed a few (~6) weeks into the outbreak of COVID-19 (caused by SARS-CoV-2), one of the most important disease parameters, the average time-to-recovery, was calculated using data publicly available on the internet (daily reported cases of confirmed infections, deaths, and recoveries), and fed into an algorithm that matches confirmed cases with deaths and recoveries. Unmatched cases were adjusted based on the matched cases calculation. The mean time-to-recovery, calculated from all globally reported cases, was found to be 18.01 days (SD 3.31 days) for the matched cases and 18.29 days (SD 2.73 days) taking into consideration the adjusted unmatched cases as well. The proposed method used limited data and provided experimental results in the same region as clinical studies published several months later. This indicates that the proposed method, combined with expert knowledge and informed calculated assumptions, could provide a meaningful calculated average time-to-recovery figure, which can be used as an evidence-based estimation to support containment and mitigation policy decisions, even at the very early stages of an outbreak. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
50. A study of differential microRNA expression profile in migraine: the microMIG exploratory study.
- Author
-
Gallardo, V. J., Gómez-Galván, J. B., Asskour, L., Torres-Ferrús, M., Alpuente, A., Caronna, E., and Pozo-Rosich, P.
- Subjects
RESEARCH ,MONONUCLEAR leukocytes ,MIGRAINE ,RESEARCH methodology ,MICRORNA ,INTERVIEWING ,CASE-control method ,RANDOM forest algorithms ,GENETIC markers ,GENE expression profiling ,QUESTIONNAIRES ,FACTOR analysis ,RESEARCH funding ,CLUSTER analysis (Statistics) ,HEADACHE ,WOMEN'S health ,LONGITUDINAL method ,ALGORITHMS ,EPIGENOMICS - Abstract
Background: Several studies have described potential microRNA (miRNA) biomarkers associated with migraine, but studies are scarcely reproducible primarily due to the heterogeneous variability of participants. Increasing evidence shows that disease-related intrinsic factors together with lifestyle (environmental factors), influence epigenetic mechanisms and in turn, diseases. Hence, the main objective of this exploratory study was to find differentially expressed miRNAs (DE miRNA) in peripheral blood mononuclear cells (PBMC) of patients with migraine compared to healthy controls in a well-controlled homogeneous cohort of non-menopausal women. Methods: Patients diagnosed with migraine according to the International Classification of Headache Disorders (ICHD-3) and healthy controls without familial history of headache disorders were recruited. All participants completed a very thorough questionnaire and structured-interview in order to control for environmental factors. RNA was extracted from PBMC and a microarray system (GeneChip miRNA 4.1 Array chip, Affymetrix) was used to determine the miRNA profiles between study groups. Principal components analysis and hierarchical clustering analysis were performed to study samples distribution and random forest (RF) algorithms were computed for the classification task. To evaluate the stability of the results and the prediction error rate, a bootstrap (.632 + rule) was run through all the procedure. Finally, a functional enrichment analysis of selected targets was computed through protein–protein interaction networks. Results: After RF classification, three DE miRNA distinguished study groups in a very homogeneous female cohort, controlled by factors such as demographics (age and BMI), life-habits (physical activity, caffeine and alcohol consumptions), comorbidities and clinical features associated to the disease: miR-342-3p, miR-532-3p and miR-758-5p. Sixty-eight target genes were predicted which were linked mainly to enriched ion channels and signaling pathways, neurotransmitter and hormone homeostasis, infectious diseases and circadian entrainment. Conclusions: A 3-miRNA (miR-342-3p, miR-532-3p and miR-758-5p) novel signature has been found differentially expressed between controls and patients with migraine. Enrichment analysis showed that these pathways are closely associated with known migraine pathophysiology, which could lead to the first reliable epigenetic biomarker set. Further studies should be performed to validate these findings in a larger and more heterogeneous sample. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.