68 results
Search Results
2. The Challenges of Algorithm Management: The Spanish Perspective.
- Author
-
Prado, Daniel Perez del
- Subjects
ALGORITHMS ,LABOR laws ,DISRUPTIVE innovations ,ARTIFICIAL intelligence ,DIGITAL technology - Abstract
This paper focuses on how Spain's labour and employment law is dealing with technological disruption and, particularly, with algorithm management, looking for a harmonious equilibrium between traditional structures and profound changes. It pays special attention to the different actors affected and the most recent normative changes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Confidence of a k-Nearest Neighbors Python Algorithm for the 3D Visualization of Sedimentary Porous Media.
- Author
-
Bullejos, Manuel, Cabezas, David, Martín-Martín, Manuel, and Alcalá, Francisco Javier
- Subjects
PYTHON programming language ,K-nearest neighbor classification ,POROUS materials ,CONFIDENCE ,ECONOMIC decision making ,ALGORITHMS - Abstract
In a previous paper, the authors implemented a machine learning k-nearest neighbors (KNN) algorithm and Python libraries to create two 3D interactive models of the stratigraphic architecture of the Quaternary onshore Llobregat River Delta (NE Spain) for groundwater exploration purposes. The main limitation of this previous paper was its lack of routines for evaluating the confidence of the 3D models. Building from the previous paper, this paper refines the programming code and introduces an additional algorithm to evaluate the confidence of the KNN predictions. A variant of the Similarity Ratio method was used to quantify the KNN prediction confidence. This variant used weights that were inversely proportional to the distance between each grain-size class and the inferred point to work out a value that played the role of similarity. While the KNN algorithm and Python libraries demonstrated their efficacy for obtaining 3D models of the stratigraphic arrangement of sedimentary porous media, the KNN prediction confidence verified the certainty of the 3D models. In the Llobregat River Delta, the KNN prediction confidence at each prospecting depth was a function of the available data density at that depth. As expected, the KNN prediction confidence decreased according to the decreasing data density at lower depths. The obtained average-weighted confidence was in the 0.44−0.53 range for gravel bodies at prospecting depths in the 12.7−72.4 m b.s.l. range and was in the 0.42−0.55 range for coarse sand bodies at prospecting depths in the 4.6−83.9 m b.s.l. range. In a couple of cases, spurious average-weighted confidences of 0.29 in one gravel body and 0.30 in one coarse sand body were obtained. These figures were interpreted as the result of the quite different weights of neighbors from different grain-size classes at short distances. The KNN algorithm confidence has proven its suitability for identifying these anomalous results in the supposedly well-depurated grain-size database used in this study. The introduced KNN algorithm confidence quantifies the reliability of the 3D interactive models, which is a necessary stage to make decisions in economic and environmental geology. In the Llobregat River Delta, this quantification clearly improves groundwater exploration predictability. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
4. A K-Nearest Neighbors Algorithm in Python for Visualizing the 3D Stratigraphic Architecture of the Llobregat River Delta in NE Spain.
- Author
-
Bullejos, Manuel, Cabezas, David, Martín-Martín, Manuel, and Alcalá, Francisco Javier
- Subjects
K-nearest neighbor classification ,SUPERVISED learning ,PYTHON programming language ,ALGORITHMS ,MACHINE learning ,SEDIMENTARY structures ,PLIOCENE Epoch - Abstract
The k-nearest neighbors (KNN) algorithm is a non-parametric supervised machine learning classifier; which uses proximity and similarity to make classifications or predictions about the grouping of an individual data point. This ability makes the KNN algorithm ideal for classifying datasets of geological variables and parameters prior to 3D visualization. This paper introduces a machine learning KNN algorithm and Python libraries for visualizing the 3D stratigraphic architecture of sedimentary porous media in the Quaternary onshore Llobregat River Delta (LRD) in northeastern Spain. A first HTML model showed a consecutive 5 m-equispaced set of horizontal sections of the granulometry classes created with the KNN algorithm from 0 to 120 m below sea level in the onshore LRD. A second HTML model showed the 3D mapping of the main Quaternary gravel and coarse sand sedimentary bodies (lithosomes) and the basement (Pliocene and older rocks) top surface created with Python libraries. These results reproduce well the complex sedimentary structure of the LRD reported in recent scientific publications and proves the suitability of the KNN algorithm and Python libraries for visualizing the 3D stratigraphic structure of sedimentary porous media, which is a crucial stage in making decisions in different environmental and economic geology disciplines. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
5. KNN and adaptive comfort applied in decision making for HVAC systems.
- Author
-
Aparicio-Ruiz, Pablo, Barbadilla-Martín, Elena, Guadix, José, and Cortés, Pablo
- Subjects
THERMAL comfort ,DECISION making ,SUPPORT vector machines ,ALGORITHMS ,AIR conditioning ,HEATING & ventilation industry - Abstract
The decision making of a suitable heating, ventilating and air conditioning system's set-point temperature is an energy and environmental challenge in our society. In the present paper, a general framework to define such temperature based on a dynamic adaptive comfort algorithm is proposed. Due to the fact that the thermal comfort of the occupants of a building has different ranges of acceptability, this method is applied to learn such comfort temperature with respect to the running mean temperature and therefore to decide the suitable range of indoor temperature. It is demonstrated that this solution allows to dynamically build an adaptive comfort algorithm, an algorithm based on the human being's thermal adaptability, without applying the traditional theory. The proposed methodology based on the K-Nearest-Neighbour algorithm was tested and compared with data from an experimental thermal comfort field study carried out in a mixed mode building in the south-western area of Spain and with the Support Vector Machine method. The results show that K-Nearest-Neighbour algorithm represents the pattern of thermal comfort data better than the traditional solution and that it is a suitable method to learn the thermal comfort area of a building and to define the set-point temperature for a heating, ventilating and air-conditioning system. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
6. Comparison of Optimisation Algorithms for Centralised Anaerobic Co-Digestion in a Real River Basin Case Study in Catalonia.
- Author
-
Palma-Heredia D, Verdaguer M, Puig V, Poch M, and Cugueró-Escofet MÀ
- Subjects
- Anaerobiosis, Digestion, Spain, Algorithms, Rivers
- Abstract
Anaerobic digestion (AnD) is a process that allows the conversion of organic waste into a source of energy such as biogas, introducing sustainability and circular economy in waste treatment. AnD is an intricate process because of multiple parameters involved, and its complexity increases when the wastes are from different types of generators. In this case, a key point to achieve good performance is optimisation methods. Currently, many tools have been developed to optimise a single AnD plant. However, the study of a network of AnD plants and multiple waste generators, all in different locations, remains unexplored. This novel approach requires the use of optimisation methodologies with the capacity to deal with a highly complex combinatorial problem. This paper proposes and compares the use of three evolutionary algorithms: ant colony optimisation (ACO), genetic algorithm (GA) and particle swarm optimisation (PSO), which are especially suited for this type of application. The algorithms successfully solve the problem, using an objective function that includes terms related to quality and logistics. Their application to a real case study in Catalonia (Spain) shows their usefulness (ACO and GA to achieve maximum biogas production and PSO for safer operation conditions) for AnD facilities.
- Published
- 2022
- Full Text
- View/download PDF
7. EL USO DE ALGORITMOS PREDICTIVOS EN EL DERECHO PENAL. A PROPÓSITO DE LA SENTENCIA DE LA CORTE DE JUSTICIA DEL DISTRITO DE LA HAYA (PAÍSES BAJOS) SOBRE SyRI, DE 5 DE FEBRERO DE 2020.
- Author
-
Sánchez Vilanova, María
- Subjects
DUE process of law ,CRIMINAL procedure ,CRIMINAL law ,RIGHT of privacy ,DISTRICT courts ,FRAUD - Abstract
Copyright of Teoría & Derecho. Revista de Pensamiento Jurídico is the property of Editorial Tirant Lo Blanch SL and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2022
- Full Text
- View/download PDF
8. IN METHODUM FLUXIONUM.
- Author
-
AUSEJO, Elena
- Subjects
EIGHTEENTH century ,CONTENT analysis ,MANUSCRIPTS ,TRANSCRIPTION (Linguistics) ,ALGORITHMS ,USER-generated content - Abstract
Copyright of Cuadernos Dieciochistas is the property of Ediciones Universidad de Salamanca and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2022
- Full Text
- View/download PDF
9. Estimation of COVID-19 epidemic curves using genetic programming algorithm.
- Author
-
Anđelić, Nikola, Šegota, Sandi Baressi, Lorencin, Ivan, Mrzljak, Vedran, and Car, Zlatan
- Subjects
HIGH performance computing ,COVID-19 ,CONVALESCENCE ,MACHINE learning ,INFECTIOUS disease transmission ,RESEARCH funding ,STATISTICAL models ,ALGORITHMS - Abstract
This paper investigates the possibility of the implementation of Genetic Programming (GP) algorithm on a publicly available COVID-19 data set, in order to obtain mathematical models which could be used for estimation of confirmed, deceased, and recovered cases and the estimation of epidemiology curve for specific countries, with a high number of cases, such as China, Italy, Spain, and USA and as well as on the global scale. The conducted investigation shows that the best mathematical models produced for estimating confirmed and deceased cases achieved R2 scores of 0.999, while the models developed for estimation of recovered cases achieved the R2 score of 0.998. The equations generated for confirmed, deceased, and recovered cases were combined in order to estimate the epidemiology curve of specific countries and on the global scale. The estimated epidemiology curve for each country obtained from these equations is almost identical to the real data contained within the data set [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
10. Bidders Recommender for Public Procurement Auctions Using Machine Learning: Data Analysis, Algorithm, and Case Study with Tenders from Spain.
- Author
-
García Rodríguez, Manuel J., Rodríguez Montequín, Vicente, Ortega Fernández, Francisco, and Villanueva Balsera, Joaquín M.
- Subjects
GOVERNMENT purchasing ,MACHINE learning ,ALGORITHMS ,RECOMMENDER systems ,RANDOM forest algorithms ,DATA analysis - Abstract
Recommending the identity of bidders in public procurement auctions (tenders) has a significant impact in many areas of public procurement, but it has not yet been studied in depth. A bidders recommender would be a very beneficial tool because a supplier (company) can search appropriate tenders and, vice versa, a public procurement agency can discover automatically unknown companies which are suitable for its tender. This paper develops a pioneering algorithm to recommend potential bidders using a machine learning method, particularly a random forest classifier. The bidders recommender is described theoretically, so it can be implemented or adapted to any particular situation. It has been successfully validated with a case study: an actual Spanish tender dataset (free public information) which has 102,087 tenders from 2014 to 2020 and a company dataset (nonfree public information) which has 1,353,213 Spanish companies. Quantitative, graphical, and statistical descriptions of both datasets are presented. The results of the case study were satisfactory: the winning bidding company is within the recommended companies group, from 24% to 38% of the tenders, according to different test conditions and scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
11. A Self-Assembly Portable Mobile Mapping System for Archeological Reconstruction Based on VSLAM-Photogrammetric Algorithm.
- Author
-
Ortiz-Coder P and Sánchez-Ríos A
- Subjects
- Cloud Computing, Equipment Design, Imaging, Three-Dimensional instrumentation, Photogrammetry instrumentation, Software, Spain, Workflow, Algorithms, Archaeology methods, Imaging, Three-Dimensional methods, Photogrammetry methods
- Abstract
Three Dimensional (3D) models are widely used in clinical applications, geosciences, cultural heritage preservation, and engineering; this, together with new emerging needs such as building information modeling (BIM) develop new data capture techniques and devices with a low cost and reduced learning curve that allow for non-specialized users to employ it. This paper presents a simple, self-assembly device for 3D point clouds data capture with an estimated base price under €2500; furthermore, a workflow for the calculations is described that includes a Visual SLAM-photogrammetric threaded algorithm that has been implemented in C++. Another purpose of this work is to validate the proposed system in BIM working environments. To achieve it, in outdoor tests, several 3D point clouds were obtained and the coordinates of 40 points were obtained by means of this device, with data capture distances ranging between 5 to 20 m. Subsequently, those were compared to the coordinates of the same targets measured by a total station. The Euclidean average distance errors and root mean square errors (RMSEs) ranging between 12-46 mm and 8-33 mm respectively, depending on the data capture distance (5-20 m). Furthermore, the proposed system was compared with a commonly used photogrammetric methodology based on Agisoft Metashape software. The results obtained demonstrate that the proposed system satisfies (in each case) the tolerances of 'level 1' (51 mm) and 'level 2' (13 mm) for point cloud acquisition in urban design and historic documentation, according to the BIM Guide for 3D Imaging (U.S. General Services).
- Published
- 2019
- Full Text
- View/download PDF
12. [Optimization of the prediction of financial problems in Spanish private health companies using genetic algorithms].
- Author
-
González-Martín JM, Sánchez-Medina AJ, and Alonso JB
- Subjects
- Artificial Intelligence, Forecasting, Humans, Spain, Algorithms, Bankruptcy, Health Care Sector economics, Private Sector economics
- Abstract
Objective: This paper presents a methodology to optimize, using Altman's Z-Score for private companies, the prediction of private companies of the Spanish health sector entering a situation of bankruptcy., Method: The proposed method consists of the application of genetic algorithms (GA) to find the coefficients of the formula of the chain of ratios proposed by Altman in the version of the score for private companies which optimize the prediction for Spanish private health companies, maximizing sensitivity and specificity, and thereby reducing type I and type II errors. For this purpose, a sample of 5,903 companies from the Spanish private health sector obtained from the database of the Iberian Balance Analysis System (SABI) between 2007 and 2015 was used., Results: The results show that the predictive model obtained with the AG presents greater accuracy, sensitivity and specificity than that proposed by Altman for private companies with both test data and all sample data., Conclusions: The most important finding of this study was to establish a methodology that can identify the optimized coefficients for the Altman Z-Score, which allows a more accurate prediction of bankruptcy in Spanish private healthcare companies., (Copyright © 2018 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.)
- Published
- 2019
- Full Text
- View/download PDF
13. The Empirically Corrected EP-TOMS Total Ozone Data Against Brewer Measurements at El Arenosillo (Southwestern Spain).
- Author
-
Antón, Manuel, Vilaplana, José Manuel, Kroon, Mark, Serrano, Antonio, Parias, Marta, Cancillo, María Luisa, and de la Morena, Benito A.
- Subjects
OZONE ,SPECTROMETERS ,SPECTRORADIOMETER ,SATELLITE geodesy - Abstract
This paper focuses on the validation of the empirically corrected total ozone column (TOC) data provided by the Earth Probe Total Ozone Mapping Spectrometer (EP-TOMS) using ground-based measurements recorded by a well-calibrated Brewer spectroradiometer located at El Arenosillo (Spain). In addition, satellite TOC observations derived from the OzoneMonitoring Instrument (OMI) with the TOMS algorithm are also used in this paper. The agreement between EP-TOMS TOC data and Brewer measurements is excellent (R² ~ 0.92) even for the period 2000-2005 when a higher EP-TOMS instrument degradation occurred. Despite its low magnitude, the EP-TOMS-Brewer relative differences depend on the solar zenith angle (SZA), showing a clear seasonal cycle with amplitude between ±2% and ±4%. Conversely, OMI-Brewer relative differences show a constant negative value around -1% with no significant dependence on SZA. No significant dependence on the ground-based to satellitebased differences with respect to the EP-TOMS scene or to the OMI crosstrack position is observed for either satellite retrieval algorithm. Finally, TOC, estimated by the two satellite instruments, have also been compared, showing a good agreement (R² ~ 0.88). Overall, we conclude that the empirical correction of the EP-TOMS data record provides a reprocessed set of high quality. However, EP-TOMS data after year 2000 should not be used in calculations of global-ozone trending due to remaining errors in the data set and because it is no longer an independent data set. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
14. Control Algorithm for Coordinated Reactive Power Compensation in a Wind Park.
- Author
-
Díaz-Dorado, E., Carrillo, C., and Cidrás, J.
- Subjects
WIND power plants ,WIND power ,ALGORITHMS ,POWER resources ,WIND turbines ,INDUCTION generators ,CAPACITOR banks ,DYNAMIC programming ,SIMULATION methods & models ,REACTIVE power - Abstract
The penetration level of wind energy is continuously growing, and it is especially relevant in European countries such as Denmark, Germany, and Spain. For this reason, grid codes in different countries have been recently revised, or are now under revision in order to integrate this energy in the network taking into account the security of supply. This paper is related to reactive compensation, which is one aspect usually included in these codes. On the other hand, a great number of installed wind parks are formed by fixed speed wind turbines equipped with induction generators. The typical scheme for reactive compensation in this kind of wind parks is based on capacitor banks locally controlled in each machine. This configuration makes very difficult to follow the requirements of the new grid codes. To overcome this problem, a configuration with a central controller that coordinates the actuation over all the capacitor steps in the wind park is proposed in this paper. A central controller algorithm that is based on a dynamic programming is presented and evaluated by means of simulation. At this time, the proposed scheme has been installed at the Sotavento Experimental Wind Park (Spain) and it is currently being tested. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
15. Breast Density Analysis Using an Automatic Density Segmentation Algorithm.
- Author
-
Oliver, Arnau, Tortajada, Meritxell, Lladó, Xavier, Freixenet, Jordi, Ganau, Sergi, Tortajada, Lidia, Vilagran, Mariona, Sentís, Melcior, and Martí, Robert
- Subjects
BREAST ,ALGORITHMS ,MAMMOGRAMS ,BREAST tumors ,DIAGNOSTIC imaging ,LONGITUDINAL method ,COMPUTERS in medicine ,PROBABILITY theory ,REGRESSION analysis ,RESEARCH funding ,T-test (Statistics) ,EVALUATION research ,DESCRIPTIVE statistics ,ANATOMY - Abstract
Breast density is a strong risk factor for breast cancer. In this paper, we present an automated approach for breast density segmentation in mammographic images based on a supervised pixel-based classification and using textural and morphological features. The objective of the paper is not only to show the feasibility of an automatic algorithm for breast density segmentation but also to prove its potential application to the study of breast density evolution in longitudinal studies. The database used here contains three complete screening examinations, acquired 2 years apart, of 130 different patients. The approach was validated by comparing manual expert annotations with automatically obtained estimations. Transversal analysis of the breast density analysis of craniocaudal (CC) and mediolateral oblique (MLO) views of both breasts acquired in the same study showed a correlation coefficient of ρ = 0.96 between the mammographic density percentage for left and right breasts, whereas a comparison of both mammographic views showed a correlation of ρ = 0.95. A longitudinal study of breast density confirmed the trend that dense tissue percentage decreases over time, although we noticed that the decrease in the ratio depends on the initial amount of breast density. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
16. Application of the GoRoSo Feedforward Algorithm to Compute the Gate Trajectories for a Quick Canal Closing in the Case of an Emergency.
- Author
-
Soler, Joan, Gómez, Manuel, Rodellar, José, and Gamazo, Pablo
- Subjects
CANALS ,RIVERS ,OPEN-channel flow ,QUADRATIC programming ,FEEDFORWARD control systems - Abstract
The canal delivery system in the Left Hemidelta area of the Ebro River in Spain consists of a tree-shaped net of open canals. The overall system can be quickly isolated in the case of an emergency by closing the upstream pool. Transients, in which the initial state is hydraulically far from the final state, are difficult to handle and cannot be made in only one gate movement in order to protect the canal lining. Therefore, they have to be as smooth as possible. GoRoSo is a feedforward control algorithm for irrigation canals based on sequential quadratic programming. With this tool, it is possible to calculate the gate trajectories that smoothly carry the canal from the initial state to the final state by keeping the water depth constant at checkpoints. The paper shows the efficient implementation of GoRoSo in both the closure and opening operations of the canal delivery system. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
17. ALGORITHMIC (IN)VISIBILITY TACTICS AMONG IMMIGRANT TIKTOKERS.
- Author
-
JARAMILLO-DENT, DANIELA
- Subjects
SCIENTIFIC literature ,IMMIGRANTS ,SOCIAL media ,DIGITAL video - Abstract
It is well established in scientific literature that immigrants are excluded from their own stories, which are often instrumentalized to fulfill specific communicative, othering intentions. In this sense, migrant agency and voice are, in many cases, absent from narratives related to their life experiences and subject to various symbolic, digital, and material borders. Moreover, although social media has been recognized as a prime space for self-representation across different segments of society, immigrants are often excluded from these spaces due to the risks that sharing certain information publicly represent to them. In this article I draw from a 16-month digital ethnography and inductive, multimodal content analysis of videos created by 53 Latin American immigrant tiktokers in the United States and Spain. This enables the conceptualization of their algorithmic (in)visibility practices which refer to the set of strategies deployed by immigrant content creators on social media --and possibly other marginalized and vulnerable populations-- to negotiate the conspicuousness of their controversial content with the aim of avoiding its deletion from the platform. The findings unveil three exemplary algorithmic (in)visibility practices that include content reuse and re-upload, vernacular visibility, and partial deplatforming. I find that these strategies shift between collective and individual approaches to achieve selective visibility and concealed conspicuousness within algorithmic moderation systems. [ABSTRACT FROM AUTHOR]
- Published
- 2022
18. International External Validation of Risk Prediction Model of 90-Day Mortality after Gastrectomy for Cancer Using Machine Learning.
- Author
-
Dal Cero, Mariagiulia, Gibert, Joan, Grande, Luis, Gimeno, Marta, Osorio, Javier, Bencivenga, Maria, Fumagalli Romario, Uberto, Rosati, Riccardo, Morgagni, Paolo, Gisbertz, Suzanne, Polkowski, Wojciech P., Lara Santos, Lucio, Kołodziejczyk, Piotr, Kielan, Wojciech, Reddavid, Rossella, van Sandick, Johanna W., Baiocchi, Gian Luca, Gockel, Ines, Davies, Andrew, and Wijnhoven, Bas P. L.
- Subjects
MORTALITY risk factors ,GASTRECTOMY ,RISK assessment ,RANDOM forest algorithms ,PREDICTION models ,STOMACH tumors ,RECEIVER operating characteristic curves ,SURGERY ,PATIENTS ,FISHER exact test ,LOGISTIC regression analysis ,HEMOGLOBINS ,CANCER patients ,HOSPITALS ,DESCRIPTIVE statistics ,AGE distribution ,RESEARCH methodology ,RESEARCH ,COMBINED modality therapy ,MACHINE learning ,DATA analysis software ,CONFIDENCE intervals ,SERUM albumin ,ALGORITHMS - Abstract
Simple Summary: A 90-day mortality predictive model for curative gastric cancer resection based on the Spanish EURECCA Esophagogastric Cancer database was externally validated using the GASTRODATA registry. The externally validated model showed a modestly worse performance compared to the original model, nevertheless maintaining its discriminating ability in clinical practice. Background: Radical gastrectomy remains the main treatment for gastric cancer, despite its high mortality. A clinical predictive model of 90-day mortality (90DM) risk after gastric cancer surgery based on the Spanish EURECCA registry database was developed using a matching learning algorithm. We performed an external validation of this model based on data from an international multicenter cohort of patients. Methods: A cohort of patients from the European GASTRODATA database was selected. Demographic, clinical, and treatment variables in the original and validation cohorts were compared. The performance of the model was evaluated using the area under the curve (AUC) for a random forest model. Results: The validation cohort included 2546 patients from 24 European hospitals. The advanced clinical T- and N-category, neoadjuvant therapy, open procedures, total gastrectomy rates, and mean volume of the centers were significantly higher in the validation cohort. The 90DM rate was also higher in the validation cohort (5.6%) vs. the original cohort (3.7%). The AUC in the validation model was 0.716. Conclusion: The externally validated model for predicting the 90DM risk in gastric cancer patients undergoing gastrectomy with curative intent continues to be as useful as the original model in clinical practice. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. La sección "Tendencias" en YouTube en España durante las primeras semanas de la pandemia de Covid-19: visibilidad de las industrias culturales frente a los youtubers.
- Author
-
Patricio Pérez-Rufi, José and Castro-Higueras, Antonio
- Subjects
CULTURAL industries ,SOCIAL responsibility ,COVID-19 ,ACCESS to information ,PRODUCE trade ,USER-generated content - Abstract
Copyright of Estudios sobre el Mensaje Periodistico is the property of Universidad Complutense de Madrid and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2021
- Full Text
- View/download PDF
20. Multiple time scales in modeling the incidence of infections acquired in intensive care units.
- Author
-
Wolkewitz, Martin, Cooper, Ben S., Palomar-Martinez, Mercedes, Alvarez-Lerma, Francisco, Olaechea-Astigarraga, Pedro, Barnett, Adrian G., and Schumacher, Martin
- Subjects
INTENSIVE care units ,INFECTION risk factors ,NOSOCOMIAL infections ,CRITICAL care medicine ,HOSPITAL admission & discharge ,DISEASE prevalence ,METHICILLIN-resistant staphylococcus aureus ,ALGORITHMS ,COMPARATIVE studies ,CROSS infection ,LENGTH of stay in hospitals ,MATHEMATICAL models ,RESEARCH methodology ,MEDICAL cooperation ,RESEARCH ,RESEARCH funding ,RISK assessment ,STAPHYLOCOCCAL diseases ,TIME ,THEORY ,EVALUATION research ,DISEASE incidence ,PROPORTIONAL hazards models ,PHYSIOLOGY - Abstract
Background: When patients are admitted to an intensive care unit (ICU) their risk of getting an infection will be highly depend on the length of stay at-risk in the ICU. In addition, risk of infection is likely to vary over calendar time as a result of fluctuations in the prevalence of the pathogen on the ward. Hence risk of infection is expected to depend on two time scales (time in ICU and calendar time) as well as competing events (discharge or death) and their spatial location. The purpose of this paper is to develop and apply appropriate statistical models for the risk of ICU-acquired infection accounting for multiple time scales, competing risks and the spatial clustering of the data.Methods: A multi-center data base from a Spanish surveillance network was used to study the occurrence of an infection due to Methicillin-resistant Staphylococcus aureus (MRSA). The analysis included 84,843 patient admissions between January 2006 and December 2011 from 81 ICUs. Stratified Cox models were used to study multiple time scales while accounting for spatial clustering of the data (patients within ICUs) and for death or discharge as competing events for MRSA infection.Results: Both time scales, time in ICU and calendar time, are highly associated with the MRSA hazard rate and cumulative risk. When using only one basic time scale, the interpretation and magnitude of several patient-individual risk factors differed. Risk factors concerning the severity of illness were more pronounced when using only calendar time. These differences disappeared when using both time scales simultaneously.Conclusions: The time-dependent dynamics of infections is complex and should be studied with models allowing for multiple time scales. For patient individual risk-factors we recommend stratified Cox regression models for competing events with ICU time as the basic time scale and calendar time as a covariate. The inclusion of calendar time and stratification by ICU allow to indirectly account for ICU-level effects such as local outbreaks or prevention interventions. [ABSTRACT FROM AUTHOR]- Published
- 2016
- Full Text
- View/download PDF
21. Day- and night-time aerosol optical depth implementation in CÆLIS.
- Author
-
González, Ramiro, Toledano, Carlos, Román, Roberto, Fuertes, David, Berjón, Alberto, Mateos, David, Guirado-Fuentes, Carmen, Velasco-Merino, Cristian, Carlos Antuña-Sanchez, Juan, Calle, Abel, E. Cachorro, Victoria, and M. de Frutos, Ángel
- Subjects
- *
OPTICAL depth (Astrophysics) , *AEROSOLS , *OBSERVATIONS of the Moon , *ALGORITHMS , *QUALITY control - Abstract
The University of Valladolid (UVa, Spain) manages since 2006 a calibration center of the AErosol RObotic NETwork (AERONET). The CÆLIS software tool, developed by UVa, was created to manage the data generated by the AERONET photometers, for calibration, quality control and data processing purposes. This paper exploits the potential of this tool in order to obtain products like the aerosol optical depth (AOD) and Angstrom exponent (AE), which are of high interest for atmospheric and climate studies, as well as to enhance the quality control of the instruments and data managed by CÆLIS. The AOD and cloud screening algorithms implemented in CÆLIS, both based on AERONET version 3, are described in detail. The obtained products are compared with the AERONET database. In general, the differences in daytime AOD between CÆLIS and AERONET are far below the expected uncertainty of the instrument, ranging the mean differences between −1.3×10[sup −4] at 870 nm and 6.2×10[sup −4] at 380 nm. The standard deviations of the differences range from 2.8×10[sup −4] at 675 nm to 8.1×10[sup −4] at 340 nm. The AOD and AE at night-time calculated by CÆLIS from Moon observations are also presented, showing good continuity between day and night-time for different locations, aerosol loads and moon phase angles. Regarding cloud screening, around 99.9 % of the observations classified as cloud-free by CÆLIS are also assumed cloud-free by AERONET; this percentage is similar for the cases considered as cloud-contaminated by both databases. The obtained results point out the capability of CÆLIS as processing system. The AOD algorithm provides the opportunity to use this tool with other instrument types and to retrieve other aerosol products in the future. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
22. A function for quality evaluation of retinal vessel segmentations.
- Author
-
Gegúndez-Arias ME, Aquino A, Bravo JM, and Marín D
- Subjects
- Humans, Image Enhancement methods, Image Enhancement standards, Image Interpretation, Computer-Assisted standards, Imaging, Three-Dimensional standards, Observer Variation, Pattern Recognition, Automated standards, Quality Assurance, Health Care methods, Reproducibility of Results, Retinoscopy standards, Sensitivity and Specificity, Spain, Algorithms, Image Interpretation, Computer-Assisted methods, Imaging, Three-Dimensional methods, Pattern Recognition, Automated methods, Retinal Vessels anatomy & histology, Retinoscopy methods
- Abstract
Retinal blood vessel assessment plays an important role in the diagnosis of ophthalmic pathologies. The use of digital images for this purpose enables the application of a computerized approach and has fostered the development of multiple methods for automated vascular tree segmentation. Metrics based on contingency tables for binary classification have been widely used for evaluating the performance of these algorithms. Metrics from this family are based on the measurement of a success or failure rate in the detected pixels, obtained by means of pixel-to-pixel comparison between the automated segmentation and a manually-labeled reference image. Therefore, vessel pixels are not considered as a part of a vascular structure with specific features. This paper contributes a function for the evaluation of global quality in retinal vessel segmentations. This function is based on the characterization of vascular structures as connected segments with measurable area and length. Thus, its design is meant to be sensitive to anatomical vascularity features. Comparison of results between the proposed function and other general quality evaluation functions shows that this proposal renders a high matching degree with human quality perception. Therefore, it can be used to enhance quality evaluation in retinal vessel segmentations, supplementing the existing functions. On the other hand, from a general point of view, the applied concept of measuring descriptive properties may be used to design specialized functions aimed at segmentation quality evaluation in other complex structures.
- Published
- 2012
- Full Text
- View/download PDF
23. Wind speed reconstruction from synoptic pressure patterns using an evolutionary algorithm
- Author
-
Carro-Calvo, L., Salcedo-Sanz, S., Prieto, L., Kirchner-Bossi, N., Portilla-Figueras, A., and Jiménez-Fernández, S.
- Subjects
- *
WIND speed , *ALGORITHMS , *PRESSURE , *DATA analysis , *MATHEMATICAL models , *TOWERS , *PROBLEM solving - Abstract
Abstract: This paper presents an evolutionary algorithm for wind speed reconstruction from synoptic pressure patterns. The algorithm operates in a search space formed by grids of pressure measures, and must classify the different situations into classes, in such a way that a measure of wind speed in a given point is minimized among patterns assigned to the same class. Then, each class is assigned a mean wind speed and direction, so the wind speed reconstruction is possible for a new grid of synoptic pressures. In this paper we present the problem model and the specific description of the evolutionary algorithm proposed to solve the problem. We also show the good performance of the proposed method in the reconstruction of the average wind speed in six wind towers in Spain. The proposed method is applicable to wind speed reconstruction or reconstruction of wind missing data of wind series, specially when there is no other variable or related measure available. [Copyright &y& Elsevier]
- Published
- 2012
- Full Text
- View/download PDF
24. Hourly-resolution analysis of electricity decarbonization in Spain (2017–2030).
- Author
-
Victoria, Marta and Gallego-Castillo, Cristobal
- Subjects
- *
ELECTRIC power production , *CARBONIZATION , *ELECTRIC power consumption , *RENEWABLE energy sources , *CARBON dioxide mitigation , *ALGORITHMS - Abstract
Highlights • Hourly-resolved model to investigate highly-renewable electricity generation in Spain. • Correlation analysis of time series to create 900 combinations used in simulations. • Transition paths evaluated based on security of supply, CO 2 emissions, and renewable share. • Short-term phase-out of nuclear and coal power plants proven to be feasible. Abstract Two alternative paths to achieve highly-renewable electricity generation in peninsular Spain are investigated in this paper. Every transition path comprises a description of the installed and decommissioned generation and storage capacities, from 2017 to 2030, as well as a hypothesis on the evolution of the electricity demand. The electricity mix for every hour within the transition path is determined through a dispatch algorithm that prioritizes electricity from renewable energy sources. The simulation is run for 900 different combinations of time series representing the hourly capacity factors of different technologies, as well as the electricity demand. This robust approach allows the evaluation of the transition paths based on the statistical distribution of several defined assessment criteria, such as security of supply, CO 2 emissions or renewable share in electricity generation. The feasibility of a Spanish power system with high renewable penetration is investigated not only in a future reference year but throughout the transition path. In particular, a progressive and simultaneous phase-out of nuclear and coal power plants in the short-term is proven to be feasible. Furthermore, the results sensitivity is analyzed including scenarios with a delayed nuclear phase-out, lower hydroelectricity generation due to more frequent and severe droughts caused by climate change and higher annual increment for the electricity demand. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
25. Data-driven fuzzy habitat suitability models for brown trout in Spanish Mediterranean rivers
- Author
-
Mouton, A.M., Alcaraz-Hernández, J.D., De Baets, B., Goethals, P.L.M., and Martínez-Capel, F.
- Subjects
- *
FUZZY systems , *MATHEMATICAL models , *HABITATS , *BROWN trout , *DISTRIBUTION (Probability theory) , *ALGORITHMS , *STATISTICAL decision making , *RIVERS - Abstract
Abstract: In recent years, fuzzy models have been acknowledged as a suitable approach for species distribution modelling due to their transparency and their ability to incorporate the ecological gradient theory. Specifically, the overlapping class boundaries of a fuzzy model are similar to the transitions between different environmental conditions. However, the need for ecological expert knowledge is an important constraint when applying fuzzy species distribution models. Moreover, the consistency of the ecological preferences of some fish species across different rivers has been widely contested. Recent research has shown that data-driven fuzzy models may solve this ‘knowledge acquisition bottleneck’ and this paper is a further contribution. The aim was to analyse the brown trout (Salmo trutta fario L.) habitat preferences based on a data-driven fuzzy modelling technique and to compare the resulting fuzzy models with a commonly applied modelling technique, Random Forests. A heuristic nearest ascent hill-climbing algorithm for fuzzy rule optimisation and Random Forests were applied to analyse the ecological preferences of brown trout in 93 mesohabitats. No significant differences in model performance were observed between the optimal fuzzy model and the Random Forests model and both approaches selected river width, the cover index and flow velocity as the most important variables describing brown trout habitat suitability. Further, the fuzzy model combined ecological relevance with reasonable interpretability, whereas the transparency of the Random Forests model was limited. This paper shows that fuzzy models may be a valid approach for species distribution modelling and that their performance is comparable to that of state-of-the-art modelling techniques like Random Forests. Fuzzy models could therefore be a valuable decision support tool for river managers and enhance communication between stakeholders. [Copyright &y& Elsevier]
- Published
- 2011
- Full Text
- View/download PDF
26. Survey of Visual and Force/Tactile Control of Robots for Physical Interaction in Spain.
- Author
-
Garcia, Gabriel J., Corrales, Juan A., Pomares, Jorge, and Torres, Fernando
- Subjects
ROBOTICS ,REMOTE sensing ,DETECTORS ,TACTILE sensors ,FORCE & energy ,TORQUE ,ALGORITHMS ,ARCHITECTURE - Abstract
Sensors provide robotic systems with the information required to perceive the changes that happen in unstructured environments and modify their actions accordingly. The robotic controllers which process and analyze this sensory information are usually based on three types of sensors (visual, force/torque and tactile) which identify the most widespread robotic control strategies: visual servoing control, force control and tactile control. This paper presents a detailed review on the sensor architectures, algorithmic techniques and applications which have been developed by Spanish researchers in order to implement these mono-sensor and multi-sensor controllers which combine several sensors. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
27. Photogrammetric Methodology for the Production of Geomorphologic Maps: Application to the Veleta Rock Glacier (Sierra Nevada, Granada, Spain).
- Author
-
de Matías, Javier, de Sanjosé, José Juan, López-Nicolás, Gonzalo, Sagüés, Carlos, and Guerrero, José Jesús
- Subjects
PHOTOGRAMMETRY ,GEOMORPHOLOGICAL mapping ,CARTOGRAPHY ,GLACIERS ,GEODETIC observations ,ALGORITHMS - Abstract
In this paper we present a stereo feature-based method using SIFT (Scale-invariant feature transform) descriptors. We use automatic feature extractors, matching algorithms between images and techniques of robust estimation to produce a DTM (Digital Terrain Model) using convergent shots of a rock glacier.The geomorphologic structure observed in this study is the Veleta rock glacier (Sierra Nevada, Granada, Spain). This rock glacier is of high scientific interest because it is the southernmost active rock glacier in Europe and it has been analyzed every year since 2001. The research on the Veleta rock glacier is devoted to the study of its displacement and cartography through geodetic and photogrammetric techniques. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
28. Humanoid robot RH-1 for collaborative tasks: a control architecture for human-robot cooperation.
- Author
-
Monje, Concepción A., Pierro, Paolo, and Balaguer, Carlos
- Subjects
ROBOTICS ,ROBOT kinematics ,ALGORITHMS ,VIRTUAL reality ,AUTONOMOUS robots ,UNIVERSITIES & colleges - Abstract
The full-scale humanoid robot RH-1 has been totally developed in the University Carlos III of Madrid. In this paper we present an advanced control system for this robot so that it can perform tasks in cooperation with humans. The collaborative tasks are carried out in a semi-autonomous way and are intended to be put into operation in real working environments where humans and robots should share the same space. Before presenting the control strategy, the kinematic model and a simplified dynamic model of the robot are presented. All the models and algorithms are verified by several simulations and experimental results. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
29. Supply Estimation Using Coevolutionary Genetic Algorithms in the Spanish Electrical Market.
- Author
-
De La Cal MarÍn, Enrique A. and Sánchez Ramos, Luciano
- Subjects
ALGORITHMS ,CONFIGURATIONS (Geometry) ,ELECTRICITY ,ELECTRIC utilities ,ELECTRIC generators ,ECONOMIC competition ,SUPPLY & demand ,ECONOMIC models ,ECONOMIC statistics - Abstract
The price of electrical energy in Spain has not been regulated by the government since 1998, but determined by the supply from the generators in a competitive market, the so-called "electrical pool". A genetic method for analyzing data from this new market is presented in this paper. The eventual objective is to determine the individual supply curves of the competitive agents. Adopting the point of view of the game theory, different genetic algorithm configurations using coevolutionary and non-coevolutionary strategies combined with scalar and multi-objective fitness are compared. The results obtained are the first step toward solving the induction of the optimal individual strategies into the Spanish electrical market from data in terms of perfect oligopolistic behavior. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
30. Aprovechando y expandiendo LA hiperflexibilización del empleo. El modelo Uber en España.
- Author
-
RIESGO GÓMEZ, VÍCTOR
- Subjects
EMPLOYMENT ,LEGAL judgments ,WORK environment ,LABOR market ,DATA analysis ,ALGORITHMS ,FIELD research ,FREEDOM of the press - Abstract
Copyright of EMPIRIA: Revista de Metodología de Ciencias Sociales is the property of Editorial UNED and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2023
- Full Text
- View/download PDF
31. An Automatic Algorithm to Date the Reference Cycle of the Spanish Economy.
- Author
-
Camacho, Maximo, Gadea, María Dolores, and Gómez-Loscos, Ana
- Subjects
GAUSSIAN distribution ,BUSINESS cycles ,ECONOMIC indicators ,MARKOV processes ,ALGORITHMS ,RECESSIONS - Abstract
This paper provides an accurate chronology of the Spanish reference business cycle adapting a multiple change-point model. In that approach, each combination of peaks and troughs dated in a set of economic indicators is assumed to be a realization of a mixture of bivariate Gaussian distributions, whose number of components is estimated from the data. The means of each of these components refer to the dates of the reference turning points. The transitions across the components of the mixture are governed by Markov chain that is restricted to force left-to-right transition dynamic. In the empirical application, seven recessions in the period from February 1970 to February 2020 are identified, which are in high concordance with the timing of the turning point dates established by the Spanish Business Cycle Dating Committee (SBCDC). [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
32. Statistical Analysis and Machine Learning Prediction of Fog-Caused Low-Visibility Events at A-8 Motor-Road in Spain.
- Author
-
Cornejo-Bueno, Sara, Casillas-Pérez, David, Cornejo-Bueno, Laura, Chidean, Mihaela I., Caamaño, Antonio J., Cerro-Prada, Elena, Casanova-Mateo, Carlos, and Salcedo-Sanz, Sancho
- Subjects
STATISTICS ,MACHINE learning ,PARETO distribution ,MAXIMUM likelihood statistics ,ALGORITHMS ,PEARSON correlation (Statistics) - Abstract
This work presents a full statistical analysis and accurate prediction of low-visibility events due to fog, at the A-8 motor-road in Mondoñedo (Galicia, Spain). The present analysis covers two years of study, considering visibility time series and exogenous variables collected in the zone affected the most by extreme low-visibility events. This paper has then a two-fold objective: first, we carry out a statistical analysis for estimating the fittest probability distributions to the fog event duration, using the Maximum Likelihood method and an alternative method known as the L-moments method. This statistical study allows association of the low-visibility depth with the event duration, showing a clear relationship, which can be modeled with distributions for extremes such as Generalized Extreme Value and Generalized Pareto distributions. Second, we apply a neural network approach, trained by means of the ELM (Extreme Learning Machine) algorithm, to predict the occurrence of low-visibility events due to fog, from atmospheric predictive variables. This study provides a full characterization of fog events at this motor-road, in which orographic fog is predominant, causing important traffic problems during all year. We also show how the ELM approach is able to obtain highly accurate low-visibility events predictions, with a Pearson correlation coefficient of 0.8 , within a half-hour time horizon, enough to initialize some protocols aiming at reducing the impact of these extreme events in the traffic of the A-8 motor road. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
33. A Method of Estimating Time-to-Recovery for a Disease Caused by a Contagious Pathogen Such as SARS-CoV-2 Using a Time Series of Aggregated Case Reports.
- Author
-
Koutsouris, Dimitrios-Dionysios, Pitoglou, Stavros, Anastasiou, Athanasios, and Koumpouros, Yiannis
- Subjects
DISEASE progression ,COMPUTER software ,COVID-19 ,CONFIDENCE intervals ,TIME ,CONVALESCENCE ,WORLD health ,EPIDEMICS ,TIME series analysis ,DESCRIPTIVE statistics ,SENSITIVITY & specificity (Statistics) ,PREDICTION models ,COVID-19 pandemic ,ALGORITHMS - Abstract
During the outbreak of a disease caused by a pathogen with unknown characteristics, the uncertainty of its progression parameters can be reduced by devising methods that, based on rational assumptions, exploit available information to provide actionable insights. In this study, performed a few (~6) weeks into the outbreak of COVID-19 (caused by SARS-CoV-2), one of the most important disease parameters, the average time-to-recovery, was calculated using data publicly available on the internet (daily reported cases of confirmed infections, deaths, and recoveries), and fed into an algorithm that matches confirmed cases with deaths and recoveries. Unmatched cases were adjusted based on the matched cases calculation. The mean time-to-recovery, calculated from all globally reported cases, was found to be 18.01 days (SD 3.31 days) for the matched cases and 18.29 days (SD 2.73 days) taking into consideration the adjusted unmatched cases as well. The proposed method used limited data and provided experimental results in the same region as clinical studies published several months later. This indicates that the proposed method, combined with expert knowledge and informed calculated assumptions, could provide a meaningful calculated average time-to-recovery figure, which can be used as an evidence-based estimation to support containment and mitigation policy decisions, even at the very early stages of an outbreak. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
34. A study of differential microRNA expression profile in migraine: the microMIG exploratory study.
- Author
-
Gallardo, V. J., Gómez-Galván, J. B., Asskour, L., Torres-Ferrús, M., Alpuente, A., Caronna, E., and Pozo-Rosich, P.
- Subjects
RESEARCH ,MONONUCLEAR leukocytes ,MIGRAINE ,RESEARCH methodology ,MICRORNA ,INTERVIEWING ,CASE-control method ,RANDOM forest algorithms ,GENETIC markers ,GENE expression profiling ,QUESTIONNAIRES ,FACTOR analysis ,RESEARCH funding ,CLUSTER analysis (Statistics) ,HEADACHE ,WOMEN'S health ,LONGITUDINAL method ,ALGORITHMS ,EPIGENOMICS - Abstract
Background: Several studies have described potential microRNA (miRNA) biomarkers associated with migraine, but studies are scarcely reproducible primarily due to the heterogeneous variability of participants. Increasing evidence shows that disease-related intrinsic factors together with lifestyle (environmental factors), influence epigenetic mechanisms and in turn, diseases. Hence, the main objective of this exploratory study was to find differentially expressed miRNAs (DE miRNA) in peripheral blood mononuclear cells (PBMC) of patients with migraine compared to healthy controls in a well-controlled homogeneous cohort of non-menopausal women. Methods: Patients diagnosed with migraine according to the International Classification of Headache Disorders (ICHD-3) and healthy controls without familial history of headache disorders were recruited. All participants completed a very thorough questionnaire and structured-interview in order to control for environmental factors. RNA was extracted from PBMC and a microarray system (GeneChip miRNA 4.1 Array chip, Affymetrix) was used to determine the miRNA profiles between study groups. Principal components analysis and hierarchical clustering analysis were performed to study samples distribution and random forest (RF) algorithms were computed for the classification task. To evaluate the stability of the results and the prediction error rate, a bootstrap (.632 + rule) was run through all the procedure. Finally, a functional enrichment analysis of selected targets was computed through protein–protein interaction networks. Results: After RF classification, three DE miRNA distinguished study groups in a very homogeneous female cohort, controlled by factors such as demographics (age and BMI), life-habits (physical activity, caffeine and alcohol consumptions), comorbidities and clinical features associated to the disease: miR-342-3p, miR-532-3p and miR-758-5p. Sixty-eight target genes were predicted which were linked mainly to enriched ion channels and signaling pathways, neurotransmitter and hormone homeostasis, infectious diseases and circadian entrainment. Conclusions: A 3-miRNA (miR-342-3p, miR-532-3p and miR-758-5p) novel signature has been found differentially expressed between controls and patients with migraine. Enrichment analysis showed that these pathways are closely associated with known migraine pathophysiology, which could lead to the first reliable epigenetic biomarker set. Further studies should be performed to validate these findings in a larger and more heterogeneous sample. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
35. A Novel Information Theoretical Criterion for Climate Network Construction.
- Author
-
Cornejo-Bueno, Sara, Chidean, Mihaela I., Caamaño, Antonio J., Prieto-Godino, Luis, and Salcedo-Sanz, Sancho
- Subjects
WIND speed ,CLIMATOLOGY ,CONSTRUCTION ,ALGORITHMS ,FORECASTING ,WIND power plants - Abstract
This paper presents a novel methodology for Climate Network (CN) construction based on the Kullback-Leibler divergence (KLD) among Membership Probability (MP) distributions, obtained from the Second Order Data-Coupled Clustering (SODCC) algorithm. The proposed method is able to obtain CNs with emergent behaviour adapted to the variables being analyzed, and with a low number of spurious or missing links. We evaluate the proposed method in a problem of CN construction to assess differences in wind speed prediction at different wind farms in Spain. The considered problem presents strong local and mesoscale relationships, but low synoptic scale relationships, which have a direct influence in the CN obtained. We carry out a comparison of the proposed approach with a classical correlation-based CN construction method. We show that the proposed approach based on the SODCC algorithm and the KLD constructs CNs with an emergent behaviour according to underlying wind speed prediction data physics, unlike the correlation-based method that produces spurious and missing links. Furthermore, it is shown that the climate network construction method facilitates the evaluation of symmetry properties in the resulting complex networks. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
36. An Innovative JavaScript-Based Framework for Teaching Backtracking Algorithms Interactively.
- Author
-
Nasralla, Moustafa M.
- Subjects
JAVASCRIPT programming language ,ALGORITHMS ,CONCEPT learning ,ENGINEERING education ,EIGENFUNCTIONS ,DIGITAL learning - Abstract
Algorithm fundamentals are useful to learn at different levels engineering education. One of the most difficult concepts to teach and understand is backtracking algorithms with proper bounding functions. This article proposes a framework to implement interactive online tools showing examples of backtracking algorithms in which students can graphically observe execution step-by-step. This approach is illustrated with the n-queens problem with students from Prince Sultan University, Saudi Arabia, and Complutense University of Madrid, Spain. The results show 6.67% increased learning on a backtracking exercise in the experimental group over the control group, in which the algorithms were automatically validated with DOMjudge software (an automated system used to run programming contests). The proposed framework was evaluated as easy to use, with a score of 74.5% in the validated System Usability Scale (SUS); easy to learn, with a score of 6.22 out of 7 in the validated Usefulness, Satisfaction, and Ease-of-Use (USE) scale; and with a general satisfaction of 5.97 out of 7 in the validated USE scale. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
37. Multi-objective optimization minimizing cost and life cycle emissions of stand-alone PV–wind–diesel systems with batteries storage
- Author
-
Dufo-López, Rodolfo, Bernal-Agustín, José L., Yusta-Loyo, José M., Domínguez-Navarro, José A., Ramírez-Rosado, Ignacio J., Lujano, Juan, and Aso, Ismael
- Subjects
- *
ENERGY storage , *ALGORITHMS , *CARBON dioxide , *PHOTOVOLTAIC power generation , *WIND turbines , *DIESEL fuels , *MATHEMATICAL optimization - Abstract
Abstract: This paper describes an application of the Strength Pareto Evolutionary Algorithm to the multi-objective optimization of a stand-alone PV–wind–diesel system with batteries storage. The objectives to be minimized are the levelized cost of energy (LCOE) and the equivalent carbon dioxide (CO2) life cycle emissions (LCE). Each solution of the Pareto front is a possible solution for the PV–wind–diesel-batteries system, which can supply the load, but each one has a different LCOE and LCE. Some solutions have low LCOE values, but high LCE values, and vice versa. Results show that the photovoltaic (PV) generator is the most important source of electrical energy for stand-alone systems in Spain and Southern Europe, not only environmentally, but also economically. In some cases, PV is almost the only source of energy, as some solutions of the best Pareto front do not include wind turbines and diesel generators run only a few hours during the year. [Copyright &y& Elsevier]
- Published
- 2011
- Full Text
- View/download PDF
38. Validation and reconstruction of flow meter data in the Barcelona water distribution network
- Author
-
Quevedo, J., Puig, V., Cembrano, G., Blanch, J., Aguilar, J., Saporta, D., Benito, G., Hedo, M., and Molina, A.
- Subjects
- *
WATER distribution , *FLOW meters , *FUZZY logic , *STATISTICAL correlation , *ALGORITHMS , *TIME series analysis - Abstract
Abstract: This paper presents a signal analysis methodology to validate (detect) and reconstruct the missing and false data of a large set of flow meters in the telecontrol system of a water distribution network. The proposed methodology is based on two time-scale forecasting models: a daily model based on a ARIMA time series, while the 10-min model is based on distributing the daily flow using a 10-min demand pattern. The demand patterns have been determined using two methods: correlation analysis and an unsupervised fuzzy logic classification, named LAMDA algorithm. Finally, the proposed methodology has been applied to the Barcelona water distribution network, providing very good results. [Copyright &y& Elsevier]
- Published
- 2010
- Full Text
- View/download PDF
39. Application of a new hybrid neuro-evolutionary system for day-ahead price forecasting of electricity markets.
- Author
-
Amjady, Nima and Keynia, Farshid
- Subjects
ELECTRIC rates ,FORECASTING ,EVOLUTIONARY computation ,ALGORITHMS ,ARTIFICIAL neural networks ,DATA flow computing ,ELECTRIC industries - Abstract
Abstract: In this paper, a new forecast strategy is proposed for day-ahead prediction of electricity prices, which are so valuable for both producers and consumers in the new competitive electric power markets. However, electricity price has a nonlinear, volatile and time dependent behavior owning many outliers. Our forecast strategy is composed of a preprocessor and a Hybrid Neuro-Evolutionary System (HNES). Preprocessor selects the input features of the HNES according to MRMR (Maximum Relevance Minimum Redundancy) principal. The HNES is composed of three Neural Networks (NN) and Evolutionary Algorithms (EA) in a cascaded structure with a new data flow among its building blocks. The effectiveness of the whole proposed method is demonstrated by means of real data of the PJM and Spanish electricity markets. Also, the proposed price forecast strategy is compared with some of the most recent techniques in the area. [Copyright &y& Elsevier]
- Published
- 2010
- Full Text
- View/download PDF
40. Strategic noise map of a major road carried out with two environmental prediction software packages.
- Author
-
Arana, M., Martin, R. San, Martin, M. L. San, and Aramendía, E.
- Subjects
NOISE ,ENVIRONMENTAL monitoring ,VEGETATION monitoring ,QUANTITATIVE research ,INTEGRATED software ,ALGORITHMS ,COMPUTER programming ,COMPUTER software - Abstract
The main objective of this study is to analyze the differences found in the results of noise mapping using two of the most popular software techniques for the prediction of environmental noise. The location selected to conduct the comparative study is an area encompassed by the ring road that surrounds the city of Pamplona and on a grid, with a total of 6 × 10
5 points, approximately. In fact, and as the Environmental Noise Directive points out, it is a major road designated by a Member State (Spain). Configuration of the calculation parameters (discretization of the sources, ground absorption, reflection order, etc.) was as equivalent as possible as far as programs allow. In spite of that, a great number of differences appear in the findings. Although in 95.5% of the points the difference in the noise level calculated from the two programs was less than 3 dB, this general statistic result concealed some great differences. These are due to the various algorithms that programs implement to evaluate noise levels. Most differences pertain to highly screened receivers or remote ones. In the former, the algorithm of visibility is the main cause of such differences. In the latter, differences are mainly brought about by a different implementation of the propagation under homogeneous and favorable atmospheric conditions from both software systems. [ABSTRACT FROM AUTHOR]- Published
- 2010
- Full Text
- View/download PDF
41. A decision support system for the automatic management of keep-clear signs based on support vector machines and geographic information systems
- Author
-
Lafuente-Arroyo, S., Salcedo-Sanz, S., Maldonado-Bascón, S., Portilla-Figueras, J.A., and López-Sastre, R.J.
- Subjects
- *
DECISION support systems , *SUPPORT vector machines , *GEOGRAPHIC information systems , *DATA analysis , *MATHEMATICAL analysis , *DATABASES , *ALGORITHMS - Abstract
Abstract: This paper presents a decision support system for automatic keep-clear signs management. The system consists of several modules. First of all, an acquisition module obtains images using a vehicle equipped with two recording cameras. A recognition module, which is based on Support Vector Machines (SVMs), analyzes each image and decides if there is a keep-clear sign in it. The images with keep-clear signs are included into a Geographical Information System (GIS) database. Finally in the management module, the data in the GIS are compared with the council database in order to decide actions such as repairing or reposition of signs, detection of possible frauds etc. We present the first tests of the system in a Spanish city (Meco, Madrid), where the systems is being tested for its application in the near future. [Copyright &y& Elsevier]
- Published
- 2010
- Full Text
- View/download PDF
42. Estimation of RVoG Scene Parameters by Means of PolInSAR With TanDEM-X Data: Effect of the Double-Bounce Contribution.
- Author
-
Romero-Puig, Noelia, Lopez-Sanchez, Juan M., and Ballester-Berman, J. David
- Subjects
CROPS ,ALGORITHMS ,PADDY fields ,BISTATIC radar ,GROUND vegetation cover - Abstract
This article evaluates the effect of the double-bounce (DB) decorrelation term that appears in single-pass bistatic acquisitions, as in the TanDEM-X system, on the inversion of scene parameters by means of polarimetric SAR interferometry (PolInSAR). The retrieval of all scene parameters involved in the Random Volume over Ground (RVoG) model (i.e., ground topography, vegetation height, extinction, and ground-to-volume ratios) is affected by this term when the radar response from the ground is dominated by the DB. The estimation error in all these parameters is analyzed by means of simulations over a wide range of system configurations and scene variables for both agricultural crops and forest scenarios. Simulations demonstrate that the inclusion of the DB term, which complicates the inversion algorithm, is necessary for the angles of incidence shallower than 30° to achieve an estimation error below 10% in vegetation height and to avoid a significant underestimation in the ground-to-volume ratios. At steep incidences, this decorrelation term does not affect the estimation of vegetation height and ground-to-volume ratios. Regarding the extinction, this parameter is intrinsically not well estimated, since most retrieved values are close to the initial guesses employed for the optimization algorithm, regardless of the use or not of the DB decorrelation term. Finally, these findings are compared with the experimental results from the TanDEM-X data acquired over the rice fields in Spain for the available system parameters (baseline and incidence angle) of the acquired data set. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
43. A genome-wide analysis of copy number variation in Murciano-Granadina goats.
- Author
-
Guan, Dailu, Martínez, Amparo, Castelló, Anna, Landi, Vincenzo, Luigi-Sierra, María Gracia, Fernández-Álvarez, Javier, Cabrera, Betlem, Delgado, Juan Vicente, Such, Xavier, Jordana, Jordi, and Amills, Marcel
- Subjects
GOAT breeds ,GOATS ,ATP-binding cassette transporters ,GENE targeting ,DNA copy number variations ,ALGORITHMS ,SECRETION ,GENETIC transduction - Abstract
Background: In this work, our aim was to generate a map of the copy number variations (CNV) segregating in a population of Murciano-Granadina goats, the most important dairy breed in Spain, and to ascertain the main biological functions of the genes that map to copy number variable regions. Results: Using a dataset that comprised 1036 Murciano-Granadina goats genotyped with the Goat SNP50 BeadChip, we were able to detect 4617 and 7750 autosomal CNV with the PennCNV and QuantiSNP software, respectively. By applying the EnsembleCNV algorithm, these CNV were assembled into 1461 CNV regions (CNVR), of which 486 (33.3% of the total CNVR count) were consistently called by PennCNV and QuantiSNP and used in subsequent analyses. In this set of 486 CNVR, we identified 78 gain, 353 loss and 55 gain/loss events. The total length of all the CNVR (95.69 Mb) represented 3.9% of the goat autosomal genome (2466.19 Mb), whereas their size ranged from 2.0 kb to 11.1 Mb, with an average size of 196.89 kb. Functional annotation of the genes that overlapped with the CNVR revealed an enrichment of pathways related with olfactory transduction (fold-enrichment = 2.33, q-value = 1.61 × 10
−10 ), ABC transporters (fold-enrichment = 5.27, q-value = 4.27 × 10−04 ) and bile secretion (fold-enrichment = 3.90, q-value = 5.70 × 10−03 ). Conclusions: A previous study reported that the average number of CNVR per goat breed was ~ 20 (978 CNVR/50 breeds), which is much smaller than the number we found here (486 CNVR). We attribute this difference to the fact that the previous study included multiple caprine breeds that were represented by small to moderate numbers of individuals. Given the low frequencies of CNV (in our study, the average frequency of CNV is 1.44%), such a design would probably underestimate the levels of the diversity of CNV at the within-breed level. We also observed that functions related with sensory perception, metabolism and embryo development are overrepresented in the set of genes that overlapped with CNV, and that these loci often belong to large multigene families with tens, hundreds or thousands of paralogous members, a feature that could favor the occurrence of duplications or deletions by non-allelic homologous recombination. [ABSTRACT FROM AUTHOR]- Published
- 2020
- Full Text
- View/download PDF
44. Free journal-ranking tool enters citation market.
- Author
-
Butler, Declan
- Subjects
- *
DATABASES , *INTERNET , *STATISTICS , *BIBLIOGRAPHICAL citations , *ALGORITHMS , *RESEARCH , *DATA mining - Abstract
The article reports on the launch of an Internet database, called the SCImago Journal & Country Rank database, allowing users to generate on-the-fly citation statistics of published research papers for free. The open-access database calculates papers' impact factors using an algorithm. It is collaborating with Amsterdam-based science publisher Elsevier. SCImago is a data-mining and visualization group in Spain. The company ranks journals and countries using citation metrics as the popular Hirsch Index. It also includes the SCImago Journal Rank (SJR).
- Published
- 2008
- Full Text
- View/download PDF
45. Players’ selection for basketball teams, through Performance Index Rating, using multiobjective evolutionary algorithms.
- Author
-
Pérez-Toledano, Miguel Ángel, Rodriguez, Francisco J., García-Rubio, Javier, and Ibañez, Sergio José
- Subjects
EVOLUTIONARY algorithms ,BASKETBALL teams ,SPORTS competitions ,SPORTS administration ,BIOLOGICAL evolution ,DIFFERENTIAL evolution - Abstract
In any sport the selection of players for a team is fundamental for its subsequent performance. Many factors condition the selection process from the characteristics of the sport discipline to financial limitations, including a long list of restrictions associated with the environment of the competitions in which the team takes part. All of this makes the process of selecting a roster of players very complex, as it is affected by multiple variables and in many cases marked by a great deal of subjectivity. The purpose of this article was to objectively select the players for a basketball team using an evolutionary algorithm, the Non-dominated Sorting Genetic Algorithm II (NSGA-II) that uses stochastic search methods based on the imitation of natural biological evolution. The sample was composed of the players from the teams competing in the top Spanish basketball league, the Association of Basketball Clubs (ACB). To assess the quality of the solutions obtained, the results were compared with the teams in the ACB playing in the same competition as the players used in the study. The results make it possible to obtain different solutions for composing teams rendering financial resources profitable and taking into account the restrictions of the competition and of each sport management. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
46. Corneal Stability following Hyperopic LASIK with Advanced Laser Ablation Profiles Analyzed by a Light Propagation Study.
- Author
-
Gharaibeh, Almutez M., Villanueva, Asier, Mas, David, Espinosa, Julian, and Alió, Jorge L.
- Subjects
CORNEA physiology ,ALGORITHMS ,CORNEAL topography ,HYPEROPIA ,SCIENTIFIC observation ,POSTOPERATIVE period ,REGRESSION analysis ,SURGEONS ,VISUAL acuity ,LASIK ,STATISTICAL reliability ,RETROSPECTIVE studies - Abstract
Purpose. To assess anterior corneal surface stability 12 months following hyperopic LASIK correction with a light propagation algorithm. Setting. Vissum Instituto Oftalmológico de Alicante, Universidad Miguel Hernández, Alicante, Spain. Methods. This retrospective consecutive observational study includes 37 eyes of 37 patients treated with 6th-generation excimer laser platform (Schwind Amaris). Hyperopic LASIK was performed in all of them by the same surgeon (JLA) and completed 12-month follow-up. Corneal topography was analyzed with a light propagation algorithm, to assess the stability of the corneal outcomes along one year of follow-up. Results. Between three and twelve months postoperatively, an objective corneal power (OCP) regression of 0.39D and 0.41D was found for 6mm and 9mm central corneal zone, respectively. Subjective outcomes at the end of the follow-up period were as follows: 65% of eyes had spherical equivalent within ±0.50 D. 70% of eyes had an uncorrected distance visual acuity 20/20 or better. 86% of eyes had the same or better corrected distance visual acuity. In terms of stability, 0.14D of regression was found. No statistically significant differences were found for all the study parameters evaluated at different postoperative moments over the 12-month period. Conclusions. Light propagation analysis confirms corneal surface stability following modern hyperopic LASIK with a 6th-generation excimer laser technology over a 12-month period. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
47. Predicting the onset of hazardous alcohol drinking in primary care: development and validation of a simple risk algorithm.
- Author
-
Bellón, Juan Ángel, de Dios Luna, Juan, King, Michael, Nazareth, Irwin, Motrico, Emma, GildeGómez-Barragán, María Josefa, Torres-González, Francisco, Montón-Franco, Carmen, Sánchez-Celaya, Marta, Díaz-Barreiros, Miguel Ángel, Vicens, Catalina, and Moreno-Peral, Patricia
- Subjects
ALCOHOL drinking ,PRIMARY care ,CLINICAL prediction rules ,ALGORITHMS ,CHILD sexual abuse ,SMOKING ,PREVENTION of alcoholism ,PSYCHOLOGY of alcoholism ,ALCOHOLISM ,COMPARATIVE studies ,LONGITUDINAL method ,RESEARCH methodology ,MEDICAL cooperation ,PRIMARY health care ,PROGNOSIS ,QUESTIONNAIRES ,RESEARCH ,RISK assessment ,EVALUATION research ,BEHAVIOR disorders - Abstract
Background: Little is known about the risk of progressing to hazardous alcohol use in abstinent or low-risk drinkers.Aim: To develop and validate a simple brief risk algorithm for the onset of hazardous alcohol drinking (HAD) over 12 months for use in primary care.Design and Setting: Prospective cohort study in 32 health centres from six Spanish provinces, with evaluations at baseline, 6 months, and 12 months.Method: Forty-one risk factors were measured and multilevel logistic regression and inverse probability weighting were used to build the risk algorithm. The outcome was new occurrence of HAD during the study, as measured by the AUDIT.Results: From the lists of 174 GPs, 3954 adult abstinent or low-risk drinkers were recruited. The 'predictAL-10' risk algorithm included just nine variables (10 questions): province, sex, age, cigarette consumption, perception of financial strain, having ever received treatment for an alcohol problem, childhood sexual abuse, AUDIT-C, and interaction AUDIT-C*Age. The c-index was 0.886 (95% CI = 0.854 to 0.918). The optimal cutoff had a sensitivity of 0.83 and specificity of 0.80. Excluding childhood sexual abuse from the model (the 'predictAL-9'), the c-index was 0.880 (95% CI = 0.847 to 0.913), sensitivity 0.79, and specificity 0.81. There was no statistically significant difference between the c-indexes of predictAL-10 and predictAL-9.Conclusion: The predictAL-10/9 is a simple and internally valid risk algorithm to predict the onset of hazardous alcohol drinking over 12 months in primary care attendees; it is a brief tool that is potentially useful for primary prevention of hazardous alcohol drinking. [ABSTRACT FROM AUTHOR]- Published
- 2017
- Full Text
- View/download PDF
48. Effect of country-of-origin contextual factors and length of stay on immigrants' substance use in Spain.
- Author
-
Sordo, L., Indave, B. I., Vallejo, F., Belza, M. J., Sanz-Barbero, B., Rosales-Statkus, M., Fernández-Balbuena, S., and Barrio, G.
- Subjects
SUBSTANCE abuse ,ALGORITHMS ,CONFIDENCE intervals ,EMIGRATION & immigration ,LENGTH of stay in hospitals ,POISSON distribution ,RESEARCH funding ,STATISTICAL sampling ,DATA analysis software ,DESCRIPTIVE statistics - Abstract
Background: Factors explaining disparities in risk of substance use between immigrants and natives and between immigrant subgroups are poorly understood. We aimed to describe such disparities and identify some explanatory factors in Spain. Methods: Participants were residents aged 15-64 years from 2005 to 07 nationally representative surveys. Outcomes were prevalences of alcohol, tobacco, sedative-hypnotics, cannabis and other illegal substance use. Immigrants were recent if <5 years of Spanish stay and long term if ≥10 years. Country-of-origin income per capita and population level of substance use were taken from international databases. Adjusted prevalence ratios (aPRs) and percent change from Poisson regression with robust variance were used to estimate risk disparities and effects of immigration variables. Results: Most immigrants had lower substance use than natives, although it generally increased with increasing Spanish stay, especially for illegal substances. This lower risk could be partially explained by country-of-origin contextual factors as a lower level of income or substance use and religious or cultural factors such as Islam. By origin, recent immigrant aPRs and convergence-divergence risk patterns were, respectively, as follows: lower aPRs with upward convergence (often incomplete) toward natives' risk in immigrants from Muslim area, Eastern-Europe and Latin-America excluding South-Cone, lower/ similar aPRs with upward overtaking or divergent patterns in South-Cone Americans and similar/higher aPRs with stable or upward divergent patterns in Non-Eastern-Europeans. Conclusion: Spain is a host context that seems to facilitate increased substance use among immigrants, even those from countries with prevalences close to Spain. However, country-of-origin context is important in explaining disparities in substance use among immigrants. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
49. Local implementation of a syndromic influenza surveillance system using emergency department data in Santander, Spain.
- Author
-
Schrell, S., Ziemann, A., Garcia-Castrillo Riesgo, L., Rosenkötter, N., Llorca, J., Popa, D., and Krafft, T.
- Subjects
EARLY medical intervention ,PUBLIC health surveillance ,ALGORITHMS ,STATISTICAL correlation ,HOSPITAL emergency services ,RESEARCH methodology ,POISSON distribution ,REACTION time ,RESEARCH funding ,TIME series analysis ,SYSTEMS development ,PREDICTIVE validity ,RETROSPECTIVE studies ,RECEIVER operating characteristic curves ,SEASONAL influenza ,MEDICAL coding - Abstract
Background We assessed the local implementation of syndromic surveillance (SyS) as part of the European project ‘System for Information on, Detection and Analysis of Risks and Threats to Health’ in Santander, Spain. Methods We applied a cumulative sum algorithm on emergency department (ED) chief complaints for influenza-like illness in the seasons 2010–11 and 2011–12. We fine tuned the algorithm using a receiver operating characteristic analysis to identify the optimal trade-off of sensitivity and specificity and defined alert criteria. We assessed the timeliness of the SyS system to detect the onset of the influenza season. Results The ED data correlated with the sentinel data. With the best algorithm settings we achieved 70/63% sensitivity and 89/95% specificity for 2010–11/2011–12. At least 2 consecutive days of signals defined an alert. In 2010–11 the SyS system alerted 1 week before the sentinel system and in 2011–12 in the same week. The data from the ED is available on a daily basis providing an advantage in timeliness compared with the weekly sentinel data. Conclusions ED-based SyS in Santander complements sentinel influenza surveillance by providing timely information. Local fine tuning and definition of alert criteria are recommended to enhance validity. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
50. Abnormal quality detection and isolation in water distribution networks using simulation models.
- Author
-
Nejjari, F., Pérez, R., Puig, V., Quevedo, J., Sarrate, R., Cugueró, M. A., Sanz, G., and Mirats, J. M.
- Subjects
WATER distribution ,ALGORITHMS ,CHLORINE ,RESIDUAL charges - Abstract
The article discusses the identification of abnormal quality for water distribution networks through a fault isolation algorithm in Barcelona, Spain. It notes that chlorine measurements and sensitivity analysis are the basis for a localization method on distribution. It mentions that a fault sensitivity matrix can be correlated with residual charges by the algorithm.
- Published
- 2012
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.