25 results
Search Results
2. A multi-resolution ensemble model of three decision-tree-based algorithms to predict daily NO 2 concentration in France 2005-2022.
- Author
-
Barbalat G, Hough I, Dorman M, Lepeule J, and Kloog I
- Subjects
- France, Environmental Monitoring methods, Air Pollution analysis, Nitrogen Dioxide analysis, Decision Trees, Algorithms, Air Pollutants analysis
- Abstract
Understanding and managing the health effects of Nitrogen Dioxide (NO
2 ) requires high resolution spatiotemporal exposure maps. Here, we developed a multi-stage multi-resolution ensemble model that predicts daily NO2 concentration across continental France from 2005 to 2022. Innovations of this work include the computation of daily predictions at a 200 m resolution in large urban areas and the use of a spatio-temporal blocking procedure to avoid data leakage and ensure fair performance estimation. Predictions were obtained after three cascading stages of modeling: (1) predicting NO2 total column density from Ozone Monitoring Instrument satellite; (2) predicting daily NO2 concentrations at a 1 km spatial resolution using a large set of potential predictors such as predictions obtained from stage 1, land-cover and road traffic data; and (3) predicting residuals from stage 2 models at a 200 m resolution in large urban areas. The latter two stages used a generalized additive model to ensemble predictions of three decision-tree algorithms (random forest, extreme gradient boosting and categorical boosting). Cross-validated performances of our ensemble models were overall very good, with a ten-fold cross-validated R2 for the 1 km model of 0.83, and of 0.69 for the 200 m model. All three basis learners participated in the ensemble predictions to various degrees depending on time and space. In sum, our multi-stage approach was able to predict daily NO2 concentrations with a relatively low error. Ensembling the predictions maximizes the chance of obtaining accurate values if one basis learner fails in a specific area or at a particular time, by relying on the other learners. To the best of our knowledge, this is the first study aiming to predict NO2 concentrations in France with such a high spatiotemporal resolution, large spatial extent, and long temporal coverage. Exposure estimates are available to investigate NO2 health effects in epidemiological studies., Competing Interests: Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper., (Copyright © 2024 The Authors. Published by Elsevier Inc. All rights reserved.)- Published
- 2024
- Full Text
- View/download PDF
3. AUTOMATIC ALGORITHM FOR GEOREFERENCING HISTORICAL-TO-NOWADAYS AERIAL IMAGES ACQUIRED IN NATURAL ENVIRONMENTS.
- Author
-
Craciun, D. and Le Bris, A.
- Subjects
ALGORITHMS ,IMAGE registration ,FEATURE extraction ,ENVIRONMENTAL monitoring - Abstract
Automatic georeferencing for historical-to-nowadays aerial images represents the main ingredient for supplying territory evolution analysis and environmental monitoring. Existing georeferencing methods based on feature extraction and matching reported successful results for multi-epoch aerial images acquired in structured and man-made environments. While improving the state-of- the-art of the multi-epoch georeferencing problem, such frameworks present several limitations when applied to unstructured scenes, such as natural feature-less environments, characterized by homogenous or texture-less areas. This is mainly due to the lack of structured areas which often results in sparse and ambiguous feature matches, introducing inconsistencies during the pose estimation process. This paper addresses the automatic georeferencing problem for historical aerial images acquired in unstructured natural environments. The research work presented in this paper introduces a feature-less algorithm designed to perform historical-to- nowadays image matching for pose estimation in a fully automatic fashion. The proposed algorithm operates within two stages: (i) 2D patch extraction and matching and (ii) 3D patch-based local alignment. The final output is a set of 3D patch matches and the 3D rigid transformation relating each homologous patches. The obtained 3D point matches are designed to be injected into traditional multi-views pose optimisation engines. Experimental results on real datasets acquired over Fabas area situated in France demonstrate the effectiveness of the proposed method. Our findings illustrate that the proposed georeferencing technique provides accurate results in presence of large periods of time separating historical from nowadays aerial images (up to 48 years time span). [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
4. Anti-sine-cosine atom search optimization (ASCASO): a novel approach for parameter estimation of PV models.
- Author
-
Zhou W, Wang P, Zhao X, and Chen H
- Subjects
- Electricity, France, Mutation, Algorithms, Communication
- Abstract
Nowadays, solar power generation has gradually become a part of electric energy sharing. How to effectively enhance the energy conversion efficiency of solar cells and components has gradually emerged as a focal point of research. This paper presents a boosted atomic search optimization (ASO) with a new anti-sine-cosine mechanism (ASCASO) to realize the parameter estimation of photovoltaic (PV) models. The anti-sine-cosine mechanism is inspired by the update principle of sine cosine algorithm (SCA) and the mutation strategy of linear population size reduction adaptive differential evolution (LSHADE). The working principle of anti-sine-cosine mechanism is to utilize two mutation formulas containing arcsine and arccosine functions to further update the position of atoms. The introduction of anti-sine-cosine mechanism achieves the populations' random handover and promotes the neighbors' information communication. For better evaluation, the proposed ASCASO is devoted to estimate parameters of three PV models of R.T.C France, one Photowat-PWP201 PV module model, and two commercial polycrystalline PV panels including STM6-40/36 and STM6-120/36 with monocrystalline cells. The proposed ASCASO is compared with nine reported comparative algorithms to assess the performance. The results of parameter estimation for different PV models of various methods demonstrate that ASCASO performs more accurately and reliably than other reported comparative methods. Thus, ASCASO can be considered a highly effective approach for accurately estimating the parameters of PV models., (© 2023. The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature.)
- Published
- 2023
- Full Text
- View/download PDF
5. Imitation Learning-Based Energy Management Algorithm: Lille Catholic University Smart Grid Demonstrator Case Study.
- Author
-
Swibki, Taheni, Ben Salem, Ines, Kraiem, Youssef, Abbes, Dhaker, and El Amraoui, Lilia
- Subjects
ENERGY management ,MACHINE learning ,REINFORCEMENT learning ,ALGORITHMS ,ENERGY industries ,LINEAR programming - Abstract
This paper proposes a novel energy management approach (imitation-Q-learning) based on imitation learning (IL) and reinforcement learning (RL). The proposed approach reinforces a decision-making agent based on a modified Q-learning algorithm to mimic an expert demonstration to solve a microgrid (MG) energy management problem. Those demonstrations are derived from solving a set of linear programming (LP) problems. Consequently, the imitation-Q-learning algorithm learns by interacting with the MG simulator and imitating the LP demonstrations to make decisions in real time that minimize the MG energy costs without prior knowledge of uncertainties related to photovoltaic (PV) production, load consumption, and electricity prices. A real-scale MG at the Lille Catholic University in France was used as a case study to conduct experiments. The proposed approach was compared to the expert performances, which are the LP algorithm and the conventional Q-learning algorithm in different test scenarios. It was approximately 80 times faster than conventional Q-learning and achieved the same performance as LP. In order to test the robustness of the proposed approach, a PV inverter crush and load shedding were also simulated. Preliminary results show the effectiveness of the proposed method. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
6. DATABOOK: a standardised framework for dynamic documentation of algorithm design during Data Science projects.
- Author
-
Nesvijevskaia, Anna
- Subjects
SCIENCE projects ,DATA science ,ALGORITHMS ,SCIENTIFIC computing ,FLEXIBLE structures - Abstract
This paper proposes a standard documentary framework called Databook for Data Science projects. This proposal is the result of five years of action-research on multiple projects in several sectors of activity in France and a confrontation of standard theoretical processes of Data Science, such as CRISP_DM, with the reality of the field. The minimalist and flexible structure of the Databook prototype, described and illustrated in this paper, has revealed its operationality on more than a hundred projects and has been recognised by various stakeholders as an excellent facilitator of Human Data Mediation, especially for multi-skilled projects. Beyond its proven benefits for project efficiency, this framework, conceived as a frontier object, can be applied more broadly to data project portfolio management and data value, governance and quality. By surpassing the computational aspect of the models, the Databook is an answer to the issues of interpretability and auditability of algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
7. Sub-exponential Time Parameterized Algorithms for Graph Layout Problems on Digraphs with Bounded Independence Number.
- Author
-
Misra, Pranabendu, Saurabh, Saket, Sharma, Roohani, and Zehavi, Meirav
- Subjects
GRAPH algorithms ,CUTTING stock problem ,REINFORCEMENT learning ,DIRECTED graphs ,ALGORITHMS - Abstract
Fradkin and Seymour (J Comb Theory Ser B 110:19–46, 2015) defined the class of digraphs of bounded independence number as a generalization of the class of tournaments. They argued that the class of digraphs of bounded independence number is structured enough to be exploited algorithmically. In this paper, we further strengthen this belief by showing that several cut problems that admit sub-exponential time parameterized algorithms (a trait uncommon to parameterized algorithms) on tournaments, including Directed Feedback Arc Set, Directed Cutwidth and Optimal Linear Arrangement, also admit such algorithms on digraphs of bounded independence number. Towards this, we rely on the generic approach of Fomin and Pilipczuk (in: Proceedings of the Algorithms—ESA 2013—21st Annual European Symposium, Sophia Antipolis, France, September 2–4, 2013, pp. 505–516, 2013), where to get the desired algorithms, it is enough to bound the number of k-cuts in digraphs of bounded independence number by a sub-exponential FPT function (Fomin and Pilipczuk bounded the number of k-cuts in transitive tournaments). Specifically, our main technical contribution is a combinatorial result that proves that the yes-instances of the problems (defined above) have a sub-exponential number of k-cuts. We prove this bound by using a combination of chromatic coding, inductive reasoning and exploiting the structural properties of these digraphs. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
8. Seamless Multicast : an SDN-based architecture for continuous audiovisual transport.
- Author
-
Colombo, Constant, Lepage, Francis, Kopp, René, and Gnaedinger, Eric
- Subjects
MULTICASTING (Computer networks) ,SOFTWARE-defined networking ,KNOWLEDGE management ,ALGORITHMS ,DIGITAL television ,STREAMING video & television - Abstract
For audiovisual network operators, end-users satisfaction is a major issue. This is the case for TDF who operates a nationwide network in France whose main purpose is to carry Digital Terrestrial Television streams. Such audiovisual content is forwarded through multicast real-time streams which require continuity of service. Therefore, the main goal of this work is to define a new architecture to prevent impact during network healing time. The proposed architecture aims to use a pair of redundant multicast trees, and ensure their seamless resiliency. This architecture called "Seamless Multicast" takes advantage of the network-end equipment's ability to receive and combine two identical streams, complete or not. The main contribution of this paper is the development and evaluation of an algorithm for the computation of a pair of multicast trees and the associated hitless deployment scheme. Implementation requires an Software-Defined Networking architecture, in which performance knowledge and bandwidth management are centralized in a controller. A proof of concept controller has been used for validation of the architecture's global behaviour using a virtualized environment in multiple scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
9. Graph-based ahead monitoring of vulnerabilities in large dynamic transportation networks.
- Author
-
Furno A, Faouzi NE, Sharma R, and Zimeo E
- Subjects
- Databases as Topic, France, Spatio-Temporal Analysis, Algorithms, Transportation
- Abstract
Betweenness Centrality (BC) has proven to be a fundamental metric in many domains to identify the components (nodes) of a system modelled as a graph that are mostly traversed by information flows thus being critical to the proper functioning of the system itself. In the transportation domain, the metric has been mainly adopted to discover topological bottlenecks of the physical infrastructure composed of roads or railways. The adoption of this metric to study the evolution of transportation networks that take into account also the dynamic conditions of traffic is in its infancy mainly due to the high computation time needed to compute BC in large dynamic graphs. This paper explores the adoption of dynamic BC, i.e., BC computed on dynamic large-scale graphs, modeling road networks and the related vehicular traffic, and proposes the adoption of a fast algorithm for ahead monitoring of transportation networks by computing approximated BC values under time constraints. The experimental analysis proves that, with a bounded and tolerable approximation, the algorithm computes BC on very large dynamically weighted graphs in a significantly shorter time if compared with exact computation. Moreover, since the proposed algorithm can be tuned for an ideal trade-off between performance and accuracy, our solution paves the way to quasi real-time monitoring of highly dynamic networks providing anticipated information about possible congested or vulnerable areas. Such knowledge can be exploited by travel assistance services or intelligent traffic control systems to perform informed re-routing and therefore enhance network resilience in smart cities., Competing Interests: The authors have declared that no competing interests exist.
- Published
- 2021
- Full Text
- View/download PDF
10. Mid-term redeployment of capacity of rescue centers considering the optimism of the decision maker.
- Author
-
Guillaume, Romain, Ben-Ammar, Oussama, and Thierry, Caroline
- Subjects
EMERGENCY vehicles ,OPTIMISM ,ALGORITHMS - Abstract
In this paper, we consider the problem of allocation of Emergency Vehicles (EVs) to Rescue Centers (RCs). The objective is to improve the operational response of the Fire and Rescue Services (SDIS) in France. More precisely, we focus on the mid-term management, and on the process of redeployment of EVs in RCs according to the evolution of requirements. At this level of decision, we do not have information on accidents. We only used the available information on the simultaneity of requirements. Based on the Hurwicz criterion, we develop a mathematical model and an iterative algorithm to solve it. The proposed approach takes into account both the uncertainty on the occurrence sequence of accidents, and the attitude towards optimization of the decision maker (DM). An illustration from the DM's point of view is presented. It shows that this uncertainty significantly impacts the deployment of EVs. Results show that the proposed approach has an efficient resolution time for real size problems. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
11. Robust-Extended Kalman Filter and Long Short-Term Memory Combination to Enhance the Quality of Single Point Positioning.
- Author
-
Tan, Truong-Ngoc, Khenchaf, Ali, Comblet, Fabrice, Franck, Pierre, Champeyroux, Jean-Marc, and Reichert, Olivier
- Subjects
ROBUST statistics ,MULTISENSOR data fusion ,LEAST squares ,ALGORITHMS ,STATISTICS ,KALMAN filtering ,DATA analysis - Abstract
In the recent years, multi-constellation and multi-frequency have improved the positioning precision in GNSS applications and significantly expanded the range of applications to new areas and services. However, the use of multiple signals presents advantages as well as disadvantages, since they may contain poor quality signals that negatively impact the position precision. The objective of this study is to improve the Single Point Positioning (SPP) accuracy using multi-GNSS data fusion. We propose the use of robust-Extended Kalman Filter (referred to as robust-EKF hereafter) to eliminate outliers. The robust-EKF used in the present work combines the Extended Kalman Filter with the Iterative ReWeighted Least Squares (IRWLS) and the Receiver Autonomous Integrity Monitoring (RAIM). The weight matrix in IRWLS is defined by the MM Estimation method which is a robust statistics approach for more efficient statistical data analysis with high breaking point. The RAIM algorithm is used to check the accuracy of the protection zone of the user. We apply the robust-EKF method along with the robust combination of GPS, Galileo and GLONASS data from ABMF base station, which significantly improves the position accuracy by about 84% compared to the non-robust data combination. ABMF station is a GNSS reception station managed by Météo-France in Guadeloupe. Thereafter, ABMF will refer to the acronym used to designate this station. Although robust-EKF demonstrates improvement in the position accuracy, its outputs might contain errors that are difficult to estimate. Therefore, an algorithm that can predetermine the error produced by robust-EKF is needed. For this purpose, the long short-term memory (LSTM) method is proposed as an adapted Deep Learning-Based approach. In this paper, LSTM is considered as a de-noising filter and the new method is proposed as a hybrid combination of robust-EKF and LSTM which is denoted rEKF-LSTM. The position precision greatly improves by about 95% compared to the non-robust combination of data from ABMF base station. In order to assess the rEKF-LSTM method, data from other base stations are tested. The position precision is enhanced by about 87%, 77% and 93% using the rEKF-LSTM compared to the non-robust combination of data from three other base stations AJAC, GRAC and LMMF in France, respectively. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
12. OPTIMAL DATES FOR DECIDUOUS TREE SPECIES MAPPING USING FULL YEARS SENTINEL-2 TIME SERIES IN SOUTH WEST FRANCE.
- Author
-
Karasiak, N., Fauvel, M., Dejoux, J.-F., Monteil, C., and Sheeren, D.
- Subjects
DECIDUOUS plants ,TIME series analysis ,SUPPORT vector machines ,ALGORITHMS ,K-means clustering - Abstract
The free to use Sentinel-2 (S2) sensors with 5-day revisit time at high spatial resolution in 10 spectral bands is a revolution in the remote sensing domain. Including 6 spectral bands in the near infrared, with 3 dedicated for the red-edge (where the vegetation significatively increases), these european satellites are very promising for mapping tree species distribution at a national scale. Here, we study the contribution of three one-year S2 Satellite Image Time Series (SITS) for mapping deciduous species distribution in the southwest of France. The annual cycle of vegetation (called phenology) can contribute to the identification of tree species. For some specific dates, species can have different phenological behaviours (senesence, flowering…). To train and validate the maps, we used the Support Vector Machine algorithm with a spatial cross-validation method. To train the algorithm with the same number of samples per species, we decided to undersample each class to the smallest class using a K-means clustering method. Moreover, a Sequential Feature Selection (SFS) has been implemented to detect the optimal dates per species. Our results are promising with high accuracy for Red oak andWillow (average score of the three one-year respectively F
1 = 0.99, F1 = 0.94) based on the optimal dates. However, it appears that the performances when using the each full SITS are far below the optimal dates models (average ΔF1 = 0.32). We did not find, except for Willow and Red oak, that the optimal dates were the same for each year. Perspectives is to find an algorithm robust to temporal or spectral noise and to smooth the time series. [ABSTRACT FROM AUTHOR]- Published
- 2020
- Full Text
- View/download PDF
13. Using the snake optimization metaheuristic algorithms to extract the photovoltaic cells parameters.
- Author
-
Belabbes, Fatima, Cotfas, Daniel T., Cotfas, Petru A., and Medles, Mourad
- Subjects
- *
PHOTOVOLTAIC cells , *OPTIMIZATION algorithms , *METAHEURISTIC algorithms , *STANDARD deviations , *SNAKES , *AMORPHOUS silicon - Abstract
• The snake optimization algorithm (SOA) and the Improved snake optimization algorithm (ISOA) are proposed to extract the parameters for photovoltaic cells. • The performance of SOA and ISOA is compared with the ones of the best algorithms from specialized literature. • ISOA proved their performance in comparison with the algorithms considered for the three photovoltaic cells analyzed. The accurate extraction of photovoltaic cells and panels parameters is very important in order to precisely forecast their generated energy, to rapidly identify the maximum power point, both for the manufacturer in order to verify the quality control during the production process, and for the researchers to improve their quality, efficiency and also study the photovoltaic cells degradation. This paper is the first to propose the usage of the Snake optimization metaheuristic algorithms to extract the parameters of three photovoltaic cells: monocrystalline commercial silicon, amorphous silicon, and RTC France. The Snake algorithms are used for the one diode and two diode models to extract five and seven parameters respectively. The root mean square error statistical test has been performed in order to prove the algorithm performance. The comparison with the algorithms published in specialized literature reveals that the Snake algorithm outperforms several others, but is not the best. An improved version is developed and the results show that it gives better results than, or at least the same as the best ones. The improvement in root mean square error, when the improved snake algorithm is used, is 16% in the case of the monocrystalline silicon photovoltaic cell for the two diode model, 5.3% in the case of the amorphous silicon photovoltaic cell for the two diode model, and it is slightly better for the RTC France photovoltaic cell, from 0.002 to 0.11% when the best algorithms are considered. The Snake algorithms give very good results, especially the improved one, even for a small population size and a limited number of iterations. Thus, for the improved snake algorithm, the iterations number and the population size are 100 and 80, in the case of the monocrystalline photovoltaic cells, for one diode model, and 200 and 40 respectively for two diode model, 100 and 120 in the case of the amorphous silicon photovoltaic cell for one diode model and 120 and 120 respectively for two diode model, 150 and 100 in the case of RTC photovoltaic cell for one diode model and 150 and 80 respectively for two diode model. This reduces the computational time for the extraction of the photovoltaic cell parameters. The computational time for the improved snake algorithm is half of the necessary one for one of the best algorithms, the hybrid successive discretization algorithm. Therefore, the snake optimization algorithm, especially the improved one is one of the best algorithms that can be used for the parameters extraction of the photovoltaic cells. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
14. The Power Implications of the Shift to Customer Reviews: A field perspective on jobbing platforms operating in France.
- Author
-
Barbe, Anne-Sophie, Gond, Jean-Pascal, and Hussler, Caroline
- Subjects
CONSUMERS' reviews ,DIGITAL technology ,SMALL capitalization stocks ,POWER (Social sciences) ,DISTRIBUTORS (Commerce) - Abstract
Customer reviews are a new type of third-party evaluation that has transformed how power operates over evaluated producers, and in so doing has attracted scholarly attention. However, this literature rarely addresses the power relationships operating between platforms collecting and aggregating these reviews. We address this blind spot by relying on Pierre Bourdieu's theory of fields, which we use to highlight that the reaction of producers to traditional third-party evaluations depend on, and reproduce, the domination of an evaluation intermediary over its competitors. Our qualitative study of the field of jobbing platforms in France reveals that producers react to customer reviews only on platforms accumulating relatively better stocks of reviews, in a self-reinforcing manner. Managers of platforms with smaller stocks of reviews resist by sheltering jobbers from reviews. Our study re-introduces field-level power dynamics between platforms to research exploring the forms of power that operate on producers subjected to reviews. It adds to studies of evaluation intermediaries by specifying that the accumulation of reviews underpins the power relationships between intermediaries in the customer review era and by identifying sheltering as a new form of resistance. Finally, it updates Bourdieu's theory for the digital age by explaining that individuals' accumulation of capital online relates to inter-organisational power dynamics. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
15. A Method of Estimating Time-to-Recovery for a Disease Caused by a Contagious Pathogen Such as SARS-CoV-2 Using a Time Series of Aggregated Case Reports.
- Author
-
Koutsouris, Dimitrios-Dionysios, Pitoglou, Stavros, Anastasiou, Athanasios, and Koumpouros, Yiannis
- Subjects
DISEASE progression ,COMPUTER software ,COVID-19 ,CONFIDENCE intervals ,TIME ,CONVALESCENCE ,WORLD health ,EPIDEMICS ,TIME series analysis ,DESCRIPTIVE statistics ,SENSITIVITY & specificity (Statistics) ,PREDICTION models ,COVID-19 pandemic ,ALGORITHMS - Abstract
During the outbreak of a disease caused by a pathogen with unknown characteristics, the uncertainty of its progression parameters can be reduced by devising methods that, based on rational assumptions, exploit available information to provide actionable insights. In this study, performed a few (~6) weeks into the outbreak of COVID-19 (caused by SARS-CoV-2), one of the most important disease parameters, the average time-to-recovery, was calculated using data publicly available on the internet (daily reported cases of confirmed infections, deaths, and recoveries), and fed into an algorithm that matches confirmed cases with deaths and recoveries. Unmatched cases were adjusted based on the matched cases calculation. The mean time-to-recovery, calculated from all globally reported cases, was found to be 18.01 days (SD 3.31 days) for the matched cases and 18.29 days (SD 2.73 days) taking into consideration the adjusted unmatched cases as well. The proposed method used limited data and provided experimental results in the same region as clinical studies published several months later. This indicates that the proposed method, combined with expert knowledge and informed calculated assumptions, could provide a meaningful calculated average time-to-recovery figure, which can be used as an evidence-based estimation to support containment and mitigation policy decisions, even at the very early stages of an outbreak. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
16. A twofold hunting trip African vultures algorithm for the optimal extraction of photovoltaic generator model parameters.
- Author
-
Belmadani, Hamza, Kheldoun, Aissa, Bradai, Rafik, Mekhilef, Saad, and Siddique, Marif Daula
- Subjects
STANDARD deviations ,VULTURES ,PHOTOVOLTAIC power systems ,SEARCH engines ,ALGORITHMS ,MATHEMATICAL optimization - Abstract
The development of reliable simulators that finely imitate the behavior of PV devices is vitally important for the design and optimization of efficient and stable photovoltaic systems. In this work, an improved variant of the African Vultures Optimization Algorithm named IAVOA is designed to serve as a powerful tool for extracting the unknown parameters of photovoltaic models. The introduced scheme incorporates a twofold strategy in such a way that allows a portion of the search agents to conduct a global search while the remaining portion performs a local search. The embedded mechanism is based on two equations added to the standard version, and by which the exploration and exploitation capabilities of the algorithm have significantly been fostered. To testify the performance of the IAVOA, a comparative study based on the Root Mean Square Error (RMSE), was conducted on six distinct benchmark PV models, and the obtained results were, in most cases, remarkably superior to the ones achieved by its competitors. The algorithm was able to produce values for the ideality factors that have not been previously found by any existing work to the best of our knowledge. In turn, the Double Diode and Triple Diode models' accuracies were notably improved with RMSE scores of 6.9096 × 10 − 4 and 7.4011 × 10 − 4 respectively for the RTC France cell, and 1.4251 × 10 − 2 for the STP6-120/36 module, outperforming the existing techniques. In light of that, it can be reliably presumed that the IAVOA is indeed a promising algorithm for the electrical characterization of PV devices. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
17. Use of Artificial Intelligence algorithms for hodoscope measurement interpretations.
- Author
-
Mirotta, S., Querre, P., Baccou, J., Gerbaud, A., and Gerbaud, T.
- Subjects
- *
ARTIFICIAL intelligence , *NUCLEAR industry , *MACHINE learning , *ALGORITHMS , *RESEARCH reactors - Abstract
Artificial Intelligence (AI) algorithms have shown their capability to complement human analysis in the understanding of complex phenomena. Their advantages essentially rely on the flexibility of their construction that allows capturing complex relationships between inputs and outputs of a problem and also on their ability to adapt themselves via a learning phase to the available information. As a result, AI is widely used in many scientific fields and especially those related to the nuclear industry. This work deals an application of AI based on several classical machine learning approaches for the study of Reactivity Initiated Accidents (RIA) in the CABRI experimental pulse reactor located at the Cadarache research center, southern France. We focus on the interpretation of the fuel rod behaviour during the power pulse using the online fuel motion monitoring system called the hodoscope. The objective of this paper is to investigate how AI algorithms can be used for the automatic detection of a generic fuel delocalization from the signals recorded by the hodoscope. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
18. A Square-Root, Dual-Resolution 3DEnVar for the AROME Model: Formulation and Evaluation on a Summertime Convective Period.
- Author
-
Michel, Yann and Brousseau, Pierre
- Subjects
SURFACE pressure ,NUMERICAL weather forecasting ,THUNDERSTORMS ,ALGORITHMS - Abstract
A three-dimensional ensemble-variational (3DEnVar) data assimilation algorithm has been developed for the high-resolution AROME NWP system. Building on previous work on 3DEnVar for AROME, we describe a formulation of the 3DEnVar that is based on the traditional square-root preconditioning. The localization may be performed either in gridpoint or spectral space, and allows for cross-covariances between surface pressure and the three-dimensional variables. The scheme has capacity for dual resolution, with the ensemble running at a lower 3.2-km spatial resolution than the deterministic AROME running at 1.3 km. This formulation is compatible with the variational bias correction scheme used in AROME. Hybrid covariances are implemented with climatological covariances at 1.3-km resolution being combined with ensemble perturbations that are interpolated to high resolution on the fly in the computation of the gradient. Hybrid 3DEnVar has an increased computational cost compared to 3DVar, which is partly mitigated by the use of dual resolution and the adoption of a flexible convergence criterion in the minimization. To get the full benefit from the ensemble scheme, it is recommended 1) to increase ensemble size from 25 to 50 members and 2) to decrease the localization length scale for the benefit of high-density radar observations. With those changes, the 3DEnVar outperforms the operational AROME-France 3DVar by a significant margin on the first 12 h of forecast range, as evidenced by a 3-month summer experiment. Finally, a case study reports on the improved prediction of heavy rainfall that frequently occurs in the Mediterranean region. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
19. Diagnostic performance of thromboelastometry in trauma-induced coagulopathy: a comparison between two level I trauma centres using two different devices.
- Author
-
Bouzat, Pierre, Guerin, Romain, Boussat, Bastien, Nicolas, Jérôme, Lambert, Aline, Greze, Jules, Maegele, Marc, and David, Jean-Stéphane
- Subjects
BIOLOGICAL assay equipment ,INJURY complications ,REPORTING of diseases ,PROTHROMBIN time ,ACQUISITION of data methodology ,ACADEMIC medical centers ,TRAUMA centers ,THROMBELASTOGRAPHY ,RETROSPECTIVE studies ,PATIENTS ,COMPARATIVE studies ,BLOOD coagulation disorders ,MEDICAL records ,EMERGENCY medical services ,FIBRINOGEN ,RECEIVER operating characteristic curves ,BLOOD coagulation factors ,ALGORITHMS - Abstract
Purpose: The implementation of a ROTEM
® -based algorithm requires reliable thresholds to mirror a prothrombin time (PT) ratio > 1.2 and/or a fibrinogen concentration < 1.5 g l−1 . Our goal was to compare the diagnostic performances of two devices (ROTEM® Sigma and Delta, IL Werfen, Munich, Germany) in two level-I trauma centres for the diagnostic of post-traumatic coagulopathy. Methods: We conducted a retrospective analysis of two registries across two periods of time: from September 2014 to December 2015 in Lyon-Sud university trauma centre and from April 2016 to January 2018 in the Grenoble Alps Trauma Centre. Accuracies of EXTEM and FIBTEM assays to detect patients with coagulation disorders were tested for each device using receiver operating characteristic (ROC) analyses. Results: Within the study period, 74 trauma patients in the Grenoble cohort and 75 trauma patients in the Lyon cohort had concomitant ROTEM® and standard coagulation testing on admission. No statistically significant difference was found between the two ROC curves for FIBTEM amplitude at 5 min (A5), FIBTEM maximum clot firmness, EXTEM clotting time (CT) and EXTEM A5 for ROTEM® Sigma and Delta to diagnose post-traumatic coagulation disorders. The best threshold for FIBTEM A5 to predict low fibrinogen concentration was 7 mm for each device. EXTEM CT thresholds to diagnose PT ratio > 1.2 were 78 s and 74 s for ROTEM® Sigma and Delta, respectively. Conclusions: These results suggest that ROTEM® -based algorithms may be transposed from one trauma centre to another independently of the setting and the ROTEM® device in use. [ABSTRACT FROM AUTHOR]- Published
- 2021
- Full Text
- View/download PDF
20. Effect of under triage on early mortality after major pediatric trauma: a registry-based propensity score matching analysis.
- Author
-
Ageron, François-Xavier, Porteaud, Jordan, Evain, Jean-Noël, Millet, Anne, Greze, Jules, Vallot, Cécile, Levrat, Albrice, Mortamet, Guillaume, and Bouzat, Pierre
- Subjects
INJURY complications ,ALGORITHMS ,CHILD mortality ,CONFIDENCE intervals ,LONGITUDINAL method ,MAPS ,MEDICAL protocols ,PEDIATRICS ,MEDICAL triage ,DESCRIPTIVE statistics ,CHILDREN - Abstract
Background: Little is known about the effect of under triage on early mortality in trauma in a pediatric population. Our objective is to describe the effect of under triage on 24-h mortality after major pediatric trauma in a regional trauma system. Methods: This cohort study was conducted from January 2009 to December 2017. Data were obtained from the registry of the Northern French Alps Trauma System. The network guidelines triage pediatric trauma patients according to an algorithm shared with adult patients. Under triage was defined by the number of pediatric trauma patients that required specialized trauma care transported to a non-level I pediatric trauma center on the total number of injured patients with critical resource use. The effect of under triage on 24-h mortality was assessed with inverse probability treatment weighting (IPTW) and a propensity score (Ps) matching analysis. Results: A total of 1143 pediatric patients were included (mean [SD], age 10 [5] years), mainly after a blunt trauma (1130 [99%]). Of the children, 402 (35%) had an ISS higher than 15 and 547 (48%) required specialized trauma care. Nineteen (1.7%) patients died within 24 h. Under triage rate was 33% based on the need of specialized trauma care. Under triage of children requiring specialized trauma care increased the risk of death in IPTW (risk difference 6.0 [95% CI 1.3–10.7]) and Ps matching analyses (risk difference 3.1 [95% CI 0.8–5.4]). Conclusions: In a regional inclusive trauma system, under triage increased the risk of early death after pediatric major trauma. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
21. Reduction of recruitment costs in preclinical AD trials: validation of automatic pre-screening algorithm for brain amyloidosis.
- Author
-
Ansart, Manon, Epelbaum, Stéphane, Gagliardi, Geoffroy, Colliot, Olivier, Dormont, Didier, Dubois, Bruno, Hampel, Harald, Durrleman, Stanley, and Alzheimer’s Disease Neuroimaging Initiative* and the INSIGHT-preAD study
- Subjects
COST control ,CLASSIFICATION algorithms ,CLINICAL trials ,DISEASE progression ,RESEARCH ,ALZHEIMER'S disease ,AMYLOIDOSIS ,PATIENT selection ,RESEARCH methodology ,MEDICAL screening ,MEDICAL cooperation ,EVALUATION research ,COMPARATIVE studies ,STATISTICAL models ,ALGORITHMS ,LONGITUDINAL method - Abstract
We propose a method for recruiting asymptomatic Amyloid positive individuals in clinical trials, using a two-step process. We first select during a pre-screening phase a subset of individuals which are more likely to be amyloid positive based on the automatic analysis of data acquired during routine clinical practice, before doing a confirmatory PET-scan to these selected individuals only. This method leads to an increased number of recruitments and to a reduced number of PET-scans, resulting in a decrease in overall recruitment costs. We validate our method on three different cohorts, and consider five different classification algorithms for the pre-screening phase. We show that the best results are obtained using solely cognitive, genetic and socio-demographic features, as the slight increased performance when using MRI or longitudinal data is balanced by the cost increase they induce. We show that the proposed method generalizes well when tested on an independent cohort, and that the characteristics of the selected set of individuals are identical to the characteristics of a population selected in a standard way. The proposed approach shows how Machine Learning can be used effectively in practice to optimize recruitment costs in clinical trials. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
22. Design of shared unit-dose drug distribution network using multi-level particle swarm optimization.
- Author
-
Chen, Linjie, Monteiro, Thibaud, Wang, Tao, and Marcon, Eric
- Subjects
HOSPITAL drug distribution systems ,PARTICLE swarm optimization ,LOCATION problems (Programming) ,PATIENT safety ,OPERATING costs ,ALGORITHMS ,AUTOMATION ,DRUG delivery systems ,HOSPITAL pharmacies ,MATHEMATICAL models ,THEORY - Abstract
Unit-dose drug distribution systems provide optimal choices in terms of medication security and efficiency for organizing the drug-use process in large hospitals. As small hospitals have to share such automatic systems for economic reasons, the structure of their logistic organization becomes a very sensitive issue. In the research reported here, we develop a generalized multi-level optimization method - multi-level particle swarm optimization (MLPSO) - to design a shared unit-dose drug distribution network. Structurally, the problem studied can be considered as a type of capacitated location-routing problem (CLRP) with new constraints related to specific production planning. This kind of problem implies that a multi-level optimization should be performed in order to minimize logistic operating costs. Our results show that with the proposed algorithm, a more suitable modeling framework, as well as computational time savings and better optimization performance are obtained than that reported in the literature on this subject. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
23. Modeling the risk of water pollution by pesticides from imbalanced data.
- Author
-
Trajanov, Aneta, Kuzmanovski, Vladimir, Real, Benoit, Perreau, Jonathan Marks, Džeroski, Sašo, and Debeljak, Marko
- Subjects
WATER pollution ,PESTICIDE pollution ,PARAMETERIZATION ,DATA mining ,RISK assessment of water pollution ,PESTICIDE use regulations ,EMPIRICAL research ,ALGORITHMS - Abstract
The pollution of ground and surface waters with pesticides is a serious ecological issue that requires adequate treatment. Most of the existing water pollution models are mechanistic mathematical models. While they have made a significant contribution to understanding the transfer processes, they face the problem of validation because of their complexity, the user subjectivity in their parameterization, and the lack of empirical data for validation. In addition, the data describing water pollution with pesticides are, in most cases, very imbalanced. This is due to strict regulations for pesticide applications, which lead to only a few pollution events. In this study, we propose the use of data mining to build models for assessing the risk of water pollution by pesticides in field-drained outflow water. Unlike the mechanistic models, the models generated by data mining are based on easily obtainable empirical data, while the parameterization of the models is not influenced by the subjectivity of ecological modelers. We used empirical data from field trials at the La Jaillière experimental site in France and applied the random forests algorithm to build predictive models that predict “risky” and “not-risky” pesticide application events. To address the problems of the imbalanced classes in the data, cost-sensitive learning and different measures of predictive performance were used. Despite the high imbalance between risky and not-risky application events, we managed to build predictive models that make reliable predictions. The proposed modeling approach can be easily applied to other ecological modeling problems where we encounter empirical data with highly imbalanced classes. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
24. Agricultural practices in grasslands detected by spatial remote sensing.
- Author
-
Dusseux, Pauline, Vertès, Françoise, Corpetti, Thomas, Corgne, Samuel, and Hubert-Moy, Laurence
- Subjects
GRASSLAND management ,REMOTE sensing ,ALGORITHMS ,REMOTE-sensing images - Abstract
The major decrease in grassland surfaces associated with changes in their management that has been observed in many regions of the earth during the last half century has major impacts on environmental and socio-economic systems. This study focuses on the identification of grassland management practices in an intensive agricultural watershed located in Brittany, France, by analyzing the intra-annual dynamics of the surface condition of vegetation using remotely sensed and field data. We studied the relationship between one vegetation index (NDVI) and two biophysical variables (LAI and fCOVER) derived from a series of three SPOT images on one hand and measurements collected during field campaigns achieved on 120 grasslands on the other. The results show that the LAI appears as the best predictor for monitoring grassland mowing and grazing. Indeed, because of its ability to characterize vegetation status, LAI estimated from remote sensing data is a relevant variable to identify these practices. LAI values derived from the SPOT images were then classified based on the K-Nearest Neighbor (KNN) supervised algorithm. The results points out that the distribution of grassland management practices such as grazing and mowing can be mapped very accurately (Kappa index = 0.82) at a field scale over large agricultural areas using a series of satellite images. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
25. Evaluation of Atmospheric Correction Algorithms for Sentinel-2-MSI and Sentinel-3-OLCI in Highly Turbid Estuarine Waters.
- Author
-
Renosh, Pannimpullath Remanan, Doxaran, David, Keukelaere, Liesbeth De, and Gossn, Juan Ignacio
- Subjects
PARTICULATE matter ,ALGORITHMS ,OCEAN color ,DATA recorders & recording ,NEAR infrared radiation - Abstract
The present study assesses the performance of state-of-the-art atmospheric correction (AC) algorithms applied to Sentinel-2-MultiSpectral Instrument (S2-MSI) and Sentinel-3-Ocean and Land Color Instrument (S3-OLCI) data recorded over moderately to highly turbid estuarine waters, considering the Gironde Estuary (SW France) as a test site. Three spectral bands of water-leaving reflectance ( R h o w ) are considered: green (560 nm), red (655 or 665 nm) and near infrared (NIR) (865 nm), required to retrieve the suspended particulate matter (SPM) concentrations in clear to highly turbid waters (SPM ranging from 1 to 2000 mg/L). A previous study satisfactorily validated Acolite short wave infrared (SWIR) AC algorithm for Landsat-8-Operational Land Imager (L8-OLI) in turbid estuarine waters. The latest version of Acolite Dark Spectrum Fitting (DSF) is tested here and shows very good agreement with Acolite SWIR for OLI data. L8-OLI satellite data corrected for atmospheric effects using Acolite DSF are then used as a reference to assess the validity of atmospheric corrections applied to other satellite data recorded over the same test site with a minimum time difference. Acolite DSF and iCOR (image correction for atmospheric effects) are identified as the best performing AC algorithms among the tested AC algorithms (Acolite DSF, iCOR, Polymer and C2RCC (case 2 regional coast color)) for S2-MSI. Then, the validity of six different AC algorithms (OLCI Baseline Atmospheric Correction (BAC), iCOR, Polymer, Baseline residual (BLR), C2RCC-V1 and C2RCC-V2) applied to OLCI satellite data is assessed based on comparisons with OLI and/or MSI Acolite DSF products recorded on a same day with a minimum time lag. Results show that all the AC algorithms tend to underestimate R h o w in green, red and NIR bands except iCOR in green and red bands. The iCOR provides minimum differences in green (slope = 1.0 ± 0.15, BIAS = 1.9 ± 4.5% and mean absolute percentage error (MAPE) = 12 ± 5%) and red (slope = 1.0 ± 0.17, BIAS = −9.8 ± 9% and MAPE = 28 ± 20%) bands with Acolite DSF products from OLI and MSI data. For the NIR band, BAC provides minimum differences (slope = 0.7 ± 0.13, BIAS = −33 ± 17% and MAPE = 55 ± 20%) with Acolite DSF products from OLI and MSI data. These results based on comparisons between almost simultaneous satellite products are supported by match-ups between satellite-derived and field-measured SPM concentrations provided by automated turbidity stations. Further validation of satellite products based on rigorous match-ups with in-situ R h o w measurements is still required in highly turbid waters. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.