660 results on '"DATA ACQUISITION"'
Search Results
2. A scalable data acquisition system for the efficient processing of DNS network traffic
- Author
-
Ochab, Marcin, Mrukowicz, Marcin, Sarzyński, Jaromir, and Rzasa, Wojciech
- Published
- 2024
- Full Text
- View/download PDF
3. Vehicle type recognition: a case study of MobileNetV2 for an image Classification task
- Author
-
Kobiela, Dariusz, Groth, Jan, Hajdasz, Michał, and Erezman, Mateusz
- Published
- 2024
- Full Text
- View/download PDF
4. Data Acquisition Framework for spatio-temporal analysis of path-based welding applications
- Author
-
Safronov, Georgij, Theisinger, Heiko, Sahlbach, Vasco, Braun, Christoph, Molzer, Andreas, Thies, Anabelle, Schuba, Christian, Shirazi, Majid, Reindl, Thomas, Hänel, Albrecht, Engelhardt, Philipp, Ihlenfeldt, Steffen, and Mayr, Peter
- Published
- 2024
- Full Text
- View/download PDF
5. Framework for the Classification of Real-time Locating System (RTLS) Use Cases in Matrix Production Systems
- Author
-
Berkhan, Patricia, Kärcher, Susann, and Bauernhansl, Thomas
- Published
- 2024
- Full Text
- View/download PDF
6. Development of a Multi-layered Quality Assurance Framework for Manual Assembly Processes in the Aviation Industry
- Author
-
Bartsch, Devis, Borck, Christian, Behm, Martin, and Böhnke, Jacob
- Published
- 2024
- Full Text
- View/download PDF
7. OR-LIM: Observability-aware robust LiDAR-inertial-mapping under high dynamic sensor motion.
- Author
-
Cong, Yangzi, Chen, Chi, Yang, Bisheng, Zhong, Ruofei, Sun, Shangzhe, Xu, Yuhang, Yan, Zhengfei, Zou, Xianghong, and Tu, Zhigang
- Subjects
- *
MOTION detectors , *OPTICAL radar , *LIDAR , *REMOTE sensing , *DATA acquisition systems - Abstract
Light Detection And Ranging (LiDAR) technology has provided an impactful way to capture 3D data. However, consistent mapping in sensing-degenerated and perceptually-limited scenes (e.g. multi-story buildings) or under high dynamic sensor motion (e.g. rotating platform) remains a significant challenge. In this paper, we present OR-LIM, a novel observability-aware LiDAR-inertial-mapping system. Essentially, it combines a robust real-time LiDAR-inertial-odometry (LIO) module with an efficient surfel-map-smoothing (SMS) module that seamlessly optimizes the sensor poses and scene geometry at the same time. To improve robustness, the planar surfels are hierarchically generated and grown from point cloud maps to provide reliable correspondences for fixed-lag optimization. Moreover, the normals of surfels are analyzed for the observability evaluation of each frame. To maintain global consistency, a factor graph is utilized integrating the information from IMU propagation, LIO as well as the SMS. The system is extensively tested on the datasets collected by a low-cost multi-beam LiDAR (MBL) mounted on a rotating platform. The experiments with various settings of sensor motion, conducted on complex multi-story buildings and large-scale outdoor scenes, demonstrate the superior performance of our system over multiple state-of-the-art methods. The improvement of point accuracy reaches 3.39–13.6 % with an average 8.71 % outdoor and correspondingly 1.89–15.88 % with 9.09 % indoor, with reference to the collected Terrestrial Laser Scanning (TLS) map. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Intelligent Control System for Wood Drying: Scalable Architecture, Predictive Analytics, and Future Enhancements.
- Author
-
Martins, Pedro, Cláudio, Ricardo, Soares, Francisco, Leitão, Jorge, Váz, Paulo, Silva, José, and Abbasi, Maryam
- Subjects
INTELLIGENT control systems ,LUMBER drying ,WEATHER forecasting ,MASTER'S degree ,RASPBERRY Pi ,LINEAR network coding - Abstract
This article explores the research and development undertaken as part of a Master's degree in Computer Engineering, with a primary focus on enhancing control mechanisms for natural wood drying. While this method is known for its cost-effectiveness in terms of labor and energy, it suffers from slower and unstable drying cycles. The project's objective is to implement an intelligent control system that significantly improves monitoring and recording of humidity levels in each wooden stack. Additionally, the system incorporates the capability to predict humidity based on data sourced from a weather forecasting API. The proposed solution entails a three-layer system: data collection, relay, and analysis. In the data collection layer, low-computing devices, utilizing a Raspberry Pi, measure humidity levels in individual wood stacks. These devices then transmit the data via Low Power Bluetooth to the subsequent layer. The data relay layer incorporates an Android application designed to aggregate, normalize, and transmit collected data. Furthermore, it provides users with visualization tools for comprehensive data understanding. The data storage and analysis layer, developed with Django, serves as the back-end, offering management functionalities for stacks, sensors, overall data, and analysis capabilities. This layer can generate humidity forecasts based on real-time weather information. The implementation of this intelligent control system enables accurate insights into humidity levels, triggering alerts for any anomalies during the drying process. This reduces the necessity for constant on-site supervision, optimizes work efficiency, lowers costs, and eliminates repetitive tasks. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Acquiring Automation and Control Data in The Manufacturing Industry: A Systematic Review.
- Author
-
Sama, Andi, Warnars, Harco Leslie Hendric Spits, Prabowo, Harjanto, Meyliana, and Hidayanto, Achmad Nizar
- Subjects
DIGITAL technology ,DIGITAL transformation ,INFORMATION technology ,PROGRAMMABLE controllers ,MANUFACTURING industries ,AUTOMATION ,MANUFACTURING execution systems - Abstract
Industry 4.0 has driven the need for Information Technology (IT) & Operational Technology (OT) convergence to modernize OT by leveraging IT. The challenges for manufacturing operations are utilizing and converting years of scattered data into valuable information integrated into the company's digital transformation strategy. This paper aims to provide a systematic literature review of current evidence in digital transformation for acquiring automation & control technologies data in manufacturing operations, such as Programmable Logic Controller (PLC), Supervisory Control and Data Acquisition (SCADA), Distributed Control System (DCS), and Manufacturing Execution System (MES), for analytics purposes and identify the current trends and best practices in this area. The findings cover current information on influential researchers, published journals, research trends, industries, industry types, types of analytics applications, methods, and frameworks for acquiring automation & control technologies data from the legacy OT infrastructure. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
10. Fuzzy decoupled-states multi-model identification of gas turbine operating variables through the use of their operating data.
- Author
-
Aissat, Sidali, Hafaifa, Ahmed, Iratni, Abdelhamid, Hadroug, Nadji, and Chen, XiaoQi
- Subjects
INDUSTRIALISM ,GAS dynamics ,LINEAR systems ,ADAPTIVE fuzzy control ,PHENOMENOLOGICAL theory (Physics) ,DYNAMICAL systems ,GAS turbines - Abstract
Practically the rotating machines degradation, such as gas turbines, is due to the quality of construction and online operation of their dynamic state models, of different physical phenomena affecting these machines which cause their total malfunction. To maintain their stable operation, it is essential to correctly describe these real dynamic behaviors by reliable and robust representations, by models that can be used in monitoring and diagnostics. To achieve the performance objectives in terms of security, reliability, availability, and operating safety, this work proposes the development of a fuzzy multi-model identification approach with states decoupled from the operating variables, uploaded for monitoring a TITAN 130 turbine. This fuzzy multi-model structure with decoupled states is of interest for the monitoring of industrial systems because it adapts to the different changes in dynamic behavior of the system, makes it possible to represent the nonlinear behavior of the real system in a linear multi-model form without loss of information. In this work, through the different implementations and obtained results, this approach clearly shows how the gas turbine dynamics were reproduced when using the proposed fuzzy multi-models, thus allowing better performance when exploiting it for the synthesis of the faults diagnosis strategy for this rotating machine. [Display omitted] • Identification of TITAN 130 turbine variables is achieved with reliable estimation. • Investigative tests on multi-model identification have been realized and achieved. • Fuzzy decoupled multi-models identification using linear optimization. • The nonlinear turbine behavior is approximated with linearity relationships. • The fuzzy multi-models is made using the turbine operating data. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
11. Digital twin for monitoring threshing performance of combine harvesters.
- Author
-
Guo, Dafang, Du, Yuefeng, Wang, Linze, Zhang, Weiran, Sun, Tiantian, and Wu, Zhikang
- Subjects
- *
DIGITAL twins , *PATTERN recognition systems , *COMMODITY futures , *COMBINES (Agricultural machinery) , *SENSOR networks - Abstract
• Specialized sensors and a sensor network were developed to collect on-site data. • A digital twin (DT) was developed through the integration of theory and data models. • The DT has unveiled additional insights into the threshing from on-site data. • Methods was proposed to monitor and enhance the threshing performance based on DT. The threshing performance of combine harvesters is crucial for minimizing grain losses. However, threshing is a highly complex process, and direct monitoring is infeasible due to high rotational speeds and concealed locations. This study introduces a method to enhance threshing performance monitoring by using digital twin (DT). A sensor network was developed to acquire on-site data. The DT was constructed by integrating a surrogate model of a discrete element model, which describes system responses, with a neural network predicting the future grain breakage rate (GBR). Test results indicate that the DT improved threshing monitoring capabilities and facilitated online optimization of settings. Compared to manual and feedback control modes, the GBR was reduced by 2.08 % and 1.00 %, and the working speed increased by 1.12 km/h and 1.47 km/h. The practical value of this study lies in reducing grain loss and enhancing harvest efficiency, providing advanced technological support for mechanical grain harvesting. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
12. Enhancing IoT data acquisition efficiency via FPGA-based implementation with OpenCL framework.
- Author
-
Firmansyah, Iman, Setiadi, Bambang, Subekti, Agus, Nugraha, Heri, Kurniawan, Edi, and Yamaguchi, Yoshiki
- Subjects
- *
DATA acquisition systems , *FIELD programmable gate arrays , *REAL-time computing , *GATE array circuits , *ACQUISITION of data , *DEBUGGING - Abstract
• A typical FPGA-based data acquisition system requires both an HDL program and a debugging process to connect the FPGA and hardware interfaces. • The use of an FPGA and the OpenCL framework for data acquisition reduces hardware connectivity while enhancing productivity. • The OpenCL component in the board support package (BSP) is developed to allow direct data transmission from the ADC to the OpenCL kernel within the FPGA. • The experimental findings demonstrated a streaming input signal from ADC AD7606 to cyclone V FPGA, which is useful for IoT applications. The increasing demand for real-time data processing in Internet of Things (IoT) applications necessitates the development of efficient and flexible data acquisition systems capable of receiving and processing data from various sensor types. In conjunction with OpenCL, field-programmable gate arrays (FPGAs) have recently emerged as powerful platforms for accelerating data-intensive tasks. This study explored the implementation of an FPGA for data acquisition using OpenCL, aiming to design and implement an efficient data acquisition system tailored for IoT applications. Utilizing OpenCL for FPGA-based data acquisition offers several advantages that contribute to system efficiency, particularly in hardware interfaces between FPGA and external devices used in IoT applications. OpenCL abstracts the complexity of the FPGA hardware interface to external DDR memory for storing temporary data and a communication interface to the host CPU for transferring the collected data and enabling remote access, enabling developers to focus on algorithm design and functionality. To enable data reading from an external analog-to-digital converter (ADC) chip for IoT applications, we developed a component module that utilizes the Avalon-streaming interface and can stream the data to the OpenCL kernel. An experiment was conducted to demonstrate the performance of our proposed design. According to the findings of the experiments, a data acquisition implementation based on an FPGA and OpenCL can simultaneously read analog signals via a multichannel ADC. The proposed design provides a foundation for designing efficient data acquisition solutions, addressing the increasing needs of FPGA-based data acquisition in various IoT environments. [Display omitted] [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. An adaptive spherical indentation test integrating targeted testing scenarios, data acquisition, and model selection for uniaxial mechanical property predictions.
- Author
-
Zhang, Tairui, Ma, Xin, Yang, Bin, Pei, Xianjun, Ge, Zhiqiang, and Jiang, Wenchun
- Subjects
- *
DIGITAL image correlation , *ACQUISITION of data , *SURFACE temperature , *STRAINS & stresses (Mechanics) , *HIGH temperatures - Abstract
To improve the applicability of spherical indentation tests (SITs) in complex targeted testing scenarios, such as piping with contaminated surfaces and high temperatures, this study proposes a targeted data acquisition and the following stress-strain prediction schemes based on the differences in targeted testing scenarios. The data acquisition scheme is classified depending on whether an additional digital image correlation (DIC) is applicable for the plastic zone radius measurements and whether the unloading information is of reliable accuracy. Then, based on the characteristics of the data acquisition scheme, a model selection, including incremental indentation energy model (IIEM), simplified indentation energy model (SIIEM), indentation energy model (IEM), and numerical model (NM), is introduced to achieve the most preferable uniaxial mechanical property predictions. Reliability of the adaptive SITs proposed in this study is verified through experiments (covering 20 °C, 400 °C, 565 °C, and 650 °C) on P91 steels in two service states, one as received and another service exposure for 300,000 h. • A targeted data acquisition and the following stress-strain prediction schemes are proposed for different testing scenarios. • Four models are introduced to achieve the most preferable uniaxial mechanical property predictions. • The plastic zone radius measurements achieved by DIC are used to improve the prediction accuracy. • Verifications are conducted on both as received and service exposure (used for 300,000 h) P91. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. Research on fault simulation and fault diagnosis of electric gate valves in nuclear power plants.
- Author
-
Huang, Xue-Ying, Liu, Yong-Kuo, Xia, Hong, and Shan, Long-Fei
- Subjects
- *
NUCLEAR power plant shutdowns , *ELECTRON tubes , *ELECTRIC fault location , *ELECTRIC power plants , *NUCLEAR reactor shutdowns , *NUCLEAR power plants , *VALVES - Abstract
• Designed and constructed a fault data acquisition system for nuclear power plant electric gate valves. • Based on the study of the fault mechanism of electric gate valves, analyzed the occurrence locations of faults in electric gate valves, completed fault settings, and provided theoretical guidance for designing relevant experimental platforms for other scholars. • Analyzed the changes in acceleration signals and AE signals when different types of faults occur in electric gate valves, selected corresponding sensors, and completed the arrangement of sensors and signal acquisition. • To effectively improve the accuracy of fault classification and the precision of fault severity assessment, this paper improved the original algorithms and developed a fault diagnosis system for nuclear power plant electric gate valves based on the SAE algorithm and a fault severity assessment system based on the Bi-LSTM algorithm. Electric gate valve failure is a common type of fault in nuclear power plants. Among all factors leading to reactor shutdown in nuclear power plants, valve failures account for a significant proportion. Due to the difficulty in obtaining valve failure data in nuclear power plants, to effectively obtain such data and provide data support for the development of subsequent fault diagnosis algorithms, this paper adopts an experimental research method. It designs and constructs an electric gate valve failure simulation test bench, obtains experimental data under various states of electric gate valves, and develops a fault diagnosis system for electric gate valves in nuclear power plants based on this. The experimental results show that the data generated in this experiment can well achieve the purpose of classifying valve failure types and evaluating the degree of failure. Moreover, the developed fault diagnosis system exhibits high diagnostic accuracy and low error in evaluating the degree of failure. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Developing an open-source flood forecasting system adapted to data-scarce regions: A digital twin coupled with hydrologic-hydrodynamic simulations.
- Author
-
M. C. Rápalo, Luis, Gomes Jr, Marcus N., and Mendiondo, Eduardo M.
- Subjects
- *
FLOOD warning systems , *FLOOD forecasting , *WEATHER forecasting , *DIGITAL twins , *RAINFALL , *FLOOD damage - Abstract
[Display omitted] • A novel open-source system for flood forecast in data scarcity regions is presented. • The system provides flood forecast for medium lead-time with a graphic interface. • The tool couples a Digital Twin with automatic satellite data acquisition. • PERSIANN PDIR-Now and the Global Forecast System (GFS) data feed the system. • The PDIR-Now represent two consecutive hurricanes (ETA and IOTA) at the basin scale in Honduras. Economic and human losses from flooding have had a significant global impact. Undeveloped nations often require extended periods to recover from flood-related damage, exacerbating the climate poverty trap, specifically in flood-prone regions. To address this issue, early warning systems (EWS) provide lead time for preparedness and measures to reduce vulnerability. However, EWS are mainly empirical at large scales and often do not incorporate hydrodynamic behaviors in flood forecasting, at least in developing regions with a lack of information. This study presents an open-source system integrating a hydrodynamic model with satellite rainfall data (PERSIANN PDIR-Now) and weather prediction data (GFS). It functions as a near real-time Digital Twin (DT) and Early Warning System for high-resolution flood forecasting. Simulated data can be compared with gauge stations in real-time through the model monitoring interface. A proof-of-concept was made by assessing the model capabilities in two case studies. First, the system simulated two consecutive extreme events (hurricanes ETA and IOTA) over the Sula Valley, Honduras, showing fidelity in streamflow responses. Second, the system worked as a DT and EWS to monitor the current and future hydrological states for two periods in 2022 and 2023. Results indicate that satellite data coupled with DT can provide up-to-date system conditions for flood forecasts for regions of lack of data for extreme rainfall events. This tool offered insights to enhance civil protection and societal engagement through warning dissemination against extreme events to build resilience to cope with the increasing magnitude and frequency of disasters in regions with data scarcity. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Geology from aeromagnetic data.
- Author
-
Betts, Peter G., Moore, David, Aitken, Alan, Blaikie, Teagan, Jessell, Mark, Ailleres, Laurent, Armit, Robin, McLean, Mark, Munukutla, Radhakrishna, and Chukwu, Chibuzo
- Subjects
- *
GEOMAGNETIC variations , *STRUCTURAL geology , *ROCK properties , *METAMORPHIC rocks , *IGNEOUS rocks - Abstract
This review aims to bridge the knowledge gap between geological and geophysical communities by elucidating the interpretation of aeromagnetic data. Aeromagnetic surveys measure the Earth's magnetic field variations and provide critical insights into subsurface geology, including basins, stratigraphy, igneous rocks and structural geology. The magnetic properties of rocks make these datasets valuable for identifying anomalies associated with various rock types and their magnetic responses. However, interpreting aeromagnetic data is complex due to the diverse geological processes that influence the formation and distribution of magnetic minerals, which must then be correlated with geological phenomena and features. Despite improved data accessibility and processing, many geoscientists still find interpreting aeromagnetic data challenging, resulting in a shortage of skilled expertise for research and industry applications. Accurate interpretation necessitates a thorough understanding of data collection and processing, recognising both the insights and limitations of the methods used and understanding how data resolution impacts the scale of interpretable geological features. This review is intended to assist those grappling with these challenges and to aid the geophysical community in interpreting complex geological features. Data treatment is explained with a focus on the reasons for specific processing methods rather than their mathematical foundations. Emphasis is placed on rock properties and their influence on aeromagnetic data expressions. The aeromagnetic expressions of common geological elements, including sedimentary, igneous, and metamorphic rocks, and their structures, such as stratigraphy and structural geometries related to folding and faulting, are explored. The discussion covers how these responses arise and how to identify them. Our explanations aim to bolster confidence in data interpretation for geologists new to aeromagnetic data and geophysicists who may not regularly interpret geological information from such data. Finally, we present strategies and pitfalls for interpreting aeromagnetic data, discuss automated interpretation methods, and offer practical guidance to improve interpretation skills and outcomes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Design and development of a PXI based data acquisition & control system for floating cesiated tungsten dust driven negative ion source.
- Author
-
Kausik, S.S., Das, Nipan, Saikia, B.K., Sarma, N.B., Kalita, D., Yadav, R., Gahlaut, A., and Bandyopadhyay, M.
- Subjects
- *
DATA acquisition systems , *HYDROGEN plasmas , *PLASMA physics , *HYDROGEN ions , *ANIONS - Abstract
• Reliable and rugged instrumentation & control system • PXI based event-driven and continuous data acquisition and control system has been developed. • The developed system is tested and some experiments are conducted. Experimental results along with test results of the system components have been presented in the paper. A production mechanism for negative hydrogen ions using cesium coated tungsten dust particles in hydrogen plasma has been established at the Centre of Plasma Physics – Institute for Plasma Research (CPP-IPR). A new experimental setup has been developed for the production, extraction and acceleration of such H− ions. The extraction and acceleration of H− ions in this system require a high voltage supply and a floating configuration based plasma source. Due to the high complexity and safety concerns associated with the experimental system, a reliable and robust instrumentation and control system has been developed and is presented in this work. To monitor and control various experimental parameters, a PXI-based event-driven interlock and a requirement-based continuous data acquisition and control system have been designed, developed, and commissioned, incorporating fiber optic links. The software for the control sequences, including monitoring and acquisition, has been developed and implemented on a Real-Time Controller using LabVIEW 2020. The system has been tested, and some experiments have been conducted. Experimental results, along with the test results of the system components, are presented in the paper. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Data quality-oriented scan planning for steel structure scenes using a probabilistic genetic algorithm.
- Author
-
Li, Fangxin, Yi, Chang-Yong, Li, Qiongfang, Chi, Hung-Lin, and Kim, Min-Koo
- Subjects
- *
GENETIC algorithms , *FEATURE extraction , *DATA quality , *ACQUISITION of data , *POINT cloud - Abstract
Scan planning is often challenging particularly in steel structure scenes because of its complex shapes and occlusions. Meeting the requirements of data quality for the scan-to-BIM model is also another issue for accurate point cloud data acquisition. To address these issues, this study proposes a solution that determines an optimal number of scans and corresponding scan positions and parameters. Three primary steps include 1) extraction of feature points using a slicing cutting method and range images, 2) evaluation of data quality using visibility check and data density evaluation, and 3) determination of optimal scan configuration using a probabilistic genetic algorithm. In order to validate the proposed solution, a series of lab-scale experiments involving five case studies with different scenarios are conducted and the results show a similarity of 88.4% between simulation and actual experiments, demonstrating the feasibility of the proposed method for steel structure scenes with complex shapes and occlusions. • A scan planning solution that satisfies data quality requirement for scan-to-BIM is developed. • Feature point extraction is performed using slicing cutting method and range images. • Evaluation of data quality is performed using visibility check and data density evaluation. • Five case studies with different structure scenes are conducted. • Validation tests show more than 88% similarity between simulation and experiments. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. The data acquisition system of the LZ dark matter detector: FADR.
- Author
-
Aalbers, J., Akerib, D.S., Al Musalhi, A.K., Alder, F., Amarasinghe, C.S., Ames, A., Anderson, T.J., Angelides, N., Araújo, H.M., Armstrong, J.E., Arthurs, M., Baker, A., Balashov, S., Bang, J., Barillier, E.E., Bargemann, J.W., Beattie, K., Benson, T., Bhatti, A., and Biekert, A.
- Subjects
- *
FIELD programmable gate arrays , *DATA acquisition systems , *WAVE analysis , *DARK matter , *ACQUISITION of data , *COMPUTER firmware - Abstract
The Data Acquisition System (DAQ) for the LUX-ZEPLIN (LZ) dark matter detector is described. The signals from 745 PMTs, distributed across three subsystems, are sampled with 100-MHz 32-channel digitizers (DDC-32s). A basic waveform analysis is carried out on the on-board Field Programmable Gate Arrays (FPGAs) to extract information about the observed scintillation and electroluminescence signals. This information is used to determine if the digitized waveforms should be preserved for offline analysis. The system is designed around the Kintex-7 FPGA. In addition to digitizing the PMT signals and providing basic event selection in real time, the flexibility provided by the use of FPGAs allows us to monitor the performance of the detector and the DAQ in parallel to normal data acquisition. The hardware and software/firmware of this FPGA-based Architecture for Data acquisition and Realtime monitoring (FADR) are discussed and performance measurements are described. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. The Mu2e Digitizer ReAdout Controller (DiRAC): Characterization and radiation hardness.
- Author
-
Atanov, N., Baranov, V., Bloise, C., Borrel, L., Ceravolo, S., Cervelli, F., Colao, F., Cordelli, M., Corradi, G., Davydov, Yu.I., Di Falco, S., Diociaiuti, E., Donati, S., Echenard, B., Fedeli, P., Ferrari, C., Gioiosa, A., Giovannella, S., Giusti, V., and Glagolev, V.
- Subjects
- *
COSMIC rays , *NEUTRON beams , *CESIUM iodide , *MAGNETIC fields , *SILICON crystals - Abstract
The Mu2e experiment at Fermilab will search for the neutrino-less coherent conversion of a muon into an electron in the field of a nucleus. Mu2e detectors comprise a straw tracker, an electromagnetic calorimeter and a veto for cosmic rays. The calorimeter employs 1348 Cesium Iodide crystals readout by silicon photo-multipliers and fast front-end, and digitization electronics. The digitization board is named DiRAC (Digitizer ReAdout Controller) and 140 cards are needed for the readout of the full calorimeter. The DiRACs are hosted in crates located on the external surface of calorimeter disks, inside the detector solenoid cryostat and must sustain very high radiation and magnetic field so it was necessary to fully qualify it. Several version of prototypes were validated for operation in a high-vacuum (10−4 Torr) and under a 1T magnetic field. An extensive radiation hardness qualification campaign, carried out with photons, 14 MeV neutron beams, and 200 MeV protons, certified the DiRAC design to sustain doses up to 12 krad, neutron fluences up to ∼ 1 0 11 1 MeV n e q / cm 2 , and very low occurrences of single-event effects. The qualification campaigns and quality assurance procedures will be reviewed. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Non-stationarity Removal Techniques in MEG Data: A Review.
- Author
-
Philip, Beril Susan, Prasad, Girijesh, and Hemanth, D Jude
- Subjects
SKELETAL muscle ,SIGNAL processing ,COMPUTER interfaces ,SIGNAL-to-noise ratio ,MAGNETOENCEPHALOGRAPHY - Abstract
Brain Computer Interface (BCI) enables communication solely through mental activity. For patients who have lost complete voluntary muscle control, it serves as a potential communication and rehabilitation channel. In a number of recent investigations, BCl based on electroencephalography (EEG) has demonstrated increased reaction in nonresponsive people to interact with others. Despite the method's effectiveness, patient interaction takes way too long. However, magnetoencephalography (MEG) may shorten the training period, improving BCI reliability in the process. One of the major technological difficulties confronting MEG data collection and interpretation is that the strength of neuromagnetic fields recorded externally is considerably lower than interfering signals. There hasn't been much substantial progress on an effective MEG-BCI system because there is no large enough MEG dataset. Despite their huge potential, MEG-based BCI systems still need a lot of work in terms of signal processing algorithms that are both reliable and efficient. Unexpected head movements and changing orientation between sessions may be the cause of non-stationarity in the MEG data that was captured, changing the most effective channel selection between sessions and participants. Not only the head movement, but also non-stationarity in data can be caused by a variety of factors like user weariness, mood changes, or external noise interfering with the MEG system. This paper discusses the many types of data acquisition artifacts and review the techniques to minimize the artifacts to increase the signal-to-noise ratio. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
22. Systematic review of the data acquisition and monitoring systems of photovoltaic panels and arrays.
- Author
-
Kalay, Muhammet Şamil, Kılıç, Beyhan, and Sağlam, Şafak
- Subjects
- *
DATA acquisition systems , *PHOTOVOLTAIC power systems , *SOLAR radiation , *SOLAR energy , *TELECOMMUNICATION - Abstract
Solar energy has increased in its share of global electrical energy production. The increasing reliability of solar energy has positively affected the sustainability of photovoltaic (PV) power plants. A failure in any module in the plant can reduce or interrupt the production of electrical energy, causing significant losses in both efficiency and asset value. Therefore, responding to a fault as quickly as possible in a PV power plant is critical. The ability of the PV plant operator to react to potential faults is directly related to the rapid detection of faulty modules. In this paper, different PV monitoring systems in the literature are investigated extensively from the point of view of the devices and the techniques used to measure PV systems' current, voltage, solar radiation, and module temperature. In particular, the communication methods and data acquisition cards used in monitoring were examined. Remote monitoring technologies quickly detect the location of a malfunction in a large-scale power plant. In this context, traditional wire communication methods, today's communication technologies, and the low-cost IoT (Internet of Things) technologies used to monitor the performance of large and small-scale PV power plants are compared in detail. With the advancement of Internet of Things technologies such as Zigbee and LoRa, research on remote wireless monitoring of photovoltaic modules has accelerated in recent years. These technologies are projected to be widely deployed in the near future for the maintenance and fault detection of numerous photovoltaic installations. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
23. Performance comparison of various electronics systems for fast-timing measurements using the KHALA LaBr3(Ce) detector array.
- Author
-
Lee, J., Kim, Y.H., Hong, B., Moon, B., Jang, Y., Ahn, S., Bae, S., Hahn, K.I., Park, J., Das, B., Górska, M., Heggen, H., Kurz, N., and Wiebusch, M.
- Subjects
- *
DETECTORS , *MEASUREMENT , *SCINTILLATION counters - Abstract
IDATEN collaboration has been formed by the joint effort between the KHALA in Korea and FATIMA in Europe to perform the fast-timing measurements with the largest LaBr 3 (Ce) detector array at RIBF of RIKEN, Japan. For precise timing detection in a large detector system, the data-acquisition system with complex correlations and large data throughput is required. The results of the benchmark tests for three different systems using standard radioactive sources are compared in the energy resolution, the timing resolution, and the prompt response curves. The characteristics of the three different systems are examined to choose the DAQ systems for KHALA and IDATEN. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
24. A data-driven method to construct prediction model of solar stills.
- Author
-
Sun, Senshan, Du, Juxin, Peng, Guilong, and Yang, Nuo
- Subjects
- *
SOLAR stills , *OPTIMIZATION algorithms , *ACQUISITION of data , *DATA reduction , *PROCESS optimization - Abstract
The interdisciplinary field between solar desalination and machine learning is the subject of a cutting-edge study. Generally, the studies treat data acquisition and model construction as independent processes, leading to problems such as insufficient dataset size or resource wastage. This study proposes a data-driven method that integrates data acquisition with model construction processes. By using the Bayesian optimization algorithm, the method accelerates the convergence of model accuracy. By comparing the results of 100 pairs of simulations, it is found that the models using the data-driven method are more accurate than traditional expert-driven methods in 70 % of compared results. Additionally, when it makes a model with the mean absolute percentage error as 5 %, the proposed data-driven method requires 220 additional data on average, compared to 258 with the traditional expert-driven method, representing a 14.7 % reduction. This work offers new ways and a broad application of the interdiscipline between solar desalination and machine learning. • A new data-driven method is proposed which is superior to the expert-driven method. • Data-driven method integrates data acquisition and model construction in real-time. • The data-driven method is more effective in 70 % of the comparisons. • A 14.7 % reduction in required data size can be achieved. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Development of a PXIe-based data acquisition and control system for hydrogen pellet injection system.
- Author
-
Banaudha, M., Mishra, J., Panchal, P., Mukherjee, S., Nayak, P., Gupta, V., Agravat, H., and Gangradey, R.
- Subjects
- *
DATA acquisition systems , *GRAPHICAL user interfaces , *PLASMA flow , *CLOUD storage , *PERSONAL computers - Abstract
• A DAC system has been developed using the NI-PXIe hardware and LabVIEW application for pellet injection system. • Functions like pellet freezing, launching and data acquisition are performed from a single controller and GUI. • The GUI allows switching between slow and fast data acquisition modes with a single click. • Pellet freezing and launching experiments successfully conducted using it. • The system will be integrated with the tokamak for pellet-plasma interaction study. A data acquisition and control (DAC) system has been developed for the solid hydrogen pellet injection system. This injector is a gas gun type injector, where solid hydrogen pellet ice is formed using a closed cycle cryocooler, and high-pressure helium gas or a pneumatic punch is used to dislodge the pellet. The DAC system is based on National Instrument's embedded controller NI PXIe-8133, along with associated I/O DAC cards and the LabVIEW application. A graphical user interface (GUI) developed using LabVIEW software allows users to remotely control the pellet formation and injection process. The data is stored locally on a desktop computer or in cloud storage for further analysis. The images of the injected pellet were obtained by using a Phantom V-1210 high-speed camera at 100-Killo FPS, which reveals the size and speed of the pellet. The developed DAC system provides the flexibility needed to operate the injector remotely during the plasma discharge in a tokamak environment. The injector setup system has been successfully tested in a test bench operation, and it will be integrated with the ADITYA-U tokamak. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Design and perspectives of the CMS Level-1 trigger Data Scouting system.
- Author
-
Ardino, Rocco, Deldicque, C., Dobson, M., Gigi, D., Giorgetti, S., James, T., Lazzari Miotto, G., Meschi, E., Migliorini, M., Petrucciani, G., Rabady, D., Racz, A., Sakulin, H., and Zejdl, P.
- Subjects
- *
ONLINE data processing , *HETEROGENEOUS computing , *MUONS , *DATA reduction , *ACQUISITION of data - Abstract
The CMS detector will undergo a significant upgrade to cope with the HL-LHC instantaneous luminosity and average number of proton–proton collisions per bunch crossing (BX). The Phase-2 CMS detector will be equipped with a new Level-1 (L1) trigger system that will have access to an unprecedented level of information. Advanced reconstruction algorithms will be deployed directly on the L1 FPGA-based processors, producing reconstructed physics primitives of quasi-offline quality. The latter will be collected and processed by the Level-1 trigger Data Scouting (L1DS) system at the full bunch crossing rate. Besides providing vast amounts of data for L1 and detector monitoring, the L1DS will perform quasi-online analysis in a heterogeneous computing farm. It is expected that the study of signatures too common to fit within the L1 acceptance budget, or orthogonal to the standard physics trigger selection strategies will greatly benefit from this approach. An L1DS prototype system has been set up to operate in the current LHC Run-3, with the main goal of demonstrating the basic principle and shape the development of the Phase-2 system. The Run-3 L1DS receives trigger primitives from the Global Muon and Calorimeter Trigger, the Global Trigger decision bits and the muon segments from the Barrel Muon Track Finder. FPGA boards acquire and aggregate the synchronous trigger data streams and perform basic data reduction, before sending the trigger primitives to a set of computing nodes through 100 Gbps Ethernet connections running a simplified firmware version of the TCP/IP protocol. An Intel TBB-based DAQ software receives the TCP/IP streams and applies further processing before the ingestion of the data into a cluster of servers running the CMS reconstruction framework. The output of the computing farm are data sets in the standard CMS data analysis format. This contribution presents the Run-3 L1DS demonstrator architecture and recent physics results extracted from the collected data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Status update scheduling in remote sensing under variable activation and propagation delays.
- Author
-
Badia, Leonardo, Zancanaro, Alberto, Cisotto, Giulia, and Munari, Andrea
- Subjects
MACHINE-to-machine communications ,PROCESS capability ,REMOTE sensing ,SENSOR networks ,TELECOMMUNICATION systems - Abstract
Sensor data exchanges in IoT applications can experience a variable delay due to changes in the communication environment and sharing of processing capabilities. This variability can impact the performance and effectiveness of the systems being controlled, and is especially reflected on Age of Information (AoI), a performance metric that quantifies the freshness of updates in remote sensing. In this work, we discuss the quantitative impact of activation and propagation delays, both taken as random variables, on AoI. In our analysis we consider an offline scheduling over a finite horizon, we derive a closed form solution to evaluate the average AoI, and we validate our results through numerical simulation. We also provide further analysis on which type of delay has more influence on the system, as well as the probability that the system fails to deliver all the scheduled updates due to excessive delays of either kind. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. 3D network-on-chip data acquisition system mapping based on reinforcement learning and improved attention mechanism.
- Author
-
Xu, Chuanpei, Shi, Xiuli, Yang, Hong, and Wang, Yang
- Subjects
- *
DATA acquisition systems , *REINFORCEMENT learning , *ACQUISITION of data , *TELECOMMUNICATION systems , *ALGORITHMS - Abstract
The three-dimensional Network-on-Chip (NoC) data acquisition system is designed to create a time-interleaved data acquisition system using NoC technology. In the design of NoC application systems, optimizing the mapping algorithm can effectively reduce network communication latency. Aiming at the mapping challenge of a large number of functional IP nodes in 3D NoC data acquisition system, the reinforcement learning and improved attention mechanism mapping algorithm (RA-Map) is proposed. The RA-Map mapping algorithm employs node function encoding and node position encoding to express the properties of an IP node in the task graph preprocessing. The local attention mechanism is used in the mapping network encoder, and the fusion of dynamic key node information is proposed in the decoder. The mapping result evaluation network achieves unsupervised training of the mapping network. These targeted improvements improve the quality of the mapping. Experimental results show that the RA-Map mapping algorithm can effectively model the IP core mapping. Compared with the DPSO algorithm and SA algorithm, the average communication cost of RA-Map mapping algorithm is reduced by 6.5 % and 8.5 %, respectively. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. Environmentally resilient data collecting system optimized for measuring low density neutron flux.
- Author
-
Jędrzejczak, K., Kasztelan, M., Orzechowski, J., Szabelski, J., and Nieckarz, Z.
- Subjects
- *
DATA acquisition systems , *NEUTRON counters , *ACTINIC flux , *ACQUISITION of data , *REMOTE control , *NEUTRON flux - Abstract
In this work we detailed present the data collecting system (DCS), designed and optimized for use with a helium neutron counter even under very challenging environmental conditions. The DCS, in addition to its simplicity, allows for the flexible configuration of multiple DCS units into a single measurement system and can also interface with various types of detectors and different data acquisition systems as per the experimental requirements. The main advantages of the presented DCS include its autonomy, remote control capability, and modularity. The pulse shape analysis procedure employed within the DCS enhances measurement sensitivity, which is crucial for detecting extremely low neutron fluxes, such as those in underground laboratories. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Cutting tool monitoring while drilling using internal CNC data.
- Author
-
Demko, Michal, Vrabeľ, Marek, Maňková, Ildikó, and Ižol, Peter
- Abstract
This article verifies the procedures used for obtaining internal data from a machining centre, and evaluates the state of the cutting tool in machining process. Tool condition monitoring reduces the production costs and time required to service cutting tools as well as increases the efficiency of the manufacturing process. This paper summarizes the commercially available solutions to access machine tool data that are suitable for creating a monitoring system for specific machining applications. The proposed model was tested under real cutting conditions, and recommendations for future research have been provided. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
31. Research on Escalator Data Acquisition and Transmission Based on Big Data Platform.
- Author
-
Chen, Jianhao, Zhang, Zhuangzhuang, Jiang, Xiyang, Huang, Jianpeng, and Tong, Yifei
- Subjects
BIG data ,ESCALATORS ,ACQUISITION of data ,FAULT trees (Reliability engineering) ,MERGERS & acquisitions ,DATABASES ,DATA transmission systems - Abstract
In order to collect the operation data of escalator for fault diagnosis, the fault tree analysis method is firstly applied to obtain the occurrence probability of the intermediate event and the bottom events of escalator fault. Then the importance degree of each event is calculated and the four most important components are selected based on the importance degree. For collecting the operation data of these important components, vibration sensor is selected to obtain their acceleration data and store the data locally through Ethernet. In order to meet the characteristics of large data volume and fast data growth and make the transmission platform have the characteristics of large throughput and distributed expansion, Flume, Kafka and Flink components are used to build the big data processing platform for the operation data transmission and pre-processing. The result shows that the platform can complete the data acquisition of the main components of the escalator. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
32. Additive manufacturing of a passive, sensor-monitored 16MnCr5 steel gear incorporating a wireless signal transmission system.
- Author
-
Binder, M., Stapff, V., Heinig, A., Schmitt, M., Seidel, C., and Reinhart, G.
- Abstract
The combination of functions in a single component, efficient use of limited space, comprehensive acquisition of process and sensor data and development of passive, wireless monitoring systems are all at the focus of current research. Additive manufacturing is playing an increasingly important role in these developments, because it acts as an enabler for free-form structures, allows access to the interior of the component through its layer-by-layer build-up, and facilitates the processing of a wide variety of materials. Within this work, the aspects mentioned are jointly addressed by the development of a monitored additively manufactured gear (material: 16MnCr5) that communicates wirelessly. This paper shows how an RFID-antenna can be optimized by simulation, incorporated in the design of a metallic gear, then built up in a single manufacturing process and linked to an in-situ inserted sensor board. The approach is validated by fabrication of a sensor-monitored prototype and verified in respect of its functionality and internal structure. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
33. Comparison of Data Sources for Robot Gear Condition Monitoring.
- Author
-
Nentwich, Corbinian and Daub, Rüdiger
- Abstract
Industrial robot gear condition monitoring has the potential to increase the productivity of highly automated production lines. In order to implement an effective condition monitoring system, data must be collected which correlates with the robot gear's state of health. The sensor choice and the characteristics of these sensors are crucial to the success of a condition monitoring system. Hence, we compare current and vibration sensor data from different accelerated robot gear wear tests in different frequency ranges to determine a suitable sensor setup. In the presented experiments, both data sources detect faults at a similar point in time and the variation of the frequency ranges has different effects on the data quality. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
34. Status update scheduling in remote sensing under variable activation and propagation delays
- Author
-
Badia, L, Zancanaro, A, Cisotto, G, Munari, A, Badia L., Zancanaro A., Cisotto G., Munari A., Badia, L, Zancanaro, A, Cisotto, G, Munari, A, Badia L., Zancanaro A., Cisotto G., and Munari A.
- Abstract
Sensor data exchanges in IoT applications can experience a variable delay due to changes in the communication environment and sharing of processing capabilities. This variability can impact the performance and effectiveness of the systems being controlled, and is especially reflected on Age of Information (AoI), a performance metric that quantifies the freshness of updates in remote sensing. In this work, we discuss the quantitative impact of activation and propagation delays, both taken as random variables, on AoI. In our analysis we consider an offline scheduling over a finite horizon, we derive a closed form solution to evaluate the average AoI, and we validate our results through numerical simulation. We also provide further analysis on which type of delay has more influence on the system, as well as the probability that the system fails to deliver all the scheduled updates due to excessive delays of either kind.
- Published
- 2024
35. Development of autonomous monitoring and performance evaluation system of grid-tied photovoltaic station.
- Author
-
Dabou, Rachid, Bouraiou, Ahmed, Ziane, Abderrezzaq, Necaibia, Ammar, Sahouane, Nordine, Blal, Mohamed, Khelifi, Seyfallah, Rouabhia, Abdelkrim, and Slimani, Abdeldjalil
- Subjects
- *
GRAPHICAL user interfaces , *POWER resources , *ACQUISITION of data , *BUILDING-integrated photovoltaic systems - Abstract
This work aims to improve the existing monitoring systems MS for two grid-connected PV stations GCPVS of URERMS ADRAR, to eliminate its limitations. This improvement consists of developing an MS which is used for two PV stations with different configurations. This MS contains new LabVIEW-based monitoring software for visualizing real-time measured data and evaluating GCPVS performance. In addition, it illustrates the 2D and 3D real-time relationships of PV system parameters, which allow us to understand the dynamic behavior of PV system components. This developed monitoring software synchronizes also the various data acquisition units DAU of GCPVS, allowing simultaneous data access. To perform a reliable performance analysis and a comparative study of different GCPVS based on accurate measurements, the sensor's calibration is performed with its DAU. The MS autonomy is ensured by integrating developed PV-UPS. A graphical user interface is provided for the evaluation of PV-UPS performance. [Display omitted] • Improvement of the existing monitoring of GCPVS to resolve its limitations. • New LabVIEW-based software has been developed to monitor the PV station. • Real-time display of measured data and performance evaluation indices of GCPVS. • Real-time illustration of 2D and 3D relationships of PV system parameters. • Developed PV-UPS to ensure the autonomy of power supply for new monitoring system. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
36. Event driven high speed data acquisition with IEEE 1588 synchronization for long pulse operations of Indian Test Facility for ITER DNB.
- Author
-
Tyagi, Himanshu, Yadav, R K, Bhuyan, M, Bandyopadhyay, M, Singh, MJ, and Chakraborty, Arun
- Subjects
- *
TRAFFIC safety , *DATA acquisition systems , *ACQUISITION of data , *TESTING laboratories , *SYNCHRONIZATION , *PHOTOMULTIPLIERS - Abstract
• Indian test facility (INTF) to characterize DNB for ITER delivery. • INTF to operate for long pulse duration of upto 3600 s with event driven DAQ and IEEE1588 time synchronization. • The work presents a solution of acquiring high bandwidth signals along with conventional PXIe based acquisition system. • Benchmarking and assessment of ITER relevant IEEE 1588 time synchronization protocol. The Indian Test Facility (INTF) is a negative Hydrogen ion based 100 kV, 60A, 5 Hz modulated NBI system having duty cycle of 3 s ON time to 20 s OFF time. Prime objective of the facility is to characterize ITER Diagnostic Neutral Beam (DNB) with full specifications, prior to shipment and installation in ITER. In order to fully characterize ITER DNB, a matured Data Acquisition and Control system (DACS) is needed which will be responsible for supervisory control of the facility and for acquiring and monitoring signals at run time. Apart from conventional DACS, high speed data acquisition is needed for diagnostics and other signals with sampling rate requirements of 100 MS/s. For acquiring data from such systems it is imperative that the data be acquired only for certain duration on the occurrence of the event. Such acquisition systems are classified as event driven data acquisition system and are widely used in fusion engineering community. Development and implementation of high speed event driven data acquisition systems pose challenges in terms of data buffering, data streaming and synchronization. The precise timing of the event occurrence with respect to the main pulse should also be known for which a timing and synchronization system is needed. To fulfill this purpose, IEEE 1588 protocol offers a network based method of synchronizing multiple nodes based on Ethernet with sub microsecond accuracy. The present work describes the development of an event driven high speed data acquisition in synchronization with IEEE 1588 based timing network. The work proposes the time stamping of the event from synchronized network integrated with the event acquisition platform. The developed system has been tested for pulse durations longer than 3600 s. The performance of event data acquisition system has been thoroughly characterized and improved by applying various software optimizations in LabVIEW. The present work although intended to be used for INTF can be used for any such facility in fusion or other areas. The paper provides results of IEEE 1588 based timing network in various configurations and event data acquisition system developed on high speed FPGA controlled digitizer board with prototype results. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Tomato quality assessment and enhancement through Fuzzy Logic: A ripe perspective on precision agriculture.
- Author
-
Cano-Lara, M. and Rostro-Gonzalez, H.
- Subjects
- *
FUZZY logic , *TOMATOES , *ANALYSIS of colors , *MEMBERSHIP functions (Fuzzy logic) , *IMAGE processing , *IMAGING systems , *CHERRIES , *PRECISION farming - Abstract
This study presents a meticulous exploration of tomato quality assessment and enhancement employing Fuzzy Logic, focusing on beefsteak, Roma, and cherry varieties. The research incorporates a multivariate approach encompassing shape, size, texture, and color analysis. Utilizing a controlled imaging system, 165 tomatoes were captured at various ripening stages to ensure uniformity in lighting, camera position, and background color. Image processing and computer vision techniques, including segmentation and feature extraction, were applied to comprehensively evaluate the tomatoes. Quality assessment involved considerations of three stages of skin coloration, defect quantity, and skin texture calculated using the statistical method of entropy. The decision-making process and quality evaluation were facilitated by a Mamdani-type Fuzzy Inference System, leveraging numerical values of external characteristics such as dimension, calyx shape, ripeness stage, and defect regions. The fuzzy logic-based system employed 22 fuzzy rules and three different membership functions (trapezoidal, Gaussian, and triangular), enabling a nuanced mapping of transitions between classes within the input variables. Furthermore, the system underwent rigorous training and validation using 80% and 20% respectively of the captured images. Through the system, we were able to identify and segment areas of tomatoes exhibiting poor conditions. This approach extends beyond conventional methods of estimating fruit ripeness, which typically rely on color assessment or destructive firmness evaluations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. Sequentially optimized data acquisition for a geothermal reservoir.
- Author
-
Corso, Anthony, Chiotoroiu, Maria, Clemens, Torsten, Zechner, Markus, and Kochenderfer, Mykel J.
- Subjects
- *
GROUND source heat pump systems , *PARTIALLY observable Markov decision processes , *ACQUISITION of data , *NET present value , *GEOTHERMAL resources - Abstract
Given the high energy demands for heating and cooling, and the currently limited use of renewable sources, there is a pressing need for efficient and economically viable geothermal development. However, the success of new geothermal reservoir projects hinges on numerous uncertain geological and economic factors. Prior to development, the project uncertainty can be reduced by performing costly data acquisition campaigns. The central question in these campaigns is: which data should be acquired in which order to determine the viability of the proposed project? Traditional methods based on value of information and other heuristics are insufficient for assessing large-scale multi-well geothermal projects. Our approach reformulates geothermal assessment as a partially observable Markov decision process (POMDP), employing algorithmic decision-making techniques to devise an approximately optimal data acquisition strategy. Applied to a real-world low-enthalpy geothermal project in Austria, our methodology increased the expected net present value of the project by nearly 30% over baseline methods (including human experts) by minimizing the risk of developing a non-profitable project. This advancement not only promises a more efficient path towards large-scale geothermal energy production but also sets a precedent for applying sophisticated decision-making frameworks in renewable energy projects. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Challenges in data acquisition for suicide attempt recurrence research.
- Author
-
Tamune, Hidetaka
- Published
- 2024
- Full Text
- View/download PDF
40. Insight into chemical basis of traditional Chinese medicine based on the state-of-the-art techniques of liquid chromatography−mass spectrometry.
- Author
-
Yu, Yang, Yao, Changliang, and Guo, De-an
- Subjects
LIQUID chromatography-mass spectrometry ,CHINESE medicine ,ACQUISITION of data ,MASS spectrometry ,NEUTRAL density filters - Abstract
Traditional Chinese medicine (TCM) has been an indispensable source of drugs for curing various human diseases. However, the inherent chemical diversity and complexity of TCM restricted the safety and efficacy of its usage. Over the past few decades, the combination of liquid chromatography with mass spectrometry has contributed greatly to the TCM qualitative analysis. And novel approaches have been continuously introduced to improve the analytical performance, including both the data acquisition methods to generate a large and informative dataset, and the data post-processing tools to extract the structure-related MS information. Furthermore, the fast-developing computer techniques and big data analytics have markedly enriched the data processing tools, bringing benefits of high efficiency and accuracy. To provide an up-to-date review of the latest techniques on the TCM qualitative analysis, multiple data-independent acquisition methods and data-dependent acquisition methods (precursor ion list, dynamic exclusion, mass tag, precursor ion scan, neutral loss scan, and multiple reaction monitoring) and post-processing techniques (mass defect filtering, diagnostic ion filtering, neutral loss filtering, mass spectral trees similarity filter, molecular networking, statistical analysis, database matching, etc.) were summarized and categorized. Applications of each technique and integrated analytical strategies were highlighted, discussion and future perspectives were proposed as well. This review summarized the mechanisms and applications of various data acquisition and data post-processing techniques of liquid chromatography−mass spectrometry (LC−MS) in qualitative analysis of traditional Chinese medicine (TCM). [Display omitted] [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
41. Neurologic Factors in Patients with Vascular Mild Cognitive Impairment Based on fMRI.
- Author
-
Zhuang, Yingying, Shi, Yuntao, Zhang, Jiandong, Kong, Dan, Guo, Lili, Bo, Genji, and Feng, Yun
- Subjects
- *
MILD cognitive impairment , *FUNCTIONAL magnetic resonance imaging , *CLINICAL neuropsychology , *MONTREAL Cognitive Assessment , *MAGNETIC resonance imaging , *PARIETAL lobe - Abstract
This study focused on the application of functional magnetic resonance imaging and neuropsychology in diagnosis of vascular mild cognitive impairment (MCI) and the exploration of its relevant factors. The study enrolled 28 patients with vascular MCI in an observation group and 30 healthy individuals in a control group. All patients underwent magnetic resonance imaging. An automatic segmentation algorithm based on graph theory was adopted to process the images. Age, sex, disease course, Montreal Cognitive Assessment score, regional homogeneity, and amplitude of low-frequency fluctuation levels were recorded. There were no significant differences in age, gender, and course of disease between the observation group and the control group (P > 0.05). The level of regional homogeneity in the left posterior cerebellum in the observation group was significantly higher than that in the control group (P < 0.05).The regional homogeneity level of bilateral cingulate cortex was negatively correlated with Montreal Cognitive Assessment score (P < 0.05). The amplitude of low-frequency fluctuation of bilateral inferior parietal lobe, parietal lobe, and prefrontal lobe in the observation group was significantly lower than that in the control group, and the amplitude of low-frequency fluctuation of bilateral anterior cingulate gyrus, superior medial frontal gyrus, orbital frontal gyrus, right middle frontal gyrus, and right auxiliary motor area was higher than that in the control group (P < 0.05). Heart disease, such as myocardial infarction and atrial fibrillation, is a high risk factor for vascular MCI. Functional magnetic resonance imaging combined with an automatic segmentation algorithm can noninvasively observe the changes of a patient's brain tissue, which can be used in the recognition of vascular MCI. The global network attributes of patients with depression tend to be more randomized and have stronger resilience under targeted attacks. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
42. Approach for Efficient Acquisition of Energy Data and Identification of Energy-related Process Parameters in Lithium-Ion Battery Cell Production.
- Author
-
Maier, Maria, Vernim, Susanne, and Reinhart, Gunther
- Abstract
Energy demand is one of the main cost drivers in lithium-ion battery (LIB) production. To cut down the costs and reach an energy-efficient production, creation of transparency concerning energy demand and the correlations with process parameters is necessary. This paper presents a systematic approach for identification and analysis of energy data points and their interrelated process parameters along the LIB production process chain. Additionally, an overview of measurement strategies and possible technical solutions for energy data collection is provided. The developed approach reduces efforts in the collection of energy data and assists the industry in achieving a cost-saving battery production. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
43. Ontology-based Traceability System for Interoperable Data Acquisition in Battery Cell Manufacturing.
- Author
-
Wessel, Jacob, Turetskyy, Artem, Wojahn, Olaf, Abraham, Tim, and Herrmann, Christoph
- Abstract
In order to support the transformation of energy and transportation sectors, costs and environmental impacts of battery cell need to be reduced. Data acquisition plays a major role in generating transparency within the complex system of battery manufacturing and enables its improvement. This paper presents a methodology for the development of an ontology-based traceability system of data acquired along the battery cell manufacturing chain. This system provides interrelations between data, data sources, and corresponding entities enabling an interoperable data acquisition. A data basis generated with this ontology-based traceability system supports and eases data analytics applications in battery cell manufacturing. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
44. Data Acquisition and Preparation – Enabling Data Analytics Projects within Production.
- Author
-
Schock, Christoph, Dumler, Jonas, and Doepper, Frank
- Abstract
The increasing amount of available data in production systems is associated with great potential for process optimization. Due to lack of a data analytics methodology and low data quality within production these potentials often remain unused. Therefore, in this paper we present a model for data acquisition and data preparation including feature engineering for characteristic sensor signals of production machines. The model allows the extraction of relevant process information from the signal, which can be used for monitoring, KPI tracking, trend analysis and anomaly detection. The approach is evaluated on an industrial turning process. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
45. Dynamic Data Acquisition and Preprocessing for Automatic Behavioral Modeling of Cyber-physical Systems.
- Author
-
Sai, Brandon K., Mayer, Yannick T., and Bauernhansl, Thomas
- Abstract
To identify the potential for optimization regarding the Overall Equipment Effectiveness (OEE) of production systems, a broad spectrum of behavior models is available. However, there are several tasks, including data extraction, data preprocessing and modeling, which are performed manually yet. This paper contributes to the automization of the process of behavioral modelling. In this work the dynamical data acquisition and preprocessing is introduced. Both the fundamental method of ELT and its technical implementation are presented. The benefital reduction of data volume and model complexity has been validated in a real industrial use case. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
46. Towards Data Acquisition for Predictive Maintenance of Industrial Robots.
- Author
-
Nentwich, Corbinian and Reinhart, Gunther
- Abstract
Predictive Maintenance of industrial robots offers the potential to increase productivity and cut costs in highly automated production systems. The success of such maintenance strategies is highly dependent on the data acquisition strategy used to monitor the robot's health state. In this publication, we first describe a methodology for deriving a suitable data acquisition strategy. Second, we apply this methodology to shape a data acquisition strategy for articulated robots. This strategy defines the robot components for which data is acquired, the robot trajectories used for the data acquisition and the frequency that measurements are taken. To conclude, we discuss the methodology's limitations. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
47. Design of electric automatic control water treatment system based on LabVIEW.
- Author
-
Yujing Yang and Dan Wu
- Subjects
WATER purification ,AUTOMATIC control systems ,PROGRAMMABLE controllers ,BIOCHEMICAL oxygen demand ,GOAL (Psychology) - Abstract
How to establish an effective electrical automation control water treatment system, optimize the operation effect, reduce cost investment, of great significance. The design of an electric automatic control water treatment system based on LabVIEW is proposed. Siemens industrial computer is used as the upper computer, and the S7-300 series of Siemens is used in a programmable logic controller (PLC) system. Based on the LabVIEW development platform, the OPC server is set up by Siemens SIMATIC. NET, which can connect with PLC to realize real-time communication. SQL Server is used as a backstage database to realize basic monitoring functions such as data collection, equipment control, real-time curve display, history record, fault alarm, etc. The experimental results show that the system can predict the biological oxygen demand concentration online and compare it with the measured value. It can adjust the control strategy in time, reduce the fault, save the dosage and reach the goal of economical operation. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
48. A cost-effective process chain for thermoplastic microneedle manufacture combining laser micro-machining and micro-injection moulding.
- Author
-
Gülçür, Mert, Romano, Jean-Michel, Penchev, Pavel, Gough, Tim, Brown, Elaine, Dimov, Stefan, and Whiteside, Ben
- Subjects
MICROINJECTIONS ,MICROMACHINING ,COMPUTER vision ,LASER machining ,LASERS ,OPTICAL imaging sensors - Abstract
• A microneedle mould insert has been produced using laser micro-machining in less than 45 min. • State-of-the-art micro-injection moulding has been used for replication of the microneedle cavities. • An automated, bespoke measurement apparatus for the quality assessment of microneedles has been developed. • Process data have been interrogated for in-line quality assurance of microneedle arrays. High-throughput manufacturing of transdermal microneedle arrays poses a significant challenge due to the high precision and number of features that need to be produced and the requirement of multi-step processing methods for achieving challenging micro-features. To address this challenge, we report a flexible and cost-effective process chain for transdermal microneedle array manufacture that includes mould production using laser machining and replication of thermoplastic microneedles via micro-injection moulding (micromoulding). The process chain also incorporates an in-line manufacturing data monitoring capability where the variability in the quality of microneedle arrays can be determined in a production run using captured data. Optical imaging and machine vision technologies are also implemented to create a quality inspection system that allows rapid evaluation of key quality indicators. The work presents the capability of laser machining as a cost-effective method for making microneedle moulds and micro-injection moulding of thermoplastic microneedle arrays as a highly-suitable manufacturing technique for large-scale production with low marginal cost. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
49. Towards a scalable implementation of digital twins - A generic method to acquire shopfloor data.
- Author
-
Dittmann, Sören, Zhang, Pengxiang, Glodde, Arne, and Dietrich, Franz
- Abstract
New strategies in factory management focus on shopfloor data as a mean to achieve more flexible mass productions. In reality, the value adding usage of smart manufacturing approaches depends on the effort it takes to provide sensor and actuator data for further analysis. The concept of digital twins, that are based on a physical, a virtual and a communication component, promises a scalable solution to read and standardize shopfloor data. So far, there exists no generic method describing the implementation of the data acquisition as part of the communication component of digital twins. This contribution therefore focusses on necessary steps to consider when realizing a scalable data acquisition and examines how existing standards and open source solutions can support the implementation. The first part of the paper derives the necessary steps for acquiring shopfloor data. It starts by defining the architecture and aim of digital twins. Afterwards, a method to implement the generic data acquisition of digital twins is introduced. The approach is inspired by the Plug-and-Produce paradigm of the control engineering field and is adapted to the concepts of data acquisition and management. The second part examines the technical implementation of the proposed method. Identified approaches from a literature review are structured within the generic method. This describes the realization of the individual steps but also systematically differentiates existing approaches to build Digital Twins. With the aim of creating scalable architectures, a special focus is set on available open source solutions and standards when presenting the implementation part of the generic method. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
50. Implementation and potentials of a machine vision system in a series production using deep learning and low-cost hardware.
- Author
-
Würschinger, Hubert, Mühlbauer, Matthias, Winter, Michael, Engelbrecht, Michael, and Hanenkamp, Nico
- Abstract
For manufacturing processes there is a need to ensure an efficient production and to fulfill the increasing quality requirements. To handle these challenges, Machine Vision Systems can be used for process monitoring and quality control. In this paper the implementation and thereby the potentials of such a system in a series production using Transfer Learning with low cost hardware is introduced. The necessary steps, from the hardware implementation, the data acquisition, the preprocessing over the optimization and the application are depicted. Finally we show that the proposed solution can fulfill defined requirements and can compete with a professional Machine Vision System. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.