2,557 results on '"large-scale"'
Search Results
2. Exploring the relation between spatial abilities and STEM expertise.
- Author
-
Tomai, Eleni, Kokla, Margarita, Charcharos, Christos, and Kavouras, Marinos
- Subjects
- *
STEM education , *INDIVIDUAL differences , *SELF-evaluation , *SPATIAL ability , *SURVEYS - Abstract
Small-scale spatial abilities that involve the mental representation and transformation of two- and three-dimensional images and manipulation of objects at table-top have been studied extensively and are considered predictive of both interest and success in STEM disciplines. However, research investigating the relation of large-scale spatial abilities to STEM disciplines is sparse. The paper describes the design and implementation of a study for assessing individual differences (if any) in spatial abilities in both figural and environmental spaces between STEM experts (with over 10 years of experience) and non-experts (individuals without any studies in STEM fields). Participants' performance in 16 small-, 10 large-scale tasks, and one self-assessment questionnaire at environmental scale was evaluated to assess their corresponding abilities. Results indicate differences between experts and non-experts, which are mostly highlighted for small-scale abilities where experts outperform non-experts. At large scale, some significant differences are identified, which also favor experts. Correlations among the variables tested provide evidence that different abilities are prominent between experts and non-experts. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Experimental investigations on normal mode nodes as support positions of a resonant testing facility for bending fatigue tests.
- Author
-
Schramm, Clara, Birkner, Dennis, Schneider, Sebastian, and Marx, Steffen
- Subjects
- *
RESONANT vibration , *STEEL pipe , *VIBRATION tests , *BEND testing , *VIBRATION isolation , *FATIGUE testing machines - Abstract
Large‐scale fatigue testing is very important to the research on scale effects, which occur in large cyclic loaded structures, such as wind turbine towers. However, such experimental testing has a very high energy consumption. As an efficient alternative, this paper presents a new resonant testing facility for large‐scale specimens under cyclic bending loads. The facility works as a 4‐point bending test, in which the specimen is supported in the nodes of its first normal bending mode, where theoretically no reaction forces occur. Two counter‐rotating imbalance motors with excitation frequencies near resonance generate a harmonic force acting on the specimen. Experimental trial fatigue tests on a steel pipe as a specimen were carried out, in order to validate the new testing setup. A great decrease in the support forces was reached by placing the supports at the normal mode nodes. Additionally, the behavior of the support forces under varying positions and excitation frequencies was also investigated. In summary, the resonant testing method combined with the supports at the normal mode nodes offers an efficient and energy‐saving testing setup for large‐scale fatigue tests. Highlights: A resonant testing facility for large‐scale bending fatigue tests was developed and tested.Supports were placed in the nodes of the specimen's first normal bending mode.The influence of the support positions on the support forces was investigated.A significant reduction in support forces was achieved. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Autoencoder evolutionary algorithm for large-scale multi-objective optimization problem.
- Author
-
Hu, Ziyu, Xiao, Zhixing, Sun, Hao, and Yang, He
- Abstract
Multi-objective optimization problems characterized by a substantial number of decision variables, which are also called large-scale multi-objective optimization problems (LSMOPs), are becoming increasingly prevalent. Traditional evolutionary algorithms may deteriorate drastically when tackling a large number of decision variables. For LSMOPs, the dimensionality of the decision variables needs to be reduced and the algorithm needs to be designed according to the characteristics of divide-and-conquer. The autoencoder evolutionary algorithm (AEEA) is proposed based on autoencoder dimensionality reduction, the grouping of decision variables, and the application of divide-and-conquer strategies. The proposed algorithm is compared with other classical algorithms. The experiment result shows that AEEA achieves excellent convergence and diversity, and still performs well in decision variables of higher dimensions. Finally, it is verified that the autoencoder improves the running time of the proposed algorithm. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Using eDNA Sampling to Identify Correlates of Species Occupancy Across Broad Spatial Scales.
- Author
-
McColl‐Gausden, Emily F., Griffiths, Josh, Weeks, Andrew R., and Tingley, Reid
- Subjects
- *
PLATYPUS , *SPECIES distribution , *FARMS , *WATER sampling , *LAND use - Abstract
ABSTRACT Aim Location Methods Results Main Conclusions Species presence–absence data can be time‐consuming and logistically difficult to obtain across large spatial extents. Yet these data are important for ensuring changes in species distributions are accurately monitored and are vital for ensuring appropriate conservation actions are undertaken. Here, we demonstrate how environmental DNA (eDNA) sampling can be used to systematically collect species occupancy data rapidly and efficiently across vast spatial domains to improve understanding of factors influencing species distributions.South‐eastern Australia.We use a widely distributed, but near‐threatened species, the platypus (Ornithorhynchus anatinus), as a test case and undertake an environmentally stratified systematic survey to assess the presence–absence of platypus eDNA at 504 sites across 584,292 km2 of south‐eastern Australia, representing ~37% of the species' extensive distribution. Site occupancy‐detection models were used to analyse how landscape‐ and site‐level factors affect platypus occupancy, enabling us to incorporate uncertainty at the different levels inherent in eDNA sampling (site, water sample replicate and qPCR replicate).Platypus eDNA was detected at 272 sites (~54%) with platypuses more likely to occupy sites in catchments with increased runoff and less zero‐flow days, and sites with access to banks suitable for burrowing. Platypuses were less likely to occupy sites in catchments with a high proportion of shrubs and grasslands, or agricultural land use.These data provide an important large‐scale validation of the landscape‐ and site‐level factors influencing platypus occupancy that can be used to inform future conservation efforts. Our case study shows that systematically designed, stratified eDNA surveys provide an efficient means to understand how environmental characteristics affect species occupancy across broad environmental gradients. The methods employed here can be applied to aquatic and semi‐aquatic species globally, providing unprecedented opportunities to understand biodiversity status and change and provide insights for current and future conservation actions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Large-Scale Green Method for Synthesizing Ultralong Uniform Tellurium Nanowires for Semiconductor Devices.
- Author
-
Lyu, Zhiyi, Park, Mose, Tang, Yanjin, Choi, Hoon, Song, Seung Hyun, and Lee, Hoo-Jeong
- Abstract
This study presents a large-scale green approach for synthesizing ultralong tellurium nanowires with diameters around 13 nm using a solution-based method. By adjusting key synthesis parameters such as the surfactant concentration, temperature, and reaction duration, we achieved high-quality, ultralong Te NWs. These nanowires exhibit properties suitable for use in semiconductor applications, particularly when employed as channel materials in thin-film transistors, displaying a pronounced gate effect with a high switch of up to 104 and a mobility of 0.9 cm2 V−1s−1. This study underscores the potential of solvent-based methods in synthesizing large-scale ultralong Te NWs as a critical resource for future sustainable nanoelectronic devices. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Recrystallization Strategy for Efficient Preparation of Metal Halide Single Crystals with High‐Quality.
- Author
-
Yun, Xiangyan, Zhou, Bo, Hu, Hanlin, Zhong, Haizhe, Xu, Denghui, Li, Henan, and Shi, Yumeng
- Subjects
- *
CRYSTAL growth , *SUSTAINABLE chemistry , *SINGLE crystals , *METAL halides , *ELECTRONIC art , *PEROVSKITE - Abstract
Large size high‐quality perovskite single‐crystals are highly desirable for investigating their fundamental materials properties and realizing state of the art electronic/optoelectronic device performance. Herein, a novel single‐crystal growth method is reported by recrystallization of perovskites in oversaturated solutions. Perovskite single crystals including both organic–inorganic hybrid metal halides and their all‐inorganic counterparts can be obtained in large amounts by this method. All of the synthesized perovskite single crystals exhibit large crystal sizes (centimeter level) and exceptional light emission properties. Meanwhile, the single‐crystal growth can be well controlled and the solvent can be reused for cycles of single‐crystal growth, which sheds light on the preparation of perovskite materials in a way of green chemistry. In addition, thermodynamic growth principles for the single‐crystal growth are proposed, providing a universal approval for metal halide single‐crystal synthesis. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Evaluation of Ecological Environment Quality Using an Improved Remote Sensing Ecological Index Model.
- Author
-
Liu, Yanan, Xiang, Wanlin, Hu, Pingbo, Gao, Peng, and Zhang, Ai
- Subjects
- *
REMOTE-sensing images , *PRINCIPAL components analysis , *REMOTE sensing , *ARTIFICIAL satellites , *AIR quality - Abstract
The Remote Sensing Ecological Index (RSEI) model is widely used for large-scale, rapid Ecological Environment Quality (EEQ) assessment. However, both the RSEI and its improved models have limitations in explaining the EEQ with only two-dimensional (2D) factors, resulting in inaccurate evaluation results. Incorporating more comprehensive, three-dimensional (3D) ecological information poses challenges for maintaining stability in large-scale monitoring, using traditional weighting methods like the Principal Component Analysis (PCA). This study introduces an Improved Remote Sensing Ecological Index (IRSEI) model that integrates 2D (normalized difference vegetation factor, normalized difference built-up and soil factor, heat factor, wetness, difference factor for air quality) and 3D (comprehensive vegetation factor) ecological factors for enhanced EEQ monitoring. The model employs a combined subjective–objective weighting approach, utilizing principal components and hierarchical analysis under minimum entropy theory. A comparative analysis of IRSEI and RSEI in Miyun, a representative study area, reveals a strong correlation and consistent monitoring trends. By incorporating air quality and 3D ecological factors, IRSEI provides a more accurate and detailed EEQ assessment, better aligning with ground truth observations from Google Earth satellite imagery. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Eco-friendly and large-scale fabrication of NiMoN/Ni(OH)2/NF as highly efficient and stable alkaline hydrogen evolution reaction electrode.
- Author
-
Xue, Mengyao, Bao, Yuankang, Xu, Xun, Liao, Luliang, Li, Ping, Zhang, Hao, Li, Deliang, Wei, Binbin, and Duo, Shuwang
- Subjects
- *
TRANSITION metal nitrides , *HYDROGEN evolution reactions , *TRANSITION metal catalysts , *MAGNETRON sputtering , *REACTIVE sputtering - Abstract
Bimetallic nickel-molybdenum nitride (NiMoN) catalysts have attracted great attention due to their remarkable similarity to group VIII noble metals in terms of electronic structure, good electron conductivity, and durability for the hydrogen evolution reaction (HER). However, the existing approaches for fabricating NiMoN catalysts usually entail a series of multiple steps and hazardous ammoniated ingredient. As a result, these methods are difficult to be scaled up for large-area fabrication towards industrial water splitting. Herein, a hierarchical composite NiMoN@Ni(OH) 2 @NF composite electrode with a sheet morphology was synthesized based on a rational combination of seawater etching and magnetron sputtering. This method is easily scalable for producing large-area electrodes with minimal pollution. The incorporation of Ni(OH) 2 @NF nanoarray carriers enhances the exposure of additional active sites on the NiMoN layer, thereby augmenting the electrochemically active specific surface area of the electrode.The as-synthesized NiMoN@Ni(OH) 2 @NF electrode demonstrates excellent HER activity, with an overpotential of 57 mV at 10 mA cm−2, and exhibits remarkable long-term catalytic stability for over 100 h at a current density of 100 mA cm−2.This study presents an eco-friendly and straightforward method for preparing large-scale transition metal nitride catalysts with high catalytic activity. Large-area NiMoN@Ni(OH) 2 @NF composite electrodes were prepared using a combination of environmentally friendly and cost-effective seawater etching, along with reactive magnetron sputtering methods. [Display omitted] • The NiMoN@Ni(OH) 2 @NF composite electrodes were fabricated utilizing a simple and eco-friendly method. • NiMoN@Ni(OH) 2 @NF demonstrates excellent electrocatalytic activity and stability for HER. • The technology demonstrates excellent potential for large-scale fabrication. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Spherical Design‐Driven Scalable Solar‐Powered Water Treatment with Salt Self‐Cleaning and Light Self‐Adaptivity.
- Author
-
Liao, Yiqi, Wang, Chuang, Dong, Yanjuan, Miao, Zhouyu, Yu, Hou‐Yong, Chen, Guozhuo, Yao, Juming, Zhou, Yongfeng, and Liu, Yannan
- Subjects
- *
SOLAR energy conversion , *CLEAN energy , *SALINE waters , *WATER purification , *EVAPORATORS - Abstract
Interfacial solar evaporation, harnessing sunlight to induce water molecule evaporation, holds great promise for sustainable solar energy conversion. However, challenges such as reduced efficiency and instability due to salt accumulation, inadequate water transport, and the high cost of advanced nanostructured solar evaporators collectively hinder the sustainable and large‐scale practical use of this technology. Herein, an eco‐friendly, floatable 3D solar seawater evaporator is developed by innovatively incorporating a lightweight foam ball enclosed in a porous cellulose hydrogel. The 3D evaporator achieves a high water evaporation rate of ≈2.01 kg m−2 h−1 under 1 Sun, owing to its super high photothermal efficiency of 117.9% and efficient internal water transport channels. Even at a 0° simulated solar angle, the 3D evaporator maintains 85.8% of the evaporation rate at a 90° simulated solar angle. Moreover, the salt self‐cleaning capability is realized by the autonomous rotation caused by salt deposition. Particularly, the 3D evaporator can be fabricated over a large area and maintain seawater evaporation performance and structural integrity for 28 days. This study provides novel economically feasible and sustainable large‐scale solutions for interfacial solar‐powered seawater treatment. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Exponential stability of large-scale stochastic reaction-diffusion equations.
- Author
-
Wang, Yuan and Ren, Yong
- Subjects
- *
STOCHASTIC systems , *REACTION-diffusion equations , *EXPONENTIAL stability - Abstract
In this paper, we consider a class of large-scale stochastic reaction-diffusion systems. To prove the exponential stability of the system, we introduce the corresponding isolated subsystems. We show that the exponential stability of the isolated systems implies the exponential stability of the large-scale stochastic reaction-diffusion system under some conditions. Furthermore, we discuss a special case where the large-scale stochastic reaction-diffusion system is described in a hierarchical form. In this case, we prove that the original system is exponentially stable if and only if the corresponding subsystems are exponentially stable. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Blocked out: reflections on the potential of intensive modes of teaching to enhance post-COVID-19 graduate employability in large-scale educational settings
- Author
-
Laura Dixon and Valerie Makin
- Subjects
Block teaching ,Employability ,Intensive delivery ,Large-scale ,Post-COVID-19 ,Social Sciences - Abstract
Purpose – This paper explores the potential that block teaching offers to enhance employability in the context of large-scale classes. It suggests that block teaching, with its condensed structure, necessitates curriculum innovation, fosters participatory learning and peer-to-peer networking, and has been shown to increase student focus and enhance engagement and attainment, especially amongst diverse learners. As these are the same challenges that large-scale teaching faces, it is proposed that intensive modes of delivery could be scaled up in a way that may help to mitigate such problems as cohorts in business schools continue to increase in size. Design/methodology/approach – The paper is based on secondary research and provides an overview of literature that looks at block teaching, followed by that which explores the challenges of large-scale teaching contexts. It compares and contrasts the gaps in both to suggest a way that they could be combined. Findings – The paper provides key insights into changes in the contemporary landscape of teaching within UK business schools, which have seen increasingly large cohorts and draws out the key strengths of intensive modes of delivery, which include helping students to time manage effectively, encouraging curriculum innovation and the creation of participatory learning opportunities as well as providing closer personal relationships between students and staff. Outlining some of the well-documented issues that can arise when teaching larger cohorts, the paper suggests that scaling up blocked delivery may offer a new way help to overcome them. Research limitations/implications – Because of the chosen research approach, the research results are subject to generalisation. Therefore, researchers are encouraged to test the proposed propositions in large-scale teaching scenarios. Practical implications – This paper includes implications for the development of innovative modes of teaching in the context of large cohorts, an experience that is increasingly common amongst British business schools and beyond. Originality/value – This paper brings together two bodies of literature for the first time – that of intensive modes of teaching and that focuses on large-scale teaching contexts – for the first time to show how the former may help to overcome some of the key issues arising in the latter.
- Published
- 2024
- Full Text
- View/download PDF
13. Managing EEG studies: How to prepare and what to do once data collection has begun
- Author
-
Boudewyn, Megan A, Erickson, Molly A, Winsler, Kurt, Ragland, John Daniel, Yonelinas, Andrew, Frank, Michael, Silverstein, Steven M, Gold, Jim, MacDonald, Angus W, Carter, Cameron S, Barch, Deanna M, and Luck, Steven J
- Subjects
Biological Sciences ,Biomedical and Clinical Sciences ,Psychology ,Neurosciences ,Clinical Research ,Electroencephalography ,Humans ,Data Collection ,Software ,Research Design ,EEG methods ,guidelines ,large-scale ,multisite ,protocol ,recommendations ,Medical and Health Sciences ,Psychology and Cognitive Sciences ,Experimental Psychology ,Biological sciences ,Biomedical and clinical sciences - Abstract
In this paper, we provide guidance for the organization and implementation of EEG studies. This work was inspired by our experience conducting a large-scale, multi-site study, but many elements could be applied to any EEG project. Section 1 focuses on study activities that take place before data collection begins. Topics covered include: establishing and training study teams, considerations for task design and piloting, setting up equipment and software, development of formal protocol documents, and planning communication strategy with all study team members. Section 2 focuses on what to do once data collection has already begun. Topics covered include: (1) how to effectively monitor and maintain EEG data quality, (2) how to ensure consistent implementation of experimental protocols, and (3) how to develop rigorous preprocessing procedures that are feasible for use in a large-scale study. Links to resources are also provided, including sample protocols, sample equipment and software tracking forms, sample code, and tutorial videos (to access resources, please visit: https://osf.io/wdrj3/).
- Published
- 2023
14. Large-scale photonic inverse design: computational challenges and breakthroughs
- Author
-
Kang Chanik, Park Chaejin, Lee Myunghoo, Kang Joonho, Jang Min Seok, and Chung Haejun
- Subjects
large-scale ,inverse design ,computational challenges ,Physics ,QC1-999 - Abstract
Recent advancements in inverse design approaches, exemplified by their large-scale optimization of all geometrical degrees of freedom, have provided a significant paradigm shift in photonic design. However, these innovative strategies still require full-wave Maxwell solutions to compute the gradients concerning the desired figure of merit, imposing, prohibitive computational demands on conventional computing platforms. This review analyzes the computational challenges associated with the design of large-scale photonic structures. It delves into the adequacy of various electromagnetic solvers for large-scale designs, from conventional to neural network-based solvers, and discusses their suitability and limitations. Furthermore, this review evaluates the research on optimization techniques, analyzes their advantages and disadvantages in large-scale applications, and sheds light on cutting-edge studies that combine neural networks with inverse design for large-scale applications. Through this comprehensive examination, this review aims to provide insights into navigating the landscape of large-scale design and advocate for strategic advancements in optimization methods, solver selection, and the integration of neural networks to overcome computational barriers, thereby guiding future advancements in large-scale photonic design.
- Published
- 2024
- Full Text
- View/download PDF
15. Dimensions of scale: Connected Learning Initiative (CLIx)—a case study of educational technology initiative in India.
- Author
-
Balli, Omkar and Singla, Ekta
- Subjects
- *
TEACHER development , *DIFFUSION of innovations , *EDUCATIONAL resources , *STUDENT engagement , *EDUCATIONAL quality - Abstract
Developing countries around the world are scaling up education interventions. New educational technologies offer opportunities to develop new models to deliver quality education at scale. However, the literature suggests that defining scale is complex, especially in heterogeneous contexts. This paper provides a conceptualization of scale as a dynamic process with three key dimensions: 'quantity', 'diffusion,' and 'quality' through the case of a multi-state, multi-stakeholder program called Connected Learning Initiative in India. It also describes the implementation processes that involve teacher professional development, student engagement, technological developments, and efforts to improve classroom processes in a multifactor environment of stakeholder needs, context, and variance in resources. In conclusion, robust and flexible design approaches for scale are discussed, with implications for further research. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Beaconet: A Reference‐Free Method for Integrating Multiple Batches of Single‐Cell Transcriptomic Data in Original Molecular Space.
- Author
-
Xu, Han, Ye, Yusen, Duan, Ran, Gao, Yong, Hu, Yuxuan, and Gao, Lin
- Subjects
- *
TRANSCRIPTOMES , *BIOLOGICAL variation , *DATA integration - Abstract
Integrating multiple single‐cell datasets is essential for the comprehensive understanding of cell heterogeneity. Batch effect is the undesired systematic variations among technologies or experimental laboratories that distort biological signals and hinder the integration of single‐cell datasets. However, existing methods typically rely on a selected dataset as a reference, leading to inconsistent integration performance using different references, or embed cells into uninterpretable low‐dimensional feature space. To overcome these limitations, a reference‐free method, Beaconet, for integrating multiple single‐cell transcriptomic datasets in original molecular space by aligning the global distribution of each batch using an adversarial correction network is presented. Through extensive comparisons with 13 state‐of‐the‐art methods, it is demonstrated that Beaconet can effectively remove batch effect while preserving biological variations and is superior to existing unsupervised methods using all possible references in overall performance. Furthermore, Beaconet performs integration in the original molecular feature space, enabling the characterization of cell types and downstream differential expression analysis directly using integrated data with gene‐expression features. Additionally, when applying to large‐scale atlas data integration, Beaconet shows notable advantages in both time‐ and space‐efficiencies. In summary, Beaconet serves as an effective and efficient batch effect removal tool that can facilitate the integration of single‐cell datasets in a reference‐free and molecular feature‐preserved mode. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Optimization of phytase production by Penicillium oxalicum in solid-state fermentation for potential as a feed additive.
- Author
-
Priya, Singh, Bijender, Sharma, Jai Gopal, and Giri, Bhoopander
- Subjects
- *
SOLID-state fermentation , *FERMENTATION of feeds , *PHYTASES , *FEED additives , *RESPONSE surfaces (Statistics) , *WHEAT bran - Abstract
The study aims to statistically optimize the phytase production by Penicillium oxalicum PBG30 in solid-state fermentation using wheat bran as substrate. Variables viz. pH, incubation days, MgSO4, and Tween-80 were the significant parameters identified through the Plackett-Burman design (PBD) that majorly influenced the phytase production. Further, central composite design (CCD) method of response surface methodology (RSM) defined the optimum values for these factors i.e., pH 7.0, 5 days of incubation, 0.75% of MgSO4, and 3.5% of Tween-80 that leads to maximum phytase production of 475.42 U/g DMR. Phytase production was also sustainable in flasks and trays of different sizes with phytase levels ranging from 394.95 to 475.42 U/g DMR. Enhancement in phytase production is 5.6-fold as compared to unoptimized conditions. The in-vitro dephytinization of feed showed an amelioration in the nutritive value by releasing inorganic phosphate and other nutrients in a time-dependent manner. The highest amount of inorganic phosphate (33.986 mg/g feed), reducing sugar (134.4 mg/g feed), and soluble protein (115.52 mg/g feed) was achieved at 37 °C with 200 U of phytase in 0.5 g feed for 48 h. This study reports the economical and large-scale production of phytase with applicability in enhancing feed nutrition. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Dynamics of extreme wind events in the marine and terrestrial sectors of coastal Antarctica.
- Author
-
Caton Harrison, Thomas, King, John C., Bracegirdle, Thomas J., and Lu, Hua
- Subjects
- *
KATABATIC winds , *WIND speed , *OCEAN circulation , *ADVECTION , *COMPUTER simulation - Abstract
Antarctic coastal surface winds affect ice‐sheet stability, sea ice, and local ecosystems. The strongest coastal winds are especially important due to the nonlinear relationship between wind speed and wind stress. We investigate the dynamics of extreme coastal winds using a simplified momentum budget calculated across the period 2010–2020 from the ERA5 reanalysis. The pressure‐gradient forcing term in the budget is decomposed into a large‐scale component and one associated with the temperature deficit layer. The role of budget terms across the coastal sector is compared for weak and strong winds. We then calculate composites of the top 100 easterly wind events across six east Antarctic coastal sectors, identifying terms responsible for the evolution of coastal extremes. A simple balance of terms exists offshore, dominated by large‐scale forcing, contrasting with the complex balance in the onshore sector where katabatic forcing is large. Large‐scale forcing explains 57% of offshore coastal wind‐speed variance overall, improving to 81% when budget terms associated with the temperature deficit layer and horizontal advection are included, with significant regional variation. The residual term plays an increasingly active role as wind speed increases. Extremes in all coastal sectors are associated with a synoptic‐scale transient dipole of pressure anomalies driving warm‐air advection. Although katabatic forcing is a very large term in magnitude, it is found to play a passive role, declining as wind speeds increase during extreme conditions. In some regions, an anomalous southerly component develops during extremes, which we attribute to an ageostrophic barrier wind. This research underscores the major role for large‐scale forcing in Antarctica's coastal winds, but also reveals a significant regional locally driven component. The results have implications for improving numerical model simulations of coastal easterlies and for studying their impacts on ocean circulation, sea ice, and ice‐shelf basal melt. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. The primate cortical LFP exhibits multiple spectral and temporal gradients and widespread task dependence during visual short-term memory.
- Author
-
Hoffman, Steven J., Dotson, Nicholas M., Lima, Vinicius, and Gray, Charles M.
- Subjects
- *
PYRAMIDAL neurons , *CEREBRAL cortex , *SHORT-term memory , *POWER spectra , *BRAIN mapping , *VISUAL memory - Abstract
Although cognitive functions are hypothesized to be mediated by synchronous neuronal interactions in multiple frequency bands among widely distributed cortical areas, we still lack a basic understanding of the distribution and task dependence of oscillatory activity across the cortical map. Here, we ask how the spectral and temporal properties of the local field potential (LFP) vary across the primate cerebral cortex, and how they are modulated during visual short-term memory. We measured the LFP from 55 cortical areas in two macaque monkeys while they performed a visual delayed match to sample task. Analysis of peak frequencies in the LFP power spectra reveals multiple discrete frequency bands between 3 and 80 Hz that differ between the two monkeys. The LFP power in each band, as well as the sample entropy, a measure of signal complexity, display distinct spatial gradients across the cortex, some of which correlate with reported spine counts in cortical pyramidal neurons. Cortical areas can be robustly decoded using a small number of spectral and temporal parameters, and significant task-dependent increases and decreases in spectral power occur in all cortical areas. These findings reveal pronounced, widespread, and spatially organized gradients in the spectral and temporal activity of cortical areas. Task-dependent changes in cortical activity are globally distributed, even for a simple cognitive task. NEW & NOTEWORTHY: We recorded extracellular electrophysiological signals from roughly the breadth and depth of a cortical hemisphere in nonhuman primates (NHPs) performing a visual memory task. Analyses of the band-limited local field potential (LFP) power displayed widespread, frequency-dependent cortical gradients in spectral power. Using a machine learning classifier, these features allowed robust cortical area decoding. Further task dependence in LFP power were found to be widespread, indicating large-scale gradients of LFP activity, and task-related activity. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. A Hybrid Parallel Processing Strategy for Large-Scale DEA Computation.
- Author
-
Chang, Shengqing, Ding, Jingjing, Feng, Chenpeng, and Wang, Ruifeng
- Subjects
PARALLEL processing ,DATA envelopment analysis ,TIME complexity ,MESSAGE passing (Computer science) ,PARALLEL algorithms - Abstract
Using data envelopment analysis (DEA) with large-scale data poses a big challenge to applications due to its computing-intensive nature. So far, various strategies have been proposed in academia to accelerate the DEA computation, including DEA algorithms such as hierarchical decomposition (HD), DEA enhancements such as restricted basis entry (RBE) and LP accelerators such as hot starts. However, few studies have integrated these strategies and combined them with a parallel processing framework to solve large-scale DEA problems. In this paper, a hybrid parallel DEA algorithm (named PRHH algorithm) is proposed, including the RBE algorithm, hot starts, and HD algorithm based on Message Passing Interface (MPI). Furthermore, the attribute of the PRHH algorithm is analyzed, and formalized as a computing time function, to shed light on its time complexity. Finally, the performance of the algorithm is investigated in various simulation scenarios with datasets of different characteristics and compared with existing methods. The results show that the proposed algorithm reduces computing time in general, and boosts performance dramatically in scenarios with low density in particular. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Scalable computer interactive education system based on large-scale multimedia data analysis.
- Author
-
Zhao, Jie, Liu, Taotang, and Li, Shuping
- Subjects
INTERACTIVE computer systems ,COMPUTER performance ,DATA analysis ,COMPUTER engineering ,ONLINE education - Abstract
Massive teaching resources will cause serious teaching efficiency problems for online teaching, and traditional online teaching models are even inferior to traditional classroom teaching in terms of teaching effects. Based on this, this paper analyzes massive educational resources and builds a scalable computer interactive education system based on large-scale multimedia data analysis. Moreover, this paper sets the role of the system according to the actual teaching situation, and constructs the functional module of the system structure. In addition, this paper uses computer simulation technology to analyze interactive technology and make technical improvements to make interactive technology the core technology of the computer interactive education system, and get an extensible interactive education system based on the characteristics of network teaching. Then helps to monitor and access the performance of an interactive educational system. Furthermore, this paper designs an experiment to evaluate the performance of the computer interactive education system, which is mainly carried out from two aspects: interactive evaluation and teaching evaluation. From the experimental research results, we can see that this system can effectively improve the quality of teaching. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. High-throughput Oxford Nanopore sequencing-based approach for the multilocus sequence typing analysis of large-scale avian Escherichia coli study in Mississippi
- Author
-
Linan Jia, Mark A. Arick, Chuan-Yu Hsu, Daniel G. Peterson, Jeffrey D. Evans, Kelsy Robinson, Anuraj T. Sukumaran, Reshma Ramachandran, Pratima Adhikari, and Li Zhang
- Subjects
avian Escherichia coli ,Oxford Nanopore ,large-scale ,high-throughput ,field study ,Animal culture ,SF1-1100 - Abstract
ABSTRACT: Avian pathogenic Escherichia coli (APEC) cause avian colibacillosis and accurately distinguishing infectious isolates is critical for controlling its transmission. Multilocus sequence typing (MLST) is an accurate and efficient strain identification method for epidemiological surveillance. This research aimed to develop a fast and high-throughput workflow that simultaneously sequences the Achtman typing scheme's 7 housekeeping genes of multiple E. coli isolates using the Oxford Nanopore Technologies (ONT) platform for large-scale APEC study. E. coli strains were isolated from poultry farms, the housekeeping genes were amplified, and amplicons were sequenced on an R9.4 MinION flow cell using the Nanopore GridION sequencer (ONT, Oxford, UK) following the initial workflow (ONT-MLST). Moreover, the workflow was revised by introducing large-scale DNA extraction and multiplex PCR into the ONT-MLST workflow and applied to 242 new isolates, 18 isolates from the previous workflow, and 5 ATCC reference strains using Flongle flow cell on the Nanopore MinION Mk1C sequencer (ONT, Oxford, UK). Finally, the sequence type (ST) results of the 308 isolates collected from infected chickens and poultry farm environments were reported and analyzed. Data indicated that E. coli belonging to ST159, ST8578, and ST355 have the potential to infect multiple organs in broiler. In addition, zoonotic STs, ST69, ST10, ST38, and ST131, were detected from poultry farms. With the advantages of the high throughput of ONT, this study provides a rapid workflow for large-scale E. coli typing and identified frequently isolated sequence types related to APEC infection in poultry.
- Published
- 2024
- Full Text
- View/download PDF
23. Design and Implementation of Industrial Internet Emulation Platform Based on Virtualization Technology
- Author
-
Wang, Shuai, Li, Wei, Bao, Wan, Zhao, Yang, Lu, Mingxin, Angrisani, Leopoldo, Series Editor, Arteaga, Marco, Series Editor, Chakraborty, Samarjit, Series Editor, Chen, Shanben, Series Editor, Chen, Tan Kay, Series Editor, Dillmann, Rüdiger, Series Editor, Duan, Haibin, Series Editor, Ferrari, Gianluigi, Series Editor, Ferre, Manuel, Series Editor, Hirche, Sandra, Series Editor, Jabbari, Faryar, Series Editor, Jia, Limin, Series Editor, Kacprzyk, Janusz, Series Editor, Khamis, Alaa, Series Editor, Kroeger, Torsten, Series Editor, Li, Yong, Series Editor, Liang, Qilian, Series Editor, Martín, Ferran, Series Editor, Ming, Tan Cher, Series Editor, Minker, Wolfgang, Series Editor, Misra, Pradeep, Series Editor, Mukhopadhyay, Subhas, Series Editor, Ning, Cun-Zheng, Series Editor, Nishida, Toyoaki, Series Editor, Oneto, Luca, Series Editor, Panigrahi, Bijaya Ketan, Series Editor, Pascucci, Federica, Series Editor, Qin, Yong, Series Editor, Seng, Gan Woon, Series Editor, Speidel, Joachim, Series Editor, Veiga, Germano, Series Editor, Wu, Haitao, Series Editor, Zamboni, Walter, Series Editor, Tan, Kay Chen, Series Editor, and Ma, Maode, editor
- Published
- 2024
- Full Text
- View/download PDF
24. Large Scale Gaussian Processes with Matheron’s Update Rule and Karhunen-Loève Expansion
- Author
-
Maatouk, Hassan, Rullière, Didier, Bay, Xavier, Hinrichs, Aicke, editor, Kritzer, Peter, editor, and Pillichshammer, Friedrich, editor
- Published
- 2024
- Full Text
- View/download PDF
25. Photocatalytic and Photoelectrochemical Technologies for Hydrogen Production: Commercialization Aspect
- Author
-
Priya, Rajendran Lakshmi, Hariprasad, Boopathi Shagunthala, Dhayanithi, Chettipalayam Arunasalam, Paunkumar, Ponnusamy, Bhuvaneswari, Chellapandi, Babu, Sundaram Ganesh, and Sathishkumar, Panneerselvam, editor
- Published
- 2024
- Full Text
- View/download PDF
26. Nexus Between Psychological Safety and Non-Technical Debt in Large-Scale Agile Enterprise Resource Planning Systems Development
- Author
-
Ahmad, Muhammad Ovais, Gustavsson, Tomas, van der Aalst, Wil, Series Editor, Ram, Sudha, Series Editor, Rosemann, Michael, Series Editor, Szyperski, Clemens, Series Editor, Guizzardi, Giancarlo, Series Editor, Jarzębowicz, Aleksander, editor, Luković, Ivan, editor, Przybyłek, Adam, editor, Staroń, Mirosław, editor, Ahmad, Muhammad Ovais, editor, and Ochodek, Mirosław, editor
- Published
- 2024
- Full Text
- View/download PDF
27. Large-Scale Swarm Control in Cluttered Environments
- Author
-
Elsayed, Saber, Mabrok, Mohamed, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Ali, Abdulaziz Al, editor, Cabibihan, John-John, editor, Meskin, Nader, editor, Rossi, Silvia, editor, Jiang, Wanyue, editor, He, Hongsheng, editor, and Ge, Shuzhi Sam, editor
- Published
- 2024
- Full Text
- View/download PDF
28. The Testing Hopscotch Model – Six Complementary Profiles Replacing the Perfect All-Round Tester
- Author
-
Mårtensson, Torvald, Sandahl, Kristian, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Kadgien, Regine, editor, Jedlitschka, Andreas, editor, Janes, Andrea, editor, Lenarduzzi, Valentina, editor, and Li, Xiaozhou, editor
- Published
- 2024
- Full Text
- View/download PDF
29. Analysis of the Causes of Large-Scale Power Outage Accidents Abroad and Suggestions on Measures to Improve the Safety of China’s Power Grid
- Author
-
Bian, Haifeng, Zhang, Jun, Xie, Guanglong, Zhang, Chen, Appolloni, Andrea, Series Editor, Caracciolo, Francesco, Series Editor, Ding, Zhuoqi, Series Editor, Gogas, Periklis, Series Editor, Huang, Gordon, Series Editor, Nartea, Gilbert, Series Editor, Ngo, Thanh, Series Editor, Striełkowski, Wadim, Series Editor, Zailani, Suhaiza Hanim Binti Dato Mohamad, editor, Yagapparaj, Kosga, editor, and Zakuan, Norhayati, editor
- Published
- 2024
- Full Text
- View/download PDF
30. Statistical methods for survival analysis in large-scale electronic health records research
- Author
-
Schmidt, James C. F.
- Subjects
Statistical Methods ,Survival analysis ,Large-scale ,Electronic Health Records research ,thesis ,Health sciences - Abstract
The relative survival framework is a popular method for the estimation of a subject's survival, corrected for the effect of non-disease related causes of death. A comparison is made between the observed all-cause survival and the expected survival, derived from published population mortality rates known as life tables, often stratified by age, sex, and calendar year. Under certain assumptions, relative survival provides an estimate of net survival, survival in a hypothetical world where subjects can only die due to their disease. In order to interpret relative survival as net survival, other-cause mortality rates for subjects with the disease of interest must be the same as expected mortality rates. When interest lies in the relative survival of diseases with multiple shared risk factors, for example lung cancer, the use of standard life tables is unsuitable, requiring additional stratification by these risk factors. The primary aim of this research is use a control population taken from large-scale linked electronic health records to adjust published life tables by comorbidity, and to investigate the impact of these and standard life tables on relative survival estimates. To achieve these research aims, bespoke software is developed to aid the management of large-scale health data, while investigations into mortality rates in the control population data is undertaken, showing biased results when follow-up requirements form part of patient selection. Comorbidity adjusted life tables are estimated using time-constant and time-updated exposures, and applied in a relative survival analysis in colorectal cancer, comparing groups defined by cardiovascular comorbidity status. This research extends concepts and methods previously developed to form novel approaches to the adjustment of background mortality data, taking into account the induced bias in the control population, and showing the importance of the use of correctly stratified life tables, with key implications for future studies investigating differential mortality rates.
- Published
- 2023
- Full Text
- View/download PDF
31. Principles and key technologies for the large-scale ecological utilization of coal gangue
- Author
-
Zhenqi HU, Yanling ZHAO, and Zhen MAO
- Subjects
coal gangue ,large-scale ,ecological utilization ,pollution control ,Geology ,QE1-996.5 ,Mining engineering. Metallurgy ,TN1-997 - Abstract
Coal is the main energy in our country and the ballast stone of energy security. As an inevitable product in the process of coal mining and coal washing, coal gangue has an annual output of more than 700 million tons, which is in urgent need of large-scale and ecological utilization to solve the problem of coal gangue as a stumbling block in enterprise development. Based on the analysis of the mechanism of ecological damage in large-scale utilization of coal gangue, the principle of large-scale ecological utilization of coal gangue was put forward, and the concrete solution of large-scale ecological utilization of environmental safety is discussed from the technical perspective. Two key technologies of large-scale ecological utilization was put forward, namely in-situ contamination control and ecological restoration of acid coal gangue mountain, and ground filling of coal gangue. The results demonstrated that: ① The key to large-scale ecological utilization of coal gangue is prevention and control of environmental contamination. Large-scale ecological utilization of coal gangue can be realized through the evaluation of the availability and economy of gangue that combined with environmental risk management. ② Realized the ecological utilization of accumulated gangue mountain by vegetation restoration based on in-situ contamination control, and developed an ecological utilization technology integrated with pollution source diagnosis, fire prevention, pollution barrier, and vegetation restoration. Based on the mechanism analysis of pollution caused by acid and heat production of gangue oxidation, thermal infrared coupled with surveying and mapping technology was uesd to locate the deep burning point (oxidation point) in gangue mountain; an oxidation inhibitor coupled with fungicide and reducing bacteria was invented, which covered with an inert materia and rolled to prevent oxygen and pollution; the fire-fighting technology combining shotcrete fire control and grouting is adopted in the spontaneous combustion area; and a fire-proof vegetation restoration technology based on local and grass irrigation was put forward, which realized in-situ contamination control and ecological restoration of acid coal gangue mountain.③ The ecological utilization of ground filling of coal gangue can be realized through the feasibility analysis of ecological utilization of ground filling, the technology of ecological utilization and the long-term monitoring of maintenance management. The key of ecological utilization is the contamination risk analysis of gangue material screening, the necessity and feasibility analysis of ground filling site selection, and environmental protection measures during the whole filling process, including safety and environmental protection measures such as anti-seepage barrier at the bottom of the site before filling, layered filling technology and soil profile reconstruction technology for fire prevention and acid control in filling, erosion control and vegetation restoration after filling. Large-scale ecological utilization of coal gangue not only solves the ecological environment problems caused by solid waste storage in mining areas, but also creates a new mode of ecological restoration in mining areas through scientific, safe, and reasonable utilization of new and dated gangue.
- Published
- 2024
- Full Text
- View/download PDF
32. High-Efficiency Dynamic Scanning Strategy for Powder Bed Fusion by Controlling Temperature Field of the Heat-Affected Zone
- Author
-
Xiaokang Huang, Xiaoyong Tian, Qi Zhong, Shunwen He, Cunbao Huo, Yi Cao, Zhiqiang Tong, and Dichen Li
- Subjects
Powder bed fusion ,Efficiency ,Large-scale ,Spot size ,Heat-affected zone (HAZ) ,Ocean engineering ,TC1501-1800 ,Mechanical engineering and machinery ,TJ1-1570 - Abstract
Abstract Improvement of fabrication efficiency and part performance was the main challenge for the large-scale powder bed fusion (PBF) process. In this study, a dynamic monitoring and feedback system of powder bed temperature field using an infrared thermal imager has been established and integrated into a four-laser PBF equipment with a working area of 2000 mm × 2000 mm. The heat-affected zone (HAZ) temperature field has been controlled by adjusting the scanning speed dynamically. Simultaneously, the relationship among spot size, HAZ temperature, and part performance has been established. The fluctuation of the HAZ temperature in four-laser scanning areas was decreased from 30.85 ℃ to 17.41 ℃. Thus, the consistency of the sintering performance of the produced large component has been improved. Based on the controllable temperature field, a dynamically adjusting strategy for laser spot size was proposed, by which the fabrication efficiency was improved up to 65.38%. The current research results were of great significance to the further industrial applications of large-scale PBF equipment.
- Published
- 2024
- Full Text
- View/download PDF
33. CDSKNNXMBD: a novel clustering framework for large-scale single-cell data based on a stable graph structure
- Author
-
Jun Ren, Xuejing Lyu, Jintao Guo, Xiaodong Shi, Ying Zhou, and Qiyuan Li
- Subjects
scRNA-seq ,Clustering ,Large-scale ,Imbalance ratio ,Medicine - Abstract
Abstract Background Accurate and efficient cell grouping is essential for analyzing single-cell transcriptome sequencing (scRNA-seq) data. However, the existing clustering techniques often struggle to provide timely and accurate cell type groupings when dealing with datasets with large-scale or imbalanced cell types. Therefore, there is a need for improved methods that can handle the increasing size of scRNA-seq datasets while maintaining high accuracy and efficiency. Methods We propose CDSKNNXMBD (Community Detection based on a Stable K-Nearest Neighbor Graph Structure), a novel single-cell clustering framework integrating partition clustering algorithm and community detection algorithm, which achieves accurate and fast cell type grouping by finding a stable graph structure. Results We evaluated the effectiveness of our approach by analyzing 15 tissues from the human fetal atlas. Compared to existing methods, CDSKNN effectively counteracts the high imbalance in single-cell data, enabling effective clustering. Furthermore, we conducted comparisons across multiple single-cell datasets from different studies and sequencing techniques. CDSKNN is of high applicability and robustness, and capable of balancing the complexities of across diverse types of data. Most importantly, CDSKNN exhibits higher operational efficiency on datasets at the million-cell scale, requiring an average of only 6.33 min for clustering 1.46 million single cells, saving 33.3% to 99% of running time compared to those of existing methods. Conclusions The CDSKNN is a flexible, resilient, and promising clustering tool that is particularly suitable for clustering imbalanced data and demonstrates high efficiency on large-scale scRNA-seq datasets.
- Published
- 2024
- Full Text
- View/download PDF
34. A PDMS coating with excellent durability for large-scale deicing
- Author
-
Tao Zhu, Yuan Yuan, Linbo Song, Xingde Wei, Huiying Xiang, Xu Dai, Xujiang Hua, and Ruijin Liao
- Subjects
PDMS ,Coating ,Large-scale ,Deicing ,Durability ,Mining engineering. Metallurgy ,TN1-997 - Abstract
The icing of wind turbine blades leads to a decrease in output power, seriously jeopardizing the economic benefits and operational reliability of wind farms. Conventional deicing techniques require expensive equipment and consume a large amount of energy. Low-interfacial toughness coatings without energy dissipation are believed to be a highly potential passive deicing technology. However, the durability in service is facing challenges. Herein, low-interfacial toughness PDMS coatings were prepared by physical blending. Through the optimization of added plasticizers and SiO2, PDMS coatings with excellent large-scale deicing performance were obtained. The constant deicing force and ice adhesion strength were reduced to 14.69 N/cm and 12.63 kPa, respectively. Moreover, a systematic durability assessment of the PDMS coating was carried out to address the actual operating conditions of wind turbines. Fortunately, the results showed that the PDMS coating could withstand 200 icing/deicing cycles while maintaining constant deicing force and ice adhesion strength of less than 50 N/cm and 30 kPa, respectively. After long-term thermal aging (21 days), UV irradiation (42 days) and salt spray corrosion (20 days), the PDMS coating still retained superior icephobicity and outstanding large-scale deicing performance. This work contributes to the research and development of low-interfacial toughness materials for large-scale deicing applications on wind turbine blades.
- Published
- 2024
- Full Text
- View/download PDF
35. Insight into best practices: a review of long-term monitoring of the rocky intertidal zone of the Northeast Pacific Coast
- Author
-
Kaplanis, Nikolas J
- Subjects
Biological Sciences ,Ecology ,long-term monitoring ,rocky intertidal zone ,sampling design ,Northeast Pacific Coast ,ecology ,large-scale ,Oceanography ,Geology - Abstract
On the shores of the Northeast Pacific Coast, research programs have monitored the rocky intertidal zone for multiple decades across thousands of kilometers, ranking among the longest-term and largest-scale ecological monitoring programs in the world. These programs have produced powerful datasets using simple field methods, and many are now capitalizing on modern field-sampling technology and computing power to collect and analyze biological information at increasing scale and resolution. Considering its depth, breadth, and cutting-edge nature, this research field provides an excellent case study for examining the design and implementation of long-term, large-scale ecological monitoring. I curated literature and interviewed 25 practitioners to describe, in detail, the methods employed in 37 community-level surveys by 18 long-term monitoring programs on the Northeast Pacific Coast, from Baja California, México, to Alaska, United States of America. I then characterized trade-offs between survey design components, identified key strengths and limitations, and provided recommendations for best practices. In doing so, I identified data gaps and research priorities for sustaining and improving this important work. This analysis is timely, especially considering the threat that climate change and other anthropogenic stressors present to the persistence of rocky intertidal communities. More generally, this review provides insight that can benefit long-term monitoring within other ecosystems.
- Published
- 2023
36. Large‐scale assessment of genetic structure to assess risk of populations of a large herbivore to disease.
- Author
-
Walter, W. David, Fameli, Alberto, Russo‐Petrick, Kelly, Edson, Jessie E., Rosenberry, Christopher S., Schuler, Krysten L., and Tonkovich, Michael J.
- Subjects
- *
CHRONIC wasting disease , *WHITE-tailed deer , *PRINCIPAL components analysis , *HERBIVORES , *PHYSIOGRAPHIC provinces , *GENETIC variation - Abstract
Chronic wasting disease (CWD) can spread among cervids by direct and indirect transmission, the former being more likely in emerging areas. Identifying subpopulations allows the delineation of focal areas to target for intervention. We aimed to assess the population structure of white‐tailed deer (Odocoileus virginianus) in the northeastern United States at a regional scale to inform managers regarding gene flow throughout the region. We genotyped 10 microsatellites in 5701 wild deer samples from Maryland, New York, Ohio, Pennsylvania, and Virginia. We evaluated the distribution of genetic variability through spatial principal component analysis and inferred genetic structure using non‐spatial and spatial Bayesian clustering algorithms (BCAs). We simulated populations representing each inferred wild cluster, wild deer in each state and each physiographic province, total wild population, and a captive population. We conducted genetic assignment tests using these potential sources, calculating the probability of samples being correctly assigned to their origin. Non‐spatial BCA identified two clusters across the region, while spatial BCA suggested a maximum of nine clusters. Assignment tests correctly placed deer into captive or wild origin in most cases (94%), as previously reported, but performance varied when assigning wild deer to more specific origins. Assignments to clusters inferred via non‐spatial BCA performed well, but efficiency was greatly reduced when assigning samples to clusters inferred via spatial BCA. Differences between spatial BCA clusters are not strong enough to make assignment tests a reliable method for inferring the geographic origin of deer using 10 microsatellites. However, the genetic distinction between clusters may indicate natural and anthropogenic barriers of interest for management. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Towards large-scale programmable silicon photonic chip for signal processing.
- Author
-
Xie, Yiwei, Wu, Jiachen, Hong, Shihan, Wang, Cong, Liu, Shujun, Li, Huan, Ju, Xinyan, Ke, Xiyuan, Liu, Dajian, and Dai, Daoxin
- Subjects
SIGNAL processing ,OPTICAL computing ,MICROWAVE photonics ,OPTICAL dispersion ,OPTICAL control ,OPTICAL switching ,OPTICAL communications ,MICROWAVE filters - Abstract
Optical signal processing has been playing a crucial part as powerful engine for various information systems in the practical applications. In particular, achieving large-scale programmable chips for signal processing are highly desirable for high flexibility, low cost and powerful processing. Silicon photonics, which has been developed successfully in the past decade, provides a promising option due to its unique advantages. Here, recent progress of large-scale programmable silicon photonic chip for signal processing in microwave photonics, optical communications, optical computing, quantum photonics as well as dispersion controlling are reviewed. Particularly, we give a discussion about the realization of high-performance building-blocks, including ultra-low-loss silicon photonic waveguides, 2 × 2 Mach–Zehnder switches and microring resonator switches. The methods for configuring large-scale programmable silicon photonic chips are also discussed. The representative examples are summarized for the applications of beam steering, optical switching, optical computing, quantum photonic processing as well as optical dispersion controlling. Finally, we give an outlook for the challenges of further developing large-scale programmable silicon photonic chips. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. Mapping long‐term terrace change in the Chinese Loess Plateau: A new approach by decision tree model with digital elevation model (DEM) and land use.
- Author
-
Shi, Yu and Wei, Wei
- Subjects
SUSTAINABLE agriculture ,DIGITAL elevation models ,DECISION trees ,LAND use ,LAND management ,TERRACING ,SOIL conservation - Abstract
Human‐created terraces are distributed extensively in the Chinese Loess Plateau, which play key roles in soil conservation, agricultural production and sustainable development. However, large‐scale and long‐term terrace mapping remains a big challenge due to the complexity of topography, land cover and the deficiency of high‐quality historical spatial data. Facing this task, our study aims to develop a new approach for capturing 30 years (from 1990 to 2020) of terrace patterns at macroscales (the whole Loess Plateau, with an area of 6.4 × 105 km2). The decision tree model (DTM) was integrated with digital elevation model (DEM) and land use data to detect terrace change, and terraced samples were extracted from existing findings for spatial validation. Our study confirmed that this new approach can work successfully on identifying cultivated and grassy terraces, as evidenced by receiver operating characteristic (ROC) curves and area under curve (AUC) values. More notably, a decreasing trend was detected in cultivated terraces with continued uneven distribution from 1990 to 2020, while the areas of grassy terraces increased markedly with more‐concentrated larger patches. This finding indicated that huge areas of terrace abandonment may have already occurred in this region. More attention thus should be paid to the rising risks of cropland utilization and food security. Since it is the first time to get long‐term reliable terrace maps on the Loess Plateau, our efforts can help to better take stock of terrace resources for wiser land use managements and agricultural policy adjustments, finally benefiting socio‐ecological sustainability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. The Effect of the Solution Flow and Electrical Field on the Homogeneity of Large-Scale Electrodeposited ZnO Nanorods.
- Author
-
Zhao, Yanmin, Li, Kexue, Hu, Ying, Hou, Xiaobing, Lin, Fengyuan, Tang, Jilong, Tang, Xin, Xing, Xida, Zhao, Xiao, Zhu, Haibin, Wang, Xiaohua, and Wei, Zhipeng
- Subjects
- *
NANORODS , *ZINC oxide , *ANTIREFLECTIVE coatings , *HOMOGENEITY , *TIN oxides - Abstract
In this paper, we demonstrate the significant impact of the solution flow and electrical field on the homogeneity of large-scale ZnO nanorod electrodeposition from an aqueous solution containing zinc nitrate and ammonium nitrate, primarily based on the X-ray fluorescence results. The homogeneity can be enhanced by adjusting the counter electrode size and solution flow rate. We have successfully produced relatively uniform nanorod arrays on an 8 × 10 cm2 i-ZnO-coated fluorine-doped tin oxide (FTO) substrate using a compact counter electrode and a vertical stirring setup. The as-grown nanorods exhibit similar surface morphologies and dominant, intense, almost uniform near-band-edge emissions in different regions of the sample. Additionally, the surface reflectance is significantly reduced after depositing the ZnO nanorods, achieving a moth-eye effect through subwavelength structuring. This effect of the nanorod array structure indicates that it can improve the utilization efficiency of light reception or emission in various optoelectronic devices and products. The large-scale preparation of ZnO nanorods is more practical to apply and has an extremely broad application value. Based on the research results, it is feasible to prepare large-scale ZnO nanorods suitable for antireflective coatings and commercial applications by optimizing the electrodeposition conditions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. High-Efficiency Dynamic Scanning Strategy for Powder Bed Fusion by Controlling Temperature Field of the Heat-Affected Zone.
- Author
-
Huang, Xiaokang, Tian, Xiaoyong, Zhong, Qi, He, Shunwen, Huo, Cunbao, Cao, Yi, Tong, Zhiqiang, and Li, Dichen
- Abstract
Improvement of fabrication efficiency and part performance was the main challenge for the large-scale powder bed fusion (PBF) process. In this study, a dynamic monitoring and feedback system of powder bed temperature field using an infrared thermal imager has been established and integrated into a four-laser PBF equipment with a working area of 2000 mm × 2000 mm. The heat-affected zone (HAZ) temperature field has been controlled by adjusting the scanning speed dynamically. Simultaneously, the relationship among spot size, HAZ temperature, and part performance has been established. The fluctuation of the HAZ temperature in four-laser scanning areas was decreased from 30.85 ℃ to 17.41 ℃. Thus, the consistency of the sintering performance of the produced large component has been improved. Based on the controllable temperature field, a dynamically adjusting strategy for laser spot size was proposed, by which the fabrication efficiency was improved up to 65.38%. The current research results were of great significance to the further industrial applications of large-scale PBF equipment. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. CDSKNNXMBD: a novel clustering framework for large-scale single-cell data based on a stable graph structure.
- Author
-
Ren, Jun, Lyu, Xuejing, Guo, Jintao, Shi, Xiaodong, Zhou, Ying, and Li, Qiyuan
- Subjects
- *
K-nearest neighbor classification , *PARALLEL algorithms , *MULTIPLE comparisons (Statistics) - Abstract
Background: Accurate and efficient cell grouping is essential for analyzing single-cell transcriptome sequencing (scRNA-seq) data. However, the existing clustering techniques often struggle to provide timely and accurate cell type groupings when dealing with datasets with large-scale or imbalanced cell types. Therefore, there is a need for improved methods that can handle the increasing size of scRNA-seq datasets while maintaining high accuracy and efficiency. Methods: We propose CDSKNNXMBD (Community Detection based on a Stable K-Nearest Neighbor Graph Structure), a novel single-cell clustering framework integrating partition clustering algorithm and community detection algorithm, which achieves accurate and fast cell type grouping by finding a stable graph structure. Results: We evaluated the effectiveness of our approach by analyzing 15 tissues from the human fetal atlas. Compared to existing methods, CDSKNN effectively counteracts the high imbalance in single-cell data, enabling effective clustering. Furthermore, we conducted comparisons across multiple single-cell datasets from different studies and sequencing techniques. CDSKNN is of high applicability and robustness, and capable of balancing the complexities of across diverse types of data. Most importantly, CDSKNN exhibits higher operational efficiency on datasets at the million-cell scale, requiring an average of only 6.33 min for clustering 1.46 million single cells, saving 33.3% to 99% of running time compared to those of existing methods. Conclusions: The CDSKNN is a flexible, resilient, and promising clustering tool that is particularly suitable for clustering imbalanced data and demonstrates high efficiency on large-scale scRNA-seq datasets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. Scale‐up fabrication and advanced properties of recycled polyethylene terephthalate aerogels from plastic waste.
- Author
-
Goh, Xue Yang, Deng, Xinying, Teo, Wern Sze, Ong, Ren Hong, Nguyen, Luon Tan, Bai, Tianliang, and Duong, Hai M.
- Subjects
POLYETHYLENE terephthalate ,PLASTIC scrap ,MECHANICAL drawing ,AEROGELS ,BLENDED yarn ,POLYVINYL alcohol - Abstract
Traditional fabrication methods of aerogels are time consuming, toxic, and difficult to implement, making the production of aerogels expensive and severely limits widespread adoption. Nonwoven technology is introduced to prepare fibers that can be used to create polymer‐based aerogel. With its introduction, it allows the continuous flow of fine fibers and eliminates the bottlenecking fiber preparation phase of the fabrication process. Using recycled polyethylene terephthalate (rPET) fibers and polyvinyl alcohol, two types of rPET aerogels are successfully fabricated, namely the lab‐scale and the large‐scale aerogels, to investigate the effectiveness of the nonwoven process line for the fiber preparation processing step. Fibers prepared manually (lab‐scale aerogels) and with the aid of a fiber preparation production line (large‐scale aerogels) are characterized and compared. Both lab‐scale and large‐scale aerogels exhibited the required specifications of low densities (12.6–45.9 and 13.2–43.7 mg/cm3, respectively) and high porosity (99.1%–96.7% and 99.0%–96.8%, respectively). Their thermal conductivity (23.4–34.0 and 23.2–31.9 mW/m⋅K, respectively) and compressive modulus (4.74–21.91 and 4.53–22.29 kPa, respectively) were also relatively similar. The advantage of scaled preparation of fibers for aerogel manufacturing includes higher throughputs (the line can produce up to 60 kg/h), improved consistency for defibrillation, homogenous fiber blending, and accurate replication of laboratory‐made aerogel properties. This demonstrates the viability of using nonwoven technology to scale for continuous production to bring down the production cost. Highlights: Scale up production of aerogels using nonwoven technologyImproving preparation process of aerogels through homogenous fiber blendingPreparation rate of up to 60 kg/hDeveloped high porosity aerogels up to 99%Good thermal insulation of 23.2–31.9 mW/m⋅K [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. A two-part alternating iteration power flow method based on dynamic equivalent admittance
- Author
-
Tong Jiang, Hongfei Hou, Zhuocheng Feng, and Chang Chen
- Subjects
Ill-conditioned ,Large-scale ,Power flow calculation ,Equivalent admittance ,Alternating iteration ,Production of electric energy or power. Powerplants. Central stations ,TK1001-1841 - Abstract
Power flow is an extensively used tool in various operations and planning of power systems. In this paper, we propose an efficient hybrid method for solving the power flow problems in ill-conditioned power systems. This method is implemented in a framework of two-part alternating iteration. In Part I of this method, PV bus voltages are regarded as constant and PQ bus voltages are updated by the Z-bus Gauss method. In Part II of this method, PQ bus loads are equivalent to dynamic admittances, and PV bus voltage angles are updated by Newton’s method. The values of PQ bus voltage magnitudes and PV bus voltage angles are passed between Part I and Part II. The proposed method is validated in well- and ill-conditioned systems and compared with several well-known power flow methods. Results show that the proposed method is robust and efficient to address the issues related with large-scale ill-conditioned power systems and it is not significantly affected by the considered initial guess.
- Published
- 2024
- Full Text
- View/download PDF
44. Beaconet: A Reference‐Free Method for Integrating Multiple Batches of Single‐Cell Transcriptomic Data in Original Molecular Space
- Author
-
Han Xu, Yusen Ye, Ran Duan, Yong Gao, Yuxuan Hu, and Lin Gao
- Subjects
batch effects ,large‐scale ,molecular feature space ,reference‐free ,single‐cell datasets ,Science - Abstract
Abstract Integrating multiple single‐cell datasets is essential for the comprehensive understanding of cell heterogeneity. Batch effect is the undesired systematic variations among technologies or experimental laboratories that distort biological signals and hinder the integration of single‐cell datasets. However, existing methods typically rely on a selected dataset as a reference, leading to inconsistent integration performance using different references, or embed cells into uninterpretable low‐dimensional feature space. To overcome these limitations, a reference‐free method, Beaconet, for integrating multiple single‐cell transcriptomic datasets in original molecular space by aligning the global distribution of each batch using an adversarial correction network is presented. Through extensive comparisons with 13 state‐of‐the‐art methods, it is demonstrated that Beaconet can effectively remove batch effect while preserving biological variations and is superior to existing unsupervised methods using all possible references in overall performance. Furthermore, Beaconet performs integration in the original molecular feature space, enabling the characterization of cell types and downstream differential expression analysis directly using integrated data with gene‐expression features. Additionally, when applying to large‐scale atlas data integration, Beaconet shows notable advantages in both time‐ and space‐efficiencies. In summary, Beaconet serves as an effective and efficient batch effect removal tool that can facilitate the integration of single‐cell datasets in a reference‐free and molecular feature‐preserved mode.
- Published
- 2024
- Full Text
- View/download PDF
45. An overview of application-oriented multifunctional large-scale stationary battery and hydrogen hybrid energy storage system
- Author
-
Yuchen Yang, Zhen Wu, Jing Yao, Tianlei Guo, Fusheng Yang, Zaoxiao Zhang, Jianwei Ren, Liangliang Jiang, and Bo Li
- Subjects
Hybrid energy storage system ,Battery ,Hydrogen ,Stationary ,Large-scale ,Multifunctional ,Technology ,Science (General) ,Q1-390 - Abstract
The imperative to address traditional energy crises and environmental concerns has accelerated the need for energy structure transformation. However, the variable nature of renewable energy poses challenges in meeting complex practical energy requirements. To address this issue, the construction of a multifunctional large-scale stationary energy storage system is considered an effective solution. This paper critically examines the battery and hydrogen hybrid energy storage systems. Both technologies face limitations hindering them from fully meeting future energy storage needs, such as large storage capacity in limited space, frequent storage with rapid response, and continuous storage without loss. Batteries, with their rapid response (90 %), excel in frequent short-duration energy storage. However, limitations such as a self-discharge rate (>1 %) and capacity loss (∼20 %) restrict their use for long-duration energy storage. Hydrogen, as a potential energy carrier, is suitable for large-scale, long-duration energy storage due to its high energy density, steady state, and low loss. Nevertheless, it is less efficient for frequent energy storage due to its low storage efficiency (∼50 %). Ongoing research suggests that a battery and hydrogen hybrid energy storage system could combine the strengths of both technologies to meet the growing demand for large-scale, long-duration energy storage. To assess their applied potentials, this paper provides a detailed analysis of the research status of both energy storage technologies using proposed key performance indices. Additionally, application-oriented future directions and challenges of the battery and hydrogen hybrid energy storage system are outlined from multiple perspectives, offering guidance for the development of advanced energy storage systems.
- Published
- 2024
- Full Text
- View/download PDF
46. A Heuristic Cutting Plane Algorithm For Budget Allocation of Large-scale Domestic Airport Network Protection
- Author
-
Yan Xihong and Hao Shiyu
- Subjects
airport security ,large-scale ,heuristic cutting plane algorithm ,budget allocation ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
It is well known that airport security is an important component of homeland security, since airports are highly vulnerable to terrorist attacks. In order to improve the overall security of the domestic airport network, some work studied the budget allocation of domestic airport network protection. They established a minimax optimization model and designed an exact cutting plane algorithm to solve the problem. However, the exact algorithm can not solve large-scale problems in an acceptable time. Hence, this paper designs a heuristic cutting plane algorithm for solving budget allocation of large-scale domestic airport network protection. Finally, numerical experiments are carried out to demonstrate the feasibility and effectiveness of the new algorithm.
- Published
- 2024
- Full Text
- View/download PDF
47. Chain-Splitting-Solving-Splicing Approach to Large-Scale OFISP-Modeled Satellite Range Scheduling Problem
- Author
-
De Meng, Zhen-Bao Liu, Yu-Hang Gao, Zu-Ren Feng, Wen-Hua Guo, and Zhi-Gang Ren
- Subjects
Large-scale ,fixed interval scheduling ,combinatorial optimization ,dynamic programming ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
The dimension of the operational-fixed-interval-scheduling-problem-modeled (OFISP-modeled) $\mathcal {NP}$ -hard satellite range scheduling problem (SRSP) extends rather than expands especially in large cases. This feature inspires us with the chain-splitting-solving-splicing (CSSS) approach to this OFISP-modeled SRSP under the curse of dimension. To boost the performance for scheduling the large-scale SRSP, we introduce the tricks to the algorithmic design from the inside out. The proposed method splits the original problem to small morsels with chain-splitting (CS) procedure to feed the route-reduction-based dynamic programming (R-DP) with the introduction of a novel scheduling element, the critical resource (CR), to expedite the execution for the optimal subsolution to the subproblem. At last, the standard DP (S-DP) splices the subsolutions into a complete optimal one. We encase the CR-based subproblem solving with the chain splitting and splicing framework. We show that it stunts the exponential explosion in time expenditure greatly with the CP by the mathematical analysis. We justify the efficiency of the proposed method with an application to a large-scale real-world SRSP instance. The proposed CSSS approach outperforms in comparison with several state-of-the-art algorithms. We obtained the optimal solution with the proposed method within reasonable time for the cases up to 3000 jobs. To the best of our knowledge, besides our research, other research into the large-scale SRSP is still absent. This status quo implies the potential of our research as a benchmark case for the would-be comparison with the other future research.
- Published
- 2024
- Full Text
- View/download PDF
48. Large-Scale Green Method for Synthesizing Ultralong Uniform Tellurium Nanowires for Semiconductor Devices
- Author
-
Zhiyi Lyu, Mose Park, Yanjin Tang, Hoon Choi, Seung Hyun Song, and Hoo-Jeong Lee
- Subjects
tellurium nanowires ,green synthesis ,semiconductor applications ,thin-film transistors ,large-scale ,Chemistry ,QD1-999 - Abstract
This study presents a large-scale green approach for synthesizing ultralong tellurium nanowires with diameters around 13 nm using a solution-based method. By adjusting key synthesis parameters such as the surfactant concentration, temperature, and reaction duration, we achieved high-quality, ultralong Te NWs. These nanowires exhibit properties suitable for use in semiconductor applications, particularly when employed as channel materials in thin-film transistors, displaying a pronounced gate effect with a high switch of up to 104 and a mobility of 0.9 cm2 V−1s−1. This study underscores the potential of solvent-based methods in synthesizing large-scale ultralong Te NWs as a critical resource for future sustainable nanoelectronic devices.
- Published
- 2024
- Full Text
- View/download PDF
49. Evaluation of Ecological Environment Quality Using an Improved Remote Sensing Ecological Index Model
- Author
-
Yanan Liu, Wanlin Xiang, Pingbo Hu, Peng Gao, and Ai Zhang
- Subjects
ecological environments ,remote sensing ecological index ,large-scale ,3D ecological factors ,subjective and objective weights determination ,Science - Abstract
The Remote Sensing Ecological Index (RSEI) model is widely used for large-scale, rapid Ecological Environment Quality (EEQ) assessment. However, both the RSEI and its improved models have limitations in explaining the EEQ with only two-dimensional (2D) factors, resulting in inaccurate evaluation results. Incorporating more comprehensive, three-dimensional (3D) ecological information poses challenges for maintaining stability in large-scale monitoring, using traditional weighting methods like the Principal Component Analysis (PCA). This study introduces an Improved Remote Sensing Ecological Index (IRSEI) model that integrates 2D (normalized difference vegetation factor, normalized difference built-up and soil factor, heat factor, wetness, difference factor for air quality) and 3D (comprehensive vegetation factor) ecological factors for enhanced EEQ monitoring. The model employs a combined subjective–objective weighting approach, utilizing principal components and hierarchical analysis under minimum entropy theory. A comparative analysis of IRSEI and RSEI in Miyun, a representative study area, reveals a strong correlation and consistent monitoring trends. By incorporating air quality and 3D ecological factors, IRSEI provides a more accurate and detailed EEQ assessment, better aligning with ground truth observations from Google Earth satellite imagery.
- Published
- 2024
- Full Text
- View/download PDF
50. Simultaneous machine selection and buffer allocation in large unbalanced series-parallel production lines.
- Author
-
Xi, Shaohui, Smith, James MacGregor, Chen, Qingxin, Mao, Ning, Zhang, Huiyu, and Yu, Ailin
- Subjects
NP-hard problems ,MACHINERY - Abstract
Simultaneous optimisation of machines and buffers in a large series-parallel production line is an NP-hard problem. The formulated optimisation model in this study is used to minimise the total investment cost subject to the desired throughput rate and cycle time by optimising the machine types, number of parallel machines, and buffer capacities. To solve this kind of design problem, a decomposition-coordination method is proposed to efficiently and accurately generate allocation solutions for large production lines. The proposed method includes two iterative processes: the decomposition process decouples the original line into several small lines and optimises them separately, while the coordination process ensures that the optimisation problems of the decomposed lines are similar to the corresponding part of the original. The performance of this approach is demonstrated through numerical experiments by comparisons with the simulated annealing algorithm and non-dominated sorting genetic algorithm-II. Finally, the sets of numerical results and a multi-factorial experimental analysis illustrate the influences of target system parameters on the resource configurations. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.