184,189 results
Search Results
202. Management of ST‐segment‐elevation myocardial infarction during the coronavirus disease 2019 ( COVID ‐19) outbreak: Iranian'247' National Committees position paper on primary percutaneous coronary intervention
- Author
-
Sadeghipour, Parham, Talasaz, Azita H., Eslami, Vahid, Geraiely, Babak, Vojdanparast, Mohammad, Sedaghat, Mojtaba, Moosavi, Abouzar Fakhr, Alipour‐Parsa, Saeed, Aminian, Bahram, Firouzi, Ata, Ghaffari, Samad, Ghasemi, Massoud, Saleh, Davood Kazemi, Khosravi, Alireza, Kojuri, Javad, Noohi, Feridoun, Pourhosseini, Hamid, Salarifar, Mojtaba, Salehi, Mohamad Reza, Sezavar, Hashem, Shabestari, Mahmoud, Soleimani, Abbas, Tabarsi, Payam, Parsa, Amir Farhang Zand, and Abdi, Seifollah
- Subjects
Infection Control ,Percutaneous Coronary Intervention ,acute myocardial infarction/STEMI ,Radiology Nuclear Medicine and imaging ,COVID-19 ,Humans ,ST Elevation Myocardial Infarction ,General Medicine ,Iran ,Core Curriculum ,Cardiology and Cardiovascular Medicine ,Algorithms ,thrombolytic therapy - Abstract
World Health Organization has designated coronavirus disease 2019 (COVID‐19) as a pandemic. During the past several weeks, a considerable burden has been imposed on the Iranian's healthcare system. The present document reviewed the latest evidence and expert opinion regarding the management of ST‐segment‐elevation myocardial infarction during the outbreak of COVID‐19 and outlines a practical algorithm for it.
- Published
- 2020
- Full Text
- View/download PDF
203. 'Another fine mess': how the papers covered the exams fiasco U-turn; Now that the algorithm has finally been dropped, the blame game begins, as well as the search for university places Gavin Williamson seeks to blame Ofqual for exams debacle
- Subjects
Algorithms ,Journalistic ethics ,Algorithm ,News, opinion and commentary - Abstract
Byline: Graham Russell Climbdowns, U-turns and calls for Gavin Williamson to resign resounded across Tuesday's front pages following the education secretary's apology for the exams algorithm fiasco that has marred [...]
- Published
- 2020
204. Desert island papers-A life in variance parameter and quantitative genetic parameter estimation reviewed using 16 papers
- Author
-
Robin Thompson
- Subjects
0301 basic medicine ,Mixed model ,Restricted maximum likelihood ,Scientific career ,03 medical and health sciences ,Food Animals ,Statistics ,Computer software ,Animals ,Humans ,Inbreeding ,Mathematics ,Estimation ,Likelihood Functions ,Sheep ,Models, Genetic ,Estimation theory ,0402 animal and dairy science ,04 agricultural and veterinary sciences ,General Medicine ,Variance (accounting) ,History, 20th Century ,040201 dairy & animal science ,030104 developmental biology ,Genetics, Population ,Linear Models ,Animal Science and Zoology ,Periodicals as Topic ,Algorithms ,Software - Abstract
I review my scientific career in terms of eight areas and 16 papers. The first two areas are associated with childhood. The other six are associated with residual maximum likelihood (REML), canonical transformation, inbreeding in selected populations, average information residual maximum likelihood (AIREML), the computer program ASReml and sampling-based estimation.
- Published
- 2018
205. Canadian Association of Radiologists White Paper on Ethical and Legal Issues Related to Artificial Intelligence in Radiology.
- Author
-
Jaremko, Jacob L., Azar, Marleine, Bromwich, Rebecca, Lum, Andrea, Alicia Cheong, Li Hsia, Gibert, Martin, Laviolette, François, Gray, Bruce, Reinhold, Caroline, Cicero, Mark, Chong, Jaron, Shaw, James, Rybicki, Frank J., Hurrell, Casey, Lee, Emil, and Tang, An
- Subjects
- *
ARTIFICIAL intelligence laws , *ACQUISITION of property , *ALGORITHMS , *ARTIFICIAL intelligence , *AUTONOMY (Psychology) , *CONCEPTUAL structures , *MEDICAL ethics , *MEDICAL practice , *MEDICAL specialties & specialists , *PRIVACY , *RADIOLOGISTS , *DATA security - Abstract
Artificial intelligence (AI) software that analyzes medical images is becoming increasingly prevalent. Unlike earlier generations of AI software, which relied on expert knowledge to identify imaging features, machine learning approaches automatically learn to recognize these features. However, the promise of accurate personalized medicine can only be fulfilled with access to large quantities of medical data from patients. This data could be used for purposes such as predicting disease, diagnosis, treatment optimization, and prognostication. Radiology is positioned to lead development and implementation of AI algorithms and to manage the associated ethical and legal challenges. This white paper from the Canadian Association of Radiologists provides a framework for study of the legal and ethical issues related to AI in medical imaging, related to patient data (privacy, confidentiality, ownership, and sharing); algorithms (levels of autonomy, liability, and jurisprudence); practice (best practices and current legal framework); and finally, opportunities in AI from the perspective of a universal health care system. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
206. A Parameterization Approach for the Dielectric Response Model of Oil Paper Insulation Using FDS Measurements.
- Author
-
Yang, Feng, Du, Lin, Yang, Lijun, Wei, Chao, Wang, Youyuan, Ran, Liman, and He, Peng
- Subjects
- *
DIELECTRICS , *HIGH voltages , *ALGORITHMS , *ELECTRIC capacity , *ELECTRIC potential - Abstract
To facilitate better interpretation of dielectric response measurements--thereby directing numerical evidence for condition assessments of oil-paper-insulated equipment in high-voltage alternating current (HVAC) transmission systems--a novel approach is presented to estimate the parameters in the extended Debye model (EDM) using wideband frequency domain spectroscopy (FDS). A syncretic algorithm that integrates a genetic algorithm (GA) and the Levenberg-Marquardt (L-M) algorithm is introduced in the present study to parameterize EDM using the FDS measurements of a real-life 126 kV oil-impregnated paper (OIP) bushing under different controlled temperatures. As for the uncertainty of the EDM structure due to variable branch quantity, Akaike's information criterion (AIC) is employed to determine the model orders. For verification, comparative analysis of FDS reconstruction and results of FDS transformation to polarization--depolarization current (PDC)/return voltage measurement (RVM) are presented. The comparison demonstrates good agreement between the measured and reconstructed spectroscopies of complex capacitance and tan δover the full tested frequency band (10-4 Hz-10³ Hz) with goodness of fit over 0.99. Deviations between the tested and modelled PDC/RVM from FDS are then discussed. Compared with the previous studies to parameterize the model using time domain dielectric responses, the proposed method solves the problematic matching between EDM and FDS especially in a wide frequency band, and therefore assures a basis for quantitative insulation condition assessment of OIP-insulated apparatus in energy systems. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
207. A natural language programming solution for executable papers.
- Author
-
Veres, Sandor M and Adolfsson, J. Patrik
- Subjects
COMPUTER software execution ,ELECTRONIC data processing ,COMPUTER programming ,SCHOLARLY electronic publishing ,SOURCE code ,INTERNET publishing ,ALGORITHMS ,SCIENCE publishing - Abstract
Abstract: The paper describes a system for executable papers for publishers enabling them to reuse content and to generate further advances of science and engineering. The executable algorithmic descriptions within a paper are presented in natural language sentences and basic code, thereby making long term compatibility absolute. Authors are required to use publicly numerical libraries on the Internet or references to publications with executable papers. As used by authors the system automatically creates a web of algorithmic knowledge on the Internet. Novelty of new algorithms in publications can be evaluated by automated tools available to authors, reviewers and readers of scientific papers published. [Copyright &y& Elsevier]
- Published
- 2011
- Full Text
- View/download PDF
208. Steven Ramsay. Reading Machines: Toward an Algorithmic Criticism. Champaign IL: University of Illinois Press. 2011. ISBN 978-0-252-03641-5 (Cloth) 978-0-252-07820-0 (Paper).
- Author
-
Matt Schneider
- Subjects
Digital Humanities ,DH ,humanities computing ,Stephen Ramsay ,algorithmic criticism ,algorithms ,History of scholarship and learning. The humanities ,AZ20-999 ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
The Digital Humanities have attracted much attention as of late, including an increased concentration of DH-specific panels at the MLA conference, a focus in the MLA’s publication "Profession" (2011b), and a series of blog posts by Stanley Fish (2012a, 2012b). While this coverage has done much to expose scholars to the variety of the work being done in the digital humanities, the general sense of the digital humanities is still that it has more in common with the social sciences and computer sciences than the humanities, a sense that is apparent in popular articles like Fish’s blog posts (2012a,2012b) and Kathryn Schulz’s New York Times article “What is Distant Reading” (2011), and can even be felt to some degree when Johanna Drucker separates the digital humanities from speculative computing (2009 4-5). Data mining, database construction, and the development of visualisation tools require deep technological engagement, and appear entirely alien when compared to standard humanistic methods of enquiry.
- Published
- 2012
- Full Text
- View/download PDF
209. Enumeration optimization of open pit production scheduling based on mobile capacity search domain.
- Author
-
Xu X, Gu X, Wang Q, Zhao Y, Kong W, Zhu Z, and Wang F
- Subjects
- Algorithms, Mining
- Abstract
The optimization of open pit mine production scheduling is not only a multistage decision-making problem but also involves space-time dynamic action among multiple factors, which makes it difficult to optimize production capacity, mining sequence, mining life, and other factors simultaneously in optimizing design. In addition, the production capacity is disorderly expanded, the calculation scale is large, and the optimization time is long. Therefore, this article designs a mobile capacity search domain method to improve computing efficiency without omitting the optimal production capacity. At the same time, taking the maximum net present value as the objective function, an enumeration method is used to optimize the possible paths in different capacity domains and calculate the infrastructure investment and facility idle cost required to meet the maximum production capacity on each possible path to control the disorderly expansion and violent fluctuation of production capacity. The research shows that the open pit mine production scheduling optimization algorithm proposed in this article can not only realize the simultaneous optimization of the three elements of production capacity, mining sequence, and mining life but also improve the computing efficiency by 200 times. Furthermore, the production capacity fluctuation is less than 1.4%. The mining life of the mine is extended by 13 years, and the overall economic benefit is increased by 18%., (© 2023. The Author(s).)
- Published
- 2023
- Full Text
- View/download PDF
210. Exploring Evolutionary Technical Trends From Academic Research Papers.
- Author
-
TENG-KAI FAN and CHIA-HUI CHANG
- Subjects
RESEARCH ,TEXT mining ,MACHINE learning ,ALGORITHMS ,INFORMATION retrieval ,INFORMATION science - Abstract
Technical terms are vital elements for understanding the techniques used in academic research papers, and in this paper, we use focused technical terms to explore technical trends in the research literature. The major purpose of this work is to understand the relationship between techniques and research topics to better explore technical trends. We define this new text mining issue and apply machine learning algorithms for solving this problem by (1) recognizing focused technical terms from research papers; (2) classifying these terms into predefined technology categories; (3) analyzing the evolution of technical trends. The dataset consists of 656 papers collected from well-known conferences on ACM. The experimental results indicate that our proposed methods can effectively explore interesting evolutionary technical trends in various research topics. [ABSTRACT FROM AUTHOR]
- Published
- 2010
211. Smart Random Walk Distributed Secured Edge Algorithm Using Multi-Regression for Green Network.
- Author
-
Saba, Tanzila, Haseeb, Khalid, Rehman, Amjad, Damaševičius, Robertas, and Bahaj, Saeed Ali
- Subjects
RANDOM walks ,ALGORITHMS ,ARTIFICIAL intelligence ,INTERNET of things ,ELECTRONIC paper ,INTERNET traffic - Abstract
Smart communication has significantly advanced with the integration of the Internet of Things (IoT). Many devices and online services are utilized in the network system to cope with data gathering and forwarding. Recently, many traffic-aware solutions have explored autonomous systems to attain the intelligent routing and flowing of internet traffic with the support of artificial intelligence. However, the inefficient usage of nodes' batteries and long-range communication degrades the connectivity time for the deployed sensors with the end devices. Moreover, trustworthy route identification is another significant research challenge for formulating a smart system. Therefore, this paper presents a smart Random walk Distributed Secured Edge algorithm (RDSE), using a multi-regression model for IoT networks, which aims to enhance the stability of the chosen IoT network with the support of an optimal system. In addition, by using secured computing, the proposed architecture increases the trustworthiness of smart devices with the least node complexity. The proposed algorithm differs from other works in terms of the following factors. Firstly, it uses the random walk to form the initial routes with certain probabilities, and later, by exploring a multi-variant function, it attains long-lasting communication with a high degree of network stability. This helps to improve the optimization criteria for the nodes' communication, and efficiently utilizes energy with the combination of mobile edges. Secondly, the trusted factors successfully identify the normal nodes even when the system is compromised. Therefore, the proposed algorithm reduces data risks and offers a more reliable and private system. In addition, the simulations-based testing reveals the significant performance of the proposed algorithm in comparison to the existing work. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
212. Active learning for ordinal classification based on expected cost minimization.
- Author
-
He D
- Subjects
- Algorithms
- Abstract
To date, a large number of active learning algorithms have been proposed, but active learning methods for ordinal classification are under-researched. For ordinal classification, there is a total ordering among the data classes, and it is natural that the cost of misclassifying an instance as an adjacent class should be lower than that of misclassifying it as a more disparate class. However, existing active learning algorithms typically do not consider the above ordering information in query selection. Thus, most of them do not perform satisfactorily in ordinal classification. This study proposes an active learning method for ordinal classification by considering the ordering information among classes. We design an expected cost minimization criterion that imbues the ordering information. Meanwhile, we incorporate it with an uncertainty sampling criterion to impose the query instance more informative. Furthermore, we introduce a candidate subset selection method based on the k-means algorithm to reduce the computational overhead led by the calculation of expected cost. Extensive experiments on nine public ordinal classification datasets demonstrate that the proposed method outperforms several baseline methods., (© 2022. The Author(s).)
- Published
- 2022
- Full Text
- View/download PDF
213. Quantum variational algorithms are swamped with traps.
- Author
-
Anschuetz ER and Kiani BT
- Subjects
- Algorithms, Neural Networks, Computer
- Abstract
One of the most important properties of classical neural networks is how surprisingly trainable they are, though their training algorithms typically rely on optimizing complicated, nonconvex loss functions. Previous results have shown that unlike the case in classical neural networks, variational quantum models are often not trainable. The most studied phenomenon is the onset of barren plateaus in the training landscape of these quantum models, typically when the models are very deep. This focus on barren plateaus has made the phenomenon almost synonymous with the trainability of quantum models. Here, we show that barren plateaus are only a part of the story. We prove that a wide class of variational quantum models-which are shallow, and exhibit no barren plateaus-have only a superpolynomially small fraction of local minima within any constant energy from the global minimum, rendering these models untrainable if no good initial guess of the optimal parameters is known. We also study the trainability of variational quantum algorithms from a statistical query framework, and show that noisy optimization of a wide variety of quantum models is impossible with a sub-exponential number of queries. Finally, we numerically confirm our results on a variety of problem instances. Though we exclude a wide variety of quantum algorithms here, we give reason for optimism for certain classes of variational algorithms and discuss potential ways forward in showing the practical utility of such algorithms., (© 2022. The Author(s).)
- Published
- 2022
- Full Text
- View/download PDF
214. Millimeter-Wave Radar-Based Identity Recognition Algorithm Built on Multimodal Fusion.
- Author
-
Guo, Jian, Wei, Jingpeng, Xiang, Yashan, and Han, Chong
- Subjects
FEATURE extraction ,HEART rate monitors ,ALGORITHMS ,SIGNAL-to-noise ratio - Abstract
Millimeter-wave radar-based identification technology has a wide range of applications in persistent identity verification, covering areas such as security production, healthcare, and personalized smart consumption systems. It has received extensive attention from the academic community due to its advantages of being non-invasive, environmentally insensitive and privacy-preserving. Existing identification algorithms mainly rely on a single signal, such as breathing or heartbeat. The reliability and accuracy of these algorithms are limited due to the high similarity of breathing patterns and the low signal-to-noise ratio of heartbeat signals. To address the above issues, this paper proposes an algorithm for multimodal fusion for identity recognition. This algorithm extracts and fuses features derived from phase signals, respiratory signals, and heartbeat signals for identity recognition purposes. The spatial features of signals with different modes are first extracted by the residual network (ResNet), after which these features are fused with a spatial-channel attention fusion module. On this basis, the temporal features are further extracted with a time series-based self-attention mechanism. Finally, the feature vectors of the user's vital sign modality are obtained to perform identity recognition. This method makes full use of the correlation and complementarity between different modal signals to improve the accuracy and reliability of identification. Simulation experiments show that the algorithm identity recognition proposed in this paper achieves an accuracy of 94.26% on a 20-subject self-test dataset, which is much higher than that of the traditional algorithm, which is about 85%. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
215. The Use of Manipulatives, Calculators, and Computers in Selected Kansan Third, Fourth, and Fifth Grade Classrooms.
- Author
-
Morrow, Jean
- Abstract
This paper discusses the results of questionnaires sent to principals and teachers on the subject of manipulatives, computer, and calculator availability and use in the classroom. The study was done in order to develop a better understanding of teachers' inservice needs. Results of the study include: (1) most teachers have at least some manipulatives available to them; (2) on average, students use manipulatives about once a week; (3) the most common use of computers is for drill and practice; and (4) the most common use of calculators is for checking paper-and-pencil work. An introduction is followed by a literature review, and by brief descriptions of the methodology, instruments, sample, and procedure. A discussion of the results precedes the general conclusions. Tables listing percentages of availability of manipulatives and student usage time with manipulatives, computers, and calculators are appended. (KR)
- Published
- 1990
216. Didactic Strategies for the Understanding of the Kalman Filter in Industrial Instrumentation Systems
- Author
-
Flórez C., Oscar D., Camargo L., Julián R., and Hurtado, Orlando García
- Abstract
This paper presents an application of the Kalman filter in signal processing in instrumentation systems when the conditions of the environment generate a large amount of interference for the acquisition of signals from measurement systems. The unwanted interferences make important use of the instrumentation system resources and do not represent useful information under any aspect. A simulation is presented using the Matlab tool, which remarkably facilitates the information processing so that the corresponding actions are taken according to the information obtained, taking advantage of the current resources offered by the embedded systems and the required measurements are obtained with enough accuracy.
- Published
- 2022
217. Specialized Content Knowledge of Pre-Service Teachers on the Infinite Limit of a Sequence
- Author
-
Arnal-Palacián, Mónica and Claros-Mellado, Javier
- Abstract
This paper analyses how pre-service teachers approach the notion of the infinite limit of a sequence from two perspectives: Specialized Content Knowledge and Advanced Mathematical Thinking. The aim of this study is to identify the difficulties associated with this notion and to classify them. In order to achieve this, an exploratory qualitative approach was applied using a sample of 12 future teachers. Among the results, we can affirm that pre-service teachers mainly use algorithmic procedures to solve tasks in which this type of limit is implicit, although they would consider a resolution that specifically involves the notion with an intuitive approach if they had to explain it to their students.
- Published
- 2022
218. A Complicated Relationship: Examining the Relationship between Flexible Strategy Use and Accuracy
- Author
-
Garcia Coppersmith, Jeannette and Star, Jon R.
- Abstract
This study explores student flexibility in mathematics by examining the relationship between accuracy and strategy use for solving arithmetic and algebra problems. Core to procedural flexibility is the ability to select and accurately execute the most appropriate strategy for a given problem. Yet the relationship between strategy selection and accurate execution is nuanced and poorly understood. In this paper, this relationship was examined in the context of an assessment where students were asked to complete the same problem twice using different approaches. In particular, we explored (a) the extent to which students were more accurate when selecting standard or better-than-standard strategies, (b) whether this accuracy-strategy use relationship differed depending on whether the student solved a problem for the first time or the second time, and (c) the extent to which students were more accurate when solving algebraic versus arithmetic problems. Our results indicate significant associations between accuracy and all of these aspects--we found differences in accuracy based on strategy, problem type, and a significant interaction effect between strategy and assessment part. These findings have important implications both for researchers investigating procedural flexibility as well as secondary mathematics educators who seek to promote this capacity among their students.
- Published
- 2022
219. Analyzing Ranking Strategies to Characterize Competition in the Co-Operative Education Job Market
- Author
-
Chopra, Shivangi and Golab, Lukasz
- Abstract
Co-operative education is a form of work-integrated learning that includes academic study and paid work experience. This provides new learning opportunities for students and a talent pipeline for employers, but also requires participation in a competitive job market. This paper studies competition through a unique dataset from a large North American co-operative program, in which students and employers rank each other after a round of interviews, then a matching algorithm assigns students to jobs based on the ranks, and finally, they evaluate each other at the end of the work term. The results suggest that less experienced students and small employers are more strongly affected by competition and consider more options in their rankings, whereas senior students and large employers often only identify their top choice. Additionally, competition appears to affect satisfaction since students and employers give higher work term evaluations when matched with their top choice.
- Published
- 2022
220. Roll assortment optimization in a paper mill: An integer programming approach
- Author
-
Chauhan, S.S., Martel, Alain, and D'Amour, Sophie
- Subjects
Algorithm ,Algorithms - Abstract
To link to full-text access for this article, visit this link: http://dx.doi.org/10.1016/j.cor.2006.03.026 Byline: S.S. Chauhan, Alain Martel, Sophie D'Amour Abstract: Fine paper mills produce a variety of paper grades to satisfy demand for a large number of sheeted products. Huge reels of different paper grades are produced on a cyclical basis on paper machines. These reels are then cut into rolls of smaller size which are then either sold as such, or sheeted into finished products in converting plants. A huge number of roll sizes would be required to cut all finished products without trim loss and they cannot all be inventoried. An assortment of rolls is inventoried with the implication that the sheeting operations may yield trim loss. The selection of the assortment of roll sizes to stock and the assignment of these roll sizes to finished products have a significant impact on performances. This paper presents a model to decide the parent roll assortment and assignments to finished products based on these products demand processes, desired service levels, trim loss and inventory holding costs. Risk pooling economies made by assigning several finished products to a given roll size is a fundamental aspect of the problem. The overall model is a binary non-linear program. Two solution methods are developed: a branch and price algorithm based on column generation and a fast pricing heuristic, and a marginal cost heuristic. The two methods are tested on real data and also on randomly generated problem instances. The approach proposed was implemented by a large pulp and paper company. Author Affiliation: FOR@C Research Consortium, Network Organization Technology Research Center (CENTOR), Universite Laval, Sainte-Foy, Que., Canada G1K 7P4
- Published
- 2008
221. Representation of Learning in the Post-Digital: Students' Dropout Predictive Models with Artificial Intelligence Algorithms
- Author
-
Zanellati, Andrea, Macauda, Anita, Panciroli, Chiara, and Gabbrielli, Maurizio
- Abstract
Within scientific debate on post-digital and education, we present a position paper to describe a research project aimed at the design of a predictive model for students' low achievements in mathematics in Italy. The model is based on the INVALSI data set, an Italian large-scale assessment test, and we use decision trees as the classification algorithm. In designing this tool, we aim to overcome the use of economic, social, and cultural context indices as main factors for the prediction of a learning gap occurrence. Indeed, we want to include a suitable representation of students' learning in the model, by exploiting the data collected through the INVALSI tests. We resort to a knowledge-based approach to address this issue and specifically, we try to understand what knowledge is introduced into the model through the representation of learning. In this sense, our proposal allows a students' learning encoding, which is transferable to different students' cohort. Furthermore, the encoding methods may be applied to other large-scale assessments test. Hence, we aim to contribute to a debate on knowledge representation in AI tool for education.
- Published
- 2023
- Full Text
- View/download PDF
222. Photodegradation of Substituted Stilbene Compounds: What Colors Aging Paper Yellow?
- Author
-
Leif A. Eriksson and Bo Durbeej
- Subjects
Models, Molecular ,Paper ,Time Factors ,Absorption spectroscopy ,Photochemistry ,Propanols ,Color ,Lignin ,Absorption ,chemistry.chemical_compound ,Stilbenes ,Physical and Theoretical Chemistry ,Photodegradation ,HOMO/LUMO ,Cinnamyl alcohol ,Chemistry ,Quinones ,Chromophore ,Hybrid functional ,Biodegradation, Environmental ,Models, Chemical ,Thermodynamics ,Spectrophotometry, Ultraviolet ,Density functional theory ,Algorithms - Abstract
Photodegradation of lignin is one of the major postprocessing problems in paper production, as this renders yellowing of the paper and reduced paper quality. In this study, we have explored the photochemical properties of substituted stilbene derivatives believed to be key chromophores in the photodegradation of lignin derived from cinnamyl alcohol. In particular, the present work focuses on the computation of UV/vis electronic absorption spectra for different methoxylated stilbenes and their proposed photodegradation products. All calculations were performed using the time-dependent formalism of density functional theory (TD-DFT) and the B3LYP hybrid functional. It is concluded that the methodology employed is capable of reproducing not only the overall spectra, but also subtle features owing to the effects of different substitution patterns. For the strongly absorbing first excited singlet state (HOMO --LUMO excitation) of the methoxylated stilbenes, the calculated transition energies are, albeit somewhat fortuitously, in excellent agreement with experimental data. The light-induced yellowing indirectly caused by the presence of stilbenes can be rationalized in terms of the absorption spectra of the resulting photodegraded o-quinones, for which distinct transitions in the 420-500 nm region of the visible spectrum lacking prior to degradation are observed.
- Published
- 2005
223. Some Problems in the Paper “Model Checking Using Partial Kripke Structure with 3-Valued Temporal Logic”.
- Author
-
Lei, Chen and Yun-fu, Shen
- Subjects
KRIPKE semantics ,MATHEMATICAL models ,COMPUTER logic ,ALGORITHMS ,OPERATOR theory ,SYSTEMS design - Abstract
Abstract: A lot of work has been done in the field of multi-valued model checking home and abroad. But some research works are not perfect. Some problems in the semantic definitions of temporal operators and corresponding model checking algorithms are found in the paper “Model Checking Using Partial Kripke Structure with 3-Valued Temporal Logic”. Some counterexamples are given to show these errors. On the basis of the discussion, some sound semantic definitions of temporal operators and correct algorithms of model checking for some temporal operators are given. [Copyright &y& Elsevier]
- Published
- 2011
- Full Text
- View/download PDF
224. The contribution of cause-effect link to representing the core of scientific paper—The role of Semantic Link Network.
- Author
-
Cao, Mengyun, Sun, Xiaoping, and Zhuge, Hai
- Subjects
- *
COMPLEXITY (Philosophy) , *CAUSATION (Philosophy) , *SEMANTICS , *RESEARCH , *PHILOSOPHY - Abstract
The Semantic Link Network is a general semantic model for modeling the structure and the evolution of complex systems. Various semantic links play different roles in rendering the semantics of complex system. One of the basic semantic links represents cause-effect relation, which plays an important role in representation and understanding. This paper verifies the role of the Semantic Link Network in representing the core of text by investigating the contribution of cause-effect link to representing the core of scientific papers. Research carries out with the following steps: (1) Two propositions on the contribution of cause-effect link in rendering the core of paper are proposed and verified through a statistical survey, which shows that the sentences on cause-effect links cover about 65% of key words within each paper on average. (2) An algorithm based on syntactic patterns is designed for automatically extracting cause-effect link from scientific papers, which recalls about 70% of manually annotated cause-effect links on average, indicating that the result adapts to the scale of data sets. (3) The effects of cause-effect link on four schemes of incorporating cause-effect link into the existing instances of the Semantic Link Network for enhancing the summarization of scientific papers are investigated. The experiments show that the quality of the summaries is significantly improved, which verifies the role of semantic links. The significance of this research lies in two aspects: (1) it verifies that the Semantic Link Network connects the important concepts to render the core of text; and, (2) it provides an evidence for realizing content services such as summarization, recommendation and question answering based on the Semantic Link Network, and it can inspire relevant research on content computing. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
225. A collaborative approach for research paper recommender system.
- Author
-
Haruna, Khalid, Akmar Ismail, Maizatul, Damiasih, Damiasih, Sutopo, Joko, and Herawan, Tutut
- Subjects
- *
CITATION analysis , *SCIENCE & state , *SOCIAL network analysis , *SOCIAL networks , *COMPUTER networks - Abstract
Research paper recommenders emerged over the last decade to ease finding publications relating to researchers’ area of interest. The challenge was not just to provide researchers with very rich publications at any time, any place and in any form but to also offer the right publication to the right researcher in the right way. Several approaches exist in handling paper recommender systems. However, these approaches assumed the availability of the whole contents of the recommending papers to be freely accessible, which is not always true due to factors such as copyright restrictions. This paper presents a collaborative approach for research paper recommender system. By leveraging the advantages of collaborative filtering approach, we utilize the publicly available contextual metadata to infer the hidden associations that exist between research papers in order to personalize recommendations. The novelty of our proposed approach is that it provides personalized recommendations regardless of the research field and regardless of the user’s expertise. Using a publicly available dataset, our proposed approach has recorded a significant improvement over other baseline methods in measuring both the overall performance and the ability to return relevant and useful publications at the top of the recommendation list. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
226. Using Distance Transform Based Algorithms for Extracting Measures of the Fiber Network in Volume Images of Paper.
- Author
-
Svensson, Stina and Aronsson, Mattias
- Subjects
FIBERS ,ALGORITHMS ,FOUNDATIONS of arithmetic ,CORDAGE ,PAPER - Abstract
Presents a study which showed how curve and surface representations of fiber network, fiber wall and fiber lumen can be computed using distance transform based algorithms. Motive for analyzing volume images of paper; Uses of the representations; Levels of detail in which the analysis of paper and paper fibers can be divided.
- Published
- 2003
- Full Text
- View/download PDF
227. Special Issue on papers from the 2019 Workshop on Models and Algorithms for Planning and Scheduling Problems.
- Author
-
Khuller, Samir
- Subjects
SCHEDULING ,ALGORITHMS ,ONLINE algorithms - Abstract
The paper "Well-behaved Online Load Balancing Against Strategic Jobs" by Li, Li and Wu considers a truthful online load-balancing problem with the objective of the makespan minimization on related machines. The 2019 workshop on models and algorithms for planning and scheduling problems was held in Renesse (The Netherlands). [Extracted from the article]
- Published
- 2023
- Full Text
- View/download PDF
228. Logic via Computer Programming.
- Author
-
Wieschenberg, Agnes A.
- Abstract
This paper proposed the question "How do we teach logical thinking and sophisticated mathematics to unsophisticated college students?" One answer among many is through the writing of computer programs. The writing of computer algorithms is mathematical problem solving and logic in disguise and it may attract students who would otherwise stop taking mathematics courses after their required sequence is finished. In college classrooms in the United States, there is often an over-involvement with the calculation aspect of mathematics, especially in today's technical environment. The emphasis should fall on the teachers' developing of logic in students. Just like mathematical algorithms, computer algorithms however simple, employ logical steps which will result in the desired conclusion. Mathematics teachers should take advantage of the inumerable opportunities, even in a beginner's computer programming course, to play with algorithms that may aid students in the development of logical ways to approach mathematical problems. (MA)
- Published
- 1999
229. Digital Mining Algorithm of English Translation Course Information Based on Digital Twin Technology.
- Author
-
Juan Yang
- Subjects
MINES & mineral resources ,ELECTRONIC paper ,PATTERNS (Mathematics) ,ALGORITHMS ,TRANSLATING & interpreting - Abstract
Cross-language communication puts forward higher requirements for information mining in English translation course. Aiming at the problem that the frequent patterns in the current digital mining algorithms produce a large number of patterns and rules, with a long execution time, this paper proposes a digital mining algorithm for English translation course information based on digital twin technology. According to the results of word segmentation and tagging, the feature words of English translation text are extracted, and the cross-language mapping of text is established by using digital twin technology. The estimated probability of text translation is maximized by corresponding relationship. The text information is transformed into text vector, the semantic similarity of text is calculated, and the degree of translation matching is judged. Based on this data dimension, the frequent sequence is constructed by transforming suffix sequence into prefix sequence, and the digital mining algorithm is designed. The results of example analysis show that the execution time of digital mining algorithm based on digital twin technology is significantly shorter than that based on Apriori and Map Reduce, and the mining accuracy rate reached more than 80%, which has good performance in processing massive data. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
230. Methodology paper: a novel phantom setup for commissioning of scanned ion beam delivery and TPS
- Author
-
Benjamin Ackermann, K. Henkner, Malte Ellerbrock, S. Ecker, M. Winter, P. Heeg, and Oliver Jäkel
- Subjects
lcsh:Medical physics. Medical radiology. Nuclear medicine ,Ion beam ,lcsh:R895-920 ,Dose profile ,lcsh:RC254-282 ,Imaging phantom ,Phantoms ,03 medical and health sciences ,610 Medical sciences Medicine ,0302 clinical medicine ,Optics ,Acceptance testing ,Image Processing, Computer-Assisted ,Medicine ,Humans ,Radiology, Nuclear Medicine and imaging ,business.industry ,Phantoms, Imaging ,Radiotherapy Planning, Computer-Assisted ,Methodology ,Radiotherapy Dosage ,lcsh:Neoplasms. Tumors. Oncology. Including cancer and carcinogens ,Data point ,TPS commissioning ,Oncology ,Scanning ion beam ,030220 oncology & carcinogenesis ,Data quality ,Calibration ,Benchmark (computing) ,Radiotherapy, Intensity-Modulated ,business ,Tomography, X-Ray Computed ,Quality assurance ,Monte Carlo Method ,Algorithms - Abstract
Background Commissioning of treatment planning systems (TPS) and beam delivery for scanned light ion beams is an important quality assurance task. This requires measurement of large sets of high quality dosimetric data in anthropomorphic phantoms to benchmark the TPS and dose delivery under realistic conditions. Method A novel measurement setup is described, which allows for an efficient collection of a large set of accurate dose data in complex phantom geometries. This setup allows dose measurements based on a set of 24 small volume ionization chambers calibrated in dose to water and mounted in a holder, which can be freely positioned in a water phantom with various phantoms mounted in front of the water tank. The phantoms can be scanned in a CT and a CT-based treatment planning can be performed for a direct benchmark of the dose calculation algorithm in various situations. Results The system has been used for acceptance testing in scanned light ion beam therapy at Heidelberg Ion Beam Therapy Center for scanned proton and carbon ion beams. It demonstrated to be useful to collect large amounts of high quality data for comparison with the TPS calculation using various phantom geometries. Conclusion The setup is an efficient tool for commissioning and verification of treatment planning systems. It is especially suited for dynamic beam delivery, as many data points can be obtained during a single plan delivery, but can be adapted also for other dynamic therapies, like rotational IMRT.
- Published
- 2019
231. A SMART RESOURCE UTILIZATION ALGORITHM FOR HIGH SPEED 5G COMMUNICATION NETWORKS BASED ON CLOUD SERVERS.
- Author
-
Ali, Syed Ibad, Jadhav, Jagannath, Arunkumar, R., and Kanagavalli, N.
- Subjects
TELECOMMUNICATION systems ,5G networks ,ELECTRONIC paper ,ALGORITHMS ,COMPUTER systems - Abstract
The 5G technology will become a truly integrated technology. It's becoming the most famous technology because of its optimal system computing complex and transforming a group of individual network components. Adequate resources must be provided for the various devices that are typically on the same network. The various functions in the data system also require resource allocation to suit its needs. This will ensure that the various devices on that network perform different types of work effectively. It also represents the various device functionalites to ensure the estimated resources onetime manner. Only then will it be convenient to carry out a variety of jobs depending on their speed and selection. And its various data requirements vary according to the functionality of the various devices featured in this series segment dynamic configuration. In this paper a smart resource utilization scheme was proposed. Its main purpose is to better manage the off-the-shelf resources available here. And provide it where it is needed and streamline data delivery to users on the network. This ensures that all data goes to the users in the correct manner. The proposed method getting 49% energy consumption, 90% resource utilization, 92% resource reservation and 91% Quality of services. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
232. Adaptive Testing without IRT.
- Author
-
Yan, Duanli, Lewis, Charles, and Stocking, Martha
- Abstract
It is unrealistic to suppose that standard item response theory (IRT) models will be appropriate for all new and currently considered computer-based tests. In addition to developing new models, researchers will need to give some attention to the possibility of constructing and analyzing new tests without the aid of strong models. Computerized adaptive testing currently relies heavily on IRT. Alternative, empirically based, nonparametric adaptive testing algorithms exist, but their properties are little known. This paper introduces an adaptive testing algorithm that balances maximum differentiation among test takers with stable estimation at each stage of testing, and compares this algorithm with a traditional one using IRT and maximum information. The adaptive testing algorithm introduced is based on the classification and regression tree approach described in L. Breiman, J. Friedman, R. Olshen, and C. Stone (1984) and J. Chambers and T. Hastie (1992). Simulation results from the regression tree approach were compared with simulation results from three parameter logistic model IRT. Simulation results show that the nonparametric tree-based approach to adaptive testing may be superior to conventional IRT-based adaptive testing in cases where the IRT assumptions are not satisfied. It clearly outperformed one-dimensional IRT when the pool was strongly two-dimensional. A technical appendix describes the algorithm. (Contains three figures and six references.) (SLD)
- Published
- 1998
233. Application of Motion Capture Based on Digital Filtering Algorithm in Sports Dance Teaching.
- Author
-
Rao, Fan
- Subjects
MOTION capture (Human mechanics) ,INTELLIGENT sensors ,ELECTRONIC paper ,ALGORITHMS ,MOTION detectors ,SYSTEMS design - Abstract
In order to improve the teaching effect of sports dance, this paper analyzes the traditional dance teaching motion capture, uses sensor motion perception algorithms to capture sports dance motion perception, and designs an intelligent sensor system that can be used for sports dance motion capture. Moreover, this paper combines the digital filter algorithm to design the hardware system structure of the sports dance motion capture system and builds a motion capture system for sports dance teaching based on the digital filter algorithm according to actual needs. In addition, this paper combines the simulation test to evaluate the performance of the system designed in this paper. The research results show that the motion capture system for sports dance teaching based on the digital filtering algorithm proposed in this paper can play an important role in sports dance teaching and effectively improve the efficiency of sports dance teaching. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
234. Building Crack Detection Based on Digital Image Processing Technology and Multiscale Feature Analysis Automatic Detection Algorithm.
- Author
-
Liu, Chenguang
- Subjects
DIGITAL image processing ,SMART structures ,ENGINEERING personnel ,ELECTRONIC paper ,ALGORITHMS ,CRACKING of concrete - Abstract
At present, the monitoring of concrete cracks is still mainly carried out by engineering personnel using simple mechanical monitoring instruments. The human inspection will undoubtedly be interfered by the individual's psychological, physical, and external conditions, and there may also be unobjective emotions, so it is impossible to ensure that the quality of the detection is up to standard and accurate. This paper combines digital image processing technology and multiscale feature analysis automatic detection algorithm to construct an intelligent building structure crack detection system. Moreover, this paper proposes an enrichment scheme for the unknown partially entangled states of building microparticles and utilizes the entanglement exchange process based on the Raman interaction of two building microparticles. The experimental results show that the automatic detection method of building cracks based on digital image processing technology and multiscale feature analysis has a good effect. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
235. A Review on Federated Learning and Machine Learning Approaches: Categorization, Application Areas, and Blockchain Technology.
- Author
-
Ogundokun, Roseline Oluwaseun, Misra, Sanjay, Maskeliunas, Rytis, and Damasevicius, Robertas
- Subjects
BLOCKCHAINS ,ARTIFICIAL intelligence ,MACHINE learning ,CONFERENCE papers ,ALGORITHMS ,SCIENCE publishing - Abstract
Federated learning (FL) is a scheme in which several consumers work collectively to unravel machine learning (ML) problems, with a dominant collector synchronizing the procedure. This decision correspondingly enables the training data to be distributed, guaranteeing that the individual device's data are secluded. The paper systematically reviewed the available literature using the Preferred Reporting Items for Systematic Review and Meta-analysis (PRISMA) guiding principle. The study presents a systematic review of appliable ML approaches for FL, reviews the categorization of FL, discusses the FL application areas, presents the relationship between FL and Blockchain Technology (BT), and discusses some existing literature that has used FL and ML approaches. The study also examined applicable machine learning models for federated learning. The inclusion measures were (i) published between 2017 and 2021, (ii) written in English, (iii) published in a peer-reviewed scientific journal, and (iv) Preprint published papers. Unpublished studies, thesis and dissertation studies, (ii) conference papers, (iii) not in English, and (iv) did not use artificial intelligence models and blockchain technology were all removed from the review. In total, 84 eligible papers were finally examined in this study. Finally, in recent years, the amount of research on ML using FL has increased. Accuracy equivalent to standard feature-based techniques has been attained, and ensembles of many algorithms may yield even better results. We discovered that the best results were obtained from the hybrid design of an ML ensemble employing expert features. However, some additional difficulties and issues need to be overcome, such as efficiency, complexity, and smaller datasets. In addition, novel FL applications should be investigated from the standpoint of the datasets and methodologies. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
236. Research on the 3D Virtual Product Network Display Algorithm Based on Digital Drive.
- Author
-
Zhang, Qian, Guo, Xiaoying, Liang, Hui, and Sun, Maojun
- Subjects
VIRTUAL networks ,HOUGH transforms ,ALGORITHMS ,ELECTRONIC paper ,IMAGE analysis ,THREE-dimensional display systems - Abstract
In order to improve the effect of 3D virtual product network display, this paper combines digital drive technology to analyze the virtual simulation algorithm and proposes a digital drive-based Hough transform clustering virtual image processing algorithm. Through the knowledge of clustering and generalized Hough transform, generalized Hough transform is applied to clustering. Moreover, this paper uses cluster analysis to determine the image characteristics of three-dimensional virtual products. In addition, this article combines the methods proposed in this article to construct a three-dimensional virtual product network display system. The research shows that the digital-driven 3D virtual product network display algorithm model proposed in this paper has a good 3D virtual display effect. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
237. Construction of Personalized Learning Platform Based on Collaborative Filtering Algorithm.
- Author
-
Zhang, Qian
- Subjects
ARTIFICIAL intelligence ,DATABASE design ,ALGORITHMS ,RECOMMENDER systems ,ELECTRONIC paper - Abstract
On the network service platform for vocational education, there are currently over 10,000 online courses. Learners face a challenge in selecting interesting courses from the vast resources available. Learners' urgent need for personalized learning is becoming more apparent as educational informatization progresses. Personalized recommendation (PR) technology can aid personalized learning and increase learners' learning efficiency significantly. This paper constructs a smart classroom model based on AI (artificial intelligence) by studying the connotation and characteristics of smart classroom in light of the current research status and trend of smart classroom at home and abroad. The merits of the recommendation system are determined by the recommendation algorithm used by PR system. This paper primarily focuses on developing a personalized learning platform based on the CF (collaborative filtering) algorithm, as well as conducting system requirements analysis, database design, functional module design, implementation, and testing on this foundation. Experiments are carried out to see if the optimized PR algorithm in the network learning platform is effective. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
238. Epidemiological Algorithm for Early Detection of COVID-19 Cases in a Mexican Oncologic Center.
- Author
-
González-Escamilla, Moisés, Pérez-Ibave, Diana Cristina, Burciaga-Flores, Carlos Horacio, Ortiz-Murillo, Vanessa Natali, Ramírez-Correa, Genaro A., Rodríguez-Niño, Patricia, Piñeiro-Retif, Rafael, Rodríguez-Gutiérrez, Hazyadee Frecia, Alcorta-Nuñez, Fernando, González-Guerrero, Juan Francisco, Vidal-Gutiérrez, Oscar, and Garza-Rodríguez, María Lourdes
- Subjects
COVID-19 pandemic ,COVID-19 ,LATENT infection ,ALGORITHMS ,ELECTRONIC paper - Abstract
An early detection tool for latent COVID-19 infections in oncology staff and patients is essential to prevent outbreaks in a cancer center. (1) Background: In this study, we developed and implemented two early detection tools for the radiotherapy area to identify COVID-19 cases opportunely. (2) Methods: Staff and patients answered a questionnaire (electronic and paper surveys, respectively) with clinical and epidemiological information. The data were collected through two online survey tools: Real-Time Tracking (R-Track) and Summary of Factors (S-Facts). Cut-off values were established according to the algorithm models. SARS-CoV-2 qRT-PCR tests confirmed the positive algorithms individuals. (3) Results: Oncology staff members (n = 142) were tested, and 14% (n = 20) were positives for the R-Track algorithm; 75% (n = 15) were qRT-PCR positive. The S-Facts Algorithm identified 7.75% (n = 11) positive oncology staff members, and 81.82% (n = 9) were qRT-PCR positive. Oncology patients (n = 369) were evaluated, and 1.36% (n = 5) were positive for the Algorithm used. The five patients (100%) were confirmed by qRT-PCR. (4) Conclusions: The proposed early detection tools have proved to be a low-cost and efficient tool in a country where qRT-PCR tests and vaccines are insufficient for the population. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
239. A Comparison of Testlet-Based Test Designs for Computerized Adaptive Testing.
- Author
-
Schnipke, Deborah L. and Reese, Lynda M.
- Abstract
Two-stage and multistage test designs provide a way of roughly adapting item difficulty to test-taker ability. All test takers take a parallel stage-one test, and, based on their scores, they are routed to tests of different difficulty levels in subsequent stages. These designs provide some of the benefits of standard computerized adaptive testing (CAT), such as increased precision of ability estimates over a paper-and-pencil test design. The item selection and scoring algorithms in two-stage and multistage designs may also be easier for test takers and test score users to understand--an important feature for gaining public acceptance of new test designs. This study incorporates testlets (bundles of items) into two-stage and multistage designs, and compares the precision of the ability estimates derived from these designs with those derived from a standard CAT design and from paper-and-pencil test designs. For the group that was used to establish the cutoffs for the two-stage and multistage testlet designs, 50,000 simulated test takers were created randomly. The group of simulated test takers used to simulate all test designs totaled 25,000. Results indicate that all testlet-based designs resulted in improved precision over the same-length paper-and-pencil test, and almost as much precision as the paper-and-pencil test of double length. Given the many other (nonpsychometric) advantages of these designs, they may be viable options for computer-administered tests. (Contains six figures and nine references.) (Author/SLD)
- Published
- 1997
240. Using Structured Representations To Solve Fraction Problems: A Discussion of Seven Students' Strategies.
- Author
-
Brinker, Laura
- Abstract
This paper describes how a target group of seven students in a combined fourth and fifth grade mathematics class used structured representations to solve fraction problems situated within various realistic contexts. Emphasis is given to the ways in which students' thinking about rational number concepts influences and is influenced by the students' use of two structured representations. The first structured representation was a set of fraction strips the students used as a manipulative. The other structured representation was a ratio table, a pictorial model used flexibly by most of the students. Findings indicate that most of the target students did not connect their symbolic procedures to the underlying concepts, particularly when they tried to write formal addition sentences using the fraction strips. The findings also suggest that the ratio table afforded more opportunities for the students to apply their informal knowledge of fractions as quantities. Contains 28 references. (DDR)
- Published
- 1997
241. The Augmented Synthetic Control Method
- Author
-
Ben-Michael, Eli, Feller, Avi, and Rothstein, Jesse
- Abstract
The synthetic control method (SCM) is a popular approach for estimating the impact of a treatment on a single unit in panel data settings. The "synthetic control" is a weighted average of control units that balances the treated unit's pre-treatment outcomes and other covariates as closely as possible. A critical feature of the original proposal is to use SCM only when the fit on pre-treatment outcomes is excellent. We propose Augmented SCM as an extension of SCM to settings where such pre-treatment fit is infeasible. Analogous to bias correction for inexact matching, Augmented SCM uses an outcome model to estimate the bias due to imperfect pretreatment fit and then de-biases the original SCM estimate. Our main proposal, which uses ridge regression as the outcome model, directly controls pre-treatment fit while minimizing extrapolation from the convex hull. This estimator can also be expressed as a solution to a modified synthetic controls problem that allows negative weights on some donor units. We bound the estimation error of this approach under different data generating processes, including a linear factor model, and show how regularization helps to avoid over-fitting to noise. We demonstrate gains from Augmented SCM with extensive simulation studies and apply this framework to estimate the impact of the 2012 Kansas tax cuts on economic growth. We implement the proposed method in the new augsynth R package. [This paper was published in "Journal of the American Statistical Association" v116 n536 2021.]
- Published
- 2021
- Full Text
- View/download PDF
242. The contribution of cause-effect link to representing the core of scientific paper—The role of Semantic Link Network
- Author
-
Xiaoping Sun, Mengyun Cao, and Hai Zhuge
- Subjects
Semantic link ,Computer and Information Sciences ,Lexical semantics ,Computer science ,Science ,lcsh:Medicine ,Social Sciences ,02 engineering and technology ,Semantic data model ,Research and Analysis Methods ,Systems Science ,Automation ,Sociology ,020204 information systems ,0202 electrical engineering, electronic engineering, information engineering ,Question answering ,Psychology ,Syntax ,lcsh:Science ,Data Curation ,Language ,Grammar ,Multidisciplinary ,Information retrieval ,Applied Mathematics ,Simulation and Modeling ,Research ,lcsh:R ,Publications ,Cognitive Psychology ,Biology and Life Sciences ,Linguistics ,Complex Systems ,Reasoning ,Automatic summarization ,Semantics ,Lexical Semantics ,Social Networks ,Physical Sciences ,Cognitive Science ,lcsh:Q ,020201 artificial intelligence & image processing ,Mathematics ,Algorithms ,Network Analysis ,Research Article ,Neuroscience - Abstract
The Semantic Link Network is a general semantic model for modeling the structure and the evolution of complex systems. Various semantic links play different roles in rendering the semantics of complex system. One of the basic semantic links represents cause-effect relation, which plays an important role in representation and understanding. This paper verifies the role of the Semantic Link Network in representing the core of text by investigating the contribution of cause-effect link to representing the core of scientific papers. Research carries out with the following steps: (1) Two propositions on the contribution of cause-effect link in rendering the core of paper are proposed and verified through a statistical survey, which shows that the sentences on cause-effect links cover about 65% of key words within each paper on average. (2) An algorithm based on syntactic patterns is designed for automatically extracting cause-effect link from scientific papers, which recalls about 70% of manually annotated cause-effect links on average, indicating that the result adapts to the scale of data sets. (3) The effects of cause-effect link on four schemes of incorporating cause-effect link into the existing instances of the Semantic Link Network for enhancing the summarization of scientific papers are investigated. The experiments show that the quality of the summaries is significantly improved, which verifies the role of semantic links. The significance of this research lies in two aspects: (1) it verifies that the Semantic Link Network connects the important concepts to render the core of text; and, (2) it provides an evidence for realizing content services such as summarization, recommendation and question answering based on the Semantic Link Network, and it can inspire relevant research on content computing.
- Published
- 2018
243. Quantifying the impact of scholarly papers based on higher-order weighted citations
- Author
-
Feng Xia, Ivan Lee, Jie Hou, Xiangjie Kong, Amr Tolba, Fuli Zhang, Xiaomei Bai, Bai, Xiaomei, Zhang, Fuli, Hou, Jie, Lee, Ivan, Kong, Xiangjie, Tolba, Amr, and Xia, Feng
- Subjects
FOS: Computer and information sciences ,Computer science ,Population Dynamics ,lcsh:Medicine ,law.invention ,Geographical Locations ,Order (exchange) ,law ,Citation analysis ,Digital Libraries (cs.DL) ,lcsh:Science ,GeneralLiterature_REFERENCE(e.g.,dictionaries,encyclopedias,glossaries) ,Multidisciplinary ,Latitude ,Geography ,Applied Mathematics ,Simulation and Modeling ,05 social sciences ,Computer Science - Digital Libraries ,Research Assessment ,Europe ,Longitude ,Physical Sciences ,Citation Analysis ,quantifying the impact ,050904 information & library sciences ,Algorithms ,Research Article ,Cartography ,InformationSystems_INFORMATIONSTORAGEANDRETRIEVAL ,Bibliometrics ,050905 science studies ,Research and Analysis Methods ,weighted citations ,PageRank ,Humans ,Pagerank algorithm ,Information retrieval ,Manuscripts as Topic ,Population Biology ,lcsh:R ,author affiliations ,Biology and Life Sciences ,Models, Theoretical ,Geographic Distribution ,People and Places ,North America ,Earth Sciences ,lcsh:Q ,0509 other social sciences ,Citation ,Mathematics - Abstract
Quantifying the impact of a scholarly paper is of great significance, yet the effect of geographical distance of cited papers has not been explored. In this paper, we examine 30,596 papers published in Physical Review C, and identify the relationship between citations and geographical distances between author affiliations. Subsequently, a relative citation weight is applied to assess the impact of a scholarly paper. A higher-order weighted quantum PageRank algorithm is also developed to address the behavior of multiple step citation flow. Capturing the citation dynamics with higher-order dependencies reveals the actual impact of papers, including necessary self-citations that are sometimes excluded in prior studies. Quantum PageRank is utilized in this paper to help differentiating nodes whose PageRank values are identical. Refereed/Peer-reviewed
- Published
- 2018
244. Technical Paper: Solving the No‐intermediate Storage Flowshop Scheduling Problem
- Author
-
Kang, Byung‐Suh and Markland, Robert E.
- Published
- 1989
- Full Text
- View/download PDF
245. A Broad Base National Enrollment Model.
- Author
-
Neblock, Carl S.
- Abstract
The use of the cohort-survival method for projecting student enrollments is widely known in educational finance literature; however, the limited information provided by the model impedes planners in making future operational decisions. The cohort-survival method employs historical rates of usage to predict future patterns of usage and produces a grade-by-grade forecast for each student cohort. This paper presents a model that develops multipliers for several student-support programs to create a broad base of information. The model accommodates the number of regular public school students by grade level as well as an estimate of certain student-support programs--compensatory education, bilingual education, and 11 categories of special education. The model incorporates student-support-program requirements and adds the numbers of retained members to the next cohort, thus presenting a more accurate view of the enrollment picture. Two figures are included. (Contains nine references.) (LMI)
- Published
- 1996
246. Teaching Computer Science: A Problem Solving Approach that Works.
- Author
-
Allan, V. H. and Kolesar, M. V.
- Abstract
The typical introductory programming course is not an appropriate first computer science course for many students. Initial experiences with programming are often frustrating, resulting in a low rate of successful completion, and focus on syntax rather than providing a representative picture of computer science as a discipline. The paper discusses the design of a preparatory course (CS0) to be taken before the introductory computer science course (CS1) at Utah State University. In the course, students gain mathematical and problem-solving skills while becoming familiar with the computer as a tool and learning how applications can save them time and work. Students are taught basic computer science concepts: data types, internal representation, operators, propositional logic, algorithms, control structures, programs, sub-programs, and recursion. Instruction is through lectures, readings, pencil-and-paper exercises, spreadsheet labs, reading and modifying code, and programming in the language ML, which identifies data type as the data are entered. The paper compares the performance of students in CS1 who took the preparatory course with those who took a course in BASIC prior to CS1. Fifty-three percent of the CSO students who took CS1 achieved A grades in CS1, while only 26% of students who took BASIC before CS1 achieved an A grade in CS1. Results suggest that the preparatory course, with its skill-based approach to computer science, is a better preparation for the traditional first programming experience than a prior programming class. (Contains 13 references.) (SWC)
- Published
- 1996
247. Reply to "Describing center of pressure movement in stabilometry by ellipse area approximation" from Agnieszka Gołąb concerning the paper "A Review of Center of Pressure (COP) Variables to Quantify Standing Balance in Elderly People: Algorithms and Open Access Code"
- Author
-
Quijoux, Flavien and Nicolaï, Alice
- Subjects
- *
OLDER people , *EQUILIBRIUM testing , *ALGORITHMS , *WAVELETS (Mathematics) - Abstract
Letter to the Editor concerning "Describing center of pressure movement in stabilometry by ellipse area approximation" from Agnieszka Golab. Reply to "Describing center of pressure movement in stabilometry by ellipse area approximation" from Agnieszka Golab concerning the paper "A Review of Center of Pressure (COP) Variables to Quantify Standing Balance in Elderly People: Algorithms and Open Access Code" Our choice was actually to present the formula of the prediction ellipse area in the article, as it indeed does not strongly depend on the sample size as the confidence ellipse area does. [Extracted from the article]
- Published
- 2022
- Full Text
- View/download PDF
248. Self-harm content to be banned by law; White Paper proposes tough measures against tech firms
- Subjects
High technology industry -- Laws, regulations and rules ,Social media ,Death ,Algorithms ,Suicide ,Technology ,Government regulation ,General interest - Abstract
Byline: Charles Hymas, Steven Swinford and Mike Wright SOCIAL media firms will be banned from promoting selfharm or suicide content to children under a legallyenforced code of conduct to prevent [...]
- Published
- 2019
249. Special Issue "Scheduling: Algorithms and Applications".
- Author
-
Werner, Frank
- Subjects
METAHEURISTIC algorithms ,FLOW shop scheduling ,OPTIMIZATION algorithms ,ALGORITHMS ,ASSEMBLY line balancing ,JOB applications - Abstract
The paper [[10]] considers an assignment problem and some modifications which can be converted to routing, distribution, or scheduling problems. This special issue of I Algorithms i is dedicated to recent developments of scheduling algorithms and new applications. References 1 Werner F., Burtseva L., Sotskov Y. Special Issue on Algorithms for Scheduling Problems. For this problem, a hybrid metaheuristic algorithm is presented which combines a genetic algorithm with a so-called spotted hyena optimization algorithm. [Extracted from the article]
- Published
- 2023
- Full Text
- View/download PDF
250. Designing plant scale process integration for water management in an Indian paper mill
- Author
-
Mukesh C. Bansal, Taesung Kim, B. Chakradhar, Vivek Kumar, and Sudheer Kumar Shukla
- Subjects
Pollution ,Paper ,Engineering ,Environmental Engineering ,media_common.quotation_subject ,India ,Fresh Water ,Management, Monitoring, Policy and Law ,Wastewater ,Waste Disposal, Fluid ,Process integration ,Mill ,Industry ,Waste Management and Disposal ,Effluent ,media_common ,Biological Oxygen Demand Analysis ,business.industry ,Scale (chemistry) ,Chemical oxygen demand ,Environmental engineering ,Paper mill ,General Medicine ,Total dissolved solids ,Chlorine ,business ,Algorithms - Abstract
In the present study, plant-scale process integration was applied to an Indian paper mill using the water cascade analysis (WCA) technique. Three limiting constraints, chemical oxygen demand (COD), total dissolved solids (TDS), and adsorbable organic halides (AOX), were considered for the study. A nearest neighbor algorithm was used to distribute the freshwater and recycled water among the plant operations. It was found that the limiting critical constraint depends upon the types of processes and streams involved in the integration. The limiting critical constraint can differ for different sections of the same industry, and can differ in different schemes of integration. After process integration, a 55.6% reduction in effluent flow, a 36% reduction in COD, and a 73% reduction in AOX were observed. After process integration, a 35.21% reduction in pollution costs can be achieved and, assuming the average production of the mill to be 225 tons per day, a savings of Indian rupees (INR) 1.73 per kg of paper produced can be achieved by employing process integration. The water cess was calculated as INR 3024.77 per day without integration for the sections that were considered for integration, while after integration, a 41.53% savings in the form of water cess was calculated.
- Published
- 2011
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.