69 results on '"*SCIENTIFIC computing"'
Search Results
2. Problems of Connectionism.
- Author
-
Vassallo, Marta, Sattin, Davide, Parati, Eugenio, and Picozzi, Mario
- Subjects
- *
PHILOSOPHY of science , *COGNITIVE science , *PHILOSOPHY of mind , *COMPUTER science , *SCIENTIFIC computing , *ARTIFICIAL intelligence , *COGNITION - Abstract
The relationship between philosophy and science has always been complementary. Today, while science moves increasingly fast and philosophy shows some problems in catching up with it, it is not always possible to ignore such relationships, especially in some disciplines such as philosophy of mind, cognitive science, and neuroscience. However, the methodological procedures used to analyze these data are based on principles and assumptions that require a profound dialogue between philosophy and science. Following these ideas, this work aims to raise the problems that a classical connectionist theory can cause and problematize them in a cognitive framework, considering both philosophy and cognitive sciences but also the disciplines that are near to them, such as AI, computer sciences, and linguistics. For this reason, we embarked on an analysis of both the computational and theoretical problems that connectionism currently has. The second aim of this work is to advocate for collaboration between neuroscience and philosophy of mind because the promotion of deeper multidisciplinarity seems necessary in order to solve connectionism's problems. In fact, we believe that the problems that we detected can be solved by a thorough investigation at both a theoretical and an empirical level, and they do not represent an impasse but rather a starting point from which connectionism should learn and be updated while keeping its original and profoundly convincing core. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Computer, Computer Science, and Computational Thinking: Relationship between the Three Concepts.
- Author
-
Chakraborty, Pinaki
- Subjects
- *
COMPUTER science , *SCIENTIFIC computing , *COMPUTERS , *ACADEMIC departments , *COMPUTER scientists - Abstract
Digital computers were invented in the 1940s. They are sophisticated and versatile machines whose functioning is grounded in elaborate theory. Advances in theory and the availability of computers helped computer science to develop as an academic discipline, and university departments for the same started coming up in the 1960s. Computer science covers all phenomenon related to computers and consists primarily of man-made laws governing building, programming, and using computers. Computational thinking is a way of thinking influenced by computers and computer science. There are two schools of thought on computational thinking. The first school sees computational thinking as the use of computers to explore the world, while the other sees computational thinking as the application of concepts from computer science to solve real-world problems. Scholars typically agree that computational thinking has four essential components, viz., abstraction, decomposition, algorithm design, and generalization. Computational thinking is often feted by computer scientists as a useful skill that can be used by anybody anywhere. However, it is necessary to find out ways for successfully using computational thinking in domains other than computer science before it can be declared a universal skill. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Usable Security: A Systematic Literature Review.
- Author
-
Di Nocera, Francesco, Tempestini, Giorgia, and Orsini, Matteo
- Subjects
- *
SECURITY systems , *BEHAVIORAL assessment , *COMPUTER science , *SCIENTIFIC computing - Abstract
Usable security involves designing security measures that accommodate users' needs and behaviors. Balancing usability and security poses challenges: the more secure the systems, the less usable they will be. On the contrary, more usable systems will be less secure. Numerous studies have addressed this balance. These studies, spanning psychology and computer science/engineering, contribute diverse perspectives, necessitating a systematic review to understand strategies and findings in this area. This systematic literature review examined articles on usable security from 2005 to 2022. A total of 55 research studies were selected after evaluation. The studies have been broadly categorized into four main clusters, each addressing different aspects: (1) usability of authentication methods, (2) helping security developers improve usability, (3) design strategies for influencing user security behavior, and (4) formal models for usable security evaluation. Based on this review, we report that the field's current state reveals a certain immaturity, with studies tending toward system comparisons rather than establishing robust design guidelines based on a thorough analysis of user behavior. A common theoretical and methodological background is one of the main areas for improvement in this area of research. Moreover, the absence of requirements for Usable security in almost all development contexts greatly discourages implementing good practices since the earlier stages of development. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
5. A Novel Approach of Residue Neutrosophic Technique for Threshold Based Image Segmentation.
- Author
-
D., Vinoth and Devarasan, Ezhilmaran
- Subjects
- *
IMAGE segmentation , *COMPUTER vision , *IMAGE analysis , *RESEARCH personnel , *COMPUTER science , *SCIENTIFIC computing - Abstract
The Residue Neutrosophic Set (RNS) is a new idea in image additional pixel level. Our idea is to make an analysis based on the additional pixel amount. In recent decades, computer vision has revolutionized image analysis by researchers. Image segmentation is a more investigated topic in the science of computer vision. Neutrosophic is a sophisticated mathematical idea to solve a myriad of challenges. The objective is to invent a neutrosophic technique to execute image thresholding. In the article, the residue methodology was applied, which denotes the residual values of neutrosophic membership intensities. This article will explore a novel idea for image thresholding termed RNS. There will be three types of RNS techniques: minimum, average, and maximum. The concepts of existing thresholding techniques in neutrosophic solvation are considered in this proposal. This article adopts novel methodologies to provide an integrated visionary path segmentation methodology. Furthermore, the proposed technique reaches a better average accuracy score. [ABSTRACT FROM AUTHOR]
- Published
- 2023
6. Gender and culture bias in letters of recommendation for computer science and data science masters programs.
- Author
-
Zhao, Yijun, Qi, Zhengxin, Grossi, John, and Weiss, Gary M.
- Subjects
- *
SEX discrimination , *COMPUTER science , *LETTERS of recommendation (Education) , *NATURAL language processing , *SCIENTIFIC computing , *WATSON (Computer) - Abstract
Letters of Recommendation (LORs) are widely utilized for admission to both undergraduate and graduate programs, and are becoming even more important with the decreasing role that standardized tests play in the admissions process. However, LORs are highly subjective and thus can inject recommender bias into the process, leading to an inequitable evaluation of the candidates' competitiveness and competence. Our study utilizes natural language processing methods and manually determined ratings to investigate gender and cultural differences and biases in LORs written for STEM Master's program applicants. We generate features to measure important characteristics of the LORs and then compare these characteristics across groups based on recommender gender, applicant gender, and applicant country of origin. One set of features, which measure the underlying sentiment, tone, and emotions associated with each LOR, is automatically generated using IBM Watson's Natural Language Understanding (NLU) service. The second set of features is measured manually by our research team and quantifies the relevance, specificity, and positivity of each LOR. We identify and discuss features that exhibit statistically significant differences across gender and culture study groups. Our analysis is based on approximately 4000 applications for the MS in Data Science and MS in Computer Science programs at Fordham University. To our knowledge, no similar study has been performed on these graduate programs. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
7. Can we identify the similarity of courses in computer science?
- Author
-
KARADAĞ, Tugay, PARIM, Coşkun, and BÜYÜKLÜ, Ali Hakan
- Subjects
- *
COMPUTER science , *SCIENTIFIC computing , *ARTIFICIAL intelligence , *DEEP learning , *BIG data , *DATA mining - Abstract
Especially on the Internet, popular topics in computer sciences which are artificial intelligence, big data, business analytics, data mining, data science, deep learning, and machine learning have been compared or classified using confusing Venn diagrams without any scientific proof. Relationships among the topics have been visualized in this study with the help of Venn diagrams to add scientificity to visualizations. Therefore, this study aims to determine the interactions among the seven popular topics in computer sciences. Five books for each topic (35 books) were included in the analysis. To illustrate the interactions among these topics, the Latent Dirichlet Allocation (LDA) analysis, a topic modeling analysis method, was applied. Further, the pairwise correlation was applied to determine the relationships among the chosen topics. The LDA analysis produced expected results in differentiating the topics, and pairwise correlation results revealed that all the topics are related to each other and that it is challenging to differentiate between them. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
8. Computer and Information Science, Internships, Biological Infrastructure Top NSF Q&A.
- Subjects
- *
INFORMATION science , *COMPUTER science , *INTERNSHIP programs , *SCIENTIFIC computing , *LIFE sciences - Abstract
The article discusses a Q&A session with program officers and awardees from the National Science Foundation (NSF) on topics related to computer and information science, internships, and biological infrastructure. The program officers provide information on the Communications and Information Foundations program, which supports research on information acquisition and processing in communication systems. They also discuss the types of research proposals funded by the program, which often involve innovative approaches and mathematical analysis. Additionally, the article highlights internship programs for graduate students at NSF, including the Mathematical Sciences Graduate Internship (MSGI) Program and Non-Academic Research Internships for Graduate Students (INTERN). The MSGI program offers internships at federal national laboratories, while the INTERN program provides supplemental funding for internships in non-academic settings. The article also features an awardee, Leah Johnson, who discusses her project on establishing a global open-access data platform for studying disease vectors. She emphasizes the importance of clear communication and engaging with program officers when applying for NSF funding. [Extracted from the article]
- Published
- 2024
9. Mitigating Bias in Algorithmic Systems—A Fish-eye View.
- Author
-
ORPHANOU, KALIA, OTTERBACHER, JAHNA, KLEANTHOUS, STYLIANI, BATSUREN, KHUYAGBAATAR, GIUNCHIGLIA, FAUSTO, BOGINA, VERONIKA, TAL, AVITAL SHULNER, HARTMAN, ALAN, and KUFLIK, TSVI
- Subjects
- *
COMMUNITIES , *COMPUTER science , *SCIENTIFIC computing , *FAIRNESS , *EYE - Abstract
Mitigating bias in algorithmic systems is a critical issue drawing attention across communities within the information and computer sciences. Given the complexity of the problem and the involvement of multiple stakeholders—including developers, end users, and third-parties—there is a need to understand the landscape of the sources of bias, and the solutions being proposed to address them, from a broad, cross-domain perspective. This survey provides a “fish-eye view,” examining approaches across four areas of research. The literature describes three steps toward a comprehensive treatment—bias detection, fairness management, and explainability management—and underscores the need to work from within the system as well as from the perspective of stakeholders in the broader context. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
10. Extended Stability and Control Strategies for Impulsive and Fractional Neural Networks: A Review of the Recent Results.
- Author
-
Stamov, Gani and Stamova, Ivanka
- Subjects
- *
COMPUTER engineering , *HOPFIELD networks , *CONVOLUTIONAL neural networks , *COMPUTER science , *COMPUTER engineers , *SCIENTIFIC computing - Abstract
In recent years, cellular neural networks (CNNs) have become a popular apparatus for simulations in neuroscience, biology, medicine, computer sciences and engineering. In order to create more adequate models, researchers have considered memory effects, reaction–diffusion structures, impulsive perturbations, uncertain terms and fractional-order dynamics. The design, cellular aspects, functioning and behavioral aspects of such CNN models depend on efficient stability and control strategies. In many practical cases, the classical stability approaches are useless. Recently, in a series of papers, we have proposed several extended stability and control concepts that are more appropriate from the applied point of view. This paper is an overview of our main results and focuses on extended stability and control notions including practical stability, stability with respect to sets and manifolds and Lipschitz stability. We outline the recent progress in the stability and control methods and provide diverse mechanisms that can be used by the researchers in the field. The proposed stability techniques are presented through several types of impulsive and fractional-order CNN models. Examples are elaborated to demonstrate the feasibility of different technologies. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
11. INTEGRATION OF INTERDISCIPLINARY RELATIONS IN TRAINING IS THE BASIS OF THE PROCESS.
- Author
-
HAJIYEVA, RENA, ALIYEV, AYDIN, AHMADOVA, ESMIRA, HAJIYEV, RAMZI, GAHRAMANLI, KHUMARA, and Alesker, İsmailov Alemdar
- Subjects
- *
SCIENTIFIC computing , *COMPUTER science education , *TEACHING methods , *UNIVERSAL language , *COMPUTER science , *BIOMATHEMATICS - Abstract
For the sake of development of modern sciences, there has been a tendency in education to make use of integration processes between sciences. Interdisciplinary integration and coordination is one of the characteristic features of the learning process which is an interconnected and fully integrated discipline. Creating interdisciplinary integration, focusing on strengthening student knowledge and skills, teaching Computer Science in relationship with other disciplines, such as mathematics, physics, biology and others, is an area of interest. This method of teaching requires instructors to be fluent not only in their own discipline, but also in other disciplines. The method of interdisciplinary relationship can be widely used in the teaching of Computer Science in relation to the course of Physics, and this opportunity is one of the actual problems. For the purpose of fulfilling this task, curriculums of both disciplines have been researched and analyzed. Certain topics from the Physics course are reminded students as a refresh to reinforce the knowledge once again, and then on the basis of this knowledge, Computer Science topics are taught, which helps consciously master the new knowledge and increase cognitive activity of students. In this study, to solve physics problems we have employed the programming language C ++, which is a universal language with a wide range of capabilities. This article justifies actuality of the issue, gives background on the existing work, and presents goals and tasks of this research, as well as methodological basics, scientific novelty, and theoretical and practical importance of this work. Experiments have been conducted with the key points emerging from the context have been presented in the conclusion section. In conclusion, one of the ways to improve the teaching of Computer Science is the use of interdisciplinary relationships in the educational process. These interdisciplinary relationships play an important role for students in acquisition of scientific, theoretical and practical knowledge and skills. These relationships can be a prerequisite for a comprehensive approach to education. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
12. Application-based principles of islamic geometric patterns; state-of-the-art, and future trends in computer science/technologies: a review.
- Author
-
Ranjazmay Azari, Mohammadreza, Bemanian, Mohammadreza, Mahdavinejad, Mohammadjavad, Körner, Axel, and Knippers, Jan
- Subjects
- *
COMPUTER science , *SCIENTIFIC computing , *EVIDENCE gaps , *CULTURAL identity , *SUPPLY & demand , *AESTHETICS - Abstract
Currently, there is a tendency to use Islamic Geometric Patterns (IGPs) as important identities and cultural elements of building design in the Middle East. Despite high demand, lack of information about the potential of IGPs principles have led to formal inspiration in the design of existing buildings. Many research studies have been carried out on the principles of IGPs. However, comprehensive studies relating to new possibilities, such as structure-based, sustainable-based, and aesthetic-based purposes, developed by computer science and related technologies, are relatively rare. This article reviews the state-of-the-art knowledge of IGPs, provides a survey of the main principles, presents the status quo, and identifies gaps in recent research directions. Finally, future prospects are discussed by focussing on different aspects of the principles in accordance with collected evidence obtained during the review process. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
13. A rubric for human-like agents and NeuroAI.
- Author
-
Momennejad, Ida
- Subjects
- *
ARTIFICIAL intelligence , *MACHINE learning , *COMPUTER science , *HUMAN behavior , *SCIENTIFIC computing - Abstract
Researchers across cognitive, neuro- and computer sciences increasingly reference 'human-like' artificial intelligence and 'neuroAI'. However, the scope and use of the terms are often inconsistent. Contributed research ranges widely from mimicking behaviour, to testing machine learning methods as neurally plausible hypotheses at the cellular or functional levels, or solving engineering problems. However, it cannot be assumed nor expected that progress on one of these three goals will automatically translate to progress in others. Here, a simple rubric is proposed to clarify the scope of individual contributions, grounded in their commitments to human-like behaviour, neural plausibility or benchmark/engineering/computer science goals. This is clarified using examples of weak and strong neuroAI and human-like agents, and discussing the generative, corroborate and corrective ways in which the three dimensions interact with one another. The author maintains that future progress in artificial intelligence will need strong interactions across the disciplines, with iterative feedback loops and meticulous validity tests—leading to both known and yet-unknown advances that may span decades to come. This article is part of a discussion meeting issue 'New approaches to 3D vision'. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
14. Aplicación con software y hardware libre Arduino como eje facilitador del aprendizaje de competencias stem.
- Author
-
Guerrero Salazar, Leonardo
- Subjects
- *
LEARNING , *DIGITAL technology , *TECHNOLOGICAL innovations , *COMPUTER science , *SCIENTIFIC computing , *CRITICAL thinking - Abstract
This article, which is the result of research, presents the design of an application as a proposal to implement free Arduino software and hardware as a facilitating axis for learning skills in natural sciences, technology, engineering, and mathematics (stem). The objective of this research was to carry out a didactic and technological innovation in the teaching and learning process of the subject of technology and computer science to develop stem competencies in students necessary to perform in a digital world. The qualitative methodology consisted of delivering didactic planning in an application based on a project-based pedagogy and integrative didactics. In this way, it was possible to identify the development of research skills, critical thinking, problem-solving, creativity, communication, and collaboration, all related to the stem approach. In addition, the educational community where the research experience took place showed great interest in using these resources in the classroom and the possibility of applying them in productive projects of the institution's technical agricultural, livestock, and business management modalities. The application can be used by technology and computer science teachers in the eleventh grade and can be adapted to other educational levels. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
15. Fractional metric dimension of generalized prism graph.
- Author
-
Goshi, Nosheen, Zafar, Sohail, and Rashid, Tabasam
- Subjects
- *
ARTIFICIAL intelligence , *PRISMS , *METRIC geometry , *GRAPH connectivity , *COMPUTER science , *SCIENTIFIC computing - Abstract
Fractional metric dimension of connected graph G was introduced by Arumugam et al. in [Discrete Math. 312, (2012), 1584-1590] as a natural extension of metric dimension which have many applications in different areas of computer sciences for example optimization, intelligent systems, networking and robot navigation. In this paper fractional metric dimension of generalized prism graph Pm × Cn is computed using combinatorial criterion devised by Liu et al. in [Mathematics, 7(1), (2019), 100]. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
16. Back to the Future: The Rise of Human Enhancement and Potential Applications for Space Missions.
- Author
-
Cahill, Ben and Braddock, Martin
- Subjects
- *
COMPUTER science , *DATA science , *SCIENTIFIC computing , *GENOME editing , *HUMAN genes - Abstract
Rapid advances in biology, electronics, computer and data science have turned invention into products, changing the lives and lifestyles of millions of people around the world. This mini-review will describe some remarkable progress made over the last 10 years which serves both healthy individuals and patients alike. With a forward looking lens towards long term space missions and the potential colonisation of the Moon and Mars, we discuss three technologies under development. We conclude with a distant looking perspective on the prospect of gene mediated human enhancement and highlight the importance of aligning benefit for people on Earth with goals for future space missions and the need to establish regulatory and ethical guidelines. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
17. A Global Survey of Scientific Consensus and Controversy on Instruments of Climate Policy.
- Author
-
Drews, Stefan, Savin, Ivan, and van den Bergh, Jeroen
- Subjects
- *
GOVERNMENT policy on climate change , *CARBON pricing , *RESEARCH personnel , *COMPUTER science , *SCIENTIFIC computing , *CLIMATE change skepticism - Abstract
There is continuing debate about which climate-policy instruments are most appropriate to reduce emissions. Undertaking a global survey among scientists who published on climate policy, we provide a systematic overview of (dis)agreements about six main types of policy instruments. The survey includes various fields across the social and natural sciences. The results show that, on average, all instruments are considered important, with direct regulation receiving the highest rating and adoption subsidies and cap-and-trade the lowest. The latter is surprising given the theoretical advantages and real-world success of the EU-ETS. Next, clustering scientific fields based on how important they consider the instruments, we determine five distinct groups, with (a) ecological economists and (b) mathematics/computer science being most dissimilar from other discipline clusters. We explain disagreement through assessing the relative importance assigned to policy criteria effectiveness, efficiency, equity and socio-political feasibility, as well as researchers' attitudes and background. Paying special attention to carbon pricing, motivated by its contested key role, we identify three respondent clusters, namely 'enthusiasts', 'undecided', and 'skeptics'. Examining various policy arguments, we find that agreeing that carbon pricing effectively limits energy/carbon rebound and has potential to be harmonized globally have the strongest association with giving importance to this policy. • We survey researchers from diverse fields to examine views on climate policies. • Direct regulation is on average rated as most important. • Environmental and ecological economists hold contrasting views on cap-and-trade. • Support for carbon pricing relates to expectations of curbing rebound and global harmonization. • Many other factors are assessed, such as policy criteria, climate worry and ideology. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Evaluación del plan de área de tecnología e informática en los colegios de Villavicencio.
- Author
-
Álvarez Cardona, Numar and Casallas, Nubia Estella Cruz
- Subjects
- *
COMPUTER engineering , *SCIENTIFIC computing , *COMPUTER science , *SCHOOL autonomy , *BEGINNING teachers - Abstract
the evaluation of the plan for the area of technology and computer science in the schools of Villavicencio allows the reader to learn about the methodologies, plans and programs that are developed in the area. The results will enable the reader to have a clear vision of the way in which the teaching process is developed and, based on this, to identify the actions that could eventually be developed in an educational institution that intends to promote the use of computer science and technology. This research, with a qualitative approach began with interviews of teachers and culminated with the review of the subject plan; these data were categorized to be submitted to the triangulation of results in order to evaluate the plans for the area. It was found in some cases that there is a lack of knowledge of the concepts of pedagogical planning; that each institution has a specific model that deviates from the Ministry's guidelines based on the protection of school autonomy, and that area planning often follows the teacher's interests. All these generalities made it possible to demonstrate that there is no common path, but that there are many individual endeavors that try, with much effort, to make the development of the area more efficient. The research leads to the deduction that the area of technology and computer science could be developed in a transversal and interdisciplinary way with the other areas of the academic curriculum so that it can become the support required by basic secondary education in the city of Villavicencio. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
19. IEEE Computer Society Awards.
- Subjects
- *
COMPUTER engineering , *COMPUTERS , *COMPUTER science , *COMPUTER engineers , *SCIENTIFIC computing - Abstract
The IEEE Computer Society awards program honors technology leaders who have had a great impact on the advancement of innovation while serving the computer profession and the Society. We're pleased to announce 2021 award recipients (as of September 2021) that represent the brightest luminaries and pioneers within the field of computer science and computer engineering. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
20. Revising Computer Science Networking Hands-On Courses in the Context of the Future Internet.
- Author
-
Fraire, Juan A. and Duran, Juan E.
- Subjects
- *
COMPUTER networks , *SCIENTIFIC computing , *NETWORK PC (Computer) , *BACHELOR of science degree , *BLENDED learning - Abstract
Contribution: In a context where hands-on courses are biased toward specific technologies, a novel creativity-provoking instructional approach for networking undergraduate courses is successfully applied following action research principles and active and creative learning techniques. Background: Extensive engineering-oriented networking courses have been proposed with a strong focus on specific protocol solutions. At the same time, the amount and complexity of techniques is notably increasing with the advent of the Future Internet. As a result, the curricula looses focus on the fundamentals of networking algorithms. Intended Outcomes: We address algorithmic learning in networking for computer sciences, where students are expected to 1) create; 2) develop; 3) analyze; and 4) compare algorithms and processes regardless of protocol-specific technologies. At least 70% of the students are expected to meet this goal while enhancing their engagement and motivation in a time-constrained course schedule. Application Design: To achieve 1) and 2), we instrument an active experimental strategy, while objectives 3) and 4) are tackled with creative learning techniques, both applied in an action research framework. The approach is supported by state-of-the-art networking application interfaces and simulators. Furthermore, a blended and game learning component favors the engagement via comparison and competition of students’ project metrics. Findings: The experiment is carried out by professors of the Computer Science Bachelor’s degree taught in FAMAF. Results show that the applied methodology met the intended outcomes, and improved by 7% in a two-year cycle. Furthermore, the approach was very well received based on student’s feedback. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
21. AstroGeoVis v1.0: Astronomical Visualizations and Scientific Computing for Earth Science Education.
- Author
-
Kostadinov, Tihomir S.
- Subjects
- *
EARTH science education , *EARTH system science , *SCIENTIFIC computing , *SUSTAINABLE design , *SCIENTIFIC visualization , *PHYSICAL geography , *COMPUTER science - Abstract
Modern climate science, Earth system science, physical geography, oceanography, meteorology, and related disciplines have increasingly turned into highly quantitative, computational fields, dealing with processing, analysis and visualization of large numerical data sets. Students of these and many other disciplines thus need to acquire robust scientific computing and data analysis skills, which have universal applicability. In addition, the increasing economic importance and environmental significance of solar power and sustainable practices such as passive building design have recently increased the importance of understanding of the apparent motions of the Sun on the celestial sphere, for a wider array of students and professionals. In this paper, I introduce and describe AstroGeoVis v1.0: open-source software that calculates solar coordinates and related parameters and produces astronomical visualizations relevant to the Earth and climate sciences. The software is written in MATLAB©; while its primary intended purpose is pedagogical, research use is envisioned as well. Both the visualizations and the code are intended to be used in the classroom in a variety of courses, at a variety of levels (targeting high school students to undergraduates), including Earth and climate sciences, geography, physics, astronomy, mathematics, statistics and computer science. I provide examples of classroom use and assignment ideas, as well as examples of ways I have used these resources in my college-level teaching. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
22. IN MEMORIAM KARL F. STOCK (13.01.1937–10.06.2022).
- Author
-
Krießmann, Ulrike and Reismann, Bernhard A.
- Subjects
- *
SCIENTIFIC computing , *LIBRARY administration , *BIBLIOGRAPHY , *MATHEMATICAL physics , *COMPUTER science , *LIBRARY directors , *ACADEMIC libraries - Abstract
The article "IN MEMORIAM KARL F. STOCK (13.01.1937–10.06.2022)" describes the life and career of Karl F. Stock, a significant Austrian librarian. Stock was born on January 13, 1937, in Graz and began studying mathematics and physics at the University of Graz after his father's death in 1943. However, due to financial reasons, he discontinued his studies and instead worked in the university library. Stock completed specialized training in data processing for library science and became the director of the library at the Graz University of Technology in 1974. He was a pioneer in the introduction of automated library management and published numerous works on bibliography, computer science, and library organization. Stock was also an artist and created graphic works for decades. He will be remembered as a significant librarian, bibliography expert, and exlibris artist. [Extracted from the article]
- Published
- 2022
- Full Text
- View/download PDF
23. RandNLA: Randomized Numerical Linear Algebra.
- Author
-
DRINEAS, PETROS and MAHONEY, MICHAEL W.
- Subjects
- *
LINEAR algebra , *RANDOM variables , *RANDOM numbers , *COMPUTER science , *SCIENTIFIC computing , *DATA analysis , *MACHINE learning , *LINEAR equations - Abstract
The article presents research on the alleged benefits of randomization in matrix algorithms for numerical linear algebra computing (RandNLA). The applications of RandNLA algorithms in scientific computing, data analysis, and algorithm development are examined and the use of RandNLA to solve least square regression problems and systems of Laplacian-based linear equations as well as in machine learning is described.
- Published
- 2016
- Full Text
- View/download PDF
24. An Appraisal of Research delineate of Jordan during 2015 to 2019: A Reflection from Scopus Database.
- Author
-
Al-Jaradat, Omar Mohammad
- Subjects
- *
CITATION indexes , *APPLIED sciences , *PUBLIC universities & colleges , *SCIENTIFIC computing , *DATABASES , *COMPUTER science - Abstract
The study is perceived to find the research performance of Jordanian Institutes during the last five years in scopus database. Jordan has produced a total of 14722 research articles published in national and international journals during 2015 to 2019 with a continuous gradual increase both in publications as well as in citations. The public and older universities have performed better as compared to the private and newly established universities. Although, the major chunk of articles has been published in national journals however, as far as the citations are concerned, the international journals have taken a lead. A total of 4767 journals have been used for publications, however, 227 have been identified as the core collection journals. The non local journals with high impact factor and citescore have got the maximum citations, although very less no. of articles have been published by Jordanian researchers. The medicine, engineering, computer science, mathematics, etc. are the top contributing subjects while as the finance, economics, neuroscience, veterinary, and decision sciences are weaker subjects. The keywords analysis also reflects the strong hold of medicine and health researches in the country. The Lancet has been identified as the top cited journal followed by New England Journal of Medicine, The Lancet Neurology, IEEE Access, JAMA Oncology and the less cited journal were International Journal of Recent Technology and Engineering, Journal of Engineering and Applied Sciences, Jordan Journal of Physics, International Journal of Scientific and Technology Research. Khader YS with 195 articles, Alzoubi KH with 143 articles and Khabour OF with 106 articles all from the Jordan University of Science and Technology are the top authors in Jordan. A total of 2208 articles are written by single authors and 12514 by multiple authors with a highly skewed correlation been identified between the authors and the no. of authors with 27979 authors contributing only single publication while as 195 articles have been published by only one author. The United States followed by Saudi Arabia and United Kingdom has been identified as the top collaborated countries of Jordan and the countries like Tajikistan, Solomon Islands and Cambodia are weakly collaborated. It is found that the highly cited articles are published in the Lancet Journal and with more collaboration of authors more citations have been identified. The top ranked articles in terms of citations have been published by the highly collaborated authors. The Jordanian researchers are citing mostly the newer research which may be related with the stronger areas of knowledge like medicine, computer sciences, engineering as per the Reference Publication Year Spectroscopy (RPYS). [ABSTRACT FROM AUTHOR]
- Published
- 2020
25. How People Are Influenced by Deceptive Tactics in Everyday Charts and Graphs.
- Author
-
Lauer, Claire and O'Brien, Shaun
- Subjects
- *
SCIENTIFIC computing , *CHARTS, diagrams, etc. , *DATA modeling , *HUMAN-computer interaction , *COMPUTER science , *FAKE news , *LITERATURE reviews , *LABELS - Abstract
Background: Visualizations are used to communicate data about important political, social, environmental, and health topics to a wide range of audiences; however, perceptions of graphs as objective conduits of factual data make them an easy means for spreading misinformation. Research questions: 1. Are people deceived by common deceptive tactics or exaggerated titles used in data visualizations about non-controversial topics? 2. Does a person's previous data visualization coursework mitigate the extent to which they are deceived by deceptive tactics used in data visualizations? 3. What parts of data visualizations (title, shape, data labels) do people use to answer questions about the information being presented in data visualizations? Literature review: Although scholarship from psychology, human-computer interaction, and computer science has examined how data visualizations are processed by readers, scholars have not adequately researched how susceptible people are to a range of deceptive tactics used in data visualizations, especially when paired with textual content. Methodology: Participants (n = 329) were randomly assigned to view one of four treatments for four different graph types (bar, line, pie, and bubble) and then asked to answer a question about each graph. Participants were asked to rank the ease with which they read each graph and comment on what they used to respond to the question about each graph. Results/Discussion: Results show that deceptive tactics caused participants to misinterpret information in the deceptive versus control visualizations across all graph types. Neither graph titles nor previous coursework impacted responses for any of the graphs. Qualitative responses illuminate people's perceptions of graph readability and what information they use to read different types of graphs. Conclusions: Recommendations are made to improve data visualization instruction, including critically examining software defaults and the ease with which people give agency over to software when preparing data visualizations. Avenues of future research are discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
26. Reed-Muller Codes Polarize.
- Author
-
Abbe, Emmanuel and Ye, Min
- Subjects
- *
REED-Muller codes , *CODING theory , *SYMMETRIC matrices , *BOOLEAN functions , *SCIENTIFIC computing - Abstract
Reed-Muller (RM) codes were introduced in 1954 and have long been conjectured to achieve Shannon’s capacity on symmetric channels. The activity on this conjecture has recently been revived with the emergence of polar codes. RM codes and polar codes are generated by the same matrix $\begin{aligned} G_{m}= \left[{\begin{smallmatrix}1 & 0 \\ 1 & 1 \\ \end{smallmatrix}}\right]^{\otimes m} \end{aligned}$ but using different subset of rows. RM codes select simply rows having largest weights. Polar codes select instead rows having the largest conditional mutual information proceeding top to down in $G_{m}$ ; while this is a more elaborate and channel-dependent rule, the top-to-down ordering allows Arıkan to show that the conditional mutual information polarizes, and this gives directly a capacity-achieving code on any symmetric channel. RM codes are yet to be proved to have such a property, despite the recent success for the erasure channel. In this article, we connect RM codes to polarization theory. We show that proceeding in the RM code ordering, i.e., not top-to-down but from the lightest to the heaviest rows in $G_{m}$ , the conditional mutual information again polarizes. Here “polarization” means that almost all the conditional mutual information becomes either very close to 0 or very close to 1. Polarization itself is a necessary condition for RM codes to achieve capacity on symmetric channels while polarization together with a strong order on the conditional mutual information gives a sufficient condition, where strong order means that rows with larger weight always correspond to larger conditional mutual information. Although we are not able to prove the strong order, we establish a partial order on the conditional mutual information, which is a subset of the strong order. While the main results of this article–polarization together with the partial order–provide some advances on the capacity-achieving conjecture of RM codes, we emphasize that our results do not allow us to prove the conjecture. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
27. New Infinite Families of Perfect Quaternion Sequences and Williamson Sequences.
- Author
-
Bright, Curtis, Kotsireas, Ilias, and Ganesh, Vijay
- Subjects
- *
QUATERNIONS , *ORTHOGONAL arrays , *SCIENTIFIC computing , *FAMILIES - Abstract
We present new constructions for perfect and odd perfect sequences over the quaternion group $Q_{8}$. In particular, we show for the first time that perfect and odd perfect quaternion sequences exist in all lengths $2^{t}$ for $t\geq 0$. In doing so we disprove the quaternionic form of Mow’s conjecture that the longest perfect $Q_{8}$ -sequence that can be constructed from an orthogonal array construction is of length 64. Furthermore, we use a connection to combinatorial design theory to prove the existence of a new infinite class of Williamson sequences, showing that Williamson sequences of length $2^{t} n$ exist for all $t\geq 0$ when Williamson sequences of odd length $n$ exist. Our constructions explain the abundance of Williamson sequences in lengths that are multiples of a large power of two. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
28. Mixed-Cell-Height Legalization Considering Technology and Region Constraints.
- Author
-
Zhu, Ziran, Chen, Jianli, Zhu, Wenxing, and Chang, Yao-Wen
- Subjects
- *
LEGALIZATION , *CELL motility , *ALGORITHMS , *TECHNOLOGY , *SCIENTIFIC computing - Abstract
Mixed-cell-height circuits have become popular in advanced technologies for better power, area, routability, and performance tradeoffs. With technology and region constraints imposed by modern circuit designs, the mixed-cell-height legalization problem has become even more challenging. Additionally, an ideal legalization method should minimize both the average and maximum cell movements to preserve the quality of a given placement as much as possible. In this article, we present an effective and efficient mixed-cell-height legalization algorithm to consider technology and region constraints while minimizing the average and maximum cell movements. We first present a fence region handling technique to unify the fence regions and the default region. To obtain a desired cell assignment, we then propose a movement-aware cell reassignment method by iteratively reassigning cells in locally dense areas to their desired rows. After cell reassignment, a technology-aware legalization is presented to remove cell overlaps while satisfying the technology constraints. Finally, we propose a technology-aware refinement to further reduce the average and maximum cell movements without increasing the technology constraints violations. Compared with the champion of the 2017 CAD Contest at ICCAD and the state-of-the-art work, experimental results based on the 2017 CAD Contest at ICCAD benchmarks show that our algorithm achieves the best average and maximum cell movements and significantly fewer technology constraint violations, in a comparable runtime. The experimental results based on the modified 2015 ISPD Contest benchmarks also demonstrate the effectiveness of our algorithm in minimizing the average and maximum cell movements, compared with state-of-the-art mixed-cell-height legalizers. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
29. A Provably Secure Two-Factor Authentication Scheme for USB Storage Devices.
- Author
-
Ayub, Muhammad Faizan, Shamshad, Salman, Mahmood, Khalid, Islam, SK Hafizul, Parizi, Reza M., and Choo, Kim-Kwang Raymond
- Subjects
- *
MULTI-factor authentication , *USB technology , *PERSONALLY identifiable information , *STORAGE , *SCIENTIFIC computing - Abstract
Universal Serial Bus (USB) is widely used, for example to facilitate hot-swapping and plug-and-play. However, USB ports can be exploited by an adversary to extract private or personal data from the connected devices. Hence, a number of organizations and workplaces have prohibited their employees from using USB devices, and there have been efforts to design secure USB storage device schemes to more effectively resist different known security attacks. However, designing such schemes is challenging. For example, in this article we revisit the Wei et al.’s scheme, and demonstrate that it is vulnerable to attacks such as password guessing and user impersonation. We also explain that the scheme does not verify the correctness of user’s input in the login phase, which is another design flaw. Then, we present an improved scheme and prove it secure in the random oracle model. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
30. A General Centrality Framework-Based on Node Navigability.
- Author
-
De Meo, Pasquale, Levene, Mark, Messina, Fabrizio, and Provetti, Alessandro
- Subjects
- *
CENTRALITY , *SCIENTIFIC computing - Abstract
Centrality metrics are a popular tool in Network Science to identify important nodes within a graph. We introduce the Potential Gain as a centrality measure that unifies many walk-based centrality metrics in graphs and captures the notion of node navigability, interpreted as the property of being reachable from anywhere else (in the graph) through short walks. Two instances of the Potential Gain (called the Geometric and the Exponential Potential Gain) are presented and we describe scalable algorithms for computing them on large graphs. We also give a proof of the relationship between the new measures and established centralities. The geometric potential gain of a node can thus be characterized as the product of its Degree centrality by its Katz centrality scores. At the same time, the exponential potential gain of a node is proved to be the product of Degree centrality by its Communicability index. These formal results connect potential gain to both the “popularity” and “similarity” properties that are captured by the above centralities. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
31. Ghost Imputation: Accurately Reconstructing Missing Data of the Off Period.
- Author
-
Rawassizadeh, Reza, Keshavarz, Hamidreza, and Pazzani, Michael
- Subjects
- *
MISSING data (Statistics) , *ACQUISITION of data , *MULTIPLE imputation (Statistics) , *COMPUTATIONAL complexity , *ALGORITHMS , *SCIENTIFIC computing , *DATA - Abstract
Noise and missing data are intrinsic characteristics of real-world data, leading to uncertainty that negatively affects the quality of knowledge extracted from the data. The burden imposed by missing data is often severe in sensors that collect data from the physical world, where large gaps of missing data may occur when the system is temporarily off or disconnected. How can we reconstruct missing data for these periods? We introduce an accurate and efficient algorithm for missing data reconstruction (imputation), that is specifically designed to recover off-period segments of missing data. This algorithm, Ghost, searches the sequential dataset to find data segments that have a prior and posterior segment that matches those of the missing data. If there is a similar segment that also satisfies the constraint – such as location or time of day – then it is substituted for the missing data. A baseline approach results in quadratic computational complexity, therefore we introduce a caching approach that reduces the search space and improves the computational complexity to linear in the common case. Experimental evaluations on five real-world datasets show that our algorithm significantly outperforms four state-of-the-art algorithms with an average of 18 percent higher F-score. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
32. Sublinear-Time Algorithms for Compressive Phase Retrieval.
- Author
-
Li, Yi and Nakos, Vasileios
- Subjects
- *
SCIENTIFIC computing , *ALGORITHMS , *IMAGE reconstruction algorithms , *APPROXIMATION algorithms - Abstract
In the problem of compressed phase retrieval, the goal is to reconstruct a sparse or approximately $k$ -sparse vector $x \in \mathbb {C} ^{n}$ given access to $y= |\Phi x|$ , where $|v|$ denotes the vector obtained from taking the absolute value of $v\in \mathbb {C} ^{n}$ coordinate-wise. In this paper we present sublinear-time algorithms for a few for-each variants of the compressive phase retrieval problem which are akin to the variants considered for the classical compressive sensing problem in theoretical computer science. Our algorithms use pure combinatorial techniques and near-optimal number of measurements. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
33. A New Family of APN Quadrinomials.
- Author
-
Budaghyan, Lilya, Helleseth, Tor, and Kaleyski, Nikolay
- Subjects
- *
SCIENTIFIC computing , *FAMILIES , *BOOLEAN functions - Abstract
The binomial $B(x) = x^{3} + \beta x^{36}$ (where $\beta $ is primitive in $\mathbb {F}_{2^{2}}$) over $\mathbb {F}_{2^{10}}$ is the first known example of an Almost Perfect Nonlinear (APN) function that is not CCZ-equivalent to a power function, and has remained unclassified into any infinite family of APN functions since its discovery in 2006. We generalize this binomial to an infinite family of APN quadrinomials of the form $x^{3} + a (x^{2^{i}+1})^{2^{k}} + b x^{3 \cdot 2^{m}} + c (x^{2^{i+m}+2^{m}})^{2^{k}}$ from which $B(x)$ can be obtained by setting $a = \beta $ , $b = c = 0$ , $i = 3$ , $k = 2$. We show that for any dimension $n = 2m$ with $m$ odd and $3 \nmid m$ , setting $(a,b,c) = (\beta, \beta ^{2}, 1)$ and $i = m-2$ or $i = (m-2)^{-1} \mod n$ yields an APN function, and verify that for $n = 10$ the quadrinomials obtained in this way for $i = m-2$ and $i = (m-2)^{-1} \mod n$ are CCZ-inequivalent to each other, to $B(x)$ , and to any other known APN function over $\mathbb {F}_{2^{10}}$. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
34. Network-Coding Solutions for Minimal Combination Networks and Their Sub-Networks.
- Author
-
Cai, Han, Chrisnata, Johan, Etzion, Tuvi, Schwartz, Moshe, and Wachter-Zeh, Antonia
- Subjects
- *
LINEAR network coding , *MINIMAL surfaces , *GRAPH coloring , *FINITE fields , *SCIENTIFIC computing , *HOMOMORPHISMS - Abstract
Minimal multicast networks are fascinating and efficient combinatorial objects, where the removal of a single link makes it impossible for all receivers to obtain all messages. We study the structure of such networks, and prove some constraints on their possible solutions. We then focus on the combination network, which is one of the simplest and most insightful network in network-coding theory. Of particular interest are minimal combination networks. We study the gap in alphabet size between vector-linear and scalar-linear network-coding solutions for such minimal combination networks and some of their sub-networks. For minimal multicast networks with two source messages we find the maximum possible gap. We define and study sub-networks of the combination network, which we call Kneser networks, and prove that they attain the upper bound on the gap with equality. We also prove that the study of this gap may be limited to the study of sub-networks of minimal combination networks, by using graph homomorphisms connected with the $q$ -analog of Kneser graphs. Additionally, we prove a gap for minimal multicast networks with three or more source messages by studying Kneser networks. Finally, an upper bound on the gap for full minimal combination networks shows nearly no gap, or none in some cases. This is obtained using an MDS-like bound for subspaces over a finite field. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
35. SEAL: User Experience-Aware Two-Level Swap for Mobile Devices.
- Author
-
Li, Changlong, Shi, Liang, Liang, Yu, and Xue, Chun Jason
- Subjects
- *
MOBILE apps , *SCIENTIFIC computing , *USER experience - Abstract
App caching is important for mobile devices, which enables fast switching and state restoration of apps by caching all the pages in memory. Memory swapping can improve app caching capability by evicting pages to the secondary storage. However, enabling memory swapping could induce jitters in interactions, which significantly degrades the user experience. As a result, storage-based swapping is disabled by default in most mobile devices. This article proposes a novel swap framework, SEAL, a user experience-aware two-level swapping, which maximizes the benefits of memory swapping and minimizes the negative impact on user experience in interactions. Inspired by a study on the access characteristics of a set of popular apps on mobile devices, the framework adopts compressed memory as the first swap level (SL1) and secondary storage as the second swap level (SL2). To optimize user experience comprehensively, three schemes are proposed. First, a novel page identification scheme is proposed to guide the page placement between these two levels. Second, a hidden page loading (HPL) scheme is proposed to load pages from SL2 to SL1 for optimized user experience during app execution. Finally, an app-granularity swapping scheme is proposed to swap data in the unit of apps. Experiments on real devices show that app caching capability is improved by $2.43\times $ on average when enabling SEAL while minimizing the negative impact on user experience. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
36. Neighbor Discovery Based on Cross-Technology Communication for Mobile Applications.
- Author
-
Gao, Demin, Li, Zhijun, Liu, Yunhuai, and He, Tian
- Subjects
- *
MOBILE apps , *ZIGBEE , *NEIGHBORS , *SCIENTIFIC computing - Abstract
Neighbor discovery is essential for mobile devices to find each other before communications. Traditional discovery protocols are constrained by the in-technology communication paradigm. Cross-Technology Communications enable ZigBee nodes to be coordinated by a WiFi node without any hardware changes or gateway equipment, which sheds the light on more efficient neighbor discovery schemes. In this work, we introduce a new direction for neighbor discovery based on Cross-Technology Communication technique and propose a physical-layer technique called NewBee. NewBee takes the advantages of coordinations from a WiFi node to assist ZigBee nodes for neighbor discovery. A simple yet effective Countdown Mechanism is employed so that ZigBee nodes are coordinately waked up according to the beacons in the Countdown Mechanism. We give a rigorous analysis on NewBee and the worst-case discovery latency is significantly reduced from the traditional O(n2) to O(n), where n indicates the reciprocal of node's duty-cycle. The experimental results show that NewBee only needs 11%, 10%, 2.5%, 2.4% and 4.4% percent of discovery time compared with the state-of-the-art BlindDate, SearchLight, Disco, Quorum, and Birthday protocols with 5% nodes’ duty-cycle, respectively. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
37. Scope-Aware Useful Cache Block Calculation for Cache-Related Pre-Emption Delay Analysis With Set-Associative Data Caches.
- Author
-
Zhang, Wei, Guan, Nan, Ju, Lei, Tang, Yue, Liu, Weichen, and Jia, Zhiping
- Subjects
- *
CACHE memory , *DATA analysis , *SYSTEM analysis , *TASK analysis , *PROBLEM solving , *SCIENTIFIC computing - Abstract
Timing analysis of real-time systems must consider cache-related pre-emption delay (CRPD) costs when pre-emptive scheduling is used. While most previous work on CRPD analysis only considers instruction caches, the CRPD incurred on data caches is actually more significant. The state-of-the-art CRPD analysis methods are based on useful cache block (UCB) calculation. Unfortunately, as shown in this article, directly extending the existing UCB calculation techniques from instruction caches to data caches will lead to both unsoundness and significant imprecision. To solve these problems, we develop a new UCB calculation technique for data caches, which redefines the analysis unit (to address the unsoundness in the existing method) and precisely captures the dynamic cache access behavior by taking the temporal scopes of memory blocks into consideration. The experimental results show that our new technique yields substantially tighter CRPD estimations comparing with the state-of-the-art. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
38. Exploring the Macrostructure of Research Articles in Economics.
- Author
-
Jin, Guangsa, Li, Chenle, and Sun, Ya
- Subjects
- *
ECONOMIC research , *COGNITIVE load , *ECONOMETRIC models , *SCIENTIFIC computing , *LEARNING goals - Abstract
Background: The cognitive load involved in research article (RA) reading can be overwhelming for L2 novice readers. RA section headings can be used as signals to help novices focus on essential information related to their learning goals to reduce extraneous cognitive processing. There is a need to examine RA macrostructures to inform RA reading instruction. Literature review: RAs do not always follow the Introduction-Methods-Results-Discussion (IMRD) model. Previous research has examined the macrostructure of articles in disciplines such as computer science, applied linguistics, and pure mathematics, but few have investigated the macrostructure of economics RAs. Research questions: 1. Are there any sections frequently used in economics articles apart from the conventional sections? 2. If yes, what are the views of expert economics RA readers on the communicative functions and propositional content of the newly identified sections of economics RAs? Research methods: Eighty RAs were collected from five economics journals using stratified random sampling. Following Yang and Allison's macrostructure analysis method, we conducted an analysis of the overall structure of the RAs based on section headings and the function and content of each section. Results: Compared with the IMRD model, we found six new section types: Background, Theoretical Model, Econometric Model, Robustness, Mechanisms, and Application. Interviews were conducted to explore expert RA readers’ genre knowledge on the newly identified sections. Conclusion: The findings can be useful for RA reading and writing instruction and future research on part-genres of economics articles. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
39. ALGORITHMICITY OF EVOLUTIONARY ALGORITHMS.
- Author
-
Leciejewski, Sławomir and Szynkiewicz, Mariusz
- Subjects
- *
SCIENTIFIC computing , *COLLOQUIAL language , *ALGORITHMS , *SCIENTIFIC language , *EVOLUTIONARY algorithms , *EVOLUTIONARY computation , *COMPUTER science - Abstract
In the first part of our article we will refer the penetration of scientific terms into colloquial language, focusing on the sense in which the concept of an algorithm currently functions outside its original scope. The given examples will refer mostly to disciplines not directly related to computer science and to the colloquial language. In the next part we will also discuss the modifications made to the meaning of the term algorithm and how this concept is now understood in computer science. Finally, we will discuss the problem of algorithmicity of evolutionary algorithms, i.e. we will try to answer the question whether - and to what extent - they are still algorithms in the classical sense. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
40. ANALOGICITY IN COMPUTER SCIENCE. METHODOLOGICAL ANALYSIS.
- Author
-
Stacewicz, Paweł
- Subjects
- *
SCIENTIFIC computing , *TURING machines , *GREAT powers (International relations) - Abstract
Analogicity in computer science is understood in two, not mutually exclusive ways: 1) with regard to the continuity feature (of data or computations), 2) with regard to the analogousness feature (i.e. similarity between certain natural processes and computations). Continuous computations are the subject of three methodological questions considered in the paper: 1a) to what extent do their theoretical models go beyond the model of the universal Turing machine (defining digital computations), 1b) is their computational power greater than that of the universal Turing machine, 1c) under what conditions are continuous computations realizable in practice? The analogue-analogical computations lead to two other issues: 2a) in what sense and to what extent their accuracy depends on the adequacy of certain theories of empirical sciences, 2b) are there analogue-analogical computations in nature that are also continuous? The above issues are an important element of the philosophical discussion on the limitations of contemporary computer science. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
41. Open Source Research Software.
- Author
-
Hasselbring, Wilhelm, Carr, Leslie, Hettrick, Simon, Packer, Heather, and Tiropanis, Thanassis
- Subjects
- *
COMPUTER software , *OPEN source software , *COMPUTER science , *SCIENTIFIC computing , *SOFTWARE engineering , *ARTIFICIAL intelligence - Abstract
Reports on the need to make make software open source. It should be both archived for reproducibility and actively maintained for reusability. In computational and computer science, research software is a central asset for development activities. For good scientific practice, the resulting research software should be open source. Established open source software licenses provide sufficient options for granting permissions such that it should be the rare exception to keep research software closed. Proper engineering is required for obtaining reusable and sustainable research software. This way, software engineering methods may improve research in other disciplines. However, research in software engineering and computer science itself will also benefit when programs are reused. To study the state of the art in this field, we analyzed research software publishing practices in computer and computational science and observed significant differences: computational science emphasizes reproducibility, while computer science emphasizes reuse. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
42. AVALIAÇÃO BASEADA EM CRITÉRIOS COMO MODO DE FORMAÇÃO DA ALFABETIZAÇÃO FUNCIONAL DOS ESTUDANTES EM CIÊNCIA DA COMPUTAÇÃO.
- Author
-
AVDARSOL, Sailaugul, RAKHIMZHANOVA, Lyazzat B., BOSTANOV, Bektas G., SAGIMBAEVA, Ainur Ye., and KHAKIMOVA, Tiyshtik
- Subjects
- *
COMPUTER literacy , *SCIENTIFIC literacy , *FORMATIVE tests , *SCIENTIFIC computing , *ACADEMIC achievement , *COMPUTER science - Abstract
For a long time, the primary approach to assessment was the normative approach when the individual achievements of students were compared with a particular norm (the results of most students). Recently, domestic pedagogical research has been developing a criteria-based approach to assessing academic achievement when students' achievements are compared with the amount of knowledge that needs to be acquired at a particular stage of training. This study aimed to determine the role of criteria-based assessment in the formation of students' functional literacy in computer science and to build a criteria-based assessment model in the development of functional literacy beyond to demonstrate the effectiveness of the methods of formation of students' functional literacy in computer science. The leading research methods were the criteria-based assessment methodology developed by the authors and the method of formative assessment. Some elements of the methodology of forming evaluation were considered. For further development of the methods for the formation of students' functional literacy in computer science, a criteria-based assessment model has been built. The introduction of criteria-based assessment will allow to switch to a formative evaluation aimed at developing student competence. The evaluation, consisting of criteria that a student understands, stimulates him and makes the learning process meaningful. Based on practical experiments and the proposed criteria-based assessment, the effectiveness of methods for the formation of students' functional literacy in computer science has been proved. [ABSTRACT FROM AUTHOR]
- Published
- 2020
43. Redesign and validation of a computer programming course using Inductive Teaching Method.
- Author
-
Khan, Iftikhar Ahmed, Iftikhar, Mehreen, Hussain, Syed Sajid, Rehman, Attiqa, Gul, Nosheen, Jadoon, Waqas, and Nazir, Babar
- Subjects
- *
COMPUTER programming , *TEACHING methods , *CLASSROOM activities , *MATHEMATICAL ability , *SCIENTIFIC computing , *COMPUTER science - Abstract
Inductive Teaching Method (ITM) promotes effective learning in technological education (Felder & Silverman, 1988). Students prefer ITM more as it makes the subject easily understandable (Goltermann, 2011). The ITM motivates the students to actively participate in class activities and therefore could be considered a better approach to teach computer programming. There has been little research on implementing ITM in computer science courses despite its potential to improve effective learning. In this research, an existing computer programming lab course is taught using a traditional Deductive Teaching Method (DTM). The course is redesigned and taught by adopting the ITM instead. Furthermore, a comprehensive plan has been devised to deliver the course content in computer labs. The course was evaluated in an experiment consisting of 81 undergraduate students. The students in the Experimental Group (EG) (N = 45) were taught using the redesigned ITM course, whereas the students in the Control Group (CG) (N = 36) were taught using the DTM course. The performance of both groups was compared in terms of the marks obtained by them. A pre-test conducted to compare pre-course mathematical and analytical abilities showed that CG was better in analytical reasoning with no significant differences in mathematical abilities. Three post-tests were used to evaluate the groups theoretical and practical competence in programming and showed EG improved performance with large, medium, and small effect sizes as compared to CG. The results of this research could help computer programming educators to implement inductive strategies that could improve the learning of the computer programming. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
44. Design, Malfunction, Validity: Three More Tasks for the Philosophy of Computing.
- Author
-
Primiero, Giuseppe
- Subjects
- *
SCIENTIFIC computing , *PHILOSOPHY of science , *TASKS , *PHILOSOPHY , *COMPUTER science - Abstract
We present a review of Raymond Turner's Book Computational Artifacts – Towards a Philosophy of Computer Science (2018), focusing on three main topics: Design, Malfunction, and Validity. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
45. Computer Science as Immaterial Formal Logic.
- Author
-
Bringsjord, Selmer
- Subjects
- *
SCIENTIFIC computing , *PHILOSOPHY of science , *LOGIC - Abstract
I critically review Raymond Turner's Computational Artifacts – Towards a Philosophy of Computer Science by placing beside his position a rather different one, according to which computer science is a branch of, and is therefore subsumed by, immaterial formal logic. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
46. SOME REMARKS CONCERNING VIRTUALITY.
- Author
-
LATAWIEC, ANNA
- Subjects
- *
VIRTUAL reality , *COMPUTER science , *WORLDVIEW , *SCIENTIFIC computing - Abstract
The development of computer sciences has transformed the way of thinking and our perception of the world. To express this new view of the world, a new language is created, which uses such notions as "virtuality", "virtual world", "virtual reality". These words have already worked in our colloquial speech and our thinking. However, they are used in various contexts and have a different meaning. The paper offers some remarks on the problem of the meaning of these notions and draws some consequences of their interpretation. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
47. The Effect of Internships on Computer Science Engineering Capstone Projects.
- Author
-
Jaime, Arturo, Olarte, Juan J., Garcia-Izquierdo, Francisco J., and Dominguez, Cesar
- Subjects
- *
SCIENTIFIC computing , *COMPUTER engineering , *INTERNSHIP programs , *TECHNOLOGICAL complexity , *PROJECT management , *STUDENT projects - Abstract
Contribution: Internships designed to provide training and an initial period of contact with industry, prior to a computer science engineering capstone project, have a very positive impact on both industry and academic capstone projects. Background: Internships and capstone projects are widely used to integrate work-related learning in computer engineering curricula. Both activities offer numerous benefits for students, industry, and academia. Although their effects have been extensively studied separately, the interaction between them remains unexplored. Research Questions: What is the effect of internships on the development of a subsequent capstone project? Methodology: The hypothesis was that the completion of an internship will have positive effects on several aspects of the capstone projects: 1) improved student competencies; 2) improved capstone project outcomes; and 3) decreased supervision effort. Further, these positive effects were expected to be greater in industry-based projects than in academic projects. The hypothesis was tested through a quantitative study of data collected from 274 computer science engineering capstone projects. A period of time with internships was compared with another period without internships, and differentiating between academic and industry projects. Findings: Internships prior to capstone projects improve student skills in autonomy, technology, methodology, and project management; increase the complexity and technological novelty of the resulting projects; and reduce advisor involvement in practical (technology, execution) and keep-the-project-alive issues, and increase advisor involvement in monitoring student work (meetings, reports, and initial arrangements). This effect was observed in both industrial and academic projects. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
48. The 1950s: Capacity expansion, HDPE/PP, polycarbonate, computers and rocket science.
- Author
-
NICHOLS, LEE
- Subjects
- *
COMPUTER science , *CHROMIUM catalysts , *MATERIALS science , *SCIENTIFIC computing , *CHEMICAL industry , *POLYCARBONATES , *POLYOLEFINS - Abstract
The article offers information related to the development of the Hydrocarbon processing industry in 1950s. The 1950s marked an evolution in the use of oil by various nations. Across the world, nations were investing in new refining capacity to satisfy demand for refined fuels. The 1950s was also a time of new technological discoveries for the refining and petrochemical industries that includes petrochemical processes to produce higher octane fuels and new derivatives of polyethylene.
- Published
- 2022
49. LogDet Metric-Based Domain Adaptation.
- Author
-
Liu, Youfa, Du, Bo, Tu, Weiping, Gong, Mingming, Guo, Yuhong, and Tao, Dacheng
- Subjects
- *
SCIENTIFIC computing , *SIMPLICITY , *STATISTICS - Abstract
Domain adaptation has proven to be successful in dealing with the case where training and test samples are drawn from two kinds of distributions, respectively. Recently, the second-order statistics alignment has gained significant attention in the field of domain adaptation due to its superior simplicity and effectiveness. However, researchers have encountered major difficulties with optimization, as it is difficult to find an explicit expression for the gradient. Moreover, the used transformation employed here does not perform dimensionality reduction. Accordingly, in this article, we prove that there exits some scaled LogDet metric that is more effective for the second-order statistics alignment than the Frobenius norm, and hence, we consider it for second-order statistics alignment. First, we introduce the two homologous transformations, which can help to reduce dimensionality and excavate transferable knowledge from the relevant domain. Second, we provide an explicit gradient expression, which is an important ingredient for optimization. We further extend the LogDet model from single-source domain setting to multisource domain setting by applying the weighted Karcher mean to the LogDet metric. Experiments on both synthetic and realistic domain adaptation tasks demonstrate that the proposed approaches are effective when compared with state-of-the-art ones. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
50. Accurate Tensor Completion via Adaptive Low-Rank Representation.
- Author
-
Zhang, Lei, Wei, Wei, Shi, Qinfeng, Shen, Chunhua, van den Hengel, Anton, and Zhang, Yanning
- Subjects
- *
ORTHOTROPIC plates , *SCIENTIFIC computing - Abstract
Low-rank representation-based approaches that assume low-rank tensors and exploit their low-rank structure with appropriate prior models have underpinned much of the recent progress in tensor completion. However, real tensor data only approximately comply with the low-rank requirement in most cases, viz., the tensor consists of low-rank (e.g., principle part) as well as non-low-rank (e.g., details) structures, which limit the completion accuracy of these approaches. To address this problem, we propose an adaptive low-rank representation model for tensor completion that represents low-rank and non-low-rank structures of a latent tensor separately in a Bayesian framework. Specifically, we reformulate the CANDECOMP/PARAFAC (CP) tensor rank and develop a sparsity-induced prior for the low-rank structure that can be used to determine tensor rank automatically. Then, the non-low-rank structure is modeled using a mixture of Gaussians prior that is shown to be sufficiently flexible and powerful to inform the completion process for a variety of real tensor data. With these two priors, we develop a Bayesian minimum mean-squared error estimate framework for inference. The developed framework can capture the important distinctions between low-rank and non-low-rank structures, thereby enabling more accurate model, and ultimately, completion. For various applications, compared with the state-of-the-art methods, the proposed model yields more accurate completion results. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.