442 results on '"e-science"'
Search Results
2. Use of e-Science Environment on CFD Education
- Author
-
Jongbae Moon, Jae Wan Ahn, Yoonhee Kim, Jin-Ho Kim, Kum Won Cho, Soon-Heum Ko, Chongam Kim, and Byoungsoo Kim
- Subjects
Engineering ,business.industry ,Process (engineering) ,Interface (computing) ,Grid ,computer.software_genre ,Data science ,Workflow ,Human–computer interaction ,Virtual machine ,Middleware (distributed applications) ,e-Science ,Information technology management ,business ,computer - Abstract
‘e-Science’ represents the global collaborations of people and shared resources to solve new and challenging problems in science and engineering (Hey & Trefethen, 2003) on the basis of the IT infrastructure, typically referred to as the Grid (Foster & Kesselman, 1999). As we can easily infer, e-Science initially meant a virtual environment where a new and challengeable research can be accomplished using latest infrastructures. That virtual environment usually has a form of a web portal page or an independent application with a bunch of computer scientific components inside: high-end application researches include large-scale computations for complex multi-physical mechanisms, coupled works of computation and experiments for design-to-development processes, and/or data-intensive researches. The infrastructure consists of computational and experimental facilities, valuable datasets, knowledge, and so on. Researchers can be referred to as a core component of e-Science environment as their discussions and collaborations are promoted by, managed by and integrated to the environment. Meanwhile, the meaning of ‘e-Science’ is becoming broader nowadays. Though e-Science first intended to enrich high-end research activities, it is soon proven to be also effective on academic activities as a cyber education system. (On the other hand, use on inter-disciplinary collaborative researches is not much vitalized as expected, because of diverse preference on internal workflow, I/O and interface among research domains.) Thus, the term ‘e-Science’ is rather used to represent ‘all scientific activities on high performance computing and experimental facilities with the aid of user-friendly interface and system middleware’ these days. As a virtual academic system for aerospace engineering, ‘e-AIRS’ (e-Science Aerospace Integrated Research System) has been designed and developed since 2005. After three years’ development, e-AIRS educational system is finally open, where non-experts can intuitively conduct the full process of computational and experimental fluid dynamic study. Also, the
- Published
- 2021
3. Understanding e-Science—What Is It About?
- Author
-
Claudia Koschtial
- Subjects
020203 distributed computing ,Computer science ,business.industry ,Emerging technologies ,Context (language use) ,02 engineering and technology ,computer.software_genre ,Data science ,Task (project management) ,Grid computing ,020204 information systems ,e-Science ,0202 electrical engineering, electronic engineering, information engineering ,Selection (linguistics) ,The Internet ,business ,computer ,Coherence (linguistics) - Abstract
Our daily life has experienced significant changes in the Internet age. The emergence of e-science is regarded as a dramatic one for science. Wikis, blogs, virtual social networks, grid computing and open access are just a brief selection of related new technologies. In order to understand the changes, it is necessary to define these aspects of e-science precisely. Right now, no generally used term or common definition of e-science exists, which limits the understanding of the true potential of the concept. Based on a well-known approach to science in terms of three dimensions—human, task and technology—the author provides a framework for understanding the concept which enables a distinctive view of its development. The concept of e-science emerged in coherence with the technological development of web 2.0 and infrastructure and has reached maturity. This is impacting on the task and human dimensions as in this context, the letter “e” means more than just electronic.
- Published
- 2021
4. Mining_RNA: WEB-Based System Using e-Science for Transcriptomic Data Mining
- Author
-
Leite Thiago Alefy Almeida e Cicília Raquel Maia Sousa, Vânia Marilande Ceccatto, Cynthia Moreira Maia, Carlos Renan Moreira, Adriano Gomes da Silva, Raquel Martins de Freitas, Pedro Victor Morais Batista, Stela Mirla da Silva Felipe, Christina Pacheco, Marcos Vinícius Pereira Diógenes, Pedro Fernandes Ribeiro Neto, Exlley Clemente dos Santos, and Thalia Katiane Sampaio Gurgel
- Subjects
Gene expression omnibus ,Web system ,Computer science ,business.industry ,Interface (computing) ,e-Science ,Biological database ,Web application ,Data mining ,DNA microarray ,computer.software_genre ,business ,computer - Abstract
High-throughput gene expression studies yielded a great number of large datasets, and these are freely available in biological databases. Re-analyzing these studies individually or in clusters can produce new results relevant to the scientific community. The purpose of this work is to develop a WEB system based on the e-Science paradigm. The system should read massive amounts of data from the Gene Expression Omnibus (GEO) database, pre-process, mine, and display it in a user-friendly interface. Thus, it is intended to mitigate the difficulty in interpreting data from transcriptomic studies made using the DNA microarray technique. Also presented will be the preliminary results obtained from the initial stages of development, as well as the proposed architecture for the system.
- Published
- 2021
5. Visions of a Future Research Workplace Arising from Recent Foresight Exercises
- Author
-
Andrzej M. J. Skulimowski
- Subjects
Process (engineering) ,business.industry ,Computer science ,02 engineering and technology ,computer.software_genre ,Data science ,Automation ,Expert system ,Data flow diagram ,Futures studies ,020401 chemical engineering ,Software deployment ,Information and Communications Technology ,020204 information systems ,e-Science ,0202 electrical engineering, electronic engineering, information engineering ,0204 chemical engineering ,business ,computer - Abstract
The results of recent foresight projects reveal the impact of future ICT tools on the practice of scientific research. This paper presents several aspects of the process of building scenarios and trends of selected advanced ICT technologies. We point out the implications of emerging global expert systems (GESs) and AI-based learning platforms (AILPs). GESs will be capable of using and processing global knowledge from all available sources, such as databases, repositories, video streams, interactions with other researchers and knowledge processing units. In many scientific disciplines, the high volume, density and increasing level of interconnection of data have already exhausted the capacities of any individual researcher. Three trends may dominate the development of scientific methodology. Collective research is one possible coping strategy: Group intellectual capacity makes it possible to tackle complex problems. Recent data flow forecasts indicate that even in the few areas, which still resist ICT domination, research based on data gathered in non-ICT supported collections will soon reach its performance limits due to the ever-growing amount of knowledge to be acquired, verified, exchanged and communicated between researchers. Growing automation of research is the second option: Automated expert systems will be capable of selecting and processing knowledge to the level of a professionally edited scientific paper, with only minor human involvement. The third trend is intensive development and deployment of brain–computer interfaces (BCIs) to quickly access and process data. Specifically, GESs and AILPs can be used together with BCIs. The above approaches may eventually merge, forming a few AI-related technological scenarios, as discussed to conclude the paper.
- Published
- 2021
6. e-Science
- Author
-
Thomas Köhler, Carsten Felden, and Claudia Koschtial
- Subjects
Management information systems ,Grid computing ,Virtual organization ,business.industry ,e-Science ,Science communication ,Information technology ,Context (language use) ,business ,computer.software_genre ,Data science ,computer ,Digitization - Abstract
This open access book shows the breadth and various facets of e-Science, while also illustrating their shared core. Changes in scientific work are driven by the shift to grid-based worlds, the use of information and communication systems, and the existential infrastructure, which includes global collaboration. In this context, the book addresses emerging issues such as open access, collaboration and virtual communities and highlights the diverse range of developments associated with e-Science. As such, it will be of interest to researchers and scholars in the fields of information technology and knowledge management.
- Published
- 2021
7. Workflow Provenance in the Lifecycle of Scientific Machine Learning
- Author
-
Marta Mattoso, Marco A. S. Netto, Renato Cerqueira, Renan Souza, Patrick Valduriez, Daniel Civitarese, Elton F. de S. Soares, Marcio Ferreira Moreno, Emilio Vital Brazil, Rafael Brandão, Vítor Lourenço, Leonardo Guerreiro Azevedo, Raphael Melo Thiago, IBM Research [Rio de Janeiro], Scientific Data Management (ZENITH), Inria Sophia Antipolis - Méditerranée (CRISAM), Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-Laboratoire d'Informatique de Robotique et de Microélectronique de Montpellier (LIRMM), Centre National de la Recherche Scientifique (CNRS)-Université de Montpellier (UM)-Centre National de la Recherche Scientifique (CNRS)-Université de Montpellier (UM), Universidade Federal do Rio de Janeiro (UFRJ), Laboratoire d'Informatique de Robotique et de Microélectronique de Montpellier (LIRMM), Centre National de la Recherche Scientifique (CNRS)-Université de Montpellier (UM)-Centre National de la Recherche Scientifique (CNRS)-Université de Montpellier (UM)-Inria Sophia Antipolis - Méditerranée (CRISAM), and Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,J.2 ,Computer science ,H.2 ,65Y05, 68P15 ,02 engineering and technology ,computer.software_genre ,Machine Learning (cs.LG) ,0202 electrical engineering, electronic engineering, information engineering ,Scientific Workflow ,Databases (cs.DB) ,Reproducibility ,Computer Science Applications ,Computer Science - Distributed, Parallel, and Cluster Computing ,Computational Theory and Mathematics ,Provenance ,Scalability ,020201 artificial intelligence & image processing ,I.2 ,Scientific Machine Learning ,Computer Science - Artificial Intelligence ,C.4 ,Computer Networks and Communications ,Machine learning ,External Data Representation ,Semantics ,Theoretical Computer Science ,Domain (software engineering) ,Computer Science - Databases ,Artificial Intelligence ,Lineage ,020204 information systems ,Design Principles ,Leverage (statistics) ,Taxonomy ,[INFO.INFO-DB]Computer Science [cs]/Databases [cs.DB] ,business.industry ,Data Science ,e-Science ,Explainability ,Workflow ,Artificial Intelligence (cs.AI) ,Systems architecture ,Artificial intelligence ,Distributed, Parallel, and Cluster Computing (cs.DC) ,business ,computer ,Data lake ,Software ,Machine Learning Lifecycle - Abstract
Machine Learning (ML) has already fundamentally changed several businesses. More recently, it has also been profoundly impacting the computational science and engineering domains, like geoscience, climate science, and health science. In these domains, users need to perform comprehensive data analyses combining scientific data and ML models to provide for critical requirements, such as reproducibility, model explainability, and experiment data understanding. However, scientific ML is multidisciplinary, heterogeneous, and affected by the physical constraints of the domain, making such analyses even more challenging. In this work, we leverage workflow provenance techniques to build a holistic view to support the lifecycle of scientific ML. We contribute with (i) characterization of the lifecycle and taxonomy for data analyses; (ii) design principles to build this view, with a W3C PROV compliant data representation and a reference system architecture; and (iii) lessons learned after an evaluation in an Oil & Gas case using an HPC cluster with 393 nodes and 946 GPUs. The experiments show that the principles enable queries that integrate domain semantics with ML models while keeping low overhead (, 21 pages, 10 figures, text overlap with arXiv:1910.04223, a workshop paper being extended in this journal paper
- Published
- 2020
- Full Text
- View/download PDF
8. Nonlinear Time Series Analysis of Complex Systems Using an e-Science Web Framework
- Author
-
Reinaldo R. Rosa, Bruno B. F. Leonor, Asiel Bomfin, and Walter Abrahão dos Santos
- Subjects
Control and Optimization ,Computer science ,business.industry ,Distributed computing ,Big data ,Computational Mechanics ,Complex system ,Statistical and Nonlinear Physics ,Cloud computing ,NoSQL ,computer.software_genre ,Variety (cybernetics) ,Workflow ,e-Science ,Discrete Mathematics and Combinatorics ,Time series ,business ,computer - Abstract
The analysis of time series in the era of Big Data has become a major challenge for computational framework research. Furthemore, in the areas of space science which deals with a large variety of data, the practical consistence between workload, workflow and cloud computing is crucial. Here, such consistence is provided by an innovative e-Science framework project named Sentinel which is based on a NoSQL data base (MongoDB) and a containerization platform (Docker). This web framework supports researchers for time series analysis in a cloud environment where they can easily access, parameterize, initialize and monitor their applications. As a case study in the Brazilian Space Weather Program, we consider the intensive analysis of time series from a complex information system for solar activity monitoring and forecasting. As a prototype for implementing the framework, the DFA (detrended fluctuation analysis) technique was used as a nonlinear spectrum analyzer applied to the solar irradiance measurements from 1978 to 2012. Moreover, new applications can be added and managed by researchers on the portal easily to complement their data analysis purposes.
- Published
- 2018
9. Applications of provenance in performance prediction and data storage optimisation
- Author
-
Simon Woodman, Paul Watson, and Hugo Hiden
- Subjects
0301 basic medicine ,Graph database ,Exploit ,Database ,Computer Networks and Communications ,business.industry ,Computer science ,Cloud computing ,02 engineering and technology ,computer.software_genre ,Data science ,03 medical and health sciences ,030104 developmental biology ,Workflow ,Hardware and Architecture ,Computer data storage ,e-Science ,0202 electrical engineering, electronic engineering, information engineering ,Performance prediction ,020201 artificial intelligence & image processing ,business ,computer ,Software - Abstract
Accurate and comprehensive storage of provenance information is a basic requirement for modern scientific computing. A significant effort in recent years has developed robust theories and standards for the representation of these traces across a variety of execution platforms. Whilst these are necessary to enable repeatability they do not exploit the captured information to its full potential. This data is increasingly being captured from applications hosted on Cloud Computing platforms, which offer large scale computing resources without significant up front costs. Medical applications, which generate large datasets are also suited to cloud computing as the practicalities of storing and processing such data locally are becoming increasingly challenging. This paper shows how provenance can be captured from medical applications, stored using a graph database and then used to answer audit questions and enable repeatability. This static provenance will then be combined with performance data to predict future workloads, inform decision makers and reduce latency. Finally, cost models which are based on real world cloud computing costs will be used to determine optimum strategies for data retention over potentially extended periods of time.
- Published
- 2017
10. Usage Patterns of Wideband Display Environments In e-Science Research, Development and Training
- Author
-
Jared H. McLean, Lance Long, Luc Renambot, Arthur Nishimoto, John H. R. Burns, Jason Leigh, Francis Ray Cristobal, Jason H. Haga, Alberto Gonzalez, Roberto Pelayo, Dylan Kobayashi, Troy Wooton, Nurit Kirshenbaum, Maxine D. Brown, Andrew Burks, Andrew Johnson, Mahdi Belcaid, and Krishna Bharadwaj
- Subjects
Multimedia ,business.industry ,Computer science ,computer.software_genre ,Human-centered computing ,Software ,Cyberinfrastructure ,Computer-supported cooperative work ,Scalability ,e-Science ,Leverage (statistics) ,Graphics ,business ,computer - Abstract
SAGE (the Scalable Adaptive Graphics Environment) and its successor SAGE2 (the Scalable Amplified Group Environment) are operating systems for managing content across wideband display environments. This paper documents the prevalent usage patterns of SAGE-enabled display walls in support of the e-Science enterprise, based on nearly 15 years of observations of the SAGE community. These patterns will help guide e-Science users and cyberinfrastructure developers on how best to leverage large tiled display walls, and the types of software services that could be provided in the future.
- Published
- 2019
11. Co-citation analysis of literature in e-science and e-infrastructures
- Author
-
Peter Matthew, Jianhua Hou, Navonil Mustafee, Simon J. E. Taylor, and Nik Bessis
- Subjects
Computer Networks and Communications ,business.industry ,Computer science ,e-Infrastructure ,e-Science ,E infrastructure ,Cloud computing ,computer.software_genre ,Data science ,grid computing ,Co-citation ,Computer Science Applications ,Theoretical Computer Science ,Computational Theory and Mathematics ,Grid computing ,Co-citation analysis ,desktop grid computing ,business ,computer ,Software - Published
- 2019
12. Scientific workflows
- Author
-
Anna-Lena Lamprecht and Kenneth J. Turner
- Subjects
0301 basic medicine ,Computer science ,computer.internet_protocol ,business.industry ,020207 software engineering ,Sample (statistics) ,02 engineering and technology ,Service-oriented architecture ,Workflow model ,03 medical and health sciences ,030104 developmental biology ,Workflow ,e-Science ,0202 electrical engineering, electronic engineering, information engineering ,Special section ,Software engineering ,business ,computer ,Software ,Information Systems - Abstract
This article commences the special section on scientific workflows. A brief history is given of workflow approaches in general. More detail is given of scientific workflows, including sample applications. Challenges and research issues are then identified. Finally, an overview is given of the articles appearing in this special section.
- Published
- 2016
13. SECURITY PROVISION ISSUES OF E-SCIENCE
- Author
-
Tahmasib Fataliyev
- Subjects
Cloud computing security ,e-Science ,Network security policy ,Business ,Computer security ,computer.software_genre ,computer - Published
- 2016
14. A Search Space Exploration Framework for e-Science Applications
- Author
-
Marco A. S. Netto, Eric B. Gauch, Bruno E. C. Milanesi, Renato L. F. Cunha, and Bruno Silva
- Subjects
Set (abstract data type) ,Computer science ,Process (engineering) ,Distributed computing ,Component (UML) ,e-Science ,Plug-in ,computer.software_genre ,Supercomputer ,computer ,Space exploration ,Parametric statistics - Abstract
High Performance Computing (HPC) has always been a fundamental component to conduct scientific experiments. Model calibrations/simulations often require several executions of scientific applications by changing their input parameters. This process is a common practice in research even though it represents a tedious and error-prone task. In this paper we propose Copper framework which employs a black-box strategy and contains a set of plugins to accelerate user experiments for exploring search spaces in HPC parametric applications. Copper has been used to conduct scientific experiments in different areas including, agriculture, oil gas, flood simulation, and bioinformatics.
- Published
- 2018
15. E-Science Indonesia: Innovation, Integration, and Utilization: Explanatory Study members in Central Java: Muria Kudus University, Islamic University of Nahdlatul Ulama Jepara, and Institute of Health Science PKU Muhammadiyah Solo
- Author
-
Albertoes Pramoekti Narendra
- Subjects
Java ,Health science ,e-Science ,Library science ,Islam ,Sociology ,computer ,computer.programming_language - Published
- 2018
16. Semantic Web-Based Framework for Scientific Workflows in E-Science
- Author
-
Singanamalla Vijayakumar, Nagaraju Dasari, Bharath Bhushan, and Rajasekhar Reddy
- Subjects
World Wide Web ,Semantic grid ,Workflow ,Database ,Computer science ,Semantic computing ,e-Science ,Semantic Web Stack ,computer.software_genre ,computer ,Semantic Web - Abstract
In the future generation, computer science plays prominent role in the scientific research. The development in the field of computers will leads to the research benefits of scientific community for sharing data, service computing, building the frameworks and many more. E-Science is the active extending field in the world by the increase data and tools. The proposed work discusses the use of semantic web applications for identifying the components in the development of scientific workflows. The main objective of the proposed work is to develop the framework which assists the scientific community to test and deploy the scientific experiments with the help of ontologies, service repositories, web services and scientific workflows. The framework which aims to sustenance the scientific results and management of applications related to the specific domain. The overall goal of this research is to automate the use of semantic web services, generate the workflows, manage the search services, manage the ontologies by considering the web service composition.
- Published
- 2018
17. A self-organized volunteer Cloud for e-Science
- Author
-
Mohamed Jemni, Walid Saad, Christophe Cérin, and Heithem Abbes
- Subjects
Cloud computing security ,business.industry ,Computer science ,Data management ,Distributed computing ,Services computing ,020206 networking & telecommunications ,Cloud computing ,02 engineering and technology ,Grid ,computer.software_genre ,Theoretical Computer Science ,System administrator ,Grid computing ,Hardware and Architecture ,Cloud testing ,e-Science ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,business ,computer ,Software ,Information Systems - Abstract
Nowadays, the adoption of Cloud Computing platforms and Service Computing technologies are almost natural for the different e-Science communities. Cost benefits for data-intensive applications, ease of access, rich and varied offers for services are examples of positive returns by users. However, beyond this favorable welcome for the technology, some research problems remain and are still challenging. In this paper, we focus on the problems of automatically deploying IaaS for computing and for data management, using the SlapOS Cloud. The core of the system is a distributed protocol for orchestrating data and compute nodes. Using this interaction scheme, users are able to deploy, without any system administrator intervention, a PaaS inside the IaaS basically a Desktop Grid middleware. The aim of this paper is to demonstrate that the Desktop Grid and Cloud paradigms may merge and may be widely used by non-experts in the different areas of e-Science. We propose a fully self-organized volunteer Cloud for researchers where they can carry out e-Science experiments and process large amounts of data in a coherent way.
- Published
- 2015
18. IMP Science Gateway: from the Portal to the Hub of Virtual Experimental Labs in e-Science and Multiscale Courses in e-Learning
- Author
-
Elena Zasimchuk, L. V. Bekenov, Yuri Gordienko, Sergii Stirenko, Olexander Gatsenko, and Olexandra Baskova
- Subjects
Service (systems architecture) ,Computer Networks and Communications ,Computer science ,Grid ,computer.software_genre ,Computer Science Applications ,Theoretical Computer Science ,World Wide Web ,Workflow ,Computational Theory and Mathematics ,Grid computing ,e-Science ,Component-based software engineering ,Web service ,computer ,Software - Abstract
Summary ‘Science gateway’ (SG) ideology means a user-friendly intuitive interface between scientists (or scientific communities) and different software components + various distributed computing infrastructures (DCIs), where researchers can focus on their scientific goals and less on the peculiarities of software/DCI. G.V.Kurdyumov Institute for Metal Physics ‘IMP Science Gateway Portal’ (http://scigate.imp.kiev.ua) is presented for complex workflow management and integration of distributed computing resources (like clusters, service grids, desktop grids, and clouds). It is created on the basis of Web Service – Parallel Grid Run-time and Application Development Environment (WS-PGRADE) and gUSE (grid and cloud User Support Environment) technologies, where WS-PGRADE is designed for science workflow operation and gUSE — for smooth integration of available resources for parallel and distributed computing in various heterogeneous DCIs. Some use cases (scientific workflows) are considered for molecular dynamics simulations of complex behavior of various nanostructures. The modular approach allows scientists to use SG portals as research hubs of various virtual experimental labs in the context of practical applications in material science, physics, and nanotechnologies. In addition, workflows and their components are proposed to be used as Lego-style construction units for learning modules of various scale by duration, complexity, targeted audience, and so on. These workflows can be used also in e-Learning infrastructures as constituent elements of learning hubs for the management of learning content, tools, resources, and users in the regular, vocational, lifelong, and informal learning. Copyright © 2015 John Wiley & Sons, Ltd.
- Published
- 2015
19. ReputationNet: Reputation-Based Service Recommendation for e-Science
- Author
-
Shiping Chen, Carole Goble, Surya Nepal, Wei Tan, Jia Zhang, Jinhui Yao, and David De Roure
- Subjects
Service (business) ,Service system ,Information Systems and Management ,Computer Networks and Communications ,Service delivery framework ,business.industry ,Computer science ,computer.internet_protocol ,Service design ,Services computing ,Service-oriented architecture ,Computer Science Applications ,World Wide Web ,Workflow ,Hardware and Architecture ,e-Science ,business ,computer - Abstract
In the paradigm of service oriented science, scientific computing applications and data are all wrapped as web accessible services. Scientific workflows further integrate these services to answer complex research questions. However, our earlier study conducted on myExperiment has revealed that although the sharing of service-based capabilities opens a gateway to resource reuse, in practice, the degree of reuse is very low. This finding has motivated us to propose ServiceMap to provide navigation facility through the network of services to facilitate the design and development of scientific workflows. This paper proposes ReputationNet as an enhancement of ServiceMap, to incorporate the often-ignored reputation aspects of services/workflows and their publishers, in order to offer better service and workflow recommendations. We have developed a novel model to reflect the reputation of e-Science services/workflows, and developed heuristic algorithms to provide service recommendations based on reputations. Experiments on myExperiment have illustrated a strong positive correlation (with Pearson correlation coefficient 0.82) between the reputation scores computed and the actual performance (i.e. usage frequency) of the services/workflows, which demonstrates the effectiveness of our approach.
- Published
- 2015
20. Challenges of big data in science researches
- Author
-
Gang Chen
- Subjects
Multidisciplinary ,business.industry ,Emerging technologies ,Data management ,Big data ,Cloud computing ,computer.software_genre ,Data science ,Grid computing ,e-Science ,Data mining ,business ,Worldwide LHC Computing Grid ,Cloud storage ,computer - Abstract
New scientific studies are moving into the era of big science with large facilities and global collaborations. Large amounts of data are being produced from large research facilities, generating challenges regarding data collection, processing, storing, and distributing. This study discusses some of the data challenges facing large scientific research projects and the practices to meet such challenges. High-energy physics is one such research area generating tens of petabytes of data per year. The data of high-energy physics experiments should be properly collected, securely stored, and distributed to and processed in laboratories around the world. The high-energy physics community has developed the Worldwide LHC Computing Grid (WLCG)system, which successfully supports data processing and analysis for Large Hadron Collider experiments. The WLCG also successfully supports many other scientific disciplines. New technologies based on software-defined networks and cloud storage have been developed to provide new services in data management. The final section of the paper discusses the application of cloud computing in science computing.
- Published
- 2015
21. A semi-automated workflow for biodiversity data retrieval, cleaning, and quality control
- Author
-
Robert Haines, Saverio Vicario, Carole Goble, Matthias Obst, Cherian Mathew, Yde de Jong, Anton Güntsch, Alan Williams, and Experimental Plant Systematics (IBED, FNWI)
- Subjects
0106 biological sciences ,Service (systems architecture) ,Computer science ,Software Description ,Biodiversity informatics ,computer.software_genre ,010603 evolutionary biology ,01 natural sciences ,Workflow engine ,Workflow technology ,03 medical and health sciences ,Data retrieval ,Journal Article ,lcsh:QH301-705.5 ,Ecology, Evolution, Behavior and Systematics ,data cleaning ,030304 developmental biology ,Data Management ,0303 health sciences ,Ecology ,Database ,e-Science ,Data science ,Data Analysis & Modelling ,Workflow ,web services ,lcsh:Biology (General) ,service oriented architecture ,workflows ,Web service ,biodiversity informatics ,computer ,Workflow management system - Abstract
The compilation and cleaning of data needed for analyses and prediction of species distributions is a time consuming process requiring a solid understanding of data formats and service APIs provided by biodiversity informatics infrastructures. We designed and implemented a Taverna-based Data Refinement Workflow which integrates taxonomic data retrieval, data cleaning, and data selection into a consistent, standards-based, and effective system hiding the complexity of underlying service infrastructures. The workflow can be freely used both locally and through a web-portal which does not require additional software installations by users.
- Published
- 2014
22. Managing Workflows on top of a Cloud Computing Orchestrator for using heterogeneous environments on e-Science
- Author
-
Rodrigo Jardim, Miguel Caballer, Alberto M. R. Dávila, Nelson Kotowski, Abel Carrión, and Ignacio Blanquer
- Subjects
Computer Networks and Communications ,Computer science ,Best practice ,Distributed computing ,Cloud computing ,02 engineering and technology ,computer.software_genre ,Workflow ,Multi-platform ,0202 electrical engineering, electronic engineering, information engineering ,CIENCIAS DE LA COMPUTACION E INTELIGENCIA ARTIFICIAL ,Database ,business.industry ,Comparative genomics ,Cloud orchestrator ,e-Science ,Workflow management systems ,020206 networking & telecommunications ,Workload ,Cloud Computing ,Elasticity (cloud computing) ,Orchestration ,020201 artificial intelligence & image processing ,business ,computer ,Software ,Workflow management system - Abstract
[EN] Scientific workflows (SWFs) are widely used to model processes in e-Science. SWFs are executed by means of workflow management systems (WMSs), which orchestrate the workload on top of computing infrastructures. The advent of cloud computing infrastructures has opened the door of using on-demand infrastructures to complement or even replace local infrastructures. However, new issues have arisen, such as the integration of hybrid resources or the compromise between infrastructure reutilisation and elasticity. In this article, we present an ad hoc solution for managing workflows exploiting the capabilities of cloud orchestrators to deploy resources on demand according to the workload and to combine heterogeneous cloud providers (such as on-premise clouds and public clouds) and traditional infrastructures (clusters) to minimise costs and response time. The work does not propose yet another WMS but demonstrates the benefits of the integration of cloud orchestration when running complex workflows. The article shows several configuration experiments from a realistic comparative genomics workflow called Orthosearch, to migrate memory-intensive workload to public infrastructures while keeping other blocks of the experiment running locally. The article computes running time and cost suggesting best practices., This paper wants to acknowledge the support of the EUBrazilCC project, funded by the European Commission (STREP 614048) and the Brazilian MCT/CNPq N. 13/2012, for the use of its infrastructure. The authors would like also to thank the Spanish 'Ministerio de Economia y Competitividad' for the project 'Clusters Virtuales Elasticos y Migrables sobre Infraestructuras Cloud Hibridas' with reference TIN2013-44390-R.
- Published
- 2017
23. AstroTaverna—Building workflows with Virtual Observatory services
- Author
-
J. D. Santander-Vela, Julián Garrido, Lourdes Verdes-Montenegro, José Enrique Ruiz, S. Sánchez-Expósito, European Commission, Universidad de Granada, Ministerio de Ciencia e Innovación (España), and Junta de Andalucía
- Subjects
Astroinformatics ,Computer science ,Astronomy and Astrophysics ,Virtual observatory ,Methods: miscellaneous ,computer.software_genre ,Computer Science Applications ,Astronomical databases: miscellaneous ,World Wide Web ,Workflow ,Documentation ,Space and Planetary Science ,E-Science ,Scientific workflows ,e-Science ,miscellaneous [Astronomical databases] ,miscellaneous [Methods] ,Plug-in ,Virtual observatories ,Web service ,computer ,Workflow management system - Abstract
Special issue on The Virtual Observatory: I, edited by Robert Hanisch, Despite the long tradition of publishing digital datasets in Astronomy, and the existence of a rich network of services providing astronomical datasets in standardized interoperable formats through the Virtual Observatory (VO), there has been little use of scientific workflow technologies in this field. In this paper we present AstroTaverna, a plugin that we have developed for the Taverna Workbench scientific workflow management system. It integrates existing VO web services as first-class building blocks in Taverna workflows, allowing the digital capture of otherwise lost procedural steps manually performed in e.g. GUI tools, providing reproducibility and re-use. It improves the readability of digital VO recipes with a comprehensive view of the entire automated execution process, complementing the scarce narratives produced in the classic documentation practices, transforming them into living tutorials for an efficient use of the VO infrastructure. The plugin also adds astronomical data manipulation and transformation tools based on the STIL Tool Set and the integration of Aladin VO software, as well as interactive connectivity with SAMP-compliant astronomy tools. © 2014 Elsevier B.V. All rights reserved., AstroTaverna has been developed in the framework of the Wf4Ever Project34 270129 funded under EU FP7 Digital Libraries and Digital Preservation (ICT-2009.4.1), which leverages workflow technology in order to preserve the scientific methodology, facilitating the reuse and exchange of digital knowledge, and to enable the inspection of the reproducibility of scientific investigation results. Part of the improvements is being undertaken in the framework of CANUBE Project35 CEI2013-P-14, an Open Science project granted by the Second Call for Proposals of the Bio-TIC Campus of International Excellence36 of the University of Granada, in Spain. This work has been also supported by Grant AYA2011-30491-C02-01, co-financed by MICINN and FEDER funds, and the Junta de Andaluca (Spain) Grants P08-FQM-4205 and TIC-114
- Published
- 2014
24. Uses of online geoprocessing technology in analyses and case studies: a systematic analysis of literature
- Author
-
Barbara Hofer
- Subjects
Geographic information system ,Computer science ,computer.internet_protocol ,business.industry ,Geoprocessing ,Service-oriented architecture ,computer.software_genre ,Data science ,Field (computer science) ,Computer Science Applications ,Set (abstract data type) ,Workflow ,e-Science ,General Earth and Planetary Sciences ,Web service ,business ,computer ,Software - Abstract
Interpreting spatial data to derive information is a core task in the field of Geographic Information Science and Technology. A logical step following the collection of data in online repositories is to provide geoprocessing technology for analysing data online. Online geoprocessing technology can be employed for providing a specified set of tools in a theme-specific platform, documenting a model or workflow and making it widely available, automating recurring tasks or offering simple tools to a large user group. This systematic analysis of literature evaluates how much available online geoprocessing tools are being used for answering questions in specific application contexts. An initial set of articles is derived from a keyword-based search in the database Scopus. This set of articles is manually filtered to identify applications of online geoprocessing tools. The analysis of application-related articles shows that virtually all applications require further development of tools. Experts outside the spat...
- Published
- 2014
25. Provision of an integrated data analysis platform for computational neuroscience experiments
- Author
-
Andrew Branson, Jetendr Shamdasani, Khawar Hasham, Saad Liaquat Kiani, Kamran Munir, and Richard McClatchey
- Subjects
Computational neuroscience ,General Computer Science ,Process (engineering) ,Computer science ,Search engine indexing ,Neuroinformatics ,computer.software_genre ,Data science ,Workflow ,e-Science ,Information system ,computer ,Information Systems ,Data integration - Abstract
Purpose – The purpose of this paper is to provide an integrated analysis base to facilitate computational neuroscience experiments, following a user-led approach to provide access to the integrated neuroscience data and to enable the analyses demanded by the biomedical research community. Design/methodology/approach – The design and development of the N4U analysis base and related information services addresses the existing research and practical challenges by offering an integrated medical data analysis environment with the necessary building blocks for neuroscientists to optimally exploit neuroscience workflows, large image data sets and algorithms to conduct analyses. Findings – The provision of an integrated e-science environment of computational neuroimaging can enhance the prospects, speed and utility of the data analysis process for neurodegenerative diseases. Originality/value – The N4U analysis base enables conducting biomedical data analyses by indexing and interlinking the neuroimaging and clinical study data sets stored on the grid infrastructure, algorithms and scientific workflow definitions along with their associated provenance information.
- Published
- 2014
26. Ten Years of SkyServer I: Tracking Web and SQL e-Science Usage
- Author
-
Aniruddha R. Thakar, M. Jordan Raddick, Alexander S. Szalay, and Rafael Santos
- Subjects
SQL ,General Computer Science ,business.industry ,Relational database ,Computer science ,InformationSystems_INFORMATIONSTORAGEANDRETRIEVAL ,General Engineering ,InformationSystems_DATABASEMANAGEMENT ,Terabyte ,Database design ,Data modeling ,World Wide Web ,e-Science ,The Internet ,business ,computer ,computer.programming_language - Abstract
SkyServer is the primary catalog data portal of the Sloan Digital Sky Survey that makes multiple terabytes of astronomy data available to the world. Here, the process is described of collecting and analyzing the complete record of more than 10 years of Web hits and SQL queries to SkyServer.
- Published
- 2014
27. E-science infrastructures for molecular modeling and parametrization
- Author
-
Ning Shen, Ye Fan, and Sudhakar Pamidighantam
- Subjects
Service (systems architecture) ,General Computer Science ,business.industry ,Computer science ,Data management ,Application software ,computer.software_genre ,Theoretical Computer Science ,Cyberinfrastructure ,Workflow ,Modeling and Simulation ,Middleware (distributed applications) ,e-Science ,Operating system ,Web service ,business ,Software engineering ,computer - Abstract
E-science infrastructures are becoming the essential tools for computational scientific research. In this paper, we describe two e-science infrastructures: Science and Engineering Applications Grid (SEAGrid) and molecular modeling and parametrization (ParamChem). The SEAGrid is a virtual organization with a diverse set of hardware and software resources and provides services to access such resources in a routine and transparent manner. These essential services include allocations of computational resources, client-side application interfaces, computational job and data management tools, and consulting activities. ParamChem is another e-science project dedicated for molecular force-field parametrization based on both ab-initio and molecular mechanics calculations on high performance computers (HPCs) driven by scientific workflow middleware services. Both the projects share a similar three-tier computational infrastructure that consists of a front-end client, a middleware web services layer, and a remote HPC computational layer. The client is a Java Swing desktop application with components for pre- and post-data processing, communications with middleware server and local data management. The middleware service is based on Axis2 web service and MySQL relational database, which provides functionalities for user authentication and session control, HPC resource information collections, discovery and matching, job information logging and notification. It can also be integrated with scientific workflow to manage computations on HPC resources. The grid credentials for accessing HPCs are delegated through MyProxy infrastructure. Currently SEAGrid has integrated several popular application software suites such as Gaussian for quantum chemistry, NAMD for molecular dynamics and engineering software such as Abacus for mechanical engineering. ParamChem has integrated CGenFF (CHARMM General Force-Field) for molecular force-field parametrization of drug-like molecules. Long-term storage of user data is handled by tertiary data archival mechanisms. SEAGrid science gateway serves more than 500 users while more than 1000 users use ParamChem services such as atom typing and initial force-field parameter guess at present.
- Published
- 2014
28. Ten Years of SkyServer II: How Astronomers and the Public Have Embraced e-Science
- Author
-
Alexander S. Szalay, Aniruddha R. Thakar, M. Jordan Raddick, and Rafael Santos
- Subjects
World Wide Web ,SQL ,General Computer Science ,Web mining ,business.industry ,Research community ,e-Science ,General Engineering ,The Internet ,Sociology ,business ,computer ,computer.programming_language - Abstract
A comprehensive analysis of 10 years of Web and SQL traffic on SkyServer--the online portal to the multiterabyte Sloan Digital Sky Survey archive--shows the impressive reach of the SDSS to the research community and the public, and provides insight into how methods of e-science are being taken up by the scientific community.
- Published
- 2014
29. A data-centric neuroscience gateway: design, implementation, and experiences
- Author
-
Matthan W.A. Caan, Mohammad Mahdi Jaghoori, Shayan Shahand, Antoine H. C. van Kampen, Jordi Huguet, Ammar Benabdelkader, Mostapha al Mourabit, and Silvia D. Olabarriaga
- Subjects
Computer Networks and Communications ,Computer science ,Gateway (computer program) ,computer.software_genre ,Grid ,Database-centric architecture ,Computer Science Applications ,Theoretical Computer Science ,World Wide Web ,Computational Theory and Mathematics ,Grid computing ,Server ,Default gateway ,e-Science ,Systems architecture ,computer ,Software - Abstract
Summary Science gateways provide UIs and high-level services to access and manage applications and data collections on distributed resources. They facilitate users to perform data analysis on distributed computing infrastructures without getting involved into the technical details. The e-BioInfra Gateway is a science gateway for biomedical data analysis on a national grid infrastructure, which has been successfully adopted for neuroscience research. This paper describes the motivation, requirements, and design of a new generation of e-BioInfra Gateway, which is based on the grid and cloud user support environment (also known as WS-PGRADE/gUSE framework) and supports heterogeneous infrastructures. The new gateway has been designed to have additional data and meta-data management facilities to access and manage (biomedical) data servers, and to provide data-centric user interaction. We have implemented and deployed the new gateway for the computational neuroscience research community of the Academic Medical Center of the University of Amsterdam. This paper presents the system architecture of the new gateway, highlights the improvements that have been achieved, discusses the choices that we have made, and reflects on those based on initial user feedback. Copyright © 2014 John Wiley & Sons, Ltd.
- Published
- 2014
30. Digital Libraries Applications: CBIR, Education, Social Networks, eScience/Simulation, and GIS
- Author
-
Jonathan P. Leidig and Edward A. Fox
- Subjects
Information Systems and Management ,Geospatial analysis ,Computer Networks and Communications ,Computer science ,InformationSystems_INFORMATIONSTORAGEANDRETRIEVAL ,Network science ,Library and Information Sciences ,Digital library ,computer.software_genre ,World Wide Web ,e-Science ,Geocoding ,computer ,Image retrieval ,Information Systems - Abstract
* Content-Based Image Retrieval* Education* Social Networks in Digital Libraries* eScience and Simulation Digital Libraries* Geospatial Information* Bibliography
- Published
- 2014
31. Experiences of the Brazilian national high-performance computing network on the rapid prototyping of science gateways
- Author
-
Bruno F. Bastos, Vivian Medeiros, Antônio Tadeu A. Gomes, and Vinícius Macedo Moreira
- Subjects
Rapid prototyping ,Computer Networks and Communications ,Computer science ,business.industry ,Network on ,computer.software_genre ,Supercomputer ,Computer Science Applications ,Theoretical Computer Science ,Computational Theory and Mathematics ,e-Science ,Operating system ,Software engineering ,business ,computer ,Software - Published
- 2014
32. Addressing structural and dynamic features of scientific social networks through the lens of Actor-Network Theory
- Author
-
Maria Cecília Calani Baranauskas and Alysson Bolognesi Prado
- Subjects
Social network ,Management science ,Actor–network theory ,business.industry ,Computer science ,Communication ,Social software ,Representation (systemics) ,Social web ,computer.software_genre ,Data science ,Computer Science Applications ,Human-Computer Interaction ,e-Science ,Media Technology ,Software design ,business ,computer ,Information Systems ,TRACE (psycholinguistics) - Abstract
Knowledge on the social web presupposes gathering information about its current and potential users and document their relationships, interests and needs. A recent branch of sociology, the Actor-Network Theory (ANT), states that relations among human and nonhuman actors are equally important to understand social phenomena. Since scientists are potential users of huge computational support, their communities provide relevant cases for domain characterization and software design. This paper investigates the possibilities of using ANT to characterize a real instance of those social networks. The active role of nonhuman actors allows us to trace the relations based on material clues left behind by the actors, and also to bring forth features to be explored by the social software. The results of a structural study offer a graphical representation that allows quantitative and qualitative analysis of the social network, while the temporal evolution case study suggests that cyclic associations are more likely to persist. These outcomes may inform a better design of Web 2.0 systems for those communities.
- Published
- 2013
33. Grid-based recording and replay architecture in hybrid remote experiment using distributed streaming network
- Author
-
Jang Ho Lee
- Subjects
Java ,Computer science ,Distributed computing ,GridFTP ,Grid ,computer.software_genre ,Theoretical Computer Science ,Grid computing ,Hardware and Architecture ,Server ,e-Science ,Scalability ,File transfer ,computer ,Software ,Remote laboratory ,Information Systems ,Open Grid Services Architecture ,computer.programming_language - Abstract
We present a grid-based system that enables researchers in civil engineering to conduct hybrid remote experiments as well as to record and replay these experiments. The system has been designed for Real Time Hybrid Testing Facility in the Korea Construction Engineering Development (KOCED) project and has been implemented based on the Open Service Grid Architecture (OGSA) and the Open System for Earthquake Engineering Simulation (OpenSees) for modeling and computational simulation on structural systems. Users can access the system with a Java client for data visualization and video monitoring through an Internet portal using a web browser. The system runs on a network of distributed NaradaBrokering streaming servers and exploits RFT and GridFTP for reliable file transfer. The main contribution of this paper is that we propose an scalable grid-based recording and replay architecture for remote experiment using distributed streaming network that can scale to support a growing large number of clients simultaneously. We also evaluate the performance of the proposed extensible architecture of distributed servers with mathematical modeling as well as simulations to show how it scales to cope with a growing large number of clients at the same time.
- Published
- 2013
34. Geospatial Cyberinfrastructure and Geoprocessing Web—A Review of Commonalities and Differences of E-Science Approaches
- Author
-
Barbara Hofer
- Subjects
geoprocessing web ,Geospatial analysis ,Geographic information system ,Computer science ,business.industry ,Geography, Planning and Development ,geospatial cyberinfrastructure ,e-Science ,lcsh:G1-922 ,Geoprocessing ,computer.software_genre ,Data science ,World Wide Web ,Workflow ,Cyberinfrastructure ,Earth and Planetary Sciences (miscellaneous) ,Computers in Earth Sciences ,Web service ,business ,computer ,Implementation ,lcsh:Geography (General) - Abstract
Online geoprocessing gains momentum through increased online data repositories, web service infrastructures, online modeling capabilities and the required online computational resources. Advantages of online geoprocessing include reuse of data and services, extended collaboration possibilities among scientists, and efficiency thanks to distributed computing facilities. In the field of Geographic Information Science (GIScience), two recent approaches exist that have the goal of supporting science in online environments: the geospatial cyberinfrastructure and the geoprocessing web. Due to its historical development, the geospatial cyberinfrastructure has strengths related to the technologies required for data storage and processing. The geoprocessing web focuses on providing components for model development and sharing. These components shall allow expert users to develop, execute and document geoprocessing workflows in online environments. Despite this difference in the emphasis of the two approaches, the objectives, concepts and technologies they use overlap. This paper provides a review of the definitions and representative implementations of the two approaches. The provided overview clarifies which aspects of e-Science are highlighted in approaches differentiated in the geographic information domain. The discussion of the two approaches leads to the conclusion that synergies in research on e-Science environments shall be extended. Full-fledged e-Science environments will require the integration of approaches with different strengths.
- Published
- 2013
35. GARUDA: Pan-Indian distributed e-infrastructure for compute-data intensive collaborative science
- Author
-
Subrata Chattopadhyay, B. B. Prahlada Rao, N. Sarat Chandra Babu, N. Mangala, and R. Sridharan
- Subjects
Scientific instrument ,Standardization ,Computer science ,Interoperability ,General Medicine ,computer.software_genre ,Grid ,World Wide Web ,Engineering management ,Grid computing ,Middleware (distributed applications) ,e-Science ,Architecture ,computer - Abstract
GARUDA is a nation-wide grid of computational nodes, mass storage and scientific instruments with an aim to provide technological advancements required to enable compute-data intensive, collaborative applications for the twenty-first century. From a Proof-of-Concept, the GARUDA has evolved to an operational grid, aggregating nearly 70TF-15TB compute–storage power, via high-speed National Knowledge Network and hosts a stack of middleware and tools to enable hundreds of users from diverse communities like life science, earth science, computer aided engineering, material science, etc. Evolution and confluence of research and technologies has led to the maturity of GARUDA grid: there have been addition of several hundred CPUs, large data stores, standardization of grid middleware, research on interoperability between grids and participation from varied application communities that have made significant impact to GARUDA. The GARUDA partner institutes are using this e-infrastructure to grid enable applications of societal and national importance. The authors in this paper present the manner of building a nation-wide operational grid and its evolution, its deliverables, architecture and applications.
- Published
- 2013
36. an e-Science project in Astrodynamics and Celestial Mechanics fields
- Author
-
Juan Félix San-Juan and Rosario López
- Subjects
Interpretation (logic) ,Programming language ,Computer science ,business.industry ,General Physics and Astronomy ,Order (ring theory) ,computer.software_genre ,Celestial mechanics ,Software ,Hardware and Architecture ,Human–computer interaction ,e-Science ,Web service ,User interface ,Representation (mathematics) ,business ,computer - Abstract
Astrodynamics Web Tools, A s t r o d y T o o l s W e b ( http://tastrody.unirioja.es ), is an ongoing collaborative Web Tools computing infrastructure project which has been specially designed to support scientific computation. A s t r o d y T o o l s W e b provides project collaborators with all the technical and human facilities in order to wrap, manage, and use specialized noncommercial software tools in Astrodynamics and Celestial Mechanics fields, with the aim of optimizing the use of resources, both human and material. However, this project is open to collaboration from the whole scientific community in order to create a library of useful tools and their corresponding theoretical backgrounds. A s t r o d y T o o l s W e b offers a user-friendly web interface in order to choose applications, introduce data, and select appropriate constraints in an intuitive and easy way for the user. After that, the application is executed in real time, whenever possible; then the critical information about program behavior (errors and logs) and output, including the postprocessing and interpretation of its results (graphical representation of data, statistical analysis or whatever manipulation therein), are shown via the same web interface or can be downloaded to the user’s computer.
- Published
- 2013
37. Scientific geodata infrastructures: challenges, approaches and directions
- Author
-
Christin Henzen, Johannes Brauner, Stephan Mäs, Lars Bernard, and Matthias S. Müller
- Subjects
Spatial data infrastructure ,Geospatial analysis ,Computer science ,Best practice ,Corporate governance ,computer.software_genre ,Data science ,Computer Science Applications ,Environmental studies ,e-Science ,General Earth and Planetary Sciences ,Dissemination ,computer ,Software ,Digital Earth - Abstract
Based on various experiences in developing Geodata Infrastructures (GDIs) for scientific applications, this article proposes the concept of a Scientific GDI that can be used by scientists in environmental and earth sciences to share and disseminate their research results and related analysis methods. Scientific GDI is understood as an approach to tackle the science case in Digital Earth and to further enhance e-science for environmental research. Creating Scientific GDI to support the research community in efficiently exchanging data and methods related to the various scientific disciplines forming the basis of environmental studies poses numerous challenges on today's GDI developments. The paper summarizes requirements and recommendations on the publication of scientific geospatial data and on functionalities to be provided in Scientific GDI. Best practices and open issues for governance and policies of a Scientific GDI are discussed and are concluded by deriving a research agenda for the next decade.
- Published
- 2013
38. An Integrated e-Science Analysis Base for Computation Neuroscience Experiments and Analysis
- Author
-
Andrew Branson, Richard McClatchey, Saad Liaquat Kiani, Khawar Hasham, Jetendr Shamdasani, and Kamran Munir
- Subjects
020205 medical informatics ,Computer science ,Emerging technologies ,Data management ,Data analysis ,02 engineering and technology ,computer.software_genre ,03 medical and health sciences ,0302 clinical medicine ,0202 electrical engineering, electronic engineering, information engineering ,Information system ,General Materials Science ,business.industry ,Scientific workflow ,Information service ,Grid ,Data science ,Workflow ,Computational neuroscience ,e-Science ,Data integration ,business ,computer ,Neuroscience ,030217 neurology & neurosurgery ,E-science - Abstract
Recent developments in data management and imaging technologies have significantly affected diagnostic and extrapolative research in the understanding of neurodegenerative diseases. However, the impact of these new technologies is largely dependent on the speed and reliability with which the medical data can be visualised, analysed and interpreted. The EU's neuGRID for Users (N4U) is a follow-on project to neuGRID, which aims to provide an integrated environment to carry out computational neuroscience experiments. This paper reports on the design and development of the N4U Analysis Base and related Information Services, which addresses existing research and practical challenges by offering an integrated medical data analysis environment with the necessary building blocks for neuroscientists to optimally exploit neuroscience workflows, large image datasets and algorithms in order to conduct analyses. The N4U Analysis Base enables such analyses by indexing and interlinking the neuroimaging and clinical study datasets stored on the N4U Grid infrastructure, algorithms and scientific workflow definitions along with their associated provenance information.
- Published
- 2013
- Full Text
- View/download PDF
39. GLORIA - the GLObal Robotic telescopes Intelligent Array for e-science
- Author
-
Lech Mankiewicz
- Subjects
Physics ,Multimedia ,Space and Planetary Science ,e-Science ,General Engineering ,Free access ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,Astronomy and Astrophysics ,Virtual observatory ,computer.software_genre ,computer - Abstract
GLORIA stands for “GLObal Robotictelescopes Intelligent Array” GLORIA will be the first free and open-access network of robotic telescopes in the world It will be a Web 2.0 environment where users can do research in astronomy by observing with robotic telescopes and/or analyzing data that other users have acquired with GLORIA, or from other free access databases, like the European Virtual Observatory.
- Published
- 2013
40. Resource Provisioning for e-Science Environments
- Author
-
Andrea Bosin
- Subjects
Flexibility (engineering) ,Computer Networks and Communications ,Computer science ,computer.internet_protocol ,business.industry ,Distributed computing ,Provisioning ,Cloud computing ,Service-oriented architecture ,computer.software_genre ,Grid ,Business Process Execution Language ,Workflow ,Resource (project management) ,Utility computing ,e-Science ,Web service ,business ,computer - Abstract
Recent works have proposed a number of models and tools to address the growing needs and expectations in the field of e-Science. At the same time, the availability and models of use of networked computing resources needed by e-Science are rapidly changing and see the coexistence of many disparate paradigms: high performance computing, grid and recently cloud, which brings very promising expectations due to its high flexibility. In this paper we suggest a model to promote the convergence and the integration of different computing paradigms and infrastructures for the dynamic on-demand provisioning of the resources needed by e-Science environments, leveraging the Service-Oriented Architecture model. In addition, its design aims at endorsing a flexible, modular, workflow-based collaborative environment for e-Science. A working implementation used to validate the proposed approach is described together with some performance tests.
- Published
- 2013
41. The Evolution of myExperiment
- Author
-
Don Cruickshank, Marco Roos, Paolo Missier, David Newman, Nandkumar Kollara, Ed Zaluska, Paul R. Fisher, Marcus Ramsden, Sergejs Aleksejevs, Carole Goble, David De Roure, Danius T. Michaelides, Jun Zhao, Sean Bechhofer, Jiten Bhagat, and Katy Wolstencroft
- Subjects
Web 2.0 ,Computer science ,business.industry ,Linked data ,computer.file_format ,computer.software_genre ,E-research ,World Wide Web ,Workflow ,e-Science ,The Internet ,Web service ,RDF ,business ,computer - Abstract
The myExperiment social website for sharing scientific workflows, designed according to Web 2.0 principles, has grown to be the largest public repository of its kind. It is distinctive for its focus on sharing methods, its researcher-centric design and its facility to aggregate content into sharable 'research objects'. This evolution of myExperiment has occurred hand in hand with its users. myExperiment now supports Linked Data as a step toward our vision of the future research environment, which we categorise here as'3 rd generation e-Research. © 2010 IEEE.
- Published
- 2016
42. Multiscale modeling: Physiome project standards, tools, and databases
- Author
-
Denis Noble, W.W. Li, Andrew D. McCulloch, and Peter Hunter
- Subjects
Markup language ,General Computer Science ,Database ,computer.internet_protocol ,Computer science ,CellML ,computer.software_genre ,Data science ,Multiscale modeling ,Physiome ,e-Science ,Leverage (statistics) ,computer ,XML - Abstract
The Physiome Project's markup languages and associated tools leverage the CellML and FieldML model databases published in peer-reviewed journals. As these tools mature, researchers can check models for conformance to underlying physics laws, using them to develop complex physiological models from separately validated components. © 2006 IEEE.
- Published
- 2016
43. Towards a Grid infrastructure to support integrative approaches to biological research
- Author
-
Sharon Lloyd, D.F Mac Randal, Andrew Simpson, David J. Gavaghan, and David Boyd
- Subjects
Underpinning ,General Mathematics ,Systems biology ,General Physics and Astronomy ,computer.software_genre ,Models, Biological ,Computer Simulation ,Architecture ,Biology ,Mathematical Computing ,Internet ,Management science ,business.industry ,Systems Biology ,General Engineering ,Computational Biology ,Grid ,United States ,Systems Integration ,Engineering management ,Grid computing ,Research Design ,e-Science ,System integration ,business ,computer ,Software - Abstract
This paper discusses the scientific rationale behind the e-Science project, Integrative Biology , which is developing mathematical modelling tools, HPC-enabled simulations and an underpinning Grid infrastructure to provide an integrative approach to the modelling of complex biological systems. The project is focusing on two key applications to validate the approach: the modelling of heart disease and cancer, which together are responsible for over 60% of deaths in the United Kingdom. This paper provides an overview of the project, describes the initial prototype architecture and discusses the long-term scientific aims.
- Published
- 2016
44. A collaborative approach to support e-science activities
- Author
-
Regina Braga, José Maria N. David, Victor Ströele, Tadeu Moreira de Classe, Fernanda Campos, and Marco Antônio Pereira Araújo
- Subjects
Knowledge management ,business.industry ,Computer science ,Information processing ,020206 networking & telecommunications ,02 engineering and technology ,Ontology (information science) ,computer.software_genre ,Social Semantic Web ,World Wide Web ,Semantic computing ,Scientific method ,e-Science ,0202 electrical engineering, electronic engineering, information engineering ,Semantic analytics ,020201 artificial intelligence & image processing ,Semantic Web Stack ,Web service ,business ,Semantic Web ,computer ,Data Web - Abstract
In recent years the scientific research has undergone substantial changes. In particular, there is a greater collaboration between research groups, which leads to an increase in the use of information processing techniques, and, therefore, the need to share results and observations among participants of a research. This work has as main goal to propose an architecture to support distributed processing of scientific experiments, as an implementation of so-called collaborative laboratories.
- Published
- 2016
45. A fully immersive virtual model to explore archaeological sites
- Author
-
Marcio Cabral, Eduardo Zilles Borba, Marcelo Knörich Zuffo, Roseli de Deus Lopes, and Regis Kopper
- Subjects
Computer science ,Point cloud ,020207 software engineering ,02 engineering and technology ,Virtual reality ,computer.software_genre ,Archaeology ,Virtual machine ,Joystick ,Computer graphics (images) ,e-Science ,0202 electrical engineering, electronic engineering, information engineering ,Immersion (virtual reality) ,computer ,ComputingMethodologies_COMPUTERGRAPHICS - Abstract
In this work we present the methodological approach applied to develop a fully immersive and interactive virtual environment that simulates an archaeological site located in Sao Paulo (Brazil). To create a realistic 3D space, which would be relevant for research through a cyber-archeology exploration, laser scanners and photometry were used for collecting 3D point clouds data from the physical. In consequence, the digital data acquired from these apparatus generated a huge density of point clouds, requiring a many gigabytes computer storage and a research work on design to compact all the information in an user friendly interactive virtual model, but realistic to archaeologists. Also to provide an immersive feeling when exploring the virtual reality we decided to allow the user to navigate through the scene using control devices (keyboard, mouse and joysticks) and a head-mounted display (Oculus Rift) to visualize the aesthetical and spatial elements of the archaeological site as if she/he was really in that place (forms, scales, proportions, perspective, textures, illumination, shadows). In resume, through a sophisticated digital simulation environment, which regards the playful of an electronic game in first-person field of vision, we created a telepresence sense to the user, as well providing archeologists a landscape (and objects) exploration through a non-destructive way.
- Published
- 2016
46. The role of online labs in the European e-Science Infrastructure
- Author
-
Marco Zappatore, Antonella Longo, Mario A. Bochicchio, Donato Tarantino, Bochicchio, Mario Alessandro, Longo, Antonella, Zappatore, MARCO SALVATORE, and Tarantino, Donato
- Subjects
European community ,Multimedia ,Real-time communication ,Computer science ,05 social sciences ,050301 education ,02 engineering and technology ,computer.software_genre ,WebRTC ,Engineering management ,020204 information systems ,e-Science ,ComputingMilieux_COMPUTERSANDEDUCATION ,0202 electrical engineering, electronic engineering, information engineering ,Dimension (data warehouse) ,0503 education ,computer ,Protocol (object-oriented programming) ,Scientific disciplines - Abstract
Scientific disciplines nowadays benefits from e-learning technologies and online laboratories. This allows engaging students more effectively and achieving better learning outcomes. However, even the most recent online laboratorial solutions lack in offering some valuable capabilities, such as the cooperative dimension, which promises to boost the students' involvement and their interactions with peers and teachers. According to the most recent guidelines from the scientific scenario as well as from the European Community, with the e-Science Infrastructure initiative, we propose in this paper a collaborative, hierarchical, multi-videoconferencing approach based on WebRTC that offers an optical microscope located at the biology faculty of our university as an online laboratory equipment. The proposed platform has been tested with students from a local high-school in their curricular biology activities, thus highlighting the feasibility of the proposed approach as well as its pedagogical effectiveness. The first implemented prototype and the proposed didactic protocol are also presented.
- Published
- 2016
47. Distributed Computing Instrastructure as a Tool for e-Science
- Author
-
Maciej Twardy, Łukasz Dutka, Robert Pająk, Tomasz Szepieniec, Jacek Kitowski, Kazimierz Wiatr, Renata Slota, and Mariusz Sterzel
- Subjects
Computer science ,business.industry ,Distributed computing ,020206 networking & telecommunications ,02 engineering and technology ,Service provider ,computer.software_genre ,Autonomic computing ,Distributed design patterns ,Software ,Grid computing ,Utility computing ,Distributed algorithm ,End-user computing ,e-Science ,0202 electrical engineering, electronic engineering, information engineering ,Data-intensive computing ,020201 artificial intelligence & image processing ,business ,computer - Abstract
It is now several years since scientists in Poland can use the resources of the distributed computing infrastructure – PLGrid. It is a flexible, large-scale e-infrastructure, which offers a homogeneous, easy to use access to organizationally distributed, heterogeneous hardware and software resources. It is built in accordance with good organizational and engineering practices, taking advantage of international experience in this field. Since the scientists need assistance and close collaboration with service providers, the e-infrastructure is relied on users’ requirements and needs coming from different scientific disciplines, being equipped with specific environments, solutions and services, suitable for various disciplines. All these tools help to lowering the barriers that hinder researchers to use the infrastructure.
- Published
- 2016
48. Assessment of SDN technology for an easy-to-use VPN service
- Author
-
B.M.M. Gijsen, Piotr Zuraniewski, Daniel Filipe Cabaça Romão, Marijke Kaat, and Ronald van der Pol
- Subjects
Service (systems architecture) ,OpenFlow ,business.product_category ,Computer Networks and Communications ,Computer science ,ISEC - Information Security ,02 engineering and technology ,computer.software_genre ,VPN ,SDN ,Virtual private networks ,Server ,Portals ,0202 electrical engineering, electronic engineering, information engineering ,eScience ,Protocol (object-oriented programming) ,Interconnection ,Network architecture ,TS - Technical Sciences ,Mininet ,business.industry ,Testbed ,e-Science ,020206 networking & telecommunications ,OpenDaylight ,Hardware and Architecture ,ICT ,Operating system ,020201 artificial intelligence & image processing ,Network switch ,business ,computer ,Software ,Computer network - Abstract
This paper describes how state-of-the-art SDN technology can be used to create and validate a user configurable, on-demand VPN service. In the Community Connection (CoCo) project an architecture for the VPN service was designed and a prototype was developed based on the OpenFlow protocol and the OpenDaylight controller. The CoCo prototype enables automatic setup and tear down of CoCo instances (VPNs) by end-users via an easy to use web portal, without needing the help of network administrators to do manual configuration of the network switches. Users from the research community, amongst others, expressed their interest in using such an easy-to-use VPN service for on-demand interconnection of their eScience resources (servers, VMs, laptops, storage, scientific instruments, etc.) that may only be reachable for their closed group. The developed CoCo prototype was validated in an SDN testbed and via Mininet simulation. Using the calibrated Mininet simulation the impact was analysed for larger scale deployments of the CoCo prototype. We describe the architecture of an OpenFlow based multi-domain on-demand L3VPN.We give implementation details of the developed demonstrator.An easy to use web portal allows end-users to set-up and manage multi-domain VPN.Community Connect (CoCo) service can enable integrated resource management solutions.We performed functional and non-functional tests in a physical and virtual testbed.
- Published
- 2016
49. An SOA-Based Model for the Integrated Provisioning of Cloud and Grid Resources
- Author
-
Andrea Bosin
- Subjects
Article Subject ,business.industry ,Computer science ,computer.internet_protocol ,Distributed computing ,020206 networking & telecommunications ,Cloud computing ,Provisioning ,02 engineering and technology ,General Medicine ,Service-oriented architecture ,Grid ,Workflow ,Utility computing ,Scalability ,e-Science ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,business ,Software engineering ,computer - Abstract
In the last years, the availability and models of use of networked computing resources within reach of e-Science are rapidly changing and see the coexistence of many disparate paradigms: high-performance computing, grid, and recently cloud. Unfortunately, none of these paradigms is recognized as the ultimate solution, and a convergence of them all should be pursued. At the same time, recent works have proposed a number of models and tools to address the growing needs and expectations in the field of e-Science. In particular, they have shown the advantages and the feasibility of modeling e-Science environments and infrastructures according to the service-oriented architecture. In this paper, we suggest a model to promote the convergence and the integration of the different computing paradigms and infrastructures for the dynamic on-demand provisioning of resources from multiple providers as a cohesive aggregate, leveraging the service-oriented architecture. In addition, we propose a design aimed at endorsing a flexible, modular, workflow-based computing model for e-Science. The model is supplemented by a working prototype implementation together with a case study in the applicative domain of bioinformatics, which is used to validate the presented approach and to carry out some performance and scalability measurements.
- Published
- 2012
50. Interventionist grid development projects: a research framework based on three frames
- Author
-
Avgousta Kyriakidou-Zacharoudiou and Will Venters
- Subjects
Engineering ,Process management ,Knowledge management ,business.industry ,media_common.quotation_subject ,Cloud computing ,Library and Information Sciences ,Grid ,computer.software_genre ,Health informatics ,Computer Science Applications ,Conceptual framework ,Grid computing ,e-Science ,Information system ,Bureaucracy ,business ,computer ,Information Systems ,media_common - Abstract
PurposeThis paper seeks to consider the collaborative efforts of developing a grid computing infrastructure within problem‐focused, distributed and multi‐disciplinary projects – which the authors term interventionist grid development projects – involving commercial, academic and public collaborators. Such projects present distinctive challenges which have been neglected by existing escience research and information systems (IS) literature. The paper aims to define a research framework for understanding and evaluating the social, political and collaborative challenges of such projects.Design/methodology/approachThe paper develops a research framework which extends Orlikowski and Gash's concept of technological frames to consider two additional frames specific to such grid projects; bureaucratic frames and collaborator frames. These are used to analyse a case study of a grid development project within Healthcare which aimed to deploy a European data‐grid of medical images to facilitate collaboration and communication between clinicians across the European Union.FindingsThat grids are shaped to a significant degree by the collaborative practices involved in their construction, and that for projects involving commercial and public partners such collaboration is inhibited by the differing interpretive frames adopted by the different relevant groups.Research limitations/implicationsThe paper is limited by the nature of the grid development project studied, and the subsequent availability of research subjects.Practical implicationsThe paper provides those involved in such projects, or in policy around such grid developments, with a practical framework by which to evaluate collaborations and their impact on the emergent grid. Further, the paper presents lessons for future such Interventionist grid projects.Originality/valueThis is a new area for research but one which is becoming increasingly important as data‐intensive computing begins to emerge as foundational to many collaborative sciences and enterprises. The work builds on significant literature in escience and IS drawing into this new domain. The research framework developed here, drawn from the IS literature, begins a new stream of systems development research with a distinct focus on bureaucracy, collaboration and technology within such interventionist grid development projects.
- Published
- 2012
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.