575 results on '"Centralized database"'
Search Results
2. Sales and production management system and reporting using centralized database system
- Author
-
Pigera, A.I.H., Senarathna, P.P, Dodanduwa, D.L.H.S.D., Amarakoon, G.A.M.T.S.B., Silva, D. I.De, and Vidhanaarchchi, Samitha
- Published
- 2022
- Full Text
- View/download PDF
3. Feasibility study for the implementation of number portability in Nicaragua
- Author
-
Guillermo de Jesús Valdivia-Medina
- Subjects
Number Portability ,Centralized Database ,Administrator ,All-Call Query ,TELCOR ,Regulation ,Technology - Abstract
The document “Feasibility Study for the Implementation of Number Portability in Nicaragua” proposes the mechanisms to implement operator Number Portability, taking into account the ITU-T Series Q and E regulations. The existing information is analyzed, after the collection of information through interviews with operator officials, as well as surveys of users of telephone services. After reviewing the advantages and conditions of the different techniques to implement number portability, we recommend the All Call Query technique, a simple and effective method, as well as operator Number Portability, for which you need intelligent network capabilities to recognize the ported number, the status, the conditions and the operator must develop activities to adapt their networks to this model. The documents that legally and regulatory support the implementation of Number Portability are: Administrative Agreement No. 036-2003, the Numbering Resource Regulations and the National Numbering Plan, all of these issued by TELCOR Regulatory Entity, the Free Trade Agreement between the Dominican Republic - Central America and the United States (DR-CAFTA) and the Regional Technical Telecommunications Commission of Central America, COMTELCA The elaboration of regulatory documents such as the Number Portability Regulation is proposed and indications are given for the creation of the Centralized Database Administrator with its subordinates in the different nodes that make up the network and a proposal for a General Implementation Plan.
- Published
- 2023
- Full Text
- View/download PDF
4. Centralized Database Access: Transformer Framework and LLM/Chatbot Integration-Based Hybrid Model
- Author
-
Diana Bratić, Marko Šapina, Denis Jurečić, and Jana Žiljak Gršić
- Subjects
centralized database ,educational materials ,transformer framework ,NLP ,API implementation ,LLM/chatbot ,Technology ,Applied mathematics. Quantitative methods ,T57-57.97 - Abstract
This paper addresses the challenges associated with the centralized storage of educational materials in the context of a fragmented and disparate database. In response to the increasing demands of modern education, efficient and accessible retrieval of materials for educators and students is essential. This paper presents a hybrid model based on the transformer framework and utilizing an API for an existing large language model (LLM)/chatbot. This integration ensures precise responses drawn from a comprehensive educational materials database. The model architecture uses mathematically defined algorithms for precise functions that enable deep text processing through advanced word embedding methods. This approach improves accuracy in natural language processing and ensures both high efficiency and adaptability. Therefore, this paper not only provides a technical solution to a prevalent problem but also highlights the potential for the continued development and integration of emerging technologies in education. The aim is to create a more efficient, transparent, and accessible educational environment. The importance of this research lies in its ability to streamline material access, benefiting the global scientific community and contributing to the continuous advancement of educational technology.
- Published
- 2024
- Full Text
- View/download PDF
5. Secure decentralized electronic health records sharing system based on blockchains
- Author
-
Farag Sallabi, Juhar Ahmed Abdella, Mohamed Adel Serhani, and Khaled Shuaib
- Subjects
File system ,Blockchain ,General Computer Science ,Computer science ,business.industry ,020206 networking & telecommunications ,Denial-of-service attack ,02 engineering and technology ,computer.software_genre ,Centralized database ,Scalability ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Single point of failure ,business ,Byzantine fault tolerance ,Database transaction ,computer ,Computer network - Abstract
Blockchain technology has a great potential for improving efficiency, security and privacy of Electronic Health Records (EHR) sharing systems. However, existing solutions relying on a centralized database are susceptible to traditional security problems such as Denial of Service (DoS) attacks and a single point of failure similar to traditional database systems. In addition, past solutions exposed users to privacy linking attacks and did not tackle performance and scalability challenges. In this paper, we propose a permissioned Blockchain based healthcare data sharing system that integrates Blockchain technology, decentralized file system and threshold signature to address the aforementioned problems. The proposed system is based on Istanbul Byzantine Fault Tolerant (IBFT) consensus algorithm and Interplanetary File System (IPFS). We implemented the proposed system on an enterprise Ethereum Blockchain known as Hyperledger Besu. We evaluated and compared the performance of the proposed system based on various performance metrics such as transaction latency, throughput and failure rate. Experiments were conducted on a variable network size and number of transactions. The experimental results indicate that the proposed system performs better than existing Blockchain based systems. Moreover, the decentralized file system provides better security than existing traditional centralized database systems while providing the same level of performance.
- Published
- 2022
6. Parallel Method for Mining High Utility Itemsets from Vertically Partitioned Distributed Databases
- Author
-
Vo, Bay, Nguyen, Huy, Ho, Tu Bao, Le, Bac, Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Nierstrasz, Oscar, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Sudan, Madhu, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Vardi, Moshe Y., Series editor, Weikum, Gerhard, Series editor, Goebel, Randy, editor, Siekmann, Jörg, editor, Wahlster, Wolfgang, editor, Velásquez, Juan D., editor, Ríos, Sebastián A., editor, Howlett, Robert J., editor, and Jain, Lakhmi C., editor
- Published
- 2009
- Full Text
- View/download PDF
7. Practical implications of using non‐relational databases to store large genomic data files and novel phenotypes
- Author
-
Elaine Parros Machado de Sousa, Lucas Tassoni Andrietta, Ricardo Vieira Ventura, André Moreira Souza, and Rodrigo de Andrade Santos Weigert
- Subjects
FASTQ format ,Information retrieval ,Genotype ,FENÓTIPOS ,Computer science ,Relational database ,Genomic data ,Information Storage and Retrieval ,Unstructured data ,Genomics ,General Medicine ,computer.file_format ,Data conversion ,Schema (genetic algorithms) ,Centralized database ,Phenotype ,Food Animals ,Animals ,Database Management Systems ,Animal Science and Zoology ,computer ,Practical implications - Abstract
The objective of our study was to provide practical directions on the storage of genomic information and novel phenotypes (treated here as unstructured data) using a non-relational database. The MongoDB technology was assessed for this purpose, enabling frequent data transactions involving numerous individuals under genetic evaluation. Our study investigated different genomic (Illumina Final Report, PLINK, 0125, FASTQ, and VCF formats) and phenotypic (including media files) information, using both real and simulated datasets. Advantages of our centralized database concept include the sublinear running time for queries after increasing the number of samples/markers exponentially, in addition to the comprehensive management of distinct data formats while searching for specific genomic regions. A comparison of our non-relational and generic solution, with an existing relational approach (developed for tabular data types using 2 bits to store genotypes), showed reduced importing time to handle 50M SNPs (PLINK format) achieved by the relational schema. Our experimental results also reinforce that data conversion is a costly step required to manage genomic data into both relational and non-relational database systems, and therefore, must be carefully treated for large applications.
- Published
- 2021
8. SISTEM PENGELOLAAN ARSIP PADA SEKRETARIAT JENDERAL KEMENTERIAN PEKERJAAN UMUM DAN PERUMAHAN RAKYAT SUMATERA BARAT
- Author
-
Ari Argonanto and Susiyanti Meilina
- Subjects
Centralized database ,Schedule (workplace) ,Public housing ,business.industry ,Political science ,Management system ,Library science ,Distribution (economics) ,Context (language use) ,business ,Administration (government) ,Qualitative research - Abstract
One of the efforts to support the management of archives in the field of the General Bureau at the Secretariat General of the Ministry of Public Works and Public Housing of West Sumatra is to create an archival application system that can be operated via a computer network, with the aim of increasing the acceleration of the search service process for an archive. This research is: 1) To find out what are the obstacles faced in the archive management system in the General Bureau at the Secretariat General of the Ministry of Public Works and Public Housing of West Sumatra; 2) To find out what are the efforts made to overcome the obstacles in the archive management system in the field of the General Bureau at the Secretariat General of the Ministry of Public Works and Public Housing of West Sumatra; 3) To find out what are the efforts made to overcome the obstacles in the archive management system in the field of the General Bureau at the Secretariat General of the Ministry of Public Works and Public Housing of West Sumatra. This type of research is qualitative research. Qualitative research is research that intends to understand the phenomena experienced by research subjects (such as behavior, perception, action, etc.), holistically, and by means of descriptions in the form of words and language, in a specific natural and natural context. by utilizing various natural methods (Moleong, 2012;: 87). Informants are people who are used to provide information about the situation and conditions of the research background. So he continues to have a lot of experience in the research setting. As a team member with kindness, volunteering can provide views of values, attitudes and culture, which are the background for local research (Moleong; 2012: 90). The types of data are primary data and secondary data collected through interviews, observation and library research. The analysis technique used is qualitative analysis. The results of this study indicate that: recording or distribution of archives in the General Bureau of the Secretariat General of the Ministry of Public Works and Public Housing of West Sumatra Province uses several types, namely the Electronic Service Manuscript System (TNDE) and the outgoing mail agenda book. The conclusion that can be drawn is the dynamic archive management system in the field of the General Bureau at the Secretariat General of the Ministry of Public Works and Public Housing of West Sumatra Province in the aspects of creating and receiving archives, storing, maintaining, and shrinking, namely: in terms of creating and receiving archives, they have followed the existing archival provisions The most frequently created and received dynamic archives are correspondence records (letters). The archive storage system used at the General Bureau is the Electronic Service Manuscript Administration (TNDE) system. This TNDE system is so that letter data will be stored in a centralized database management system, making it easy to maintain data. The search for the TNDE system cannot be done automatically in the access feature in the system. For archival maintenance, the implementation of fumigation is carried out twice every year. In the context of shrinking the archives, there is an alternating dynamic archive assessment team. Archive shrinkage is in accordance with the Archive Retention Schedule (JRA), archives that have passed JRA will be immediately destroyed by burning. Suggestions include the implementation of a good archive management system at the Secretariat General of the Ministry of Public Works and Public Housing of West Sumatra, it should be equipped with supporting facilities and infrastructure.
- Published
- 2021
9. Sales and Production Management System and Reporting using Centralized Database System
- Author
-
Pigera A.I.H., Senarathna P.P, D.L.H.S.D. Dodanduwa, G.A.M.T.S.B. Amarakoon, D. I .De Silva, Samitha Vidhanaarchchi, Pigera A.I.H., Senarathna P.P, D.L.H.S.D. Dodanduwa, G.A.M.T.S.B. Amarakoon, D. I .De Silva, and Samitha Vidhanaarchchi
- Abstract
In the industry, one of the most extensively utilized systems is the Sales and Production management system. Since there were numerous departments inside the company, building up a shared sales and production management system enabled timely and effective control of the company's stocks, order management, and efficient use. If you do the accounting using manual methods for the administration, as mentioned above, chores will also take a lot of paperwork and workforce. This study suggests a system.
- Published
- 2022
10. Is the French SIRE equine information system a good basis for surveillance and epidemiological research? Quality assessment using two surveys
- Author
-
David Garon, Aurelie Merlin, Carole Sala, Mathilde Dhollande, Xavier Dornier, Mathilde Saussac, J. Tapprest, Halifa Farchati, Aliments Bioprocédés Toxicologie Environnements (ABTE), Université de Caen Normandie (UNICAEN), Normandie Université (NU)-Normandie Université (NU)-Université de Rouen Normandie (UNIROUEN), Normandie Université (NU), ANSES, Laboratoire de Santé Animale - site de Dozulé, Agence nationale de sécurité sanitaire de l'alimentation, de l'environnement et du travail (ANSES), Laboratoire de Lyon [ANSES], Institut Français du Cheval et de L'équitation (IFCE), and We are grateful to the French Agency for Food, Environmental and Occupational Health & Safety (ANSES), to the French horse and riding institute (IFCE) and the Fonds Eperon, for their financial support and for providing access to data.
- Subjects
medicine.medical_specialty ,Databases, Factual ,Epidemiology ,040301 veterinary sciences ,media_common.quotation_subject ,Population ,Keeper ,0403 veterinary science ,03 medical and health sciences ,Owner ,Surveys and Questionnaires ,Environmental health ,Information system ,medicine ,Animals ,Research quality ,Quality (business) ,Horses ,education ,health care economics and organizations ,030304 developmental biology ,media_common ,0303 health sciences ,education.field_of_study ,[SDV.BA.MVSA]Life Sciences [q-bio]/Animal biology/Veterinary medicine and animal Health ,General Veterinary ,Equine ,Ownership ,Sire ,Data quality ,Traceability ,04 agricultural and veterinary sciences ,Europe ,Vital Statistics ,Centralized database ,Geography ,Epidemiological Monitoring ,[SDV.SPEE]Life Sciences [q-bio]/Santé publique et épidémiologie ,France - Abstract
International audience; Accurate demographic knowledge of the equine population is needed to assess and model equine health events. France is one of the few European countries with an operational centralized database (SIRE) recording individual data on all declared equines living in France and on their owners and keepers. Our study aimed to assess SIRE database quality concerning the updating of information by equine owners and keepers with a view to its improvement and use in surveillance and research. Two online surveys were conducted with the participation of 6244 registered keepers and 13,869 owners. Results showed some inconsistencies between SIRE records and survey responses. The inconsistency rate for equines whose castration and death were not registered in the database was 28.7% and 5.9% respectively. Concerning owners, 11% of respondents did not own the reference equine selected considered by the survey, 33% had changed address without updating it in the SIRE. Concerning premises hosting equines, the keeper survey's inconsistency rate was 7.3%, of which 57 respondents had closed and 32 had opened premises without reporting it. Comparatively, the owner survey's inconsistency rate was 40.7% including respondents who owned and hosted an equine without reporting these equine premises, and owners who did not keep any equines on their premises. In conclusion, the SIRE database proved to be a valuable and reliable source for epidemiological research as long as some bias is taken into account. On the contrary, its use in surveillance is currently limited due some shortcomings in updating and/or reporting by owners and keepers.
- Published
- 2021
11. PastureBase Ireland: A grassland decision support system and national database.
- Author
-
Hanrahan, Liam, Geoghegan, Anne, O'Donovan, Michael, Griffith, Vincent, Ruelle, Elodie, Wallace, Michael, and Shalloo, Laurence
- Subjects
- *
GRASSLAND management , *DECISION support systems , *PASTURE plants , *ACQUISITION of data , *STATISTICAL correlation - Abstract
PastureBase Ireland (PBI) is a web-based grassland management application incorporating a dual function of grassland decision support and a centralized national database to collate commercial farm grassland data. This database facilitates the collection and storage of vast quantities of grassland data from grassland farmers. The database spans across ruminant grassland enterprises – dairy, beef and sheep. To help farmers determine appropriate actions around grassland management, we have developed this data informed decision support tool to function at the paddock level. Individual farmers enter data through the completion of regular pasture cover estimations across the farm, allowing the performance of individual paddocks to be evaluated within and across years. To evaluate the PBI system, we compared actual pasture cut experimental data (Etesia cuts) to PBI calculated outputs. We examined three comparisons, comparing PBI outputs to actual pasture cut data, for individual DM yields at defoliation ( Comparison 1 ), for cumulative annual DM yields including silage data ( Comparison 2 ) and, for cumulative annual DM yields excluding silage data ( Comparison 3 ). We found an acceptable accuracy between PBI outputs and pasture cut data when statistically analyzed using relative prediction error and concordance correlation coefficients for the measurement of total annual DM yield ( Comparison 2 ), with a relative prediction error of 15.4% and a concordance correlation coefficient of 0.85. We demonstrated an application of the PBI system through analysis of commercial farm data across two years (2014–2015) for 75 commercial farms who actively use the system. The analysis showed there was a significant increase in DM yield from 2014 to 2015. The results indicated a greater variation in pasture growth across paddocks within farms than across farms. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
12. Histórico do desenvolvimento e implementação de um sistema nacional de gerenciamento de amostras de produtos em Vigilância Sanitária
- Author
-
Adalberto Lamim da Silva, Célia Maria Carvalho Pereira Araujo Romão, Rosane Gomes Alves Lopes, Antonio Eugenio Castro Cardoso de Almeida, and Nélio Cezar de Aquino
- Subjects
Receipt ,medicine.medical_specialty ,Knowledge management ,business.industry ,lcsh:Public aspects of medicine ,Public health ,lcsh:RA1-1270 ,Sistemas de Informação ,Vigilância Sanitária ,Amostras de Produtos ,Serviços Laboratoriais de Saúde Pública ,Scientific evidence ,Identification (information) ,Centralized database ,Management system ,Agency (sociology) ,medicine ,Information system ,business - Abstract
Introduction: Information is an essential tool for decision-making, and contributes to the “information-decision-action” process. Objective: To report the National Institute for Quality Control in Health (INCQS, by its acronym in Portuguese) experience in the development and implementation of a national information system for the management of samples of products of interest to health - from receipt in the laboratory to the issuance of analytical reports - and to present the possibility of its contribution to the National Health Surveillance System (SNVS). Method: This is a document analysis study on the development and implantation of Harpya (Sample Management System), its evolution and advances from 1986 to 2020. Access to the system by users, the existence of a centralized database, the implantation in the National Network of Laboratories Health Surveillance (RNLVISA, by its acronym in Portuguese) and the creation and use of nationally standardized catalogs were discussed. Results: The data and information generated by RNLVISA on the quality of products can generate scientific evidence for the identification of risk, being fundamental for the decision-making of managers. In addition, they contribute to the evaluation of laboratories and to the monitoring of market products, subsidizing health surveillance actions. Conclusions: The possibility of accessing data at the national level by different technical areas of the National Health Surveillance Agency stands out, supporting the coordination of the SNVS and also that Harpya’s information can contribute with other information systems in the resolution of public health problems in the country.
- Published
- 2020
13. IMPLEMENTASI BASIS DATA TERPUSAT UNTUK PENAGIHAN TUNGGAKAN LISTRIK PADA CV. CAHAYA ABADI
- Author
-
Siti Sauda and Eka Septiawati
- Subjects
Officer ,Centralized database ,Database ,Scope (project management) ,business.industry ,Arrears ,A Very Big Problem ,Electricity ,Electric power ,business ,computer.software_genre ,computer - Abstract
Each company certainly cannot be separated from technology, which is technology that is expected to assist in doing every job, for example in a centralized database where the data can be accessed simultaneously. CV. Cahaya Abadi is a company or construction engaged in the electricity sector to carry out the problem of billing customer electricity arrears in villages in Sekayu District. Electricity Bill arrears in PT. Muba Electric Power is still a very big problem, namely the total bill reachesRp. 30,911,833,490 for 10 Districts consisting of 20,677 subscribers. Due to the mismatch of the usage records recorded by the meter registering officer every month, the amount of the bill appeared and made customers lazy to pay.So, to overcome these problems, namely by creating a database that will be connected to the scope of the parts in the CV. Eternal Light centrally. So that every job gets the desired results with structured performance. The database development method used for this study uses the methodLife Cycle Database(DBLC). Setiap perusahaan tentunya tidak lepas dari yang namanya teknologi, yaitu teknologi yang diharapkan dapat membantu dalam mengerjakan setiap pekerjaan, contohnya dalam satu basis data terpusat dimana data tersebut bisa diakses secara bersamaan. CV. Cahaya Abadi merupakan salah satu perusahaan atau konstruksi yang bergerak di bidang listrik untuk melakukan masalah penagihan tunggakan listrik pelanggan di desa-desa yang ada di Kecamatan Sekayu. Tunggakan Tagihan Listrik yang ada di PT. Muba Electric Power masih menjadi masalah yang sangat besar, yaitu total tagihan mencapai Rp. 30.911.833.490 untuk 10 Kecamatan yang terdiri dari 20.677 pelanggan. Disebabkan oleh ketidaksesuaian catatan pemakaian yang dicatat oleh petugas pencatat angka meter setiap bulannya sehingga muncul besarnya tagihan dan mengakibatkan pelanggan malas membayar. Jadi, untuk mengatasi permasalahan tersebut, yaitu dengan membuat basis data yang nantinya terhubung dengan ruang lingkup bagian-bagian dalam CV. Cahaya Abadi secara terpusat. Sehingga setiap pekerjaan mendapatkan hasil yang diinginkan dengan kinerja yang terstruktur. Adapun metode pengembangan database yang digunakan untuk penelitian ini menggunakan metode Database Life Cycle (DBLC).
- Published
- 2020
14. FRAGMENT: A Web Application for Database Fragmentation, Allocation and Replication over a Cloud Environment
- Author
-
Asdrúbal López-Chau, María Antonieta Abud-Figueroa, Lisbeth Rodríguez-Mazahua, Felipe Castro-Medina, and Giner Alor-Hernández
- Subjects
Distributed Computing Environment ,General Computer Science ,Database ,Distributed database ,Computer science ,business.industry ,Relational database ,Fragmentation (computing) ,020206 networking & telecommunications ,Cloud computing ,02 engineering and technology ,computer.software_genre ,Metadata ,Centralized database ,0202 electrical engineering, electronic engineering, information engineering ,Web application ,020201 artificial intelligence & image processing ,Electrical and Electronic Engineering ,business ,computer - Abstract
Fragmentation, allocation and replication are techniques widely used in relational databases to improve the performance of operations and reduce their cost in distributed environments. This article shows an analysis of different methods for database fragmentation, allocation and replication and a Web application called FRAGMENT that adopts the work technique that was selected in the analysis stage, because it presents a fragmentation and replication method, it is applied to a cloud environment, it is easy to implement, it focuses on improving the performance of the operations executed on the database, it shows everything necessary for its implementation and is based on a cost model. FRAGMENT analyzes the operations performed in any table of a database, proposes fragmentation schemes based on the most expensive attributes and allocates and replicates a scheme chosen by the user in a distributed environment in the cloud. This work shows a common problem in fragmentation methods, overlapping fragments, and provides an algorithm with an approach to address it. This algorithm results in the predicates that will define each fragment in a distributed environment. To validate the implemented technique, a second web application is presented, dedicated to simulate operations on sites and focused on producing a log file for the main application. Experiments with the TPC-E benchmark demonstrated lower response time of the queries executed against the distributed database generated by FRAGMENT compared with a centralized database.
- Published
- 2020
15. Implementasi Client-Server Pada Sistem Informasi Pengolahan Nilai Siswa Menggunakan Object-Oriented Programming
- Author
-
Fitrilina Fitrilina and Ullya Mega Wahyuni
- Subjects
Centralized database ,Object-oriented programming ,Process (engineering) ,business.industry ,Computer science ,Systems development life cycle ,Waterfall model ,Information system ,Key (cryptography) ,Information technology ,business ,Software engineering - Abstract
The use of information technology in educational institutions be necessary for work to be efficient and flexible. The process of reporting student data and assessment results at SMA Negeri 1 Dharmasraya is still manual using paper (paper-based), and there is no specific storage for the database. To make reports on student learning outcomes and share resources easier then an information system was built on a client-server network using object-oriented programming with a centralized database so that the stored data can be organized. The build systems using the System Development Life Cycle waterfall model, which starts from the analysis, design, coding, testing, and maintenance. The results of this study are the establishment of a multiuser system that implements the use of client-server computer networks to process student grades.Key Words: Client-Server, Information Systems, OOP
- Published
- 2020
16. A framework for plant growth analysis and simulation using data analysis techniques
- Author
-
Snehit Sagi, Malathy Chidambaranathan, and Gayathri Mani
- Subjects
0106 biological sciences ,business.industry ,02 engineering and technology ,Agricultural engineering ,010603 evolutionary biology ,01 natural sciences ,Gross domestic product ,Centralized database ,Work (electrical) ,Agriculture ,Soil pH ,0202 electrical engineering, electronic engineering, information engineering ,Data analysis ,Environmental science ,020201 artificial intelligence & image processing ,Gradient descent ,business ,Cluster analysis - Abstract
PurposeIn India, agriculture is considered as the major source of income for a major sector of people. Our country's GDP (Gross Domestic Product) can increase only if we focus on agriculture and its growth toward global economy. There have been several attempts to improve the agricultural sector since decades.Design/methodology/approachThis work describes about the design of a device which continuously monitors the plant growth and sends the data to a centralized database, where data is dynamically analyzed based on base references using various machine learning algorithms like regression, gradient descent, clustering etc.FindingsThis paper aims at analyzing the plant growth in of our country and focuses on the improvement of plant growth based on factors such as temperature, air moisture, radiant energy, carbon dioxide levels, soil pH& temperature through the design of a device.Originality/valueIt is anticipated to provide a solution by analyzing the plant growth percentage in different regions over a period of time. Based on the inferences, we will be able to suggest an optimum environment for the plant species to grow best. Various sensors like temperature and humidity sensors, light sensors and pH electrodes can be used in collecting data from the plant environment.
- Published
- 2020
17. PILOT PROJECT FOR ELECTRONIC REIMBURSEMENT SYSTEM FOR PHYSICIANS IN INDONESIA
- Author
-
Taiyin Wu
- Subjects
Centralized database ,SQL ,Intranet ,Decision support system ,Information retrieval ,Computer science ,Interface (computing) ,Information system ,Domain knowledge ,General Medicine ,computer ,Column (database) ,computer.programming_language - Abstract
A healthcare department in remove community of Indonesia aimed for reducing paperwork and improving the electronic system. As part of a pilot project, one aspect was replaced from manual to the electronic format. The proposed system was use of electronic form for claiming for fee reimbursement made by the physicians. The design of the system is intranet based and consisted of two separate portals. The first portal is for physicians and second portal is for billing clerk. The interface is user-friendly and packed with pre-defined codes set in several of its fields and sub-fields. The electronic form is also linked to a centralized database from which a physician can copy the existing patients record. For improving the system variance from individual needs, decision support algorithm is used. Whereas, for improving the system performance, machine learning algorithm is used. For data query, database query was designed. The relationship of columns in the database is displayed as a tabulated form to the user. In situation where a user selects a particular column, a filtered display mechanism displays those columns which satisfying the portion of the query already constructed. For obtaining data from the tabulated database, the SQL query is adapted. Rule-based knowledge inference model is utilized for reasoning about terminology and required domain knowledge. The inference used is algorithmic and helpful in performing all necessary tasks under the suitable billing circumstances. A survey is conducted with 35 physicians for judging their perception towards the system. Results of the survey indicate that most participants find the system suitable and better than the paper-based system in terms of several dimensions such as user friendliness, time saving, reducing errors, and accuracy.
- Published
- 2020
18. The Digital Cicognara Library: transforming a 19th century resource for the digital age
- Author
-
Roger Lawson, Charlotte Oertel, and Holly Hatheway
- Subjects
History ,Point (typography) ,business.industry ,Subject (documents) ,Metadata ,World Wide Web ,Centralized database ,Resource (project management) ,Transformative learning ,General partnership ,General Earth and Planetary Sciences ,Web application ,business ,General Environmental Science - Abstract
The Digital Cicognara Library is an international initiative to recreate in digital form the private book collection of Count Leopoldo Cicognara (1767–1834). His collection of five thousand early imprints comprises foundational literature of art and archaeology, and includes a diverse range of publications in all areas of the visual arts. Our partnership's 21st- century effort advances Cicognara's Enlightenment-era ideals by making digital copies of his library available through an open access web application, where they will be fully searchable from a centralized database as well as relevant subject research interfaces. The aggregated images and text offer a potentially transformative opportunity for the discipline of art history and allied disciplines. By offering a new interface for Cicognara's collection, the endeavour allows open access availability to nearly all of the key historical volumes, the illustrations within, and the searchable metadata. The Digital Cicognara Library offers a corpus that will allow scholars to ask and answer new questions in disciplines beyond art history and archaeology, and will offer scholars of early printed books a new access point to study both the individual volumes and their relationship to each other in an accessible digital collection.
- Published
- 2020
19. Integration of School Management Systems Using a Centralized Database (ISMSCD)
- Author
-
Romeo E. Balcita and Thelma D. Palaoag
- Subjects
Database ,Computer science ,05 social sciences ,050301 education ,030206 dentistry ,computer.software_genre ,Computer Science Applications ,Education ,03 medical and health sciences ,Centralized database ,0302 clinical medicine ,Management system ,0503 education ,computer - Abstract
School have different departments and offices with interrelated functionalities that needs one’s cooperation in order to function well. Offices are scattered across the campus and transfer of information is affected. The need to build an integrated School Management System using a centralized database will make school services of better quality. The need to integrate technologies like barcodes, use of internet, video cameras, sensors and the use of a better framework are needed to cope up with the changing needs of the society and in providing quality school service. Thus this study aimed to build such system, implement the system and identify the level of acceptability. Features of the system include enrollment, assessment, report generation and providing decision support module. The system is based from a standardized school management framework derived from different existing school management systems. This is under qualitative as well as descriptive research. Agile AWE model is used in the project development. This is implemented in an institution that uses manual process of school management. Manager, staff, teachers, students and guardian will provide the needed system requirements and at the same time they will be the respondents in getting the level of acceptability. Feedbacks are continuously gathered and helps identify possible improvements. This system is beneficial to institutions currently using manual process in their departments. This is a fully functional and configurable system that will suit to the needs and surely provide quality service.
- Published
- 2020
20. A Food Traceability System Based on Blockchain and Radio Frequency Identification Technologies
- Author
-
Pan Feng and Miaolei Deng
- Subjects
010304 chemical physics ,Traceability ,Computer science ,business.industry ,Supply chain ,010402 general chemistry ,Food safety ,01 natural sciences ,0104 chemical sciences ,Environmental data ,Centralized database ,Identification (information) ,Risk analysis (engineering) ,0103 physical sciences ,Radio-frequency identification ,Food quality ,business - Abstract
The frequent occurrence of food safety accidents at the global level has triggered consumer sensitivity. Establishing a food traceability system can effectively guarantee food safety and increase consumer confidence and satisfaction. Existing food traceability systems often ignore environmental factors that affect food quality at all stages of the supply chain, and the authenticity of the information obtained through traceability is difficult to guarantee. In this study, a food supply chain traceability model was established based on blockchain and radio frequency identification (RFID) technologies. The model focused on the traceability of environmental data during the various stages of the food supply chain and combined a centralized database and blockchain for data storage. The lot identification data of the various supply chain stages were stored in a centralized database, while the environmental data were stored in a blockchain. Thereby, the authenticity and accuracy of the traceability data were ensured. The blockchain part of the model has been simulated in the Ethereum test environment, and the experiment has achieved traceability of temperature data.
- Published
- 2020
21. Lösungsansätze für Big Data Security Probleme: Hybrid Blockchain und dezentralisierte Datenbank
- Author
-
Chmel, Sebastian
- Subjects
Graph Universe Node (GUN) ,Ethereum ,Big Data Security ,Blockchain ,Big Data Sicherheit ,Herausforderungen & Lösungen ,Dezentralisierte Datenbanken ,Zentralisierte Datenbank ,Challenges & Solutions ,Centralized Database ,Decentralized Databases - Abstract
Die Vielfalt der Big-Data-Anwendungen, die in der heutigen digitalen Welt zur Verfügung stehen, wächst rasant, und damit auch die damit verbundenen Sicherheitsherausforderungen, denen wir uns stellen müssen. Insbesondere wenn es um nutzerbezogene Daten geht, bei denen der Datenschutz von größter Bedeutung sein sollte, ist dies leider nicht immer der Fall. Das liegt daran, dass die Big-Data-Branche von zentralisierten Unternehmen und Institutionen dominiert wird, die ihre Dienste der Öffentlichkeit kostenlos im Austausch gegen persönliche Daten anbieten und sich nur darauf konzentrieren, wie sie diese Daten nutzen können, um mit Hilfe von KI und Analysen neue Geschäftswerte zu schaffen. Diese Arbeit befasst sich mit den bestehenden Sicherheitsproblemen von Big Data und schlägt Lösungen vor, die Blockchain in Kombination mit einer relationalen Datenbank und einer dezentralen Datenbank verwenden. Der Beitrag dieser Arbeit sind zwei praktisch implementierte Chat-Anwendungsprototypen namens DeChat.eth, die die Ethereum-Blockchain und eine NoSQL-Datenbank verwenden, und DeChat.gun, das das Graphical Universe Node (GUN)-Framework nutzt, um die Sicherheitsaspekte der vorgeschlagenen Lösungen zu demonstrieren. Durch die Analyse der Unterschiede zwischen den beiden können wir die Anwendbarkeit im Big-Data-Bereich ableiten und auch die Schlüsselaspekte für die Lösung von Big-Data-Sicherheitsherausforderungen hervorheben, wodurch wir an Relevanz gewinnen. Die Ergebnisse dieser Arbeit haben gezeigt, dass sowohl der hybride Blockchain-Ansatz als auch der dezentrale Datenbank-Ansatz tatsächlich zur Bewältigung von Big-Data-Sicherheitsherausforderungen verwendet werden können, allerdings auf grundlegend unterschiedliche Weise. Einer der Hauptaspekte ist die Authentifizierung von Nutzern, wobei der hybride Blockchain-Ansatz die Blockchain-Wallet mit einer Public-Key-Infrastruktur (PKI) verwendet, während die dezentrale Datenbanklösung eine herkömmliche Benutzername/Passwort-Kombination zur Authentifizierung verwendet, indem ein Schlüssel aus dem Passwort abgeleitet wird, der zur Ver-/Entschlüsselung des in der dezentralen Datenbank gespeicherten Schlüsselpaars des Nutzers verwendet wird, wodurch die Identität des Nutzers validiert wird. Ein weiteres Ergebnis ist, dass bei unserer dezentralen Datenbanklösung ein echtes Dateneigentum besteht, während beim hybriden Blockchain-Ansatz nur das Identitätseigentum des Benutzers garantiert ist und alle anderen Daten in der zentralen Datenbank gespeichert werden, was nur eine teilweise Zugangskontrolle ermöglicht. Diese und andere in diesem Papier diskutierte Aspekte zeigen, dass die beiden vorgeschlagenen Lösungen nicht nur für die Big-Data-Branche, sondern auch für andere Anwendungsfälle geeignet sind. Leider gibt es immer noch Nachteile, wie z. B. die fehlende vollständige Kontrolle über die Daten, die von böswilligen Benutzern ausgenutzt werden kann, oder der Mangel an einer detaillierten Dokumentation. Daher sind weitere Forschungen und Arbeiten erforderlich, um beide Lösungen zu verbessern und sie universell einsetzbar zu machen. The variety of Big Data applications available in today's digital world is growing rapidly, and so are the related security challenges we face. Especially when it comes to user-related data, where data protection should be of the utmost importance. Unfortunately, this is not always the case. That is because the Big Data industry is dominated by centralized companies and institutions that offer their services to the public for free in exchange for personal data and focus only on how they can use that data to create new business value using AI and analytics. This thesis addresses the existing security issues of Big Data and proposes solutions using blockchain in combination with a relational database and a decentralized database. To this end, the contribution of this work are two practically implemented chat application prototypes named DeChat.eth, using the Ethereum blockchain and a NoSQL database, and DeChat.gun, utilizing the Graphical Universe Node (GUN) framework, to demonstrate the security aspects of the proposed solutions. By analyzing the differences between the two, we can infer the applicability in the Big Data domain and also highlight the key aspects for solving Big Data security challenges, thus gaining relevance. The results of this work have shown that both the hybrid blockchain approach and the decentralized database approach can indeed be used to address Big Data security challenges, but in fundamentally different ways. One of the main aspects is the authentication of users, where the hybrid blockchain approach uses the blockchain wallet with a public key infrastructure (PKI), while the decentralized database solution uses a traditional username/password combination for authentication by deriving a key from the password that is used to encrypt/decrypt the user's key pair stored in the decentralized database, thus validating the user's identity. Another result is that in our decentralized database solution, there is true data ownership, while in the hybrid blockchain approach, only the user's identity ownership is guaranteed and all other data is stored in the centralized database, providing only partial access control. These and other aspects discussed in this paper show that the two proposed solutions carry potential not only for the Big Data industry, but also for other use cases. Unfortunately, there are still drawbacks, such as the lack of complete control over the data, which can be exploited by malicious users, or the shortage of detailed documentation. Therefore, further research and work is needed to enhance both solutions and make them universally applicable.
- Published
- 2022
22. Smart Transducers in Distributed and Model-Driven Control Applications: Empowering Seamless Internet of Things Integration
- Author
-
Hans-Peter Bernhard, Alois Zoitl, and Andreas Springer
- Subjects
Data stream ,TheoryofComputation_COMPUTATIONBYABSTRACTDEVICES ,Computer science ,Modeling language ,business.industry ,020208 electrical & electronic engineering ,02 engineering and technology ,Smart transducer ,Process automation system ,Automation ,Industrial and Manufacturing Engineering ,Software development process ,Centralized database ,TheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGES ,Transducer ,0202 electrical engineering, electronic engineering, information engineering ,ComputerSystemsOrganization_SPECIAL-PURPOSEANDAPPLICATION-BASEDSYSTEMS ,Electrical and Electronic Engineering ,business ,Computer hardware - Abstract
Smart transducers based on ISO/IEC/IEEE 21450 provide standardized access to sensors and actuators. Hence, no centralized database is needed for the calibration and interpretation of the data collected by transducers. The smart transducer stores and maintains its individual transducer electronic data sheet (TEDS). Whenever transducers are replaced in an automation system, the correct description, calibration, and data stream interpretation come along with the newly connected transducer. The domain-specific modeling language defined by IEC 61499 improves the software development process for automation systems. However, it lacks a clear transducer interaction model.
- Published
- 2019
23. An overview of the National COVID-19 Chest Imaging Database: data quality and cohort analysis
- Author
-
Daniel Schofield, François Lemarchand, Rosalind Berka, Ashwin Chopra, Mark D. Halling-Brown, Jeremy C Wyatt, Tara Ganepola, Nccid Collaborative, Samie Dorgham, Alberto Favaro, Ottavia Bertolli, Emily Jefferson, Gergely Imreh, Oscar Bennett, Joseph Jacob, and Dominic Cushnan
- Subjects
education.field_of_study ,Chest imaging ,Data collection ,Coronavirus disease 2019 (COVID-19) ,Database ,business.industry ,Population ,Health Informatics ,computer.software_genre ,Representativeness heuristic ,Computer Science Applications ,Centralized database ,DICOM ,Data quality ,Medical imaging ,Medicine ,Snapshot (computer storage) ,business ,education ,computer ,Cohort study - Abstract
Background The National COVID-19 Chest Imaging Database (NCCID) is a centralized database containing mainly chest X-rays and computed tomography scans from patients across the UK. The objective of the initiative is to support a better understanding of the coronavirus SARS-CoV-2 disease (COVID-19) and the development of machine learning technologies that will improve care for patients hospitalized with a severe COVID-19 infection. This article introduces the training dataset, including a snapshot analysis covering the completeness of clinical data, and availability of image data for the various use-cases (diagnosis, prognosis, longitudinal risk). An additional cohort analysis measures how well the NCCID represents the wider COVID-19–affected UK population in terms of geographic, demographic, and temporal coverage. Findings The NCCID offers high-quality DICOM images acquired across a variety of imaging machinery; multiple time points including historical images are available for a subset of patients. This volume and variety make the database well suited to development of diagnostic/prognostic models for COVID-associated respiratory conditions. Historical images and clinical data may aid long-term risk stratification, particularly as availability of comorbidity data increases through linkage to other resources. The cohort analysis revealed good alignment to general UK COVID-19 statistics for some categories, e.g., sex, whilst identifying areas for improvements to data collection methods, particularly geographic coverage. Conclusion The NCCID is a growing resource that provides researchers with a large, high-quality database that can be leveraged both to support the response to the COVID-19 pandemic and as a test bed for building clinically viable medical imaging models.
- Published
- 2021
24. Authorized Shared Electronic Medical Record System with Proxy Re-Encryption and Blockchain Technology
- Author
-
Yong-Yuan Deng, Chin-Ling Chen, Weizhe Chen, Shunzhi Zhu, Jiaxin Wu, and Jianmin Li
- Subjects
Information privacy ,Technology ,IoT ,Computer science ,privacy protection ,Data security ,Access control ,TP1-1185 ,Computer security ,computer.software_genre ,Biochemistry ,Article ,EMR sharing ,Analytical Chemistry ,Blockchain ,proxy re-encryption ,consortium blockchain ,BCoT ,Electronic Health Records ,Electrical and Electronic Engineering ,Instrumentation ,Computer Security ,business.industry ,Chemical technology ,Cloud Computing ,Atomic and Molecular Physics, and Optics ,Proxy re-encryption ,Centralized database ,Data access ,The Internet ,business ,computer ,Cloud storage - Abstract
With the popularity of the internet 5G network, the network constructions of hospitals have also rapidly developed. Operations management in the healthcare system is becoming paperless, for example, via a shared electronic medical record (EMR) system. A shared electronic medical record system plays an important role in reducing diagnosis costs and improving diagnostic accuracy. In the traditional electronic medical record system, centralized database storage is typically used. Once there is a problem with the data storage, it could cause data privacy disclosure and security risks. Blockchain is tamper-proof and data traceable. It can ensure the security and correctness of data. Proxy re-encryption technology can ensure the safe sharing and transmission of relatively sensitive data. Based on the above situation, we propose an electronic medical record system based on consortium blockchain and proxy re-encryption to solve the problem of EMR security sharing. Electronic equipment in this process is connected to the blockchain network, and the security of data access is ensured through the automatic execution of blockchain chaincodes; the attribute-based access control method ensures fine-grained access to the data and improves the system security. Compared with the existing electronic medical records based on cloud storage, the system not only realizes the sharing of electronic medical records, but it also has advantages in privacy protection, access control, data security, etc.
- Published
- 2021
25. Rationale for the Integration of BIM and Blockchain for the Construction Supply Chain Data Delivery: A Systematic Literature Review and Validation through Focus Group
- Author
-
Rodrigo N. Calheiros, Ali Mohammed Alashwal, Srinath Perera, and Amer A. Hijazi
- Subjects
Centralized database ,Process management ,Systematic review ,Blockchain ,Computer science ,Information model ,Strategy and Management ,Supply chain ,Industrial relations ,Building and Construction ,Data delivery ,Focus group ,Civil and Structural Engineering - Abstract
Building information modeling (BIM) plays a critical role in the integration of different parties by acting as a centralized database defining their responsibilities and contractual obligat...
- Published
- 2021
26. IoT-Based Tracing and Communication Platform for Disease Control
- Author
-
Pei Cheng Ooi, Sze Qi Chew, Yi Zhen Quak, Zhi Yuan Chan, and Yi Xin Loke
- Subjects
Upload ,Microcontroller ,Identification (information) ,Centralized database ,Computer science ,Arduino ,Real-time computing ,Timestamp ,Tracing ,Android (operating system) - Abstract
Coronavirus disease 2019 (COVID-19) is highly contagious and has swept the globe. Countries worldwide is in urgent need of efficient technological solutions to control the transmission of COVID-19 disease. The objective of this project is to develop an artificial intelligence-driven contact tracing platform and communication to come up with an integrated solution to block the transmission chain of the disease. Three elements are included in this platform, which are behavioral recognition system, mobile application and smart wristband. Mobile application developed through Android Studio SDK, has multiple functions, which are Quick Response (QR) code scanner for location tracking, close contact identification, COVID-19 cases update, district color alert system and exposure notification. Behavioral recognition system developed on Raspberry Pi v4 and Faster Region Based Convolutional Neural Network Version 2 (RCNN_v2) and Single Shot Multibox Detection MobileNet Version 2 (SSD MobileNet_v2) are adopted as machine learning algorithm can carry out close-proximity detection, people counting, and face mask detection. Smart wristband built with Arduino MKR GSM1400 microcontroller and various sensors are developed through Arduino Integrated Development Environment (IDE) to keep track on the location and vital signs of the quarantined people and is designed with an emergency button to allow the quarantined people to get help immediately if they are not feeling well. The data obtained from the three elements is uploaded to a centralized database, Firestore associating with accurate timestamp and location. This system integrated with various preventive measure and control measure can mitigate and manage COVID-19 pandemic effectively and efficiently.
- Published
- 2021
27. COVID-19 Early Symptom Prediction Using Blockchain and Machine Learning
- Author
-
Ruppa K. Thulasiram and Sarada Kiranmayee Tadepalli
- Subjects
education.field_of_study ,Blockchain ,Data collection ,Process (engineering) ,Computer science ,business.industry ,Population ,Retraining ,Machine learning ,computer.software_genre ,Centralized database ,Order (business) ,Artificial intelligence ,education ,business ,computer ,Data transmission - Abstract
The COVID-19 outbreak has resulted in unprecedented and difficult times for world’s population. Social distancing and self-isolation have become very important to reduce the spread. This called upon the creation of numerous applications that have used proprietary models for symptoms-tracking and contact-tracing around the world to mitigate the spread. In most of the applications data collected is stored in a centralized database without verification and hence, the data is not reliable. In this study, a decentralized application for COVID-19 symptoms tracking using Blockchain is proposed in order to enhance reliable data collection for training Machine Learning (ML) models. The Blockchain integration in this application will help in collecting COVID-19 symptoms data from the patients with trust. In addition to this, the data would be first verified by an entity of the decentralized network (e.g. a COVID-19 testing lab). Then, with the consent of the patient, this data is provided to the centralized system for retraining the ML. In short, the main advantage of this architecture is that the data from the users is collected and checked by a laboratory first and then provided to the ML model. The process helps in identifying the incorrect ML prediction and further train the ML model with reliable data for accurate prediction. Moreover, the trust of the users is earned as the data transfer happens with their consent and, besides, all transactions are recorded on the Blockchain, which is possible with the help of the Distributed Ledger Technology (DLT).
- Published
- 2021
28. An Efficient and Effective Blockchain-based Data Aggregation for Voting System
- Author
-
D. Saranya, R. ShankarRam, and M. Ramalingam
- Subjects
Secure Hash Algorithm ,Blockchain ,Computer science ,business.industry ,media_common.quotation_subject ,Track (rail transport) ,Computer security ,computer.software_genre ,Automation ,Data aggregator ,Centralized database ,Trustworthiness ,Voting ,business ,computer ,media_common - Abstract
Blockchain is opening up new avenues for the development of new sorts of digital services. In this article, we'll employ the transparent Blockchain method to propose a system for collecting data from many sources and databases for use in local and national elections. The Blockchain-based system will be safe, trustworthy, and private. It will assist to know the overall count of the candidates who participated and it functions in the same way as people's faith in their governments does. Blockchain technology is the one that handles the actual vote. We use the secure hash algorithm for resolving this problem and tried to bring a solution through the usage of this booming technology. A centralized database in a blockchain system keeps track of the secure electronic interactions of users in a peer-to-peer network.
- Published
- 2021
29. Blockchain-based Parametric Health Insurance
- Author
-
Purshottam Purswani
- Subjects
Centralized database ,Service (systems architecture) ,Actuarial science ,Event (computing) ,media_common.quotation_subject ,Asset (economics) ,Business ,Indemnity ,Duration (project management) ,Payment ,media_common ,Parametric statistics - Abstract
In traditional indemnity policy insurance, the payment is made to the insurer based on the actual loss or the damage from an insured physical asset. Therefore, the settlement is typically a very long process as there are multiple parties involved. In comparison, parametric insurance insures a policyholder against the occurrence of a specific event by paying a set amount based on the magnitude of the event.We have seen that with digitization, the insurance industries have also been disrupted through the use of innovative insurance schemes, one of them being parametric insurance. In parametric insurance, a fixed amount is paid on an event once a certain threshold is crossed. The amount is determined based on the modeled forecast of the loss.Parametric insurance will help settle the claim process, which currently is challenged with a longer settlement duration and disputes. However, for parametric insurance to work, there is a need for the data providers to capture events and their trust factor. In the new data-driven age, technologies like IoT, API’s, and distributed ledgers are the technology enablers to capture real-time events and add a trust factor with the distributed ledger. Though parametric insurance can be implemented with the traditional centralized database approach, it has limitations on trust issues as the data can be manipulated.This paper explores how we can realize trusted digital technology-enabled parametric health insurance. It will deep dive into the technical solution, challenges to design, deploy, and parametric service insurance. A proof of concept was done for health-based parametric insurance, and the resultant blueprint, which can be applied for other use cases, is shared in sections ahead
- Published
- 2021
30. Collaborative AI in Smart Healthcare System
- Author
-
Nusrat Jahan, Nasima Begum, Israt Binte Rashid, Obaydullah Al Numan, and A S M Touhidul Hasan
- Subjects
Information privacy ,Centralized database ,Scope (project management) ,business.industry ,Computer science ,Medical record ,Interoperability ,Health care ,Internet privacy ,Differential privacy ,Confidentiality ,business - Abstract
As technology advances, growing organizations aim to bring revolutionary changes to all aspects of industries. The healthcare industry is experiencing a twofold burden of illnesses, low benefit scope, and a deficiency of basic budgetary risk security mechanism. To maintain patient health records, many healthcare organizations still depend on local centralized database systems. The main problem is when patients’ confidential data has been stored in hospital’s database, its integrity and privacy is the main concern. In some cases, patient’s upgraded records are not stored within the database on time. Also patients are not happy to disclose their medical records for some privacy issues. Therefore, the re-diagnosis process is time-consuming and complicated which leads to high healthcare costs. To illuminate this issue, we propose a collaborative smart healthcare system with decentralized information capacity mechanism, guaranteeing the protection and interoperability of patient’s medical records by collaborative use. In the proposed system, hospitals share patient’s data in a decentralized database maintaining data privacy using differential privacy mechanism. The experimental result analysis and data accuracy using AI model show the practicality of the proposed system.
- Published
- 2021
31. DIUcerts DApp: A Blockchain-based Solution for Verification of Educational Certificates
- Author
-
Hosnain Ahammad, Mahfujur Rahman, Shumrose Zaman Shetu, Shahriar Karim Shawon, and Syed Akhter Hossain
- Subjects
Centralized database ,Upgrade ,Blockchain ,ComputingMilieux_THECOMPUTINGPROFESSION ,Computer science ,Process (engineering) ,Server ,Certification ,Computer security ,computer.software_genre ,Certificate ,computer ,Maintenance engineering - Abstract
Educational certificate verification is the process of checking and verifying the certificate legitimacy of graduate students. It is a costly, lengthy, and time-consuming procedure as university authorities invest millions of dollars in maintaining the entire process each year. The employer also takes plenty of time to verify the authenticity of the applicant's certificate. The current certification system provides traditional certificates to the candidates. That's why certificates can be tampered with and lost at any time. Moreover, counterfeiting the certificates by scammers and issued by many illegal institutions makes the process hazardous. People frequently lie about their degrees and qualifications by counterfeiting certificates. A fake certificate generated by skillful scammers is always tough to identify and address as the original one. Therefore, there is a crucial need to upgrade the certification and verification process. This paper introduced a Blockchain-based decentralized DIUcerts platform that offers an easy way to issue, check, and verify educational certificates. Additionally, in DIUcerts, data doesn't have to be stored in one place as each certificate's information is kept in an individual file; entire issuance and verifications are done through the Ethereum platform. With this infrastructure, the cost of maintaining a Blockchain-based certificate verification system could be highly minimized as compared to building a similar application on a centralized database. As a result, DIUcerts can lead to better security, cost savings, and a time-saving platform for educational certificate verification.
- Published
- 2021
32. Editorial Perspectives: the need for a comprehensive, centralized database of interbasin water transfers in the United States
- Author
-
Landon Marston, David A. Dzombak, and Kerim E. Dickson
- Subjects
Centralized database ,Environmental Engineering ,Environmental science ,Environmental planning ,Water Science and Technology - Abstract
Kerim Dickson, Landon Marston and David Dzombak provide an ‘Editorial Perspective’ on the importance of understanding interbasin water transfers in managing water resources.
- Published
- 2020
33. Securing IoTs in distributed blockchain: Analysis, requirements and open issues
- Author
-
Sana Moin, Ejaz Ahmed, Kalsoom Safdar, Zanab Safdar, Ahmad Karim, and Muhammad Imran
- Subjects
Cryptocurrency ,Blockchain ,Computer Networks and Communications ,business.industry ,Computer science ,020206 networking & telecommunications ,Cloud computing ,02 engineering and technology ,Virtualization ,computer.software_genre ,Computer security ,Centralized database ,Hardware and Architecture ,Home automation ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Smart environment ,business ,Adaptation (computer science) ,computer ,Software - Abstract
IoTs are integrated, interconnected concepts of things or objects in our surroundings, with an essence of virtualization. The interconnectivity of the business world, health environments, smart home devices, and daily use gadgets takes place through the networks based on cloud infrastructure which is not restricted to jurisdictional, geographic, and national boundaries. However, the light-weight IoT devices come with a limited storage and processing capacity. Due to this limitation, the need for separate data storage arises so that data can be utilized in the future. These third-party storage services are provided at the cost of a user’s privacy. Furthermore, the storage relies on a centralized database which is more open to attack due to its single point security breach chances. Furthermore, present IoT data is not trustworthy in the external environment, as data manipulation is lacking when data is shared with other parties. To overcome the above-mentioned limitations of IoT, the emerging secure decentralized storage technology; Blockchain, have begun to abandon the significant impact in the IoT with the enhancement of security and incorporating a large number of devices in the today’s ecosystems. In this paper, we have performed a comprehensive literature review to show how well blockchain has transformed the smart environments connected with IoT sensors and the underlying issues for its adaptation. Further, a well-organized taxonomy is presented by highlighting the strengths, weaknesses, opportunities, and threats (SWOT) of blockchain based IoT environment. In addition to that, we have clearly presented the verities of blockchain applications such as bitcoin (earlier cryptocurrency used in blockchain) or ethereum (establish smart contracts) based works and pinpoint the necessities and security challenges. Moreover, we have highlighted the essential implementation requirements of blockchain in the IoTs. This paper is also equipped with a state-of-the-art framework of IoT while adopting security features and decentralized storage requirements of the blockchain. In the end, we have presented insightful challenges need to be addressed to obtain efficient, secure, and effective communication goals and to provide private and secure services for users as per their requirements.
- Published
- 2019
34. A Mobile-based Monitoring System for Micro Small Medium Enterprises (MSMEs) with Offline Data Synchronization
- Author
-
Jocelyn B. Barbosa
- Subjects
Centralized database ,Multidisciplinary ,User experience design ,business.industry ,Computer science ,Mobile computing ,System context diagram ,Data synchronization ,Android (operating system) ,Information repository ,Telecommunications ,business ,Agile software development - Abstract
Objectives: The use of technology has become ubiquitous in different areas of businesses and plays a significant role in gathering huge amount of data. In Philippines, for example, a government agency, Department of Trade and Industry (DTI) is entrusted with a project to assist and monitor the status of Micro Small Medium Enterprises (MSMEs), some of which are located in rural or remote places. There are number of approaches that are able to address registration and monitoring of clients or business establishments records through mobile-based application. However, there may be times that poor connectivity or no network connectivity is experienced with LTE or wi-fi network. Hence, one of the major challenges in dealing with mobile computing is on how to provide the clients with the same user experience even when there are some issues on connectivity. In this study, we developed a mobile-based application that may significantly improve the process whereby clients or users have the chance to access data that is stored in a stationary database or some other data repository at any time and any place. Methods: We introduce an innovative approach for efficient tracking and monitoring data leveraging the strength of data synchronization and replication. A mobile-based platform using a centralized database was designed and developed to facilitate quick registration of clients and easy portfolio monitoring of the business establishments. Application: The mobile-based system is an application that runs through android platform necessary for MSMEs’ efficient registration and monitoring purposes whereby users in remote areas are provided with the same user experience despite issues on connectivity (i.e. from “poor” to “no network” connectivity). We deployed the system in the Department of Industry (DTI) provincial office, which covers thousands of MSMEs. Findings: Experiments reveal that our approach provides effective results, which gains high acceptability of the users. Keywords: Agile Approach, Agile Software Development Method, Context Diagram, Micro Small Medium Enterprises (MSME), Mobile-based Application, Monitoring System, Offline Data Synchronization
- Published
- 2019
35. CLAIM MANAGEMENT FRAMEWORK UNDER FIDIC 2017: CONTRACTOR CLAIM SUBMISSION
- Author
-
Ahmed EL-Ghrory, Norain Ismail, and Md. Nor Hayati Bin Tahir
- Subjects
Process management ,Event (computing) ,Computer science ,General Arts and Humanities ,media_common.quotation_subject ,Rank (computer programming) ,General Social Sciences ,Document management system ,computer.software_genre ,Transparency (behavior) ,Information sensitivity ,Centralized database ,Documentation ,Originality ,computer ,media_common - Abstract
Purpose of the study: The objective of this research is to develop a framework for managing the claim document. The contractual and management issues will be considered in this framework to enhance the Claims Management System (CMS). The framework includes the mechanism of claim submission based on the clauses and procedures of Fédération Internationale Des Ingénieurs Conseils (FIDIC, 2017). Methodology: Qualitative methodology has been selected for this paper as the topic requires a collection of sensitive information from an experienced professional. The proposed Claims Management (CM) framework is developed on the basis of a study conducted to rank the feature required for CMS. This study has been conducted among 43 experts in CM field working on contractor firm’s category A. Then, the framework was verified by seven experts who participated in the first study. Main Findings: Eleven features were required for CMS that can enhance contractor claim submission. These factors have different levels of importance. The top three factors are Tracking Claim Status (99.5%), Supporting all types of documents (96.3%), and having a Centralized Database (93.0%). Based on these features, the proposed framework was developed to improve contractor claim submission. Applications of this study: Applying the proposed framework reduces human efforts in getting documents related to claims by its systematic recording, transparency, reminder feature, contractual guide, user-friendliness, and other features of the system. Moreover, it provides the contractual support pursuant to FIDIC 2017 clauses. Novelty/Originality of this study: The framework will improve contractor claim submission and the contractor will be satisfied by claim resolution and engineer determination. Side by side, the framework will save about 50% of time consumed by the claims analysts that are usually spent in collecting, screening, and identifying information related to claim event in the project’s documentation.
- Published
- 2019
36. EVALUATION OF THE SPATIAL DISTRIBUTION OF EVACUATION CENTERS IN METRO MANILA, PHILIPPINES
- Author
-
E. P. Cajucom, G. Y. Chao Jr., G. A. Constantino, J. A. Ejares, S. J. G. Quillope, H. M. Solomon, and C. L. Ringor
- Subjects
lcsh:Applied optics. Photonics ,010504 meteorology & atmospheric sciences ,Emergency management ,business.industry ,lcsh:T ,lcsh:TA1501-1820 ,Megalopolis ,010502 geochemistry & geophysics ,01 natural sciences ,lcsh:Technology ,Transport engineering ,Centralized database ,Geography ,lcsh:TA1-2040 ,Distance analysis ,Hazard zone ,business ,lcsh:Engineering (General). Civil engineering (General) ,0105 earth and related environmental sciences - Abstract
In a densely populated and hazard-prone megalopolis like Metro Manila, the ability to execute a rapid evacuation protocol is crucial in saving lives and minimizing the damage during disastrous events. However, there is no centralized database on the location of evacuation centers (ECs) in Metro Manila and the available lists are not up-to-date. This study geotagged the current list of ECs in Metro Manila obtained from different government agencies to evaluate the spatial distribution using Geographical Information System (GIS). This is important since the immediate evacuation of residents depends on the proximity and safe location of the ECs. A total of 870 ECs were geo-tagged and validated using the street view of Google EarthTM. EC-to-population ratios were calculated for each of the 16 cities and one municipality of Metro Manila. Values range from ~3,000 to 81,000 persons per EC. Distance analysis using Thiessen Polygon shows that the ECs are not evenly distributed with proximity areas ranging from 0.0009 to 9.5 km2. Out of the total number of mapped ECs, 392 (45%) are situated in flood-prone areas while 108 (12%) are within the 1-km buffer hazard zone of an active faultline. Re-evaluation of the locations and the number of ECs per city or municipality is highly recommended to facilitate prompt evacuation when disasters strike.
- Published
- 2019
37. Ultralightweight Mutual Authentication RFID Protocol for Blockchain Enabled Supply Chains
- Author
-
Ren Ohmura, Junya Nakamura, Ming Tze Ong, Ravivarma Vikneswaren Sridharan, Jing Huey Khor, and Michail Sidorov
- Subjects
radio frequency identification ,Security analysis ,Supply chain management ,Blockchain ,General Computer Science ,business.industry ,Computer science ,Supply chain ,General Engineering ,020206 networking & telecommunications ,02 engineering and technology ,Mutual authentication ,Internet security ,Centralized database ,0202 electrical engineering, electronic engineering, information engineering ,Data Protection Act 1998 ,Radio-frequency identification ,020201 artificial intelligence & image processing ,General Materials Science ,lcsh:Electrical engineering. Electronics. Nuclear engineering ,business ,distributed ledger technology ,lcsh:TK1-9971 ,Computer network - Abstract
Previous research studies mostly focused on enhancing the security of radio frequency identification (RFID) protocols for various RFID applications that rely on a centralized database. However, blockchain technology is quickly emerging as a novel distributed and decentralized alternative that provides higher data protection, reliability, immutability, transparency, and lower management costs compared with a conventional centralized database. These properties make it extremely suitable for integration in a supply chain management system. In order to successfully fuse RFID and blockchain technologies together, a secure method of communication is required between the RFID tagged goods and the blockchain nodes. Therefore, this paper proposes a robust ultra-lightweight mutual authentication RFID protocol that works together with a decentralized database to create a secure blockchain-enabled supply chain management system. Detailed security analysis is performed to prove that the proposed protocol is secure from key disclosure, replay, man-in-the-middle, de-synchronization, and tracking attacks. In addition to that, a formal analysis is conducted using Gong, Needham, and Yahalom logic and automated validation of internet security protocols and applications tool to verify the security of the proposed protocol. The protocol is proven to be efficient with respect to storage, computational, and communication costs. In addition to that, a further step is taken to ensure the robustness of the protocol by analyzing the probability of data collision written to the blockchain.
- Published
- 2019
38. User behavior analysis-based smart energy management for webpage ranking: Learning automata-based solution
- Author
-
Aaisha Makkar and Neeraj Kumar
- Subjects
Information retrieval ,General Computer Science ,Learning automata ,Computer science ,Energy management ,020206 networking & telecommunications ,02 engineering and technology ,Energy consumption ,Hyperlink ,law.invention ,Centralized database ,PageRank ,law ,Web page ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Learning to rank ,Electrical and Electronic Engineering - Abstract
Search engines are widely used for surfing the Internet. Different search engines vary with respect to their accuracy and time to fetch the information from the distributed/centralized database repository across the globe. However, it has been found in the literature that webpage ranking helps in saving the user's surfing time which in turn saves considerable energy consumption during computation and transmission across the network. Most of the earlier solutions reported in the literature uses the hyperlink structure of graph which consume a lot of energy during the computation. It may lead to the link leakage problem with the occurrence of spam pages more often. Nowadays, hyperlink structure alone is inadequate for predicting webpage importance keeping in view of the energy consumption of various smart devices. User browsing behavior depicts its real importance. It is essential to demote the spam pages to increase the search engine accuracy and speed. Hence, user behavior analysis along with demotion of spam pages can improve Search Engine Result Pages (SERP) which in turn saves the energy consumption. In the proposed approach, web page importance score is computed by analyzing user surfing behavior attributes, dwell time, and click count. After computing the webpage importance score, the ranks are revised by implementing it in Learning Automata (LA) environment. Learning automaton is the stochastic system which learns from the environment and responds either with a reward or a penalty. With every response from the environment, the probability of visiting the webpage is updated. Probability computation is done using Normal and Gamma distribution functions. In the proposal, we have considered only the dangling pages for experiments. Inactive webpages are punished and degraded from the system. We have validated proposed approach with Microsoft Learning to Rank dataset. It has been found in the experiments performed that 3403 dangling pages out of 12211 dangling pages have been degraded using the proposed scheme. The objective of the proposed scheme is achieved by saving web energy and computational cost. It takes 100 iterations to convergence which executed in 21.88 ms. However, the user behavior analysis helped in improving PageRank score of the webpages.
- Published
- 2018
39. Fog and Cloud Computing-based IoT in Healthcare Monitoring System for Healthy Ageing
- Author
-
Maria Gheorghe-Moisii, Eugenia Tîrziu, Nicolae-Dragos Nicolau, and Eleonora Tudora
- Subjects
Data processing ,Computer science ,business.industry ,Cloud computing ,Computer security ,computer.software_genre ,Data flow diagram ,Centralized database ,Smart environment ,State (computer science) ,Latency (engineering) ,business ,computer ,Wireless sensor network - Abstract
The Internet of Things (IoT), through portable devices and sensor networks, has the potential to transform the way in which healthcare monitoring systems collect and analyze patients' medical data. Nowadays, the IoT-based healthcare monitoring systems for the elderly require large amounts of data that needs to be transferred to a centralized database and from the database to cloud data centers, which leads to a decrease in the performance of such systems. Thus, fog computing provides innovative solutions by bringing resources closer to the user and by providing low latency and efficient solutions for data processing as against to cloud. In this paper, we present the “Non-invasive monitoring and health assessment of the elderly in a smart environment (RO-SmartAgeing)” project which supports elderly centered healthcare services in order to ensure illness prevention, personalized assistance and monitoring healthcare services. According to the current development state of the project, we propose a functional framework of the architecture of RO-SmartAgeing system based on IoT, Fog and Cloud Computing, present the hardware components of the system necessary for proper operation, the data flow chart of the RO-SmartAgeing system and an example of numerical validation and filtering of data from IoT devices.
- Published
- 2021
40. Crowd-Sourced Centralized Thermal Imaging for Isolation and Quarantine
- Author
-
Prabuddha Sinha, Sujata Pal, and Sudershan Kumar
- Subjects
Government ,Centralized database ,Data collection ,Computer science ,Process (engineering) ,SAFER ,Isolation (database systems) ,Computer security ,computer.software_genre ,Phase (combat) ,computer ,Data warehouse - Abstract
Without an effective vaccine yet for COVID-19, the only solution to stop the spread of this virus is through quarantine and isolation by identifying the people who contracted the virus and isolating them. High temperature is one of the major indicators of this virus which can be used to screen the suspected patients. Thermal infrared sensors are coming up as a safer alternative to the mercury thermometer that has a higher risk of virus transmission. The process of thermal screening is not being utilized to its full potential as it is conducted only at specific points such as hospitals, airports, and government offices. The data collected is random and too sparse to conclude any specific inference about the spreading of the virus. Therefore, we propose a centralized, distributed, and connected system, where the temperature of people in specific areas are monitored constantly using thermal cameras at various parts of the city without putting healthcare workers at risk. This data is also updated simultaneously online in real-time databases that can be viewed by the government agencies for any inferences. The temperature data is an open-source and educational institutions can also avail it to find innovative inferences. The proposed system comprises of three primary phases: the first phase is the data collection phase, consisting of sensor detection, the second phase is the centralized phase or integration phase, consisting of the data warehouse or centralized database system and the third one is the user phase, which is optional and decided by the officials to send alerting messages.
- Published
- 2021
41. Status of Bioinformatics Education in South Asia: Past and Present
- Author
-
Muhammad Hamid, Saadia Malik, Furqan Awan, Muhammad Muddassir Ali, Muhammad Saleem, Natash Ali Mian, Nadia Tabassum, Muhammad Ahmed Ihsan, and Khalid Mehmood
- Subjects
0301 basic medicine ,South asia ,Universities ,Computer science ,media_common.quotation_subject ,0206 medical engineering ,MEDLINE ,Developing country ,02 engineering and technology ,Review Article ,Bioinformatics ,General Biochemistry, Genetics and Molecular Biology ,Field (computer science) ,03 medical and health sciences ,Asia, Western ,Institution ,Humans ,Developing Countries ,media_common ,General Immunology and Microbiology ,business.industry ,Computational Biology ,General Medicine ,Centralized database ,030104 developmental biology ,Agriculture ,Medicine ,Sri lanka ,business ,020602 bioinformatics - Abstract
Bioinformatics education has been a hot topic in South Asia, and the interest in this education peaks with the start of the 21st century. The governments of South Asian countries had a systematic effort for bioinformatics. They developed the infrastructures to provide maximum facility to the scientific community to gain maximum output in this field. This article renders bioinformatics, measures, and its importance of implementation in South Asia with proper ways of improving bioinformatics education flaws. It also addresses the problems faced in South Asia and proposes some recommendations regarding bioinformatics education. The information regarding bioinformatics education and institutes was collected from different existing research papers, databases, and surveys. The information was then confirmed by visiting each institution’s website, while problems and solutions displayed in the article are mostly in line with South Asian bioinformatics conferences and institutions’ objectives. Among South Asian countries, India and Pakistan have developed infrastructure and education regarding bioinformatics rapidly as compared to other countries, whereas Bangladesh, Sri Lanka, and Nepal are still in a progressing phase in this field. To advance in a different sector, the bioinformatics industry has to be revolutionized, and it will contribute to strengthening the pharmaceutical, agricultural, and molecular sectors in South Asia. To advance in bioinformatics, universities’ infrastructure needs to be on a par with the current international standards, which will produce well-trained professionals with skills in multiple fields like biotechnology, mathematics, statistics, and computer science. The bioinformatics industry has revolutionized and strengthened the pharmaceutical, agricultural, and molecular sectors in South Asia, and it will serve as the standard of education increases in the South Asian countries. A framework for developing a centralized database is suggested after the literature review to collect and store the information on the current status of South Asian bioinformatics education. This will be named as the South Asian Bioinformatics Education Database (SABE). This will provide comprehensive information regarding the bioinformatics in South Asian countries by the country name, the experts of this field, and the university name to explore the top-ranked outputs relevant to queries.
- Published
- 2021
42. ATAV: a comprehensive platform for population-scale genomic analyses
- Author
-
Hongzhu Cui, David Goldstein, Nitin Bhardwaj, Joseph A. Hostyk, Gundula Povysil, and Zhong Ren
- Subjects
Computer science ,Interface (Java) ,Gene discovery ,Population ,lcsh:Computer applications to medicine. Medical informatics ,Biochemistry ,03 medical and health sciences ,0302 clinical medicine ,Structural Biology ,Databases, Genetic ,Exome Sequencing ,Code (cryptography) ,Humans ,Diagnostic ,education ,lcsh:QH301-705.5 ,Molecular Biology ,030304 developmental biology ,0303 health sciences ,education.field_of_study ,Information retrieval ,Applied Mathematics ,Reproducibility of Results ,Coverage data ,Genome analysis ,Genomics ,Computer Science Applications ,Identification (information) ,Centralized database ,Association testing ,Genetics, Population ,Phenotype ,lcsh:Biology (General) ,Scalability ,lcsh:R858-859.7 ,User interface ,Web platform ,030217 neurology & neurosurgery ,Software - Abstract
Background A common approach for sequencing studies is to do joint-calling and store variants of all samples in a single file. If new samples are continually added or controls are re-used for several studies, the cost and time required to perform joint-calling for each analysis can become prohibitive. Results We present ATAV, an analysis platform for large-scale whole-exome and whole-genome sequencing projects. ATAV stores variant and per site coverage data for all samples in a centralized database, which is efficiently queried by ATAV to support diagnostic analyses for trios and singletons, as well as rare-variant collapsing analyses for finding disease associations in complex diseases. Runtime logs ensure full reproducibility and the modularized ATAV framework makes it extensible to continuous development. Besides helping with the identification of disease-causing variants for a range of diseases, ATAV has also enabled the discovery of disease-genes by rare-variant collapsing on datasets containing more than 20,000 samples. Analyses to date have been performed on data of more than 110,000 individuals demonstrating the scalability of the framework. To allow users to easily access variant-level data directly from the database, we provide a web-based interface, the ATAV data browser (http://atavdb.org/). Through this browser, summary-level data for more than 40,000 samples can be queried by the general public representing a mix of cases and controls of diverse ancestries. Users have access to phenotype categories of variant carriers, as well as predicted ancestry, gender, and quality metrics. In contrast to many other platforms, the data browser is able to show data of newly-added samples in real-time and therefore evolves rapidly as more and more samples are sequenced. Conclusions Through ATAV, users have public access to one of the largest variant databases for patients sequenced at a tertiary care center and can look up any genes or variants of interest. Additionally, since the entire code is freely available on GitHub, ATAV can easily be deployed by other groups that wish to build their own platform, database, and user interface.
- Published
- 2021
43. Geographic Visualization of Mortality in the United States as Related to Healthcare Access by County
- Author
-
Eric Wiener, Landan Peters, Prithvi Chippada, Jason Widrich, Shelley Nation, and Eldon Jenkins
- Subjects
index of medical underservice (imu) ,Bivariate analysis ,Healthcare Technology ,030204 cardiovascular system & hematology ,03 medical and health sciences ,0302 clinical medicine ,Environmental health ,Health care ,map ,area deprivation index (adi) ,Medicine ,Social determinants of health ,Choropleth map ,Socioeconomic status ,business.industry ,chloropleth ,Mortality rate ,General Engineering ,mortality ,Centralized database ,Epidemiology/Public Health ,social determinants of health (sdoh) ,Geovisualization ,Public Health ,business ,030217 neurology & neurosurgery ,bivariate - Abstract
This investigation analyzed the impact of place-based inequities on mortality rates in 2014. The team combined mortality data with metrics on health care accessibility, socioeconomic deprivation, and other variables available from publicly available data sets. The investigation team created a centralized database for visualizations that combined mortality data by diagnosis, socioeconomic data, health resource data, and an index of area deprivation. Choropleth maps, scatterplots, and regression analyses were performed to identify the major areas of mortality and how well different measures of the social determinants of health (SDOH) correlate to mortality data. A bivariate color scheme to visually capture both outcomes and SDOH in a choropleth map was shown to be a compact and novel manner to display complex epidemiologic data.
- Published
- 2021
44. Transaction synchronization and privacy aspect in blockchain decentralized applications
- Author
-
Ongkasuwan, Patarawan and Ongkasuwan, Patarawan
- Abstract
The ideas and techniques of cryptography and decentralized storage have seen tremendous growth in many industries, as they have been adopted to improve activities in the organization. That called Blockchain technology, it provides an effective transparency solution. Generally, Blockchain has been used for digital currency or cryptocurrency since its inception. One of the best-known Blockchain protocols is Ethereum, which has invented the smart contract to enable Blockchain’s ability to execute a condition, rather than simply acting as storage. Applications that adopt this technology are called ‘Dapps’ or ‘decentralized applications’. However, there are ongoing arguments about synchronization associated with the system. System synchronization is currently extremely important for applications, because the waiting time for a transaction to be verified can cause dissatisfaction in the user experience. Several studies have revealed that privacy leakage occurs, even though the Blockchain provides a degree of security, as a result of the traditional transaction, which requires approval through an intermediate institution. For instance, a bank needs to process transactions via many constitution parties before receiving the final confirmation, which requires the user to wait for a considerable amount of time. This thesis describes the challenge of transaction synchronization between the user and smart contract, as well as the matter of a privacy strategy for the system and compliance. To approach these two challenges, the first task separates different events and evaluates the results compared to an alternative solution. This is done by testing the smart contract to find the best gas price result, which varies over time. In the Ethereum protocol, gas price is one of the best ways to decrease the transaction time to meet user expectations. The gas price is affected by the code structure and the network. In the smart contract, testing is run based on two cases, and solves plat, Idéer och tekniker för kryptografi och decentraliserad lagring har haft en enorm tillväxt i många branscher, eftersom de har antagits för att förbättra verksamheten i organisationen. Den som kallas Blockchain-tekniken ger den en effektiv transparenslösning. Generellt har Blockchain använts för digital valuta eller cryptocurrency sedan starten. Ett av de mest kända Blockchainprotokollen är Ethereum, som har uppfunnit det smarta kontraktet för att möjliggöra Blockchains förmåga att utföra ett villkor, snarare än att bara fungera som lagring. Applikationer som använder denna teknik kallas 'Dapps' eller 'decentraliserade applikationer'. Det finns emellertid pågående argument om synkronisering associerad med systemet. Systemsynkronisering är för närvarande oerhört viktigt för applikationer, eftersom väntetiden för att en transaktion ska verifieras kan orsaka missnöje i användarupplevelsen. Flera studier har visat att sekretessläckage inträffar, även om Blockchain ger en viss säkerhet, till följd av den traditionella transaktionen, som kräver godkännande genom en mellaninstitution. Till exempel måste en bank bearbeta transaktioner via många konstitutionspartier innan den får den slutliga bekräftelsen, vilket kräver att användaren väntar en betydande tid. Den här avhandlingen beskriver utmaningen med transaktionssynkronisering mellan användaren och smart kontrakt, samt frågan om en sekretessstrategi för systemet och efterlevnad. För att närma sig dessa två utmaningar separerar den första uppgiften olika händelser och utvärderar resultaten jämfört med en alternativ lösning. Detta görs genom att testa det smarta kontraktet för att hitta det bästa gasprisresultatet, som varierar över tiden. I Ethereum-protokollet är gaspriset ett av de bästa sätten att minska transaktionstiden för att möta användarens förväntningar. Gaspriset påverkas av kodstrukturen och nätverket. I det smarta kontraktet körs test baserat på två fall och löser plattformsproblem som löpare och användarupplev
- Published
- 2020
45. The Orthopaedic Resident Selection Process: Proposed Reforms and Lessons From Other Specialties
- Author
-
Daniel A. London and Ryley K. Zastrow
- Subjects
Resident selection ,Medical education ,Web of science ,Coronavirus disease 2019 (COVID-19) ,business.industry ,Process (engineering) ,Scopus ,MEDLINE ,COVID-19 ,Internship and Residency ,Context (language use) ,Centralized database ,Orthopedics ,Medicine ,Humans ,Orthopedics and Sports Medicine ,Surgery ,School Admission Criteria ,business ,Pandemics - Abstract
INTRODUCTION: Proposals for substantive reforms to the orthopaedic resident selection process are growing, given increasing applicant competitiveness, burgeoning inefficiencies and inequities of the current system, and impending transition of Step 1 to pass/fail. The COVID-19 pandemic has further catalyzed the need for reforms, offering unprecedented opportunities to pilot novel changes. However, a comprehensive collation of all proposed and implemented orthopaedic reforms is currently lacking. Thus, we aimed to characterize proposed orthopaedic-specific resident selection reforms in the context of reforms implemented by other specialties. METHODS: EMBASE, MEDLINE, Scopus, and Web of Science databases were searched for references proposing reforms to the orthopaedic resident selection process published from 2005 to 2020. An inductive approach to qualitative content analysis was used to categorize reforms. RESULTS: Twenty-six articles proposing 13 unique reforms to the orthopaedic resident selection process were identified. The most commonly proposed reforms included noncognitive assessments (n = 8), application caps (n = 7), standardized letters of recommendation (n = 5), program-specific supplemental applications (n = 5), creation of a centralized database of standardized program information (n = 4), use of a standardized applicant composite score (n = 4), and a moratorium on postinterview communication (n = 4). Importantly, nearly all of these reforms have also been proposed or implemented by other specialties. DISCUSSION: Numerous reforms to the orthopaedic resident selection process have been suggested over the past 15 years, several of which have been implemented on a program-specific basis, including noncognitive assessments, supplemental applications, and standardized letters of recommendation. Careful examination of applicant and program experiences and Match outcomes after these reforms is imperative to inform future directions.
- Published
- 2021
46. Applying Blockchain Technology to Secure Object Detection Data
- Author
-
Zeyad A.T. Ahmed, Pravin L. Yannawar, Ali Mansour Al-madani, and Ahmed Abdullah A. Shareef
- Subjects
Centralized database ,Blockchain ,Distributed database ,business.industry ,Computer science ,Hash function ,Trusted third party ,business ,Encryption ,Object detection ,Computer network ,Data modeling - Abstract
Recently, blockchain, deep learning, and computer vision are considered the most fundamental technologies in the world. The researchers have been paying much attention to use them. The centralized database system’s problem has low security because it must be transferred through a trusted third party, while the data transmitted might be hacked. However, blockchain provides a distributed database that makes the network secure, flexible, and capable of supporting real-time services. This article proposes a blockchain-based object detection model. The blockchain system makes files secure using the Interplanetary Files System (IPFS) that receives our files, stores them on the decentralized application, and sends the hash to Ethereum to be stored. A YOLOv3 object detection model receives our encrypted data as a hash value and detects objects from images that consist of a huge dataset of different objects.
- Published
- 2021
47. BlockChain Based Inventory Management by QR Code Using Open CV
- Author
-
Shaik Reehana, G. Vidhya Lakshmi, Bodapati Nagaeswari, and Subbarao Gogulamudi
- Subjects
Information transfer ,Blockchain ,Supply chain management ,Traceability ,Database ,Computer science ,business.industry ,Supply chain ,020208 electrical & electronic engineering ,02 engineering and technology ,computer.software_genre ,Automation ,Centralized database ,0202 electrical engineering, electronic engineering, information engineering ,Code (cryptography) ,business ,computer - Abstract
Inventory management is a part of the supply chain where inventory and quantities of stock are tracked in and out of the stockroom. Proper handling of inventory will leads to successful supply chain management in any organization. QR codes make this inventory management speedy. This fast information transfer will also reduce the number of errors in inventory records and also gives accurate results to make informed decisions during frequent reviews. But based on Quick Response (QR) code for inventory management will become a centralized database. Blockchain facilitates manufacturers to connect each party from distribution centers and retail partners, to suppliers and production sites — with an abiding record of each, single exchange that occurred. These put away records are available to everybody inside the P2P organize and gives decentralization. The degree of straightforwardness and permanency gave in blockchain are frequently useful for manufacturers to oversee item roots and traceability. Smart contracts, one of the features of blockchain, have built-in automation, which makes a lot of sense for transaction management. In this paper, we are using both the features of QR code and blockchain for transparent, distributed, and reliable inventory management.
- Published
- 2021
48. Search Engine Database Systems Tools for Building Literature
- Author
-
Zayar Aung, Hein Zaw Htet, and Alexey R. Fedrorov
- Subjects
SQL ,Database ,business.industry ,Computer science ,ASP.NET ,computer.software_genre ,Centralized database ,Search engine ,Software ,Information system ,The Internet ,Gradient boosting ,business ,computer ,computer.programming_language - Abstract
This article discusses research information from search engines. A search engine is a static and centralized database that allows users to access the WWW on the Internet to search a query-based database of information. Justification is considered as obtaining information. This is a method for analyzing the design, software, and implementation of technical tools. Special attention is paid to organizations that search for information via the Internet. In this article, you develop a database procedure for storing information and implement it in Microsoft SQL Server using ASP NET. In fact, the book search information system was implemented. Today, there are many different search engines with features and functions on the Internet. In this article, it is decided that the search engine is an effective tool for creating a database of literature search.
- Published
- 2021
49. Blockchain-based Automated Formal Model for Smart Market System
- Author
-
Aniqa Rehman, Nazir Ahmad Zafar, Saba Latif, and Muhammad Fayez Afzaal
- Subjects
Blockchain ,Correctness ,business.industry ,Computer science ,media_common.quotation_subject ,Data security ,System requirements specification ,Toolbox ,Smart market ,Centralized database ,Quality (business) ,Software engineering ,business ,media_common - Abstract
The blockchain technology is creating a great impact on the stakeholders of different departments. It resolves data security and centralized database issues. In this work, blockchain-based smart market system is presented to ensure quality of products and to increase tax ratio. In smart market system, we have defined customer, distributors, stores and distributor departments as operation in VDM-SL in which their details are managed and stored in blockchain. Furthermore, product sale, purchase and quality details are saved as transactions in the blockchain. All the details are viewed by relevant departments such as quality assurance authority, distribution departments, distributors and customers using mobile application, which are also described as operations in the system specification. Due to lack of space, we have not mentioned customer and distributor departments in detail. Unified Modeling Language (UML) is used to describe blockchain-based smart market diagrammatically. We have automated our model using Non-deterministic Finite Automata (NFA) to realize behavior of the system. Vienna Development Method-Specification Language (VDM-SL) is used to formally describe and analyze the system properties and invariants. VDM-SL toolbox is used to ensure system correctness and efficiency before implementation phase.
- Published
- 2021
50. Medical IoT—Automatic Medical Dispensing Machine
- Author
-
V. Nishanthan, S. Muthuramlingam, C. V. Nisha Angeline, E. Rahul Ganesh, and S. Siva Pratheep
- Subjects
Smart system ,Matching (statistics) ,Computer science ,media_common.quotation_subject ,Computer security ,computer.software_genre ,Field (computer science) ,Centralized database ,Management system ,Code (cryptography) ,Quality (business) ,Database transaction ,computer ,media_common - Abstract
Internet of things (IoT) is playing a vital role in the development of various high-performance smart systems. So much research is being done to improve the quality of human life in various ways. One such research is in the field of hospital management. After the recent COVID situation which has led to concept of social distancing and contactless transactions, we propose a centralized hospital management system using IoT. The proposed system is a mobile app that can be made use of by the doctor to access the patient history from the centralized database. The doctor can then make an E-prescription based on his diagnosis. The E-prescription is generated as a QR code on the patient-side app. The patient can show the QR code to the automatic medical dispensing machine (AMDM) which dispenses the prescribed medicines to the patient by matching it against a QR code. This helps to avoid 70% of the medical errors due to manual prescription and achieve the concept of social distancing and contactless transaction.
- Published
- 2021
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.