41 results on '"OLTP"'
Search Results
2. MULTIDIMENSIONAL DATABASES IN INFORMATION SYSTEMS OF UNIVERSITIES
- Author
-
A. Mukasheva, D. Yedilkhan, and M. Aldiyar
- Subjects
information system ,database ,olap ,oltp ,three-dimensional ,one-dimensional ,data analysis ,Information technology ,T58.5-58.64 - Abstract
The article is devoted to the description of the method of multidimensional database, which is an effective method of data storage, which allows analyzing data qualitatively, and most importantly in a short time. The article discusses the capabilities of multidimensional databases, in particular, multidimensional OLAP (On-Line Analytical Processing) cubes when analyzing large amounts of data. Provides an overview and features of a multidimensional database and discusses the steps you need to take with a multidimensional database to understand the structure and capabilities of an OLAP cube. To create a knowledge base, it describes the steps you can take to create and execute a multidimensional database that you can collect from various sources, save to a database, and then prepare a report using OLAP analysis. Various information system data processing technologies such as OLTP and OLAP were considered. The algorithm of the data storage process for analysis purposes was studied. A model of a multidimensional database in the form of a three-dimensional cube was presented. Examples of analysis and ways of obtaining information from the data cube were also given. The use of a multidimensional database in higher education institutions as a simple and effective method of data storage is considered. There are also illustrations of the structure of a higher educational institution to see the bulkiness of information, and what kind of information database operates in the educational institution.
- Published
- 2022
- Full Text
- View/download PDF
3. Operationalizing Analytics with NewSQL
- Author
-
Chereja, Ionela, Hahn, Sarah Myriam Lydia, Matei, Oliviu, Avram, Anca, Kacprzyk, Janusz, Series Editor, Gomide, Fernando, Advisory Editor, Kaynak, Okyay, Advisory Editor, Liu, Derong, Advisory Editor, Pedrycz, Witold, Advisory Editor, Polycarpou, Marios M., Advisory Editor, Rudas, Imre J., Advisory Editor, Wang, Jun, Advisory Editor, and Silhavy, Radek, editor
- Published
- 2021
- Full Text
- View/download PDF
4. A pragmatic approach to recover access time of Apriori algorithm by applying intersection on CSS for redefining FIS through matrix implementation in textual data.
- Author
-
Verma, Neeraj Kumar and Singh, Vaishali
- Subjects
- *
APRIORI algorithm , *ASSOCIATION rule mining , *ONLINE data processing , *PRAGMATICS , *OLAP technology , *DATA mining - Abstract
Nowadays data analytics OLAP (online Analytical Processing) is mostly accepted domain of current researchers and the concept of data mining serves better for the same. There are so many data mining methodologies defined for data analytics. Mining Association rule is widely used in data mining methods for data categorization. Apriori Algorithm is popular method for defining n-element. Frequent item set form k number of huge transactional data set online transaction processing (OLTP) using Association Mining rule (AMR). In this paper, researchers executed original Apriori on transactional data set containing 35039 number of transactions, divided into three data sets DS-1 to DS-3 with 20039, 12000, 5000 number of transactions with variable length with minimum support of 30%, 60% and 80% respectively. Researchers carried out experimental work and compared results of Apriori Algorithm with our proposed algorithm (enhanced version of Apriori algorithm) on the same perimeter and state improvement with 11%, 30% and 27% of Rate of Improvements in DS-1 to Ds-3 respectively for 30% minimum support. Our proposed algorithm is working far much better then Apriori algorithm at each parameter which was included to conclude the results. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
5. MULTIDIMENSIONAL DATABASES IN INFORMATION SYSTEMS OF UNIVERSITIES.
- Author
-
Mukasheva, A., Yedilkhan, D., and Aldiyar, M.
- Subjects
MULTIDIMENSIONAL databases ,INFORMATION storage & retrieval systems ,OLAP technology ,DATA mining ,HIGHER education - Abstract
The article is devoted to the description of the method of multidimensional database, which is an effective method of data storage, which allows analyzing data qualitatively, and most importantly in a short time. The article discusses the capabilities of multidimensional databases, in particular, multidimensional OLAP (On-Line Analytical Processing) cubes when analyzing large amounts of data. Provides an overview and features of a multidimensional database and discusses the steps you need to take with a multidimensional database to understand the structure and capabilities of an OLAP cube. To create a knowledge base, it describes the steps you can take to create and execute a multidimensional database that you can collect from various sources, save to a database, and then prepare a report using OLAP analysis. Various information system data processing technologies such as OLTP and OLAP were considered. The algorithm of the data storage process for analysis purposes was studied. A model of a multidimensional database in the form of a three-dimensional cube was presented. Examples of analysis and ways of obtaining information from the data cube were also given. The use of a multidimensional database in higher education institutions as a simple and effective method of data storage is considered. There are also illustrations of the structure of a higher educational institution to see the bulkiness of information, and what kind of information database operates in the educational institution. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
6. DEVELOPMENT OF ONLINE EGG GRADING INFORMATION MANAGEMENT SYSTEM WITH DATA WAREHOUSE TECHNIQUE.
- Author
-
Yoon, S., Shin, T. S., Lawrence, K., and Jones, D. R.
- Published
- 2020
- Full Text
- View/download PDF
7. Хмарні обчислення в діяльності банківських установ
- Author
-
Баглай Р.О.
- Subjects
ІТ архітектура банку ,хмарні технології ,бази даних ,хмарні сервіси тестування ,програмне забезпечення з відкритим кодом ,OLTP ,OLAP ,Information technology ,T58.5-58.64 - Abstract
Проведено аналіз можливості впровадження хмарних технологій для забезпечення діяльності банківських установ і підтримки функціонування бізнес-процесів. Розглянуто проблеми та переваги хмарних технологій на різних рівнях архітектурного ландшафту банку з урахуванням специфіки нормативно-правового регулювання діяльності фінансової установи. Результати дослідження можуть бути апробовані шляхом впровадження відповідних проектів, зумовлених викликами та тенденціями банківської сфери, ринковими та регуляторними змінами.
- Published
- 2017
- Full Text
- View/download PDF
8. Подходы и проблемы обмена данными между информационными системами
- Author
-
M. B. Gabbassov and T. D. Kuanov
- Subjects
данные ,метаданные ,обмен данными ,oltp ,olap ,пространство имен ,Mechanical engineering and machinery ,TJ1-1570 ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
Обмен данными между двумя и более информационными системами всегда являлся актуальной и сложной задачей. За последние три десятилетия данная задача прошла свой эволюционный путь от ручного способа обмена до возможности автоматического обмена данными между системами. В данной работе рассматриваются различные современные технологии для организации обмена данными между информационными системами, в том числе между OLTP- и OLAP-системами. Приведено описание технологий ETL, MDM и технологий, основанных на пространствах имен, в том числе "Синхронизатор ТОФИ" , разрабатываемый в Компании системных исследований "Фактор". Подробно излагается формальное понятие "данных" , используемое в технологии ТОФИ, пространство имен ТОФИ и функциональные возможности синхронизатора ТОФИ по обмену данными и метаданными. Функционально синхронизатор осуществляет сопоставление метаданных различных информационных систем на основе систем кодирования, автоматически (по расписанию) или вручную производит формирование метаданных, обмен метаданными с другими системами, формирование данных для передачи другой системе, прием и передачу набора данных из/в другой(-ую) информационной(-ую) системы(-у) и т.д. Обмен данными, основанный на пространстве имен ТОФИ, может осуществляться в автоматическом режиме на основе синхронизации метаданных.
- Published
- 2017
9. Scaling Up Mixed Workloads: A Battle of Data Freshness, Flexibility, and Scheduling
- Author
-
Psaroudakis, Iraklis, Wolf, Florian, May, Norman, Neumann, Thomas, Böhm, Alexander, Ailamaki, Anastasia, Sattler, Kai-Uwe, Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Weikum, Gerhard, Series editor, Nambiar, Raghunath, editor, and Poess, Meikel, editor
- Published
- 2015
- Full Text
- View/download PDF
10. Information management of the geoinformation systems
- Author
-
Trifković Milan
- Subjects
data ,information ,information management ,transactional and informational systems ,geoinformation system ,OLTP ,ETL ,OLAP ,Architecture ,NA1-9428 ,Urban groups. The city. Urban sociology ,HT101-395 - Abstract
Importance of information management procedures and practice based on of defined and accepted frame within an organization, is discussed. It is a common presumption that transformation of data into information brings the level of knowledge to support right decision making. However, these efforts and investments often do not result in increased efficiency, due to absence of well defined and designed transactional and informational systems. As a consequence, many organizations are 'rich' in data, but still 'poor' in information. Difference between transactional (or operational) and informational systems are highlighted, too.
- Published
- 2010
11. Substantial data in cloud computing
- Author
-
Singhal, Shweta and Sharma, Ankita
- Published
- 2013
12. Comparative Study of Row and Column Oriented Database.
- Author
-
Bhagat, Vandana and Gopal, Arpita
- Abstract
Since long time, the relational row oriented databases are most often used for Data warehouse implementation because it is more efficient in the database which contains large number of short on-line transactions (INSERT, UPDATE, and DELETE) in OLTP (Online Transaction Processing). The functionality of OLTP is to utilize very fast query processing, maintaining data integrity in multi-access environments and an effectiveness measured by number of transactions per second. It contains detailed and current data to control and run fundamental business tasks. But their Record Based structure can not satisfy the existing needs of the user. As the size of data grows day by day, user demand more sophisticated analytical capabilities. For large amount of data RDBMS data warehouse systems are difficult to design and maintain, inefficient in their use of disk space and I/O. Designers require to compromise between optimizing query performance and maximizing query flexibility. Columnar databases use less disk space and are more efficient in their I/O demands than records-based data warehouses but force their own compromise between optimizing for new record insertion versus data selection and retrieval. The current paper gives an idea about advantages and disadvantages of Row and Column Oriented Database concepts. The paper refers study of different researchers to come to the conclusion. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
13. İş Zekâsı: Kavramsal Çerçeve, Bileşenler ve İşleyiş.
- Author
-
PAZARÇEVİREN, Selim Yüksel, ZOR, Ümmügülsüm, and GÜRBÜZ, Filiz
- Subjects
ORGANIZATIONAL structure ,BUSINESS intelligence ,COMPETITIVE advantage in business ,BUSINESS ethics - Abstract
Copyright of Research Journal of Politics, Economics & Management / Siyaset, Ekonomi ve Yönetim Araştırmaları Dergisi is the property of Research Journal of Politics, Economics & Management and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2015
14. HyPer Beyond Software: Exploiting Modern Hardware for Main-Memory Database Systems.
- Author
-
Funke, Florian, Kemper, Alfons, Mühlbauer, Tobias, Neumann, Thomas, and Leis, Viktor
- Abstract
In this paper, we survey the use of advanced hardware features for optimizing main-memory database systems in the context of our HyPer project. We exploit the virtual memory management for snapshotting the transactional data in order to separate OLAP queries from parallel OLTP transactions. The access behavior of database objects from simultaneous OLTP transactions is monitored using the virtual memory management component in order to compact the database into hot and cold partitions. Utilizing many-core NUMA-organized database servers is facilitated by the morsel-driven adaptive parallelization and partitioning that guarantees data locality w.r.t. the processing core. The most recent Hardware Transactional Memory support of, e.g., Intel's Haswell processor, can be used as the basis for a lock-free concurrency control scheme for OLTP transactions. Finally, we show how heterogeneous processors of 'wimpy' devices such as tablets can be utilized for high-performance and energy-efficient query processing. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
15. MOVING FROM TRADITIONAL DATA WAREHOUSE TO ENTERPRISE DATA MANAGEMENT: A CASE STUDY.
- Author
-
Pandey, Amit and Mishra, Sushma
- Subjects
DATA analysis ,DESCRIPTIVE statistics ,CLOUD storage ,DATA warehousing ,MANAGEMENT information systems - Abstract
In the era of big data, organizations today rely of huge quantity of data from diverse sources and need to integrate this data in a speedy manner to gain any strategic advantage out of the data. Data warehouse is becoming increasingly popular in organizations due to the need for enterprises to gather all of their data in a single place for in-depth analysis and also to segregate such analytical work form on-line transaction processing systems. In carrying out this qualitative case study, researchers examined the limitation and issues of current data warehousing system architecture of the financial institution. The researchers discuss the benefits of migration from tradition data warehouse to enterprise data management and present the new architecture. Contributions are noted and conclusions drawn. [ABSTRACT FROM AUTHOR]
- Published
- 2014
16. Adaptive HTAP through Elastic Resource Scheduling
- Author
-
Angelos-Christos G. Anadiotis, Aunn Raza, Periklis Chrysogelos, Anastasia Ailamaki, Ecole Polytechnique Fédérale de Lausanne (EPFL), Département d'informatique de l'École polytechnique (X-DEP-INFO), École polytechnique (X), Rich Data Analytics at Cloud Scale (CEDAR), Laboratoire d'informatique de l'École polytechnique [Palaiseau] (LIX), Centre National de la Recherche Scientifique (CNRS)-École polytechnique (X)-Centre National de la Recherche Scientifique (CNRS)-École polytechnique (X)-Inria Saclay - Ile de France, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria), and École polytechnique (X)-Centre National de la Recherche Scientifique (CNRS)-École polytechnique (X)-Centre National de la Recherche Scientifique (CNRS)-Inria Saclay - Ile de France
- Subjects
FOS: Computer and information sciences ,Computer science ,Distributed computing ,Systems and Control (eess.SY) ,02 engineering and technology ,Electrical Engineering and Systems Science - Systems and Control ,01 natural sciences ,Resource (project management) ,Computer Science - Databases ,OLTP ,0103 physical sciences ,FOS: Electrical engineering, electronic engineering, information engineering ,0202 electrical engineering, electronic engineering, information engineering ,Resource management ,DBMS ,Throughput (business) ,010302 applied physics ,OLAP ,[INFO.INFO-DB]Computer Science [cs]/Databases [cs.DB] ,Online analytical processing ,InformationSystems_DATABASEMANAGEMENT ,Databases (cs.DB) ,020202 computer hardware & architecture ,Data exchange ,HTAP ,Online transaction processing ,[INFO.INFO-DC]Computer Science [cs]/Distributed, Parallel, and Cluster Computing [cs.DC] - Abstract
Modern Hybrid Transactional/Analytical Processing (HTAP) systems use an integrated data processing engine that performs analytics on fresh data, which are ingested from a transactional engine. HTAP systems typically consider data freshness at design time, and are optimized for a fixed range of freshness requirements, addressed at a performance cost for either OLTP or OLAP. The data freshness and the performance requirements of both engines, however, may vary with the workload. We approach HTAP as a scheduling problem, addressed at runtime through elastic resource management. We model an HTAP system as a set of three individual engines: an OLTP, an OLAP and a Resource and Data Exchange (RDE) engine. We devise a scheduling algorithm which traverses the HTAP design spectrum through elastic resource management, to meet the data freshness requirements of the workload. We propose an in-memory system design which is non-intrusive to the current state-of-art OLTP and OLAP engines, and we use it to evaluate the performance of our approach. Our evaluation shows that the performance benefit of our system for OLAP queries increases over time, reaching up to 50% compared to static schedules for 100 query sequences, while maintaining a small, and controlled, drop in the OLTP throughput., Comment: Technical report accompanying the paper in SIGMOD 2020 proceedings
- Published
- 2020
- Full Text
- View/download PDF
17. Optimizing Data Warehouse.
- Author
-
MOÇKA, Blerta and LEKA, Daniel
- Subjects
DATA warehousing ,BUSINESS enterprises ,ONLINE data processing ,TOTAL quality management ,OLAP technology - Abstract
Data warehouse is playing more and more an important role in large companies. It collects and processes OLTP data from all functions of the company and keeps them centralized. Product quality specialists can now use the appropriate data in the data warehouse to analyze the quality of the product. After a successful analysis of data on the products they can find rules and patterns that lie behind and thus help analysts and quality management to take better decisions in terms of quality management. In most cases any organization serious about measuring its performances will have one data warehous. In this article we answer the question why the companies need one data warehouse and what are the benefits of building it? [ABSTRACT FROM AUTHOR]
- Published
- 2013
18. BUID: A Virtual Agent to Become Robust Integrated Core Financial System.
- Author
-
Bhedi, Vaibhav R., Lanjewar, Ujwal A., and Deshpande, Shrinivas P.
- Subjects
IDENTIFICATION ,BANK security ,DATA processing in the banking industry ,BANK accounts ,SOFTWARE architecture - Abstract
Today's core banking system is a comprehensive, integrated yet modular business solution that effectively addresses the strategic and day-to-day challenges faced by banks. It is highly parametrical providing much-needed flexibility to innovate and adapt to a dynamic environment. The current core banking system is comprehensive and unified customer data repository with capabilities to educate and empower customers. With core banking solution, banks can meet the challenges of managing change, competition, compliance and customer demands effectively. But core banking system doesn't know the customers opened number of accounts in assorted banks under one roof. This paper offers the concept of BUID (Bank unique Identification) code to unhide customer details and transaction from income tax department, government and overall financial system. The BUID can be easily unified in current financial core system. It will be better alternative of pan card. The BUID will be well-built to execute a position to augment the present financial core system. [ABSTRACT FROM AUTHOR]
- Published
- 2012
19. Advanced Applications of Data Warehousing Using 3-tier Architecture.
- Author
-
Sharma, Praveen
- Subjects
- *
ASSOCIATIONS, institutions, etc. , *PLANNING , *DECISION making , *DATA mining , *WORLD Wide Web , *LIBRARIES - Abstract
Organisations be it industry or business or even educational institutes, need to improve their information inventory system so as to survive in the competitive environment. The organisations have to increase their efficiency and effectiveness in maintaining the cycle of activities, in their planning, decision-making processes, and analytical needs. There are several ways to acquire this goal; one of it is with data mining which is able to make a prediction using existing data in their database in order to forecast future demand. In addition, with data mining they would be able to determine which activity is more important and what trend is prevailing. An information system, which is based on both World Wide Web technology and a 3-tiered architecture, is proposed herein to meet the above requirements. This paper is an attempt to provide the initial concept about data mining model that most likely will be used in various department including libraries of the teaching institutes. The initial concepts covered by the paper are the appropriate data warehouse schema; data mining tasks and techniques that are best suited, and applications. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
20. Хмарні обчислення в діяльності банківських установ
- Subjects
хмарні сервіси тестування ,OLAP ,lcsh:T58.5-58.64 ,ІТ архітектура банку ,lcsh:Information technology ,програмне забезпечення з відкритим кодом ,OLTP ,хмарні технології ,бази даних - Abstract
Проведено аналіз можливості впровадження хмарних технологій для забезпечення діяльності банківських установ і підтримки функціонування бізнес-процесів. Розглянуто проблеми та переваги хмарних технологій на різних рівнях архітектурного ландшафту банку з урахуванням специфіки нормативно-правового регулювання діяльності фінансової установи. Результати дослідження можуть бути апробовані шляхом впровадження відповідних проектів, зумовлених викликами та тенденціями банківської сфери, ринковими та регуляторними змінами.
- Published
- 2017
21. GPU-accelerated data management under the test of time
- Author
-
Aunn Raza, Chrysogelos, Periklis, Sioulas, Panagiotis, Indjic, Vladimir, Anadiotis, Angelos Christos, and Ailamaki, Anastasia
- Subjects
OLAP ,OLTP ,GPU ,HTAP ,DBMS - Abstract
GPUs are becoming increasingly popular in large scale data center installations due to their strong, embarrassingly parallel, processing capabilities. Data management systems are riding the wave by using GPUs to accelerate query execution, mainly for analytical workloads. However, this acceleration comes at the price of a slow interconnect which imposes strong restrictions in bandwidth and latency when bringing data from the main memory to the GPU for processing. The related research in data management systems mostly relies on late materialization and data sharing to mitigate the overheads introduced by slow interconnects even in the standard CPU processing case. Finally, workload trends move beyond analytical to fresh data processing, typically referred to as Hybrid Transactional and Analytical Processing (HTAP). Therefore, we experience an evolution in three different axes: interconnect technology, GPU architecture, and workload characteristics. In this paper, we break the evolution of the technological landscape into steps and we study the applicability and performance of late materialization and data sharing in each one of them. We demonstrate that the standard PCIe interconnect substantially limits the performance of state-of-the-art GPUs and we propose a hybrid materialization approach which combines eager with lazy data transfers. Further, we show that the wide gap between GPU and PCIe throughput can be bridged through efficient data sharing techniques. Finally, we provide an H2TAP system design which removes software-level interference and we show that the interference in the memory bus is minimal, allowing data transfer optimizations as in OLAP workloads.
- Published
- 2020
- Full Text
- View/download PDF
22. Implementación Tablero de Control - Procedimiento Compra del Gobierno de la Ciudad de Buenos Aires
- Author
-
Mormando, Julio Maximiliano, Amadeo, Ana Paola, and Marcote, Cristian Gabriel
- Subjects
Data Warehouse ,OLAP ,Ciencias Informáticas ,Esquema estrella ,Metadatos ,Métricas ,Jerarquías ,ETL ,Data Mart ,Esquema copo de nieve ,OLTP ,Tabla de Dimensiones ,Panel de Control ,Tabla de Hechos ,KPI - Abstract
El presente trabajo de tesina tiene como finalidad presentar la actividad profesional desarrollada en las oficinas de Sistemas del Ministerio de Hacienda de la Ciudad de Buenos Aires. Se describen las tareas realizadas en la elaboración del Tablero de Control utilizado por el Gobierno para su gestión, indicando cada uno de los pasos implicados para su desarrollo, desde obtención del requerimiento hasta llegar a la solución, basado como se verá en la metodología de Ralph Kimball. Como resultado final se obtiene el Tablero de Control que le permite al Estado la toma decisiones gerenciales., Tesina Programa de Apoyo al Egreso de Profesionales en Actividad (PAEPA)., Facultad de Informática
- Published
- 2020
23. Máster en Diseño y Gestión de Proyectos Tecnológicos
- Author
-
Arreaga-Santacruz, Sally Astrid
- Subjects
alta dirección ,DataMart ,OLAP ,Máster en Diseño y Gestión de Proyectos Tecnológicos ,OLTP ,data warehouse ,senior management ,toma de decisiones ,business intelligence ,decision making - Abstract
Business Intelligence (BI) plays an indispensable role for decision making with quality guidelines, and in turn, meet the requirements of corporate intelligence in organizations. BI instruments such as DataMart are important because they manage the information generated in an entity, and even more so in the Administrative area of the Center for Continuing Education (CEC) at the Technical University of Machala; because it allows senior management to consult, interpret and validate the data processed within the institution and thus obtain functional knowledge that allows for sustained decisions and prospective consent. Another favorable tool that derives from BI is the Data Warehouse (DW) because it integrates several data sources, mainly structured transactional databases, structured formats, or unstructured sources to make information analysis more efficient. Consistent with this premise, the TFM aims to design a DW system with DataMart and BI to offer support to senior management in decision making using structured or unstructured historical data. The business intelligence methodology used was the Kimball methodology along with the project management methodology that was the PMBOK methodology that details the relevant guidelines for the administration and development of a technological project; among the main ideas of the methodology are the management of the scope, time, costs, quality and risks that are very necessary to comply satisfactorily; in both methodologies, the achievement of the DataMart and the user's requirements. At the end of the TFM, the most important conclusion obtained is that a DataMart, such as business intelligence strategies, can improve decision making because together with the multidimensional database together with its star connection they form a dimensional cube of information that shows direct queries of data that make it possible for interested parties to observe the best ways to reach various solutions to a problem, in the case of CEC, the difficulties faced by administrators at the time of student registration and admission, financial results or achievement will be mitigated academic; marking a positive impact on the students who want a continuous education without blunt in the administration of the information processed by the Technical University of Machala. El Business Intelligence (BI) desempeña un rol indispensable para la toma de decisiones con directriz en la calidad, y a su vez resolver los requerimientos de inteligencia corporativa en las organizaciones. Los instrumentos de BI como DataMart son importantes porque gestionan la información que se genera en una entidad, y más aún en el área Administrativa del Centro de Educación Continua (CEC) en la Universidad Técnica de Machala; porque permite a la alta dirección consultar, interpretar y validar los datos procesados dentro de la institución y así obtener conocimiento funcional que permita tomar decisiones sustentadas y de prospectiva consiente. Otra herramienta favorable que deriva del BI es el Data Warehouse (DW) porque integra varias fuentes de datos, principalmente bases de datos transaccionales estructuradas, formatos estructurados, o fuentes no estructuradas para hacer más eficiente el análisis de la información. En congruencia a esta premisa, el TFM tiene como propósito diseñar un sistema DW con DataMart y BI para ofrecer soporte a la alta dirección en la toma de decisiones utilizando datos históricos estructurados o no estructurados. La metodología de inteligencia de negocios utilizada fue la metodología Kimball junto con la metodología para la gestión del proyecto que fue la metodología PMBOK la misma que detalla los lineamientos pertinentes para la administración y desarrollo de un proyecto tecnológico; entre las principales ideas de la metodología destacan la gestión del alcance, tiempo, costos, calidad y riesgos muy necesarios para cumplir satisfactoriamente; en ambas metodologías, la consecución del DataMart y los requerimientos del usuario. Al término del TFM la conclusión más importante obtenida es que un DataMart como estrategias de inteligencia de negocios, pueden mejorar la toma de decisiones porque junto con la base de datos multidimensionales junto con su conexión estrella forman un cubo dimensional de información que muestra consultas directas de datos que hacen que las partes interesadas puedan observar los mejores caminos para llegar a diversas soluciones de un problema, en el caso de CEC, se mitigarán las dificultades que tenían los administradores al momento del registro y admisión de estudiantes, los resultados financieros o el aprovechamiento académico; marcando un impacto positivo en los estudiantes que quieren una educación continua sin tapujos en la administración de la información que procesa la Universidad Técnica de Machala.
- Published
- 2020
24. Cloud-based information architecture of a bank
- Subjects
OLAP ,архітектура банку ,іnformation technology ,data base ,дисертація ,OLTP ,база даних ,architecture of a bank ,336.71 [004] ,інформаційна технологія ,хмарна технологія ,cloud technology - Abstract
Дисертація на здобуття наукового ступеня кандидата технічних наук (доктора філософії) за спеціальністю 05.13.06 – інформаційні технології (122 – комп’ютерні науки). – Національний технічний університет "Харківський політехнічний інститут", Харків, 2019. Об’єктом дослідження є процеси автоматизованого управління потоками даних інформаційної архітектури банку на основі хмарних технологій. Предметом дослідження є моделі, методи та інформаційні технології оптимізації обробки банківської інформації на основі хмарної інфраструктури. Дисертацію присвячено вирішенню актуальної науково-прикладної задачі підвищення ефективності обробки інформації регламенту операційного дня банку шляхом модернізації інформаційної архітектури банку на основі впровадження хмарних технологій. У дисертації проведено аналіз можливості впровадження хмарних технологій для забезпечення діяльності банківських установ та підтримки функціонування бізнес-процесів. Розглянуто проблеми та переваги хмарних технологій на різних рівнях архітектурного ландшафту банку з урахуванням специфіки нормативно-правового регулювання діяльності фінансової установи. У вступі обґрунтовано актуальність теми дисертації, зазначено зв’язок роботи з науковими темами, сформульовано мету та задачі дослідження, визначено об’єкт, предмет та методи дослідження, показано наукову новизну та практичне значення отриманих результатів, наведено інформацію про практичне використання, апробацію результатів та їх висвітлення у публікаціях. У першому розділі проведено аналіз основних підходів до управління банківською інформацією та перспективних напрямів застосування хмарних технологій для банківських інформаційних систем. Зокрема, проведено декомпозицію об’єкта дослідження на складові – «інформаційна архітектура», «хмарні технології», «хмарні обчислення» «банківська інформаційна система (ІС)» з метою подальшого застосування методів аналізу та синтезу. Малодослідженими залишаються проблеми та переваги застосування хмарних технологій у банківських установах України. Банки, які не є професійними компаніями з інформаційних технологій, змушені інвестувати і підтримувати значну кількість ресурсів інфраструктури ІТ та персоналу для управління власними бізнес-процесами. У такій ситуації хмарні технології дозволяють скоротити витрати та підвищити ефективність використання банківських інформаційних систем. У другому розділі досліджено інформаційні технології мінімізації загроз безпеки хмарних технологій для автоматизованих банківських систем шляхом застосування механізмів єдиного входу з метою забезпечення сильної автентифікації користувачів. Досліджено механізми впровадження такої автентифікації та їх практичне застосування для забезпечення безпеки та підвищення ефективності бізнес-процесів банку. Вироблено пропозиції щодо критеріїв вибору постачальника інформаційної технології управління обліковими даними як сервісу, механізмів єдиного входу в систему та федеративних сценаріїв доступу для забезпечення сильної автентифікації користувачів банківських ІС. Автором удосконалено метод оцінки загроз інформаційної безпеки банку при впровадженні хмарних технологій, який має в основі якісний аналіз імовірності ризику та обсягу збитків на основі міжнародного стандарту класифікації кібернетичних атак MITRE, що дозволило оптимізувати механізми захисту інформаційної архітектури банку від потенційних кібератак. У третьому розділі розроблено ІТ-рішення для банківської системи на основі хмарних технологій, що дозволяє перенести великі обчислювальні навантаження в хмарне середовище, забезпечивши відповідність вимогам загального регламенту про захист даних (GDPR) та національних регуляторів. Анонімізація даних клієнта описується як рішення для уникнення ризиків, пов’язаних із конфіденційністю даних клієнтів, а також необхідністю їх згоди на розміщення персональних даних у хмарному середовищі. Удосконалено інформаційну технологію реплікації даних банківських ІС, що ґрунтується на механізмах деперсоніфікації клієнтських даних. Це дозволило покращити захист конфіденційності даних та виконати вимоги НБУ щодо локалізації банківських персоніфікованих клієнтських даних на серверах, які фізично розташовані на території України. Розроблена архітектура рішень ІТ поєднує обробку даних у режимі реального часу та пакетних завантажень даних. На відміну від традиційного способу використання даних вони не тільки мігрують в базу даних (БД), розгорнуту на хмарній інфраструктурі, а також реплікуються назад у наземну інфраструктуру. Вимоги безпеки, що регулюються стандартами конфіденційності, цілісності та доступності даних, повністю задовольняються відповідними хмарними технологіями. Автором побудовано математичну модель процесу закриття операційного дня банку і вирішено задачу оптимізації часу та вартості обробки інформації для банківських ІС, розгорнутих у хмарному середовищі, що дозволило визначити оптимальну конфігурацію хмарних сервісів в інформаційній архітектурі банку на базі сервісів AWS. The dissertation for the degree of a candidate in technical sciences (PhD), specialty 05.13.06 – information technologies (122 - computer science). – National Technical University «Kharkiv Polytechnic Institute», Kharkiv, 2019. The research object is the processes of automated management of data flows of the bank's information architecture based on cloud technologies. The research subject is models, methods and information technologies for optimization of banking information processing based on cloud infrastructure. The dissertation is devoted to the solution of the actual scientific and applied problem of increasing the efficiency of processing information of the bank end of day procedure by modernizing the information architecture of the bank based on the introduction of cloud technologies. The dissertation analyzes the feasibility of implementing cloud technologies to support the activities of banking institutions and functioning of business processes. The problems and advantages of cloud technologies at different levels of the bank's architectural landscape are considered, taking into account the specifics of regulatory requirements to activity of a financial institution. The introduction contains the proof dissertation topic relevance, indicates the relationship of work with scientific topics, purpose and objectives of the study, identified the object, subject and methods of research, shows the scientific novelty and practical significance of the results obtained, provides information on practical use, validation of results and their coverage in publications. The first chapter includes analyzes of the basic approaches to banking information management and the perspective fields to apply the cloud technologies for banking information systems. In particular, the decomposition of the object of study into components - "information architecture", "cloud technologies", "cloud computing" "banking information system (IS)", was carried out in order to further apply the methods of analysis and synthesis. The problems and benefits of using cloud technologies in Ukrainian banking institutions remain under-researched. Banks which are non-professional IT companies are forced to invest and maintain a significant amount of IT infrastructure resources and staff to manage their own business processes. In such a situation, cloud technologies help reduce costs and increase the efficiency of banking information systems. The second chapter explores information technology to minimize the security threats of cloud technologies for automated banking systems by using single sign on mechanisms to ensure strong user authentication. The mechanisms of implementation of such authentication and their practical application for security and increase of efficiency of bank business processes are investigated. Proposals have been made on the criteria for choosing a provider of identity access management as a service, single sign-on mechanisms and federated access scenarios to ensure strong authentication of users of banking ISs. The author has improved the method of assessing bank information security threats in the implementation of cloud technologies, which is based on a qualitative analysis of the probability of risk and volume of losses based on the international standard classification of cyber attacks MITRE, which allowed to optimize the mechanisms of protection of the information architecture of the bank against potential cyber attacks. The third chapter contains designed cloud-based IT solutions for the banking system that can transfer large computational loads to the cloud environment, ensuring compliance with the General Data Protection Regulation (GDPR) and national regulators. Anonymization of customer data is described as a solution to avoid the risks associated with the confidentiality of customer data and the need for their consent to the placement of personal data in a cloud environment. Improved information technology for replication of banking IS data, based on mechanisms of customer data depersonification. This allowed to improve the protection of data confidentiality and to fulfill the requirements of the NBU for localization of banking personalized client data on servers physically located in the territory of Ukraine. The developed IT solution architecture combines real-time data processing and batch data uploads. Unlike the traditional way of using data, it is not only migrated to a DB (database) deployed on cloud infrastructure, but also replicated back to On-premise infrastructure. Security requirements, governed by the standards of confidentiality, integrity and availability of data, are fully met by relevant cloud technologies. The author developed a mathematical model of the process of closing a bank's operating day and solved the problem of optimizing the time and cost of information processing for bank ICs deployed in a cloud environment, which allowed to determine the optimal configuration of cloud services in the bank's information architecture based on AWS services.
- Published
- 2019
25. Diseño, desarrollo e implementación de una solución en el marco Business Intelligence para la Conselleria d'Educació del Govern de les Illes Balears
- Author
-
López Campos, Javier
- Subjects
Información estratégica ,QlikView ,OLAP ,Grado en Ingeniería Informática-Grau en Enginyeria Informàtica ,Business intelligence ,Cuadro de mandos ,Almacén de datos ,Data warehouse ,OLTP ,Strategic information ,Dashboard ,LENGUAJES Y SISTEMAS INFORMATICOS ,KPI - Abstract
[ES] El presente trabajo refleja la importancia del tratamiento adecuado de los datos almacenados. Cada vez, se almacenan más y más datos, información muy útil si se sabe interpretar. En él, se recorren las maneras de hacer que los datos no solo sean números o texto, sino algo más, información clave para aportar competitividad y eficiencia a cualquier empresa. Además de explorar las claves para convertir datos en información útil, se introduce QlikView, uno de los programas más importantes en el mundo del Business Intelligence, con el que se realizan paso a paso tres cuadros de mandos para la administración pública del Gobierno Balear., [EN] This work reflects the importance of the right treatment of stored data. As time passes by, more and more data is stored, which can give really useful information if interpreted the right way. It shows ways of viewing data not only as numbers or text, but key information that can bring competitiveness and efficiency to any enterprise. Besides exploring the keys to transform data into useful information, QlikView is introduced as one of the most important software in the business intelligence environment. With the use of it, it is shown how to build three dashboards specifically for the public administration of the Balearic Government.
- Published
- 2019
26. Cloud-based information architecture of a bank
- Subjects
OLAP ,автореферат дисертації ,архітектура банку ,іnformation technology ,data base ,OLTP ,база даних ,architecture of a bank ,336.71 [004] ,інформаційна технологія ,хмарна технологія ,cloud technology - Abstract
Дисертація на здобуття наукового ступеня кандидата технічних наук за спеціальністю 05.13.06 – інформаційні технології. – Київський національний торговельно-економічний університет, Київ, 2019. У дисертації проведено аналіз можливості впровадження хмарних технологій для забезпечення діяльності банківських установ та підтримки функціонування бізнес-процесів. Розглянуто проблеми та переваги хмарних технологій на різних рівнях архітектурного ландшафту банку з урахуванням специфіки нормативно-правового регулювання діяльності фінансової установи. Метою дисертаційної роботи є підвищення ефективності обробки інформації регламенту операційного дня банку шляхом модернізації інформаційної архітектури банку на основі впровадження хмарних технологій. Розглянуто сучасні підходи щодо управління безпекою ІТ банківських установ для мінімізації загроз, в тому числі породжених хмарними технологіями. Запропоновано сучасний підхід до побудови систем з механізмами забезпечення безпеки ІТ. Проведено аналіз загроз безпеки інформаційних технологій при впровадженні хмарних обчислень для забезпечення безперебійної та ефективної діяльності банківських установ та запропоновано заходи щодо мінімізації цих загроз. Результати дослідження апробовані шляхом впровадження відповідних проектів, обумовлених викликами і тенденціями банківської сфери, ринковими та регуляторними змінами. Виконано впровадження результатів дисертаційної роботи у діяльність управління менеджменту портфелю проектів АТ "Райффайзен Банк Аваль" (м. Київ) щодо модернізації архітектури банківських інформаційних систем та розробки програмного забезпечення на основі хмарних технологій; у діяльність ТОВ "ІТ Інновації Україна" (м. Київ) щодо класифікації загроз інформаційної безпеки на основі якісної оцінки ризиків та ефективності використання серверних ресурсів за рахунок застосування хмарних обчислень. The research for a Ph. D. science degree by specialty 05.13.06 – information technologies. – Kyiv National University of Trade and Economics, Kyiv, 2019. The feasibility study for implementation of cloud technologies to support the activities of banking institutions and functioning of business processes is conducted in the dissertation. The problems and advantages of cloud technologies at different levels of the bank's architectural landscape are investigated, taking into account the specifics of regulatory activity of a financial institution. The purpose of the dissertation is to increase the efficiency of information processing in frames of end of day procedure of the Core Banking System by modernizing the information architecture of the bank based on cloud technologies implementation. Modern approaches to managing IT security of banking institutions to minimize threats, including those generated by cloud technologies, are considered. A modern approach to building systems with IT security mechanisms is proposed. The analysis of information technology security threats in the implementation of cloud computing has been conducted to ensure the smooth and efficient operation of banking institutions and measures have been proposed to minimize these threats. The proof of concepts for results of the study was included to the relevant projects, driven by the challenges and trends of the banking sector, market and regulatory changes.
- Published
- 2019
27. Zvjezdana shema - temeljni model za skladište podataka
- Author
-
Kružić, Katarina
- Subjects
poslovna inteligencija ,skladište podataka ,OLAP ,OLTP ,zvjezdana shema ,SCD - Abstract
Zvjezdana shema jedna je od osnovnih shema za povezivanje tablice činjenica i tablice dimenzija. To je dimenzionalni dizajn za relacijske baze podataka i najjednostavnija shema za skladištenje podataka. Široko je povezana s razvijanjem skladišta podataka i dimenzionalnog skladištenja podataka. Shema je dobila ime po sličnosti sa stvarnim oblikom zvijezde. U zvjezdanoj shemi središnja tablica jest tablica činjenica. Ona sadrži ključeve svih tablica kojima je povezana. Tablice koje su izravno povezane s tablicom činjenica zovu se dimenzijske tablice. Svaka dimenzijska tablica definirana je od njezinog primarnog ključa. Te tablice sadrže dodatne podatke i opise koji daju značenje podacima u tablici činjenica. Povezane dimenzije grupirane su kao stupci u dimenzijskim tablicama, a činjenice su pohranjene kao stupci u tablici činjenica. Rad je organiziran na sljedeći način: na početku će biti riječi o zvjezdanoj shemi. U središnjem dijelu ću pisati o SCD-u, upitima, dimenzionalnom dizajnu, navest ću prednosti i nedostatke te prikazati zvjezdanu shemu kroz praktični primjer.
- Published
- 2019
28. Business intelligence numa consultora de seguros : um projeto de reformulação e implementação
- Author
-
Gonçalves, Leandro Magalhães and Neto, Miguel de Castro Simões Ferreira
- Subjects
Data Mart ,ETL ,Data Warehouse ,Business Intelligence ,OLAP ,Mercado segurador ,OLTP ,Dashboards - Abstract
Project Work presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Knowledge Management and Business Intelligence Nos dias de hoje as empresas necessitam tratar uma gama enorme de dados, a que têm acesso, pelas mais variadas vias, para uma tomada de decisão. A obtenção destes dados são cruciais, se bem tratadas e integradas, permite-lhes acumular “inteligência” para competir em mercados cada vez mais exigentes. Esta inteligência de negócios, traduzida do inglês Business Intelligence (BI), é um método que visa ajudar as empresas a tomar as decisões inteligentes e mais eficazes, através do acesso a dados e informações recolhidas dos diversos sistemas de informação. O mercado segurador tem sofrido enormes pressões, impostas não apenas pela recente crise financeira que afetou Portugal, como também pela necessidade cada vez mais evidente de saber de formar mais detalhada e assertiva quem são realmente seus clientes, tudo isso, mitigando os riscos e evitando a aquisição de uma carteira de clientes pouco produtiva ou não lucrativa. O projeto apresentado neste relatório pretende descrever o desenvolvimento e implementação de uma solução de Business Intelligence em uma consultoria na área seguradora, baseando-se nas mais recentes técnicas atualmente no mercado. O objetivo principal é renovar a estrutura analítica e informacional da organização, outros valores também incentivaram a realização desse projeto como melhorar gestão das informações na organização, otimizar de forma mais funcional os processos da empresa, aumentar o “auto” conhecimento da organização para torna-la mais competitiva junto aos seus concorrentes, reduzir os custos para aumentar a receita, preparar novas analises analíticas e informacionais, melhorar a previsão dos riscos e aumentar o conhecimento da sua carteira atual de clientes e extrapolar esse conhecimento para coleta de novos clientes. Através da solução BI Microsoft, será possível implementar uma plataforma completa e eficaz de relatórios, conquistando novos clientes e proporcionando uma boa oportunidade futura de continuidade evolutiva desta plataforma de BI, extrapolando para outras filiais da empresa, assim como, o desenvolvimento e implementação de novos conceitos de análises.
- Published
- 2018
29. Conceção e modelação de um esquema híbrido de base de dados: transacional e analítico
- Author
-
Costa, André Pinheiro, Oliveira e Sá, Jorge, and Universidade do Minho
- Subjects
OLAP ,OLTP ,Engenharia e Tecnologia::Outras Engenharias e Tecnologias ,HTAP ,Esquema conceptual de dados ,Conceptual data modelling - Abstract
Os sistemas de bases de dados em contexto organizacional estão segmentados em duas categorias: bases de dados operacionais e bases de dados analíticas. Como consequência da constante evolução tecnológica, vários esforços foram levados a cabo com o objetivo de reavaliar o paradigma atual. Assim, a conceção de bases de dados híbridas capazes de suportar o processamento transacional e analítico é vista como possível e viável. No entanto, é clara a presença de uma lacuna no que diz respeito à existência de abordagens que demonstrem como modelar esquemas conceptuais para este tipo de bases de dados. Neste póster é apresentada uma conceção e modelação de um esquema lógico de uma base de dados híbrida., Database systems in organizational context are segmented in two categories: operational databases and analytical databases. Several efforts have been undertaken to re-evaluate the current paradigm, based on the constant technological evolution. Thus, the design of hybrid databases, capable of supporting transactional and analytical processing is possible and feasible. Even so, it is clear a gap about the existence of approaches regarding the modelling of conceptual schema for this type of databases. In this poster, is presented a logical schema for a hybrid database., (undefined), info:eu-repo/semantics/publishedVersion
- Published
- 2017
30. HTAPBench
- Author
-
João Paulo, José Pereira, Fábio Coelho, Ricardo Vilaça, Rui Oliveira, and Universidade do Minho
- Subjects
OLAP ,Computer science ,Transaction processing ,business.industry ,Online analytical processing ,InformationSystems_DATABASEMANAGEMENT ,02 engineering and technology ,Benchmarking ,Disjoint sets ,computer.software_genre ,Analytics ,OLTP ,020204 information systems ,Metric (mathematics) ,HTAP ,0202 electrical engineering, electronic engineering, information engineering ,Benchmark (computing) ,Online transaction processing ,020201 artificial intelligence & image processing ,Data mining ,business ,computer - Abstract
The increasing demand for real-time analytics requires the fusion of Transactional (OLTP) and Analytical (OLAP) systems, eschewing ETL processes and introducing a plethora of proposals for the so-called Hybrid Analytical and Trans-actional Processing (HTAP) systems. Unfortunately, current benchmarking approaches are not able to comprehensively produce a unified metric from the assessment of an HTAP system. The evaluation of both engine types is done separately, leading to the use of disjoint sets of benchmarks such as TPC-C or TPC-H. In this paper we propose a new benchmark, HTAPBench, providing a unified metric for HTAP systems geared toward the execution of constantly increasing OLAP requests limited by an admissible impact on OLTP performance. To achieve this, a load balancer within HTAPBench regulates the coexistence of OLTP and OLAP workloads, proposing a method for the generation of both new data and requests, so that OLAP requests over freshly modified data are comparable across runs. We demonstrate the merit of our approach by validating it with different types of systems: OLTP, OLAP and HTAP; showing that the benchmark is able to highlight the differences between them, while producing queries with comparable complexity across experiments with negligible variability., The authors would like to thank Marco Vieira for all the constructive reviews and comments on the final stage of this work, but also to Martin Arlitt and the anonymous reviewers for their helpful comments. This work is financed by: (1) the ERDF – European Regional Development Fund through the Operational Programme for Competitiveness and Internationalization - COMPETE 2020 Programme within project POCI-01-0145-FEDER-006961, and by National Funds through the Portuguese funding agency, FCT - Fundação para a Ciência e a Tecnologia as part of project UID/EEA/50014/2013; (2) the European Union’s Horizon 2020 - The EU Framework Programme for Research and Innovation 2014-2020, under grant agreement No. 653884., info:eu-repo/semantics/publishedVersion
- Published
- 2017
- Full Text
- View/download PDF
31. Finding the Needle in the Big Data Systems Haystack.
- Author
-
Kraska, Tim
- Subjects
BIG data ,INFORMATION storage & retrieval systems ,SOCIAL networks ,DATABASES ,MACHINE learning ,ONLINE data processing ,HYPERTEXT systems - Abstract
With the increasing importance of big data, many new systems have been developed to "solve" the big data challenge. At the same time, famous database researchers argue that there is nothing new about these systems and that they're actually a step backward. This article sheds some light on this discussion. [ABSTRACT FROM PUBLISHER]
- Published
- 2013
- Full Text
- View/download PDF
32. Herramienta de inteligencia de negocio para la gestión de transporte intermodal
- Author
-
Nicuesa Carreras, Samuel, Astrain Escola, José Javier, Escuela Técnica Superior de Ingenieros Industriales y de Telecomunicación, and Telekomunikazio eta Industria Ingeniarien Goi Mailako Eskola Teknikoa
- Subjects
Big Data ,Business Intelligence ,OLAP ,Dato ,Conocimiento ,OLTP ,Información ,Proceso ETL ,Cubo OLAP ,Datawarehouse ,Dashboard (Cuadros de mando Integral) - Abstract
Diseño y Desarrollo de una herramienta informática de inteligencia de negocio (Business Intelligence, BI) que permite gestionar la información de transporte intermodal de un modo automatizado y dinámico para que las empresas e instituciones del sector puedan tomar decisiones que mejoren las infraestructuras e incrementen la productividad para permitir la evolución de los medios de transporte. Los datos para la realización del trabajo son Open Data extraídos del Instituto Nacional de Estadística (INE) [1] y Datos del Gobierno de España [2] Design and Development of a Business Intelligence tool in order to manage transport information automatically and dynamically. This project will serve the companies and institutions to take good decisions in order to improve infrastructure and increase productivity. This project contains a main Dashboard with images of different topics that we want to analyze. These images make reference to others Dashboards which will show graphics with information about the chosen topic. The analysis of all this information will allow the evolution of transport making them more efficient, secure and better Graduado o Graduada en Ingeniería Informática por la Universidad Pública de Navarra Informatika Ingeniaritzako Graduatua Nafarroako Unibertsitate Publikoan
- Published
- 2017
33. Apply On-Line Analytical Processing (OLAP)With Data Mining For Clinical Decision Support
- Author
-
Walid Qassim Qwaider
- Subjects
Decision support system ,OLAP ,Performance management ,business.industry ,Process (engineering) ,Computer science ,Online analytical processing ,InformationSystems_DATABASEMANAGEMENT ,Clinical decision support system ,computer.software_genre ,Data science ,Field (computer science) ,diabetic approach ,OLTP ,Health care ,Medicine ,Data mining ,On line analytical processing ,business ,computer - Abstract
Medicine is a new direction in his mission is to prevent, diagnose and medicate diseases using OLAP with data mining. Are analyzed clinical data on patient population and the wide range of performance management of health care, unfortunately, are not converted to useful information for effective decision making. Built OLAP and data mining techniques in the field of health care, and an easy to use decision support platform, which supports the decision-making process of caregivers and clinical managers. This paper presents a model for clinical decision support system which combines the strengths of both OLAP and data mining. It provides a knowledge rich environment that cannot be achieved by using OLAP or data mining alone.  
- Published
- 2012
- Full Text
- View/download PDF
34. Business intelligence indicators : Types, models and implementation
- Author
-
Omar Boussaid, Michel Schneider, Sandro Bimonte, Institut national de recherche en sciences et technologies pour l'environnement et l'agriculture (IRSTEA), Université de Clermond-Ferrand 2, Equipe de Recherche en Ingénierie des Connaissances (ERIC), Université Lumière - Lyon 2 (UL2), Technologies et systèmes d'information pour les agrosystèmes (UR TSCF), Laboratoire d'Informatique, de Modélisation et d'Optimisation des Systèmes (LIMOS), Ecole Nationale Supérieure des Mines de St Etienne (ENSM ST-ETIENNE)-Université Clermont Auvergne [2017-2020] (UCA [2017-2020])-Centre National de la Recherche Scientifique (CNRS), Irstea Publications, Migration, and Ecole Nationale Supérieure des Mines de St Etienne-Université Clermont Auvergne [2017-2020] (UCA [2017-2020])-Centre National de la Recherche Scientifique (CNRS)
- Subjects
[SDE] Environmental Sciences ,Knowledge management ,Computer science ,Formalism (philosophy) ,02 engineering and technology ,020204 information systems ,OLTP ,0202 electrical engineering, electronic engineering, information engineering ,ComputingMilieux_MISCELLANEOUS ,BUSINESS INTELLIGENCE ,OLAP ,[INFO.INFO-DB]Computer Science [cs]/Databases [cs.DB] ,business.industry ,Online analytical processing ,Uml profile ,Conceptual framework ,Hardware and Architecture ,Business intelligence ,Chaining ,[SDE]Environmental Sciences ,Online transaction processing ,020201 artificial intelligence & image processing ,business ,Software engineering ,Software ,Conceptual level - Abstract
[Departement_IRSTEA]Ecotechnologies [TR1_IRSTEA]MOTIVE [Departement_IRSTEA]Ecotechnologies [TR1_IRSTEA]MOTIVE; International audience; Nowadays, more and more data are available for decisional analysis and decision-making based on different indicators. Although different decision-making technologies have been developed, we note the lack of a conceptual framework for the definition and implementation of these indicators. In this paper, we propose a first classification of these indicators. Furthermore, motivated by the need for formalism for the definition of these indicators at a conceptual level, we present the Business Intelligence Indicators (BI2) UML profile to represent indicators for OLAP, OLTP and streaming technologies. We also present their implementation in existing industrial tools. In addition, we show how these indicators can coexist in the same environment to exchange data through a chaining model and its implementation.
- Published
- 2016
35. Ocena efektywności architektury In-Memory w SQL Server 2014
- Author
-
Grala, Łukasz and Królikowski, Zbyszko
- Subjects
kolumnowe bazy danych ,In-Memory ,OLTP ,OLAP - Abstract
Rozwój sprzętu komputerowego w ostatnich latach zainicjował badania, mające na celu opracowanie nowych architektur baz danych. Jednym z takich nowych rozwiązań jest architektura nazywana „In-Memory” lub też „Main-Memory”. Stanowi ona radykalną zmianę podejścia do sposobu przechowywania i przetwarzania danych w systemach baz danych, wykorzystujących model przetwarzania transakcyjnego OLTP (ang. On-Line Transaction Processing). Celem niniejszej pracy jest analiza wybranych aspektów architektury In-Memory oraz ocena jej efektywności w porównaniu do rozwiązań konwencjonalnych, wykorzystujących pamięci dyskowe., Studia Informatica, Vol 36, No 1 (2015)
- Published
- 2015
- Full Text
- View/download PDF
36. BI2 : un profil UML pour les indicateurs décisionnels
- Author
-
sandro Bimonte, Irstea Publications, Migration, Technologies et systèmes d'information pour les agrosystèmes (UR TSCF), and Institut national de recherche en sciences et technologies pour l'environnement et l'agriculture (IRSTEA)
- Subjects
[SDE] Environmental Sciences ,OLAP ,OLTP ,[SDE]Environmental Sciences ,STREAM - Abstract
National audience; Aujourd’hui de plus en plus de données sont disponibles pour une analyse décisionnelle et reposent sur des indicateurs décisionnels. Bien que différentes technologies décisionnelles aient été développées, nous constatons le manque d’un cadre conceptuel pour la définition et l’implémentation de ces indicateurs. Dans ce papier, nous présentons une première classification de ces indicateurs. De plus, motivés par le besoin d’un formalisme pour la définition de ces indicateurs à un niveau conceptuel, nous présentons un profil UML BI2 qui permet de représenter des indicateurs OLAP, OLTP et stream. Nous présentons également leur implémentation dans les outils industriels existants.
- Published
- 2015
37. Application and Survey of Business Intelligence (BI) Tools within the Context of Military Decision Making
- Author
-
Tounsi, Mohamed Ilyes, Kamel, Madig, Kendall, Walter, and Systems Technology
- Subjects
Business Intelligence ,OLAP ,OLTP ,data warehouse ,PolyAnalyst ,OBIEE ,Rapid-I ,Oracle ,decision making - Abstract
Business intelligence (BI) is a general category of applications and technologies for collecting, storing, analyzing, and providing access to data to help users make better and faster decisions. BI applications include the activities of decision support systems, query and reporting, online analytical processing (OLAP), statistical analysis, forecasting, and data mining. The purpose of this research is to explore and survey several tools that fall under the BI umbrella and investigate their applicability within the context of military decision making. This survey will help military decision makers select the right BI tool for the right decision problem using the right technology. This would result in reduced IT costs by eliminating redundancy and consolidating computing resources, accelerated decision making, and improved accuracy, consistency, and relevance of decisions by providing a single version of truth. http://archive.org/details/applicationndsur109457419 Captain, Tunisian Air Force
- Published
- 2012
38. BUSINESS INTELLIGENCE SYSTEM IN MARIBORSKA LIVARNA MARIBOR
- Author
-
Korent, Simona and Perko, Igor
- Subjects
Podatkovno skladišče ,MLM ,Data Warehouse ,OLAP ,informacija ,Business Intelligence System ,Decision-making process ,Sistem poslovnega obveščanja ,Mariborska livarna Maribor ,udc:659.2:004 ,OLTP ,Information ,odločitveni proces ,Cognos ,ERP - Abstract
Pomembnost informacij v poslovnem svetu neprestano narašča. Odločanje je mogoče le na podlagi pravilnih in kakovostnih informacijah. Podjetja želijo z uvajanjem sistemov poslovnega obveščanja zapolniti vrzel pridobivanja in prikazovanja podatkov. Diplomsko delo predstavlja opis sistema poslovnega obveščanja v podjetju MLM. Z opredelitvijo osnovnih pojmov, sem želela predstaviti celostni obseg sistema poslovnega obveščanja. Podrobneje sem opisala OLAP tehnologijo. Podala sem splošne ugotovitve, ki sem jih spoznala tekom uvedbe in uporabe sistema. The importance of information in the business world continues to grow. Decision making is only possible on the basis of accurate and quality information. Companies need the introduction of business intelligence systems to fill the gap of obtaining and displaying data. The thesis presents a description of the business intelligence company in MLM. By defining the basic concepts, I wanted to present a comprehensive range of business intelligence. More specifically I focused in OLAP technology. The following are general comments were that I met during the introduction and use of the system.
- Published
- 2010
39. Diseño de sistema por actividad (abc)
- Author
-
Plaza Sánchez, Gloria Jahaira, Guzmán Muñoz, Nicolás Alexis, and Dalton Noboa, Jaime Lozada
- Subjects
OLAP ,OLTP ,ABC ,GeneralLiterature_MISCELLANEOUS - Abstract
The present work is about the design of Costing System by Activities together with the Development of an informatic application based in this system, inside an institution devoted to the manufacture, and commercialization of bags and plastic. In the first chapter we will know about the theory of Costing System ABC, like terms and definitions that use in the development of the present work. In the second chapter we will know the mission, vision, description of the activities that generate indirect costs, the detail of the current costing system that we use in the company and his main lines of product that offers. The third chapter is about the development of the Costing System ABC inside the company, the results that obtain with this system, comparisons and analysis between the current system and the system ABC. The chamber chapter presents us all the concerning the development of the informatic application, important definitions on what contains this application and illustrative graphic on the development of this system. The fifth chapter established the conclusions and recommendations that were appearing during the development of each one of the chapters, because it ´s important to emphasize some points that we found.
- Published
- 2009
40. Разработка архитектуры и программных средств витрин данных для предприятия нефтегазовой отрасли
- Subjects
Business Intelligence ,OLAP ,OLTP ,системы поддержки принятия решений ,хранилища ,данные ,информационные системы ,витрины ,многомерные модели - Abstract
Анализируются проблемы создания отраслевых информационно-аналитических систем, основанных на современных OLAP-технологиях. Рассмотрен вариант архитектуры хранилища данных для крупного нефтегазодобывающего предприятия на примере проектирования тематической витрины данных, включающего анализ специфической предметной области, проектирование многомерных структур данных, постановку аналитических задач и их решение.
- Published
- 2009
41. Scaling up Mixed Workloads: a Battle of Data Freshness, Flexibility, and Scheduling
- Author
-
Florian Wolf, Norman May, Iraklis Psaroudakis, Anastasia Ailamaki, Kai-Uwe Sattler, Alexander Böhm, and Thomas Neumann
- Subjects
workload management ,data freshness ,OLAP ,Database ,SAP HANA ,Computer science ,business.industry ,Online analytical processing ,Workload ,computer.software_genre ,Data warehouse ,Scheduling (computing) ,flexibility ,Transactional leadership ,Analytics ,OLTP ,Online transaction processing ,scheduling ,business ,computer ,CH-benCHmark ,HyPer - Abstract
The common “one size does not fit all” paradigm isolates transactional and analytical workloads into separate, specialized database systems. Operational data is periodically replicated to a data warehouse for analytics. Competitiveness of enterprises today, however, depends on real-time reporting on operational data, necessitating an integration of transactional and analytical processing in a single database system. The mixed workload should be able to query and modify common data in a shared schema. The database needs to provide performance guarantees for transactional workloads, and, at the same time, efficiently evaluate complex analytical queries. In this paper, we share our analysis of the performance of two main-memory databases that support mixed workloads, SAP HANA and HyPer, while evaluating the mixed workload CH-benCHmark. By examining their similarities and differences, we identify the factors that affect performance while scaling the number of concurrent transactional and analytical clients. The three main factors are (a) data freshness, i.e., how recent is the data processed by analytical queries, (b) flexibility, i.e., restricting transactional features in order to increase optimization choices and enhance performance, and (c) scheduling, i.e., how the mixed workload utilizes resources. Specifically for scheduling, we show that the absence of workload management under cases of high concurrency leads to analytical workloads overwhelming the system and severely hurting the performance of transactional workloads.
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.