3,067 results
Search Results
2. A recursive linear programming analysis of the future of the pulp and paper industry in the United States: Changes in supplies and demands, and the effects of recycling
- Author
-
Peter J. Ince, Joseph Buongiorno, and Dali Zhang
- Subjects
Paperboard ,Linear programming ,Pulp (paper) ,Competitive industry ,General Decision Sciences ,Management Science and Operations Research ,engineering.material ,Pulp and paper industry ,Supply and demand ,Paper recycling ,Market forces ,visual_art ,Economics ,visual_art.visual_art_medium ,engineering ,Economic model - Abstract
The impacts of increased paper recycling on the U.S. pulp and paper sector are investigated, using the North American Pulp And Paper (NAPAP) model. This dynamic spatial equilibrium model forecasts the amount of pulp, paper and paperboard exchanged in a multi-region market, and the corresponding prices. The core of the model is a recursive price-endogenous linear programming system that simulates the behavior of a competitive industry. The model has been used to make forecasts of key variables describing the sector from 1986 to 2012, demand for paper would have the greatest impact on the amount of wood used. But the minimum recycled content policies envisaged currently would have no more effect than what will come about due to unregulated market forces.
- Published
- 1996
- Full Text
- View/download PDF
3. A coupling cutting stock-lot sizing problem in the paper industry
- Author
-
Kelly Cristina Poldi, Marcos Nereu Arenales, Sônia Cristina Poltroniere, and Franklina Maria Bragion de Toledo
- Subjects
Mathematical optimization ,Computer science ,Heuristic ,Scheduling (production processes) ,General Decision Sciences ,Dynamic priority scheduling ,Management Science and Operations Research ,Sizing ,Scheduling (computing) ,symbols.namesake ,Cutting stock problem ,Lagrangian relaxation ,symbols ,Stock (geology) - Abstract
An important production programming problem arises in paper industries coupling multiple machine scheduling with cutting stocks. Concerning machine scheduling: how can the production of the quantity of large rolls of paper of different types be determined. These rolls are cut to meet demand of items. Scheduling that minimizes setups and production costs may produce rolls which may increase waste in the cutting process. On the other hand, the best number of rolls in the point of view of minimizing waste may lead to high setup costs. In this paper, coupled modeling and heuristic methods are proposed. Computational experiments are presented.
- Published
- 2007
- Full Text
- View/download PDF
4. A bibliography of ATM related papers
- Author
-
Guy Pujolle, Raif O. Onvural, and Harry G. Perros
- Subjects
Network congestion ,Operations research ,Computer science ,business.industry ,Bibliography ,General Decision Sciences ,Management Science and Operations Research ,Routing (electronic design automation) ,business ,GeneralLiterature_REFERENCE(e.g.,dictionaries,encyclopedias,glossaries) ,Computer network - Abstract
In this paper, we give a list of ATM related papers. This is not a complete list. Rather, it represents the bulk of the references in this ever-growing area. For easier use of this reference list, we have classified the papers into the following six categories: general, source characterization, switching, congestion control, routing, and transport.
- Published
- 1994
- Full Text
- View/download PDF
5. Evaluating the optimal timing and capacity of investments in flexible combined heat and power generation for energy-intensive industries
- Author
-
Dimitrios Zormpas and Giorgia Oggioni
- Subjects
History ,Combined heat and power, Optimal investment timing and capacity, Pulp and paper industry, Real options, Energy markets ,Pulp and paper industry ,Polymers and Plastics ,Optimal investment timing and capacity ,Real options ,General Decision Sciences ,Business and International Management ,Management Science and Operations Research ,Energy markets ,Industrial and Manufacturing Engineering ,Combined heat and power - Abstract
Substantial R &D efforts are currently directed towards the development of combined heat and power (CHP) systems that automatically and seamlessly connect to the power grid. In this paper we develop a real options model to assess the impact that the operational flexibility characterizing such systems will have on the optimal timing and capacity associated with investments in CHP plants. We take the viewpoint of a manufacturer operating in an energy-intensive industry who contemplates investing in CHP. We discuss and compare investments in two types of CHP systems: a standard one that is operationally rigid and a technologically advanced one that is operationally flexible. The interaction between temporal and operational flexibility under uncertainty and irreversibility is central to our analysis. We show that operational flexibility guarantees earlier investment but has an ambiguous effect in terms of capacity. In particular, when operational flexibility is very valuable the potential investor is opting for investing in a plant with larger productive capacity. The potential investor chooses a smaller CHP unit if otherwise. A numerical exercise calibrated using data from the Italian pulp and paper and electricity industries complements our theoretical analysis.
- Published
- 2023
- Full Text
- View/download PDF
6. Computational relevance of the Bayesian paradigm: Discussion of Professor Singpurwalla's paper
- Author
-
E. J. Wegman
- Subjects
business.industry ,Computer science ,Applied probability ,General Decision Sciences ,Management Science and Operations Research ,Machine learning ,computer.software_genre ,Frequentist inference ,Statistical inference ,Fiducial inference ,Relevance (information retrieval) ,Artificial intelligence ,Data mining ,business ,Bayesian paradigm ,computer - Abstract
Professor Singpurwalla's call to pay more attention to the statistical inference issues is, I believe, a timely one. I must strongly endorse the notion that applied probabilists will greatly benefit from increased attention to this area. Applied probability will move from a descriptive tool to an inferential tool.
- Published
- 1987
- Full Text
- View/download PDF
7. Outliers detection in assessment tests’ quality evaluation through the blended use of functional data analysis and item response theory
- Author
-
Fabrizio Maturo, Francesca Fortuna, Tonio Di Battista, Maturo, Fabrizio, Fortuna, Francesca, and Di Battista, Tonio
- Subjects
The quality of assessment tests plays a fundamental role in decision-making problems in various fields such as education, psychology, and behavioural medicine. The first phase in the questionnaires’ validation process is outliers’ recognition. The latter can be identified at different levels, such as subject responses, individuals, and items. This paper focuses on item outliers and proposes a blended use of functional data analysis and item response theory for identifying outliers in assessment tests. The basic idea is that item characteristics curves derived from test responses can be treated as functions, and functional tools can be exploited to discover anomalies in item behaviour. For this purpose, this research suggests a multi-step strategy to catch magnitude and shape outliers employing a suitable transformation of item characteristics curves and their first derivatives. A simulation study emphasises the effectiveness of the proposed technique and exhibits exciting results in discovering outliers that classical functional methods do not detect. Moreover, the applicability of the method is shown with a real dataset. The final aim is to offer a methodology for improving the questionnaires’ quality ,General Decision Sciences ,Management Science and Operations Research - Published
- 2022
- Full Text
- View/download PDF
8. Auditing Shaked and Shanthikumar’s ‘excess wealth’
- Author
-
Anna S. Gordon and Nozer D. Singpurwalla
- Subjects
Public economics ,Economic inequality ,Honor ,Theory of computation ,Economics ,General Decision Sciences ,Entropy (information theory) ,Diagnostic test ,Audit ,Paper based ,Management Science and Operations Research ,Lorenz curve ,Mathematical economics - Abstract
A notion called “excess wealth” was introduced by Shaked and Shanthikumar around 1998 (Probab. Eng. Inf. Sci. 12:1–23, 1998). Subsequent to this, much has been written on it, mostly by Shaked and his colleagues; see Sordo (Insur. Math. Econ. 45(3):466–469, 2009) for a recent review. These works have appeared in the literatures of reliability theory and stochastic orderings. Since the term excess wealth connotes a measure of income inequality—much like its dual, poverty—it should have had an impact in economics and the econometric literature. This, it appears is not the case, at least to the extent that it should be. The purpose of this paper is to investigate the above disconnect by looking at the notion of excess wealth more carefully, but keeping in mind the angle of economics and income. Our conclusion is that an alternative definition of excess wealth better encapsulates what one means by a colloquial use of the term. Our motivation for being attracted to this topic arises from two angles. The first is that the stochastics of diagnostic and threat detection tests, in which we have an interest, has a strong bearing on indices of concentration like the Lorenz Curve, the Gini index, and the entropy. Thus the notion of excess wealth, which conveys a sense of income concentration should also be relevant to diagnostics. The second motivation is to honor Moshe Shaked, a prolific researcher and a friend of the first author, by developing a paper based on an idea that is co-attributed to him.
- Published
- 2012
- Full Text
- View/download PDF
9. [Untitled]
- Author
-
Andy Philpott and Graeme Everett
- Subjects
business.product_category ,Operations research ,Supply chain ,General Decision Sciences ,Management Science and Operations Research ,Decision problem ,Pulp and paper industry ,Product (business) ,Paper machine ,Theory of computation ,Value (economics) ,Economics ,Mill ,business ,Integer (computer science) - Abstract
We describe the formulation and development of a supply-chain optimisation model for Fletcher Challenge Paper Australasia (FCPA). This model, known as Paper Industry Value Optimisation Tool (PIVOT), is a large mixed integer program that finds an optimal allocation of supplier to mill, product to paper machine, and paper machine to customer, while at the same time modelling many of the supply chain details and nuances which are peculiar to FCPA. PIVOT has assisted FCPA in solving a number of strategic and tactical decision problems, and provided significant economic benefits for the company.
- Published
- 2001
- Full Text
- View/download PDF
10. [Untitled]
- Author
-
Ronny Aboudi and Paulo Bárcia
- Subjects
Mathematical optimization ,business.product_category ,Partition problem ,General Decision Sciences ,Management Science and Operations Research ,Paper machine ,Cutting stock problem ,Knapsack problem ,Theory of computation ,Subset sum problem ,Subprocedure ,business ,Stock (geology) ,Mathematics - Abstract
This paper investigates the problem of obtaining the best arrangement for cutting stockpatterns in the presence of defective material in the paper, which is a common problem inthe industry. This problem is modeled as a multiple subset sum problem and its structure isexploited in order to obtain a practical solution procedure. The solution methodologyemploys the subset sum problem as a subprocedure. Computational results are reported.
- Published
- 1998
- Full Text
- View/download PDF
11. Integrating fire risk in stand management scheduling. An application to Maritime pine stands in Portugal
- Author
-
José G. Borges, Jordi Garcia-Gonzalo, and Timo Pukkala
- Subjects
Schedule ,education.field_of_study ,biology ,Thinning ,Population ,General Decision Sciences ,Forestry ,Understory ,Management Science and Operations Research ,biology.organism_classification ,Pulp and paper industry ,Basal area ,Environmental science ,Pinus pinaster ,Profitability index ,education ,Expected loss - Abstract
Maritime pine (Pinus pinaster Ait) is a very important forest species in Portugal. Nevertheless, both revenues and timber flows from the pine forests are substantially impacted by forest fires. We present a methodology for integrating fire risk in Maritime pine stand-level management optimization in Portugal. The objective is to determine the optimal prescription for a stand where fire risk is related to its structure and fuel load. The study optimizes the thinning treatments and the rotation length, as well as the fuel treatment schedule, i.e., reduction of understory cover during the rotation. Two components of wildfire risk—occurrence and damage—are considered. Fire damage was treated as an endogenous factor depending on the stand management schedule while fire occurrence was considered exogenous. A preliminary model that relates the expected loss to stand basal area, mean tree diameter and fire severity was used for this purpose. The article demonstrates how a deterministic stand-level growth and yield model may be combined with wildfire occurrence and damage models to optimize stand management. The Hooke-Jeeves direct search method was used to find the optimal prescription. In addition, population-based direct search methods (e.g. differential evolution and particle swarm optimization) were used for further testing purposes. Results are presented for Maritime pine stand management in Leiria National Forest in Portugal. They confirm that fuel treatments improve profitability and reduce the expected damage.
- Published
- 2011
- Full Text
- View/download PDF
12. Initial basis for a national planning model
- Author
-
James A. Calloway
- Subjects
Tax revenue ,Public economics ,Shadow price ,Short paper ,Economics ,General Decision Sciences ,Resource use ,National Policy ,Price level ,Management Science and Operations Research ,National planning ,Set (psychology) - Abstract
Analysis of pending national policy takes on greater importance each day to supply policy makers with much needed data on the overall economic effect such policies will have on total employment, income, output, tax revenues, and the general price level throughout the economy. This short paper examines the potential for integration of existing modeling technology into a meaningful national planning model to provide a sound set of economic indices leading to enhanced market efficiency and resource use. Initial efforts at integrated modeling and analysis are described, together with indications of their levels of success in replicating the existing economic environment. Emphasis is placed on determination of shadow prices in a growing economy and their incorporation into the systematic development of a national planning model.
- Published
- 1984
- Full Text
- View/download PDF
13. Multiobjective centralized DEA approach to Tokyo 2020 Olympic Games
- Author
-
Sebastián Lozano and Gabriel Villa
- Subjects
General Decision Sciences ,Management Science and Operations Research - Abstract
There exist two types of Data Envelopment Analysis (DEA) approaches to the Olympic Games: conventional and fixed-sum outputs (FSO). The approach proposed in this paper belongs to the latter category as it takes into account the total number de medals of each type awarded. Imposing these constraints requires a centralized DEA perspective that projects all the countries simultaneously. In this paper, a multiobjective FSO approach is proposed, and the Weighted Tchebychef solution method is employed. This approach aims to set all output targets as close as possible to their ideal values. In order to choose between the alternative optima, a secondary goal has been considered that minimizes the sum of absolute changes in the number of medals, which also renders the computed targets to be as close to the observed values as possible. These targets represent the output levels that could be expected if all countries performed at their best level. For certain countries, the targets are higher than the actual number of medals won while, for other countries, these targets may be lower. The proposed approach has been applied to the results of the Tokyo 2020 Olympic Games and compared with both FSO and non-FSO DEA methods.
- Published
- 2022
- Full Text
- View/download PDF
14. Key performance indicator based dynamic decision-making framework for sustainable Industry 4.0 implementation risks evaluation: reference to the Indian manufacturing industries
- Author
-
Rimalini Gadekar, Bijan Sarkar, and Ashish Gadekar
- Subjects
General Decision Sciences ,Management Science and Operations Research - Abstract
Global corporate giants are keen to adopt Industry 4.0 (I4.0) owing to its continuous, impactful, and evident benefits. However, implementing I4.0 remains a significant challenge for many organizations, mainly due to the absence of a systematic and comprehensive framework. The risk assessment study is key to the flawless execution of any project is a proven fact. This paper aims to develop a KPIs-based sustainable integrated model to assess and evaluate risks associated with the I4.0 implementation. This research paper has developed the I4.0 risks evaluation model through fifteen expert interventions and an extensive systematic literature review. This research, based on sixteen KPIs evaluates six risks impacting the organization's decision to adopt I4.0. Initially, the Fuzzy Decision-Making Trial and Evaluation Laboratory method is used to map the causal relationship among the KPIs. Further, the additive ratio assessment with interval triangular fuzzy numbers method is used to rank the risks. The study revealed that information technology infrastructure and prediction capabilities are the most crucial prominence and receiver KPIs. Simultaneously, technological and social risks are found to be highly significant in the I4.0 implementation decision-making process. The developed model meticulously supports the manufacturer's, policymaker, and researchers' viewpoint toward I4.0 implementation in the present and post COVID-19 pandemic phases in manufacturing companies. The comprehensive yet simple model developed in this study contributes to the larger ambit of new knowledge and extant literature. The integrated model is exceptionally based on the most prominent risks and a wider range of KPIs that are further analyzed by aptly fitting two fuzzy MCDM techniques, which makes the study special as it perfectly takes care of the uncertainties and vagueness in the decision-making process. Hence, this study is pioneering and unique in context to I4.0 risks prioritization aiming to accelerate I4.0 adoption.
- Published
- 2022
- Full Text
- View/download PDF
15. The origins and development of statistical approaches in non-parametric frontier models: a survey of the first two decades of scholarly literature (1998–2020)
- Author
-
Amir Moradi-Motlagh and Ali Emrouznejad
- Subjects
General Decision Sciences ,Management Science and Operations Research - Abstract
This paper surveys the increasing use of statistical approaches in non-parametric efficiency studies. Data Envelopment Analysis (DEA) and Free Disposable Hull (FDH) are recognized as standard non-parametric methods developed in the field of operations research. Kneip et al. (Econom Theory, 14:783–793, 1998) and Park et al. (Econom Theory, 16:855–877, 2000) develop statistical properties of the variable returns-to-scale (VRS) version of DEA estimators and FDH estimators, respectively. Simar & Wilson (Manag Sci 44, 49–61, 1998) show that conventional bootstrap methods cannot provide valid inference in the context of DEA or FDH estimators and introduce a smoothed bootstrap for use with DEA or FDH efficiency estimators. By doing so, they address the main drawback of non-parametric models as being deterministic and without a statistical interpretation. Since then, many articles have applied this innovative approach to examine efficiency and productivity in various fields while providing confidence interval estimates to gauge uncertainty. Despite this increasing research attention and significant theoretical and methodological developments in its first two decades, a specific and comprehensive bibliometric analysis of bootstrap DEA/FDH literature and subsequent statistical approaches is still missing. This paper thus, aims to provide an extensive overview of the key articles and their impact in the field. Specifically, in addition to some summary statistics such as citations, the most influential academic journals and authorship network analysis, we review the methodological developments as well as the pertinent software applications.
- Published
- 2022
- Full Text
- View/download PDF
16. Performance evaluation of problematic samples: a robust nonparametric approach for wastewater treatment plants
- Author
-
Alda A. Henriques, Milton Fontes, Ana S. Camanho, Giovanna D’Inverno, Pedro Amorim, Jaime Gabriel Silva, and Faculdade de Engenharia
- Subjects
INDICATORS ,Technology ,Science & Technology ,DETECTING OUTLIERS ,ENERGY EFFICIENCY ,Operations Research & Management Science ,DATA ENVELOPMENT ANALYSIS ,INFLUENTIAL OBSERVATIONS ,General Decision Sciences ,FRONTIER MODELS ,Management Science and Operations Research ,Problematic samples ,Robust and conditional approach ,DEA ,BOOTSTRAP ,Wastewater treatment plants ,ORDER-M EFFICIENCY ,BENCHMARKING - Abstract
This paper explores robust unconditional and conditional nonparametric approaches to support performance evaluation in problematic samples. Real-world assessments often face critical problems regarding available data, as samples may be relatively small, with high variability in the magnitude of the observed indicators and contextual conditions. This paper explores the possibility of mitigating the impact of potential outlier observations and variability in small samples using a robust nonparametric approach. This approach has the advantage of avoiding unnecessary loss of relevant information, retaining all the decision-making units of the original sample. We devote particular attention to identifying peers and targets in the robust nonparametric approach to guide improvements for underperforming units. The results are compared with a traditional deterministic approach to highlight the proposed method's benefits for problematic samples. This framework's applicability in internal benchmarking studies is illustrated with a case study within the wastewater treatment industry in Portugal.
- Published
- 2022
- Full Text
- View/download PDF
17. Bank efficiency and failure prediction: a nonparametric and dynamic model based on data envelopment analysis
- Author
-
Zhiyong Li, Chen Feng, and Ying Tang
- Subjects
General Decision Sciences ,Management Science and Operations Research - Abstract
For decades, the prediction of bank failure has been a popular topic in credit risk and banking studies. Statistical and machine learning methods have been working well in predicting the probability of bankruptcy for different time horizons prior to the failure. In recent years, bank efficiency has attracted much interest from academic circles, where low productivity or efficiency in banks has been regarded as a potential reason for failure. It is generally believed that low efficiency implies low-quality management of the organisation, which may lead to bad performance in the competitive financial markets. Previous papers linking efficiency measures calculated by Data Envelopment Analysis (DEA) to bank failure prediction have been limited to cross sectional analyses. A dynamic analysis with the updated samples is therefore recommended for bankruptcy prediction. This paper proposes a nonparametric method, Malmquist DEA with Worst Practice Frontier, to dynamically assess the bankruptcy risk of banks over multiple periods. A total sample of 4426 US banks over a period of 15 years (2002-2016), covering the subprime financial crisis, is used to empirically test the model. A static model is used as the benchmark, and we introduce more extensions for comparisons of predictive performance. Results of the comparisons and robustness tests show that Malmquist DEA is a useful tool not only for estimating productivity growth but also to give early warnings of the potential collapse of banks. The extended DEA models with various reference sets and orientations also show strong predictive power.
- Published
- 2022
- Full Text
- View/download PDF
18. Carbon emissions and sustainability in Covid-19’s waves: evidence from a two-state dynamic Markov-switching regression (MSR) model
- Author
-
Konstantinos N. Konstantakis, Panayotis G. Michaelides, Panos Xidonas, and Stavroula Yfanti
- Subjects
General Decision Sciences ,Management Science and Operations Research - Abstract
Throughout the world, carbon emissions have decreased in an unprecedented way as a result of the Covid-19 pandemic. The purpose of this paper is to investigate whether a rebound effect in carbon emissions is anticipated following the extraction of information related to the beliefs of investors. A suitable Markov switching model is used in this paper to adapt the safe haven financial methodology to an environmental sustainability perspective. Analytically, the aforementioned situation is modeled by estimating a two-state dynamic Markov-Switching Regression (MSR), with a state-dependent intercept term to capture the dynamics of the series, across unobserved regimes. In light of the results of the research and the robustness checks, investors are anticipating a rebound effect on the total quantity of carbon emissions.
- Published
- 2023
- Full Text
- View/download PDF
19. A novel deep neural network model based Xception and genetic algorithm for detection of COVID-19 from X-ray images
- Author
-
Burak Gülmez
- Subjects
General Decision Sciences ,Management Science and Operations Research - Abstract
The coronavirus first appeared in China in 2019, and the World Health Organization (WHO) named it COVID-19. Then WHO announced this illness as a worldwide pandemic in March 2020. The number of cases, infections, and fatalities varied considerably worldwide. Because the main characteristic of COVID-19 is its rapid spread, doctors and specialists generally use PCR tests to detect the COVID-19 virus. As an alternative to PCR, X-ray images can help diagnose illness using artificial intelligence (AI). In medicine, AI is commonly employed. Convolutional neural networks (CNN) and deep learning models make it simple to extract information from images. Several options exist when creating a deep CNN. The possibilities include network depth, layer count, layer type, and parameters. In this paper, a novel Xception-based neural network is discovered using the genetic algorithm (GA). GA finds better alternative networks and parameters during iterations. The best network discovered with GA is tested on a COVID-19 X-ray image dataset. The results are compared with other networks and the results of papers in the literature. The novel network of this paper gives more successful results. The accuracy results are 0.996, 0.989, and 0.924 for two-class, three-class, and four-class datasets, respectively.
- Published
- 2022
- Full Text
- View/download PDF
20. Dynamic interplay of environmental sustainability and corporate reputation: a combined parametric and nonparametric approach
- Author
-
Laee Choi, He-Boong Kwon, and Jooh Lee
- Subjects
Frame (networking) ,Nonparametric statistics ,General Decision Sciences ,Linkage (mechanical) ,Management Science and Operations Research ,law.invention ,Microeconomics ,law ,Ask price ,Theory of computation ,Sustainability ,Economics ,Mainstream ,Parametric statistics - Abstract
For decades, research on the relationship between environmental sustainability and firm performance focused on determining whether the effect of the former on the latter was positive or negative. That potentially controversial approach had driven mainstream research to ask two focal questions: Does it pay to be green? and Does it cost to be green? Recent papers have explored a curvilinear effect (i.e., U-shaped and inverted U-shaped) by tackling another question: When does it pay to be green? These efforts, however, have either oversimplified the complex linkage between these two things into a linear relationship or confined the complexity within a formatted frame. This paper overcomes these shortfalls by exploring the holistic effect of environmental sustainability on firm performance, contingent upon corporate reputation. In conjunction with direct effect analysis, this paper uniquely explores the synergistic effect of environmental sustainability and corporate reputation on performance. In addition to detecting the nonlinear synergy pattern, we found that the synergy effect is asymmetric. Specifically, the effect is complementary for low-reputation firms but substitutive for high-reputation firms. In addition to its theoretical contributions and pragmatic implications, this paper uniquely presents PROCESS-Neural network as a promising analytic paradigm.
- Published
- 2021
- Full Text
- View/download PDF
21. Cross-platform hotel evaluation by aggregating multi-website consumer reviews with probabilistic linguistic term set and Choquet integral
- Author
-
Yinrunjie Zhang, Decui Liang, and Zeshui Xu
- Subjects
General Decision Sciences ,Management Science and Operations Research - Abstract
In order to adequately utilize and integrate both ratings and comments from multiple websites, this paper proposes a new hotel evaluation model with probabilistic linguistic information processing. Taking consumers' possible psychological activities when leaving their reviews into consideration, this paper adapts the Weber-Fechner Law with the linguistic scale function and develop a novel unbalanced linguistic scale function. This paper also attempts to develop a method that enables adjusting linguistic-term formations among different websites to make full use of information. Then, we learn the decision criteria and the corresponding weight of hotel evaluation based on analyzing rating rules and consumer comments. Moreover, considering the interrelationships among criteria, this paper extends the Choquet integral to the probabilistic linguistic term set (PLTS) environment and designs some novel fusion operators. Furthermore, considering the fact that different websites mostly focus on heterogeneous hotel criteria, this paper puts forward a weighted averaging linear assignment based ranking method with the aid of PLTS Choquet integral. Finally, a case study of hotel evaluation is given to illustrate the validity and applicability of our proposed approach.
- Published
- 2022
22. Linking data-driven innovation to firm performance: a theoretical framework and case analysis
- Author
-
David T. W. Wong and Eric W. T. Ngai
- Subjects
General Decision Sciences ,Management Science and Operations Research - Abstract
This paper examines the impact of data-driven innovation (DDI) on firm performance, based on an exploratory case study of a manufacturing firm in China's textile and apparel industry. It explores the influence of various contextual variables on the firm's DDI and suggests ways to enhance DDI and thereby firm performance. Extending the literature on DDI, the paper proposes and validates a theoretical framework that incorporates the influence of various contextual factors on firms' DDI. The findings show that (1) individual context is associated with DDI; (2) organizational context is associated with DDI; and (3) DDI is associated with firm performance. This paper extends our understanding of how firm performance can be improved through DDI and shows that DDI should match a firm's contextual environment.The online version contains supplementary material available at 10.1007/s10479-022-05038-y.
- Published
- 2022
- Full Text
- View/download PDF
23. Cross-influence of information and risk effects on the IPO market: exploring risk disclosure with a machine learning approach
- Author
-
Huosong Xia, Juan Weng, Sabri Boubaker, Zuopeng Zhang, and Sajjad M. Jasimuddin
- Subjects
General Decision Sciences ,Management Science and Operations Research - Abstract
The paper examines whether the structure of the risk factor disclosure in an IPO prospectus helps explain the cross-section of first-day returns in a sample of Chinese initial public offerings. This paper analyzes the semantics and content of risk disclosure based on an unsupervised machine learning algorithm. From both long-term and short-term perspectives, this paper explores how the information effect and risk effect of risk disclosure play their respective roles. The results show that risk disclosure has a stronger risk effect at the semantic novelty level and a more substantial information effect at the risk content level. A novel aspect of the paper lies in the use of text analysis (semantic novelty and content richness) to characterize the structure of the risk factor disclosure. The study shows that initial IPO returns negatively correlate with semantic novelty and content richness. We show the interaction between risk effect and information effect on risk disclosure under the nature of the same stock plate. When enterprise information transparency is low, the impact of semantic novelty and content richness on the IPO market is respectively enhanced.
- Published
- 2022
- Full Text
- View/download PDF
24. Investigating a citrus fruit supply chain network considering CO2 emissions using meta-heuristic algorithms
- Author
-
Fariba Goodarzian, Vikas Kumar, and Peiman Ghasemi
- Subjects
Meta-heuristic algorithms ,Mathematical model ,Citrus fruit agri-food supply chain network ,Business Management ,General Decision Sciences ,Management Science and Operations Research ,CO2 emissions ,Innovation, Operations Management and Supply ,Sustainability & Climate Change - Abstract
According to the increasing carbon dioxide released through vehicles and the shortage of water resources, decision-makers decided to combine the environmental and economic effects in the Agri-Food Supply Chain Network (AFSCN) in developing countries. This paper focuses on the citrus fruit supply chain network. The novelty of this study is the proposal of a mathematical model for a three-echelon AFSCN considering simultaneously CO2 emissions, coefficient water, and time window. Additionally, a bi-objective mixed-integer non-linear programming is formulated for production–distribution-inventory-allocation problem. The model seeks to minimise the total cost and CO+ emission simultaneously. To solve the multi-objective model in this paper, the Augmented Epsilon-constraint method is utilised for small- and medium-sized problems. The Augmented Epsilon-constraint method is not able to solve large-scale problems due to its high computational time. This method is a well-known approach to dealing with multi-objective problems. It allows for producing a set of Pareto solutions for multi-objective problems. Multi-Objective Ant Colony Optimisation, fast Pareto genetic algorithm, non-dominated sorting genetic algorithm II, and multi-objective simulated annealing are used to solve the model. Then, a hybrid meta-heuristic algorithm called Hybrid multi-objective Ant Colony Optimisation with multi-objective Simulated Annealing (HACO-SA) is developed to solve the model. In the HACO-SA algorithm, an initial temperature and temperature reduction rate is utilised to ensure a faster convergence rate and to optimise the ability of exploitation and exploration as input data of the SA algorithm. The computational results show the superiority of the Augmented Epsilon-constraint method in small-sized problems, while HACO-SA indicates that is better than the suggested original algorithms in the medium- and large-sized problems.
- Published
- 2022
- Full Text
- View/download PDF
25. The daily swab test collection problem
- Author
-
Roberto Aringhieri, Sara Bigharaz, Alessandro Druetto, Davide Duma, Andrea Grosso, and Alberto Guastalla
- Subjects
Hybrid algorithms ,Covid-19 ,Digital contact tracing ,OR in health services ,Team orienteering problem ,General Decision Sciences ,Management Science and Operations Research - Abstract
Digital Contact Tracing (DCT) has been proved to be an effective tool to counteract the new SARS-CoV-2 or Covid-19. Despite this widespread effort to adopt the DCT, less attention has been paid to the organisation of the health logistics system that should support the tracing activities. Actually, the DCT poses a challenge to the logistics of the local health system in terms of number of daily tests to be collected and evaluated, especially when the spreading of the virus is soaring. In this paper we introduce a new optimisation problem called the Daily Swab Test Collection (DSTC) problem, that is the daily problem of collecting swab tests at home in such a way to guarantee a timely testing to people notified by the app to be in contact with a positive case. The problem is formulated as a variant of the team orienteering problem. The contributions of this paper are the following: (i) the new optimisation problem DSTC that complements and improves the DCT approach proposed by Ferretti et al. (Science https://doi.org/10.1126/science.abb6936, 2020), (ii) the DSCT formulation as a variant of the TOP and a literature review highlighting that this variant can have useful application in healthcare management, (iii) new realistic benchmark instances for the DSTC based on the city of Turin, (iv) two new efficient and effective hybrid algorithms capable to deal with realistic instances, (v) the managerial insights of our approach with a special regard on the fairness of the solutions. The main finding is that it possible to optimise the underlying logistics system in such a way to guarantee a timely testing to people recognised by the DCT.
- Published
- 2022
26. Forecasting oil commodity spot price in a data-rich environment
- Author
-
Sabri Boubaker, Zhenya Liu, and Yifan Zhang
- Subjects
General Decision Sciences ,Management Science and Operations Research - Abstract
Statistical properties that vary with time represent a challenge for time series forecasting. This paper proposes a change point-adaptive-RNN (CP-ADARNN) framework to predict crude oil prices with high-dimensional monthly variables. We first detect the structural breaks in predictors using the change point technique, and subsequently train a prediction model based on ADARNN. Using 310 economic series as exogenous factors from 1993 to 2021 to predict the monthly return on the WTI crude oil real price, CP-ADARNN outperforms competing benchmarks by 12.5% in terms of the root mean square error and achieves a correlation of 0.706 between predicted and actual returns. Furthermore, the superiority of CP-ADARNN is robust for Brent oil price as well as during the COVID-19 pandemic. The findings of this paper provide new insights for investors and researchers in the oil market.
- Published
- 2022
27. Bayesian sequential update for monitoring and control of high-dimensional processes
- Author
-
Mehmet Turkoz and Sangahn Kim
- Subjects
Hyperparameter ,Computational complexity theory ,Computer science ,Process (engineering) ,Bayesian probability ,Posterior probability ,General Decision Sciences ,Process variable ,Management Science and Operations Research ,computer.software_genre ,Prior probability ,Bayesian hierarchical modeling ,Data mining ,computer - Abstract
Simultaneous monitoring of multi-dimensional processes becomes much more challenging as the dimension increases, especially when there are only a few or moderate number of process variables that are responsible for the process change, and when the size of change is particularly small. In this paper, we develop an efficient statistical process monitoring methodology in high-dimensional processes based on the Bayesian approach. The key idea of this paper is to sequentially update a posterior distribution of the process parameter of interest through the Bayesian rule. In particular, a sparsity promoting prior distribution of the parameter is applied properly under sparsity, and is sequentially updated in online processing. A Bayesian hierarchical model with a data-driven way of determining the hyperparameters enables the monitoring scheme to be effective to the detection of process shifts and to be efficient to the computational complexity in the high-dimensional processes. Comparison with recently proposed methods for monitoring high-dimensional processes demonstrates the superiority of the proposed method in detecting small shifts. In addition, graphical presentations in tracking the process parameter provide the information about decisions regarding whether a process needs to be adjusted before it triggers alarm.
- Published
- 2021
- Full Text
- View/download PDF
28. Contract design under asymmetric demand information for sustainable supply chain practices
- Author
-
Xin Yun, Hao Liu, Kin Keung Lai, and Yi Li
- Subjects
Sustainable development ,Competition (economics) ,Information asymmetry ,Supply chain ,Sustainability ,General Decision Sciences ,Tariff ,Business ,Management Science and Operations Research ,Private information retrieval ,Industrial organization ,Supply and demand - Abstract
Emerging smart technology spawns many start-up firms, which enter into the market and bring more instability and uncertainty to supply chains. This paper develops a game-theoretic model which captures the most crucial factors in market entry stage, such as market uncertainty, competition, contract design, distribution channel, and sustainability. We consider a supply chain consisting of one supplier and two retailers (an incumbent and an entrant). We assume that the incumbent first enters a market and possesses private information about market demand. The entrant encroaches upon the market after observing its big growth potentials. In this paper, we study how the supplier optimally designs the contract for the retailers under asymmetric information to make the supply chain more efficient and sustainable. Subsequently, we investigate the preference of contractual forms between a franchise contract (FC) and a two-part tariff contract (TTC), from the perspectives of the supplier, the incumbent retailer, and the whole supply chain, respectively. It turns out that the supplier prefers the TTC while the incumbent is better off under the FC when the demand variation is relatively high. In addition, we find that the TTC is beneficial to the whole supply chain in most cases. In other words, TTC can promote the coordination of the supply chain and realize the sustainable development of the supply chain. We further analyze the impact of the entrant entry on the supplier’s contract design for the incumbent. The supplier tends to charge the lower wholesale price and less franchise fee for the incumbent after the entrant entry. The supplier’s contract strategy is conducive to maintaining the horizontal competition and promoting the sustainability of the supply chain. We also analyze the supplier’s distribution channel decision and find that she is more willing to block the entrant from entering the market in most cases because the incumbent with the private demand information makes the supply chain more efficient. Finally, we also extend our results by altering some assumption in the basic model.
- Published
- 2021
- Full Text
- View/download PDF
29. A literature review on police patrolling problems
- Author
-
Sukanya Samanta, Goutam Sen, and Soumya K. Ghosh
- Subjects
Operations research ,Jurisdiction ,Computer science ,Crime prevention ,Smart city ,Patrolling ,General Decision Sciences ,Resource allocation ,Management Science and Operations Research ,Set (psychology) ,Domain (software engineering) ,Variety (cybernetics) - Abstract
Police patrol is an effective crime prevention tool and boosts public confidence in urban security. Many interesting decision making problems appear in route design, resource allocation and jurisdiction planning. Many cities across the world have adopted a structured and intelligent method of police patrol due to the presence of a variety of operational and resource constraints. In this paper, we present a comprehensive review of the state-of-the-art in this domain, especially from the practice of operations research (OR) point of view. This is the first-of-its-kind review on police patrol presenting a classification scheme based on the type of problem, objective and modelling approach. In this novel scheme, one can track any paper almost readily to find the specific contribution. The applicability of OR in this domain is set to grow significantly as the governments formulate policies related to smart city planning and urban security. This study reveals many practical challenges in police patrolling for future research.
- Published
- 2021
- Full Text
- View/download PDF
30. A bi-objective hierarchical program scheduling problem and its solution based on NSGA-III
- Author
-
Jingwen Zhang, Liangwei Chen, Wuliang Peng, and Jiali lin
- Subjects
Distributed Computing Environment ,021103 operations research ,Job shop scheduling ,Operations research ,Computer science ,0211 other engineering and technologies ,Pareto principle ,General Decision Sciences ,02 engineering and technology ,Management Science and Operations Research ,Enterprise project management ,Scheduling (computing) ,Shared resource ,Theory of computation ,Global optimization - Abstract
In enterprise project management systems, a program at the tactical level coordinates and manages multiple projects at the operational level. There are close relationships between multiple projects in a program, which are typically manifested as shared resources and precedence relationships. Most research efforts have concentrated on the resource sharing by projects, while the precedence relationships between projects have yet to be comprehensively investigated. In this paper, a bi-objective hierarchical resource-constrained program scheduling problem proposed, where both resource sharing and precedence relationships between projects are considered in a distributed environment. The problem contains two different sub-problems at the operational level and the tactical level, and they are modeled in the same way as two bi-objective multi-mode scheduling problems. Shared resources are allocated from the tactical level to the operational level, and once they are allocated to a project, they can only be re-allocated to other projects once the current project is finished. Subsequently, a two-phase algorithm based on NSGA-III is developed. The algorithm runs at the operational level and the tactical level in turn. According to the Pareto fronts of projects that are submitted from the operational level, the bi-objective program planning at the tactical level is conducted under the constraints of precedence relationships and shared resources. The results of computational simulations demonstrate the satisfactory performance of the improved algorithm. By coordinating the local optimization of projects and the global optimization of the program in a hierarchical framework, the method proposed in this paper provides an effective integrated scheduling method for decision-makers at various levels of a program.
- Published
- 2021
- Full Text
- View/download PDF
31. Queueing models for cognitive wireless networks with sensing time of secondary users
- Author
-
Kohei Akutsu, Sabine Wittevrongel, Tuan Phung-Duc, Osama Salameh, and Ken'ichi Kawanishi
- Subjects
Queueing theory ,021103 operations research ,Computer science ,business.industry ,0211 other engineering and technologies ,Stability (learning theory) ,General Decision Sciences ,02 engineering and technology ,Management Science and Operations Research ,Idle ,Cognitive radio ,Transmission (telecommunications) ,State (computer science) ,Interrupt ,business ,Communication channel ,Computer network - Abstract
This paper considers queueing models for cognitive radio networks that account for the sensing time of secondary users (SUs). In cognitive radio networks, secondary users are allowed to opportunistically use idle channels originally allocated to primary users (PUs). To this end, SUs must sense the state of the channels before transmission. After sensing, if an idle channel is available, the SU can start transmission immediately; otherwise, the SU must carry out another channel sensing. In this paper, we study two retrial queueing models with an unlimited number of sensing SUs, where PUs have preemptive priority over SUs. The two models differ in whether or not an arriving PU can interrupt a SU transmission also in case there are still idle channels available. We show that both models have the same stability condition and that the model without interruptions in case of available idle channels has a slightly lower number of sensing SUs than the model with interruptions.
- Published
- 2021
- Full Text
- View/download PDF
32. A new cooperative depot sharing approach for inventory routing problem
- Author
-
Mehmet Onur Olgun and Erdal Aydemir
- Subjects
Cost allocation ,021103 operations research ,Operations research ,Computer science ,0211 other engineering and technologies ,General Decision Sciences ,02 engineering and technology ,Plan (drawing) ,Management Science and Operations Research ,Cooperative game theory ,Shapley value ,Set (abstract data type) ,Order (business) ,Theory of computation ,Routing (electronic design automation) - Abstract
This paper addresses cooperative game theory based bi-objective inventory routing problem, where replenishment plans are assumed as a coalition structure. Particularly, a distribution system is a set of customers that may order a single product from unique wholesalers to satisfy their own demands. When the products are carried from one depot to customers, transportation and inventory costs are incurred and some of the customers have insufficient depot capacity to their order levels. Therefore, in this paper, customers are willing to cooperate for cost saving benefits, where the depots are cooperatively used by each other with additional cost. The cooperative inventory game is established. An application is performed in a furniture roving parts company with the Shapley value cost allocation methods. The main objective of the study is to make a new replenishment plan for the demand of the customers, while meet the demands by sharing each other's depots and inventory routing costs. In addition, an improvement of 1.67% is obtained in vehicle utilization. Moreover, the proposed distribution method makes it possible to satisfy all customers demand for their own period. It helps customers who do not satisfy the demand into the warehouse, the three other most free capacitive customers can take the excess orders into their warehouse which are included in the coalition. As a result, a new effective replenishment plan has been obtained that it is economical to bear the additional costs.
- Published
- 2021
- Full Text
- View/download PDF
33. A hybrid decision support model using Grey Relational Analysis and the Additive-Veto Model for solving multicriteria decision-making problems: an approach to supplier selection
- Author
-
Helder Tenório Cavalcanti, Thalles Vitelli Garcez, and Adiel Teixeira de Almeida
- Subjects
Decision support system ,021103 operations research ,Operations research ,Computer science ,Process (engineering) ,Veto ,0211 other engineering and technologies ,General Decision Sciences ,02 engineering and technology ,Management Science and Operations Research ,Decision problem ,Multiple-criteria decision analysis ,Grey relational analysis ,Ranking ,Complete information - Abstract
This paper puts forward a new hybrid Grey Additive-Veto Model (GAVM) for selecting suppliers, which optimizes the choice by the decision-maker’s (DM) preference. This study uses a Multi-Criteria Decision Making (MCDM) approach. The model proposed incorporates the uncertainty of real-world decision problems, through the Grey Numbers. Given the DM’s compensatory rationality, the method reflects tradeoffs amongst different criteria. However, a compensatory model may not represent the DM faithfully and realistically. For this is proposed a performance veto condition. Additionally, the DM may consider that suppliers with a high uncertainty are not desirable or even that they are not acceptable. For this is also proposed width performance veto condition. Given practical implications, the need for an enriched analysis of the ranking is perceived because during the ranking process, there are some situations in which the DM is undecided (mistrust, hesitation) about the statement that a given supplier is better ranked than another supplier. This hesitation is mainly caused by uncertainty arising from the grey numbers, and from attitudes, functions and characteristics of vetoes that the DM has defined. As findings, this paper shows that GAVM is suited to dealing with multicriteria decisions with uncertain information, veto conditions and its potential applicability is illustrated by a numerical supplier selection example. In addition, the numerical example shows that the enriched analysis of the ranking proposed can be conducted to direct the DM’s efforts to further enriching the insights for more assertive and coherent decision-making.
- Published
- 2021
- Full Text
- View/download PDF
34. Risk management for crude oil futures: an optimal stopping-timing approach
- Author
-
Yaosong Zhan, Zhenya Liu, and Sabri Boubaker
- Subjects
021103 operations research ,business.industry ,0211 other engineering and technologies ,General Decision Sciences ,02 engineering and technology ,Management Science and Operations Research ,Audit risk ,Crude oil ,Econometrics ,Economics ,Drawdown (economics) ,Optimal stopping time ,Optimal stopping ,business ,Futures contract ,Risk management - Abstract
Timing the selling of crude oil futures to control risk is a worth studying question given the swift fall of their prices. This paper proposes an optimal stopping model to find the optimal selling time at the beginning of the downtrend. The model depends on the crude oil futures prices drawdown and the boundary to identify the occurrence of downtrend in real-time. The numerical simulation and empirical analyses help verify the effectiveness of the proposed optimal stopping time model, especially, in 2007, when the model can effectively avoid losses. The conclusions of the paper provide a new perspective for investors to control risk.
- Published
- 2021
- Full Text
- View/download PDF
35. Retailer's optimal strategy for a perishable product with increasing demand under various payment schemes
- Author
-
Yan Shi, Zhiwen Tao, Sunil Tiwari, and Zhiyong Zhang
- Subjects
Upstream (petroleum industry) ,021103 operations research ,Total cost ,media_common.quotation_subject ,0211 other engineering and technologies ,General Decision Sciences ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,02 engineering and technology ,Management Science and Operations Research ,Payment ,Product (business) ,Microeconomics ,Downstream (manufacturing) ,Demand curve ,Order (business) ,Cash ,Business ,media_common - Abstract
This paper studies the optimal replenishment strategy of the retailer under partial two levels of credit. The paper also considers the following scenarios: (1) the product under consideration is a deteriorating item, (2) the demand function is an incremental function of time, and (3) the retailer pays the supplier by the payment method of Advance-Cash-Credit (ACC) and gives his/her customers a certain credit period. (4) The supplier provides the retailer with a certain price discount to facilitate sales. The goal of the paper is to decide the retailer’s order cycle, which minimizes his/her total cost per unit time. Firstly, we proved the existence and uniqueness of the optimal solution. Secondly, we validated the theoretical results and discussed the performance of upstream ACC payment and downstream credit payment by numerical analysis of key parameters, numerical results show that it is cheaper for the retailer to pay for the payment in upstream ACC and downstream credit payment than in traditional payment method (i.e., upstream cash and downstream cash payment method), which encourage the retailer to order more quantity and less frequently under the former payment method. Thirdly, we compared the retailer’s order behaviors and total cost per unit time under the following five two-level payment types: upstream cash and downstream cash payment, upstream advance, cash, credit, ACC and downstream credit payment, in which the supplier will provide a certain price discount when the retailer pays in advance. It is found that the retailer pays the supplier in upstream advance and downstream credit payment is the lowest cost to the retailer, and it will lead to order the most quantities under this payment method, while the retailer pays the supplier with upstream credit and downstream credit (upstream credit period is shorter than the downstream credit period) is the highest cost to the retailer. The research results can help the retailers make the payment selections and optimize their operational decisions.
- Published
- 2021
- Full Text
- View/download PDF
36. Evaluation method of path selection for smart supply chain innovation
- Author
-
Jingkun Wang, Siyu Wang, and Weihua Liu
- Subjects
021103 operations research ,Operations research ,Computer science ,Supply chain ,0211 other engineering and technologies ,General Decision Sciences ,TOPSIS ,02 engineering and technology ,Management Science and Operations Research ,Fuzzy logic ,Weighting ,Decision matrix ,Decision-matrix method ,Path (graph theory) ,Operational efficiency - Abstract
The smart supply chain innovation (SSCI) has become the key way for enterprises to enhance their competitiveness. Therefore, it is very important for the current supply chain enterprises to choose a reasonable innovation path. Through the literature review method, this paper summarizes the four main evaluation indicators of path selection for smart supply chain innovation, which are technical indicators, organizational environment indicators, operational efficiency indicators, risk prevention and control indicators. According to the characteristics of evaluation index, the improved Fuzzy Entropy-Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) method is proposed. This method considers the situation that the original data contains both interval value and fixed value data. Firstly, the standardized decision matrix is constructed, and then the optimal index weight is established by Fuzzy Entropy method. Secondly, this paper calculates the weighted decision matrix by using the method of sub item weighting of hierarchical indicators. Finally, the improved TOPSIS method is used to determine the relative closeness of each scheme. According to the weighted decision matrix, the innovation path index of the smart supply chain (SSC) is further calculated. We use the actual case of a company to analyze the evaluation method for three different SSCI paths. This paper provides a reference for the path selection of SSCI from the aspects of theory and practice.
- Published
- 2021
- Full Text
- View/download PDF
37. A Fuzzy ISM approach for modeling electronic traceability in agri-food supply chain in India
- Author
-
Ayushi Srivastava and Kavya Dashora
- Subjects
021103 operations research ,Process management ,Supply chain management ,Traceability ,Supply chain ,0211 other engineering and technologies ,General Decision Sciences ,02 engineering and technology ,Management Science and Operations Research ,Appropriate technology ,Competitive advantage ,Fuzzy logic ,Product (business) ,Transparency (graphic) ,Business - Abstract
The purpose of this paper is to explore the enablers for the implementation of electronic traceability in agri-food supply chain in India. In several agri-food supply chains, the lack of any form of traceability or the presence of paper-based traceability impacts the trade of the concerned food product. Electronic traceability (e-traceability) will assist agri-food firms in improving their performance, minimize food fraud activities, ensure efficient recall of the products and contribute in overall agri-food supply chain management. With the help of literature review and expert opinions, enablers of e-traceability are modelled and analyzed using Fuzzy ISM and FUZZY MICMAC. The combination of both these techniques helps in identifying the essential drivers in the implementation of e-traceability in agri-food supply chains. The proposed approach found that that electronic form of traceability is better than paper-based traceability in agri-food supply chains. The significant drivers in e-traceability implementation, particularly in agri-food supply chain are appropriate technology for e-traceability, competitive advantage, coordination and transparency and management support. The identified enablers would guide the managers or decision-makers in the adoption of e-traceability in their existing supply chains in the agri-food sector.
- Published
- 2021
- Full Text
- View/download PDF
38. Dealing with complex transaction costs in portfolio management
- Author
-
Antonio Violi, Massimiliano Ferrara, Patrizia Beraldi, Bruno Antonio Pansera, Claudio Ciancio, and Operations Analytics
- Subjects
Transaction cost ,Structure (mathematical logic) ,021103 operations research ,Operations research ,Computer science ,Portfolio management ,0211 other engineering and technologies ,General Decision Sciences ,02 engineering and technology ,Management Science and Operations Research ,Range (mathematics) ,SDG 17 - Partnerships for the Goals ,Branch and bound method ,Portfolio ,Complex transaction costs ,Project portfolio management ,Integer (computer science) - Abstract
This paper deals with the problem of modelling complex transaction cost structures within portfolio management models in an efficient and effective way. We consider a general structure of transaction costs, where the applied commissions depend on the range of traded monetary amount and we use this general structure within a portfolio optimization problem with rebalancing decisions in response to new market conditions. The presence of transaction costs reduces the fund’s capital and should be properly accounted for to avoid substantial costs that impact on portfolio performance. In this paper we present a mixed integer model equipped with a specialized Branch and Bound method that exploits the specific formulation of the trading operations. Computational experiments, carried out on transaction cost structures offered by real-life traders, have shown the effectiveness of the proposed model and the computational efficiency of the solution approach.
- Published
- 2021
- Full Text
- View/download PDF
39. Award scheme in random trial contests
- Author
-
Gongbing Bi and Xu Tian
- Subjects
Scheme (programming language) ,021103 operations research ,ComputingMilieux_THECOMPUTINGPROFESSION ,Computer science ,Risk aversion ,ComputingMilieux_PERSONALCOMPUTING ,0211 other engineering and technologies ,General Decision Sciences ,Probability density function ,02 engineering and technology ,Management Science and Operations Research ,CONTEST ,GeneralLiterature_MISCELLANEOUS ,Shock (economics) ,Theory of computation ,Product (category theory) ,computer ,Mathematical economics ,computer.programming_language - Abstract
Innovation contests have been an important tool used in product research and development for companies. In the innovation contest literature, most papers assume the homogenous innovation contest model or the all-pay auction model. In this paper, we consider the random trial contest model and study the optimal award scheme. We show that, in this contest model, risk types of contestants play important roles in the award scheme, and the results are independent of the probability density function of the random shock. These generalize the work in literature. In addition, the risk aversion coefficient will decide the allocation manner in a multiple-winner scheme, i.e., a concave allocation manner or a convex allocation manner is optimal.
- Published
- 2021
- Full Text
- View/download PDF
40. Optimal selection of third-party logistics providers using integer programming: a case study of a furniture company storage and distribution
- Author
-
Diane Ahrens, Mohammed Alnahhal, and Mosab I. Tabash
- Subjects
Truck ,021103 operations research ,Exponential distribution ,Supply chain management ,Operations research ,Computer science ,business.industry ,0211 other engineering and technologies ,General Decision Sciences ,Distribution (economics) ,02 engineering and technology ,Management Science and Operations Research ,Order (business) ,Theory of computation ,business ,Integer programming ,Selection (genetic algorithm) - Abstract
This paper investigates the selection of third-party logistics providers (3PLs) based on the best prices offered by them. The focus is on outbound logistics where 3PLs must have their own distribution centres for storage and picking activities. They must also have suitable trucks for distribution to different small-scale customers. The motivation for this paper is a case study from Germany in which a furniture company with hundreds of small customers in ten zones is seeking one or more 3PLs to do the distribution. A mathematical programming model was built based on integer programming where demand per order can be expressed using exponential distribution in each customer zone. The main contribution of this paper is that it finds the best 3PLs based on the different pricing methods of the various providers; this means including the location problem indirectly using the new integer programming model. The model takes into consideration three different methods of pricing based on the offers of five 3PLs. These different methods make it difficult for the decision makers to choose the best solution, especially if specific trends in demand are expected in the future for some customer zones. The results show that future increases in demand in terms of the number of orders or order size could affect the optimal solution. The best pricing method with the lowest variability in cost over time is selected.
- Published
- 2021
- Full Text
- View/download PDF
41. Dynamic capital allocation rules via BSDEs: an axiomatic approach
- Author
-
Elisa Mastrogiacomo and Emanuela Rosazza Gianin
- Subjects
Risk measures ,Capital allocation ,BSDE ,g-expectation ,Subdifferential ,Gradient ,General Decision Sciences ,Management Science and Operations Research - Abstract
In this paper, we study capital allocation for dynamic risk measures, with an axiomatic approach but also by exploiting the relation between risk measures and BSDEs. Although there is a wide literature on capital allocation rules in a static setting and on dynamic risk measures, only a few recent papers on capital allocation work in a dynamic setting and, moreover, those papers mainly focus on the gradient approach. To fill this gap, we then discuss new perspectives to the capital allocation problem going beyond those already existing in the literature. In particular, we introduce and investigate a general axiomatic approach to dynamic capital allocations as well as an approach suitable for risk measures induced by g-expectations under weaker assumptions than Gateaux differentiability.
- Published
- 2022
- Full Text
- View/download PDF
42. A simulation-deep reinforcement learning (SiRL) approach for epidemic control optimization
- Author
-
Sabah Bushaj, Xuecheng Yin, Arjeta Beqiri, Donald Andrews, and İ. Esra Büyüktahtakın
- Subjects
General Decision Sciences ,Management Science and Operations Research - Abstract
In this paper, we address the controversies of epidemic control planning by developing a novel Simulation-Deep Reinforcement Learning (SiRL) model. COVID-19 reminded constituents over the world that government decision-making could change their lives. During the COVID-19 pandemic, governments were concerned with reducing fatalities as the virus spread but at the same time also maintaining a flowing economy. In this paper, we address epidemic decision-making regarding the interventions necessary given of the epidemic based on the purpose of the decision-maker. Further, we intend to compare different vaccination strategies, such as age-based and random vaccination, to shine a light on who should get priority in the vaccination process. To address these issues, we propose a simulation-deep reinforcement learning (DRL) framework. This framework is composed of an agent-based simulation model and a governor DRL agent that can enforce interventions in the agent-based simulation environment. Computational results show that our DRL agent can learn effective strategies and suggest optimal actions given a specific epidemic situation based on a multi-objective reward structure. We compare our DRL agent's decisions to government interventions at different periods of time during the COVID-19 pandemic. Our results suggest that more could have been done to control the epidemic. In addition, if a random vaccination strategy that allows super-spreaders to get vaccinated early were used, infections would have been reduced by 32% at the expense of 4% more deaths. We also show that a behavioral change of fully quarantining 10% of the risky individuals and using a random vaccination strategy leads to a reduction of the death toll by 14% and 27% compared to the age-based vaccination strategy that was implemented and the New Jersey reported data, respectively. We have also demonstrated the flexibility of our approach to be applied to other locations by validating and applying our model to the COVID-19 case in the state of Kansas.
- Published
- 2022
43. A novel framework of credit risk feature selection for SMEs during industry 4.0
- Author
-
Yang Lu, Lian Yang, Baofeng Shi, Jiaxiang Li, and Mohammad Zoynul Abedin
- Subjects
General Decision Sciences ,Management Science and Operations Research - Abstract
With the development of industry 4.0, the credit data of SMEs are characterized by a large volume, high speed, diversity and low-value density. How to select the key features that affect the credit risk from the high-dimensional data has become the critical point to accurately measure the credit risk of SMEs and alleviate their financing constraints. In doing so, this paper proposes a credit risk feature selection approach that integrates the binary opposite whale optimization algorithm (BOWOA) and the Kolmogorov-Smirnov (KS) statistic. Furthermore, we use seven machine learning classifiers and three discriminant methods to verify the robustness of the proposed model by using three actual bank data from SMEs. The empirical results show that although no one artificial intelligence credit evaluation method is universal for different SMEs' credit data, the performance of the BOWOA-KS model proposed in this paper is better than other methods if the number of indicators in the optimal subset of indicators and the prediction performance of the classifier are considered simultaneously. By providing a high-dimensional data feature selection method and improving the predictive performance of credit risk, it could help SMEs focus on the factors that will allow them to improve their creditworthiness and more easily access loans from financial institutions. Moreover, it will also help government agencies and policymakers develop policies to help SMEs reduce their credit risks.
- Published
- 2022
- Full Text
- View/download PDF
44. Supplier selection to support environmental sustainability: the stratified BWM TOPSIS method
- Author
-
Mehdi Rajabi Asadabadi, Hadi Badri Ahmadi, Himanshu Gupta, and James J. H. Liou
- Subjects
General Decision Sciences ,Management Science and Operations Research - Abstract
Organisations need to develop long-term strategies to ensure they incorporate innovation for environmental sustainability (IES) to remain competitive in the market. This can be challenging given the high level of uncertainty regarding the future (e.g., following the COVID pandemic). Supplier selection is an important decision that organisations make and can be designed to support IES. While the literature provides various criteria in the field of IES strategies, it does not identify the criteria which can be utilised to assist organisations in their supplier selection decisions. Moreover, the literature in this field does not consider uncertainty related to the occurrence of possible future events which may influence the importance of these criteria. To address this gap, this paper develops a novel criteria decision framework to assist supplier evaluation in organisations, taking into consideration different events that may occur in the future. The framework that combines three decision-making methods: the stratified multi-criteria decision-making method, best worst method, and technique for order of preference by similarity to ideal solution. The framework, proposed in this paper, can also be adopted to enable effective and sustainable decision making under uncertainty in various fields.
- Published
- 2022
45. Dyadic analysis for multi-block data in sport surveys analytics
- Author
-
Maria IANNARIO, Rosaria Romano, Domenico VISTOCCO, Iannario, M., Romano, R., and Vistocco, D.
- Subjects
Athletes’/coaches’ perception ,Complex data structure ,Dyadic analysi ,Quantile regression ,General Decision Sciences ,Management Science and Operations Research - Abstract
Analyzing sports data has become a challenging issue as it involves not standard data structures coming from several sources and with different formats, being often high dimensional and complex. This paper deals with a dyadic structure (athletes/coaches), characterized by a large number of manifest and latent variables. Data were collected in a survey administered within a joint project of University of Naples Federico II and Italian Swimmer Federation. The survey gathers information about psychosocial aspects influencing swimmers’ performance. The paper introduces a data processing method for dyadic data by presenting an alternative approach with respect to the current used models and provides an analysis of psychological factors affecting the actor/partner interdependence by means of a quantile regression. The obtained results could be an asset to design strategies and actions both for coaches and swimmers establishing an original use of statistical methods for analysing athletes psychological behaviour.
- Published
- 2022
- Full Text
- View/download PDF
46. Intelligence customs declaration for cross-border e-commerce based on the multi-modal model and the optimal window mechanism
- Author
-
Xiaofeng Li, Jing Ma, and Shan Li
- Subjects
General Decision Sciences ,Management Science and Operations Research - Abstract
This paper aims to study the intelligent customs declaration of cross-border e-commerce commodities from algorithm design and implementation. The difficulty of this issue is the recognition of commodity names, materials, and processing processes. Because the process of recognizing these three kinds of commodity information is similar, this paper chooses to identify the commodity name as the experimental research object. The algorithm in this paper is based on the premise of pre-clustering, using an optimal window mechanism to obtain the best word embedding vector representation. The Vision Transformer model extracts image features instead of traditional CNN models, and then text features are fused with image features to generate a multi-modal semantically feature vector. Finally, a deep forest classifier replaces the conventional neural network classifiers to complete the commodity name recognition task. The experimental results show that, for more than 600 different commodities on the 120,000 data records, the precision is 0.85, the recall is 0.87, and the F
- Published
- 2022
- Full Text
- View/download PDF
47. To stop or not to stop: a time-constrained trip covering location problem on a tree network
- Author
-
M. C. López-de-los-Mozos and Juan A. Mesa
- Subjects
Constraint (information theory) ,Set (abstract data type) ,Mathematical optimization ,021103 operations research ,Plane (geometry) ,Computer science ,Theory of computation ,0211 other engineering and technologies ,Tree network ,General Decision Sciences ,Point (geometry) ,02 engineering and technology ,Management Science and Operations Research - Abstract
Location of new stations/stops in public transportation networks has attracted much interest from both the point of views of theory and applications. In this paper we consider a set of pairs of points in the plane demanding traveling between the elements of each pair, and a tree network embedded in the plane representing the transportation system. An alternative mode of transportation competes with the combined plane-network mode so that the modal choice is made by distance (time) comparisons. The aim of the problem dealt with in this paper is to locate a new station/stop so that the traffic through the network would be maximized. Since stops at new stations increases the time of passengers that already used the combined mode, and may persuade them to change the mode, a constraint on the increase of the overall time is imposed. A quadratic in the number of pairs time algorithm is proposed.
- Published
- 2021
- Full Text
- View/download PDF
48. Applications of data envelopment analysis in supplier selection between 2000 and 2020: a literature review
- Author
-
Pankaj Dutta, Bharath Jaikumar, and Manpreet Singh Arora
- Subjects
021103 operations research ,Supply chain management ,Performance management ,Supply chain ,0211 other engineering and technologies ,Scopus ,General Decision Sciences ,02 engineering and technology ,Management Science and Operations Research ,Supplier evaluation ,Competitive advantage ,Purchasing ,Data envelopment analysis ,Business ,Industrial organization - Abstract
Purchasing occupies a strategic role in supply chain management for a firm and is the driver of competitive advantage. Owing to the high purchase cost to revenue ratio, decisions such as evaluation, selection, and performance management of suppliers are of the matter of immense interest to firms. Multi-criteria decision making tools allow the purchasing managers to evaluate the suppliers holistically. One such tool, data envelopment analysis (DEA) has been used extensively for supplier evaluation and selection. This paper presents a comprehensive review of 161 articles published since 2000, on the application of DEA in supplier selection. These articles are located from the Scopus database. With little existing literature on a full-fledged review, this work envisages to be first of its kind, by aiding DEA practitioners in purchasing function. The analysis of the study indicates the emergence of the theme of green supply chain and sustainability in recent years as well as the adoption of hybrid approaches to solving the problem of supplier selection using DEA. The paper presents various classifications of DEA methods based on input criteria, sectors of application, and industry-wide case studies, which can be used as a quick reckoner by an academician or a purchasing manager.
- Published
- 2021
- Full Text
- View/download PDF
49. Spatio-temporal efficiency measurement under undesirable outputs using multi-objective programming: a GAMS representation
- Author
-
Konstantinos Petridis
- Subjects
Mathematical optimization ,021103 operations research ,Computer science ,business.industry ,0211 other engineering and technologies ,General Decision Sciences ,Context (language use) ,02 engineering and technology ,Management Science and Operations Research ,Set (abstract data type) ,Software ,Dimension (vector space) ,Face (geometry) ,Time series ,business ,Representation (mathematics) - Abstract
Time series data in DEA often represent successive versions of the same unit (DMU). In order to assess efficiency of each DMU, several DEA techniques have been employed. One of the problems that conventional DEA models face is that the reference set, when dealing with time series data, is not constructed correctly. This is attributed to the fact that conventional DEA models examine the DMUs and extract their efficiency scores based only the spatial dimension. However, when dealing with time series data for DMUs in the DEA context, the temporal dimension should be also taken into account. This paper is based on Spatio-Temporal DEA (ST-DEA) model (Petridis et al. in Ann Oper Res 238(1–2):475–496, 2016) and extends the presented S-T DEA model by incorporating undesirable inputs/outputs. A GAMS representation of the model for the solution and explanation of ST-DEA model is shown through an illustrative example. The scope of the paper is to analyze the concept of ST-DEA model and demonstrate its applicability via an application explained in GAMS optimization software.
- Published
- 2020
- Full Text
- View/download PDF
50. Downtime cost analysis of offloading operations due to influence of partially standing waves in Malaysian waters and development of graphical user interface
- Author
-
A. A. Khalifa, M. S. Patel, Mohd Shahir Liew, Zahiraniza Mustaffa, and Andrew Whyte
- Subjects
Downtime ,021103 operations research ,Computer science ,business.industry ,Interface (Java) ,0211 other engineering and technologies ,General Decision Sciences ,02 engineering and technology ,Management Science and Operations Research ,Field (computer science) ,Reliability engineering ,Work (electrical) ,Revenue ,Floating liquefied natural gas ,business ,Graphical user interface - Abstract
A cost related study for the offloading operations is an integral part of monetary risks assessment. The partially standing waves which occur between the gap of vessels are responsible for impacting the offloading operations in terms of downtime costs. This paper presents a downtime cost analysis of side-by-side offloading operations in Malaysian waters for regular waves addressing the influence of partially standing waves through a graphical user interface (GUI). The developed interface is explained and its work procedure is demonstrated in this paper. The downtime is studied for two sea-states, beam and heading seas for which the probability of occurrence was calculated from the location specific wave scatter distribution. The results of wave kinematics for partially standing waves influencing the offloading operation for side-by-side configuration are presented. The down-time cost analysis will help the oil operator companies to analyze the economic risks involved for field developments and anticipate the loss in revenue for down-time occurrences. Overall, an attempt to integrate the influence of gap between vessels with offloading operations and related cost is presented.
- Published
- 2020
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.