Computing the sequence of -cardinality assignments
The -cardinality assignment (-assignment, for short) problem asks for finding a minimal (maximal) weight of a matching of cardinality in a weighted bipartite graph , . Here we are interested in computing the sequence of all -assignments, . By applying the algorithm of Gassner and Klinz (2010) for the parametric assignment problem one can compute in time the set of -assignments for those integers which refer to essential terms of the full characteristic maxpolynomial of the corresponding max-plus weight matrix . We show that is in full canonical form, which implies that the remaining -assignments refer to semi-essential terms of . This property enables us to efficiently compute in time all the remaining -assignments out of the already computed essential -assignments. It follows that time complexity for computing the sequence of all -cardinality assignments is , which is the best known time for this problem.
Convex-Concave fitting to successively updated data and its application to covid-19 analysis
Let measurements of a process be provided sequentially, where the process follows a sigmoid shape, but the data have lost sigmoidicity due to measuring errors. If we smooth the data by making least the sum of squares of errors subject to one sign change in the second divided differences, then we obtain a sigmoid approximation. It is known that the optimal fit of this calculation is composed of two separate sections, one best convex and one best concave. We propose a method that starts at the beginning of the data and proceeds systematically to construct the two sections of the fit for the current data, step by step as is increased. Although the minimization calculation at each step may have many local minima, it can be solved in about operations, because of properties of the join between the convex and the concave section. We apply this method to data of daily Covid-19 cases and deaths of Greece, the United States of America and the United Kingdom. These data provide substantial differences in the final approximations. Thus, we evaluate the performance of the method in terms of its capabilities as both constructing a sigmoid-type approximant to the data and a trend detector. Our results clarify the optimization calculation both in a systematic manner and to a good extent. At the same time, they reveal some features of the method to be considered in scenaria that may involve predictions, and as a tool to support policy-making. The results also expose some limitations of the method that may be useful to future research on convex-concave data fitting.
Industrial structure optimization, economic development factors and regional economic risk prevention in post COVID-19 period: empirical analysis based on panel data of Guangdong regional economy
In the post-COVID-19 period, the regional economic development gap in Guangdong has expanded and the risk has increased. This is not conducive to building a well-off society in Guangdong in an all-round way. Optimizing the industrial structure, changing the mode of economic growth, preventing regional economic risks, and narrowing the regional economic development gap are the focus of Guangdong governments at all levels. Based on a two-sector economic growth model, the paper uses descriptive statistics and Deng's grey correlation method to analyze the economic development panel data of 21 prefecture-level cities of Guangdong Province from 2011 to 2018, analyzes the characteristics of regional economic development in Guangdong, and makes experimental research of the effect of industrial structure optimization and economic development factors on economic growth in Guangzhou, Zhongshan, Shantou, Maoming and Meizhou at different stages of development. The results show that: there are differences in the industrial structure and the level of economic development in different regions; the rationalization of industrial structure and upgrading of the industrial structure has an impact on economic growth and show different features during different stages; in different stages of economic development, foreign trade, scientific and technological innovation are the main factors affecting regional economic growth. This paper holds that the northwest of Guangdong Province and the Pearl River Delta should formulate industrial structure adjustment policies to promote the rationalization of regional industrial structure, expand foreign trade channels and innovate financing mechanisms, and restrain regional economic risks in combination with regional characteristics. The new development pattern of domestic and international dual circulation and mutual promotion, further expands the level of opening up, effectively playing the role of free trade pilot areas, free trade ports, special economic zones, development zones, bonded areas, and other frontier highlands, improve the level and capacity of foreign trade.
A sustainable supply chain network considering lot sizing with quantity discounts under disruption risks: centralized and decentralized models
This study proposes a framework for the main parties of a sustainable supply chain network considering lot-sizing impact with quantity discounts under disruption risk among the first studies. The proposed problem differs from most studies considering supplier selection and order allocation in this area. First, regarding the concept of the triple bottom line, total cost, environmental emissions, and job opportunities are considered to cover the criteria of sustainability. Second, the application of this supply chain network is transformer production. Third, applying an economic order quantity model lets our model have a smart inventory plan to control the uncertainties. Most significantly, we present both centralized and decentralized optimization models to cope with the considered problem. The proposed centralized model focuses on pricing and inventory decisions of a supply chain network with a focus on supplier selection and order allocation parts. This model is formulated by a scenario-based stochastic mixed-integer non-linear programming approach. Our second model focuses on the competition of suppliers based on the price of products with regard to sustainability. In this regard, a Stackelberg game model is developed. Based on this comparison, we can see that the sum of the costs for both levels is lower than the cost without the bi-level approach. However, the computational time for the bi-level approach is more than for the centralized model. This means that the proposed optimization model can better solve our problem to achieve a better solution than the centralized optimization model. However, obtaining this better answer also requires more processing time. To address both optimization models, a hybrid bio-inspired metaheuristic as the hybrid of imperialist competitive algorithm (ICA) and particle swarm optimization (PSO) is utilized. The proposed algorithm is compared with its individuals. All employed optimizers have been tuned by the Taguchi method and validated by an exact solver in small sizes. Numerical results show that striking similarities are observed between the results of the algorithms, but the standard deviations of PSO and ICA-PSO show better behavior. Furthermore, while PSO consumes less time among the metaheuristics, the proposed hybrid metaheuristic named ICA-PSO shows more time computations in all small instances. Finally, the provided results confirm the efficiency and the performance of the proposed framework and the proposed hybrid metaheuristic algorithm.
Airline capacity distribution under financial budget and resource consideration
Capacity distribution is a challenging issue for an airline under financial budget and resource consideration. It is a large-scale optimization problem covering both long-term planning and short-term operating arrangements. This study investigates on the airline capacity distribution problem with financial budget and resource consideration. It contains subproblems of financial budget arrangement, fleet introduction, and fleet assignment. Among them, financial budget is arranged in multiple decision periods, fleet introduction is decided under fixed time points, while fleet assignment is decided under all available time points. To tackle this problem, an integer programming model is formulated for descriptions. Then, an integrated algorithm of modified Variable Neighborhood Search (VNS) and Branch-and-bound (B&B) strategy is developed to find solutions. In detail, a greedy heuristic approach is utilized to generate an initial solution for fleet introduction, the modified B&B strategy is utilized to generate the optimal solution for fleet assignment and the modified VNS is applied to update current solution for a new one with better quality. In addition, budget limit checks are added for financial budget arrangements. Finally, the hybrid algorithm is tested on efficiency and stability. It is also compared to other algorithms which replace the modified VNS by basic VNS, differential evolution and genetic algorithm. Computational results show that performance of our approach is powerful in terms of objective value, convergence speed and stability.
Diversified-profit maximization in competitive social advertising
Due to the important role that social networks play in advertisements and propaganda, influence maximization (IM) problem which aims at finding some influential users as seeds to trigger large online cascading influence spread has been a hot research topic in the past two decades. As an extension of classical IM problem, profit maximization (PM) problem is inspired by product promotion and focuses on the profit generated by consumer. Given that competitive social advertising is more common in real-world, a series of studies propose some versions of PM problem. However, the competition happening in the information dissemination of imperfect substitutes and the influence of potential consumers' preference have been mostly ignored. Besides, to the best of our knowledge, no research pays attention on the fact that some companies may snatch seeds to limit the profits of their opponents. Therefore, we propose a novel competition-based diversified-profit maximization (CDM) problem and its adaptive version, i.e., adaptive competition-based diversified-profit maximization (ACDM) problem. Different from prior works, these problems take seed's preference into consideration and use social welfare to reflect it. These two problems aim at selecting seeds and allocating them to marketers such that the sum of profit generated by consumers for a special entity after information dissemination and social welfare with respect to seed allocation reaches maximum. To address these two problems, we design an algorithm which combines the method of online allocation and the concept of Shapley value. Experimental results on three real-world data sets demonstrate the effectiveness of our proposed algorithm.
Modeling and optimizing an agro-supply chain considering different quality grades and storage systems for fresh products: a Benders decomposition solution approach
This paper proposes a mathematical model in the context of agro-supply chain management, considering specific characteristics of agro-products to assist purchase, storage, and transportation decisions. In addition, a new method for determining the required quality score of different types of products is proposed based on their loss factors and purchasing costs. The model aims to minimize total cost imposed by purchasing fresh products, opening warehouses, holding inventories, operational activities, and transportation. Two sets of examples, including small and medium-sized problems, are implemented by general algebraic modeling language (GAMS) software to evaluate the model. Then, Benders decomposition (BD) algorithm is applied to tackle the complexity of solving large-sized instances. The results of both GAMS and BD are compared in terms of objective function values and computational time to demonstrate the efficiency of the BD algorithm. Finally, the model is applied in a real case study involving an apple supply chain to obtain managerial insights.
A complete algebraic solution to the optimal dynamic rationing policy in the stock-rationing queue with two demand classes
In this paper, we study a stock-rationing queue with two demand classes by means of the sensitivity-based optimization, and develop a complete algebraic solution to the optimal dynamic rationing policy. We show that the optimal dynamic rationing policy must be of transformational threshold type. Based on this finding, we can refine three sufficient conditions under each of which the optimal dynamic rationing policy is of threshold type (i.e., critical rationing level). To do this, we use the performance difference equation to characterize the monotonicity and optimality of the long-run average profit of this system, and thus establish some new structural properties of the optimal dynamic rationing policy by observing any given reference policy. Finally, we use numerical experiments to demonstrate our theoretical results of the optimal dynamic rationing policy. We believe that the methodology and results developed in this paper can shed light on the study of stock-rationing queue and open a series of potentially promising research.
Analysis of modern circulation industry development level using industrial structure mechanism
This study focuses on China's industrial transformation and urban income inequality. It is shown that between 2011 and 2020, improvements in China's industrial structure have a significant positive influence on lowering income gaps between urban and rural areas when used in conjunction with the empirical research approach. The mechanical study shows that the urban population impacts this causation. Rural-to-urban economic gaps have been reduced through modernisation in different parts of the country. The result remains the same even if the urban-rural consumption gap is used as a proxy for income discrepancy. The mechanism for the industrial structure upgrading model (MISUM) is proposed in this article for the modern circulation industry. Key contributions include: (1) environmental rules in these components have no impact on each other, but the updating of industrial buildings indicates a substantial location-specific dependence; (2) environmental standards have impacts on industrial structures throughout provinces; and (3) environmental standards have a long-term qualifying impact on the industrial structures. This essay focuses on combining environmental regulation with industrial expansion in different regions. In this study, government environmental requirements for industrial structural improvements are shown to be in operation. The test results show the MISUM has been described with high accuracy of 94.2%, carbon emission level of 18%, soil emission level of 11% and efficiency ratio of 97.8% compared to other methods.
A combination of TEXTCNN model and Bayesian classifier for microblog sentiment analysis
More and more individuals are paying attention to the research on the emotional information found in micro-blog comments. TEXTCNN is growing rapidly in the short text space. However, because the training model of TEXTCNN model itself is not very extensible and interpretable, it is difficult to quantify and evaluate the relative importance of features and themselves. At the same time, word embedding can't solve the problem of polysemy at one time. This research suggests a microblog sentiment analysis method based on TEXTCNN and Bayes that addresses this flaw. First, the word embedding vector is obtained by word2vec tool, and based on the word vector, the ELMo word vector integrating contextual features and different semantic features is generated by ELMo model. Second, the local features of ELMo word vector are extracted from multiple angles by using the convolution layer and pooling layer of TEXTCNN model. Finally, the training task of emotion data classification is completed by combining Bayes classifier. On the Stanford Sentiment Classification Corpus data set SST (Stanford Sentiment Classification Corpus Data bank), the experimental findings demonstrate that the model in this paper is compared with TEXTCNN, LSTM, and LSTM-TEXTCNN models. The Accuracy, Precision, Recall, and F1-score of the experimental results of this research have all greatly increased. Their values are respectively 0.9813, 0.9821, 0.9804 and 0.9812, which are superior to other comparison models and can be effectively used for emotional accurate analysis and identification of events in microblog emotion analysis.
Enhanced post-quantum key escrow system for supervised data conflict of interest based on consortium blockchain
Consortium blockchains offer privacy for members while allowing supervision peers access to on-chain data under certain circumstances. However, current key escrow schemes rely on vulnerable traditional asymmetric encryption/decryption algorithms. To address this issue, we have designed and implemented an enhanced post-quantum key escrow system for consortium blockchains. Our system integrates NIST post-quantum public-key encryption/KEM algorithms and various post-quantum cryptographic tools to provide a fine-grained, single-point-of-dishonest-resistant, collusion-proof and privacy-preserving solution. We also offer chaincodes, related APIs, and invoking command lines for development. Finally, we perform detailed security analysis and performance evaluation, including the consumed time of chaincode execution and the needed on-chain storage space, and we also highlight the security and performance of related post-quantum KEM algorithms on consortium blockchain.
Target set selection in social networks with tiered influence and activation thresholds
Thanks to the mass adoption of internet and mobile devices, users of the social media can seamlessly and spontaneously connect with their friends, followers and followees. Consequently, social media networks have gradually become the major venue for broadcasting and relaying information, and is casting great influences on the people in many aspects of their daily lives. Thus locating those influential users in social media has become crucially important for the successes of many viral marketing, cyber security, politics, and safety-related applications. In this study, we address the problem through solving the tiered influence and activation thresholds target set selection problem, which is to find the seed nodes that can influence the most users within a limited time frame. Both the minimum influential seeds and maximum influence within budget problems are considered in this study. Besides, this study proposes several models exploiting different requirements on seed nodes selection, such as maximum activation, early activation and dynamic threshold. These time-indexed integer program models suffer from the computational difficulties due to the large numbers of binary variables to model influence actions at each time epoch. To address this challenge, this paper designs and leverages several efficient algorithms, i.e., Graph Partition, Nodes Selection, Greedy algorithm, recursive threshold back algorithm and two-stage approach in time, especially for large-scale networks. Computational results show that it is beneficial to apply either the breadth first search or depth first search greedy algorithms for the large instances. In addition, algorithms based on node selection methods perform better in the long-tailed networks.
The Steiner cycle and path cover problem on interval graphs
The Steiner path problem is a common generalization of the Steiner tree and the Hamiltonian path problem, in which we have to decide if for a given graph there exists a path visiting a fixed set of terminals. In the Steiner cycle problem we look for a cycle visiting all terminals instead of a path. The Steiner path cover problem is an optimization variant of the Steiner path problem generalizing the path cover problem, in which one has to cover all terminals with a minimum number of paths. We study those problems for the special class of interval graphs. We present linear time algorithms for both the Steiner path cover problem and the Steiner cycle problem on interval graphs given as endpoint sorted lists. The main contribution is a lemma showing that backward steps to non-Steiner intervals are never necessary. Furthermore, we show how to integrate this modification to the deferred-query technique of Chang et al. to obtain the linear running times.
A Bayesian analysis based on multivariate stochastic volatility model: evidence from green stocks
Green stocks are companies environmental protective and friendly. We test Green stock index in Shanghai Stock Exchange and China Securities Index as safe-havens for global investors. Suitable multivariate-SV model and Bayesian method are used to estimate the spillover effect between different assets among local and global markets. We choose multivariate volatility model because it can efficiently simulate the spillover effect by using machine learning MCMC method. The results show that the Environmental Protection Index (EPI) of Shanghai Stock Exchange (SSE) and China Securities Index (CSI) have no significant volatility spillover from Shanghai Stock index, S &P index, gold price, oil future prices of USA and China. During COVID-19 pandemic, we find Green stock index is a suitable safe-haven with low volatility spillover. Green stock indexes has a strongly one-way spillover to the crude oil future price. Environmentally friendly investor can use diversity green assets to provide a low risk investment portfolio in EPI stock market. The DCGCt-MSV model using machine learning of MCMC method is accurate and outperform others in Bayes parameter estimation.
The sum of root-leaf distance interdiction problem by upgrading edges/nodes on trees
Network interdiction problems by upgading critical edges/nodes have important applications to reduce the infectivity of the COVID-19. A network of confirmed cases can be described as a rooted tree that has a weight of infectious intensity for each edge. Upgrading edges (nodes) can reduce the infectious intensity with contacts by taking prevention measures such as disinfection (treating the confirmed cases, isolating their close contacts or vaccinating the uninfected people). We take the sum of root-leaf distance on a rooted tree as the whole infectious intensity of the tree. Hence, we consider the sum of root-leaf distance interdiction problem by upgrading edges/nodes on trees (SDIPT-UE/N). The problem (SDIPT-UE) aims to minimize the sum of root-leaf distance by reducing the weights of some critical edges such that the upgrade cost under some measurement is upper-bounded by a given value. Different from the problem (SDIPT-UE), the problem (SDIPT-UN) aims to upgrade a set of critical nodes to reduce the weights of the edges adjacent to the nodes. The relevant minimum cost problem (MCSDIPT-UE/N) aims to minimize the upgrade cost on the premise that the sum of root-leaf distance is upper-bounded by a given value. We develop different norms to measure the upgrade cost. Under weighted Hamming distance, we show the problems (SDIPT-UE/N) and (MCSDIPT-UE/N) are NP-hard by showing the equivalence of the two problems and the 0-1 knapsack problem. Under weighted norm, we solve the problems (SDIPT-UE) and (MCSDIPT-UE) in () time by transforimg them into continuous knapsack problems. We propose two linear time greedy algorithms to solve the problem (SDIPT-UE) under unit Hamming distance and the problem (SDIPT-UN) with unit cost, respectively. Furthermore, for the the minimum cost problem (MCSDIPT-UE) under unit Hamming distance and the problem (MCSDIPT-UN) with unit cost, we provide two time algorithms by the binary search methods. Finally, we perform some numerical experiments to compare the results obtained by these algorithms.
Computing a maximum clique in geometric superclasses of disk graphs
In the 90's Clark, Colbourn and Johnson wrote a seminal paper where they proved that maximum clique can be solved in polynomial time in unit disk graphs. Since then, the complexity of maximum clique in intersection graphs of -dimensional (unit) balls has been investigated. For ball graphs, the problem is NP-hard, as shown by Bonamy et al. (FOCS '18). They also gave an efficient polynomial time approximation scheme (EPTAS) for disk graphs. However, the complexity of maximum clique in this setting remains unknown. In this paper, we show the existence of a polynomial time algorithm for a geometric superclass of unit disk graphs. Moreover, we give partial results toward obtaining an EPTAS for intersection graphs of convex pseudo-disks.
An integrated method for hybrid distribution with estimation of demand matching degree
Timely and effective distribution of relief materials is one of the most important aspects when fighting with a natural or a man-made disaster. Due to the sudden and urgent nature of most disasters, it is hard to make the exact prediction on the demand information. Meanwhile, timely delivery is also a problem. In this paper, taking the COVID-19 epidemic as an example, we propose an integrated method to fulfill both the demand estimation and the relief material distribution. We assume the relief supply is directed by government, so it is possible to arrange experts to evaluate the situation from aspects and coordinate supplies of different sources. The first part of the integrated method is a fuzzy decision-making process. The demand degrees on relief materials are estimated by extending COPRAS under interval 2-tuple linguistic environment. The second part includes the demand degrees as one of the inputs, conducts a hybrid distribution model to decide the allocation and routing. The key point of hybrid distribution is that each demand point could be visited by different vehicles and each vehicle could visit different demand points. Our method can also be extended to include both relief materials and medical staffs. A real-life case study of Wuhan, China is provided to illustrate the presented method.
Performance prediction and optimization for healthcare enterprises in the context of the COVID-19 pandemic: an intelligent DEA-SVM model
The coronavirus disease (COVID-19) pandemic has caused significant changes in the external environment of enterprises, resulting in tremendous negative impacts. Accordingly, the irregular fluctuation of business data poses a critical challenge to traditional approaches. Therefore, to combat the effects of the COVID-19 pandemic, an effective model is required to proactively predict an enterprise's performance and simultaneously generate scientific performance optimization solutions. Consequently, at the intersection of artificial intelligence algorithms, operations research, and management science, an intelligent DEA-SVM model, which has a theoretical contribution, is developed in this study. The capabilities of this model are verified through sufficient numerical experiments. On the one hand, this model outperforms traditional algorithms in prediction accuracy. On the other hand, effective performance optimization solutions for low-performance enterprises are obtained from the input-output perspective. Moreover, the application value of this model is reflected in its successful implementation in the healthcare industry. Thus, it is a user-friendly tool for realizing the stable operation of enterprises in the context of the COVID-19 pandemic.
An integrated model for medical expense system optimization during diagnosis process based on artificial intelligence algorithm
In the era of artificial intelligence, the healthcare industry is undergoing tremendous innovation and development based on sophisticated AI algorithms. Focusing on diagnosis process and target disease, this study theoretically proposed an integrated model to optimize traditional medical expense system, and ultimately helps medical staff and patients make more reliable decisions. From the new perspective of total expense estimation and detailed expense analysis, the proposed model innovatively consists of two intelligent modules, with theoretical contribution. The two modules are SVM-based module and SOM-based module. According to the rigorous comparative analysis with two classic AI techniques, back propagation neural networks and random forests, it is demonstrated that the SVM-based module achieved better capability of total expense estimation. Meanwhile, by designing a two-stage clustering process, SOM-based module effectively generated decision clusters and corresponding cluster centers were obtained, that clarified the complex relationship between detailed expense and patient information. To achieve practical contribution, the proposed model was applied to the diagnosis process of coronary heart disease. The real data from a hospital in Shanghai was collected, and the validity and accuracy of the proposed model were verified with rigorous experiments. The proposed model innovatively optimized traditional medical expense system, and intelligently generated reliable decision-making information for both total expense and detailed expense. The successful application on the target disease further indicates that this model is a user-friendly tool for medical expense control and therapeutic regimen strategy.
Agent-constrained truthful facility location games
We consider a truthful facility location game in which there is a set of agents with private locations on the line of real numbers, and the goal is to place a number of facilities at different locations chosen from the set of those reported by the agents. Given a feasible solution, each agent suffers an individual cost that is either its total distance to all facilities (sum-variant) or its distance to the farthest facility (max-variant). For both variants, we show tight bounds on the approximation ratio of strategyproof mechanisms in terms of the social cost, the total individual cost of the agents.
