TECHNICAL PROGRAM

Wednesday, 9:00-10:30 Wednesday, 10:50-12:20

 WA-01  WB-02 Wednesday, 9:00-10:30 - Fo1 Wednesday, 10:50-12:20 - Fo2 Opening Ceremony Mixed Integer Linear Programming Stream: Invited Presentations and Ceremonies Stream: Discrete and Combinatorial Optimization, Plenary session Graphs and Networks Chair: Marco Lübbecke Invited session Chair: Alexander Martin 1 - Sports Scheduling meets Business Analytics Michael Trick 1 - Error bounds for mixed integer linear optimization Faster computers and algorithms have transformed how sports sched- problems ules have been created in practice in a wide range of sports. Techniques Oliver Stein such as Combinatorial Benders Decomposition, Large Scale Neighbor- hood Search, and Brand-and-Price have greatly increased the range of We introduce computable a-priori and a-posteriori error bounds for op- sports leagues that can use operations research methods to create their timality and feasibility of a point generated as the rounding of an op- schedules. With this increase in computational and algorithmic power timal point of the LP relaxation of a mixed integer linear optimization comes the opportunity to create not just playable schedules but more problem. Treating the mesh size of integer vectors as a parameter al- profitable schedules. Using data mining and other predictive analytics lows us to study the effect of different ‘granularities’ in the discrete techniques, it is possible to model attendance and other revenue effects variables on the error bounds. Our analysis mainly bases on the con- of the schedule. Combining these models with advanced schedule cre- struction of a so-called grid relaxation retract. Relations to proximity ation approaches leads to schedules that can generate more revenue for results and the integer rounding property are highlighted. teams and leagues. These concepts are illustrated with experiences in professional and college sports leagues. 2 - Separation of Generic Cutting Planes in Branch-and- Price Using a Basis Jonas Timon Witt, Marco Lübbecke

When reformulating a given mixed integer program by the use of clas- sical Dantzig-Wolfe decomposition, a subset of the constraints is par- tially convexified, which corresponds to implicitly adding all valid in- equalities for the associated integer hull. Since these inequalities are not known, a solution of the original linear programming (LP) relax- ation which is obtained by transferring an optimal basic solution of the reformulated LP relaxation is in general not basic. In order to obtain an optimal basic solution we would have to explicitly add valid in- equalities for the integer hull associated with the partially convexified constraints such that the considered solution becomes basic. Hence, cutting planes which are separated using a basis like Gomory mixed integer cuts or strong Chvatal-Gomory cuts are usually not directly ap- plicable when separating such a solution in the original problem. Nev- ertheless, we can use some crossover method in order to obtain a basic solution which is nearby the considered non-basic solution and sepa- rate this auxiliary solution by applying all separators including those using a basis. The generated cutting planes might not only cut off the auxiliary solution, but also the solution we originally wanted to sepa- rate. So far, this problem was only considered extensively by Range, who proposed the previously described approach including a particular crossover method to find such a nearby basic solution. We present a modified crossover method and extend this procedure by considering additional valid inequalities strengthening the original LP relaxation. Furthermore, we provide the first full implementation of a separator like this and tested it on instances of several problem classes. 3 - A Lagrangian Relaxation Algorithm for Modularity Maximization Problem Kotohumi Inaba, Yoichi Izunaga, Yoshitsugu Yamamoto

The modularity proposed by Newman and Girvan is one of the most common measures when the nodes of a graph are grouped into commu- nities consisting of tightly connected nodes. Due to the NP-hardness of the problem, few exact algorithms have been proposed. Aloise et al. formulated the problem as a set partitioning problem, which has to take into account all, exponentially many, nonempty subsets of the node set, and makes it difficult to secure the computational resource when the number of nodes is large. Their algorithm is based on the linear programming relaxation, LP relaxation for short, and uses the column generation technique. Although it provides a tight upper bound of the optimal value, it can suffer a high degeneracy due to the set partitioning constraints. In this study, we propose an algorithm based on the La- grangian relaxation. We relax the set partitioning constraints and add them to the objective function as a penalty with Lagrangian multipli- ers, and obtain the Lagrangian relaxation problem with only the binary variable constraints. For a given Lagrangian multiplier vector, an opti- mal solution of the Lagrangian relaxation problem can be obtained by

1 WB-03 OR 2014 - Aachen

checking the sign of coefficients, but it is hard to compute all the co- Our result complements the results obtained in the companion paper efficients of variables. Then we propose to use the column generation of Krumke and Thielen, where a nondeterministic polynomial time al- technique in order to alleviate the computational burden. Namely, we gorithm for the more general problem of deciding of strong imple- start the algorithm with a small number of variables and gradually add mentability via indirect mechanisms is given. This more general prob- variables as the computation goes on. We also propose some methods lem is expected to be NP-complete. to accelerate the convergence.

 WB-04  WB-03 Wednesday, 10:50-12:20 - Fo4 Wednesday, 10:50-12:20 - Fo3 Robust knapsack problems Computational Social Choice Stream: Robust and Stochastic Optimization Stream: Algorithmic Game Theory Invited session Invited session Chair: Marc Goerigk Chair: Stephan Westphal 1 - Packing a Knapsack of Unknown Capacity 1 - On the Discriminative Power of Tournament Solu- Yann Disser, Max Klimm, Nicole Megow, Sebastian Stiller tions We study the problem of packing a knapsack without knowing its ca- Hans Georg Seedig, Felix Brandt pacity. Whenever we attempt to pack an item that does not fit, the item is discarded; if the item fits, we have to include it in the packing. We Tournament solutions constitute an important class of social choice show that there is always a policy that packs a value within factor 2 of functions that only depend on the pairwise majority comparisons be- the optimum packing, irrespective of the actual capacity. If all items tween alternatives. Recent analytical results have shown that several have unit density, we achieve a factor equal to the golden ratio 1.618. concepts with appealing axiomatic properties such as the Banks set or Both factors are shown to be best possible. In fact, we obtain the above the minimal covering set tend to not discriminate at all when the tour- factors using packing policies that are universal in the sense that they naments are chosen from the uniform distribution. This is in sharp fix a particular order of the items and try to pack the items in this order, contrast to empirical studies which have found that real-world prefer- independent of the observations made while packing. We give efficient ence profiles often exhibit Condorcet winners, i.e. alternatives that all algorithms computing these policies. On the other hand, we show that, tournament solutions select as the unique winner. In this work, we aim for any alpha > 1, the problem of deciding whether a given universal to fill the gap between these extremes by examining the distribution of policy achieves a factor of alpha is coNP-complete. If alphe is part of the number of alternatives returned by common tournament solutions the input, the same problem is shown to be coNP-complete for items for empirical data as well as data generated according to stochastic with unit densities. Finally, we show that it is coNP-hard to decide, preference models such as impartial culture, impartial anonymous cul- for given alpha, whether a set of items admits a universal policy with ture, Mallows mixtures, spatial models, and Polya-Eggenberger urn factor alpha, even if all items have unit densities. models. 2 - Algorithms for the Recoverable Robust Knapsack 2 - Complexity of Strong Implementation of Social Problem Choice Functions in Dominant Strategies Christina Büsing, Sebastian Goderbauer, Arie Koster, Manuel Sven Krumke, Clemens Thielen Kutschka We consider the classical mechanism design problem of strongly im- In this talk we present a recoverable robust knapsack problem, where plementing social choice functions in a setting where monetary trans- the uncertainty of the item weights follows the approach of Bertsimas fers are allowed. In contrast to weak implementation, where only one and Sim. In contrast to the classical robust setting, a limited recovery equilibrium of a mechanism needs to yield the desired outcomes given action is allowed, i.e., up to k items may be removed when the actual by the social choice function, strong implementation (also known as weights are known. We will introduce several algorithms based on a full implementation) means that a mechanism is sought in which all compact integer linear programming formulation, different robustness equilibria yield the desired outcomes. For strong implementation, one cuts and robust extended cover inequalities and compare their run-time cannot restrict attention to incentive compatible direct revelation mech- w.r.t. the recovery action and the scenario set. anisms via the Revelation Principle, so the question whether a given social choice function is strongly implementable cannot be answered 3 - The Robust Knapsack Problem with Queries as easily as for weak implementation. When considering Bayes Nash Marc Goerigk, Manoj Gupta, Jonas Ide, Anita Schöbel, equilibria, the Augmented Revelation Principle states that it suffices Sandeep Sen to consider mechanisms in which the set of types of each agent is a subset of the set of her possible bids. Moreover, given some additional In this talk, we consider knapsack problems with uncertain item data, such a mechanism can be constructed by an iterative procedure weights. We are allowed to query an item to find its exact weight, via selective elimination of undesired equilibria in finitely (but possible where the number of such queries is bounded by a given parameter. exponentially) many steps. For dominant strategies as the equilibrium After these queries are made, we need to pack the items robustly, i.e., concept, however, no such results have been known so far. We close so that the choice of items is feasible for every remaining possible sce- this gap by showing a variant of the Augmented Revelation Principle nario of item weights. for dominant strategies and a selective elimination procedure for con- The central question that we consider is: Which items should be structing the desired mechanisms in polynomially many steps. Using queried in order to gain maximum profit? We introduce the notion these results, we then show that strong implementability in dominant of query competitiveness to evaluate the quality of an algorithm for strategies can be decided in nondeterministic polynomial time. this problem, and obtain lower and upper bounds on this competitive- ness for interval-based uncertainty. Similar to the study of online al- 3 - A Combinatorial Algorithm for Strong Implementa- gorithms, we study the competitiveness under different frameworks, tion of Social Choice Functions namely we analyze the worst-case query competitiveness for determin- Clemens Thielen, Stephan Westphal istic algorithms, and the expected query competitiveness for random- ized algorithms. We consider algorithmic aspects of the classical mechanism design problem of implementing social choice functions. We show how an We also extend this approach to Gamma-restricted uncertainties as in- adaption of the well-known negative cycle criterion for weak imple- troduced by Bertsimas and Sim. Furthermore, we present heuristic mentability can be used to decide the question of implementability in algorithms for the problem. In computational experiments consider- the strong sense when one restricts to incentive compatible direct rev- ing both the interval-based and the Gamma-restricted uncertainty, we elation mechanisms. We derive an efficient combinatorial algorithm evaluate their empirical performance. While the usage of a Gamma- that computes the payments of an incentive compatible direct revela- restricted uncertainty improves the nominal performance of a solution tion mechanism that strongly implements a given social choice func- (as expected), we find that the query competitiveness gets worse. tion in dominant strategies or decides that none exist.

2 OR 2014 - Aachen WB-07

 WB-05 1 - A convex approximation for two-stage mixed-integer Wednesday, 10:50-12:20 - Fo5 recourse models Ward Romeijnders, Rüdiger Schultz, Maarten H. van der Polyhedra Vlerk, Wim Klein Haneveld We derive a convex approximation for two-stage mixed-integer re- Stream: Discrete and Combinatorial Optimization, course models and we show that the error of this approximation van- Graphs and Networks ishes as all total variations of the probability density functions of the Invited session random variables in the model decrease to zero. To prove this result we use asymptotic periodicity of the mixed-integer value function and Chair: Markus Leitner error bounds on the expectation of periodic functions. 1 - Facets of the axial Three-Index Assignment Polytope 2 - A representation of a class of stochastic dominance Trivikram Dokka, Frits Spieksma constraints enabling Lipschitzian properties and sta- bility Given three sets, each with cardinality of n elements, and a cost func- tion defined on triples with one element each from these sets, the axial Matthias Claus, Rüdiger Schultz 3-index assignment problem asks for a set of n disjoint triples such Introducing stochastic dominance constraints when handling risk aver- that the total cost is minimised. it is well-known that this problem is sion in linear programming under stochastic uncertainty leads to opti- strongly NP-hard. In this work we extend the study of facial struc- mization problems with uncountably many chance constraints. Metric ture of the axial 3-index assignment polytope (3AP). In particular we regularity of the constraint function is the key to stability of the opti- answer a question asked in Qi, L., D. Sun (2000). By giving a new mal solution sets subject to perturbations of the underlying probability class of facet defining inequalities with right hand side 2, called wall measure. The talk is on identifying verifiable sufficient conditions for inequalities. We obtain our results using a geometric interpretation metric regularity via a local linear growth condition. of a valid inequality of 3AP by relating it to vertices and axes of a three-dimensional cube. We also give a polynomial time separation 3 - Two-stage Stochastic Semidefinite Programming for algorithm for the wall inequalities. Unit Commitment Under Uncertainty with AC Power 2 - A Configuration Model for the Line Planning Problem Flow Constraints Heide Hoppmann, Ralf Borndörfer, Marika Karbstein Tobias Wollenberg, Rüdiger Schultz

We present a novel extended formulation for the line planning problem This talk addresses unit commitment under uncertainty of load and that is based on what we call "configurations" of lines and frequencies. power infeed from renewables in alternating current (AC) power sys- Configurations are combinatorial building blocks of primal solutions; tems. Beside traditional unit-commitment constraints, the physics they rule out the "capacity numerics" and make the problem purely of power flow are included. To gain global optimal solutions a re- combinatorial. The configuration model is strong in the sense that cent semidefinite programming approach is used, which leads us to it implies several facet-defining inequalities for the standard model: risk averse two-stage stochastic mixed integer semidefinite programs set cover, symmetric band and MIR inequalities, and multicover in- whose structure is analyzed, and for which a decomposition algorithm equalities. These theoretical findings can be confirmed in computa- is presented. tions, however, the enormous number of configurations can blow up the formulation for large instances. We propose a mixed model that enriches the standard model by a judiciously chosen subset of config- urations that provide a good compromise between model strength and size. Computational results for large-scale line planning problems are presented.  WB-07 Wednesday, 10:50-12:20 - Fo7 3 - A polyhedral study of the diameter constrained min- imum spanning tree problem Branch-and-Price/Branch-and-Cut Markus Leitner, Luís Gouveia, Ivana Ljubic Stream: Discrete and Combinatorial Optimization, We consider the diameter constrained minimum spanning tree prob- Graphs and Networks lem (DMSTP) on a graph. Given an edge-weighted undirected graph, Invited session the objective is to find a minimum-weight spanning tree such that the Chair: Andreas Bley number of edges on the path between any two nodes does not exceed a given diameter D. Several integer programming models as well as exact and heuristic solution approaches for the DMSTP have been dis- 1 - A Branch-and-Price Algorithm for Minimizing the cussed in the literature. The current state-of-the-art approach has been Border in Bordered Block-Diagonal Matrices proposed by Gouveia, Simonetti and Uchoa in 2011 who reformulated Michael Bastubbe, Martin Bergner, Alberto Ceselli, Marco the DMSTP as a Steiner tree problem on a layered graph. The au- thors showed that the layered graph approach outperformed all previ- Lübbecke ous integer programming based approaches both in theory and practice. In our talk, we consider the optimization problem of re-arranging the Surprisingly not much is known, however, with respect to the polyhe- rows and columns of a matrix (especially matrices arising from mixed dral structure of the DMSTP. In this work, we aim to close this gap integer programs) into singly-bordered block-diagonal form for a given by studying formulations in the natural space of variables, i.e, in the number of blocks such that the total number of border rows is mini- space of undirected edge design variables. We introduce new classes mized. of facet-defining inequalities that are based on so-called jump inequal- ities. Finally, some results from computational experiments are given. A singly-bordered block-diagonal form is beneficial not only because it can be exploited for computing an LU- and QR-factorization, but also when solving mixed integer programs, e.g. by applying Dantzig-Wolfe decomposition. In the literature numerous heuristic approaches have been proposed,  WB-06 but only one exact algorithm. Wednesday, 10:50-12:20 - Fo6 We present a new IP-formulation that is solved by a branch-and-price algorithm. In this formulation every binary variable corresponds to a Recent Developments in Stochastic subset of rows forming one particular block. Programming The pricing problem can be solved by several heuristics (e.g. hill climbing and simulated annealing). Moreover, we introduce two ex- Stream: Robust and Stochastic Optimization act approaches: the first one is by integer programming and the second is based on Lagrangean relaxation exploiting the combinatorial struc- Invited session ture of the pricing problem such that the Lagrangean subproblem can Chair: Tobias Wollenberg be solved efficiently by computing a minimum s-t cut.

3 WB-08 OR 2014 - Aachen

Furthermore, we suggest a branching scheme working on (aggregated) Matching and coalition formation are fundamental problems in a va- sums of variables whose branching decisions are respected in easily riety of scenarios where agents join efforts to perform tasks, such as, adapted pricing algorithms. Moreover, we introduce a (randomized) e.g., in scientific publishing. To allocate credit or profit stemming from primal heuristic that uses fractional solutions to provide good integer a joint project, different communities use different crediting schemes solutions even in the root node. in practice. A natural and widely used approach to profit distribution is In our computational study we will examine the impact of the above equal sharing, where every member receives the same credit for a joint techniques and compare the performance with the existing exact algo- work. This scheme captures a natural egalitarian fairness condition rithm. Preliminary results suggest that our approach particularly per- when each member of a coalition is critical for success. Unfortunately, forms well for a higher (>4) number of blocks and thus complements when coalitions are formed by rational agents, equal sharing can lead the existing approach. to high inefficiency of the resulting stable states. In this paper, we study the impact of changing profit sharing schemes in order to obtain good 2 - Methods for time-dependent combined network de- stable states in matching and coalition formation games. We generalize equal sharing to sharing schemes where for each coalition each player sign and routing optimization is guaranteed to receive at least an alpha-share. This way the coalition Dimitri Papadimitriou, Bernard Fortz formation can stabilize on more efficient outcomes. In particular, we show a direct trade-off between efficiency and equal treatment. The Fixed Charge Network Design Problem addresses the problem of simultaneous design and routing, where a fixed cost is paid for open- If k denotes the size of the largest possible coalition, we prove an ing a link and a linear routing cost is paid for sending traffic flow on a asymptotically tight bound of k2 alpha on prices of anarchy and stabil- link. The routing decision must be performed such that flows remain ity. This result extends to polynomial-time algorithms to compute good bounded by the installed capacities. This problem appears as a par- sharing schemes. Further, we show improved results for a novel class ticular case of the combined network design and traffic flow routing of matching problems that covers many well-studied cases, including problem with time-dependent demands developed in this paper. This two-sided matching and instances with integrality gap 1. general problem can be formulated as a multi-period mixed integer op- timization problem. A compact formulation based on the aggregation 2 - Shapley meets Shapley of flows by destination shows that its resolution on realistic instances Bart de Keijzer becomes intractable and unscalable with state-of-the-art solvers due to the weak linear programming bound provided by the formulation. This talk is about computing the Shapley value in matching games. An extended formulation, where flows are decomposed by origin- Matching games constitute a fundamental class of cooperative games destination pairs while keeping the requirement of destination-based which help understand and model auctions and assignments. In a routing, provide much better linear programming lower bounds. How- matching game, the value of a coalition of vertices is the weight of ever, as its resolution still suffers from its huge size, solving the linear the maximum size matching in the subgraph induced by the coalition. relaxation becomes intractable as the network size increases. In this The Shapley value is one of the most important solution concepts in paper, we explore different decomposition techniques to overcome this cooperative game theory. After establishing some general insights, we limit. One of them consists in projecting the extended formulation on show that the Shapley value of matching games can be computed in the space of variables of the base formulation, leading to a Benders de- polynomial time for some special cases: graphs with maximum degree composition that can be embedded in a branch-and-cut method yield- two, and graphs that have a small modular decomposition into cliques ing a candidate for an efficient centralized solving procedure. Moving or cocliques (complete k-partite graphs are a notable special case of to a decentralized procedure, requires to design mechanisms to obtain this). The latter result extends to various other well-known classes of distributed versions of the master problem solved at each node and to graph-based cooperative games. We continue by showing that comput- exchange information from the subproblems of tractable size by in- ing the Shapley value of unweighted matching games is #P-complete volving only local decisions to the distributed versions of the master in general. Finally, a fully polynomial-time randomized approximation problem. scheme (FPRAS) is presented. This FPRAS can be considered the best positive result conceivable, in view of the #P-completeness result. 3 - A branch-and-price algorithm for the chromatic scheduling problem 3 - An NTU-Based Approach for Allocation Problems in Andreas Bley Cooperative Planning defined as a Multi-Objective Optimization Problem We address the chromatic scheduling (aka interval coloring) problem, which arises in the planning of flexgrid fiber optic communication net- Igor Kozeletskyi, Alf Kimms works, for example. Given a set of fixed paths with integer demands, In this presentation a case of horizontal cooperation with transferable the task is to assign to each path an interval of length equal to the and non-transferable utility is considered. As example, we consider a path’s demand such that intervals that correspond to paths sharing an cooperative traveling salesman problem, where besides the minimiza- edge are disjoint. This corresponds to the task of ’coloring’ the nodes tion of total costs every player aims to maximize his own utility from of an associated conflict graph with intervals such that the intervals of assigned orders. The allocation problem is defined as a generalized neighboring nodes do not overlap. The objective considered in this talk NTU game with transferable and non-transferable utility. The charac- is to minimize the span needed to accommodate all intervals. teristic set of this game results from the solution of a multi-objective We present an exact algorithm based on branch-and-price that is work- optimization problem. For the determination of the allocation of costs ing in two stages. In the first stage, we solve an ILP formulation of a and utilities we present a game-theoretic approach using the NTU core multi-coloring relaxation and employ several heuristics to quickly gen- concept. For the computation of NTU core elements an algorithm was erate good solutions and bounds. If this stage fails to prove optimality, developed. The results of the computational study show the perfor- we run a branch-and-price algorithm based on an exact ILP formula- mance of the developed algorithm. tion for interval coloring in the second stage. This two stage approach proves to be computationally efficient in our experiments, solving re- alistic instances with up to 1000 paths within seconds.  WB-09 Wednesday, 10:50-12:20 - SFo1  WB-08 Variational Inequalities and Related Wednesday, 10:50-12:20 - Fo8 Topics I Cooperative Games Stream: Continuous and Non-linear Optimization Stream: Algorithmic Game Theory Invited session Invited session Chair: Steffensen Sonja Chair: Igor Kozeletskyi 1 - A Reformulation of Mathematical Programs with Car- 1 - Designing Profit Shares in Matching and Coalition dinality Constraints using a Complementarity-type Formation Games Condition Martin Hoefer, Lisa Wagner Alexandra Schwartz, Christian Kanzow, Oleg Burdakov

4 OR 2014 - Aachen WB-11

Mathematical programs with cardinality constraints are constrained 2 - Reducing fleet CO2 emissions through successfully optimization problems, where only a given number of the variables is promoting the market diffusion of electric vehicles — allowed to be nonzero. We consider a reformulation of the cardinality constraint using binary variables, more precisely a relaxation of this a manufacturer’s leverage reformulation, which leads to a mathematical program in continuous Katharina Wachter, Karsten Kieckhäfer, Thomas Spengler variables with a complementarity-type of constraints. In this talk, we discuss the relation between the local and global solutions of the orig- Induced by the threat of climate change political decision makers inal and the relaxed problem. Additionally, we analyze the theoretical around the world have set up regulations to limit the CO2 emissions properties of the relaxed problem, which differ from those known for from passenger cars (e.g. Regulation (EC) No 443/2009 in the Eu- general mathematical programs with complementarity constraints. Fi- ropean Union). As a result, automobile manufacturers are currently nally, we suggest a regularization method for the solution of the relaxed adding electric vehicles to their portfolio. Up to now these innovative problem and present some preliminary numerical results. vehicles do not make up a considerable part of the new car sales. This, however, would be a prerequisite for automobile manufacturers to ef- 2 - Optimality conditions for optimization problems with fectively lower their average fleet CO2 emissions and thus adhere to cardinality constraints the regulation. In this contribution we explore how automobile man- Michal Cervinka, Alexandra Schwartz, Christian Kanzow ufacturers can actively support the market success of electric vehicles We consider an NLP formulation of the optimization problems with by making use of the two main drivers of the diffusion of an innova- cardinality constraints which arise, e.g. in sparse portfolio selection. tion: marketing and word of mouth. To do so we extend an existing Similarly to mathematical programs with complementarity constraints, hybrid simulation model of the automobile market. In this model, sys- we introduce concepts of S- and M-stationarity. We show that both tem dynamics serves to illustrate the aggregated system structures such are optimality conditions for cardinality constrained problems under as the development of the employed powertrain technologies and the certain problem-tailored constraint qualifications. For a parameterized corresponding infrastructure availability. To enable a more detailed version of the problem we analyze qualitative stability of solutions to examination of specific parts of the aggregated system an agent-based the respective M-stationarity conditions. simulation model is integrated. With this, the individual automobile purchasing decision of heterogeneous consumers is depicted. We ex- 3 - Newton-type methods for Differential Variational In- tend this approach to also include interaction between the consumers to equalities account for the effects of word of mouth and marketing. We apply the model to the German automobile market to analyze different structures Steffensen Sonja of consumer interaction and their effect on the vehicles’ market share We study a semismooth Newton method for differential variational in- developments. From this we derive recommendations for automobile equalities (DVIs). Such problems comprise the solution of an ODE and manufacturers on how to successfully promote alternatively powered a variational inequality (VI) and have various applications e.g. as dif- vehicles and thereby lower their average fleet CO2 emissions. ferental games in economic sciences. The method we propose is based on a suitable time discretization scheme of the underlying ODE and a 3 - System Dynamics Modeling of Pathways to a Sus- reformulation of the resulting finite dimensional problem as a system of nonlinear, nonsmooth equations. We will theoretically analyze the tainable Transportation in Iceland resulting method and finish with some numerical results. Hlynur Stefánsson

Due to the growth of vehicles-per-capita and travel demand in Iceland, greenhouse gas (GHG) emissions from the road transport sector have been increasing rapidly during the past decade. To achieve the Ice-  WB-10 land’s long-term goals to reduce the net GHG emissions in the trans- Wednesday, 10:50-12:20 - SFo2 port sector, a transition to alternative fuel vehicles (AFVs) will be re- quired. To explore the transition process toward a low carbon transport, Sustainable Transport a system-dynamics model of Iceland’s energy systems (UniSyD_IS) is developed. UniSyD_IS is a detailed resource and technology spe- Stream: Energy and Environment cific model in which equilibrium interactions act across six key mar- kets: electricity, hydrogen, biogas, bioethanol, biodiesel, and vehicle Invited session fleets. UnisyD_IS encompasses conventional and alternative fuel sup- Chair: Karsten Kieckhäfer ply pathways and the corresponding vehicle powertrains. The whole model structure is divided into four main sectors: 1) fuel supply, 2) 1 - Simulation-based analysis of market introduction fuel prices, 3) infrastructures, and 4) fuel demand. The model provides strategies for alternative powertrain technologies in an endogenous analysis of road transport sector in which the long-term evolutions of light and heavy-duty vehicles are simulated through a ve- long-range passenger cars hicle choice algorithm. In this paper the structure and the algorithm of Christian Thies, Karsten Kieckhäfer, Thomas Spengler energy and transport simulation are described and possible transition Alternative powertrain technologies have the potential to reduce local pathways toward a low-carbon transport in Iceland are explored. The GHG-emissions of passenger cars significantly. Critical for the cus- application of the UniSyD_IS model has potential to provide important tomer adoption of these technologies are not only competitive prices policy insights as it enables policy analysis at both supply and demand but also similar or better technical characteristics compared to conven- sides and can simulate the impact of different policy instruments on tional vehicles with internal combustion engines (ICE). Regarding the both fuels and vehicles. operating range, plug-in hybrid electric vehicles (PHEV) and fuel cell electric vehicles (FCEV) can be considered as promising alternatives to ICEs. However, the development of such powertrains involves con- siderable risk on the manufacturers’ side due to high investments and uncertain customer acceptance. To support the manufacturers in deriv- ing successful market introduction strategies for PHEVs and FCEVs,  WB-11 this paper proposes a System Dynamics approach to model the inter- Wednesday, 10:50-12:20 - SFo3 actions between manufacturers and customers. Subject of the model is a generic car market with two competing manufacturers, each in- Data Envelopment Analyis troducing alternatively powered car models according to exogenously defined strategies comprising the times of market introduction, target market shares and target profit margins. The market penetration of new Stream: Decision Theory and Multi-Criteria Optimiza- technologies is based on a Bass diffusion model in combination with tion a multinomial logit model to account for particular vehicle types in Invited session the customers’ buying decision. Furthermore, the model considers the interdependencies with a complementary filling station infrastructure Chair: Matthias Ehrgott as well as fuel and energy prices. Experience curves and spillover ef- fects between the manufacturers are also included for electric batteries 1 - Luenberger Indicator and Directions of Measure- and fuel cells. The model is applied to an exemplary dataset in order to derive general strategy patterns that are most likely suitable for a ment: A Bottoms-up Approach with an Empirical Il- successful market introduction of alternative powertrain technologies. lustration to German Saving Banks Mohsen Afsharian, Heinz Ahn

5 WB-12 OR 2014 - Aachen

The Luenberger productivity indicator applies directional distance jobs on a dominating machine are at least as large as the processing functions which allow to specifying in what direction (i.e. direction times of all jobs on the other machines. Especially, we discuss flow of measurement) the operating units will be evaluated. In the presence shops with synchronous movement for a small number of dominating of a change in the direction of measurement, the standard components machines (one or two) and different objective functions. of the existing Luenberger productivity indicator may provide values which are not compatible with reality. In order to eliminate this pit- 2 - Scheduling Bidirectional Traffic on a Path fall, the so-called bottoms-up approach is used to revisit the definition Elisabeth Lübbecke, Yann Disser, Max Klimm of the indicator and its components. We start with a list of selected sources of productivity change, namely efficiency change, technical We study the fundamental problem of scheduling bidirectional traffic change and direction change, then examine the best possible way of along a path composed of multiple segments. The main feature of the measuring each of the sources and combine them to derive a new mea- problem is that jobs traveling in the same direction can be scheduled in sure of productivity change. The proposed indicator will be illustrated quick succession on a segment, while jobs in opposing directions can- by means of an empirical application to a panel of 417 German saving not cross a segment at the same time. We show that this tradeoff makes banks over the time period 2006-2012. The example explains how the the problem significantly harder than the related flow shop problem, by proposed approach is able to properly measure efficiency change, tech- showing that it is NP-hard even for identical jobs. We give polynomial nical change and direction change. The results also provide conclusive algorithms for a single segment and any constant number of segments, evidence about the effect of the change in direction of measurement on respectively. In contrast, we show the problem to be NP-hard on a the results of the productivity over time in a centralized management single segment and with identical jobs if some pairs of jobs traveling scenario. in different directions are allowed to cross the segment concurrently. Finally, we give a PTAS for scheduling bidirectional traffic on a path 2 - DEA-scale variable from an economic point of view composed of a constant number of segments. Andreas Kleine, Wilhelm Rödder Data Envelopment Analysis (DEA) is a nonparametric approach to measure the relative efficiency of decision making units (DMUs). One of the most popular DEA approaches is the BCC model by Banker, Charnes and Cooper (1984). It is a well-known result that the sign  WB-13 of the BCC scale variable indicates whether increasing, decreasing or Wednesday, 10:50-12:20 - SFo9 constant returns to scale prevail for a DMU. Moreover Rödder, Kleine and Dellnitz (2012) proof that the scale variable reveals the efficiency Fuzzy Expert Systems/Fuzzy change due to a monocentric scaling of inputs and outputs. In this con- tribution we interpret the value of the scale variable from an additional Expertensysteme economic point of view. It is shown that the optimal value of the scale variable corresponds to an hidden output. The idea is illustrated by a Stream: Artificial Intelligence, Big Data, and Data Min- balance sheet and a numerical example. ing Invited session 3 - Quality assessment for external radiotherapy plan- ning based on data envelopment analysis Chair: Thomas Spengler Matthias Ehrgott, Kuan-Min Lin, John Simpson, Andrea Chair: Heinrich Rommelfanger Raith, Giuseppe Sasso 1 - Methoden der Fuzzy-Datenanalyse für gemischt- We present an application of data envelopment analysis for the assess- skalierte Daten ment of the quality of treatment plans for radiation therapy of prostate cancer. Because commercial radiotherapy treatment planning systems Joachim Vierling require treatment planners to iteratively adjust the plan parameters in Gerade im ökonomischen Kontext sind häufig Sachverhalte anzutre- order to find a satisfactory plan, the quality of a plan may not be the ffen, deren Wesen nur unscharf beschrieben werden kann. Eine Ba- best achievable one. We propose a quality assessment method based sis für die Modellierung solcher Sachverhalte in Form von unscharf on Data Envelopment Analysis (DEA) to address this inefficiency. This definierten Begriffen und Zusammenhängen stellt die von Lofti Zadeh method compares a plan of interest to a set of past delivered plans and begründete Fuzzy-Mengentheorie dar. War es in der frühen Phase searches for evidence of potential further improvement. With the as- Experten vorbehalten, unscharfes Wissen in Form von Systemen zu sistance of DEA, planners will be able to make informed decisions on beschreiben, die auf der Fuzzy-Mengentheorie beruhen, wurde im whether further planning is required and ensure that a plan is only ac- Zuge der fortschreitenden Entwicklung eine Vielzahl von Verfahren cepted when the plan quality is close to the best attainable one. We entwickelt, mit deren Hilfe unscharfe Strukturen direkt aus vorhan- demonstrate the potential of the DEA method on a set of 37 clinically denen Daten extrahiert werden können. Eine Grundeigenschaft der acceptable prostate cancer treatment plans. meisten dieser Verfahren besteht allerdings in der ausschließlichen Be- handlung von Daten auf kardinalem Skalenniveau. Eine Berücksich- tigung von Daten auf ordinalem oder nominalem Skalenniveau und speziell von Daten auf unterschiedlichen Skalenniveaus, die gerade im Hinblick auf Analysen im ökonomischen Bereich interessant erscheint,  WB-12 fand und findet nur in Ansätzen statt. Im Rahmen dieses Vortrages soll Wednesday, 10:50-12:20 - SFo4 beschrieben werden, wie Daten auf unterschiedlichen Skaleniveaus einer Analyse mit Methoden der Fuzzy-Datenanalyse zugänglich gemacht werden können. Nach der Motivierung des Themas und Flow Shop Scheduling der Einführung in die Problemstellung soll gezeigt werden, wie aus- gewählte bereits existierende Verfahren der Fuzzy-Datenanalyse durch Stream: Project Management and Scheduling geeignete Modifikationen bzw. Erweiterungen für eine Analyse von Invited session Daten auf unterschiedlichen Skalenniveaus eingesetzt werden können. Chair: Sigrid Knust Abschließend soll anhand von exemplarisch ausgewählten Datensätzen die Geeignetheit der vorgestellten Verfahren zur Analyse von Daten auf unterschiedlichen Skalenniveaus aufgezeigt werden. 1 - Flow Shops with Synchronous Movement Stefan Waldherr, Sigrid Knust 2 - Vagueness-enabled Scheduling in project manage- ment In this talk we discuss flow shop problems with synchronous move- ment which are a variant of a non-preemptive permutation flow shop. Wolfgang Anthony Eiden Jobs have to be moved from one machine to the next by an unpaced In the scientific literature no comprehensive approach can be found for synchronous transportation system, which implies that the processing scheduling in project management that takes into account vagueness is organized in synchronized cycles. This means that in each cycle the of non-stochastic origin, which means that no approach exists starting current jobs start at the same time on the corresponding machines and with modeling, proceeding with scheduling, and finishing with eval- after processing have to wait until the last job is finished. Afterwards, uating and interpreting the results taking into account in a systematic all jobs are moved to the next machine simultaneously. way the effects of vagueness of non-stochastic origin. The aim of the Besides the general situation we also investigate special cases involv- PhD-thesis of the author, which is currently in progress, is to show that ing machine dominance which means that the processing times of all such a comprehensive approach is feasible, and more importantly, the

6 OR 2014 - Aachen WB-16

usefulness of such an approach for handling real-life problems will be WB-16 demonstrated by means of a realistic practical example. In this talk we  will present the starting point, the goals, and the methods of this work. Wednesday, 10:50-12:20 - SFo14 To a large extent the methods are based on Fuzzy Theory, whose main aim is to handle vagueness in a mathematically precise way. Optimization in Regional Energy Systems Stream: Energy and Environment 3 - Evaluating path-dependency in networks — a fuzzy Invited session realoption approach Chair: Sabrina Ried André Mangelsdorf, Thomas Spengler 1 - A Quantitative Model for Cost-Efficient Regional En- ergy System Planning So far path-dependency in interfirm-networks haven’t been covered in Sören Christian Meyer, Michael H. Breitner literature in depth and we lack in the evaluation with real-options in this field of interest both in an uncertain and in a fuzzy environment The transformation of the german Energy System and its ambitious at all. By regarding networks as sequence of single investments, in aims are a challenging field for strategic decision support tools and the financial theory well known option-pricing theory can be applied to underlying models. The Hanover region has for instance committed this type of organization. For the determination of change in value of itself to the goal of greenhouse gas emissions reduction by 95% un- a network-entrance by emerging path-dependency, a fuzzy approach til 2050. This aim can only be reached if renewable energy sources will be used for evaluation and thus giving decision support. For this will completely substitute fossil fuels. This objective makes it neces- purpose, a fuzzy version of the Datar-Mathews evaluation model will sary to redesign the energy sector. Including the selection of cost and be applied to interfirm-networks, once with and once without consider- space effective sources and optimal balancing between energy saving ing path-dependent processes. The use of the Datar-Mathews approach and additional renewable energy production. first relaxes the stringent assumption of evaluation via an escapist risk- Therefore the dilemma between volatile and rapid price decline of less market interest rate. Instead, according to network-specific char- energy collection, storage and conversion technologies and necessary acteristics and in relation to the exercise date of the examined options, longtime planning in the energy sector has to be solved. differentiated interest rates are used for a more realistic evaluation. In addition, the effects of arising path-dependency on the payoff distribu- This papers model solves this dilemma in two steps. At first the model tion and, hence, the corresponding realoption-value of a project, here predicts technology prices by the combination of learning rates and the network entrance, can be examined by applying a fuzzy method- global market size forecasts. Subsequent the resulting learnings curves ology. Thereto, we can use fuzzy numbers and intervals on the one are used for future actual costs calculations based on annual working hand, linguistic variables on the other hand. Fuzzy numbers and inter- costs as well as annuities of renewable energy investments. This in- vals are employed for arithmetical calculation, linguistic variables for vestments consist of electricity generation from photovoltaic and wind rule-based determination of the respective option-value. Whereat, the power, the future prices of different electric, plugin-electric and com- corresponding fuzzy factors can refer to all determinants of the option- bustion engines, and the price of electricity storage (batteries) and con- value (e.g. cashflows, scenario probabilities). version (power-to-gas). Furthermore the model includes viability cal- culations for building heating and insulation. In all three sectors a given electricity, mobility and warmth demand must be fulfilled by se- lection of technology alternatives. This papers model can easily be expanded by additional technologies. In further research based on the model a scenario tool for economic viable decision making in the en- ergy sector can be programmed. WB-15  2 - Day-ahead versus Intraday Valuation of Demand- Wednesday, 10:50-12:20 - SFo11 Side Flexibility for Photovoltaic and Wind Power Sys- Forecasting Stream Keynote tems Ernesto Garnier, Reinhard Madlener Stream: Statistics and Forecasting One potential means to handle non-controllability and limited pre- Invited session dictability of photovoltaic (PV) and wind power production is to em- ploy demand-side flexibility, i.e. demand response (DR) resources. In Chair: Sven F. Crone this paper, we take the perspective of a PV or wind power system op- erator who leverages DR in order to maximize the economic value of the production supplied in short-term power markets. DR is modeled 1 - Forecasting of Complex Dynamical Systems with as contractual flexibility, meaning that the PV or wind operator has the Neural Networks freedom to shift a share of some power supply commitment between Hans Georg Zimmermann neighbored delivery slots. We formulate two alternative DR operation modes: (1) use DR to maximize relative day-ahead market value, by shifting the supply-demand balance in view of day-ahead prices; (2) use DR in intraday operations to minimize costs incurred when balanc- The forecasting of high dimensional nonlinear systems is obviously a ing forecast errors. An analytical comparison and some (preliminary) non-trivial challenging task, both regarding identification, analysis and testing with German market data suggest that the intraday operation model specification. Artificial neural networks have been proven to mode (2) yields a higher value in the vast majority of instances. These be universal approximators but this still leaves the identification task a findings can be attributed to the greater volatility of intraday prices hard one. To do it efficiently, we have to violate some of the rules of compared to day-ahead prices, and to the fact that the intraday appli- classical regression theory. Furthermore we should focus on the inter- cation of DR allows both shifting supply-demand balances as well as pretation of the resulting model to overcome its black box character. netting them. Ultimately, we combine the intraday DR operation with Of special interest are complex dynamical system in the form of state a bidding model for the balancing of PV or wind power forecast errors space models realized as recurrent neural networks. After the intro- in continuous-trade intraday markets. The integrated model proposed duction of small open dynamical systems we will study dynamical sys- directs both the allocation of DR resources and the trading of remain- tems on manifolds. Here manifold and dynamics have to be identified ing supply-demand imbalances with the goal of value maximization. in parallel. We will move on to large closed dynamical systems with In the optimization problem, the stochastic and correlated behavior of hundreds of state variables and will compare causal versus retro-causal the two key variables intraday price and forecast error are accounted models of the observations. The combination of these models will lead for by means of a multi-dimensional binomial lattice and real options us to an implicit description of dynamical systems on manifolds. Fi- analysis. nally we will discuss the quantification of uncertainty in forecasting. In our framework the uncertainty appears as a consequence of principally 3 - Impacts of electricity consumers’ unit commitment unidentifiable hidden variables in the description of large systems. The on low voltage networks value of the different principles will be shown on real world principal in Supply Chain Management, Finance, Load Forecasting, Renewable Johannes Schäuble, Patrick Jochem, Wolf Fichtner Energy, Process Control and Process Surveillance. Today’s electricity consumer tend to become small businesses (also re- ferred to as prosumers) as they invest in their own decentralized elec- tricity generation (e.g. photovoltaic, combined heat and power, etc.)

7 WB-19 OR 2014 - Aachen

and mobile (e.g. electric vehicles) and/or stationary energy storage as The transformed plans are highly accepted and are in operational use. well as in information technology (IT) to connect and organize these Based in Aachen, INFORM s Aviation Division is a team of raised-in- new devices (e.g. as virtual storage). Furthermore, the installed IT al- industry ICT professionals who research, develop and deliver cutting- lows them at least technically to establish local markets. The variety edge software solutions to improve airport and ground handling logis- of consumers and their characteristics implies numerous ways of how tics operations and provide consulting services. INFORM products are they optimize their individual unit commitment. This paper aims to used by over 75 organizations in more than 165 airports worldwide. analyze the impact of the individual consumers’ decisions on a future electricity demand and feed-in on low voltage network level. There- 3 - Fairness considerations in railway crew scheduling fore, in a first step the different unit commitment problems of the dif- Silke Jütte, Daniel Müller, Ulrich Thonemann ferent small businesses have been modeled using mixed-integer linear Railway crew scheduling deals with generating duties for train drivers programming (MILP). In a second step these consumers are modeled to cover all train movements of a given timetable. The common objec- as learning agents of a multi-agent system (MAS). The MAS repre- tive is to minimize the overall costs associated with a crew schedule, sents a local electricity market in which participants negotiate supply such as workforce costs, hotel costs, etc. relationships. At each step of the simulation of the MAS the agents will readjust their MILP decision based on newly gained information (e.g. In reality, a cost minimal schedule often shows an uneven distribution neighbor agent sells electricity cheaper in the evening). Finally, using of unpopular duties among crew districts. As an example, overnight different synthetic scenarios with different input parameters (e.g. load rests, which are typically unpopular among drivers, might be assigned characteristics) the behavior of the agents and the resulting impact are to one crew district only. This situation is commonly perceived as un- studied in detail. Amongst others, the simulation’s results show major fair and corresponding crew schedules are only badly accepted among changes in electricity demand and feed-in for scenarios with high mar- employees. ket penetration of storages. Results also show that, electricity demand Transferring results from stationary contexts, we define and measure and supply balance in the local electricity market for certain scenarios unpopularity and (un)fairness in a railway crew scheduling context. and a specific market design. We show how to best integrate fairness conditions into a column gen- eration based solution algorithm. Our method has been applied to real-world test instances from a large European freight railway carrier. For our test scenarios, we could significantly improve schedule fair- ness, while schedule cost and solution runtime were only marginally  WB-19 affected. Wednesday, 10:50-12:20 - I Staff Scheduling and Rostering  WB-20 Stream: Traffic and Transportation Wednesday, 10:50-12:20 - II Invited session Chair: Silke Jütte Integrating Lotsizing and Scheduling

1 - An Insight to Aviation: Rostering Ground Personnel Stream: Production and Operations Management in Practice Invited session Manuel Kutschka Chair: Steffen Kasper Numerous dynamic, interdependent processes exist at an airport. These processes are highly affected by uncertain events as changing 1 - A metaheuristic solution approach for a rich lot- flight schedules, delays, or weather conditions. Naturally a flexible sizing and scheduling model workforce management is needed to support such operation. Airlines, Michael Schilde, Cornelia Jetzinger, Karl Schneeberger, Karl airports, and ground handlers provide the necessary workforce to meet Doerner this demand. But legal requirements, union agreements and company policies define the flexibility of workforce planning and utilization in The production of perishable products such as dairy products includes practice. Nevertheless a valid (monthly) roster matching the supply several specific aspects concerning product durability and production with demand under all these requirements has to be prepared usually procedures. The production of different dairy goods within a plant re- several weeks before the day of operation. quires coordination of the product flow across different production lev- els (e.g., soaking, heating, fermentation, cooling, filling). Usually the In this talk we discuss the optimization challenges to create monthly lot-sizing problem and the detailed sequencing and scheduling prob- rosters for ground personnel at an airport. We give examples of typi- lem are treated separately in the production planning process. By con- cal (legal/union/company) constraints, point out the characteristics of sidering these two problems simultaneously in a realistic model formu- different work areas at an airport, and how this affects the rostering. lation we aim to improve the overall performance of the entire produc- Further we present how rostering is solved by our branch-and-price tion process. For this purpose we extended the Position-Based Model solution methodology in practice. Using this approach, we report on introduced by Lütke-Entrup et al. (2005). The extensions include ex- our real-life experience with optimized rostering in airport ground han- plicit product transfers via product pipes (i.e., pipes are used to trans- dling. fer products between aggregates; no two transfers can be performed at the same time), product-dependent durability during the production 2 - Fair Cyclic Roster Planning — A Case Study for a process (e.g., after fermentation the product has to be chilled within a Large European Airport certain time limit), cleaning and sterilization pipes which prevent si- Torsten Fahle multaneous treatment of specific aggregates, maximum and minimum capacity of aggregates, sequence-dependent setup times, product loss Airport ground staff scheduling has been long known as one of the caused by transfers, a product specific production speed for each ag- most challenging and successful application of operations research, in gregate, and cleaning intervals (i.e., the time between two consecutive particular column generation method. In this presentation, we will cleaning procedures is limited). Based on a set of real-world produc- concentrate on one type of rostering known as cyclic roster (equiva- tion data, we used our model to determine exact solutions to very small lently shift pattern or rotating schedules) which represent sequences problem settings. As even for small instances the time required for ob- of consecutive shifts and days-off designed for a group of employees, taining exact solutions is too long for practical use cases, we developed rotating from one week to the next. Numerous aspects required in prac- an innovative metaheuristic solution approach based on the concept of tice have to be taken into account, amongst others crew qualification, adaptive large neighborhood search for this problem. work locations and the travel time between each location, government regulations and labour agreements, etc. INFORM’s branch-and-price 2 - Solving a rich position based model for dairy prod- solution approach covers all of these aspects and is in use on many ucts airports world-wide. Cyclic Rosters are usually considered as ’fair by construction’. Nevertheless, one of our customers missed several fair- Karl Schneeberger, Karl Doerner, Michael Schilde ness aspects in the generated plans. In this case study we will discuss Usually the lot-sizing problem and the detailed sequencing and why the customer find the resulting rosters unfair. We show which scheduling problem are treated separately in the production planning new fairness requirements are needed. We present a fast local search process. By considering these two problems simultaneously in a very post-processing step that transforms a cost optimal shift plan from our realistic model formulation we aim to improve the overall performance branch-and-price solver into a fair cyclic shift plan with the same costs. of the entire production process. For this purpose we extended the

8 OR 2014 - Aachen WB-22

Position-Based Model introduced by Lütke-Entrup et al. (2005). The Given a generic process or workflow model in YAWL-notation or any extensions include explicit product transfers via product pipes (i.e., other process modelling language like BPMN or WFMC we state that, pipes are used to transfer products between aggregates; no two trans- by using a set of reduction rules as introduced e.g. by Sadiq et al. we fers can be performed at the same time), product-dependent durabil- are able to generate a hierarchically structured tree of sub graphs of ity during the production process (e.g., after fermentation the product the workflow graph-representation. According to the notation used in has to be chilled within a certain time limit), cleaning and steriliza- La Rosa et al. we call these sub graphs facts. The tree structure of tion pipes which prevent simultaneous treatment of specific aggregates, the graph-representation on the one hand and the logical relation be- maximum and minimum capacity of aggregates, sequence-dependent tween the branches and leafs of the tree on the other can be utilized to setup times, product loss caused by transfers, a product specific pro- create a set of constraints and dependencies between the single facts. duction speed for each aggregate, and cleaning intervals (i.e., the time La Rosa et al. showed that the nested branches can be associated to between two consecutive cleaning procedures is limited). Based on (predefined) questions with respect to the configuration of a workflow a set of real-world production data, we used our model to determine management system like for instance an ERP-application. They pre- exact solutions to very small problem settings. As even for small in- sented an algorithm which dynamically sorts the questions and answers stances the time required for obtaining exact solutions is too long in in a maximum efficient configuration path while working through the general (even finding a first feasible solution for one product on all corresponding questionnaire. By combining the different elements as available aggregates takes many hours), we first developed a fix-and- facts, constraints on questions and configuration space we are thus able optimize inspired construction heuristic to obtain a feasible solution to 1. Algorithmically generate the efficient structured, interactive ques- for several products. This means, that the overall problem is first de- tionnaire for the configuration of workflow systems and 2. Algorith- composed and iteratively solved while adding one product per itera- mically check the consistency (dead lock free, free of synchronization tion. With this as input we created an innovative matheuristic solution structural conflict) of the underlying workflow model. The concept approach based on the concept of fix-and-optimize for this problem. was tested in the prototype of the interactive questionnaire for config- uration of the webservice based ERP-Application Posity. 3 - A Proportional Lot-sizing and Scheduling Problem with Setup Classes 3 - A Unified Framework and Toolkit for Decision-Making Steffen Kasper, Stefan Helber under Uncertainty Susara van den Heever We propose an extended model formulation of the Proportional Lot- sizing and Scheduling Prob-lem (PLSP). In this model, setup opera- Business decision makers require optimized plans and schedules to be tions can be carried out period overlapping as well as product order robust. This means plans and schedules must be resilient to change, as dependent. We merge setup operations with comparable effort to so well as have the ability to quickly recover from shocks or changes to called setup classes. The implementation of these setup classes leads the system. At the same time, many decision makers who use optimiza- to a significant reduction of binary variables and therefore to a faster tion technology consider data as being certain when creating plans and solution time resp. a better solution quality. We finally pre-sent numer- schedules. However, when the data changes, their plans and schedules ical studies which show the effect of the implemented setup classes in often break down, leading to decision instability and distrust in opti- comparison to literature-based approaches. mization technology. In this talk, we present a unified framework and toolkit for robust decision-making under uncertainty. The goal of this approach is to improve the resilience of plans and schedules, as well as trust in optimization technology. This toolkit guides operations re- search practioners in modeling business problems under uncertainty, including a methodology for soliciting the uncertainty characteriza-  WB-21 tion and information to automatically construct robust and stochastic Wednesday, 10:50-12:20 - III models, based on a given deterministic model. It also guides business decision makers in leveraging various optimization-based approaches, Optimization Software I such as robust and stochastic optimization, to create multiple plans and compare their performance across multiple scenarios and KPIs. We Stream: Software Applications and Modelling Systems demonstrate the use of the toolkit for a case study involving demand Invited session fulfillment. Chair: Susara van den Heever

1 - Cluster Task Flow Execution Analysis Model Vlad Kucher, Zoia Runovska  WB-22 Wednesday, 10:50-12:20 - IV Cluster nodes’ load analysis is widely used to detect potential problems and optimize cluster performance in solving control problems when Transport Network Planning & Operation clusters are combined in Grid systems, which become increasingly popular due to the growing number of cluster systems. The system is studied under general assumptions about its functionality. Descrip- Stream: Logistics and Inventory tion of the system operating in continuous time is given as follows. A Invited session cluster is a set of independent nodes. Tasks enter the system at ran- Chair: Annette Chmielewski dom time moments, the time intervals between the moments are mutu- ally independent and have the same distribution. Number of requested computation nodes to perform a task is an integer random variable. If 1 - Modelling and Solving Realistic Hub Location Prob- there is sufficient number of available nodes, the task is accepted for lems execution. Otherwise, the task is performed on a group of clusters that J. Fabian Meier, Uwe Clausen can exchange tasks. Task performance duration is some continuous random variable. The number of nodes becoming available after a task Real-world transportation often uses hub-based transport networks: is performed is a limited integer random variable. Cluster load analy- Shipments from different sources are bundled, send to a hub, reclas- sis is associated with the study of a random process that describes the sified and send to other hubs or their respective sinks. The economies cluster state at the current time moment. This process can be inter- of scale lower the transport costs but additional costs for building and preted as a semi-Markov process with spontaneous changes, the study maintaining hubs come into play. To balance costs and gains, one has of which is reduced to the investigation of semi-Markov process with to overcome the simplified view on economies of scale often present in two-dimensional states. The results obtained characterize in sufficient literature: Use linear costs and discount them on hub-hub-connections detail the studied system: the distribution of availability of computa- by a fixed factor. tional cluster nodes can be used to predict changes in the number of If we measure transport costs as concave or step function, we get more nodes available over time; stationary distribution can be used to opti- realistic models, but MIP methods cannot solve the resulting problems. mize a number of parameters; non-stationary feature of the considered We present a way to write down hub location problems in a suitable system - task execution time without a queue - made it possible to esti- way for neighbourhood search. Furthermore, we develop a Simulated mate the average number of tasks performed within this time interval. Annealing procedure to attack such general hub location problems. An important and often overlooked problem is the calibration of heuris- 2 - Generation of interactive Questionnaires using tics. We present a method using the F-Race approach of M. Birattari YAWL-based workflow models adapted for our purposes. Raimond Wüst

9 WB-23 OR 2014 - Aachen

2 - Tender decisions in European transportation net- In the past decades, companies in several industries transformed their works for automotive companies business from selling single products to selling individualized and inte- Maximilian Brock, Annette Chmielewski grated products and services. This shift dramatically impacts the sell- ing approach of the sales force: planning long term relationships with the client become more and more important and the sales force needs Automotive Original Equipment Manufacturers (OEM) produce a to be organized in sales teams rather than ’lone-wolf’ sellers. wide range of cars in their plants in Europe. The plants are supplied with component parts by thousands of suppliers of which a large num- Sales Force Modeling, in particular sales territory design, has been a ber is located in Europe as well. Depending on the part type and de- subject of research for more than 40 years. Sales territory design mod- mand structure different transport types like Full Truck Load or Less els assign customer accounts or geographic units to sales representa- Than Full Truck Load are chosen to provide the plants. In addition, tives and calculate coverage, disruption, and profit impact of alterna- various transport modes like road or rail transports are used to benefit tive assignments. These models are usually integer programming mod- from their individual cost structures. In a strategic network optimiza- els that maximize coverage, minimize disruption, or maximize profit. tion the optimal transport types and modes are identified. Afterwards, However, little attention is given to the changing selling approach and logistics service providers (LSP) are chosen in a tender process to per- the impact on sales territory design. We study a novel sales territory form the necessary transports at minimum costs and highest possible design model for assigning sales teams to customer accounts to sales quality and service level. The transport network is separated in geo- teams for maximizing sales revenue based upon individual sales poten- graphic regions (e.g. Spain, North of France) in order to design lo- tial for individual products and services. We consider constraints such gistically relevant and manageable network parts on which the LSP as workload and geographical distance in our model. We first present a can bid. The aim of the tender process is to gather LSP-quotes on deterministic formulation of the sales territory design model and then the transport regions, modes and types. The quotes are compared and formulate a stochastic program. Using real data, the solution of both the cost-optimal set of quotes and LSPs is chosen under certain re- models are evaluated. strictions. For example, the restrictions cover the transport volume per LSP or the maximum number of LSP admitted. Furthermore, it is pos- 2 - Towards a Customer-oriented Queuing in Service In- sible to form bid packages combining several regions. The objective cident Management function consists of the transport costs arising out of a feasible set of Peter Hottum, Melanie Reuter LSPs serving then network regions and it is often influenced by cri- The provision of services hinges considerably on the contribution of teria as, for instance, the implementation effort. 4flow has developed the provider and the customer and — if present — on their involved a mathematical model and an optimization tool to support the tender networks. In this talk we focus on incident management — a ser- process and assess and compare possible scenarios. In this paper the vice domain that is highly relevant for all kinds of industries and is mathematical model and its application on a real-world problem will described from a provider internal perspective in the ITIL documenta- be presented. tion. In previously conducted studies, we have derived result influenc- ing factor classes based on qualitatively and textual analyzed service 3 - Improving an auction based exchange mechanism incident tickets from a worldwide operating IT service provider. We for the collaborative carrier routing problem have proven the customer induced contribution to the service genera- Margaretha Gansterer, Daniel Kaml, Richard Hartl tion and aggregated a customer contribution factor (ccf). By comple- menting these provider-centric service processes with that factor, we The logistics and transportation sector has undergone fundamental are able to use information about the customer’s ability to contribute, changes in the past decades. Intensified competition on global mar- that was not able to process before. In the talk, we address the question: kets along with heightened customer expectations lead to increased How can the customer’s potential to contribute be used to organize the pricing pressure. However, increased efficiency can be achieved if queuing in service incident management in a customer-oriented way? freight carriers collaborate by trading their transportation requests. In We present a mathematical formulation for assigning tickets to servers the highly competitive shipping and transportation industry, compa- and discuss first results of a discrete event simulation. We use this nies need to achieve a maximum level of efficiency in order to stay in simulation to test basic assignment rules based on the ticket complex- business. Our study is based on an auction based exchange mechanism ity and the servers’ level of experience. We also study the impact of which has been presented by Berger and Bierwirth [1]. They provide the ccf in a small example. a framework of methods for maximizing the total profit of the network 3 - Optimization of "Picking Points" for the area-wide while enabling the carriers to reveal as few as possible private informa- tion. Results are compared to those obtained by central planning and supply of service parts to a situation where carriers do not collaborate at all. Investigations Peter Korevaar are based on three levels of competition according to the geographical The area-wide supply of service parts remains a big planning and lo- composition of customer areas. The integrated tour planning method gistical challenge for many companies, particularly when the customer constitutes the traveling salesman problem with precedence constraints requirements for parts availability are high, the demand per part is low (TSPPD) and is solved by exact algorithms. We replace their exact and and the individual parts are expensive. A common approach is the thus costly tour planning process by a heuristic approach. Further- overnight delivery of service parts to so-called picking points. Of- more, we include new carrier strategies and show their effect on the ten the home addresses of technicians are used as picking points (e.g. total profit of the collaborative carrier network. Our heuristic based garage or van kit). While this approach has obvious advantages for framework shows promising results while we are able to handle prob- the technicians, it is in general not cost optimal. On the one hand this lems with an increased number of transportation requests. [1] Berger, is due to the fact that the technician residences are usually not opti- S., Bierwirth, C. (2010). Solutions to the request reassignment prob- mally located with respect to the repair locations, on the other hand lem in collaborative carrier networks. Transportation Research Part E because there are typically more technician locations than the number 46: 627-638. of locations necessary for the required maintenance and repair service. Consequently the delivery to and the inventory costs of those picking points tend to be too high. The optimal choice of the number of picking points, their geographical locations, as well as the optimal allocation of customers to the picking points, considering maximum travel times, is an NP-complete optimization problem. In the lecture, an optimization  WB-23 system is presented which solves this problem using the Threshold Ac- Wednesday, 10:50-12:20 - V cepting method (a metaheuristic optimization method similar to Sim- ulated Annealing). Based on real life examples, typical total logistics Service Analytics and Optimization costs savings will be demonstrated and the sensitivity of the optimal solution with respect to the maximum allowable travel time will be Stream: Production and Operations Management discussed. Invited session 4 - Statistical Correction of Cash-Flow Forecasts Chair: Peter Hottum Sebastian Blanc, Thomas Setzer Chair: Hansjörg Fromm Cash flow forecasts play a pivotal role in corporate planning as for in- stance liquidity and foreign-exchange risk management are typically 1 - Deterministic and stochastic sales territory design based on these forecasts. Unfortunately, expert judgment and pre- diction are typically biased, but empirical work from other domains models for sales teams employing comparably small datasets suggest that biases can often be Johannes Kunze von Bischoffshausen, Melanie Reuter, partially removed using statistical correction techniques in order to in- Hansjörg Fromm crease forecast accuracy. Although accurate forecasting of financial

10 OR 2014 - Aachen WB-25

figures is vital for today’s corporations, the quantitative impact of sta- 3 - Designing a Combinatorial Market for Offloading Cel- tistical correction of enterprise financial forecasts in general and cash lular Traffic via Wireless Access Points flow prediction in particular has not yet been empirically analyzed. Sven Seuken Accordingly, we analyze correction of cash flow forecasts generated by experts from various subsidiaries and business divisions of a multi- Every year, mobile network operators (MNOs) around the world spend national company. Employing a unique set of forecasts with different billions of dollars expanding their mobile networks, to cope with the horizons delivered by numerous subsidiaries over a period of over five exponentially increasing demand for 3G and 4G bandwidth. Cellu- years, we find that forecasts accuracy can be increased significantly lar capacity is particularly scarce in inner-city locations during par- using correction techniques based on linear models. We compare tech- ticular times-of-day. At the same time, the majority of wireless ac- niques based on ordinary least squares (used for instance in Theil’s cess points (residential and commercial) are largely idle most of the method) and least absolute deviation regression. The results show that time, i.e., the cheap Internet bandwidth provided by Internet Service forecast accuracy can be improved significantly using statistical cor- Providers (ISPs) remains largely unused. This gives rise to opportu- rection techniques. In addition, results indicate a higher robustness of nities for trade, where some of the peak-time cellular traffic from the the latter technique against outliers. Furthermore, we find that the us- MNOs is offloaded via wireless access points, in exchange for pay- age of geometrically descending weights of recent observations fosters ments from the MNOs to the ISPs. However, determining an optimal accuracy in the majority of cases. allocation and prices is a challenging problem, in particular because MNOs have complex, combinatorial preferences: their need and their value for offloading traffic vary by location and by time-of-day. In this paper, we propose a market design solution for this problem, where an intermediary sets up a smart market platform that automatically es- WB-24 tablishes trades between sellers and buyers. We first describe how the  preferences of the sellers and buyers in this domain can be modeled Wednesday, 10:50-12:20 - AS succinctly. Then we introduce a combinatorial allocation mechanism that computes an optimal allocation, i.e., which MNOs get to offload Combinatorial Auctions how much of their traffic in which of their cell sectors and at what time of the day. Finally, we show how to use core-selecting combina- Stream: Pricing, Revenue Management, and Smart torial auctions in this domain to computes prices for each MNO, while Markets minimizing the incentives for the MNOs to misreport their values. We Invited session conclude by discussing a number of challenges (e.g., thin markets and non-scarcity) that arise in fielding this mechanism in practice. Chair: Martin Bichler

1 - Dynamic programming for combinatorial auctions with items arranged on rows Bart Vangerven, Dries Goossens, Frits Spieksma  WB-25 Wednesday, 10:50-12:20 - AachenMuenchener Halle (Aula) Combinatorial auctions allow bidders to better express their prefer- ences compared to the traditional auction formats, increasing economic OR Success Stories I efficiency when complementarities or substitution effects are present. We investigate the problem setting where we have an auction of sim- Stream: Business Day ilar goods that can be arranged on rows. An application of this is the Invited session selling of tickets for seats in a grandstand or stadium. Another applica- tion is selling pieces of land. Bidders are allowed to submit one bid on Chair: Christoph Hempsch any subset of the goods (seats, land, ...) that is connected. The objec- tive is a traditional one: to compute the subset of bids that maximize 1 - OR at Netherlands Railways: Successes and Chal- auction revenue. Of course every good can only be sold once. We de- scribe a dynamic programming algorithm which, for a fixed number of lenges rows and for a particular class of bids, solves the winner determina- Dennis Huisman tion problem optimally in polynomial time. We also study a number of extensions for which we can modify the algorithm. In 2008, Netherlands Railways (NS) won the Franz Edelman Award for using the Operations Research (OR) methods to introduce a com- 2 - Compact bid languages and core-pricing in large pletely, new timetable in December 2006. This timetable was selected from a set of 10 timetables, all generated with OR models. In addition, multi-item auctions rolling stock and crew schedules were constructed with OR tools. Al- Andor Goetzendorff, Martin Bichler though the general performance of NS has significantly improved since the introduction of the new timetable, NS struggled a lot with winter We introduce an auction design framework for large markets with hun- weather, more specific with snow, during the last couple of years. dreds of items and complex bidder preferences. Such markets typically In December 2009, after severe problems in train operations, NS lead to computationally hard allocation problems. Our new framework started - together with ProRail (the Dutch railway infrastructure man- consists of compact bid languages for sealed-bid auctions and meth- ager) and the Ministry of Infrastructure and the Environment - a Winter ods to compute second-price rules such as the Vickrey-Clarke-Groves program to improve its operations during heavy winter days. The long or bidder-optimal, core-selecting payment rules when the optimality of term goal of the Winter program is to achieve a high performance un- the allocation problem cannot be guaranteed. To demonstrate the ef- der all circumstances, so even during heavy winter conditions. This ficacy of the approach for a specific, complex market, we introduce goal can be achieved by improving the assets such that they do not fail a compact bidding language for TV advertising markets and inves- during heavy winter conditions, and by having a new process for dis- tigate the resulting winner-determination problem and the computa- ruption management. In this process, advanced algorithms to resched- tion of core payments. For realistic instances of the respective winner ule the timetable, rolling stock and crew in realtime, play a major role. determination problems, very good solutions with a small integrality The algorithms for real-time rescheduling of crew have already been gap can be found quickly, though closing the integrality gap to find used in practice. Experiments since Summer 2013 have shown that marginally better solutions or prove optimality can take a prohibitively crew members can be rescheduled within a few minutes. In this pre- large amount of time. Our subsequent adaptation of a constraint- sentation, we will discuss the first results, our implementation strategy generation technique for the computation of bidder-optimal core pay- and remaining challenges. ments to this environment is a practically viable paradigm by which core-selecting auction designs can be applied to large markets with potentially hundreds of items. Such auction designs allow bidders to 2 - Solving two practical problems from the worlds of express their preferences with a low number of parameters, while at transport and territory planning the same time providing incentives for truthful bidding. We comple- Nitin Ahuja ment our computational experiments in the context of TV ad markets with additional results for volume discount auctions in procurement in The transport planning problem deals with moving goods from one order to illustrate the applicability of the approach in different types of location to another by using the available modes of transport. What large markets. makes this problem interesting is that intermodal transport of goods is also allowed.

11 WC-02 OR 2014 - Aachen

The territory planning problem basically involves partitioning a set of customers into a given number of clusters that are "balanced’, "com- Wednesday, 13:10-14:40 pact’, and as nonoverlapping as possible. We will briefly talk about and define these problems. Then, we will take a short look at the way we solved them.  WC-02 Wednesday, 13:10-14:40 - Fo2 3 - How OR Improves The Baggage Handling System At Frankfurt Airport - Project Insights Online-Optimization Marco Franz, Frauke Böckmann Stream: Discrete and Combinatorial Optimization, The baggage handling system at Frankfurt Airport distributes up to Graphs and Networks 110.000 bags per day using its over 80 km long rail tracks. During the last years the baggage handling system has been extended and new Invited session requirements have been implemented, e.g. robust routing in case of Chair: Chi-Fen Chang disturbances and balancing constraints for the early baggage storage system based on prognosis data. This talk describes how OR helped us 1 - The Bi-Objective Ski Rental Problem to succeed in this complex project. Morten Tiedemann Technical details will be discussed in an accompanying talk given by In online optimization, an algorithm has to make decisions based on a Frauke Böckmann on Thursday morning. sequence of incoming bits of information without knowledge of fu- ture inputs. The performance of an online algorithm is commonly 4 - Optimizing the delivery depot network of DHL Parcel evaluated by comparing its objective value to the optimal offline so- in Germany lution, also referred to as competitive analysis. Thus far, the notion of Christoph Hempsch, Matthias Meisen online algorithms and competitive analysis is only known for single- objective optimization problems. We transfer the concept to multiple Deutsche Post DHL has invested a significant amount in its parcel dis- objectives and introduce competitive analysis for multi-objective opti- tribution network over the last 3 years. Different methods and models mization problems. Due to the shift from a single optimal solution in from Operations Research are used to optimize the distribution net- a single-objective optimization problem to a set of efficient solutions work by supporting management decisions concerning the hub loca- in a multi-objective optimization problem, the transformation of the tion or the sorting capacity of hubs. A linear mathematical programm concept of competitive analysis to multiple objectives is not straight- which optimizes the number and the location of delivery depots will forward. In this talk, the novel definition of multi-objective online be presented and it will be illustrated how OR has influenced the optimization is introduced and then applied to the ski rental problem landscape of delivery depots within the parcel distribution network of which is an analogy for the classic rent or buy problem. Imagine you Deutsche Post DHL in Germany so far. are about to go skiing for the first time in your life and you are faced with the question of whether to buy skis or to rent them. If you knew how often you would go skiing in the future, the optimal decision could be calculated based on the rental and the buying costs. By the definition of a second cost component related to the comfort of buying or renting skis, the bi-objective ski rental problem is introduced. We present an optimal bi-objective online algorithm with respect to multi-objective competitive analysis. 2 - Interval Scheduling with Online Failure of Jobs Marco Bender, Clemens Thielen, Stephan Westphal In the interval scheduling problem we are given a set of jobs (inter- vals), where every interval is given by a release date and a processing requirement. If an interval is accepted, it must be started at its release date. The task is to accept a maximum number of non-overlapping intervals. We consider the following online variant of this problem. An online algorithm knows the set of intervals. However, up to k of these in- tervals can fail, i.e., they cannot be accepted, and an online algorithm only learns that an interval fails at the time when it is released. An op- timal offline algorithm knows all failing intervals beforehand and can compute an optimal solution for the remaining intervals. We analyze this setting by means of competitive analysis. We present competitive algorithms and provide lower bounds on the competitive ratio that can be achieved by deterministic and randomized online al- gorithms. 3 - The Online Bin Packing Problem Revisited Chi-Fen Chang, Ping-Ting Lin, Chung-Shou Liao Bin packing is one of oldest classic NP-hard problems in the field of combinatorial optimization. The problem involves assigning a set of n items with positive sizes to bins of equal capacity such that the number of bins required is minimized. In this work, we consider a long-studied generalization of the bin packing problem, in which every item is re- leased one by one and must be packed into a unique bin as soon as the items arrive. The objective of this online bin packing problem is to determine a packing without knowledge of next items to minimize the total number of bins required. There have been many papers on this online packing problem since an elegant online strategy, called the Harmonic algorithm, was presented by Lee and Lee in 1985 [Journal of the ACM, 32(3):562-572, 1985]. After longstanding efforts by a series of the Harmonic algorithm and its variants, Seiden extended the idea and proposed the Super Harmonic algorithm whose asymptotic competitive ratio is the best known result to date [Journal of the ACM, 49(5):640-671, 2002]. This study revisits the properties of this online packing problem and investigates approx- imation algorithms for solving its multidimensional model.

12 OR 2014 - Aachen WC-04

WC-03 where each of these parameters is small, it is natural to employ pa-  rameterized complexity analysis, that is, to measure the computational Wednesday, 13:10-14:40 - Fo3 complexity of a problem as a function of a multitude of input parame- ters. Computational Social Choice We discuss a few results from the DFG project "Parameterized Al- gorithmics for Voting Systems" using the problems "Lobbying" and Stream: Algorithmic Game Theory "Shift Bribery" as examples and show both (fixed-parameter) tractable Invited session and intractable parameterizations of the problems. Chair: Joerg Rothe In the Lobbying problem, we are given a binary multi-issue election where voters approve or disapprove multiple issues and an agent (the 1 - The Complexity of Manipulation, Bribery, Campaign lobby) may influence up to k voters. The Lobbying problem asks whether the lobby can choose k voters to be influenced so that as a Management, and Margin of Victory in Schulze, result each issue that is liked by the lobby gets a majority of approvals Copeland, Cup, Bucklin, and Fallback Voting and each issue that is disliked by the lobby gets a majority of disap- Joerg Rothe, Piotr Faliszewski, Yannick Reisch, Lena Schend provals. A central theme in computational social choice is to study the extent In the Shift Bribery problem, we are given an election (based on pref- to which voting systems computationally resist manipulative attacks erence orders), a preferred candidate p, and a budget. The goal is to seeking to influence the outcome of elections, such as manipulation ensure that p wins (under some specified voting rule) by shifting p (i.e., strategic voting), control, and bribery. Bucklin and fallback vot- higher in some voters’ preference orders. However, each such shift re- ing are among the voting systems with the broadest resistance (i.e., NP- quest comes at a price (depending on the voter and on the extent of the hardness) to control attacks. However, only little is known about their shift) and we may not exceed the given budget. behavior regarding manipulation and bribery attacks. We comprehen- sively investigate the computational resistance of Bucklin and fallback voting for many of the common manipulation and bribery scenarios; we also complement our discussion by considering several campaign management problems for Bucklin and fallback and by giving a survey  WC-04 of known results for Schulze, cup, and Copeland. Wednesday, 13:10-14:40 - Fo4 Relatedly, the margin of victory is a critical measure for the robust- ness of voting systems in terms of changing election outcomes due to Complex Scheduling errors in the ballots or fraud in using electronic voting machines. Ap- plications include risk-limiting post-election audits so as to restore the Stream: Project Management and Scheduling trust in the correctness of election outcomes. Continuing the work of Invited session Xia, we show that the margin of victory problem is NP-complete for Schulze and cup elections. We also consider the exact variant of this Chair: Stefan Bock problem, which we show to be complete for DP, the second level of the boolean hierarchy over NP, in Schulze, cup, and Copeland elections. 1 - Row-and-Column Generation for the Jobshop Scheduling Problem with min-sum Objective This paper is based on the work presented in the extended abstract that is to appear in the proceedings of the Thirteenth International Confer- Sarah Kirchner, Andreas Gebauer, Marco Lübbecke ence on Autonomous Agents and Multiagent Systems, May 2014, and The Job-Shop Scheduling Problem is widely studied for the makespan on as yet unpublished results by the authors. objective. In recent years there has also been research on exact solution approaches and lower bounds to the job shop problem with min-sum 2 - Path-Disruption Games: Bribery and a Probabilistic criteria (e.g. Lancia et al., 2011; Baptiste et al., 2008). We propose a Model new solution approach for the job shop problem minimizing the sum Anja Rey, Joerg Rothe, Adrian Marple of completion times. In our approach we bring together several ideas of recent research. First we adapt the arc-time indexed formulation Path-disruption games, introduced by Bachrach and Porat, are coali- of Pessoa et al. (2010) to the jobshop problem. In this formulation a tional games played on graphs where one or multiple adversaries each machine schedule is represented by a path through a network. There- seeks to reach a given target vertex from a given source vertex, while fore we have a flow formulation with side constraints. We solve this a coalition of agents seeks to prevent that from happening by block- problem with row and column generation (e.g. Sadykov and Vander- ing every path from the source to the target for each adversary. In- beck, 2011). That means we generate machine schedules for a sin- spired by bribery in voting, we introduce the notion of bribery for path- gle machine in the pricing procedure and add the corresponding arcs disruption games. Here, the adversary breaks into the setting and tries to the arc time indexed master problem formulation. This approach to change the outcome to her advantage by paying a certain amount of has the advantage that machine schedules can be recombined in the money, without exceeding a given budget. Now that the agents collab- masterproblem and we need to solve potentially less pricing problems. orate while, at the same time, they want to win against their adversary The proposed formulation can be further strengthened by adding prece- who can actively interfere with the situation in order to achieve her in- dence cuts (Christofides et al., 1987). First computational tests yield dividual goals in opposition to the agents, the game combines aspects promising results in terms of lower bounds and running times. of both cooperative and noncooperative game theory. We analyze the complexity of the problem of whether the adversaries can bribe some 2 - Complex Job Shop Scheduling with Regular Objec- of the agents such that no coalition will form that prevents the ad- tive versaries from reaching their targets. We show that this problem is Reinhard Bürgy, Heinz Gröflin NP-complete for a single adversary and complete for the second level In the last years, job shop scheduling problems that are more complex of the polynomial hierarchy for the case of multiple adversaries. Fur- - but also of wider applicability - than the classical Job Shop (JS) have thermore, we expand the model of path-disruption game by allowing found increasing interest, a prototypical example of such a Complex uncertainty about the target vertices. The agents do not know for sure Job Shop problem (CJS) being the Blocking Job Shop (BJS). While where the adversary is heading to and which paths to block. Rather, several methods have been developed for various CJS, most of them every vertex is a potential target the adversary seeks to reach with a consider as objective makespan minimization and very few address certain given probability. We study the complexity of problems related other objectives related for instance to due dates or flow time, that are to common solution concepts and other properties of such games. equally relevant in practice. 3 - On the Multivariate Complexity of some Bribery Prob- This talk addresses CJS problems with arbitrary regular objective. lems Building on previous work of the authors on CJS problems with makespan objective, the problem is formulated in a disjunctive graph Robert Bredereck and a local search method using a neighborhood based on job-insertion Voting scenarios arise whenever the preferences of different parties is proposed. A key feature is the ability to consistently and efficiently have to be aggregated to form a joint decision, for example in political generate feasible neighbor solutions, typically by moving a critical op- elections, group decisions, web site rankings, or multiagent systems. eration together with other operations whose moves are "implied’. Many voting problems turned out to be NP-hard which can also be a Numerical results are presented for the Job Shop (JS), the Job Shop desired property, for example in case of manipulation or bribery. Since with Setup Times (JSS), and the Blocking Job Shop (BJS) with the fol- voting problems carry many natural parameters such as the number of lowing four objectives: makespan, maximum tardiness, total weighted alternatives or the number of votes and there are real-world scenarios flow time, total weighted tardiness.

13 WC-05 OR 2014 - Aachen

The results support the validity of the proposed method. Specifically, 2 - Improved Results on Multiple Sink Location Prob- the results are competitive with current benchmarks in the JS and JSS, lems in Dynamic Path Networks they substantially improve the current benchmarks in the BJS with Yuya Higashikawa, Mordecai GOLIN, Naoki Katoh makespan objective and establish first benchmarks in the BJS with the other three objectives. This paper considers the k-sink location problem in dynamic path net- works. In our model, a dynamic path network consists of an undirected 3 - Lower Bounds based on decompositions for a sin- path with positive edge lengths, uniform edge capacity, and positive gle machine scheduling problem with sequence- vertex supplies. Here, each vertex supply corresponds to a set of evac- uees. Then, the problem requires to find the optimal location of k sinks dependent setup times and inventory constraints in a given path so that each evacuee is sent to one of k sinks. Under the Paul Göpfert, Stefan Bock optimal evacuation for a given k-sink location, there exist k-1 vertices such that each one represents the boundary dividing all evacuees be- The policy of low inventories along the automative supply chain leads tween adjacent two sinks into two groups, i.e., all supplies in one group from the viewpoint of an automative supplier to a high number of cus- evacuate to the left sink and all supplies in the other group evacuate to tomer demands and material supplies from his suppliers. The huge the right sink. We call such k-1 vertices (k-1)-divider. Therefore, the amount of possible car configurations results at the suppliers stage in goal is to find k-sink location and (k-1)-divider which minimize the a high product variety for one functional component. In order to ful- maximum evacuation time or the total evacuation time for all supplies, fill customer demands and to cope with the variety of the own product which are denoted by the minimax problem and the minisum problem, portfolio the need for efficient production scheduling arises. We con- respectively. We study the k-sink location problem in dynamic path sider the case that the variations of the component are assembled on networks with continuous model, and prove that the minimax problem a single machine with sequence-dependent setup times. Moreover the can be solved in O(kn) time and the minisum problem can be solved in components can only be produced if enough materials are available. O(n2 mink, 2logkloglogn(0.5)) time, where n is the number of vertices The objective is high production efficiency, measured by a schedules’ in the given network. Note that these improve the previous results by total weighted completion time. In order to obtain lower bounds on the Y. Higashikawa, M. J. Golin, and N. Katoh, “Multiple Sink Location objective, the problem is decomposed into a master problem and sev- Problems in Dynamic Path Networks”, Proc. AAIM 2014 (to appear). eral subproblems. The decomposition exploits the presence of prece- dence chains among the jobs and yields a partition of the job set. For 3 - Solving the Variable Sized Bin Packing Problem with every set of jobs out of this partition a subproblem is defined. In this incompatibility and cardinality constraints talk we present different possibilities to define the subproblems as well Mohamed MAIZA, Sais Lakhdar, Mohammed Said Radjef as several classes of valid inequalities for the master problem. There- fore, a set of lower bounds can be obtained by applying a column gen- In this study we discuss a version of the classical one dimensional eration approach on different compatible combinations of master and bin-packing problem (BPP), where the objective is to minimize the subproblem formulations. These bounds are tested on a variety of test total cost of heterogeneous bins needed to store a given set of items, instances and first computational results are presented. each with some space requirement. The bins are grouped by categories where each category is distinguished by its cost and capacity. In our study we consider the conflicts constraints where some of the items are pairwise incompatible and consequently cannot be packed together. This variant of the problem occurs in a number of businesses, indus- trial and transportation contexts within a supply chain where generally  WC-05 we prospect to ask the best transfer cost of diverse available products. Wednesday, 13:10-14:40 - Fo5 We propose a pseudo-polynomial time algorithm for the Variable Sized BPP with Conflicts in the case where the size of the items is at least Combinatorial Algorithms equal to one third of the largest bin capacity. The latter considered have the lowest unit cost. Our algorithm is described as follows: after assigning each three compatible items with size equal to one third of Stream: Discrete and Combinatorial Optimization, the largest bin to the same largest bin, the algorithm solves the problem Graphs and Networks as a minimum-weight matching problem in the weighted compatibil- Invited session ity graph where nodes are the items and adjacent items are mutually compatible. The weight of a given edge is obtained by calculus of the Chair: Mohamed MAIZA minimum cost for packing the corresponding two items either in the same or in separate bins. 1 - Dynamic Programming for the Minimum Tour Dura- tion Problem Christian Tilk, Stefan Irnich

In this presentation, we consider a variant of the traveling salesman  WC-07 problem with time windows (TSPTW), called minimum tour duration Wednesday, 13:10-14:40 - Fo7 problem (MTDP), where the objective is the minimization of the tour duration. We present a new effective dynamic programming (DP) ap- Logic-Based Benders Decomposition proach to solve the MTDP. It is motivated by the DP-based solution approach of Baldacci et al. (2011), who successfully solve the TSPTW Stream: Discrete and Combinatorial Optimization, with a DP-based algorithm. For dealing with tour duration minimiza- tion, we will follow ideas presented in (Irnich 2008) to define con- Graphs and Networks sistent resource extension functions in order to apply effective domi- Invited session nance and bounding procedures. This is a non-trivial task because in Chair: Florian Dahms the MTDP at least two resources will depend on each other in a non- additive and non-linear way. Using relaxations to obtain lower bounds is common practice in routing problems, e.g., using a state-space re- 1 - Robust Scheduling with Logic-Based Benders De- laxation (Christofides et al. 1981). We present two new relaxation composition for the MTDP with two respectively one resource, which are attrac- Elvin Coban, Aliza R. Heching, John N. Hooker, Alan tive due to their low computational complexity. This and other relax- Scheller-Wolf ations can be combined with the ng-tour relaxation and the ngL-tour relaxation (Baldacci et al. 2011). To improve the lower bounds, we We study project scheduling at a large IT services delivery center in use two methods: First, we adapt a penalty method, first suggested by which there are unpredictable delays. We apply robust optimization Christofides et al. (1981) for solving a TSPTW with the objective of to minimize tardiness while informing the customer of a reasonable makespan minimization. Second, we generate the neighborhoods for worst-case completion time. Due to the impracticality of quantify- the ng-tour and ngL-tour relaxations dynamically, a technique success- ing joint probability distributions for delay times, we follow the recent fully applied for solving different routing problems (Bode and Irnich practice of using empirically determined uncertainty sets, which to our 2012). To our knowledge, we present the first exact algorithm for the knowledge have not been applied to service scheduling. To solve in- MTDP and provide computational results with optimal solutions on stances of a realistic size, we introduce a new solution method based many known benchmark instances for the TSPTW. on logic-based Benders decomposition. We show that when the un- certainty set is polyhedral, convexity properties of the problem allow

14 OR 2014 - Aachen WC-09

us to simplify the decomposition substantially, leading to a model of establish the existence of approximate equilibria for specific classes of tractable size that is suitable for a distributed computing environment. cost functions. For example for concave cost functions the factor is Preliminary computational experience indicates that this approach is at most 3/2, for quadratic cost functions 4/3, and for polynomial cost superior to a conventional mathematical programming model solved functions of maximal degree d it is at at most d + 1. For games with by state-of-the-art software. two players we obtain tight bounds which are as small as for example 1.054 in the case of quadratic cost functions. 2 - Curriculum based timetabling with room stability constraints using Logical Benders Cuts 4 - Congestion Games with Variable Demands Florian Dahms, Marco Lübbecke Max Klimm, Tobias Harks In the university course timetabling problem the goal is to find rooms We initiate the study of congestion games with variable demands in and timeslots for courses according to certain criteria (e.g., avoiding which the players strategically choose both a non- negative demand overlaps in student curricula, make good use of room capacity, etc.). and a subset of resources. The players’ incentives to use higher de- Traditionally this problem has been approached by many heuristic so- mands are stimulated by nondecreas- ing and concave utility functions. lutions but more recently also exact procedures have been proposed The payoff for a player is defined as the difference between the util- that are able to deal with large scale instances. In [G. Lach, M.E. ity of the demand and the associated cost on the used resources. Al- Lübbecke (2008)] an exact approach is shown that assigns timeslots though this class of non-cooperative games captures many elements of and rooms independently, thereby gaining a large benefit in efficiency, real-world applications, it has not been studied in this generality in the while ensuring feasibility for both problems using Halls Marriage The- past. Specifically, we study the fundamental problem of the existence orem. A drawback of this approach is, that it fails to properly deal with of pure Nash equilibria, PNE for short. We call a set of cost functions room stability constraints. We show how these constraints can be mod- C consistent if every congestion game with variable demands and cost eled using hypergraph matchings and describe a two stage procedure, functions in C possesses a PNE. We say that C is universally approx- similar to the aforementioned one, that can solve such instances. To do imately consistent if every such game has the rho-Finite Improvement so we use logical Benders cuts to coordinate the two problem stages. Property for every rho > 0. Our results provide a complete character- The generated cuts are derived from a modification of Halls Marriage ization of consistency and universally approximately consistency re- Theorem that is applicable for hypergraphs. The approach is demon- vealing that only affine and homogeneous exponential functions are strated on various problem instances, including some drawn from an consistent. En route, we obtain novel characterizations of consistency applied timetabling project at RWTH Aachen University. for congestion games with fixed but resource-dependent demands.

 WC-08  WC-09 Wednesday, 13:10-14:40 - Fo8 Wednesday, 13:10-14:40 - SFo1 Congestion Games Nonlinear Optimization I Stream: Algorithmic Game Theory Stream: Continuous and Non-linear Optimization Invited session Invited session Chair: Max Klimm Chair: Michael Herty Chair: Simone Göttlich 1 - Computing Approximate Pure Nash Equilibria Alexander Skopalik 1 - Generalized Nash Equilibrium Problems in Banach Spaces: Theory and Nikaido-Isoda-Based Path- Among other solution concepts, the notion of the pure Nash equilib- Following Methods rium plays a central role in Game Theory. Pure Nash equilibria in a Michael Hintermueller game characterize situations with non-cooperative deterministic play- ers in which no player has any incentive to unilaterally deviate from A class of non-cooperative Nash equilibrium problems is presented, the current situation in order to achieve a higher payoff. Unfortu- in which the feasible set of each player is perturbed by the decisions nately, it is well known that there are games that do not have pure of their competitors via an affine constraint. For every vector of deci- Nash equilibria. Furthermore, even in games where the existence of sions, the affine constraint defines a shared "state" variable. Due to the equilibria is guaranteed, their computation can be a computationally presence of an additional constraint on the state, the problem cannot hard task. Such negative results significantly question the importance be reduced to the classical setting as considered in Nash’s work. The of pure Nash equilibria as solution concepts that characterize the be- existence of an equilibrium for this problem is demonstrated, first or- havior of rational players. Approximate pure Nash equilibria, which der optimality conditions are derived under a constraint qualification, characterize situations where no player can significantly improve her and a numerical method is proposed, which involves the solution of a payoff by unilaterally deviating from her current strategy, could serve sequence of penalized problems. A new type of path-following strat- as alternative solution concepts provided that they exist and can be egy using a value function based in part on the Nikaido-Isoda function computed efficiently. We discuss recent positive algorithmic results for is proposed to accelerate the outer loop. approximate pure Nash equilibria in unweighted and weighted conges- tion games. 2 - An inexact trust-region algorithm for nonlinear pro- 2 - Price of Stability in Congestion Games gramming problems with dense constraint Jaco- Martin Gairing, Giorgos Christodoulou bians Andrea Walther, Lorenz T. Biegler During the last decade, the quantification of the inefficiency of game- theoretic equilibria has been a popular and successful line of research. There is a wide range of applications where the derivative matrices of The two most widely adopted measures for this inefficiency are the the corresponding minimization problems are of rather small size but Price of Anarchy (PoA) and the Price of Stability (PoS). In this talk I dense. Examples for such a setting are Periodic Adsorption Processes will summarise recent results on the PoS in congestion games. (PAPs). Here, the purity of the product or the energy consumption serve as target function. Additionaly, the state of the system is de- 3 - Approximate pure Nash equilibria in weighted con- scribed by general nonlinear equality constraints. As a consequence, gestion games when using well-established techniques the run-time needed for the op- timization process may be dominated significantly by the computation Christoph Hansknecht, Alexander Skopalik, Max Klimm of the dense Jacobian and its factorization. We study the existence of approximate pure Nash equilibria in This talk presents an alternative approach, namely an inexact trust- weighted congestion games. We develop techniques to obtain approxi- region SQP algorithm. The proposed method does not require the mate potential functions that prove the existence of alpha-approximate exact evaluation of the constraint Jacobian or an iterative solution of pure Nash equilibria and the convergence of alpha-improvement steps. a linear system with a system matrix that involves the constraint Ja- We show how to obtain upper bounds for approximation factor alpha cobian. Instead, only an approximation of the constraint Jacobian is for a given class of cost functions. We demonstrate our techniques and required. Furthermore, it is assumed that an exact representation of

15 WC-11 OR 2014 - Aachen

the nullspace of the constraint Jacobian at the current iterate can be for decision makers over deterministic approach. Moreover, it is con- evaluated in a fixed finite number of steps if necessary. Corresponding cluded by extensive computational experiments that solution qualities accuracy requirements for the presented first-order global convergence in terms of satisfaction ratio, profit and capacity loss varies under dif- result can be verified easily during the optimization process to adjust ferent fuzzy operators the approximation quality of the constraint Jacobian and its nullspace representation. First numerical results for this new approach are dis- 3 - On the representation of the search region in multi- cussed. ple objective optimization 3 - Modeling and optimization of supply chains Renaud Lacour, Kathrin Klamroth, Daniel Vanderpooten Michael Herty, Simone Göttlich Given a set N of feasible points of a multi-objective optimization We discuss recent advances in the modeling, simulation and optimiza- (MOO) problem, the search region corresponds to the part of the ob- tion of supply chains. The interest is in macroscopic descriptions of jective space containing all the points that are not dominated by any phenomena related to production. Several approaches based on con- point of N. We consider an alternative representation of the search re- tinuous formulation of the model and suitable discretization leading gion by a set of tight local upper bounds (in the minimization case) to a variety of problems from nonlinear optimization to mixed-integer that can be derived from the points of N. Determining efficiently such problems. Recent results and relations are discussed in this talk. local upper bounds is a crucial issue when designing methods for gen- erating or approximating the nondominated set. We present several equivalent definitions of local upper bounds and show their usefulness in MOO. The existence of local upper bounds is supported by a first in- cremental approach which eliminates redundancies among local upper  WC-11 bounds. We also study some properties of local upper bounds, espe- Wednesday, 13:10-14:40 - SFo3 cially concerning the issue of redundant local upper bound, that give rise to an incremental approach which avoids such redundancies. Fi- nally, we bound the worst case number of local upper bounds and dis- Multi-objective Optimization cuss computational experiments that compare the practical efficiency of the presented approaches. Stream: Decision Theory and Multi-Criteria Optimiza- tion Invited session Chair: Renaud Lacour  WC-12 1 - Portfolio optimization without sufficient statistics Wednesday, 13:10-14:40 - SFo4 Sigifredo Laengle, Fabián Flores-Bazán, Fernando Flores-Bazán Airport Operations Scheduling I Important efforts contributing to Extreme Value Theory lead to determine (sufficient) statistics that reduce an infinite-dimensional Stream: Project Management and Scheduling multi-criteria problem to an equivalent problem defined in a finite- Invited session dimensional space. Is it possible to address the original problem with- Chair: Erik Demeulemeester out calculating statistics for optimizing? Vector Optimization Theory successfully solves infinite-dimensional problems, it however requires preference cones to have a non-empty interior (i.e., the property of 1 - A two-stage mixed integer programming approach solidness), which is not true in some relevant spaces of random vari- for optimizing the skill mix and training schedules for ables. Fortunately, a recent theoretical work (still in press) restores the optimization results without requiring solidness property. We ap- aircraft maintenance ply this work to a portfolio optimization problem in a space of ran- Philippe De Bruecker, Jeroen Belien, Jorne Van den Bergh, dom variables, whose cones do not satisfy solidness. This document Erik Demeulemeester presents a summary of the application. We obtain the solution to the original problem without the need of defining statistics. Finally, we This paper presents a two-stage mixed integer programming approach propose to deepen our analysis, as it is possible to extend this appli- for optimizing the skill mix and training schedule at an aircraft main- cation to a new scalarization procedure, which requires neither linear tenance company. In this study, we only focus on the line mainte- spaces nor transitive preferences. nance which takes place at the gate or parking ramp between the arrival and departure of an aircraft. Since different aircraft have different fea- 2 - A Stochastic Multi-objective Backbone Selection and tures and can show different problems, only adequately skilled workers Capacity Allocation Problem under Different Fuzzy should be assigned to maintain certain flights. Hence, a good person- Operators nel schedule should make sure that all flights can be maintained in time Mahir ATMIS, Hasan Huseyin TURAN with the available workers and their respective skills. While a higher skilled workforce increases flexibility and can lead to The purpose of this study is to propose a mathematical model that has cheaper schedules, the required training can become very expensive. ability of solving a multi-objective optimization problem for a telecom- The first step is therefore to make a trade-off between cheaper ros- munications bandwidth broker who acquires and sells bandwidth un- ters that require higher skilled workers and the training costs to obtain der uncertain market environment. The model seeks for two impor- this higher skilled workforce. The second step is to design an optimal tant goals: maximizing expected profit and minimizing expected loss training schedule to obtain the optimal skill mix with minimal costs. capacity with some constraints such as backbones’ capacity and the This second model determines the exact timing of the training for each Quality of Service (QoS) expectations of end-users’. The presence of worker and takes into account that a worker is unavailable to work dur- vagueness and randomness of information makes applying the fuzzy ing his training periods. set theory and stochastic programming techniques more convenient to deal with the non-deterministic nature of telecommunication network We illustrate our approach with a computational experiment based on setting. The proposed model simultaneously considers the random- real life data of Sabena Technics, a large aircraft maintenance company ness in demand and determines the allocation of end-users’ bandwidth located at Brussels Airport in Belgium. Experiments first demonstrate requests into acquired capacity. In the solution phase of the model, that our models succeed in finding low cost schedules in reasonable max-min, weighted additive and weighted max-min fuzzy operators time. Second, we illustrate the benefits of training by comparing a sce- are used in order to solve resulting probabilistic linear programming nario without training to a scenario with training. The results show model. To evaluate the effectiveness of suggested model, well-known how our approach can make a good trade-off between cheaper rosters measure of uncertainty effect, the value of the stochastic solution is that require higher skilled workers and the training costs to obtain this modified and used. Randomly generated test scenarios are tested and higher skilled workforce. results are obtained to provide managerial insights to decision makers. It is observed that proposed model provides more profit, satisfaction ratio and less capacity loss compared to deterministic methodology. 2 - Flexible project scheduling motivated by aircraft It is shown that increasing uncertainties (increasing number of sce- maintenance narios and variance) makes fuzzy stochastic approach more attractive Lukas Berthold

16 OR 2014 - Aachen WC-15

Classical project scheduling is based on the assumption of a fixed exact methods for solving the sub-problems defined in the POPMU- project structure. However, since this assumption is too strict for SIC template highlight an interoperation between metaheuristics and many practical applications, several models have been introduced to mathematical programming techniques, which provide a new type of cope with flexibility in project networks. Against this background Approach for this problem. Computational experiments reveal excel- we present a different approach which has been motivated by aircraft lent results outperforming best approaches known to date. maintenance. Here the extensive maintenance procedure of an aircraft (especially of the aircraft engines) can be regarded as a project. A 2 - Primal Heuristics for Multi-Stage Mixed Integer Pro- special feature of aircraft maintenance is given by the possibility to grams exchange components between different aircrafts. In this way addi- Christian Puchert, Marco Lübbecke tional working time caused by faulty components can be transferred between different aircrafts, making it possible to reduce ground time In various industrial applications, one is encountered with a multi-stage of aircrafts. In a first step towards capturing this situation we define so optimization problem. This is in particular the case when a problem is called "insertion networks": Here the nodes are partitioned in several formulated over a time horizon - as, e.g., lot sizing problems or re- classes only one of which corresponds to a set of "normal" activities. source allocation problems. Each of the other ones, say class i, represents positions on which a cer- When such a problem is formulated as a MIP, its coefficient matrix tain number of activities of type i have to be inserted. The objective is then has a so-called staircase structure, i.e. it breaks down into subse- to find an insertion of the activities which minimizes the makespan of quent, pairwise connected blocks. In the recent past, we have managed the network. At first we study the case without resource constraints. In to find a staircase structure in an arbitrary MIP - even if it is not a priori contrast to critical path method, our problem turns out to be strongly known to be a multi-stage problem. NP-hard even after dropping the resource constraints. However, for a special case we present a polynomial time algorithm based on comput- Besides exact algorithms, heuristic algorithms have been devised for ing maximum Sperner families. In a second step we include resource multi-stage problems. Such heuristics are e.g. the rolling horizon or constraints and discuss how to adapt RCPSP algorithms to our prob- the fix-and-relax heuristic. lem. We apply generic versions of these heuristics on general MIPs from the MIPLIB2010 library as well as on instances of multi-stage prob- 3 - Military aircraft mission planning: a column- lems from the literature. In particular, we compare them against clas- generation approach sical generic MIP heuristics and investigate whether the knowledge of Jorne Van den Bergh, Torbjörn Larsson, Nils-Hassan a structure is of any advantage when searching for feasible solutions. Quttineh, Jeroen Belien On the other hand, we also exploit staircase structures in generic MIP heuristics, in particular diving heuristics. Diving heuristics iteratively We introduce a military aircraft mission planning problem where a branch on a variable and solve the resulting LP. In our diving heuris- given fleet of aircraft should attack a number of ground targets. Due to tics, the blocks of the MIP (which, depending on the problem, may be the nature of the attack, two aircraft need to rendezvous at the target, interpreted as time steps) are taken into account when selecting vari- that is, they need to be synchronized in both space and time. At the ables. We give a comparison between block-wise diving heuristics and attack, one aircraft is launching a guided weapon, while the other is classical ones. illuminating the target. Each target is associated with multiple attack and illumination options. For each attack option the expected effect Preliminary results show that our heuristics are successful on multi- on the target is given. Further, there may be precedence constraints stage problems from the literature as well as on generic MIPs. between targets, limiting the order of the attacks. The objective is to maximize the outcome of the entire attack, while also minimizing the mission timespan. The problem is formulated as a mixed integer lin- ear programming (MILP) model and can be characterized as a gener- alized vehicle routing problem with synchronization and precedence WC-15 side constraints. Finding optimal solutions through direct application  of a general MIP solver is only practical for scenarios of moderate Wednesday, 13:10-14:40 - SFo11 sizes. Even for problem instances including only five targets, it takes CPLEX several hours to verify optimality, although it is able to find Forecasting for Business Analytics I feasible and near-optimal solutions much earlier. Therefore, we pro- pose a Dantzig-Wolfe decomposition and solve the resulting problem Stream: Statistics and Forecasting by a column generation approach. A column represents a predefined Invited session sequence of targets and tasks for one aircraft with the corresponding time periods in which the targets are visited. To generate columns Chair: Xi Chen we solve longest path subproblems with capacity and precedence side constraints. We compare the column generation approach with a direct 1 - Forecasting intermittent demand with generalized application of CPLEX on the MILP formulation. state-space model Kei Takahashi, Marina Fujita, Kishiko Maruyama, Toshiko Aizono, Koji Ara We propose a method for forecasting intermittent demand with gen-  WC-13 eralized state-space model using time series data. Forecasting in- Wednesday, 13:10-14:40 - SFo9 termittent demand exactly is important for manufacturers, transport businesses and retailer, because of diversification of consumer prefer- ence and small lot production of many products by the diversification. Matheuristics There are many models for forecasting intermittent demand. Croston’s model is one of the most popular one for intermittent demand fore- Stream: Heuristics, Metaheuristics, and Matheuristics casting, which has many variant models, log-Croston, modified Cros- Invited session ton and other models. However, Croston’s model has inconsistency Chair: Stefan Voss on its assumptions pointed out by Shenstone and Hyndman (2005). Additionally, Croston’s model generally needs round-up approxima- tion on the inter-arrival time for estimating parameters from discrete 1 - A Matheuristic Approach for the Berth Allocation time-series data. We employ non-Gaussian nonlinear state space mod- Problem els to forecast intermittent demand. Specifically, we employ mixture Stefan Voss, Eduardo Lalla Ruiz of zero and Poisson distributions, because occurrence of intermittent phenomenon implies low average demand. As well as in DECOMP The Berth Allocation Problem aims at assigning and scheduling in- (Kitagawa, 1986), time series are broken down into steady, seasonal, coming vessels to berthing positions along the quay of a container autoregression and external terms in our model. Therefore, with ordi- terminal. This problem is a well-known optimization problem within nal maximum likelihood estimators, we cannot obtain parameters be- maritime shipping. In order to address it, we propose two POPMU- cause the number of parameters excesses the number of data owing SIC (Partial Optimization Metaheuristic Under Special Intensification to non-stationary assumptions on parameters. Then, we adopt Beyes Conditions) approaches that incorporate an existing mathematical pro- framework, which is similar to DECOMP. However, we employ par- gramming formulation for solving it. POPMUSIC is an efficient meta- ticle filter for filtering method instead of Kalman filter in DECOMP heuristic that may serve as blueprint for matheuristics approaches once owing to non-Gaussianness on the system and observation noises and hybridized with mathematical programming. In this regard, the use of nonlinearity in these models. To show the superiority of ours to other

17 WC-16 OR 2014 - Aachen

typical intermittent demand forecasting method, we will conduct com- 1 - Increased Capacity in Pipeline Transport through parison analysis using actual data in the grocery store. Flexibility in Supply Distribution 2 - A queuing model and its application to software re- Robert Schwarz, Benjamin Hiller, Claudia Stangl juvenation Michael Grottke, Jing Zhao, Kishor Trivedi, Javier Alonso, Transmission system operators are faced with the task of routing nat- Yanbin Wang ural gas through a network of pipelines. The friction in the pipelines lead to pressure loss, that must be compensated by compressor ma- "Software aging" relates to the phenomenon that during operations a chines to enable transportation along far distances. In addition to com- software system may show an increasing failure rate that cannot be pressors, active components such as valves and regulators are used to attributed to changes in the user behavior or the software code. Typ- control the flow of gas and sustain feasible operation within technical ically, software aging is due to the accumulation of error states (such and contractual limits. The operators may also request to change the as leaked memory) inside the running system. Periodically removing supply distribution at entries through contractual means. For example, these error states (e.g., via system reboots or application restarts) can the supply at one entry might be reduced, but then another entry has help prevent future aging-related failures of the software system, thus his supply increased to retain a balanced nomination of flow. The sup- increasing its reliability. This proactive technique has been known as pliers then react in the manner of a Stackelberg game and can choose "software rejuvenation". Since the accumulation of internal error states the supply nodes used for rebalancing. This game is modeled as a is often accompanied by progressive software performance degrada- bilevel optimization problem, extending an MINLP formulation of the tion, the measured response time of the software system can be used feasibility problem for stationary gas transport. We study the impact to detect the onset of software aging, and to determine when to trigger of these contractual means on the transport capacity of the networks software rejuvenation. under different scenarios of opponent behavior. Preliminary computa- In this talk, we propose to model the widely-used Apache HTTP server tional results are presented. as a finite-server queue with Poisson arrivals and service times that fol- low a two-stage Erlang distribution. For this queuing model, we first 2 - Discrete Optimization Methods for Pipe Routing derive the steady-state probabilities, which to the best of our knowl- edge have not been available in the existing literature. We then obtain Jakob Schelbert, Lars Schewe closed-form expressions for the response time distribution as well as its moments, using this information to validate our model. Finally, we Routing a pipe through a power plant is a difficult task as one has present a phase-type-distribution approach to calculating the cumula- many possibilities for a given discretization of the design space fol- tive distribution function of the sample average of response times. The lowing the ground structure approach. The problem combines discrete quantiles of this distribution are employed in our distribution-based aspects and nonlinear constraints that model the physics of the pipe. rejuvenation algorithm (DBRA), which uses the mean of observed re- To cope with real world technical restrictions we propose an approach sponse times for deciding when to rejuvenation the system. In simu- based on a second-order cone model that relaxes these restrictions and lations, we compare the performance of the DBRA with the one of a a decomposition procedure that is able to handle them explicitly. previously-suggested algorithm. In the past time the wide field of truss design with linear elasticity has 3 - Sometimes two wrongs can make a right - Combin- been discussed from a mostly nonlinear optimization point of view. In ing forecasting and queueing models for call centre our case conventional truss topology optimization methods are not di- rectly applicable due to discrete constraints that force the pipe to form staffing a path or even a Steiner tree. In addition to the self-weight we consider Xi Chen, Robert Fildes, Dave Worthington the placement of hangers that provide support for the pipe. Starting Call centre staffing is important as the workforce accounts for 60-70% from a rough outline of the admissible region and inlet and outlet points of the operating cost of a call centre. The staffing procedure involves we derive a discretized set of potential pipe elements that are modeled two distinct but interrelated research areas: a) forecasting the call ar- as Timoshenko beams. The problem now consists of finding the opti- rival rates and b) modelling the call centre as queueing system to decide mal routing of the pipe, considering also operational costs, while com- on staffing levels, using the forecast arrival rates. plying with technical restrictions. The former can be modeled with the We introduce a geometric discrete time modelling (Geo-DTM) ap- use of an extended graph, while the latter are either relaxed to obtain a proach and use it with an iterative-staffing algorithm (ISA) to deter- binary SOCP model. Explicitly including the technical restrictions on mine staffing levels. Empirical tests show that under perfect knowledge the other hand leads to non-convexities which can be addressed via a of arrival rates, there are many benefits of using the Geo-DTM+ISA decomposition algorithm. We also provide hardness results of the oc- method compared to steady-state staffing methods. curring problems and show how special tailored cutting planes can be used to accelerate the solving process. Numerical results for academic With simulated call arrivals data, we evaluate the effects of forecasting and real world test instances are presented. errors on call centre performance using various forecasting models and the Geo-DTM+ISA for staffing. The results show that even with a good quality dynamic queueing model (Geo-DTM+ISA), better forecasting 3 - Decomposition in Gas Network Planning Under Un- accuracy does not necessarily translate into better service levels. The certainty system performance exhibited depends on a combination of factors. Jonas Schweiger We also study the combined effects on call centre performance in the likely practical case where both forecasting and queueing models Gas transmission networks are complex structures that consist of pas- are suboptimal. Our results show that under a quality driven service sive pipes and active, controllable elements such as valves and com- regime, stationary models perform similarly to Geo-DTM+ISA. How- pressors. Today’s gas markets demand more flexibility from the net- ever, under an efficiency driven service regime, the stationary based work operators which in turn have to invest into their network infras- staffing methods perform much worse than Geo-DTM+ISA, although tructure. As these investments are very cost-intensive and long-living, both are affected by forecasting errors. Insights from the empirical network extensions should not only focus on one bottleneck scenario, results are used to provide guidance for call centre workforce manage- but should increase the flexibility to fulfill different demand scenarios. ment. What proves important is the interaction between uncertainty, Thereby we consider several ways of extending the network: by new forecasting accuracy and the call centre planning system. No element pipes between points without a prior direct connection, by building a should be analysed in isolation. pipe next to an existing pipe, or by adding active elements to the net- work. In this presentation, we formulate a model for the network ex- tension problem for multiple demand scenarios and propose two solu- tion strategies. First, we decompose along the scenarios and coordinate the search by a Branch&Bound procedure. We solve MINLP single-  WC-16 scenario subproblems and obtain valid bounds even without solving Wednesday, 13:10-14:40 - SFo14 them to optimality. Second, Dantzig-Wolfe decomposition is used to decouple the scenarios. Optimal Design and Operation of Pipeline Networks 4 - Continuous Reformulation Techniques for Mixed- Integer Nonlinear Optimization of Gas Compressor Stream: Energy and Environment Stations Invited session Martin Schmidt, Daniel Rose, Marc Steinbach, Bernhard Chair: Lars Schewe Willert

18 OR 2014 - Aachen WC-19

When considering cost-optimal operation of gas transport networks, or financial consulting firm together with the individual needs of an in- compressor stations play the most important role. However, modeling vestor. Motivated by these developments, the present work focuses on of these stations lead to complicated mixed-integer nonlinear and non- the evaluation of software-based depot optimization and -monitoring convex optimization or feasibility problems. In this talk, MINLP and for investment advisory services. For this purpose, the following re- GDP models as well as continuous reformulations of an isothermal and search question will be answered and discussed: "What are the of- stationary variant of the problem are discussed. The applicability and fered possibilities of different software-based depot optimization and importance of different model formulations, especially those without —monitoring tools and how reasonable are the optimization results?’ discrete variables is demonstrated by an extensive computational study Based on an exemplary portfolio various optimization results of advi- on real-world instances. sory tools are critically evaluated based on scenarios for different risk types. All scenario results are represented graphically in the form of risk-performance charts. The scenario analysis of the two tested tools show, as with all methods and algorithms, the results strongly depend on the entered parameters. Overall, the tools provide according to the  WC-17 settings valid optimization results. Wednesday, 13:10-14:40 - 001 Risk and Uncertainty Stream: Finance, Banking, Insurance, and Accounting  WC-19 Invited session Wednesday, 13:10-14:40 - I Chair: Rouven Wiegard Rail Transportation 1 - Multiperiod Maximum Loss Is Time Unit Invariant Stream: Traffic and Transportation Thomas Breuer, Raimund Kovacevic Invited session Time unit invariance is introduced as an additional requirement for Chair: Małgorzata Grela multiperiod risk measures: for a constant portfolio under an iid risk factor process, the multiperiod risk should equal the one period risk of the aggregated loss, for an appropriate choice of parameters and inde- 1 - Real-Time Train Dispatching Using Mixed-Integer pendent of the portfolio. Multiperiod maximum loss over a sequence Linear Programming of Kullback-Leibler balls is time unit invariant, whereas multiperiod Frederic Weymann Value at Risk is not. The railway system is vulnerable to external disturbances which inter- 2 - Credit Decisions under Risk and Uncertainty rupt the regular operation and cause delays. These delays can propa- Arndt Claußen, Daniel Roesch gate through a larger part of the railway network if a train dispatcher does not quickly reschedule the operation. Train dispatching is cur- Credit risk modeling is generally based on a stochastic framework, rently performed by hand. Most dispatchers rely on their experience with implicit assumptions about the applied model, stochastic vari- and simple dispatching rules when they deal with disturbances. The ables and parameters. The loss of a portfolio is often described as a usage of optimization methods could result in a significant reduction random variable, and credit risk is expressed by appropriate risk mea- of delays and energy consumption. This paper presents OptDis, a novel sures. However, estimating the model parameters induces estimation method for solving the important dispatching problem of detecting and errors, leading to errors in risk measures. These estimation errors are resolving occupation conflicts. These conflicts arise when two trains generally treated as random variables, even though this is equivalent attempt to occupy the same infrastructure at the same time. OptDis to being neutral to uncertainty in the sense of Knight (1921). Thus, assists dispatchers by resolving conflicts using a mixed-integer lin- given the large estimation errors in credit risk, and empirical evidence ear programming approach. The real-world feasibility of the solutions which shows that agents are actually not neutral to uncertainty (Ells- found by OptDis is ensured by a detailed consideration of minimum berg, 1961), we present a framework on how parameter uncertainty can running times and of the interlocking system. The newly developed be considered in any credit risk model. simulation tool LUKS-D is used to evaluate OptDis. This tool per- Following Garlappi (2007), we model non-neutrality to uncertainty us- forms simulations which are much closer to the real-world operation ing an optimization problem subject to specified confidence intervals than previous approaches. An experiment is presented which simulates around the parameter estimates. In an empirical study we use default the region around Bern, a real-world dispatching area from Switzer- data from S&P’s and Moody’s from 1970 to 2012 to assess how the land. inclusion of parameter uncertainty impact real-world applications. We demonstrate that many published alternative approaches that deal 2 - Crane areas and move sequences in rail-road trans- with parameter uncertainty are also covered by our approach. In ad- shipment yards dition, we observe that previous approaches inadvertently model in- Xiyu Li, Alena Otto, Erwin Pesch vestors as affine to uncertainty. Our analysis of historic default data shows that portfolios with particularly high ratings are more affected by parameter uncertainty than portfolios with lower ratings. In conclu- Rail-road transshipment yard is an important entity of intermodal sion, our framework (1) has a solid axiomatic foundation, (2) can be transportation system. In the yard gantry cranes are used to move con- applied to any credit risk model, (3) uses the full information included tainers, e.g. from trains to trucks and vice versa. In practice, we often in available datasets, and (4) can quantify and compare the degree of assign a fixed work area for each crane to avoid crane interferences. uncertainty aversion. We aim to determine the size of the work areas to balance the work- loads for all cranes, so as to minimize the makespan. To evaluate the 3 - Evaluation of a software-based investment advisor workloads accurately it’s necessary to consider the move sequence of containers, so it’s required to solve a sequencing problem with setup for depot optimization and -monitoring times. Also consider arrival and departure time of trucks, we formulate Rouven Wiegard, Michael H. Breitner this sequencing problem as ATSP with time windows and precedence relations. A exact solution procedure is provided to solve the both The structure of the age pyramide is constantly shifting due to the de- problems together, i.e. determining fixed crane areas and obtaining mographic change in Germany. The citizens bear higher responsibil- move sequences for each crane. We also generate random instances, ity for investments and retirement decisions and simultaneously the which simulate real-world transshipment yards, to test our method. In complexity of the financial markets and the range of available finan- the end, our experimental results are reported and compared to another cial products are steadily increasing. Limited financial knowledge and crane area solution, which is obtained by an algorithm without consid- simplified decision rules can be wrong and result in negative effects on ering the crane move sequences, to show the acceleration effect of our future consumption possibilities. The question is how an investor can method. reduce the discrepancy between complex financial decisions and lack of financial expertise. Intelligent software solutions that support the processes at the portfolio level can relieve the financial advisor enor- 3 - Single Track Railway Problem mously. In a software-based portfolio optimization the financial ad- Małgorzata Grela, Jacek Blazewicz, Grzegorz Pawlak, Gaurav viser has the key role to bring the resources and restrictions of the bank Singh

19 WC-20 OR 2014 - Aachen

This paper considers the optimization problem of trains schedule for typically involves substantial uncertainty, we also investigate on the single track railway. The main motivation for this work is a need of ef- issue of information robustness to find plans that are sufficiently insen- ficient utilization of railway infrastructure available in Australia, thus sitive to the level and quality of forecast information. A scaled-down the mathematical modelling inspired by practical issue is developed. industry example is used to evaluate the presented approach. The model presented in the paper aims to maximize number of trains that could carry out the route in the given time interval, avoiding dead- 3 - Order release within operative hierarchical produc- locks. Firstly a basic model which allows optimization work is de- fined, under assumptions regarding time, route, capacities of the sta- tion planning and control using product specific tions, intervals between stations and velocity of the trains. Therefore clearing functions the minimal time consuming model with finite capacities is developed Frank Herrmann, Frederick Lange, Michael Manitz and presented, referring to particular conditions. Measurements of the actual lead times in production systems show a non-linear relationship between the workload and the output of these systems which can be modeled by queuing models (see e.g. publica- tions by Missbauer and Uzsoy (in 2002 and in 2011)). Competition WC-20 between jobs for bottleneck stations cause a network of queuing mod-  els (discussed e.g. by Haskose in 2002). Due to their complexity, Wednesday, 13:10-14:40 - II several authors (e.g. Asmundsson in 2009) recommend clearing func- tions (CFs) as an alternative. Several usages of CFs are proposed in Robust Production and Distribution recent research. One is the control of the order release within the op- Planning erative hierarchical production planning and control (see e.g. Kacar in 2012 or Missbauer in 2011). For this, Ravindran introduced in 2011 a single product planning model to estimate order releases. The CF Stream: Production and Operations Management is used as limitation of the capacity and determines the output of a Invited session production system depending on its workload. A predetermined alpha Chair: Gerd J. Hahn service level, which denotes the probability of no stockout within an order cycle, has to be fulfilled. The demands are normally distributed and the mean values as well as the standard deviation values vary over 1 - A multi-criteria approach to outsourcing decision- time, which cause (additional) uncertainty. In this contribution we ex- making in stochastic manufacturing systems tend the order release planning model of Ravindran to deal with mul- Nico Vandaele, Catherine Decouttere, Gerd J. Hahn, Torben tiple products as well as with product specific CFs. We introduce a Sens simulation-based method to derive product specific CFs for a specific production system. As test problem we use a scaled down produc- The issue of contract manufacturing has grown in importance due to tion system based on an electrical parts manufacturer in Germany. To the increasing trend of companies that reduce internal value added or analyze the extended order release planning model we perform long even deliberately focus on their core competencies out-side of the oper- term simulation experiments in a rolling and overlapping planning en- ations and manufacturing domain. We study contract manufacturing at vironment. The results outperform a basestock policy as an alternative the strategic-tactical level and approach the topic from a multi-criteria procedure. decision-making perspective since this topic involves service, cost, quality, and especially more long-term value-related aspects. Value- related aspects can include, among others, intellectual property con- cerns, reputational risks, in-house expertise retention or legal require- ments etc. To arrive at a well-balanced outsourcing decision with re- spect to the aforementioned multiple dimensions, we apply a scenario-  WC-21 based approach at the strategic level and make use of two types of Wednesday, 13:10-14:40 - III Key Performance Indicators (KPIs): model-based KPIs and KPIs that are derived in an independent assessment from multiple stakehold- Optimization Modeling I ers. A linear programming approach is applied at the strategic level to solve the multi-criteria multi-stakeholder decision problem. The Stream: Software Applications and Modelling Systems model-based KPIs are handled through the tactical model which man- ages the trade-off be-tween service and cost. For this purpose, we Invited session apply an Aggregate Production Planning model to coordinate internal Chair: Sonja Mars operations with external contract manufacturing. A queuing network- based approach is integrated to anticipate the stochastic and dynamic behavior of the internal shop floor environment. This gives us the op- 1 - What you can learn from a Gurobi log file portunity to balance aggregated customer order lead times vs. total Sonja Mars costs of operations when deciding on volume and mix of internal vs. external production. A numerical case example involving different out- Log files for optimization solvers can sometimes be confusing: sourcing scenarios is applied to highlight the benefits of the approach. columns of numbers, some increasing and some decreasing, with few words to explain their meanings. We will show you how to pull useful 2 - Robust Tactical Planning in Automotive Supplier Net- information out of a Gurobi log file. We’ll talk about what is happen- works ing behind the scenes and provide hints on how a Gurobi log file can Gerd J. Hahn, Svenja Hoffmann-Fölkersamb, Sven Woogt help you to improve solver performance. Examining log files often leads to useful parameter changes. Default Suppliers in the automotive industry currently face two major require- parameter values are chosen to be efficient on average. They work well ments from their customers: (i) increasing flexibility to deal with across a broad set of models, but not on all models. We will present fluctuating demand volumes while following a leveled production ap- some examples where analyzing the solver output and changing a small proach according to the lean philosophy, and (ii) prearranged cost sav- number of parameters helped to reduce the solving time significantly. ing targets over product life-time due to a more competitive business Additionally, we give a brief introduction to Gurobi’s Tuning Tool and environment. Automotive suppliers thus have started to organize them- explain the meanings of the different parameter changes it suggests. selves in manufacturing networks to better balance demand with capac- ity supply across locations and to realize cost savings from economies of scale. However, OEMs require approval of each production line 2 - Recent Enhancements in GAMS before production starts which is costly, involves several months of Lutz Westermann, Michael Bussieck lead time, and is time-bound if a minimum production quantity is not met. Consequently, planning complexity increases which requires ap- From the beginning in the 1970s at the World Bank till today, GAMS, propriate decision support systems for tactical supply chain planning. the General Algebraic Modeling System, has evolved continuously in We present a corresponding robust decision model for tactical supply response to user requirements, changes in computing environments chain planning of an automotive supplier. The approach is capable and advances in the theory and practice of mathematical programing. of flexibly dealing with demand fluctuations while following a stable We will outline several recent enhancements of GAMS supporting effi- and cost-efficient production plan. For this purpose, we make use of cient and productive development of optimization based decision sup- two-stage stochastic programming and integrate corresponding robust port applications. planning methods. Since the forecast delivery schedule of the OEM

20 OR 2014 - Aachen WC-23

3 - Modeling with IBM ILOG CPLEX - An IBM-certified We present a new extension of the vehicle routing problem (VRP): course and textbook The VRP with flexible delivery locations (VRPFDL). In the VRPFDL, Hans Schlenker, Stefan Nickel, Claudius Steinhardt, Melanie a job not only corresponds to exactly one location but has to be per- formed at one out of a set of possible locations. One possible applica- Reuter tion for the VRPFDL is the hospital-wide therapist routing problem. In the hospital-wide therapist routing problem, therapists are depicted as For more than two years, the Karlsruhe Institute of Technology and vehicles and patients are depicted as jobs to be treated at a ward or at a the University of Augsburg have been teaching a course on modeling therapy center. In order to solve the VRPFDL we present a mixed in- with ILOG CPLEX Optimization Studio in cooperation with IBM. The teger program. Due to its computational intractability we reformulate course complements existing theoretical bachelor courses on mathe- the problem as a Dantzig-Wolfe formulation which is solved by means matical programming. In the course, students get a detailed introduc- of branch-and-price-and-cut. Based on standard VRP benchmark in- tion into the elements of OPL and learn how to efficiently model, im- stances, we generate VRPFDL instances and evaluate the mixed inte- plement and solve real-world optimization problems. ger program and the Dantzig-Wolfe approach. This course is now being rolled out to other universities and comple- mented by a textbook. The textbook will cover the content of the course, and will also serve as a tutorial on optimization programming with the Optimization Programming Language (OPL). In the talk, we give an overview of the content of the textbook and  WC-23 show how the course can successfully be included into existing study Wednesday, 13:10-14:40 - V programs. Furthermore, we give details on the official certification pro- cess, through which successfull students can get an IBM certificate. Stochastic Flow Lines: Analysis and Optimization Stream: Production and Operations Management  WC-22 Invited session Wednesday, 13:10-14:40 - IV Chair: Svenja Lagershausen Vehicle Routing and Scheduling with 1 - Accelerating sample-based optimization methods of Column Generation buffer allocations in flow lines Sophie Weiss, Raik Stolletz Stream: Logistics and Inventory The Buffer Allocation Problem (BAP) can be modeled as a mixed in- Invited session teger program by sampling the effective processing times. Recently, a Chair: Rainer Kolisch Benders Decomposition approach has been proposed for the BAP. The computation times of this procedure can be further reduced by the gen- 1 - Multiperiod Technician Routing and Scheduling eration of lower bounds based on the optimization of subsystems. We improve the computation time by generating additional lower bounds Emilio Zamorano de Acha, Raik Stolletz for each individual buffer. Additionally, the choice of the next evalu- ated buffer allocation is not under control using a standard solver for This paper addresses a technician routing and scheduling problem mo- the master problem in the existing Benders Decomposition approach. tivated by the case of a forklift maintenance provider. Technicians are We investigate the performance of approaches to systematically gener- proficient on different skills and pair into teams to serve maintenance ate new candidate solutions in the master problem. tasks. Tasks are skill constrained and have time windows that can span multiple days. The objective is to determine the daily assignment of 2 - Setting Kanban cards for production systems un- technicians into teams, teams into tasks, and daily team routes such that the operation costs and the customer waiting are minimized. We der non-stationary and stochastic operating environ- propose a mixed integer program and a column generation-based algo- ments rithm for the solution of this problem. Using real-world data from a Justus Arne Schwarz, Raik Stolletz forklift maintenance provider, we present our first numerical results. Traditional Kanban systems are designed to work in a static operating 2 - A New Formulation and Approach for the Black and environment. Recent works suggest to change the number of Kanban cards based on the current inventory level to account for stochasticity White Traveling Salesman Problem in the system. We provide a systematic overview of the existing ap- Ibrahim Muter proaches. Based on that, a new mechanism for the Kanban card setting under stochastic and moreover non-stationary operating environments This study proposes a new formulation and a column generation ap- is introduced. This operating environment occurs for example during proach for the black and white traveling salesman problem. This prob- production ramp-ups. In contrast to existing approaches, the proposed lem is an extension of the traveling salesman problem in which the ver- approach uses information about the future development of the system tex set is divided into black vertices and white vertices. The number of to proactively change the number of Kanban cards. We discusses ob- white vertices visited and the length of the path between two consec- jectives and preliminary results of the new card setting approach and utive black vertices are constrained. The objective of this problem is outline similarities to the buffer allocation problem. to find the shortest Hamiltonian cycle that covers all vertices satisfying the cardinality and the length constraints. We present a new formula- 3 - Inter-departure, Inter-start, and Cycle Time Distribu- tion for the undirected version of this problem, which is amenable to tion of Closed Queueing Networks the Dantzig-Wolfe decomposition. The decomposed problem which is defined on a multigraph becomes the traveling salesman problem with Svenja Lagershausen, Baris Tan an extra constraint set in which the variable set is the feasible paths We present a method to determine the exact inter-departure, inter-start between pairs of black vertices. In this paper, a column generation al- and cycle time of closed queueing networks that can be modeled as gorithm is designed to solve the linear programming relaxation of this Continuous-Time Markov Chains with finite state space. The method problem. The resulting pricing subproblem is an elementary shortest is based on extending the state space to determine the transitions that path problem with resource constraints, and we employ acceleration lead to a departure or an arrival of a part on a station. Once these strategies to solve this subproblem effectively. The linear program- transitions are identified and represented in an indicator matrix, a first ming relaxation bound is strengthened by a cutting plane procedure, passage time analysis is utilized to determine the exact distributions and then column generation is embedded within a branch-and-bound of the inter-departure, inter- start, and cycle time. In order to illus- algorithm to compute optimal integer solutions. The proposed algo- trate the methodology, we consider closed-loop production lines with rithm is used to solve randomly generated instances with up to 80 ver- phase-type service time distributions and finite buffers. We discuss tices. the methodology to generate the state space and to obtain the transi- tion rate matrices for the considered distributions automatically. We 3 - Vehicle Routing with Flexible Delivery Locations use the proposed method to analyze the effects of system parameters Alexander Döge, Markus Frey, Daniel Gartner, Rainer on the inter-departure, inter-start time, and cycle time distributions nu- Kolisch merically for various cases. The generality of the methodology allows

21 WC-24 OR 2014 - Aachen

an analysis of the inter-departure, inter-start, and cycle time distribu- Therefore, we present here the foundation for the energy agent ap- tions of a wide range of production systems including open queueing proach by means of a generalized energy option model that gives an networks that can be modeled as Continuous-Time Markov Chains in agent the needed understanding and economical assessment options a unified way. regarding the underlying technical system. In this context we present and discuss two exemplary use cases for the option model that illustrate the applicability of this approach for hybrid energy systems. Further, we present and discuss first combinatorial evaluation strategies and ex- periences that were determined in the course of our work. Overall we WC-24 believe that our approach is a good but also necessary base for the de-  velopment of the future IT-enriched and hybrid energy grid. Wednesday, 13:10-14:40 - AS Smart Electricity Markets Stream: Pricing, Revenue Management, and Smart  WC-25 Markets Wednesday, 13:10-14:40 - AachenMuenchener Halle (Aula) Invited session Chair: Wolfgang Ketter OR Success Stories II Stream: Business Day 1 - Electric Vehicles as Virtual Power Plants Micha Kahlen, Wolfgang Ketter, Jan van Dalen Invited session Chair: Günter Stock What happens to electricity supply if a tree hits a major electricity transmission line, or a power plant has to force shut down immedi- 1 - Time, speed and bunker consumption in tactical and ately? The brief answer is nothing — previously idle operating re- serves jump in to take over. However, with a growing share of un- operational planning predictable and uncontrollable renewable generation sources entering Jochem Donkers the electricity market, the need for operating reserves will increase in the future. We explore the possibility of using the storage capacity of Since the beginning of the credit crisis in 2008, many shipping com- large fleets of electric vehicles (EV) as short term (secondary) operat- panies have adopted slow-steaming policies. Prior to the credit crisis ing reserves. We approach the problem from a carsharing fleet owner slow-steaming was a concept virtually unheard of in the industry; pri- perspective by designing and evaluating an automated decision sup- marily due to a shortage of vessel capacity. During our presentation we port system to manage the virtual power plant of EVs. The system will introduce some of the dynamics of slow-steaming and the dilemma decides for the carsharing owner on a real time basis whether to 1) rent that this creates for shipping companies. Also, we will discuss speed an EV at a specific location to consumers, 2) rent out the battery of differentiation that is beyond the scope of the company?s control. this EV as operating reserve to generate electricity (positive reserve), In the second half of the presentation we will explore how Integrated or 3) rent it out as operating reserve to consume (charge) energy (neg- Planning Platform help to secure the benefits of the slow steaming op- ative reserve). In contrast to previous studies we incorporate explicit portunity while adhere to other to other constraints. Then we will re- opportunity costs for non-availability of EVs during rental to operating view how this complex model ? with the help of solvers ? will help reserves (and while recharging afterwards). We find that market design to create robust planning scenarios for an example scenario with in- for secondary operating reserves has to change to a more flexible de- creased demand. We will conclude by summarizing the benefits for sign in order for storage to participate in and add value to the market. shipping company. The system is evaluated on the basis of a case study of 500 Car2Go EVs in Stuttgart, Germany over a period of 3 months. With decreasing 2 - Optimization for a Canadian hydro-power company battery depreciation costs and an increasing need for balancing capac- Günter Stock ity we show that carshared operating reserves are commercially bene- ficial for both the carsharing owner and the institution in charge of the TransAlta headquartered in Calgary, Alberta, Canada, has imple- operating reserves. mented KISTERS’ BelVis ResOpt for optimization of their 800 MW Hydroelectric power plants on the Bow and the North Saskatchewan 2 - A Probabilistic Preference Model with Applications River Systems. BelVis ResOpt solves complex optimization problems to Electricity Tariff Choice using Mixed Integer Linear Programming (MILP) with the support of a commercial mathematical solver. The application has been setup to Markus Peters, Wolfgang Ketter automatically provide day-ahead operation schedules to optimize for Information systems that make autonomous decisions based on con- maximum benefit over time. Prior to using BelVis ResOpt, the water sumers’ preferences will play an important role in the smart retail mar- management engineers estimated the optimization results with the use kets of the future. We present a nonparametric Bayesian preference of spreadsheets. This process was not only time consuming but did not model that learns unobtrusively from limited data collected from many guarantee the best possible solution. By switching to BelVis ResOpt concurrent users, and that quantifies the certainty of its own predic- users are now able to focus their attention on higher level tasks like tions as input to autonomous decision-making tasks. We make use of scheduling decisions to maximize profit and minimize risks while the recent advances in probabilistic inference for structured Gaussian pro- tool runs different analysis scenarios and automatically provides daily cess models to improve the scalability of our model, and we evaluate operational solutions. its performance on real-world electricity tariff choice data collected through a commercial crowdsourcing platform. 3 - Intraday Trading in Energy Markets — a continuous Optimization and Forecasting Challenge 3 - Hybrid Energy Option Models for Unified Energy Olaf Syben Agents Due to the energy transition in Germany the energy sector is affected Christian Derksen, Rainer Unland by massive changes that make the planning tasks for the utilities more The ongoing conversion of our energy supply experiences great inter- and more complex. Given the sharp rise in shares of non- predictable est from many different market players that were originally working energy supply from renewable energy sources (wind, solar) an intra- in other industries. As a consequence, a vast amount of proprietary day market was established, allowing the energy suppliers to respond solutions for "smart’ energy applications is flooding the market. This to changes in demand and supply situations at very short notice. Espe- turns out to be rather a problem than part of the solution for the system- cially the difficult predictability of renewable generation leads to im- atic development of future energy grids. Additionally, the absence of balances in the energy system, which cannot be covered by balancing necessary unifications and standards block further developments that energy mechanisms only. would enable the creation of novel market-driven and hybrid control Energy suppliers now have the challenge to continuously monitor their solutions for various types of technical systems. To overcome these generation portfolio and calculate forecasts for their own renewable problems, we present our notion and definition of a unified autonomous infeed and the heat demand of their customers. In addition, the short- software entity called energy agent. Based on the energy conservation term forecast of prices in the intraday market seems useful. Any new law and a derivative option and action model therefrom, we claim that forecast situation or any change in the conditions in the market result in our energy agent approach has the capabilities to enable cross domain a new optimization problem to determine the optimal market behavior interactions between different types of energy systems and networks. and to calculate the optimal operation of the power plants.

22 OR 2014 - Aachen WD-05

For about 20 years ProCom GmbH offers planning solutions for the en- ergy sector, helping to optimize energy portfolios and forecast prices, Wednesday, 15:00-15:45 energy demand and supply. The underlying platform BoFiT supports users starting from the model development for optimization and fore- cast ending at automatically running processes for continuous trading  WD-01 in the energy markets. The optimization functions are based on mixed Wednesday, 15:00-15:45 - Fo1 integer programming and the forecast functions use among other meth- ods artificial neural networks. Semiplenary Gritzmann 4 - Industrial Gases Production and Supply Chain Opti- mization Stream: Invited Presentations and Ceremonies Frank Lüders Semi-plenary session Chair: Alexander Martin Air Liquide’s operations management must take into account supply and demand data as well as contractual requirements and market infor- 1 - On data segmentation mation when planning the day-to-day operations, as well as for long- Peter Gritzmann term strategic planning. Optimization tools have been developed based on complex models of Air Liquide’s industrial activity, reaching from Detecting hidden structures in (high-dimensional) data is a general medium-term regional production planning down to real-time opera- and ubiquitous task. Examples include the assessment of insurance tion of individual plant equipment. Based on the integration of real- risks and the corresponding tariffing, issues of predictive maintenance, time operations data as well as multiple market data interactive tools supply-chain diversification, medical treatment planning or business support daily business decision-making such as short-term energy pro- demand prediction. curement as well as optimized dispatching of production volumes to There is a wide range of analytical and statistical methods for data the plants. analysis and assessment. For instance, in auto insurance, parameter- based tariffing is employed that, in fact, leads to a box-classification of the parameter space followed by rather involved statistics. In the talk we present a geometric clustering model that provides a shift towards a more sophisticated and data-structure-based dissection of space that allows for much simpler and more reliable subsequent statistics. We show how the model captures the intuition behind good clusterings and leads to efficient algorithms in practice. Also, we re- port on results for some real-world tasks of the problems mentioned before. (Joint work with Andreas Brieden)

 WD-02 Wednesday, 15:00-15:45 - Fo2 Semiplenary Fransoo Stream: Invited Presentations and Ceremonies Semi-plenary session Chair: Ulrich Thonemann

1 - Optimizing beyond company borders: Horizontal col- laboration in supply chain management Jan C. Fransoo Increasingly, companies are finding that the end of efficiency improve- ments in operations within company borders have been reached: ve- hciles routes have been optimized, inventory has been pooled, and manufacturing economies of scale have been reached. A natural next step is to then explore optimization bevond company borders. While vertical collaboration (with suppliers and customers) has received a lot of attention in the literature and in industrial practice in the past 15 years, horizontal collaboration (with competitors or other companies at the same level in the supply chain) is much less studied and practiced. In this talk, I will give an overview of horizontal collaboration across a variety of industries in logistics, manufacturing, and retail. Examples will be given of actual projects, and the requirements on associated models will be discussed. The relationship between important business questions such as the appropriate governance model of the collabora- tion and the modeling support that operations research can offer will be discussed.

 WD-05 Wednesday, 15:00-15:45 - Fo5 Semiplenary Lee Stream: Invited Presentations and Ceremonies Semi-plenary session Chair: Teresa Melo

23 WD-25 OR 2014 - Aachen

1 - Health Analytics: Personalized Cancer Treatment Wednesday, 16:05-17:35 Planning Eva Lee Almost eight million cancer patients worldwide receive some form of  WE-02 radiation therapy each year. One promising treatment is high-dose- Wednesday, 16:05-17:35 - Fo2 rate brachytherapy, which entails delivering high-dose radiation to the tumor via the temporary implantation of radioactive seeds. This treat- GOR Masterthesis Award ment promises to be particularly effective in eradicating tumors, while preserving the organs. Yet, major obstacles to successful treatment re- Stream: Awards Sessions main, especially (1) determining the best seed type, spatial configura- tion of seeds, and seed dwelling time, and (2) improving the probabil- Award Competition session ity that the treatment will eliminate all malignant cells. We developed Chair: Kathrin Klamroth an advanced planning model to simultaneously address both of these issues. To permit taking advantage of the best available information, 1 - Political Districting for Elections to the German Bun- our model works with inputs from positron emission tomography. We destag begin with a multiobjective, nonlinear, mixed-integer programming model that is initially intractable. To solve the model, we introduce Sebastian Goderbauer an original branch-and-cut and local-search approach that couples new Political districts are of significant importance in elections to the Bun- polyhedral cuts with matrix reduction and intelligent geometric heuris- destag, the German parliament. Each district elects one representative tics. The result has been accurate solutions, which are obtained rapidly. into parliament ensuring that each part of the country is represented. Clinical trials on cervical cancer treatment at Rush University Medical These elected representatives make up half of the members of the Bun- Center have demonstrated superior medical outcomes. These analyt- destag. The allocation of electoral districts is subject to legal require- ical techniques are applicable not only to cervical cancer, but also to ments and needs regular updates due to an ever-changing population other types of cancer, including breast, lung, and prostate cancer. distribution. In the author’s master’s thesis, the problem of dividing a country into electoral districts is defined as a multi-criteria optimiza- tion problem in which a node-weighted graph has to be partitioned into a given number of contiguous, weight-restricted subgraphs. In a comprehensive analysis, the NP-hardness of this problem is proven.  WD-25 In addition and to get a more profound understanding of the com- Wednesday, 15:00-15:45 - AachenMuenchener Halle (Aula) plexity, the underlying partition problems are investigated on different graph classes. An optimization-based heuristic in accordance with the divide-and-conquer principle is introduced and successfully applied to Company Award population data of the latest German census. The computed results show that the presented algorithm allocates electoral districts, which Stream: Awards Sessions are not only in accordance with the law, but also match the objectives Award Competition session mentioned in the law more closely than the current districting. Chair: Leena Suhl 2 - Congestion Games with Multi-Dimensional Demands Andreas Schütz 1 - Simulation and Optimization - Practical Considera- Weighted congestion games are a significant and extensively studied tions of competitive Companions class of strategic games, in which players compete for subsets of shared Ulrich Burges resources in order to minimize their private costs. In this thesis, we in- troduce congestion games with multi-dimensional demands as a gener- Discrete-event simulation is a well-established OR tool to tackle a alization of weighted congestion games. Instead of a one-dimensional great variety of (stochastic) optimization problems in different indus- demand value each player is associated with a multi-dimensional de- tries like manufacturing and logistics. But the terms simulation and mand vector and the cost function of a resource is a multi-variable optimization are very often mixed-up or even seen as competitive in function of the aggregated demand vectors of all players sharing the these industries. This talk will show some practical applications of resource. Such a cost structure is natural when the cost of a resource discrete-event simulation starting with simulation as "pure" evaluation depends not only on one, but on several properties of the players’ de- function, as a tool for the simulation expert to , e.g., manually optimize mands, e.g., total weight, total volume, and total number of items. dynamic control rules, and as a complementing companion of mathe- We study the existence of pure Nash equilibria for this new class of matical optimization procedures. In addition, some of the challenges games and give a complete characterization of the existence of pure which an OR consultant and a company working day to day in that Nash equilibria in terms of the resource cost functions. Specifically, we field have to face when simulation comes to practice will be described identify all sets of cost functions that guarantee the existence of a pure as well. Nash equilibrium for every congestion game with multi-dimensional demands. Furthermore, we investigate the equilibrium existence prob- lem for subclasses with restricted strategy spaces, such as singleton or matroid congestion games with multi-dimensional demands. 3 - Adapting exact and heuristic procedures in solving an NP-hard sequencing problem Andreas Wiehl We consider an operational process at shunting yards, where freight cars are disassembled and reassembled via a system of tracks and switches to form outbound trains with no restriction on the order of the freight cars. Given are due dates for each outbound train and pri- ority values for its freight cars. Furthermore, the composition and the processing time of each inbound train is part of the input. An outbound train is de ned by a set of freight cars taken from one or many inbound trains. In this context, we try to minimize the weighted tardiness of all outbound trains by the determination of the optimal humping se- quence of the inbound trains. We show that this problem is NP-hard and we present a simple mixed integer problem formulation. Besides two heuristic approaches and an implementation in CPLEX, the main focus of our single stage shunting problem is on the development of an exact branch & bound procedure. Therefore, we present powerful precedence constraints and priority rules to reduce the solution space. Further, we compare the runtime and accuracy of the proposed algo- rithms with the results of CPLEX optimizer in a computational study.

24 OR 2014 - Aachen WE-04

WE-03 players. Unfortunately there exists the hazard that the inspectee will  prey the inspector’s trust if the inspector’s trust level increases and ex- Wednesday, 16:05-17:35 - Fo3 ceeds a threshold. The inspector wants to prevent this situation. This is modelled by a stochastic term which expresses the percentual loss Security and Inspection Games of trust of the inspector that may occur, so that a reasonable boundary of a threatened exploitation is not realized. However if this occurs, the Stream: Algorithmic Game Theory calculated equilibrium in the next round of the repeated game will be Invited session the Nash solution without any trust. In the following rounds trust will maybe increase again and develop in a similar way as before. Chair: Jan Trockel Based on a simulated structure of the chronology of the players’ pay- offs one can estimate the level of mistrust the inspector should never 1 - Algorithms in Stochastic Positional Games for Deter- underbid, so that error-free payoff-series without trust variations occur mining Nash Equilibria that dominate the Nash solution in games without trust, but simultane- Dmitrii Lozovanu, Stefan Wolfgang Pickl ously decrease the value of hazard the inspector may be exploited by the inspectee. We consider a class of stochastic positional games that extend deter- ministic positional games and discrete Markov decision processes. A Reference: Fandel, G., Trockel, J., 2011. Der Einfluss von Vertrauen special class of these games is formulated and studied using a certain in einem Inspektionsspiel zwischen Disposition und Controlling, in: game-theoretical concept for finite state space Markov decision pro- Nguyen, T. (Hrsg.): Mensch und Markt — Die ethische Dimension cesses with an average and expected total discounted optimization cri- wirtschaftlichen Handelns, Festschrift für Prof. Dr. Dr. h. c. Volker teria [1,2]. Now, we assume that a finite space Markov process may be Arnold, Wiesbaden, 451-478. controlled by several players as follows: 4 - Strategic Deterrence of Terrorist Attacks The set of states is divided into several disjoint subsets which we re- Marcus Wiens, Sascha Meng, Frank Schultmann gard as the position sets of the corresponding players. Each player has to determine which action should be taken in each state of his position Since 9/11 terrorist threats are much more present in the European set in order to minimize his own average cost per transition or the ex- countries’ preventative security and intelligence strategies. However, pected total discounted cost. The cost of system’s transition from one risk management is still dominated by methods which focus too much state to another in a Markov process is given for each player separately. on historical frequencies and do not sufficiently account for the ter- In addition the set of actions, the transition probability functions and rorists’ motives and for the strategic component of the interaction be- the starting state are known. We assume that in the considered games tween offender and defender. One alternative is to cluster the wide field the players use stationary strategies: we are seeking for a Nash equi- of terrorists’ motives by type and to set up a corresponding defender- librium. offender-game with incomplete information on both sides. The de- Our main results are concerned with the existence and characterization fender is uncertain about the terrorist’s type which we reduce to two of Nash equilibria for stochastic positional games and an application exemplified variants, both well established in terrorism research: The of the algorithms for determining the optimal stationary strategies of fanatic (’irrational’) terrorist strives to spread maximum damage and the players. fear whereas the subversive terrorist uses attacks as a symbolic and specifically addressed communication device. Additionally also the References terrorists face uncertainty with regard to random events which e.g. de- 1. Lozovanu D. The game-theoretical approach to Markov decision termine the success of an attack or the amount of collateral damage. problems and determining Nash equilibria for stochastic positional In this paper we analyze the difference between these two approaches. games. Int. J. Mathematical Modelling and Numerical Optimization, We model the classical risk management approach by a specific vari- 2 (2), 162-164 , 2011 ant of fictitious play and compare it with the outcome of the sequential game of incomplete information. We find that the latter allows for a 2. Lozovanu D., Pickl S., Kropat E. Markov decision processes and more purposeful use of security measures as the defender avoids to determining Nash equilibria for st get caught in a hare-and-tortoise-trap. We specify the conditions under which the incomplete information-model outperforms fictitious play in 2 - Novel Formulations for Stackelberg Security Games the sense that it lowers the cost of defense at a given rate of deterrence. Carlos Casorrán-Amilburu, Bernard Fortz, Martine Labbé, We analyze the robustness of our results and discuss the implications Fernando Ordonez and requirements for practical application. Stackelberg Games confront contenders with opposed objectives se- quentially. The Leader acts first and the Follower reacts to the Leader’s strategy. The objective of the game is for the Leader to commit to a reward-maximizing strategy anticipating the Follower’s best response. WE-04 In a Bayesian Stackelberg Game, which is NP-hard, the Leader  faces one out of a group of Followers, otherwise the game is called Wednesday, 16:05-17:35 - Fo4 a Singletype-of-Follower Stackelberg Game, which is polynomial (Conitzer and Sandholm, 2006). Moreover, games in which the re- Scheduling with Uncertainties spective strategies of the Leader and Follower consist in covering and attacking targets are called Stackelberg Security Games. Stream: Robust and Stochastic Optimization We present novel tight formulations for the Single-type-of-Follower Invited session Stackelberg Game and for the Single-type-of-Attacker Stackelberg Se- Chair: Christina Büsing curity Game, significantly improving the current formulations present in the literature (Paruchuri et al., 2008), (Kiekintveld et al., 2009). Fur- ther, we show that both formulations provide a complete linear de- 1 - Two-Stage Scheduling on Unrelated Machines scription of the convex hull of the sets of feasible solutions of the cor- Roman Rischke, Lin Chen, Nicole Megow, Leen Stougie responding problems and show that one formulation is the projection of the other on the appropriate space. The formulations presented for We propose a natural model for two-stage scheduling in which reserv- the Bayesian case improve the continuous relaxations of existing for- ing a time unit for processing jobs incurs some cost. This cost depends mulations. Computational experiments are carried out to compare our on the time at which the reservation is made: a priori decisions, based formulations with those in the literature. only on distributional information, are much cheaper than on-demand decisions made when the actual scheduling scenario is known. Such 3 - The effects of trust variations on the results of in- a model captures e.g. the resource provisioning problem that users of spection procedures cloud computing services face. Jan Trockel, Günter Fandel We consider both the stochastic and the robust version of this prob- lem. We investigate scheduling on unrelated machines in the two-stage In our paper we expand the considerations of Fandel/Trockel (2011) to scheduling model with reservation cost. Our main contribution is an (8 an analysis of a dynamic trust behavior of the strategic players. The + epsilon)-approximation algorithm, for both the stochastic and the trust parameters that determine the level of the additional payoffs in robust version with a polynomial number of scenarios. It relies on the case of trust are now time-dependent with respect to the number of a generalized time-indexed LP-formulation, a rounding strategy that repeated rounds of the inspection game. The basis of modelling is a balances first-stage reservation and fractional scheduling cost, and the logistic function that describes the trust expansion among the strategic concept of alpha-points. The key ingredient is a separation of jobs and

25 WE-05 OR 2014 - Aachen

time slots to be considered in either the first or the second stage only. can be written as convex combination of at most n vertices of C. We At the expense of another epsilon our result holds for any arbitrary show that this convex combination can be computed from x and p in scenario distribution given by means of a black-box in the two-stage time O(n2), which is linear in the naive encoding of the output. We stochastic problem. obtain this result using essentially two ingredients. First, we build on the fact that the scheduling polytope is a zonotope. Therefore, all of 2 - Robust single machine scheduling problem with its faces are centrally symmetric. Second, instead of C, we consider weighted number of late jobs criterion the polytope Q of half times and its barycentric subdivision. We show Pawel Zielinski, Adam Kasperski that the subpolytopes of this barycentric subdivison of Q have a sim- ple, linear description. The final decomposition algorithm is in fact an We are given a set of jobs which are to be processed on a single ma- implementation of an algorithm proposed by Grötschel, Lovász, and chine. For each job three parameters, namely a processing time, a due Schrijver applied to one of these subpolytopes. date, and a weight, are specified. Each schedule is represented as a permutation of the jobs. A job is late in a schedule if its completion 2 - Scheduling with job dependent machine speed time in this schedule exceeds its due date. We seek a schedule min- Veerle Timmermans, Tjark Vredeveld imizing the weighted number of late jobs. This problem is known to The power consumption rate of computing devices has seen an enor- be NP-hard. However, special cases with unit weights or unit job pro- mous increase over the last decades. Therefore computer systems must cessing times are polynomially solvable. In this paper, we assume that make a trade-off between performance and energy usage. This obser- all the job parameters may be imprecise. We model this uncertainty by vation has led to speed scaling, a technique that adapts the speed of specifying a scenario set containing all vectors of the job parameters the system to balance energy and performance. Fundamentally, when (called scenarios) which may occur. We use the min-max criterion to implementing speed scaling, an algorithm must make two decisions at compute a solution, which is the most popular criterion in robust op- each time: (i) a scheduling policy decides which jobs to serve, and (ii) timization. In this paper, we extend and strengthen the results which a speed scaler decides how fast to run. have been recently obtained for the problem in the literature. Namely, we show that if the number of processing time scenarios is a part of the In this presentation we introduce a preemptive single machine schedul- input, then the problem is strongly NP-hard even when all job weights ing problem where the machine speed is externally given and depends are equal to 1 and all jobs have a common deterministic due date. We on the number of jobs that is available for processing. A job is available also show that when the number of due date scenarios is a part of the for processing when it is released but not yet completed. The objective input, then the problem is not approximable within any constant fac- is to minimize the sum of weighted completion times. tor. This assertion remains true even if there are two distinct values When the machine speed is constant over time, it is well known that of the due-dates in scenarios and all job weights are equal to 1. Fi- the Smith’s rule yields an optimal schedule. Unfortunately this gives nally, we show some positive approximation results for the problem arbitrary bad results for the problem under consideration. We intro- with unit processing times, uncertain due dates and uncertain weights. duce a greedy algorithm that solves the problem to optimality when all This approximation results can be extended to a more general problem, weights are equal. With only small changes we can alter this algorithm in which the maximum is replaced with the OWA operator. to work when weights are arbitrary and we have unit processing times. 3 - Robust job shop scheduling problem with uncertain For arbitrary weights and processing times our algorithm finds an opti- operations mal schedule when we restrict ourselves to a certain order of job com- pletions. However, we do not know which is an optimal order of job Annika Thome, Christina Büsing, Sarah Kirchner completions. The WSPT-order, which is optimal when machine speed We investigate a special case of the job shop problem where there is is fixed, can even give arbitrary bad results. uncertainty about whether an operation actually occurs and needs to be scheduled. Specifically, we model a job shop problem with perfect 3 - On time-indexed formulations for the resource- information about all processing times, precedence relations, and re- constrained project scheduling problem source consumption but with a set of operations that might or might Christian Artigues not occur - so called variable operations. The total number of variable The classical time-indexed 0—1 linear programming formulations for operations that will indeed occur is limited by a parameter K. This the resource constrained project scheduling problem involve binary type of problem arises in a simplified environment of a hospital patient variables indicating whether an activity starts precisely at or before a scheduling problem. Knowing that at most K variable operations will given time period. In the literature, references to less classical "on/off’ occur, one wants to come up with a stable schedule. We propose a formulations, that involve binary variables indicating whether an ac- recoverable robust model with two stages: in a first stage we fix the tivity is in progress during a time period, can also be found. These order of operations sharing the same machine. After the occurrence formulations were not compared to the classical ones in terms of linear of at most K variable operations, we compute in a second phase the programming (LP) relaxations. In this talk, we show that the previ- start times of each operation. Since we want to minimize the sum of ously proposed on/off formulations are weaker than the classical for- the worst-case completion times, this sums up to a nonlinear min-max- mulations and we obtain a stronger on/off formulation via non singular min optimization problem. In this talk, we present a branch and bound transformations of the classical formulations. We also remark that ad- algorithm to solve this recoverable robust scheduling problem. Upper ditional time-indexed formulations, presented as appealing in the liter- bounds on the nodes and branching rules are based on an budgeted max ature, are in fact either weaker or equivalent to the classical ones. cost flow problem on acyclic graphs.

WE-05  WE-06  Wednesday, 16:05-17:35 - Fo6 Wednesday, 16:05-17:35 - Fo5 Combinatorial & Polyhedral Aspects of Robustness Issues Scheduling Stream: Robust and Stochastic Optimization Invited session Stream: Discrete and Combinatorial Optimization, Chair: Lars Eufinger Graphs and Networks Invited session 1 - A general solution for robust linear programs with Chair: Marc Uetz distortion risk constraints Karl Mosler, Pavel Bazovkin 1 - Decomposition Algorithm for the Single Machine Linear optimization problems are investigated that have random pa- Scheduling Polytope rameters in their m constraints. In constructing a robust solution x in Ruben Hoeksma, Bodo Manthey, Marc Uetz d-space, we control the risk arising from violations of the constraints. This risk is measured by set-valued risk measures, which extend the Given an n-vector p of processing times of jobs, the single machine usual univariate coherent distortion (= spectral) risk measures to the scheduling polytope C arises as the convex hull of completion times multivariate case. To obtain a robust solution in d variables, the lin- of jobs when these are scheduled without idle time on a single ma- ear goal function is optimized under the restrictions holding uniformly chine. Given a point x in C, Carathéodory’s theorem implies that x for all parameters in a d-variate uncertainty set. This set is built from

26 OR 2014 - Aachen WE-08

uncertainty sets of the single constraints, each of which is a weighted- As first result of our experiment we can say that column generation mean trimmed region in d-space and can be efficiently calculated. Fur- techniques and preprocessing methods provided a considerable reduc- thermore, a possible substitution of violations between different con- tion of the computational demand to solve the problem. In particular straints is investigated by means of the admissable set of the multi- no branches are necessary in the column generation process for the variate risk measure. In the case of no substitution, we give an exact analysed datasets. geometric algorithm, which possesses a worst-case polynomial com- plexity. We extend the algorithm to the general substitutability case, [1] An Integer Programming Formulation of the Parsimonious Loss that is, to robust polyhedral optimization. The consistency of the ap- of Heterozygosity Problem. Daniele Catanzaro, Martine Labbé, and proach is shown for generally distributed parameters. Finally, appli- Bjarni Halldorsson, IEEE/ACM Trans Comput Biol Bioinform, 2012 cations of the model, especially to supervised machine learning, are discussed. 2 - Dual Inequalities for Stabilized Column Generation Revisited 2 - Robust gate assignment in less-than-truckload ter- Stefan Irnich, Timo Gschwind minals Lars Eufinger, Uwe Clausen Column generation (CG) models have several advantages over com- pact formulations, e.g., they provide better LP bounds, may eliminate symmetry, and can hide non-linearities in their subproblems. How- Freight forwarding companies in the less-than-truckload (LTL) indus- ever, users also encounter drawbacks in the form slow convergency try are under strong competitive pressure. Due to this pressure com- a.k.a. the tailing-off effect and the oscillation of the dual variables. panies are trying to gain a competitive advantage by systematically Among different alternatives for stabilizing the CG process, Ben Amor, optimizing the processes and the implementation of logistics innova- Desrosiers, and Valério de Carvalho (2006) suggest the use of dual- tions. We want to investigate LTL terminals, which are the hubs of the optimal inequalities in the context of cutting-stock and bin-packing LTL transportation networks and operate as distribution centers with problems. We will generalize their results, provide new classes of collection and distribution function of goods, e.g. cross docking. The (deep) dual-optimal inequalities, and show the applicability to other task of a LTL terminal is the accurate and in time handling of ship- problems (vertex coloring, bin-packing and cutting-stock problems ments between vehicles on short-distance traffic and transport vehicles with conflicts, temporal knapsack problem). We also suggest the dy- on long-distance traffic. The performance of a LTL terminal is largely namic addition of violated dual inequalities in a cutting-plane fashion, determined by the proper use of the gates. However, many uncertain and present computational results proving the usefulness of the meth- factors influence the planning. Fluctuations can occur in both, the ar- ods. rival times of vehicles as well as in the processing times. Even fail- ures of resources within the logistical system can affect the quality of the solution. A gate assignment plan should also be stable for late 3 - Solving MINLP using hierarchical MILP relaxations arrivals and other uncertain influences. Thus it is reasonable to use ro- Lars Schewe bust (stochastic) optimization to create a gate assignment plan which can handle the occurring uncertainties. We present our optimization We discuss the solution of large MINLP using hierachical MILP re- model for the assignment of the trucks to the gates, taking into account laxations. The MILP relaxiations are derived using piecewise-linear the processes inside the terminal, e.g. the movements of the goods relaxations of the underlying nonlinear constraints. The main focus of from gate to gate. In addition to this, we also show which uncertainties our talk is to discuss how we can adaptively refine the relaxations. We need to be considered to get a robust gate assignment plan and how show the applicability of this technique on examples from gas network to include these uncertainties in our model using robust (stochastic) optimzation. optimization.

 WE-08  WE-07 Wednesday, 16:05-17:35 - Fo8 Wednesday, 16:05-17:35 - Fo7 Network Creation Column Generation Stream: Algorithmic Game Theory Stream: Discrete and Combinatorial Optimization, Invited session Graphs and Networks Chair: Pascal Lenzner Invited session Chair: Lars Schewe 1 - Tree Nash Equilibria in the Network Creation Game Akaki Mamageishvili, Matús Mihalák, Dominik Muller 1 - A column generation approach to the Parsimonious In the network creation game with n vertices, every vertex (a player) Loss of Heterozygosity Problem buys a set of adjacent edges, each at a fixed amount C. It has been con- Luciano Porretta jectured that for C more or equal n, every Nash equilibrium is a tree, and has been confirmed for every C more than 273*n. We improve A Loss of Heterozygosity (LOH) event occurs when, by the laws of upon this bound and show that this is true for every C at least 65*n. Mendelian inheritance, an individual should be heterozygote at a given To show this, we provide new and improved results on the local struc- site but, due to a deletion polymorphism, is not. Deletions play an im- ture of Nash equilibria. Technically, we show that if there is a cycle portant role in human disease and their detection could provide funda- in a Nash equilibrium, then C has to be smaller than 65*n. Proving mental insights for the development of new diagnostics and treatments. this, we only consider relatively simple strategy changes of the players In this article we investigate the Parsimonious Loss of Heterozygosity involved in the cycle. We further show that this simple approach can- Problem (PLOHP), i.e., the problem of partitioning suspected poly- not be used to show the conjectured upper bound "C is less than n if morphisms from a set of individuals into a minimum number of dele- a cycle may exist in Nash equilibrium", but conjecture that a slightly tion areas. worse bound "C less than 1.3*n" can be achieved with this approach. We focused our attention in the article [1] that provide a state-of-the art Towards this conjecture, we show that if a Nash equilibrium has a cycle integer programming formulation able to solve instances of the PLOHP of length at most 10, then indeed C is less than 1.3*n. We further pro- containing up to 9000 individuals and 3000 SNPs. vide experimental evidence suggesting that when the girth of a Nash equilibrium is increasing, the upper bound on C obtained by the simple In the article [1], the authors show that the PLOHP can be formulated strategy changes is not increasing. To the end, we investigate the ap- as a specific version of the partition problem in a particular class proach for a coalitional variant of Nash equilibrium, where coalitions of graphs called undirected catch-point interval graphs and they prove of two players cannot collectively improve, and show that if C is at its general NP-hardness. least 41*n, then every such Nash equilibrium is a tree. In order to tackle this problem, we investigate the possibility to use col- umn generation techniques, together with the use of graph decompo- 2 - Network Creation Games with Incomplete Informa- sition methods and divide and conquer techniques in order to improve tion the solution time of one of the model provided in [1]. Davide Bilò

27 WE-09 OR 2014 - Aachen

Network creation games have been extensively studied, both from In our study, we investigate how the concept of robust optimization can economists and computer scientists, due to their versatility in modeling provide a modeling framework for the analysis of complex regulatory individual-based community formation processes, which in turn are the systems under data uncertainty. Supported by Robust Optimization, theoretical counterpart of several economics, social, and computational we analyze time-discrete target-environment regulatory systems under applications on the Internet. However, the generally adopted assump- polyhedral uncertainty by using Robust MARS (RMARS) and Robust tion is that players have a common and complete information about the Conic MARS (RCMARS), which is more model-based and employs ongoing network, which is quite unrealistic in practice. We consider continuous, well-structured convex optimization that enables the ap- a more compelling scenario in which players have only limited infor- plying of Interior Point Methods and their codes, e.g., MOSEK. We mation about the network they are embedded in. More precisely, we demonstrate the performance of RMARS and RCMARS with a nu- define a suitable equilibrium concept and we explore the game theo- merical study and compare their results. retic and computational implications of assuming that players have a defective view of the network. 3 - On biconjugates of infimal functions Sorin-Mihai Grad, Gert Wanka 3 - Price of Anarchy in the Network Formation Adversary We deliver formulae for the biconjugate functions of some infimal Model functions, that hold provided the fulfillment of certain regularity con- Lasse Kliemann ditions. Moreover, we rediscover or extend different results from the literature. The adversary model is one of the first network formation models to explicitly consider robustness aspects. It supposes that exactly one link in the built network will be destroyed by an adversary, according to a known probability distribution. The cost of a player is the expected number of other players to which connection will be lost. Despite its WE-11 simplicity, this model poses a challenge when it comes to the price of  anarchy. Wednesday, 16:05-17:35 - SFo3 In this presentation, I will give an overview over the results obtained so Decision Making and Game Theory far for the price of anarchy in this model for Nash equilibrium, pairwise Nash equilibrium, and pairwise stability. Some of the graph-theoretic techniques used in the proofs will be sketched. Stream: Decision Theory and Multi-Criteria Optimiza- tion Invited session Chair: Pierre von Mouche  WE-09 1 - The proportional partitional Shapley value Wednesday, 16:05-17:35 - SFo1 Francesc Carreras, José María Alonso_meijide, Julian Costa, Ignacio García-Jurado Applications of linear and nonlinear The cooperative game theory deals with situations where a group of optimization I agents (players) wants to share the gains derived from their coopera- tion. A value is a solution concept that proposes, according to some Stream: Continuous and Non-linear Optimization criteria, an allocation vector for each cooperative game that represents a fair compromise for the players. The best-known value is the Shapley Invited session value. Chair: Michael Herty Since the notion of cooperative game with a coalition structure (a par- Chair: Simone Göttlich tition of the set of players into unions) was first considered, modifica- tions of the Shapley value, called coalitional values, have been intro- 1 - Continuous-Discrete Optimal Control of Markov duced and analyzed in the game theoretical literature. The two most Swicthing Models and Stochastic Hybrid Systems cited ones are the Aumann-Drèze value and the Owen value. They are based on two different interpretations of the coalition structure that with Jumps give rise to two different approaches when defining a coalitional value: Gerhard-Wilhelm Weber, Busra Temocin, Diogo Pinheiro, 1. Aumann and Drèze consider that, once a coalition structure exists, a Nuno Azevedo, Sevtap Kestel, Nadi Serhan Aydin cooperative situation arises in each union independently of the remain- ing ones (isolated unions). We contribute to the hybrid, e.g., mixed continuous-discrete dynamics of stochastic differential equations with jumps or Markov-switching 2. Instead, Owen considers the coalition structure just as a way to models, and to its optimal control. Those systems allow the represen- influence the negotiation among the agents (bargaining unions). tation of random regime switches and are of growing importance in Here we adopt approach 1. A new coalitional value, the proportional economics, finance, science and engineering. We introduce two new partitional Shapley value, is proposed under the hypothesis of isolated approaches to this area of stochastic optimal control: one is based on unions but, contrarily to Aumann and Drèze, it takes into account in the finding of closed-form solutions, the other one on a discrete-time some manner the outside options for the players. The main difference numerical approximation scheme. The presentation ends with a con- between this new value and the Aumann-Drèze value is that the allo- clusion and an outlook to future studies. cations within each union are not given by the Shapley value of the restricted game, but proportionally to the Shapley value of the orig- 2 - Robust counterparts for two-modal complex regula- inal game. Axiomatic characterizations of the new value, as well as tory networks: rmars and rcmars examples illustrating its application and a comparative discussion, are Ayse Özmen, Erik Kropat, Gerhard-Wilhelm Weber provided. The modeling and prediction of regulatory networks is of considerable 2 - Closed-Loop Nash Equilibria Strategy under Uncer- importance in many different areas such as finance, environmental pro- tainty of Advertising Effect in Advertising Competi- tection, education, systems biology, medicine and life sciences. Mod- tion ern statistical learning, data mining and estimation theory has provided many regression approaches. In particular, Multivariate Adaptive Re- Yasuhiro Iida, Kei Takahashi, Takahiro Ohno gression Spline (MARS) is an important non-parametric regression In this study, we propose a model of advertising competition under methodology. However, after the recent financial crisis, it has been re- uncertainty of advertising effect in order to measure the influence that alized that the known statistical methods, which suppose that the input its effect gives in the strategies of the firm. Advertising competition data are exactly known and equal to some nominal values in develop- is one of the dynamic marketing problems. In markets, a firm needs ing models, may give untrustworthy results. This introduces a kind of to take care of its own and competitors’ advertising strategies in order weakness to the methods since, in real-life, both output and input data to maintain or improve its own market share, sales and profit. Over include uncertainty in the form of noise. Therefore, robustification has the past few decades, considerable studies, for example, introduced in started to draw more attention in many fields, especially, the financial Huang et al. (2012) have been conducted on advertising competition. sector where regulatory networks appear naturally and the correspond- In some of these studies, advertising competition is formulated as dif- ing regression problems usually depend on complex data bases that are ferential game models. However, advertising effect is assumed to be affected by noise and uncertainty. deterministic. In other words, it is assumed that state variables change

28 OR 2014 - Aachen WE-13

by the actions of a firm and the competitors and do not change in other and 100-job instances from the literature. Our computational experi- factors. ments show, that our exact algorithm, i.e. the solution of our models with SCIP outperforms the state-of-the-art exact approach from the lit- Prasad and Sethi (2004) proposed the extended Vidal-Wolfe model un- erature. Finally, we conclude that our approach is easily usable via der uncertainty of market share. However, they did not consider the SCIP and can also be applied to generalizations of the MRCPSP con- uncertainty of advertising effect. Jorgensen and Zaccour (2004) men- taining more general objective functions and resource constraints. tioned the need in consideration of the uncertainty of advertising effect. It is clear that the advertising effect has uncertainty by an audience rat- 2 - Review of The Literature About Genetic Algorithms ing in the television commercial. On The Resource-Constrained Project Scheduling This paper proposes Closed-loop Nash equilibria (Markovian Nash Problems equilibria) in a Lanchester differential game of advertising competi- Gülnar Eren, Burcu Karaöz, ¸Sevkinaz Gümü¸soglu˘ tion with uncertainty of advertising effect. In this study, we formulate that advertising effect changes over time stochastically. In addition, Project management is a complex decision making process involving Closed-loop equilibira in stochastic games are satisfied with Hamilton- the unrelenting pressures of risk, time and cost. A project manage- Jacobi-Bellman equations. ment problem typically consists of planning and scheduling decisions. (Gonçalves, Mendes, 2008) The resource-constrained project schedul- In the numerical experiments, the advertising cost and profit in our ing problem (RCPSP) consists of activities that must be scheduled sub- strategy are compared to those of actual data and Open-loop Nash equi- ject to precedence and resource constraints such that the makespan is libria strategy. minimized. (Hartmann, Briskorn,2010). It is is an NP-hard (Non- Deterministic Polynomial-Time Hard) problem which contains a num- 3 - On the Setting for the Selten-Szidarovzsky Method ber of complicated problems in scheduling like job shop, flow shop and Pierre von Mouche assembly line balancing. As the problem is NP-hard the performance is limited and can only solve small-sized project networks. Genetic al- In this article I reconsider a technique for handling Nash equilibria for gorithms (GA) adapt to dynamic factors such as changes to the project additively aggregative games, i.e., games where the payoff function of plan and aims to find near-optimal solutions. Also overcomes the poor a player depends only on his own strategy and the sum of all strategies. performance of the exact procedures for large-sized project networks. This study presents a literature review about genetic algorithm for the This technique, developped by Selten and Szidarovzsky, was especially resource constrained project scheduling problems and aims to show the succefull in oligopoly theory. The technique consists in transforming advantages of using a genetic algorithm (GA) as a heuristic method in the fixed point problem for the best reply correspondence R into an RCPSP. associated fixed point problem for a correspondence B. The value of the technique consists in the fact that the fixed point problem for B in 3 - Efficiency Evaluation of Multi-Mode Project Sched- general is more simple than that of R$ as the domain of B typically is ules: Comparison of Different Approaches a subset of the real numbers. This holds in particular under suitable differentiability and concavity assumptions for the payoff functions of Utkan Eryilmaz, ONCU HAZIR, Klaus Schmidt the game. In this study, we evaluate the efficiency of schedules for a well-known multi-mode projects scheduling problem, discrete time/cost trade-off In fact the technique even applies to correspondences with a special problem (DTCTP). We aim to investigate different approaches to mea- factorisation property and so its setting is not necessarily a game the- sure efficiency of solutions using Data Envelopment Analysis (DEA) oretic one. A quite general class of correspondences to which it ap- and compare them in terms of applicability in real life problems. In a plies recently has been identified. In the present article I show that it is previous study (Eryilmaz et al. 2014), robustness was integrated as a possbile to extend this class which makes that the technique can handle criterion for ranking the schedules. However each DEA model (input, more general types of aggregative games. output orientation, slack based measure; constant, variable, increasing, and decreasing returns to scale; weight restricted) resulted in different efficiency scores and interpretations. Differently, in this research, we compare the DEA approaches; investigate the domination relations for the set of schedules considering various criteria (cost, time, and ro- WE-12 bustness). To validate the effectiveness and efficiency of the chosen  approach, extensive computational experiments and statistical analysis Wednesday, 16:05-17:35 - SFo4 will be performed. Resource-Constrained Project References Eryilmaz, Utkan, Hazir, Oncu, and Schimdt, Klaus W., A Multi-Criteria Approach for Ranking Schedules for Multi-Mode Scheduling: Efficient Solution Procedures Projects, Proceedings of the 14th International Conference on Project Management and Scheduling, TUM School of Management, Munich, Stream: Project Management and Scheduling Germany, (2014), pp. 80—83. Invited session Chair: ONCU HAZIR Chair: Utkan Eryilmaz  WE-13 1 - On the efficient modeling of the multi-mode Wednesday, 16:05-17:35 - SFo9 resource-constrained project scheduling problem Analytics with generalized precedence relations in the opti- mization framework SCIP Stream: Artificial Intelligence, Big Data, and Data Min- Alexander Schnell, Richard Hartl ing The aim of resource-constrained project scheduling (RCPS) is to as- Invited session sign starting times to a number of jobs subject to precedence and re- Chair: Ralph Grothmann source constraints such that a project related objective is optimized. Chair: Jochen Gönsch We present a new exact approach for the multi-mode RCPS problem Chair: Thomas Setzer (MRCPSP) with generalized precedence relations (GPR) and the ob- jective of makespan minimization. State-of-the-art exact algorithms 1 - Customer Segmentation Using Concise Representa- for the single-mode RCPSP integrate techniques from Constraint Pro- gramming (CP) and Boolean Satisfiability (SAT) solving in a Branch- tion of Purchasing Sequences and-Bound search framework. In our talk, we show how these tech- Katerina Shapoval, Thomas Setzer niques can be generalized to the MRCPSP with GPRs. For the general- Targeting the customers most prone to respond to a marketing cam- ization, we implemented two new constraint handlers for the optimiza- paign, such as product advertisements, is a key task in direct market- tion framework SCIP. The latter capture constraint propagation rules ing. Typically, targets are preselected using segmentation techniques, for precedence and renewable resource constraints. Moreover, they which divide the customer base into groups of customers with sim- process valid clauses deduced from the processed domain reductions ilar characteristics and hopefully comparable response probabilities to the SCIP-intern SAT mechanism. We introduce three mathematical to marketing activities. Today, segmentation is often based on cus- models in SCIP for the MRCPSP with GPRs which integrate our new tomer attributes such as recency, frequency and monetary value of pur- constraint handlers. The different formulations are tested with 30-, 50- chases (RFM approach); in some cases also ownership of a certain

29 WE-15 OR 2014 - Aachen

product is used. However, in several studies it has been shown that 2 - On the Information Content of Decomposed Finan- a more detailed modeling of a customer’s purchase history has addi- cial Return Series: A Wavelet Approach tional predictive value, for example to estimate the probability of next Theo Berger purchase. The major challenge with sequence modeling is the combi- natorial growth of possible sequences and the resulting small groups of We decompose financial return series via wavelets into different time customers for each sequence (Empty Space Phenomenon). We present scales to analyze their information content regarding the volatility of a method to learn a user-defined number of customer segments, where the returns. Moreover, we investigate the information of each scale member count in each segment is significantly increased compared to and discuss the decomposition of daily Value-at-Risk (VaR) forecasts. a grouping based on raw purchase sequences. In our approach, we By an extensive empirical analysis, we analyze financial assets in calm first preprocess each customer’s purchase history by assigning each and turmoil market times and show that daily VaR forecasts are mainly bought product a geometrically decreasing weight depending on the driven by the volatility which is captured by the scales comprising recency of it acquisition. Thereby the focus is more on recent buying the short-run information. Further, we apply Extreme-Value-Theory behavior while still considering older purchase history depending on to each time scale and illustrate that the information which is stored by the discount factor used. The resulting weighted purchase vectors are the short-run scales linked via copulas outperforms classical paramet- then projected into product space and distance-based clustering is used ric VaR approaches which incorporate all information available. for the segmentation. Experimental results based on data provided by a large telecommunications company show high discriminative power 3 - Short-Term Market Models for Option Prices of the derived segments for predicting next purchases. Hans-Jörg von Mettenheim, Michael H. Breitner 2 - Sparse principal component analysis in revenue We use high-frequency option data to build a market model of option management data prices. To this end we train an artificial neural network. The network Claus Gwiggner, Catherine Cleophas is retrained on a rolling window basis. The results can be used for real-time option pricing or for short-term forecasting. Similar customer behaviour should lead to clusters in the data they leave behind. When such clusters overlap, standard algorithms might lead to uninterpretable results. We performed standard and sparse prin- cipal component analysis in order to explain the heterogeneity in rev- enue management data. Our results identify typical customer behavior  WE-19 and some first new insight into dependencies between our variables. Wednesday, 16:05-17:35 - I 3 - Decoding problem gamblers’ signals — a decision model for casino enterprises Routing Sandra Ifrim Stream: Traffic and Transportation The aim of the present study is to offer a validated decision model for Invited session casino enterprises. The model enables those users to perform early detection of problem gamblers and fulfill their ethical duty of social Chair: Ha Hoang cost minimization. To this end, the interpretation of casino customers’ nonverbal communication is understood as a signal-processing prob- 1 - Geometric Approaches for Districting Problems lem. Indicators of problem gambling recommended by Delfabbro et al. Alexander Butsch, Jörg Kalcsics, Stefan Nickel (2007) are combined with Viterbi algorithm into an interdisciplinary model that helps decoding signals emitted by casino customers. Model Districting is the problem of grouping small geographic areas, called output consists of a historical path of mental states and cumulated so- basic areas, into larger geographic clusters, called districts, subject to cial costs associated with a particular client. Groups of problem and a number of relevant planning criteria. non-problem gamblers were simulated to investigate the model’s di- In this talk we will focus on practical problems in the context of sales agnostic capability and its cost minimization ability. Each group con- districting. In this application a basic area corresponds to a customer sisted of 26 subjects and was subsequently enlarged to 100 subjects. location and a district corresponds to the area of responsibility for one In approximately 95 percent of the cases, mental states were correctly sales person. Three important planning criteria are balance, compact- decoded for problem gamblers. Statistical analysis using planned con- ness and contiguity. Balance describes the requirement for districts trasts revealed that the model is relatively robust to the suppression to have approximately the same size with respect to the workload or of signals performed by casino clientele facing gambling problems as sales potential. A district is said to be geographically compact if it is well as to misjudgments made by staff regarding the clients’ mental closely and firmly packed together. Compact districts reduce the sales states. Only if the last mentioned source of error occurs in a very pro- persons’ unproductive travel time. Contiguity means that it is possible nounced manner, i.e. judgment is extremely faulty, cumulated social to travel between the basic areas of a district without having to leave costs might be distorted. the district. The main idea of our solution approach is to recursively subdivide the problem geometrically into smaller and smaller subproblems until an elementary level is reached, at which point we can efficiently solve the problem. To subdivide a problem into two subproblems, we first deter-  WE-15 mine two parallel lines L1 and L2, such that the set of all basic areas Wednesday, 16:05-17:35 - SFo11 to the left (right) of L1 (L2) already comprises a balanced subproblem. In order to obtain compact districts, we then assign sequentially each Forecasting Applications for Quantitative basic area between these lines to its closest subproblem. In that way, we are also able to consider network distances, although we use a ge- Trading and Investing ometric approach. Tests on real-world data confirm the efficiency of this approach and the quality of the solutions obtained. Stream: Statistics and Forecasting Invited session 2 - A new approach for the vehicle routing of hazardous Chair: Hans-Jörg von Mettenheim materials Julia Rieck, Carsten Ehrenberg, Jürgen Zimmermann 1 - Company fundamentals as mid-term predictive indi- Each year millions of tons of hazardous materials are transported cators across Europe. Hazardous materials are substances, which if released Peter Lusk, Hans-Jörg von Mettenheim or misused can pose significant impacts to human life and the natu- ral environment. Therefore, the transportation and vehicle routing of We investigate the influence of company fundamentals on the mid-term hazardous materials must be carefully managed. We consider a practi- development of a company’s stock price. Contribution to the literature cal variant of the vehicle routing problem, where a weighted distance is three-fold: 1.) We filter using several fundamental indicators. This criterion is involved, that results from the risk of hazardous materi- reduces the number of potential portfolio candidate. 2.) We specifi- als passing on road links. Thereby, the risk of a vehicle load on a cally consider a mid-term investment of several weeks as opposed to a road link is computed using the risk factors presented in the "European daily or yearly investment horizon. 3.) We focus on the importance of agreement concerning the international carriage of dangerous goods by including debt in the relevant indicators. road" (ADR). The problem is modeled as a mixed-integer linear pro- gram and small-scale problem instances are solved with CPLEX. In

30 OR 2014 - Aachen WE-21

order to solve large-scale instances heuristically, a genetic algorithm where factories and customers have fixed locations. The cost func- is presented that uses a random key representation. Computational ex- tion covers transportation and facility fixed costs as well as inventory periments are conducted on benchmark problems from the literature holding and handling costs, and thus underlines the important trade-off in order to evaluate the performances of the proposed solution proce- between these costs. We allow for direct flows between factories and dures. customers and consider capacitated vehicles. In order to be able to an- alyze large real-life problems, we develop a continuous optimization 3 - Vessel journey planning for oil products distribution formulation (and avoid using integer variables). The latter is shown in Vietnam to decompose when the flows through the DCs are fixed. In this case, Ha Hoang, Gerrit K. Janssens the inventory decisions can be computed from a closed-form equation and the location-allocation decisions follow from solving a linear pro- The research deals with a routing and scheduling problem of special- gram. Based on this, we propose an iterative heuristic which, at each ized vessels carrying oil products. The aim is to develop some models iteration, estimates the DC flows, solves a linear program, and then for enhancing profits to oil products tankers fleet pools and for reduc- improves the DC flow estimations. In order to assess its efficiency, the ing shipping costs of oil products tankers to increase their competi- heuristic is tested on many different configurations. It shows to be able tiveness. A heterogeneous fleet transports the products from several to design large supply chains and to uncover benefits from inventory loading ports to several discharging ports. Time windows are involved decisions. Finally, we illustrate the application of our approach on the on the discharging side due to production and storage plans, and on the inspiring practical case. loading side as a result of negotiations with customers. Demand may be delivered by more than one vessel. Also a vessel may have both a 3 - A stakeholder perspective as a basis for sustainable service related to pick-up of oil products and a delivery of oil products. supply chain design The routing and scheduling problem is formulated as a mixed-integer Catherine Decouttere, Nico Vandaele, Stef Lemmens programming problem. It includes information and constraints about the supply and demand, but also about the vessels, the vessel routes, Sustainability goes far beyond the inclusion of emissions. We present and the ports and their restrictions. a five-step framework where supply chain modelling is embedded in a broader contextual setting to preserve sustainability in all its aspects: stakeholder analysis, key performance setup, model construction and scenario building, scenario ranking and final design choice. This con- tribution focusses on the first and second step. The stakeholder analy- sis provides insight in the number, type and interrelationships between  WE-20 the various stakeholders. We cover internal and external, supply and Wednesday, 16:05-17:35 - II demand type of stakeholders. From this analysis a concise set of key performance indicators is derived. In this process we preserve the in- Supply Chain Design clusion of different types of KPI’s, which we classify as technical, eco- nomical and value based. It is along these types we balance the sus- Stream: Production and Operations Management tainability aims of the supply chain: technological sustainability from the viewpoint of products, services and supply systems; economical Invited session sustainability in term of the financial continuation of the system and Chair: Nico Vandaele the sustainability based on the human values contain, user and cus- tomer values as well as ecological, social and ethical aspects. We will illustrate the approach with real-life industrial evidence. 1 - Integrated supply chain design models for vaccines: A literature review 4 - Facilitating product platform decisions based on to- Stef Lemmens, Catherine Decouttere, Nico Vandaele tal supply chain costs Maud Van den Broeke Companies all over the world are confronted with designing and re- designing the supply chain of their businesses. A typical supply chain Product platforms hold the promise for companies to deliver a large consists of the following levels: suppliers, plants, distribution centers product variety to their customers in a cost efficient way. Our study and customer markets. The location of distribution centers is, for ex- presents a model to support companies in making product platform de- ample, an expensive and a difficult to reverse long-term, strategic de- cisions based on the overall supply chain costs. By quantifying the cision that can be tackled with supply chain network design. However, supply chain costs that correspond to a set of platform choices on the tactical and operational level decisions are often taken into account one hand and end product requirements on the other, the model is able when addressing supply chain network design issues: the inventory to evaluate (1) whether introducing platforms is beneficial, (2) how flows and order policies between consecutive supply chain stages de- many and which platforms should be developed, and (3) which prod- pend on the network design. The importance of integrating these deci- ucts should be derived from which platforms. The costs impacted by sion levels can hardly be overestimated. The aim of this literature re- the strategic platform decisions, and considered in the model, origin view is twofold. We provide an updated overview on integrated supply from the various supply chain activities within the company, namely chain network design and we study the issues for the design of a supply development, ordering, purchasing, inventory management and cus- chain network for a peculiar pharmaceutical product: vaccines. Vac- tomisation. We find that the most cost-efficient product platform de- cines are not ordinary commodities and concern the human race. They cision depends on the trade-off between the costs related to the plat- reduce the worldwide disease burden of many infectious diseases. The forms, such as the investments in the initial platform development, and methods of distribution are country dependent and affected by national the costs to derive product variants from those platforms, such as the vaccination policies. This complicates the demand forecasting of vac- customisation cost. Another trade-off influencing the platform deci- cines. The total lead time of vaccines varies between 9 and 22 months sion is the one between development costs and other costs. The ex- which makes it even more difficult to match supply and demand. These istence of these trade-offs makes the evaluation of product platform long lead times are due to the permanent quality control and quality as- decisions a complex problem and confirms the need for an integrated surance. The perishability of vaccines requires cold chain management cost model. The relevance of our platform evaluation model is shown to ensure a safe vaccination for the entire world population. Further- through the application to a real business example at a global technol- more, the high inventory value and the short shelf-life of the vaccines ogy company specialised in the development and production of medi- complicate the inventory management. cal screens. 2 - A location-inventory model and heuristic to design large supply chains Jean-Sébastien Tancrez, Jean-Charles Lange, Pierre Semal

Our research is inspired by the real-life case of a leading European  WE-21 glass manufacturer, with around 500 customers throughout Europe. Wednesday, 16:05-17:35 - III The case concerns its reverse logistics network, and the idea of ac- cumulating folded empty trestles in regional depots, to return them to Distribution & Inventory Management factories in trucks that are better utilized. Consequently, in this prob- lem, inventory management decisions such as the shipment size play Stream: Logistics and Inventory a central role, and have to be integrated with the location-allocation decisions. Inspired by this practical case, we study the location of Invited session intermediary facilities in a three-level network (reverse or forward), Chair: Uwe Clausen

31 WE-22 OR 2014 - Aachen

1 - Integrated Optimization of Safety Stock and Trans-  WE-22 portation Capacity Wednesday, 16:05-17:35 - IV Horst Tempelmeier, Oliver Bantel We consider a segment of a supply chain comprising an inventory and Vehicle Routing with Intermediate a transportation system that cooperate in the fulfillment of stochastic Facilities customer orders. The inventory is operated under a discrete time (r,s,q) policy with backorders. The transportation system consists of an in- Stream: Logistics and Inventory house transportation capacity which can be extended by costly external Invited session transportation capacity (such as a third-party logistics provider). Chair: Michael Schneider We show that in a system of this kind stock-outs and the resulting ac- cumulation of inventory backorders introduces volatility in the work- load of the transportation process. Geunes and Zeng(2001) have shown 1 - The Electric Vehicle Routing Problem with Time Win- for a base-stock system, that backordering decreases the variability of dows and Load-Dependent Energy Consumption transportation orders. Our findings show, that in inventory systems Mario Ruthmair, Jakob Puchinger, Luís Gouveia with order cycles longer than one period the opposite is true. In both cases, however, inventory decisions and transportation decision must We study a generalization of the vehicle routing problem with time be taken simultaneously. windows: Instead of vehicles with a conventional combustion engine We present a procedure to compute the probability distribution of the we consider electric vehicles which usually have a strictly limited number of transportation orders and the resulting excess transportation range. We assume, that these vehicles might not be able to complete requirements or rather transportation costs. We show that the increase their tour with a single battery charge and thus would have to visit one of transportation costs resulting from a safety stock reduction may off- or more recharging stations. Given is a graph with a depot, a set of set the change of the inventory costs. This effect may have a significant clients, and a set of recharging stations. Each client has a strictly pos- impact on general optimality statements for multi-echelon inventory itive demand and a time window in which it has to be visited. The systems. vehicles are able to recharge their battery’s state of charge (SOC) by a fixed amount of energy per time unit. A network arc is defined by its 2 - Inventory Optimization at Ford of Europe travel cost and time. Further, a vehicle consumes a particular amount of paul moraal, manuel bojahr energy on each arc which depends on the arc length, its empty weight and linearly on the weight of the currently loaded goods. This amount In the automotive industry, managing inventory is both critically im- can be negative if the vehicle is able to recover energy on a down- portant and highly complex. The cost associated with keeping a suffi- ward slope. The fleet is homogeneous with fixed load and SOC limits. cient number of vehicles in inventory is significant. At the same time, The objective is to find a set of routes with minimal total costs such depending on vehicle segment and geographical market, a sizable share that each route starts and ends at the depot, each client is visited ex- of the customer base expects to be able to purchase a vehicle without actly once within its time window, the total demand of all clients on having to wait for it to be built. Not having the right vehicles in inven- a route must not exceed the load capacity, and the SOC stays within tory can also be costly. its limits. Recharging stations may be visited as often and as long as A complicating factor in inventory management is the fact that the typ- necessary. Schneider et al. (2012) introduced a variant of this prob- ical product complexity (in terms of uniquely different buildable con- lem with strictly positive and load independent energy consumption figurations) is measured in the billions or even many orders of magni- per arc. At a recharging station the SOC of a vehicle is always recov- tude higher. In other words, it is physically impossible to keep every ered to its maximum. We consider the more practical setting defined possible configuration of a given vehicle model in inventory. above and present mixed integer linear programming formulations and corresponding branch-and-cut methods to solve them. In this presentation we will present a few recent topics in inventory management at Ford of Europe, where mixed-integer optimization 2 - A branch-and-price approach for a vehicle routing techniques were employed to improve the overall inventory manage- ment. Particular emphasis will also be on the challenges that one problem with optional mid-tour recharging stops often encounters in practical business applications: production con- Stefan Frank, Henning Preis, Karl Nachtigall straints, organizational and timing constraints, data fusion from differ- ent sources, communication with non-mathematically minded business In this paper we study a variant of the vehicle routing problem with partners, confounding factors in the evaluation of final results, etc. time windows (VRPTW) in which the fleet consists of battery electric vehicles (BEVs). Because of the limited range of BEVs and a low- 3 - Solution algorithms for storage unloading and pre- developed infrastructure of charging stations the energetic feasibility marshalling problems with types of vehicle routes must be focused in models and algorithms. This in- cludes the state of charge as well as possible mid-tour recharging stops. Jana Lehnfeld This contribution contains a branch-and-price approach where the sub- Storage unloading and premarshalling problems occur in container ter- problem consists of an elementary shortest path problem with resource minals, tram and bus depots and steel slab warehousing where items constraints (ESPPRC). The ESPPRC is characterized by the common are stored in stacks and need to be retrieved out of a storage area. Due resource constraints (cost, time, load, customer visits) and is extended to the arrangement in stacks, only the topmost item of each stack is by energy feasible paths. Therefore possible recharging stops along a accessible directly. If an item below has to be retrieved, reshuffling is path are included in a labeling algorithm which is used to solve the sub- necessary. Since reshuffling is very time-consuming, most problems problem. We present several aspects concerning our approach, discuss try to minimize the number of reshuffling moves. results and outline further work. A common problem is that, given a retrieval sequence of items, one 3 - A Rich Electric Fleet Size and Mix Problem has to find a relocation pattern with a minimum number of reshuffling moves which satisfies the sequence. If on the one hand, this problem Gerhard Hiermann, Thibaut Vidal, Jakob Puchinger, Richard is considered as an unloading one, a current target item is retrieved Hartl as soon as it is the topmost item of any stack, i.e. the storage area gets emptier during unloading. If on the other hand, this problem is In recent years, the optimal use of alternative fueled vehicles in trans- considered as a premarshalling one, all items have to be sorted such port applications has received increased attention. This has led to sev- that afterwards no reshuffling is necessary to retrieve the sequence. In eral reformulations of existing problems to cover the newly introduced particular, no item leaves the storage area during premarshalling. features. In some practical applications (e.g. storage of wooden plates), items In previous work we combined two streams of research to model a occur in types. Several items belong to the same type if they share the fleet sizing problem with battery electric vehicles, time windows and same properties. In this case, a retrieval request does not ask for a spe- the possibility of recharging on tour at dedicated recharging stations — cific item but for any item of a specific type which means that one item called the Electric Fleet Size and Mix Vehicle Routing Problem with has to be chosen among several possible items. Time Windows and recharging stations (E-FSMVRPTW). This formu- lation is limited to a fleet of electric vehicles only, neglecting the real In this talk, we will present solution algorithms for unloading and pre- world requirement of considering conventional vehicles as well. Fur- marshalling problems with types. We will discuss and compare them thermore, by enforcing a strict recharging policy (always recharge to to existing algorithms for similar problems. Moreover, we will give an full capacity) past works are restricted to a subset of routing decisions, overview of further problem variants. omitting routes where a recharge to only half of the battery would re- sult in a time-feasible solution.

32 OR 2014 - Aachen WE-24

In this work we extend the E-FSMVRPTW formulation by introduc- 3 - Performance improvement of short-term production ing conventional and plug-in hybrid vehicles to the available fleet mix. planning for district heating systems A new decision set is considered since the engine mode (electricity Sara Modarres Razavi, Markus Bohlin, Andreas Nilsson, or conventional fuel) can be switched en route. Furthermore, we en- rich the problem by adding other real-world aspects, such as differ- Per-Ola Larsson, Stephane Velut, Jonas Funkquist ent recharging rates and decidable charging quantities as well as city- center restrictions. These restrictions model so-called ’green zones’ District heating networks (DHN) can provide higher efficiencies and where vehicles using conventional engines are prohibited or penalized. better pollution control compared to local heat generation. However, We propose a rich electric fleet size and mix problem model and a there are still many areas, which can be improved and optimized in layered evaluation and improvement approach using local search, dy- these systems. A DHN is a complex distributed system of different namic programming-based labeling and greedy policies. We present customer substations and components such as boilers, accumulators, first computational results on benchmark instances with a focus on the pipes, and in many cases also turbines for electricity production. How methodological aspects. to schedule the components with the objective of maximizing the profit of heat and electricity production over a finite time horizon is receiving increased attention [1-3], and is the problem that has been dealt with in this work. This mixed integer linear programming (MILP) problem has been formulated as a unit commitment problem, which involves finding the most profitable unit dispatch regarding production costs  WE-23 and heat and electricity sell prices, while simultaneously meeting the Wednesday, 16:05-17:35 - V predicted district heating demands and satisfying network operational constraints. The heating demands within the optimization time hori- Energy, Heat and Steel Production zon are predicted based on season and weather forecasts. In this work, the district heating plant in Uppsala, Sweden, owned by Vattenfall ab, has been considered as a reference plant for modeling and optimiza- Stream: Production and Operations Management tion. The optimization model is formulated in python using pyomo Invited session modeling language, and Chair: Sara Modarres Razavi Solved by the gurobi solver. An hourly-based data of five consecutive days is used as the time horizon. The results demonstrate the fact that 1 - Fuzzy Linear Programming Model for Scheduling with an accurate model of the DHN, it is possible to significantly in- Steelmaking and Continuous Casting Production crease the revenue of the DHN by finding the most economical way to Eduardo Salazar dispatch different production components. A fuzzy linear programming model to scheduling orders in the steel- making and continuous casting production is developed. The general structure of the production system considers an arbitrary number of machines at each stage, producing orders of several steel grades and types (e.g. slabs and billets). In addition to optimization criteria  WE-24 such as makespan the satisfaction of continuity constraints (between Wednesday, 16:05-17:35 - AS batches), transit time constraints (in process time of liquid steel) and due date satisfaction constraints are of main importance in this prob- lem. But the strict satisfaction of these constraints in a given instance Auction Theory may produce bad solutions, i.e. feasible solutions that satisfy strictly all constraints whitin a high makespan (low productivity), or no fea- Stream: Pricing, Revenue Management, and Smart sible solution exists. In practice, aspects such as continuity, maximal Markets transit time and due date satisfaction can be handled in a relative man- ner: it is possible to allow schedules of casting sequences with small Invited session discontinuities and/or small violations of maximal transit times and Chair: Marion Ott due dates. The resulting symmetrical model optimizes an overall con- straints satisfaction grade of the global decision, i.e., the degree of sat- isfaction of the fuzzy continuity, transit time and due date satisfaction 1 - How does the Winner’s Curse affect bidding behavior constraints as the fuzzy makespan aspiration level constraint satisfac- in sealed-bid auctions? tion. Matej Belica, Karl-Martin Ehrhart 2 - Robust Optimization for the Multi-Period Unit Com- The Winner’s Curse is a highly relevant phenomenon for auction prac- mitment Problem titioners and researchers alike. Even if bidders adopt the Bayesian- Michal Melamed, Aharon Ben-Tal, Boaz Golany Nash equilibrium bidding strategy resulting in non-negative expected Unit commitment (UC) is one of the most important problems in elec- profits, they can experience the Winner’s Curse in individual realiza- tric power system operations. It seeks an economic operating policy to tions. This is specifically relevant for auctions in practice, as their a system of generating units over a multi-period finite horizon T. This success and acceptance will not only be judged on averages, but also policy determines the periodical power output of each unit, implying on individual realizations and outcomes. Hence, the frequency and which units to commit (operate) in order to meet the demand subject to the magnitude of the Winner’s Curse experienced in individual real- equipment and physical constraints. We consider the profit UC (PUC) izations, plays an important role. These indicators, however, can sig- problem of a generator operating in a deregulated market wishing to nificantly differ between auction formats even when bidders use the maximize its profit under uncertain demand and renewable energy out- equilibrium strategy. put. Electricity is inefficiently stored, which poses an additional chal- Within an interdependent values framework, we compare different auc- lenge. We employ the Robust Optimization (RO) methodology to solve tion formats with respect to the probability and the magnitude with the PUC model. The RO is a large-scale distribution-free methodology which the Winner’s Curse occurs in the respective symmetric equilib- designed for uncertain problems. It provides a feasible solution for rium. We show that there is no clear ranking between the first- and the any realization bounded within an uncertainty set, and its value is a second-price sealed-bid auction which depends both on the number of guaranteed bound on the objective function value over this set. For a bidders as well as on the distribution of value signals. given trajectories of the uncertain parameters the problem is casted as a mixed integer linear program (MILP) with O(T) constraints. Since Furthermore, we present a theoretical model for sealed-bid auctions these uncertainties affects the objective function only, its robust coun- with symmetric interdependent values in which bidders exhibit loss terpart (RC) includes the same O(T) mixed integer linear constraints as aversion. Herein, we treat the ex-post realized value of the good as in the deterministic problem, yet its objective function is bilinear. Al- the reference point of each bidder. We derive an implicit characteriza- though such a problem is generally NP-hard, we utilize the uncertainty tion of the symmetric equilibrium bidding functions. For the special set structure to optimality solve it via enumeration. Essentially, the re- case of common value goods and beta distributed signals, we present sulting RC model is a MILP with O(T) constraints as the deterministic explicit equilibrium bidding functions and show how the expected auc- model. Numerical experiments show that the RC policy is significantly tion revenue differs within first- and second-price auctions dependent superior to the NOM policy, i.e., the solution to the problem given spe- on the number of bidders, the signal distribution and the probability cific nominal trajectories of the uncertain parameters. and magnitude of the occurrence of the Winner’s Curse.

33 WE-25 OR 2014 - Aachen

2 - Reference-Dependent Bidding in Dynamic Auctions as cost-effectively as possible while maintaining a high service level. Karl-Martin Ehrhart, Marion Ott Taking the classical inventory optimization problem as a starting point, we will provide an introduction to the additional challenges arising by Loss-averse bidders face different sensations as the price clock pro- way of service times, lead times, cash recycling, co-location of ATMs, ceeds in single-unit ascending or descending auctions. We investigate residual costs and limited transport capacities, amongst others. We will equilibrium bidding behavior of bidders with independent private val- present common pitfalls and successful modelling techniques. If time ues and reference-dependent preferences, applying the Köszegi and permits, we will go beyond the realm of retail banking and have a brief Rabin (2006) model. Bidders’ stochastic reference points are endoge- look at cash management for central banks. nous, and are determined by their strategy and their beliefs about the other bidders. Utility functions reflect that bidders anticipate changes 3 - Planning optimization for multi-site, fast & efficient in their reference point due to updated beliefs, e.g. about the own win- production processes - customer satisfaction and ning probability, during the course of the auction. An optimal bidding strategy can be reduced to a series of optimal binary decisions at each on-time delivery in focus price (approve or quit in the EA and wait or bid in the DA).We solve Alexander Aschauer for personal equilibrium (PE) profiles, which contain for each bidder a A smart and simple solution usable for production optimization must bidding strategy that is optimal given the others’ bidding strategies and nowadays answer a variety of questions for business operations and the reference point induced by the own and others’ strategies. There long-term planning. As part of this presentation and discussion we exists a range of belief-free PE profiles in the EA and a range of sym- give an insight into a customer project . The mission-critical objec- metric PE profiles in the DA under different existence conditions. The tives of this project included the improvement of customer orientation highest expected revenue in a PE profile of the DA is higher than in the by optimizing delivery and response times as well as optimizing the EA, but the highest expected revenue in a PE profile of the EA may ex- use of capital by reducing bound means. We optimized the produc- ceed the lowest expected revenue in the DA. The difference is mainly tion regarding utilization of existing capacity. Futher goals were the driven by the aversion to losing the item in the DA. automation of operational processes - for example the creation of de- 3 - When Bidders Fail to Coordinate: First-Price Sealed- livery plans for major projects or the bundling of contracts - and a simulation of "what if" scenarios for optimal decision support. Bid Package Auctions with Quasi-Linear and Value Bidders The project was realized using X-INTEGRATEs solution "XPO". Us- ing XPO, you can utilize production facilities better, improve customer Per Paulsen, Martin Bichler service and satisfaction and optimize planning horizons as well as pro- Much research in auction design was devoted to combinatorial or pack- duction processes. XPO is ready to use and can flexibly be adapted age auctions, which allow for bids on packages of objects. Arguably, to individual needs. Because of its module-based approach, XPO is the first-price sealed-bid auction is the most wide-spread package auc- suitable for medium-sized companies as well as large enterprises and tion format. However, the characterization of Bayesian Nash equilib- regional manufacturing companies. rium strategies in the incomplete information game turned out to be hard. Initial attempts focus on a market where two local bidders com- 4 - Mathematical Optimization in Printed Circuit Board pete against a global bidder. The resulting free-rider problem makes Assembly it hard for the local bidders to coordinate assuming quasi-linear utility Christoph Moll functions. Quasi-linearity might not be the right assumption in many real-world markets. We analyze value bidders who receive their bud- The nature of the pick and place process directly implies lot of discrete get for certain packages from a principal, but this budget is considered decisions to take. Somehow the machine setup, the sequence of op- sunk cost. This model describes utility functions as they can be found erations on a machine and the distribution of workload within in the in ad markets or in spectrum auctions. With value bidders coordination machines in a line have to be defined. becomes trivial for the local and single-minded bidders. In this paper The same holds on the production planning level. The goal of reducing we analyze a different market with two multi-minded bidders both in- change over times directly leads to the task of sequencing and cluster- terested in one or two units of a single good, where the split outcome ing products. is efficient. Interestingly, the results are the opposite. We show that in a Bayesian Nash equilibrium value bidders do not coordinate and this In this talk, we focus on the interaction of mathematicians and en- result is independent of distributions or risk aversion. In contrast, bid- gineers in the development of machines and of mathematicians and ders with quasi-linear utility functions coordinate to the efficient split production planners on the higher planning levels. outcome in equilibrium.

 WE-25 Wednesday, 16:05-17:35 - AachenMuenchener Halle (Aula) OR Success Stories III Stream: Business Day Invited session Chair: Bjarni Kristjansson

1 - An OR-Framework-based solution for optimal flight simulator scheduling Ingmar Steinzen The ORCONOMY GmbH has successfully developed an IT-solution for scenario-based scheduling of flight simulator trainings. The goal was to achieve an optimal capacity allocation of cockpit simulator ca- pacities under consideration of customer requests. An advanced user interface in combination with a powerful optimization core ensured the quality and acceptance of the solution. 2 - Optimized Operation of Automated Teller Machines and Vaults Peter Lietz Being one of the largest manufacturers of automated teller machines (ATMs), it comes naturally to Wincor Nixdorf to offer cash manage- ment software and consulting to the end of running a network of ATMs

34 OR 2014 - Aachen WF-25

Wednesday, 17:45-18:45

 WF-25 Wednesday, 17:45-18:45 - AachenMuenchener Halle (Aula) Business Panel Discussion Stream: Business Day Plenary session Chair: John Poppelaars

1 - Analytics - Hype or here to stay? Josef Kallrath, Gertjan de Lange, Ingmar Steinzen, Hans Georg Zimmermann This is a panel discussion with business analytics experts from renowned companies. They will discuss whether analytics is differ- ent from operations research as we know it, its risks and opportunities, and what perspectives it offers for companies and scientists, theory and practice.

35 TA-02 OR 2014 - Aachen

calibrating relevant types of simulation. For each aspect, we propose Thursday, 8:15-9:45 a general formulation of calibration as classical optimization problem. We examine the formulations’ complexities, their advantages and dis- advantages and available solving approaches. We particularly consider  TA-02 the quest for an automated calibration approach for agent-based simu- Thursday, 8:15-9:45 - Fo2 lations. Based on the view of calibration as optimization problem, this could be used for a wide field of applications in business analytics. Modeling 4 - Assigning University Students to Schools for Intern- ships in Teacher Education Stream: Discrete and Combinatorial Optimization, Kathrin Klamroth, Simon Görtz, Markus Kaiser, Michael Graphs and Networks Stiglmayr Invited session Before applying as teacher trainees, students in North Rhine- Chair: Kathrin Klamroth Westphalia have to complete at least one internship (6 months) at a school. These internships are centrally organised and can be scheduled during the summer or winter term, respectively. A feasible assignment 1 - A new approach for interference modulation in wire- of students to schools must respect subject specific capacity constraints less networks at the schools and at the associated centers for teacher education. Un- Grit Claßen, Arie Koster, Anke Schmeink der these constraints the assignment is optimized with respect to in- dividual preferences of students and distances between students and The wireless network planning problem as considered in this talk com- schools. prises two tasks: The decision which base stations should be deployed The problem is formulated as a discrete, assignment-like optimization and the assignment of traffic nodes to base stations. A predominant problem that has an interesting structure due to the conjunction of the problem in these types of networks is interference. A traffic node two majors of every student (e.g., Mathematics and Physics, Biology served by one base station also receives interfering signals from other and English,...) with the subject specific capacities at the schools and base stations. A prevalent method to avoid interference is the incorpo- at the centers for teacher education. We discuss relations to multi- ration of constraints in the formulation of the planning problem which commodity network flow models and suggest both exact and heuristic guarantee a minimum so-called signal-to-interference-plus-noise ratio solution algorithms. The methods are illustrated at problem instances (SINR) per traffic node. However, this type of constraint leads to nu- for students from Wuppertal. merical difficulties as the coefficients in the resulting linear inequal- ity vary significantly in magnitude. In this talk, motivated by system models from network engineering, we propose the usage of discrete channel quality indicators to model interference as a building block of an integer linear program. These indicators depend on the SINR TA-03 and define the quality of a link which affects the amount of bandwidth  needed to serve a traffic node. Depending on the base station deci- Thursday, 8:15-9:45 - Fo3 sion and hence, the emitted interference, not every indicator is feasible for each link. We separate the proposed model inequalities, which are Network Design similar to cover inequalities, on the fly to exclude such infeasible base station decision and indicator combinations from the solution space. Stream: Algorithmic Game Theory Invited session 2 - Models for the Double-Row Equidistant Facility Lay- Chair: Matús Mihalák out Problem Anja Fischer, Miguel Anjos, Philipp Hungerländer 1 - Selfish Network Creation - Dynamics and Structure Given a set of machines the Double-Row Facility Layout Problem Pascal Lenzner (DRFLP) asks for an arrangement of the machines along both sides Many important networks, most prominently the Internet, are not de- of a path. The aim is to minimize the sum of the weighted trans- signed and administrated by a central authority. Instead, such networks ports between the machines. In contrast to the Single-Row Facility have evolved over time by (repeated) uncoordinated interaction of self- Layout Problem there may be spaces between neighboring machines ish agents which control and modify parts of the network. The Net- in the same row. We consider here the special case of DRFLP with work Creation Game [Fabrikant et al. PODC’03] and its variants at- all machines having the same size. After a short literature review we tempt to model this scenario. In these games, agents correspond to present two different models, which are based on the idea of introduc- nodes in a network and each agent may create costly links to other ing additional dummy machines in order to model the spaces between nodes. The goal of each agent is to obtain a connected network having the machines in the arrangement. The number of these additional ma- maximum service quality, i.e. small distances to all other agents, at chines is chosen such that at least one of the original optimal solutions low cost. is preserved. Our integer linear programming model uses between- The key questions are: How do the equilibrium networks of these ness variables combined with variables modeling the overlap of the games look like and how can selfish agents actually find them? For machines. A quadratic program in ordering variables is the basis for the latter, recent results on the dynamic properties of the sequential a semidefinite programming model, whose relaxation is solved with a version of these games will be surveyed. For the former, ongoing work spectral bundle method. Computational tests show that for medium- to focussing on structural properties is presented. large-sized instances the SDP approach clearly beats the ILP approach regarding the strength of the lower bounds after one hour of computing 2 - Quality of Service in Network Creation Games time. Andreas Cord-Landwehr, Alexander Mäcker, Friedhelm Meyer auf der Heide 3 - Calibration of Discrete Simulations for Business An- Network creation games (NCG) aim to model the evolution and out- alytics - An Optimization Problem and its Prospects come of networks created by selfish nodes. In these games, nodes can Julia Buwaya decide individually which edges they want to buy in order to minimize their private costs, i.e., the costs of the bought edges plus costs for com- This work examines the calibration of simulations for business ana- municating with other nodes. Each node v can buy a set of edges, each lytics. Complex simulations are increasingly employed to evaluate for a price alpha. Its goal is to minimize its private costs, i.e., the sum alternative planning strategies. With the number of model parame- (SUM-game) or maximum (MAX-game) of the distances from v to all ters and interdependencies grow challenges regarding the validation other nodes in the network plus the costs of the bought edges. Since and calibration of simulation systems. A major tasks lies in emerging all decisions are taken individually and only with respect to optimize simulation types such as agent-based simulation. Here heterogeneous their private costs, analyzing the resulting network by comparing it to groups of agents are directly modeled to enable the consideration of an overall good structure constitutes the central aspect in the study of agents’ impact on the planning solution and its success. Challenges NCGs. This task was formalized as analyzing the price of anarchy and are that the system’s behavior emerges from agents’ individual deci- was first discussed by Fabrikant et al. (PODC ’03) for the SUM-game sions and actions, which cannot be fully observed. To survey cali- and by Demaine et al. (PODC ’07) for the MAX-game. These papers bration of simulation in business analytics, we present the results of a inspired to a series of subsequent work. We extend these models by quantitative literature review. We differentiate stochastic, event-driven incorporating quality-of-service aspects: Each edge can not only be and agent-based aspects in simulations and subsume existing calibra- bought at a fixed quality (edge length one) at a fixed price alpha. In- tion approaches. Based on this, we collect specific requirements for stead, we assume that quality levels (i.e., edge lengths) are varying in

36 OR 2014 - Aachen TA-05

a fixed interval. A node now can not only choose which edge to buy, guarantees can potentially be of great value, but only few such guar- but can also choose its quality x, at the price p(x), for a given price antees exist. A very easy but effective approximation technique is to function p. For both games and all price functions, we show that Nash compute the midpoint solution of the original optimization problem, equilibria exist and that the price of stability is either constant or de- which aims at optimizing the average regret, and also the average nom- pends only on the interval size of available edge lengths. Our main inal objective. It is a well-known result that the regret of the midpoint results are bounds for the price of anarchy. In case of the SUM-game, solution is at most 2 times the optimal regret. Besides some academic we show that they are tight if price functions decrease sufficiently fast. instances showing that this bound is tight, most instances reveal a way better approximation ratio. In this talk we consider a new lower bound 3 - New Bound on the Price of Stability for Network De- for the optimal value of the minmax regret problem. Using this lower sign Games bound we state an algorithm that gives an instance dependent perfor- Matús Mihalák, Akaki Mamageishvili mance guarantee of the midpoint solution for combinatorial problems that is at most 2. The computational complexity of the algorithm de- In the network design game with n players, every player chooses a pends on the minmax regret problem under consideration; we show path in an edge-weighted graph to connect her pair of terminals, shar- that the sharpened guarantee can be computed in polynomial time for ing costs of the edges on her path with all other players fairly. We study several classes of combinatorial optimization problems. the price of stability of the game, i.e., the ratio of the social costs of a best Nash equilibrium (with respect to the social cost) and of an opti- 3 - Robust discrete optimization problems with the mal play. It has been previously shown that the price of stability of any WOWA criterion network design game is at most H(n), the n-th harmonic number. This Adam Kasperski, Pawel Zielinski bound is tight for directed graphs. For undirected graphs, the situation is dramatically different, and tight bounds are not known. It has only A finite set of elements and a set of feasible solutions composed of recently been shown that the price of stability is at most (1-1/n4)*H(n), some subsets of the element set are given. In the deterministic case, while the worst-case known example has price of stability around 2.25. each element has a nonnegative cost and we seek a feasible solution We improve the upper bound considerably by showing that the price of whose total cost is minimal. In this paper, we assume that the ele- stability is at most H(n/2) + eps for any value of eps > 0 (starting from ment costs are not precisely known and we model this uncertainty by some suitable n > n(eps)). specifying a scenario set containing a finite number of cost scenarios. In order to choose a solution a popular robust approach is typically used. In the robust approach, we compute a solution which minimize the cost in a worst case, which leads to the min-max (or min-max re- gret) criteria. This approach has, however, some known drawbacks. It  TA-04 assumes that decision makers are very pessimistic or risk averse. Fur- Thursday, 8:15-9:45 - Fo4 thermore, they have no additional knowledge about which scenarios are more likely to occur. In 1988 the Ordered Weighted Averaging aggregation operator was introduced by Yager. This operator allows Approximation Algorithms in Robust decision makers to take their attitude towards the risk into account. Optimization It contains the maximum and the arithmetic mean as special cases. The class of discrete optimization problems with the OWA criterion Stream: Robust and Stochastic Optimization has been recently discussed in a number of papers. In this work we propose to use a more general criterion, namely the Weighted OWA Invited session operator (shortly WOWA) introduced by Torra in 1996. In the WOWA Chair: Adam Kasperski operator an additional vector of weighs is specified which can be in- terpreted as scenarios subjective probabilities. The WOWA operator 1 - Approximation Schemes for Robust Makespan contains the OWA and the weighted mean (the expected value) as spe- cial cases. In this paper, we construct an approximation algorithm for Scheduling Problems the considered problem, with some guaranteed worst case ratio. This Adam Kurpisz algorithm is general and requires only that the underlying deterministic problem is polynomially solvable. In this paper, we investigate two robust makespan scheduling prob- lems. In a considered robust approach we are given a scenario set which contains a constant number of distinct cost vectors which de- scribes possible realizations of each element cost. The objective is to find a solution that minimizes the cost under the worst possible scenario, called Min-Max cryterion. The first considered problem is  TA-05 a Min-Max Makespan Scheduling Problem (Min-Max Pm||Cmax) in Thursday, 8:15-9:45 - Fo5 which the task is to partition the jobs into m subsets, each of which is scheduled on one machine. The objective is to minimize the biggest Transportation completion time under worst case scenario (makespan). The problem is equivalent to the know Vector Scheduling Problem. In the latter one, Min-Max Flow Shop Scheduling Problem (Min-Max Fm||Cmax), each Stream: Discrete and Combinatorial Optimization, job consists of m operations each of which is processed sequentially by Graphs and Networks a different machine. The problem consists in finding a permutation of Invited session jobs. The objective is once again the worst scenario makespan. In this paper we prove that a simple merging rule can reduce the number of Chair: Frauke Böckmann jobs in any instance of the both considered problems to a constant de- pending only on eps. Additionaly we prove that the optimal solution of 1 - The Cycle Embedding Problem a reduced instance does not differ from the original one by more than Markus Reuther, Ralf Borndörfer, Marika Karbstein, Julika a 1 + O(eps) factor. As a consequence we provide a Polynomial Time Mehrgardt, Thomas Schlechte Approximation Scheme for both problems. The running time of the al- gorithm is linear in the number of jobs which improves the best known We introduce a hypergraph based combinatorial optimization problem result. The second result of the paper is the Competitive Ratio Approx- - the Cycle Embedding Problem (CEP). The CEP is a subproblem of imation Scheme for the online counterpart of the considered problems. the Rolling Stock Rotation Problem and can be described as follows. Such a scheme algorithmically constructs an online algorithm with a Given are two hypergraphs, i.e., on a fine and a coarse layer, a corre- competitive ratio arbitrarily close to the best possible competitive ratio sponding projection from fine to coarse, and a set of cycles covering for a given problem. all nodes of the coarse layer. The goal is to embed these coarse cy- cles, i.e., to find cycles in the fine layer such that their projection are 2 - Minmax Regret: Improved Analysis for the Midpoint the coarse cycles. We develop an integer programming formulation Solution for this combinatorial problem and provide a complete description for André Chassein, Marc Goerigk standard graphs. For hypergraphs we prove that the problem is NP- hard. Finally, we present computational results of CEPs deduced from Minmax regret optimization aims at finding robust solutions that per- problem instances of DB Fernverkehr AG. The layers for the rolling form best in the worst-case, compared to the respective optimum objec- stock rotation planning problem are motivated by aspects of vehicle tive value in each scenario. Even for simple uncertainty sets like boxes, orientations. Neglecting the orientation of vehicles leads to a coarse most polynomially solvable optimization problems have strongly NP- hypergraph layer without considering necessary turn around trips. The hard minmax regret counterparts. Thus, heuristics with performance CEPs coming from our coarse-to-fine approach are usually feasible

37 TA-06 OR 2014 - Aachen

and tractable for optimization. Hence, in general, the planning of turn Management (DSM) methodologies often attempt to schedule these around trips can be done subsequently after solving a coarse variant of appliances locally based on a steering signal send by a global con- the rolling stock rotation problem. troller. The local controllers use these steering signals to determine an optimal schedule for their appliances. The local schedule generated 2 - On the Representation of Transportation Plans by by the local controller is thus an important step within the realization Sums of Squares and analysis of DSM methodologies. To this end we study a two-fold Ingo Althoefer steering signal consisting of both a time-varying price and a target pro- file. We model the local minimization objective as a weighted sum of Lagrange’s four-square theorem states that every natural number can the total cost of the consumed energy and the squared deviation from be represented by the sum of at most four square numbers. The the- the target profile. The minimization is done subject to local constraints orem is generalized to transportation problems with two factories (ca- implied by the appliance. We study the structure of the derived op- pacities a1, a2) and two customers (demands b1, b2). timization problems and use them to obtain efficient algorithms that solve the optimization problems to optimality. When all four parameters are natural numbers, there exists a trans- portation plan where the amounts can be represented by altogether 3 - 1. Automatic planning for power system design/2. eleven squares or less. There exist examples where eleven squares are necessary. Environmental friendly vehicle scheduling Marjan van den Akker, Alexandru Dimitriu, Han Hoogeveen, Here is an example where ten squares are needed: a1=7, a2=113, b1=b2=60. Examples which require eleven squares are much more Marcel van Kooten Niekerk, Roger Cremers complicated. Numerical simulations seem to indicate that for (asymp- Because of circumstances, this is a combination of two talks. totically) almost all instances eight squares are enough. This is joint work with Katharina Collatz, Robert Hesse, and Anne 1. Designing electrical power network grids is a challenging and com- Hilbert. plex issue. We investigate two different problems: connecting a new point to an existing electrical grid based on Euclidean distances in a 3 - How OR Improves The Baggage Handling System At non-uniform weighted space and choosing the cost-optimum design for a new electrical network in which we are given information about Frankfurt Airport - The Technical Details the producers, the consumers and the possible connections between Frauke Böckmann, Marco Franz points in the network. The baggage handling system at Frankfurt Airport distributes up to For the first problem we show that Dijkstra’s algorithm combined with 110.000 bags per day using its over 80 km long rail tracks. During a point sampling approach can be used to find an approximate solution. the last years the baggage handling system has been extended and new The second problem is modeled as a maximum network flow problem requirements have been implemented, e.g. robust routing in case of for which connections do not only have a cost for each unit of flow disturbances and balancing constraints for the early baggage storage sent, but also a fixed cost, which has to be paid if the connection is system based on prognosis data. This talk describes how OR helped us used in the network. We propose two different approaches for solving to succeed in this complex project focussing on the technical details. this problem: a branch-and-bound (BB) algorithm and a cost-function slope (CFS) heuristic. 2. The best way to reduce air pollution by public transport is to use only electric vehicles. However, at this moment this is not a cost-efficient TA-06 solution. We will discuss environment-friendly vehicle scheduling us-  ing a fleet with multiple types of buses, where we focus on reducing Thursday, 8:15-9:45 - Fo6 air pollution on the global level and on ’black spots’, for example city centres which are already heavily polluted. We also look for a balance Optimization Methods for Energy and between operational cost and pollution. We will illustrate this with a Environment real-life case. Stream: Energy and Environment Invited session Chair: Marjan van den Akker  TA-07 Thursday, 8:15-9:45 - Fo7 1 - Practical Use of Different Linearized Power Flows Stephan Lemkens, Arie Koster Routing In this talk we give insight into the mathematical properties of two different linearized power flow models. We compare the well known Stream: Discrete and Combinatorial Optimization, DC approximation with a formulation which also approximates re- Graphs and Networks active flows. We analyze these two linearizations in terms of solv- ability and discuss how good they approximate the nonlinear power Invited session flow equations. The need of linearized power flow equations is based Chair: Peter Recht on the power grid design problem, as using the nonlinear power flow equations would yield a non-convex MINLP. By considering linearized 1 - An Optimization Model for Software-defined Mobile equations we gain an MILP, which is preferred for many reasons. We therefore analyze the two linearizations’ corresponding power grid de- Networks with virtualized Network and Service Func- sign problems and show whether the more complex approximation is tions suitable for practical use. Finally, we consider the fact if the optimal Andreas Baumgartner, Thomas Bauschert design for the linearized power flow models is feasible for the nonlin- ear power flow equations. Future mobile networks will implement concepts of Software Defined Networking (SDN) and Network Function Virtualization (NFV). These 2 - Local device scheduling for demand side manage- concepts facilitate the "slicing’ of a mobile network infrastructure into ment using two types of steering signals several virtual mobile networks. In this contribution we address the Thijs van der Klauw, Johann Hurink, Gerard Smit problem of optimal embedding of virtual network slices into a given mobile network infrastructure with virtualized core network and ser- Within power grids, supply and demand need to be matched at all vice functions that are executed within datacenters of a cloud. Here times. This is traditionally done by using flexibility in supply to match the virtual core network functions comprise both data and control plane demand. However, this methodology is less viable in future power entities. The target is to determine for each network slice the optimum girds due to an increasing share of inflexible generation from renew- number and locations of the virtual functions and the routing of the able sources such as wind and sun. Thus flexibility to match supply and traffic flows traversing these virtual functions (service chaining) so as demand needs to be sought elsewhere. It is expected that in the future to minimize the consumption of physical resources while guaranteeing an increasing amount of flexibility is available on the demand side of a certain quality of service level. Physical resources are processing, electricity, mainly in the form of appliances that store energy in some storage, switching and transmission capacity. This problem can be form or manner. Examples of such appliances are electrical vehicles, formulated using an extended virtual network embedding approach al- heat pumps combined with heat buffers and fridges. Demand Side lowing virtual network elements to be split on multiple physical nodes.

38 OR 2014 - Aachen TA-10

However, this approach turns out to be not scalable for realistic prob- by the sum of the total possible improvements of the players in the lem sizes. Therefore we developed a new integer program optimization outcome s. formulation. Our model also provides the basis for algorithmic opti- To achieve this result we introduce a transition graph, which is defined mization approaches. on a pair of outcomes s, s* and captures how to transform s into s*. On 2 - Integer Programming Approaches for the Rapid Tran- this graph we define an ordered path-cycle decomposition. We upper bound the change in the potential for every path and cycle in the de- sit Challenge composition, and lower bound their contribution to the potential. The Wolfgang A. Welz result then follows by summing up over all paths and cycles. The Rapid Transit Challenge is a challenge in which participants have The significance of this result is twofold. On the one hand it provides to traverse an entire subway network in the shortest possible time. Price-of-Anarchy-like results with respect to the potential function. There are two main variations of this challenge: In the first the rider is On the other hand, we show that these approximations can be used required to cover all lines, i.e. traverse every distinct segment between to compute approximate pure Nash equilibria for congestion games two station in any direction, while for the second only every station with non-decreasing cost functions with the method of Caragiannis et complex needs to be visited. To efficiently model those two problems, al. [FOCS 2011]. Our technique significantly improves the approxi- it is crucial to take directions and changing times between lines and di- mation for polynomial cost functions. Moreover, our analysis suggests rections into account. The problem then corresponds to a special type and identifies large and practically relevant classes of cost functions for of Routing Problem on a directed graph, closely related to the Directed which approximate equilibria with small approximation factors can be Rural Postman Problem and the Generalized TSP. In this context we computed in polynomial time. For example, in games where resources will present an IP-based branch-and-bound approach using dynamic have a certain cost offset, e.g., traffic networks, the approximation fac- constraint generation. By exploiting the special structure of the under- tor drastically decreases with the increase of offsets or coefficients in lying transportation network, we were able to solve the Rapid Transit delay functions. Challenge for large real-world subway networks, such as Berlin and New York City, very efficiently. 3 - Short sequences of improvement moves lead to ap- proximate equilibria in constraint satisfaction games 3 - Spontaneous Postmen Problems and edge-disjoint Angelo Fanelli, Ioannis Caragiannis, Nick Gravin cycle packings in graphs Peter Recht We present an algorithm that computes approximate pure Nash equi- libria in a broad class of constraint satisfaction games that generalize A "‘Spontaneous Postman Problem"’ is a routing problem in which a the well-known cut and party affiliation games. Our results improve postman selects subsequent streets of his tour by a nilly-willy strategy. previous ones by Bhalgat et al. (EC 10) in terms of the obtained ap- This spontaneous choice leads to the basic question how to partition a proximation guarantee. More importantly, our algorithm identifies a network into different subdistricts such that it can be guaranteed, that polynomially-long sequence of improvement moves from any initial each district is served if the postman is "‘spontaneous”. The structural state to an approximate equilibrium in these games. The existence of problems of the network that arise within this framework are closely such short sequences is an interesting structural property which, to the related to the investigation of local traces and maximum edge-disjoint best of our knowledge, was not known before. Our techniques adapt cycle packings in graphs. A "‘min-max-theorem"’ can be proved if the and ex- tend our previous work for congestion games (FOCS 11) but graph is Eulerian. the current analysis is considerably simpler.

 TA-08  TA-10 Thursday, 8:15-9:45 - Fo8 Thursday, 8:15-9:45 - SFo2 Computing Equilibria Planning of Local Generation and Consumption Stream: Algorithmic Game Theory Invited session Stream: Energy and Environment Chair: Angelo Fanelli Invited session Chair: Christoph Meyer 1 - The Complexity of the Simplex Method Rahul Savani, John Fearnley 1 - Real-Time Optimization of Virtual Power Plants pro- The simplex method is a well-studied and widely-used pivoting viding Grid Services method for solving linear programs. When Dantzig originally formu- Sleman Saliba, Rüdiger Franke, Alexander Frick lated the simplex method, he gave a natural pivot rule that pivots into the basis a variable with the most violated reduced cost. In their semi- In this talk we present a mixed integer linear program that is used to nal work, Klee and Minty showed that this pivot rule takes exponential optimally distribute the release calls for grid services on the multiple time in the worst case. We prove two main results on the simplex generator units of a virtual power plant in real time. method. Firstly, we show that it is PSPACE-complete to find the solu- tion that is computed by the simplex method using Dantzig’s pivot rule. A virtual power plant is a union of multiple power generator units, Secondly, we prove that deciding whether Dantzig’s rule ever chooses power storage devices and power consumption units that are controlled a specific variable to enter the basis is PSPACE-complete. We use the by a central server. This setting allows the pooling of small renewable known connection between Markov decision processes (MDPs) and power plants, in order to achieve an overall capacity that is sufficient linear programming, and an equivalence between Dantzig’s pivot rule for participation in the electricity market. Comparable fast load ramps and a natural variant of policy iteration for average-reward MDPs. We make renewable generation units well suited for the provision of grid construct MDPs and show PSPACE-completeness results for single- services. This is why the direct marketing of renewable power is be- switch policy iteration, which in turn imply our main results for the coming increasingly attractive. simplex method. Central servers provide the integration of virtual power plants into elec- tricity markets. They receive overall set points and distribute them in 2 - Bounding the Potential Function in Congestion real-time to individual power production units. Moreover, the central Games and Approximate Pure Nash Equilibria servers provide information management for the planning, reporting Matthias Feldotto, Martin Gairing, Alexander Skopalik and accounting of the plant operation. The required control algorithms for virtual power plants need to consider specific plant properties, such In this talk we study the potential function in congestion games. We as production limits, ramp times and storage capacities for each gen- consider both games with non-decreasing cost functions as well as eration unit. The presented mixed integer linear program considers the games with non-increasing utility functions. We show that the value of plant properties in the form of constraints and objectives and solves the potential function of any outcome s of a congestion game approxi- the control task in real-time. The mathematical program does not only mates the optimum potential value by a factor which only depends on obtain a feasible plant operation, but does result in the operation of the the set of cost/utility functions, and an additive term which is bounded virtual power plants at the optimal point.

39 TA-11 OR 2014 - Aachen

Finally, we present an installation of the mathematical program at a 1 - Benchmarking the Brazilian Distribution Electricity large-scale virtual power plant providing secondary balancing power Companies using Data Envelopment Analysys - DEA and minute reserve for all four German grid areas. Ana Lopes, Marcelo Azevedo, Edgar Augusto Lanzer 2 - Economically optimized operation of heat pumps in Benchmarking methodologies as Data Envelopment Analysis — DEA, a smart grid environment Stochastic Frontier (SFA) and Corrected Ordinary Least Squares has Björn Felten, Jessica Raasch, Christoph Weber been used by energy regulators around the world as an important part of the regulation process. Some regulators uses benchmarking to reach Heat pumps provide the possibility to use renewable electricity for the total costs (TOTEX) that the companies are allowed to charge in heating purposes and at the same time offer flexibility for the usage the tariff, while others uses the methodology to measure the efficient of intermittent renewable supply through thermal storage. Yet the eco- operational costs(OPEX). In these models TOTEX or OPEX are used nomically optimized operation in a smart grid environment has so far as input while network extension, number of consumers, market, area not been investigated in detail. On the one hand, heat pump owners served, energy distributed disaggregated in high, medium and low volt- may operate it at substantially decreased costs by making use of peri- age, density of the network, maximum demand, among others, are used ods of low market prices for electricity. On the other hand, grid oper- as outputs. In Brazil, the regulator (ANEEL) started using benchmark- ators may benefit from such type of operation through shaving off the ing in 2007. During two cycles of tariff review and in 2012, by an demand peaks or avoiding grid overloads due to the fluctuating sup- anticipation of concessions renewal, the transmission companies were ply of electricity from renewable sources. This contribution analyzes evaluated by DEA. For distribution energy companies the application the optimal control strategy for a heat pump including thermal stor- of a benchmarking methodology started in 2010/2011. Actually the age. It notably investigates the heat costs saving potential of a typical result of the process was an efficiency score taken from the average be- household being heated by a standard air-water heat pump. The ther- tween the results of two methodologies: COLS, using a Cobb-Douglas modynamic system is modelled by a simplified system of 5 differential function, and DEA, using a non decreasing returns to scale model. A equations. The control algorithm for the operation of the heat pump is second stage was added to take into account that the environment can based on the principles of model predictive control and applies a com- affect the costs. In 2014 another revision in this methodology is ongo- bined COP and market price optimization based on a one-day-look- ing. This paper review the methodologies applied in the third(3CRTP) ahead algorithm for heat demand and electricity prices - both being and fourth(4CRTP) cycle of tariff review of the distribution companies implemented in MATLAB. The algorithm is designed as to be easily and proposes an alternative that better suits to the reality of the energy implementable in machine code. The method is assessed using actual brazilian distributors. meteorological data and market prices and simulating the operation of an optimized, a non-optimized and a partly-optimized system. These 2 - On the relationship between efficiency and use of in- results provide an exemplary insight into the annual cost savings by us- formation and communication technologies in cold ing the smart grid environment. The last part of this contribution then supply chains gives an outlook on the evaluation method of the advantageousness of Carlos Ernani Fries, Ismael Peruzzo Zamoner, Fernanda such operation for the grid operator in an agent based simulation. Christmann 3 - Dispatch of a wind farm with a battery storage This work presents an analysis of the relationship between efficiency Sabrina Ried, Melanie Reuter, Lucas Baier, Patrick Jochem, and the use of information and communication technologies (ICT) by Wolf Fichtner logistic service providers (LSP) that operate in temperature controlled supply chains. The study was performed using secondary data col- Since 2012, the operators of renewable energy systems in Germany can lected from 2007 to 2013 of the Brazilian cold supply chain market. chose between different trading mechanisms. The electricity produced Technical and scale efficiency scores were determined with DEA (Data by renewable energies can be reimbursed according to the Renewable Envelopment Analysis) models and multivariate statistical procedures Energy Law (EEG). Otherwise, the new trading mechanism of direct for the selection of input-output variables of the transformation process marketing ("Direktvermarktung’) can be chosen. Direct marketing of- of inputs into logistic services. The relationship between efficiency fers further possibilities for commercialization e.g. to wind park op- scores of LSP and their use of ICT were established with regression erators, as for example taking part in reserve power markets or selling models for groups of LSP according the use of exclusively own fleet, the produced energy at the electricity stock exchange. The combina- exclusively third fleet, as well as own plus third fleet of trucks in their tion of a wind farm with a battery storage is particularly interesting operations. Results show a weak relationship between use of ICT and for the direct marketing case, as an integrated wind-battery system can efficiency scores. Merely groups of LSP that operate with ICT re- be scheduled in a more balanced way, alleviating natural wind power lated to third fleet of trucks show higher technical and scale efficiency fluctuations. scores. This indicate that LSP that operate with own fleet tend to reveal lower efficiency scores when compared with the former group. The We present a mathematical model that optimizes the contribution mar- lack of strong sensitivity of the use of ICT with the observed efficiency gin for two direct marketing options. We analyze a system that consists of LSP in temperature controlled supply chains seems to indicate that of a wind farm and a battery storage and consider the system to take investments in these technologies have no measurable effect on the ef- part in the electricity stock exchange. We discuss adaptions of the ficiency, and therefore, on productivity. model when additional participation at the tertiary control market is possible. As a reference for determining the profitability of investing 3 - Decision Making Process Model with Price Accept- in a storage we take an average fixed EEG compensation for wind en- ability in Japanese products ergy. We construct a test instance for the models based on 2013 prices of the European Power Exchange and the German reserve power mar- Yumi Asahi ket as well as wind data of the transmission system operator 50Hertz. In Japan, the production of vegetables has fallen from the latter half We then compare the optimal solutions to the reference case. We eval- of the 1980’s because of decreasing of agriculture workers. On the uate if the gain of an integrated wind and storage system exceeds the other hand, the amount of imported vegetable has increased, because fixed EEG compensation such that the yearly costs for the storage can the price of imported vegetables is cheap and the stable supply of them be compensated. is possible. However, in recent years, the consumers have become in- creasingly aware of problem related to safety of food like chemical levels in imported vegetables, therefore, needs of domestic vegetable have risen. In this study, the consideration of the consumer who relates to the purchase of a domestic vegetable is clarified for giving a useful TA-11 finding for further promotion of the consumption expansion activity of  a domestic vegetable. There also have been some studies on purchas- Thursday, 8:15-9:45 - SFo3 ing vegetables in Japan. However, few studies focus on purchasing domestic vegetables of housewives. It limits to a lot of live foods con- Applications of Data Analysis and Data sumed at home and the analysis is advanced in the present study though Envelopment Analysis there is various kinds of such as the live food, the frozen vegetables, and dehydrated vegetables. This study analyzed the consumer aware- ness that influence the purchasing Japanese domestic vegetables in by Stream: Decision Theory and Multi-Criteria Optimiza- structural equation modeling analysis (SEM). The analysis data is the tion survey data which was collected in the Tokyo metropolitan area for Invited session international student of the living in Japan. Chair: Yumi Asahi

40 OR 2014 - Aachen TA-13

 TA-12  TA-13 Thursday, 8:15-9:45 - SFo4 Thursday, 8:15-9:45 - SFo9 Scheduling on multiple machines Metaheuristics in Production and Logistics Stream: Project Management and Scheduling Invited session Stream: Heuristics, Metaheuristics, and Matheuristics Chair: Guido Schryen Invited session Chair: Jörn Schönberger 1 - Scheduling identical parallel machines with a fixed number of delivery dates Arne Mensendiek 1 - Parallel Algorithm Portfolio with Market Trading- We consider the scheduling problem of a manufacturer that has to pro- based Time Allocation cess a set of jobs on identical parallel machines where jobs can only Dimitris Souravlias, Konstantinos Parsopoulos, Enrique Alba be delivered at a given number of delivery dates and the total tardiness is to be minimized. Such settings are frequently found in industry, for example when a manufacturer relies on a logistics provider that picks up completed jobs twice a day. The scheduling problem with When different algorithms are used to solve an optimization problem fixed delivery dates where the delivery dates are considered as an ex- in parallel, a fixed time budget is usually allocated to each one of them. ogenously given parameter for the manufacturer’ scheduling decisions The decision is user-defined and it is based on the inherent characteris- can be solved by various optimal and heuristic solution procedures. tics of the problem at hand and the employed algorithms. Typically, Here, we consider a variant of this problem where only the number of all algorithms are allocated equal time budgets, which remain con- delivery dates is fixed. For example, the manufacturer may be entitled stant throughout the optimization process. However, it is frequently to assign the logistics provider two pick-up times per day. Then, the observed that different algorithms perform better than others for dif- machine schedule and the delivery dates can be determined simulta- ferent problems or instances of the same problem. Thus, it is rea- neously which may significantly improve adherence to due dates. We sonable to reward better-performing algorithms with higher time bud- discuss a mathematical programming formulation and a heuristic so- get during the optimization procedure. However, the selection of the lution approach for the resulting parallel machine production and dis- favored algorithms cannot be taken a priori, since it is habitually ob- tribution scheduling problem. The findings can provide valuable input served that different algorithms perform better at different stages of the when it comes to evaluating and selecting distribution strategies that optimization procedure. This paper proposes a portfolio of metaheuris- offer a different extent of flexibility regarding the delivery dates. tic algorithms that operate in parallel and adopt a market trading-based system that orchestrates the allocation of the total available execution 2 - On Complexity of Scheduling Problem with Technol- time among the constituent algorithms. More specifically, the portfo- lio dynamically distributes the total available time budget by favoring ogy Based Machines Grouping the best performing algorithms with a higher fraction of the total avail- Julia Kovalenko, Anton Eremeev able time. The core idea of the time allocation mechanism is inspired A problem of multi-product scheduling on dedicated machines is con- by trading models and it involves a number of algorithms that act as sidered. Each product can be produced by a family of alternative multi- investors, which buy and sell solutions using time as currency. The machine technologies. Multi-machine technologies require more than proposed approach is assessed on a significant problem from the field one machine at the same time. A sequence dependent setup time is of Operations Research, namely the production planning in systems needed between different technologies. The criterion is to minimize with remanufacturing of products. Preliminary experimental evidence the makespan. The problem is motivated by the real-life scheduling renders our approach highly promising. applications in chemical industry. Preemptive and non-preemptive ver- sions of the problem are studied. Mixed integer linear programming models, based on a continuous time representation, are formulated for 2 - Hybrid sampling-based metaheuristics for the Orien- both versions. Using these models, the polynomially solvable cases of teeering Problem with Stochastic Travel and Service the problem, in which the number of technologies is a given constant, Times are found. We analyze computational complexity and approximation complexity of the problem. In particular, it is proved that the prob- Roberto Montemanni, Vassilis Papapanagiotou, Luca Maria lem without setup times in the case, when there is only one technology Gambardella for each product and each machine may be used for processing only two technologies, is NP-hard in the strong sense. Moreover, the prob- lem cannot be approximated within a practically relevant factor of the One of the reasons that Stochastic Combinatorial Optimization Prob- optimum in polynomial time, if not P=NP. This research has been sup- lems (SCOPs) are interesting is because of their modeling power re- ported by the RFFI Grant 12-01-00122. garding certain quantities or events. For example, some problems in- clude travel times which can vary due to unpredictable factors such as 3 - Emergency Response in Natural Disaster Manage- unexpected traffic or weather conditions. In these cases, SCOPs offer ment: Allocation and Scheduling of Rescue Units a better approximation than their deterministic counterparts. However, Guido Schryen, Gerhard Rauchecker modeling the problem to be solved as a SCOP introduces intricacies that do not exist in the deterministic versions. One of the most im- Natural disasters, such as earthquakes, tsunamis and hurricanes, cause portant difficulties present in SCOPs is that often computing the ob- tremendous harm each year. In order to reduce casualties and eco- jective function can be a hard problem or in our case computationally nomic losses during the response phase, rescue units must be allo- expensive. To mitigate this problem, the objective function can be ap- cated and scheduled efficiently. This problem is one of the key is- proximated instead of being computed exactly. To obtain a reasonable sues in emergency response and has been addressed only rarely in approximation, in a way that is both fast and does not affect the quality the literature. We suggest a binary, quadratic decision support model of solutions found by a metaheuristic, we use Monte Carlo sampling. that minimizes the sum of completion times of incidents weighted Metaheuristics using Monte Carlo Sampling in the objective function by their severity. The presented problem is a generalization of the have become state-of-the-art approaches for many SCOPs such as the parallel-machine scheduling problem with unrelated machines, non- Probabilistic Traveling Salesman Problem with Deadlines (PTSPD) batch sequence-dependent setup times and a weighted sum of com- and recently the Orienteering Problem with Stochastic Travel and Ser- pletion times — thus, it is NP-hard. Using literature on scheduling vice Times (OPSTS). However, we have observed that using Monte and routing, we propose and computationally compare several heuris- Carlo Sampling in problems with deadlines such as the OPSTS causes tics, including a Monte Carlo-based heuristic, the joint application of large errors in the nodes where the deadline is likely to occur. These 8 construction heuristics and 5 improvement heuristics, and GRASP can mislead metaheuristics in reaching suboptimal solutions. In this metaheuristics. Our results show that problem instances (with up to 40 presentation, we study different methods for computing the objective incidents and 40 rescue units) can be solved in less than a second, with function with the purpose of creating faster objective functions than results being at most 10.9% up to 33.9% higher than optimal values. the exact or the pure Monte Carlo sampling based one, while mini- Compared to current best practice solutions, the overall harm can be mizing the error and keeping the quality of the solutions found by the reduced by up to 81.8%. metaheuristic, the same.

41 TA-14 OR 2014 - Aachen

3 - Strategies for achieving synchronized operation The objective in timetable information is to find "good’ train connec- times of several vehicles in a meta-heuristic search tions. Besides other criteria like, e.g., the number of transfers, the travel time is an important measure for the quality of a connection. procedure However, fast connections tend to use transfers with small transfer Jörn Schönberger times which are highly endangered to break in case of delays. Ro- Meta-heuristic search procedures incorporate several strategies for bust timetable information aims at finding connections which are ro- evolving one or several search trajectories through a search space that bust against delays. I.e., given an uncertainty set U which specifies all represents the set of feasible solutions of a combinatorial optimization possible delay scenarios, the objective of robust timetable information problem. In the context of vehicle routing, a lot of basic operators is to find a connection which has lowest travel time in the worst case have been invented in order to find distance minimal route sets that (taken over the scenarios from set U). One possible approach to robust fulfill given sets of often complicated constraints. Typically, several timetable information would be to consider a connection (consisting types of constraints are distinguished. Constructive constraints must of a sequence of stations and the trains taken between these stations) be fulfilled in order to ensure that the proposed solution is a set of robust, if it can be traveled on as planned in all delay scenarios. This, routes but loading constraints coordinate the assignment of requests however, often leads to connections with very high transfer times, since to vehicles. Scheduling constraints are imposed in order to determine the transfers have to have the capacity to "absorb’ all (potential) previ- feasible operation starting times. While intra-route scheduling con- ously arising delays. Better solutions (with respect to both worst-case straints have implications for the scheduling of operations in the route and non-delayed travel time) can be found if we include the possibil- of one vehicle, inter-route scheduling constraints are imposed in order ity to reroute passengers in case of delays. In this case, a solution to to achieve a coordination of the starting times of operations executed the timetable information problem is a strategy which specifies a con- by different vehicles. If all given inter-route scheduling constraints are nection and indicates how to adapt the chosen connection in case of fulfilled then the vehicles are called synchronized. Recently, synchro- delays. Such a strategy is robust optimal, if it minimizes the worst- nization constraints have entered the arena of vehicle routing research. case travel time (where the worst-case is taken over the scenarios in We report about experiences from the development of a meta-heuristic the uncertainty set U). In this talk, we present solution algorithms for that incorporates different operators as well as hill climbing procedures finite uncertainty sets and discuss how these can be used to find robust with the goal to solve a pickup and delivery problem with synchro- optimal strategies also for infinite uncertainty sets. nization constraints. Typical search concepts reported for the pickup and delivery problem fail to achieve synchronized operations. We start 3 - Modelling Delay Propagation in Railway Networks with a report on these failures. Next, we derive countermeasures to Fabian Kirchhoff overcome this shortcoming. Finally, we demonstrate the effectiveness We want to determine delay distribution functions analytically from of the proposed new operators and neighborhoods within comprehen- given source delays. For this purpose, we use an activity on arc net- sive computational experiments. work. Generally, the calculation of propagated delays requires a topo- logical sorting of arrival and departure events and cannot be applied if the network contains cycles. We use an iterative method to approxi- mate the long-run delay distributions in those cycles. The objective of this talk is to investigate the impact of this approach on the limiting  TA-14 distributions. Thursday, 8:15-9:45 - SFo10 In a first step we try to find a topological sorting for the events. If suc- cessful, we use this order to compute the corresponding delay distribu- Robust and stochastic optimization in tion functions. Otherwise, in cyclic structures, i.e. the strong compo- nents of the network, we sort the events using a relaxed version of the transportation topological sorting and approximate the delay distributions functions iteratively. Basically, we just need to apply three mathematical op- Stream: Robust and Stochastic Optimization erations to the distribution functions: convolution, multiplication and Invited session excess-beyond operation. For studying the convergence behaviour we Chair: Marie Schmidt make use of a result of Loynes who examined the long-run waiting times in a simple queuing model. 1 - A Chance-Constrained Approach for Lateness Avoid- In this iteration scheme the delay distribution functions of the arrival and departure events converge under certain conditions. But the ran- ance in Routing Problems with Time Windows and dom variable of the limiting distribution might not be finite. Moreover, Stochastic Travel Times the limiting distribution might not be unique, i.e. independent of the Jan Fabian Ehmke, Ann Campbell, Timothy L. Urban sorting used for the events. In practice, the execution of delivery routes often differs from plans due to a variety of influences such as traffic jams, weather, customer availability, etc. These influences may cost the driver to miss cus- tomer time windows. While the predictability of some of these factors is nearly impossible, it is often possible to derive travel time distri-  TA-15 butions based on historical traffic data. This idea is the foundation Thursday, 8:15-9:45 - SFo11 the vehicle routing problem with time windows and stochastic travel times (SVRPTW). In this work, we propose a fairly straightforward Advances in credit scoring methodology I way to guarantee a given service level at all customers by ensuring a certain probability of arrival before the end of each customer’s time Stream: Statistics and Forecasting window. We particularly consider how arrival time distributions should properly be propagated throughout a route given the presence of time Invited session windows. Our chance-constrained approach carefully considers how Chair: Gero Szepannek to estimate the arrival time distributions at each customer on the route, which enables us to verify if the given service level is maintained at in- 1 - Exact fit of simple finite mixture models dividual customers. Our ideas can be "plugged’ into any algorithm for the SVRPTW and thus be used to solve large problems fairly quickly Dirk Tasche in the form of a feasibility check. We exemplarily show how to im- How to forecast next year’s portfolio-wide credit default rate based plement them in a tabu search solution approach. Computational ex- on last year’s default observations and the current score distribution? periments based on Solomon instances demonstrate how the solutions A classical approach to this problem consists of fitting a mixture of change for different levels of customer service across two probability the conditional score distributions observed last year to the current distributions and several parameter settings. Results show that it is score distribution. This is a special (simple) case of a finite mixture possible to achieve a certain level of service at almost no additional model where the mixture components are fixed and only the weights of cost for some types of instances, while others require up to 33% more the components are estimated. The optimum weights provide a fore- vehicles and 22% more working time to make delivery routes virtually cast of next year’s portfolio-wide default rate. We point out that the free of lateness. maximum-likelihood (ML) approach to fitting the mixture distribution not only gives an optimum but even an exact fit if we allow the mixture 2 - Robust optimal strategies in timetable information components to vary but keep their density ratio fix. From this obser- Marie Schmidt, Marc Goerigk, Matthias Müller-Hannemann, vation we can conclude that the standard default rate forecast based on Anita Schöbel last year’s conditional default rates will always be located between last

42 OR 2014 - Aachen TA-16

year’s portfolio-wide default rate and the ML forecast for next year. data and (together with the scenario assumptions for energy demand We also discuss how the mixture model based estimation methods can and DSM) they yield the residual load. Sensitivities examined include be used to forecast total loss. This involves the reinterpretation of an two different RES mixes, delayed grid expansion, increased costs for individual classification problem as a collective quantification prob- CO2-emissions and a reduced utilization of DSM. The scenario with lem. reduced DSM utilization results in the highest storage demand for Ger- many while in the other scenarios energy supply is mainly ensured just 2 - Consensus Information and Consensus Rating - A as well by capacity expansions of combined cycle power plants and Note on Methodological Problems of Rating Aggre- gas turbines by extensions on cross-border interconnectors. gation 2 - Design of distributed energy supply systems through Christoph Lehmann, Daniel Tillich analysis of near-optimal solutions Quantifying credit risk with default probabilities is a standard tech- Philip Voll, Maike Hennen, Andre Bardow nique for financial institutes, investors or rating agencies. To get a A methodology is presented for the design of distributed energy sup- higher precision of default probabilities, one idea is the aggregation of ply systems exploiting the near-optimal solution space. Distributed different available ratings (i.e. default probabilities) to a so called ’con- energy supply systems are integrated systems incorporating a multi- sensus rating’. But does the concept of ’consensus rating’ really make tude of technical units. The design of these systems is intrinsically sense? What is a ’real’ consensus rating? This paper tries to clarify un- complex and challenging. For this reason, the design should best be der which conditions a consensus rating exits. Therefore, the term of addressed by mathematical optimization. Usually, optimization-based ’consensus information’ is used. This leads to a concept that deals with design methods aim at generating the mathematically optimal solution the precision of aggregated rating characteristics. Within this frame- for a given problem. However, the mathematical models employed work the problem of misinformation resp. contradictory information never perfectly represent the real world. Thus, the optimal solution is addressed. It is shown that consensus information is not necessarily also only approximates the optimal real-world solution. For this rea- more informative than individual information. Furthermore, the aggre- son, the optimum alone is of limited use to the design engineer in prac- gation aspects are discussed from a statistical perspective. tice. 3 - The Exact Solution of Multi-Period Portfolio Choice In this paper, a design approach is presented, which supports the en- gineer through the generation of a set of near-optimal solution alter- Problem with Exponential Utility natives. These alternatives can be evaluated in more detail a poste- Nestor Parolya, Taras Bodnar, Wolfgang Schmid riori. The near-optimal solutions are generated systematically by se- quentially solving a series of optimization models, each extended by In this talk we present our recent results on the multi-period portfolio an integer-cut constraint to exclude already known solutions. selection problem with exponential utility function. It is assumed that the asset returns depend on predictable variables and that the joint ran- Analyzing a real-world problem at the industrial scale, we reveal a rich dom process of the asset returns and the predictable variables follow near-optimal solution space with structurally very different solutions a vector autoregressive process. We prove that the optimal portfolio that exhibit similar objective function values. Considering the many weights depend on the covariance matrices of the next two periods constraints and uncertainties arising in practice, it is practically impos- and the conditional mean vector of the next period. The case without sible to rank the generated solutions strictly based on a single objective predictable variables and the case of independent asset returns are par- function value. Instead, the near-optimal solutions should be employed tial cases of our solution. Furthermore, we provide an empirical study to support the design process by emphasizing the "must-haves’ and the where the cumulative empirical distribution function of the investor’s differences of the generated solutions, i.e., the rational choices. wealth is calculated using the exact solution. It is compared with the investment strategy obtained under the additional assumption that the 3 - Simulation of the System-Wide Impact of Power-to- asset returns are independently distributed. Gas Energy Storages by Multi-Stage Optimization Christoph Baumann, Julia Schleibach, Albert Moser In order to reduce greenhouse gas emissions, the expansion of renew- able energy sources in the European power system is strongly pro- moted. Especially the intermittent feed-in of wind power and photo-  TA-16 voltaic plants increases significantly and will result in high temporary Thursday, 8:15-9:45 - SFo14 surpluses of electrical energy. Thus, short- and long-term energy stor- ages are required in the future power system. A promising option for Energy Systems Engineering: Methods long-term storage is the transformation of electrical energy into hydro- gen or methane using Power-to-Gas technology (PtG). The produced and Real-Life Applications gas can then be stored in the natural gas infrastructure. This way PtG couples the power with the natural gas system and an evaluation of the Stream: Energy and Environment impact of PtG requires a combined simulation of both systems. In this Invited session paper, a simulation method for the European power and natural gas system based on an optimization approach is introduced. The mathe- Chair: Andre Bardow matical formulation of the problem represents a minimization of total Chair: Albert Moser costs subject to the coverage of demand and reserve requirements as well as the observance of technical constraints. Due to the problem 1 - A European Dispatch and Investment Model for De- complexity, especially because of binary decisions and non-linearities resulting from restrictions of thermal power and PtG plants, a closed- termining Cost Minimal Power Systems with High loop optimization is not practicable. Therefore, the introduced method Shares of Renewable Energy consists of a multi-stage optimization with the use of Lagrangian Re- Angela Scholz, Fabian Sandau laxation and decomposition techniques. After the presentation of the mathematical model and the technical implementation, exemplarily re- In order to achieve the climate protection targets of the European Com- sults for a future scenario of the European power and natural gas sys- mission increased use of renewable energy sources (RES) is vital. As tems are shown in this paper. The following evaluation of the results these exhibit an unsteady availability, future power systems will re- focusses on the robustness of the developed method as well as the com- quire technologies able to shift the available energy either in space putational time. Finally, an outlook on future method enhancements is or time. Such technologies include demand-side-management (DSM), given. transmission systems and electricity storages while in times of low re- newable feed-in flexible power plants such as gas turbines and com- 4 - Applied MILP Modelling and Optimisation of Cooling bined cycle power plants are required. To analyse such aspects in sce- Energy Systems narios for the time horizon 2050 with high shares of RES, we have Stefan Kirschbaum, Michael Zens, Christoph Kausch, Helmut established a combined dispatch and investment model determining required storage and exchange capacities as well as capacities of ther- Lepple, Achim Brenner mal power plants at minimum economic costs. This linear optimiza- About 15% of the German electricity consumption is used to oper- tion problem (LP) is constrained to meet the energy demand in each ate chillers and cooling towers. The mathematical optimisation of the country and every hour of the year. Furthermore, it has to comply mode of operation of these units is often very time consuming, because with the technology specific restrictions. Generally, wind and solar en- the part load behaviour and temperature dependency of both chillers ergy as well as energy from run-of-river are taken into account as an and re-cooling units is non-linear and the interdependencies cannot hourly time series for each country. These time series are generated in easily be overseen. Mathematical models often neglect the tempera- an exogenous step based on detailed geographical and meteorological ture dependency of the unit performance and take only the part load

43 TA-17 OR 2014 - Aachen

behaviour at fixed temperatures into consideration. Hence degrees of 3 - Goal congruence and preference similarity between freedom are neglected, that could lead to significant energy savings. principal and agent - setting incentives under risk In this paper we present a methodology to model chillers and cooling with differing time horizons towers in order to perform a mixed integer linear optimisation. The Markus Grottke, Josef Schosser focus of the presented methodology is the application by energy con- sultants. Therefore a practical linearization approach is used, that is We analyse in a parsimonious static model how goal congruence or applicable with data that is available in a non-academic energy system preference similarity can be obtained when principal and agent are analysis. It is shown how chillers, cooling towers and pumps can be risk sensitive and when a setting is prevailing in which the agent has a modelled linearly and how an optimization can be done using this lin- shorter time horizon than the principal while intertemporal dependen- ear model. The models consider the part load behaviour for fixed tem- cies in risky cash flows are to be taken into account. peratures both for chillers and re-coolers. Furthermore the dependency between outside temperature and re-cooler efficiency of dry cooling Our results are as follows. First, we identify preferences that allow for towers and between wet bulb temperature and efficiency of wet cool- preserving the unique properties of the residual income measure when ing towers is taken into consideration. The chillers are modelled using agent and principal are risk sensitive. Second, we are able to show that a linearized dependency between the temperature of the cooling water in addition to the identified preferences constant absolute risk aversion and the electricity consumption. allows for reconciling the agent’s and the principal’s risk attitude. Fi- nally, we find a new relative risk allocation scheme for this setting. The methodology is used to optimize a cooling network consisting of It allows for both, (robust) goal congruence and preference similarity 13 chillers and 9 cooling towers with an overall maximum cooling ca- when cash flows are normally distributed. We also prove that these pacity of 27MW. The optimised control uses 15% less electrical energy results hold for the special case of budget restrictions. than the actual implemented control strategy. JEL: M 41, J 33

 TA-17 TA-18 Thursday, 8:15-9:45 - 001  Thursday, 8:15-9:45 - 004 Accounting Supply Chain Design and Collaboration Stream: Finance, Banking, Insurance, and Accounting Invited session Stream: Supply Chain Management Invited session Chair: Markus Grottke Chair: Alf Kimms 1 - Local governments in the wake of new financial man- 1 - A green facility location for a closed-loop supply agement: Evidence from Germany Dennis Hilgers, Hannes Lampe chain network design Babak Farhang Moghaddam, Amir Afshin Fatahi, Fatemeh The stressed financial situation in the public sector and the continu- Movahedi, Parastoo Hassani ous aspiration for austerity in western governments and public bodies is omnipresent. As one core element in the New Public Management There are lots of factors to evaluate the performance of the supply shift, Germany, like many other countries, has experienced significant chains such as customer service,quality, lead time, cost etc. But due to reforms in public sector accounting and reporting in the last decade. the environmental requirements (social responsibilities, Kyoto Proto- We analyse the effect of new accounting and budgeting regimes. We col,government agencies etc.) an increasing attention has to be given to therefore analyse public service provisions’ cost-efficiency of German develop environmental strategies. If the aforementioned environmen- local governments in the state of North Rhine-Westphalia applying a tal applications are considered in the management of supply chains, stochastic frontier approach. This study presents evidence for an effi- then a new paradigm called green supply chain management (GrSCM) ciency boost of municipalities due to the adoption of accrual account- can be achieved.In this paper, the design of a closed loop supply chain ing. Furthermore, we show that adopting accrual accounting leads to network is studied which includes multiple plants, collection centers, an increase in efficiency over time. demand markets and products. The proposed model is able to integrate the forward and reverse network design decisions to avoid the sub- 2 - Favoritism and indirect Reciprocity in Hierarchical optimality leads from separated and sequential designs. To this aim, a Relationships mixed-integer linear programming model is proposed that minimizes the total cost. Besides, a test problem is examined. Also we offer an Peter Bußwolder, Swetlana Dregert, Peter Letmathe exploratory study of modified facility location. Because of leading the environmental disasters, the global climate alteration has been one of This paper addresses the question of the effect of fair / unfair promo- the most important controversial issues in decades. The greenhouse tion on the willingness to cooperate within a group, using an extended gas emissions(co2, methane, nitric oxide, ozone etc.) begin with the version of the classic dictator game. To investigate the research ques- industrial revolution. After this significant event, the global warming tion at hand, a laboratory experiment is conducted. A 2x2 research is getting worst as long as the energy demands are met by the fos- design is used. During the experiment, the two members of each group sil fuels. Thus, model is extended to consider environmental factors will be assigned the role of a superior (promoted group member) and by-constraint method. Our results indicate that it may be desirable to a subordinate (responder). We manipulate two factors: Fair and unfair open more or different facilities than optimal from a narrow economic promotion as well as the possibility for the subordinate to punish her perspective to reduce the carbon dioxide emissions. superior. The promotion depends on the result of a real effort task. In the unfair setting, subjects with a higher score in the task get the 2 - Coordinating the Health Care Supply Chain - Saving less favorable position in the dictator game, the role of the responder. In consequence, the unfair promotion should not be attributed to the costs and improving service level through joint Fore- superior. It is analyzed how fairness of the promotion influences the casting and Planning heights of the offer. In addition to the dictator game we enable the re- Eike Nohdurft sponder to react to the offer by punishing the superior. The treatments with fair promotion rules (including punishment or not) are supposed The health care supply chain, especially in developed countries, is fac- to serve as the basis for comparison. The results are compared to the ing 2 major challenges: First, it is a substantial cost driver in an indus- unfair promotion treatments. Because of indirect reciprocity concerns, try under permanent cost-pressure. Second, successful supply chain we expect the superior to make higher offers in comparison to the con- practices from other industries, like retail, have not been broadly ap- trol treatment. On the other hand, subordinates punish more when they plied in health care leading to a gap in operational performance. The believe that they would have "earned’ the position of the superior if un- application of supply chain management methods like collaborative fairness is not "compensated’ by the superior. The results have impli- forecasting and planning could overcome both challenges. A compre- cations for promotion decisions and the subsequent interaction within hensive model quantifying the impact of such an application is miss- hierarchical groups. ing for the health care sector with its specific characteristics, like very high service level requirements and perishable products with limited

44 OR 2014 - Aachen TA-20

lifetime. This paper therefore aims to quantify the impact of the appli- A platform is able to change its direction horizontally and vertically, cation of collaborative forecasting and planning on supply chain per- furthermore, the length of a platform may be extended by a loading formance in health care and to provide guidelines for a successful ap- plain. Also, rotating of the vehicle direction is allowed which enables plication in this multi-stakeholder environment. The study is based on a large variety of loading patterns. In contrast to previous approaches, a model simulating a 3-tier supply chain from pharmaceutical manu- a second objective function which balances the workload allocation be- facturers to hospital patients. The model is fed with real-world patient tween the tours (or drivers, respectively) is considered. Workload of an demand data for multiple pharmaceuticals with different demand pat- auto-carrier is measured as the total driving time together with the total terns. Further studies aim to find mechanisms to balance the benefits unloading time. The bi-objective solution approach searches for a set of collaborative forecasting and planning between the partners of the of non-dominated solutions. It is based on a route-first, load-second health care supply chain. approach. The routing phase is performed by means of a clustering approach and 2-opt based neighborhood search. The loading phase 3 - Benders Decomposition applied to cooperative lot- is dealt with an integer program. Computational results illustrate the sizing trade-off between total transportation time and the desire to construct Andreas Elias, Alf Kimms tours with an equal workload. We consider a special lot-sizing problem in the context of purchasing 3 - A Greedy Algorithm for relocation problem in one- alliances. We focus on a supply chain which consists of several re- way carsharing tailers and one supplier. The retailers are free to cooperate in order to Rabih Zakaria benefit from quantity discounts. In case of a cooperation, transship- ments are possible, that is, movement of a product from one retailer to Carsharing systems offer a new mobility service that allow its users to another. A mixed integer programming problem is introduced to cope use cars when they need one, from a fleet of cars scattered in an urban with the decision problem of who orders when, for whom, how many area. Cars are located at different stations that have a fixed number of products and how many products are in stock or being transshipped. parking spaces. In this study, we are dealing with one-way carsharing Our goal is to minimize the total cost of the system. A Benders De- system where users can pick a car from a station and return it to any composition approach is applied to find a solution to this problem. other station. Available Cars and Free parking space at each station, play a major role in the success of the one-way carsharing systems. Therefore, Carsharing operators recruit employees to relocate cars be- tween the stations to avoid the rejection of users’ demands for picking up cars or returning them into stations. In this paper, we developed a  TA-19 greedy algorithm in order to solve the maximum number of rejected Thursday, 8:15-9:45 - I users’ demands, using the minimum number of employees. We used mobility data collected in an operational system to build the users’ de- mand matrices. In a carsharing system which has 20 stations, 150 cars, Routing and Vehicle Logistics through 1,374 trips made during one day and 555 expected rejected de- mands, results show that the algorithm is able to solve 55% of expected Stream: Traffic and Transportation rejected demands using 5 employees and more than 80% of these re- Invited session jected demands using 10 employees. We compared the algorithm with Chair: Rabih Zakaria an exact Mixed Integer Linear Programming model using IBM ILOG CPLEX optimizer and made the proof of performance with stochastic input data and different numbers of employees. 1 - Network Design and Transport Planning in Finished Vehicle Logistics Joachim Kneis According to the International Organization of Motor Vehicle Man- ufacturers (OICA), more than 15 million new vehicles were sold or  TA-20 registered in Europe in 2013. The majority of these vehicles needs Thursday, 8:15-9:45 - II to be transported from plant to dealership,which imposes large logis- tic challenges for the manufacturers. We introduce two transportation Modeling and Analysis of Complex problems that arise in the area of finished vehicle logistics and propose algorithms how to solve them. Manufacturing Systems On a strategic level, a manufacturer needs to negotiate contracts both Stream: Production and Operations Management for transportation and yard services. Any selection of contracts must consider capacity constraints as well as constraints on minimal usage Invited session and mutual exclusion of contracts. The selected contracts will usually Chair: Thomas Ponsignon be used as the default transport network for most of the vehicles for at least the next year. Due to the large amount of transported vehicles, possible savings from optimized networks are huge. 1 - Representing production scheduling problems with answer set programming approaches On an tactical level, a detailed routing for the next weeks needs to be computed based on the default network. However, time tables of ves- Andreas Starzacher, Gerhard Friedrich, Melanie Frühstück, sels and trains as well as unexpected disruptions of the transport net- Vera Mersheeva, Anna Ryabokon, Maria Sander, Erich work can result in bottlenecks that render the regular transport paths Teppan invalid. Among a set of alternative routes, a cost effective alternative that still delivers the vehicles in time has to be computed. Scheduling is one of the most important problems in industrial produc- tion planning. Unfortunately, in general this problem is NP-complete We present similar exact approaches for these problems that were de- which makes it a computationally hard problem to solve. Various veloped at INFORM and are used successfully by large manufacturers. methods such as SAT-Solving, Dynamic Programming, state-based We also emphasize on how changing requirements can be considered search, have been applied on scheduling since the late 1940s. In or- in the development of such algorithms, a key requirement in a modern der to handle the diversity and dynamics of different scheduling do- software development process. mains more general problem representations such as Mixed Integer Programming or Constraint Satisfaction have proven to be successful. 2 - Bi-objective auto-carrier transportation with work- More recent approaches build on the paradigm of Answer Set Pro- load balancing by means of a route-first, load-second gramming (ASP) which is a pure declarative approach based on a sub- heuristic set of first order logic. Due to the success of Constraint Programming Onur Caki, Tobias Buer (CP) techniques hybrid approaches combining ASP and CP have been developed. Currently, there are two types of such Constraint ASP ap- The auto-carrier transportation problem is a challenging optimization proaches. First, there are solvers providing an extended input language problem in the automobile industry that is concerned with the distri- in order to have special constructs for expressing constraints concur- bution of finished vehicles from a storage area to multiple car deal- rently to ASP constructs. Second, a more lightweight combination of ers by means of auto-carriers. We model this problem based on the combining Constraint Programming and ASP is to use ASP as a spec- capacitated vehicle routing problem extended by loading constraints ification language for Constraint Satisfaction Problems (CSPs) such as recently proposed by Dell’Amico, Falavigna, and Iori. An auto- that the solution (answer set) of an ASP program encodes a CSP which carrier consists of a truck and trailer with multiple movable platforms. is then used by a CP/CSP solver as an input. In the proposed article we

45 TA-21 OR 2014 - Aachen

compare such ASP based approaches with regard to their suitability for In this presentation, we will illustrate how the modeling script that is scheduling problems. Our analysis and exemplifications are done on central to your application, is able to extend itself based on data sup- the basis of a production scheduling problem of Infineon Technologies plied by the end user, using the model query and model edit functions Austria AG. The Infineon production scheduling problem incorporates offered by AIMMS. The presentation will discuss a repetitive model- the notions of workflows, priorities, due time, tardiness, change over ing pattern. Such a pattern can be coded once, generating script upon times, planned and unplanned downtimes of devices and furthermore. execution, thereby reducing the application’s maintenance costs. Fur- thermore, it will delve into the use of formulas as data. Formulas, such 2 - Complexity Measurement in the Semiconductor Sup- as "blending rules" and "pricing rules", are the intellectual property of ply Chain the end user, and such formulas are only made available while the end user is running its application. By treating these formulas as data, they Can Sun, Hans Ehm, Thomas Rose, Stefan Heilmayer can be used inside optimization models. Many activities happen in the daily innovation of supply chain and thus complexity is generated. One typical problem from high-tech in- 2 - Alternatives for Programming in Conjunction with an dustry is that when a new alternative solution for a technology emerges Algebraic Modeling Language for Optimization with cost saving but adding complexity, it is not obvious whether the Robert Fourer company should adopt it or not. Therefore it is important to evaluate the complexity and then identify which complexity is value-added and Modeling languages for formulating and analyzing optimization prob- which is not. Current research on qualitative measurement is mainly lems are essentially declarative, in that they are founded on a symbolic for the strategic analysis; some quantitative methods are also available description of a model’s objective function and constraints rather than but most of them still lack practicability and tools. To support deci- a procedural specification of how a problem instance is to be gener- sion making, we are interested in the implementation details, such as, ated and solved. Yet successful optimization modeling languages have formal indicators and structural measurement. This paper focuses on come to offer many of the same facilities as procedural, high-level the quantitative analysis of the complexity. The complex problem can programming languages, in two ways: by extension of their syntax be viewed as a system consisting of elements and various relationships to interpreted scripting languages, and by exposure of their functions based on the PROS (process, role, object, state) approach. Our hy- through application programming interfaces (APIs). How can script- pothesis of complexity measurement analyses the static and dynamic ing and APIs benefit the user of a declarative language, and what do parts separately. The static part can be measured using statistics of the they offer in comparison to modeling exclusively in a general-purpose elements and their unchanged relationships by calculating the weights language? This presentation suggests a variety of answers, through of attributes. And the dynamic part tracks the interactions of elements examples that make use of advanced AMPL scripting features and the and their changeable relationships in a holistic system. However, how new AMPL APIs for Java, MATLAB, and other platforms. to transfer this part to a numeric value is still under investigation. This idea was partly verified in an internal workshop for complexity man- 3 - Real-world Optimization Models Formulated and De- agement, where 4 different topics in the semiconductor industry from a specific technical problem to the general data management were dis- ployed as Applications using MPL OptiMax cussed by around 50 audiences. The first step of quantitative mea- Sandip Pindoria surement was tested by process and role analysis. It shows that this methodology has a potential to be applied in a broad area. MPL is a modeling system that allows the model developer to effi- ciently formulate complicated optimization models. We will demon- 3 - Supply Chain Integration and Practical Problems in strate some real-world MPL models that have been formulated as web- Semiconductor Manufacturing based applications using the MPL OptiMax Component Library. A Portfolio Optimization model will be explored and shown how it can Thomas Ponsignon, Christian Schiller be iterated to produce a trade-off curve (Efficient Frontier) between the Return and Risk. A Vehicle Routing application with multiple vehicles This paper deals with integration problems between planning, schedul- and depots. We will also demonstrate a Nurse Rostering application ing and execution processes that arise in semiconductor supply chains with multiple contract rules and objective criterias. on the example of Infineon Technologies AG. Over the last decades, semiconductor manufacturing evolved to complex production net- works with facilities dispersed all over the globe. Each step of the value chain can be processed in multiple parallel sites. The decision where and when to produce which products is taken in the enterprise-wide Master Planning from which weekly production targets are derived for TA-22 all fabrication sites. A high level of aggregation and a mid-term hori-  zon are considered. Later, the execution of the master plan is detailed Thursday, 8:15-9:45 - IV in a daily production schedule within the scope of each facility. In this paper, we investigate inconsistencies that may occur between decisions Applications of combined Simulation & taken at the corporate and local level. Among others, local scheduling Optimization Methods in Logistics approaches may not be able to cope with short-term supply disrup- tions from other sites. Also, different disaggregation procedures may lead to misaligned local production schedules. We first perform an as- Stream: Logistics and Inventory is analysis. Infineon’s planning landscape is compared to a reference Invited session model from the literature, namely IEC 62264 international standard. Chair: J. Fabian Meier We show that Infineon lacks an intermediate process to link up plan- ning with scheduling activities. We suggest improvement approaches for an enhanced vertical architecture that incorporates all relevant plan- 1 - Performance Evaluation of Metaheuristics for the ning, scheduling and execution processes. Finally, we sketch the long- Simulation-based Optimization of Material Flow Mod- term company vision of supply chain integration. els Christoph Laroque

Simulation, especially the discrete, event based approach is widely accepted as a decision support technology for the analysis of man-  TA-21 ufacturing systems. In practice, simulation studies aim either at the Thursday, 8:15-9:45 - III comparison of competitive system designs, the identification the best simulation model’s parameter configuration or both. Combinations of simulation and optimization heuristics support the user in automati- Optimization Modeling II cally finding optimal solutions, but typically result in long compu- tation times. This often prohibits the practical application of these Stream: Software Applications and Modelling Systems techniques. Therefore, this paper evaluates different heuristics for the Invited session simulation-based optimization approach, in order to derive a fast con- Chair: Robert Fourer verging procedure. Results are derived with a scalable material flow model and analytical mathematical functions. The implementation ad- ditionally includes an interactive analysis of simulation runs and an 1 - Letting Your Application do the Modeling early-exit-strategy. Chris Kuip

46 OR 2014 - Aachen TA-24

2 - Investigation of strategic and stochastic Aspects in to vehicles and/or depot positions. The problem usually involves a lot Hub Location Problems using Simulation of constraints and criteria. Constraints can be hard (e.g. a certain track Peiman Dabidian, J. Fabian Meier, Christian Tesch, Uwe needs to be cleared by a certain time) or soft (only use a certain vehicle if you really need to), and they can be both explicit and implicit. The Clausen criteria can be of different priorities and are often contradicting. Fur- Transport service providers aim to improve efficiency. They bundle thermore, as anyone who has used public transport can tell, there is a shipments in consolidation centres called hubs, before reaching other great degree of uncertainty. On top of that complexity, in a real-time hubs or their respective destinations and try to reduce the number of system we have strict time constraints for computation. Any decision connections in transport networks. Thereby, two different kinds of must be taken within a few seconds. costs are recognizable: first the costs for buildings and maintaining PSI Transcom’s depot management software provides all of this in a hubs which are mainly strategic and can be well approximated long highly configurable way. Constraints and criteria can be dynamically before. Further are the stochastic transport costs which depend on the added and modified, thus adapting the system to the special needs of actual transport volume that arises in the future. individual customers without changing existing code. The choice of Most hub location models ignore the dependence of transport costs the preferred solution among a set of feasible alternatives given multi- R on stochastic factors (e.g., the number and size of the shipments) to ple criteria is based on fuzzy logic, implemented in the Qualicision avoid computationally demanding models. A bad solution of a realis- kernel supplied by Fuzzy Logik Systeme. tic model might be worse in practise than a good solution of simpli- In this talk, I will first give a short overview of the general optimiza- fied model. Therefore it is not a priori clear which is the best model tion process in our software. Then I will provide some examples where to receive adequate results in limited computation time. One cannot standard combinatorial optimization problems, such as shortest path, put every model into practise and evaluate it, but there is a good sur- maximum flow and maximum matching, play a key role within this rogate: Event-based computer simulations allow us to consider much process. They occur as subproblems of the global optimization prob- more complicated models compared to those applicable in optimiza- lem and need to be solved in order to evaluate individual constraints tion algorithms. Furthermore, they are reasonably fast to be run hun- and criteria. dreds or thousands of times to give a good estimate of the stochastic nature of the problem. 2 - Optimizing Itineraries in Public Transportation with Following our well-established heuristic algorithm for complicated Walks between Rides hub location problems, we introduce the "stochasticity" as a parameter Bram de Jonge, Ruud Teunter into the model: Having a detailed model of the stochastic behaviour, we consider weaker and stronger integration of this behaviour into the We study the problem of finding an optimal itinerary to travel from a model for the optimization algorithm. The results of the optimization starting location to a destination location using public transport, where algorithm can subsequently be evaluated by a detailed event-based sim- we allow travelers to alternate rides with (short) walks. The main dif- ulation, both for average results and the variance in costs. ference with previous research is that we take all possible walks that a traveler can make into consideration. This large number of possi- 3 - Combining simulation and business analytics for op- ble walks poses a potential computational difficulty. However, in this study we derive theorems for identifying a small subset of walks that timizing safety stocks in supply networks only need to be considered. These results are embedded in a solution Philipp Arnold, Kai Gutenschwager algorithm, which is tested in a real-life setting for bus transportation in Groningen, a medium sized city in the northern part of the Netherlands. The calibration of supply networks considering safety stocks is a prob- An extensive numerical study leads to encouraging results. First, only lem that has drawn considerable research activities in the last decades. one per cent of all possible walks needs to be considered, so that the Besides analytical approaches also simulation has gained broader at- optimal itinerary can be determined very efficiently. Second, allowing tention recently. The main reason for using simulation is that these walks has considerable benefits; reducing the travel time in about 6 per models are usually more detailed and thus more realistic than analyti- cent of all randomly generated examples by more than 10 per cent on cal models. ZF Friedrichshafen AG, e.g. uses SimChain as a simula- average. tion tool for calibrating and optimizing their safety stocks. The main problem for such simulation-based approaches is the number of pa- rameter settings to be tested for each supply chain. A straight-forward approach is therefore to use an analytical model to gain a good start- ing solution. However, using analytical models to parameterize safety stocks within the model hardly ever leads to a clear reduction of the  TA-24 number of parameter settings to be tested by simulation in order to Thursday, 8:15-9:45 - AS obtain satisfactory results. Analyzing the simulation results of various projects and test settings, Revenue Management and Flexible however, show some significant mathematical relations between re- Products sults (in terms of average service levels) and parameter settings, e.g. safety stocks, transport times (including the underlying distributions) Stream: Pricing, Revenue Management, and Smart and also transport schedules. In this paper we present these results and Markets an approach for calibrating safety stocks based on business analytics, which has shown a clear superiority compared to other approaches in Invited session the field of simulation-based optimization with respect to the number Chair: Claudius Steinhardt of parameter settings to be tested. 1 - A long-term view on Flexible Products in Airline Rev- enue Management Sebastian Vock, Catherine Cleophas, Natalia Kliewer  TA-23 During the airline revenue management process demand is forecasted Thursday, 8:15-9:45 - V for whole flight networks. The forecasting methods represent the re- cent development and historical bookings. The whole process incor- Public Transport porates numerous uncertainties and risks. A relative new approach providing the possibility to overcome these uncertainties is the con- cept of flexible products. They give traditional network carriers the Stream: Traffic and Transportation opportunity to induce new customer markets and generate more rev- Invited session enue while improving the utilization of fixed capacities. In the last Chair: Kai-Simon Goetzmann years the amount of strategic customers is increasing and presents new challenges for airlines in the field of revenue management. Beside the pure revenue maximization it seems to be economically reasonable to 1 - Aspects of Combinatorial Optimization in Depot Man- integrate the customer satisfaction into decision making process. An agement for Public Transport increasing flexibility in airline revenue management process means an Kai-Simon Goetzmann extension of subsequent adaptabilities which can have effects on cus- tomer satisfaction in different dimensions. This talk will present op- In depot management for public transport, disposition is the process of portunities for network carriers to integrate Flexible Products in their assigning vehicles to tracks or to parking spaces, as well as journeys portfolio and outline possible impacts on customer satisfaction. We

47 TB-01 OR 2014 - Aachen

formulate an analytical approach to represent dependencies between the amount of flexibility and customer value within the revenue man- Thursday, 10:15-11:00 agement process. With practical experiments we show the reliability of this model and calculate first numerical results. In our talk we present these results and outline shortcomings and possible future research di-  TB-01 rections. Thursday, 10:15-11:00 - Fo1 2 - Experimental Analysis of Buyer Behavior in Opaque Semiplenary Ben-Tal Selling Markets Martin Spann, Lucas Stich Stream: Invited Presentations and Ceremonies Opaque selling refers to a price discrimination practice in which a Semi-plenary session seller conceals some attributes of the product from the customers and Chair: Rüdiger Schultz reveals them only after a non-refundable purchase has been made. This selling strategy gives sellers degrees of freedom in assigning customers to a specific product (e.g., a product with excess capacity or distressed 1 - Tractable solutions of some challenging optimization inventory) and can thus enable them to increase revenues under con- problems sideration of capacity constraints. By offering customers a choice in- Aharon Ben-Tal volving uncertainty, the opaque product serves the seller as a mean to induce customers to reveal their idiosyncratic preferences. Customers Optimization problems associated with real applications often suf- with weak product preferences are motivated to choose the discounted fer from difficulties due to a large scale design dimension, or lack opaque product, whereas customers exhibiting strong preferences are of convexity, or the presence of uncertain parameters. We present less likely to opt for the uncertain option. The application and optimal some meaningful cases where these difficulties were successfully ad- design of opaque mechanisms requires a thorough understanding of the dressed. The examples include applications in Signal Processing, Ma- drivers of buyer behavior in such settings. Further, understanding cus- chine Learning, Supply Chains, and Medical Imaging. tomers’ decision-making is a prerequisite for integrating opaque sell- ing successfully into revenue management. In a series of experiments, we thus aim to identify the causality and strength of factors that drive customer behavior in opaque selling markets. In particular, we study how customers’ product choice and willingness-to-pay for the opaque TB-02 product is influenced by factors such as strength of preferences, atti-  tude towards risk, ambiguity and the source of uncertainty as well as Thursday, 10:15-11:00 - Fo2 the choice-elicitation interface to reduce opaqueness. Semiplenary Marschner 3 - On the application of DLP-based approaches for rev- enue management with flexible products Stream: Invited Presentations and Ceremonies Claudius Steinhardt, Jochen Gönsch, Sebastian Koch Semi-plenary session A major benefit of flexible products is that they allow for supply-side Chair: Hans-Jürgen Sebastian substitution even after they have been sold. This helps improve ca- pacity utilization and increase the overall revenue in a stochastic en- 1 - Re-Designing a parcel network for growth vironment. As several authors have shown, flexible products can be Andreas Marschner incorporated into the well-known deterministic linear program (DLP) of revenue management’s capacity control. In this talk, we show that t.b.a. flexible products have an additional "value of flexibility’ due to their supply-side substitution possibilities, which can be captured monetar- ily. However, the DLP-based approaches proposed so far fail to capture this value and, thus, steadily undervalue flexible products, resulting in lower overall revenues. To take the full potential of flexible products TB-04 into account, we propose a new approach that systematically increases  the revenues of flexible products when solving the DLP and perform- Thursday, 10:15-11:00 - Fo4 ing capacity control. A mathematical function of variables available during the booking horizon represents this artificial markup and adapts Semiplenary McLay dynamically to the current situation. We determine the function’s pa- rameters using a standard simulation-based optimization method. Nu- Stream: Invited Presentations and Ceremonies merical experiments show that the benefits of the new approach are Semi-plenary session biggest when low value demand arrives early. Revenues are improved by up to 5% in many settings. Chair: Erik Demeulemeester

1 - Delivering emergency medical services: research, application, and outreach Laura McLay Laura McLay will describe her research projects that apply operations research methodologies to emergency medical services. These projects have resulted in several key insights into optimally using scarce public resources for responding to health emergencies. This talk will include a discussion of issues that affect models for optimally allocating scarce resources for public services (such as emergency medical services) in- cluding issues involving performance benchmarks, equity, natural dis- asters, and modeling human elements in systems. She will also discuss insights obtained from putting the results into practice in a real world setting.

48 OR 2014 - Aachen TC-03

network among consumers, along with content dimensions and their Thursday, 11:25-12:55 effects on declared adoption, about a popular subtopic: Rika Memo, a pellet-fired stove (www.rika.at/en/memo/). First, we look at the evolu- tion of this thread between its start in Oct 2007, and Oct 2012, by time  TC-02 intervals of a year, which summarizes the data in 6 graphs. All discus- Thursday, 11:25-12:55 - Fo2 sion board members who contributed to the topic at any time during our total study period (except for the thread’s initiator) are pictured as nodes in each graph. If members communicated with each other Coloring in the thread before or during a certain year, this is visualized with a link between them in that year’s graph. Cumulative number of posts Stream: Discrete and Combinatorial Optimization, exchanged is reflected in the strength of a link. Next, we construct Graphs and Networks similar series of graphs for each of some relevant content dimensions: Invited session social and informational exchange (Harmsen - van Hout et al. EJOR 2013), where the latter is subdivided in the traditional 4 elements of the Chair: Marjolein Harmsen - van Hout marketing mix, as well as positive and negative exchange. Finally, by coloring the nodes in all graphs we investigate the effects of (dimen- 1 - A Branch-and-Cut algorithm for robust graph color- sions of) communication on adoption of the respective heating system. ing By such semi-dynamic visualization we add to the literature on social network visualization (e.g., Trier ISR 2008) to gain new insights into Birol Yuceoglu, Stan van Hoesel, Guvenc Sahin the dynamic characteristics of online consumer discussion fora as well as the topic of energy consumer behavior and how this can be affected The graph coloring problem corresponds to assigning a minimum num- by low-cost online communication platforms. ber of colors to the vertices of a graph such that no two vertices of an edge get the same color. The problem is used in scheduling, timetabling, and telecommunications networks. However, in real-life problems assignments that are feasible may conflict as a result of un- foreseen circumstances (e.g. delays). When we consider unforeseen circumstances, a feasible assignment may still be undesirable. In the  TC-03 robust graph coloring problem, every pair of vertices that can be as- Thursday, 11:25-12:55 - Fo3 signed to the same color has a cost coefficient, representing undesir- ability of the assignment. The robust graph coloring problem is to create a valid coloring of a graph by minimizing the total cost of the Multi-objective Integer Programming I assignment. The problem is used in scheduling where tasks are subject to delays and cost coefficients represent the probability and the effect Stream: Decision Theory and Multi-Criteria Optimiza- of delays. In our work, we model the robust graph coloring problem tion by modifying the asymmetric representatives formulation, originally used for the graph coloring problem. In the formulation, representa- Invited session tive vertices are used in order to represent vertices belonging to the Chair: Thibaut Barthelemy same color class. The formulation reduces symmetry in the problem. We present various classes of valid inequalities as well as their separa- 1 - New properties to reduce the size of bi-objective tion procedures. We discuss a column generation scheme and present computational results. knapsack problems chaabane Djamal 2 - New Experimental Algorithm for List Coloring Prob- lem In this paper, we treat the binary combinatorial optimisation problems by focusing on bi- objective study as was widely done in the mono- Andrew Ju objective case. On the basis of Julien Gorge’s observation made in his In the classical vertex coloring problem one asks if one may color the paper, where he proposed a new approach for reducing a priori the size vertices of a graph G = (V, E) with one of k colors so that no two ad- of the binary uni-dimensional knapsack problem with multiple objec- jacent vertices are similarly colored; the corresponding optimization tives; this reduction allows to fix a priori some components to 0 or 1 in problem seeks to find the minimum value k for a graph that admits a all the efficient solutions, showing that many variables do not satisfy legal k coloring. The list coloring problem is a variant of vertex col- the developped properties to reduce the instances size of the bi-objectif oring where a vertex may be colored only a permissible color from Knapsack problem, even if they are regular. To remedy some of these a prescribed set. Many problems that rely on vertex coloring can be last cases, we propose, however, another way to do so. The approach modelled more appropriately using list coloring. For example, exam is based on the determination of the supported extreme solutions and timetabling is frequently modelled as a vertex coloring problem where the dominance relation of the efficiency of the objectifs objects. A di- graph edges represent subjects that may not be scheduled simultane- dactic example is presented to illustrate all the stages as well as some ously. Other constraints, such as preference for the times an exam numerical experiments. may be scheduled, are often considered to be soft constraints. By sup- plying a list of (in)appropriate hours for each exam one may model 2 - Transforming Constraints into Objectives:A Biob- the problem more accurately as an instance of list coloring. Similarly, jective Solution Method for Solving Bidimensional the frequency assignment problem for cellular telephone networks and Knapsack Problems WLANs may be modelled more accurately by restricting the coloring Britta Schulze, Luis Paquete, Kathrin Klamroth, José Rui of vertices (transmitters or routers) to a specified set. Figueira Clearly the list coloring problem is as hard as vertex coloring, for the latter reduces to the former (in polynomial time) through supplying, We consider constrained combinatorial optimization problems and re- for every vertex, all colors as its permissible list. However, in spite of lax one or several of the constraints. In this way, we formulate as- its importance few published algorithms exist for list coloring. In this sociated multiple objective optimization problems. This allows us to research, we propose a new efficient ILP based algorithm and com- analyze the trade-off between constraint satisfaction on one hand and pare it with the two existing ones we could find in the literature: the original objective value on the other hand. As a concrete example greedy, random algorithm k-GL (Greedy List) proposed by Achlioptas problem, we consider bidimensional knapsack problems (i.e., one ob- and Molloy (1997), and the maximal independent set-based heuristic jective and two knapsack constraints) and their associated biobjective, algorithm LC proposed by Tsouros and Satratzemi (2005). single-constraint knapsack problems. A dynamic programming based solution approach is adapted to compute the nondominated set of the 3 - Online discussion among energy consumers: A transformed problem or a subset of it. It is shown that a representation semi-dynamic social network visualization of the nondominated set is obtained at little extra cost as compared to Marjolein Harmsen - van Hout, Reinhard Madlener, Carsten the solution of the original problem. In this context we discuss strate- gies for bound computation and for handling negative cost coefficients, D. Prang which occur through the transformation. Much on behavior by energy consumers can be learned from what they tell each other in online discussion fora. Hence, we perform a longitu- 3 - Beam Search for integer multi-objective optimization dinal case study on the discussion platform www.energiesparhaus.at. Thibaut Barthelemy, Sophie Parragh, Fabien Tricoire, Richard Specifically, we visualize the yearly changes in the communication Hartl

49 TC-04 OR 2014 - Aachen

Beam search is a tree search procedure where, at each level of the tree, pricing strategies are presented and evaluated. Since these problems at most W nodes are kept. That results in a meta-heuristic whose solv- turn out being NP-hard even if the network structure of transportation ing time is polynomial in both W and the number of variables as long network is very simple, efficient heuristic solution approaches are in- as node selection is decided in polynomial time. troduced. Although popular for solving single-objective problems (mostly scheduling-related ones), beam search has been studied for multi- objective optimization by two teams so far. The approaches have three fundamental components in common: branching scheme, variable se- lection and node pruning. Authors do not analyze influence of the  TC-05 branching scheme on solution quality. Regarding the decision rule for Thursday, 11:25-12:55 - Fo5 selecting the variable to branch on at each tree-node, they notice a strong influence on the quality. Pruning the nodes in polynomial time Network Design is made in various ways. Our work studies those three components and explains their influence. Relying on theoretical grounds, we ad- Stream: Discrete and Combinatorial Optimization, vice their design and show that specificities due to the multiplicity of objectives must be regarded. In particular, we clearly divide the node Graphs and Networks pruning process into two phases, respectively node quality evaluation Invited session and node selection, whose desirable properties are identified and re- Chair: Frank Fischer fined individually through empirical processes. At last, those versatile considerations are applied to the bi-objective 1 - Demand Distribution Effects on the Critical Service Knapsack Problem and the bi-objective Traveling Salesman Problem Systems Design with Profits. Thus, the benefits from our guidelines are shown com- Matej Cebecauer ponent by component. The so-obtained Knapsack solver outperforms the previous dedicated beam search of literature. The TSP results will The spatial design and operation of public service systems, such as allow authors for future comparisons. emergency health-care stations, police or fire departments, require to estimate the size of demand, its spatial distribution, possible locations of service centers and the travelling times. Thus, to design and operate these systems efficiently a lot of data need to be collected and properly utilized. In recent years, the availability of open data is rapidly grow- ing and new possibilities, how to build better and more detailed data  TC-04 models, emerge. Remarkable example is the OpenStreetMap (OSM) Thursday, 11:25-12:55 - Fo4 portal providing detailed geographical information. Here, we use OSM data to extract the road network and to identify customers’ locations. Logistics Scheduling When estimating the demand for services, we consider as customers all inhabitants and therefore, we use available population grids. We Stream: Project Management and Scheduling calculate and compare efficient designs corresponding to two demand profiles, night time demand profile, when the majority of inhabitants Invited session rests at home and the demand profile derived from the 24hours average Chair: Nils Boysen of the population density. We draw conclusions on how affected is the efficient design of service systems by the used population grid. 1 - A heuristic decomposition procedure for the twin 2 - Complexity Results for Network Design with Com- robot scheduling problem on a line pression Nils Boysen, Dirk Briskorn, Simon Emde Martin Tieves, Arie Koster This talk treats the twin robot scheduling problem where two moving Recent advances in communication technology allow to compress data robots execute storage and retrieval moves in parallel along a shared streams in communication networks by deploying physical devices pathway. The depots are located at both ends of the line and a dedicated (caches) at routers, yielding a more efficient usage of link capaci- robot is assigned to each of them. While moving goods between their ties. This gives rise to the network design problem with compression respective depots and some storage locations on the line, non-crossing (NDPC), a generalization of the classical network design problem. In constraints among robots need to be considered. This problem setting this paper, we compare both problems, focusing on the computational is, for instance, relevant in container yards of large ports, where two complexity and analyzing the differences induced by the compression identical gantry cranes (robots) store and retrieve containers from sea- aspect. and landside in parallel. We present an efficient decomposition heuris- We show that the subproblem of adding compression, i.e., the com- tic, which solves even large problem instances with hundreds of jobs pressor placement problem (CPP), is already weakly NP-hard, even on close to optimality in a couple of minutes. instances where Network Design alone is easy. We conclude with a 2 - Routing feeder ships along a coastline pseudopolynomial algorithm for tree instances and a restricted poly- Michael Zenker, Simon Emde, Nils Boysen nomial case. With an ongoing containerization of the global trade, optimizing routes 3 - Models for Virtual Network Embedding Problems of container ships received plenty attention within the recent years. with Time Restrictions Most of the models developed in this context are adaptions of the well- Frank Fischer, Andreas Bley known vehicle routing problem, which, however, in some scenarios An important task in modern communication networks is the virtual- seems needlessly complex. For instance in short-sea shipping, most ization of network resources. On top of an existing physical substrate satellite ports visited by feeder ships to exchange containers with a network (SN), a virtual network provider implements virtual networks central hub port lie on a straight line along some coastline or river. (VN) consisting of processing units connected by links. The process- This paper investigates ship routing where all ports are located along ing units are realized in terms of virtual machines on real physical ones, a shoreline and introduces problem versions solvable in polynomial the links are mapped to physical connections. Different virtual ma- time. Other problem settings are shown to be NP-hard, so that heuris- chines may be mapped to the same physical machine and links may be tic solution procedures are developed. A comprehensive computational realized using several routing and communication lines in the SN. study investigates the efficiency of different routing policies. The aim of the Virtual Network Embedding Problem (VNEP) is to map 3 - Zone-based tariff design in public transportation nodes and links of a VN to nodes and paths in the SN. The nodes and Benjamin Otto, Nils Boysen links of each VN possess certain demands on resources on the substrate nodes (e.g. computing capacity) and edges (e.g. bandwidth require- In public transportation, tariff planning is an important decision prob- ments). The task is to embed several VNs so that resource constraints lem, which considerably influences the profit of operators and the cus- in the substrate nodes and edges are satisfied. Classical approaches tomer satisfaction of passengers. In this talk we investigate a tariff typically consider a simultaneous embedding of all VNs in an offline design problem, which maximizes the profit while considering the cus- setting or the successive embedding of arriving VNs into the running tomers’ willingness to pay in different tariff systems, such as unit, network in an online setting. In contrast, we regard the problem with distance- or zone-based tariffs. In the latter case, the transportation additional time restrictions for each VN. For each VN one is given a network is subdivided into disjoint zones and the customers’ fare de- interval when the VN should be embedded and a duration how long the pends on the traversed zones. For this tariff system, several zoning and VN lasts. The task is to find an embedding for each VN and together

50 OR 2014 - Aachen TC-07

with this a time slot in which the VN should be embedded, so that the TC-07 time restrictions are satisfied and for each point in time the capacity re-  strictions are fulfilled as well. The aim of this talk is to present models Thursday, 11:25-12:55 - Fo7 for solving the Time Restricted Virtual Network Embedding Problem based on integer programming. We present first computational results Algorithm Engineering to compare the performance of these models on moderately sized in- stances. Stream: Discrete and Combinatorial Optimization, Graphs and Networks Invited session Chair: Lars Beckmann TC-06  1 - A computational comparison of approaches to La- Thursday, 11:25-12:55 - Fo6 grangian duals: the case study of FC-MMCF Stochastic Programming Concepts and Enrico Gorgone, Antonio Frangioni, Bernard Gendron Models The focus of this work is to compare several Lagrangian relaxation approaches for solving the multicommodity capacitated network de- Stream: Robust and Stochastic Optimization sign problem (FC-MMCF). This problem frequently appears in the Invited session real world. In fact, FC-MMCF problem arises in Logists, Telecom- Chair: Rüdiger Schultz munications and Transportation to model a plenty of applications. On the other hand, the numerical results aim at providing a benchmark for large scale MIP problems, pointing out the strengths and weaknesses 1 - Maintenance strategies for modular monotone multi- of the different Lagrangian approaches. state systems In particular, we consider the Flow and the Knapsack relaxation for Michael Krause FC-MMCF problem and we solve the Lagrangian duals by using dif- ferent methods coming from the differentiable optimization like (in- Preventive maintenance of technical systems strives at increasing the cremental, deflected, projected) subgradient-type methods and (disag- reliability or performance of the system. We consider modular mono- gregated, generalized) bundle type methods. tonic multi-state systems consisting of several components (modular- ity), where the maintenance of a component does not impair the perfor- 2 - Mixed-integer optimization of sparse representations mance of the system (monotonicity) and every component may evolve of vectors in frames in (possibly infinitely many) different states. A structure function maps Corinna Krüger, Anita Schöbel, Gerlind Plonka-Hoch the component states to the system performance. We assume that the structure function, the stochastic wearing processes of the single com- Let F be a frame, which is a spanning set, of a real vector space and ponents, the available maintenance budget, and the (opportunity) cost let v be a vector in the same space. Our goal is to represent v as a for the deterioration of the system performance are known. We seek linear combination of elements of F and to use as few elements of F as a component-specific maintenance strategy that allocates the budget in possible. such a way that the total opportunity cost in the planning horizon is minimized. We address this optimization problem using the concept of The problem has applications in image processing, e.g. when a picture (approximative) dynamic programming. is wanted to necessitate as little storage space as possible, and in signal processing, e.g. when a decomposition of a piece of music into the 2 - Distribution shaping and scenario bundling for individual acoustic signals of the contributing instruments is required. stochastic programs with endogenous uncertainty The NP-completeness of the problem is known. In our talk we use Marco Laumanns, Steven Prestwich, Ban Kawas a formulation of the problem as an integer linear program [see Jokar and Pfetsch 2008], which requires big-M constraints, where M is an Stochastic programs are usually formulated with probability distribu- upper bound on all entries of at least one optimal solution of the orig- tions that are exogenously given. Modeling and solving models of en- inal problem. Since M depends on the corresponding frame F, we use dogenous uncertainty, where decisions can influence the probabilities, properties of F in order to decrease M. has remained a largely unresolved challenge. In this talk we present In particular, we determine the size of M in the integer linear program- a new approach to handle endogenous uncertainty in stochastic pro- ming formulations for the following two frames: firstly the frame con- grams for the case of decision-dependent probabilities, called distri- sisting of all the vectors of the standard basis together with the columns bution shaping. It enables an efficient characterization of decision- of the N-dimensional normed Haar matrix and secondly the standard dependent probability measures based on the observation that neigh- basis vectors together with the transposed rows of the N-dimensional boring probability measures in the space of the influencing binary de- normed Haar matrix. The individual structures of both frames are used cision variables, linearly related according to Bayes’ Rule. Accord- to obtain big-M constraints which are small enough to allow the appli- ingly, we derive a successive polyhedral characterization of probability cation of the integer linear program to vectors of dimension 16 up to measures as a function of decisions and reformulate the corresponding 4096. Finally, optimal solutions to the nonlinear problem for each of nonlinear stochastic programs as mixed-integer programs. We demon- the two frames are obtained. strate the effectiveness of the approach on two example problems. The first example is a pre-disaster planning problem of finding optimal in- 3 - Exploiting performance variability in MIP solvers us- vestments to strengthen links in a transportation network, given that the links are subject to stochastic failure. Using the new approach, a ing machine learning recently considered instance of the Istanbul highway network can be Lars Beckmann solved to optimality within seconds, for which only approximate so- lutions have been known so far. The second example is a stochastic Performance variability is a phenomenon inherent to all state-of-the- project planning problem, where individual activities have a risk of art solver codes in which different representations of a problem in- exceeding their allocated planned duration. This probability can be re- stance result in different solver runtimes. In mixed-integer program- duced by investing additional resources, and our approach allows to ming (MIP) solvers, simply permuting the rows and/or columns of the find an investment plan to that minimized expected project duration. A matrix is sufficient to drastically change solver performance. This is, in part, due to imperfect tie-breaking. Several approaches exist to 3 - Stochastic Dominance in Stochastic Optimization exploit this variability through parallelizing the solving of several rep- resentations and choosing the one that looks the most promising. In Rüdiger Schultz contrast to these methods, we propose an offline machine learning ap- proach that predicts "good" problem instance representations in order The talk addresses modeling and algorithmics in stochastic program- to achieve better runtimes, meaning no parallelism is required. Our ming with dominance constraints. With accent on recent develop- approach analyzes the structure of instances and permutations of the ments, mixed-integer models and models involving PDE constraints variables/constraints in a problem in order to predict whether or not a will be discussed. particular instance representation will provide a low runtime.

51 TC-08 OR 2014 - Aachen

TC-08 since power is consumed in the moment of generation. The integra-  tion of heat storage might be efficient, as heat generation is decoupled Thursday, 11:25-12:55 - Fo8 from demand. This allows a partially power price oriented plant op- eration, where power is generated especially in times of high market Allocation under Preferences prices. Therefore, the short-term development of the power market has to be anticipated. Consequently, an efficient plant operation depends to Stream: Algorithmic Game Theory a great extent on the accuracy of the anticipated power prices and the Invited session flexibility due to the respective storage capacity. This contribution an- alyzes the effects of short-term uncertainties in the power price on the Chair: Martin Hoefer CHP unit commitment for different heat storage capacities. An exten- sive Monte Carlo Simulation is run in order to determine the financial 1 - An Improved Approximation Algorithm for the Stable consequences of inaccurate power price anticipation. The study shows Marriage Problem with One-Sided Ties that the storage capacity affects the sensitivity of the solution due to Chien-Chung Huang stochastic influences. Since a higher storage capacity increases the flexibility in the unit commitment, the CHP plant operation is able to We consider the problem of computing a large stable matching in a bi- react faster on inaccurately anticipated power prices. Thus, high stor- partite graph G = (A cup B, E) where each vertex u in A cup B ranks its age capacities go along with robust solutions concerning the financial neighbors in an order of preference, perhaps involving ties. A matching consequences due to uncertain power prices. The consideration of only M is said to be stable if there is no edge (a,b) such that a is unmatched long-term uncertainties might result in an underestimation of heat stor- or prefers b to M(a) and similarly, b is unmatched or prefers a to M(b). age capacity. It is recommended to additionally integrate short-term While a stable matching in G can be easily computed in linear time by uncertainties in models for strategic planning of heat storage capacity. the Gale-Shapley algorithm, it is known that computing a maximum size stable matching is APX-hard. 2 - Optimal operation of a CHP plant for the energy bal- In this paper we consider the case when the preference lists of vertices ancing market in A are strict while the preference lists of vertices in B may include Katrin Schulz, Bastian Hechenrieder, Brigitte Werners ties. This case is also APX-hard and the current best approximation The provision of balancing power offers additional revenue poten- ratio known here is 25/17 approx 1.4706 which relies on solving an tial, especially for energy companies with a combined heat and power LP. We improve this ratio to 22/15 approx 1.4667 by a simple linear (CHP) plant and a heat storage. Balancing power is needed to ensure time algorithm. a reliable power supply as nominal frequency has to be maintained We first compute a half-integral stable matching in 0,0.5,1|E| and round despite of unscheduled power plant outages and a volatile feed-in of it to an integral stable matching M. The ratio |opt|/|M| is bounded via renewable energies. In Germany the transmission system operator is a payment scheme that charges other components in opt oplus M to responsible for the provision of sufficient balancing power. On the Ger- cover the costs of length-5 augmenting paths. There will be no length- man balancing market the demand for three types of balancing power 3 augmenting paths here. is procured in a request for proposal process. The assignment takes place on the basis of work and achievement costs. In order to par- We also consider the following special case of two-sided ties, where ticipate on the balancing market for minute reserve, municipal energy every tie length is 2. This case is known to be UGC-hard to approx- companies have to submit a bid for each hour of the following day that imate to within 4/3. We show a 10/7 approx 1.4286 approximation comprises a price and the amount of electricity at which power genera- algorithm here that runs in linear time. tion can be increased or decreased. If the bid price is lower or equal to 2 - Uncoordinated Matching Markets with Local Con- the market clearing price, the contract is awarded to the energy com- pany. In this case the energy company has to ensure that the needed straints capacity is available considering the own uncertain heat and power de- Lisa Wagner, Martin Hoefer mand. Therefore, unit commitment of a CHP plant and a heat storage as well as capacity allocation for the balancing energy market are si- We study matching games and stable matchings with different forms multaneously optimized in our approach to support energy companies of local constraints. In our model, each player is a node in a fixed planning their bids. matching network and strives to be matched to another player. Each player has a complete preference list over all other players it can be matched with but depending on the constraints and the current state of the game not all potential matching partners are available at all times. For the constraints we concentrate on the well studied cases of locally stable matching and friendship matching as well as considerate stable  TC-11 matching and socially stable matching, but additionally give results for Thursday, 11:25-12:55 - SFo3 a way broader class of matching games with local constraints in the case of correlated preferences. We focus on convergence of dynamics Vector and Set Optimization I to stable states (regarding the local constraints) but also give insights on the relationships between the different types of constraints. Further we analyze maximum stable matchings and prove that unlike for the Stream: Decision Theory and Multi-Criteria Optimiza- setting without constraints for all our settings computing the size of a tion maximum stable matching is NP-hard and further hard to approximate Invited session within a factor of 1.5-epsilon. Chair: Andreas Löhne Chair: Benjamin Weißing

1 - Parametric Simplex Algorithm for Linear Vector Opti-  TC-10 mization Problems Thursday, 11:25-12:55 - SFo2 Firdevs Ulus, Birgit Rudloff, Robert J. Vanderbei We propose a parametric simplex algorithm for solving linear vector Combined Heat and Power optimization problems (LVOPs). It is a generalization of the paramet- ric self-dual simplex algorithm, which originally is designed for solv- Stream: Energy and Environment ing single objective linear optimization problems, and capable of solv- Invited session ing two objective LVOPs whenever the ordering cone is the positive orthant. Our algorithm works for any dimension, and it is possible to Chair: Katrin Schulz extent it to any polyhedral ordering cone C. In each iteration, the algo- rithm provides a set of inequalities, which define the current partition 1 - Impact of heat storage capacity on CHP unit commit- of the parameter space and correspond to a vertex of the upper image. ment under power price uncertainties In addition to the usual simplex arguments, one needs to eliminate the Matthias Schacht, Brigitte Werners redundant inequalities from that set. This extra step is similar to the vertex enumeration procedure, which is used in most of the objective Combined heat and power (CHP) plants generate heat and power si- space based LVOP algorithms. Different from those, this algorithm multaneously leading to a higher efficiency than an isolated produc- doesn’t require to solve a scalar linear program in each iteration. tion. CHP unit commitment requires a complex operation planning,

52 OR 2014 - Aachen TC-12

2 - bensolve – A tool for solving Linear Vector Optimiza- 1 - A multi-mode RCPSP with fuzzy activity times for ro- tion Problems bust scheduling of product development projects Benjamin Weißing, Andreas Löhne Maren Gäde, Matthias Wichmann, Thomas Spengler

Often, in Linear Vector Optimization Problems, one is confronted with a large number of variables, whereas the dimension of the outcome Due to a considerable degree of uncertainty, the generation of base- space is of considerably lower dimension. Taking advantage of this line schedules for the execution of product development projects is a observation, Benson proposed an outcomespace-based outer approx- challenging task. In order to cope with unforeseen disruptions during imation algorithm. Later, Ehrgott, Löhne and Shao presented a dual project execution, research efforts have been made in the area of robust variant of this algorithm, and recently, Hamel, Löhne and Rudloff pro- project scheduling. There, uncertainty is most often treated by model- vided further improvements and extensions to Benson’s algorithm. The ing activity durations as random variables with a known distribution. aim of the talk is to present an implementation of this algorithm along However, due to creative engineering tasks, lack of historical data and with several enhancements that were made. Therefore, in the first part vague specifications of product characteristics, it is not trivial to define we will explain what we consider to be a "Linear Vector Optimization appropriate distribution functions. Thus, some authors propose the use Problem" (LVOP) and state an appropriate solution concept. In the of fuzzy logic to account for imprecision in the data. When combining second part, we will show how such a solution can be computed by fuzzy logic and robust project planning, the challenge lies in evaluating iteratively approximating the so called "upper image" of the LVOP, a the robustness of the resulting schedule. In this context, we examine polyhedron which is defined to be the image of the objective function the suitability of existing robustness measures for the assessment of over the feasible set plus (Minkowski) the ordering cone (an arbitrary fuzzy project plans. To this end, we present a basic model formula- pointed solid polyhedral cone). In order to compute the outer approx- tion of a multi-mode resource-constrained project scheduling problem imation polyhedra in each iteration step, we need to solve one linear (RCPSP) with fuzzy activity times and a pre-specified project deadline. programm. The solution of this LP defines a "cutting plane", which is Using a numerical example to illustrate the solution structure of fuzzy used to refine the outer approximation. Ascertaining the resulting poly- project plans, the applicability of different robustness measures is dis- hedron as intersection of the old approximation and the affine halfspace cussed. Finally, an outline of a robust scheduling methodology for the induced by the cutting plane is called "vertex enumeration". We will presented multi-mode RCPSP with fuzzy activity times is presented. show how the vertex enumeration can be computed efficiently by using adjacence- and incidenceproperties of the polyhedra. Also the reduc- tion of computational expenses for solving the LP’s in every iteration 2 - Resource-constrained project scheduling with over- step by utilizing the common structure of these LP’s (warmstarts) will time be considered. Eventually, we will present computational examples. Andre Schnabel, Carolin Kellenbrink 3 - Robustness concepts for uncertain multi-objective optimization problems Jobs scheduled in the classical resource-constrained project schedul- Jonas Ide, Anita Schöbel ing problem (RCPSP) consume renewable resources during their exe- cution. Thereby, it is often assumed that each of these resources has a Robust optimization incorporates uncertainties in the formulation of constant capacity throughout the planning horizon, which must not be optimization models and hedges against these uncertainties by min- exceeded. In practice, the usage of additional capacity can be part of imizing the worst case of all possible outcomes. Different concepts the decision problem. For that reason, we extend the classical RCPSP of what is seen as robustness are presented in the literature. Multi- by a decision on the usage of overtime with associated penalty costs objective optimization, on the other hand, considers multiple objec- (RCPSP-OC). tives and investigates solution techniques for calculating efficient so- lutions, i.e., solutions whose objective vector is not dominated in the In order to solve problem instances of practically relevant size, we de- objective space. Since handling uncertainties and multiple objectives velop a heuristic solution method. Studies from literature show that is necessary for many real world applications, combining robust and the most powerful heuristics for the RCPSP contain the serial schedule multi-objective optimization is of high practical and theoretical inter- generation scheme (SSGS) at their core. Accordingly, a heuristic for est. the RCPSP-OC based on the SSGS seems to be promising. Due to the fact that additional capacities are not considered, however, the SSGS In this talk, we present several concepts of robustness for uncertain in its basic form is not suitable for solving the RCPSP-OC. Therefore, multi-objective optimization problems, some of which are extensions we present a modified version of the SSGS embedded in a genetic al- of classical concepts of robustness for single objective optimization gorithm. Additionally, we evaluate further approaches for solving the problems while others are new concepts developed specifically for the RCPSP-OC by choosing different representations in a genetic algo- multi-objective setting. rithm. Namely, we present the concepts of minmax, highly, flimsily, and lightly robust efficiency as well as the concept of lower set less ordered efficiency. We motivate the different concepts by pointing out which 3 - Combined staff and machine shift scheduling in a strategy a decision maker follows by choosing each of the respective German potash underground mine concepts. Marco Schulze, Jürgen Zimmermann Furthermore, we investigate relationships between the concepts and shortly present algorithms for calculating robust efficient solutions. Most of these algorithms are based on well-known solutions tech- We consider a German potash underground mine where crude salt is niques for calculating efficient solutions to deterministic multi- mined using a room and pillar mining method. The excavation is based objective optimization problems, such as the weighted sum scalariza- on conventional drilling and blasting techniques. This kind of un- tion and epsilon-constraint method. derground mining is characterized by different consecutive production steps (operations) such as filling blast holes with explosive substance or Finally, we illustrate the concepts on a practical example. loading broken material. Each production step requires one trackless machine (from a set of heterogeneous machines) and a mine worker who has the corresponding skill. The daily workforce scheduling prob- lem forms the bottom level of a hierarchical planning approach. In or- der to generate reasonable shift schedules, the overlying planning lev- els provide input data concerning which amount has to be mined per  TC-12 shift/day and which parts of the mine should be excavated with higher Thursday, 11:25-12:55 - SFo4 priority. Therefore, our problem consists of specifying the assignment and scheduling of the planned operations to the resources, i.e., miners and machines that are available in the respective shift. Due to a vari- Applications and Models in Project ety of practical requirements, even small instances could not be solved Scheduling to optimality within reasonable computation time. For this reason, we develop a problem specific construction heuristic that is already em- Stream: Project Management and Scheduling bedded into the IT structure of our industrial partner. We exemplify our solution approach by scheduling a real-world shift. Invited session Chair: Jürgen Zimmermann

53 TC-13 OR 2014 - Aachen

 TC-13 1 - A hierarchical facility layout planning approach for Thursday, 11:25-12:55 - SFo9 large and complex hospitals Stefan Helber, Steffen Kasper, Svenja Lagershausen Metaheuristics for Assignment Problems Stream: Heuristics, Metaheuristics, and Matheuristics The transportation processes for patients, personnel, and material in large and complex maximum-care hospitals with many departments Invited session can consume significant resources and thus induce substantial logis- Chair: Taieb Mellouli tics costs. These costs are largely determined by the allocation of the different departments and wards in possibly multiple connected hos- 1 - An Ant Colony System adaptation to deal with acces- pital buildings. We develop a hierarchical layout planning approach based on an analysis of organizational and operational data from the sibility issues after a disaster Hannover Medical School, a large and complex university hospital in Alfonso Mateos, Antonio Jiménez-Martín, Héctor Muñoz Hannover, Germany. The purpose of this approach is to propose loca- One of the main problems relief teams face after a natural or man-made tions for departments and wards for a given system of buildings such disaster is how to plan rural road repair work tasks to take maximum that the consumption of resources due to those transportation processes advantage of the limited available financial and human resources. Pre- is minimized. We apply the approach to this real-world organizational vious research focused on speeding up repair work or on selecting the and operational dataset as well as to a fictitious hospital building and location of health centers to minimize transport times for injured cit- analyze the algorithmic behavior and resulting layout. izens. In spite of the good results, this research does not take into account another key factor: survivor accessibility to resources. 2 - Strategic planning of coverage for inpatient primary In this paper we account for the accessibility issue, that is, we max- imize the number of survivors that reach the nearest regional center healthcare (cities where economic and social activity is concentrated) in a min- Verena Feld, Grit Walther imum time by planning which rural roads should be repaired given the available financial and human resources. This is a combinatorial problem since the number of connections between cities and regional Demographic change as well as the introduction of a new compensa- centers grows exponentially with the problem size, and exact methods tion system based on diagnostic related groups lead to a demand shift are no good for achieving an optimum solution. for hospital inpatient care in Germany. Thus, hospital capacity has to In order to solve the problem we propose using an Ant Colony System be adapted for hospital sites and medical specialties to adapt to these adaptation, which is based on ants’ foraging behavior. Ants stochas- new conditions. In Germany, strategic planning of inpatient care lies tically build minimal paths to regional centers and decide if dam- within the responsibility of the federal states, and is often executed aged roads are repaired on the basis of pheromone levels, accessibility with a limited regional scope in a rather hands-on approach by plan- heuristic information and the available budget. ning committees staffed with representatives of health insurances and hospital operators. The proposed algorithm is illustrated by means of an example regard- ing the 2010 Haiti earthquake, and its performance is compared with Against this background, a strategic planning approach is developed another metaheuristic, GRASP. to increase the effectiveness and efficiency of hospital planning for the 2 - Heuristics for Multiple Domicile Crew Assignment four medical specialties of inpatient primary healthcare. For this pur- pose, we develop an ILP, which simultaneously determines the location based on Capacity Calculation and Expert Rules and number of hospitals, the medical specialties offered by each hos- Taieb Mellouli, Jörg Michels pital, and the capacity per medical specialty (expressed by the number We consider the crew scheduling problem for multi-domicile airlines of beds). Furthermore, the model ensures the accessibility of inpa- with irregular flight schedules. Based on capacity-based crew pair- tient primary healthcare for the entire population, while minimizing ing optimization, we develop heuristics for the crew assignment phase the overall number of hospitals and medical departments within a re- taking into account both airline productivity and crew welfare crite- gion. ria. For crews stationed at several domiciles with several kinds of full and parttime contracts and unevenly distributed times of vacation, off- We apply our model to the case study of North Rhine-Westphalia duty requests and assigned office and simulator duties, it is difficult where the federal government has recently enacted a new legislative to assign all pairings into the timely unstructured gaps in-between the framework indicating a reduction of the overall number of hospital prescheduled activities and fixed and requested off-day blocks. We beds by over 12 % (excluding psychiatry and geriatrics). propose a three-staged assignment process: The first phase "capacity- based assignment’ constructs a first assignment solution based on day- capacities, flight hour goals and lengths of gaps. The optional sec- 3 - Operational scheduling of care workers in long-term ond phase "inter-domicile balancing’ relocates some pairings out of care facilities domiciles with capacity deficiencies to correct the assignability of Alexander Lieder, Dennis Moeke, Raik Stolletz, Ger Koole pairings. The third fine-tuning phase "intra-domicile fairness balanc- ing’ performs exchange moves between crew members of the same domicile improving the starting solution by minimizing a variance- like evaluation function of fairness criteria based on equal distribution Nursing homes provide long-term care for elderly people who are too of flight hour goals, nice destinations, crew preferences and granted frail or sick to live autonomously anymore. The majority of nursing OFF requests. We report on some techniques enhancing the results for home residents require assistance with activities of daily living such real-world data both quantitatively and qualitatively: Scheduling small as bathing, grooming, eating meals and taking medication. Each task gaps first and using multi-level moves between several crew members requires a specific level of qualification of the respective care worker solved hard bottleneck situations. The integration of expert and cor- and has to be performed within a small time window around the point porate rules, e.g. schedule pairings directly before/after vacation and in time requested by the resident. Since care workers cause the largest off-blocks, prefer early-to-late duty blocks, account for standby-usable share of operational costs, there is a lot of pressure on management days, enhanced the quality of assignment results. of such facilities to appoint as little workers as possible while main- taining a high quality of service. We present a mixed-integer program (MIP) and a dynamic programming (DP) approach that generate opti- mal task schedules in terms of waiting times of the residents for a given workforce composition. To solve large problem instances, we develop heuristic solution approaches to speed up the DP approach. Using data  TC-14 from practice, we evaluate the computational performance of our so- Thursday, 11:25-12:55 - SFo10 lution approaches. Furthermore, we perform a sensitivity analysis to show how waiting times or workforce expenditures can be reduced Health Care Operations Management by increasing the flexibility of the workforce (i.e., allowing workers with high qualification to perform task with lower qualification re- Stream: Health Care Management quirements) and by increasing the scale of the schedule (i.e., creating a common schedule for two neighboring departments of a facility). Invited session Chair: Katja Schimmelpfeng

54 OR 2014 - Aachen TC-16

TC-15 methods which allow us to be independent from the language of a doc-  ument. As input for these methods serve the downloaded documents, Thursday, 11:25-12:55 - SFo11 an specially prepared index structure containing meta data and various other information which accumulate during the collection of the docu- Forecasting for Business Analytics II ments. We show that we can detect current events with a high impact and their corresponding locality and discuss future research. Stream: Statistics and Forecasting Invited session Chair: Sven F. Crone TC-16 1 - A Note on the Properties of the Independent Probit  Thursday, 11:25-12:55 - SFo14 Model Friederike Paetz, Winfried Steiner Sustainable Supply Chains Designing new products and forecasting their market success is crucial for companies. Nowadays, several different types of conjoint choice Stream: Energy and Environment models are established to design new products and predict choice Invited session shares for given competitive market situations. Therefore, in-depth Chair: Jacqueline Bloemhof knowledge of model characteristics is important to select a context- specific adequate conjoint choice model and to assess choice shares of Chair: Grit Walther new products right. Generally, the Multinomial Logit (MNL) Model is applied, which is known to suffer from the popular Independence 1 - Sources and Effects of Heterogeneous Willingness of Irrelevant Alternatives (IIA) property. This property may lead to to Pay for Remanufactured Products biased model estimates. While the Multinomial Probit (MNP) Model is known to overcome the IIA property, it is still often stated in the Rainer Kleber, Gilvan C. Souza, Guido Voigt modeling literature, that its restricted form, the Independent Probit (IP) Current research on strategic issues in Closed-Loop Supply Chain Model, still exhibits the IIA property. We confute this common belief Management typically assumes that consumers are characterized by empirically and illustrate the true properties of the IP Model. Like the a heterogeneous willingness to pay (WTP) for a new product which is MNL Model, the IP Model assumes independence between utilities of often modeled by using a uniform distribution. In contrast to this, the alternatives. This independence assumption also leads to biased choice value that customers assign to a remanufactured product is determined share predictions, when different pairwise similarities of competitive by discounting the new product price with a common factor. Thus, any products are present. However, the IIA property and the independence heterogeneity in consumers’ WTP for remanufactured products solely assumptions do not build interchangeable constructs. Considering this stems from a differing WTP for the new product. This approach typi- is important, when choosing an adequate model. Otherwise, implica- cally leads to linear price/quantity relationships enhancing tractability tions on the market launch of new products may be based on a mis- of the resulting stylized models. However, recent empirical work in- leading basis. dicates that consumers are quite different in their relative assessment of the quality of a remanufactured product. Our research aims at (1) 2 - Predictive Value of Geometric Measures for Revi- assessing the impact of the assumption that consumers homogenously sioning Pattern in Corporate Cash Flow Forecasting discount the value of remanufactured goods on the price/quantity de- Florian Knoell, Thomas Setzer cisions of a monopolistic producer offering both, new and remanufac- tured products, (2) identifying important drivers of the heterogeneity, We introduce concise, multi-dimensional metrics to characterize the and (3) providing a compelling model based on individual utility that revision behavior found in cash flow forecasting processes and use can explain important drivers of the heterogeneity observed and fit the these metrics to predict forecast accuracy, entropy, as well as the direc- corresponding modeling parameters in an experimental study. tion of a forecast error. Accuracy of cash flow forecasts is important in corporate reporting and planning systems, and corporate financial 2 - An actor-oriented approach to evaluate climate poli- controllers require techniques to assess and improve the quality of the cies with regard to resource intensive industries forecast data. However, vast amounts of cash flow forecasts with fore- cast horizons of up to 12 months are generated and revised regularly Patrick Breun, Magnus Fröhling, Frank Schultmann by local financial managers working for different subsidiaries in differ- Resource intensive industries are still responsible for a large part of ent regions and business divisions, and corporate financial controllers greenhouse gas (GHG) emissions in Germany. While some political need decision support tools to analyze, assess, and improve the fore- stakeholders call for a more restrictive climate policy to force further cast data. Employing a large, multi-year dataset of real-world cash flow reductions of GHG emissions, the exceptions made for these indus- forecasts provided by a large multinational company, we show empir- tries increase. Currently, there exist financial reliefs of about eight ically that novel measures such as the (geometric) revisioning center billion Euros due to different taxations and free allocation of certifi- in combinations with the type and strength of a principal revision pat- cates to guarantee the global competitiveness of German industries. tern – assigned based on the similarity of an individual revision pattern The question rises how a more restrictive climate policy would af- and empirical orthogonal patterns – provide predictive value over es- fect industrial GHG emissions and the economic situation. Contrary tablished quality indicators such as weak planning efficiency, or the to many other approaches in the field of policy evaluation, the under- determination of biases potentially introduced by ’anchoring and ad- lying actor-oriented approach of the project DECARBONISE (funded justment’ or ’running down a forecast’. by the BMBF) focuses on the simulation of plant-specific investment 3 - Detection and localization of Current Events using decisions as well as the calculation of plant specific costs and revenues. Therefor a detailed database of the internal material and energy flows Web Data of all relevant plants together with the currently available efficiency Jan Stutzki increasing measures is developed. In the subsequent simulation, the plants, modelled as actors, decide on the implementation of these mea- This paper represents result from our ongoing research project in the sures dependent on the GHG reduction potentials as well as on the foresight area. The goal of our project is to develop web based tools overall economic and political conditions which can be varied in sce- which automatically detect potential real world events and associate narios. The approach focuses on the iron and steel and the aluminum them with a real location. This knowledge can be used to enable com- sector, whose GHG emissions represent about 7% of German’s overall panies to adapt their capacities accordingly. As for now we analyze GHG emissions. The results show, that there are only minor reduction the world wide web in more than 60 languages and can use a scalable potentials of these industries due to already realized high efficiency amount of sources which we assign to one of over 100 national states. standards. Thus, more restrictive climate policies only show slight ad- To reach this goal we utilize the big search engines as their core compe- ditional GHG emission reductions of the German metal production but tence is to determine the relevance of a document regarding the search go along with a significant cost increase influencing global competi- query. The search engines allow us a slicing of the results by language tiveness. and country. In the next step we download some of the proposed docu- ments for analysis. Because of the amount of information required we 3 - Dynamic capabilities and sustainability practices in reach the field of Big Data. Therefore an extra effort is made to en- sure scalability of the application. As data storage we chose a NoSQL supply chain management — a causal diagram and a database which scales linear with the number of nodes and promises system dynamics model fault tolerance. To finally detect events in the data we use data mining Marcus Brandenburg, Stefan Seuring, Daniel Thiel

55 TC-17 OR 2014 - Aachen

The link between sustainable supply chain management and dynamic supply market and landfills that do not meet the standards for collection capabilities has been conceptualized by Beske (2012) and operational- and classification of waste, leading to difficulties in continuous supply ized by adequate policies in various industries (Beske 2013, Beske et of recyclers and (2) corruption scandals related to government subsi- al. 2013). In a first step, the approach of the proposed paper is to dies and state financing of waste collection and processing, leading to develop a causal diagram from the conceptual framework and related unequal market position of enterprises in this industry. These charac- literature. Based on approaches, the different constructs teristics of the sample creates a relevant basis for testing reliability of are analyzed with regard to their importance for the coherence and in- the model just in terms of issues specific for economies in transition, terplay in sustainable supply chains. In a second step, a system dynam- abstracting the impact of financial crisis. The aim of the paper is to test ics model is developed from this causal diagram to assess the behavior Altman’s Z-Score model in terms of its reliability in predicting perfor- of supply chains with regard to triggers and performance outcomes of mance of the enterprises within two years lag period. Predictive value sustainability. The system dynamics model is suitable to reflect the of Z-Score model is going to be statistically tested by investigating high complexity of different constructs and their dynamic interplay, to the relationship of model’s results with profitability in terms of ROA, validate conceptual frameworks and to substantiate conclusions drawn ROE, Liquidity Ratio and Net Income of enterprises in sample. from empirical findings. Faced with different demand fluctuations like pulses, steps and random fluctuations, preeminent loops have been de- 3 - Tight bounds on the cardinality constrained mean- tected for each scenario. These main regulation mechanisms are insur- variance portfolio optimization problem using trun- ing a stable and sustainable supply chain management. At the theoret- cated eigendecomposition ical side, this means that the model can tell us which dynamic capabil- ities are more suitable for different market fluctuations. Elina Rönnberg, FRED MAYAMBALA, Torbjörn Larsson The mean-variance problem introduced by Markowitz in 1952 is a fun- Beske P (2012): Dynamic capabilities and sustainable supply chain damental model in portfolio optimization up to date. This problem is management. IJPDLM 42 (4): 372-387. Beske P (2013): Dynamic particularly hard to solve when cardinality constraints are added, be- Capabilities in Sustainable Supply Chain Management. Kassel Uni- cause the problem then becomes non-convex and NP-hard. This prob- versity Press, Kassel. Beske P, Land A, Seuring S (2014): Sus- lem often includes transaction level constraints, that is, minimum and tainable supply chain management practices and dynamic capabili- maximum portions held of an asset, if it is held at all. The existing ties in the food industry: A critical analysis of the literature. IJPE exact methods for such problems take a huge amount of time to give http://dx.doi.org/10.1016/j.ijpe.2013.12.026i solutions, which render them practically hard to apply. The aim of this talk is to introduce a method that provides tight lower and upper bounds to the mean-variance portfolio optimization problem with cardinality and transaction level constraints. The method involves TC-17 performing eigendecomposition of the covariance matrix and then us-  ing only a few of the eigenvectors and eigenvalues to obtain a relax- Thursday, 11:25-12:55 - 001 ation of the original problem. This relaxation, when solved, gives a lower bound to the optimal value of the original problem. The solution Optimization and Statistics to the relaxed problem is then used to obtain feasible solutions to the original problem and upper bounds. Stream: Finance, Banking, Insurance, and Accounting The obtained upper and lower bounds are tight and the computing time Invited session required to obtain them is much less than what state-of-the-art mixed- Chair: Elina Rönnberg integer proramming softwares use. We test the method on large-scale problems of up to 1,000 assets and the results are not only good but can also be obtained in a reasonable amount of time, which makes it 1 - Measuring Parameter Uncertainty in CDO Pricing practically usable. Models Martin Schmelzle, Daniel Rösch, Stefan Weber Uncertainty about the probabilistic modeling of contingent claims is magnified by the uncertainty stemming from the need to adequately TC-18 parameterize the stochastic dynamics of a given model or family of  models. Losses associated with models usage can lead to financial Thursday, 11:25-12:55 - 004 distress for market participants and possibly impose systemic risks spreading throughout the economy attracting special attention from Contracts and Pricing both risk management and regulatory authorities. This paper proposes a new framework based on convex risk measures to provide robust bid Stream: Supply Chain Management and ask pricing functionals. These uncertainty-capturing pricing func- Invited session tionals adjust models calibrated to benchmark instruments for model risk additionally reflecting uncertainty about the true parameters if the Chair: Jochen Gönsch solution to model calibration is not unique. Based on these robust risk measures, we introduce the notion of uncertainty premiums for dis- 1 - Empirical Newsvendor Decisions under a Service counted payoffs. This enables us to quantify the degree of uncertainty Level Contract and derive price uncertainty ratios for contingent claims. Numerical Michael Becker-Peth case studies using market information from credit index tranches for multi-name credit payoffs subject to default risk capture the impact Analyzing the newsvendor context in laboratory experiments gives of uncertainty about the parameterization of probability (pricing) mea- new insights into the behavioral aspects of decision makers. However, sures, illustrate the characteristics of parameter uncertainty premiums, a valid question is whether the results of experimental studies can be and document their evolution over time. transferred to real world decisions. We test this by analyzing the deci- sions of real decision makers. We derive normative benchmarks for the 2 - Reliability Assessment of Altman’s Z-Score Model to profit maximizing behavior and compare these to actual data. Our find- Predict Financial Distress of Enterprises in Transi- ings indicate that real decision makers show similar decision biases as students in laboratory environments. Being the first to analyze a multi tional Economy product setting we find a new decision bias, as our decision maker is Jelena Stankovic, Zarko Popovic, Ivana Veselinovic aggregating costs. The paper presents the results of an empirical testing of Altman’s Z- 2 - Optimal contract length in dynamic markets Score model in transitional economy conditions as are in Serbia today. The authors chose as research sample the enterprises in recycling in- Gilles Merckx, Aadhaar Chaturvedi dustry, because these enterprises are not heavily affected by the finan- In some dynamic markets, like electronics, the cost of some compo- cial crisis in the period after 2008. Namely, regardless of the conse- nents can change from one quarter to the next. Thus, some buyers quences of the economic crisis, the recycling industry in Serbia shows might be interested in awarding regular short-term contracts to follow an increase in both total assets, as well as in total income of the sec- the best market price. However, when suppliers can invest in improv- tor. Also, the percentage of processed municipal and electrical waste ing their production cost, such short-term contracts can deter suppli- in Serbia is still low for EU standards, but is increasing in recent years. ers from investing since they have no guarantee about future business. These facts indicate growth and development of the recycling industry In this paper, we investigate this issue through a two-period model in in Serbia. On the other hand, the recycling industry is particularly af- which the buyer decides whether to auction off one contract at the be- fected by problems specific for transitional economies: (1) unregulated ginning of each period or (to auction off) a single contract for both

56 OR 2014 - Aachen TC-20

periods, depending on the size of its supply base. We assume that only 2 - Pre-selection Strategies for Dynamic Collaborative the supplier that wins the first-period auction can invest to improve its Transportation Planning Problems second-period cost. We first characterize the endogenous optimal in- Kristian Schopka, Herbert Kopfer vestment under both settings. Then we determine the equilibrium bid- ding strategy of the suppliers, as well as the optimal mechanism that Nowadays, freight carriers are often confronted with customers de- the buyer should propose to its potential suppliers. Finally, we real- manding for quick execution of their transportation requests. Depend- ize some numerical analysis to illustrate our results. Our results show ing on this need, some new transportation requests could appear during that suppliers always make a greater investment when there is only one the current planning interval. Especially for Small and Mid-size Car- second-price auction that is organized. Further, we find that the opti- riers (SMCs), it is difficult to deal with the uncertainness pertinent to mal mechanism for the buyer depends on the cost improvement curve dynamic situations. In this context, SMCs may find a possibility to in- and on the supply base size. Indeed, the buyer tends to prefer a single crease their transportation efficiency by joining or even establishing a auction when there are few suppliers competing, to take advantage of horizontal collaboration within a carrier coalition for freight exchange. a greater investment, and two auctions when suppliers are numerous, For such coalitions, mechanisms of the Dynamic Collaborative Trans- to benefit from an enhanced competition. Finally, we also find that portation Planning Problems (DCTPP) have to be developed in order suppliers are always better off in the two-auction context. to conquer the uncertainness of dynamic situations in collaborative scenarios. In this paper a new heuristic approach, using an Adaptive 3 - Buying used products for remanufacturing: Negoti- Large Neighborhood Search, for a multi-vehicle version of the DCTPP ating or posted pricing is introduced. This framework organizes the collaboration process of Jochen Gönsch some independent SMCs by a stepwise request exchange mechanism. All dynamic aspects are viewed by a periodic re-optimization strategy Product reclamation is a critical process in remanufacturing. It is gen- within a rolling horizon planning. One of the main barriers for the erally assumed in the literature that customers simply want to get rid establishment of collaborations is the carrier-fear for abandoning their of their used products without expecting any compensation for them. independence. To guarantee a still existing autonomy in our frame- Some authors have only recently started looking into firms that offer work, each coalition member decides on its own which of its requests a posted (fixed) price for them. Following recent reports suggesting are offered for freight exchange. In a computational study it is ana- that customers are increasingly open to bargaining, we compare using lyzed which heuristic strategies are most suitable for selecting those a posted price and bargaining to obtain used products. In our anal- collaborative requests. ysis, we consider an original manufacturer acting as a monopolist as well as a manufacturer and an independent remanufacturer acting in 3 - Scheduling co-operating stacking cranes with time a duopoly. We analytically show that bargaining is always beneficial windows to the monopoly manufacturer. In the duopoly case, we distinguish a Cournot competition and a market with the manufacturer as Stackel- Amelie Eilken, Malte Fliedner berg leader. The results of a numerical study show that both firms will Due to the ever increasing volume of international container freight, use posted pricing in the Cournot competition, especially if bargaining the operational planning of modern seaport container terminals has re- is not costless. By contrast, the remanufacturer can significantly in- ceived plenty of attention in the academic literature. One of the many crease his profit by using negotiations if he is the Stackelberg follower. well-known decision problems in this field of research deals with the scheduling of automated stacking cranes in block storage yards that serve as an intermediate buffer between the seaside operations and the hinterland. This planning task typically comprises the assignment of jobs to cranes, the sequencing of jobs per crane and the scheduling of  TC-19 job executions. For the latter scheduling problem Briskorn et. al [D. Thursday, 11:25-12:55 - I Briskorn, P. Angeloudis, M. Bell. (2013). Scheduling co-operating stacking cranes with predetermined container sequences. Submitted Collaborative Aspects in Routing and for publication.] recently proposed a graphical representation and strongly polynomial algorithms for make-span minimization of typi- Scheduling cal crane settings. In practical applications however, the block storage needs to timely serve the succeeding stage of the logistic chain (e.g. Stream: Traffic and Transportation seaside quay cranes), so that the compliance of jobs with given time Invited session windows becomes a primary concern. Therefore allowing for release dates and minimizing delay is essential. In this work we generalize and Chair: Amelie Eilken extend the results of Briskorn et al. and develop strongly polynomial time algorithms that observe time windows and weakly polynomial al- 1 - Dynamic user-optimal routing based on joint strat- gorithms that minimize maximum delay. egy fictitious play Tai-yu ma Dynamic route guidance system, generally provided by a system ad- ministrator, aims to provide road users en-route recommendations to TC-20 avoid traffic congestion. In this study, we consider the problem as  a multi-player repeated game in a dynamic multi-agent transporta- Thursday, 11:25-12:55 - II tion system. A game-theoretical route guidance (user-optimal routing) strategy based on joint strategy fictitious play (JSFP) is proposed to Production Planning and Order solve dynamic user-optimal routing problem. Each guided user makes Acceptance his travel time estimations and local outgoing link decisions based on his historical experiences and received en-route traffic time informa- tion. The proposed algorithm incorporates users’ inertia and en-route Stream: Production and Operations Management traffic information when making his local route choice decisions. The Invited session en-route traffic information considered in this study consists of peri- Chair: Benjamin von Eicken odically updated link travel times and announced travel time delays. The proposed approach is based on individual selfish adaptive online route choice behavior under JSFP strategy with real-time traffic infor- 1 - Production Planning with Order Acceptance and mation provision. The numerical results demonstrate the convergence Load Dependent Lead Time of the proposed algorithm to near-Nash equilibrium and travel times Tarik Aouam, Nadjib Brahimi, El-Houssaine Aghezzaf and delay reduction in a dynamic congested network. The advantages of the proposed algorithm reside on its distributed and self-guidance We consider a tactical planning problem which integrates production aspects. We show that the proposed JSFP strategy can achieve better planning decisions together with order acceptance decisions, while tak- route guidance performance compared with the existing iterative solu- ing into account the dependency between workload and lead times. tion algorithm (Zuurbier, 2010) and the time-dependent shortest path The planner needs to decide which orders to accept and the period in routing method. The computational result in realistic network based which each accepted order should be produced. If an order is accepted, on a queue model will be presented. We demonstrate the performance it generates revenue, incurs production and inventory costs, and affects of the proposed algorithm with respect to different compliance rates the production lead time. The more orders are accepted, the higher is and to non-recurrent incident situations. the workload and production lead time resulting in the possibility of missing due dates. If an order is rejected, a lost sale cost occurs. The

57 TC-21 OR 2014 - Aachen

problem is formulated as a mixed integer linear program and solved 1 - Robust optimization and distributed computation using an efficient Lagrangian relaxation heuristic that is based on de- features of the FICO Xpress Optimization Suite composing the problem into efficiently solvable sub-problems. Nu- Sebastien Lannez, Yves Colombani, Zsolt Csizmadia, merical results show that the proposed Lagrangian heuristic outper- forms a relax-and-fix heuristic and a state-of-the-art solver, providing Susanne Heipcke, Pietro belotti very small gaps between the obtained lower and upper bounds within In this presentation, we will introduce the robust optimization frame- reasonable CPU times. work and the distributed computation feature of the FICO Xpress Op- timization Suite 7.7. The first part of the talk will present the design 2 - How to increase robustness of capable-to-promise? principles and incremental modelling feature of the robust optimization - A numerical analysis of preventive measures framework, and will be continued by a description of the distributed Sonja Kalkowski, Ralf Gössinger computation engine and the new cloud-based optimization service. 2 - Distributed parallel MIP solving with CPLEX In the past decades reaching a high level of on time deliveries became a crucial requirement to ensure customer satisfaction. On the other Daniel Junglas hand order- and resource-related uncertainty hampers the generation The two most recent versions of CPLEX have introduced three dif- of reliable delivery promises. Therefore the need for planning ap- ferent ways to solve (mixed) integer problems with CPLEX in a dis- proaches that generate robust delivery dates arises. In the context of tributed parallel fashion: 1. "Concurrent parallel optimization" starts supply chains capable-to-promise approaches are suggested to deter- MIP solvers with different parameter settings on the same problem on mine delivery dates with respect to available resources. To enhance a set of (remote) machines. The results of all solvers are monitored solution and planning robustness, the paper aims at analyzing three and search stops if the combined primal and dual bounds of all solvers preventive measures that can be applied to cover order- and resource- satisfy a termination criterion. 2. "Concurrent parallel optimization related uncertainty: capacity nesting, safety capacity and interactive with communication between solvers" is similar to the above but now order promising. Therefore, a two-stage approach is modelled and nu- information found during the search (incumbents, cuts, ...) can be com- merically analyzed. The common way of accepting orders according to municated between the solvers so as to speed up the individual solvers. customers’ delivery time specifications is applied at the first planning 3. "Distributed parallel tree search" takes the standard MIP tree search stage. In order to reduce order-related uncertainty, instead of finally algorithm to a cluster of distributed machines. In this talk we will re- rejecting orders whose requested delivery times cannot be met, deviat- view and compare the three different strategies in detail. We will also ing delivery dates are proposed at the second planning stage consider- report on their respective performance. ing customers’ response (interactive order promising). At both stages capacity for future lucrative orders (capacity nesting) is reserved to 3 - Deploying MPL Optimization Models Online on cope with order-related uncertainty and safety capacity is provided to Servers and Mobile Platforms handle resource-related uncertainty. The two-stage planning approach is implemented as a dynamic, stochastic mixed-integer programming Bjarni Kristjansson model and a numerical analysis based on real-world data of a manufac- The IT industry is currently undergoing a major shift, away from tra- turer is performed by means of the AIMMS environment. A systematic ditional standalone applications, to new platforms such as servers, variation of relevant measure parameters is carried out to identify the clouds, tablets, and mobile phones. We will demonstrate a new server- impacts of the preventive measures on profit, solution and planning ro- based version of the MPL OptiMax Component Library, that makes bustness as well as the interactions between the adaptation measures. implementing real-world optimization applications a relatively quick and easy process. We will take you through all the steps of implement- 3 - Data Quality and Production Planning ing optimization projects, including formulating the model, integrating Benjamin von Eicken, Peter Letmathe it seamlessly with data in different formats, and then finally deploying the project on a server for servicing both web and mobile clients, using The amount of data produced by any kind of electronic device in- standard programming languages, such as CSharp, Visual Basic, Java, creases continuously. With more and more data available, more and C/C++, or Python. more processes and decisions are based on such data. But with an increasing amount of data available, there is little said about the data quality. A lot of research has been done to define measures for data quality. We do know how to measure different aspects of data quality. But we know little about the effects. With more and more decisions  TC-22 based on an increasing amount of data, it remains open to investigate Thursday, 11:25-12:55 - IV the impact of poor data quality. Depending on the process using cer- tain data, the effects may be harmful. The knowledge about effects of using data with low quality is mandatory. Gaining such knowledge, Robust Vehicle Routing Problems under we may find a way to avoid negative effects. It is necessary to keep in Uncertainty mind, that poor data quality is not equivalent to data uncertainty. Un- derstanding data quality as usefulness of data, uncertainty is neither a Stream: Logistics and Inventory prerequisite nor an outcome of poor quality. In fact, poor data quality may come along with very certain and specific information. Thus, it is Invited session not advisable to reduce data quality to data uncertainty. In this paper, Chair: Uwe Clausen we are going to define a production model. This model will be used Chair: Jens Baudach to solve a cost minimization problem. We use this model to compute different solutions, enriched with different kinds of low data quality as- 1 - Robust vehicle routing as an impact on LTL terminals pects. For each quality aspect, we investigate the negative effects, cor- related to the input data. Afterwards, we are going to combine different Christian Tesch, Uwe Clausen, Lars Eufinger kinds of low data quality aspects. We present the gained information. Resulting from an increase of shipment quantities many logistics fa- Additionally, we use the computed information for a business process cilities such as terminals and distribution centers often reach the limit improvement. We aim on a general approach to reduce negative effects of their performance ability and thus become a bottleneck in supply- of using data with low quality in processes. chains. Our research focuses on less than truckload (LTL) forward- ing companies providing a pick-up and delivery service within a local area. Whereas delivery orders are already known before tour start, about 50% of daily pick-up orders will be added during the day, at the time the trucks are on tour. This dynamic vehicle route planning has a large impact on the internal processes of the transshipment facility  TC-21 and thus to in-house resource requirements. The daily decisions about Thursday, 11:25-12:55 - III which trucks to use, which vehicle routing to plan and how to organize the transshipment in the terminal reach their limits with static meth- Distributed and Remote MIP Solving I ods since the freight forwarding business becomes more dynamic and unknown customer orders are not considered. The objectives of our re- Stream: Software Applications and Modelling Systems search are the development of robust route planning methods, unknown customer orders, the dynamic dispatching of incoming pick ups, the in- Invited session tegration of varying driving times as well as testing our methods on real Chair: Bjarni Kristjansson world data. Based on expected pick ups a variety of feasible, probable

58 OR 2014 - Aachen TC-24

scenarios are generated. Subsequently an initial heuristic is used for change over time is due to an overall ability change, namely a shift our tours and an evolutionary algorithm improves each solution. The of the production function or individual effects. We deploy Stochastic parallel computed solutions are then combined to achieve an efficient Frontier Analysis to build an efficient frontier based on a novel panel and robust solution. On the basis of real data sets with about 100 ve- of German municipalities. On the one hand our results suggest that for hicles and 1,000 stops per day the portability will be shown on usual both, productive- and cost-efficiency a negative structural trend holds. problem instances. On the other hand, for productive-efficiency individual effects increase public service delivery efficiency over time. Therefore, an efficiency 2 - Route planning under uncertainty increase on the cost level might be rejected but on the productive level Nadine Wollenberg, Rüdiger Schultz, Michel Gendreau mixed results occur. In the present talk we will introduce an integer L-shaped algorithm for 2 - DEA calculation for production simulation modeling a stochastic extension of the vehicle routing problem with simultane- with universities ous delivery and pickup. We assume that the quantities to be delivered Matthias Klumpp are fixed, whereas the quantities to be picked up are not known in ad- vance. For the stochastic pickup data, we contemplate a finite discrete University production processes are hard to analyze and to simulate probability distribution. Due to possible route failures, i.e. arriving at due to the high complexity of inputs and outputs. For example outputs a customer with insufficient vehicle capacity, compensation strategies can be in very different areas like teaching (graduates), research (pub- or corrective actions need to be considered. Unserved pickup quan- lications, third party funding) or third mission (co-operations, transfer tities are collected by an additional vehicle, starting from the depot of knowledge). Therefore usually for an efficiency analysis, methods only after complete information on the unserved pickup amounts has such as the data envelopment analysis (DEA) are used in order to ad- become available. For the single vehicle case, the stochastic model is here to the multitude of inputs and outputs. This contribution will dis- formulated as a two-stage stochastic program with recourse and solved cuss in a first step a DEA calculation for 86 German universities and by means of the integer L-shaped method. Lower bounding functionals universities of applied sciences with the inputs budget or number of (LBFs) are used to improve the efficiency of this algorithm by strength- professors and the outputs PhD, MA and BA graduates, third party ening the lower bound on the recourse cost associated to partial routes funding in the science areas social/humanities, life sciences (including encountered throughout the solution process. The concept of general medicine), natural sciences and engineering (data by Statistisches Bun- partial routes is adapted from Jabali, Rei, Gendreau and Laporte. desamt and DFG, 2009-2012). The analysis is then in a second step used to suggest a production function model for simulation purposes. 3 - Exact algorithms for the vehicle routing problem with This is accomplished by including the DEA efficiency values into a re- soft time windows gression model calculated with "R’ as a dependent variable and testing Matteo Salani, Maria Battarra, Luca Maria Gambardella several independent variables as possible explanating factors. The suc- cessfully reviewed factors in this procedure will be included in a draft This paper studies a variant of the Vehicle Routing Problem with Soft for a production function for universities in order to allow for simula- Time Windows (VRPSTW) inspired by real world distribution prob- tion of production process results in universities. Finally, the simula- lems. Soft time windows constraints are very common in the distri- tion results are backtested with the existing DEA dataset and therefore bution industry, but quantifying the trade-off between routing cost and evaluated for their prognostic power compared to the real input and customer inconvenience is a hard task for practitioners. There is not output values of the 86 listed German universities. consensus among scientists on how to model time windows violations and on how to weight time windows violations and routing costs. We therefore develop an alternative interpretation of soft time windows constraints. In our model, practitioners impose a minimum routing cost saving (to be achieved with respect to the hard time windows so- TC-24 lutions) and to minimize solely the customer inconvenience. We pro-  pose two exact algorithms based on branch-and-cut-and-price method. Thursday, 11:25-12:55 - AS The first algorithm is based on standard branch-and-cut-and-price and uses an embedded relaxation procedure. The second algorithm uses Combinatorial Markets and Pricing concepts of bi-objective optimization and is based on a bisection algo- rithm. Our computational experience provides an extensive compari- Stream: Pricing, Revenue Management, and Smart son among the results obtained with our variants and those obtained by imposing hard and soft time windows constraints. The performance of Markets the algorithm is also discussed. Invited session Chair: Alessia Violin

1 - Competing Combinatorial Auctions Thomas Kittsteiner, Marion Ott, Richard Steinberg  TC-23 Thursday, 11:25-12:55 - V We analyze incentives to offer combinatorial auctions in a compet- ing environment with asymmetric information with respect to bidders’ preferences. Two sellers each offer the same heterogeneous items. Performance Evaluation Sellers choose to either use two single-item second-price auctions, a single-bundle second-price auction, or a combinatorial Vickrey auc- Stream: Production and Operations Management tion. We find that if bidders are sufficiently heterogenous in their Invited session demand, sellers might prefer not to offer the combinatorial auction. Chair: Matthias Klumpp In equilibrium, one seller offers two single-unit auctions and attracts those bidders who want a single item and the other seller offers a single-bundle auction and attracts the remaining bidders. 1 - Structural versus individual time effects on local gov- ernments’ service provision efficiency: A compari- 2 - Efficient convex decomposition for truthful-in- son of productive- and cost-efficiency expectation approximation mechanisms Hannes Lampe, Dennis Hilgers Salman Fadaei, Dennis Kraft, Martin Bichler Since decades the public sector seems incapable to not incur new debt In a seminal paper, Lavi and Swamy (2011) propose a general frame- or even save money to overcome the omnipresent stressed financial work to obtain approximation mechanisms that are truthful in expecta- situation. This grievance does not only exist on a certain level of pub- tion. The ellipsoid method is pivotal in this framework and alongside lic sector administration but is rather pervasive over several levels of an approximation algorithm is used to find an integral convex decom- public bodies including municipalities, federal states as well as en- position for the fractional solution to the linear program relaxation. tire nations. This maladjustment is further accompanied by calls for Although its worst-case runtime is polynomial, the ellipsoid method is an increase in public service provision efficiency to improve the fi- notoriously known to be inefficient in practice. In this paper, we pro- nancial situation of public institutions. In our contribution, we test pose a more efficient method for finding convex decompositions that whether and how local public service provision efficiency (differen- eliminates the use of the ellipsoid method. Our method requires only tiated in productive- and cost-efficiency) changes over time identify- a quadratic number of invocations of the gap-verifying approximation ing if the stressed financial situation is counteracted over time. We algorithm. The number of invocations is quadratic in terms of the num- therefore identify if a possible municipal service provision efficiency ber of dimensions with positive entries in the fractional solution. Our

59 TD-01 OR 2014 - Aachen

method describes a practical way to find integral convex decomposi- tions in order to transform various approximation algorithms of the Thursday, 13:45-14:30 packing problems into truthful-in-expectation approximation mecha- nisms.  TD-01 3 - A Branch-and-Cut-and-Price Algorithm for the Net- Thursday, 13:45-14:30 - Fo1 work Pricing Problem with Connected Toll Arcs Alessia Violin, Bernard Fortz, Martine Labbé Semiplenary Boyd Consider a network where there are two types of arcs: a subset of arcs is owned by a company imposing tolls for using them, and a subset Stream: Invited Presentations and Ceremonies of remaining arcs which are toll-free. Furthermore, toll arcs are con- Semi-plenary session nected such that they constitute a single path, as it occurs for instance Chair: Michael Herty in a highway network. The company is willing to maximize the rev- enue from tolls, whilst users seek for their minimum cost path between 1 - Convex Optimization: From embedded real-time to their origin and destination. This problem is strongly NP-hard and can be modeled as a bilevel program. large-scale distributed Stephen Boyd We propose a Dantzig-Wolfe reformulation for this problem, and show that the linear relaxation is stronger than the MILP formulation pro- Convex optimization has emerged as useful tool for applications that posed in the literature. The subproblem is non linear but easily solv- include data analysis and model fitting, resource allocation, engineer- able. More advanced techniques have been included in our column ing design, network design and optimization, finance, and control and generation algorithm, as initialization alternatives, stabilization of dual signal processing. After an overview, the talk will focus on two ex- variables values and early stopping criteria. Furthermore, we propose a tremes: real-time embedded convex optimization, and distributed con- full Branch-and-Price scheme to solve the integer problem, with an ad- vex optimization. Code generation can be used to generate extremely hoc branching algorithm using pseudo-costs to guide the choices. An efficient and reliable solvers for small problems, that can execute in SOS branching scheme has also been proposed. Some rounding heuris- milliseconds or microseconds, and are ideal for embedding in real- tics have been investigated to improve the primal bound during the time systems. At the other extreme, we describe methods for large- branching. Finally the framework has been extended to a branch-and- scale distributed optimization, which coordinate many solvers to solve cut-and-price, including some efficient valid inequalities from the lit- enormous problems. erature. Numerical experiments have been run under the SCIP frame- work.  TD-02 Thursday, 13:45-14:30 - Fo2 Semiplenary Rönnqvist Stream: Invited Presentations and Ceremonies Semi-plenary session Chair: Grit Walther 1 - Value chain planning for natural resources Mikael Rönnqvist There are several industries connected closely to natural resources, such as forests, minerals and energy. The natural resource sector is fun- damental to a strong modern bio-social-economy for many countries. There are some important characteristics which are common for all (or a majority) of the natural resource areas. The underlying value chain for each area is divergent i.e. they start with few raw materials and as these are refined through the value chain the number of products and services increases. These areas are also important for their social im- pact, energy production/consumption and the environment. There are many stakeholders and decision makers in the natural resources value chains, which often are decoupled, and several objectives are driving the planning processes. Lastly, the logistic operations typically involve large long term investments and very large volumes. Optimizing this set of integrated values which often are conflicting (in terms of how they drive the decisions) is a complex task. This raises the need for sustainable, robust and integrated long-term and short-term planning. In this presentation, we will address a number of applications arising in different value chains. We discuss their properties, interactions and challenges. The applications include transportation, inventory, routing and distribution planning in the forest and mining industries. We also describe how operations research models and methods have been cru- cial and used efficiently to develop practical decision support tools for the sector. These tools include collaborative logistics, anticipative and integrative planning, and robust optimization.

 TD-04 Thursday, 13:45-14:30 - Fo4 Semiplenary Vohra Stream: Invited Presentations and Ceremonies Semi-plenary session Chair: Tobias Harks

60 OR 2014 - Aachen TE-02

1 - One-Sided Matching with Limited Complementarities Rakesh Vohra Thursday, 15:00-16:30 The problem of allocating bundles of indivisible objects without trans- fers arises in the assignment of courses to students, the assignment of  TE-02 computing resources like CPU time, memory and disk space to com- Thursday, 15:00-16:30 - Fo2 puting tasks and the assignment of truck loads of food to food banks. In these settings the complementarities in preferences are small compared with the size of the market. We exploit this to design mechanisms sat- GOR Dissertation Award isfying efficiency, envyfreeness and asymptotic strategy-proofness. Stream: Awards Sessions Informally, we assume that agents do not want bundles that are to large. Award Competition session There will be a parameter k such that the marginal utility of any item relative to a bundle of size k or larger is zero. We call such preferences Chair: Alf Kimms k-demand preferences. Given this parameter we show how to repre- sent probability shares over bundles as lotteries over approximately 1 - On Arc-Routing Problems deterministic feasible integer allocations. The degree of infeasibility Claudia Schlebusch in these integer allocations will be controlled by the parameter k. In particular, ex-post, no good is over allocated by at most k-1 units. The basic multiple-vehicle arc-routing problem is called Capacitated Based on joint work with Thanh Nguyen and Ahmad Peivandi. Arc-Routing Problem (CARP). Applications of the CARP are in waste collection and mail delivery, for example. The goal is to find a cost- minimal set of tours that service all required edges and meet the ca- pacity restriction. In this work, a cut-first branch-and-price second approach is developed. In phase one, cutting planes are generated that are introduced to the master problem in the second phase. The subproblem is a shortest path problem with resource constraints. It is solved in order to generate new columns for the master problem. In- teger CARP solutions are guaranteed by a new hierarchical branching scheme. Comprehensive computational results show the effectiveness of the algorithm. Combining location problems with arc-routing prob- lems enables one to model more realistic mail delivery applications. In this work, two mathematical formulations for each park and loop, and park and loop with curbline are introduced. The two models for each problem differ in how they model feasible transfer routes. While the first type of model uses subtour-elimination constraints, the second type uses flow variables and flow conservation constraints. The com- putational study shows that a MIP-Solver often needs less computation time to solve the latter type of model or results in better lower bounds when reaching the time limit. 2 - A Polyhedral Study of Quadratic Traveling Salesman Problems Anja Fischer The quadratic traveling salesman problem (QTSP) is an extension of the traveling salesman problem (TSP) where the costs do not depend on two but on each three nodes traversed in succession. It can be for- mulated as an integer program with a quadratic objective function. The QTSP is motivated by an application in bioinformatics. Important spe- cial cases are the angular-metric TSP and the TSP with reload costs. We present polyhedral studies for the linearized integer programming formulations in the symmetric and in the asymmetric case. These include the dimension of the associated polytopes as well as three groups of valid inequalities and facets. Some are related to the Boolean quadric polytope and some forbid conflicting configurations. Further- more, we provide two general approaches that allow to strengthen valid inequalities of TSP in order to get stronger inequalities for QTSP. Ap- plying these approaches to the subtour elimination constraints leads to facets in most cases, but in general facetness is not preserved. In addition, the complexity of the separation problems for several facet classes is studied. Finally, we present some computational results us- ing a branch-and-cut framework. Separating the newly derived cutting planes large instances from biology could be solved to optimality. 3 - Line Planning and Connectivity Marika Karbstein A system of lines in public transport should usually be connected, i.e., for each two stations there has to be a connecting path that is covered by the lines. We define this problem in a graph theoretical context and call it the Steiner connectivity problem. It is a generalization of the well-known Steiner tree problem. We discuss complexity and ap- proximation algorithms, give a transformation to the directed Steiner tree problem, and show that directed models provide tighter formu- lations for the Steiner connectivity problem than undirected models, similar as for the Steiner tree problem. Some of these investigations can be used to propose a novel direct connection approach that allows an integrated optimization of line planning and passenger routing. This approach focusses on direct connections. The attractiveness of trans- fer free connections is increased by introducing a transfer penalty for each non-direct connection. In a project with the Verkehr in Potsdam GmbH to compute the line plan for 2010 we showed that our approach is applicable in practice and can be used to solve real world problems.

61 TE-03 OR 2014 - Aachen

4 - Robustness Concepts for Knapsack and Network De- This communication aims to present some computational experiments sign Problems under Data Uncertainty with Branch & Bound techniques in Multiobjective (mixed) Integer Manuel Kutschka Linear Fractional Programming (MOILFP). The importance and inter- est of these problems stem from the fact that many real applications In this thesis, we consider mathematical optimization under data uncer- involve the optimization of ratios (e.g. the maximization of output per tainty using mixed integer linear programming techniques. Our inves- some metric of the dimension of the region), integer variables for mod- tigations follow the deterministic paradigm known as robust optimiza- elling several real world issues, and also entail multiple conflicting cri- tion. We investigate four robustness concepts for robust optimization teria. One of the most used techniques for computing non-dominated and describe their parametrization, application, and evaluation. The solutions in multiobjective programming problems is the optimization concepts are gamma-robustness, its generalization multi-band robust- of a weighted sum of the objective functions. This transformation in ness, the novel more general submodular robustness, and the two-stage problems with linear fractional functions leads to the so called sum approach called recoverable robustness. of ratios problem, one of the most difficult fractional problems en- countered so far. We have already developed algorithms to compute For each concept, we investigate the corresponding robust generaliza- nondominated solutions for Multiobjective Linear Fractional Program- tion of the knapsack problem (KP) presenting IP formulations, detailed ming Problems (MOLFP): Branch & Bound algorithms, which were polyhedral studies including new classes of valid inequalities, and al- then improved by introducing cuts into Branch & Cut schemas. We gorithms. In particular, for the submodular KP, we establish a con- will present some adaptations of these algorithms in order to cope with nection to polymatroids and for the recoverable robust KP, we derive a integer variables. Computational experiments, for testing performance, nontrivial compact reformulation. Additionally, the recoverable robust were carried out and we will report on the obtained results. KP is experimentally evaluated in detail. Further, we consider the gamma-robust and multi-band robust general- izations of the capacitated network design problem (NDP) presenting MIP formulations, new detailed polyhedral insights with new classes of valid inequalities, and algorithms. For example, we derive alter-  TE-04 native formulations for these robust NDP by generalizing metric in- Thursday, 15:00-16:30 - Fo4 equalities. Furthermore, we present representative computational re- sults for the gamma-robust NDP using real-life measured uncertain Multistage Stochastic Programming data from telecommunication networks based on our work with the German ROBUKOM project. Stream: Robust and Stochastic Optimization Invited session Chair: Ulf Lorenz  TE-03 1 - Stochastic Dynamic Programming Solution of a Risk- Thursday, 15:00-16:30 - Fo3 Adjusted Disaster Preparedness and Relief Distribu- tion Problem Multi-objective Integer Programming II Ebru Angun Stream: Decision Theory and Multi-Criteria Optimiza- We formulate the disaster preparedness and short-term response plan- ning problem through a multistage stochastic optimization model. To tion account for risk, we also add chance constraints which ensure that the Invited session budget limits will not be exceeded with high probability. These chance Chair: João Paulo Costa constraints, however, are replaced by some coherent risk measures and are then added to the objective function. We assume that both the de- mands and the road capacities have known but continuous distribu- 1 - A New Algorithme for Optimizing Over Integer Effi- tions, which implies an infinite number of scenarios for the problem. cient Stochastic Set Then, we discretize these continuous distributions using different sam- MEBEREK FATMA, chaabane Djamal pling techniques, and build scenario trees which contain a big num- ber of scenarios. To cope with the computationally untractable multi- In this paper, we study the problem of optimizing a linear function dimensional integrations, we estimate the expectations by their Sample over an integer efficient solution set in the stochastic discrete environ- Average Approximations. Under some assumptions, we solve the re- ment. A new exact technique is proposed however, to provide the best sulting problems through the Stochastic Dual Dynamic Programming preference to the decision maker among a set of stochastic non domi- algorithm. We numerically derive useful insights for the applications nated solutions. Once the problem is converted into a deterministic one of the algorithm. by adapting the 2-levels recourse approach, a new pivoting technique is applied to generate an optimal penalized efficient solution without 2 - Multistage stochastic programs: metrics, approxima- having to enumerate all of them. The combination of both approaches tions, ambiguity L-Shaped and the combined method proposed by Chaabane & Pirlot Georg Pflug (JIMO 2010) enables us to come up not only with an optimal solution efficient but also with sub-a set of stochastic efficient solution defining We review the notion of nested distance for filtered stochastic pro- the path joining the later. cesses as a basis for approximation results in multistage stochastic pro- gramming. 2 - Optimization of a Choquet integral in a multiple ob- This notion is used in scenario tree generation, bounding techniques jective integer linear programming problem and in ambiguity models. assia Menni, chaabane Djamal The talk reviews some results contained in the forthcoming Sprnger book "Multistage Stochastic Programming" by G. Pflug and A. Pich- We propose an exact method allowing the generation of Choquet- opti- ler. mal solutions of a multiple objective Integer linear programming prob- lem (MOILP). A Choquet-optimal solution is a solution that optimizes 3 - Multistage Optimization with the help of Quantified the Choquet integral for certain values of its parameters and is Pareto optimal. The method is based on optimization of a Choquet integral Linear Programming and it obviously remains valid for MOLP problems in which case Ulf Lorenz it is no longer preferable because of its complexity compared to the Quantified linear integer programs (QIPs) are linear integer programs weighted sum for example, which in this case generates the same set (IPs) with variables being either existentially or universally quantified. of Pareto-optimal solutions. For MOILP problems, the main contribu- They can be interpreted as two-person zero-sum games between an ex- tion of the proposed method here is the ability to generate some unsup- istential and a universal player on the one side, or multistage optimiza- ported solutions stuck in concave portions of the efficient frontier. tion problems under uncertainty on the other side. Solutions of feasi- ble QIPs are so called winning strategies for the existential player that 3 - Some experiments with Branch & Bound techniques specify how to react on moves — certain fixations of universally quan- in MOILFP problems tified variables — of the universal player to certainly win the game. João Paulo Costa, Maria João Alves In order to solve the QIP optimization problem, where the task is to

62 OR 2014 - Aachen TE-06

find an especially attractive winning strategy, we examine the prob- 3 - Exact Approaches for Runway Schedules in Air Traf- lem’s hybrid nature and combine linear programming techniques with fic Management: Nominal and Robust Solutions solution techniques from game-tree search. Andreas Heidt, Hartmut Helmke, Frauke Liers, Alexander Here, we present the algorithmic framework of our basic solver ’Ya- Martin sol’, which is based upon a mixture of Alphabeta-algorithm, cutting plane techniques and boolean constraint propagation. Considering Air Traffic Management (ATM), the runway system is the main element that combines airside and groundside of the ATM System. To achieve an efficient planning, exact models are required. First of all, we model the runway scheduling problem in two differ- ent ways. One of them decides the ordering of the aircraft together with the landing time. The second uses a time discretization. It is an  TE-05 assignment problem with side constraints that computes for every dis- Thursday, 15:00-16:30 - Fo5 cretized point in time whether an aircraft is scheduled and if so, which one is. Furthermore, security distances are respected. For randomly Traffic generated instances and different values of aircraft, the results for both exact models are evaluated with respect to optimality and computa- Stream: Discrete and Combinatorial Optimization, tional run time. Different preprocessing rules have been derived and Graphs and Networks are evaluated for each model. In reality, we have to face disturbances and uncertainties in the aircraft flight times that usually lead to devi- Invited session ations from the actual plan or schedule. Using robust optimization, Chair: Andreas Heidt we protect the model against data uncertainties in order to avoid ex- pensive or even infeasible solutions for the disturbed problem. Robust optimization concepts are transferred to the runway scheduling prob- 1 - Homogenous Radial Approach to Fair Public Service lem and are incorporated into the exact models. They are tested within System Design a simulation for a planning horizon up to two hours before landing. Marek Kvet, Jaroslav Janacek Using random initial data for each aircraft and for the uncertainties in the earliest and latest landing or departure times, the robust models are In the host situations, a public service system is designed so that the to- compared to the nominal ones. The comparison is done with respect tal disutility, like social costs, is minimized. The social costs are often to stability of a plan, the number of necessary reschedulings and the proportional to the total distance travelled by all users to the nearest number of served aircraft. In presence of uncertainty, the preliminary service center. Mathematical models of such public service system de- computations yield promising results for improved schedules. sign are often related to the weighted p-median problem, where the numbers of served users and possible service center locations take the value of several thousands. The number of possible center locations impacts the computational time for solving the associated mathemat- ical programming problem. The necessity to solve large instances of the weighted p-median problem has led to the approximate approach  TE-06 based on a radial formulation, which enables to solve bigger instances Thursday, 15:00-16:30 - Fo6 in admissible time making use of a universal IP-solver. This approach pays for shorter computational time or smaller demanded computer Sustainable Production Planning memory by a loss of accuracy. The accuracy can be improved by more suitable determination of dividing points, which define homogenous Stream: Energy and Environment set of radii for each user. The above objective denoted as min-sum criterion often causes such situation that the total social cost is min- Invited session imal, but disutility of the worst situated user is extremely high, what Chair: Christian Gahm is considered as unfair design. A fair approach to the public service system design consists in the process, when the disutility of the worst situated users is minimized first and then disutility of better located 1 - Improving energy efficiency by integrating energy users is optimized subject to the disutility of the worst situated users aspects into short-term production planning does not worsen, what is called lexicographical minimization. In this Florian Denz, Christian Gahm contribution we present an approximate approach based on radial for- mulation of the problem with homogenous system of radii given by As the producing industry is one of the main energy consumers, in- dividing points applied to fair public service system design. creasing energy efficiency in production processes turns out to be a key factor in the energy transition. An analysis of planning approaches 2 - Models for traffic engineering with multiple spanning for industrial energy supply systems (ESSs) shows that fixed energy tree protocols demand patterns are an essential planning assumption. This assump- Martim Joyce-Moniz, Bernard Fortz, Luís Gouveia tion of fixed and unchangeable demand patterns is not necessarily valid for production processes. Instead, we assume that these demand pat- terns are directly related to the production schedule and thus, can be With the increasing demand for Internet and cloud computing services, influenced by production planning. In consequence, adjusting the de- the need for large scale data centers has become paramount. In these mand patterns offers the possibility to design and operate ESSs bet- data centers, switched Ethernet networks are becoming popular, be- ter and in consequence, to increase energy supply’s efficiency. The cause of the way they effectively manage traffic. Their topology must study at hand transfers potentials to increase the efficiency of ESSs be cycle-free, to avoid broadcast radiation. Therefore, Ethernet net- into aspects (e.g., objectives or constraints) which can be integrated works only activate, at a given time, a subset of the existing links that into short-term production planning in order to take advantage of the must verify the IEEE 802.1d standard, which defines the topology of energy-saving potentials of ESSs. In detail, an exemplary short-term the sub-network as a spanning tree. One of the drawbacks of this pro- production planning approach that tackles the problem of minimizing tocol is that the network only ends up using a small number of the the difference between the minimum and maximum cumulated energy existing links. To overcome this, Ethernet networks began using the demand is presented. The resulting energy demand patterns have a Multiple Spanning Tree Protocol, which maintains a set of spanning reduced range (defined by the minimum and maximum cumulated en- trees that are used for routing the traffic demands in the network. This ergy demand) and thus, the energy efficiency of the ESS can be im- is highly advantageous for the traffic performances of Ethernet net- proved. Here, we consider a scheduling problem that is defined by a works, as the traffic can be spread throughout a bigger number of links. parallel machine environment, whereby production orders are charac- We present different mixed integer programming models for the Traffic terized by energy demand profiles. Because it can be shown that the Engineering problem of optimally designing a network implementing problem is NP complete, we propose (beside a mixed integer linear the Multiple Spanning Tree Protocol, such that link utilization is mini- program) an iterated greedy local search heuristic for the new schedul- mized. Although some variants of this problem have been treated in the ing problem. The solution quality of this heuristic is measured with literature, this is the first approach that focuses on using exact meth- regard to several test instances. ods. We present tests in order to compare the formulations, in terms of linear relaxation strength and computing time. We also propose a binary search algorithm that has proven to be efficient in obtaining 2 - The Integration of Solar Energy Storages in Distribu- quasi-optimal solutions for this problem. tion Grids Using a Fuzzy Control Algorithm Tobias Lühn, Jutta Geldermann

63 TE-07 OR 2014 - Aachen

Over the last decade, the installed peak power of PV-plants in Ger- using benchmarks derived from the real-world road network data. We many has significantly increased from 1 GW in 2004 to over 36 GW in demonstrate that our algorithm allows to solve larger problems than ex- 2014. The majority of photovoltaic plants are connected to the distribu- isting algorithms and provides high-quality solutions. The algorithm tion grid. Photovoltaics are subject to large fluctuations in their power found an optimal solution for all tested benchmarks where we could generation because of changing weather conditions. Therefore, distri- compare the results with the exact algorithm. bution system operators (DSOs) are confronted with greater problems to prevent the overload of grid components and to keep the voltage 2 - The Multi-service Facility Location Problem range within admissible constraints. In recent years, DSOs normally Ping-Ting Lin, Chi-Fen Chang, Chung-Shou Liao reacted to those new challenges with conventional grid expansion by increasing cable cross-sections, laying parallel cables and increasing The facility location problem involves locating facilities in potential the power capacity of transformers. This may result in expensive and sites, and determining the best strategy for communications between inefficient grid expansion. A solution might be the integration of so- facilities and clients. The objective of the facility location problem is lar energy storages in private households and the active peak power to open a subset of facilities, and assign service from facilities to each reduction at the grid connection point of the household. By using client such that the total sum of setup cost and connection cost is mini- conventional operating strategies, solar energy storages are often fully mized. During the past decades, there has been a considerable amount charged when the power generation from photovoltaics is at its maxi- of research on the facility location problem and its variations in the mum at midday. Therefore, peak shaving is not possible and the PV- operations research community. plant wattage has to be controlled. By optimizing the charge/discharge To cope with real-world applications in which constraints and require- mode of solar energy storages with a fuzzy control system, the peak ment of the facility location problem appear in different scenarios, the power generation from photovoltaics and therefore, the required grid problem can be formulated in various ways. A facility cannot often expansion on the low voltage level can be reduced. In this study, the afford all kinds of service due to complexity and cost, especially in a design and implementation of a fuzzy control system is developed con- large-scale network. Moreover, each client may have different types sidering the current power generation, the charging level of the battery of demand requirement. In this study, we investigate the multiservice and the solar forecast. First high-resolution simulations on a 1-min facility location problem, which finds applications to distribution net- time scale show the high potential of the fuzzy control system to min- work design. Each facility has the ability to provide at most p kinds imize the feed-in management. of distinct service for clients, while each client is associated with dif- ferent demand requirement from the p services. The goal is to select a 3 - Optimization as a means of evaluating the respective subset of facilities and to identify its corresponding service assignment advantages of biorefinery concepts to clients such that the demand requirement of each client can be sat- Lars-Peter Lauven isfied, and the total cost, including the facility setup cost, service cost and connection cost which is usually measured by the metric distance A multitude of biorefinery concepts is discussed for the usage of var- between facilities and clients, is minimized. Note that the optimal ious forms of biomass. In any case, a biorefinery is meant to make placement of such distinct but cooperative facilities is very different economically advantageous use of the main and co-products resulting from that of identical facilities. We attempt to explore approximation from the corresponding biomass conversion process. Whether the up- hardness and develop algorithms for solving the problem. grading of potential products can be considered economically advan- tageous depends on the value of the biorefinery’s products and the cost 3 - On the integrality gap of the connected facility loca- associated with the installation of the required upgrading and/or sepa- tion with buy-at-bulk edge costs problem ration equipment. Based upon the model of a biomass-based synthesis Mohsen Rezapour, Jose Soto gas biorefinery, other biorefinery concepts based on algae and enzy- matic breakdown of cellulose are investigated. These concepts require In the connected facility location with buy-at-bulk edge costs problem different assumptions concerning the origin of the input biomass, the we are given a graph containing clients and potential facilities, a core struture of logistics cost, viable biorefinery capacities, potential prod- cable type of infinite capacity, and several access cable types with de- ucts and required upgrading and separation processes. Accordingly, creasing cost per capacity ratio. The task is to open some facilities, modeling approaches used for the economic assessment of biorefiner- connect them by a Steiner tree of core cables, and build a forest net- ies need to be adapted in a suitable manner. In spite of these differ- work of access cables such that the edge capacities suffice to route all ences, it is attempted to conduct a comprehensive evaluation of biore- client demands to open facilities. The objective is to minimize the to- finery concepts to determine which concepts appear to be the most tal cost for opening facilities and installing core and access cables. We promising from the current state of knowledge and to present an ap- consider a natural model for this problem, and show that its integrality proach to use techno-economic modeling to gain further understanding gap is a constant. in this developing key field of biomass conversion.

 TE-08  TE-07 Thursday, 15:00-16:30 - Fo8 Thursday, 15:00-16:30 - Fo7 Mechanism Design I Location Stream: Algorithmic Game Theory Stream: Discrete and Combinatorial Optimization, Invited session Graphs and Networks Chair: Andre Berger Invited session Chair: Mohsen Rezapour 1 - The Sequential Price of Anarchy Jasper de Jong, Marc Uetz 1 - An Approximative Lexicographic Min-Max Approach The price of anarchy measures the costs to society due to the selfish- to the Discrete Facility Location Problem ness of players. More formally, it is a lower bound on the quality of Lubos Buzna, Michal Kohani, Jaroslav Janacek any Nash equilibrium relative to the quality of the global optimum. However, in particular games some Nash equilibria are not realistic, We propose a new approximative approach to the discrete facility lo- therefore the price of anarchy gives an overly pessimistic view. Instead cation problem that provides solutions close to the lexicographic min- of assuming that all players choose their strategies simultaneously, we imax optimum. The lexicographic minimax optimum is a concept that consider games where players choose their strategies sequentially. The allows to find equitable location of facilities. Our main contribution is sequential price of anarchy is then a lower bound on the quality of any the approximation approach which is based on rules which allow; (i) subgame perfect equilibrium of such a game relative to the quality of to take into account the multiplicities assigned to different customers; the global optimum. This idea was introduced in a recent paper by Paes (ii) to detect whether for a given distance active customers can reach Leme, Syrgkanis, and Tardos, where they indeed give examples where higher, equal or smaller distance to the closest located facility; and (iii) sequential decision making leads to better equilibria. We review some to use methods customized for solving the p-median problem. Cus- of their results, touch upon our own results for a throughput scheduling tomized methods can handle larger problems than up to date general problem, and discuss some of our ongoing work on linear congestion purpose integer programming solvers. We use the resulting algorithm games. to perform an extensive study using the well-known benchmarks, and

64 OR 2014 - Aachen TE-10

2 - Multi-item auctions with exclusivity margin 2 - Lower Bounds for Global Polynomial Integer Opti- Rudolf Müller, Hemant Bhargava, Gergely Csapo mization We study the problem of finding the profit-maximizing multi-item Sönke Behrends, Ruth Hübner, Anita Schöbel mechanism in a setting, where bidders hold two-dimensional private We consider the problem of minimizing a polynomial over the integer information: one for the value of the item being sold, the other one lattice, which is an NP hard problem in general. Since it has many for the added value margin they exhibit in case of exclusive alloca- applications and interesting special cases, tractable subclasses of the tion. We require the mechanism to be deterministic, individually ratio- problem need to be identified. Existence of global integer minimizers nal, and implementable in dominant strategies. Our main motivating is related to the leading form of the polynomial, i.e. its highest order example comes from online marketing, specifically sales lead genera- terms: A well-known sufficient condition for the existence of continu- tion. Due to the great demand from practitioners for simple and speedy ous and integer minimizers is a leading form that attains positive values solutions, instead of going for optimality, we focus on heuristics that only - except at zero. For example, in the univariate case this condition provide good revenue relative to the optimal mechanism. Notably, we simplifies to an even degree and a positive leading coefficient. demonstrate that even the simplest mechanisms, such as selling always exclusively or always non-exclusively, produce revenue within a con- If this condition holds, we algorithmically determine a finite box stant factor approximation of the optimal revenue. We identify two containing an integer minimizer, having smaller box size than boxes different single-dimensional relaxations of the problem, for which we known from literature. Once the box is fixed, we may find the mini- determine the optimal auction using well-known techniques. The re- mizer by branch and bound. For an effective bounding, we introduce a laxations provide revenue bounds that can be used to evaluate the qual- new class of underestimators having integer minimizers which can be ity of heuristic auctions. We also devise a heuristic mechanism from directly determined. By this method we obtain a lower bound on the the class of affine maximizers and demonstrate by means of simulation value of the integer minimizer of the polynomial. Numerical results that it yields revenue very close to the upper bounds, and thus very show the quality of the lower bound. Since it is possible to compute close to optimality. an optimal underestimator from this class during preprocessing, we omit time-consuming computations at each subproblem. Using results 3 - Characterizing implementable allocation rules in from real algebraic geometry, it is possible to further tighten the lower multi-dimensional environments bound. Andre Berger, Rudolf Müller The resulting branch and bound procedure has been implemented and tested on random instances. We study characterizations of implementable allocation rules when types are multi-dimensional, monetary transfers are allowed, and 3 - On the Interaction Between Price and Advertising in agents have quasi-linear preferences over outcomes and transfers. Our a Gould Contagion Model main characterization theorem implies that allocation rules are imple- mentable if and only if they are implementable on any two-dimensional Gustav Feichtinger, Fouad El Ouardighi, Dieter Grass, convex subset of the type set. For finite sets of outcomes, they are Richard Hartl, Peter M. Kort, Peter M. Kort implementable if and only if they are implementable on every one- Dynamic sales models in marketing widely rely on the assumption that dimensional subset of the type set. Our results complement and extend the attraction rate of new customers for a given product is critically de- significantly a characterization result by Saks and Yu (2005), as well pendent upon advertising effort. These models fail to take into account as follow-up results thereof. the role of sales price in the evolution of the number of potential cus- Our proofs demonstrate that the linear programming approach to tomers. In addition, they disregard the existence of spontaneous word mechanism design, pioneered in Gui et al (2004) and Vohra (2011), can of mouth regarding the product, which may be crucial independently be extended from models with linear valuation functions to arbitrary from advertising effort. These two important omissions may lead to continuous valuation functions. This provides a deeper understanding produce misleading marketing policies, specifically for the manage- of the role of monotonicity and local implementation. In particular, we ment of word of mouth effectiveness. A primary goal of the paper is to provide a new, simple proof of the Saks and Yu theorem, and gener- analyze the optimal tradeoff between sales price and advertising effort alizations thereof. Modeling multi-dimensional mechanism design the and its implications for word of mouth effectiveness. To address this way we propose it here is of relevance whenever types are given by few issue, an optimal control problem is formulated where the attraction parameters, while the set of possible outcomes is large. Examples for of new customers depends both on spontaneous and advertising-based such types occur in scheduling problems and combinatorial auctions. word of mouth and sales price adjustments are costly.

 TE-09  TE-10 Thursday, 15:00-16:30 - SFo1 Thursday, 15:00-16:30 - SFo2 Applications of linear and nonlinear Uncertainties in Energy Markets and optimization II Stochastic Models for Energy Economics Stream: Continuous and Non-linear Optimization Stream: Energy and Environment Invited session Invited session Chair: Michael Herty Chair: Dogan Keles Chair: Simone Göttlich 1 - Stochastic simulation of solar radiation in order to 1 - A Bound-tightening technique for the global opti- generate time series of photovoltaic electricity feed- mization of distillation-based separation processes. in Dennis Michaels Hans Schermeyer, Hannes Schwarz, Valentin Bertsch, Wolf Fichtner Many separation tasks in Chemical Engineering are based on distil- lation. The determination of an optimal design for such separation The increasing impact of electricity generation from renewable energy processes often requires solving a mixed-integer non-convex optimiza- sources (RES-E) on energy markets in Europe and beyond, makes it tion problem to global optimality, and, is, hence, very challenging, in more and more important to study their generation characteristics in general. In this work, we present, for a certain class of distillation pro- detail. Therefore, in order to design an energy system or to make in- cesses, a bound-tightening strategy that exploits the problem-specific vestment decisions, it is crucial to thoroughly explore the fluctuating structure. The bound-tightening strategy is used to define a MINLP re- and uncertain properties of RES-E generation. laxation of the original problem. The relaxed MINLP forms the basis The fluctuating character of RES-E feed-in can adequately be ex- of a modified branch-and-bound algorithm, which is used to solve the pressed by time series with a high temporal resolution (e.g. hours or original problem to global optimality. The performance of the algo- 1/4 hours). Since there is few data on a high spatial and temporal reso- rithm is demonstrated on a series of test instances. lution available on a European scale, many energy system analysts use This is a joint work with Martin Ballerstein, Achim Kienle, Christian historical meteorological data for their analysis and convert it to RES- Kunde und Robert Weismantel E generation. However, when investigating the fluctuating character of RES-E feed-in and the linked need for flexible backup capacity, this

65 TE-11 OR 2014 - Aachen

data base might not be sufficient to draw robust conclusions. There- on past employees data (performance, tenure, attrition etc.) and col- fore, in this work we pursue a methodology to generate an infinite lege data (joining ratio, external college grading etc.), with the organi- volume of realistic photovoltaic feed-in data. This aims at enabling zational utility perspective. The rankings were correlated with expert energy systems analysts to base their modelling approaches on a larger opinion using measures like Kendall Tau. Based on rankings, a new set of realistic data and thus reaching more robust and reliable results. recruitment allocation model is developed to determine the number of students to be recruited from each college, given an overall target of The purpose of this paper is to explore the possibilities of modeling number of students to be recruited in that year. Our model tries to max- the solar radiation through a stochastic process. After reviewing sev- imize the expected organizational utility as well as maintain a healthy eral stochastic models from literature, a modeling approach for solar diversity. For solution, we have modified the standard water-fill algo- radiation is formulated and calibrated using historical meteorological rithm from convex optimization. The results show improved organiza- data. The generated radiation data is converted to electricity generation tional utility of new allocations, assuming the same overall behaviour using a photovoltaic power plant model. Both the modeled radiation of hires. We are building an intelligent automated recruitment alloca- and electricity generation are evaluated and discussed with regard to tion system to reduce recruitment efforts and enhance organizational their ability to simulate the RES-E’s uncertain and fluctuating charac- utility from the recruited fresh hires. The proposed approach is appli- ter. cable for other human resource supply chain problems like recruitment 2 - Uncertainties in the balancing markets for electricity of experienced professionals from placement agencies. - barriers for renewable energy sources 2 - System Survivability under Attack: Concept and pre- Michael Zipf, Dominik Möst liminary results Mohamed Naceur Azaiez, Asma Ben Yaghlane Overview This presentation deals with uncertainties in electricity mar- kets, especially on the balancing markets. Thereby, forecasting quality We investigate the concept of system survivability under attack. This plays an important role in this context, as shorter lead time leads to bet- concept is particularly important nowadays given the intelligent threats ter forecasts. Shorter contract durations on the balancing market would the world is confronting including terrorism, rebellions, civil wars, and reduce the costs for providing balancing energy and would lower barri- so on. We discuss the discrete and continuous cases as well as network ers for market entry. To measure the efficiency gain of shorter contract systems. We display results for some classical configurations includ- duration in balancing markets, an optimization model is applied to the ing series-parallel, parallel-series, and k-out-of-n systems. We also German electricity system. Thereby, day-ahead and markets for bal- discuss the case of repetitive attacks. Moreover, we will briefly outline ancing energy are modelled in detail. Methods A Stochastic mixed defensive and offensive planning resources through the expected num- integer linear optimization model minimizing total system costs is ap- ber of attacks for each system configuration. This gives rise to a game plied. Uncertain parameters are the renewable feed-in. The time series problem between the defender and the attacker where the first player are generated with an ARMA and/or SARIMA approach. Under these seeks preserving system functionality while the second considers the uncertainties the balancing commitment are optimized. Based on these best attack strategies to force disabling the targeted system. commitments a second model is used, where the need of control energy is modelled as stochastic component. With this two-stage model ap- 3 - Optimizing information security investments with proach, the influence of different balancing market design options are limited budget analysed concerning effectivity. Results With shorter contract dura- Andreas Schilling, Brigitte Werners tion time a) costs for providing balancing energy decrease, b) number The importance of information security is constantly increasing with of market participants increase significantly and c) renewable energy technology becoming more pervasive every day. As a result, the ne- sources have an incentive to participate in the markets for balancing cessity and demand for practical methods to evaluate and improve in- energy. Higher volatility of day-ahead market prices intensifies the formation security is particularly high. The aim of this paper is to above mentioned effects. apply mathematical optimization techniques to improve information security. According to the identified problem structure, a combinato- 3 - The impact of disequilibria in power markets rial optimization model is established. The objective of the presented Thomas Kallabis, Christoph Weber approach is to maximize system security by choosing the best com- bination of security controls limited by available budget. In addition, Frequently, energy market models focus on the analysis of equilib- by performing a What-if analysis and systematic budget variations, the ria and equilibrium development paths. But the history of competi- decision maker can get improved insights and thus determine an ideal tive electricity markets in Europe is more resembling to a sequence of budget proposition yielding the highest benefit among all possible con- booms and busts - with currently a bust period. A key reason are the trol configurations. An exemplary case study demonstrates how this long lead times for construction. By investigating the impacts of devi- approach can be used as a tool within the risk management process of ations from an (anticipated) equilibrium, the relevance of various risk an organization. factors for profitability is highlighted. This will contribute to improve investment decision making under uncertainty.

 TE-12 Thursday, 15:00-16:30 - SFo4  TE-11 Thursday, 15:00-16:30 - SFo3 Airport Operations Scheduling II Decision Making in Practice Stream: Project Management and Scheduling Invited session Stream: Decision Theory and Multi-Criteria Optimiza- Chair: Rainer Kolisch tion Invited session 1 - Inbound Baggage Handling at Airports Chair: Andreas Schilling Markus Frey, Ferdinand Kiermaier, Rainer Kolisch We consider the planning and scheduling of inbound baggage which 1 - Novel approach for effective campus recruitment leaves the airport through the baggage claim hall. Although, this is a Rajiv Srivastava, Mangesh Gharote, Girish Palshikar standard process at airports, to the best of our knowledge, there has been no one mathematical model proposed in the literature optimiz- Every year organizations go for campus recruitment of fresh graduates ing the inbound baggage handling process and analyzing its theoreti- on large scale. Organizations commonly consider college rankings by cal structure. As the inbound baggage handling problem turns out to be external agencies to make hiring decisions. The colleges may be highly NP-hard, we propose a hybrid heuristic combining greedy randomized ranked externally but the realized "utility’ of the hired candidates and adaptive search procedure (GRASP) with a guided fast local search college rankings from an organization’s perspective, may differ from (GFLS) and path-relinking. We demonstrate how we implemented the external rankings. Organizational utility is based on factors such as proposed algorithm into the running system of a major European Air- tenure of the candidates and their performances on the job. Also deter- port. In a case study, we compare the results of the MIP with the mining number of students to be recruited from each ranked college is solution of the GRASP/GFLS heuristic and the solution provided by another challenge faced by managers. In this paper, we present rank- practice. All computational results are based on real data. ing schemes for the colleges, using DEA and average ranking, based

66 OR 2014 - Aachen TE-14

2 - Disruption Management for Outbound Baggage Han- 2 - Intrusion detection system performance in ubiqui- dling with Work Group Pairings tous environments using genetic algorithm approach Christian Ruf, Markus Frey, Rainer Kolisch, Marco Lübbecke Djamal Dris, Lynda SELLAMI, Djilali Idoughi Outbound baggage handling at international airports concerns baggage which has to be transfered from the terminal or transfer flights to the departing airplane. In the planning flights have to be assigned to han- Distributed computing is the next generation of computing environ- dling facilities and the baggage handling has to be scheduled. The ments is called ubiquitous computing. The purpose of ubiquitous com- latter involves settting the start of the baggage handling and the start puting is to render and allow the computing to being ubiquitous and of the depletion of the central baggage storage. Moreover, to load the discreetly accessible to the user at any time and anywher, while inte- incoming baggage into bulk containers at the handling facilities work- grating technologies and communication in daily life people (person) ers have to be staffed to the flights. We propose a model formulation in a strong way. This information accessibility and flexibility of ubiq- to plan the outbound baggage handling including the staffing of the uitous computing, making it vulnerable to attacks. This requires the workers. We present a Dantzig-Wolfe and Benders reformulation. An detection of security breaches when they occur. Our goal in this article algorithm based on column and row generation is applied to find a fea- is to explore the possibility of detecting intrusions (attacks) occurred sible assignment and schedule for the baggage handling and to staff in ubiquitous environments using genetic algorithm approach. the workers at the handling facilities. In a computational study we test the performance of the procedure with real-world instances based on Terminal 2 of Munich Airport. The results show that the algorithm is capable of giving a feasible solution in a reasonable amount of time. 3 - The Airport Ground Staff Tour Scheduling Problem with Flexible Breaks  TE-14 Ferdinand Kiermaier, Markus Frey, Rainer Kolisch Thursday, 15:00-16:30 - SFo10 Ground handling as well as many other personnel scheduling problems require the explicit assignment of shifts and days off to individual em- Transportation in Health Care ployees rather than to a generic workforce. This means that informa- tion on skills, availability, and overtime balances must be taken into Stream: Health Care Management account. A very flexible component of the shift models used in ground handling is the break regulation. For a large portion of the workforce Invited session at AeroGround, a major European groundhandler, the total break du- Chair: Teresa Melo ration per shift may be split into sub-breaks, where each sub-break has a minimum and maximum length, and workstretch durations regulate the time between them. We will provide an overview and classifica- 1 - Vaccine Supply Strategies in case of a Smallpox Epi- tion scheme for breaks regulations discussed in literature. Moreover, we will introduce and present a tour scheduling model based on work demic templates that includes hierarchical skill levels and the possibility to Burcu Adivar use different break regulations. We analyze different break regulations and show the advantage of using flexible break regulations. In this study, epidemiological modeling is used to evaluate different ordering policies for vaccine requirements for an anticipated smallpox attack. Based on a compartmental model for the dispersion of smallpox virus, we consider vaccination as the main control policy in addition to  TE-13 hospitalization and quarantine. Solution to a set of ordinary differen- Thursday, 15:00-16:30 - SFo9 tial equations is used to estimate the need for vaccines for two different population sizes. Assuming zero initial stock level for smallpox vac- Applications of Metaheuristics cines, we discuss several supply strategies and evaluate their effect on stopping the dispersion of epidemic. In order to guarantee sufficient supply of vaccine, mathematical programming is used to solve the sin- Stream: Heuristics, Metaheuristics, and Matheuristics gle commodity multi-supplier procurement model under time varying Invited session demand rate. Chair: Djamal Dris 1 - Local search for determining the supplier’s optimal 2 - A heuristic approach to the home health care prob- discount schedule lem with working regulations Viktoryia Buhayenko Daniela Lüers, Leena Suhl This research examines the problem of determining an optimal dis- count schedule where the supplier decides how much (if any) discount should be given to each customer in each period, aiming to maximize The global demographic change leads to an increasing number of el- his profit. The customers get benefits from accepting discounts since derly people and therefore people in need of care. One possibility to the resulting price reduction exceeds their increase in inventory and support these people is home health care, where clients stay at their order costs. The demand for each customer varies from period to pe- homes and receive services from home health care providers. To plan riod and is independent of the discount offered. Therefore, the total the different services for a given time period, the providers have to demand for the supplier is constant and he can only influence when the perform a complex routing and scheduling task. The home health customers place their orders by offering sufficient discounts in these care problem from literature combines the well-known vehicle routing periods. The type of discount studied is a simple price reduction. The problem and nurse rostering problem to achieve the daily routes and customers are heterogeneous in their demand, holding and order costs. duty schedules for the nurses. These classical problems are extended The initial solution is received by application of a series of Wagner- by some specific extensions such as skill requirements and personal Whitin algorithms to the situation without discounts. The neighbour- preferences of clients to be applicable in the home health care context. ing solutions for finding the best production schedule for the supplier The routing part of the home health care problem is the object of many are created by increasing or reducing the number of set-ups. The pro- publications. In contrast, usually working regulations for the nurses duction periods are spread in such a way that total customers’ demand are not considered, although they are widely used in the nurse roster- in-between two set-ups is roughly equal. To calculate the objective ing problem for hospitals. We adapt relevant regulations, such as shift value for the new production pattern of the supplier, a second neigh- rotations and break rules, and bring them to the home health care prob- bourhood search is performed for each customer to determine the best lem in order to respect personal preferences and legal requirements. order pattern that minimizes the discount offered and the inventory Numerical results from a mixed-integer formulation solved by a com- cost for the supplier. The neighbourhood structure for this heuristic is mercial solver show that these tend to be noncompetitive with respect based on the observation that customers can order an integer number to computing time due to the integration of two NP-hard problems. of times between two set-ups of the supplier. These orders are evenly Therefore we present a heuristic approach based on a large neighbor- spaced between the set-up periods. This performs a role of a fitness hood search to cope with the complexity of the problem and get solu- function to compare the quality of solutions. Optimal discounts are tions in a reasonable computing time. determined while the best order pattern for customers is obtained.

67 TE-15 OR 2014 - Aachen

 TE-15  TE-16 Thursday, 15:00-16:30 - SFo11 Thursday, 15:00-16:30 - SFo14 Forecasting with Computational Modelling Material and Energy Flows Intelligence Stream: Energy and Environment Stream: Statistics and Forecasting Invited session Invited session Chair: Matthias Gerhard Wichmann Chair: Sven F. Crone 1 - Operative planning of recycling measures for iron and steel slags 1 - Forecasting Energy Prices with Neural Networks Christoph Meyer, Matthias Gerhard Wichmann, Thomas Ralph Grothmann, Hans Georg Zimmermann Spengler

The liberalization of the German energy market in 2002 along with With a total of 1.6 billion tons of crude steel in 2013 worldwide steel the deregulation of neighbouring European energy markets in recent production has reached its highest level to date. Therefore it is nec- years has created several needs for energy companies, public services, essary to deal with large amounts of by-products. An essential group energy brokers and large scale energy consumers. Innovative pro- of by-products in the iron and steel industry are slags. Slags perform curement concepts must be developed in order to make use of market important metallurgical tasks and are inevitable for iron and steel pro- chances, to minimize risks or to leverage energy resources efficiently. duction processes. Although slag production is inevitable, slags are not considered waste and can be used as secondary resources. For exam- In this context, energy price forecasting is probably one of the most ple, slags are recycled to produce road construction material, cements demanding tasks for market oriented energy procurement. Since the and fertilizers. In order to recycle slags there is a variety of alternative primary energy markets are highly interrelated, an isolated analysis of recycling measures. The potential of a recycling measure strongly de- a single domestic energy market is questionable. What is required is a pends on a multitude of technical, economic and ecological variables. joint modelling of all interrelated energy and commodity markets. Due to the concurrence of these variables a general statement concern- ing the advantage of one specific recycling measure cannot be given. We present an approach of coherent market modeling to forecast en- In short term production planning this leads to the question how slags ergy prices, which is based on large time-delay recurrent neural net- are to be recycled. A planning approach taking into account all relevant works (LRNN). These nonlinear state space models combine different variables is not known. This contribution introduces a production plan- operations of small neural networks into only one shared state transi- ning approach for slag recycling considering technical, economic and tion matrix. We use unfolding in time to transfer the network equations ecological variables. The planning approach comprises a quantity and into a spatial architecture. The training is done with error backpropa- a value structure. The quantity structure is based on an activity analysis gation through time. Unlikely to small networks and standard econo- focusing operating points of possible recycling processes. In order to metric techniques, overfitting and the associated loss of generalization determine relevant operating points, recycling processes are modeled abilities is not a major problem in large networks due to the self-created by means of flowsheet simulation. The information provided by the eigen-noise of the systems. quantity structure is evaluated in the value structure based on manage- ment accounting. Subsequently, the quantity and value structure are We exemplify our approach of market modeling by forecasting the incorporated into a formal mathematical model. The application of the long- and short-term development EEX base future prices. model is illustrated in a case study.

2 - Evaluation of managerial and operational measures 2 - Improved Model Selection Criteria for Neural Net- to improve ressource- and energy-efficience in the works: an empirical evaluation of trace errors in time automotive industry series prediction Ina Schlei-Peters, Matthias Gerhard Wichmann, Thomas Sven F. Crone Spengler

Selecting appropriate forecasting models is of particular importance The energy- and ressource-efficient design of production processes is for artificial Neural Networks (NN), where many candidate models of nowadays a crucial com-petitive factor for producing companies. Con- varying performance can be created. NN offer large degrees of free- sidering the actual discussion along the field of sus-tainability, produc- dom in specifying network architectures, and the weight of each archi- ing companies strive to improve energy- and resource-efficiency. To tecture must be randomly initialised multiple times in order to account achieve this goal, managerial and operational measures (MOM) have for local minima in parameterisation. As a result, for a single time to be planned and implemented. A major challenge for the planning of series a large number of NN candidate models can be created, out of MOMs is the evaluation of their ecological effectiveness. Here three which one must be selected to generate the actual out-of-sample pre- characteristics arise. First, MOMs in general effect multiple ecological diction. index numbers. Sec-ond, there are numerous interdependencies in the flow of materials and energy within a produc-tion system, which are af- Despite the inherent importance of model selection for NNs, only fected by the implementation of an individual managerial or operation- limited research has addressed the issue. In statistics, model selec- al measure. Third, multiple measures are interdependent as well and tion based on Information criteria (IC) is widely accepted (Hyndman, thus cannot be cumulated easily. Common approaches in the evalua- 2010). However, for NNs, QI & Zhang (2001) compare different in- tion of MOMs typically focus on their individual evalua-tion regard- sample information criteria of AIC, BIC etc. and forecast errors and ing a specific ecologic measure for a specific production process. An find they fail in selecting models with high out-of-sample performance. evaluation re-garding the whole production system as well as the inter- Curry and Morgan (2004) raise fundamental theoretical concerns about dependencies to other managerial or operational measues is missing. the adequacy of any ICs for NNs due to the indeterminacy of redundant In this paper, a modell based evaluation approach is developed, which network weights, and the challenges in determining a network’s actual takes interdependencies of the flow of materials and energy as well as degrees of freedom. These limitations question the popular use of IC interdependencies between MOMs. The evaluation approachis based for the selection of NN models, and fundamentally impair the reliable on a modular multi-layer model of the flow of materials and energy. use of NNs in forecasting. This mod-el explicitly incorporates all described interdependencies. The modular design of this model allows for a sufficiently accurate To address this gap, this study proposes a novel NN selection criteria and problem adequate modelling of the production system. This ap- of multiple-step ahead out-of-sample trace forecast errors for model proach is the base for the development of a decision support tool for selection. We assess its efficacy in a large empirical evaluation on 111 the planning and implementation a bundle of efficient and MOMs. time series of industry data, taken from the popular NN3 competition (Crone et al., 2010), and compare it to popular information criteria of 3 - A Quantitative Model for the Simulation and Opti- AIC and BIC and error metrics of MSE, MAE, and MAPE for in- and out-of-sample NN model selection. mization of Energy Flows in Buildings Jens Tiekenheinrich, Maria-Isabella Eickenjäger, Michael H. Breitner

68 OR 2014 - Aachen TE-19

Today the energetic layout of buildings is increasingly shaped by cli- introduce the most common risks and then we analyze quantitative risk mate change and limited fossil resources. New loads, local power gen- models and their solution techniques. Finally we give some directions eration, and energy storages rapidly gain influence. Various technolo- for future research about modeling risks in supply chain. gies appeared on the market, each with specific effects on the ener- getic footprint of a building. New energy supply technologies are, e.g., 3 - The role of service level based compensation in a photovoltaic (PV) and solar thermal systems, heat pumps, or combined profit center organization with demand uncertainty heat and power (CHP) micro systems. For the energy demand addition- Guido Voigt, Barbara Schoendube ally new loads such as electric vehicles arise. The specific conditions of a building play a crucial role besides energy storages. Important We consider a decentralized organization. A principal hires an agent parameters are, e.g., the energy usage of the building (private, com- to be head of a profit center. The agent’s only task is to choose the mercial or industrial) or the specific location characteristics (ambient periodic order quantity for a single selling season in the presence of temperature, solar radiation, etc.). Cause-and-effect-laws of each tech- uncertain demand. One out of two possible demand distributions mate- nology today are well-studied for most instances. But, the complex- rializes. By assumption the agent knows which distribution is present, ity of mutual relations increases with combining different technolo- but the principal does not. The principal aims at maximizing long term gies under various conditions caused by the plurality of influencing firm value while the agent maximizes short run profit. Given this set- factors. This trend progressively complicates the simulation of energy ting an adverse selection problem is present. We show that the princi- flows and the determination of specific energetic-optimal layouts. Here pal can offer a compensation scheme that results in zero agency costs a quantitative model of the holistic energy flows in buildings is pre- and fully efficient outcomes. sented. This model takes into account building- and location-specific requirements and their influence on energy demand and supply. Fur- thermore it involves both the possible usage of different technologies and their combination. It is shown that the model helps to comprehend the complexity of cause-and-effect-laws regarding the energy flows in TE-19 buildings. Results provide a basis for energetic-optimal planning.  Thursday, 15:00-16:30 - I Train Path Assignment  TE-18 Stream: Traffic and Transportation Thursday, 15:00-16:30 - 004 Invited session Chair: Karl Nachtigall Information Asymmetry and Risk 1 - New insights using optimized train path‘s assign- Stream: Supply Chain Management ment for the development of railway infrastructure Invited session Daniel Pöhle, Matthias Feil Chair: Guido Voigt The train path assignment’s optimization algorithm generates an op- timal solution for freight train path applications by connecting avail- 1 - Supplier selection with utility range-based interac- able slots between several construction nodes without conflicts. This tive group decision making method method is not only used for a real timetable e.g. for the following year Halil ¸SEN, Murat Ayanoglu but also for timetable-based development of railway infrastructure in long-term scenarios. However, for infrastructure development the ac- Today’s decision making problems are discrete, multi-criteria and in- tual slot connections are not the main concern in this planning step. volve multiple decision maker (DM).Organizations use the GDM tech- The railway infrastructure company rather wants to detect bottlenecks niques because of the problem’s complexity. One of the key questions in the infrastructure and needs to get evidence for necessary develop- in this type of problems is how the preferences of the DMs can be ments of its railway network. By presenting results of a real world modeled. DMs are able to provide only incomplete information, be- German railway network’s test case, this paper shows which bottle- cause of time pressure, lack of knowledge, and their limited expertise necks can be derived from an optimized slot assignment and which related to the problem domain. In these types of situations the DSS measures (in timetable and infrastructure) could eliminate the detected should allow modeling of the incomplete preference information. In bottlenecks. Necessary key figures for discovering bottlenecks will be this study we developed an interactive procedure which uses incom- introduced, too. It is shown that shadow prices of the developed col- plete information preference information. Main theme underlying the umn generation method are a good indicator for the identification of method is every group member wants to compare their partial informa- bottlenecks. For the first time with the comparison of different assign- tion with other group members. This procedure reflects the incomplete ments’ key numbers one can deliver a clear monetary benefit for the information as linear range because it can count easily from partial removal of a single bottleneck, e.g. the revenue advantage of an ad- utility information. Range type makes the incomplete information ef- ditional track for buffering freight trains. Hence, using the developed fective and efficient to demonstrate the group members. In addition to optimization algorithm for train path assignment leads to new useful this, range type utility information makes easy to compare every group insights for a railway infrastructure company to develop its railway members’ utility information with group’s information and collecting network. the each group member’s utility information within group’s utility in- formation. To obtain group utility, preference aggregation method is 2 - Why does a railway infrastructure company need an used. Interactive procedure helps to make a consensus of group. The optimized train path‘s assignment for industrialized method uses the criterion of realism (Hurwicz) and for this it uses the timetabling? infinity of knowledge. We used this method for the evaluation of the Matthias Feil, Daniel Pöhle performance of organization companies as the service suppliers of a pharmaceutical company. The suppliers’ utility information calculated In today’s timetabling process for railway freight transportation, train by using optimism coefficient which is determined by the group. The paths are constructed individually on the base of concrete train path supplier which has the highest utility is selected. applications. In contrast, an industrialized timetable for rail freight transport is based on the separation of train path and the train itself. 2 - Modeling risks in supply chain: an overview to liter- This means that in the first step train paths are constructed between ature predefined locations in the railway network (construction nodes) with Betül Özkan, Huseyin Basligil parameters representing many trains. These train paths are called slots and correspond to the available capacity for freight trains in the rail- There are many internal and external factors that affect a supply chain’s way network. After the train path applications have been received at success and sustainability. All these factors mean a risk for the supply a fixed date, suitable train paths will be searched simultaneously for chain. So risks should be managed carefully and effectively. The neg- all applications and their complete routes. In this step the railway in- ative effects of potential risks should be minimized for a better supply frastructure company has to make sure that each slot is used by only chain performance. The topic risk management has started to be used one application and that the slots are connected with no conflicts in more common by academicians in recent years. There are different the construction nodes. This paper discusses the necessity of an opti- risk management techniques in literature. Some of these techniques mization algorithm for the outlined slot’s assignment for industrialized are qualitative and some of them are quantitative. In this study we re- timetabling of a railway infrastructure company. Therefore the po- view quantitative risk minimization models in supply chain. First we tential for optimization in comparison to a greedy heuristic approach

69 TE-20 OR 2014 - Aachen

is shown in an example of the German railway network’s long-term process understanding, if they can be presented in their entirety and timetabling scenario. It becomes apparent that the developed column together with their origin in the parameter domain. In this paper an generation approach generates good solutions for real-world use cases approach is shown for the support of experts in the special application in an acceptable runtime. field of laser cutting. Furthermore, the approach’s basic principles can be transferred to any application domain, in which process maps can 3 - Modelling and Solving a Train Path Assignment be useful to gather process knowledge. The paper describes feasible Model and suitable methods for this purpose. It also presents the validation Karl Nachtigall of these methods with the help of a web application. This web ap- plication considers the integration of simulation data as well as their We introduce a binary linear model for solving the train path assign- suitable visualization. The paper discusses the current approach and ment problem. For each train request a train path has to be con- gives an outlook for the future work. structed from a set of predefined path parts within a time-space net- work. For each possible path we use a binary decision variable to in- 3 - A Simulation Study on Lot Assignment Problem in dicate, whether the path is used by the train request. Track and halting Semiconductor Photolithography Process capacity constraints are taken into account. We discuss different objec- tive functions, like maximizing revenue or maximizing total train path You-Jin Park quality. The problem is solved by using column generation within a branch and price approach. This talk gives some modeling and imple- It is important to develop practical and effective methods to improve mentation details and presents computational results from real world productivity in semiconductor manufacturing fab which involves pos- instances. sibly one of the most complex manufacturing processes ever used. The photolithography process in semiconductor manufacturing is one of the most complex processes and known as the bottleneck process which significantly affects the entire fab productivity. In this research, we consider lot assignment problem in photolithography process in case TE-20 that there is a limited number of qualified equipment types and main  equipment for each photolithography step and apply a simple heuris- Thursday, 15:00-16:30 - II tic approach in order to minimize makespan of the photolithography process. Manufacturing Applications Stream: Production and Operations Management Invited session Chair: You-Jin Park  TE-21 Thursday, 15:00-16:30 - III 1 - "3d-printing" redesigns the procedure of planning, manufacturing and selling goods Distributed and Remote MIP Solving II Maria Mavri Stream: Software Applications and Modelling Systems The evolution of 3D-printing and its applications in sectors such as medicine, culture, manufacture etc. Leads to notable changes in our Invited session daily life. 3D-printing and its implications at production chain, is the Chair: Bjarni Kristjansson main research question of this study. As known, production chain is the procedure of transforming raw materials into goods. Planning, manu- facturing, selling, are some of the main steps in this chain. Recently, 1 - CmplServer - An open source approach for dis- the above steps are being restructured. Products are designed and then tributed and grid optimization are just printed and sold. Storage cost or cost from unsold products is Mike Steglich eliminated. 3D-printing is a rapid, customized and low cost produc- tion. The speed and the low cost are also associated with the sense CMPL ( Mathematical Programming Language) is a of reproduction. After designing the prototype of a product, is easy to mathematical programming language and a system for mathemati- reprint it in small or big quantities. The problem of suppling small mar- cal programming and optimisation of linear optimisation problems. kets is solved by 3D-printing, as these markets could be served without CMPL can be used with the CMPLServer which is an XML-RPC- enabling manufacture companies to warehouse or produce goods with based web service for distributed optimisation. After an overview large cost. The customized production is associated with the ability of the main functionality, the XML-based file formats (CmplInstance, to make small or big changes into product’s prototype i.e. In color, in CmplSolutions, CmplMessages, CmplInfo) for the communication be- size, in scheme without cost, and satisfy customers’ needs in a more tween a CMPLServer and its clients are described. Since a CMPL efficient way. The goal of this study is to examine how the production model can be solved on a CMPLServer synchronously and asyn- chain is being transformed, which steps are changing, which steps are chronously both modes are explained in the next step. All these dis- being replaced and which new steps are being added. Managing in- tributed optimisation procedures require a one to one connection be- coming orders, mapping the production procedure, redefining the sup- tween a CMPLServer and the client. Furthermore, it will be discussed pliers, reconsidering the selling points (how the consumers receive the how CMPLServers from several locations can be coupled to one "vir- products), pricing these new products, are some issues under examina- tual CMPLServer’, how a client can connect with it and how optimisa- tion in this paper. tion jobs are coordinated within the CMPLServer grid. At the end an analysis about the positive effects of shipping optimisation problems 2 - Virtual Production Intelligence Providing Analytics in into a grid of CMPLServers versus the corresponding network traffic Laser Cutting will be discussed. Rudolf Reinhard, Urs Eppelt, Toufik Al Khawli Laser cutting is a thermal separation process widely used in shaping 2 - Distributed computing using server based optimiza- and contour cutting applications. It has the advantage over conven- tion in AIMMS PRO tional cutting techniques that it is a very fast and at the same time very Ovidiu Listes accurate technology with the optical tool laser not being exposed to any wear. There are, however, gaps in understanding the dynamics AIMMS PRO (Publishing and Remote Optimization) platform allows of the process, especially with regard to issues related to cut quality. for remote optimization, queuing and prioritizing of the optimization Modeling and simulation of the laser cutting process -although quite tasks on a central server. We discuss important aspects such as data demanding in computational resources- has shown to improve that un- exchange and messages exchange capabilities offered by the PRO plat- derstanding without the need for executing a lot of experiments in the form. We illustrate these capabilities based on an advanced example real world. The simulation itself is characterized by a high dimensional where several incumbent solutions of an MIP problem are evaluated re- input parameter set. Each parameter has its own range, and thus they motely on the server using a simulation run. Results are fed back to the are together forming the parameter domain space. The quality criteria, client where they serve for sensitivity analysis and visual comparison which are predicted with the numerical model, are analyzed together of a number of feasible solutions. with the parameter domain space and are thus used to optimize the process. However, simulation results can only help in the build-up of

70 OR 2014 - Aachen TE-24

 TE-22 Stefanie Wolff, Reinhard Madlener Thursday, 15:00-16:30 - IV Electric drivetrain technologies play an important role for delivery fleets in light of constantly rising fuel prices and steadily tightening Attended Home Deliveries CO2 emission and urban air quality legislations. In this study, we as- sess the total cost of ownership (TCO) of electrifying a delivery fleet in Stream: Logistics and Inventory inner city districts over a ten-year operating life. For the electrification, Invited session two battery technologies (ZEBRA, Li-Ion) are used that differ, among others, in capacity and thus coverage, price, and durability. Along Chair: Alberto Ceselli with direct vehicle costs, we also consider indirect vehicle costs, such as installation and operating expenses of the charging infrastructure, 1 - Improving Order Acceptance through Revenue Man- and environmental costs (life-cycle analysis, LCA). Empirical data are agement Techniques for Attended Home Deliveries in taken from a field trial of Deutsche Post DHL in Bonn, where deliv- Metropolitan Areas ery is characterized in particular by a high number of stopovers and stop-and-go traffic. TCO of the electric delivery fleet are compared Catherine Cleophas, Jan Fabian Ehmke with that of a diesel-powered delivery fleet. Evidently, the results vary The increase in online retailing has caused a boom in attended home according to battery technology. TCO of the electric drives are found deliveries: For example, when fresh groceries are delivered to the cus- to be 190-210% higher than those of a diesel-powered fleet. How- tomer’s door, this has to happen at a time when someone is at home ever, intelligent battery charging strategies, smart charging networks, to receive them. As delivery fees cannot fully compensate the costs charging algorithms that prolong battery durability, and vehicle-to-grid of delivery in tight delivery time windows, we propose a novel, value- (V2G) technologies reduce TCO of electric drives by 8-9%. Sensitiv- based approach to demand fulfilment. The idea is that to be profitable, ity analyses of factors with the strongest causalities reveal at which businesses should not only minimize the costs of delivery, but should point cost efficiency is attained. In fact, only under the premises of also maximize the overall value of fulfilled orders. We present an iter- falling manufacturing and battery costs, rising diesel prices, and the ative solution approach: First, we approximate the transport capacity adoption of V2G models, electric delivery vehicles will become eco- based on forecasts of expected delivery requests and a cost-minimizing nomically viable. Advancement of battery technologies and battery routing. Subsequently, we decide on the acceptance of actual deliv- size optimization will improve economic efficiency further. ery requests given the objective of maximizing the overall value of orders given a fixed transport capacity. Based on the set of accepted 2 - Distribution of deviation distance to alternative fuel requests, we update the routing solution to minimize costs of delivery. stations The solution combines well-known techniques from revenue manage- Masashi Miyagawa ment and time-dependent vehicle routing. In a computational study, the potential and the limits of this approach are investigated for a Ger- We derive the distribution of the deviation distance to visit an alterna- man metropolitan area. This study particularly considers the sensitivity tive fuel station. The deviation distance is defined as the sum of the of our approach regarding forecast accuracy and demand composition. distances from origin to the station and from the station to destination. Since refueling demand decreases with the deviation distance, the dis- 2 - Optimizing time slot allocation in single operator tribution is useful to estimate the number of vehicles refueled at the home delivery problems station. Distance is measured as the Euclidean distance. Origins and Alberto Ceselli, Marco Casazza, Lucas Létocart destinations are assumed to be uniformly distributed. Not only the de- viation distance but also the vehicle range is significant for alternative Home service optimization is becoming a key issue in many industrial fuel vehicles. We then focus on whether the vehicle can make the round sectors. For instance, it is common practice for large technology stores trip between origin and destination. The analytical expression for the in Europe to offer both the delivery of products at home after purchase, distribution demonstrates how the vehicle range, the trip length, and and additional professional services like installation and setup. Crucial the refueling availability at origin and destination affect the deviation decisions must be taken at different levels, like the tactical definition distance. of time slots and the operational scheduling of the operators; differ- ent strategies have been developed in the literature, typically trading 3 - Optimal Renewal and Electrification Strategy for time slot flexibility with price incentives and discounts. Any approach agree on a common principle: while a service time slot may be negoti- Commercial Car Fleets in Germany ated with some flexibility, missing a fixed appointment is perceived as Reinhard Madlener, Ricardo Tejada a strong disservice by the customer. We tackle the problem of nego- tiating service time between customers and service providers (a) at an The present work models the uncertainty of oil, electricity and battery operational level of detail, that is explicitly producing hard time win- prices in order to find the optimal renewal strategy for transport vehicle dows, together with a suitable scheduling for an operator to meet them, fleets in Germany. It presents a comprehensive statistical model of total (b) in an online fashion, that is answering to each customer at his ar- operating costs for the usage of vehicles in the transport industry. The rival time, without assuming any distribution on future customers, and model takes into consideration current and future power train technolo- without the possibility of re-negotiating the slots, and (c) with realtime gies, such as internal combustion and electric engines. The framework performances, that is with computational methods yielding decision allows for the calculation of sensitivities of the relevant explanatory support options in fractions of seconds. We formalize a time slot allo- variables (fuel price, interest rate, inflation rate, economic life-cycle cation problem, we propose diverse negotiation policies and we design duration, subsidies, taxing policies, and economic environment). The algorithms that are able to cope with issues (a), (b) and (c) simulta- study also contains the calculation and evaluation of relevant diffusion neously. We also introduce suitable indicators, highlighting level of scenarios for commercially used electric vehicles. service factors as quality measures. Finally, we perform an experimen- tal campaign, assessing the trade-off between different level of service measures, and proving the overall effectiveness of the negotiation pro- cess.  TE-24 Thursday, 15:00-16:30 - AS Non-Airline Applications of Revenue  TE-23 Thursday, 15:00-16:30 - V Management Electric and Hybrid Vehicles Stream: Pricing, Revenue Management, and Smart Markets Stream: Traffic and Transportation Invited session Invited session Chair: Jochen Gönsch Chair: Reinhard Madlener 1 - Revenue management in a multi-stage ATO produc- 1 - Electrification of Postal Delivery Fleets: A Cost- tion environment based Analysis of Competing Battery Technologies Hendrik Guhlich, Moritz Fleischmann, Raik Stolletz, Lars and Charging Strategies Moench

71 TE-24 OR 2014 - Aachen

In this talk, we consider order acceptance and scheduling decisions in a multi-stage assemble-to-order production system facing stochastic de- mand. We take a revenue management approach based on bid prices to make these decisions. Revenue management applications in produc- tion systems commonly take an aggregated view of capacity. We inves- tigate the appropriateness of this approach. The resulting production plans are evaluated using a detailed simulation model of the produc- tion system. We propose an approach that computes bid prices based on a detailed multi-stage production model. Additionally, parameters are updated using data from the simulation. 2 - Controlling the availability of electric vehicles in station-based car sharing systems with round-trips Natalia Agata Stepien, Isa von Hoesslin, Kerstin Schmidt, Thomas Spengler We consider the availability control of electric vehicles in station-based car sharing systems (e-car-sharing). In this kind of car sharing system customers are able to flexibly rent vehicles at rental points that are typ- ically located at easily accessible locations within a metropolitan area. Our focus is on station-based car sharing systems with round-trips, i.e., customers are obliged to return the rented vehicles at the start sta- tion. We present a novel approach for the availability control in such systems. The proposed availability control accounts for stochastic de- mand and the strong temporal interdependencies between acceptance decisions that arise from limited battery capacity and long-recharging times. A decomposition approach is presented to approximately solve the model. The result is an acceptance-denial policy that significantly outperforms the first-come-first-served ap-proach currently used in in- dustry. 3 - Automated discount calculation for retail Alexander Börsch Discounts are often an important selling point for retail, especially for online shops. If the variety of products of a retailer is very large then these calculations can cost a lot of resources. Therefore two algorithms for discount calculation will be introduced. The algorithms are based on econometrics. A multi product model which uses the price elas- ticity of demand builds the core of the algorithms. To solve the model regression and dynamic programming are used amongst others. Gener- ally there is a data pre-processing necessary to apply these algorithms. So the second part is a discussion of how the data available to a retailer can be processed to meet the requirements of the introduced algorithms and allowing so the application. This will include the consideration of seasonality or the price of a competitor. 4 - Order Acceptance Control in a Regenerate-To-Order Environment with Different Regeneration Modes Felix Herde In a regenerate-to-order environment used products owned by the cus- tomers are sent to service providers to restore the products’ functional- ity for another life cycle. Regeneration service providers have to decide whether to accept or to reject incoming regeneration orders. The rea- son is that they have only scarce short-run capacities. The used prod- ucts have individual conditions which are not known with certainty in advance. Different regeneration modes can be applied in order to regenerate used goods. Besides the possibility of repairing damaged parts of the product, there exists the option to replace those parts by inventory parts. Taking these characteristics into account, we present a bid-price-based approach to capacity control. We develop a random- ized linear program with network capacities. The beneficial effect of the flexibility arising from different modes is shown by numerical stud- ies.

72 OR 2014 - Aachen FA-04

To model the followers’ behavior we study both a non-adaptive vari- Friday, 8:15-9:45 ant, in which passengers select a path a priori and continue along it throughout their journey, and an adaptive variant, in which they gain information along the way and use it to update their route. For these  FA-02 problems, which are interesting in their own right, we design exact and Friday, 8:15-9:45 - Fo2 approximation algorithms. We also prove a tight bound of 3/4 on the ratio of the optimal cost between adaptive and non-adaptive strategies. Matching For the leader’s optimization problem, we study a fixed-fare and a flexible-fare variant, where ticket prices may or may not be set at the Stream: Discrete and Combinatorial Optimization, operator’s will. For the latter variant, we design an LP based approx- imation algorithm. For all variants of the problem, we devise a local Graphs and Networks search procedure that shifts inspection probabilities within an initially Invited session determined support set. We finally present the results of an extensive Chair: Sebastian Meiswinkel computational study on instances of the Dutch railway and the Amster- dam subway network. This study reveals that our solutions are within 95% of theoretical upper bounds drawn from the LP relaxation. 1 - The Duality between Matchings and Vertex Covers in Balanced Hypergraphs 2 - Quantitative Comparative Statics for a Multimarket Robert Scheidweiler, Eberhard Triesch Paradox Tobias Harks, Philipp von Falkenhausen We investigate the class of balanced hypergraphs, a common general- Comparative statics is a well established research field where one an- ization of bipartite graphs due to Berge. Based on coloring properties alyzes how marginal changes in parameters of a strategic game affect of these hypergraphs we present a new min-max theorem for an opti- the resulting equilibria. While classic comparative statics is mainly mization problem closely connected to matchings and vertex covers. concerned with qualitative approaches we aim at quantifying the pos- The result generalizes König’s Theorem and Hall’s Theorem for bal- sible extend of such an effect. We apply our quantitative approach to a anced hypergraphs. multimarket oligopoly model and consider price shocks as parameter 2 - A Hall condition for normal hypergraphs change. Isabel Beckenbach, Ralf Borndörfer We quantify the worst case profit reduction for multimarket oligopolies with an arbitrary number of markets exhibiting arbitrary positive price We investigate a sufficient and necessary condition for the existence of shocks. For markets with affine price functions and firms with con- a perfect matching in normal hypergraphs. The class of normal hyper- vex cost technologies, we show that the relative loss of any firm is at graphs strictly contains all balanced hypergraphs for which Conforti et. most 25% no matter how many firms compete in the oligopoly. We al. proved a Hall-type condition for the existence of a perfect matching. further investigate the impact of positive price shocks on total profit of We show that this condition can be generalized to normal hypergraphs all firms as well as on consumer surplus. We find tight bounds also by multiplying vertices and give a tight upper bound on the number of for these measures showing that total profit and consumer surplus de- times a vertex has to be multiplied. creases by at most 25% and 16,6%, respectively. 3 - The Partitioning Min-Max Weighted Matching Prob- 3 - Resource Competition on Integral Polymatroids lem Britta Peis, Tobias Harks, Max Klimm Sebastian Meiswinkel, Dominik Kress, Erwin Pesch We study competitive resource allocation problems in which a set of players distribute their demands integrally on a set of resources subject We introduce and analyze the Partitioning Min-Max Weighted Match- to player-specific submodular capacity constraints. Each player has to ing (PMMWM) Problem. PMMWM combines the problem of parti- pay for each unit demand a cost that is a nondecreasing and convex tioning a set of vertices into disjoint subsets of restricted size and the function of the total allocation of that resource. This general model of the strongly NP-hard Min-Max Weighted Matching (MMWM) Prob- resource allocation generalizes both singleton congestion games with lem, that has recently been introduced in the literature. In contrast integer-splittable demands and matroid congestion games with player- to PMMWM, the latter problem assumes the partitioning to be given. specific costs. As our main result, we give an algorithm computing a Potential applications arise in the field of container transshipment in pure Nash equilibrium. The proof rests on a structural result on the rail-road terminals. We propose a MILP formulation for PMMWM sensitivity of optimal solutions minimizing some linear objective over and prove that the problem is NP-hard in the strong sense. Two heuris- an integral polymatroid base polyhedron which is of independent in- tic frameworks are presented. Both of them outperform standard opti- terest. mization software. Our extensive computational study proves that the algorithms provide high quality solutions within reasonable time.  FA-04 Friday, 8:15-9:45 - Fo4  FA-03 Friday, 8:15-9:45 - Fo3 Applications in Scheduling Network Games Stream: Project Management and Scheduling Invited session Stream: Algorithmic Game Theory Chair: Thomas Vossen Invited session Chair: Tobias Harks 1 - High Multiplicity Scheduling with Switching Costs for few Products 1 - Fare Evasion in Transit Networks Tim Oosterwijk, Michaël Gabay, Alexander Grigoriev, Jannik Matuschke, José Correa, Tobias Harks, Vincent Vincent Kreuzen Kreuzen We study several variants of the single machine capacitated lot sizing problem with sequence-dependent setup costs and product-dependent Fare evasion in public transit systems causes significant losses to so- inventory costs. Here we are given one machine and k types of prod- ciety. In order to decrease evasion rates and minimize these losses, ucts that need to be scheduled, each associated with a constant demand transportation companies conduct fare inspections to check traveling rate, production rate p(i) and inventory costs per unit. When the ma- passengers for a valid ticket. We discuss new models for optimizing chine switches from producing product i to product j, setup costs c(i,j) the distribution of fare inspections within the network based on bilevel are incurred. The goal is to find a schedule such that demand is met programming. In the first level, the leader (the network operator) de- at all times and the average per-time-unit costs are minimized. This termines probabilities for inspecting passengers at different locations, can be seen as lifting a conventional scheduling problem to its more while in the second level, the followers (the fare-evading passengers) general high-multiplicity counterpart where there are only a few job respond by optimizing their routes given the inspection probabilities types, but each with a high multiplicity. This severely increases the and travel times. complexity of the problem.

73 FA-05 OR 2014 - Aachen

We distinguish three cases. In the continuous case the machine can optimal topology with an optimal feedback control system as a whole. switch products at any time and it can produce at most p(i) units of This mixed-integer formulation considers and compares numerous dis- product i. In the discrete case the machine can only switch products at crete topological and continuous control options. the end of every unit of time (e.g. a day) and it can produce at most p(i) units of product i. In the fixed case the machine can only switch One example for a system with different topological and control strat- products at the end of every unit of time and if it produces product i egy options is a filling level application: Pumps can either be used or during that unit of time, it has to produce exactly p(i) units. not and they can be connected in series or in parallel. To attain dif- ferent filling levels, the rotational speed of the used pumps has to be We characterize feasible instances and solve the three cases for k=1 controlled. The optimum needs to balance energy efficiency against product, where the fixed case is already non-trivial. We prove the deci- short actuation times between the load cases. sion variants of these cases are in P and we provide an algorithm which outputs a polynomial-sized representation of an optimal schedule. We We developed an abstract model of the filling level application as a con- also solve the continuous k=2 case by proving results on the structure trol circuit with optional elements. To accurately optimize a feedback of the schedule. Future work includes the discrete and fixed case with control system one needs to account for the time-dependent behavior k=2 and the problem with any fixed value of k. of its components: P (proportional), I (integration) and D (derivation), PT1, PT2 (delay of first or second order) or PTt (dead-time). We dis- 2 - Hierarchical Benders Decomposition for Open-pit cretized the time dependence and obtained a mixed-integer formulation Mine Block Sequencing based on a time-expanded flow network. This allows one to include combinatorial decisions such as variation of the network topology. We Thomas Vossen are able to appraise feasible solutions using the global optimality gap. The open-pit mine block sequencing problem (OPBS) models a de- posit of ore lying near the earth’s surface as a three-dimensional grid of 2 - Experimental Validation of an Enhanced System Syn- blocks. A solution, in discretized time, identifies a profit-maximizing thesis Approach extraction (mining) schedule for the blocks. Our model variant, a Lena Altherr, Thorsten Ederer, Ulf Lorenz, Peter Pelz, Philipp mixed-integer program (MIP), presumes a predetermined destination for each extracted block, namely, processing plant or waste dump. The Pöttgen MIP incorporates standard constructs, but also (i) adds not-so-standard lower bounds on resource consumption in each time period, and (ii) System Synthesis is the process of finding a combination of various allows fractional block extraction in a novel fashion while still enforc- components such that the resulting technical system fulfills a given pur- ing pit-wall slope restrictions. A new, flexible “hierarchical Benders pose. Typically, the workflow is divided into two consecutive stages: decomposition” extends nested Benders decomposition to solve the first, a set-up is found by an experienced engineer or by heuristic meth- MIP’s linear-programming relaxation. Adding constraints aggregated ods. Secondly, optimization techniques are used to compute an optimal across time to the decomposition’s subproblems reduces solution times usage strategy. This usually results in an optimal operating of a subop- dramatically. A specialized branch-and-bound heuristic then produces timal system topology. high-quality integer solutions. Medium-sized problems (e.g., 25,000 In contrast, we apply Operations Research methods to find an optimal blocks and 20 time periods) solve to near optimality in minutes. We solution for both stages simultaneously, i.e. we find the best topol- believe these computational results are the best known for instances of ogy to enable the best possible usage. The composite solution is an OPBS that enforce lower bounds on resource consumption. energy-optimal or cost-optimal system. 3 - Hybridization of a CSP with a genetic algorithm for We test our approach with a practical test case: a booster station which workflow scheduling based on energy-aware in cloud is used to guarantee the water supply in multistory buildings. The sys- computing environment tem essentially consists of a combination of pumps and pressure accu- Khaled Sellami, Pierre F Tiako, Rabah Kassa mulators. For a given flow and pressure demand, we are able to syn- thesize the best booster station, i.e. we find an optimal combination of In this paper, we investigate the problem of scheduling workflow ap- available components and an optimal control for the used components. plications on cloud computing infrastructures. The cloud workflow With this example, we address a ubiquitous problem of Operations Re- scheduling is a complex optimization problem which requires consid- search methods: Despite being able to find a provable optimal solu- ering various scheduling criteria. Traditional researches mainly fo- tion to a model, the modeling error often cannot be quantified. We cus on optimizing the time and cost without paying much attention have validated the quality of our test case model with an experimen- to energy consumption. We propose a new approach based on the hy- tal set-up. Thereto, we consider several load cases and compare our bridization of a CSP with a genetic algorithm heuristic to optimize the computational results with measurements on a test rig. scheduling performance by (a) formulating a model for task-resource mapping to minimize the overall energy consumption using the dy- namic voltage scaling (DVS) technique; and (b) designing a heuristic 3 - Building Nominations for Real World Gas Transporta- that uses hybridization of a CSP with a genetic algorithm to solve task tion Problems resource mapping based on the proposed model. Our approach is vali- Claudia Stangl, Benjamin Hiller, Robert Schwarz dated by simulating a complex workflow application. Checking the feasibility of bookings belongs to the key tasks in gas pipeline operation. The customer orders a booking, that means a max- imal in- or output of gas, at a node on the underlying gas network. The gas transportation company has to decide whether to agree to the  FA-05 booking or not. In its most basic form, they have to be able to sent Friday, 8:15-9:45 - Fo5 all balanced nominations within the bookings on the exits and entries through the network. A vector of gas input at entries and output at exits Optimization in Engineering Science together with allowed pressure intervals for all nodes is called a nomi- nation. Due to special agreements with customers it is possible that the nomination consists of power intervals at some exits or entries. In this Stream: Discrete and Combinatorial Optimization, talk a method is presented to generate nominations for given bookings Graphs and Networks to decide whether the booking is feasible or not. Invited session Chair: Ulf Lorenz

1 - Designing a Feedback Control System via Mixed- Integer Programming  FA-06 Thorsten Ederer, Philipp Pöttgen, Lena Altherr, Ulf Lorenz, Friday, 8:15-9:45 - Fo6 Peter Pelz Planning of Energy Availability Analytical or experimental methods can only find an optimal control strategy for technical systems with a fixed setup. To find the overall Stream: Energy and Environment optimum, it is necassary to find the optimal control strategy for a wide variety of setups, not all of which might be obvious even to an expe- Invited session rienced designer. We present an approach that allows one to find the Chair: Anja Ohsenbruegge

74 OR 2014 - Aachen FA-08

1 - Dynamic Strategies for Amount and Reliability of  FA-07 Control Reserve in Future Smart Grids Friday, 8:15-9:45 - Fo7 Anja Ohsenbruegge The nuclear phase-out and the increased share of renewable dis- Packing tributed generation implicate new challenges for transforming the elec- tric power system into an environmentally sustainable, reliable and Stream: Discrete and Combinatorial Optimization, cost-efficient system. In particular, the process of control reserve and Graphs and Networks balancing power to cover power plant or prognosis faults has to be adapted to today’s complex decentralized structure. On the one hand Invited session the main influencing parameters had changed and are no longer statis- Chair: Torsten Buchwald tically uncorrelated and on the other hand the base value given by the loss of load probability (LOLP) is originated from power plant black- outs, what implies that the provided reserve power is oversized in the 1 - Multidimensional Strip Packing with partially non- majority of cases. Whereas in the past the activation of reserve en- overlapping intervals ergy was caused by random failures, today the demand and activation Thomas Rieger, Uwe T. Zimmermann is related to the actual state of generation and supply, the current net- work characteristics and also generation and load forecasts. Even the We consider a generalization of the multidimensional Strip Packing influence of business management reasons could be accessed in the re- Problem (SPP). In a standard d-dimensional SPP (SPP-d), a given set cent past. The aim of this work is to use these dependencies (patterns) of boxes has to be scheduled within a strip with fixed sizes in the first to be learned with methods of computational intelligence (k-nearest- d-1 dimensions and variable size in dimension d without rotating any neighbours) and then use the trained model for predicting future de- box. The latter size is commonly denoted as height of the strip and mand. For this purpose a dynamic method with various bidding pe- has to be minimized such that the boxes can still be packed pairwise riods and various tendering quantities is aimed at to be designed and disjointly within the strip. evaluated. The design should not longer be exclusively based on ex- post data but should also factor in ex-ante values like wind, solar or In a packing that is feasible with respect to SPP-d each scheduled box load prognosis. Furthermore it is aimed to examine how a-priori data is represented as a product of d one-dimensional intervals of corre- can reduce the present statically provision of operating reserve. sponding width. Additionally, for a given subset S of dimensions, these intervals are required to be non-overlapping in the problem considered 2 - Optimising energy procurement for small and in this talk. We also introduce some variants with further special con- medium-sized enterprises straints. For example, we add constraints related to gravity or related Nadine Kumbartzky, Brigitte Werners to the orientation of boxes with respect to S. For all considered variants we derive the computational complexity, Increasing energy prices present a challenging task for small and we introduce preprocessing methods based on conservative scales, i.e. medium-sized enterprises (SMEs). Since the availability of energy on a modification of the widths of the boxes, and we present exact so- plays a crucial role in running the day-to-day business, this paper fo- lution methods. For the latter, we use the concept of packing classes cuses on energy procurement of SMEs. Due to limited financial and well known for standard SPP-d. Finally, we discuss some computa- human resources, SMEs find it difficult to implement measures to re- tional results for our implementation of the exact solution methods. duce energy costs. It is shown that a sustainable reduction of energy purchase prices can be achieved by choosing an appropriate procure- ment strategy. We develop a quantitative optimisation model that takes 2 - New inequalities for one-dimensional relaxations of into account specific needs of SMEs. The aim of the model is to min- the two-dimensional strip packing problem imise energy purchase costs while assuring that demand is fulfilled at Isabel Friedow, Guntram Scheithauer all times. Uncertainty regarding future energy prices and consumption is modelled by considering different scenarios for the evolution of un- We investigate a heuristic for the two-dimensional rectangular strip certain parameters. A minimax regret approach is used to determine packing problem (2DSPP) that constructs a feasible two-dimensional a robust selection of purchase contracts. For strategic decision sup- packing by placing one-dimensional cutting patterns obtained by solv- port, a robust optimisation model is particularly well-suited to choose ing the horizontal one-dimensional bar relaxation (1DHBR). To rep- a risk-averse procurement strategy. In an exemplary case study, differ- resent a solution of 2DSPP, a 1DHBR solution has to satisfy, among ent procurement strategies are compared. Computational results show others, the vertical contiguous condition. That means that there must that a structured energy procurement concept has a high potential to exist such an ordering of cutting patterns that all items representing significantly reduce energy costs if SMEs are willing to take over vol- one rectangle are located in consecutive patterns. To strengthen the ume and price risk. 1DHBR with respect to that vertical contiguity new inequalities were formulated and numerically analyzed. 3 - Unit Commitment by Column Generation Takayuki Shiina, Takahiro Yurugi, Susumu Morito, Jun 3 - Upper Bounds for Heuristic Approaches to the Strip Imaizumi Packing Problem The unit commitment problem is to determine the schedule of power Torsten Buchwald, Guntram Scheithauer generating units and the generating level of each unit. The decisions involve which units to commit at each time period and at what level to We present an algorithm for the two-dimensional SPP that improves generate power to meet the electricity demand. The objective is to min- the packing of the FFDH heuristic and state theoretical results of this imize the operational cost which is given by the sum of the fuel cost algorithm. We also present an implementation of the FFDH heuristic and the start-up cost. The problem is a typical scheduling problem in for the three-dimensional case, which is used to construct a new algo- an electric power system. This problem is formulated as a multi-stage rithm with absolute performance ratio of at most 5. Based on this al- nonlinear integer programming problem because the fuel cost function gorithm, we prove a general upper bound for the optimal height, which is assumed to be a convex quadratic function. In this paper, we propose depends on the continuous lower bound and the maximum height lower a new algorithm that is based on the Dantzig-Wolfe reformulation and bound, and show that the combination of both lower bounds also has column generation to solve the unit commitment problem. The col- an absolute worst case performance ratio of at most 5. umn generation method has not been applied frequently to solve the unit commitment problem. Several applications of column generation such as crew scheduling or other scheduling problems suggest that the column generation approach is encouraging. The only research that uses a column generation technique is a paper by Shiina and Birge (2004). They used the column generation approach in which each col-  FA-08 umn corresponds to the start-stop schedule and output level. Since Friday, 8:15-9:45 - Fo8 power output is a continuous quantity, it takes time to generate the required column efficiently. In our new approach, the problem to be solved is not a simple set partitioning problem because the columns Mechanism Design II generated contain only a schedule specified by 0-1 value. We present a new solution algorithm based on the column generation. It is shown Stream: Algorithmic Game Theory that the new approach is effective to solve the problem. Invited session Chair: Marlis Bärthel

75 FA-09 OR 2014 - Aachen

1 - Effects of Profit-Taxation in Matrix Games  FA-09 Marlis Bärthel Friday, 8:15-9:45 - SFo1 Nonlinear Optimization II It is consensus in different fields of practical relevance that the intro- duction of taxes might affect somehow the playing behavior of actors. Think of e.g. a financial transaction tax, taxes on betting- or poker- Stream: Continuous and Non-linear Optimization platforms, winning taxes in casinos, etc. However, it is not that clear Invited session what the effects might really look like. Here, a game-theoretic-model Chair: Lena Michailidis is considered that concentrates on taxation-effects on transferred mon- etary volume. For matrix games it is asked: How do taxes on profits change the behavior of players and the expected transacted volume? 1 - Neue Entwicklungen zur Auswertung von Jacobi- Analyzing this basic research model clearly shows: one has to be care- Matrizen auf Gittern ful in considering taxes as panacea to confine aggressive playing be- Martin Bücker havior! Taxes might encourage increased expected transfers! Wir betrachten in diesem Vortrag nicht-lineare kontinuierlich Funktio- nen, die auf strukturierten Gittern definiert sind. Zur Auswertung der 2 - Optimal Effort in Sports Tournaments Jacobi-Matrix solcher Funktionen werden üblicherweise Techniken Christian Doegen verwendet, die auf Graphmodellen und deren diskreten Optimierung basieren. Der Vortrag skizziert neue Techniken aus diesem Umfeld. 2 - Intersection of calm multifunctions and its applica- Modern professional sports tournaments grew up to a huge economy factor. Sports contests consist of many games, usually between two tion to the argmin mapping players. The outcomes of these pairwise comparisons are, with respect Diethard Klatte to the rules, relevant for the quantitative and qualitative results of the whole tournament. Models for sports contests are also applicable to In this talk, we first recall a basic theorem on calmness of the intersec- multilevel economic competitions. The outcome of a game depends tion of calm multifunctions, cf. Klatte, Kummer: Nonsmooth Equa- on the effort of both players as well as a random component. The tions in Optimization, Kluwer 2002. Then we show how to apply this probability of a win increases with higher own effort und decreases to calmness conditions for the optimal solution set mapping of a (fi- with higher opponent’s effort. This effort fluctuates over the season. nite and semi-infinite) optimization problem under data perturbations. Besides random-based variations the players can control their effort A main result says the argmin mapping of a convex semi-infinite pro- inputs in a strategic way. After some games with the highest input, gram is calm if an auxiliary inequality system has this property, while there is no chance to stay on this level. A wise and sustainable in- the opposite direction may fail in the general case. The results are put of the player’s effort resources is necessary. The individual aims obtained in collaboration with Bernd Kummer (Humboldt University and the different opponent’s strengths influence the optimal effort in- Berlin). put of a player. If all players are optimizing their inputs, there are huge effects on the tournament itself. Economic factors, especially a per- 3 - Inverse Modelling via Non-Linear Optimization for manent high audience interest, play a major role. A model describing Precise Force Predictions in Hot Rolling of Steel the input of effort resources in sports tournaments with pairwise com- Johannes Lohmar, Markus Bambach parisons and under certain real world conditions is developed. Main issues are the analysis of the optimal input from the player’s point of Hot rolling is the most important metal forming process in terms of view and the input allocation of all players in a game theoretical ap- production capacity and demand. In this process slabs of cast metal proach. Existence, uniqueness and special properties of the resulting are heated to temperatures up to 1250 degrees C and then formed into Nash equilibrium are discussed. Further issues are the effects on the metal sheets by reducing their thickness and at the same time increas- tournament as a whole and the applicability to the reality including real ing their length in several steps or passes. In each pass the slab is de- specific phenomena. formed via compression between two driven work rolls in a roll stand. To design the schedules for rolling processes the force during each rolling pass is of crucial importance. Typically simplified constitu- 3 - Design and evaluation of algorithmic mechanisms tive modelling equations are used to predict those forces. For precise for hard multilateral multi-issue negotiation prob- predictions the equations use typically around 15 material dependent parameters. These parameters are conventionally determined in labo- lems ratory scale tests with high efforts and costs. In this paper a concept is Andreas Fink, Jörg Homberger detailed that enables the determination of the material model parame- ters via inverse modelling directly from industrially measured process data. The basic idea is to use the deviation between measured and We consider intertwined optimization problems with multiple au- modelled forces as a quality indicator for the material model parame- tonomous decision makers, which are represented by corresponding ter precision. To minimize the force deviations the simplified rolling agents within a strategic environment. These agents deal with a com- model is embedded into a non-linear optimization loop where the ma- mon set of feasible solutions (contracts). This combined solution space terial model parameters serve as input and the objective function is (contract space) is complex since, firstly, even for a single agent the defined as the least squares deviation between measured and modelled calculation of an optimal solution may constitute an NP-hard prob- force. After successful optimization the resulting parameter set en- lem, and, secondly, one has to cope with self-interested agents with ables the prediction of roll forces in industrial rolling processes with conflicting goals and private information. The problem is to mutu- high accuracy without the need for additional material testing. Using ally determine a commonly accepted solution as the eventual contract these forces it is then possible to lay out optimal process schedules. which should account for all agents’ individual goals to some degree. Such kinds of coordination problems impose restrictions on the de- 4 - Kinetic Models for Assembly Lines in Automotive In- sign of possible solution mechanisms, which may be regarded as ne- dustries gotiation procedures. In particular, one cannot presuppose that the in- Lena Michailidis volved parties truthfully disclose private information (i.e., their prefer- ences) and honestly observe any conceivable rule. On the contrary, the Part feeding processes at automotive assembly plants deal with the applicability and effectiveness of a solution mechanism depends on timely supply of parts to designated stations at the assembly line. This a sensible design of verifiable negotiations rules under consideration is accomplished by means of an internal shuttle system which supplies of the incentives of the involved agents. After reviewing research on various stations with needed parts. To specify the production processes negotiation-based search mechanisms which address these concerns, by models based on partial differential equations, kinetic equations are we adapt and extend general negotiation procedures regarding i) how derived to model the production flow on assembly lines. Of interest new contract proposals are generated, ii) which questions are directed is a mathematical description able to be used for long-term planning at the involved agents, and iii) how iterative decisions are conducted to of e.g. workforce capacities, storage capacities and workload predic- advance the search process. These concepts are experimentally eval- tions. Due to the high volume assembly lines a discrete mathematical uated for multi-agent sequencing problems under consideration of the model as given by discrete event simulators is challenging to compute incentives of self-interested agents. for large time periods. Compared to contemporary literature the under- lying particle dynamic is more complicated due to different hyperbolic closure relations to derive the macroscopic hyperbolic models. The

76 OR 2014 - Aachen FA-11

transport coefficients in the resulting equations are also computed ex- 3 - Steering of Small Scale Electricity Consumer and plicitly and include the statistical information available from the man- Producer Behavior using Regional Electricity Mar- ufacturing plant. In addition numerical studies on the macroscopic equations are presented. kets — A Quantitative Model Michael H. Breitner, Sören Christian Meyer The progress of the transformation of the German energy system is connected with challenges in the mobility, warmth and electricity sec- tor. Especially the volatile production of renewable electricity from  FA-10 windmills and photovoltaic systems leads to new obstacles regarding Friday, 8:15-9:45 - SFo2 to grid stability. The two obvious alternatives are balancing electric- ity supply and demand with storage and conversion technologies or compensating local peaks with an expanded grid. Both options lead Agent-based Simulation and Experimental to high investments and cause rising electricity prices. This, and the Economics environmental destruction of grid expansion lower the acceptance of additional renewable energy generation capacities. Stream: Energy and Environment This impact can be reduced through a better geographical and tem- Invited session poral link between electricity demand and renewable energy genera- Chair: Michael H. Breitner tion. This objective can be achieved by regional electricity markets as steering mechanism between independent producing and consuming agents. 1 - The effect of competition on environmental behavior- The demand side of this regional market consists of perfectly price Evidence from the lab inelastic agents (e.g. TV, illumination) and price elastic agents (e.g. Daniel C. Pithan, Heike Schenk-Mathes electric vehicle, wash-dryer, heat generation and battery). As well the supply side consists out of perfectly inelastic agents (e.g. photovoltaic Since the seminal work of Rosen/Lazear (1981), many studies on tour- systems and windmills) and elastic agents (e.g. cogeneration units, naments have been conducted. We refer to the extension of Gilpatric batteries). (2005) that allows agents in tournaments to cheat (and thus to possibly gain higher payoffs) and implements incomplete monitoring. For the This paper describes a model of rational agent behavior based on agents the dominant strategy is to cheat if audit probability falls below marginal costs and game theory. The presented approach leads to a a certain value and not to cheat if audit probability is higher than an- market equilibrium suiting to the challenge of grid stability. other value. Based on the work of Evans (2008), who experimentally The presented research allows to develop and evaluate business models investigates how agents react to imperfect monitoring, we create an en- for (partial-) off grid systems and the prediction of economic rational vironmental framework for participants in a lab experiment: cheating agent behavior within them. is called "do not undertake necessary environmental protection mea- surements’. Participants are informed that doing so will harm other participants in the lab by reducing their payoffs. One third of the par- ticipants acts as principals and chooses the audit probability while the other participants are agents in a tournament and decide whether to cheat after they are informed about their principals’ decisions. We cre-  FA-11 ate another experiment without the tournament structure. While keep- Friday, 8:15-9:45 - SFo3 ing the eco-framework and having financial incentives as identical as possible, we just remove the tournament. Agents still decide whether Group Decision Making and Cooperation to cheat given an audit probability, but they do not act in a competitive environment as in the tournament. We find that principals choose sig- Stream: Decision Theory and Multi-Criteria Optimiza- nificantly higher audit probabilities in the absence of the tournament structure. Furthermore, for given audit probabilities, agents decide to tion cheat less without the tournament. It seems that within a competitive Invited session structure such as tournaments agents less respect the environment. In Chair: Erdem Aksakal addition, principals monitor more intensively, leading to smaller pay- offs. 1 - A hybrid Delphi and aggregation-disaggregation pro- 2 - An agent-based model for investment decision in cedure for group decision-making: algorithms, oper- electricity markets ators and metrics Andreas Bublitz, Dogan Keles, Massimo Genoese, Wolf Andrej Bregar Fichtner In the past research work, we have introduced a generic iterative hy- brid procedure for quantitative group multi-criteria decision analysis Investments decisions in electricity markets are complex problems. that consolidates an autonomous aggregation-disaggregation mecha- With technical lifetimes of power plants that extend over 30 years, long nism with a moderated Delphi process. By applying a universal frame- time periods have to be regarded when determining the value of an in- work for the assessment of group decision-making methods and sys- vestment. Naturally, this value is subject to uncertainty — which is tems, we have shown that the aggregation-disaggregation analysis and one of the characteristics of a real investment. Besides uncertainty, the Delphi method can be synergistically combined although they are two other aspects characterize real investments: Investments are not or based on different core principles. In this paper, we operationalize all only partially reversible and the time for an investment decision is to steps of the hybrid procedure by defining appropriate algorithms, op- some extent flexible. Literature suggests treating real investments like erators and metrics. We focus on initial preference specification, pref- American-style call options, whose value can be determined via the erence aggregation, analysis of (dis)agreements, sensitivity/robustness Black-Scholes model. In the field of energy economics there exists a analysis, communication, adjustment of holistic decisions and prefer- broad application of real-option models, e.g. Fleten (2007) or Kum- ential parameters, preference disaggregation, and relaxation of con- baroglua˘ (2008). While many models focus on stochastic processes straints. Introduced operators and metrics assess the majority opin- such as Brownian motions to model the electricity prices, in this paper ion, compute deviations, determine the direction of the group, identify an approach is presented where hourly prices are forecasted based on the most discordant decision-makers, and determine the robustness of existing power plants and expected future investments. To analyze in- opinions. Proposed algorithms adjust preferential parameters of the vestments in electricity markets an agent-based model for the German most opposing group members with the purpose to iteratively, conver- market area is chosen, where the generation companies represented gently and efficiently unify opinions. Derived algorithms, operators by supply agents, determine individually if and when to build a new and metrics are applied to both most relevant decision-making prob- power plant. Each supply agent forecasts fundamental based hourly lematics: ranking of alternatives and sorting of alternatives into arbi- day-ahead market prices for the year a new power plant could be op- trary many ordered categories. erated for the first time. To account for uncertainties underlying the investment such as fuel prices, a recombining tree for each investment 2 - Multicriteria group choice using the majority prefer- option is created. Each node of the tree contains for each year the ex- pected hourly profit margins based the volatility of the margins of past ence relation based on cone individual preference re- of the years. lations Alexey Zakharov

77 FA-12 OR 2014 - Aachen

A multicriteria group choice problem is considered, it includes: a set of experimental bidding setting with discrepancies in the bidders’ expe- feasible decisions; a vector criterion reflecting general goals of group rience levels. A contractor’s strategy is composed of the pre-tender of decision makers (DMs); asymmetric binary relations of decision investment willingness and the targeted mark-up. An approximation makers (DM), which reflect individual preferences. Individual prefer- algorithm for the determination of the bidding equilibrium at the con- ences are given by "quanta’ of information, which indicate compro- tractors’ level is presented and relies on the best response heuristic. mise between two components of vector criterion: for the sake of get- At the upper-level, the impact of common governmental policies on ting profit on one component of criteria (more important) the DM is the bidding equilibrium is investigated. Firstly, the number of bidders ready to lose some value on other component (less important). Each invited for tender is a critical factor for the bidding behavior and es- DM’s preference relation is a cone relation, and is characterized by pecially inexperienced bidders need additional incentives to penetrate convex pointed cone, which contains nonnegative orthant and does not the market. The introduction of a partial government contribution in contain the origin. The majority preference relation is considered: one the bidding cost is proven to add value. Last but not least, a project decision majority dominates another if for at least half group of DMs pipeline effectively stirs up the enthusiasm of new entrants but results the first decision is preferred to another (by the DM’s preference rela- mostly in lower costs for the government and not necessarily in higher tion). It is proved that such majority relation is a cone relation, and its quality proposal documents. cone, in general, is not convex. The property of convex is equivalent to transitivity of corresponding relation. The goal is to construct a convex part of the majority preference relation cone, and it gives the transitive 2 - A branch and bound algorithm for the chance- part of this relation. The case of group of three DMs and three com- ponents of criteria is considered. It is given such information: 1) for constrained RCPSP with stochastic activity dura- DM1 first criterion is more important than the second, and the second tions criterion is more important than the third; 2) for DM2 the second is Patricio Lamas, Erik Demeulemeester more important than the third, and the third is more important than the first; 3) for DM3 the third is more important than the first, and the first is more important than the second. It is shown how to specify the con- In reality projects are executed under high levels of uncertainty. Em- vex part of majority preference relation cone, and construct the set of pirical and theoretical research shows that the consideration of such a nondominated vectors. factor is crucial for achieving a successful project implementation. We consider an extension of the resource-constrained project scheduling 3 - The risk factors analysis that can be faced in logis- problem (RCPSP), where the duration of the activities are stochastic tics sector with dematel method variables with a known probabilistic distribution and both the prece- Erdem Aksakal, Metin Dagdeviren dence and the resource constraints must hold with a given probability. In this paper we present a branch and bound (b&b) algorithm for solv- As a definition, the logistics industry can be described as the well worth ing the chance-constrained RCPSP. General chance-constrained pro- winning in the field of engineering and widely used in human sys- gramming (CCP) problems are extremely difficult to solve. In this par- tems, which tends to grow every day in changing and evolving world ticular case such complexity is amplified by the inherent difficulty of economy. The logistics industry services in line with people’s expec- the (deterministic) RCPSP. Typical approaches for solving CCP prob- tations and requests, such as product needs to be provided at any time lems are based on an extended integer programming (IP) formulation, which is embraced like an idea, while performing activities with the where the constraints that define the set of feasible solutions of the effects of globalization are faced with various risks. Known as a neg- deterministic counterpart problem are slightly modified and extended ative concept, risk and its reflection for the situation to be resolved is in order to obtain the new set of feasible solutions. The efficiency a paramount importance for companies. Incorporating human factors of those approaches is determined by the quality of the linear pro- for the logistics industry is also important to reduce risks thoughts. In gramming (LP) relaxation of the deterministic counterpart problem. this study, the risks that logistics sector firms can be encountered an- The IP formulations for the RCPSP have weak LP relaxations in gen- alyzed by one of the Multi-Criteria Decision Making techniques, De- eral. In the literature there are a number of papers that present valid cision Making Trial and Evaluation Laboratory (DEMATEL) method. (sometimes facet defining) inequalities that make the different formu- The method has considered four main criteria (Financial Risks, Strate- lations stronger. However, the efficiency gap (in terms of computation gic Risks, Physical Risks, Operational Risks) and twelve sub-criteria’s time) between methods based on an IP formulation and ad hoc meth- (Liquidity Management, Contracts, Credibility Management, Political ods (i.e. not based on LP relaxations) is still important. We developed Risks, Customer Satisfaction, E commerce Logistics, Weather Condi- a b&b method for solving the chance-constrained RCPSP. It is based tions & Climate, Cargo Security, Dependence to Key Area, Fleet Man- on bounds obtained by using an efficient ad hoc method for the RCPSP. agement, Employee Suitability, Asset Utilization and Management). At last the result obtained from method is presented. 3 - New inequalities for solving proactive RCPSP with a chance constraint Morteza Davari, Erik Demeulemeester, Patricio Lamas  FA-12 Friday, 8:15-9:45 - SFo4 During the last decade, generating proactive baseline schedules for the RCPSP has been considered many times by different authors. Among Project Scheduling: Stochastic And Game the different approaches to construct a baseline schedule, we select Theoretic Aspects the chance constraint programming approach because it has the advan- tage of being independent from the reactive scheduling approach and Stream: Project Management and Scheduling shows better performance in comparison with its competitors. Assum- ing that a set of supporting realizations (scenarios) exists for the ran- Invited session dom (stochastic) parameters, the proactive chance-constrained RCPSP Chair: Erik Demeulemeester can be formulated as a MIP formulation. The goal is to minimize the makespan such that the constraints are satisfied for a certain number (total number of scenarios x (1 - the confidence level)) of selected sce- 1 - How to abolish the barriers to entry in Public-Private narios. Lamas and Demeulemeester (2014) solve this MIP formula- Partnership bidding: A game-theoretical assessment tion using a branch and cut algorithm. Since solving instances with of governmental policies a large (possibly infinite) number of scenarios may be computation- Dennis De Clerck, Erik Demeulemeester ally exhaustive, the total number of scenarios can be reduced using the sample average approximation technique. Selecting a subset of scenar- Due to the extensive timeframes and the transfer of risk towards the ios is equivalent to excluding its complement subset from the set of all preferred special purpose vehicle, there is more value at stake in scenarios. We construct an undirected graph of conflicting scenarios public-private partnership projects in relation to traditional public pro- which cannot be excluded together. Each node represents a scenario curement projects. It is in the social interest that public entities select and each edge represents a conflict. These conflicts lead to a set of consortia capable of performing the project with outstanding quality, node packing inequalities some of which are lifted to become more yet at a reasonable price. Usually, detailed and costly project propos- stronger. In this paper, a polynomial algorithm is devised to construct als are required from the bidding consortia. These high bidding costs the node packing constraints. The results are discussed and compared are often seen as an inhibitor for contractors to enter the playing field with those provided in (Lamas and Demeulemeester, 2014). and governments are currently seeking for mechanisms to increase competition. This paper models the procurement process in a bi-level

78 OR 2014 - Aachen FA-16

 FA-14  FA-16 Friday, 8:15-9:45 - SFo10 Friday, 8:15-9:45 - SFo14 Operating Room Planning & Scheduling Location of Renewable Energy Sources Stream: Health Care Management Stream: Energy and Environment Invited session Invited session Chair: Erik Demeulemeester Chair: Kai Plociennik 1 - Diffusion of Photovoltaic Installations in Germany 1 - Master Surgical Scheduling with stochastic surgery Sven Müller durations Alexander Kressner, Katja Schimmelpfeng The purpose of this study is to investigate factors on the very local level (households, addresses) that determine whether or not photo- voltaic systems - solar cell systems to generate electricity - are to be Operating rooms (OR) are a hospital’s most important and expensive installed on buildings. We consider a case study of Germany. We aim resources. Thus hospitals strive to operate ORs at high utilization with- to identify if the decision of households to install photovoltaics can be out jeopardizing patient service. In this context, one of the main chal- explained by peer effects measured by pre-existing proximate instal- lenges is to cope with the natural uncertainty in surgery durations. We lations (also known as the installed base). Since our analysis is based consider the problem of scheduling types of elective procedures to ORs on individual decisions of households, we employ a discrete choice over a mid-term planning horizon (Master Surgical Scheduling). The model with panel data in order to model the decisions whether to in- resulting OR-planning model is stochastic and allows to control over- stall in a certain period or not. We employ a geocoded data set of 21 time. We present different linearization approaches of the non-linear million addresses and the grid-connected photovoltaic systems set up base model and indicate further extensions. in Germany through 2010 (we consider 11 periods). In total, our data set used for analysis comprises nearly 210 million observations. Our 2 - How to schedule patients to operating rooms when analysis reveals a positive influence of previously installed systems lo- simulation results suggest FCFS to perform best cated nearby on the decision to install a photovoltaic system. However, Michael Samudra, Erik Demeulemeester, Brecht Cardoen this effect decreases in time. We present an outlook on how our model can be used for forcasting purposes and locational decision making. In most hospitals there are patients who receive surgery later than med- Finally, a concept is proposed how Google Trends data can be used ically advised. In one of Belgium’s largest hospitals, UZ Leuven, this within our model. is the case for approximately every third patient. Patients could be served in a timelier manner if the hospital would increase its capacities, 2 - Optimized Pattern Design for Photovoltaic Power i.e., opening a new operating room and hiring the necessary additional Stations personnel. Unfortunately, this is not an option. Another way to im- Kai Plociennik, Alena Klug, Karl-Heinz Küfer, Ingmar prove the amount of patients served in time is to improve the way they Schüle, Martin Bischoff are scheduled. Surgeons do their scheduling themselves and do not use The task of planning photovoltaic (PV) power plants is very challeng- algorithms to support their decision. As a consequence, we analyzed ing, due to several degrees of freedom and constraints. The decision scheduling mechanisms that are easy to apply manually. We checked makers have to consider the local weather conditions, the area topog- three mechanisms. Firstly, we checked whether it is beneficial to allow raphy, the physical behavior of the technical components and many different groups of patients to be served one day after it was decided more complex aspects. Currently, engineers often make decisions only that they need surgery. Secondly, we checked whether it is beneficial to based on their personal experience and with rules of thumb. But the allow some group of patients to be served on a FCFS basis. Thirdly and problem is far too complex to be solved with simple rules. Hence, in lastly, we investigated the effects of pushing low urgency patients more most cases this process results in suboptimal solutions. into the future in order to serve high urgency patients quicker. We will show extensive computational results to demonstrate the impact that We present an approach for optimizing one variant of the way routing combinations of the different scheduling mechanisms have on various problem for PV plants. These ways are needed for site access during patient related performance measures. Additionally, we also identify construction and maintenance and must be placed under certain restric- those combinations that best match the hospital data. tions. Mathematically, our problem is: Given the area polygon and the ways’ angle (orientation), choose feasible positions for the ways by 3 - Evaluating different policies on scheduling elective placing corresponding parallel stripes, so that the space available for placing PV tables in the polygon is maximized. This space is given by surgical procedures: A case study certain parallelograms in the polygon between the ways. Teresa Melo, Alexandra Bernhardt In our solution concept, we discretize the problem and regard the an- In many hospitals, surgical departments are typically allocated oper- gle of the ways as fixed. Then we formulate the problem as an Integer ating rooms and time on given weekdays into which they are able to Program (IP) which can be solved by standard solvers. In addition, we book elective surgeries. This is the case of the operating theatre of reformulate the IP as a maximum independent set problem on inter- a German hospital where each surgical group informs the operating val graphs. This graph-theoretic problem can be solved in polynomial room (OR) manager of the elective cases that will be performed on time. Using the latter approach, we are able to generate a variety of the following day. During the surgery day, the OR manager gradually solutions with different angles for the ways with little effort of time. details the OR schedule as time unfolds. Currently, there is the belief This enables us to also analyze the effect of different angles. that this modus operandi neither maximizes the utilization of the avail- Summarizing, we present a time efficient, exact solution approach for able resources nor minimizes the costs of the operating theatre. Before dealing with a complex problem which is motivated by an industrial changing the current practice, the hospital would like to know the po- optimization task. tential impact of a number of scenarios. We developed a mixed integer linear programming model that is embedded in a preemptive goal pro- 3 - Mean-variance models for the location problem of re- gramming framework to create a schedule for next day’s surgeries. The newable power plants base scenario depicts the current situation of dedicated rooms to surgi- Marius Radulescu, Constanta Zoie Radulescu cal groups. In the second scenario, surgeries may be performed in non- preferred rooms and deviations from dedicated rooms are minimized. In this paper financial portfolio theory is applied in order to obtain opti- The third scenario analyses the policy of opening the ORs to all surgi- mal locations for renewable power plants. A geographically dispersed cal groups. This option would dramatically change the current role of set of wind farms and solar power plants provide a more stable en- the OR manager. In a second step, each scenario was also considered ergy than the energy provided in the case the renewable energy plants with the option of performing anaesthesia induction outside the OR, are concentrated in a small area. Two single period portfolio selec- a practice that is currently not followed by the hospital but that could tion models for optimal location of renewable energy power plants are allow a higher utilization of the operating theatre. All scenarios were presented. The models belong to the class of mean-variance models. evaluated using real data provided by the hospital. We will compare One of them is a minimum risk model and the other one is a maximum the corresponding OR schedules by means of several performance in- expected return model. Decisions of investment in renewable energy dicators. Based on the computational study, recommendations for the power plants are connected with land use decisions and the develop- implementation of a new policy are discussed. ment of the grid infrastructure. In our models a renewable power plant is a wind farm or a solar power plant. A wind farm is a set of wind turbines and a solar power plant is a set of solar panels. A portfolio is a

79 FA-17 OR 2014 - Aachen

vector with integer components which show how many wind turbines risk parameters. These stressed risk parameters are needed for comput- or solar panels should be installed at various locations. Some of the ing the regulatory and economic capital requirements in the assumed variables of the models are non-negative integers and others are binary stress scenario. As the results of other quantitative risk management variables. Input data in the models are represented by wind data sets tools, stress test results are prone to model and estimation risk. Sur- (mean wind speed and velocity variances) and solar data set (which prisingly, the discussion of these issues in the context of stress tests is establishes the site’s irradiance and weather variability) collected from relatively sparse in the literature. This paper contributes to this discus- sites geographically dispersed. sion in the specific field of credit risk stress tests. Based on the credit portfolio model CreditPortfolioView, we show how model and estima- tion risk can influence the stressed default probabilities and, hence, the regulatory economic capital requirements. Starting from a base case model, we analyze, among others, the impact of the usage of different FA-17 methods for making the data stationary, lagged variables in the regres-  sion equation for the macroeconomic index and different time series Friday, 8:15-9:45 - 001 models for the explaining risk factors. Furthermore, we show how degrees of freedom with respect to the choice of the stress scenario Financial Modeling can influence the results. Although the analyzed specifications satisfy all statistical and econometrical demands, we find that stressed default Stream: Finance, Banking, Insurance, and Accounting probabilities and regulatory capital requirements can vary significantly. This shows the vulnerability of the employed stress test to model and Invited session estimation risk and calls for extensive robustness checks of stress test Chair: Michael Tuchscherer results.

1 - Advantages of Multi-level Fund Structures for Urban Development Investments Bertram Steininger, Wolfgang Breuer, Dominique Brueser  FA-18 Holding funds have become increasingly popular for asset classes such Friday, 8:15-9:45 - 004 as real estate, hedge funds or mixed-assets funds over the last years. This second level of funds is plausible if there are some practical lim- itations on the first level that hinder diversification. The European Supply Chain Planning Union regional policy with its European Urban Development Funds creates such limited investment universes. Stream: Supply Chain Management In a first step, our paper analyses the theoretical reasons for the bene- Invited session fits of a second fund level. In a second step, we employ mathematical Chair: Alexander Hübner modelling techniques to quantitatively evaluate its diversification ef- fects. Therefore, we use a Monte Carlo method to simulate the devel- opment of Polish projects within possible credit portfolios. Geometric 1 - Designing a planning system for suppliers of the ma- Brownian motion processes help to determine the cash flows of these chine building industry Polish projects by allowing for stochastic deviations from a given drift Nicolas Justus, Herbert Meyr rate. As an interim result of this application, we obtain the discrete probability density of investors’ terminal wealth generated by the re- ERP-based (Enterprise Resource Planning) advanced planning systems spective credit funds. Finally, an assessment of the utility (CRRA) of use Operations Research methods for solving planning tasks within an investment measured through the terminal wealth in either one of ERP-systems as optimal as possible. The companies using such plan- the Urban Development Funds (first level) or the Holding Fund (sec- ning systems mostly make quite diverse demands on them. Still in ond level) reveals that the diversification benefits through the latter ex- most cases it is possible to find a number of common demands for a ceed the additional transaction costs for a wide range of investors’ risk group of companies which operate in the same industry sector. preferences. The presentation analyzes how the planning tasks of the Supply Chain Our paper contributes to the scarce research on urban development in- Matrix can be modeled for such a planning system taking into account vestments within multi-level fund structures. To the best of our knowl- the specific demands of the machine building industry. The focus lies edge, we are the first to quantify the benefits from the second level on the required variables, the constraints and the interaction of the var- of such funds on the one hand and to consider volatile cash flows of ious optimization models. projects suitable for Urban Development Funds on the other hand. The outcome should be a model of a planning system consisting of 2 - Capital distribution in the bank currency exchange particular relevant planning modules and a definition of the informa- tion flow between the modules (=branch-wokflow) that can be used by system developers for developing specific solutions for suppliers of the ma- Grzegorz Pawlak, Mateusz Cichenski chine building industry. The research is motivated by the practical internet exchange platform in the multibank system. The model of the capital relocation was con- 2 - Oracle In-Memory Consumption-Driven Planning: sidered for the currency exchange transactions in the multibank envi- Extreme Performance for the Demand-Driven Value ronment. The paper is focused in the capital distribution in the bank Chain exchange system due to the revenue maximization according to the de- Hans-H. Schulz, Andreas Brock mand forecast. The practical application simulation model with trans- action emulation has been developed. Several capital distribution mod- Oracle’s In-Memory Consumption-Driven Planning(IMCDP) is a new els were evaluated and compared. The model assumed the initial cap- product which offers the step change in performance, scalability, and ital distribution between banks and capital transfers during the trans- new functionality needed for forecasting and replenishment planning actions time periods. The transfer time depends on the bank and the at a highly granular level. Supply chain performance, e.g. for con- currency. Several algorithms of the capital relocation have been pro- sumer products manufacturers, is most effective when forecasting and posed and analyzed their effectiveness in the several conditions. The replenishment planning are done as close to the end customer as possi- simulation experiments have been performed. The profit calculation ble using demand or consumption data, e.g. daily Point of Sale (POS) algorithms have been proposed and evaluated calculating the possible data, in order to avoid the bullwhip effect and to enable shelf-connected pair matching coefficient. and collaborative supply chain planning. However, limitations in soft- ware and hardware performance have precluded the use of sophisti- 3 - Model and estimation risk in quantitative credit risk cated planning algorithms in most situations, so many companies plan stress tests at distribution center level and week, and using outbound shipments Michael Tuchscherer, Peter Grundke, Kamil Pliszka instead of end-customer demand. This approach leads to suboptimal forecasts, service levels,and inventory turns. IMCDP overcomes these After the financial crisis 2007-2009, supervisory authorities endorsed limitations and makes daily store-level planning a reality. IMCDP has more comprehensive stress testing frameworks. For many risk types, the scalability to plan across all of the business and can accommodate usually, model-based stress tests are carried out by banks. Taking planning at different levels in a single system. The level at which a line credit risk as an example, the scenario of a severe macroeconomic of business or account will be planned will be tied to the criticality to downturn has to be translated into the corresponding changes of credit the business as well as data availability and planning needs. That way

80 OR 2014 - Aachen FA-20

one can plan at a granular level for one business process without in- Electric Vehicle — Vehicle Routing Problem (HEV-VRP) and supports curring performance impacts on other business processes such as Sales decision makers to evaluate the acquisition of HEVs for their specific and Operations Planning. operation. IMCDP calculates a time-phased replenishment plan, that combines We assume vehicles that are able to use four different modes of opera- current on-hand and inventory targets with in-transit, on-order and tion: pure combustion, pure electric, charging the battery while driving shipment leadtime to create an order which will meet end-customer in combustion mode and a boost mode, where the electric and combus- demand. tion engine are combined. The modes of operation differ in cost and travel time for each arc within the delivery network and we restrict the 3 - Selecting delivery patterns for grocery chains maximum working hours for the drivers. Moreover, the vehicles have a Andreas Holzapfel, Michael Sternbeck, Alexander Hübner limited capacity that cannot exceed the given demand of the customers visited. On a tactical level retailers face the problem to determine on which As the HEV-VRP is a generalization of the NP-hard VRP, the HEV- weekdays stores should be delivered and to set a frame for short term VRP is NP-hard, too. As the complexity of the problem is even in- vehicle routing. Especially in grocery retail weekly repetitive delivery creased by the different modes of operation compared to the VRP, prac- patterns are applied to increase planning stability for the stores and to tical problem sizes require the usage of heuristics to find solutions in balance picking workload at the distribution center. A delivery pat- a reasonable time. Therefore, we present a simple heuristic approach, tern is defined as a store-specific combination of weekdays on which a combining modifications on the tour structure and the modes of op- delivery takes place. eration. To test our approach, we generated test instances based on As several processes in the logistics subsystems distribution center, the Solomon benchmark instances for the VRP with Time Windows transportation and instore logistics of the retail supply chain are influ- (VRPTW) and point out the potential savings by using HEVs for de- enced by the delivery pattern decision, an integrated approach is nec- livery tours. essary to solve the problem. Therefore we propose an IP model that considers the decision relevant costs and capacities at the distribution 3 - Extension and application of a general pickup and center, in transportation and instore. We especially focus on instore delivery model for evaluating the impact of bundling handling aspects, bundling issues in transportation and associated in- goods in an urban environment terrelations. Stephan Buetikofer, Albert Steiner To solve the trade-off between the different cost components which Over the past decades, many metropolitan areas were facing a contin- are aligned to the delivery pattern decision, we propose a simultane- uing increase of their population leading, amongst others, to a higher ous and a sequential solution approach. We show significant cost sav- demand of transport of goods, both by the industry and suppliers, and ing potentials by applying the model and approaches proposed using a by households. In addition, various new types of services were devel- case from a major European grocery retailer. An extensive sensitivity oped to deliver goods. For delivery companies, short delivery times, analysis gives further insights into cost and capacity effects. low costs and high quality are some of the main targets of a ship- ment, whereas for public authorities and the society, the minimiza- tion of green-house gas emissions is of increasing importance. To ad- dress some of these requirements, various models to bundle and deliver goods were developed. In an applied research project presented here, FA-19 a cooperation platform for several logistic companies will be devel-  oped, where bundling of goods is a key component. Therefore, an Friday, 8:15-9:45 - I existing pickup and delivery model from Savelsbergh and Sol (1995) was extended. This model is flexible enough for the extensions de- Routing: Pickup-and-Delivery veloped while at the same time covering all essential requirements. A consistent path was developed from the raw data to the parame- Stream: Traffic and Transportation ters of the model. The model was implemented in GAMS and solved Invited session with CPLEX. Furthermore, a methodology was developed to allow for meeting multi-criteria objectives (costs, emissions, etc.) together with Chair: Stephan Buetikofer the definition of measures to quantify the impact for different scenar- ios. The scenarios are based on real world data from different shipping 1 - A New Approach to Freight Consolidation For a Real- companies. In this paper, the focus is not on computational aspects World Pickup-and-Delivery Problem related to the size of the problem, but instead on (i) illustrating a con- sistent methodology for integrating raw data into the model (network Curt Nowak, Felix Hahne, Klaus Ambrosi layout, costs,...), (ii) the measures to quantify the impact of bundling During courier and express providers’ operational dispatching, vehi- goods, and (iii) presenting some preliminarily results from tests with cles are assigned to customer orders. This task is complex, combina- real world data. torially comprehensive, and contains facets that defy modeling within reasonable effort, e.g. due to a lack of structured data. Hence, a fully automated solution cannot be achieved. In praxis, human dispatch- ers often use dialog-oriented decision support systems (DSS). These systems generate recommendations from which the human dispatchers  FA-20 select the most profitable one, additionally taking into account domain- Friday, 8:15-9:45 - II specific knowledge. Solutions that consolidate the freight of multiple customer orders onto a single vehicle are usually particularly favorable. Analytics in the Automotive Sector Generally, consolidating leads to a higher degree of vehicle capacity utilization, which in turn increases cost effectiveness, and lowers the Stream: Production and Operations Management resulting environmental burden. We present a new recursive heuristic for this scenario based on the well-known savings algorithm. A central Invited session parameter of the algorithm limits the number of interdependent sin- Chair: Mahmut Ali Gokce gle tours. Through the appropriate setting of this parameter, one can control the results’ complexity and ensure their transparency and ac- 1 - Programm-Füllungs-Assistenzsystem ceptance by human dispatchers. Using real-world data benchmarks, Benjamin Korth, Christian Schwede, Julien Weierke we prove its effectiveness empirically. Innerhalb der Wochenprogrammfüllung, einem Teilprozess der Wochenprogrammplanung in der Automobilindustrie, werden 2 - A simple Heuristic for the Hybrid Electric Vehicle — Aufträge der jeweils aktuellen Planungswoche so zugeordnet, dass Vehicle Routing Problem die Werkskapazitäten bezogen auf Auftragsvolumen möglichst aus- Christian Doppstadt, Achim Koberstein, Daniele Vigo gelastet werden. Hierbei ist eine Vielzahl von Restriktionen, wie z.B. die verfügbaren Kapazitäten für Montage oder der Lieferan- The delivery of goods from a central depot to different customer lo- ten zu berücksichtigen. Die zugrundeliegende Aufgabe ist ein ganz- cations with a given set of vehicles is a well-known and widely stud- zahliges Optimierungsproblem, bei dem Aufträge unter Berücksichti- ied problem in Operations Research. We extend the so-called Vehicle gung der genannten Restriktionen und Zielen gewählt werden müssen. Routing Problem (VRP) in a way that the fleet of delivery vehicles con- In der Praxis sind die Ziele meist nicht allein durch die geschickte sists of Hybrid Electric Vehicles (HEVs) having a combustion and an Auswahl der Aufträge zu erreichen. Deshalb wird in einem aufwendi- additional electric engine, which is powered by an integrated battery gen Abstimmungsprozess zwischen Logistik und Vertrieb versucht, with a given capacity. The resulting problem is introduced as Hybrid notwendige Restriktionsanpassungen zu ermitteln und umzusetzen.

81 FA-21 OR 2014 - Aachen

In diesem Betrag wird ein „Programm-Füllungs-Assistenzsystem" as hybrid, tending to integrate all optimization techniques appropri- vorgestellt, dass für einen deutschen Automobilhersteller entwickelt ate to explore from small to large neighborhoods: local and direct wurde. Das Logistische Assistenzsystem ermöglicht es die Einpla- search, constraint propagation and inference, mixed-integer linear pro- nung zu simulieren und die notwendigen Restriktionsanpassungen zu gramming techniques, nonlinear programming techniques, etc. The bestimmen. Auf dieser Basis unterstützt das Assistenzsystem den iter- functional and technical novelties coming with LocalSolver 5.0 will ativen Prozess der Wochenprogrammfüllung zwischen den beteiligten be presented. Then, some benchmarks and practical applications will Bereichen. Die eingesetzte Heuristik nutzt für die Einplanung eine be detailed to assess the performance and the relevance of LocalSolver Bewertung der Aufträge auf Basis der durch lineare Extrapolation for solving large-scale real-life optimization problems. prognostizierten verfügbaren Kapazitäten. Dieses Vorgehen mindert die Gefahr, dass Restriktionen frühzeitig kritisch werden. Restrik- 2 - SCIP Optimization Suite 3.1 tionen werden automatisch angepasst, falls dies notwendig und er- Matthias Miltenberger laubt ist. Durch dieses Vorgehen sinkt der Planungsaufwand deutlich, während die Ergebnisqualität steigt. Das vorliegende Paper stellt das Assistenzsystem, die Heuristik und Ergebnisse aus der Praxis vor. We present the new features and performance improvements included in the latest release of the SCIP Optimization Suite. The Optimization 2 - Integrating satisfiability framework in association Suite consists of the constraint integer programming toolkit SCIP and rule for better customer buying behavior analysis in ships with the LP solver SoPlex and the modeling language ZIMPL. It also contains the parallelization framework UG and the column gener- mass customization ation extension GCG. Tilak Raj Singh, Narayan Rangaraj We report on our new interfaces to Python and Java and also on our Continuous changes in product design and market conditions imply rewritten internal interface between SCIP and SoPlex, which allows that product variants which have been produced in the past may not be for better control over the LP solver. valid in the future. Nevertheless, customer order history is an impor- tant input for capturing customer buying behavior, required for future More information and downloadable libraries, binaries as well as planning activities. In order to extrapolate associations among prod- source code packages can be found at scip.zib.de. ucts features and their validity against new product design, we pro- pose a fully automated association rule mining with satisfiability (SAT) 3 - Recent enhancements of the FICO Xpress Optimizer framework. Design rules are modeled as SAT problem and generated Timo Berthold association rules are filtered by solving an instance for SAT problem. The methodology is demonstrated using an industry size example. In this presentation, we will discuss improvements in the linear, mixed integer and nonlinear solvers in the latest release of the FICO Xpress 3 - Simulation based Scheduling Using Dispatching and Optimization Suite version 7.7. This includes a multi-start heuristic Batching Rules in Due Date Priority Production Plan- for nonlinear programming, the detection of nearly parallel objective ning functions for MIP, solving and modelling capabilities for robust opti- Mahmut Ali Gokce, Gulsum Ozer, Merve Ilbeyi, Cansel mization, and an innovative method for parallelizing the dual simplex UZARAS, cenk tasyurek algorithm (parallelization across multiple iterations). This study is done over the planning/scheduling problem of produc- tion line of one of the world’s largest automotive wheel suppliers. The production process consists of casting, heat treatment, machining, lev- eling, leakage testing, brushing and the paint shop. This study specifi- cally focuses on the WIP build-up and solution methods to remedy this  FA-22 WIP build-up on the conveyors starting from machining to the end of Friday, 8:15-9:45 - IV the brushing process. Although more than 300+ models can be man- ufactured in the plant, daily average number of models in the produc- Vehicle Routing and Scheduling and tion, vary between 20 and 25. Especially in machining and leveling, cycle times of products change significantly based on their models. Pickup and Delivery Cycle times range in [103, 240] sec. in machining and [58, 112]sec. in leveling. The conveyor system carrying the products move at a con- Stream: Logistics and Inventory stant speed of 9mt/sec. This situation, along with pressure to make due Invited session dates, creates significant WIP build-up and congestion on the conveyor Chair: Aydin Sipahioglu system. Due to long production line, product variety etc., it is very dif- ficult to model and solve this problem analytically. For these reasons, the production line is modeled in detailed using simulation. The simu- 1 - 2D Rail Mounted Vehicle Scheduling with Non- lation model is verified and validated. We present results of 3 different Crossing Constraints solution methods and their effects on the system using real-life data. The 3 methods are, use of dispatching decision rules, different layouts Torsten Gellert and increasing/decreasing numbers of machines and/or operators in a design of experiments. 2D Rail Mounted Vehicle Scheduling with Non-Crossing Constraints In many logistics applications several rail-mounted vehicles are used to organize a warehouse, a shipyard or a storage of goods. It is crucial for an efficient production environment to find conflict-free tours - every vehicle limits the other due to its position along the rail - minimizing  FA-21 the total execution time. We consider a vehicle scheduling problem Friday, 8:15-9:45 - III (called 2DVS), where n items need to be transported on the plane us- ing k identical vehicles. Throughout the whole process, all vehicles Math Programming Solvers need to stay in their initial order along the x-axis. The aim is to find a solution where all jobs are transported, the tours do Stream: Software Applications and Modelling Systems not cross along the x-axis and the makespan is minimized. As a gener- Invited session alization that fits well with the practical circumstances, we consider a safety distance that has to be kept between the vehicles. Chair: Timo Berthold Our main contribution is a model for 2DVS and some nearby varia- tions. More specifically, we show that 2DVS is NP-hard even if we 1 - LocalSolver: a new kind of math programming solver know the optimal starting times of all jobs. Both the starting times and Frédéric Gardi, Thierry Benoist, Julien Darlay, Bertrand the assignment of each job can be treated as a solution to the problem. Estellon, Romain Megel Interestingly, the problem can be seen as k-Stacker Crane Problem (k- The LocalSolver project (http://www.localsolver.com) aims at provid- SCP) in a grid graph with a maximum metric as distance. We combine ing a mathematical programming solver for large-scale mixed-variable some well known approximation algorithms for the k-SCP to get a bet- non-convex optimization. To scale, LocalSolver is based on variable ter approximation for the special distance function. neighborhood search as global search. Its architecture can be viewed

82 OR 2014 - Aachen FA-24

2 - Real-time rerouting with time windows and ad-hoc 1 - A three-level capacitated lot sizing model for produc- demand changes tion control Olga Bock, Andreas Braun Florian Isenberg, Leena Suhl This paper considers the exploitation of available real-time informa- Based on various requirements for different horizons in production tion in parcel pick-up processes based on a real-world application with planning and control, a three-level mathematical model for lot siz- a German parcel service provider. In particular, the effects of real-time ing and scheduling has been developed. The requirements originate changes in business customers’ demands are analyzed. During exe- from practical problems and challenges in production control, found in cution of an a priori planned master-schedule, vehicle capacity may small and medium enterprises in the metalworking industry. The idea be exceeded in the case of unexpectedly increased customer demands. is to divide the planning horizon into three different planning scopes, Thanks to new information systems, changes in demand can be moni- to cope with the requirements on an adequate level. Each scope is im- tored in real-time during execution of the schedule. Therefore, it is pos- plemented by a suitable model based on the literature. A short-term sible to reschedule before a problem occurs and the remaining tour be- scope includes the next few days and maps the production control of comes infeasible. To address this issue, we model the problem as a Ve- the scope to a "proportional lot sizing and scheduling’ model. The hicle Rescheduling Problem with Time Windows (VSRPTW). A ruin- medium-term scope covers several days up to more than a week and and-recreate heuristic is presented to allow for real-time rescheduling differs from the short-term scope by a larger period size and minor based on newly available data. Via simulation, the system is evaluated level of detail. It is modeled by a "continuous setup lot sizing prob- in a computational study. The study shows that exploiting real-time lem’. Different from the two previous models, the long-term scope is information with the developed method has positive effects on the cost based on a model which is not simultaneously building lot sizes and a and on the service level of logistics providers. sequence. The model extends the "capacitated lot sizing problem’ us- 3 - Optimization of Vehicle Routes with Delivery and ing periods, the size of a whole week. The three models are combined into one major model, using the inventory constraints as synchroniza- Pickup for a Rental Business: A Case Study tion. This way, consistency of already produced items and demanded Susumu Morito, Tatsuki Inoue, Takuya Hirota, Ryo Nakahara items can be guaranteed. First results will be discussed in the talk. Optimization of vehicle routes with delivery and pickup for a rental industry is considered. The company delivers to or picks up from cus- 2 - Capacitated lot sizing with quality-dependent reman- tomers rented products. Several types of products exist, and customers ufacturing rent the specified number of products of the specific type. Time win- Kristina Burmeister, Florian Sahling dows exist for delivery and pickup. There exist two sizes of vehicles, and their trips start from and end at depot and vehicles can make sev- We present a new model formulation for a multi-product capaci- eral trips during a day. Delivery must precede pickup on any trip of tated lot sizing problem with remanufacturing. We assume that the a vehicle. Capacity of vehicles depends on product type and also on returned products possess different quality levels. The process of how products are loaded on vehicles, i.e., whether products are loaded (re)manufacturing is located on multiple machines. The returned prod- in a fold form or in an unfold or assembled form. Depending on or- ucts with a given quality level can be remanufactured to the next better der quantity, split deliveries/pickups may be necessary. The company or a higher quality level. External demands for new products as well as wants to minimize the total transportation cost. Based on the fact that for remanufactured products depending on the quality are considered. the total number of distinct trips is rather small due to limited capac- The demand of remanufactured products of a given quality level can ity of the vehicles, our solution strategy first enumerates all possible also be satisfied by any remanufactured product with a higher quality trips. Routes (i.e., collection of trips) are obtained by assigning trips level or in addition by a new product. However, this substitution is not to vehicles so that the total cost is minimized subject to constraints on allowed in the other direction. Furthermore, a solution approach based demand, an upper limit on the number of trips per vehicle, and time on mathematical programming is proposed. compatibility of trips assigned to a specific vehicle. Since there exist many time compatibility constraints, the problem is first solved without 3 - On the value of warranty returns them, we then check the compatibility and if necessary add compatibil- Simme Douwe Flapper ity constraints, and the problem is solved again until all routes become time compatible. The computational performance of the proposed so- Every company selling a physical product has to decide on the related lution approach and goodness of the generated routes are evaluated warranty issues. One of these issues concerns: what to do with prod- based on several sets of real data. Routes obtained in 30-60 minutes ucts or parts replaced in the context of a warranty claim? An overview of CPU time were found to be 4-10% lower in cost than actual routes is given of a number of potential reasons for taking care of these prod- produced manually by the planning personnel. ucts or parts, as well as simple mathematical models to estimate the costs and benefits related to each reason individually as well as for 4 - A New Approach for Pickup and Delivery Problem combinations of these reasons. Application of the models in practice Aydin Sipahioglu, gökhan çelik is briefly discussed and directions for further research are indicated. Pickup and delivery problems (PDP) are an important class of VRP and it aims to find good tours for vehicles both pickup and delivery operations. PDP can be classified into 3 groups, basically. 1-1 (one to one), M-M (many to many) and 1-M-1 (one to many to one). 1-1 means each commodity has only one supply and one demand point. M-  FA-24 M means there can be more than one supply and demand points for any Friday, 8:15-9:45 - AS commodity. 1-M-1 means some commodities at depot are delivered to demand points and some commodities at customers are delivered to depot. Additionally, it can be defined a new problem that more than Revenue Management one supply and demand points for each commodity occur and every node in the network acts like a depot. Since these problems are NP- Stream: Pricing, Revenue Management, and Smart hard, obtaining an optimal solution in a reasonable time is not easy. In Markets this study, integer models are offered for 1-1 and 1-M-1 type PDP so as Invited session to energy minimizing and it is shown on small scale test instances that the models can find the optimal solution. Additionally, a new problem Chair: Subrata Mitra for PDP is defined and its model is discussed. 1 - Capacity Uncertainty in Airline Revenue Manage- ment Daniel Kadatz, Catherine Cleophas, Natalia Kliewer  FA-23 A classic assumption in airline revenue management (ARM) is that a Friday, 8:15-9:45 - V capacity will be fixed for the entire booking horizon. I.e., the number of seats is supposed to be constant for the whole revenue management Lotsizing and Product Returns process. However, execution difficulties can lead to an unexpected change of capacity, and thus the number of potential tickets for sale Stream: Production and Operations Management can be altered. This situation can arise, e.g., due to aircraft on ground, or technical issues, crew planning, special sales, or weather conditions. Invited session It involves the danger of a non-optimal revenue management inventory Chair: Simme Douwe Flapper control as the system will optimize via incorrect input parameters.

83 FB-01 OR 2014 - Aachen

The possibility of deliberately altering the final capacity of a flight — to better match the arrived demand to date — has already been ana- Friday, 10:10-10:55 lyzed extensively. In literature, established approaches are i.a. known under the name of demand driven dispatch, dynamic capacity manage- ment, or demand driven swapping. Here, a capacity change is inter-  FB-01 nally motivated, i.e. by revenue management itself, while the major Friday, 10:10-10:55 - Fo1 problem of external capacity changes is that revenue management sys- tems are not able to consider these changes yet. Neither their appear- Semiplenary Puget ance, nor their impact has been sufficiently explored. This talk introduces a formalization of the problem and will highlight Stream: Invited Presentations and Ceremonies how an integration of external capacity changes in ARM forecast and optimization could look like. It will present a simulation model for an- Semi-plenary session alyzing the effects of capacity changes and their integration in ARM. Chair: Arie Koster 2 - Optimal Pricing and Core Acquisition Strategy for a 1 - Optimization in the Big Data age Hybrid Manufacturing/Remanufacturing System Jean Francois Puget Subrata Mitra Big Data is a shortcut for a very interesting phenomenon: we now have data about almost everything. Managing and using that data can Remanufacturing is one of the product recovery options where the be challenging. Indeed, data can come in very large volumes. Data can quality of used products (cores) is upgraded to "as-good-as-new’ con- also come in a large variety of forms (eg audio, video, free text, text ditions. In this paper, we consider a monopolist firm selling new and feeds, sensor measures, etc). Data can also by in motion (streamed) remanufactured products to primary and secondary customers, respec- as opposed to be at rest. Each of these Big Data dimensions (volume, tively, with one-way substitution, i.e. primary customers may substi- variety, velocity) create challenges and opportunities for optimization tute new products by remanufactured products while secondary cus- techniques and applications. We will review these challenges and ex- tomers never consider buying new products. We develop economic plore potential approaches. We will also provide some actual examples models under two scenarios — when the supply of cores is uncon- where Big Data and Optimization are used together in new, innovative, strained and when manufacturers have to procure cores at an acquisi- applications. tion price. The major observations of the paper are as follows. A firm is better off when there is no constraint on the supply of cores. Even when cores have to be acquired at an acquisition price, the profitability is higher than that when the firm does not engage in remanufacturing activities. When a larger number of primary customers replace new  FB-02 products with remanufactured products, there is partial cannibalization Friday, 10:10-10:55 - Fo2 of new product sales; however, the combined market share and prof- itability of the firm increase. When core supply is constrained and cus- tomers are less sensitive to core prices, the limited supply of cores may Semiplenary Dörner render remanufacturing an infeasible option for the firm. Therefore, firms should not only generate awareness among primary customers to Stream: Invited Presentations and Ceremonies buy remanufactured products, but also step up efforts to ensure a steady Semi-plenary session supply of cores. We conclude the paper with managerial implications and directions for future research. Chair: Stefan Voss 1 - Matheuristic Design Concepts for Rich Problems Karl Doerner Metaheuristic algorithms and frameworks, such as tabu search, genetic algorithms, variable neighborhood search, etc., were in fact usually proposed in years when mixed integer programming (MIP) was sel- dom a viable option for solving real-world problem instances. However, research on mathematical programming, and in particular on discrete optimization, has led to a state of the art where MIP solvers or customized mathematical programming codes can be effective even in a heuristic context, both as primary solvers or as subprocedures. In the past years different hybrid variants of xact and metaheuristic search techniques - also called matheuristics - were developed espe- cially for rich problems in the field of logistics and transportation. In this talk different design concepts of matheuristics will be presented.

 FB-04 Friday, 10:10-10:55 - Fo4 Semiplenary Barber Stream: Invited Presentations and Ceremonies Semi-plenary session Chair: Catherine Cleophas 1 - Deep Learning David Barber Deep Learning is the field of study of hierarchical information process- ing systems, more historically known as neural networks. Recently the field has been received great attention from academia, industry and the media as new techniques have emerged that have advanced the state of the art in reinforcement learning, natural language mod- elling, compression and computer vision, amongst others. I’ll give a brief overview of the field, the techniques involved and aspirations for future intelligent information processing algorithms.

84 OR 2014 - Aachen FC-03

capital budgeting problem. We also consider the connection between Friday, 11:20-12:50 these parameterized problems and approximation and pseudopolyno- mial algorithms.  FC-02 Friday, 11:20-12:50 - Fo2 Complexity  FC-03 Friday, 11:20-12:50 - Fo3 Stream: Discrete and Combinatorial Optimization, Graphs and Networks Multi-criteria Decision Making Methods Invited session Chair: Frank Gurski Stream: Decision Theory and Multi-Criteria Optimiza- tion 1 - Algorithms for Controlling Palletizers Invited session Jochen Rethmann, Frank Gurski, Egon Wanke Chair: Nils Lerche

Palletizers are widely used in delivery industry. We consider a large 1 - Deriving Priorities From Inconsistent PCM using the palletizer where each stacker crane grabs a bin from one of k convey- ors and position it onto a pallet located at one of p stack-up places. Network Algorithms All bins have the same size. Each pallet is destined for one customer. Marcin Anholcer, János Fülöp A completely stacked pallet will be removed automatically and a new empty pallet is placed at the palletizer. The FIFO Stack-up problem is In several multiobjective decision problems Pairwise Comparison Ma- to decide whether the bins can be palletized by using at most p stack-up trices (PCM) are applied to evaluate the decision variants. The problem places. Since the FIFO stack-up problem is computational intractable that arises very often is inconsistency of given PCM. In such a situa- in general, we study the fixed-parameter tractability of this problem. tion it is important to approximate the PCM with a consistent one. The The idea behind fixed-parameter tractability is to try to separate out most common way is to minimize the Euclidean distance between the the complexity into two pieces - some piece that depends purely on matrices. In the paper we consider minimization of the maximum dis- the size of the input, and some piece that depends on some parame- tance. After applying the logarithmic transformation we are able to ter of the problem that tends to be small in practice. We introduce a formulate obtained subproblem as Shortest Path Problem and solve it digraph and a linear programming model for the problem. Based on more efficiently. We analyze and completely characterize the form of these characterizations we give algorithms to show that the number n the set of optimal solutions and provide an algorithm that results in a of bins, the number m of pallets, and k to the power of m can be chosen unique optimum regardless the initial conditions. as a parameter such that the problem is fixed parameter tractable. Thus 2 - A Decision Support System to Optimize Car Sharing for a lot of small parameter values we obtain efficient solutions for the FIFO stack-up problem. We also discuss approximation results for the Stations with Electric Vehicles problem. Kathrin Kühne, Tim A. Rickenberg, Michael H. Breitner 2 - The Focus of Attention Problem An increasing environmental awareness, rising energy costs, progress- Dries Goossens, Frits Spieksma, Sergey Polyakovskiy, ing urbanization, and shortage of space cause to rethink individual mo- bility behavior and personal car ownership in cities. Car sharing is a Gerhard J. Woeginger sustainable mobility concept that allows individuals to satisfy their mo- Sensor networks offer exciting new possibilities for achieving sensory bility needs without owning a car and addresses modern mobility. Car omnipresence. The main trouble is that the used sensors are inherently sharing is particularly suitable to cover medium-range distances and limited and individually incapable of estimating the state of a target, can be linked to the public transport of major cities (intermodal mobil- and that the measurements provided by these sensors are strongly cor- ity). Within this context, the integration of electric vehicles represents rupted by noise. We consider the problem of assigning sensors to track an opportunity to further protect the environment and potentially save targets so as to minimize the error cost in the resulting estimation for energy costs. In order to create an efficient car sharing transporta- target locations. More in particular, 2n sensors are located on a straight tion network, the location of stations, the number of vehicles and the line, and need to be assigned in disjoint pairs to n targets, which are availability of electric fast charging stations are critical success factors. somewhere in the plane. This so-called Focus of Attention problem is Based on an existing optimization approach for fossil car sharing, we a special case of a three index assignment problem. provide a decision support system (DSS) to plan and optimize car shar- ing stations for electric vehicles. Within design-oriented research, we We provide a complete complexity and approximability analysis of the refine and evaluate research artefacts. An optimization model and the Focus Of Attention problem. We establish strong NP-hardness, and we DSS OptCarShare 1.1 enable to optimize stations and visualize results. construct a polynomial time approximation scheme. Furthermore, we Parameters, such as the annual lease payment for charging station, the describe error cost functions for which a polynomial time algorithm expected travel time of consumer, the charging time of electric vehicle exists. Finally, we discuss the setting where the sensors are at unit dis- dependent on available charging stations, affect the decision variables tances from each other, and prove that even in this special case strong such as the number of car sharing stations, vehicles and fast charging NP-hardness still applies. stations. On the basis of evaluations and benchmarks for the cities of Hanover and Zurich, we establish generalizations for the parameters of 3 - Capital Budgeting Problems: A parameterized point the model. The results show a high impact of the fast charging stations of view (half an hour to fill 80% of the battery) on the current model and the Frank Gurski, Jochen Rethmann, Eda Yilmaz optimal solution. A fundamental financial problem is budgeting. A firm is given a set of 3 - Integration of Prospect Theory into the outranking financial instruments X over a number of time periods. Every instru- approach PROMETHEE ment has a return and for every time period a price. Further for every Nils Lerche time period there is budget. The task is to choose a portfolio X’ from X such that for every time period the prices of the portfolio do not ex- Outranking-methods as a specific application of Multi-Criteria Deci- ceed the budged and the return of the portfolio is maximized. Since sion Analysis (MCDA) are applied to structure complex decision prob- the capital budgeting problem is computational intractable and is de- lems as well as to elicitate of the decision makers’ preferences. There- fined on inputs of various informations, we study the fixed-parameter fore, a consideration of behavioral effects within outranking-methods tractability of the problem. The idea behind fixed-parameter tractabil- seems to be meaningful. Several behavioral effects and biases have ity is to split the complexity into two parts - one part that depends been identified in previous studies, however, only few approaches ex- purely on the size of the input, and one part that depends on some pa- ist to consider such behavioral issues within the application of MCDA- rameter of the problem that tends to be small in practice. We show that methods explicitly. The prospect theory developed by Kahneman and for the multi-period problem the number of instruments and the sum Tversky (1979) represents one of the most prominent theories from of all budgets can be chosen as a parameter such that the problem is behavioral decision theory. Their findings concerning the decision fixed-parameter tractable. For the single-period problem additionally behavior of humans, e.g. loss aversion or reference dependency, are the threshold value of the return can be chosen as a parameter. Thus broadly supported and confirmed through a variety of empirical re- for a lot of small parameter values we obtain efficient solutions for the search. Hence, the aim this paper is to integrate these elements from

85 FC-04 OR 2014 - Aachen

prospect theory within the outranking approach PROMETHEE. For The formulation of such an optimization problem requires modeling that purpose, an additional discrete reference alternative is incorpo- the network and the progression of node pressures and water flows rated. A case study concerning the sustainable usage of biomass for over time. The time discretization step for the resulting differential- energy conversion illustrates the new developed method. algebraic equation must be chosen carefully, because a large time step can result in a solution that is feasible for the discretized model, but not feasible for the physical (continuous) system, whereas small time- steps impact the tractability of the optimization problem at hand. In the study of water networks, the basic forward Euler method is the FC-04 universally accepted method of choice, even though its limitations are  well-known. We show that a large time step can result in meaningless Friday, 11:20-12:50 - Fo4 numerical results, and we construct an upper bound on the error in the tank pressures found by using a forward Euler scheme for a general Advanced techniques in robust network. This error bound is then used to construct an optimization optimization formulation that is robust to the discretization error. While many clas- sical approaches for robust optimization dealing with uncertainty in the input data have been proposed, robust optimization with respect to Stream: Robust and Stochastic Optimization uncertainty in the modeling of the system is novel in water network Invited session optimization. Chair: Sebastian Stiller We provide methods to find an upper bound on the discretization er- Chair: Britta Peis ror for a chosen network and to formulate an optimization problem that is robust to the model uncertainty. These methods can be used to 1 - Single-Commodity Robust Network Design with Sim- optimize water network operation given the error introduced by time discretization. Results are shown for these methods using our test net- ple Polyhedral Uncertainty work. Daniel Schmidt, Valentina Cacchiani, Michael Juenger, Frauke Liers, Andrea Lodi We seek to design optimal robust client-server networks: Suppose that we want to transfer a single commodity (e.g., data) among the nodes in  FC-05 a network. Each node has a minimum and a maximum balance limiting Friday, 11:20-12:50 - Fo5 how much of the unique commodity the node can supply or demand. Our aim is to find minimum cost integer capacities for the network’s links such that all possible realizations of supplies and demands can be Approximation Algorithms routed through the network. This gives us a worst-case robust model with polyhedral uncertainties. Applications for the model lie in net- Stream: Discrete and Combinatorial Optimization, works where all servers can answer the client’s requests which is, for Graphs and Networks instance, true for movie streaming networks. Invited session We build on previous work by Buchheim, Liers and Sanità (INOC Chair: Marc Uetz 2011) and a previous joint work by authors (ISCO 2012) with Ál- varez, Dorneth and Parriani to develop a branch-and-cut algorithm for 1 - Primal-Dual Algorithms for Precedence Constrained the problem. It uses a capacity based linear program to obtain lower Covering Problems bounds for the objective value and derives upper bounds with problem Andreas Wierz, Britta Peis, Thomas S. McCormick specific rounding heuristics. To solve the linear program, we give a separation algorithm and tighten our formulation with 3-partition in- We discuss precedence constrained knapsack problems (PCKP) and equalities. Finally, we evaluate the algorithm experimentally. some extensions which are of high theoretical and practical interest for example in portfolio optimization. Let us consider the following port- 2 - Strategic planning in large-scale logistic networks folio optimization problem: We are given a fixed budget and a set of Alexander Richter, Sebastian Stiller, Daniel Karch assets which may be included in the portfolio. Each asset has a cost and an arbitrary, non-negative utility, for example projected profits, as- We consider routing through hub networks from the perspective of a sociated. The optimization problem seeks for a portfolio, that is, a set logistics’ customer. We devise an optimization method to decide on of assets, with maximal utility such that the budget constraint is met. renting hubs and routing several different goods from multiple sources In the presence of precedence constraints, some assets may only be in- to multiple sinks in a large scale network with realistic transportation cluded in the portfolio if a set of preceding assets was also selected. costs. As the strategic planning must be completed before the actual The additional restriction of precedence constraints often appears in demand is known, our solutions are robust optimal, i.e., have lowest practice, for example in real estate markets. Buying a single object cost under the worst-case of a restricted fluctuating demand. Our case may be expensive at low projected profit, however, buying an entire studies are outbound networks of retailers, and the inbound network of street block might enable us to design highly profitable building com- an automotive company. Given multi-commodity demands, the goal plexes. An intuitive extension of this problem has to ensure multiple is to choose cost minimal hubs and subroutes offered by a multitude budget constraints which might arise for legal reasons. We present the of transportation companies at an involved system of rates. Typical first approximation algorithms for PCKP in the presence of arbitrary rates depend on several properties (usually two: weight and volume). precedence constraints for both, single and multiple budget constraints. We model the cost on an edge by different, abstract containers, each In either case, the result is tight in the sense that the approximation al- having fixed cost for buying copies and capacities for each property. gorithm may in fact output a solution meeting our approximation guar- In one respect, fluctuating demand makes optimization easier as limits antee, which, interestingly, is independent of the number of budget the the benefit of planning for tightly filled containers. Therefore, we constraints. The approximation algorithm uses a novel linear program- allow for buying a fractional number of copies greater than one or no ming relaxation of PCKP which exploits the structure of the underlying copy. The crucial optimization potential stems from the consolidation precedence constraints and hence might be of independent interest. Fi- of goods with different properties. We apply the model of budgeted nally, we show that there is no polynomial-time approximation scheme interval uncertainty as in Bertsimas and Sim, with uncertain demand for PCKP under standard complexity assumptions. values. We extend their methodology to obtain a compact MILP for- mulation for a robust counterpart. Preliminary experiments on a real 2 - Almost Tight Approximation Results for Biclique instance with roughly 400 sources, 20 sinks and 10000 demands indi- Cover and Partition cate that for a small set of hubs (”10) the MILP can be solved close to Andreas Karrenbauer optimality, while for larger hub sets (”100) a solution to the resulting We consider the Minimum Biclique Cover and Minimum Biclique Par- LP relaxation can be rounded to integrality. tition problems on bipartite graphs. That is, for a given , we wish to compute a small number of complete bipartite subgraphs 3 - Robustness to Time Discretization Errors in Water (also called bicliques) such that each edge is contained in at least one Network Optimization of them. This problem, besides its correspondence to a well-studied Nicole Taheri notion of bipartite dimension in graph theory, has applications in many other research areas such as artificial intelligence, computer security, Water network optimization can benefit water utilities by improv- automata theory, electrical engineering, and biology. Since it is NP- ing operations, such as finding lowest-cost pump schedules or deter- hard, past research has focused on approximation algorithms, fixed pa- mining valve placements and settings that result in minimal leakage. rameter tractability, and special graph classes that admit polynomial

86 OR 2014 - Aachen FC-07

time exact algorithms. For the minimum biclique partition problem, as ex-post assessments of SHPs based on a review of existing applica- we are interested in a biclique cover that covers each edge exactly once. tions and methods and it derives a catalogue of sustainability criteria for evaluating SHPs. We revisit the problems from approximation algorithms’ perspectives and give nearly tight lower and upper bound results. We first show that both problems are as hard to approximate as the Node Coloring 2 - Developing a decision support framework to regulate problem. That is, by exploiting properties of graph products, we ob- the fuel / bio-fuel sector under consideration of eco- tain lower bounds for the approximation guarantee of polynomial-time nomical, ecological and social objectives approximation algorithms, which grow almost linearly in the number of nodes. We thereby raise the best-known lower bound for Minimum Laura Elisabeth Hombach, Grit Walther Biclique Cover by Gruber and Holzer to a power of 3. The improve- ment of the lower bound for Minimum Biclique Partition is even more In the future, the usage of fossil fuels must be reduced in order to dramatically, where only APX-hardness with a constant slightly above ensure supply security as well as projected emission savings within 1 was known before. Furthermore, we show that sub-linear approxi- the transportation sector. One option for achieving these targets is mation factors can be obtained, which almost closes the remaining gap the substitution of fossil fuels by biofuels. When using biofuels between upper and lower bounds. the occurrence of negative social side-effects has to be observed. Those social side-effects include the competition of biofuels with the Joint work with Parinya Chalermsook, Sandy Heydrich, and Eugenia food/fodder production as well as the appearance of land use change Holm (direct/indirect). To avoid land use change the minimization of life cy- cle emissions of the fuel sector is not sufficient, as the emissions pol- 3 - Unrelated Machine Scheduling with Stochastic Pro- luted by the biofuel production (especially from the 2nd generation) cessing Times are still below those of fossil fuels including land use change emis- Marc Uetz, Martin Skutella, Maxim Sviridenko sions. Thus, we develop a three-objective, multi-period optimization model, considering cultivation of biomass, production of biofuels, im- Two important characteristics encountered in many real-world port of biofuels and biomass, as well as blending of fuels. Our aim is to scheduling problems are heterogeneous processors and a certain de- identify Pareto-efficient solutions and to derive trade-off relations for gree of uncertainty about the sizes of jobs. In this paper we address political decision makers regarding profit maximization, emission min- both, and study for the first time a scheduling problem that combines imization, and land use change minimization. To calculate the efficient the classical unrelated machine scheduling model with stochastic pro- frontier of the three-objective MILP we use the augmented epsilon- cessing times of jobs. Here, the processing time of jobs on machines constraint approach. To estimate the optimal solutions of existing po- is governed by independent random variables, and their realization be- litical regulations according to ecological, economic and social targets come known only upon job completion. We study the objective to lexicographical ordering is used. The model is applied to a case study minimize the expected total weighted completion time. By means of the German (bio)diesel market and the existing political regulations of a novel time-indexed linear programming relaxation, we compute are analyzed to derive implications and recommendations for political in polynomial time a non-anticipatory scheduling policy with perfor- decision makers. mance guarantee arbitrarily close to 3/2+D/2. Here, D is an upper bound on the squared coefficient of variation of the processing times. 3 - Demand side management for the household sector When jobs also have individual release dates, our bound is 2+D. We also show that the dependence of the performance guarantees on D is — case study of prospective household technologies tight. Notably, via D=0 the currently best known bounds for deter- Martin Bock, Grit Walther ministic scheduling on unrelated machines, 3/2 and 2 respectively, are contained as a special case. Electrical power demand and supply fluctuate over time, but have to be balanced in the electrical power net. For the private household sector, further fluctuations of demand are expected to increase with the diffu- sion of power intense technology like battery electric vehicles (BEVs) and heat pumps. By applying demand side management (DSM), either FC-06 by incentives or direct control instruments, utilities influence energy  demand. Against this background, the aim of this paper is to present Friday, 11:20-12:50 - Fo6 a methodology to analyze the impact of DSM on household energy demand. Thus, we develop a two step model, which, first, generates Sustainable Energy Supply Networks household load profiles and, second, adjusts the load profiles in reac- tion to the used DSM instrument. The load profile of each household is Stream: Energy and Environment generated by a bottom-up simulation model due to the complexity and Invited session randomness of residential appliance usage. In reaction to DSM, each household may alter its appliance usage, considering both cost mini- Chair: Laura Elisabeth Hombach mization against its own comfort of living. Therefore, each household is modeled as a bi-criterial, mixed integer linear program for the second 1 - Sustainability Assessment of Hydropower Plants: A step. The MILP considers both a structural level of the energy infras- Review of Applications and Methods tructure of the household, and a behavioral level of the resident behav- ior. By the disaggregated level of modeling different energy carriers, Lea Berg, Simon Hirzel, Benedikt Freiherr von Lüninck, Felix energy storage and transformation are considered for each household. Tettenborn The developed model is applied to a case study of a future household sector which includes prospective appliances. Different types of appli- The rising energy demand of rapidly developing countries, such as ances such as BEVs, heat pumps, battery storages and combined heat China, has to be met with a fast expansion in electricity generation and power generators are considered. Implications from the utilization capacity. As electricity generation from fossil fuels is linked to envi- of DSM for the households, the utility and the power net are derived ronmental pollution, capacity expansion is increasingly governed by by the case study. the paradigm of sustainability. Small hydropower plants (SHPs) cater to the principle of sustainable development by beneficial effects such as poverty alleviation in underdeveloped rural regions resulting from the provision of affordable electricity or the associated reductions in greenhouse gas emissions. Impacts both on the environment and the society nearby, however, are adversely connoted in sustainability con- FC-07 siderations.  Friday, 11:20-12:50 - Fo7 Decision making on design, location and management issues of SHPs, with a view to increase the sustainability of projects, poses a multi- New effective approaches in dimensional problem, to which many scholars apply methods from operations research, in particular multi-criteria analysis. A review of combinatorial optimization the pertinent literature yields two insights. First, sustainability lacks a clear-cut definition giving room to a plethora of approaches and virtu- Stream: Discrete and Combinatorial Optimization, ous sets of criteria. Second, authors focus either on existing SHPs, or Graphs and Networks on planning new SHPs. Yet an assessment for sustainability of SHPs needs to grasp all relevant impacts in an integrated manner. Therefore, Invited session this contribution draws up a methodology applicable to ex-ante as well Chair: Frauke Liers

87 FC-08 OR 2014 - Aachen

1 - ILP Formulations for the Multiple Constant Multipli- Secondly, we present a model for runway scheduling with precedence cation Problem constraints in which the arrival and the departure planner cooperate Diana Fanghaenel, Martin Kumm in order to derive improved schedules. The task of determining an optimum slot assignment for landing and departing airplanes simulta- The Multiple Constant Multiplication problem is an important prob- neously is modeled as a linear mixed-integer problem. Its polyhedral lem in the field of digital signal processing, that is investigated since structure is analyzed. Furthermore, the complete facial description is decades by engineers. There are many applications, such as the im- derived for the corresponding assignment problem with one additional plementation of digital filters and linear transforms, where a variable precedence constraint. (input signal) has to be multiplied with a set of constants. In hardware, this is realized using addition/subtraction and bit shift operations. The problem is to find a circuit that realizes the multiplications with a min- imum number of additions/subtractions (bit shifts are assumed to be without costs). The Multiple Constant Multiplication Problem can be described as Steiner tree problem in directed hypergraphs.  FC-08 We present different ILP and MILP formulations of the problem. Fi- Friday, 11:20-12:50 - Fo8 nally we discuss different extensions of the Multiple Constant Multi- plication Problem. Resource Allocation Games 2 - On bounding the bandwidth of graphs Renata Sotirov, Edwin van Dam, Franz Rendl Stream: Algorithmic Game Theory Invited session For (undirected) graphs, the bandwidth problem is the problem of la- beling the vertices of a given graph with distinct integers such that the Chair: Sascha Kurz maximum difference between the labels of adjacent vertices is mini- mal. In this talk we present two new semidefinite programming bounds of the problem; one is suitable for graphs with symmetry, and the other 1 - Budget-restricted utility games with ordered strate- one applicable to any graph of moderate size. In order to evaluate the gic decisions lower bounds, we also compute upper bounds. Consequently, we are Maximilian Drees, Alexander Skopalik able to determine an optimal labeling for several graphs under consid- eration. Recent advancements of network technology enabled and simplified 3 - An Exact Solution Method for the Quadratic Match- outsourcing of processing and storing information to remote facilities. ing Problem: The One-Quadratic Term Approach and The offering of such services in a competitive environment has become Generalizations known as cloud computing. The competitive aspect is twofold. On the Lena Maria Hupp, Frauke Liers, Laura Klein one hand, customers compete over the allocation of various types of services and resources like bandwidth or computing power. These re- The quadratic matching problem (QMP) asks for a matching in a graph sources are usually limited in capacity and when the demand exceeds that optimizes a quadratic objective in the edge variables. The QMP that capacity costumers’ demand can only be satisfied partially. On generalizes the quadratic assignment problem. Applications of the the other hand, service providers face strategic decisions in the mar- QMP exist in computer vision, when for example a moving person kets which have to take into account the budget of their clients. As is identified automatically on photos that are taken within a short pe- long as a client can afford all the desired products, this has no conse- riod of time. More generally, the problem of finding ’highly similar’ quence. But once their total costs exceed his budget, he has to split subgraphs in two given graphs can be solved by determining a QMP. it between them. When deciding to offer a product, a provider there- When using branch-and-cut approaches, usually the binary quadratic fore has to consider the remaining budgets of the interested clients. We problems are linearized by introducing additional variables that model study budget games as strategic games as well as in a variant that takes the product terms, together with linearization constraints. However, into account temporal aspect. Strategic games are often analyzed as in general LP-relaxations of the linearized IP-formulation yield weak one-shot games which do not capture situations like a new provider bounds. In our approach, we strengthen the linearized IP-formulation having a disadvantage against those already established. The clients by cutting planes that are derived from facets of the corresponding prioritize the products they already know and spend only what is left matching problem where only one quadratic term is assumed in the of their budget on what a new provider offers. As a result, he cannot objective function (QMP1). We present new classes of facets that arise gain more than what is left of a clients budget. In this approach, called from the well known blossom inequalities of the matching problem. ordered budget games, we take the order of strategy changes into ac- We show that separation of these new inequalities is polynomially count. Each client has an ordering of the products and its budget is solvable. We present different methods to strengthen the new relax- allocated to them in that order. If a player decides to change its strat- ations for the general QMP. Thus, during separation additional cut- egy, its supply of products changes and new products are moved to the ting planes for the general QMP can be derived from valid inequali- last position in the ordering of the clients in the target group. ties of the QMP1. In particular, we introduce a method based on the linearization-reformulation technique that generates valid inequalities for the QMP from valid inequalities from QMP1. Based on these re- 2 - Sharing costs for good equilibria sults, we design and implement an exact branch-and-cut approach and Daniel Schmand, Max Klimm report computational results.

4 - The Assignment Problem with Precedence Con- In cost sharing games, the existence and efficiency of pure Nash equi- straints and its Application in Air Traffic Management libria fundamentally depends on the underlying cost sharing protocol. Andrea Peter, Thorsten Ederer, Frauke Liers, Alexander We consider a general class of resource allocation problems in which a Martin set of resources is used by a heterogeneous set of selfish users. The cost of a resource is a (non-decreasing) function of the set of its users. The Optimization problems abound in air traffic management. A relevant set-dependency of the cost functions allows to model different tech- problem consists in scheduling the landing and departing airplanes on a nologies at the resources required by different users, such as machines, runway. In this talk, we consider a single runway. The task is to assign bandwidth, personal, etc. each airplane a discrete time slot such that the makespan is minimized. Each time slot can be assigned at most once to either a landing or a de- Under the assumption that the costs of the resources are shared by uni- parting aircraft. The freedom of the ordering is limited by precedence form cost sharing protocols, i.e., protocols that use only local informa- constraints that are present between some pairs of aircraft. In practice, tion of the resource’s cost structure and its users to determine the cost there exists an arrival planner and a departure planner that can assign shares, we give (asymptotically) tight bounds on the inefficiency of the time slots to aircraft. First, we present a mathematical model for the resulting pure Nash equilibria. Specifically, we show tight bounds on ‘worst-case’ scenario in which the arrival planner and the departure price of stability and anarchy for games with only submodular, only planner act as adversaries and alternatingly assign a certain time slot to supermodular or arbitrary cost functions, respectively. While all our some aircraft, together with computational results. The model yields upper bounds are attained for the well-known Shapley cost sharing an integer program together with quantifier variables. For an instance, protocol, all our lower bounds hold for arbitrary uniform protocols and it answers the question whether there exists a successful scheduling even for games with anonymous costs, i.e., games in which the cost strategy for the arrival planner, irrespectively of the strategy of the de- only depends on the cardinality of the set of its users. parture planner.

88 OR 2014 - Aachen FC-10

3 - The price of fairness for a small number of indivisible 1 - A Quantitative Model of Electric Vehicles’ Energy items Consumption Sascha Kurz Kim Lana Köhler, Michael H. Breitner

Assume that a set of agents has to subdivide a set of indivisible items. Electric mobility enables a sustainable climate and environmental Preferences over the items may be diverse for different agents. We friendly mobility to manage globally discussed topics like the reduc- study the impact of fairness on the efficiency of allocations, looking at tion of CO2 emissions, the reduction of total energy consumption, and different fairness criteria from the theory of fair division. Current re- the efficient use of available resources. For example, the German gov- search has focused on the case where the number of items is arbitrary. ernment gives a clear sign for the principal role of electric vehicles in Here we consider situations where the number of items is relatively the near future with the goal of one million electric vehicles on German small, compared to the number of agents, which is relevant in practice roads by 2020. But, the range of an electric vehicle is currently limited and indeed makes a difference. by battery cost and weight. It must be the goal of each driver to use the given energy as good as possible. Compared to conventional cars, where the driving behavior is a pivotal factor for fuel consumption, there are a lot more influences which have a significant influence on power consumption while driving electric vehicles. For example, the use of auxiliary equipment (e.g. interior heating) has a high influence FC-09 and results in a significant range reduction. The missing understanding  of the cause and effect laws for environmental (e.g. weather, route) and Friday, 11:20-12:50 - SFo1 additional (e.g. number of car passengers) factors on power consump- tion of an electric vehicle induces uncertainty and range anxiety, i.e. Variational Inequalities and Related the fear of not reaching the desired destination. We describe a model Topics II which explains these interdependencies and a cause and effect model. The applied model provides assistance for the driver: it quantifies the interaction between the factors which influence the power consump- Stream: Continuous and Non-linear Optimization tion. To validate the model, test cycles with real time data are con- Invited session ducted. The test cycles show exemplary different effects on the power Chair: Steffensen Sonja consumption by varying the identified influence factors. 2 - Strategic network planning of recycling of photo- 1 - Algorithmic models of market equilibrium voltaic modules Vladimir Shikhman, Yurii Nesterov Eva Johanna Degel, Grit Walther In this talk we suggest a new framework for constructing mathemati- The energy production of renewable energy sources is an essential cal models of market activity. Contrary to the majority of the classical mean to stop the consequences of climate change. In Germany, the economical models (e.g. Arrow-Debreu, Walras, etc.), we get a char- installed capacity of photovoltaic (PV) steeply increased over the last acterization of general equilibrium of the market as a saddle point in a years due to subsidies provided by the government. Considering the convex-concave game. This feature significantly simplifies the proof of lifetime of PV modules of 25 — 30 years, the related amount of PV existence theorems and construction of the adjustment processes both waste will increase during the next decades. Thus, PV modules have for producers and consumers. Moreover, we argue that the unique been integrated into the WEEE directive in the year 2012. In order equilibrium prices can be characterized as a unique limiting point of to fulfil the WEEE recycling and recovery quotas, it will be sufficient some simple price dynamics. In our model, the equilibrium prices have to recover the fractions with the highest mass, i.e. glass and frame. natural explanation: they minimize the total excessive revenue of the However, it could be reasonable to recover other rare materials, like market’s participants. Due to convexity, all our adjustment processes silver, copper, tellurium or indium, due to evolving scarcity of certain have unambiguous behavioral and algorithmic interpretation. From the resources and limited availability of primary resources in Germany. technical point of view, the most unusual feature of our approach is the The necessary technologies are still in the development or pilot stage. absence of the budget constraint in its classical form. Against this background, the aim is to develop a strategic planning ap- proach in order to analyse the early installation of appropriate collec- 2 - Coercivity of Multivariate Polynomials in Terms of tion and recycling infrastructures with special focus on future resource criticalities. In order to do so, a multi-periodic MILP is developed Their Newton Polytopes regarding capacity, technology and location decisions for collection Tomas Bajbar, Oliver Stein and recycling of PV modules. Decisions are evaluated with regard to economic aspects. Additionally, information on resource criticali- We state necessary conditions for coercivity of a multivariate polyno- ties derived from criticality indicators is integrated. By using resource mial involving the vertex set of its Newton polytope. We also discuss price scenarios and subjective probabilities for the scenarios, the model the issues around proving the sufficiency of these conditions. evaluates the effects of resource criticality for economic decisions in recycling network planning. A case study illustrates the approach and 3 - On proper efficiency in multiobjective semi-infinite its results. optimization 3 - Reuse and recycling of batteries from electric cars: Jan-J Ruckmann, Francisco Guerra-Vázquez Economic and Market consequences We consider multiobjective semi-infinite optimization problems which Ramajothi Ramsundar, Peter Letmathe, Ilhana Mulic are defined by finitely many objective functions and infinitely many In the future, electric mobility will become more important in the traf- inequality constraints in a finite-dimensional space. We discuss con- fic sector. Lithium-ion batteries will power most of the electric ve- straint qualifications as well as necessary and sufficient conditions for hicles. These batteries used in electric vehicles retain 80% of their locally weakly efficient solutions. Furthermore, we generalize two capacity even after 7-8 years of use, which is sufficient for alternate concepts of properly efficient solutions to the semi-infinite setting and use. The research question of this paper is to calculate the value of present corresponding optimality conditions. second life batteries. We also determine the economic (price, profit) and market (demand) consequence of the second life of used electric vehicle batteries. We use a closed loop supply chain (CLSC) framework for electric ve- hicles, which includes reuse, recycling and disposal of electric vehicle  FC-10 batteries. In our model, we use a generalized Bass diffusion model for Friday, 11:20-12:50 - SFo2 demand of electric vehicles. The reuse and recycling of used batteries from electric cars also have a significant effect on the pricing decision and profits of OEMs. We model this effect by optimizing the OEMs Recycling and Electrical Vehicles profit in the CLSC framework. Another important factor that affects the profit is the return rate of used batteries. Stream: Energy and Environment We find that the optimal price for electric vehicles is influenced by Invited session the economic value of used batteries and can potentially increase the Chair: Marcus Schröter demand for electric vehicles. We show that the profits of OEMs are

89 FC-11 OR 2014 - Aachen

higher when considering a realistic closed loop supply chain frame- the activities is maximized. Doersch and Patterson (1977) discuss a work for electric vehicles. Since their profit is also affected by the mixed-integer linear programming formulation (MILP). Despite im- return rate of used batteries, OEMs should incentivize the customers provements in optimization software and computer hardware, nowa- to return the used batteries by applying different return policies. days such formulations are applicable to small-sized problem instances only. We present a heuristic method that combines mixed-integer linear pro- gramming with a network-based decomposition. The heuristic consists of two phases: In the first phase, the project network is decomposed  FC-11 into subsets of activities. In the second phase, these subsets of activities Friday, 11:20-12:50 - SFo3 are iteratively added to the partial schedule, applying an exact MILP formulation. Thereby, we do not fix the starting times of the activities which have already been scheduled, but we allow a time window in Vector and Set Optimization II which the activities can be shifted when inserting new activities to the partial schedule. A major advantage of such MILP-based heuristics is Stream: Decision Theory and Multi-Criteria Optimiza- the flexibility to account for additional constraints or modified plan- tion ning objectives. Our computational results indicate that the heuristic is Invited session able to devise optimal solutions to non-trivial problem instances, and outperforms the MILP of Doersch and Patterson (1977). Chair: Andreas Löhne Chair: Benjamin Weißing 2 - A Hybrid Metaheuristic for the Resource-Constrained Project Scheduling Problem with Flexible Resource 1 - Optimality conditions in set-valued programming Profiles Maria Pilecka Martin Tritschler, Anulark Naber, Rainer Kolisch We consider a set-valued optimization problem where a set-valued ob- We consider a generalization of the resource-constrained project jective mapping is minimized over a feasible set given by a closed con- scheduling problem (RCPSP), namely the RCPSP with flexible re- vex set. In this talk the notion of optimality introduced by Kuroiwa is source profiles (FRCPSP) in discrete-time periods. In the FRCPSP, regarded. The images of the set-valued mapping are assumed to be only the total required amount of each resource is given for each activ- compact and convex whereas the mapping is not convex. Such kind ity, whereas the activity duration and the resource allocation have to be of optimization problem may be interpreted as a bilevel optimization determined. As the resource allocation of an activity can be adjusted problem with a convex lower level problem possessing a compact fea- between time periods, the resulting resource profile of the activity is sible set. After introducing a special set difference, we define a di- flexible. The FRCPSP, therefore, determines for each activity the start rectional derivative and a subdifferential for set-valued mappings and time, the duration, and the resource allocation per time period in order investigate properties of these tools. We derive new optimality con- to minimize the project makespan. To solve the FRCPSP, we propose ditions for unrestricted set-valued optimization problems using both a hybrid metaheuristic that integrates a genetic algorithm and a vari- directional derivative and subdifferential. In addition, we also present able neighborhood search. The genetic algorithm employs a modified optimality conditions for restricted problems with the aid of the tangent parallel schedule generation scheme with a two-step resource alloca- cone (Bouligand cone) to the feasible set and the directional derivative. tion heuristic to generate schedules. We then further improve the best- found schedules with a variable neighborhood search by reallocating 2 - Set-valued shortfall risk measures via Lagrange du- resources among activities based on critical path calculations. We eval- ality uate the performance of our proposed method and compare the results Cagin Ararat to those of other heuristic and exact methods. Set-valued risk measures have been recently used to quantify risk in 3 - A Flow-Based Tabu Search Algorithm for the RCPSP multi-asset financial markets with transaction costs or other frictions. with Transfer Times In this work, it is assumed that there is an individual utility function Sigrid Knust, Jens Poppenborg for every asset and the set-valued shortfall risk measures are studied based on these utility functions. The value of a shortfall risk measure In this talk we present a tabu search algorithm for the resource- at a fixed random vector is defined as the solution of a certain convex constrained project scheduling problem (RCPSP) with transfer times. set optimization problem. Using a recent Lagrange duality for set opti- Solutions are represented by resource flows extending the disjunctive mization, the corresponding dual problem is obtained, which gives rise graph model for shop scheduling problems. Neighborhoods are de- to divergence risk measures - another new class of convex set-valued fined by parallel and serial modifications as suggested in Fortemps risk measures. The value of a divergence risk measure can be inter- and Hapke [1997]. This approach is evaluated from a theoretical and preted as a "partially scalarized" set optimization problem where one practical point of view. Besides studying the connectivity of differ- of the dual variables coming from Lagrange duality has the role of a ent neighborhoods, computational results are presented for benchmark scalarizing vector. It is shown that a shortfall risk measure can be writ- instances with and without transfer times. ten as an intersection, that is, a set-valued supremum, over a family of divergence risk measures. Examples of these risk measures include the set-valued versions of the entropic risk measure and average value at risk.  FC-13 Friday, 11:20-12:50 - SFo9 Machine Learning and Data Mining  FC-12 Friday, 11:20-12:50 - SFo4 Stream: Artificial Intelligence, Big Data, and Data Min- ing New Models in Project Scheduling Invited session Stream: Project Management and Scheduling Chair: Max Krueger Invited session 1 - Row and Column Generation Algorithm for Maxi- Chair: Sigrid Knust mization of Minimum Margin for Ranking Problem Yoichi Izunaga, Keisuke Sato, Keiji Tatsumi, Yoshitsugu 1 - An MILP-based Heuristic for the Capital-Constrained Yamamoto Net Present Value Problem Tom Rihm, Norbert Trautmann Ranking problem is a problem of learning a ranking function from the data set of n objects each of which is endowed with an attribute vec- The capital-constrained net present value problem consists of schedul- tor of m dimension and a ranking label chosen from the ordered set ing several project activities subject to completion-start precedence and of labels. Attribute vectors of objects are separated by hyperplanes capital constraints such that the net present value of the cash flows of which share a common normal vector, then each object is given a label

90 OR 2014 - Aachen FC-14

according to the layer it is located in. The problem is to find the nor- 1 - Planning the Patient Transport as Part of the German mal vector as well as the thresholds of each layer that best fit the input EMS System data. What distinguishes the problem from the conventional multi- Melanie Reuter, Stefan Nickel class classification problems is that the identical normal vector should be shared by all the separating hyperplanes. We propose to apply the In Germany, the federal states have sovereignty over the Emergency dual representation of the normal vector to the formulation based on Medical Service (EMS) system. Therefore, each state has its own EMS the fixed margin strategy by Shashua and Levin for the ranking prob- law including different rules and specific definitions of the provided lem. We keep the original constraints and replace the objective func- services. Each state is then divided into smaller EMS regions with a tion by a quadratic function of the dual representation. The problem rescue coordination center being responsible for the allocation and or- thus obtained has the drawback that it has n of variables as well as n ganization of the services. EMS systems in Germany are not only re- of constrains, however, the fact that it enables the application of ker- sponsible for emergency services but also for the transport of patients if nel technique outweighs the drawback. The key idea is twofold: the the attendance of an emergency medical assistant is necessary. Even if dimension m of the attribute vectors is usually much smaller than the many of the transportation tasks are known in advance, trips are usually number n of objects, hence we need a small number of attribute vec- not planned at present, especially not automatically. One of the main tors for the dual representation, and it is very likely that most of the problems in practice are the waiting times for the patients when ambu- constraints are redundant at the optimal solution. Then we propose a lances arrive too late, but also the waiting times for the staff, if patients row and column generation algorithm. Namely, we start the algorithm are not ready at the hospital when they are supposed to be picked up. with a sub-problem which is much smaller than the master problem in We want to show that by modeling it as a dial-a-ride problem and solv- both variables and constraints, and then increment both of them as the ing it with (online) heuristics to include short-term demands, schedul- computation goes on. Some computational results will be reported at ing the patient transports can be reasonable in practice. Due to cost the site. increases and cost pressure which are typical for the healthcare sec- tor efficient planning methods become more and more important. We 2 - Bayesian network learning using integer program- present a mathematical model and an algorithm for solving the patient transportation problem. We test these using data from a rescue coordi- ming nation center located in the south of Germany. The long-term goal is to James Cussens build a platform that connects the EMS regions within a federal state. It integrates the ability to schedule patient transports between different Bayesian networks (BNs) represent relations of conditional indepen- regions to avoid empty trips on the way back. dence between random variables. Learning BNs from data (big or oth- erwise) is an important task that is known to be NP-hard in general. 2 - Optimal adaptation process of Emergency Medical By casting BN learning as constrained optimisation, (constraint) inte- Services systems in a changing environment ger programming (CIP) has been used to attack the problem - in our Dirk Degel own work via the SCIP framework. In this talk I will talk on what has been achieved by taking this approach and what remains to be done. Providing high quality emergency medical services (EMS) and ensur- ing accessibility to these services for the general public is a key task for An important advantage of using CIP is that it facilitates a ’declarative’ health care systems. Given a limited budget, available resources, e.g., approach to machine learning where the user need only declare the data ambulances, have to be used economically in order to ensure a high and prior knowledge and the solver/developer has the job of finding an quality coverage. Demographic changes, increased traffic volume and optimal BN given that information. I will discuss recent work where structural modifications in the urban infrastructure lead to permanent this has been exploited to find optimal sets of related BNs. changes in EMS demand. In particular, the developments in the ur- ban infrastructure include modifications of the road-network, the pro- 3 - Monitoring of Bayesian Network Sources in Repeat- vision of developing areas and the incorporation of neighboring cities edly Performed Classification Tasks including the centralization of EMS. An appropriate EMS infrastruc- Max Krueger ture (number and positions of stations) and configuration of the EMS system (number, positions and relocations of ambulances) is needed Bayesian Networks are established means in various kinds of classifi- to ensure an adequate coverage and high service quality. Most ap- cation tasks, e.g., technical or medical diagnosis, pattern recognition, proaches in literature dealing with strategic location and resource plan- as well as air and sea surveillance. In Bayesian Networks, nodes’ prob- ning neglect the consideration of the initial state of EMS systems and ability of query states (classification results) are determined based on possible future structural changes. In contrast, the presented approach declarations (finding of features) from different evidence nodes. Con- identifies an optimal adaptation process, i.e., an enhancement of the ex- flicts are pieces of such evidence from different sources that carry sub- isting EMS system to meet future requirements. This adjustment takes stantially different, but reliable information on the same object. They into account the existing EMS infrastructure and future developments can be detected by an adequate conflict measure. In normal operations, in a dynamic manner, while respecting the EMS quality criteria. Tac- conflicts between sources occasionally appear due to rare cases, situ- tical decisions are included into strategic planning and are combined ations not covered by the underlying Bayesian model, or inaccuracies with strategic decisions in order to stabilize the EMS system advance- of sensor measurements. ment against environmental changes. A linear multi-criteria program is developed and solved using a weighted sum approach. It supports Failing sources showing no or same declaration at all times can easily EMS decision makers to dynamically improve an existing with respect be detected by simple statistic means, whereas systematically deviating to multiple requirements during a strategic time horizon. or random behavior of defect sources is harder to detect. Our working hypothesis is that these types of failure result into a significantly in- 3 - Analysis of ambulance location models using dis- creased level of conflicts in applications, where the same classification crete event simulation task is repeatedly performed in a large number of cases within a short Pascal Lutter, Dirk Degel, Lara Wiesche, Brigitte Werners time frame. We propose an approach to monitor all sources in such a Bayesian Classification Network by evaluating the conflict-ratio levels Rescue services are an important part of public health care, offered by in a sliding window, covering a certain number of previous classifica- the state to the general public. A crucial aspect of rescue service is tion cases. In a simulated scenario of maritime surveillance with the the first aid of patients provided by local Emergency Medical Services task of smugglers’ detection, the ability of this monitoring approach to (EMS). The quality of a rescue service system is typically evaluated detect defect sources is evaluated by determination of typical param- ex post by the proportion of emergencies reached within the legal time eters, which describe underlying diagnostic test’s performance of this frame. Optimization models in literature consider different variants of approach. demand area coverage, such as single coverage, double coverage and empirical required coverage. Additionally, models with busy fractions and reliability levels serving as a proxy for EMS quality are suggested. All models support the decision maker on the strategic and tactical level of ambulance location planning, but differ regarding the speci- fication of objective functions as well as concerning input parameters  FC-14 and model assumptions. In literature no comparisons of the mentioned Friday, 11:20-12:50 - SFo10 models with respect to their influence on the EMS quality are found. In order to evaluate the performance of different optimization models, Planning Emergency Medical Services a detailed simulation study is conducted. We analyze the influence of different objective functions and the resulting positioning of EMS re- sources on real world outcome measures. Test instances include data Stream: Health Care Management sets of a large German city as well as randomly generated samples with Invited session different urban structures. Chair: Pascal Lutter

91 FC-15 OR 2014 - Aachen

FC-15 In Germany, the aggressive expansion of wind and solar power is de-  creasing power prices and eroding the viability of conventional elec- Friday, 11:20-12:50 - SFo11 tricity generation units of established utilities. The intermittency of re- newables and insufficient transmission capacity from the windy north Advances in credit scoring methodology II to the energy-intensive south has increased the need for grid congestion management (BMWi (2012)). Low operating hours have lead to threats Stream: Statistics and Forecasting to decommission fast-adjusting conventional plants, which poses risks Invited session for the security of supply and grid stability. As a response, the trans- mission system operator, TenneT, and the Federal Network Agency, Chair: Gero Szepannek Bundesnetzagentur, have agreed to compensate costs of two plants in Bavaria (Tennet (2013)). 1 - Do we need scorecard cut-offs? From matching ac- As a long-term solution, BMWi (2013) has proposed a central capacity cept rates to maximising return on capital market. We build a complementarity model to investigate this market Gerard Scallan and to determine optimal compensations to conventional generators. Specifically, we recast the sequential model in Kunz (2013) as a bi- Credit scorecards estimate risk - or probability of default. For many level optimisation model in which the day-ahead market decision are years, credit grantors used score (with policy rules) as a 1-dimensional taken at the upper level and congestion management decisions at the scale for decision making. This gave better operational control and lower level, guided by the upper level compensation decisions that the reduced losses. Credit granting strategy largely reduced to fixing a regulator takes in order to minimize the total generation costs. scorecard cutoff. We calibrate the model to the German power system and identify the Over time, cutoffs became more sophisticated, taking into account loss congested parts of the transmission network: this gives insights about levels, revenue, customer lifetime value and most recently capital re- the geographical distribution of capacity payments under different de- quirements. However, PD is not profit. There will be many credit ap- mand and renewable energy scenarios. We also extend the model to plicants below the scorecard cutoff who would be profitable - and many multiple time periods to show how capacity payments create incentives accepted applicants who have little chance of ever turning a profit. to dispatch flexible units and thus mitigate the impacts of intermittent renewables. Our framework can be modified to study the impacts of This has led to lenders switching dimensions, to look at profit rather other policy measures such as higher CO2 prices. than risk. But profit is largely driven by lending margins - in other words, price. What is the "correct" price for each customer? Each cus- 2 - Equilibrium pricing of reserve power tomer should be offered an individualised price which gives the lender Lenja Niesen, Christoph Weber an acceptable return on capital - taking risk into account. With increasing shares of renewable generation, reserve power markets But there are(important) complications: are expected to gain in importance. Competitively organized markets - Organisation: whose budget is responsible for the trade-offs involved like in Germany are characterized by high prices with considerable in RORAC based decisions? How should the performance of differ- fluctuations. Analytical investigations in a partial equilibrium frame- ent functions be measured? - Stability: revenue and profit estimates work reveal however that capacity prices should be rather low if re- are less stable than estimates of PD. What happens when real life de- serve power is auctioned on an hourly basis. Numerical analyses are parts from the models? - Price Sensitivity: in many markets, customers then used to quantify in a large European electricity market model the are increasing ly sensitive to price. If the price is too high, only cus- impact of different specifications of the reserve power products. tomers who can’t get credit elsewhere will take the offer. Their risk and 3 - A Real Options Model for the Disinvestment in Con- revenue performance may be very different from what was assumed. Recent experience - especially during the financial crisis - has tested ventional Power Plants some of the assumptions. But there are enough open questions to keep Barbara Glensk, Christiane Rosen, Reinhard Madlener credit analysts in work for years to come! The liberalization of the energy market and the merit-order effects lead to difficulties in the profitable operation of some modern con- 2 - On class imbalancy correction for classification al- ventional power plants. Although they are highly efficient with state- gorithms in credit scoring of-the-art technical properties, these power plants are underutilized or Bernd Bischl, Gero Szepannek even mothballed. Decisions about further operation or shut-down of these conventional power plants are in most cases characterized by be- The mathematical problem of credit scoring is often formulated as a ing irreversible, implying uncertainty about future rewards, and being binary classification task. Due to the nature of the problem, defaults flexible in timing. A relatively new approach for evaluating invest- rarely occur and the classes are therefore highly imbalanced. In con- ment/disinvestment projects with uncertainties has been introduced by trast to the balanced setting and the many methods available for it, this the real options approach (ROA) (Dixit and Pindyck, 1994; Schwartz aspect is still underrepresented in research despite its great relevance and Trigeorgis, 2001). This valuation technique is based on option for many business applications, e.g., in response modeling or medical pricing methods used in finance and has been developed by Black, diagnosis. Imbalancy can substantially degrade the performance of a Scholes, and Merton (Black and Scholes, 1973; Merton, 1973), who binary classifier - in the extreme resulting in a trivial majority vote. based their valuation on partial differential equations. In last two decades, real options models have been widely applied to analyze es- We have set up a systematic benchmark study for a real world credit pecially investment decision under dynamic market conditions. ROA scoring classification task in the open source framework "Machine dominates the net present value approach and accounts for flexibility in Learning in R" (mlr). We will compare popular ways of mitigating the decision-making process. Nevertheless, the analysis of disinvest- the imbalancy problem in this setting. Results are presented and dis- ment decisions considering uncertainty of the market has been of high cussed concerning both the benefit of possible strategies as well as the relevance in recent years, but so far it has not been applied to the en- effect of classifier choice and tuning parameters. ergy sector. Moreover, disregarding disinvestment options in decision- making processes can lead to incorrect valuations of investment strate- gies at the firm level. In this paper, we develop a real options model for the disinvestment in conventional power plants. Using the real options approach we aim at determining the optimal timing for shut-down of  FC-16 unprofitable power plants. Friday, 11:20-12:50 - SFo14 Energy Markets  FC-18 Stream: Energy and Environment Friday, 11:20-12:50 - 004 Invited session Chair: Katharina Wachter Uncertainty

1 - How Much is Enough? Optimal Capacity Payments Stream: Supply Chain Management in a Renewable-Rich Power System Invited session Tuomas Rintamäki, Afzal Siddiqui, Ahti Salo Chair: Stefan Woerner

92 OR 2014 - Aachen FC-19

1 - Robustification of 2-stage last mile delivery tour  FC-19 planning for stochastic demand Friday, 11:20-12:50 - I Jürgen Pannek, Matthias Klumpp, Nihat Engin Toklu, Roberto Montemanni Rail Freight Transportation Stream: Traffic and Transportation Uncertainty of demand is usually modelled in stochastic and multi- Invited session stage tour planning algorithms. Optimization in this context looks out for minima of costs and travel distances (and therefore also car- Chair: Alena Otto bon footprint size). As an example algorithm the contribution explains a 2-stage depot-store last mile distribution based on a metaheuristic 1 - Efficient Rail Freight Transport method modelling stochastic demand. As a further enhancement, a Frederik Fiand, Uwe T. Zimmermann robustification with an alternative objective function is outlined and discussed. The idea of modification is to compute a solution, which Based on a real world problem we optimize efficiency in railway trans- remains optimal and unchanged even if unknown but bounded distur- portation. Given a set of shipment requests and predefined train sched- bances occur at runtime. To this end, costs are not exclusively assigned ules we assign shipments to trains in an efficient way. For our indus- to travel distances, but also to switches in the solution structure. This trial partner, the Kombiverkehr GmbH & Co KG (KV), we formulate is applied in order to further broaden the methodological basis of the a MIP model that minimizes the number of train changes for single model and therefore also the width of practical usefulness in business loading units. In a feasible transportation plan every shipment has applications. Practical implications are discussed further by relating a delivery due date that needs to be met. To be able to handle the the results to the green bullwhip effect concept, assuming additionally tremendous model size for realistic instances, we combine the usage increased order and safety stock levels in supply chains when green of state-of-the-art MIP solvers with tailor made techniques from the measures such as slow steaming, electric cars or load optimization are field of Combinatorial Optimization and a custom-built preprocess- applied. Since robustness compensates for the decreased flexibility ing. The KV provides us with real world data for that problem. As in league with these measures but might lead to an increase in the a next step, we consider energy efficiency as well. For that purpose, travel distances, the approach represents a calculation model for the the power consumption of a train between two terminals is estimated increased costs of green measures by flexibility reduction. Real costs as a function of transported weight, covered distance and difference in increase as modeled in a second stage delivery to satisfy all uncertain altitude. In cooperation with our other industrial partner, the Deutsche demand (realistic for most B2C industries like fashion, food and elec- Bahn Mobility Logistics AG, we are working on a further extension tronics shops) represents therefore the trigger and motivation for logis- where the predefined train schedules are not fixed anymore but train tics managers to increase order and safety stock levels in all upstream departure and arrival times can be varied within certain time intervals. sections from last mile delivery. The project is part of the BMBF-supported joint research project "e- motion’.

2 - Vendor Managed Inventory vs installation stock re- 2 - On the rail-rail transshipment yard scheduling prob- plenishment policy in a 3-stage supply chain under lem demand and supply uncertainty Mateusz Cichenski, Jacek Blazewicz, Grzegorz Pawlak, MICHAEL VIDALIS Erwin Pesch, Gaurav Singh A hub-and-spoke railway system is an efficient way of handling freight transport by land. A modern rail-rail train yard consists of huge gantry Supply chain management (SCM) is the management of flows (prod- cranes that move the containers between the trains. In this context, we ucts, capital and information) among the stages of the supply chain, can consider a rail-rail transshipment yard scheduling problem (TYSP) aiming to maximize the total expected profitability. Inventory control where the containers arrive to the hub and need to be placed on a train plays an important role in supply chain management. Properly con- that will deliver them to their destination. In the literature the problem trolled inventory can satisfy customers’ demands, smooth the produc- is decomposed hierarchically into five sub problems, which are solved tion plans, and reduce the operation costs. One alternative, stock policy separately. First, we have to group the trains into bundles, in which is the Vendor Managed Inventory, where a single decision maker, the the trains visit the yard and are processed at the same time. Next, we supplier, have the decision rights on all echelons in the supply chain. assign tracks to trains within these groups, namely parking positions. In this study, the system under consideration consists of three stages, a Then we have to find final positions for the containers on trains. Next Distribution Center (DC), a wholesaler and a retailer. The wholesaler we generate container moves that needs to be performed to repack the follows a continuous review (s, S) policy and as a Vendor Manager of trains. Finally, we have to assign those moves to the cranes for pro- the Retailer’s inventory level. Retailer faces Poisson demand and the cessing. excess demand is lost. The lead times are exponentially distributed. We propose a model which will solve TYSP as a single problem. A We model this supply chain network as a continuous Markov process mathematical formulation has been proposed, which enables us to de- with discrete states. The transition matrices have a blocked structure fine more robust and complex objective functions which includes the due to the fact that the system’s flows constitute a QBD or left skip-free key characteristics from each of the sub problems. A batch of compu- process. A computational algorithm is developed in order to generate tational experiments has been conducted and compared to the results the performance measures for different values of the system’s charac- from the literature. The conclusions from the performed experiments is teristics. The major task is to compare the current VMI results, with that the transshipment yard scheduling problem can be solved without the results of the same supply chain network, where both wholesaler the use of decomposition techniques. and retailer are following continues review installation policies. 3 - Allocating classification tasks at multiple-sided shunting yards 3 - Multi-modal service level distributions in stochastic Alena Otto, Christian Otto, Erwin Pesch production networks Stefan Woerner, Ulrich Schimpel Notwithstanding political initiatives to promote the rail freight trans- portation, the actual share of freight rail transport is decreasing in EU (from 15% in 1980 to 10% in 2010, EU statistics "Transport in Fig- ures’). In fact, rail logistics fails to make a competitive offer to firms. This work investigates multi-stage stochastic manufacturing and inven- The actual average speed of freight trains is estimated at about 10 km/h tory systems in the semi-conductor industry. The systems use reorder and only a half of freight trains reach their destination with less than a point replenishment policies and face supply variability, demand vari- 30 min-delay. A recognized bottleneck in the rail logistics is inefficient ability and shared capacities among other business constraints. In this operation of the shunting yards. We consider a tactical planning task context the customer service level for the end product might follow a of allocating the classification work within a multiple-sided shunting multi-modal distribution. This phenomenon challenges the usage of yard. The inbound freight trains have to be re-assembled, or classified, low-order moments for planning and analyzing the customer service to form the outbound trains. Thereby the classification tasks can be dis- level. We show the causes of this multi-modal behavior as well as a tributed among several classification units. Rail cars can be transferred feasible approach to detect and mitigate these situations. between classification units at rather high additional cost. Typically, the hub-yards within the rail network, such as Maschen (Hamburg), represent examples of such multiple-side shunting yards. In our paper,

93 FC-20 OR 2014 - Aachen

we investigate how to allocate the classification tasks within multiple- allowed to hold items between 1 and the maximum items of the job. sided shunting yards in an efficient way. We model the problem as In the first step priority rules are used to assign sublots to so-called clique partitioning problem with additional constraints. We present a dispatching positions, in which each position holds exactly one sublot heuristic and an exact solution method. and vice versa. In the second step, a mixed integer linear programming model obtains the sublot number, sublot sizes and scheduling plan si- multaneously. To improve the solution quality, a simple genetic algo- rithm is presented to enhance the position allocation in the first step while the decomposition approach is iteratively re-run until a stopping criterion is fulfilled. First numerical results are provided proving the  FC-20 model’s effectiveness. The allocation of sublots to due dates and vice Friday, 11:20-12:50 - II versa is analyzed as well as the influence of the number of sublot posi- tions and the setup durations. Flow Lines: Line Balancing and Scheduling Stream: Production and Operations Management Invited session  FC-21 Chair: Marc-Andre Weber Friday, 11:20-12:50 - III

1 - A New Design Team-Oriented Assembly Line Balanc- Optimization Software II ing Problem Hamid Yılmaz, Mustafa Yilmaz Stream: Software Applications and Modelling Systems Invited session This paper draws attention to assembly line balancing problem in Chair: Franz Nelissen which workers have been assigned to teams in advance inspiring from the assembly lines with multi-manned work stations differing from conventional ones. Team-oriented assembly line balancing is the prob- 1 - A new Java Framework for Resource Scheduling lem to assign tasks to multi-manned workstations while satisfying some constraints. Most researches about the team-oriented assem- Problems bly line balancing problems are focused on the conventional indus- Christian Gahm trial measures that minimizing total worker, number of multi-manned workstations or both. But in real life problems, workload density has The Java Framework for Resource Scheduling (JFORS) is designed an important role for the ergonomic measures. So, a mathematical to support scientists to develop, implement, enhance, and evaluate so- model that combines the minimization of multi-manned stations and lution methods (optimization algorithms) tackling Resource Schedul- difference of physical workload of workers is proposed. ing Problems (RSPs). The application domain RSP includes different kinds of scheduling problems (e.g., single machine scheduling or flow- 2 - A Variable Neighborhood Search for a Re-entrant Per- shops), lot-sizing problems, or resource constraint project scheduling mutation Flow Shop Scheduling Problem problems. Basic purpose of JFORS is to enable the scientist to con- Richard Hinze, Dirk Sackmann centrate on the development and (performance) analysis of solution methods. Therefore, JFORS provides a comprehensive data model for This paper discusses a re-entrant permutation slow shop scheduling all kind of RSPs, problem instances, solution methods and their param- problem with missing operations. The two considered objective func- eter, objective functions and their parameter, key figures, experiments, tions are makespan and total throughput time. Re-entrant flows are schedules, evaluation settings, and execution settings. All these infor- characterized by the multiple processing of jobs on one or more ma- mation are saved persistently in a relational database (MySQL) by the chines. The reasons for a re-entry of a job can be, e.g., rework or object-relational mapping framework Hibernate. JFORS itself is di- process-related. Each re-entry is starting the job on a new produc- vided into four software components: The first component comprises tion level. After giving an overview on recent literature on re-entrant basic functionalities regarding problem specification, algorithm con- scheduling problems, we introduce two possibilities to represent the figuration, and key figure definition. The second component provides job sequence for a re-entrant permutation flow shop problem with possibilities to import and manipulate problem instances and to define missing operations. Because the problem is NP-hard, we propose a experiments. The third component offers functionalities to execute heuristic for solving the problem. We chose the Variable Neighbor- experiments on the local workstation in a single or multiple threads hood Search (VNS), since there have been promising approaches in (by using the Parallel Java Library), or remotely on a computing grid literature on other scheduling problems. This meta-heuristic frame- (by using the JPPF framework). Functionalities to evaluate the per- work combines the advantages of local search algorithms and tries to formance of an algorithm and to export and process the results are avoid getting stuck in local optima. The initial solution for the VNS bundled in the fourth component. All four components provide a com- is obtained by a dispatching rule. A hill climbing algorithm has been prehensive user interfaces to simplify tasks like solver configuration, implemented as the integrated local search method of the VNS. The experiment design, or result analysis (e.g., concerning objective val- VNS is compared to a MIP formulation from literature and one from ues, schedules, or key fig earlier work of us. The computational results show, that the VNS de- livers better objective values for larger problems, if the computation 2 - LAVES: A(nother) Software for Supporting Students time for the solution methods is limited to one hour. As well we show for what problem sizes the VNS is applicable and what measures could by Visualizing Algorithms be taken to make it applicable to even larger problem sizes. Dominik Kress, Jan Dornseifer, Erwin Pesch

3 - Permutation Flow Shop Scheduling using Lot The Logistics Algorithms Visualization and Education Software Streaming for Job-specific Due Date Vectors mini- (LAVES) is an open source project at the University of Siegen, aiming mizing Due Date Deviation at supporting non-mathematics and non-informatics (bachelor-level) Marc-Andre Weber, Rainer Leisten students in understanding the basic concepts of algorithms that are applied to solve problems arising in Operations Research, especially Splitting of a number of jobs consisting of several identical items into in logistics, by means of visualization. It allows students to create consistent sublots with sublot-attached setups is studied. Processing is problem instances click-by-click with direct graphical feedback, of- conducted in an overlapping way on at least two successive machines fers a set of algorithm related controls, presents execution-table views in a permutation flow shop environment, known as lot streaming. The as used by the students when manually processing algorithms, depicts minimization of the total time deviation from due windows under the and highlights related pseudocode (including LaTeX formulas), and assumption of several due windows per job is considered, including includes an exercise-mode. LAVES is accompanied by a Development earliness and tardiness penalties. Lot streaming research has focused Kit (LAVES-DK) that allows instructors (with Java knowledge) to im- little attention on due date objective functions in general, and the re- plement course-specific algorithm visualizations (called plugins) to be search question of due date vectors per job in particular has not been used with LAVES. The DK is generic and provides a broad range of addressed before. However, it is important if several identical items tools. We will present the basic features of LAVES and LAVES-DK have to be produced and delivered to customers in various time slots. and report on first classroom experiences and student feedback. A decomposition based solution heuristic is presented and sublots are

94 OR 2014 - Aachen FC-23

3 - Design Principles that make the Difference with non-stationary stochastic demand, where replenishment can oc- Franz Nelissen cur either through a regular stochastically capacitated supply channel and/or an alternative uncapacitated supply channel with a longer fixed Optimization is a small but essential element of many applications. lead time. While most of the multiple supplier research explores the Thus the capability to interact with other systems is a key design prin- trade-off between purchasing costs and indirect costs of holding safety ciple of GAMS - not just a superficial check on the feature list. With inventory to cover against demand and supply variability, our focus its open architecture, powerful modeling language and integrated state- lies more in studying the effect of capacity and lead time on supply of-the-art solvers, GAMS provides you the best tools to develop and reliability and the customer’s order allocation decision to suppliers. In seamlessly integrate optimization models into various environments addition, we study a situation in which the unreliable supplier provides without locking you into a particular solution. During this workshop upfront information on capacity availability, denoted as advance ca- we will illustrate these design principles in GAMS and show how they pacity information, to the customer. We derive the optimal dynamic contribute to the success of several commercial and academic applica- programming formulation and we show some of the properties of the tions. optimal policy by carrying out a numerical analysis. Additionally, our numerical results on the benefits of dual sourcing and the value of shar- ing advance capacity information reveal several managerial insights.

 FC-22 Friday, 11:20-12:50 - IV  FC-23 Inventory Management Friday, 11:20-12:50 - V Stream: Logistics and Inventory Airline Applications Invited session Stream: Traffic and Transportation Chair: Marko Jaksic Invited session Chair: Zhi Yuan 1 - Selling over an Uncertain Season: Scale and Timing of Inventory Availability 1 - Optimal Airline Networks, Flight Volumes, and the Jochen Schlapp, Moritz Fleischmann Number of Crafts for New Low-cost Carrier in Japan Unknown customer demand patterns are a significant challenge for Ryosuke Yabe, Yudai Honma many firms. This is particularly true for seasonal products which often- Recently, lots of low-cost carrier(LCC) company have founded and times do not only suffer from an uncertain demand scale, but also from get popularization as new airline style. In Japan "Peach Co." started an uncertain demand timing. To ensure a product’s profitability in such business as LCC in 2012. However, it is true that some companies adverse market environments, firms have to coordinate a product’s in- is suffering from a slump in business. To found new LCC company, ventory scale with the inventory timing. Motivated by this commonly considering airline networks is the most important to success. There- observed challenge, we address the following question: for a seasonal fore, in this research, we propose a mathematical model to optimize product, when and how much inventory should a firm stock to best sat- both airline network, number of flights, and the number of airplanes isfy uncertain customer demand over an uncertain selling season? Our for maximize new LCC’s profit supposing hub-spoke system. To cal- analysis reveals that for an efficient inventory strategy, the firm has to culate the solution, we incorporate the real data of LCC’s revenue and actively manage a tradeoff between the product’s market potential and cost to calculate profit. We consider freight and incidental business the product’s costly market time. We also find that the timing uncer- revenue for revenue. Next, airport setting-up expense, sales adminis- tainty has more severe repercussions on a product’s profitability than trative expense, expense for holding airplanes, fuel expense, employ- an unknown demand scale. We discuss important managerial implica- ment expense, and some other expense are contained in cost. There tions arising from these findings. are also some constraint; the significant one is flight time restriction. We used the "record of transportation" which Ministry of Land, Infras- 2 - Inventory policies for systems with updated supplier tructure, Transport and Tourism published. In addition, for calculating delivery information each cost, we used actual airline’s profit-and-loss statement because Mahesh Srinivasan, Douglas Thomas particular cost item is written on there. In this research, Narita Airport and other top 15 airports are set as proposed airports. Out of these 16 We consider a single item periodic-review inventory system with lost airports, top 6Airport which include Narita and Kansai International sales having iid demand and lead times. Customers receive periodic will become hub airport. We use Wolfram Mathematica9 for calculat- order updates including updated delivery information from the ven- ing optimization problem. First, we calculated the case of single-hub dor. With advancement in information technologies, the availability of problem. The result is that the profit became maximum when Narita information systems and the ability to share, transmit and access infor- is set as hub airport. Next, we considered the case of 2-hub problem. mation has increased. Based on our interaction with sourcing profes- There are 15 pairs of hub airport. As a result, Narita-Kansai Interna- sionals in the industry, information access is not an issue anymore and tional pair became maximum. Top four pairs were related to Narita information is readily available. The problem lies in knowing how to Airport use information updates for better decision making. In this paper, we demonstrate how updated supply delivery information can drive better 2 - Fuel Efficient Vertical Optimization of Passenger inventory decisions. We use dynamic programming based optimiza- Flight Trajectories tion and simulation to investigate improved inventory policies under such a system which uses updated delivery information. It is seen that Zhi Yuan, Armin Fügenschuh, Anton Kaier, Swen Schlobach such policies perform significantly better as compared to the classical Planning a fuel-efficient flight trajectory connecting the departure and base-stock order up to inventory policies. We demonstrate conditions destination is a hard optimization problem. The solution space of a under which such updated delivery information could be useful and the flight trajectory is four-dimensional: a 2D horizontal space on the earth value of such information. surface, a vertical dimension consisting of a number of discrete alti- tude levels, and a time dimension controlled by the aircraft speed. In 3 - Dual Sourcing Inventory Model with uncertain Sup- practice, the flight planning problem is solved in two separate phases: ply and Advance Capacity Information a horizontal phase that finds an optimal 2D trajectory consisting of a Marko Jaksic, Jan C. Fransoo series of segments; followed by a vertical phase that assigns optimal flight altitude and speed to each segment, and the altitude and speed Lead time reduction is one of the main goals when one wants to pursue can only be changed at the beginning of a segment. In this work, we a concept of a lean and agile modern supply chain. However, many focus on the vertical phase. In general, the higher it flies, the more companies that have actively embarked on the projects related to re- fuel efficient it is. However, the optimal altitude for each segment also ducing the lead times were, at least in a short run, faced by the fact depends on some interconnected and dynamically changing aspects: that their customer service performance suffered. This has forced the weight, which decreases as fuel burns; speed, which changes flight customers to search for or stick with the alternative, more reliable, sup- time; and weather condition, especially wind, which changes consid- ply channels, through which they would improve the supply process erably by altitude and time. A time constraint is enforced such that a reliability. We study a customer’s perspective of this problem by mod- flight should arrive within a certain time window due to gate availabil- elling a periodic review, single stage dual sourcing inventory system ity. Then how to assign altitude and speed for each segment of a flight

95 FC-24 OR 2014 - Aachen

trajectory while satisfying the arrival time constraint is a challenging the possible equilibrium outcomes - assignments, payments, and pay- optimization problem. We formulate this problem into a mixed integer offs - of these auctions. Any assignment and payments that generate nonlinear programming model, and apply different acceleration tech- individually rational payoffs are the result of some Nash equilibrium niques. Besides, general-purposed black-box optimizers and problem- of the Vickrey auction and of every bidder-optimal core-selecting auc- specific heuristics are also applied. Experiments are conducted based tion. We provide a necessary and sufficient condition for an outcome on the real-world data provided by our industrial partner Lufthansa to result from an equilibrium of the pay-as-bid auction. This condition Systems. The experimental results confirm the fuel saving potential of is stricter than individual rationality but still allows payoffs outside of the vertical profile optimization in flight planning. the core. We consider extensions of the auction games to address the seller’s incentives, the impact of budget constraints, and the implemen- tation of the tie-breaking rule.

 FC-24 Friday, 11:20-12:50 - AS Combinatorial Clock Auctions Stream: Pricing, Revenue Management, and Smart Markets Invited session Chair: Thomas Kittsteiner

1 - A Practical Guide to the Combinatorial Clock Auction Oleg Baranov, Lawrence Ausubel The Combinatorial Clock Auction (CCA) is an important recent in- novation in auction design that has been utilised for many spectrum auctions worldwide. While the theoretical foundations of the CCA are described in a growing literature, many of the practical implementation choices are omitted. In this paper, we review and discuss the most crit- ical practical decisions for a regulator implementing the CCA. The list of topics includes: incorporation of competition policy objectives, im- plementation of reserve prices, activity rules, price incrementing pol- icy, and accommodation of technological choice. We illustrate our dis- cussion with examples from recent CCAs, including UK and Ireland spectrum auctions. 2 - Strategic Spiteful Bidding in Combinatorial Clock Auctions Maarten Janssen Combinatorial clock auctions (CCAs) have recently been used around the globe to allocate mobile telecom licenses. The CCA is presented to national authorities as a superior auction model that, because of its second-price rule, eliminates the scope for strategic bidding or "gam- ing’ (see, e.g., Cramton, 2012) so that bidders can "simply bid their valuations’. The bidding behaviour of firms in many of the recent real- world auctions are difficult to rationalize by bidders bidding value. This article analyzes the properties of the CCA in case bidders have a spite motive in that ceteris paribus they prefer outcomes where rivals pay more for their winning allocation. We show that if firms have a lexicographic spite motive, the Vickrey-Clark-Groves (VCG) mecha- nism underlying the CCA does not have a (weakly) dominant strategy. Nevertheless, under additional conditions, the CCA, unlike the VCG mechanism, can be solved using iterative elimination of (weakly) dom- inated strategies. In the resulting equilibrium, bidders express aggres- sive bids above value on some packages, and they may even bid on packages without intrinsic value. Bidding truthfully is an iteratively dominated strategy. We show that bidders in the clock phase strategi- cally expand their demand, while a former result on strategic demand reduction only holds true when there are 2 bidders. The spite motive also interacts in a complicated way with budget constraints. A budget constraint implies that bidders cannot pay more than a certain exoge- nously determined amount. Paradoxically (maybe), in a CCA this does not mean that (in the supplementary round) a bidder cannot bid more than its budget. We discuss some results showing that there always will be at least one bidder who bids above budget. 3 - Nash Equilibria of Sealed-Bid Combinatorial Auc- tions Marion Ott, Marissa Beck This paper characterizes the complete set of full-information Nash equilibria of a class of sealed-bid combinatorial auctions. This class contains the Vickrey auction, bidder-optimal core-selecting auctions, and the pay-as-bid auction as well as any other core-selecting auction. The characterizations for particular auctions are simple and intuitive, and allow for straightforward comparisons between auctions. All of the Nash equilibria of the pay-as-bid auction are Nash equilibria of ev- ery bidder-optimal core-selecting auction, and all of the latter’s equilib- ria are also equilibria of the Vickrey auction. This paper also analyzes

96 OR 2014 - Aachen FD-01

Friday, 13:00-14:15

 FD-01 Friday, 13:00-14:15 - Fo1 Closing Ceremony Stream: Invited Presentations and Ceremonies Plenary session Chair: Marco Lübbecke

1 - Operations Research in the Era of Cognitive Comput- ing Brenda Dietrich Cognitive computing systems learn and interact naturally with people to extend what either humans or machine could do on their own. They help human experts make better decisions by penetrating the complex- ity of Big Data. This talk will provide an overview of cognitive com- puting, discuss current applications of it, and explore the role opera- tions research can play in extending cognitive computing beyond the domain of language based reasoning.

97 SESSION INDEX

Wednesday, 9:00-10:30

WA-01: Opening Ceremony (Fo1) ...... 1

Wednesday, 10:50-12:20

WB-02: Mixed Integer Linear Programming (Fo2) ...... 1 WB-03: Computational Social Choice (Fo3) ...... 2 WB-04: Robust knapsack problems (Fo4) ...... 2 WB-05: Polyhedra (Fo5) ...... 3 WB-06: Recent Developments in Stochastic Programming (Fo6)...... 3 WB-07: Branch-and-Price/Branch-and-Cut (Fo7) ...... 3 WB-08: Cooperative Games (Fo8) ...... 4 WB-09: Variational Inequalities and Related Topics I (SFo1) ...... 4 WB-10: Sustainable Transport (SFo2)...... 5 WB-11: Data Envelopment Analyis (SFo3) ...... 5 WB-12: Flow Shop Scheduling (SFo4) ...... 6 WB-13: Fuzzy Expert Systems/Fuzzy Expertensysteme (SFo9) ...... 6 WB-15: Forecasting Stream Keynote (SFo11) ...... 7 WB-16: Optimization in Regional Energy Systems (SFo14) ...... 7 WB-19: Staff Scheduling and Rostering (I) ...... 8 WB-20: Integrating Lotsizing and Scheduling (II) ...... 8 WB-21: Optimization Software I (III) ...... 9 WB-22: Transport Network Planning & Operation (IV) ...... 9 WB-23: Service Analytics and Optimization (V) ...... 10 WB-24: Combinatorial Auctions (AS) ...... 11 WB-25: OR Success Stories I (AachenMuenchener Halle (Aula)) ...... 11

Wednesday, 13:10-14:40

WC-02: Online-Optimization (Fo2) ...... 12 WC-03: Computational Social Choice (Fo3) ...... 13 WC-04: Complex Scheduling (Fo4) ...... 13 WC-05: Combinatorial Algorithms (Fo5) ...... 14 WC-07: Logic-Based Benders Decomposition (Fo7) ...... 14 WC-08: Congestion Games (Fo8) ...... 15 WC-09: Nonlinear Optimization I (SFo1) ...... 15 WC-11: Multi-objective Optimization (SFo3) ...... 16 WC-12: Airport Operations Scheduling I (SFo4) ...... 16 WC-13: Matheuristics (SFo9) ...... 17 WC-15: Forecasting for Business Analytics I (SFo11) ...... 17 WC-16: Optimal Design and Operation of Pipeline Networks (SFo14) ...... 18 WC-17: Risk and Uncertainty (001) ...... 19 WC-19: Rail Transportation (I) ...... 19 WC-20: Robust Production and Distribution Planning (II) ...... 20 WC-21: Optimization Modeling I (III) ...... 20 WC-22: Vehicle Routing and Scheduling with Column Generation (IV) ...... 21 WC-23: Stochastic Flow Lines: Analysis and Optimization (V) ...... 21 WC-24: Smart Electricity Markets (AS) ...... 22 WC-25: OR Success Stories II (AachenMuenchener Halle (Aula)) ...... 22

Wednesday, 15:00-15:45

WD-01: Semiplenary Gritzmann (Fo1) ...... 23

98 OR 2014 - Aachen SESSION INDEX

WD-02: Semiplenary Fransoo (Fo2) ...... 23 WD-05: Semiplenary Lee (Fo5) ...... 23 WD-25: Company Award (AachenMuenchener Halle (Aula)) ...... 24

Wednesday, 16:05-17:35

WE-02: GOR Masterthesis Award (Fo2) ...... 24 WE-03: Security and Inspection Games (Fo3) ...... 25 WE-04: Scheduling with Uncertainties (Fo4) ...... 25 WE-05: Combinatorial & Polyhedral Aspects of Scheduling (Fo5) ...... 26 WE-06: Robustness Issues (Fo6) ...... 26 WE-07: Column Generation (Fo7) ...... 27 WE-08: Network Creation (Fo8) ...... 27 WE-09: Applications of linear and nonlinear optimization I (SFo1) ...... 28 WE-11: Decision Making and Game Theory (SFo3) ...... 28 WE-12: Resource-Constrained Project Scheduling: Efficient Solution Procedures (SFo4) ...... 29 WE-13: Analytics (SFo9) ...... 29 WE-15: Forecasting Applications for Quantitative Trading and Investing (SFo11) ...... 30 WE-19: Routing (I) ...... 30 WE-20: Supply Chain Design (II) ...... 31 WE-21: Distribution & Inventory Management (III) ...... 31 WE-22: Vehicle Routing with Intermediate Facilities (IV) ...... 32 WE-23: Energy, Heat and Steel Production (V) ...... 33 WE-24: Auction Theory (AS) ...... 33 WE-25: OR Success Stories III (AachenMuenchener Halle (Aula)) ...... 34

Wednesday, 17:45-18:45

WF-25: Business Panel Discussion (AachenMuenchener Halle (Aula)) ...... 35

Thursday, 8:15-9:45

TA-02: Modeling (Fo2) ...... 36 TA-03: Network Design (Fo3) ...... 36 TA-04: Approximation Algorithms in Robust Optimization (Fo4) ...... 37 TA-05: Transportation (Fo5) ...... 37 TA-06: Optimization Methods for Energy and Environment (Fo6) ...... 38 TA-07: Routing (Fo7) ...... 38 TA-08: Computing Equilibria (Fo8) ...... 39 TA-10: Planning of Local Generation and Consumption (SFo2) ...... 39 TA-11: Applications of Data Analysis and Data Envelopment Analysis (SFo3) ...... 40 TA-12: Scheduling on multiple machines (SFo4) ...... 41 TA-13: Metaheuristics in Production and Logistics (SFo9) ...... 41 TA-14: Robust and stochastic optimization in transportation (SFo10) ...... 42 TA-15: Advances in credit scoring methodology I (SFo11) ...... 42 TA-16: Energy Systems Engineering: Methods and Real-Life Applications (SFo14) ...... 43 TA-17: Accounting (001)...... 44 TA-18: Supply Chain Design and Collaboration (004) ...... 44 TA-19: Routing and Vehicle Logistics (I) ...... 45 TA-20: Modeling and Analysis of Complex Manufacturing Systems (II) ...... 45 TA-21: Optimization Modeling II (III) ...... 46 TA-22: Applications of combined Simulation & Optimization Methods in Logistics (IV) ...... 46 TA-23: Public Transport (V) ...... 47 TA-24: Revenue Management and Flexible Products (AS) ...... 47

Thursday, 10:15-11:00

99 SESSION INDEX OR 2014 - Aachen

TB-01: Semiplenary Ben-Tal (Fo1) ...... 48 TB-02: Semiplenary Marschner (Fo2) ...... 48 TB-04: Semiplenary McLay (Fo4) ...... 48

Thursday, 11:25-12:55

TC-02: Coloring (Fo2) ...... 49 TC-03: Multi-objective Integer Programming I (Fo3) ...... 49 TC-04: Logistics Scheduling (Fo4) ...... 50 TC-05: Network Design (Fo5) ...... 50 TC-06: Stochastic Programming Concepts and Models (Fo6) ...... 51 TC-07: Algorithm Engineering (Fo7) ...... 51 TC-08: Allocation under Preferences (Fo8) ...... 52 TC-10: Combined Heat and Power (SFo2) ...... 52 TC-11: Vector and Set Optimization I (SFo3) ...... 52 TC-12: Applications and Models in Project Scheduling (SFo4) ...... 53 TC-13: Metaheuristics for Assignment Problems (SFo9) ...... 54 TC-14: Health Care Operations Management (SFo10) ...... 54 TC-15: Forecasting for Business Analytics II (SFo11) ...... 55 TC-16: Sustainable Supply Chains (SFo14) ...... 55 TC-17: Optimization and Statistics (001) ...... 56 TC-18: Contracts and Pricing (004) ...... 56 TC-19: Collaborative Aspects in Routing and Scheduling (I) ...... 57 TC-20: Production Planning and Order Acceptance (II) ...... 57 TC-21: Distributed and Remote MIP Solving I (III) ...... 58 TC-22: Robust Vehicle Routing Problems under Uncertainty (IV) ...... 58 TC-23: Performance Evaluation (V) ...... 59 TC-24: Combinatorial Markets and Pricing (AS) ...... 59

Thursday, 13:45-14:30

TD-01: Semiplenary Boyd (Fo1) ...... 60 TD-02: Semiplenary Rönnqvist (Fo2) ...... 60 TD-04: Semiplenary Vohra (Fo4) ...... 60

Thursday, 15:00-16:30

TE-02: GOR Dissertation Award (Fo2) ...... 61 TE-03: Multi-objective Integer Programming II (Fo3) ...... 62 TE-04: Multistage Stochastic Programming (Fo4) ...... 62 TE-05: Traffic (Fo5) ...... 63 TE-06: Sustainable Production Planning (Fo6) ...... 63 TE-07: Location (Fo7) ...... 64 TE-08: Mechanism Design I (Fo8) ...... 64 TE-09: Applications of linear and nonlinear optimization II (SFo1) ...... 65 TE-10: Uncertainties in Energy Markets and Stochastic Models for Energy Economics (SFo2) ...... 65 TE-11: Decision Making in Practice (SFo3) ...... 66 TE-12: Airport Operations Scheduling II (SFo4) ...... 66 TE-13: Applications of Metaheuristics (SFo9) ...... 67 TE-14: Transportation in Health Care (SFo10) ...... 67 TE-15: Forecasting with Computational Intelligence (SFo11) ...... 68 TE-16: Modelling Material and Energy Flows (SFo14) ...... 68 TE-18: Information Asymmetry and Risk (004) ...... 69 TE-19: Train Path Assignment (I) ...... 69 TE-20: Manufacturing Applications (II) ...... 70 TE-21: Distributed and Remote MIP Solving II (III)...... 70 TE-22: Attended Home Deliveries (IV) ...... 71 TE-23: Electric and Hybrid Vehicles (V) ...... 71

100 OR 2014 - Aachen SESSION INDEX

TE-24: Non-Airline Applications of Revenue Management (AS) ...... 71

Friday, 8:15-9:45

FA-02: Matching (Fo2) ...... 73 FA-03: Network Games (Fo3) ...... 73 FA-04: Applications in Scheduling (Fo4) ...... 73 FA-05: Optimization in Engineering Science (Fo5) ...... 74 FA-06: Planning of Energy Availability (Fo6) ...... 74 FA-07: Packing (Fo7) ...... 75 FA-08: Mechanism Design II (Fo8) ...... 75 FA-09: Nonlinear Optimization II (SFo1) ...... 76 FA-10: Agent-based Simulation and Experimental Economics (SFo2) ...... 77 FA-11: Group Decision Making and Cooperation (SFo3) ...... 77 FA-12: Project Scheduling: Stochastic And Game Theoretic Aspects (SFo4) ...... 78 FA-14: Operating Room Planning & Scheduling (SFo10) ...... 79 FA-16: Location of Renewable Energy Sources (SFo14) ...... 79 FA-17: Financial Modeling (001) ...... 80 FA-18: Supply Chain Planning (004) ...... 80 FA-19: Routing: Pickup-and-Delivery (I) ...... 81 FA-20: Analytics in the Automotive Sector (II) ...... 81 FA-21: Math Programming Solvers (III) ...... 82 FA-22: Vehicle Routing and Scheduling and Pickup and Delivery (IV) ...... 82 FA-23: Lotsizing and Product Returns (V) ...... 83 FA-24: Revenue Management (AS) ...... 83

Friday, 10:10-10:55

FB-01: Semiplenary Puget (Fo1) ...... 84 FB-02: Semiplenary Dörner (Fo2) ...... 84 FB-04: Semiplenary Barber (Fo4) ...... 84

Friday, 11:20-12:50

FC-02: Complexity (Fo2) ...... 85 FC-03: Multi-criteria Decision Making Methods (Fo3) ...... 85 FC-04: Advanced techniques in robust optimization (Fo4) ...... 86 FC-05: Approximation Algorithms (Fo5) ...... 86 FC-06: Sustainable Energy Supply Networks (Fo6) ...... 87 FC-07: New effective approaches in combinatorial optimization (Fo7)...... 87 FC-08: Resource Allocation Games (Fo8) ...... 88 FC-09: Variational Inequalities and Related Topics II (SFo1) ...... 89 FC-10: Recycling and Electrical Vehicles (SFo2) ...... 89 FC-11: Vector and Set Optimization II (SFo3) ...... 90 FC-12: New Models in Project Scheduling (SFo4) ...... 90 FC-13: Machine Learning and Data Mining (SFo9) ...... 90 FC-14: Planning Emergency Medical Services (SFo10) ...... 91 FC-15: Advances in credit scoring methodology II (SFo11) ...... 92 FC-16: Energy Markets (SFo14) ...... 92 FC-18: Uncertainty (004) ...... 92 FC-19: Rail Freight Transportation (I) ...... 93 FC-20: Flow Lines: Line Balancing and Scheduling (II) ...... 94 FC-21: Optimization Software II (III) ...... 94 FC-22: Inventory Management (IV) ...... 95 FC-23: Airline Applications (V) ...... 95 FC-24: Combinatorial Clock Auctions (AS) ...... 96

101 SESSION INDEX OR 2014 - Aachen

Friday, 13:00-14:15

FD-01: Closing Ceremony (Fo1) ...... 97

102