Conference 2023
Top image

 
Home
Program
Invited Speakers
PhD student presentations
Registration
Registration for seminar only
Participants
Location
Previous conferences
 
Return to LNMB Site
 

48th CONFERENCE

 

Tuesday | Abstracts of presentations by PhD students III

 

Session A

 

A renewed take on weighted sum in sandwich algorithms: Modification of the criterion space

 

Session A: St. Jan-zaal

Melissa Koenen - m.f.koenen@tilburguniversity.edu

Tuesday, 14.30 - 14.55

Supervisors: Marleen Balvert, Hein Fleuren

 

 

Multi-objective optimization (MO) concerns the optimization of more than 1 objective at the same time. In case the objectives and the constraints are convex, we are dealing with convex MO for which it holds that the Pareto front is convex as well. Sandwich algorithms use an inner and an outer approximation that are iteratively updated to approximate the Pareto front of the convex MO up to epsilon precision, where the maximum distance between the inner and the outer approximation determines the direction in which the algorithm searches. One of the first to consider such a sandwich algorithm is Solanki et al. (1993) where they use weighted sum scalarization to expand the inner approximation. Using the inner normals as weights, one is only guaranteed to find efficient points when the inner normals are positive. Although non-positive weights are helpful to expand the inner approximation, one starts to describe part of the non-Pareto part of the convex MO as well. In our work we consider a modification of the criterion space, the MCS, which aims to describe less of the non-Pareto part than the formulation of Solanki. This saves solving uninformative LPs. Furthermore, we provide a proof that shows that for the MO with linear objectives and constraints, the MOLP, it is possible to obtain all Pareto points using weighted sum scalarization as long as the considered criterion space is (made) bounded and contains all Pareto points. At last, we present a new measure to approximate the distance between the inner and the outer approximation which is less computationally heavy than the exact measure of Solanki et al. (1993) and less involved than of Craft et al. (2006).

 

 

A hybrid local search algorithm for the Continuous Energy-Constrained Scheduling Problem

 

Session A: St. Jan-zaal

Roel Brouwer - R.J.J.Brouwer@uu.nl

Tuesday, 14.55 - 15.20

Supervisor: Marjan van den Akker

 

 

We consider the Continuous Energy-Constrained Scheduling Problem (CECSP), introduced by Nattaf et al. (2014). A set of jobs has to be processed on a continuous resource. Each job requires a given total amount of resource during its execution. We want to find a schedule such that: a job does not start before its release time, is completed before its deadline, and respects its lower and upper bounds on resource consumption during processing. Our objective is to minimize the total weighted completion time. We look at the case where both the resource and time are continuous. We assume that there is no 6efficiency function influencing resource consumption. We present a hybrid local search algorithm that exploits a decomposition of the problem, where we use local search to find an order of events (start/completion of a job), and for a given order determine optimal start and completion times as well as resource consumption using an event-based LP formulation. We perform computational experiments to compare the performance of our algorithm with exact approaches and to show its ability to deal with larger problem sizes. Our approach can be extended to deal with piece-wise linear resource availability functions, explicit precedence relations and linear efficiency functions.

 

 

A novel approach for a broad class of nonconvex optimization problems

 

Session A: St. Jan-zaal

Danique de Moor - d.demoor@uva.nl

Tuesday, 15.20 - 15.45

Supervisor: Dick den Hertog

 

 

We introduce a novel approach, called the Reformulation-Perspectification Technique (RPT), to obtain convex approximations of nonconvex continuous optimization problems. RPT consists of two steps: the reformulation step generates redundant nonconvex constraints from pairwise multiplication of the existing constraints; the perspectification step then convexifies the nonconvex components by using perspective functions. RPT extends the existing Reformulation-Linearization Technique (RLT) in two ways. First, it can multiply constraints that are not linear or not quadratic, and thereby obtain tighter approximations than RLT. Second, it can also handle more types of nonconvexity than RLT.

 

 

 

 

 

Session B

 

 

A model-based evolutionary algorithm for home health care scheduling

 

Session B: Congo-zaal

Yoram Clapper - y.clapper@vu.nl

Tuesday, 14.30 - 14.55

Supervisors: Rene Bekker

 

 

For the first time a model-based evolutionary algorithm is presented for a real-life Home Health Care Routing and Scheduling Problem (HHCRSP). The algorithm generates routes consisting of care activities jointly with the underlying shift schedule, while taking into account the qualification levels. The performance is optimized in terms of travel time, time window waiting time and shift overtime. The algorithm is a novel extension of the permutation Gene-Pool Optimal Mixing Evolutionary Algorithm. Numerical experiments, using real-life data, show that the algorithm performs close to optimal for small instances, and outperforms schedules from a case study, leading to efficiency gains of 41%. Furthermore, it is shown that the model-based evolutionary algorithm performs better than a more traditional evolutionary algorithm, which demonstrates the importance of learning and exploiting a model to guide the optimization in HHCRSP.

 

A distributionally robust perspective on the extremal queue problem

 

Session B: Congo-zaal

Wouter van Eekelen - w.j.e.c.vaneekelen@tilburguniversity.edu

Tuesday, 14.55 - 15.20

Supervisors: Johan S.H. van Leeuwaarden, Dick den Hertog

 

 

Stochastic models are traditionally built on the assumption that the probability distributions of the driving random variables are known, whereas a distribution-free perspective assumes that these distributions are only partially known. Distributionally robust analysis seeks to calculate the worst-case model performance over the set of distributions satisfying this partial information. For most stochastic models, finding this worst-case scenario requires convex optimization methods, which are strongly linked to the generalized moment problem. However, in this talk, we emphasize specific methodological challenges for stochastic models with multiple random variables that are independent and identically distributed (i.i.d.). Applying existing optimization methods then becomes difficult, or even impossible, since the resulting optimization problem is no longer convex.

We will connect these challenges with the problem of finding tight bounds for waiting time moments in the GI/GI/1 queue, as one of many examples from applied probability with i.i.d. random variables as input. For example, one of the most notorious (and still open) problems in queueing theory is to compute the worst-possible expected waiting time of the GI/GI/1 queue under mean-variance constraints for the interarrival- and service-time distributions. In this setting, the extremal distribution has only been determined numerically as the solution of a nonconvex optimization problem. We will instead address the extremal queue problem by measuring dispersion in terms of Mean Absolute Deviation (MAD) as a substitute for the more conventional variance. Using MAD instead of variance alleviates the computational intractability of the extremal GI/GI/1 queue problem, which arises from the i.i.d. property, enabling us to state the worst-case distributions explicitly. Combined with classical random walk theory, we then obtain explicit expressions for the best-possible upper bounds for all moments of the waiting time.

 

 

A framework for efficient dynamic routing under stochastically varying conditions

 

Session B: Congo-zaal

Nikki Levering - n.a.c.levering@uva.nl

Tuesday 15:20- 15:45

Supervisors: Marko Boon, Michel Mandjes, Sindo Nunez Queija

 

 

Despite measures to reduce congestion, both recurrent patterns (e.g., rush hours) and non-recurrent events (e.g., traffic incidents) cause large delays in highway networks with important economic implications. In this talk, we will introduce the Markovian Velocity Model (MVM), a framework capable of incorporating both effects. The model uses an environmental background process that tracks the random and (semi-)predictable events affecting the vehicle speeds in a highway network. The resulting continuous-time model, which allows for correlation between velocities on the arcs, is highly flexible. This flexibility can be employed to fit the model on a real-life network. The fitted model can then be used to predict travel time distributions and, as a result, provide guidance for travelers in this highway network.

 

 

Redundancy systems with data locality constraints

 

Session B: Congo-zaal

Ellen Cardinaels - e.cardinaels@tue.nl

Tuesday, 15:45 - 16.10

Supervisors: Sem Borst, Johan van Leeuwaarden

 

 

Heterogeneity and compatibility relations between jobs and servers are becoming ubiquitous in cloud computing platforms due to data locality and network topology constraints. These features strongly diverge from the inherent symmetry of the supermarket model as the baseline scenario for performance benchmarking in parallel-server systems.

We focus in particular on redundancy scheduling systems which exploit the variability of service times of a job on different servers. More specifically we gain insight in the performance of these systems with compatibility constraints in limiting regimes via product-form stationary distributions. For instance in a heavy-traffic regime, the system achieves complete resource pooling and exhibits the same behavior across a broad range of scenarios.