Conference 2015
Top image

 
Home
Program LNMB Conference
Invited Speakers LNMB Conference
Program PhD presentations
Abstracts PhD presentations
Registration LNMB Conference
Announcement NGB/LNMB Seminar
Abstracts/Bios NGB/LNMB Seminar
Registration NGB/LNMB Seminar
Registered Participants
Conference Office
How to get there
 
Return to LNMB Site
 

Abstract and Bio Speakers NGB/LNMB Seminar

Back to school,
learn about the latest developments in Operations Research

Robert Bixby (Gurobi Optimization); day-chair

Brief resume: Dr. Robert Bixby has a BS in Industrial Engineering and Operations Research from the University of California, Berkeley (1968), and a PhD in Operations Research from Cornell University (1972). He has held academic positions at the University of Kentucky, Northwestern University, and Rice University, as well as visiting positions at the University of Wisconsin, Cornell University, the Forschungsinstitut für Diskrete Mathematik, Bonn, Universität Augsburg, and the Konrad Zuse Zentrum, Berlin. He is currently Noah Harding Professor Emeritus of Computational and Applied Mathematics at Rice University, Research Professor of Management in Rice's Jones School of Management, and visiting Professor in the Department of Mathematics at Universität Erlangen. He is also the co-founder (2008) and CEO of Gurobi Optimization.
Dr. Bixby has published over fifty journal articles, and is an acknowledged expert on the computational aspects of linear and integer programming. He has won several awards for his work in optimization, including a Humboldt Senior Scientist award, the Beale-Orchard-Hays Prize of the Mathematical Programming Society, and the INFORMS Impact and Frederick W. Lanchester Prizes. He was Editor-in-Chief Mathematical Programming, Series A, 1989-1994, and Chairman of the Mathematical Programming Society, 2001-2004. In 1997 he was elected to the National Academy of Engineering for his contributions to the theory and practice of optimization. In 2012 he was awarded an honorary doctorate in Mathematics from the University of Waterloo, Canada.
Dr. Bixby has over twenty five years of experience in the optimization software business. He co-founded CPLEX Optimization, Inc., in 1987. CPLEX was acquired by ILOG, Inc., in 1997, after which he served on the ILOG Board of Directors, manager of the ILOG CPLEX Development Team, President of the ILOG Technical Advisory Board, and General Manager of ILOG's Semiconductor Business Division.

Laurens van der Maaten (TU Delft)

Brief resume: Laurens van der Maaten is currently an Assistant Professor in the Intelligent Systems department of Delft University of Technology. Previously, he worked as a post-doctoral scholar at University of California San Diego, and as a (visiting) PhD student at Maastricht University, Tilburg University, and University of Toronto. In February 2015, he will join Facebook AI Research in New York as a research scientist. His research interests include dimensionality reduction, embedding, metric learning, generative models, deep learning, time series modeling, structured prediction, regularization, face recognition, and object tracking. For his work on dimensionality reduction, he received the SNN Machine Learning Award 2012.

Title: Tutorial "Constructing Maps to Visualize Big Data"

Abstract: Visualization techniques are essential tools for every data scientist. Unfortunately, the majority of visualization techniques can only be used to inspect a limited number of variables of interest simultaneously. As a result, these techniques are not suitable for big data that is very high-dimensional.
An effective way to visualize high-dimensional data is to represent each data object by a two-dimensional point in such a way that similar objects are represented by nearby points, and that dissimilar objects are represented by distant points. The resulting two-dimensional points can be visualized in a scatter plot. This leads to a map of the data that reveals the underlying structure of the objects, such as the presence of clusters.
The talk gives an overview of techniques that can be used to construct such maps. In addition, we present a new technique to construct such maps, called t-Distributed Stochastic Neighbor Embedding (t-SNE). We demonstrate the value of t-SNE in domains such as computer vision and bioinformatics, and we show how to scale up t-SNE to Big Data sets with millions of objects.


Danny Holten (SynerScope)

Brief resume: Danny Holten respectively received his MSc (hon., 2005) and PhD (2009) degree in computer science from Eindhoven University of Technology (TU/e). His PhD research, performed at the same university within the Visualization group of prof. Jack van Wijk, focused on the development of techniques for the visualization of graphs and trees to aid in program understanding tasks. From 2009 to 2011, he worked as a postdoctoral researcher at the TU/e Dept. of Math. & Comp. Science and LaQuSo, the TU/e Laboratory for Quality Software, on the NWO "Expression of Interest" project with prof. Jack van Wijk to study how interesting data aspects can be visualized in a clear way, which led to the development of generic models that formed the basis for a number of visualization techniques.
He (co-)received awards for his visualization work -- IEEE InfoVis 2006, best paper (Hierarchical Edge Bundling); ACM CHI 2009, best-paper nominee (visualization of directed edges in graphs); IEEE PacificVis 2013, best paper (reordering Massive Sequence Views) -- as well as the TU/e Doctoral Project Academic Award 2010 for the best PhD dissertation at TU/e.
Since April 2011, Danny Holten is co-founder and Lead Visualization Scientist at SynerScope B.V., a Dutch TU/e spin-off company that leverages his PhD research and the inspiring business vision of SynerScope's CEO and co-founder Jan-Kees Buenen to create Big-Data analysis and visualization solutions that directly allow actual domain experts and analysts to make sense of their Big Data.

Title: Application of "Data Visualization"

Abstract: Making sense of Big Data within various and often highly disparate data domains has become an important research, industrial, and commercial topic during the last couple of years. The sense-making process used to turn raw and often unstructured high-volume data into actionable information and insights spans a broad collection of (un)supervised, (semi-)automated approaches, techniques, and algorithms that can be used in a complementary fashion. SynerScope demonstrates how interactive visual analysis solutions can be used to perform Active Discovery by means of a "Human-in-the-Loop" approach and in conjunction with (automated) data mining, machine learning, and analytics approaches to quickly discover unexpected (or confirm expected) trends, patterns, outliers, and interesting data aspects.


Jean-Francois Puget (IBM)

Brief resume: Jean-Francois Puget is an IBM Distinguished Engineer. He currently oversees the use of optimization in IBM analytics offerings and solutions for IBM Software Group. In particular he contributes code to the CPLEX Optimizer product as well as to new cloud offerings that leverage optimization and business analytics technologies. He has managed the ILOG Optimization development team for over 22 years where he led the introduction of successful new products and techniques in the market. He holds a PhD in Machine Learning from Universite Paris XI Orsay and is an alumni of Ecole Normale Superieure Paris-Ulm in mathematics.

Title: Tutorial "Big Data Optimization"

Abstract: Big Data is a shortcut for a very interesting phenomenon: we now have data about almost everything. The value of Big Data is to use data to make better decisions. Optimization, being the science of better decisions, has a natural role to play here. However, managing and using data can be challenging. Indeed, data can come in very large volumes. It can also come in a large variety of forms (eg audio, video, free text, text feeds, sensor measures, etc). Data can also be in motion (streamed) as opposed to be at rest. Each of these Big Data dimensions (volume, variety, velocity) create challenges and opportunities for optimization techniques and applications. We will review these challenges and explore potential approaches. We will also provide some actual examples where Big Data and Optimization are used together in new, innovative, applications.



Björn Geißler (Friedrich-Alexander-Universität Erlangen-Nürnberg )

Brief resume: Björn Geißler studied computer science at Technische Universität Darmstadt. He received his diploma in 2007. In 2011, he received his doctorate in mathematics under the supervision of Prof. Dr. Alexander Martin from Friedrich-Alexander-Universität Erlangen-Nürnberg for his thesis "Towards Globally Optimal Solutions for MINLPs by Discretization Techniques with Applications in Gas Network Optimization". He is interested in the solution of difficult mixed integer linear and nonlinear optimization problems, in particular with some underlying network structure. He has written more than 10 publications in international books and peer reviewed journals. In 2013 he became a co-founder of the develOPT GmbH, a company for commercial optimization software and algorithm development. Since then, he works as managing director of develOPT and is a research associate at Friedrich-Alexander-Universität Erlangen-Nürnberg .

Title: Applying Mathematical Optimization to Germany's Largest Gas Transport Network

Abstract: German gas transport system operators (TSO) are obliged to offer as much freely allocable capacity as possible. Freely allocable capacities enable gas traders to feed in or withdraw gas at certain entries and exits without having to care where the gas is withdrawn or fed in, respectively. When offering transmission capacities, the TSO has to ensure that all gas flow situations that may result from the offered capacities can be technically realized. This requirement can hardly be verified with existing simulation-based planning tools. In response to this, Germany's largest TSO, Open Grid Europe GmbH, initiated the research project "ForNe - Research Cooperation Network Optimization" in 2009, where more than 30 mathematicians from 7 German research institutes successfully developed mathematical optimization models and algorithms to tackle this problem. Now, the results of this research are about to be transformed into an application software that fulfills all requirements of a productive network planning tool.