ZIB Forty – Past and Future

Zuse Institute Berlin celebrates its 40th birthday in 2024, among others with this scientific conference, which is held from 11th to 13th of June. To contact the organizers, please send an email to 40@zib.de.

Program

June 11

TimeSpeakerTalk Title (for abstract click title or see below)
10:00-11:00Registration, coffee
11:00-11:15Welcome by Christof Schütte and Sebastian Pokutta
11:15-12:15Session Chair: Christof Schütte and Sebastian Pokutta
11:15-11:45 Martin Grötschel
On the Role of Applications in Mathematical Theory and vice versa


Abstract. The origins of mathematics can be traced back to tasks such as counting animals or determining the size of a piece of land. Issues occurring in the daily life of humans have led to a rich system of mathematical structures; and the related theories have opened up novel approaches to understand complex situations arising in practice. In my lecture, I will address success stories of the productive interplay of applications and mathematical theory and indicate, in particular, developments where the work of ZIB members has played an important role.

11:45-12:15 Volker Kaibel
Cyclic Transversal Polytopes


Abstract. With every family of finitely many subsets of a finite-dimensional vector space over the Galois-field with two elements we associate a cyclic transversal polytope. It turns out that those polytopes generalize several well-known polytopes that are relevant in combinatorial optimization, among them cut polytopes as well as stable set and matching polytopes. We introduce the class of lifted odd-set inequalities and provide results demonstrating their strength. In particular, it turns out that they generalize well-known inequalities like, e.g., Edmonds’s inequalities for matching polytopes and that they suffice to describe cyclic transversal polytopes if the union of the sets in the family has rank at most two. We also describe extended formulations for cyclic transversal polytopes and introduce a special relaxation hierarchy for them. The talk ist based on joint work with Jonas Frede, Jannik Trappe (OVGU Magdeburg), and Maximilian Merkert (TU Braunschweig).

12:15-12:45 Harry Yserentant
Finite Elements, Multigrid Methods, and Homogenization


Abstract. Forty years ago, when ZIB was founded, finite elements and multigrid methods were a hot topic in numerical analysis. To a lesser extent, this is still true today, but the focus has shifted from simpler problems of the type of the Laplace equation to more challenging problems such as equations with rough, strongly oscillating coefficients. A particular approach of this kind related to these old developments is presented in more detail.

12:45-14:00Lunch
14:00-15:00Session Chair: Tim Conrad and Max Fackeldey
14:00-14:30 Hans-Christian Hege
From data visualization to data reasoning


Abstract. Research and development in the field of data visualization have changed the world. Powerful visualization tools and techniques have led to data visualization finding its way into almost all areas of life. Today data visualization supports companies in decision-making and communication, journalism and media in the presentation of facts and science in the analysis of complex data sets, the formulation and testing of hypotheses and the presentation of results.

What is the future of data visualization? Answers to this question can be found by extrapolating current developments such as the further use of augmented reality (AR and VR), cloud and streaming technologies, improved integration with other technologies such as the Internet of Things (IoT), edge computing and 5G networks. These developments will improve location independence, situational awareness and decision-making capabilities.

Furthermore, it is clear that artificial intelligence and machine learning will play a pivotal role, e.g. in analyzing huge amounts of data from different sources and automatically creating insightful visualizations, detecting patterns, trends and anomalies in complex data sets to enable more accurate and actionable visualizations, personalizing visualizations based on user preferences and goals or developing natural-language user interfaces.

These are all fairly obvious developments. The question is whether there will also be less obvious and perhaps more promising, if not disruptive, developments? It is worth taking a closer look at the cognitive foundations of visualization. After all, visualization not only makes use of the far greater computing capacities of computers, but above all compensates for weaknesses in human cognition. The following questions will therefore be discussed: What are these weaknesses of human cognition and how can they be better compensated by future tools? Furthermore, what are the special features of human problem solving and how can these be taken into account in future tools?

14:30-15:00 Stefan Weltge
Integer programming and subdeterminants


Abstract. During my time at ZIB I learnt a lot about integer programming, and the fascination for this topic has driven my research ever since. My talk is about which parameters are responsible for integer optimization problems being theoretically difficult or easy to solve. In particular, I will highlight a number of results from recent years that deal with the role of subdeterminants as such parameters.

15:00-15:30Coffee break
15:30-16:30Session Chair: Martin Weiser
15:30-16:00 Günter ZieglerHistorical Perspective Already Now?
16:00-16:30 Jens Lang
Adaptive Algorithms for Elliptic PDEs with Random Data


Abstract. In this talk, I will present a fully adaptive strategy for solving elliptic PDEs with random data. A hierarchical sequence of adaptive mesh refinements for the spatial approximation is combined with adaptive anisotropic sparse Smolyak grids in the stochastic space in such a way as to minimize the computational cost. The novel aspect of the strategy is that the hierarchy of spatial approximations is sample-dependent so that the computational effort at each collocation point can be optimized individually. I will present results from a rigorous analysis for the computational complexity of the adaptive multilevel algorithm. Numerical examples demonstrate the reliability of the error control.

16:30-16:45Day Wrap-Up by Christof Schütte and Sebastian Pokutta and Group Photo

June 12

TimeSpeakerTalk Title (for abstract click title or see below)
09:30-10:00Coffee
10:00-10:15Welcome by Christof Schütte and Sebastian Pokutta
10:15-11:15Session Chair: Christof Schütte and Sebastian Pokutta
10:15-10:45 Rupert Klein
Pressure gain combustion for gas turbines: Analysis of a fully coupled engine model


Abstract. To fill in for missing solar and wind energy during calm weather and cloudy or night time conditions, gas turbines will be with us for some time to come. The higher their efficiency, the better, for obvious reasons. Concepts currently in use have been optimized over decades, and increasing their efficiency further is becoming exceedingly difficult and expensive. Today, billions of Euros are spent to achieve efficiency gains of less than a percent. Here we present Shockless Explosion Combustion (SEC), a novel concept for what is called “pressure gain combustion”, which promises efficiency gains exceeding 15% based on a computational model for a complete engine. In this presentation, mathematical modelling meets theory and numerics for hyperbolic systems with stiff source terms to develop an understanding of the intriguing multiscale interactions constituting the SEC process.

Joint work with Maikel Nadolski (FUB), Christian Zenker (BTU), Michael Oevermann (BTU), Oliver Paschereit (TUB).

10:45-11:15 Wilhelm Huisinga
We can do better than starting from scratch: How to leverage detailed mechanistic models for the analysis of sparse clinical data?


Abstract. A growing understanding of complex processes in biology and physiology has led to large-scale quantitative systems pharmacology (QSP) models of pharmacologically relevant processes. These models are increasingly used to study the response of the system to a given input or stimulus, including drug perturbations. For the analysis of clinical data, however, these models are rarely used; due to their size, it it not feasible to estimate individual model parameter based on common clinical data. As a solution, researchers resort to small-scale data-driven and/or semi-mechanistic pharmacokinetic (PK)/pharmacodynamic (PD) models. To what extent these model are consistent with the accumulated knowledge present in the large-scale QSP models, remains open. An approach that allows to systematically derive small-scale PK/PD models from the large-scale QSP model is therefore highly desirable and would support the translation of models between different phases of the drug discovery and development process. We present a model order reduction approach based on a novel sensitivity based input-response index. It is linked to the product of two local sensitivity coefficients: The first coefficient quantifies the impact of the input on a given state variable at a given time; the second coefficient quantifies how a perturbation of a given state variable at a given time impacts the output in the future. We rank state variables according to their input-response index and reduce the complexity of a model in three steps by elimination of state variables of negligible impact, approximation of fast molecular species by their quasi-steady state; and exploitation of conservation laws. A key feature of the reduced model is its mechanistic interpretability in terms of quantities of the original system. We illustrate our approach in application to the humoral blood coagulation system. The input-response-indices & the reduced models give insight about which molecular players are key—and during with time spans. Exemplified for the anti-coagulant warfarin, we demonstrate that subsequent reduction of reactions (in addition to molecular species) can further reduce the complexity to a level that is suitable for the analysis of sparse clinical data in the context of model-informed precision dosing.

11:15-11:45Coffee break
11:45-12:45Session Chair: Daniel Baum
11:45-12:15 Malte Westerhoff
From Bench to Bedside: Translating AI Algorithms into Real-World Solutions for Medical Imaging


Abstract. Artificial intelligence (AI) is poised to transform most areas of our life, with medical imaging and radiology at the forefront. The pace at which research in this area evolves is breathtakingly fast and feels like it’s exponentially accelerating. Translating groundbreaking research results into products that practically help healthcare professionals and ultimately patients opens up exciting opportunities and demands innovative and new approaches.

Through specific examples, such as the application of AI in cancer detection and medical education, this talk will highlight key methodologies for developing and translating AI algorithms into products, ensuring their reliability and efficacy in real-world applications.

By examining successful case studies, the presentation will showcase the collaborative efforts among researchers, clinicians, and industry partners to integrate AI solutions into everyday clinical workflows. We will explore the promising potential of AI in improving diagnostic accuracy, enhancing medical education, and ultimately transforming patient care. Ethical considerations, regulatory frameworks, data availability, privacy concerns, and future directions of AI in medical imaging are just some of the constraints that need to be considered.

This talk aims to inspire and inform on the innovative strategies and collaborative efforts necessary to transition from cutting-edge research to impactful clinical tools, showcasing the vast opportunities AI offers to enhance medical practice and patient outcomes.

12:15-12:45 Susanna Röblitz
Markov State Modelling for Gene Regulatory Networks: Challenges and Opportunities


Abstract. Markov state modelling (MSM) is a powerful framework for studying the long-timescale behavior of high-dimensional dynamical systems. Originally developed in the context of molecular dynamics (with foundational contributions from researchers at ZIB), MSM has been extended in recent years towards stochastic gene regulatory networks governed by the chemical master equation, which provides a novel tool for characterizing cellular phenotypes. In fact, there are indications that other modelling frameworks typically used to describe stochastic gene expression, like asynchronous Boolean networks or piecewise deterministic Markov processes, could benefit from the application of MSM. In my talk, I‘ll give an overview of these developments and their impact on our understanding of cellular decision making, and I‘ll outline some of the challenges we are faced with when transferring MSM from molecular dynamics to other simulation methods.

12:45-14:00Lunch
14:00-15:00Session Chair: Ralf Borndörfer
14:00-14:30 Ralf Kornhuber
Tempora mutantur, nos et mutamur in illis


Abstract. There was a time, when there was a strong belief that the world can be described in terms of differential equations and that their fast and reliable solution would be the key to find out what is the case. Meanwhile, society believes that this can be better done directly by collecting and interpreting sufficiently much data. In this talk, we provide a historical case study on a project at ZIB from the year 1990 focussing on reverse-biased pn-junctions, and then move to Fredholm integral operators in very high dimensional spaces for function approximation and training of neural networks.

14:30-15:00 Christoph Helmberg
Conic Bundle Developments


Abstract. Conic Bundle is a callable library for optimizing sums of convex functions by a proximal bundle method. It has its origins in the spectral bundle approach for large scale semidefinite programming and builds on the idea of using convex function models that may be represented as support functions over suitable conic sets. Cones serve multiple purposes and we illustrate some typical use cases. In bundle methods the key step is to efficiently determine the next candidate via a quadratic bundle subproblem. The conic setting allows to formulate and solve this in a uniform way by interior point methods. The recent development of a preconditioned iterative solver for the associated primal-dual KKT system revives hopes for efficient dynamic restarting so as to admit model updates embedded directly into the interior point approach. In this, again, the uniform and simple conic structure comes in handy. We present our current ideas and attempts to make this work in practice and present first numerical results.

15:00-15:30Coffee break
15:30-16:15Session Chair: Ralf Borndörfer
15:30-16:15 Sven Krumke
The Joy of “Good Enough”: Tackling Robust Covering Problems


Abstract. Set cover problems are fundamental to combinatorial optimization, with applications spanning logistics, resource allocation, and scheduling. However, classic formulations typically assume perfect knowledge of input data. Robust set cover problems introduce uncertainty into this model, requiring solutions that remain effective even when costs, coverage sets, or demands change unexpectedly.

The strategic placement of emergency doctors is crucial for ensuring rapid response times and saving lives. We delve into the mathematical complexities underlying this critical optimization problem by presenting results for various versions of robust covering problems such as the Multi-Set Multi-Cover Problem (MSMC) and Partial Scenario Set Cover Problem (PSSC), which are both motivated by an application for locating emergency doctors. In the MSMC one is given a finite ground set J, a demand value d(j)0 for every element of J, and a collection C of subsets of J. The goal is to choose a minimum total number of subsets from C, where multiple choices of a subset are allowed, such that the demand of every element in J is covered. In the robust version, the demand of each element j is no longer present but replaced by a set of scenarios U—the uncertainty set—containing various possible demand vectors. The PSSC generalizes the Partial Set Cover problem, which is itself again a generalization of the classical Set Cover problem. We are given again a finite ground set J, a collection C of subsets to choose from, each of which is associated with a nonnegative cost, and a second collection U of subsets of J of which a given number I must be covered. The task is to choose a minimum cost sub-collection from C that covers at least l sets from U. We present new approximation approaches combining LP-based rounding with a greedy consideration of the scenarios or directly using variants of the greedy set cover algorithm which in each iteration tries to minimize the ratio of cost to number of newly covered scenarios.

16:15-16:30Day Wrap-Up by Christof Schütte and Sebastian Pokutta
16:30-23:00Summer festival

June 13

TimeSpeakerTalk Title (for abstract click title or see below)
09:30-10:00Coffee
10:00-10:15Welcome by Christof Schütte and Sebastian Pokutta
10:15-11:15Session Chair: Christof Schütte and Sebastian Pokutta
10:15-10:45 Antoine Deza
All optimal paths lead to ZIB


Abstract. Worst-case constructions have helped providing a deeper understanding of how the structural properties of the input affect the computational performance of optimization algorithms. Recent examples include the construction of Allamigeon, Benchimol, Gaubert, and Michael Joswig, for which the interior point method performs an exponential number of iterations. In a similar spirit, we investigate the following question: how close can be two disjoint lattice polytopes contained in a fixed hypercube? This question stems from various contexts where the minimal distance between such polytopes appears in complexity bounds of optimization algorithms. We provide nearly matching lower and upper bounds on this distance and discuss its exact computation. Based on joint work with Shmuel Onn (Technion), Sebastian Pokutta (ZIB), and Lionel Pournin (Paris XIII).

10:45-11:15 Michael Wulkow
32 Years after leaving the ZIB—Challenges along the road


Abstract. In the spring of 1992, my departure from ZIB marked the beginning of a journey that continues to this day: solving complex systems of differential equations, developing algorithms for fitting and control tasks, developing models, linking structures of different fields. Moreover, the software tools developed during this period have not only endured but have evolved into versatile platforms, tailored to the needs of researcher, development, and production alike. These tools are still quite general—even if particularly designed for certain fields of application—since they are written from the standpoint of mathematicians. Here, it means that they do not contain too many fixed structures, but possess the flexibility to accommodate novel concepts and foster innovation. In other words: we develop and sell modeling tools, not models.

This adaptability is evidenced by the extensive body of work comprising hundreds of publications authored by users and collaborators that have transcended the status of knowledge at the time of development of the respective software version. In this lecture, I will delve into a few foundational numerical principles that still underpin these tools, emphasizing their enduring significance in contemporary practice. Furthermore, I will highlight select projects that exemplify the unique challenges encountered and overcome along the way. This talk aims to provide a succinct yet insightful glimpse into the evolution of mathematical practice since my time at ZIB.

11:15-11:45Coffee break
11:45-12:45Session Chair: Christoph Spiegel
11:45-12:15 Martin Skutella
On the Complexity of Neural Networks


Abstract. We discuss various recent results on neural networks with ReLU activations. With respect to the expressivity of neural networks, we provide a mathematical counterbalance to the universal approximation theorems which suggest that a single hidden layer is sufficient for learning any function. In particular, we investigate whether the class of exactly representable functions strictly increases by adding more layers. We also present results on how basic problems in combinatorial optimization can be solved via neural network with ReLU activations. Finally, we discuss recent results on the algorithmic complexity of determining basic properties of neural networks with given weights. Our approaches build on techniques from mixed-integer and combinatorial optimization, efficient algorithms, polyhedral theory, as well as, discrete and tropical geometry.

The talk is based on joint work with Moritz Grillo, Vincent Froese, Christoph Hertrich, Amitabh Basu, and Marco Di Summa.

12:15-12:45 Robert Weismantel
ZIB 40 = 33 years of research in the theory of Integer Programming


Abstract. We will refer to several important results in the theory of IP that have been discovered around the time when ZIB was founded. We will put them into perspective, discuss some achievements in the past 40 years and outline several major questions that are open today.

12:45-13:00Final Wrap-Up by Christof Schütte and Sebastian Pokutta
13:00-14:00Lunch

Abstracts

On the Role of Applications in Mathematical Theory and vice versa
Martin Grötschel

The origins of mathematics can be traced back to tasks such as counting animals or determining the size of a piece of land. Issues occurring in the daily life of humans have led to a rich system of mathematical structures; and the related theories have opened up novel approaches to understand complex situations arising in practice. In my lecture, I will address success stories of the productive interplay of applications and mathematical theory and indicate, in particular, developments where the work of ZIB members has played an important role.

Cyclic Transversal Polytopes
Volker Kaibel

With every family of finitely many subsets of a finite-dimensional vector space over the Galois-field with two elements we associate a cyclic transversal polytope. It turns out that those polytopes generalize several well-known polytopes that are relevant in combinatorial optimization, among them cut polytopes as well as stable set and matching polytopes. We introduce the class of lifted odd-set inequalities and provide results demonstrating their strength. In particular, it turns out that they generalize well-known inequalities like, e.g., Edmonds’s inequalities for matching polytopes and that they suffice to describe cyclic transversal polytopes if the union of the sets in the family has rank at most two. We also describe extended formulations for cyclic transversal polytopes and introduce a special relaxation hierarchy for them. The talk ist based on joint work with Jonas Frede, Jannik Trappe (OVGU Magdeburg), and Maximilian Merkert (TU Braunschweig).

Finite Elements, Multigrid Methods, and Homogenization
Harry Yserentant

Forty years ago, when ZIB was founded, finite elements and multigrid methods were a hot topic in numerical analysis. To a lesser extent, this is still true today, but the focus has shifted from simpler problems of the type of the Laplace equation to more challenging problems such as equations with rough, strongly oscillating coefficients. A particular approach of this kind related to these old developments is presented in more detail.

From data visualization to data reasoning
Hans-Christian Hege

Research and development in the field of data visualization have changed the world. Powerful visualization tools and techniques have led to data visualization finding its way into almost all areas of life. Today data visualization supports companies in decision-making and communication, journalism and media in the presentation of facts and science in the analysis of complex data sets, the formulation and testing of hypotheses and the presentation of results.

What is the future of data visualization? Answers to this question can be found by extrapolating current developments such as the further use of augmented reality (AR and VR), cloud and streaming technologies, improved integration with other technologies such as the Internet of Things (IoT), edge computing and 5G networks. These developments will improve location independence, situational awareness and decision-making capabilities.

Furthermore, it is clear that artificial intelligence and machine learning will play a pivotal role, e.g. in analyzing huge amounts of data from different sources and automatically creating insightful visualizations, detecting patterns, trends and anomalies in complex data sets to enable more accurate and actionable visualizations, personalizing visualizations based on user preferences and goals or developing natural-language user interfaces.

These are all fairly obvious developments. The question is whether there will also be less obvious and perhaps more promising, if not disruptive, developments? It is worth taking a closer look at the cognitive foundations of visualization. After all, visualization not only makes use of the far greater computing capacities of computers, but above all compensates for weaknesses in human cognition. The following questions will therefore be discussed: What are these weaknesses of human cognition and how can they be better compensated by future tools? Furthermore, what are the special features of human problem solving and how can these be taken into account in future tools?

Adaptive Algorithms for Elliptic PDEs with Random Data
Jens Lang

In this talk, I will present a fully adaptive strategy for solving elliptic PDEs with random data. A hierarchical sequence of adaptive mesh refinements for the spatial approximation is combined with adaptive anisotropic sparse Smolyak grids in the stochastic space in such a way as to minimize the computational cost. The novel aspect of the strategy is that the hierarchy of spatial approximations is sample-dependent so that the computational effort at each collocation point can be optimized individually. I will present results from a rigorous analysis for the computational complexity of the adaptive multilevel algorithm. Numerical examples demonstrate the reliability of the error control.

Integer programming and subdeterminants
Stefan Weltge

During my time at ZIB I learnt a lot about integer programming, and the fascination for this topic has driven my research ever since. My talk is about which parameters are responsible for integer optimization problems being theoretically difficult or easy to solve. In particular, I will highlight a number of results from recent years that deal with the role of subdeterminants as such parameters.

Pressure gain combustion for gas turbines: Analysis of a fully coupled engine model
Rupert Klein

To fill in for missing solar and wind energy during calm weather and cloudy or night time conditions, gas turbines will be with us for some time to come. The higher their efficiency, the better, for obvious reasons. Concepts currently in use have been optimized over decades, and increasing their efficiency further is becoming exceedingly difficult and expensive. Today, billions of Euros are spent to achieve efficiency gains of less than a percent. Here we present Shockless Explosion Combustion (SEC), a novel concept for what is called “pressure gain combustion”, which promises efficiency gains exceeding 15% based on a computational model for a complete engine. In this presentation, mathematical modelling meets theory and numerics for hyperbolic systems with stiff source terms to develop an understanding of the intriguing multiscale interactions constituting the SEC process.

Joint work with Maikel Nadolski (FUB), Christian Zenker (BTU), Michael Oevermann (BTU), Oliver Paschereit (TUB).

We can do better than starting from scratch: How to leverage detailed mechanistic models for the analysis of sparse clinical data?
Wilhelm Huisinga

A growing understanding of complex processes in biology and physiology has led to large-scale quantitative systems pharmacology (QSP) models of pharmacologically relevant processes. These models are increasingly used to study the response of the system to a given input or stimulus, including drug perturbations. For the analysis of clinical data, however, these models are rarely used; due to their size, it it not feasible to estimate individual model parameter based on common clinical data. As a solution, researchers resort to small-scale data-driven and/or semi-mechanistic pharmacokinetic (PK)/pharmacodynamic (PD) models. To what extent these model are consistent with the accumulated knowledge present in the large-scale QSP models, remains open. An approach that allows to systematically derive small-scale PK/PD models from the large-scale QSP model is therefore highly desirable and would support the translation of models between different phases of the drug discovery and development process. We present a model order reduction approach based on a novel sensitivity based input-response index. It is linked to the product of two local sensitivity coefficients: The first coefficient quantifies the impact of the input on a given state variable at a given time; the second coefficient quantifies how a perturbation of a given state variable at a given time impacts the output in the future. We rank state variables according to their input-response index and reduce the complexity of a model in three steps by elimination of state variables of negligible impact, approximation of fast molecular species by their quasi-steady state; and exploitation of conservation laws. A key feature of the reduced model is its mechanistic interpretability in terms of quantities of the original system. We illustrate our approach in application to the humoral blood coagulation system. The input-response-indices & the reduced models give insight about which molecular players are key—and during with time spans. Exemplified for the anti-coagulant warfarin, we demonstrate that subsequent reduction of reactions (in addition to molecular species) can further reduce the complexity to a level that is suitable for the analysis of sparse clinical data in the context of model-informed precision dosing.

From Bench to Bedside: Translating AI Algorithms into Real-World Solutions for Medical Imaging
Malte Westerhoff

Artificial intelligence (AI) is poised to transform most areas of our life, with medical imaging and radiology at the forefront. The pace at which research in this area evolves is breathtakingly fast and feels like it’s exponentially accelerating. Translating groundbreaking research results into products that practically help healthcare professionals and ultimately patients opens up exciting opportunities and demands innovative and new approaches.

Through specific examples, such as the application of AI in cancer detection and medical education, this talk will highlight key methodologies for developing and translating AI algorithms into products, ensuring their reliability and efficacy in real-world applications.

By examining successful case studies, the presentation will showcase the collaborative efforts among researchers, clinicians, and industry partners to integrate AI solutions into everyday clinical workflows. We will explore the promising potential of AI in improving diagnostic accuracy, enhancing medical education, and ultimately transforming patient care. Ethical considerations, regulatory frameworks, data availability, privacy concerns, and future directions of AI in medical imaging are just some of the constraints that need to be considered.

This talk aims to inspire and inform on the innovative strategies and collaborative efforts necessary to transition from cutting-edge research to impactful clinical tools, showcasing the vast opportunities AI offers to enhance medical practice and patient outcomes.

Markov State Modelling for Gene Regulatory Networks: Challenges and Opportunities
Susanna Röblitz

Markov state modelling (MSM) is a powerful framework for studying the long-timescale behavior of high-dimensional dynamical systems. Originally developed in the context of molecular dynamics (with foundational contributions from researchers at ZIB), MSM has been extended in recent years towards stochastic gene regulatory networks governed by the chemical master equation, which provides a novel tool for characterizing cellular phenotypes. In fact, there are indications that other modelling frameworks typically used to describe stochastic gene expression, like asynchronous Boolean networks or piecewise deterministic Markov processes, could benefit from the application of MSM. In my talk, I‘ll give an overview of these developments and their impact on our understanding of cellular decision making, and I‘ll outline some of the challenges we are faced with when transferring MSM from molecular dynamics to other simulation methods.

Tempora mutantur, nos et mutamur in illis
Ralf Kornhuber

There was a time, when there was a strong belief that the world can be described in terms of differential equations and that their fast and reliable solution would be the key to find out what is the case. Meanwhile, society believes that this can be better done directly by collecting and interpreting sufficiently much data. In this talk, we provide a historical case study on a project at ZIB from the year 1990 focussing on reverse-biased pn-junctions, and then move to Fredholm integral operators in very high dimensional spaces for function approximation and training of neural networks.

Conic Bundle Developments
Christoph Helmberg

Conic Bundle is a callable library for optimizing sums of convex functions by a proximal bundle method. It has its origins in the spectral bundle approach for large scale semidefinite programming and builds on the idea of using convex function models that may be represented as support functions over suitable conic sets. Cones serve multiple purposes and we illustrate some typical use cases. In bundle methods the key step is to efficiently determine the next candidate via a quadratic bundle subproblem. The conic setting allows to formulate and solve this in a uniform way by interior point methods. The recent development of a preconditioned iterative solver for the associated primal-dual KKT system revives hopes for efficient dynamic restarting so as to admit model updates embedded directly into the interior point approach. In this, again, the uniform and simple conic structure comes in handy. We present our current ideas and attempts to make this work in practice and present first numerical results.

The Joy of “Good Enough”: Tackling Robust Covering Problems
Sven Krumke

Set cover problems are fundamental to combinatorial optimization, with applications spanning logistics, resource allocation, and scheduling. However, classic formulations typically assume perfect knowledge of input data. Robust set cover problems introduce uncertainty into this model, requiring solutions that remain effective even when costs, coverage sets, or demands change unexpectedly.

The strategic placement of emergency doctors is crucial for ensuring rapid response times and saving lives. We delve into the mathematical complexities underlying this critical optimization problem by presenting results for various versions of robust covering problems such as the Multi-Set Multi-Cover Problem (MSMC) and Partial Scenario Set Cover Problem (PSSC), which are both motivated by an application for locating emergency doctors. In the MSMC one is given a finite ground set J, a demand value d(j)0 for every element of J, and a collection C of subsets of J. The goal is to choose a minimum total number of subsets from C, where multiple choices of a subset are allowed, such that the demand of every element in J is covered. In the robust version, the demand of each element j is no longer present but replaced by a set of scenarios U—the uncertainty set—containing various possible demand vectors. The PSSC generalizes the Partial Set Cover problem, which is itself again a generalization of the classical Set Cover problem. We are given again a finite ground set J, a collection C of subsets to choose from, each of which is associated with a nonnegative cost, and a second collection U of subsets of J of which a given number I must be covered. The task is to choose a minimum cost sub-collection from C that covers at least l sets from U. We present new approximation approaches combining LP-based rounding with a greedy consideration of the scenarios or directly using variants of the greedy set cover algorithm which in each iteration tries to minimize the ratio of cost to number of newly covered scenarios.

All optimal paths lead to ZIB
Antoine Deza

Worst-case constructions have helped providing a deeper understanding of how the structural properties of the input affect the computational performance of optimization algorithms. Recent examples include the construction of Allamigeon, Benchimol, Gaubert, and Michael Joswig, for which the interior point method performs an exponential number of iterations. In a similar spirit, we investigate the following question: how close can be two disjoint lattice polytopes contained in a fixed hypercube? This question stems from various contexts where the minimal distance between such polytopes appears in complexity bounds of optimization algorithms. We provide nearly matching lower and upper bounds on this distance and discuss its exact computation. Based on joint work with Shmuel Onn (Technion), Sebastian Pokutta (ZIB), and Lionel Pournin (Paris XIII).

32 Years after leaving the ZIB—Challenges along the road
Michael Wulkow

In the spring of 1992, my departure from ZIB marked the beginning of a journey that continues to this day: solving complex systems of differential equations, developing algorithms for fitting and control tasks, developing models, linking structures of different fields. Moreover, the software tools developed during this period have not only endured but have evolved into versatile platforms, tailored to the needs of researcher, development, and production alike. These tools are still quite general—even if particularly designed for certain fields of application—since they are written from the standpoint of mathematicians. Here, it means that they do not contain too many fixed structures, but possess the flexibility to accommodate novel concepts and foster innovation. In other words: we develop and sell modeling tools, not models.

This adaptability is evidenced by the extensive body of work comprising hundreds of publications authored by users and collaborators that have transcended the status of knowledge at the time of development of the respective software version. In this lecture, I will delve into a few foundational numerical principles that still underpin these tools, emphasizing their enduring significance in contemporary practice. Furthermore, I will highlight select projects that exemplify the unique challenges encountered and overcome along the way. This talk aims to provide a succinct yet insightful glimpse into the evolution of mathematical practice since my time at ZIB.

On the Complexity of Neural Networks
Martin Skutella

We discuss various recent results on neural networks with ReLU activations. With respect to the expressivity of neural networks, we provide a mathematical counterbalance to the universal approximation theorems which suggest that a single hidden layer is sufficient for learning any function. In particular, we investigate whether the class of exactly representable functions strictly increases by adding more layers. We also present results on how basic problems in combinatorial optimization can be solved via neural network with ReLU activations. Finally, we discuss recent results on the algorithmic complexity of determining basic properties of neural networks with given weights. Our approaches build on techniques from mixed-integer and combinatorial optimization, efficient algorithms, polyhedral theory, as well as, discrete and tropical geometry.

The talk is based on joint work with Moritz Grillo, Vincent Froese, Christoph Hertrich, Amitabh Basu, and Marco Di Summa.

ZIB 40 = 33 years of research in the theory of Integer Programming
Robert Weismantel

We will refer to several important results in the theory of IP that have been discovered around the time when ZIB was founded. We will put them into perspective, discuss some achievements in the past 40 years and outline several major questions that are open today.