No.125 Piecewise smooth system and optimization with piecewise linearization via algorithmic differentiation

Icon

NII Shonan Meeting Seminar 125

Organizers

Prof. Andreas Griewank, Universidad Yachay Tech
Prof. Andrea Walther, University of Paderborn
Prof. Siegfried M. Rump, Hamburg University of Technology
Prof. Koichi Kubota, Chuo University

Participants

Prof. Paul Barton, MIT Boston
Prof. Bradley Bell, University of Washington
Prof. Francesc Anton Castro, Universidad Yachay Tech
Ms. Olga Ebel, University of Paderborn
Dr. Laurent Hascoet, INRIA
Prof. Xiaolin Huang, Shanghai Jiao Tong University
Prof. Satoru Iwata, University of Tokyo
Prof. Vladik Kreinovich, University of Texas at El Paso
Prof. Fritz Mayer-Lindenberg, Technische Universitat Hamburg
Mr. Taihei Oki, University of Tokyo
Dr. Manuel Radons, Humboldt University of Berlin
Dr. Stephan Schmidt, University of Wuerzburg
Dr. Florian Steinberg, INRIA
Mr. Tom Streubel, Humboldt University of Berlin
Prof. Mizuyo Takamatsu, Chuo University
Dr. Takahito Tanabe, NTT Data Mathematical Systems inc.
Prof. Takashi Tsuchiya, National Graduate Institute for Policy Studies
Dr. Jean Utke, Allstate Insurance Company
Dr. Kathrin Welker, Trier University
(Mr. William Rodolfo Arellano Tamayo,Universidad Yachay Tech)

Schedule

* all talks are about 30min + 10min (Q&A)

24th June (Sunday)
  15:00-19:00 Check-in
  19:00-21:00 Welcome Banquet

25th June (Monday)
   7:30- 9:00 Breakfast
   9:00-10:30 Session 1
              Introduction (Prof.Andrea Walther)
                "Abs-Linearization for Piecewise Smooth Optimization"
              Overview on PS,PL
              Discussion: Representation and Conversion of PS function
  10:30-11:00 Break
  11:00-12:00 Session 2
              (talk) Ms.Olga Ebel
                 "Dealing with Optimization Problems    
                  Constrained by Nonlinear Piecewise Smooth PDEs"
              Discussion: PL optimization
  12:00-13:30 Lunch
  13:30-14:00 Group Photo Shooting
  14:00-15:30 Session 3
              (talk) Prof.Xiaolin Huang
                 "Parametric and Non-parametric Piecewise 
                  Linear Model and its Optimization"
              [(short talk) Prof.Seidel. His arrival time may be around 14:30]
              Discussion: PL optimization  
  15:30-16:00 Break
  16:00-18:00 Session 4
              (talk) Dr.Jean Utke
                 "Applying PL optimization to a data science problem"
              (new talk) Dr.S.Schmidt
                 "Non-Smooth Geometric Inverse Problems"
              Discussion: Basic PL subroutines

  18:00-19:30 Dinner

26th June (Tuesday)
   7:30- 9:00 Breakfast
   9:00-10:30 Session 1
              (talk) Dr.Laurent Hascoet
                 "Non-smooth tangent differentiation
                  experiments with Source-Transformation AD tools"
              Discussion: Implementation
  10:30-11:00 Break
  11:00-12:00 Session 2
              (new talk) Prof.B.Bell
                 "CppAD’s Abs-normal Representation"
              Discussion: Approximate functions and solvers
  12:00-13:30 Lunch
  13:30-15:30 Session 3
              (talk) Prof.Fritz Mayer-Lindenberg
                 "Applying automatic functorial substitutions
                  to simplify programming"
              (new talk) Prof. Francesc Anton Castro
                 "Introduction to Algebraic and Computational Geometry"
              Discussion: AD / Interval Implementation
  15:30-16:00 Break
  16:00-18:00 Session 4
              (talk) Dr.Kathrin Welker
                 "Solution Techniques for Constrained Shape
                  Optimization Problems in Shape Spaces"
              (new talk) Prof.P.Barton
                 "Lexicographic Derivatives"
              Discussion: Approximate functions and solvers
  18:00-19:30 Dinner

27th June (Wednesday)
   7:30- 9:00 Breakfast
   9:00-10:30 Session 1
              (new talk) Prof.S.Rump
                 "Introduction to INTLAB"
              (talk) Prof.Vladik Kreinovich
                 "How Interval Measurement Uncertainty Affects 
                   the Results of Data Processing: 
                   A Calculus-Based Approach to 
                   Computing the Range of a Box"
              Discussion: Interval Computation and PL/PS
  10:30-11:00 Break
  11:00-12:00 Session 2
              (talk) Prof.Takashi Tsuchiya
                 "Chubanov's new polynomial-time algorithm 
                  for linear programming and extensions"
                 "A Recursive Recomputation Approach to Smoothing in Nonlinear and Bayesian State-Space Modeling"
              Discussion: Semi-definite Programming
  12:00-13:30 Lunch
  13:30-18:00 Excursion
              Grate Buddha,  Hase Temple

  18:15-21:00 Main Banquet

28th June (Thursday)
   7:30- 9:00 Breakfast
   9:00-10:30 Session 1
              (talk) Mr.Taihei Oki
                 "Index Reduction for Nonlinear 
                  Differential-Algebraic Equations via 
                  Combinatorial Relaxation"
              Discussion: PL/PS and DAE/ODE
  10:30-11:00 Break
  11:00-12:00 Session 2
              (new talk) Mr. Tom Streubel
                 "High Order Taylor-like Expansions of PS functions and their Application to DAEs"
              Discussion: (all topics or new topics)
  12:00-13:30 Lunch
  13:30-      Dissmiss

Abstracts.
=====
Prof. Xiaolin Huang, Shanghai Jiao Tong University
   "Parametric and Non-parametric Piecewise 
    Linear Model and its Optimization"

    Optimization algorithms are designed for specific formulations,
  which are also the basis of modeling piecewise linear systems. Since
  the proposal of compact representation, the parametric models of
  continuous piecewise linear functions have been insightfully studied
  and some models with full representation capability, including our
  contributions, have been established. Also, we proposed
  non-parametric piecewise linear models by designing specific
  kernels. The recent progress of machine learning makes both
  parametric and non-parametric piecewise linear models promising to
  describe complicated systems. Therefore, it becomes more important
  to investigate optimization methods based on the learned
  functions. It is also desirable to develop efficient optimization
  method to train those piecewise linear models for machine learning.

=====
Dr.Laurent Hascoet, INRIA
   "Non-smooth tangent differentiation experiments with
    Source-Transformation AD tools"

    The research of Andreas Griewank, Andrea Walther, and their
  students has shown that Algorithmic Differentiation can be
  used to derive tangent models that cope with a certain class
  of non-smoothness, through the use of the so-called
  Abs-linear form (ALF). These tangent models incorporate some
  knowledge of the nearby discontinuities of the
  derivatives. These models seem to bring some additional power
  to processes that use tangent approximations, such as
  simulation, optimization, or solution of differential
  equations.  The mechanics to derive these special tangent
  models may seem at first sightly exotic and remote from
  standard tangent linear Algorithmic Differentiation. However,
  successful experiments with Adol-C have shown that tangent AD
  may be adapted to produce these ALF models. Together with
  Krishna Narayanan and following suggestion from Torsten
  Bosse, we recently tried a similar adaption on
  Source-Transformation AD tools. It appears that very little
  development is needed in the AD-tool, be it OpenAD or
  Tapenade. Specifically for Tapenade, it appears that no
  development at all is needed in the tool itself. Any end-user
  can already produce ALF tangent without needing any access to
  the tool source. We will detail the required work, which
  amounts essentially to hand-writing, once and for all, a
  customized derivative of the absolute-value function
  (ABS). This is currently less than 40 lines of code.
  Building the ALF of a given program introduces one new
  variable per run-time execution of the ABS function. As the
  number of rows and columns of the constructed extended
  Jacobian are both more or less equal to the number of
  variables, this extended Jacobian may end up using
  unreasonably large memory space for large codes. To overcome
  this limitation of the approach, we would like to discuss the
  possibility of finding at run-time the "important" ABS calls
  that deserve this treatment, and those that don't. We believe
  we can base this decision on a notion of distance to the kink
  induced by this ABS call, using ideas from the PhD thesis of
  Mauricio Araya. We believe that we can also decide at
  run-time to forget a previously "important" kink, if we
  encounter another one which is closer and therefore more
  "important". Thus limiting the number of "active" ABS calls,
  we can limit the size of the extended Jacobian to a fixed
  number, taking into account only the nearest kinks in the
  primal function. We hope to be able to demonstrate this on a
  few examples.

=====
Prof. Fritz Mayer-Lindenberg, Technische Universitat Hamburg
"Applying automatic functorial substitutions to simplify programming"

    Automatic differentiation will be a special case of such
  substitutions (there are more). The talk discusses its
  embedding into a particular programming language for
  computing with real numbers that attempts to combine
  simplicity with the support of difficult targets.

=====
Dr.Kathrin Welker, Trier University
"Solution Techniques for Constrained Shape
Optimization Problems in Shape Spaces"

    Shape optimization problems arise frequently in
  technological processes which are modelled in the form of
  partial differential equations (PDEs) or variational
  inequalities (VIs). In many practical circumstances, the
  shape under investigation is parametrized by finitely many
  parameters, which on the one hand allows the application of
  standard optimization approaches, but on the other hand
  limits the space of reachable shapes unnecessarily. In this
  talk, the theory of shape optimization is connected to the
  differential-geometric structure of shape spaces. In
  particular, efficient algorithms in terms of shape spaces and
  the resulting framework from infinite dimensional Riemannian
  geometry are presented. Moreover, VI constrained shape
  optimization problems are treated from an analytical and
  numerical point of view in order to formulate approaches
  aiming at semi-smooth Newton methods on shape vector
  bundles. Shape optimization problems constrained by VIs are
  very challenging because of the necessity to operate in
  inherently non-linear and non-convex shape spaces. In
  classical VIs, there is no explicit dependence on the domain,
  which adds an unavoidable source of non-linearity and
  non-convexity due to the non-linear and non-convex nature of
  shape spaces.

=====
Prof.Takashi Tsuchiya, National Graduate Institute for Policy Science
"Chubanov's new polynomial-time algorithm
for linear programming and extensions"

    Recently, Chubanov proposed a third polynomial-time algorithm
  for linear programming. The problem deals with homogeneous
  feasibility problem of a linear program. The algorithm finds
  an interior feasible solution of a linear program by
  repeating projection and scaling.  We extended this algorithm
  to symmetric cone programming including SDP and SOCP. In this
  talk, we introduce Chubanov's algorithm and discuss further
  extensions, seeking for a new direction of conic linear
  programming.
=====
Mr.Taihei Oki, University of Tokyo
"Index Reduction for Nonlinear Differential-Algebraic Equations via
 Combinatorial Relaxation"

    Differential-algebraic equations (DAEs) are widely used for
  modeling of dynamical systems. The difficulty in numerically solving
  a DAE is measured by its differentiation index. For highly accurate
  simulation of dynamical systems, it is important to convert high
  index DAEs into low index DAEs. Most of existing simulation software
  packages for dynamical systems are equipped with an index reduction
  algorithm given by Mattsson and Soederlind. Unfortunately, this
  algorithm fails if there are unlucky numerical or symbolic
  cancellations.  This talk gives a new index reduction algorithm for
  nonlinear DAEs. This algorithm modifies a DAE into another DAE to
  which Mattsson--Soederlind’s index reduction algorithm is applicable
  by iteratively applying the implicit function theorem. Our approach
  is based on the combinatorial relaxation approach, which is a
  framework to solve a linear algebraic problem by iteratively
  relaxing it into an efficiently solvable combinatorial optimization
  problem. Though this algorithm heavily uses symbolic manipulations,
  we give implementation strategies to overcome the drawback.

=====


Overview

overview:
http://shonan.nii.ac.jp/shonan/blog/2017/08/18/piecewise-smooth-system-and-optimization-with-piecewise-linearization-via-algorithmic-differentiation/

Topics
1. Introduction
2. Representation and conversion of PS functions
3. Approximate functions and solvers
4. Basic PL subroutines
5. PS to PL and back
6. Dealing with discontinuities
7. Complexity bounds
8. Interval computation
9. Semi-definite Programming

accommodation fee and full board:
http://shonan.nii.ac.jp/shonan/proposal-submissions/expenses