Preliminary list of abstracts

Note: We are using the fantastic MathJax JavaScript library to typeset the mathematics on this web page. You can right-click on a formula to zoom in or to select the rendering engine you like (all under 'Settings'). Under Firefox the default renderer turns out to be MathML (which is quick), but you might prefer the HTML-CSS renderer for more faithful LaTeX rendering. If you encounter any problems with this setup: then please email us!

Click here to go back to the list of abstracts.

Tensor decompositions in convex optimization problems
Mon, 16:35--17:00
  • Falcó, Antonio F. (Physics, Mathematics and Computer Science, Universidad CEU Cardenal Herrera, Spain)
  • Falcó, Antonio, A. (Departamento de Ciencias Físicas, Matemáticas y de la Computación, Universidad CEU Cardenal Herreram Spain)
  • Nouy, Anthony, A. (GeM - Institut de Recherche en Génie Civil et Mécanique, UMR CNRS 6183, Ecole Centrale Nantes, Université de Nantes, France)

Model reduction techniques based on the construction of separated representations are receiving a growing interest in scientific computing. A family of methods, recently called Proper Generalized Decomposition (PGD) methods, have been introduced for the a priori construction of separated representations (or tensor product approximation) of the solution $u$ of problems defined in tensor product spaces: \begin{align} u\in V=V_1\otimes \ldots \otimes V_d,\quad A(u) = l\qquad \qquad \qquad \qquad (1) \end{align} PGD methods can be interpreted as generalizations of Proper Orthogonal Decomposition (or Singular Value Decomposition, or Karhunen-Loève Decomposition) for the a priori construction of a separated representation $u_m$ of the solution $u$ (i.e. without knowing $u$ a priori): \begin{align} u\approx u_m = \sum_{i=1}^n w_i^1\otimes \ldots \otimes w_i^d,\quad w_i^k\in V_k \end{align} A priori construction means that we only know the equation solved by the function $u$ and not the function $u$ itself. Several definitions of PGDs have been proposed. Basic PGDs are based on a progressive construction of the sequence $u_m$, where at each step, an additional rank-one element $\otimes_{k=1}^d w_{m}^k$ is added to the previously computed decomposition $u_{m-1}.$ These progressive definitions of PGDs can then be considered as Greedy algorithms for constructing separated representations. A possible improvement of these progressive decompositions consists in introducing some updating steps in order to capture an approximation of the optimal decomposition, obtained by defining the whole set of functions simultaneously (and not progressively). For many applications, it allows recovering good convergence properties of separated representations.

In this work, we propose a theoretical analysis of progressive and updated Proper Generalized Decompositions for a general class of problems where (1) is equivalent to the minimization of an elliptic and differentiable functional $J$, $$ J(u) = \min_{v\in V} J(v) $$ where $V$ is a tensor product of reflexive Banach spaces.