next up previous
Next: About this document ...

Brief Description of
A New Adaptive Algorithm for
Convex Quadratic Multicriteria Optimization



by Christoph Heermann, Bernd Weyers, & Jörg Fliege



Multicriteria optimization problems are a class of difficult optimization problems in which several different objective functions have to be taken care of at the same time. It will usually be the case that no single point will minimize all of the several objective functions given at once. Therefore, we are in search for so-called efficient points, i. e. feasible points for which there does not exist a different feasible point with the same or smaller objective function values such that there is a strict decrease in at least one objective function value. More precisely, let

\begin{displaymath}
f : G \longrightarrow \mbox{{\sf I\hspace{-0.083em}R}}^p
\end{displaymath}

be a function and let $f_i$ ( $i = 1, \ldots, p$) be the objective functions to be minimized. Then, an element $y^* \in f(G)$ is called efficient, if and only if there is no other $y \in f(G)$ with

\begin{displaymath}
y_i \leq y_i^* \quad \forall \, i \in \{1,2,\ldots,p\}
\end{displaymath}

and

\begin{displaymath}
y_k < y_k^* \quad \mbox{for at least one } k \in \{1,2,\ldots,p\}.
\end{displaymath}

The set of all efficient points of the set $f(G)$ is called the efficient set, denoted by $E(f(G))$. Note that two efficient points $f(x^{(1)}), f(x^{(2)}) \in E(f(G))$ ( $x^{(1)}, x^{(2)} \in G$) with $f(x^{(1)}) \neq f(x^{(2)})$ are incomparable to each other with respect to the order defined above. By their very nature of being efficient, there exist two indices $i, j \in \{ 1, \ldots, p \}$ such that $f_i (x^{(1)}) < f_i (x^{(2)})$ and $f_j (x^{(2)}) < f_j (x^{(1)})$. Therefore, just one efficient point can not capture the possible optimal alternatives we face when solving a multicriteria optimization problem. This clearly shows that human decision makers need information about the whole set $E(f(G))$. This is the subject of this paper.

The basic idea is as follows. Our aim is to compute a discrete approximation of the set of efficient points. By a standard scalarization approach, we have to solve a scalar optimization problem for each efficient point we want to compute. These different optimization problems we have to consider can be viewed as perturbations of each other, defined by a well-defined perturbation parameter.

We propose to use an adaptive discretization technique for the set of pertubations. Basically, we want to use more parameters in those regions of the parameter space where pertubations which are close together result in efficient points whose images in the image space  $\mbox{{\sf I\hspace{-0.083em}R}}^p$ are far apart from each other. We will introduce more parameters (i. e. more scalar problems) during the solution process of other scalar problems, even before these other problems are completely solved.

Furthermore, to save work when computing the new efficient points (i. e. when solving the new optimization problems), we propose to use a warm-start strategy. With such a strategy, points from the iteration history of scalar problems already considered (but not necessarily solved to optimality) are used as starting points for the optimization problems currently under consideration. In this way, even intermediate results obtained during a solution process form already a well-defined approximation to the set of efficient points.

The paper is therefore organized as follows. First, we derive a long-step interior-point algorithm for convex quadratic optimization problems subject to linear equality and inequality constraints that is able to use infeasible starting points. Then, perturbed optimization problems which stem form a standard scalarization approach for multicriteria problems are considered and a warm-start strategy for these problems are derived. After that, we describe a new efficient adaptive interior-point technique for solving convex quadratic multicriteria problems. Numerical results are presented and discussed, too. It turns out that the method is faster than a standard method for solving multicriteria optimization problems.




next up previous
Next: About this document ...
Joerg Fliege 2004-04-15