[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Help-glpk] Exclude a possible zero solution
From: |
Brady Hunsaker |
Subject: |
Re: [Help-glpk] Exclude a possible zero solution |
Date: |
Tue, 25 May 2004 09:45:22 -0400 |
On Mon, 2004-05-24 at 12:45, Cordian Benedikt Riener wrote:
> I am faced with the following problem.
> I have a LP that maximzies a linear function due to Ax<=0
> My problem is, that I want to exclude the trivial sollution x=0, but I
> dont know how I can do this, because the solver stops once it was found,
> that all other possible solution are samller than the one with x=0. Is
> there a possobility the make the solver run through all the eges of Ax<=0?
> Or some other trick to find the next optimal solution ?
In a linear program, it is not possible to exclude a single point in
this way for the same reason that it's not possible to have strict
inequalities in the formulation.
If your problem is an integer program, then it might be possible to
devise some constraints such that at least one coordinate of x is
non-zero. For example, if you know that x_j >= 0, then you could just
require that sum_j x_j >= 1. But if x_j are allowed to be negative,
then this simple solution won't work.
Back to a linear program. Based on your previous post, x represents an
inequality, so it can be scaled. If the right-hand-side x_0 is not
zero, then you can scale the solution so that x_0 = 1. By adding this
constraint to your original formulation you will determine the
following:
- If infeasible, then p is an interior point or p lies on a facet with
right-hand-side = 0.
- If feasible and objective is zero, then the solution gives a facet
that p lies on.
- If feasible and objective is positive, then the solution gives a
separating hyperplane.
That's a good approach if you can handle constraints with
right-hand-side 0 in a separate way. Alternatively, you could scale so
that any single coordinate x_j = 1. In each case, however, you will
miss any solution for which x_j = 0. By solving n different LPs, one
with each x_j = 0, and considering the results you could be sure you
determined the answer to your problem.
To your question about searching for alternative optima: I don't
believe GLPK (or most solvers) provides an easy way to do this. You
could do it manually by checking for variables with zero reduced cost
and attempting to pivot them into the basis. In many cases this may be
a degenerate pivot, but you should eventually be able to pivot to
alternative optima in this way. I'm not sure that approach is worth the
effort.
Brady
p.s. Theory questions like this are probably better sent the newsgroup
sci.op-research rather than to help-glpk.
--
Brady Hunsaker
Assistant Professor
Industrial Engineering
University of Pittsburgh
http://www.engr.pitt.edu/hunsaker/