[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Help-glpk] [Fwd: Exit code 3 for larger datasets]

From: glpk xypron
Subject: Re: [Help-glpk] [Fwd: Exit code 3 for larger datasets]
Date: Mon, 25 Jul 2011 21:51:35 +0200

Hello Sheetal,

unfortunately your mail does not offer enough information
to analyze your problem.

I suggest for each GLPK routine you call you write a log statement
to a file and flush the file before entering the GLPK routine, e.g.

fprintf(file, "Calling GLPK in file %s, line %f\n, __FILE__, __LINE__);

You could also use an error hook:

#include <setjmp.h>
jmpbuf env;

if (setjmp(env)) {
  fprintf(file, "function glp_check_dup failed");
} else {
  glp_error_hook(error_hook, &env);
  result = glp_check_dup(m, n, ne, ia[], ja[]);

void error_hook(void *in) {
  /* free GLPK memory */
  /* safely return */
  longjmp(*((jmp_buf*)in), 1);

Given the size of your memory consumption you should consider using
a 64bit system with sufficient memory.

Best regards


> -------- Forwarded Message --------
> Subject: Exit code 3 for larger datasets
> Date: Sun, 24 Jul 2011 15:00:27 -0400
> Hi,
> I am using Visual C 2010 on a windows system.
> My code reads a data-file, solves the MIP using default settings and
> finally writes the optimal solution array x[ ] to a text file.
> I am collecting the results for different data-sets.
> I have tested for some data-sets which run for less than 15,000 sec.
> The code works fine for these data-sets.
> The code ends with exit code 3 for larger data-sets.
> It's problematic to trace it as this happens sometime after 20,000 sec
> (more than 1.5 Gb runtime memory) of computation time.
> As many GLPK routines are called within the code, I am also not able to
> trace exactly what point the code ends at.
> Can you please guide me regarding this?
> -Thanks with Best Regards,
>  Sheetal
> _______________________________________________
> Help-glpk mailing list
> address@hidden

NEU: FreePhone - 0ct/min Handyspartarif mit Geld-zurück-Garantie!               
Jetzt informieren:

reply via email to

[Prev in Thread] Current Thread [Next in Thread]