[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Gnumed-devel] speeding up GNUmed

From: Sebastian Hilbert
Subject: Re: [Gnumed-devel] speeding up GNUmed
Date: Wed, 7 Apr 2010 07:18:34 +0200
User-agent: KMail/1.13.2 (Linux/; KDE/4.4.2; i686; ; )

Am Dienstag 06 April 2010 15:08:48 schrieb Karsten Hilbert:
> Recently I have had access to a live GNUmed installation. I
> noticed that even over a LAN populating the document tree of
> a patient took a rather long time.
> The facts:
> - Intel Pentium Mobile 1.6 GHz
> - IDE ATA drive
> - 1.5 GB RAM
> - 32 MB swap in use
> - GNUmed 0.7 / v13 (for testing)
> - patient with 190 encounters, 36 documents and 56 pages therein
> While this laptop was both bzip2ing up a 7 GB backup and
> rsyncing another one to a second machine it took 30 seconds
> to populate the document tree.
> A few document load SQL tweaks later it took well under one
> second under the same load.

So the slowness resulted from the SQL ? How did you go about optimizing ? I 
would have thought that unless the SQL included retrieving overhead there is 
little to speed up the query itself.

Parsing and working with the returned data can be done in various ways at 
various speed was my impression.

I guess the connection to the public database is slower by physical means and 
therefore a good way to identify candidates for speedup.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]