[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Gnumed-devel] GNUmed (debian) servers and security

From: James Busser
Subject: [Gnumed-devel] GNUmed (debian) servers and security
Date: Sun, 27 Jan 2008 10:32:29 -0800

Well, I am starting to get organized, imagining a time that I can have a production use for GNUmed. Still many things to figure out. For a production machine, I become most concerned about security and data integrity. Here are a variety of thoughts... would appreciate if any would strengthen these:

1. The server needs adequate physical protection. Even if the room in which it resides can be accessed by thieves it would be good to have some additional physical lockdown of the machine. I understand that it is not unusual for thieves to bring boltcutters with them, therefore special hardened chain that cannot be severed with bolt cutters be must instead be cut with a grinder may be better for this situation.

2. Debian etch - what should be done with it to make it more secure? Does it comes with services that should be removed or turned off? What manner of things (like Bastille Linux?) should be activated? Is there any set of practices we would encourage and anywhere to be pointed to?

3. The server medical data (Postgres cluster for GNUmed, dumps, downloaded HL7 messages etc) should live on an encrypted partition. Truecrypt seems to have become the standard for multi-OS encryption but its license does not qualify for direct Debian distributions. Is it still wiser / better to use it, over (say) cryptmount?

4. Access to the database. Should Postgres and the machine it is sitting on be somehow better-protected behind some other machine, or it is somehow acceptable for this machine to be connected to the router/internet. Is there anything about this set-up that needs to be carefully considered? It seems to me that the fact that Apache/Tomcat serve Oscar's MySQL data was used as a strength maybe because Apache's security has been well-tested whereas in our case if Postgres is directly serving the data are we in a less-well tested environment?

5. Data integrity / safety (backups). We previously discussed the need to have regular (including offsite) backups. Personaly I will be happier to see an actual backup/restore cycle tested and showing that data that was known to exist! For sites which did not want to suffer downtime, a secondary server would be a good idea. Did Karsten or anyone else ever establish slave/replication services on their database(s)?

reply via email to

[Prev in Thread] Current Thread [Next in Thread]