discuss-gnustep
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Capacity of NSDictionary


From: Dr . Rolf Jansen
Subject: Re: Capacity of NSDictionary
Date: Fri, 1 Feb 2008 16:51:13 -0200

Do you talk about 3 GB of raw data or do you talk about 3 GB of allocated memory for the NSDictionary?

Does Solaris 10 run in 64bit mode and did you also compile GNUStep in 64bit mode, if not, then the available memory for your process is 4 GB and beyond that swapping might occur or not, depending on the memory management of Solaris.

Are you really talking about an inmutable NSDictionary? In this case and if you are completely in 64bit mode then 16 GB might be somehow sufficient physical memory. If you are working with a 3 GB mutable dictionary, you should be aware that occasionally it will move and copy itself from one location in memory to another, when it should increase in size and because of that needs to find a bigger free block of memory. This will of course also take some seconds.

If you are on a 32bit machine, then you should forget working with 3GB chunks of data in memory and use a database backend of your choice for your data.

Best regards

Rolf Jansen


Am 01.02.2008 um 15:45 schrieb Andreas Höschler:

Hi all,

I am using a dictionary to hold business objects. As the key I use a class SOObjectID which basically looks as follows:

@interface SOObjectID : NSObject < NSCopying, SRArchivingProtocol >
{
   int _identifier; // entity
   int _primaryKey;
}

- (unsigned int)hash
{
   return (_primaryKey * 1000 + _identifier);
}

- (BOOL)isEqual:(id)object
{
   if (self == object) return YES;
   if ([object isKindOfClass:[SOObjectID class]])
     {
return ([(SOObjectID *)object identifier] == _identifier && [(SOObjectID *)object primaryKey] == _primaryKey);
     }
   else return NO;
}

I am hitting a serious performance problem when holding many objects in the dictionary. Currently I am running a test with 3 GB of data loaded into the process (running under Solaris 10) and it's getting slow as hell. The machine has 16 GB of RAM so swapping is not an issue. I know that the above is not the cutest way to hold great chunks of data. I already wondered whether it would be a good idea to replace the NSDictionary with something that uses a binary tree to hold the data. However, since this is not done in a couple of minutes I would appreciate to find an easier solution for now. How many objects should a dictionary be able to handle with reasonable performance? Anything I could do in my SOGlobalID class to improve performance?

Thanks a lot!

Regards,

  Andreas



_______________________________________________
Discuss-gnustep mailing list
Discuss-gnustep@gnu.org
http://lists.gnu.org/mailman/listinfo/discuss-gnustep





reply via email to

[Prev in Thread] Current Thread [Next in Thread]