[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [help-GIFT] some concept problems

From: Wolfgang Müller
Subject: Re: [help-GIFT] some concept problems
Date: Mon, 13 Apr 2009 10:51:44 +0200

The data structure of the GIFT is the inverted file. Each image is transformed into a pseudo text with terms like "dark_red_most_frequent_color_in_square_region_22". Each image has many square regions and a corresponding number of terms. Details in some squiremuellermuellerpunraki papers you find at the Geneva website.

The pruning is about not evaluating all search terms (features) but taking the ones which are most likely to influence the result (i.e. the terms with the lowest document frequency). 70% is a good compromise between computing time and result. The result is actually better than when evaluating 100%.

In addition to these square region (a.k.a. block) features, there are two types of histograms, color and texture histograms. These need to be evaluated completely to make sense. Or at least the type of pruning used by for the block features does not work there. As an exercise you *could* add pruning to this type of features: You would have to prune not by document frequency but by term frequency. Smith and Chang showed somewhere in the nineties, that pruning works well for this case. There is another paper by Arjen P. de Vries and colleagues which goes in the same direction, I believe it is one of the BOND papers.


On Sun, Apr 12, 2009 at 11:13 AM, higo ic <> wrote:
i see the caption prune % of feature in the gift-config and some client
what's the concept means
i see the algorithm sub1 sub3 are  prune 100%  feature, i don't understand why both of these algorithms are 100% 
,And someone could explain the "prune   feature" in brief?

help-GIFT mailing list

reply via email to

[Prev in Thread] Current Thread [Next in Thread]