monotone-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Monotone-devel] Re: Monotone spam (was: wiki spam)


From: hendrik
Subject: Re: [Monotone-devel] Re: Monotone spam (was: wiki spam)
Date: Tue, 29 Apr 2008 09:53:41 -0400
User-agent: Mutt/1.5.9i

On Mon, Apr 28, 2008 at 03:48:27PM +1000, William Uther wrote:
> 
> On 27/04/2008, at 9:58 AM, address@hidden wrote:
> 
> >
> >On Sun, Apr 27, 2008 at 12:39:25AM +0200, Thomas Keller wrote:
> >>
> >>The amount of wiki spam gets more and more annoying - do we already  
> >>have
> >>something setup like this [0] for our MoinMoin installation?
> >>
> >>And, can we somehow easily block certain IP (ranges) through
> >>LocalBadContent as well? Whoever recently defaced our FrontPage  
> >>used an
> >>IP inside the range 89.149.241.0 - 89.149.244.255 which is assigned  
> >>to a
> >>small German web hoster [1], so it would not be a big / global  
> >>issue to
> >>just block his further attempts.
> >
> >Eventually, there's going to be monotone spam.
> 
> Monotone is generally used in situations where there is more control
> over the user-base than you're assuming.  We don't have CVS spam, and
> the wiki is no more distributed than CVS.
> 
> >In a distributed network of monotone installations, especially in a  
> >large one,
> >someone, somewhere, is going to check in a revision containing load of
> >spam, and netsync will quickly, and efficiently, spread this to all  
> >the
> >machines in the net.  Once it's done once, of course, it will be done
> >again and again.  Now the security model can be configured to
> >ignore the certificates relating to the spam, and maybe repeatedly
> >reconfigired as necessary,  but do we have any effective way of
> >reclaiming its disk space?
> 
> Not yet.  Does MoinMoin reclaim the space of old spammed pages, or does
> is just revert the main page and leave the spam in the history?
> 
> Hrm.  As an aside, is there something stopping Google from indexing the
> history on the wiki (nofollow or robots.txt), or are those history
> links useful for SEO even once they've been reverted?
> 
> It would be nice to have a way to reject revs signed by a particular
> person on sync, unless they're also signed by someone else, or a child  
> rev
> is signed by someone else.  This would stop bad revs from being sync'd
> all over a cloud.  This feature would be much easier to implement in  
> nuskool
> I suspect.
> 
> >Now I don't think it's urgent to solve this problem now, but it's
> >important to brainstorm solutions, so that when the time comes, we  
> >won't
> >have implemented a lot of things that make spamfighting hard.
> >
> >This is related to the problem discussed a few months ago about a
> >mechanism to permanently expunge a particular revision from all copies
> >that might exist enywhere.  There I believe the motivation was to  
> >revoke
> >copyright violations.
> 
> Yeah - it's useful.  I don't think it is the highest priority though.

I agree.  But it's one of the reservations I have about a completely 
distributed wiki.  If it's popular enough toe make distribution 
necessary for scalability (as opposed to backup and convenience), it 
will be big enough for spam to be a problem.

-- hendrik




reply via email to

[Prev in Thread] Current Thread [Next in Thread]