savannah-hackers-public
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Savannah-hackers-public] robots.txt disallows all spiders for maili


From: Noah Slater
Subject: Re: [Savannah-hackers-public] robots.txt disallows all spiders for mailing lists
Date: Sun, 5 Jul 2009 21:06:01 +0100
User-agent: Mutt/1.5.18 (2008-05-17)

On Sun, Jul 05, 2009 at 10:47:23PM +0300, Yavor Doganov wrote:
> Thanks, I was sure the reasons were purely technical, although I
> didn't remember the details.

Ah, this makes sense.

How about we disallow all spiders, but add an exemption rule for Google?

Sylvain, Karl, thoughts?

> I avoid using Google, Yahoo!, etc. search engines

That is good, for you.

> the GNU project does not have a mission to make their life easier

You have the wrong framing, this is about making it easy for users.

> If I have to read an old disussion from the time I was not subscribed
> to that particular list, I download it in mbox format, grep through it
> to discover the right file and then use my MUA's facilities to search
> and read the interesting messages.  This has the extra advantage to
> follow the discussion in the same natural order it happened, and (if
> necessary) reply without breaking threads.

And you expect all users of GNU software to do the same? Heh.

> The main GNU site is different -- we want to be as convenient as possible for
> everyone to discover GNU and learn about the free software movement.

Ah, but once they've decided to use GNU, we make it as hard as possible?

Strong rhetoric I know, but my point stands.

Anyway, thanks for the follow up!

Best,

-- 
Noah Slater, http://tumbolia.org/nslater




reply via email to

[Prev in Thread] Current Thread [Next in Thread]