lwip-users
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: [lwip-users] scribblewiki down


From: Pettinato, Jim
Subject: RE: [lwip-users] scribblewiki down
Date: Thu, 25 Sep 2008 16:00:59 -0400

Funny you mentioned a logo... I noticed the lack of a logo with the new
wiki site also, and have begun working on something... How does everyone
feel about a hummingbird for a lwIP mascot? Lightweight, fast, doesn't
consume much in the way of resources... And it flies!!! I think it's a
suitable fit, but I'm not sure if it's been done elsewhere. Input? 
 
__
 
James M. Pettinato, Jr.
Software Engineer
E: address@hidden | P: 814 898 5250 


FMC Technologies Measurement Solutions Inc.
1602 Wagner Avenue | Erie PA | 16510 USA 
Phone: 814 898 5000 | Fax: 814 899-3414
www.fmctechnologies.com 

 

-----Original Message-----
From: address@hidden
[mailto:address@hidden On Behalf
Of Thomas Taranowski
Sent: Thursday, September 25, 2008 2:45 PM
To: Mailing list for lwIP users
Subject: Re: [lwip-users] scribblewiki down

If we had a listing of all the wiki pages, we could use a wget script to
grab them all from the google cache.  It would be straightforward copy
and paste to the media wiki then. Ideally, though, we would get an lwip
db dump from the scribblewiki folks, which we could then merge into our
own wiki db.

As a side note, does anyone have a cool lwIP logo?  Everything I do
looks like bad programmer art.

On Thu, Sep 25, 2008 at 10:04 AM, address@hidden <address@hidden>
wrote:
> Grubb, Jared wrote:
>>
>> So, what we need now is a DB dump from the old wiki... I'm not sure 
>> how to get that.
>> jared
>>
>
> That won't be easy - unless there is another way than asking the 
> hosters of the 'old' wiki (which really is the 'current' :) Anyway, 
> who tells us the 'old' wiki will be online again at all to grab the 
> pages back? I tried the google cache method, but all links lead to the

> real site, which makes it pretty hard to even grab the whole wiki as a

> backup of html pages... (can't use a simple crawler and tell it to 
> stay on the google cache server ignoring extern links)
>
> Does anyone have a better idea? I'm a little afraid of losing the 
> pages built so far!
>
> Simon
>
>
> _______________________________________________
> lwip-users mailing list
> address@hidden
> http://lists.nongnu.org/mailman/listinfo/lwip-users
>



--
Thomas Taranowski
Certified netburner consultant
baringforge.com


_______________________________________________
lwip-users mailing list
address@hidden
http://lists.nongnu.org/mailman/listinfo/lwip-users




reply via email to

[Prev in Thread] Current Thread [Next in Thread]