[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Pan-users] Please test my updated github master branch - "patches_f
From: |
SciFi |
Subject: |
Re: [Pan-users] Please test my updated github master branch - "patches_for_imhotep82_master_GIT-ef96111.diff" - yEnc (1/1) |
Date: |
Wed, 13 Jul 2011 05:01:22 +0000 (UTC) |
User-agent: |
Pan/0.135 (Tomorrow I'll Wake Up and Scald Myself with Tea; GIT ef96111 (github.com/imhotep82/pan2/master); x86_64-apple-darwin10.8.0; gcc-4.2.1 (build 5666 (dot 3)); 32-bit mode) |
On Tue, 12 Jul 2011 21:28:13 +0200, Heinrich Mueller wrote:
> 500 is obvious. The server doesn't support the command. If so,
> the program barfs atm. As I said, experimental ;)
> I think I'll put it into another branch.
Please do. That way we can test different builds as our personal time
allows, then switch back to a more-proven functioning version for
regular use.
> On 07/12/11 08:58, SciFi wrote:
>> (b) With my world-wide Usenet subscriptions at Giganews (primary) and
>> Astraweb (fallback) both, no red-flag warnings were seen, but we
>> also saw *no* updates in the newsgroup-subscription panel at all
>> -- none were bold-faced, none had '(numbers)' after them, etc.
>> This also was the result at Gmane, together with the already-
>> mentioned red-flag warnings in its log.
> I tested it with astranews and it worked, so I can't guess what's wrong.
As for Giganews, the only public article I could find is here:
<http://www.giganews.com/accelerator.html>
They do mention Newsbin there.
I wonder if I should open a support request to see if they are doing
this compression in the XZVER/XZHDR methods as Astraweb has documented?
> Thank you for your feedback, much appreciated.
> I can't recreate your yenc-error, sorry.
I up'd the bytes-per-file spinner way-way-way up there,
to ensure the files would be posted as one single slice each
(I knew they were very small files).
>> Also, my text introduction was repeated for each file in separate
>> postings, with no Reference line(s) joining the messages themselves
>> (while the proper Reference was included to follow your original
>> posting only).
> Well, how many references would you like? GNKSA says the limit is
> something around
> 936 chars, so this would be impossible for large upload queues. Or am I
> understanding
> you wrong??
Please disregard what the GNKSA says about Reference lines.
I believe there is no _effective_ limits in the NNTP protocol for these,
since each header-line is "continued" to the next line
by adding spaces/tabs to the first part of those line(s).
But *EACH* Message/Reference ID *STRING* might have a server limit.
However, don't confuse these two aspects, please. ;)
> I _could_ do as you suggested as an option, to let the user choose the
> repetition of the
> initial message in the posts or just the first, or let the message be
> seperate from the
> binaries. I don't know what's better, yet.
For the "all-in-one-post" version of "multi-file" postings,
I kinda view this as what a PowerPoint™ presentation might look like:
---MIME-separator---
content text/plain
initial message text etc.
---MIME-separator---
content octet-stream (or PNG-img or whatever)
encoded data for the first file
---MIME-separator---
content octet-stream (or whatever)
encoded data for the 2nd file
---MIME-separator---
etc etc etc
… just as MIME is used in e-mail to provide all the parts of the
whole message, esp'ly in HTML e-mail (gfx/pictures/etc), see.
This was what I was hoping for the two files I tried to post.
yeah I know this type of post can get awfully complex,
that's why there are specialized apps for this kind of stuff ;)
As for the other kind of "multi-file" postings,
think of it as the usual big massive binary postings,
perhaps a RAR-split collection of files comprising a 700MB AVI for example.
(yencee allows an initial text-mode post ahead of the binary posts,
separated and not even Reference'd, as a "subject.nfo" file type thing.
I'll try to binsearch for such a post I did with yencee, I'll send
another message with the nzb for it.)
I haven't studied to find out whether all of the subsequent
article-sized slices need the entire wad of Reference lines
to keep everything together. I kinda think it only needs to reference
the immediately-preceding ('just posted') Message-ID,
and not _all_ of the preceding ones that form 'this' file (RAR-part).
Much simpler in this case, of course.
Visually:
file.part01.rar (01/10) (first article-sized slice)
file.part01.rar (02/10) (2nd slice, needs to Reference the ID for 01/10)
file.part01.rar (03/10) (3rd slice, needs to Reference the ID for 02/10 *and*
01/10??? i think it only needs to reference 02/10 only)
…etc…
>> These files should have been small enough such that they won't need the
>> "part" designation, based on the Preference setting (byte-count per file).
>> -OR-
>> There is no proper yEnc "ending part" to say there is no more.
>
> UUEncodePartial appends a part counter for every upload, but I _could_
> change that.
As I eluded, pre-determine the small file is a single-part piece entirely,
not needing any "slices" at all, (comparing to the bytes-per-file setting),
or
be sure the single piece is completely self-contained
including any "ending" gizmos to
tell the downloading app there is nothing more needed.
I think this latter mode is what got my posts into trouble,
yydecode was quite emphatic about the missing "ending" gizmos ISTM.
Again, look at yencee-at-Sourceforge,
it seems to be doing the first mode properly
i.e. it does not create "yEnc part=#" slices if it is the only piece.
It is imperative we get any/all encoding methods working as
expected and as compared with the tried-&-true apps already out there
being used esp'ly by the wynderz™ people.
> I'll try and fix those bugs. (And read your mails thoroughly this time ;) )
Upload test messages,
download 'em as original text mode,
use different tools (try using yydecode and yencee etc.),
use other newsreaders to ensure they like your test-posts (other volunteers can
do this),
tweak things as you go along,
and study-study-study. ;)
> See you.
Hollar if needed.
Thanks much for taking time for all this. :)