[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: api.header.include and backward-compatible .y files

From: Akim Demaille
Subject: Re: api.header.include and backward-compatible .y files
Date: Sat, 19 Sep 2020 11:07:15 +0200

Hi Kaz,

> Le 7 sept. 2020 à 11:30, Kaz Kylheku <kaz@kylheku.com> a écrit :
> On 2020-09-06 00:46, Akim Demaille wrote:
>> Kaz,
>> You are vastly simplifying things.  In particular, you completely
>> discard the problems with evolutions here.
> I see simplifying things as my job, really.

So do I.  But ignoring the problem is not simplifying it.

>>> Bison's user is whoever runs Bison. Bison's user is not that one
>>> who runs the program built with Bison; that is the user's user.
>>> The user's user is not your user.
>>> You cannot assume that your user is just a middleman in
>>> a delivery chain, who can deal with any nuisance that lands his way,
>>> because it's his job. That user may be a free software developer,
>>> like you.
>> You are misunderstanding my point.  My point is that back in the
>> days people were shipping releases, and releases are self-contained,
>> they protect the end user from any non standard dependency such
>> as Autoconf, Automake, Bison, Flex, Gperf, Libtool, Gettext, just
>> to name a few of them close the GNU project.  Installing a release
>> was super easy, because you hardly had any dependency.
> Well, Autoconf without question! If the end user of a program is
> required to have Autoconf, then the developer has misunderstood the
> meaning of Autoconf, which is to generate a configure script that
> assumes little about the environment.
> I think that Autoconf and Automake are so thoroughly baked into
> the "DNA" of Bison, that you may be losing touch with the idea that
> a parser generator is not Autoconf.

I think you and I are wasting time trying to rationalize things.  It
just boils down, IMHO, to deep disagreements in how things should be
done.  Each time one of us puts forward a "fact", the other one advances
a counter-argument.  This is driving us nowhere.

So, let's finish with all these pseudo philosophical issues asap, and
let's go back to addressing real problems.

You claim that Bison is a program like any other and it's fine to
require it on the user's machine, and I disagree.  Not only do I,
Akim, disagree, but it's the whole GNU project that disagrees, and
GNU is _the_ standard I follow.


> Naturally, all the source files must be in the distribution. It is okay to 
> include non-source files in the distribution along with the source files they 
> are generated from, provided they are up-to-date with the source they are 
> made from, and machine-independent, so that normal building of the 
> distribution will never modify them. We commonly include non-source files 
> produced by Autoconf, Automake, Bison, flex, TeX, and makeinfo; this helps 
> avoid unnecessary dependencies between our distributions, so that users can 
> install whichever versions of whichever packages they like. Do not induce new 
> dependencies on other software lightly.

> Look, Bison's tree makes more use of M4 macros than anything I remember
> ever seeing. It's not just for configuration but elements like
> parser skeletons and test cases. It's kind of weird!

Granted.  But I don't see what's your point here.

> In the Unix world, Yacc is standard. You can rely on it for building
> your program as surely as you can rely on make, awk, sed, or the
> shell.

You keep on _not_ making the difference between a package, a git checkout
and a distro.  We cannot understand each other if you continue discarding

>> Maintainers and contributors had a way more complex task: setting
>> up a *developer*  environment with all the required versions of the
>> required tools.  And they had to keep their environment fresh.  On
>> occasions it meant using non released versions of these tools.  But
>> that was not a problem, because it was only on the shoulders of a
>> few experienced people.
>> Way too often today people no longer make self-contained releases,
>> and releases are hardly different from a git snapshot.  That is
>> wrong.
> Nope; what is wrong is thinking there is a difference.

I believe you live in a perfect world, where all the features you
need already exist in all the existing versions of the tool you
depend upon are available.


But it turns out that some of the tools evolve to offer new
features.  And if you want/need to depend upon that feature, either
you wait for *years* before it's ubiquitous enough for you to
start using it, or you ship the generated result.

You are simplifying the problem, and then, naturally, don't understand
the proposed solution.

Again, another example of a thread of conversation where you and I
can just not converge to an understanding.

Let's stop that, it's pointless.

Rather, let's fix problems.  Start a thread with a short description
of *one* issue to address.  Don't make several emails or one with
many items, I just don't have enough bandwidth for that.  Let's work
problems one after the other, please.

Or just wake up a thread I have left unanswered.

>> This is wrong because now end users need to install tons of tools.
> Good incentive too keep that tool count down, right?
> If you think the user will hate installing a ton of tools, what
> makes you think the contributing user won't hate it?

If a contributor is not ready to install the dependencies to become
a contributor, well, too bad.

>> And most of them don't want to install recent versions of these tools
>> (and I don't blame them), they just want to use the one provided by
>> their distro.
> Are you the same person who reminded me not to use GCC-specific
> warning options in a patch, because Bison builds with many C compilers?
> :)

I don't see what's funny/ironical here.  Yes, Bison aims at full
portability, so, yes, I don't want to bake into Bison gratuitous
dependencies.  What is your point?

>> So today, some maintainers locked themselves into not being able
>> to use tools that are no widespread enough.
>> Not to mention that they
>> might even have to deal with different behaviors from different
>> versions of the tool.  Then they find convenient to blame the
>> evolution of the tool.
> I'd love to see you maintain Bison stoically, without complaining,
> if different versions of your compiler (or other tools) were
> producing different results.

I spend an enormous amount of my own private time, of my very own
life, to address portability issues.  I certainly pay more attention
to portability than most maintainers need to, because Bison is a
generator of code, hence I don't just fight my fights (have bison
compile), I'm also fighting my users' fights (have the generated
parsers compile cleanly everywhere).

Sure, I do have my complains.  So for instance if you understand
what problem ICC sees with Bison itself, I'd be happy to address
it (e.g., https://travis-ci.org/github/akimd/bison/jobs/727908860).

> We have detailed requirements for those things, and international
> standards, for good reasons.

You're only watching the tip of the iceberg.  There are compiler
bugs, there are warnings that users don't want to see in the generated
code, etc.

The standards shield us from most issues.  Not all of them.

>> But the problem is rather their use of the generator.  *They* are
>> in charge of generating say Bison parsers, and to pass them in
>> their releases.  That's a mild effort, but with a huge ROI: you
>> no longer, ever, have to face the nightmare of having to support
>> very different versions of the tool, and you also can *immediately*
>> benefit from new features.
> What do you do if you've maintained a program for over a decade,
> and none of your old commits have a copy of the generated parser?
> When you do a "git bisect", the old builds have to build!

I agree.  But I've never had to bisect over several *years*.  If
I had to, I would install the required versions of the tools.  And
certainly push on everybody some constraint that I can easily
address on my side for my one time problem.

>> I know of several projects, some very important ones, that are
>> stuck with old versions of Bison although they could benefit from
> Are they really stuck with old versions of Bison, though?
> That only happens if their code actually doesn't work with newer Bison.
> *Using* only the features provided by Ubuntu 18's packaged Bison
> is not the same as being *stuck*.
> I'm using Ubuntu 18's Bison myself, but I'm not stuck. It looks
> like my stuff works with 3.7.

You didn't get my point.  Actually, your answer is in the middle
of my sentence, so let's finish the sentence first:

>> newer features, features that have sometimes been written *for them*,
>> to simplify *their* problems.  But they still have the old hackish
>> code because the recent releases of Bison are not available "yet"
>> in Ubuntu 18.04...  Gee.

I was saying: projects that agree to ship the generated sources
can _immediately_ benefit from new features.  People who expect
the releases to be "wildly available" (whatever the meaning of
this), have to wait for years before.

Shipping the generating sources not only relieves the user from having
to install some specific version of a tool, not only makes the
version-to-version issue nonexistent, it also buys you freedom.

> Can't Bison have improvements that are internal? As in, nothing
> changes in the input file, but the output is better?

It happens marginally, and it depends what you call "better".  Most
of them are only about killing warnings with some compilers.  It
did happen I had nuked a useless table, so the parser was shorter.
Other changes have varying mileage, such as: if you have a really
large parser, the integer types were not big enough, so here
"better" means "goes from not working, to working".

> Suppose I want the user to benefit from the newest, shiniest Bison
> they can get their hands on.
> The following is the actual situation.
> My code works with old Bison, as far back as 2.x.

You are accepting a strong constraint here.  *You* decided that.
Bison 3, as its version name demonstrates, as cut some old crap.
Trying to have 2.x and 3.x works well if you stay on the main
features, and a lot of effort was put in the backward compatibility,
but there are undoubtedly differences.

Bison 3.0 was released more than 7 years ago.

Some people, though, try to stick with 2.3 (fourteen years old...)
for just one single reason: Bison v2.4 moved to GPLv3, and Apple does
not like the GPLv3, so it still ships 2.3 today.

I have no reason to comply with Apple's dictatorship here.

> The Bison I'm using is behind; it's the Ubuntu one.
> But, I have downstream packagers who are on Bison 3.7.
> Maybe that puts out better code. Maybe tables are compressed better,
> or the skeleton has some new tricks to run faster or whatever.
> Maybe some buffer overflow has been fixed somewhere.

[Off topic, but: no, I am not aware of any such issue in the
generated code.  Ever.]

> I have no idea.
> Just because I'm not using new syntax doesn't mean I'm not
> using new Bison. Just like just because I'm using C99
> or C90 doesn't mean I'm not getting better code generation
> or diagnostics.
> Why would I ship frozen parser output? Why recommend that to me?

Because you know it works.  Because that's the one you ran your test
suite with.  At my day job, when we ship a product, it is comes with
its dependencies, and is deployed with them.  If you change your
dependencies, you are taking chances.

In the precise case of Bison, backward compatibility is maintained.
Your threads gives the impression it is not the case, but it is,
and represent a significant amount of my problems: how to get it to
evolve, without breaking something existing.  But whatever you change
in a program, there will always be someone for whom it did change
something, most of the time because they relied on private, undocumented,
features of the generated code.  I do what I can for them, but they are
in the realm of the undefined behavior.

Back to the point: In the precise case of Bison, backward compatibility
_is_ maintained.  So it is more the case than not that running bison
again will give something functional.

> The downstream packagers have chosen Bison 3.7 for their distro,
> and expect all programs to use that Bison.

No, that is wrong.  This only happens if you *force* them to run
the generator because your release does not ship the generated

> If there is some security issue found in Bison-generated code,
> they expect to be able to upgrade Bison and rebuild all
> packages that name it as a dependency.

That is definitely a scenario where the GNU way has issues.
But finding one such scenario does not void all the other ones.
Engineering is exactly about that: finding the best compromise.

> Shipping a frozen parser is downright antisocial.

Nice motto :)  But that does not make it true.

> Today, the consumers of the free software developer's code base
> are downstream packagers. They have the whole suite of tools, by definition.
> The users who run the program get binaries from the packagers;
> they need no tools.

Exactly my point too.

> If they do need tools, their packagers have them all,
> in package form, so they can be almost instantly as well-tooled
> as the distro itself.
> The imaginary user with the "medium amount of tools" went
> extinct in the 1990's.
> The modern user has all the tools. He or she just doesn't have the latest
> version of all of them, necessarily.


>>> The consumers of programming languages
>>> are programmers. Yet, we broadly value stability of programming
>>> languages. Multiple implementations that adhere to common standards
>>> are also a boon.
>> True, but moot.  There's one Bison.
> There is one C#. So Microsoft should just break all C# code
> written before 2014.

I don't know if I truly fail to have you understand what I mean, or
if you just enjoy extracting from my sentences things I did not say.

>> The right approach is rather to see how your need is part of general
>> pattern, and how that need can be fulfilled in a clean way.
>> But don't, say, happily sed the generated output, and expect it to
>> work forever.
> If Bison has a test case representing a usage, then that will continue
> to work.

You are not the one to dub what is part of the contract, and what is
not.  This thread is plain useless, and just seems to demonstrate that
either we don't understand each other, or stand at opposite ends and
cannot find a way to converge.  Maybe a nightly discussion with a lot
of beer would make it easier, but we are too far away from one another
(I believe you're in Vancouver, aren't you?).

You are not the one to dub what is part of the contract, but you do
contribute in shaping it by stating what are the problems you are
facing, and helping us see how we can address them.  So let's go
back, as I said, to exact issues.

> sed-ing the output is a poor approach, which was made necessary due to
> not having a test case in the parser generator to check the behavior.



And no.

*You* seem to believe Bison should not generate the signature of yyparse
in the header.  But that's Kaz, not the typical use case.  And instead
of just accepting that, you decided to fight it.

Fighting your tools is certainly one way to have endless problems with
them.  So let's start a thread, again, for just that one issue.  We'll
see the other ones afterwards.

> Well, not exactly necessary. A fix was necessary, and there are always
> alternative solutions to seding the output.
> However, seding the output is (sometimes) the simplest solution which
> has the virtue of having the highest probability of easily backporting
> to old baselines, building which requires the fix.

These are workarounds, and Bison does what it can to make workarounds
useless by providing features instead.  Let's work this out.

> If you go back with "git bisect", that seding very easy to apply.
> Even if there is a conflict in that Makefile rule (rare), it can be
> added by hand. A refactoring of the code may not backport as easiy.

Sorry, you lost me here.

If you don't tweak the output, but just use the tool, there's nothing
to do at all.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]