[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Gnu-arch-users] [OT] GCC (was Re: Re: Command abbreviations)

From: Pierce T . Wetter III
Subject: Re: [Gnu-arch-users] [OT] GCC (was Re: Re: Command abbreviations)
Date: Fri, 19 Mar 2004 09:00:17 -0700

On Mar 18, 2004, at 4:36 PM, Tom Lord wrote:

From: "Pierce T.Wetter III" <address@hidden>

Anyway, GCC isn't the only game in town and, looking at my crystal
ball, it's going to be blown away w/in the next 10 years by something
about 10x smaller (measured by source-code).  Lcc is most definately
not that GCC killer -- but it proves the point.

I don't know why that would be true, since it would reverse the trend
of all the other software projects in existence.

Out of curiosity, please elaborate.

Once upon a time, programming was a lot harder then it is now. There was less memory, less processor speed, less disk space. So in order to accomplish anything at all, programmers had to be smarter.

These days, there's so much memory, so much processor speed, and so much disk space that everything just keeps bloating, because it doesn't have to be concise and elegant. Good programmers spend their timing solving more different problems, and ignoring speed or size. But its the bad programmers who then end up implementing all the other stuff, and its crap, and there's more of them.

I'm not sure that any of the applications I use today work as well as say, MacWrite 1.0, in its 64K.

That's what provoked me to respond really. I miss Lotus Jazz every time I use Excel, it was a better program...

My claim is grounded, to whatever degree it's grounded at all, in
technology considerations: that the essential information content of
GCC source code can be far more concisely expressed;

We can both be right on this one. There's a big chunk of GCC that's actually produced by bison and flex. Adding a regular expression engine library to my code made a lot of the other code much simpler. So I can definitely see your point, that there can be "levels" of source code, that could produce GCC from a much higher level description. Especially compiling is an extremely well specified problem.

In my daily work, I've switched to using Python for a big chunk of it, because with PyObjC (an Objective-C to Python transparent bridge), I can access our data model directly, and produce programs in the time I used to spend setting up an IDE project. I can even write programs in the interpreter to do one offs when need be.

So that's how you're right. In fact, I would go on to say that the great strength of Perl & Python and their success as a language is that you can code at a much higher level because arrays and maps are so well integrated into the language. I expect someone to finally clue in and integrate that into a more C like language. There's no reason why:

 foreach (o,array)
    print o;

Couldn't be legal C. Which would reduce the size of the equivalent C source:

 for (i=0;i<array.count;i++)
      print o;

How I'm right is that if I count the source code to Python or Perl as part of "our" source code, it will take a long while before I've written enough negative code* to make up for the size of the Python or Perl executables. When someone writes hlcc (high level compiler compiler) to replace most of the guts of gcc, I would suspect that much of the current gcc code would end up moving to that. So now to work on gcc, you have to work on TWO programs: changes to hlcc, and changes to gcc...


 * negative code

If you change how you were doing something in a piece of code, and it does the same thing in less lines, you've written negative code. Programmer productivity can be measured as:

 positive lines + 10*(negative lines)

reply via email to

[Prev in Thread] Current Thread [Next in Thread]