From paul.winalski at gmail.com Tue Oct 1 01:49:28 2024 From: paul.winalski at gmail.com (Paul Winalski) Date: Mon, 30 Sep 2024 11:49:28 -0400 Subject: [COFF] [TUHS] Re: Minimum Array Sizes in 16 bit C (was Maximum) In-Reply-To: <20240928180559.GF9067@mcvoy.com> References: <20240928165812.4uyturluj4dsuwef@illithid> <20240928180559.GF9067@mcvoy.com> Message-ID: [moving to COFF as this has drifted away from Unix] On Sat, Sep 28, 2024 at 2:06 PM Larry McVoy wrote: > I have a somewhat different view. I have a son who is learning to program > and he asked me about C. I said "C is like driving a sports car on a > twisty mountain road that has cliffs and no guard rails. If you want to > check your phone while you are driving, it's not for you. It requires > your full, focussed attention. So that sounds bad, right? Well, if > you are someone who enjoys driving a sports car, and are good at it, > perhaps C is for you." > > If you really want a language with no guard rails, try programming in BLISS. Regarding C and C++ having dangerous language features--of course they do. Every higher-level language I've ever seen has its set of toxic language features that should be avoided if you want reliability and maintainability for your programs. And a set of things to avoid if you want portability. Regarding managed dynamic memory allocation schemes that use garbage collection vs. malloc()/free(), there are some applications where they are not suitable. I'm thinking about real-time programs. You can't have your missle defense software pause to do garbage collection when you're trying to shoot down an incoming ballistic missile. -Paul W. -------------- next part -------------- An HTML attachment was scrubbed... URL: From lm at mcvoy.com Tue Oct 1 03:59:49 2024 From: lm at mcvoy.com (Larry McVoy) Date: Mon, 30 Sep 2024 10:59:49 -0700 Subject: [COFF] [TUHS] Re: Minimum Array Sizes in 16 bit C (was Maximum) In-Reply-To: References: <20240928165812.4uyturluj4dsuwef@illithid> <20240928180559.GF9067@mcvoy.com> Message-ID: <20240930175949.GI17434@mcvoy.com> On Mon, Sep 30, 2024 at 11:49:28AM -0400, Paul Winalski wrote: > [moving to COFF as this has drifted away from Unix] > > On Sat, Sep 28, 2024 at 2:06???PM Larry McVoy wrote: > > > > I have a somewhat different view. I have a son who is learning to program > > and he asked me about C. I said "C is like driving a sports car on a > > twisty mountain road that has cliffs and no guard rails. If you want to > > check your phone while you are driving, it's not for you. It requires > > your full, focussed attention. So that sounds bad, right? Well, if > > you are someone who enjoys driving a sports car, and are good at it, > > perhaps C is for you." > > > > If you really want a language with no guard rails, try programming in > BLISS. > > Regarding C and C++ having dangerous language features--of course they do. > Every higher-level language I've ever seen has its set of toxic language > features that should be avoided if you want reliability and maintainability > for your programs. And a set of things to avoid if you want portability. > > Regarding managed dynamic memory allocation schemes that use garbage > collection vs. malloc()/free(), there are some applications where they are > not suitable. I'm thinking about real-time programs. You can't have your > missle defense software pause to do garbage collection when you're trying > to shoot down an incoming ballistic missile. That's why I like reference counting. It doesn't have the long pauses that other garbage collection systems have, when the variable goes out of scope, you decrement, last guy frees. Seems pretty simple. -- --- Larry McVoy Retired to fishing http://www.mcvoy.com/lm/boat From crossd at gmail.com Tue Oct 1 04:08:58 2024 From: crossd at gmail.com (Dan Cross) Date: Mon, 30 Sep 2024 14:08:58 -0400 Subject: [COFF] [TUHS] Re: Minimum Array Sizes in 16 bit C (was Maximum) In-Reply-To: <20240930175949.GI17434@mcvoy.com> References: <20240928165812.4uyturluj4dsuwef@illithid> <20240928180559.GF9067@mcvoy.com> <20240930175949.GI17434@mcvoy.com> Message-ID: On Mon, Sep 30, 2024 at 2:07 PM Larry McVoy wrote: > On Mon, Sep 30, 2024 at 11:49:28AM -0400, Paul Winalski wrote: > > [moving to COFF as this has drifted away from Unix] > > > > On Sat, Sep 28, 2024 at 2:06???PM Larry McVoy wrote: > > > > > > > I have a somewhat different view. I have a son who is learning to program > > > and he asked me about C. I said "C is like driving a sports car on a > > > twisty mountain road that has cliffs and no guard rails. If you want to > > > check your phone while you are driving, it's not for you. It requires > > > your full, focussed attention. So that sounds bad, right? Well, if > > > you are someone who enjoys driving a sports car, and are good at it, > > > perhaps C is for you." > > > > > > If you really want a language with no guard rails, try programming in > > BLISS. > > > > Regarding C and C++ having dangerous language features--of course they do. > > Every higher-level language I've ever seen has its set of toxic language > > features that should be avoided if you want reliability and maintainability > > for your programs. And a set of things to avoid if you want portability. > > > > Regarding managed dynamic memory allocation schemes that use garbage > > collection vs. malloc()/free(), there are some applications where they are > > not suitable. I'm thinking about real-time programs. You can't have your > > missle defense software pause to do garbage collection when you're trying > > to shoot down an incoming ballistic missile. > > That's why I like reference counting. It doesn't have the long pauses > that other garbage collection systems have, when the variable goes out > of scope, you decrement, last guy frees. Seems pretty simple. The problem with ref counting is that it's not completely general; circular data structures will never be collected, even if all external references to them disappear. That said, reference counting is a really powerful technique; it's just that it must be used carefully. - Dan C. From jcapp at anteil.com Tue Oct 1 06:12:00 2024 From: jcapp at anteil.com (Jim Capp) Date: Mon, 30 Sep 2024 16:12:00 -0400 (EDT) Subject: [COFF] [TUHS] Re: Minimum Array Sizes in 16 bit C (was Maximum) In-Reply-To: Message-ID: <5182549.12624.1727727120300.JavaMail.root@zimbraanteil> Moving to COFF ,.. From: "Rich Salz" To: "TUHS main list" Cc: "Douglas McIlroy" Sent: Monday, September 30, 2024 4:03:15 PM Subject: [TUHS] Re: Minimum Array Sizes in 16 bit C (was Maximum) On Mon, Sep 30, 2024 at 3:12 PM Steffen Nurpmeso < steffen at sdaoden.eu > wrote noone ever told them that even the eldest C can be used in a safe way; Perhaps we have different meanings of the word safe. void foo(char *p) { /* interesting stuff here */ ; free(p); } void bar() { char *p = malloc(20); foo(p); printf("foo is %s\n", p); foo(p); } Why should I have to think about this code when the language already knows what is wrong. No one would make the claim that programming in machine "language" is safe. No one would make the claim that programming in assembly "language" is safe. I've always viewed C as a portable assembler. I think the real issue has nothing to do with the "safety" of C, but rather the "safety" of your-choice-of-C-libraries-and-methods. My $.02 Jim -------------- next part -------------- An HTML attachment was scrubbed... URL: From coff at tuhs.org Tue Oct 1 14:31:38 2024 From: coff at tuhs.org (Warren Toomey via COFF) Date: Tue, 1 Oct 2024 14:31:38 +1000 Subject: [COFF] Fwd: Re: Trove of CSTR's Message-ID: Poul-Henning also suggests this link as well ... Warren ----- Forwarded message from Poul-Henning Kamp ----- There is also 3B stuff in various other subdirectories on that site, for instance: https://www.telecomarchive.com/six-digit.html ----- End forwarded message ----- From ralph at inputplus.co.uk Tue Oct 1 21:52:36 2024 From: ralph at inputplus.co.uk (Ralph Corderoy) Date: Tue, 01 Oct 2024 12:52:36 +0100 Subject: [COFF] Minimum Array Sizes in 16 bit C (was Maximum) In-Reply-To: <20240930175949.GI17434@mcvoy.com> References: <20240928165812.4uyturluj4dsuwef@illithid> <20240928180559.GF9067@mcvoy.com> <20240930175949.GI17434@mcvoy.com> Message-ID: <20241001115236.D5993207D9@orac.inputplus.co.uk> Hi Larry, > > You can't have your missle defense software pause to do garbage > > collection when you're trying to shoot down an incoming ballistic > > missile. > > That's why I like reference counting. It doesn't have the long pauses > that other garbage collection systems have, when the variable goes out > of scope, you decrement, last guy frees. Seems pretty simple. It's better, but the free() might cause a varying amount of work, say due to coallescing. Garbage collection these days doesn't have to be long-pause ‘stop the world’. Go's GC thread runs concurrently with the others, for example. https://tip.golang.org/doc/gc-guide -- Cheers, Ralph. From ralph at inputplus.co.uk Tue Oct 1 23:20:59 2024 From: ralph at inputplus.co.uk (Ralph Corderoy) Date: Tue, 01 Oct 2024 14:20:59 +0100 Subject: [COFF] Shadowing variables in Go. In-Reply-To: <202410011243.491ChKiV419651@freefriends.org> References: <20240930003630.GE17434@mcvoy.com> <202410011243.491ChKiV419651@freefriends.org> Message-ID: <20241001132059.511CD1FB21@orac.inputplus.co.uk> Taken to COFF... Hi Arnold, > In main(), I *think* I'm assigning to the global clientSet so that > I can use it later. But because of the 'err' and the :=, I've > actually created a local variable that shadows the global one, and in > otherfunc(), the global clientSet is still nil. Kaboom! > > The correct way to write the code is: > > var err error > clientSet, err = cluster.MakeClient() // or whatever I think this is a common problem when learning Go, like assigning getchar()'s value to a char in C. It was back in ’14 anyway, when I saw https://www.qureet.com/blog/golang-beartrap/ which has an ‘err’ at an outer scope be unwritten by the ‘:=’ with the new, assigned-to ‘err’ going un-checked. The author mentions ‘go vet’ highlights these cases with -shadow, which is off by default. https://pkg.go.dev/github.com/golangci/govet#hdr-Shadowed_variables suggests that's still the case. -- Cheers, Ralph. From stuff at riddermarkfarm.ca Wed Oct 2 01:08:06 2024 From: stuff at riddermarkfarm.ca (Stuff Received) Date: Tue, 1 Oct 2024 11:08:06 -0400 Subject: [COFF] [TUHS] Re: Minimum Array Sizes in 16 bit C (was Maximum) In-Reply-To: References: <20240928165812.4uyturluj4dsuwef@illithid> <20240928180138.aygrwqdwrvq3n6xt@illithid> <202410011313.491DD4ac421643@freefriends.org> <20241001133231.GE13777@mcvoy.com> <202410011347.491DlAsJ423777@freefriends.org> <20241001140101.GG13777@mcvoy.com> <024bd803-2852-c0d0-5f15-30ec65c45cb4@makerlisp.com> Message-ID: [-->COFF] On 2024-10-01 10:56, Dan Cross wrote (in part): > I've found a grounding in mathematics useful for programming, but > beyond some knowledge of the physical constraints that the universe > places on us and a very healthy appreciation for the scientific > method, I'm having a hard time understanding how the hard sciences > would help out too much. Electrical engineering seems like it would be > more useful, than, say, chemistry or geology. I see this as related to the old question about whether it is easier to teach domain experts to program or teach programmers about the domain. (I worked for a company that wrote/sold scientific libraries for embedded systems.) We had a mixture but the former was often easier. S. > > I talk to a lot of academics, and I think they see the situation > differently than is presented here. In a nutshell, the way a lot of > them look at it, the amount of computer science in the world increases > constantly while the amount of time they have to teach that to > undergraduates remains fixed. As a result, they have to pick and > choose what they teach very, very carefully, balancing a number of > criteria as they do so. What this translates to in the real world > isn't that the bar is lowered, but that the bar is different. > > - Dan C. From paul.winalski at gmail.com Wed Oct 2 02:40:32 2024 From: paul.winalski at gmail.com (Paul Winalski) Date: Tue, 1 Oct 2024 12:40:32 -0400 Subject: [COFF] [TUHS] Re: Minimum Array Sizes in 16 bit C (was Maximum) In-Reply-To: <202410011313.491DD4ac421643@freefriends.org> References: <20240928165812.4uyturluj4dsuwef@illithid> <20240928180138.aygrwqdwrvq3n6xt@illithid> <202410011313.491DD4ac421643@freefriends.org> Message-ID: On Tue, Oct 1, 2024 at 9:13 AM wrote: > This goes back to the evolution thing. At the time, C was a huge > step up from FORTRAN and assembly. > Certainly it's a step up (and a BIG step up) from assembly. But I'd say C is a step sidewise from Fortran. An awful lot of HPTC programming involves throwing multidimensional arrays around and C is not suitable for that. -Paul W. -------------- next part -------------- An HTML attachment was scrubbed... URL: From paul.winalski at gmail.com Wed Oct 2 02:49:10 2024 From: paul.winalski at gmail.com (Paul Winalski) Date: Tue, 1 Oct 2024 12:49:10 -0400 Subject: [COFF] [TUHS] Re: Minimum Array Sizes in 16 bit C (was Maximum) In-Reply-To: <202410011347.491DlAsJ423777@freefriends.org> References: <20240928165812.4uyturluj4dsuwef@illithid> <20240928180138.aygrwqdwrvq3n6xt@illithid> <202410011313.491DD4ac421643@freefriends.org> <20241001133231.GE13777@mcvoy.com> <202410011347.491DlAsJ423777@freefriends.org> Message-ID: On Tue, Oct 1, 2024 at 10:07 AM wrote: [regarding writing an Ada compiler as a class project] > Did you do generics? That and the run time, which had some real-time > bits to it (*IIRC*, it's been a long time), as well as the cross > object code type checking, would have been real bears. > > Like many things, the first 90% is easy, the second 90% is hard. :-) > > I was in DEC's compiler group when they were implementing Ada for VAX/VMS. It gets very tricky when routine libraries are involved. Just figuring out the compilation order can be a real bear (part of this is the cross object code type checking you mention). >From my viewpoint Ada suffered two problems. First, it was such a large language and very tricky to implement--even more so than PL/I. Second, it had US Government cooties. -Paul W. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralph at inputplus.co.uk Wed Oct 2 03:09:27 2024 From: ralph at inputplus.co.uk (Ralph Corderoy) Date: Tue, 01 Oct 2024 18:09:27 +0100 Subject: [COFF] C a step up from Fortran. In-Reply-To: References: <20240928165812.4uyturluj4dsuwef@illithid> <20240928180138.aygrwqdwrvq3n6xt@illithid> <202410011313.491DD4ac421643@freefriends.org> Message-ID: <20241001170927.4C66F1FB21@orac.inputplus.co.uk> Hi Paul, > Arnold wrote: > > This goes back to the evolution thing. At the time, C was a huge > > step up from FORTRAN and assembly. > > Certainly it's a step up (and a BIG step up) from assembly. > But I'd say C is a step sidewise from Fortran. An awful lot of HPTC > programming involves throwing multidimensional arrays around and C is > not suitable for that. I expect the structured-programming parts of C were being referred to as a step up compared to Fortran 77. Ratfor was a step up in the same direction. Don't disagree about the HPTC side. -- Cheers, Ralph. From arnold at skeeve.com Wed Oct 2 05:08:49 2024 From: arnold at skeeve.com (arnold at skeeve.com) Date: Tue, 01 Oct 2024 13:08:49 -0600 Subject: [COFF] [TUHS] Re: Minimum Array Sizes in 16 bit C (was Maximum) In-Reply-To: References: <20240928165812.4uyturluj4dsuwef@illithid> <20240928180138.aygrwqdwrvq3n6xt@illithid> <202410011313.491DD4ac421643@freefriends.org> Message-ID: <202410011908.491J8nM9448504@freefriends.org> Paul Winalski wrote: > On Tue, Oct 1, 2024 at 9:13 AM wrote: > > > This goes back to the evolution thing. At the time, C was a huge > > step up from FORTRAN and assembly. > > > > Certainly it's a step up (and a BIG step up) from assembly. But I'd say C > is a step sidewise from Fortran. An awful lot of HPTC programming involves > throwing multidimensional arrays around and C is not suitable for that. > > -Paul W. In my head, FORTRAN is still FORTRAN 66, where there are no while loops, or else statements, and the only data structure is the array. For non-HPC stuff, C was a huge step up. From g.branden.robinson at gmail.com Thu Oct 3 10:46:31 2024 From: g.branden.robinson at gmail.com (G. Branden Robinson) Date: Wed, 2 Oct 2024 19:46:31 -0500 Subject: [COFF] Penn State In-Reply-To: References: <20240928165812.4uyturluj4dsuwef@illithid> <20240928180138.aygrwqdwrvq3n6xt@illithid> <202410011313.491DD4ac421643@freefriends.org> <8A3A3643-B1B2-4C41-A36F-BF00AF3C7020@iitbombay.org> <202410011907.491J74wT448406@freefriends.org> <202410020549.4925nXHf489153@freefriends.org> Message-ID: <20241003004631.xy2xezik7jyhikr6@illithid> At 2024-10-02T16:42:59-0400, Dan Cross wrote: > On Wed, Oct 2, 2024 at 2:27 AM wrote: > > Also true. In the late 80s I was a sysadmin at Emory U. We had a Vax > > connected to BITNET with funky hardware and UREP, the Unix RSCS > > Emulation Program, from the University of Pennsylvania. Every time I > > had to dive into that code, I felt like I needed a shower > > afterwards. :-) > > Uh oh, lest the UPenn alumni among us get angry (high, Ron!) I feel I > must point out that UREP wasn't from the University of Pennsylvania, > but rather, from The Pennsylvania State University (yes, "The" is part > of the name). UPenn (upenn.edu) is an Ivy in Philly; Penn State > (psu.edu) is a state school in University Park, which is next to State > College (really, that's the name of the town) with satellite campuses > scattered around the state. There's another method of distinguishing UPenn from Penn State. Permit me to share my favorite joke on the subject, from ten years ago. "STATE COLLEGE, Pa. -- Construction workers tore down Penn State's iconic Joe Paterno statue on campus two years ago -- but this town might not be without one for much longer. Two alumni already have received the OK from the borough to install a projected $300,000 life-sized bronze sculpture downtown, about two miles from the original site." -- ESPN ([1]) "The key difference is that the new statue will look the other way." -- Chris Lawrence Regards, Branden [1] https://www.espn.com/college-football/story/_/id/10828351/joe-paterno-honored-new-statue-state-college-pennsylvania -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 833 bytes Desc: not available URL: From athornton at gmail.com Fri Oct 4 13:57:24 2024 From: athornton at gmail.com (Adam Thornton) Date: Thu, 3 Oct 2024 20:57:24 -0700 Subject: [COFF] [TUHS] Re: Minimum Array Sizes in 16 bit C (was Maximum) In-Reply-To: <20241001133231.GE13777@mcvoy.com> References: <20240928165812.4uyturluj4dsuwef@illithid> <20240928180138.aygrwqdwrvq3n6xt@illithid> <202410011313.491DD4ac421643@freefriends.org> <20241001133231.GE13777@mcvoy.com> Message-ID: Moving over to COFF from TUHS. The following was Larry McVoy: > I don't consider myself to be that good of a programmer, I can point to > dozens of people my age that can run circles around me and I'm sure there > are many more. But apparently the bar is pretty low these days and I > agree, that's sad. > > It's hard not to feel like the bar is lower. I feel like since Steve Grandi retired at NOIRLab, I and Josh Hoblitt are the only people left who actually understand how IP networks work. And I'm not great, never was, but I know a lot more than...everyone else. And kids these days, well, I'm not very fluent in TypeScript and I really don't understand why every damn thing needs to be asynchronous especially if you're just awaiting its completion anyway. But, hey, it ain't that hard to do. But again, there's a part of me that wonders how relevant the skills I miss *are* anymore. I'm a software developer now, but I always thought of myself as basically a sysadmin. It's just that we had automated away all of what I started out doing (which was, what, 35-ish years ago?) by 20 years ago, and staying ahead of the automation has made me, of necessity, a software developer now. But I was also thinking of Larry saying he wouldn't last a week in today's workplace, and I'm not sure that's true. I mean, there's a lot of stuff that you once COULD say that would these days get you a quick trip through HR and your crap in a box and a walk to the curb...but I am a pretty foul-mouthed individual, and I have said nasty things about people's code, and, indeed, the people who are repeat offenders with respect to said code, and nevertheless I have had surprisingly few issues with HR these last couple decades. So in some sense it really DOES matter WHAT it is that's offensive that you're saying, and I am living-and-still-employed proof. If you generally treat people with respect until they prove they don't deserve it, and you base your calumny on the bad technical decisions they make and not their inherent characteristics, then it really ain't that hard to get along in a woke workplace. And I say this as an abrasive coworker, who happens to be a cis het white dude from a fairly-mainstream Christian background and the usual set of academic credentials. Let's face it: to do a good job as a software developer or generally an IT person, you do not need a penis. You do not need to worship the way most people at your workplace do. You do not need a college degree, let alone in CS. You do not need to be sexually attracted to the opposite sex. You do not need to have the same gender now that you were assigned at birth. You do not need two (or given the current state of the art, ANY) working eyes. Or hands. You do not need to be under 40. You do not need to be able to walk. You do not need pale skin. And anyone who's saying shit about someone else based on THAT sort of thing *should* be shown the curb, and quickly. And the fact that many employers are willing to do this now is, in my opinion, a really good thing. On the other hand, if someone reliably makes terrible technical decisions, well, yeah, you should spend a little time understanding whether there is a structural incentive to steer them that way and try to help them if they're trainable, but sometimes there isn't and they're not. And those people, it's OK to say they've got bad taste and their implementations of their poor taste are worse. And at least in my little corner of the world, which is quasi-academic and scientific, there's a lot of that. Just because you're really really good at astronomy doesn't mean you're good at writing intelligible, testable, maintainable programs. Some very smart people have written really awful code that solved their immediate problems, but that's no way to start a library used by thousands of astronomers. But whether or not they're competent software engineers ain't got shit to do with what they have in their pants or what color their skin is. And it's not always even obvious bigotry. I don't want to work with toxic geniuses anymore. Even if the only awful things they do and say are to people that they regard as intellectually inferior and are not based on bullshit as above...look, I'd much rather work with someone who writes just-OK code and is pleasant than someone who writes brilliant code and who's always a quarter-second from going off on someone not quite as smart as they are. Cleverness is vastly overrated. I'd rather have someone with whom I don't dread interacting writing the stuff I have to interface with, even if it means the code runs 25% slower. Machine cycles are dirt cheap now. The number of places where you SHOULD have to put up with toxicity because you get more efficient code and it actually matters has been pretty tiny my entire adult lifetime, and has been shrinking over that lifetime as well. And from a maintainability standpoint...if I encounter someone else's just-OK code, well, I can probably figure out what it's doing and why it's there way, way more easily than someone's code that used to be blazing fast, is now broken, and it turns out that's because it encodes assumptions about the runtime environment that were true five years ago and are no longer correct. That said, it's (again, in my not-necessarily-representative experience) not usually the nonspecific toxic genius people who get in trouble with HR. The ones who do, well, much, MUCH, too often, are the people complaining about wokeness in the workplace who just want to be able to say bad things about their coworkers based on their race or gender (or...) rather than the quality of their work, and I'm totally happy to be in the "That's not OK" camp, and I applaud it when HR repeats that and walks them out the door. Adam -------------- next part -------------- An HTML attachment was scrubbed... URL: From paul.winalski at gmail.com Sat Oct 5 01:39:56 2024 From: paul.winalski at gmail.com (Paul Winalski) Date: Fri, 4 Oct 2024 11:39:56 -0400 Subject: [COFF] Modern programming vs. the Good Ole Days (was Minimum Array Sizes in 16 bit C )(was Maximum) In-Reply-To: References: <20240928165812.4uyturluj4dsuwef@illithid> <20240928180138.aygrwqdwrvq3n6xt@illithid> <202410011313.491DD4ac421643@freefriends.org> <20241001133231.GE13777@mcvoy.com> Message-ID: A bit about my background: The first machine I programmed was ca. 1970 in high school--a Monroe programmable calculator with four registers and a maximum of 40 program steps. The most complicated thing I got it to do was to iteratively calculate factorials (of course it got arithmetic overflow pretty quickly). As an undergrad Biology major I learned BASIC on DTSS and later Fortran IV and PL/I for an S/360 model 25 (32K max memory). I decided to go into computer technology rather than Biology and attended CS grad school. I completed all the coursework but never finished my thesis. After interning at IBM's Cambridge Scientific Center for a few years, I joined DEC's software development tools group in early 1980, later joining the GEM compiler team to design and implement the non-compiler parts of the compiler back end (heap memory management, generating listing files, command line parsing, object file generation, etc.). Compaq acquired DEC and about a year later sold the Alpha chip technology--including the GEM optimizing compiler back end--to Intel. Many of the engineers--including me--went with it. I retired from Intel in 2016. One important thing I learned early on in my career at DEC is that there is a big difference between Computer Science and Software Engineering. I was very lucky that many of the top engineers in DEC's compiler group had studied at CMU--one of the few schools that taught SW Engineering skills as well as CS. I learned good SW engineering practices from the get-go. Unlike CS programming, SW engineering has to worry about things such as: o design and implementation for testability and maintainability o test system development o commenting and documentation so that others can pick up and maintain your code o algorithm scalability This thread has spent a lot of time discussing how programming has changed over the years. I bring the SW Engineering skill set up because IMO it's just as relevant today as it was in the past. Perhaps even more so. My observation is that programming style has changed in response to hardware getting faster and memory capacity getting larger. If your program has to fit into 8K or 32K you have to make every byte count--often at the expense of maintainability and algorithmic efficiency. As machines got larger and faster, programming for small size became less and less important. The first revolution along these lines was the switch from writing in machine code (assembler) to higher-level languages. Machines had become fast enough that in general it didn't matter if the compiler didn't generate the most efficient code possible. In most cases the increase in productivity (HLLs are less error-prone than assembler) and maintainability more than made up for less efficient code. The second revolution was structured programming. Machines fast enough and large enough that one didn't have to resort to rat's nest coding to make the program small and fast enough to be useful. Structured programming made code more easily understood--both by humans and by optimizing compilers. These days we have machines with several levels of data caching and multiple processor cores running asynchronously. If (as in the HPTC world) you want to get the maximum performance out of the hardware, you have to worry about framing your program in a way that can be multitasked (to take advantage of all those cores) and you have to worry about efficient cache management and interprocessor communication. The ways to do this are not always intuitively obvious. Modern optimizing compilers know all the (often completely non-intuitive) efficiency rules and can best apply them when you write your code in an algorithmically clean manner and leave the grunt work of running it efficiently on the hardware to the compiler. It's a very different world than when you had to figure out how to fit your code and data into 8K! -Paul W. -------------- next part -------------- An HTML attachment was scrubbed... URL: