History of programming languages - Ada

I've been reading History of programming languages---II which is a book of a 1993 conference. There's lots of interesting stuff, if you like that kind of thing, but I'm particularly struck by the section on Ada. There's fun like the distinction between general-purpose and "embedded" computing, which is always somewhat hard to define:

In the early seventies these were generally called "weapons system computers." A short time later they were called "embedded systems," to convey the message that they also included functions such as control, communications, and intelligence as part of an overall system, not to their physically being "embedded" in a weapon... Later an Appropriations Act invented the name "mission critical," which is certainly morale boosting anyway. Any attempt to parse these terms out of historical context is doomed. These distintions may seem esoteric to an outsider, but within the DoD the great religious conflicts of history pale to insignificance by comparison...

But the bit that struck me was at the far end, once they'd defined their language, they needed actual compilers for it:

It was never the intent that the HOLWG [High Order Language Working Group] would implement compilers. This was the prerogative of the individual Services and of industry. It was hoped that settling on one language would make it attractive for the industry to produce compilers as commercial products, without government funding or control (as it has worked out). However, it was important that the Services show support for the standard by putting their money into some products. If no one thought the Services were interested (and money spells interest), then why should industry risk its own money? It was not actually necessary that the the Service programs be successful, just that they exist. Indeed, there was a certain inhibitory factor; a company may not want to invest in a compiler for a particular machine if the government was doing the same and it might be available for free later.

(More here, that's in the Language control section). I find that a rather nice statement of a general problem; and its good to see a govt organisation actually thinking about it.

I've no familiarity with Ada itself; on a quick glance, it looks clunky. But I'm a serial zealot - I was a Fortran zealot, then a Perl zealot, and am now a C zealot, so don't expect me to impartially evaluate anything.

Refs

* Fiordland rangers prepare for stoat plague
* The Top 10 Retractions of 2014
* Does irony have a place in science?
* TIOBE Index for December 2014 - C wins; langpop.com - C wins; ieee - Java edges out C (boo!); Redmonk - C doesn't win :-(. Ada is hard to find in all of these.

Categories

More like this

John M can probably add more detail, but the problem with ADA was mission creep. Each of the armed services (it was a DOD project) wanted their own pet thing shoved in. And then another, and then . . .

If you want small, elegant and optimized for instrument control try FORTH

By Eli Rabett (not verified) on 23 Dec 2014 #permalink

Never dappled with Ada myself. I suppose the most exotic language I've come across is APL, which was an old IBM language originally used to model processors, I believe. It was really conducive to producing 'write-only' code. 6 months after you wrote stuff in APL, you couldn't even figure out what your own program used to do. IIRC, there was a single operator that did a complete 3D matrix reduction. Now that's really useful.

Funny that you mentioned military applications, because my first job out of uni circa 1978 was working on the F-16 radar. One of the fun things I did was to model the complete processor instruction set on an old HP micro in basic. It was really handy if you wanted to find out what the actual firmware would do without having to program it up in punch cards.

Mashey will probably be along here in a shot. I suspect we are functional equivalents from about the same era.

By metzomagic (not verified) on 23 Dec 2014 #permalink

Cute! I started a first year computer science degree in 1990, and they taught us basic programming using Ada because (their explanation, from memory) as an embedded system for the military it had to be very robust to failure, and they believed that through using it they could get us to learn the proper respect due to careful attention to data types, careful declaration of variables, etc. From memory, you couldn't easily get away with sloppy behavior in this regard, and this forced good practice on the students.

I have no idea if Ada is a good teaching language, because I dropped out of that course after 6 weeks when I realized how insufferable and unpleasant computer science students were.

By Captain Flashheart (not verified) on 23 Dec 2014 #permalink

No ADA experience. Perl is favorite especially sine Strawberry Perl means I can things with a PC no one else can. I also like SAS once upon a time ago though not sure if that qualifies as a language. I always thought of "embedded" as being the microcode executed by the processor but I am not mainly a programmer. First programming language in school was Pascal. Lisp ans Skill are close to each other and still need it for emacs. TCL is apparently free so many tool venders have a TCL interface now. Python seems to be what kids like now. Fortran reminds me of the old card based instruction followed by Spice. On the other side of the coin is the hardware developer either tweaking the CPU for a pre-compiled benchmark regardless of the compromise or (more sensibly) trying to optimize the compiler for the CPU. There were many tradeoffs made so that a certain sequence of self-modifying code in the old "Doom" video game would run unchanged on newer CPUs.

By Tim Beatty (not verified) on 24 Dec 2014 #permalink

If your project requires you to use anything other than Fortran it means you're just playing with toys.

By Raymond Arritt (not verified) on 24 Dec 2014 #permalink

"If your project requires you to use anything other than Fortran it means you’re just playing with toys."

Well, pilots would probably tell you that the F-16 is a hell of a fun toy (the software was written in Ada). And, no, Hank, you woudn't want to write something that large and complex in Forth, though Forth's fine for the relatively simple embedded control systems it was developed for.

Ada was mostly adopted by the USAF, which had a history of interest in forward-looking languages.

JOVIAL, for instance - Jules Own Version of the International Algorithmic Language (IAL was the early working name for what became Algol 60). USAF used it for years.

FORTRAN, well, we lost a spacecraft (unmanned, fortunately) because of a missing comma in a FOR statement that turned it into a variable assignment statement ...

Eli's right that a big problem with Ada was that too much was crammed into the spec making the four language proposals submitted to DOD overly complex. I was one of many, many compiler specialists who reviewed the four proposed language designs (for the record, "Green" won). I wasn't happy with any of them, having experience both in writing compilers and in developing embedded systems.

The following URL includes pointers to Dijkstra's comments on the process, the specifications, and the four language proposals for those whose interest in the history of Ada exceeds their common sense :)

Eli will disagree a bit with dhogaza. FORTH was really good in memory limited micro and minicomputers. You could buy small Z-80 boards with a FORTH system in ROM. Much more powerful than the BASIC that was the other choice, and much faster for development than compiling on a mini or mainframe. As fast or faster than compiled code in many cases.

FORTH was not a toy, and was used, indeed developed for, controlling large telescopes and image processing. What killed it was Moore's law, with memory and storage limits disappearing for all practical purposes.

[The system I program for - in C - has 56k bytes of RAM -W]

In a FORTH environment you would have a small Z-80 board running each part of the system and communicating with a central controller, which, come to think about it is what we have today.

By Eli Rabett (not verified) on 24 Dec 2014 #permalink

Eli: "li will disagree a bit with dhogaza. FORTH was really good in memory limited micro and minicomputers"

Not sure about how this disagrees with "Forth’s fine for the relatively simple embedded control systems it was developed for."

What you describe is far simpler than the fly-by-wire software that controls today's Boeing 777 and 787, whose software is also written in Ada. Or the TGV. Or the Canadian air traffic control system.

Disclaimer: though I've worked as a compiler consultant for a major vendor of Ada, I've never used Ada professionally. In fact, I was hired to integrate a C++ front-end into their compiler and debugger technology ...

However, Ada has a fine track record in the aviation and related industries. I know of one experiment in which two teams implemented a complex project in both Ada and C, and the bug rates in the Ada code was about 50% that of the C code. Which helps explain why it is still used for mission-critical software in the aviation world.

Modern (i.e. ANSI) C (which is pretty ancient now) does incorporate much of the type-checking and a few other notions that first cropped up in C++ so some of the sillier and easily prevented errors (for instance, calling functions with either too few or two many arguments) that used to plague C code have disappeared as the language and its compilers have improved and matured.

What's the line? C treats you like an adult. Pascal treats you like a naughty child. Ada treats you like a criminal.

By Eli Rabett (not verified) on 24 Dec 2014 #permalink

Because professional programmers don't need safety features.

Just like fighter pilots and race car drivers don't need four-point harnesses, race car drivers don't need rear-view mirrors, professional airline pilots don't need checklists, etc etc. Because, you know, they're professionals.

That attitude has led to more broken code that I care to think about.

Are you aware that the vast majority of breaches into Unix-based systems are due to buffer overflows due to C programmers failing to manually check array bounds?

"C provides no built-in protection against accessing or overwriting data in any part of memory; more specifically, it does not check that data written to a buffer is within the boundaries of that buffer."

...

"Many other programming languages provide runtime checking and in some cases even compile-time checking which might send a warning or raise an exception when C or C++ would overwrite data and continue to execute further instructions until erroneous results are obtained which might or might not cause the program to crash. Examples of such languages include Ada, Eiffel, Lisp, Modula-2, Smalltalk, OCaml and such C-derivatives as Cyclone, Rust and D. The Java and .NET Framework bytecode environments also require bounds checking on all arrays."

http://en.wikipedia.org/wiki/Buffer_overflow#Barriers_to_exploitation

This feature, if you will, of the C language has caused enormous economic damage, though in recent years buffer overflows have been replaced by sql injection as the most financial damaging error on the part of programmers (which, like buffer overflows, are easily prevented by use of proper tools and techniques, yet we hear of credit card thefts happening routinely and usually it is due to sql injection).

Now, I write professionally in C and C++ for the most part. However, my code is routinely audited by a third-party security firm expressly for errors which are easily made in those languages (C++ provides many ways to lower the risk, which I routinely use). "C treats you as an adult". Well, there is no software engineer on the planet who writes error-free code for their entire career, though some are better than others. The fact that my code is audited by other security experts helps me sleep at night.

Next time you're at 30,000 feet in an airplane in a fly-by-wire airplane, ask yourself: do you want the FAA to trust the programmers involved, or do you want the FAA to be suspicious of their capacity to make mistakes, and subject their code to the closest scrutiny? And if automated tools and features built into a modern language can ensure that certain errors simply can't be made, is there any particular reason why those tools shouldn't be used when lives are at stake?

[C is fine, as long as you turn up the checking and use lint sensibly. Any language that doesn't allow you to make that kind of error probably isn't very useful. However, I didn't intend this post to be a re-run of the language wars. I was trying to draw attention to the attitude within the Ada procurement / design process -W]

"Design by Committee" was the slur used against Ada in the FORTH community in the 80's... considering I was, at that point, one of the foremost Novix NC4000 and Harris RTX2000 programmers in the southern hemisphere, I guess I was one of those making the slurs.

It's formality was viewed as ugliness - strict checking was also seen as a negative thing.

Guess what: Ada lived.
It has, I think, fulfilled many of its goals.

I've never written code in it.
I did follow with interest a tutorial series in "Embedded System Design" magazine. I recall that it covered some of the more interesting features for modularisation and re-entrant code (which was it's whole reason for being).
I note you can still get solid Ada support on most platforms.

I stopped using FORTH and went C++ in about 1998 and never looked back - that "formality and strict checking" business grows on you when your projects get to sufficient complexity.
In my dotage, I shall write a formal and strictly typed FORTH and close the circle :^

By Happy Heyoka (not verified) on 24 Dec 2014 #permalink

In response to WMC's last inline comment: there are several excellent programming languages which protect against array indexing errors (by raising an exception). Ada was one of the first although Standard ML, a later language, is my favorite, being one of the very few with a completely formal definition.

In Standard ML there is an 'unsafe' module so one can avoid index checking. Considerable experience shows almost no time is saved by doing so.

At the time Ada was being specified and designed there simply were not enough good programmers, not of the caliber of the astronomers using FORTH. Ada was a fine step in the right direction.

By David B. Benson (not verified) on 24 Dec 2014 #permalink

1) See Languages, Levels, Libraries, and Longevity". I have no clue why a 10-year-old article has gotten half its downloads in 2014. This is one of those strange articles that gets used much more often than it gets cited :-)

2) Recall that C really got going ~1973, when UNIX kernel was rewritten in it, where the biggest UNX systems were 248KB PDP-11/45s, with 64K+64D memory per process.

Outside Burroughs, it was considered somewhat weird to write an OS in even the modest level of C.

3) I've never written any Ada, although I did give the keynote for Tri-Ada'95 and I followed this at the time, since we (SGI) did support ADA. A good rule is always to write in the highest level language that is practical.

4) C of course started life amidst some rather good programmers, and no one had the foggiest clue of the impact that C and its derivatives would have. Many people though we were nuts to be messing with minicomputers and then microprocessors.

5) In some languages, subscript range checking is pretty easy, and good optimizing compiles can even optimize away a lot of range checks, The problem for C was the need to use efficient pointers and often to deal with data structures created by somebody else, which make bounds checking sometimes awkward and sometimes impossible at compile time.

6) Many errors would have been avoided if there had been more thought given to standard functions & macros early that would help write code to access memory and especially write into memory when the bounds were not statically visible.
For example functions like strncpy(3), strncmp(3) originated `40 years ago from a need to copy strings into fixed buffers for system accounting, and thus be truncated if necessary. I'll admit that when I wrote the originals I might have thought more extensively.

I can't point at examples, it's been a long time, but I recall that Bell Labs programming projects often layered project coding conventions atop C, and I think I recall that some used functions/macros/typedefs to make it safer ... although anything with C's pointers is hard.

By John Mashey (not verified) on 25 Dec 2014 #permalink

dhogaza said:

"Well, pilots would probably tell you that the F-16 is a hell of a fun toy (the software was written in Ada)."

Now that's interesting. As a matter of curiosity, what part of the avionics was written in Ada? The fly-by-wire bit, the HUD, or something? IIRC, the pilots didn't like the fly-by-wire concept at first until they finally built some feedback indications into the stick.

And... sounds like you're the same generation as Mashey and myself.

By metzomagic (not verified) on 25 Dec 2014 #permalink

I was working for a defense contractor in the late 80s, early 90s when the ADA Directive came down. Though I wasn't affected, I remember looking through various magazines that catered to the defense electronics industry and they seemed little more than wall-to-wall classifieds for ADA programmers.

With a dearth of qualified programmers available, I'd assume there were many projects that suffered - and/or had to endure a long learning curve until the supply of programmers qualified ADA programmers caught up and enabled companies to be a bit more selective.

But while ADA had to be used/embedded into all of our military systems/products it was never or rarely used to create those products. I.e., most development and test was still done in Rocky Mountain Basic, C, or Pascal.

On a side note, I do understand the military's quandary. When the Army fielded their Calibration Management Information Ssystem (CALMIS) in the mid-80s it came with programming manuals for at least 5 different languages. Different modules written in different languages and to make a patch you had to be ready to program in whatever source the module was written in. Considering my only exposure to programming at the time was Apple's ProBasic it was a great learning experience :)

By Kevin ONeill (not verified) on 26 Dec 2014 #permalink

I've been considering learning Ada just for the hell of it, and I've come to the conclusion it's exactly what it needs to be.

"Treats you like a criminal"? Look, there isn't a whole lot of desktop development in Ada, for good reason; it's a pain in the ass. But it is the way it is for a reason, and that's because when you're in a plane, a train, or a spacecraft going at speeds that nothing living can possibly approach, an out-of-bounds error can kill you. (I suspect Ada would be a good choice for high-availabililty client-server computing as well; it's hard to make a bug like Heartbleed happen when the compiler won't let a buffer overflow happen in the first place.)

The world probably won't end because LibreOffice crashed and ate the last revision of your book report. But that's not the kind of app Ada was created for.