Dennis Ritchie has died

Dennis Ritchie, creator of C and co-creator of Unix, has died.

John Mashey writes:

Dennis was an old friend, and I'd heard this yesterday from Doug McIlroy. See this for how Dennis, Steve Bourne and I evolved my PWB stuff into UNIX V7's environment variables. Dennis in particular suggested the idea of just making them a 2nd argument list, which kept most of it out of the kernel and kept simple semantics.

Fortunately, Dennis got a Computer History Fellow Award a while back, among others.

While sad, I'd like to remember that Dennis had fun, as with Rob Pike in this prank on Arno.

Dennis has already been covered by NPR and is getting lots of other press, as he should.

But, while Dennis and Steve were about as far apart as you could get in personality and nature of accomplishments, honoring DMR takes nothing from Steve. Both were great, just in very different directions. Of course, it is interesting that the iPhone runs a UNIX-derivative with apps written in a C+Smalltalk derivative.

This discussion on Usenet in 1989 has a lesson. Dennis Ritchie wrote:

The question arose: why does C use a terminating character for
strings instead of a count?

Discussion of the representation of strings in C is not fruitful
unless it is realized that there are no strings in C. There
are character arrays, which serve a similar purpose, but no
strings.

Things very deep in the design of the language, and in the customs
of its use, make strings a mess to add. The intent was that
the behavior of character arrays should be exactly like that
of other arrays, and the hope was that stringish operations
on these character arrays should be convenient enough. ...

Given the explicit use of character arrays, and explicit pointers to
sequences of characters, the conventional use of a terminating
marker is hard to avoid. The history of this convention and
of the general array scheme had little to do with the PDP-11; it
was inherited from BCPL and B.

Robert Firth replied:

A correction here: the C scheme was NOT inherited from BCPL.
BCPL strings are not confused with character arrays; their
implemetation is not normally visible to the programmer, and
their semantics are respectably robust.

Eric S Raymond replied to Firth:

I've seen bonehead idiocy on the net before, but this tops it all -- this takes
the cut-glass flyswatter. Mr. Firth, do you read what you're replying to
before you pontificate? Didn't the name 'Dennis Ritchie' register in whatever
soggy lump of excrement you're using as a central nervous system? Do you
realize that the person you just incorrectly 'corrected' on a point of C's
intellectual antecedents is the inventor of C himself!?!

Sheesh. No wonder Dennis doesn't post more often.

Next time dmr posts something, I suggest you shut up and listen. Respectfully.

While Dennis Ritchie also replied to Firth:

Robert Firth justifiably corrects my misstatement about
BCPL strings; they were indeed counted. I evidently edited
my memory.

More like this

And, reading that thread, ER apologized to Firth in typical ER fashion ...

(I'm being sarcastic, though accurate - he didn't post again in the thread).

Offhand, I'm not sure that anyone (with the possible exception of Bill Gates) had a greater impact on modern computing. Ritchie really was a giant. [CNN's article](http://www.cnn.com/2011/10/14/tech/innovation/dennis-ritchie-obit-bell-…) referred to Ritchie as "The shoulders Steve Jobs stood on," but in a way, that's misleading: that's a metaphor that has Jobs above Ritchie, and in truth Ritchie cast a far longer shadow.

Meanwhile, after C, there was C++. And if you've ever wondered why C++ is the way it is, well... [here's the story, which I'm certain is all 100% true](http://harmful.cat-v.org/software/c++/I_did_it_for_you_all)...

I've been wrong so often and have apologized and owned up so often that I've joked that my distinction is that I'm the only man on the internet to have ever been wrong. It's great to discover that I was (again) wrong about that and that I've had far more distinguished company than I imagined.

Well, don't forget Ken Thompson...
Also, and not to take anything away from Dennis, that lab at Murray Hill was simply awesome, one of the two densest concentrations of brilliant and productive computer scientists ever, and much longer-lived. (The other being Xerox PARC, with a nod to IBM TJ Watson Research Ctr, but IBM's research was spread around a bit more.)

But Dennis' footprint is bigger than people might realize, because C had incredibly strong influence on microprocessor architecture. Most RISCs were designed to run C well, with maybe influences from FORTRAN, COBOL, etc. x86s have long been performance-tuned for C, and of course the strong presence of C codes in SPEC has been an influence.

But most if the Internet plumbing is RISC-based (MIPS and other) and of course smartphones are ARM-based... So it isn't just that software is permeated by C and its descendants, it's that mist if the computing infrastructure we use was either designed for C or was influenced by it (X86, even current IBM mainframes .. )

By John Mashey (not verified) on 14 Oct 2011 #permalink

"Things very deep in the design of the language, and in the customs of its use, make strings a mess to add."

I think the D language indicates that they can be integrated cleanly ... given at least a rudimentary object system, which of course C did/does not have. (C++, OTOH, shows how not to do it, which can be said of a lot of its features.)

By https://me.yah… (not verified) on 14 Oct 2011 #permalink

And for further evidence of his prankster-ism, take a look in the index of your handy 2nd Edition K&R and lookup the term "recursive", a vital programming concept. Notice anything interesting about that last page reference?

Thank You and Rest on Peace Dennis, for teaching us how to think succinctly.

Offhand, I'm not sure that anyone

This sort of nonsense is no better than the idolization of Jobs. Try Tim Berners-Lee, Vint Cerf and Bob Kahn, J.C.R. Licklider, Alan Turing, John von Neumann, Christopher Strachey, John Bachus and Peter Naur and the rest of the Algol 60 team ... just because you aren't familiar with people and their contributions doesn't mean they didn't make them. And you can't use the "modern computing" dodge to eliminate these folks from consideration.

Ritchie really was a giant

His influence was great largely because of placement. Bjarne Stroustrup has made the same point about himself and C++, and it also applies to Linus Torvalds. There are many people as skilled and creative or more so whose work doesn't have the same impact for various reasons. Dennis deserves his due, which is considerable, but kindly take the very lesson that he taught in that exchange above.

Languages, Levels, Libraries, and Longevity: ... Software designers must continue to build better languages that harness increasing performance to unchanging human characteristics. They must continually raise the level of abstraction, so that humans can ignore more details. Continual improvement is needed in tools for organizing software, so that people can more easily discover existing code, reuse it, and adapt it.

Precisely these considerations are behind the design of Scala.

Most RISCs were designed to run C well

Since C gives close access to the hardware, this really isn't saying much, unless it refers to zero-terminated character arrays or possibly integer promotion rules. Certainly RISC designs don't reflect C's curly braces and free format, its lowercase reserved keywords, or much else in its design. Actually, C's sparseness, its lack of a garbage collector or object system, may have had a negative impact on hardware design (not that Dennis Ritchie or Ken Thompson -- who did much of the relevant design work in producing B -- can be faulted for that since they had to squeeze the compiler into a handful of kilobytes).

@7 John Mashey

I remember chuckling to myself reading that section about 01/01/1970 in "A Deepness In The Sky".

And in a nice coincidence (harking back to the Wegman thread thread) Vernor Vinge's "A Fire Upon The Deep" is another literary work where one can find the term "flensing" in common usage.

Ianam, I didn't say Ritchie was the greatest computer scientist ever; I offered the opinion that his impact was greatest. You're right that this is partly a matter of placement, but the same could be said of the Wright Brothers or Henry Ford or any other innovator you care to name.

As to the "'modern computing' dodge," it's precisely because there are many, many people who've made enormous contributions that I chose to make the more limited statement. If you prefer sweeping generalizations and blanket statements, feel free to make them.

Bruce, you said two different things, one about dmr's impact and another about him being a "giant" and I answered them both. And I would add another real giant with greater impact: Claude Shannon. There are also giants with less impact, such as Donald Knuth, Edsger Dykstra, and Tony Hoare (actually, considering Hoare's work on type systems, and the regretted null reference in particular, his total impact may have been greater). As for "modern", you missed the point ... a number of non-"modern" people had tremendous impact on "modern" computing ... the restriction is pointless unless we were talking about people who only had an impact on past computing but not on present computing. George Boole is a fellow from long ago who had a huge impact ... and Frege too. If you narrow your vision just right, you can always zero in on one person, like dmr, while ignoring every aspect of computing that might falsify the claim. dmr was more humble and more rational ... let's honor that.

One of my favorite anecdotes about the working relationship of Ken Thompson and Dennis Ritchie, from Thompson's Turing lecture, "Reflections on Trusting Trust":

That brings me to Dennis Ritchie. Our collaboration has been a thing of beauty. In the ten years that we have worked together, I can recall only one case of miscoordination of work. On that occasion, I discovered that we both had written the same 20-line assembly language program. I compared the sources and was astounded to find that they matched character-for-character. The result of our work together has been far greater than the work that we each contributed.

Twenty lines, identical to the character. That's a shared style!

By Ted Kirkpatrick (not verified) on 14 Oct 2011 #permalink

I'm a Trustee of the Computer History Museum in Mountain View CA.
Come visit. Modern computing technology is the product of a vast number of people, some of whom contributed more tech genes to the gene pool, but public visibility and contributions often don't match that well. Dennis was certainly one of the quieter folks with huge impact, but nobody at the Museum tries to create Top10 lists and argue the order. We do pick ~3 per year to honor, and that Museum Fellows page shows them.

By John Mashey (not verified) on 14 Oct 2011 #permalink

Re 10:
I hesitate to turn this into comp.arch, but:

1) C could be mapped onto many ISAs and was.

2) Some ISAs forced truly awkward implementations, like the hundreds of lines of code in Univac 11xx strcpy().

3) Some ISAs had numerous features that were really hard to get to from C, but if course cost.

In the last 20-25 years, 2) have died off, in part because C-conducive ISAs were just easier to deal with.

Most in 3) have died off too, for the opposite reason: they could run C just fine, but they had a lot of baggage that made it difficult to increase performance cost-competively.
VAX was elegant, but impossible to make go fast, in many cases to support features C compilers didn't use. MC68010 was not bad, but Motorola added features in MC68020 that proved troublesome later. Going further back, if Algol had taken over the world, the (interesting) Burroughs B5000 hardware implementation if call-by-name would have been a great winner.

Finally, from personal experience, Stanford MIPS was a word-addressed machine with byte manipulation and no halfwords, as the statistics driving it were from Pascal, C and FORTRAN.
Fortunately I talked Hennessy out of that by observation of real-world usage of C, and MIPS R2000s were byte addressed. I.e. The word-addressed hardware worked, you could compile C for it, but it would not have been successful.

So it isn't a question of C being close to hardware, it is that it wants certain features and doesn't care about others, and ISAs that were missing the former and had too much if the latter were selected negatively by C's prevalence.
Of course, the 2 major pre-RISC general-purpose ISAs remaining in wide use have subsets that fit C fairly well, and at least in IBM mainframe case, vast installed base if software, including plenty if COBOL, etc to drive some of the usage. The X86, messy as it is, avoided some if the problemmatical features of the otherwise more elegant VAX or 68020.

By John Mashey (not verified) on 14 Oct 2011 #permalink

In the case of the Intel ISA, starting around the introduction of the 486, Intel took much of the instruction set which mapped onto C and moved it from microcode to hardwired instruction which decoded and executed much more quickly. With the 386 generation, they got rid of the need for the messy segmented addressing scheme (although of course this was retained for backwards compatibility) which made C programs much easier to support in the Intel boxes.

But I do have to say that the nicest ISA I've written for was the MIPS. It was just sweet, and being able to do something useful in the delay slot made writing ASM for it fun.

By Rattus Norvegicus (not verified) on 14 Oct 2011 #permalink

re: 5 (this for computer and statistical folks)
I have old BTL phone books. March 1974, 6 months after I arrived, says (using hierarchical numbering adding more digits lower in organization). See if you recognize any names. This is just a sample. NOTE: I was not in this organization, I just talked with some a lot and knew others by reputation.

12 Division - Research, Communication Principles Division
Prim, Robert Executive Director
See algorithm very important to Bell System network design.

Tukey, John, Associate Director [i.e., does what he wants, minimal adminstrative hassle.]

121 Mathematics and Statistics Research Center
Henry Pollak, Director,
who I was lucky to hear talking about Minimal Spanning Tree algorithms, when I was a high school junior.

He had 9 Departments, 1211-1219, usually 6-8 people each including

1212 ...
Garey, Michael
Kruskal, Joseph

1214 ...
Chambers, John

1215 ...
Cleveland, William S. (get his books)

1216 Graham, Ron
Johnson, David S.

122 Acoustical and Behavioral Research Center
Matthews, Max
He had ~8 departments, for example:

1223 Julesz, Bela.

Located physically one floor up from 121 was:
127 Computing Science Research Center, run by Sam Morgan

1271 Doug McIlroy
(who sent me the email on DMR) See also diff.
Aho, Al, the A in "awk" ... well he did other stuff, too.
Knowlton, Ken.
Morris, Robert (sr) As a teenager, his son used to come by the Labs.
Ossanna, Joe
Thompson, Ken

1273 Elliott Pinson
Johnson, Steve
Kernighan, Brian
Ritchie, Dennis

1274 Brown, Stanley
Feldman, Stu
Lesk, Mike

1275 Hamming, Richard

That's just a sample. Some people may have heard of some of these folks. Steve Bourne,Rob Pike, Pete Weinberger, etc came later.

To be hired then as a Member of Technical Staff at Bell Labs, you pretty much needed a relevant MS or PhD from a good school. Bell Labs didn't use recruiters, but rather teams of managers, usually run by a Director (typically managed 80-150 people), dedicated to each school. They'd come by a few times a year, knew the faculty, and would be tracking students well in advance. If you were working on a PhD, a year before you were done, some Bell Labs Director or Department head would wander into your office, introduce themselves and ask if you'd considered industrial R&D...

The "typical" MTS would have been high school valedictorian. (The one random sample I had was 11 of 11.)

Still, it was almost guaranteed that no matter how good you were in college, and how expert you were, BTL would have many people who were either smarter or knew more, or both. New hires needed to learn to ask questions, especially across disciplines.

This was not a place for Dunning-Kruger afflictees.

By John Mashey (not verified) on 14 Oct 2011 #permalink

I hesitate to turn this into comp.arch, but:

You don't appear hesitant to me. And you quite miss the point. Any language and any compiler implementation of a scope and nature similar to C would have had similar impacts on hardware designs driven by benchmarks. So yes, C had a strong influence on microprocessor design, but not a very special one, not a particularly positive one, and not a very "impactful" one, since these specifics of microprocessor design, matched to the language implementation to satisfy benchmarks, aren't visible at higher levels. The RISC machine designers care, but people building portable systems do not.

So it isn't a question of C being close to hardware, it is that it wants certain features and doesn't care about others

This too misses the point. Languages at the level of C will generally want and be indifferent to a similar set of features, except for specific representational issues like NUL-terminated strings (and thus those awkward strcpy implementations). OTOH, higher level languages can differ far more in the generated code and thus put less consistent pressures on hardware design. They also have a different set of demands, such as the need in functional languages such as Scala for rapid memory allocation and deallocation (something not provided well by the JVM), the need for type tagging, hardware support of vtbls, etc.

Intel took much of the instruction set which mapped onto C

Which was not specific to C. Some people still think that C's autoincrement and autodecrement syntax was influenced by the PDP-11 instruction set, but it wasn't, and the reverse is also true.

With the 386 generation, they got rid of the need for the messy segmented addressing scheme (although of course this was retained for backwards compatibility) which made C programs much easier to support in the Intel boxes.

This too, of course, was not specific to C. Any other systems programming language would also have needed those near and far keywords and it too would have been easier to program without the segmentation needs ... this was certainly true of ASM programs. My point is that nothing about the C language or its design had anything to do with providing a flat address space on the 386 ... it was provided because segmented addressing is a pain, period.

This was not a place for Dunning-Kruger afflictees.

I wonder how Lycklama made it.

> This was not a place for Dunning-Kruger afflictees.

I think that observation touches on what makes folk susceptible to D-K and dismissivism - being in a small pond where the "big fish" aren't all that big, yet don't realize that, since they aren't themselves around bigger fish (or when they are, it's in a contrived context where the bigger ones are circumspect about pointing out errors).

By Anna Haynes (not verified) on 15 Oct 2011 #permalink

"creator of C and co-creator of Unix" ... that's a wealth of beauty contributed over a lifetime, thank you very much dmr!

[i]With the 386 generation, they got rid of the need for the messy segmented addressing scheme (although of course this was retained for backwards compatibility) which made C programs much easier to support in the Intel boxes.[/i]

Yet the PDP-11/45 on which C/Unix was implemented had... segmented memory addressing!

It was simply transparent to the user because you were limited to one 64 kb text segment and one 64kb data/stack segment.

If you were happy with these memory limitations, you could program exactly the same way on real mode Intel x86! (Some messiness came into play for IO, admittedly) Compilers for real mode called this "small mode." The complexity only came in if you wanted to use more memory.

In fact the x86 was a 16-bit word/20 bit address line architecture similar in design to the PDP-11/45 (16 bit word/22 bit address line), but stripped of the complexity of its Memory Management Unit.

The 386 did not "get rid of" segmented memory, but essentially moved segment management to the operating system, similar to the PDP-11, making segmentation transparent to the programmer. The instruction set remained microcoded. There were attempts to optimize the microcode to make the most common instructions the fastest to execute, but this just common sense. Although the instruction set "mapped onto C", C basically assumes the minimal subset of the typical computer architecture anyway.

Anne, you're offering a rather different explanation (that may be accurate for some people) for erroneous self-assessments than that of Dunning and Kruger. One should should be careful about applying the term "Dunning-Kruger" to the condition they sought to explain, rather than to the explanation they offered for it, when another explanation is in play.

As for weathercasters, I doubt that, as a group, their degree of self-assessment error is atypical. The problem is that the failure to know/understand the difference between climate and weather is a double whammy: many weathercasters don't understand that their knowledge about weather and the ability to predict it does not apply to climate change, and the public doesn't understand that experts on weather forecasting aren't experts on climate change.

Yet the PDP-11/45 on which C/Unix was implemented

It also ran on the PDP-11/40 (among others), which did not have separate I/D address spaces.

had... segmented memory addressing!

Technically, but not in a sense elevant here, where the discussion was about "the messy segmented addressing scheme", not segmented memory per se.

If you were happy with these memory limitations

It's not about "happiness", it's about requirements.

The 386 did not "get rid of" segmented memory, but essentially moved segment management to the operating system, similar to the PDP-11, making segmentation transparent to the programmer.

This is very wrong. The PDP-11/45 fetched instructions from I space and data from D space ... that has nothing to do with the OS. And the 386 was a 32-bit machine with 32-bit address registers which could reference the entire physical address space without use of segment registers ... that has nothing to do with the OS. The role of the OS was in memory protection, which is a different issue than segmentation.

Bruce Sharp --- Bjarne didn't talk like that so I don't believe a word of it.

By David B. Benson (not verified) on 15 Oct 2011 #permalink

(ianam, you're right, it's Dunning-Kruger once removed. The locally-biggest fish are prone to DK, & then are influencers for those around them. also: sorry to have gone off topic.)

David, it's a well known parody (some have speculated that dmr wrote it) and Bruce was obviously joking when he said he's certain that it's 100% true. Of course, you may be joking to, but you didn't present the sort of humor cues that Bruce did (such cues are also present in his latest comment).

By https://me.yah… (not verified) on 16 Oct 2011 #permalink