Bash 4.0 = **, $>>, and associative arrays

Bash 4.0 includes associative arrays, a superwildcard called "**" and a new redirection operator. Details are here.

More like this

The point of set theory isn't just to sit around and twiddle our thumbs about the various definitions we can heap together. It's to create a basis on which we can build and study interesting things. A great example of that is something called a group. Once we've built up sets enough to be able to…
Dennis Ritchie, creator of C and co-creator of Unix, has died. John Mashey writes: Dennis was an old friend, and I'd heard this yesterday from Doug McIlroy. See this for how Dennis, Steve Bourne and I evolved my PWB stuff into UNIX V7's environment variables. Dennis in particular suggested the…
KDE 4.0 is upon us. I believe it is essentially out there, being tested and messed with, with the final launch etc. happening in a few weeks (mid January). KDE is a Window Manager (no "s" in that Window, please note) for Linux. I'm not big on KDE myself, but my daughter uses it. KDE 4.0,…
Red Hat Profitable; Solar-powered laptop for Tanzania; Happy Birthday Perl; NetBSD 4.0 released Red Hat 3Q Profit Up 12 Percent from PhysOrg.com (AP) -- Open-source software provider Red Hat Inc. said Thursday that its third quarter profit rose 12 percent as a surge in subscriptions helped…

Some of my favorite new features:

There is a new shell option: `dirspell'. When enabled, the filename completion code performs spelling correction on directory names during completion.

My sepllnig skucs

The parser now understands `|&' as a synonym for `2>&1 |', which redirects the standard error for a command through a pipe.

For some reason I always forget the correct order of that string of characters.

If a command is not found, the shell attempts to execute a shell function named 'command_not_found_handle', supplying the command words as the function arguments.

Very cool from a user interface perspective. Now I can write my own snarky error messages if the user doesn't have the prereqs installed.

Most of it looks very good, but I have admit when I see 'associative arrays' mentioned, I start to worry if they are getting a bit close to the line of what is better left for the likes of Perl or Python. Suppose it can't do much harm (you can always just not use it...). I prefer to keep shell scripts simple and hand over to Perl, etc., if I want something with more logic, etc.

By Heraclides (not verified) on 26 Feb 2009 #permalink

Automatic spelling correction trains your fingers to spell badly. It's much harder to untrain fingers that have been mis-trained than to train right in the first place.

By Nathan Myers (not verified) on 26 Feb 2009 #permalink

Interesting philosophical divergences here. Heraclides: I find it interesting that the array is possibly on the other side of the Script River Styx. I'm not disagreeing (or agreeing) with you, just bringing it up. It would be like this: A shell scripting lanaguge would be going too far if ... then fill in the blank (a priori) if certain things are implemented. Structurally, a SSL should not be object oriented. With respect to text manipulation a SSL should not have regular expressions fully implemented. With respect to variables, a SSL should not have arrays. With respect to file manipulatoi, a SSL should not have complex built in search functions and so on.

Were I to agree with you, I'd say that the way to implement arrays would be the same way one implements searching or other text manipulation, with a set of stand alone functions such as tr, grep, awk, and so on. We would want an "array" program.

Independant programs have all sorts of upsides, including that they follow the basic Unix Philosophy. The down side is that they would be independent threads. We have somehow come to fetishize process number as a number to minimize (rather than redesign our systems so that process number increase is efficient and non undesirable)

One thing I would say is that arrays may not be making bash more complicated. They may be making bash less complicated.

Nathan: I understand why you say what you say, but it is a little like saying "Wheel chairs are not helping the paraplegics. If they didn't use wheel chairs they'd be better off." Aspellia is a perfectly valid disability.

Too short on time to think properly, but if you go way back, scripting was literally just commands in a file. Add to this a means to pass external variables and replace them and you have a reusable command stream. In many ways this should really be the essence of scripts still.

As a rule of thumb I find it wise to move "up" to the next level of programming if there is any risk of the current level getting more complex than it is well suited to. This requires forethought, which isn't a perfect art... The general cut-offs are vague, of course, but the point is that some things that are quite clumsy in, say, bash, are trivial in a language suited to it and that you're better off to move "up" earlier rather than later, so that the code is smaller (less migration work).

Its not about "complexity", it's about "appropriate tools".

My usual rough rule is that scripting should be limited to commands, variables, and dead-simple logic. If you start looking at working around a lack of data structures, more complex logic, numerical work, etc., you are (well and truly) ready to move up to Perl or whatever.

A lot can be done with grep, cut, paste, etc., if you use a file as your "data structure" and for some tasks that's appropriate, but a character of these is that their logic is largely linear.

This "moving up" theme also applies to Perl: Perl (5) really isn't that good for writing anything too large in my experience! (Been there a few too many times...)

Reminds me that one of those things I'm "supposed" to do before a keel over is write my own programming language...

By Heraclides (not verified) on 02 Mar 2009 #permalink