Bash 4.0 includes associative arrays, a superwildcard called “**” and a new redirection operator. Details are here.
Some of my favorite new features:
There is a new shell option: `dirspell’. When enabled, the filename completion code performs spelling correction on directory names during completion.
My sepllnig skucs
The parser now understands `|&’ as a synonym for `2>&1 |’, which redirects the standard error for a command through a pipe.
For some reason I always forget the correct order of that string of characters.
If a command is not found, the shell attempts to execute a shell function named ‘command_not_found_handle’, supplying the command words as the function arguments.
Very cool from a user interface perspective. Now I can write my own snarky error messages if the user doesn’t have the prereqs installed.
Bash 4.0. It’s like a whole new world.
Most of it looks very good, but I have admit when I see ‘associative arrays’ mentioned, I start to worry if they are getting a bit close to the line of what is better left for the likes of Perl or Python. Suppose it can’t do much harm (you can always just not use it…). I prefer to keep shell scripts simple and hand over to Perl, etc., if I want something with more logic, etc.
Automatic spelling correction trains your fingers to spell badly. It’s much harder to untrain fingers that have been mis-trained than to train right in the first place.
Interesting philosophical divergences here. Heraclides: I find it interesting that the array is possibly on the other side of the Script River Styx. I’m not disagreeing (or agreeing) with you, just bringing it up. It would be like this: A shell scripting lanaguge would be going too far if … then fill in the blank (a priori) if certain things are implemented. Structurally, a SSL should not be object oriented. With respect to text manipulation a SSL should not have regular expressions fully implemented. With respect to variables, a SSL should not have arrays. With respect to file manipulatoi, a SSL should not have complex built in search functions and so on.
Were I to agree with you, I’d say that the way to implement arrays would be the same way one implements searching or other text manipulation, with a set of stand alone functions such as tr, grep, awk, and so on. We would want an “array” program.
Independant programs have all sorts of upsides, including that they follow the basic Unix Philosophy. The down side is that they would be independent threads. We have somehow come to fetishize process number as a number to minimize (rather than redesign our systems so that process number increase is efficient and non undesirable)
One thing I would say is that arrays may not be making bash more complicated. They may be making bash less complicated.
Nathan: I understand why you say what you say, but it is a little like saying “Wheel chairs are not helping the paraplegics. If they didn’t use wheel chairs they’d be better off.” Aspellia is a perfectly valid disability.
Too short on time to think properly, but if you go way back, scripting was literally just commands in a file. Add to this a means to pass external variables and replace them and you have a reusable command stream. In many ways this should really be the essence of scripts still.
As a rule of thumb I find it wise to move “up” to the next level of programming if there is any risk of the current level getting more complex than it is well suited to. This requires forethought, which isn’t a perfect art… The general cut-offs are vague, of course, but the point is that some things that are quite clumsy in, say, bash, are trivial in a language suited to it and that you’re better off to move “up” earlier rather than later, so that the code is smaller (less migration work).
Its not about “complexity”, it’s about “appropriate tools”.
My usual rough rule is that scripting should be limited to commands, variables, and dead-simple logic. If you start looking at working around a lack of data structures, more complex logic, numerical work, etc., you are (well and truly) ready to move up to Perl or whatever.
A lot can be done with grep, cut, paste, etc., if you use a file as your “data structure” and for some tasks that’s appropriate, but a character of these is that their logic is largely linear.
This “moving up” theme also applies to Perl: Perl (5) really isn’t that good for writing anything too large in my experience! (Been there a few too many times…)
Reminds me that one of those things I’m “supposed” to do before a keel over is write my own programming language…
Young Heraclides: Scripting was originally commands on punch cards!!!!
But yes, I see your points.
If you are looking for a place to donate to help out the people in the Philippines, Eli Rabett has a list of places HERE
Click here to visit my page for the novel Sungudogo, which is now available for the Kindle
I and the BIRD … not just a Web Carnival any more