This is a continuation of a discussion of the role of the command line in both the functionality and the culture of the three main operating systems used today general purpose desktop computers: Linux, and the other two.

Today’s toic: Underlying power.

Underlying all of this are two fundamental philosophical features of Linux vs. Windows, and for the most part Mac OSX falls in with Linux in this regard. One is the link between “commands” and the GUI and the other is the link between commands and configuration files, which in turn is par of the way in which applications are set up or “registered” on a system.

It is more or less true (and I oversimplify) that most Linux GUI software can be represented as a set of commands that could be entered on the command line. For some software there may be a totally different command for each function, and for other software, there may be a smaller number of commands with switches or options. There are some major exceptions to this, but it is very very common, and if you exclude all software that does not follow The Unix Philosophy then it will be generally true. This is not true in Windows. Not even close.

With respect to configurations and installation or registration of software, there is a more subtle but I think very real and important link between function and the command line approach. Commands can be placed into files. A command might do nothing more than open a certain text file (or create it if it does not exist) and change it or set the text to say a certain thing. This text is then the configuration data for some aspect of your system or your software. This whole approach of commands, text fields, and so on is played out on a file system that itself is both simple and significant. There are directories in the standard unix/Linux installation that are simply known, by the system and the software that runs on it, to contain information regarding the system. In Linux, go to your /proc directory and get a listing. You will see a lot of the ‘files’ in that directory are numbers. These numbers are process numbers. When you look at this listing, you are looking at the processes that are running on your system right now!!! It’s like looking at the mirror at a certain angle and seeing the thoughts that are in your brain represented as little blue dots or something!!! How cool is that?!?!

This exposes one of the great features of Linux: Everything is a file. You put a CD or DVD in the player, it becomes a file. An “iso” image on your hard drive and a CD with that image on it in your drive are, for all practical purposes, the same thing. What are they? Files! Your keyboard is a file. The list of software installed on your system is a file. A running piece of software is a file (in the /proc directory). Everything is a file, all files are text files (philosophically if not in reality), text files can contain lists of commands. How amazingly simple. How amazingly powerful. How amazingly unlike the Windows Registry. How amazingly accessible and understandable by both the tech expert you need to rely on and to you, if you decide to get into it.

Earlier on in this discussion I gave the example of how to make your computer squawk like a cuckoo bird. Go look back at that example. The cat command simply streams the text out of a file. The file being streamed here is an open source format audio file, so it is just a stream of text characters that are the data for sound. The data are being streamed out of the file, and into whatever is next in the sequence (the “pipe”) that the command represents. In this case, the output of the file is being streamed into a “file” which is actually the sound system on that computer. Since the sound system is actually a driver linked to some hardware, the text output of the audio file is played. This is so incredibly straight forward that it is easy to understand, difficult to screw up, and easy to fix. It is the way computers should work.

That simple yet powerful engineering, and the transparent and single layered connection between the files, the file system, and operations makes Linux so powerful and secure, but it is also what makes the command line not only necessary but a very powerful tool for those who chose to see it for what it is rather than what you are told.

Comments

  1. #1 D. C. Sessions
    August 12, 2009

    Greg, have I mentioned that it’s gratifying to see these tutorials coming from an anthropologist? You’re doing a lot just by who you aren’t.

    Steady on!


    dcs (yeah, I have a degree in CS. Totally irrelevant.)

  2. #2 MadScientist
    August 12, 2009

    [OT] I couldn’t resist the Microsoft news: A Texas court has issued an injunction against sales of Microsoft’s XML-based MSOffice.

    [more info: http://www.i4i.com/collateral/05-26-09-Release-Jury_orders_Microsoft_to_pay_i4i.pdf

    That should be fun to watch. In a disturbingly increasing number of rare events, I’m hoping MS fight and win.

  3. #3 Gray Gaffer
    August 12, 2009

    Not to forget the companion philosophy to this: write programs that do one thing very well and obey the command line and piping rules. Complex tasks can then be built ad-hoc out of simpler and easier to understand sub-tasks. Unlike traditional coding, think data flow not control flow for these.

    I have a GUI language that can wrap this nicely.

  4. #4 Jeff Knapp
    August 13, 2009

    Here are two examples of high-end creative software packages used in CGI and VFX that I believe follow this philosophy. Modo and Nuke. Especially in the case of Modo, a beautifully designed GUI has made the program a joy to use for us right-brained types as well as gives you full access to the tool pipe and underlying command structure if you want it.

  5. #5 Jeff Knapp
    August 13, 2009

    And here is an example of such a package with that philosophy but with and absolute nightmare of a GUI (an industry standard btw), Maya

  6. #6 Jeff Knapp
    August 13, 2009

    Jere is an example of such a package with that philosophy with a beautiful and functional GUI that makes this program an absolute joy to use for us right-brainers yet, gives you full access to the underlying command structure if you want it., Modo

  7. #7 travc
    August 13, 2009

    Jeff, I think I should just make sure… You realize you aren’t even close to a “normal user”, right? Right brain stuff aside, you seem to be doing some rather specialist/professional image work on a high-end system.

    That is all quite cool, but you seem to be projecting a bit much as to what median folks want in a system. Yeah, pot meet kettle ;)

    BTW: When doing distributed rendering, you have to boil down the operation into a series of instructions to send out after all. Which brings up another good reason for the primacy of the command line. Not saying a good GUI isn’t really nice, but a bad or incomplete command line interface is a killer.

  8. #8 Virgil Samms
    August 13, 2009

    I’d love to take advantage of all that underlying power! but Fedora 11 won’t install on my machine.

  9. #9 Ray Ingles
    August 13, 2009

    Aww, Samms, it shouldn’t be that hard for the First Lensman! :->

    But seriously, there are other options if Fedora’s not working for you. You could, for example, try Ubuntu – if you can’t burn a CD they’ll mail you one free. If you’re able to burn CDs, then you could give SUSE or Linux Mint a shot.

  10. #10 Jeff Knapp
    August 13, 2009

    @7 True – in all of what you are saying. And the render-wranglers (guys who push rendering jobs through render farms) are the ones who have to deal with stuff on that level (thankfully). For me, I just dump render jobs into the render cue. I use the GUI interface for that render cue. At one gig I had, there were no render wranglers and I often had to go into the server room and re-start crashed servers and the like myself. In that case, the render cue and server were purely a GUI affair though, if one wanted, the command line was available.

    BTW: When doing distributed rendering, you have to boil down the operation into a series of instructions to send out after all. Which brings up another good reason for the primacy of the command line. Not saying a good GUI isn’t really nice, but a bad or incomplete command line interface is a killer.

    Likewise, a bad or incomplete GUI is every bit as much a killer. Maya is one package that, unfortunately, forces the artist to access much of its underlying power from scrips (MEL) and a command line. It is one of the reasons TDs (technical directors) love it but, some artists (like me) not so much. That and its GUI is an awful mess. When given the choice, I use other 3D apps such as Modo and Cinema 4D which have much better, more accessible “artist friendly” GUIs (as well as a lot of power and capability – and command lines for those that want to use them).

    That is all quite cool, but you seem to be projecting a bit much as to what median folks want in a system. Yeah, pot meet kettle ;)

    Much of my projecting is from me but it is also from the experience I have with my kids, my wife and my mother-in-law (grandma). Grandma insists on staying with Windows and I am constantly having to support her system for her, updating this, that and the other thing, fixing constant, recurring software issues, etc. It is a real pain. I switched my wife to the Mac about a year ago from a POS Windows box and she absolutely loves it and would never go back. And, the time I spend supporting that system for her is almost non-existent. I strongly suspect that if either one of them had to use a command line, they simply would be lost.

    With that said, I have tried to talk “Grandma” into letting me replace her Windows on her PC with Ubuntu knowing that the standard 32 bit installation with Firefox, Thunderbird and the office suite it comes with would be all she would ever need and it would be more stable and quicker for her, and easier for me to maintain (wouldn’t really need that much compared to Windows – much like the Mac). And here is where I can definitely thank Greg for convincing me to give it a try in the first place.