So, if I can build a wagon with a lawnmower engine from parts, and it is able to putter down the street (barely) faster than I can walk, why should I use a car?
That's my question.
I'm a sysadmin. I have been for years. My coworker calls me "the encyclopaedia of Linux", because I know all the command line tools, their functions and features, and how to tie them together. I write one-liners that are 5 lines long. And I like it. I love the Unix philosophy and design and tools.
But, that doesn't make it a perfect fit for anything. Shell scripts are very limited and painful for a lot of things. When I'm writing something I expect to be 50 lines or less, shell scripts and *nix utilities will usually (not always, but usually) get the job done (especially if you include sed/awk in there). For anything more than that, I break out Perl and I never regret it.
Complicated logic, non-trivial data structures, multiple chains of processing, these are things that just don't do well with shell scripts and tools. It's also a lot cleaner and simpler using a real data structure, instead of forcing everything into pipes and temporary files.
Note: Your comment of "instead of reading entire files into memory the way perl does" shows your ignorance. Perl is an immensely flexible tool, and it gives you the ability to process files line-by-line, stream-style, or by reading the entire file into memory. Both methods are trivial, and having the choice means you can use whichever method is appropriate for the task at hand (your comment suggests that reading the whole file is somehow "wrong", but for many tasks it's a faster and cleaner method).
That's my question.
I'm a sysadmin. I have been for years. My coworker calls me "the encyclopaedia of Linux", because I know all the command line tools, their functions and features, and how to tie them together. I write one-liners that are 5 lines long. And I like it. I love the Unix philosophy and design and tools.
But, that doesn't make it a perfect fit for anything. Shell scripts are very limited and painful for a lot of things. When I'm writing something I expect to be 50 lines or less, shell scripts and *nix utilities will usually (not always, but usually) get the job done (especially if you include sed/awk in there). For anything more than that, I break out Perl and I never regret it.
Complicated logic, non-trivial data structures, multiple chains of processing, these are things that just don't do well with shell scripts and tools. It's also a lot cleaner and simpler using a real data structure, instead of forcing everything into pipes and temporary files.
Note: Your comment of "instead of reading entire files into memory the way perl does" shows your ignorance. Perl is an immensely flexible tool, and it gives you the ability to process files line-by-line, stream-style, or by reading the entire file into memory. Both methods are trivial, and having the choice means you can use whichever method is appropriate for the task at hand (your comment suggests that reading the whole file is somehow "wrong", but for many tasks it's a faster and cleaner method).