Reader small image

You're reading from  Hands-On Data Science with the Command Line

Product typeBook
Published inJan 2019
Reading LevelIntermediate
PublisherPackt
ISBN-139781789132984
Edition1st Edition
Languages
Tools
Concepts
Right arrow
Authors (3):
Jason Morris
Jason Morris
author image
Jason Morris

Jason Morris is a systems and research engineer with over 19 years of experience in system architecture, research engineering, and large data analysis. His primary focus is machine learning with TensorFlow, CUDA, and Apache Spark. Jason is also a speaker and a consultant for designing large-scale architectures, implementing best security practices on the cloud, creating near real-time image detection analytics with deep learning, and developing serverless architectures to aid in ETL. His most recent roles include solution architect, big data engineer, big data specialist, and instructor at Amazon Web Services. He is currently the Chief Technology Officer of Next Rev Technologies and his favorite command line program is netcat
Read more about Jason Morris

Chris McCubbin
Chris McCubbin
author image
Chris McCubbin

Chris McCubbin is a data scientist and software developer with 20 years experience in developing complex systems and analytics. He co-founded the successful big data security startup Sqrrl, since acquired by Amazon. He has also developed smart swarming systems for drones, social network analysis systems in MapReduce and big data security analytic platforms using the Apache projects Accumulo and Spark. He has been using the Unix command line starting on IRIX platforms in college and his favorite command line program is find.
Read more about Chris McCubbin

Raymond Page
Raymond Page
author image
Raymond Page

Raymond Page is a computer engineer specializing in site reliability. His experience with embedded development engendered a passion for removing the pervasive bloat from web technologies and cloud computing. His favorite command is cat.
Read more about Raymond Page

View More author details
Right arrow

History of the command line

Since the very first electronic machines, people have strived to communicate with them the same way that we humans talk to each other. But since natural-language processing was beyond the technological grasp of early computer systems, engineers relatively quickly replaced the punch cards, dials, and knobs of early computing machines with teletypes: typewriter-like machines that enabled keyed input and textual output to a display. Teletypes were replaced fairly quickly with video monitors, enabling a world of graphical displays. A novelty of the time, teletypes served a function that was missing in graphical environments, and thus terminal emulators were born for serving as the modern interface to the command line. The programs behind the terminals started out as an ingrained part of the computer itself: resident monitor programs that were able to start a job, detect when it was done, and clean up.

As computers grew in complexity, so did the programs controlling them. Resident monitors gave way to operating systems that were able to share time between multiple jobs. In the early 1960s, Louis Pouzin had the brilliant idea to use the commands being fed to the computer as a kind of program, a shell around the operating system.

"After having written dozens of commands for CTSS, I reached the stage where I felt that commands should be usable as building blocks for writing more commands, just like subroutine libraries. Hence, I wrote RUNCOM, a sort of shell that drives the execution of command scripts, with argument substitution. The tool became instantly popular, as it became possible to go home in the evening and leaving long runcoms to execute overnight."

Scripting in this way, and the reuse of tooling, would become an ingrained trope in the exciting new world of programmable computing. Pouzin's concepts for a programmable shell made their way into the design and philosophy of Multics in the 1960s and its Bell Labs successor, Unix.

In the Bell System Technical Journal from 1978, Doug McIlroy wrote the following regarding the Unix system:

"A number of maxims have gained currency among the builders and users of the UNIX system to explain and promote its characteristic style: Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new features."
  • Expect the output of every program to become the input to another, as yet unknown, program. Don't clutter output with extraneous information. Avoid stringently columnar or binary input formats. Don't insist on interactive input.
  • Design and build software, even operating systems, to be tried early, ideally within weeks. Don't hesitate to throw away the clumsy parts and rebuild them.
  • Use tools in preference to unskilled help to lighten a programming task, even if you have to detour to build the tools and expect to throw some of them out after you've finished using them.

This is the core of the Unix philosophy and the key tenets that make the command line not just a way to launch programs or list files, but a powerful group of community-built tools that can work together to process data in a clean, simple manner. In fact, McIlroy follows up with this great example of how this had led to success with data processing, even back in 1978:

"Unexpected uses of files abound: programs may be compiled to be run and also typeset to be published in a book from the same text without human intervention; text intended for publication serves as grist for statistical studies of English to help in data compression or cryptography; mailing lists turn into maps. The prevalence of free-format text, even in "data" files, makes the text-processing utilities useful for many strictly data processing functions such as shuffling fields, counting, or collating."

Having access to simple yet powerful components, programmers needed an easy way to construct, reuse, and execute more complicated commands and scripts to do the processing specific to their needs. Enter the early fully-featured command line shell: the Bourne shell. Developed by Stephen Bourne (also at Bell Labs) in the late 1970s for Unix's System 7, the Bourne shell was designed from the start with programmers like us in mind: it had all the scripting tools needed to put the community-developed single-purpose tools to good use. It was the right tool, in the right place, at the right time; almost all Unix systems today are based upon System 7 and nearly all still include the original Bourne shell as an option. In this book, we will use a descendant of the venerable Bourne shell, known as Bash, which is a rewrite of the Bourne shell released in 1989 for the GNU project that incorporated the best features of the Bourne shell itself along with several of its earlier spinoffs.

Previous PageNext Page
You have been reading a chapter from
Hands-On Data Science with the Command Line
Published in: Jan 2019Publisher: PacktISBN-13: 9781789132984
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
undefined
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime

Authors (3)

author image
Jason Morris

Jason Morris is a systems and research engineer with over 19 years of experience in system architecture, research engineering, and large data analysis. His primary focus is machine learning with TensorFlow, CUDA, and Apache Spark. Jason is also a speaker and a consultant for designing large-scale architectures, implementing best security practices on the cloud, creating near real-time image detection analytics with deep learning, and developing serverless architectures to aid in ETL. His most recent roles include solution architect, big data engineer, big data specialist, and instructor at Amazon Web Services. He is currently the Chief Technology Officer of Next Rev Technologies and his favorite command line program is netcat
Read more about Jason Morris

author image
Chris McCubbin

Chris McCubbin is a data scientist and software developer with 20 years experience in developing complex systems and analytics. He co-founded the successful big data security startup Sqrrl, since acquired by Amazon. He has also developed smart swarming systems for drones, social network analysis systems in MapReduce and big data security analytic platforms using the Apache projects Accumulo and Spark. He has been using the Unix command line starting on IRIX platforms in college and his favorite command line program is find.
Read more about Chris McCubbin

author image
Raymond Page

Raymond Page is a computer engineer specializing in site reliability. His experience with embedded development engendered a passion for removing the pervasive bloat from web technologies and cloud computing. His favorite command is cat.
Read more about Raymond Page