Getting Started

Fabian Cook

November 2015

In this article by Fabian Cook, the authors of the book Node.js Essentials, Every web developer might have come across JavaScript every once in a while, even if they just dabble in simple web pages. Whenever you want to make your web page a little more interactive, you grab your trustworthy friends, such as JavaScript and jQuery and hack together something new. You might have developed some exciting frontend applications using AngularJS or Backbone and want to learn more about what else you can do with JavaScript.

While testing your website on multiple browsers you must have come across Google Chrome at some point and you might have noticed that your JavaScript code runs really well on it.

Google Chrome and Node.js have something very big in common, they both work on Google's high performance V8 JavaScript engine, this gives us the same engine in all the browsers that we will be using in the backend, pretty cool right?

(For more resources related to this topic, see here.)

Setting up

In order to get started and use node, we need to download and install Node.js. The best way to install it will be to head over to https://nodejs.org/ and download the installer.

At the time of writing, the current version of Node is 0.12.7.

To ensure consistency, we are going to use a npm package to install the correct version of Node.JS and for this, we are going to use the n package described at https://www.npmjs.com/package/n.

Currently, this package has support only for *nix machines. For windows. see nvm-windows or download the binary for 0.12.7 from https://nodejs.org/dist/v0.12.7/.

Once you have Node.js installed, open a terminal and run:

[~]$ npm install -g n

The –g argument will install the package globally and then we can use the package anywhere the Linux users want to run commands that install global packages, such as sudo.

Using the recently install package, run:

[~]$ n

This will display a screen with something such as the following package's:

node/0.10.38
node/0.11.16
node/0.12.0
node/0.12.2
node/0.12.7

If node/0.12.7 isn't marked we can simply run the following package's, this will ensure that node/0.12.7 gets installed:

[~]$ sudo n 0.12.7

To ensure that the node is all good to go, lets create and run a simple hello world example:

[~/src/examples/example-1]$ touch example.js
[~/src/examples/example-1]$ echo "console.log(\"Hello world\")" > example.js
[~/src/examples/example-1]$ node example.js
Hello World

Cool, it works, now let's get down to business.

Hello require

In the preceding example, we just logged a simple message, nothing interesting, so let's dive a bit deeper in this article.

When using multiple scripts in the browser, we usually just include another script tag such as:

<script type='application/javascript' src='script_a.js'>
</script>
<script type='application/javascript' src='script_b.js'>
</script>

Both these scripts share the same global scope, this usually leads to some unusual conflicts when people want to name things the same.

//script_a.js
function run( ) {
   console.log( "I'm running from script_a.js!" );}
$( run );

//script_b.js
function run( ) {
   console.log( "I'm running from script_b.js!" );
}
$( run );

This can lead to confusion when many files are minified and crammed together. It causes a problem when it is run –a global variable from script_a is replaced with a variable from scope_b and on running the code, we see the following on the console:

> I'm running from script_b.js!
> I'm running from script_b.js!

The most common method to get around script and to limit the pollution of the global scope is to wrap our files with an anonymous function, as shown:

//script_a.js
(function( $, undefined ) {
   function run( ) {
       console.log( "I'm running from script_a.js!" );
   }
   $( run );
})( jQuery );

//script_b.js
(function( $, undefined ) {
   function run( ) {
       console.log( "I'm running from script_b.js!" );
   }
   $( run );
})( jQuery );

Now when we run this, it works as expected:

> I'm running from script_a.js!
> I'm running from script_b.js!

This is good for the code that we do not depend on the external form of the scope of the anonymous function, but what do we do for the code that we do depend on? We just export it, right?

Something similar to the following code will do:

(function( undefined ) {
   function Logger(){
   }
   Logger.prototype.log = function( message /*...*/ ){
       console.log.apply( console, arguments );
   }
   this.Logger = Logger;
})( )

Now, when we run this script, we can access Logger from the global scope:

var logger = new Logger( );
logger.log( "This", "is", "pretty", "cool" )
> This is pretty cool

So now, we can share our libraries and everything is good but someone else already has a library that exposes the same Logger class.

So what does node do to solve this issue? Hello require!

Node.js has a simple way to bring in scripts and modules from external sources, it is comparable to require in PHP.

Lets create a few files in this structure:

/example-2
   /util
       index.js
       logger.js
   main.js

/* util/index.js */
var logger = new Logger( )
var util = {
   logger: logger
};

/* util/logger.js */

function Logger(){
}
Logger.prototype.log = function( message /*...*/ ){
   console.log.apply( console, arguments );
};

/* main.js */
util.logger.log( "This is pretty cool" );

We can see that main.js. is dependent on util/index.js, which is in turn dependent on util/logger.js.

This should just work right or maybe not. Let's run the command:

[~/src/examples/example-2]$ node main.js
ReferenceError: logger is not defined
   at Object.<anonymous> (/Users/fabian/examples/example-2/main.js:1:63)
   /* Removed for simplicity */
   at Node.js:814:3

So why is this? Shouldn't they be sharing the same global scope? Well, in Node.js the story is a bit different. Remember those anonymous functions that we were wrapping our files in earlier? Node.js wraps our scripts in them automatically and this is where require fits in.

Lets fix our files, as shown:

/* util/index.js */
Logger = require( "./logger" )

/* main.js */
util = require( "./util" );

If you notice, I didn't use index.js when requiring util/index.js, the reason for this is that when you a require a folder rather than a file you can specify an index file that can represent that folder's code. This can be handy for something such as a model folder where you expose all your models in one require rather than a separate require for each model.

So now, we have required our files. But what do we get back?

[~/src/examples/example-2]$ node
> var util = require( "./util" );
> console.log( util );
{} 

Still, there is no logger. We have missed an important step; we haven't told Node.js as to what we want to expose in our files.

To expose something in Node.js, we use an object called module.exports. There is a short hand reference to it that is just exports. When our file is wrapped in an anonymous function, both module and exports are passed as a parameter, as shown in the following example:

function Module( ) {
   this.exports = { };
}

function require( file ) {
   // .....
   returns module.exports;
}

var module = new Module( );
var exports = module.exports;

(function( exports, require, module ) {
   exports = "Value a"
   module.exports = "Value b"
})( exports, require, module );
console.log( module.exports );
// Value b

The example shows that exports is initially just a reference to module.exports. This means that if you use exports = { }, the value you set it as wouldn't be accessible outside the function's scope. However, when you add properties to exports object, you are actually adding properties to module.exports object as they both are the same value. Assigning a value to module.exports will export that value because it is accessible outside the function's scope through the module.

With this knowledge, we can finally run our script in the following manner:

/* util/index.js */
Logger = require( "./logger.js" );
exports.logger = new Logger( );

/* util/logger.js */
function Logger( ){
}
Logger.prototype.log = ( message /*...*/ ) {
   console.log.apply( console, arguments );
};
module.exports = Logger;

/* main.js */
util = require( "./utils" );
util.logger.log( "This is pretty cool" );

Running main.js:

[~/src/examples/example-2]$ node main.js
This is pretty cool

Require can also be used to include modules in our code. When requiring modules, we don't need to use a file path, we just need the name of the node module that we want.

Node.js includes many prebuilt core modules, one of which is the util module. You can find details on the util module at https://nodejs.org/api/util.html.

Let's see the util module command:

[~]$ node
> var util = require( "util" )
> util.log( 'This is pretty cool as well' )
01 Jan 00:00:00 - This is pretty cool as well 

Hello npm

Along with internal modules there is also an entire ecosystem of packages, the most common package manager for Node.js is npm. At the time of writing this article there are a total of 192,875 packages available.

We can use npm to access packages that do many things for us, from routing HTTP requests to building our projects. You can also browse the packages available at https://www.npmjs.com/.

Using a package manager you can bring in other modules, which is great as you can spend more time working on your business logic rather than reinventing the wheel.

Let's download the following package to make our log messages colorful:

[~/src/examples/example-3]$ npm install chalk

Now, to use it, create a file and require it:

[~/src/examples/example-3]$ touch index.js
/* index.js */
var chalk = require( "chalk" );
console.log( "I am just normal text" )
console.log( chalk.blue( "I am blue text!" ) )

On running this code, you will see the first message in a default color and the second message in blue. Let's look at the command:.

[~/src/examples/example-3]$ node index.js
I am just normal text
I am blue text!

Having the ability to download existing packages comes handy when you require something that someone else has already implemented. As we have said earlier, there are many packages out there to choose from.

We need to keep track of these dependencies and there is a simple solution to that: package.json.

Using package.json we can define things, such as the name of our project, what the main script is, how to run tests and our dependencies and so on. You can find a full list of properties at https://docs.npmjs.com/files/package.json.

The npm provides a handy command to create these files and it will ask you the relevant questions needed to create your package.json file:

[~/src/examples/example-3]$ npm init

The preceding utility will walk you through the creation of a package.json file.

It only covers the most common items and tries to guess valid defaults.

Run the npm help json command for definitive documentation on these fields and to know what they do exactly.

Afterwards, use npm and install <pkg> --save to install a package and save it as a dependency in the package.json file.

Press ^C to quit at any time:

name: (example-3)
version: (1.0.0)
description:
entry point: (main.js)
test command:
git repository:
keywords:
license: (ISC)

About to write to /examples/example-3/package.json:

{
"name": "example-3",
"version": "1.0.0",
"description": "",
"main": "main.js",
"scripts": {
   "test": "echo \"Error: no test specified\" && exit 1"
},
"author": "....",
"license": "ISC"
}
Is this ok? (yes)

The tool will provide you with default values, so it is easier to just skip through them using the enter key.

Now. when installing a package. we can use the --save option to save chalk as a dependency, as shown:

[~/src/examples/example-3]$ npm install --save chalk

We can see chalk has been added:

[~/examples/example-3]$ cat package.json
{
"name": "example-3",
"version": "1.0.0",
"description": "",
"main": "main.js",
"scripts": {
   "test": "echo \"Error: no test specified\" && exit 1"
},
"author": "...",
"license": "ISC",
"dependencies": {
   "chalk": "^1.0.0"
}
}

We can add these dependencies manually by modifying package.json; however' it is the most common method to save dependencies on installation.

You can read more about the package file at: https://docs.npmjs.com/files/package.json.

If you are creating a server or an application rather than a module, you most likely want to find a way to start your process without having to give a path to your main file all the time, this is where the script object in your package.json file comes into play.

To set your start up script you just need to set the start property in the scripts object, as shown:

"scripts": {
   "test": "echo \"Error: no test specified\" && exit 1",
     "start": "node server.js"
}

Now, all we need to do is run npm start and then npm will run the start script we have already specified.

We can define more scripts if we want a start script for the development environment. We can also define a development property with non-standard script names; however, instead of just using npm <script> we need to use npm run <script>. For example, if we want to run our new development script we will have to use npm run development.

The npm has scripts that are triggered at different times We can define a postinstall script that runs after we run npm install, we can use this if we want to trigger a package manager to install the modules ( thzt is bower )

You can read more about the scripts object here: https://docs.npmjs.com/misc/scripts.

You need to define a package if you are working in a team of developers where the project is to be installed on different machines. If you are using a source control tool such as git, it is recommended that you add the node_modules directory into your ignore file, as shown:

[~/examples/example-3]$ echo "node_modules" > .gitignore
[~/examples/example-3]$ cat .gitignore
node_modules

Summary

That was quick, wasn't it? We have covered the fundamentals of Node.js that we need to continue on our journey.

We have covered how easy it is to expose and protect public and private code compared to regular JavaScript code in the browser where the global scope can get very polluted.

We also know how to include packages and code from external sources and how to ensure that the packages included are consistent.

As you can see there is a huge ecosystem of packages in one of the many JavaScript package managers, such as npm, just waiting for us to use and consume.

In the next chapter, we will focus on creating a simple server to route, authenticate and consume requests.

Resources for Article:


Further resources on this subject:


You've been reading an excerpt of:

Node.js Essentials

Explore Title