Communicating with Servers

Exclusive offer: get 50% off this eBook here
HTML5 Data and Services Cookbook

HTML5 Data and Services Cookbook — Save 50%

Over one hundred website building recipes utilizing all the modern HTML5 features and techniques! with this book and ebook

$32.99    $16.50
by Gorgi Kosev Mite Mitreski | September 2013 | Cookbooks Web Development

In this article, by Gorgi Kosev and Mite Mitreski, the authors of HTML5 Data and Services Cookbook, we're going to cover the following topics:

In this Article, we're going to cover the following topics:

  • Creating an HTTP GET request to fetch JSON
  • Creating a request with custom headers
  • Versioning your API
  • Fetching JSON data with JSONP
  • Reading XML data from the server
  • Using the FormData interface
  • Posting a binary file to the server
  • Creating an SSL connection with Node.js
  • Making real-time updates with Ajax Push
  • Exchanging real-time messages using WebSockets

(For more resources related to this topic, see here.)

Creating an HTTP GET request to fetch JSON

One of the basic means of retrieving information from the server is using HTTP GET. This type of method in a RESTful manner should be only used for reading data. So, GET calls should never change server state. Now, this may not be true for every possible case, for example, if we have a view counter on a certain resource, is that a real change? Well, if we follow the definition literally then yes, this is a change, but it's far from significant to be taken into account.

Opening a web page in a browser does a GET request, but often we want to have a scripted way of retrieving data. This is usually to achieve Asynchronous JavaScript and XML (AJAX ), allowing reloading of data without doing a complete page reload. Despite the name, the use of XML is not required, and these days, JSON is the format of choice.

A combination of JavaScript and the XMLHttpRequest object provides a method for exchanging data asynchronously, and in this recipe, we are going to see how to read JSON for the server using plain JavaScript and jQuery. Why use plain JavaScript rather than using jQuery directly? We strongly believe that jQuery simplifies the DOM API, but it is not always available to us, and additionally, we need have to know the underlying code behind asynchronous data transfer in order to fully grasp how applications work.

Getting ready

The server will be implemented using Node.js. In this example, for simplicity, we will use restify (http://mcavage.github.io/node-restify/), a Node.js module for creation of correct REST web services.

How to do it...

Let's perform the following steps.

  1. In order to include restify to our project in the root directory of our server side scripts, use the following command:

    npm install restify

  2. After adding the dependency, we can proceed to creating the server code. We create a server.js file that will be run by Node.js, and at the beginning of it we add restify:

    var restify = require('restify');

  3. With this restify object, we can now create a server object and add handlers for get methods:

    var server = restify.createServer(); server.get('hi', respond); server.get('hi/:index', respond);

  4. The get handlers do a callback to a function called respond, so we can now define this function that will return the JSON data. We will create a sample JavaScript object called hello, and in case the function was called having a parameter index part of the request it was called from the "hi/:index" handler:

    function respond(req, res, next) { console.log("Got HTTP " + req.method + " on " + req.url + " responding"); var hello = [{ 'id':'0', 'hello': 'world' },{ 'id':'1', 'say':'what' }]; if(req.params.index){ var found = hello[req.params.index]; if(found){ res.send(found); } else { res.status(404); res.send(); } }; res.send(hello); addHeaders(req,res); return next(); }

  5. The following addHeaders function that we call at the beginning is adding headers to enable access to the resources served from a different domain or a different server port:

    function addHeaders(req, res) { res.header("Access-Control-Allow-Origin", "*"); res.header("Access-Control-Allow-Headers", "X-Requested-With"); };

  6. The definition of headers and what they mean will be discussed later on in the Article. For now, let's just say they enable accesses to the resources from a browser using AJAX. At the end, we add a block of code that will set the server to listen on port 8080:

    server.listen(8080, function() { console.log('%s listening at %s', server.name, server.url); });

  7. To start the sever using command line, we type the following command:

    node server.js

  8. If everything went as it should, we will get a message in the log:

    restify listening at http://0.0.0.0:8080

  9. We can then test it by accessing directly from the browser on the URL we defined http://localhost:8080/hi

Now we can proceed with the client-side HTML and JavaScript. We will implement two ways for reading data from the server, one using standard XMLHttpRequest and the other using jQuery.get(). Note that not all features are fully compatible with all browsers.

  1. We create a simple page where we have two div elements, one with the ID data and another with the ID say. These elements will be used as placeholders to load data form the server into them:

    Hello <div id="data">loading</div> <hr/> Say <div id="say">No</div>s <script src = "http://ajax.googleapis.com/ajax/libs/jquery
    /1.8.2/jquery.min.js"></script> <script src = "example.js"></script> <script src = "exampleJQuery.js"></script>

  2. In the example.js file, we define a function called getData that will create a AJAX call to a given url and do a callback if the request went successfully:

    function getData(url, onSuccess) { var request = new XMLHttpRequest(); request.open("GET", url); request.onload = function() { if (request.status === 200) { console.log(request); onSuccess(request.response); } }; request.send(null); }

  3. After that, we can call the function directly, but in order to demonstrate that the call happens after the page is loaded, we will call it after a timeout of three seconds:

    setTimeout( function() { getData( 'http://localhost:8080/hi', function(response){ console.log('finished getting data'); var div = document.getElementById('data'); var data = JSON.parse(response); div.innerHTML = data[0].hello; }) }, 3000);

  4. The jQuery version is a lot cleaner, as the complexity that comes with the standard DOM API and the event handling is reduced substantially:

    (function(){ $.getJSON('http://localhost:8080/hi/1', function(data) { $('#say').text(data.say); }); }())

How it works...

At the beginning, we installed the dependency using npm install restify; this is sufficient to have it working, but in order to define dependencies in a more expressive way, npm has a way of specifying it. We can add a file called package.json, a packaging format that is mainly used for for publishing details for Node.js applications. In our case, we can define package.json with the flowing code:

{ "name" : "ch8-tip1-http-get-example", "description" : "example on http get", "dependencies" : ["restify"], "author" : "Mite Mitreski", "main" : "html5dasc", "version" : "0.0.1" }

If we have a file like this, npm will automatically handle the installation of dependencies after calling npm install from the command line in the directory where the package.json file is placed.

Restify has a simple routing where functions are mapped to appropriate methods for a given URL. The HTTP GET request for '/hi' is mapped with server.get('hi', theCallback), where theCallback is executed, and a response should be returned.

When we have a parameterized resource, for example in 'hi/:index', the value associated with :index will be available under req.params. For example, in a request to '/hi/john' to access the john value, we simple have req.params.index. Additionally, the value for index will automatically get URL-decoded before it is passed to our handler. One other notable part of the request handlers in restify is the next() function that we called at the end. In our case, it mostly does not makes much sense, but in general, we are responsible for calling it if we want the next handler function in the chain to be called. For exceptional circumstances, there is also an option to call next() with an error object triggering custom responses.

When it comes to the client-side code, XMLHttpRequest is the mechanism behind the async calls, and on calling request.open("GET", url, true) with the last parameter value as true, we get a truly asynchronous execution. Now you might be wondering why is this parameter here, isn't the call already done after loading the page? That is true, the call is done after loading the page, but if, for example, the parameter was set to false, the execution of the request will be a blocking method, or to put it in layman's terms, the script will pause until we get a response. This might look like a small detail, but it can have a huge impact on performance.

The jQuery part is pretty straightforward; there is function that accepts a URL value of the resource, the data handler function, and a success function that gets called after successfully getting a response:

jQuery.getJSON( url [, data ] [, success(data, textStatus, jqXHR) ] )

When we open index.htm, the server should log something like the following:

Got HTTP GET on /hi/1 responding Got HTTP GET on /hi responding

Here one is from the jQuery request and the other from the plain JavaScript.

There's more...

XMLHttpRequest Level 2 is one of the new improvements being added to the browsers, although not part of HTML5 it is still a significant change. There are several features with the Level 2 changes, mostly to enable working with files and data streams, but there is one simplification we already used. Earlier we would have to use onreadystatechange and go through all of the states, and if the readyState was 4, which is equal to DONE, we could read the data:

var xhr = new XMLHttpRequest(); xhr.open('GET', 'someurl', true); xhr.onreadystatechange = function(e) { if (this.readyState == 4 && this.status == 200) { // response is loaded } }

In a Level 2 request however, we can use request.onload = function() {} directly without checking states. Possible states can be seen in the table:

table

One other thing to note is that XMLHttpRequest Level 2 is supported in all major browsers and IE 10; the older XMLHttpRequest has a different way of instantiation on older versions of IE (older than IE 7), where we can access it through an ActiveX object via new ActiveXObject("Msxml2.XMLHTTP.6.0");.

Creating a request with custom headers

The HTTP headers are a part of the request object being sent to the server. Many of them give information about the client's user agent setup and configuration, as that is sometimes the basis of making description for the resources being fetched from the server. Several of them such as Etag, Expires, and If-Modified-Since are closely related to caching, while others such as DNT that stands for "Do Not Track" (http://www.w3.org/2011/tracking-protection/drafts/tracking-dnt.html) can be quite controversial. In this recipe, we will take a look at a way for using the custom X-Myapp header in our server and client-side code.

Getting ready

The server will be implemented using Node.js. In this example, again for simplicity, we will use restify (http://mcavage.github.io/node-restify/). Also, monitoring the console in your browser and server is crucial in order to understand what happens in the background.

How to do it...

  1. We can start by defining the dependencies for the server side in package.json file:

    { "name" : "ch8-tip2-custom-headers", "dependencies" : ["restify"], "main" : "html5dasc", "version" : "0.0.1" }

  2. After that, we can call npm install from the command line that will automatically retrieve restify and place it in a node_modules folder created in the root directory of the project. After this part, we can proceed to creating the server-side code in a server.js file where we set the server to listen on port 8080 and add a route handler for 'hi' and for every other path when the request method is HTTP OPTIONS:

    var restify = require('restify'); var server = restify.createServer(); server.get('hi', addHeaders, respond); server.opts(/\.*/, addHeaders, function (req, res, next) { console.log("Got HTTP " + req.method + " on " + req.url + " with headers\n"); res.send(200); return next(); }); server.listen(8080, function() { console.log('%s listening at %s', server.name, server.url); });

    In most cases, the documentation should be enough when we write the application's build onto Restify, but sometimes, it is a good idea to take a look a the source code as well. It can be found on https://github.com/mcavage/node-restify/.

  3. One thing to notice is that we can have multiple chained handlers; in this case, we have addHeaders before the others. In order for every handler to be propagated, next() should be called:

    function addHeaders(req, res, next) { res.setHeader("Access-Control-Allow-Origin", "*"); res.setHeader('Access-Control-Allow-Headers', 'X-Requested-With, X-Myapp'); res.setHeader('Access-Control-Allow-Methods', 'GET, OPTIONS'); res.setHeader('Access-Control-Expose-Headers', 'X-Myapp, X-Requested-With'); return next(); };

    The addHeaders adds access control options in order to enable cross-origin resource sharing. Cross-origin resource sharing (CORS ) defines a way in which the browser and server can interact to determine if the request should be allowed. It is more secure than allowing all cross-origin requests, but is more powerful than simply allowing all of them.

  4. After this, we can create the handler function that will return a JSON response with the headers the server received and a hello world kind of object:

    function respond(req, res, next) { console.log("Got HTTP " + req.method + " on " + req.url + " with headers\n"); console.log("Request: ", req.headers); var hello = [{ 'id':'0', 'hello': 'world', 'headers': req.headers }]; res.send(hello); console.log('Response:\n ', res.headers()); return next(); }

    We additionally log the request and response headers to the sever console log in order to see what happens in the background.

  5. For the client-side code, we need a plain "vanilla" JavaScript approach and jQuery method, so in order to do that, include example.js and exampleJquery.js as well as a few div elements that we will use for displaying data retrieved from the server:

    Hi <div id="data">loading</div> <hr/> Headers list from the request: <div id="headers"></div> <hr/> Data from jQuery: <div id="dataRecieved">loading</div> <script src = "http://ajax.googleapis.com/ajax/libs/
    jquery/1.8.2/jquery.min.js"></script> <script src = "example.js"></script> <script src = "exampleJQuery.js"></script>

  6. A simple way to add the headers is to call setRequestHeader on a XMLHttpRequest object after the call of open():

    function getData(url, onSucess) { var request = new XMLHttpRequest(); request.open("GET", url, true); request.setRequestHeader("X-Myapp","super"); request.setRequestHeader("X-Myapp","awesome"); request.onload = function() { if (request.status === 200) { onSuccess(request.response); } }; request.send(null); }

  7. The XMLHttpRequest automatically sets headers, such as "Content-Length","Referer", and "User-Agent", and does not allow you to change them using JavaScript.

    A more complete list of headers and the reasoning behind this can be found in the W3C documentation at http://www.w3.org/TR/XMLHttpRequest/#the-setrequestheader%28%29-method.

  8. To print out the results, we add a function that will add each of the header keys and values to an unordered list:

    getData( 'http://localhost:8080/hi', function(response){ console.log('finished getting data'); var data = JSON.parse(response); document.getElementById('data').innerHTML = data[0].hello; var headers = data[0].headers, headersList = "<ul>"; for(var key in headers){ headersList += '<li><b>' + key + '</b>: ' + headers[key] +'</li>'; }; headersList += "</ul>"; document.getElementById('headers').innerHTML = headersList; });

  9. When this gets executed. a list of all the request headers should be displayed on a page, and our custom x-myapp should be shown:

    host: localhost:8080 connection: keep-alive origin: http://localhost:8000 x-myapp: super, awesome user-agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.27
    (KHTML, like Gecko) Chrome/26.0.1386.0 Safari/537.27

  10. The jQuery approach is far simpler, we can use the beforeSend hook to call a function that will set the 'x-myapp' header. When we receive the response, write it down to the element with the ID dataRecived:

    $.ajax({ beforeSend: function (xhr) { xhr.setRequestHeader('x-myapp', 'this was easy'); }, success: function (data) { $('#dataRecieved').text(data[0].headers['x-myapp']); }

  11. Output from the jQuery example will be the data contained in x-myapp header:

    Data from jQuery: this was easy

How it works...

You may have noticed that on the server side, we added a route that has a handler for HTTP OPTIONS method, but we never explicitly did a call there. If we take a look at the server log, there should be something like the following output:

Got HTTP OPTIONS on /hi with headers
Got HTTP GET on /hi with headers

This happens because the browser first issues a preflight request , which in a way is the browser's question whether or not there is a permission to make the "real" request. Once the permission has been received, the original GET request happens. If the OPTIONS response is cached, the browser will not issue any extra preflight calls for subsequent requests.

The setRequestHeader function of XMLHttpRequest actually appends each value as a comma-separated list of values. As we called the function two times, the value for the header is as follows:

'x-myapp': 'super, awesome'

There's more...

For most use cases, we do not need custom headers to be part of our logic, but there are plenty of API's that make good use of them. For example, many server-side technologies add the X-Powered-By header that contains some meta information, such as JBoss 6 or PHP/5.3.0. Another example is Google Cloud Storage, where among other headers there are x-goog-meta-prefixed headers such as x-goog-meta-project-name and x-goog-meta-project-manager.

Versioning your API

We do not always have the best solution while doing the first implementation. The API can be extended up to a certain point, but afterwards needs to undergo some structural changes. But we might already have users that depend on the current version, so we need a way to have different representation versions of the same resource. Once a module has users, the API cannot be changed at our own will.

One way to resolve this issue is to use a so-called URL versioning, where we simply add a prefix. For example, if the old URL was http://example.com/rest/employees, the new one could be http://example.com/rest/v1/employees, or under a subdomain it could be http://v1.example.com/rest/employee. This approach only works if you have direct control over all the servers and clients. Otherwise, you need to have a way of handling fallback to older versions.

In this recipe, we are going implement a so-called "Semantic versioning", http://semver.org/, using HTTP headers to specify accepted versions.

Getting ready

The server will be implemented using Node.js. In this example, we will use restify (http://mcavage.github.io/node-restify/) for the server-side logic to monitor the requests to understand what is sent.

How to do it...

Let's perform the following steps.

  1. We need to define the dependencies first, and after installing restify, we can proceed to the creation of the server code. The main difference with the previous examples is the definition of the "Accept-version" header. restify has built-in handling for this header using versioned routes . After creating the server object, we can set which methods will get called for what version:

    server.get({ path: "hi", version: '2.1.1'}, addHeaders, helloV2, logReqRes); server.get({ path: "hi", version: '1.1.1'}, addHeaders, helloV1, logReqRes);

  2. We also need the handler for the HTTP OPTIONS, as we are using cross-origin resource sharing and the browser needs to do the additional request in order to get permissions:

    server.opts(/\.*/, addHeaders, logReqRes, function (req, res, next) { res.send(200); return next(); });

  3. The handlers for Version 1 and Version 2 will return different objects in order for us to easily notice the difference between the API calls. In the general case, the resource should be the same, but can have different structural changes. For Version 1, we can have the following:

    function helloV1(req, res, next) { var hello = [{ 'id':'0', 'hello': 'grumpy old data', 'headers': req.headers }]; res.send(hello); return next() }

  4. As for Version 2, we have the following:

    function helloV2(req, res, next) { var hello = [{ 'id':'0', 'awesome-new-feature':{ 'hello': 'awesomeness' }, 'headers': req.headers }]; res.send(hello); return next(); }

  5. One other thing we must do is add the CORS headers in order to enable the accept-version header, so in the route we included the addHeaders that should be something like the following:

    function addHeaders(req, res, next) { res.setHeader("Access-Control-Allow-Origin", "*"); res.setHeader('Access-Control-Allow-Headers',
    'X-Requested-With, accept-version'); res.setHeader('Access-Control-Allow-Methods', 'GET, OPTIONS'); res.setHeader('Access-Control-Expose-Headers',
    'X-Requested-With, accept-version'); return next(); };

    Note that you should not forget to the call to next() in order to call the next function in the route chain.

  6. For simplicity, we will only implement the client side in jQuery, so we create a simple HTML document, where we include the necessary JavaScript dependencies:

    Old api: <div id="data">loading</div> <hr/> New one: <div id="dataNew"> </div> <hr/> <script src = "http://ajax.googleapis.com/ajax/libs/jquery/
    1.8.2/jquery.min.js"></script> <script src = "exampleJQuery.js"></script>

  7. In the example.js file, we do two AJAX calls to our REST API, one is set to use the Version 1 and other to use Version 2:

    $.ajax({ url: 'http://localhost:8080/hi', type: 'GET', dataType: 'json', success: function (data) { $('#data').text(data[0].hello); }, beforeSend: function (xhr) { xhr.setRequestHeader('accept-version', '~1'); } }); $.ajax({ url: 'http://localhost:8080/hi', type: 'GET', dataType: 'json', success: function (data) { $('#dataNew').text(data[0]['awesome-new-feature'].hello); }, beforeSend: function (xhr) { xhr.setRequestHeader('accept-version', '~2'); } });

Notice that the accept-version header contains values ~1 and ~2. These designate that all the semantic versions such as 1.1.0 and 1.1.1 1.2.1 will get matched by ~1 and similarly for ~2. At the end, we should get an output like the following text:

Old api:grumpy old data New one:awesomeness

How it works...

Versioned routes are a built-in feature of restify that work through the use of accept-version. In our example, we used Versions ~1 and ~2, but what happens if we don't specify a version? restify will do the choice for us, as the the request will be treated in the same manner as if the client has sent a * version. The first defined matching route in our code will be used. There is also an option to set up the routes to match multiple versions by adding a list of versions for a certain handler:

server.get({path: 'hi', version: ['1.1.0', '1.1.1', '1.2.1']}, sendOld);

The reason why this type of versioning is very suitable for use in constantly growing applications is because as the API changes, the client can stick with their version of the API without any additional effort or changes needed in the client-side development. Meaning that we don't have to do updates on the application. On the other hand, if the client is sure that their application will work on newer API versions, they can simply change the request headers.

There's more...

Versioning can be implemented by using custom content types prefixed with vnd for example, application/vnd.mycompany.user-v1. An example of this is Google Earth's content type KML where it is defined as application/vnd.google-earth.kml+xml. Notice that the content type can be in two parts; we could have application/vnd.mycompany-v1+json where the second part will be the format of the response.

Fetching JSON data with JSONP

JSONP or JSON with padding is a mechanism of making cross-domain requests by taking advantage of the <script> tag. AJAX transport is done by simply setting the src attribute on a script element or adding the element itself if not present. The browser will do an HTTP request to download the URL specified, and that is not subject to the same origin policy, meaning that we can use it to get data from servers that are not under our control. In this recipe, we will create a simple JSONP request, and a simple server to back that up.

Getting ready

We will make a simplified implementation of the server we used in previous examples, so we need Node.js and restify (http://mcavage.github.io/node-restify/) installed either via definition of package.json or a simple install. For working with Node.js.

How to do it...

  1. First, we will create a simple route handler that will return a JSON object:

    function respond(req, res, next) { console.log("Got HTTP " + req.method + " on " + req.url + " responding"); var hello = [{ 'id':'0', 'what': 'hi there stranger' }]; res.send(hello); return next(); }

  2. We could roll our own version that will wrap the response into a JavaScript function with the given name, but in order to enable JSONP when using restify, we can simply enable the bundled plugin. This is done by specifying what plugin to be used:

    var server = restify.createServer(); server.use(restify.jsonp()); server.get('hi', respond);

  3. After this, we just set the server to listen on port 8080:

    server.listen(8080, function() { console.log('%s listening at %s', server.name, server.url); });

  4. The built-in plugin checks the request string for parameters called callback or jsonp, and if those are found, the result will be JSONP with the function name of the one passed as value to one of these parameters. For example, in our case, if we open the browser on http://localhost:8080/hi, we get the following:

    [{"id":"0","what":"hi there stranger"}]

  5. If we access the same URL with the callback parameter or a JSONP set, such as http://localhost:8080/hi?callback=great, we should receive the same data wrapped with that function name:

    great([{"id":"0","what":"hi there stranger"}]);

    This is where the P in JSONP, which stands for padded, comes into the picture.

  6. So, what we need to do next is create an HTML file where we would show the data from the server and include two scripts, one for the pure JavaScript approach and another for the jQuery way:

    <b>Hello far away server: </b> <div id="data">loading</div> <hr/> <div id="oneMoreTime">...</div> <script src = "http://ajax.googleapis.com/ajax/libs/jquery
    /1.8.2/jquery.min.js"></script> <script src = "example.js"></script> <script src = "exampleJQuery.js"></script>

  7. We can proceed with the creation of example.js, where we create two functions; one will create a script element and set the value of src to http://localhost:8080/?callback=cool.run, and the other will serve as a callback upon receiving the data:

    var cool = (function(){ var module = {}; module.run = function(data){ document.getElementById('data').innerHTML = data[0].what; } module.addElement = function (){ var script = document.createElement('script'); script.src = 'http://localhost:8080/hi?callback=cool.run' document.getElementById('data').appendChild(script); return true; } return module; }());

  8. Afterwards we only need the function that adds the element:

    cool.addElement();

    This should read the data from the server and show a result similar to the following:

    Hello far away server: hi there stranger

    From the cool object, we can run the addElement function directly as we defined it as self-executable.

  9. The jQuery example is a lot simpler; We can set the datatype to JSONP and everything else is the same as any other AJAX call, at least from the API point of view:

    $.ajax({ type : "GET", dataType : "jsonp", url : 'http://localhost:8080/hi', success: function(obj){ $('#oneMoreTime').text(obj[0].what); } });

We can now use the standard success callback to handle the data received from the server, and we don't have to specify the parameter in the request. jQuery will automatically append a callback parameter to the URL and delegate the call to the success callback.

How it works...

The first large leap we are doing here is trusting the source of the data. Results from the server is evaluated after the data is downloaded from the server. There has been some efforts to define a safer JSONP on http://json-p.org/, but it is far from being widespread.

The download itself is a HTTP GET method adding another major limitation to usability. Hypermedia as the Engine of Application State (HATEOAS ), among other things, defines the use of HTTP methods for the create, update, and delete operations, making JSONP very unstable for those use cases.

Another interesting point is how jQuery delegates the call to the success callback. In order to achieve this, a unique function name is created and is sent to the callback parameter, for example:

/hi?callback=jQuery182031846177391707897_1359599143721&_=1359599143727

This function later does a callback to the appropriate handler of jQuey.ajax.

There's more...

With jQuery, we can also use a custom function if the server parameter that should handle jsonp is not called callback. This is done using the flowing config:

jsonp: false, jsonpCallback: "my callback"

As with JSONP, we don't do XMLHttpRequest and expect any of the functions that are used with AJAX call to be executed or have their parameters filled as such call. It is a very common mistake to expect just that. More on this can be found in the jQuery documentation at http://api.jquery.com/category/ajax/.

HTML5 Data and Services Cookbook Over one hundred website building recipes utilizing all the modern HTML5 features and techniques! with this book and ebook
Published: September 2013
eBook Price: $32.99
Book Price: $54.99
See more
Select your format and quantity:

Reading XML data from server

Another common data format for REST services is XML. If we have the option to choose a format, there are very small number of cases where JSON is not a better choice. XML is a better option if we need strict message validation using multiple namespaces and schemas, or for some reason, we use Extensible Stylesheet Language Transformations (XSTL ). The biggest reason of all is the need to work with and support legacy environments that don't use JSON. Most of the modern server-side frameworks have a built-in support for content negotiation, meaning that depending on the client's request, they can serve up the same resource in different formats. In this recipe, we are going to create a simple XML server and use it from the client side.

Getting ready

For the server side, we will use Node.js with restify (http://mcavage.github.io/node-restify/) for the REST services, and xmlbuilder (https://github.com/oozcitak/xmlbuilder-js) for creating simple XML documents. To do this, we can use npm to install the dependencies or define a simple package.json file, such as the one available in the example files.

How to do it...

Let's follow these steps to demonstrate the use of XML.

  1. The server code is similar to other restify-based examples that we created previously. As we just want to demonstrate the use of XML, we can create a simple structure with xmlbuilder:

    var restify = require('restify'); var builder = require('xmlbuilder'); var doc = builder.create(); doc.begin('root') .ele('human') .att('type', 'female') .txt('some gal') .up() .ele('human') .att('type', 'male') .txt('some guy') .up() .ele('alien') .txt('complete');

  2. The use of it is very straightforward; the doc.begin('root') statement creates the root of the document and the ele() and att() statements create an element and attribute accordingly. As we are always adding new parts on the level of nesting where we added the last one, in order to move the cursor on level up, we just call the up() function.

    In our case, the document that will be generated is as follows:

    <root> <human type="female">some gal</human> <human type="male">some guy</human> <alien>complete</alien> </root>

  3. To create the route for the resource, we can create server.get('hi', addHeaders, respond), where the add headers are the ones for CORS and the response will return the XML document we created as a string:

    function respond(req, res, next) { res.setHeader('content-type', 'application/xml'); res.send(doc.toString({ pretty: true })); return next(); }

  4. restify does not have a direct support for application/xml; if we leave it like this, the server's response will be of type application/octet-stream. In order to add support, we will create the restify object and add a formatter that will accept XML:

    var server = restify.createServer({ formatters: { 'application/xml': function formatXML(req, res, body) { if (body instanceof Error) return body.stack; if (Buffer.isBuffer(body)) return body.toString('base64'); return body; } } });

    The server should be returning correct content-type and CORS headers together with the response data:

    < HTTP/1.1 200 OK < Access-Control-Allow-Origin: * < Access-Control-Allow-Headers: X-Requested-With < content-type: application/xml < Date: Sat, 02 Feb 2013 13:08:20 GMT < Connection: keep-alive < Transfer-Encoding: chunked

  5. As we have the server ready, we can proceed with the client side by creating a basic HTML file in which we will include jQuery and a simple script:

    Hello <div id="humans"></div> <hr/> <script src = "http://ajax.googleapis.com/ajax/libs/
    jquery/1.8.2/jquery.min.js"> </script> <script src = "exampleJQuery.js"></script>

  6. For simplicity, we use jQuery.ajax(), where the value of dataType will be xml:

    function(){ $.ajax({ type: "GET", url: "http://localhost:8080/hi", dataType: "xml", success: function(xml) { $("root > human", xml).each(function(){ var p = $("<p></p>"); $(p).text($(this).text()).appendTo("#humans"); }); } }); }())

How it works...

While most of the example code should be straightforward, the first thing you might be wondering is what is application/octet-stream? Well, it is an internet media type of a generic binary data stream. If we were to open the resource with a browser, it will ask us where to save it or with what application it should be opened.

The formatter we added in the restify implementation accepts a function with the request, response, and the body. It is the body object that is of most interest to us; we check if it is an instance of Error in order to somehow handle it. The other check that needs to be done is if the body is an instance of Buffer. JavaScript does not play very well with binary data, so a Buffer object was created to store raw data. In our case, we just return the body, as we already have constructed the XML. If we do a lot of processing like this, it might make sense to add formatting for JavaScript objects directly rather manually creating a string with XML data.

On the client side, we used jQuery.ajax() to get the XML, and when that happens, the success callback does not just receive text, but also accepts a DOM element that we can traverse using standard jQuery selectors. In our case, with "root> human", we select all the human elements, and for the text inside, each of them appends a paragraph to "#humans", just like working with HTML:

. $("root > human", xml).each(function(){ var p = $("<p></p>"); $(p).text($(this).text()).appendTo("#humans"); });

There's more...

JXON (https://developer.mozilla.org/en-US/docs/JXON) is one good alternative when we have to support XML. Without standardization, it follows a simple convention to transform XML to JSON. Another good option for working with XML is to use XPath—the XML Path Language (http://www.w3.org/TR/xpath/), a query language that can be used to retrieve values from certain nodes or to select them for other manipulation. XPath is the simplest option in most of the use cases and as such, it should often be our first option.

Older versions of jQuery (before Version 1.1.2) had support of XPath out of the box but was later removed as the standard selectors are lot more powerful when doing HTML transformations.

ECMAScript for XML or commonly known as E4X is a programming language extension to enable native support for XML. Although it has several implementations available in the newest version of Firefox, it's getting removed.

Using the FormData interface

One of the new features added to XMLHttpRequest Level 2 (http://www.w3.org/TR/XMLHttpRequest2/) is the FormData object. This enables us to use a set of key-value pairs that can be sent using AJAX. The most common use is in sending binary files or any other large amount of data. In this recipe, we will create two scripts that will send FormData, one with a plain JavaScript and the other with jQuery, as well as the server-side code to support it.

Getting ready

The server will be done in Nodejs using restify (http://mcavage.github.io/node-restify/). In order to install the dependencies, a package.json file can be created where restify will be added.

How to do it...

  1. The server should be able to accept HTTP POST with type multipart/form-data; that is why there is a built-in plugin for restify called BodyParser. This will block the parsing of the HTTP request body:

    var server = restify.createServer(); server.use(restify.bodyParser({ mapParams: false })); server.post('hi', addHeaders, doPost);

  2. This switches the content type, and depending on it, does the appropriate logic for application/json, application/x-ww-form-urlencoded, and mutipart/form-data. The addHeaders parameter will be the same as we added in the other examples that enables CORS. For simplicity in our doPost handler, we just log the request body and return HTTP 200:

    function doPost(req, res, next) { console.log("Got HTTP " + req.method + " on " + req.url + " responding"); console.log(req.body); res.send(200); return next(); }

  3. For the client side, we create an HTML file that will have a simple script:

    (function (){ var myForm = new FormData(); myForm.append("username", "johndoe"); myForm.append("books", 7); var xhr = new XMLHttpRequest(); xhr.open("POST", "http://localhost:8080/hi"); xhr.send(myForm); }());

  4. The jQuery way is a lot simpler; we can set FormData as part of the data attribute in jQuery.ajax() where additionally we need to disable data processing before we send and leave the original content type:

    (function(){ var formData = new FormData(); formData.append("text", "some strange data"); $.ajax({ url: "http://localhost:8080/hi", type: "POST", data: formData, processData: false, // don't process data contentType: false // don't set contentType }); }());

How it works...

The transmitted data will have the same format as it would if we submitted a form that has the multipart/form-data encoding type. The need for this type of encoding comes from sending mixed data together with files. This encoding is supported by most of the web browsers and web servers. The encoding can be used for forms that are not HTML or even part of the browser.

If we take a look at request being sent, we can see that it has the following data:

Content-Length:239 Content-Type:multipart/form-data; boundary=----WebKitFormBoundaryQXGz NXa82frwui6S

The payload will be as follows:

------WebKitFormBoundaryQXGzNXa82frwui6S Content-Disposition: form-data; name="username" johndoe ------WebKitFormBoundaryQXGzNXa82frwui6S Content-Disposition: form-data; name="books" 7 ------WebKitFormBoundaryQXGzNXa82frwui6S--

You may notice that each of these parts contain a Content-Disposition section with the name of the control that is an origin of the data or, in our case, the key we set in every append to the FormData object. There is also an option to set the content type on each individual part, for example, if we had an image from some control named profileImage then that part can be as follows:

Content-Disposition: form-data; name="profileImage"; filename="me.png" Content-Type: image/png

The last call to xhr.sent() in example.js sets the content type automatically when we are sending an object of type FormData.

And if we need to support older legacy browsers that don't have XMLHttpRequest level 2, we can check if FormData is there and handle that case accordingly:

if (typeof FormData === "undefined")

The method we use as a fallback cannot be an AJAX call, but this should not be a problem as all the modern browsers IE<10 version don't have support for it.

Posting a binary file to the server

Posting text, XML, or JSON to the server is relatively easy, and most JavaScript libraries are optimized for that scenario.

Posting binary data is slightly trickier. Modern applications may need to be able to upload the generated binary files; examples include images drawn on an HTML5 canvas, ZIP files created with JSZip, and so on.

Additionally, it's convenient to be able to upload files selected using the HTML5 file API. We can do some interesting things with it, such as resumable file uploads by splitting the file into smaller parts and uploading every part separately to the server.

In this recipe, we're going to upload files selected by the user using a file input.

Getting ready

The server will be implemented using Node.js—you can download and install Node.js from http://nodejs.org/. The server will be implemented with the Node.js framework Connect (http://www.senchalabs.org/connect/).

How to do it...

Let's write the client and server code.

  1. Create a file named index.html—the file upload page that includes a file input, upload button, a progress bar, and a message container:
    <!DOCTYPE HTML>
    <html>
    <head>
    <title>Upload binary file</title>
    <style type="text/css">
    .progress {
    position:relative;
    height:1em; width: 12em;
    border: solid 1px #aaa;
    }
    .progress div {
    position: absolute;
    top:0; bottom:0; left:0;
    background-color:#336699;
    }
    </style>
    </head>
    <body>
    <input type="file" id="file" value="Choose file">
    <input type="button" id="upload" value="Upload"><br>
    <p id="info"></p>
    <div class="progress"><div id="progress"></div></div>
    <script src = "http://ajax.googleapis.com/ajax/libs/jquery/1.8.2/
    jquery.min.js"></script>
    <script type="text/javascript" src = "uploader.js"></script>
    <script type="text/javascript" src = "example.js"></script>
    </body>
    </html>
  2. Create a file named uploader.js that implements a binary file uploader. It posts the file to a specified URL and returns an object that enables the binding of progress events:
    window.postBinary = function(url, data) {
    var self = {},
    xhr = new XMLHttpRequest();
    xhr.open('POST', url, true);
    xhr.responseType = 'text';
    self.done = function(cb) {
    xhr.addEventListener('load', function() {
    if (this.status == 200)
    cb(null, this.response)
    else
    cb(this.status, this.response)
    });
    return self;
    }
    self.progress = function(cb) {
    xhr.upload.addEventListener('progress', function(e) {
    if (e.lengthComputable)
    cb(null, e.loaded / e.total);
    else
    cb('Progress not available');
    });
    return progress;
    };
    xhr.send(data);
    return self;
    };
  3. Create a file named example.js that uses the API provided by uploader.js to add the upload functionality to the upload form:
    $(function() {
    var file;
    $("#file").on('change', function(e) {
    file = this.files[0]
    });
    $("#upload").on('click', function() {
    $("#info").text("Uploading...");
    $("#progress").css({width:0});
    if (!file) {
    $("#info").text('No file selected')
    return;
    }
    var upload = postBinary('/upload/' + file.name, file);
    upload.progress(function(err, percent) {
    if (err) {
    $("#info").text(err);
    return;
    }
    $("#progress").css({width: percent + '%'});
    });
    upload.done(function(err, res) {
    if (err) {
    $("#info").text(err + ' ' + res);
    return;
    }
    $("#progress").css({width: '100%'});
    $("#info").text("Upload complete");
    });
    });
    });
  4. Create a file named server.js—a Node.js server based on the Node.js Connect framework that handles the file uploads and serves the static files:
    var path = require('path'),
    connect = require('connect'),
    fs = require('fs');
    connect()
    .use('/upload', function(req, res) {
    var file = fs.createWriteStream(
    path.join(__dirname, 'uploads', req.url))
    req.pipe(file);
    req.on('end', function() {
    res.end("ok");
    });
    })
    .use(connect.static(__dirname))
    .listen(8080);
  5. Open a command prompt from the directory where server.js is located and type the following commands to create a directory for the uploads, install the connect library, and start the server:
    mkdir uploads
    npm install connect
    node server.js
  6. Navigate your browser to http://localhost:8080 to test the example. All the created files (including server.js) should be in the same directory.

How it works...

The new XMLHttpRequest object in HTML5 has a send method that supports more types of data. It can accept File, Blob, and ArrayBuffer objects. We use this new functionality together with the HTML5 File API to upload the file selected by the user. 

The new API also provides an upload object, which is of type XMLHttpRequestUpload. It allows us to attach event listeners to monitor the upload progress. We use this functionality to show a progress bar for the upload.

The server accepts the uploads at '/upload' and saves the files to the uploads directory. Additionally, it serves the static files in the example directory.

There's more…

The new XHR API is only available in Internet Explorer 10 and up.

Some browsers may fail to fire upload progress events.

Creating an SSL connection with Node.js

Common security problems are so-called man-in-the-middle attacks, a form of eavesdropping in which the attacker makes independent connections to the victim and forwards the messages to the desired locations. The attacker must be able to intercept messages and change them on his own. This is only possible if the attacker can successfully impersonate the two involved parties. Secure Socket Layer (SSL ) and it's successor Transport Layer Security (TSL ) prevent these type of attacks by encrypting the data. In this recipe, we create a Node.js server using restify that has support for HTTPS.

Getting ready

We will use a certificate and a server private key in order to enable HTTPS. To generate this, we need OpenSSL (http://www.openssl.org/), a fully featured open source toolkit implementing SSL and TLS, as well as a general purpose cryptography library.

First, on the command line, generate an RSA (http://en.wikipedia.org/wiki/RSA_(algorithm)) private key:

openssl genrsa -out privatekey.pem 1024

The actual key that will get generated should be something like the following:

The one you generated should be substantially longer.

Note that the private key is called private for a reason, you should not have it in any version control system or have it accessible for everyone. This should be kept safe, as it is your real identification.

Next we will create a Certificate Signing Request (CSR ) file using the private key that we just created with some additional information that will be prompted to enter:

openssl req -new -key privatekey.pem -out csr.pem

After filling out the form, we get a CSR file generated, which is intended for asking a Certificate Authority to sign your certificate. This file could be sent to them for processing and they would give us a certificate. As we are only creating a simple example, we will self-sign the file using our private key:

openssl x509 -req -in csr.pem -signkey privatekey.pem -out publiccert.pem

The publiccert.pem file is the one that we will use as a certificate in our server.

How to do it...

  1. First we add the dependencies, and then we create an options object where we read out the key and the certificate that we generated:

    var restify = require('restify'); var fs = require('fs'); // create option for the https server instance var httpsOptions = { key: fs.readFileSync('privatekey.pem'),//private key certificate: fs.readFileSync('publiccert.pem')//certificate };

    File IO in Node.js is provided using the fs module. This is a wrapper to the standard POSIX functionality. The documentation on it can be found at http://nodejs.org/api/fs.html.

  2. We continue with the creation of the routes and handlers, and in order not to duplicate the logic for the two server instances, we create a common serverCreate function:

    var serverCreate = function(app) { function doHi(req, res, next) { var name = 'nobody'; if(req.params.name){ name = req.params.name; } res.send('Hi ' + name); return next(); } app.get('/hi/', doHi); app.get('/hi/:name', doHi); }

  3. Then we can use this function to create instances of the two servers:

    serverCreate(server); serverCreate(httpsServer);

  4. We can set the standard server to listen to port 80 and the HTTPS version to port 443:

    server.listen(80, function() { console.log('started at %s', server.url); }); httpsServer.listen(443, function() { console.log('started at %s', httpsServer.url); });

  5. Now we can call node server.js to start the servers and try to access the following pages from the browser:
    • http://localhost:80/hi/John
    • http://localhost:443/hi/UncleSam

How it works...

The first thing you might encounter when running the server is an error similar to the following:

Error: listen EACCES at errnoException (net.js:770:11) at Server._listen2 (net.js:893:19)

The problem here is that the server itself cannot bind to a port smaller than 1024 unless it has root or administrative privileges(as commonly known).

The HTTPS server we just created uses public key cryptography. Each peer has two keys: one public and one private.

In cryptography, commonly the involved parties are called Alice and Bob, so we will use the same names. More on the topic can be found on Wikipedia at http://en.wikipedia.org/wiki/Alice_and_Bob.

Alice and Bob's public keys are shared with everyone, and their private keys are kept secret. In order for Alice to encrypt a message that she needs to sent to Bob, she needs Bob's public key and her private key. On the other hand, if Bob needs to decrypt the same message that he received from Alice, he needs her public key and his private key.

In TLS connections, the public key is the certificate. This is because it is signed to prove that the real owner is the person they are claiming to be; for example Bob. TSL certificates can be signed by a Certificate Authority that actuality confirms that Bob is who it claims to be. Firefox, Chrome, and other browsers have a list of root CA's that are trusted for issuing a certificate. This root CA may issue certificates to other signing authorities that sell them to the general public; very interesting business don't you think?

In our case, we self-signed our certificate so it is not trusted by the browsers, when we open it, we get the following lovely little page:

This message will not appear when we use a CA-signed certificate, as we would have an authority recognized by our browser as a trusted one.

There's more...

Open Web Application Security Project, or OWASP (https://www.owasp.org/), has a comprehensive database of procedures of common security problems and pitfalls when creating web application. There you can find a great cheat sheet for security about HTML5 applications (https://www.owasp.org/index.php/HTML5_Security_Cheat_Sheet). When it comes to HTTPS, one common problem is having mixed content that does not always come from same protocol. One simple way to increase security is to have every request sent over TLS/SSL.

HTML5 Data and Services Cookbook Over one hundred website building recipes utilizing all the modern HTML5 features and techniques! with this book and ebook
Published: September 2013
eBook Price: $32.99
Book Price: $54.99
See more
Select your format and quantity:

Making real-time updates with Ajax Push

Comet is a web model in which a long-held HTTP request allows the server to "push" data from the server to the browser without the need for the browser to make a request explicitly. Comet is known in many different names, Ajax Push, Server Push, Reverse Ajax two-way-web, and so on. In this recipe, we are going to create a simple server that sends or "pushes" its current time to client.

Getting ready

For this example, we will use Node.js and a library called Socket.IO (http://socket.io/). The dependency can be included in the package.json file or directly installed from npm.

How to do it...

Let's get started.

  1. First, we will start with the server side, where we will add the needed require statements for Socket.IO, HTTP, and filesystem:

    var app = require('http').createServer(requestHandler), io = require('socket.io').listen(app), fs = require('fs')

  2. The server is initialized with requestHandler, where we will just serve an index.html file placed in the same directory that we will create a bit later:

    function requestHandler (req, res) { fs.readFile('index.html', function (err, data) { if (err) { res.writeHead(500); return res.end('Error loading index.html'); } res.writeHead(200); res.end(data); }); }

  3. If the file cannot be read, it returns HTTP 500, and if everything is fine, it just returns the data, a very simplified handler. We set the server to listen on port 80 with app.listen(80) and afterwards we can continue with the Socket.IO-related configuration:

    io.configure(function () { io.set("transports", ["xhr-polling"]); io.set("polling duration", 10); });

    Here we set the only allowed transport to be xhr-polling for the purpose of the example. Socket.IO has support for multiple different ways of sending server-side events to the client, so we disabled everything else.

    Note that in a real-life application, you probably will want to leave the other transport methods as they might be a better option for the given client or act as a fallback mechanism.

  4. Afterwards, we can continue with the events. On every connection we get, we emit a ping event with some JSON data towards the client the first time, and on every received pong event, we wait for 15 seconds and then again send some JSON data with the current server time:

    io.sockets.on('connection', function (socket) { socket.emit('ping', { timeIs: new Date() }); socket.on('pong', function (data) { setTimeout(function(){ socket.emit('ping', { timeIs: new Date() }); console.log(data); }, 15000); }); });

  5. Now on the client side, we will include the socket.io.js file, and as we are serving our index.html file from node, it will be added with the following default path:

    <script src = "/socket.io/socket.io.js"></script>

  6. After that, we connect to localhost and wait for a ping event, and on every such event, we append a p element with the server time. We then emit a pong event to the server:

    <script> var socket = io.connect('http://localhost'); socket.on('ping', function (data) { var p = document.createElement("p"); p.textContent = 'Server time is ' + data.timeIs; document.body.appendChild(p); socket.emit('pong', { my: 'clientData' }); }); </script>

Now when we start the server and access index.html by opening http://localhost, we should be getting server updates without explicitly asking for them:

Server time is 2013-02-05T06:14:33.052Z

How it works...

If we don't set the only transport method to be Ajax pooling or xhr-polling, Socket.IO will attempt to use the best method available. Currently, there are several transports supported: WebSocket, Adobe Flash Socket, AJAX long polling, AJAX multipart streaming, Forever IFrame, and JSONP Polling.

Depending on the browser used, different methods might be better, worse, or not available, but it's safe to say that WebSockets are the future. Long polling is easier to implement on the browser side and works with every browser that supports XMLHttpRequest.

As the name suggests, long polling works with the client requesting the server for an event. This request is left open until the server has sent some new data to the browser or has closed the connection.

If we open up a console in our example, we can see that a request is done towards the server, but it is not closed as the response is not finished:

Table

As we configured the server-pulling duration to 10 seconds with io.set("polling duration", 10), this connection will be closed and another is reopened. The first thing you might be wondering is why do we ever need to close the connections? Well, if we don't, the resources on the server will easy get depleted.

You may notice the closing and sending of the data in the server console:

   debug - xhr-polling received data packet 5:::{"name":"pong","args":[{"my":"clientData"}]}
   debug - setting request GET /socket.io/1/xhr-polling/5jBJdDQ6Uc2ZYXzZHcqd?t=1360050667340
   debug - setting poll timeout
   debug - discarding transport

One additional thing to note is that as soon as the connection is closed, either due to a response received or due to a timeout on the server side, a new one is created. The newly created request usually has a connection for the server waiting for it, resulting in a significant reduction of latency.

There's more...

Socket.IO has plenty of other features that we did not cover. One of them is the broadcasting of messages to all the connected clients. For example, to let everyone know that a new user connected, we can do the following:

io.sockets.on('connection', function (soc) { soc.broadcast.emit('user connected'); });

Even if we don't use Node.js, the comet technologies or "hacks" are available in most of the programming languages, and are a great way to improve the user experience.

Exchanging real-time messages using WebSockets

Before HTML5 Web Sockets, web applications that needed to implement real-time updates, such as chat messages and game moves, had to resort to inefficient methods.

The most popular method was to use long polling, where a connection to the server is kept open until an event arrives. Another popular method was streaming chunked blocks of JavaScript to an iframe element, also known as comet streaming .

HTML5 WebSockets enable the exchange of real-time messages with the web server. The API is much cleaner and easier to use, less error-prone, and provides lower message latency.

In this recipe, we're going to implement a simple chat system based on WebSockets. To make the system easier to extend, we're going to use dnode on top of the underlying WebSockets. The dnode library provides full callback-based RPC for multiple languages and platforms: Node.js, Ruby, Java, and Perl. Essentially, it enables us to call server-side code as if it were executing on the client side.

Getting ready

The server will be implemented using Node.js—you can download and install Node.js from http://nodejs.org/.

To prepare yourself, you will also need to install some node modules. Create a new directory for the recipe and type in the following commands to install node modules:

npm install -g browserify npm install express shoe dnode

How to do it...

Let's write the client and the server.

  1. Create the main chat page containing a list of messages, a list of users, and a text input box in index.html. The chat page is styled to fill the whole browser viewport.

    <!DOCTYPE HTML> <html> <head> <title>Using websockets</title> <style type="text/css"> #chat { position: absolute; overflow: auto; top:0; left:0; bottom:2em; right:12em; } #users { position: absolute; overflow: auto; top:0; right: 0; width:12em; bottom: 0; } #input { position: absolute; overflow: auto; bottom:0; height:2em; left: 0; right: 12em; } #chat .name { padding-right:1em; font-weight:bold; } #chat .msg { padding: 0.33em; } </style> </head> <body> <div id="chat"> </div> <div id="users"> </div> <input type="text" id="input"> <script src = "http://ajax.googleapis.com/ajax/libs/
    jquery/1.8.2/jquery.min.js"></script> <script type="text/javascript" src = "example.min.js"></script> </body> </html>

  2. Create a file named chat.js—a chat room implementation in JavaScript. The chat() function creates a chat room and returns the public API of the chatroom, consisting of the join, leave, msg, ping, and listen functions.

    function keysOf(obj) { var k = []; for (var key in obj) if (obj.hasOwnProperty(key)) k.push(key); return k; } function chat() { var self = {}, users = {}, messages = []; // Identify the user by comparing the data provided // for identification with the data stored server-side function identify(user) { return users[user.name] && user.token == users[user.name].token; } // Send an event to all connected chat users that // are listening for events function emit(event) { console.log(event); for (var key in users) if (users.hasOwnProperty(key)) if (users[key].send) users[key].send(event); } // This function resets the timeout countdown for a // specified user. The countdown is reset on every user // action and every time the browser sends a ping // If the countdown expires, the user is considered // to have closed the browser window and no longer present function resetTimeout(user) { if (user.timeout) { clearTimeout(user.timeout); user.timeout = null; } user.timeout = setTimeout(function() { self.leave(user, function() {}); }, 60000); } // When a user attempts to join, he must reserve a // unique name. If this succeeds, he is given an auth // token along with the name. Only actions performed // using this token will be accepted as coming from // the user. After the user joins a list of users and // past messages are sent to him along with the // authentication information. self.join = function(name, cb) { if (users[name]) return cb(name + " is in use"); users[name] = { name: name, token: Math.round(Math.random() * Math.pow(2, 30)) } resetTimeout(users[name]); emit({type: 'join', name: name}); cb(null, { you: users[name], messages: messages, users: keysOf(users) }); } // The leave function is called when the user leaves // after closing the browser window. self.leave = function(user, cb) { if (!identify(user)) return clearTimeout(users[user.name].timeout); delete users[user.name]; emit({type: 'leave', name: user.name}); cb(null); } // The message function allows the user to send a // message. The message is saved with a timestamp // then sent to all users as an event. self.msg = function(user, text) { if (!identify(user)) return; resetTimeout(users[user.name]); var msg = { type: 'msg', name: user.name, text: text, time: Date.now() } messages.push(msg); emit(msg); } // The ping function allows the browser to reset // the timeout. It lets the server know that the // user hasn't closed the chat yet. self.ping = function(user) { if (identify(user)) resetTimeout(users[user.name]); } // The listen function allows the user to provide // a callback function to be called for every event. // This way the server can call client-side code. self.listen = function(user, send, cb) { if (!identify(user)) return users[user.name].send = send; } return self; }; module.exports = chat;

  3. Let's create the Node.js script named server.js, implementing the web server:

    var express = require('express'), http = require('http'), chat = require('./chat.js'), shoe = require('shoe'), dnode = require('dnode') // Create an express app var app = express(); // that serves the static files in this directory app.use('/', express.static(__dirname)); // then create a web server with this app var server = http.createServer(app); // Create a chat room instance, var room = chat(); // then create a websocket stream that // provides the chat room API via dnode // and install that stream on the http server // at the address /chat shoe(function (stream) { var d = dnode(room); d.pipe(stream).pipe(d); }).install(server, '/chat'); // start the server server.listen(8080);

  4. Create a file named example.js to implement the chat client:

    var shoe = require('shoe'), dnode = require('dnode'); $(function() { // Add a message to the message div function addMsg(msg) { var dMsg = $("<div />").addClass('msg'), dName = $("<span />").addClass('name') .text(msg.name).appendTo(dMsg), dText = $("<span />").addClass('text') .text(msg.text).appendTo(dMsg); dMsg.appendTo("#chat"); $("#chat").scrollTop($("#chat")[0].scrollHeight); } // Re-display a list of the present users. function showUsers(users) { $("#users").html(''); users.forEach(function(name) { $("<div />").addClass('user') .text(name).appendTo('#users'); }); } // Create a client-side web sockets stream // piped to a dnode instance var stream = shoe('/chat'); var d = dnode(); // When the remote chat API becomes available d.on('remote', function (chat) { // Attempt to join the room until a suitable // nickname that is not already in use is found function join(cb, msg) { var name = prompt(msg || "Enter a name"); chat.join(name, function(err, data) { if (err) join(cb, err); else cb(data); }); } join(function(data) { var me = data.you, users = data.users; // Show the users and messages after joining showUsers(users); data.messages.forEach(addMsg); // Allow the user to send messages $("#input").on('keydown', function(e) { if (e.keyCode == 13) { // sending works by calling the // remote's msg function. chat.msg(me, $(this).val()); $(this).val(''); } }); // Tell the remote we're listening for // events chat.listen(me, function(e) { if (e.type == 'msg') return addMsg(e); if (e.type == 'leave') delete users[users.indexOf(e.name)]; else if (e.type == 'join') users.push(e.name); showUsers(users); }); // Tell the remote every 30 seconds that // we're still active setInterval(function() { chat.ping(me); }, 30000); }); }); // pipe dnode messages to the websocket stream // and messages from the stream to dnode d.pipe(stream).pipe(d); });

  5. Use browserify to create example.min.js:

    browserify example.js –-debug -o example.min.js

  6. Start the node server:

    node server.js

  7. Navigate your browser to http://localhost:8080 to test the example.

How it works...

We're not using the WebSockets API directly here. The reason for that is, it's not very easy to send responses to messages using the raw WebSockets—they don't support a request-response cycle. Because of that, it would be much harder to implement some of the RPC calls, such as asking the server if the name is available.

On the other hand, the dnode protocol supports passing local callbacks to remote functions, which in turn can pass callbacks of their own to the callbacks received and so on—resulting in a very powerful, full RPC implementation. This allows us to extend our application to meet new demands as they arise. As a bonus, the resulting API is much clearer and more expressive.

Here is what we did to implement a chatroom with dnode:

  1. We created a simple object that uses continuation-passing style to return errors and values for all functions. This is our chatroom object and defines the RPC API for our application.
  2. We defined a WebSockets server based on the shoe library that creates a new Node.js stream for every connected client. Then we installed it to the regular HTTP server at the /chat route.
  3. We connected the two by piping every connected client stream to a newly created dnode stream based on the chatroom object.

That's all! Then, to use the API on the client, do the following:

  1. We defined a WebSockets client based on the shoe library that connects to the HTTP server at the /chat route and creates a new Node.js stream when the connection is established.
  2. We piped that stream to a newly created dnode client.
  3. After establishing a connection, the dnode client received an object containing the API defined in step 1—all the functions are available.

Summary

This article explains about communicating with servers.Find out more about dnode at https://github.com/substack/dnode.

IE versions up to IE 9 don't support the WebSockets API. As of February 2013, the built-in browser in the latest version of Android (v 4.2) doesn't support the WebSockets API either.

Resources for Article :


Further resources on this subject:


 

About the Author :


Gorgi Kosev

Gorgi Kosev is the lead software engineer at CreationPal, where he currently works on development of mobile and desktop HTML5 applications, as well as cloud solutions based on Node.js. He is also responsible for the selection and development of the technology stack used in CreationPal products, such as SportyPal and Appzer for Google Docs.

He completed his graduation in Electrical Engineering from University of Ss. Cyril and Methodius in 2008, and his Masters in Intelligent Information Systems in 2011. His research interests includes collaborative computer systems and machine learning.

In his spare time, he enjoys sharing code and hardware hacks with members of the local hack lab and playing the piano.

Mite Mitreski

Mite Mitreski works on custom enterprise application development with primary focus on Java and JVM-based solutions. He has worked as a programming course trainer in the past. He is deeply involved in activities surrounding development groups in Macedonia, where he is currently the JUG Leader of Java User Group, Macedonia. Mite has a great passion for free and open source software, open data formats, and the open web. Occasionally, he writes at his blog at http://blog.mitemitreski.com and can be reached on Twitter, his Twitter handle being @mitemitreski.

Books From Packt


HTML5 Enterprise Application Development
HTML5 Enterprise Application Development

 HTML5 Game Development with GameMaker
HTML5 Game Development with GameMaker

 HTML5 Graphing and Data Visualization Cookbook
HTML5 Graphing and Data Visualization Cookbook

HTML5 Games Development by Example: Beginner’s Guide
HTML5 Games Development by Example: Beginner’s Guide

Getting Started with HTML5 WebSocket Programming
Getting Started with HTML5 WebSocket Programming

HTML5 Game Development with ImpactJS
HTML5 Game Development with ImpactJS

HTML5 and CSS3 Responsive Web Design Cookbook
HTML5 and CSS3 Responsive Web Design Cookbook

Responsive Web Design with HTML5 and CSS3
Responsive Web Design with HTML5 and CSS3


No votes yet

Post new comment

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.
N
i
1
h
A
v
Enter the code without spaces and pay attention to upper/lower case.
Code Download and Errata
Packt Anytime, Anywhere
Register Books
Print Upgrades
eBook Downloads
Video Support
Contact Us
Awards Voting Nominations Previous Winners
Judges Open Source CMS Hall Of Fame CMS Most Promising Open Source Project Open Source E-Commerce Applications Open Source JavaScript Library Open Source Graphics Software
Resources
Open Source CMS Hall Of Fame CMS Most Promising Open Source Project Open Source E-Commerce Applications Open Source JavaScript Library Open Source Graphics Software