An Overview of Web Services in Sakai

Exclusive offer: get 50% off this eBook here
Sakai CLE Courseware Management: The Official Guide

Sakai CLE Courseware Management: The Official Guide — Save 50%

Take your e-learning, research, and collaboration experience to the next level

$32.99    $16.50
by Alan Mark Berg | July 2011 | Open Source

When Sakai was first designed, the specifics of the majority of the connected systems were not knowable. To adapt to these tough circumstances, Sakai supplies web services that are easy to hook into or write. Sakai exposes services for creating and maintaining users, sites, and groups. These services are easily extensible to include any part of the Sakai framework.

This article by Alan Berg, author of Sakai CLE Courseware Management, explains the two main types of web service, Simple Object Access Protocol (SOAP) and Representational State Transfer (REST). It also covers the already-existing web services and describes how to hook into them. If you follow the examples, you will be able to write and deploy your first service. Lastly, this article includes a few simple client-side Perl scripts that create new users, employing both the SOAP and RESTful approaches.

 

Sakai CLE Courseware Management: The Official Guide

Sakai CLE Courseware Management: The Official Guide

Take your e-learning, research, and collaboration experience to the next level

        Read more about this book      

(For more resources on Sakai, see here.)

Connecting to Sakai is straightforward, and simple tasks, such as automatic course creation, take only a few lines of programming effort.

There are significant advantages to having web services in the enterprise. If a developer writes an application that calls a number of web services, then the application does not need to know the hidden details behind the services. It just needs to agree on what data to send. This loosely couples the application to the services. Later, if you can replace one web service with another, programmers do not need to change the code on the application side. SOAP works well with most organizations' firewalls (http://en.wikipedia.org/wiki/Firewall), as SOAP uses the same protocol as web browsers. System administrators have a tendency to protect an organization's network by closing unused ports to the outside world. This means that most of the time there is no extra network configuration effort required to enable web services.

Another simplifying factor is that a programmer does not need to know the details of SOAP or REST, as there are libraries and frameworks that hide the underlying magic. For the Sakai implementation of SOAP, to add a new service is as simple as writing a small amount of Java code within a text file, which is then compiled automatically and run the first time the service is called. This is great for rapid application development and deployment, as the system administrator does not need to restart Sakai for each change. Just as importantly, the Sakai services use the well-known libraries from the Apache Axis project (http://ws.apache.org/axis/).

SOAP is an XML message passing protocol that, in the case of Sakai sites, sits on top of the Hyper Text Transfer Protocol (HTTP). HTTP is the protocol used by web browsers to obtain web pages from a server. The client sends messages in XML format to a service, including the information that the service needs. Then the service returns a message with the results or an error message.

The full definition of HTTP is given at http://www.w3.org/TR/soap12-part1.

The architects introduced SOAP-based web services first to Sakai , adding RESTful services later. Unlike SOAP, instead of sending XML via HTTP posts to one URL that points to a service, REST sends to a URL that includes information about the entity, such as a user, with which the client wishes to interact. For example, a REST URL for viewing an address book item could look similar to http://host/direct/addressbook_item/15. Applying URLs in this way makes for understandable, human-readable address spaces. This more intuitive approach simplifies coding. Further, SOAP XML passing requires that the client and the server parse the XML and at times, the parsing effort is expensive in CPU cycles and response times.

The Entity Broker is an internal service that makes life easier for programmers and helps them manipulate entities. Entities in Sakai are managed pieces of data such as representations of courses, users, grade books, and so on. In the newer versions of Sakai, the Entity Broker has the power to expose entities as RESTful services. In contrast, for SOAP services, if you wanted a new service, you would need to write it yourself. Over time, the Entity Broker exposes more and more entities RESTfully, delivering more hooks free to integrate with other enterprise systems.

Both SOAP and REST services sit on top of the HTTP protocol.

 

Protocols

This section explains how web browsers talk to servers in order to gather web pages. It explains how to use the telnet command and a visual tool called TCPMON (http://ws.apache.org/commons/tcpmon/tcpmontutorial.html) to gain insight into how web services and Web 2.0 technologies work.

Playing with Telnet

It turns out that message passing occurs via text commands between the browser and the server. Web browsers use HTTP (http://www.w3.org/Protocols/rfc2616/rfc2616.html) to get web pages and the embedded content from the server and to send form information to the server. HTTP talks between the client and server via text (7-bit ASCII) commands. When humans talk with each other, they have a wide vocabulary. However, HTTP uses fewer than twenty words.

You can directly experiment with HTTP using a Telnet client to send your commands to a web server. For example, if your demonstration Sakai instance is running on port 8080, the following command will get you the login page:

telnet localhost 8080
GET /portal/login

The GET command does what it sounds like and gets a web page. Forms can use the GET verb to send data at the end of the URL. For example, GET /portal/login?name=alan&age=15 is sending the variables name=alan and age=15 to the server.

Installing TCPMON

You can use the TCPMON tool to view requests and responses from a web browser such as Firefox. One of TCPMON's abilities is that it can act as an invisible man in the middle, recording the messages between the web browser and the server. Once set up, the requests sent from the browser go to TCPMON and it passes the request on to the server. The server passes back a response and then TCPMON, a transparent proxy (http://en.wikipedia.org/wiki/Proxy_server), returns the response to the web browser. This allows us to look at all requests and responses graphically.

First, you can set up TCPMON to listenon a given port number—by convention, normally port 8888—and then you can configure your web browser to send its requests through the proxy. Then, you can type the address of a given page into the web browser, but instead of going directly to the relevant server, the browser sends the request to the proxy, which then passes it on and passes the response back. TCPMON displays both the request and the responses in a window.

You can download TCPMON from http://ws.apache.org/commons/tcpmon/download.cgi.

After downloading and unpacking, you can—from within the build directory—run either tcpmon.bat for the Windows environment or tcpmon.sh for the UNIX/Linux environment. To configure a proxy, you can click on the Admin tab and then set the Listen Port to 8888 and select the Proxy radio button. After that, clicking on Add will create a new tab, where the requests and responses will be displayed later.

Your favorite web browser now has to recognize the newly-setup proxy. For Firefox 3, you can do this by selecting the menu option Edit/Preferences, and then choosing the Advanced tab and the Network tab, as shown in the next screenshot. You will need to set the proxy options, HTTP proxy to 127.0.0.1, and the port number to 8888. If you do this, you will need to ensure that the No proxies text input is blank. Clicking on the OK button enables the new settings.

(Move the mouse over the image to enlarge.)

To use the Proxy from within Internet Explorer 7 for a Local Area Network (LAN), you can edit the dialog box found under Tools | Internet Options | Connections | LAN settings.

Sakai CLE Courseware Management

Once the proxy is working, typing http://localhost:8080/portal/login in the address bar will seamlessly return the login page of your local Sakai instance. Otherwise, you will see an error message similar to Proxy Server Refused Connection for Firefox or Internet Explorer cannot display the webpage.

To turn off the proxy settings, simply select the No Proxies radio box and click on OK for Firefox 3, and for Internet Explorer 7, unselect the Use a proxy server for the LAN tick box and click on OK

Requests and returned status codes

When TCPMON is running a proxy on port 8888, it allows you to view the requests from the browser and the response in an extra tab, as shown in the following screenshot. Notice the extra information that the browser sends as part of the request. HTTP/1.1 defines the protocol and version level and the lines below GET are the header variables. The User-Agent defines which client sends the request. The Accept headers tell the server what the capabilities of the browser are, and the Cookie header defines the value stored in a cookie. HTTP is stateless, in principle; each response is based only on the current request. However, to get around this, persistent information can be stored in cookies. Web browsers normally store their representation of a cookie as a little text file or in a small database on the end users' computers.

Sakai uses the supporting features of a servlet container, such as Tomcat, to maintain state in cookies. A cookie stores a session ID, and when the server sees the session ID, it can look up the request's server-side state. This state contains information such as whether the user is logged in, or what he or she has ordered. The web browser deletes the local representation of the cookie each time the browser closes.

A cookie that is deleted when a web browser closes is known as a session cookie.

The server response starts with the protocol followed by a status number. HTTP/1.1 200 OK tells the web browser that the server is using HTTP version 1.1 and was able to return the requested web page successfully. 2xx status codes imply success. 3xx status codes imply some form of redirection and tell the web browser where to try to pick up the requested resource. 4xx status codes are for client errors, such as malformed requests or lack of permission to obtain the resource. 4xx states are fertile grounds for security managers to look in log files for attempted hacking. 5xx status codes mostly have to do with a failure of the server itself and are mostly of interest to system administrators and programmers during the debugging cycle. In most cases, 5xx status numbers are about either high server load or a broken piece of code. Sakai is changing rapidly and even with the most vigorous testing, there are bound to be the occasional hiccups. You will find accurate details of the full range of status codes at: http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html.

Another important part of the response is Content-Type, which tells the web browser which type of material the response is returning, so the browser knows how to handle it. For example, the web browser may want to run a plug-in for video types and display text natively. Content-Length in characters is normally also given. After the header information is finished, there is a newline followed by the content itself.

Web browsers interpret any redirects that are returned by sending extra requests. Web browsers also interpret any HTML pages and make multiple requests for resources such as JavaScript files and images. Modern browsers do not wait until the server returns all the requests, but render the HTML page live as the server returns the parts.

The GET verb is not very efficient for posting a large amount of data, as the URL has a length limit of around 2000 characters. Further, the end user can see the form data, and the browser may encode entities such as spaces to make the URL unreadable. There is also a security aspect: if you are typing passwords in forms using GET, others may see your password or other details. This is not a good idea, especially at Internet Cafés where the next user who logs on can see the password in the browsing history. The POST verb is a better choice. Let us take as an example the Sakai demonstration login page (http://localhost:8080/portal/login). The login page itself contains a form tag that points to the relogin page with the POST method.

<form method="post" action="http://localhost:8080/portal/relogin"
enctype="application/x-www-form-urlencoded">

Note that the HTML tag also defines the content type. Key features of the POST request compared to GET are:

  • The form values are stored as content after the header values
  • There is a newline between the end of the header and the data
  • The request mentions data and the amount of data by the use of the Content-Length header value

The essential POST values for a login form with user admin (eid=admin) and password admin (pw=admin) will look like:

POST http://localhost:8080/portal/relogin HTTP/1.1
Content-Type: application/x-www-form-urlencoded
Content-Length: 31
eid=admin&pw=admin&submit=Login

POST requests can contain much more information than GET requests, and the requests hide the values from the address bar of the web browser. This is not secure. The header is just as visible as the URL, so POST values are also neither hidden nor secure. The only viable solution is for your web browser to encrypt your transactions using SSL/TLS (http://www.ietf.org/rfc/rfc2246.txt) for security, and this occurs every time you connect to a server using an HTTPS URL.

SOAP

Sakai uses the Apache Axis framework, which the developers have configured to accept SOAP calls via POST. SOAP sends messages in a specific XML format with the Content-Type, otherwise known as MIME type, application/soap+xml. A programmer does not need to know more than that, as the client libraries take care of the majority of the excruciating low-level details. An example SOAP message generated by the Perl module, SOAP::Lite (http://www.soaplite.com/), for creating a login session in Sakai will look like the following POST data:

<?xml version="1.0" encoding="UTF-8"?>
<soap:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchemainstance"
xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"
xmlns:xsd="http://www.w3.org/2001/XMLSchema" soap:encodingStyle=
"http://schemas.xmlsoap.org/soap/encoding/" xmlns:soap=
"http://schemas.xmlsoap.org/soap/envelope/">
<soap:Body>
<login xmlns="http://localhost:8081/sakai-axis/SakaiLogin.jws">
<c-gensym3 xsi:type="xsd:string">admin</c-gensym3>
<c-gensym5 xsi:type="xsd:string">admin</c-gensym5>
</login>
</soap:Body>
</soap:Envelope>

There is an envelope with a body containing data for the service to consume. The important point to remember is that both the client and the server have to be able to parse the specific XML schema. SOAP messages can include extra security features, but Sakai does not require these. The architects expect organizations to encrypt web services using SSL/TSL.

The last extra SOAP-related complexity is the Web Service Description Language (http://www.w3.org/TR/wsdl). Web services may change location or exist in multiple locations for redundancy. The service writer can define the location of the services and the data types involved with those services in another file, in XML format.

JSON

Also worth mentioning is JavaScript Object Notation (JSON) (http://tools.ietf.org/html/rfc4627), which is another popular format, passed using HTTP. When web developers realized that they could force browsers to load parts of a web page in at a time, it significantly improved the quality of the web browsing experience for the end user. This asynchronous loading enables all kinds of whiz-bang features, such as when you type in a search term and can choose from a set of search term completions before pressing on the Submit button. Asynchronous loading delivers more responsive and richer web pages that feel more like traditional desktop applications than a plain old web page. JSON is one of the formats of choice for passing asynchronous requests and responses.

The asynchronous communication normally occurs through HTTP GET or POST, but with a specific content structure that is designed to be human readable and script language parser-friendly. JSON calls have the file extension .json as part of the URL. As mentioned in RFC 4627, an example image object communicated in JSON looks like:

{
"Image": {
"Width": 800,
"Height": 600,
"Title": "View from 15th Floor",
"Thumbnail": {
"Url": "http://www.example.com/image/481989943",
"Height": 125,
"Width": "100"
},
"IDs": [116, 943, 234, 38793]
}
}

By confusing the boundaries between client and server, a lot of the presentation and business logic is locked on the client side in scripting languages such as JavaScript. The scripting language orchestrates the loading of parts of pages and the generation of widget sets. Frameworks such as jQuery (http://jquery.com/) and MyFaces (http://myfaces.apache.org/) significantly ease the client-side programming burden.

REST

To understand REST, you need to understand the other verbs in HTTP (http://www.w3.org/Protocols/rfc2616/rfc2616-sec9.html). The full HTTP set is OPTIONS, GET, HEAD, POST, PUT, DELETE, and TRACE.

The HEAD verb returns from the server only the headers of the response without the content, and is useful for clients that want to see if the content has changed since the last request. PUT requests that the content in the request be stored at a particular location mentioned in the request. DELETE is for deleting the entity.

REST uses the URL of the request to route to the resource, and the HTTP verb GET is used to get a resource, PUT to update, DELETE to delete, and POST to add a new resource. In general, POST request is for creating an item, PUT for updating an item, DELETE for deleting an item, and GET for returning information on the item.

In SOAP, you are pointing directly towards the service the client calls or indirectly via the web service description. However, in REST, part of the URL describes the resource or resources you wish to work with. For example, a hypothetical address book application that lists all e-mail addresses in HTML format would look similar to the following:

GET /email

To list the addresses in XML format or JSON format:

GET /email.xml
GET /email.json

To get the first e-mail address in the list:

GET /email/1

To create a new e-mail address, of course remembering to add the rest of e-mail details to the end of the GET:

POST /email

In addition, to delete address 5 from the list use the following command:

DELETE /email/5

To obtain address 5 in other formats such as JSON or XML, then use file extensions at the end of the URL, for example:

GET /email/5.json
GET /email/5.xml

RESTful services are intuitively more descriptive than SOAP services, and they enable easy switching of the format from HTML to JSON to fuel the dynamic and asynchronous loading of websites. Due to the direct use of HTTP verbs by REST, this methodology also fits well with the most common application type: CRUD (Create, Read, Update, and Delete) applications, such as the site or user tools within Sakai. Now that we have discussed the theory, in the next section we shall discuss which Sakai-related SOAP services already exist.

 

Sakai CLE Courseware Management: The Official Guide Take your e-learning, research, and collaboration experience to the next level
Published: July 2011
eBook Price: $32.99
Book Price: $54.99
See more
Select your format and quantity:

 

        Read more about this book      

(For more resources on Sakai, see here.)

Existing web services

Sakai includes by default most community-requested web services, and there are a few more services in the contributed section of the source code repository. This section describes the currently available services and the next section explains an example use, creating a new user.

Recapping terminology

In general, developers write web services for other developers' code to connect to (consume). Therefore, terminology can be confusing. When you create a site, a copy is made from a specific realm template for that particular site type. The permissions can be then modified for the roles in the site, and members added to the site with any of the specific roles. Internally, Sakai uses Authz Groups to keep track of the groups of users. An Authz Group is an authorization group (a group of users, each with a role and a set of permissions of functions assigned to each role). A site contains pages; when you click on the tool menu for a given tool, normally you will see only one tool displayed in a page. However, for the home page tool, you will see more tools contained within a page.

Default web services

Sakai has a comprehensive set of web services, with the list increasing with each new version. This section lists the services for Sakai 2.8.

To enable the web services, you will need to add the property:
webservices.allowlogin=true in sakai/sakai.properties.
Starting from Sakai 2.6, you also need to set the list of IP addresses or hosts that can connect to the web services. This particular property takes a list of IP addresses/hosts, and can include ranges. Note, however, it is important to escape the period. Following are a few examples: webservices.allow=localhost,127\.0\.0\.1,192\.168\.[0-9.]+,domain\.somewhere\.edu\.au,123\.45\.678\.90 You can also set a few optional properties to log allowed and denied access attempts:
webservices.log-allowed=true
webservices.log-denied=true

The following table defines the default web services and the methods included. Notice that the SakaiScript service is the most comprehensive.

Service Methods Description
SakaiLogin login, logout Web services need to log in before they can call other services that do work
SakaiPortalLogin login, loginAndCreate, UsageSessionService_loginDirect Web services to help connections from Portal software such as uPortal
SakaiScript checkSession, addNewUser, removeUser, changeUserInfo, changeUserName, changeUserEmail, changeUserType, changeUserPassword, getUserEmail, getUserDisplayName, addNewAuthzGroup, removeAuthzGroup, addNewRoleToAuthzGroup, removeAllRolesFromAuthzGroup, removeRoleFromAuthzGroup, allowFunctionForRole, disallowAllFunctionsForRole, setRoleDescription, addMemberToAuthzGroupWithRole, removeMemberFromAuthzGroup, removeAllMembersFromAuthzGroup, setRoleForAuthzGroupMaintenance, addMemberToSiteWithRole,add NewSite, removeSite, copySite, addNewPageToSite, removePageFromSite, addNewToolToPage, addConfigPropertyToTool, checkForUser, checkForSite, checkForMemberInAuthzGroupWithRole, getSitesUserCanAccess, getAllSitesForUser, getSiteTitle, getSiteDescription, getSiteSkin, isSiteJoinable, changeSiteTitle, changeSiteSkin, changeSiteJoinable, changeSiteIconUrl, changeSiteDescription, getSiteProperty, setSiteProperty, removeSiteProperty, checkForRoleInAuthzGroup, searchForUsers, checkForAuthzGroup, removeMemberFromSite, checkForUserInAuthzGroup, getUsersInAuthzGroupWithRole, getUsersInAuthzGroup, copyCalendarEvents, getUserType, addNewToolToAllWorkspaces, copyRole, getAllUsers, getSessionForUser, getUserId, getPagesAndToolsForSite Function-rich service that includes the main services you would expect for manipulating users, sites, memberships, and permissions in sites
SakaiSession checkSession, getSessionUser Service that returns the session information of the string sent to it
SakaiSigning establishSession, testsign, verifysign, getsession, touchsession Enables an external application to verify a user and is normally used in conjunction with the Rutgers Link tool
SakaiSite establishSession, getUserSite, getSiteList, joinAllSites, getSitesDom, getToolsDom These are site manipulation services. The methods with the word DOM return strings in a specific XML format

A number of the services have the same establishSession method. This saves the client code calling a second service (SakaiLogin).

A consumer of web services is the Rutgers Link tool (https://source.sakaiproject.org/svn/linktool/). This is a tool within Sakai that points outwards to an external application of choice, and makes the end user believe that the external tool is actually a part of Sakai. The end user clicks the tool link. On clicking, the link directs the user's browser to the external application. As part of the request, the browser passes on a cookie containing an encrypted session ID. The external application then sends the encrypted session back to the testSign method contained within the SakaiSigning web service, which will return true if the link tool has generated the session. Through this approach, Single Sign-On (SSO) between Sakai and an external application is achieved and the external tool now looks like a part of the Sakai site.

There are extra web services available in the contributed source repository (https://source.sakaiproject.org/contrib), including:
/rutgers/webservices/ for grade book manipulation
/sakaiadminx/trunk/ws/ to support delegated administration
/uct/webservices/ for manipulating assignments, users, content, and the message center, presence, and profile
/qa/trunk/provisioning/version_2/—an offering from the University of Michigan and Amsterdam University to support the generation of large numbers of populated sites, ready for use as part of the realistic stress testing environment

 

Sakai and SOAP

Sakai SOAP web services piggyback on top of the Apache Axis project. Creating basic Sakai web services is programmer-friendly because Apache Axis removes many of the hard chores. All you have to do is create a Java class in a text file under the /web-apps/sakai-axis directory, and any public method is automatically compiled into a service with a WSDL file automatically generated for it, ready for discovery by the client program. The compilation of the web service occurs after creation or modification and is triggered by the next incoming request. Helpfully, when you make a typo or any other mistake, the server displays the compilation error as a web page at the URL of the broken service, as shown in the next screenshot. You'll notice that the line number and type of error are included. The combination of text processing with a rich set of services to call on, plus the fact that it is not necessary to restart the server every time you compile, makes for rapid development cycles.

My first web service

To create your first web service, you can add the file /web-apps/sakai-axis/MyTest.jws with the following contents to a running demonstration instance of Sakai:

public class MyTest{
public String YouSaid(String message){
return "You said: "+message;
}
}

Then, typing http://localhost:8080/sakai-axis/MyTest.jws?wsdl will return a corresponding WSDL file similar to the following screenshot. Notice that it would take a human perhaps 30 minutes to generate the file and the computer took just milliseconds.

My first client

For the programmers among you, the following piece of Perl code consumes the service:

#!/usr/bin/perl
use SOAP::Lite;
my $host='http://localhost:8080';
my $soap = SOAP::Lite -> proxy("$host/sakai-axis/MyTest.jws?wsdl");
my $result =$soap->YouSaid("WHAT!");
print $result->result()."\n\n";

The returned result is:

You said: WHAT!

The SOAP Lite module interprets the WSDL file and after that, you can name the web service method directly in the code with the correct number of parameters. This feature results in code that is much more readable and hence maintainable. Changing the variable $host changes the server location. Changing the service and method requires the little nudge of modifying the last two lines of code.

A more realistic client example

Sakai web services will not let you perform any action without fulfilling two prerequisites: first, you need to enable the web services in sakai/sakai.properties (see section above on how to enable the web services). Second, the client code needs to obtain a session string from the login web service, and then use this string as part of any calls you make to other services. If the client code tries to perform any action without logging in, the server returns an error message.

The login service requires a username and password and it is very important to note that in production, you are expected to run the client code over an SSL/TLS connection.

The following piece of Perl code gets a session string and then uses it as a part of a second web service call to the addNewUser method which, as you would expect from the name, then creates a new user in Sakai.

#!/usr/bin/perl
use SOAP::Lite;
my $host='http://localhost:8080';
my $soap = SOAP::Lite
-> proxy("$host/sakai-axis/SakaiLogin.jws?wsdl");
my $result =$soap->login("admin","admin");
my $sessionid=$result->result();
$soap = SOAP::Lite
-> proxy("$host/sakai-axis/SakaiScript.jws?wsdl");
$soap->addNewUser( $sessionid, 'alanberg', 'Alan', 'Berg', 'berg@
xx.nl', '', 'useruser');
if ($result->fault) {
print "Error";
} else {
print "Success\n";
}

Even if you find the Perl code unreadable, the point of the example is to show how only few lines of coding are required for an enterprise to hook into Sakai.

 

Entity Broker

Over time, more and more tools and services are included within Sakai. Therefore, there is an ever-expanding set of data, such as courses, users, polls, forums, grade books, assessments, and new data structures, available for integration.

It would indeed be handy if instead of needing to write custom web services per new entity, a tool programmer could call a service, write, and register his or her data for exposure. The kernel would then become responsible for the end delivery and the RESTful web services. Because the programmer does not have to deal as much with the details as before the services existed, the structure reduces the duplication of code and effort. This also increases maintainability, quality, and scalability, and generally eases the programmer's burden. Further, if by default, the entities are exposed as MIME types HTML, JSON, XML, you can write rich web-based applications and widget sets that consume the JSON and XML formats from the data within Sakai.

The Entity Broker is one such service that allows code to find and get at important data in Sakai, and easily manipulate that data from within Java objects. To accommodate the ever-changing set of requirements, the data needs to have some uniform parts to it, such as an ID and an associated URL, and it needs to have the ability to register its existence to a central service. If the data has this kind of a structure, it is called an entity, the original technical details of which you can find in the source code under: /reference/docs/architecture/sakai_entity.doc.

You can find the Java-specific details of the Entity Broker on Confluence: (http://confluence.sakaiproject.org/confluence/display/SAKDEV/Entity+Provider+and+Broker).

Unless you are a hardcore Sakai kernel programmer, it isn't important to understand the hidden and subtle details. You just need to know how to find out which services exist, and how to do business with those services.

Finding descriptions of services

For the demonstration instance, the Entity Broker services exist under the /direct URL space. To view a human-readable description of all the services available in your Sakai installation, visit http://localhost:8080/direct/describe. The following figure is the description of services available on one of the Sakai QA servers.

To read the description of the user service, use the following demonstration URL: http://localhost:8080/direct/user/describe.

On different tag versions of Sakai, different services exist. However, every available entity is described by the same URL structure: http://hostname:port/direct/entity_prefix/describe.

It is helpful to read the specific description page for each entity, as Entity Broker empowers the programmer to add custom actions. The describe page shown next is for the user entity.

Notice that custom actions currently exist and the server returns data in either XML or JSON format.

Before logging in to the demonstration instance of Sakai, first visit the URL http://localhost:8080/direct/user/current. Notice that the returned HTML page tells you in a 400 HTTP status error message that there is no current user to get user information about. This makes sense, as you have not logged in. After you log in and revisit the page, the server still does not return the user information. This is because without an extension on the URL, Entity Broker assumes HTML and tries to return the data in HTML format. Because HTML is not one of the supported return formats for this entity, an error occurs. One of the supported formats for this entity is JSON. So, to obtain your current information in JSON format, simply visit: http://localhost:8080/direct/user/current.json

Authenticating

At this point in the section, if you have everything set up properly to run TCPMON and watch the request and responses generated, then running the example code mentioned next will allow you to see how REST works in practice.

For a client-side application to create a new user, it must first obtain a session via a post to the URL http://localhost:8080/direct/session/new with the variables _username and _password set, as described in http://localhost:8080/direct/session/describe. The server returns the session ID in the form of one of the header values, EntityId, which the script then passes on in any requests it sends. You can also pass sessionId as sakai.session=sessionId, as a header or in the URL. You can also use a cookie with the same values included.

To create a user, the client application will need to post to the user service with, as a minimum, the eid (Enterprise ID) variable set. Note that the user/describe URL explains which name and value pairs are valid.

A client-side coding example

For the programming-inclined people, I have included the following listing that creates a session as the user admin, with the password admin, and then creates a user in Sakai with eid=its_alive, firstName=The and lastName=Monster. For the sake of brevity, there is no programmatic error checking.

#!/usr/bin/perl -w
use strict;
use LWP::UserAgent;
use HTTP::Request::Common;
my $host='http://localhost:8080';
my $credential = "_username=admin&_password=admin";
my $user='eid=its_alive&firstName=The&lastName=Monster';
my $userAgent = LWP::UserAgent->new();
my $response = $userAgent->request(POST "$host/direct/session/new",
Content_Type => 'application/x-www-form-urlencoded',
Content => $credential);
my $entityid= $response->header('EntityId');
print "Session: $host/direct/session/$entityid\n";
$response = $userAgent->request(POST "$host/direct/user/new",
Content_Type => 'application/x-www-form-urlencoded',
Content => $user);
$entityid= $response->header('EntityId');
print "User [json format]: $host/direct/user/$entityid.json\n";
print "User [XML format]: $host/direct/user/$entityid.xml\n";

On running, the output of the script should look similar to the following:

Session: http://localhost:8080/direct/session/770588c7-9a58-46f6-8d47-
7c92cab93759
User [json format]: http://localhost:8080/direct/user/c9ab941f-3fac-4827-
ad00-c4f98cf9ad5e.json
User [XML format]: http://localhost:8080/direct/user/c9ab941f-3fac-4827-
ad00-c4f98cf9ad5e.xml

Once you have written one client script, any new scripts are going to be quite similar. Expect an ever-expanding set of client scripts to be included in the Contrib section, waiting for new organizations to pick them up.

 

WSRP

Portals such as uPortal (http://www.uportal.org) aggregate information from various systems into channels that are a part of one view for the user. A typical university may include an accumulation of the newest e-mails, RSS feeds for upcoming events, and links into important systems such as Sakai and the library systems. An institute can enforce a single corporate look and feel through a portal and empower the end user to transverse efficiently through their most current personalized information.

In the Java world, programmers can package channels into little applications that interact in a standard way with the portal system. These standard packages are called portlets and the interactions are standardized via JSR-168 (http://jcp.org/aboutJava/communityprocess/review/jsr168/). This standardization allows the portlets to be shared between different commercial and non-commercial portals and enables organizations to avoid locking in to a particular vendor's solution.

The issue with JSR-168 portlets is that the standardization constrains the range of events the portlet can react to, and consequently makes the user experience less rich.

Portlets reside on the portal and traditionally get their own external data from RSS feeds, or under the hood via web services.

Sakai is thoroughly RSS-enabled. For example, visit the main page of your demonstration server with the URL http://localhost:8080/portal/rss and you will see an RSS-rendered version of the main page. After logging in and visiting the page again, you will get to see more details. For a PDA-compliant page, visit: http://localhost:8080/portal/pda.

Having all the portlet code on the portal system makes for a lot of code in one place, which is a serious risk for later trouble in terms of performance, code duplication, maintainability, and connecting to external data sources consistently. The web services for Remote Portlets WSRP (http://oasis-open.org/committee/wsrp) allow a Portal to call WSRP-enabled portlets via web services. On the portal side, all you would need is a connector that an administrator can then configure to target a specific service.

Building a viable Portal system has knock-on effects on the background systems. If users (potentially the whole of an organization's population) hit the Portal heavily, then expect a considerable increase in usage of the secondary systems. The deploying organization needs to strengthen the legacy systems preemptively. Further, end users naturally expect to follow links safely from the various feeds into the associated background application directly. End users do not expect to have to login more than once, and only through the portal. If enacted, SSO through mechanisms such as CAS (http://www.ja-sig.org/products/cas/) or Shibboleth (http://shibboleth.internet2.edu/) is viable. Uniform provisioning of user accounts across the full spectrum of linked-to applications is also a concern.

For Sakai, it makes sense to expose to a portal user a list of what is new in the user's courses, schedules, the Message of the Day, and other facets of the daily interaction between learners and Sakai. Whenever possible, it is a good idea for system integrators to use current standards to do so.

Activating the services within Sakai requires downloading and installing an extra web application (Servlet) that runs within a specific Sakai instance and delivers the WSRP producer services. The location of the most up-to-date README is https://source.sakaiproject.org/svn/wsrp/trunk/producer/README.txt. The code is based on the WSRP4J framework http://portals.apache.org/wsrp4j.

Summary

Web services are one of the standard approaches to enterprise integration. The services allow for loose coupling with consuming applications. Lazy coupling implies that you can replace one service with another without the code in a client application needing to change.

Sakai has a basic set of SOAP-based web services available, which an administrator can turn on by setting webservices.allowlogin=true in the sakai/sakai.properties file. There are more services that you can deploy stored in the Contrib section of Sakai.

By placing text files containing a few lines of Java in the right location in Sakai, a programmer can create new web services rapidly. Many client-side libraries remove the need to understand the underlying complexities of the protocols involved. The Entity Broker exposes managed data (entities) within Sakai, such as the representation of users and sites by RESTful web services. You can discover currently available services by visiting http://host/direct.

It is possible to connect Sakai to Portal systems via the WSRP standard.


Further resources on this subject:


Sakai CLE Courseware Management: The Official Guide Take your e-learning, research, and collaboration experience to the next level
Published: July 2011
eBook Price: $32.99
Book Price: $54.99
See more
Select your format and quantity:

About the Author :


Alan Mark Berg

Alan Mark Berg Bsc. MSc. PGCE, has for the last twelve years been the lead developer at the Central Computer Services at the University of Amsterdam. In his famously scarce spare time, he writes. Alan has a degree, two masters degrees, and a teaching qualification. He has also co-authored two books about Sakai (http://sakaiproject.org), a highly successful open source learning management platform used by many millions of students around the world. Alan has also won a Sakai Fellowship.

In previous incarnations, Alan was a technical writer, an Internet/Linux course writer, a product line development officer, and a teacher. He likes to get his hands dirty with the building and gluing of systems. He remains agile by ruining various development and acceptance environments.

Books From Packt


NetBeans IDE 7 Cookbook
NetBeans IDE 7 Cookbook

WordPress 3 Ultimate Security
WordPress 3 Ultimate Security

CMS Made Simple Development Cookbook
CMS Made Simple Development Cookbook

Pentaho Data Integration 4 Cookbook
Pentaho Data Integration 4 Cookbook

Drupal 7 Themes
Drupal 7 Themes

Apache Wicket Cookbook
Apache Wicket Cookbook

Joomla! 1.5 Multimedia
Joomla! 1.5 Multimedia

jQuery 1.4 Reference Guide
jQuery 1.4 Reference Guide


No votes yet

Post new comment

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.
R
Z
D
8
y
V
Enter the code without spaces and pay attention to upper/lower case.
Code Download and Errata
Packt Anytime, Anywhere
Register Books
Print Upgrades
eBook Downloads
Video Support
Contact Us
Awards Voting Nominations Previous Winners
Judges Open Source CMS Hall Of Fame CMS Most Promising Open Source Project Open Source E-Commerce Applications Open Source JavaScript Library Open Source Graphics Software
Resources
Open Source CMS Hall Of Fame CMS Most Promising Open Source Project Open Source E-Commerce Applications Open Source JavaScript Library Open Source Graphics Software