ASP.NET Site Performance: Reducing Long Wait Times

Exclusive offer: get 50% off this eBook here
ASP.NET Site Performance Secrets

ASP.NET Site Performance Secrets — Save 50%

Simple and proven techniques to quickly speed up your ASP.NET website

$35.99    $18.00
by Matt Perdeck | October 2010 | .NET Web Development

If the server is not running out of memory, CPU capacity, or threads and requests are still taking long to complete, chances are that the server has to wait too long for off-box resources, such as the database.

In this article by Matt Perdeck, author of ASP.NET Site Performance Secret, we'll cover the following topics:

  • How to measure the wait times for each off-box resource using custom counters
  • Waiting concurrently instead of sequentially
  • Improving session state performance
  • Reducing thread-locking delays

 

ASP.NET Site Performance Secrets

ASP.NET Site Performance Secrets

Simple and proven techniques to quickly speed up your ASP.NET website

  • Speed up your ASP.NET website by identifying performance bottlenecks that hold back your site's performance and fixing them
  • Tips and tricks for writing faster code and pinpointing those areas in the code that matter most, thus saving time and energy
  • Drastically reduce page load times
  • Configure and improve compression – the single most important way to improve your site's performance
  • Written in a simple problem-solving manner – with a practical hands-on approach and just the right amount of theory you need to make sense of it all

 

        Read more about this book      

Measuring wait times

We can use a number of ways to find out which external requests are most frequent and how long the site has to wait for a response:

  • Run the code in the debugger with breakpoints around each external request. This will give you a quick hint of which external request is the likely culprit. However, you wouldn't do this in a production environment, as it only gives you information for a few requests.
  • Use the Trace class (in the namespace System.Diagnostics) to trace how long each request takes. This will give you a lot of detailed information. However, the overhead incurred by processing all the trace messages may be too high to use in a production environment, and you would have to somehow aggregate the trace data to find which requests are the most frequent and take the longest.
  • Build performance counters into your code that record the frequency of each request and the average wait time. These counters are light-weight, and hence, can be used in a production environment. Also, you can readily access them via perfmon, along with the counters provided by ASP.NET, SQL Server, and so on that you have already come across.

The remainder of this section focuses on performance counters. Also, performance counters are a convenient way to keep an eye on off-box requests on a day-to-day basis instead of as a one-off.

Windows offers you 28 types of performance counters to choose from. Some of these are esoteric, others extremely useful. For example, you can measure the rate per second that a request is made, and the average time in milliseconds that the site waits for a response. Adding your own custom counters is easy, and you can see their real-time values in perfmon, along with that of the built-in counters.

The runtime overhead of counters is minimal. You have already come across some of the hundreds of counters published by ASP.NET, SQL Server, and Windows itself. Even if you add a lot of counters, CPU overhead would be well under one percent.

This section describes only three commonly used counters: simple number, rate per second, and time. A list of all types of counters with examples of their use is available at http://msdn.microsoft.com/en-us/library/system.diagnostics.performancecountertype.aspx?ppud=4.

To use the counters, you need to follow these three steps:

  1. Create custom counters.
  2. Update them in your code.
  3. See their values in perfmon.

Creating custom counters

In this example, we'll put counters on a page that simply waits for one second to simulate waiting for an external resource.

Windows allows you to group counters into categories. We'll create a new category "Test Counters" for the new counters.

Counter Name Counter Type Description
Nbr Page Hits NumberOfItems64 64 bit counter, counting the total number of hits on the page since the website started.
Hits/second RateOfCountsPerSecond32 Hits per second
Average Wait AverageTimer32 Time taken by the resource. Inspite of the name, it is used here to simply measure an interval, not an average.
Average Wait Base* AverageBase Utility counter required by Average Wait.

*The text says there are three counters, but the table lists four. Why? The last counter, Average Wait Base, doesn't provide information on its own, but helps to compute the value of counter Average Wait. Later on, we'll see how this works.

There are two ways to create the "Test Counters" category and the counters themselves:

  • Using Visual Studio: This is relatively quick, but if you want to apply the same counters to for example your development and production environments, you'll have to enter the counters separately in each environment
  • Programmatically: Because this involves writing code, it takes a bit longer upfront, but makes it easier to apply the same counters to multiple environments and to place the counters under source control

Creating counters with Visual Studio

To create the counters in Visual Studio:

  1. Make sure you have administrative privileges or are a member of the Performance Monitor Users group.
  2. Open Visual Studio.
  3. Click on the Server Explorer tab.
  4. Expand Servers.
  5. Expand your machine.
  6. Right-click on Performance Counters and choose Create New Category.
  7. Enter Test Counters in the Category Name field.
  8. Click on the New button for each of the four counters to add, as listed in the table you saw earlier. Be sure to add the Average Wait Base counter right after Average Wait, to properly associate the two counters.
  9. Click on OK when you're done.

ASP.NET Site Performance Secret

This technique is easy. However, you'll need to remember to add the same counters to the production machine when you release new code with new custom counters. Writing a program to create the counters is more work initially, but gives you easier maintenance in the long run. Let's see how to do this.

Creating counters programmatically

From a maintenance point of view, it would be best to create the counters when the web application starts, in the Global.asax file. However, you would then have to make the account under which the application pool runs part of the Performance Monitor Users group.

An alternative is to create the counters in a separate console program. An administrator can then run the program to create the counters on the server. Here is the code.

using System;
using System.Diagnostics;
namespace CreateCounters
{
class Program
{
static void Main(string[] args)
{

To create a group of counters, you create each one in turn, and add them to a CounterCreationDataCollection object:

CounterCreationDataCollection ccdc = new
CounterCreationDataCollection();

Create the first counter, Nbr Page Hits. Give it a short help message and the counter type. Now, add it to the CounterCreationDataCollection object:

CounterCreationData ccd = new CounterCreationData
("Nbr Page Hits", "Total number of page hits",
PerformanceCounterType.NumberOfItems64);
ccdc.Add(ccd);

Add the second, third, and fourth counters along the same lines:

ccd = new CounterCreationData("Hits / second",
"Total number of page hits / sec",
PerformanceCounterType.RateOfCountsPerSecond32);
ccdc.Add(ccd);

ccd = new CounterCreationData("Average Wait",
"Average wait in seconds",
PerformanceCounterType.AverageTimer32);
ccdc.Add(ccd);

ccd = new CounterCreationData("Average Wait Base", "",
PerformanceCounterType.AverageBase);
ccdc.Add(ccd);

Now, it's time to take the CounterCreationDataCollection object and make it into a category. Because you'll get an exception when you try to create a category that already exists if there already is a category with the same name, delete it now. Because you can't add new counters to an existing category, there is no simple work-around for this:

if (PerformanceCounterCategory.Exists("Test Counters"))
{
PerformanceCounterCategory.Delete("Test Counters");
}

Finally, create the Test Counters category. Give it a short help message, and make it a single instance. You can also make a category multi-instance, which allows you to split the category into instances. Also, pass in the CounterCreationDataCollection object with all the counters. This creates the complete category with all your counters in one go, as shown in the following code:

PerformanceCounterCategory.Create("Test Counters",
"Counters for test site",PerformanceCounterCategoryType.
SingleInstance,ccdc);
}
}
}

Now that you know how to create the counters, let's see how to update them in your code

Updating counters in your code

To keep things simple, this example uses the counters in a page that simply waits for a second to simulate waiting for an external resource:

using System;
using System.Diagnostics;
public partial class _Default : System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{

First, increment the nbrPageHits counter. To do this, create a PerformanceCounter object, attaching it to the nbrPageHits counter in the Test Counters category. Then, increment the PerformanceCounter object:

PerformanceCounter nbrPageHitsCounter =
new PerformanceCounter("Test Counters", "Nbr Page Hits", false);
nbrPageHitsCounter.Increment();

Now, do the same with the Hits/second counter. Because you set its type to RateOfCountsPerSecond32 when you generated it in the console program, the counter will automatically give you a rate per second when viewed in perfmon:

PerformanceCounter nbrPageHitsPerSecCounter =
new PerformanceCounter("Test Counters", "Hits / second", false);
nbrPageHitsPerSecCounter.Increment();

To measure how long the actual operation takes, create a Stopwatch object, and start it:

Stopwatch sw = new Stopwatch();
sw.Start();

Execute the simulated operation:

// Simulate actual operation
System.Threading.Thread.Sleep(1000);

Stop the stopwatch:

sw.Stop();

Update the Average Wait counter and the associated Average Wait Base counter to record the elapsed time in the stopwatch.

PerformanceCounter waitTimeCounter = new
PerformanceCounter("Test Counters", "Average Wait", false);
waitTimeCounter.IncrementBy(sw.ElapsedTicks);
PerformanceCounter waitTimeBaseCounter = new
PerformanceCounter("Test Counters", "Average Wait Base",
false);
waitTimeBaseCounter.Increment();
}
}

Now that we've seen how to create and use the most commonly used counters, it's time to retrieve their values.

Viewing custom counters in perfmon

Accessing your custom counters goes the following way:

  1. On the server, run perfmon from the command prompt. To open the command prompt on Vista, click on Start | All Programs | Accessories | Command Prompt. This opens the monitor window.
  2. Expand Monitoring Tools and click on Performance Monitor.
  3. Click on the green "plus" sign.
  4. In the Add Counters dialog, scroll down to your new Test Counters category.
  5. Expand that category and add your new counters. Click on OK.

    ASP.NET Site Performance Secret

  6. To see the counters in action, run a load test. If you use WCAT, you could use files runwcat_testcounters.bat and testcounters_scenario.ubr from the downloaded code bundle.

Now that you have seen how to measure wait times, let's turn to a number of ways to reduce those wait times.

ASP.NET Site Performance Secrets Simple and proven techniques to quickly speed up your ASP.NET website
Published: October 2010
eBook Price: $35.99
Book Price: $59.99
See more
Select your format and quantity:
        Read more about this book      

Waiting concurrently

If your site needs to wait for responses from multiple external resources, and those requests are not dependent on each other, initiate those requests in one go and wait for all responses in parallel instead of one after the other. If you need information from three web services, each taking five seconds to respond, you'll now wait for five seconds only, instead of 3*5=15 seconds.

You can easily implement this using asynchronous code. When you register each asynchronous task, pass true in the executeInParallel parameter of the PageAsyncTask constructor, as shown in the following code:

bool executeInParallel = true;
PageAsyncTask pageAsyncTask =
new PageAsyncTask(BeginAsync, EndAsync, null, null,
executeInParallel);
RegisterAsyncTask(pageAsyncTask);

Retrieving multiple result sets from the database

ADO.NET allows you to retrieve multiple result sets from the database, instead of retrieving them one-by-one.

Reducing overhead by using off-box session modes

If you use session state on a server farm, you probably use StateServer or SqlServer mode rather than InProc mode, because requests coming from the one visitor may be processed at different servers.

This means that when a request starts being processed, ASP.NET retrieves the current session state from the StateServer or the SQL Server database, and de-serializes it so that your code can work with it. Then, towards the end of the life cycle of the page, the session state is serialized again and stored in the StateServer or the SQL Server database. As part of this, ASP.NET updates the last update time of the session, so that it can expire the session if it hasn't been used for too long. If you use SqlServer mode, this all means two trips to the database per request.

A few ways to reduce all this overhead are discussed in the following sections.

Reducing trips to the database

You can reduce the number of trips to the database by setting EnableSessionState in the Page directive to True, as shown:

<%@ Page EnableSessionState="True" ... %>

EnableSessionState takes these values:

  • True: It is the default value; you get both trips to the database.
  • False: It disables access to session state on the page and prevents the initial read. However, to prevent session expiry, the session state in the store will still be marked as accessed towards the end of the page life cycle. Hence, you wind up with just one trip to the database.
  • ReadOnly: This value makes the session state read-only. When page processing starts, the session state is still retrieved and deserialized. However, towards the end of the page life cycle, there is no update of the session state. This means you wind up with just one trip to the database. An added advantage is that this mode uses only read locks, enabling multiple read-only requests to access the session state concurrently. As a result, it prevents lock contention when multiple files are processed for the same visitor.

Setting EnableSessionState

As you saw, you can set EnableSessionState in the Page directive:

<%@ Page EnableSessionState="ReadOnly" %>

You can also set it on a site-wide basis in web.config:

<configuration>
<system.web>
<pages enableSessionState="ReadOnly" />
</system.web>
</configuration>

You can then override this default state in the Page directive of each page.

Reducing serialization and transfer overhead

In addition to reducing the number of trips to the database, it makes sense to reduce serialization and transfer overhead to save time and CPU usage.

Instead of storing an object with multiple fields in one go in Session, store its individual fields. This has the following advantages:

  • Serializing .NET Framework primitive types such as String, Boolean, DateTime, TimeSpan, Int16, Int32, Int64, Byte, Char, Single, Double, Decimal, SByte, UInt16, UInt32, UInt64, Guid, and IntPtr is very quick and efficient. Serializing object types however, uses the BinaryFormatter, which is much slower.
  • It allows you to access only those individual fields that really need to be accessed. Fields that do not get accessed do not get updated in the session store, saving serialization and transfer overhead.

Suppose you use objects of this class (page Original.aspx.cs in folder LongWaitTimes_ReduceSessionSerializatonOverhead in the code bundle):

[Serializable]
private class Person
{
public string FirstName { get; set; }
public string LastName { get; set; }
}

You would retrieve and store this in Session as shown:

// Get object from session
Person myPerson = (Person)Session["MyPerson"];

// Make changes
myPerson.LastName = "Jones";

// Store object in session
Session["MyPerson"] = myPerson;

This will use the BinaryFormatter to deserialize/serialize the entire myPerson object and transfer it in its entirety from/to the session store.

Now look at the alternative (page Improved.aspx.cs in folder LongWaitTimes_ReduceSessionSerializatonOverhead in the downloaded code bundle):

private class SessionBackedPerson
{
private string _id;
public SessionBackedPerson(string id)
{
_id = id;
}
private string _firstName;
public string FirstName
{
get
{
_firstName = HttpContext.Current.Session[_id +
"_firstName"].ToString();
return _firstName;
}
set
{
if (value != _firstName)
{
_firstName = value;
HttpContext.Current.Session[_id + "_firstName"] = value;
}
}
}
private string _lastName;
public string LastName
{
get
{
_lastName = HttpContext.Current.Session[_id +
"_lastName"].ToString();
return _lastName;
}
set
{
if (value != _lastName)
{
_lastName = value;
HttpContext.Current.Session[_id + "_lastName"] = value;
}
}
}
}

This class takes care of storing its own individual properties in Session. Because of this, it needs to know its ID when it is constructed, so that it can construct a unique session key. When setting a property value, the Session object is only accessed when the new value is actually different from the old value.

As a result, this solution only stores individual primitives that are quick to serialize, rather than the entire object. It also only updates those fields in Session that have actually been updated.

Working with this new class requires changing the page code. Instead of retrieving an object from Session, the code needs to simply instantiate a new object, passing in the ID to the constructor. Then when the code updates any properties, they are stored in Session right away, with no need to store the entire object in Session at the end:

protected void Page_Load(object sender, EventArgs e)
{
SessionBackedPerson myPerson = new SessionBackedPerson("myPers
on");
// Update object, and session, in one go.
// Only touch LastName, not FirstName.
myPerson.LastName = "Jones";
}

We've now seen a number of ways to reduce the cost of sessions. But what about getting rid of them altogether? Refer to the next section for more about that option.

Cutting your dependence on sessions

The great advantage of session state is that it lives on the server, so is more difficult to access or modify by unauthorized people. However, if this is not an issue, here are some options to get rid of session state and its overhead in your pages:

  • If you are not keeping a lot of session data, use simple cookies instead.
  • Store session data in ViewState. This requires more bandwidth but reduces database traffic.
  • Use AJAX-style asynchronous callbacks on your pages instead of full-page refreshes, so that you can keep session information on the page.

Thread locking

If you use locking to ensure only a single thread can access a given resource, some threads may have to wait for a lock to become available.

To see if this is an issue, use perfmon to check the following counters, all in category .NET CLR LocksAndThreads

Category: .NET CLR LocksAndThreads
Contention Rate/sec The rate at which the runtime tries to get a managed lock, and fails to do so.
Current Queue Length Last recorded number of threads waiting to get a managed lock.

 

If you consistently have threads failing to get a managed lock, you are looking at a source of delays. You can consider the following ways to reduce these delays:

  • Minimize the duration of locks
  • Use granular locks
  • Use System.Threading.Interlocked
  • Use ReaderWriterLock

Minimizing the duration of locks

Acquire locks on shared resources just before you access them, and release them immediately after you are finished with them. By limiting the time each resource is locked, you minimize the time threads need to wait for resources to become available.

Using granular locks

If you use the C# lock statement, or the Monitor object, lock as small an object as possible.

Take for example the following code:

lock (protectedObject)
{
// protected code
}

This is shorthand for the following code:

try
{
Monitor.Enter(protectedObject);
// protected code ...
}
finally
{
Monitor.Exit(protectedObject);
}

Note that Monitor.Enter effectively locks the given object protectedObject. Because only a single thread can lock the object, this has the result of allowing only a single thread to execute the protected code.

This works well, as long as the object that is locked is solely related to the protected code. Only lock on private or internal objects. Otherwise, some unrelated code might try to lock on the same object to protect some other bit of code, leading to unnecessary delays. For example, do not lock on this:

lock (this)
{
// protected code ...
}

Instead, lock on a private object:

private readonly object privateObject = new object();
public void MyMethod()
{
lock (privateObject)
{
// protected code ...
}
}

If you are protecting static code, do not lock on the class type:

lock (typeof(MyClass))
{
// protected code ...
}

Instead, use a static object:

private static readonly object privateStaticObject = new object();

public void MyMethod()
{
lock (privateStaticObject)
{
// protected code ...
}
}

Using System.Threading.Interlocked

If your protected code simply increments or decrements an integer, adds one integer to another, or exchanges two values, consider using the System.Threading.Interlocked class instead of lock. Interlocked executes a lot faster than lock, so should result in less waiting for locks.

For example, instead of the following:

lock (privateObject)
{
counter++;
}

Use the following:

Interlocked.Increment(ref counter);

Using ReaderWriterLock

If most threads accessing a protected object read only that object, and relatively few threads update the object, consider using a ReaderWriterLock. This allows multiple readers to access the protected code, but only a single writer to access it.

Acquiring a reader lock

When using a ReaderWriterLock, you declare the ReaderWriterLock at the class level:

static ReaderWriterLock readerWriterLock = new ReaderWriterLock();

Then to acquire a reader lock within a method, call AcquireReaderLock. You can give this a timeout. When a timeout occurs, an ApplicationException is thrown. To release the lock, call ReleaseReaderLock. Be sure to only release the lock if you actually have the lock, that is, you didn't suffer a timeout, otherwise an ApplicationException is thrown.

try
{
readerWriterLock.AcquireReaderLock(millisecondsTimeout);
// Read the protected object
}
catch (ApplicationException)
{
// The reader lock request timed out.
}
finally
{
// Ensure that the lock is released, provided there was no
// timeout.
if (readerWriterLock.IsReaderLockHeld)
{
readerWriterLock.ReleaseReaderLock();
}
}

Acquiring a writer lock

Using a writer lock goes along the same lines. Acquire a writer lock by calling AcquireWriterLock. When you're done with it, call ReleaseWriterLock, making sure you actually had the lock:

try
{
readerWriterLock.AcquireWriterLock(millisecondsTimeout);
// Update the protected object
}
catch (ApplicationException)
{
// The writer lock request timed out.
}
finally
{
// Ensure that the lock is released, provided there was no
// timeout.
if (readerWriterLock.IsWriterLockHeld)
{
readerWriterLock.ReleaseWriterLock();
}
}

If your code holds a reader lock and then decides that it needs to update the protected object, it can either release the reader lock and acquire a writer lock, or call the UpgradeToWriterLock method. You can also downgrade from a writer lock to a reader lock using DowngradeFromWriterLock, allowing the waiting reader threads to start reading.

Alternating readers and writers

While it is fine for multiple threads to read the protected object simultaneously, a thread that updates the protected object needs exclusive access. That way, while the thread is updating the object, no other threads can read or update that same object. To implement this, threads waiting for reader locks and threads waiting for writer locks sit in separate queues. When a writer releases its lock, all threads waiting for a reader lock have their locks granted and proceed through the protected code. When they have all released their reader locks, the next thread waiting for a writer lock has its lock granted. This way, the protected code is executed alternately by reader and writer threads.

To make sure that writer threads are not locked indefinitely by a constant stream of reading threads, if a new thread tries to acquire a reader lock while other reading threads are already executing the protected code, the new thread has to wait until the next writer thread has finished. Obviously, if there is no thread waiting for a writer lock, the new reader thread has its lock granted right away.

All of this is depicted in the following diagram:

ASP.NET Site Performance Secret

Optimizing disk writes

If your site creates many new files on disk, such as files uploaded by visitors, consider these performance improvements:

  • Avoid head seeks
  • Use FileStream.SetLength to avoid fragmentation
  • Use 64 K buffers
  • Disable 8.3 filenames

Avoiding head seeks

Writing bytes sequentially without moving the read/write head happens much faster than random access. If you are only writing the files and not reading them, try writing them on a dedicated disk drive using a single dedicated thread. That way, other processes won't move the read/write head on the drive.

Using FileStream.SetLength to avoid fragmentation

If multiple threads write files at the same time, space used for those files will become interleaved, leading to instant fragmentation.

To prevent this, use the FileStream.SetLength method to reserve enough space for the file before you start writing.

If you use the ASP.NET FileUpload control to receive files from a visitor, it can give get the length of a file as shown:

int bufferLength = FileUpload1.PostedFile.ContentLength;

Using 64 K buffers

The NTFS file system uses an internal buffer of 64 KB. The FileStream constructor allows you to set the buffer size for file writes. By setting the FileStream buffer size to 64 KB, you bypass both the internal FileStream buffer and the NTFS buffer, which can result in higher performance.

Disabling 8.3 filenames

To retain backwards compatibility with MS-DOS, the NTFS file system maintains an 8.3 filename for each file or directory. This creates some overhead, because the system has to make sure that the 8.3 filename is unique, and so has to check the other names in the directory. You would have to have over 20,000 files in a single directory though for this to become significant.

Before disabling 8.3 filenames, make sure there are no applications on your system that rely on these names. Test the change first on a test system with the same operating system as the operational system.

To disable 8.3 filenames, execute this from the command prompt (after you back up the registry.):

fsutil behavior set disable8dot3

Because this changes the registry, you need to restart the machine for this to take effect.

Summary

In this article, we first saw how to implement performance counters to keep track of the frequency and response times of off-box requests. We then discussed a number of ways to reduce wait times. These included waiting concurrently instead of sequentially, reducing the overhead of session state kept on a state server or database server, minimizing delays due to thread locking and optimizing disk writes. We also had a quick look at disabling 8.3 filenames.

ASP.NET Site Performance Secrets Simple and proven techniques to quickly speed up your ASP.NET website
Published: October 2010
eBook Price: $35.99
Book Price: $59.99
See more
Select your format and quantity:

About the Author :


Matt Perdeck

Matt Perdeck has over 20 years experience developing high-performance software systems, ranging from the largest ATM network in the Netherlands to embedded software in advanced Wide Area Networks. He has a B.Sc. in Computer Science from the University of Technology Twente, the Netherlands, and an M.B.A. from the University of Western Australia and Copenhagen Business School. He has worked in Australia, the Netherlands, Slovakia, and Thailand. He has extensive .NET and SQL Server development experience and has written several .NET articles. As an old-school software engineer, he is passionate about making things faster, simpler, and more efficiently.

Books From Packt


ASP.NET 3.5 Application Architecture and Design
ASP.NET 3.5 Application Architecture and Design

ASP.NET MVC 1.0 Quickly
ASP.NET MVC 1.0 Quickly

.NET Compact Framework 3.5 Data Driven Applications
.NET Compact Framework 3.5 Data Driven Applications

BlackBerry Java Application Development
BlackBerry Java Application Development

ColdFusion 9 Developer Tutorial
ColdFusion 9 Developer Tutorial

The 3CX IP PBX Tutorial
The 3CX IP PBX Tutorial

Kentico CMS 5 Website Development: Beginner's Guide
Kentico CMS 5 Website Development: Beginner's Guide

Unity 3D Game Development by Example Beginner's Guide
Unity 3D Game Development by Example Beginner's Guide


Code Download and Errata
Packt Anytime, Anywhere
Register Books
Print Upgrades
eBook Downloads
Video Support
Contact Us
Awards Voting Nominations Previous Winners
Judges Open Source CMS Hall Of Fame CMS Most Promising Open Source Project Open Source E-Commerce Applications Open Source JavaScript Library Open Source Graphics Software
Resources
Open Source CMS Hall Of Fame CMS Most Promising Open Source Project Open Source E-Commerce Applications Open Source JavaScript Library Open Source Graphics Software