For the past decade or so we have begun to realize that computers are an indispensable necessity. They're around us everywhere, starting from our comfortable households to rovers from other planets. Currently, it is not uncommon at all to have more than a few dozen of office computers and other IT equipment in the infrastructure of a small company that does nothing directly related to that specific area.
It should not surprise anyone that in case of business environments there has to be some streamlined inventory. Especially, when we consider that the network might have a total of several hundreds, if not thousands, of workstation computers, servers, portable devices, other office equipment such as printers, scanners, and other networking components.
Resource management, in its essence, when viewed from an IT perspective, is providing a method to gather and store all kinds of information about items in our infrastructure. Later on supporting means to further maintain the said inventory. Also, performing routine tasks based on the collected data such as generating reports, locating relevant information easily (like where is a specific memory module with the model number you're looking for), auditing type of software installed on workstation computers, and more.
Our plan of action is going to be pretty straightforward; we analyze the IT inventorying needs and some general requisites when it comes to managing those assets. What's more we'll be presenting the client-sever model that is the underlying foundation on which most centralized management solutions are working. This is when OCS Inventory NG pops into the picture saving the day. Soon we will see why.
We will get to know more about OCS Inventory NG soon, for now it's enough to realize that it's an open source project. No matter how successful a company is, open source solutions are always appreciated by the IT staff and management. As long as the project is actively developed, it's fairly popular, well documented, provides community support, and meets their needs. Among others, open source projects end up modular and flexible.
Inventorying requirements in the real world
One of the general requirements of an IT inventory is to be efficient and practical. The entire process should be seamless to the clients and requires limited (or none at all) user interaction. Once set up, it just needs to be automated to update the inventory database based on the latest changes without manually being required to do so. Thereafter, the collage of data gathered is ought to be organized and labeled the way we want.
Businesses everywhere have come to realize that process integration is the best method for querying, standardizing, and organizing information about the infrastructure. The age of hi-tech computing made this possible by speeding up routine tasks and saving up employee time, eliminating bureaucracy and unnecessary filing of papers that all lead to frustration and waste of resources. Implementing integrated processes can change the structure and behavior of an organization. But finding the correct integration often becomes a dilemma.
Feasible solution to avoid inevitable havoc
Drifting back to the case of IT department, the necessity of having an integrated and centralized solution to manage numerous systems and other hardware equipment becomes obvious. The higher the number of systems, the bigger the volume to be managed, the easier the situation can get out of control, thus leading to crisis. Everyone runs around in panic like headless zombies trying to figure out who can be held responsible and what's there to do in order to avoid such scenarios.
Taking a rational approach soon enough can improve the stability of entire organizations. Chances are you already know this, but usually system administrators tend to dislike working with papers. Filling in forms, storing them purely for archiving needs, and then when they least expect it, finding relevant information. A system like that won't make anyone happy.
A centralized repository in some shape or form of a database gives almost instant access to results whenever such a query happens. Its actual state of always being up-to-date and reflecting the actual state of the infrastructure can be guaranteed by implementing updating mechanism.
Later on, once the database is in healthy state and the process is integrated, tried and proven, it won't make any significant difference between managing dozens of computers or thousands. A well-designed integrated process is future proof and scalable. This way it won't become a setback if and when the company decides to expand.
Streamlining software auditing and license management
As mentioned earlier, it is important to understand that auditing workstation machines cannot be neglected. In certain environments, the users or employees have limited access and work within a sort of enclosed program area and they can do little to nothing outside of their specialization. But there are situations when the employees are supposed to have administrative access and full permissions. It is for the good of both the user and company to monitor and pay attention to what happens within each and every computer.
Having an up to par auditing mechanism can integrate the system of license management as well. The persons responsible for this can track the total amount of licenses used and owned by the company, can calculate balance, notify when this number is about to run out, and so forth. It isn't all that uncommon to automate the purchasing of licenses either.
The license management process description varies from firm to firm, but usually it's something similar to the following: user requests for a license, supervisor agrees, and the request arrives to the relevant IT staff. After this step, the license request gets analyzed and based on the result it is either handed out or ordered/acquired if necessary.
If the process is not automated, all this would involve paperwork. And soon you will see frustrated employees running back-and-forth through departments asking who else needs to sign this paper. The process of automating and printing the end result is elegant and takes no trouble. The responsible department can then store the printed document for archiving purposes, if required. But the key of the process lies in integration. Inventorying can help here too.
More uses of an integrated IT inventory solution
The count of office consumables can also be tracked and maintained. This is a trickier process because it cannot be made unattended totally. Unless by installing some sort of sensor to track the count of printer cartridges inside an office furniture or the warehouse. However, you can update this field each time the said item gets restocked.
A centralized method for consumables means the responsible parties can get notified before running out of stock. Once again, this step eliminates unexpected scenarios and unnecessary tasks.
The beauty of centralized management solutions in the IT world is that if it is done right, they can open doors to numerous other activities as well. For example, in case of workstation PCs, the integrated process can be expanded into providing remote administration and similar other activities to be carried out remotely on the client machine.
Package deployment and execution of scripts are just few distinctive examples. Think of it like, license is granted, the package is deployed, and the script is run to ensure proper registration of the application, if required. System administrators can usually help fixing common issues of employees via remote execution of scripts. Surely there are other means to administer the machines, but we're focusing on all-in-one integrated solutions.
Another possibility is integrating the help-desk and ticketing system within the centralized inventory's management control panel as well. This way when an employee asks for help or reports a hardware issue, the system administrator can take a look at what's inside that system (hardware specifications, software installed, and so on.).Therefore, the system administrator gets to know the situation beforehand and thus use the right tools to troubleshoot the said issue.
Gathering relevant inventory information
We can conclude that in order to have a complete inventory on top of which we can build and implement other IT-related and administrative tasks, we need at least the following:
- Collecting relevant hardware information in case of workstation computers
- Manufacturer, serial number, model number of every component
- When applicable,some of the following: revision number, size, speed, memory, type, description, designation, connection port, interface, slot number, driver, MAC and IP address, and so on
- Collecting installed software/OS (licensing) information
- Operating system: Name, version, and registration information
- Application name, publisher, version, location
- Custom-queries from the Windows registry (if applicable)
- Collecting information about networking equipment and office peripherals
- Manufacturer, serial number, model, type of component, and so on
- MAC and IP address
- When applicable: revision number, firmware, total uptime, and so on
Overall inventory demands to enhance usability
Now let's create a list of criteria that we want our IT inventory solution to meet. In the paragraph previous to this, we enumerated some of the must-have data that cannot be left out from our inventory. Likewise, we have expectancies regarding how the process works.
From the perspective of users and running of background software, the process must be transparent and not become a resource hog. The bandwidth usage that is required to communicate with the centralized management server should be minimal. The inventorying mechanism must be automatic and discover on its own every item within the environment. Once everything is recorded, the copy stored in database must be always kept up-to-date, and backed up.
The inventorying client that sweeps through the entire network should be cross-platform. As always, everyone likes an intuitive and fast user interface. This is especially important when managing inventories and working with large volumes of data. The control panel or management center is the place where we can organize, label, and work with the gathered information. If the interface is way too complex or overcrowded, it leads to frustration.
The way information is queried from the database and displayed on screen must be snappy enough so that we don't have to wait and get bored to tears while some rotating hourglass is animated.
In addition, we want integrated backup functions. It's always possible to manually create database dumps or backup points, but if we can do so directly from the interface, it's much easier and possible for non-IT proficient individuals as well.
Assuming that the web interface can be configured to be accessed by multiple users having different permissions and rights, it can become quite a useful tool for employees working in non-IT departments such as accounting and management. The process of inventorying becomes streamlined and everyone can work with the inventory information to get their share of tasks done.
In a corporate environment, it might happen quite often that an employee receives a new computer and the older computer is received by another user having different needs. The inventory must be able to automatically detect and diagnose these situations, and track the history of a machine.
The ability to custom-specify, define, and set labels to the inventoried items is really important. When done professionally, companies might agree upon some naming convention to label inventoried items. An example of this is the following: pc001 in case of workstation computers, nt001 for networking equipment, sv001 for servers, ph001 for phones, pr001 for printers, and so on.
This means we need such functionalities from our IT inventory solution to track these inventory IDs as well. Should you want to take this idea further, you can generate and print barcodes and stick those on the side of those items. A feature-laden IT inventory can systematize the way tasks are carried out within an organization.
Summing these up, we have looked upon the most common inventorying requirements that each one of us is facing within a corporate environment. These are the necessities and the solution we want to implement so that our needs are met. In order to understand how it's going to accomplish our demands, we will talk about the client-server model. Once we know that, we are going to overview how OCS-NG ticks those inventory requisite checkboxes.
Centralization: Introducing the client-server model
Ever since distributed applications appeared, the client-server model has become popular. In simplest terms, the server is a computer (usually, a high performing one) running the service that centralizes some kind of information. It's also able to receive connections from clients, process their requests, and give them the results whenever necessary
Clients establish a connection with the server in order to request or upload some content. This communication model describes one of the most basic relationship and architecture. Typically, servers can simultaneously accept and process requests. This is done with multithreading programming. Other times the queries are so fast that sequential execution is enough.
The communication between clients and server can happen either via the Internet in case of wide area network (WAN) or just locally when it's limited to the local area network (LAN ). When necessary to enhance scalability, it is possible to incorporate more than one server in the client-server model. The servers will be part of a pool and they can share the load between each other; thus, a rather balanced workload and bandwidth is achieved.
Example to the client-server model: an Internet forum
The service that runs on the servers is a computer application. It is usually using elements of other services. Let's consider the example of a PHP-based web application: forums or bulletin boards. Everyone knows those. The forum application is the service running on the server and the clients are the members visiting the site, posting, reading posts, and so on.
The forum service cannot run on its own. It needs a set of other vital server components. A web server is necessary to listen, accept, and serves HTTP requests from visitors. In case of users, the web browser can formulate the HTTP requests, establish the communication with the target web server, and retrieve its HTTP responses. This is how the simple web surfing can be explained from client-server architecture perspective.
Nevertheless, this is not sufficient for the forum script to function properly. It is heavily dependent on a database service as well. This is the place where the data is stored. If the script is PHP based then the PHP service is also a prerequisite so that the dynamically generated web pages can be processed. While, other services may be also required, for the sake of keeping things simple and to present the basis of client-server, these services will suffice.
Client-server model versus peer-to-peer paradigm
The client-server model has its share of advantages and drawbacks when compared with other similar models such as the peer-to-peer paradigm. First and foremost, the client-server model is based on having only one place where the data is stored: on the server. This provides enhanced security and management. The server can be tightly secured, firewalled, powered by high-performing components, has access to plenty of system resources, is backed up regularly, and is maintained appropriately.
The data is centralized and this gives a safer infrastructure to maintain an error-free copy of the actual data on the side of every client. From the client's perspective, the server can be replaced, upgraded, or migrated on another server, without being affected. They know the path and destination of how to reach the server. If the migration or the maintenance is carried out properly, clients will not even be aware of that.
The peer-to-peer ( P2P ) paradigm takes a different approach to the client-server model. The model presumes that every end point can act both as a server and client likewise. Undoubtedly this brings the advantage of greater scalability and flexibility. But it is tougher and time-consuming to maintain an actual up-to-date copy of the database on every end-point client.
The P2P paradigm solves the possibility of network traffic congestion since there's no dedicated server to get overloaded. Ultimately, this is not a magic pill either as all of clients creating so much cross-talk contributes to increased all-around network traffic.
On the other hand, the client-server model does not provide such a high degree of robustness. If and when the server fails on a hardware level, until it gets replaced, repaired, or fixed, clients won't be able to connect and get any data out of the management server at all. However, there are various workarounds to enhance the uptime of servers and ensure their balanced workflow. Redundancy can also be implemented within the model if it is truly necessary.
IT inventorying based on the client-server model
Each of the paradigms mentioned have their share of best fit scenario where using one in favor of the other is a better decision. In case of IT inventorying and resource management solutions, the first model, which is the client-server, centralization is a better approach. The chance of overloading the server isn't all that easy because the volume of data that is exchanged is really low, few kilobytes at most. The bandwidth usage is lightweight.
And most importantly, the server-client model yields an immediate access to the actual information stored (that is secured) in the database. Centralization is an advantage here.
In this article, we got introduced to IT Inventory and Resource Management. We also learned about centralization with the help of the client server model.
If you have read this article you may be interested to view :