No Rain On Your Parade – The Cloud is Coming, Relax

By Ed Cartier, xAssets Ltd.

It can be argued that the practice of IT asset management (ITAM) was born out of a number of unforeseen changes. To describe the impact of those changes, let me lay a little groundwork.

For years data processing was centralized. Employees (not quite thought of as end-users yet) logged into a mainframe or bank of mini-computers using terminals to use “home-grown” software written by COBOL programmers who were part of the MIS department. Requests were submitted for new programs and MIS decided if the request had merit and proceeded to give it a priority number. Life went on that way until about 1982.

Around that time, IBM and two small companies (Lotus and Microsoft) shook the MIS world to its foundations. Using IBM’s PC and Microsoft’s and Lotus’ software, employees (now eligible to be called end-users) could, for the first time, create budgets, forecasts and even graphics without asking someone to write program for them. Individuals actually purchased their own computers and software and brought them to work. With control of the IT operations seemingly lost, MIS Directors (soon to become CIOs) implemented PC programs and budgets. Lists of standard hardware configurations and software titles were developed and budget dollars were re-allocated. The PCs, individual printers and floppy disks became the property of the company and order was almost restored. And what, you may ask, happened to the people who ran the computer rooms and time-sharing operations? Some stayed with their jobs as all centralized computing was not abandoned. Others were assigned to the new PC operations and became the fore-runners to today’s IT asset managers.

Life settled down until 1999. At that point the use of PCs was well entrenched, but a new century was on the horizon. Forecasts of doom and pestilence abounded. All of a sudden companies had to know exactly what software was running, what computers were in use and where it all was located. Configuration checks had to be performed and software reviewed. Through the use of a series of new ITAM software products and much work and worry, disaster was avoided and the IT asset manager’s role grew in prominence.

We are now on the precipice of a new IT paradigm; the cloud is coming and many IT asset managers are thinking, “What if there are no software licenses to reconcile to and my servers are replaced by some strange remote data center in the sky? Will my job survive?” My educated guess is: “most likely.” Let me explain.

In the early 80’s, not all applications were transferred from the data center to the desktop. Similarly, more-than-likely all of your end-user applications will not migrate to the cloud. In fact, the cloud will probably not actually remove work from your team. The cloud really adds a layer of complexity as it adds a virtual data center to your operations and more than likely a set of new vendors. Now, instead of working with your data center managers and reconciling software purchases with lists from one Large Account Reseller (LAR) and a few direct suppliers, you will be working with most of your current inventory of purchased software along with the cloud software vendor and any Infrastructure as a Service (IaaS) or remote storage provider. There may be fewer software contracts to review and fewer configurations to track, but remaining consistent with the terms of the cloud subscriber and IaaS provider contracts will take up the slack.

Moreover, as some (and perhaps a lot of) installed software will remain, you will still need to provide proof of proper licensing. Not all software will be available in the cloud, and not all stakeholders will want to convert. At this writing, there seems to be a consensus that critical financial applications will not be put into the cloud, nor will key corporate data be stored offsite. In addition, some applications that are deemed mission critical or core to an organization’s operations, or in which there has been considerable investment, will probably remain on-site. Thus, you will not be totally relieved of your software license reconciliation duties.

To follow that thought, the general consensus is that compliant usage of Microsoft’s products (and probably all other software vendors) will still require that you retain the original base license to prove compliance. Not only that, but you will need to reconcile all purchases with the software that is in use, track and record all “software disposals,” and keep that information available and up-to-date. As any move to the cloud will likely be a phased program across a large organization, the implementation will be gradual, requiring management and monitoring in all environments. You will be managing the cloud applications, the installed applications, all of the changes and the de-installs. The work just piles on!

But, you may say, “Cloud computing is a pay-as-you-go system. Plus, we don’t actually license the software so we can’t be in non-compliance.” True, to a point. Subscription software can be tailored to match the exact number of users, but first you need to know what that number is. In reality, in many cases it is “pay-for-what you-guess-you-need.” It is easy to over-buy cloud solutions without some concrete data as to what you actually need. Consequently, monitoring usage of cloud-based applications becomes yet another task to add to your list. Just as you work to avoid over-licensing now, you will need to work to avoid over-subscribing cloud-based software. That becomes even more critical for companies and organizations that are growing (adding staff) or restructuring (shedding staff). In dynamic situations it is very easy to have too many or not enough subscriptions.

Also, you won’t be completely free from license reconciliations. Software subscriptions have terms as well. Fees and usage can be granted according to the number of users, devices, concurrent users. Compliance with the terms of the subscription agreement will still need to be monitored. Likewise, storage in the cloud may free up hard drives and IaaS may reduce the number of servers in use, but terms and cost thresholds will have to be adhered to and monitored.

But wait, there’s more! Most people equate cloud computing with Software as a Service (SaaS). However, SaaS is only one component of the cloud. In an IaaS environment, cloud versions of your current applications, such as Oracle and IBM, reside on a cloud infrastructure such as Amazon’s Elastic Compute Cloud®. The software vendor (e.g. Oracle) will then require you to obtain a cloud license for the software, which is often based on equivalency tables. You will have the pleasure of paying up front for the full suite of the software’s capabilities even if you only use part of the technology. Moreover, the IaaS provider (e.g. Amazon) will charge for each hour of system time (remember timesharing?) or some other metric. So, now you need to right size the infrastructure and the new software agreement, and keep track of what is being used, what is being charged and what you are entitled to use.

However, before any organization needs to deal with any of these details, the transition of any application or process to the cloud needs to be planned in detail. Data regarding number of licenses, number of active users, relative cost per user, current total cost of ownership, existing investment, life cycle status of the application, criticality of the application, consequences of service outage all need to be considered. Guess who has most of that data? You will have a key role in planning any transition, and that role will likely grow after whatever service you select is adopted.

Once the service is adopted, there is the issue of service continuity and system availability. In a conventional on-premises environment, if the system went down, you called the help desk, reported the problem and expected them to fix it right away. In a cloud-based environment, the service level agreements (SLA) include terms governing up-time and availability. You can also buy different levels of service, depending on how critical the application is to the organization. The point here is that someone (you) will need to track SLA contract compliance, complain to the provider when the system is down and insure that you are getting what you are paying for.

But let’s not forget that, in any environment where services are adopted, your end-users and data centers will still be there. That means that most of the hardware side of an ITAM manager’s job remains in tact. IMACs will still have to be monitored and life cycles will need to be managed. Configurations will need to be monitored, end-of-lease and disposals will need to be managed and new or replacement computers will need to be purchased. Just add all that to your new list of duties.

I will stipulate that I do not have crystal ball that predicts with certainty what will exactly happen as some premises-based computing functions and applications migrate to the cloud. However, I am certain that companies and organizations will still need to manage their IT costs, regardless of where the applications and infrastructure may reside. In an environment where there may actually be dual infrastructures and parallel software usage and licensing terms, the need for a function dedicated to managing those IT assets is obvious. So, relax. Although the cloud is coming it seems unlikely to me that your job is going anywhere.