SAM Failure Points – A Change of Focus Provides Insight

By Jason Keogh, iQuate

The profile of IT Asset Management (ITAM) within organizations has been on the rise in recent years as IT managers realize that Asset Management has a significant role in managing budgets, negotiating better contracts and preventing inadvertent non-compliance with associated financial risk.

Software Asset Management (SAM) is a critical part of ITAM as software is typically one of the hardest assets to manage – especially within large-scale server environments. Metrics for licensing server software are complex, often related to the underlying physical hardware (which is getting more difficult to identify in this age of virtualization and cloud based infrastructure). Server software typically relies on very complex licensing rules that are constantly evolving.

To understand failure, one has to understand what promise SAM has been making to the business – what is it failing to deliver? In essence SAM exists for two reasons:

  1. To reduce the risk involved in running IT within an organization.
  2. To reduce the cost involved in running IT within an organization.

Inaccuracy Leads to Failure

There are several reasons for SAM failure but ultimately failure can be traced back to one of the earliest maxims introduced in the computer industry – Garbage in, garbage out.

In recent years, organizations have been told to expect that investment in SAM will reap significant returns in optimizing software licensing to reduce costs. The secondary goal is to ensure that companies avoid unexpected expenditure by ensuring software deployment is compliant with license entitlement. If the output from a SAM initiative is based on inaccurate inventory data then the organization immediately faces increased financial, contractual and operational risk.

Failure from Concentrating on the Wrong Software

The growth in recent years of audit activity by major software vendors is well documented. When a company is hit with an unexpected, unbudgeted expenditure as a result of a vendor audit – the SAM processes in that organization have failed.

To date, most SAM projects have focused on desktop environments – many of the recent unbudgeted expenditure resulting from vendor audits have been from vendors focused on server environments.

Desktop software is often governed by license keys. Well established approaches have been developed over the years by vendors and clients to control and manage the deployment and usage of software in this environment.

Server software typically accounts for 70% to 85% of a large organization’s total software spend, yet it is apparent that SAM in server environments is nowhere near as established and mature as it is within the desktop estate. Again, there are many underlying reasons why – people often assume that as server environments are more closely controlled and that the data required for SAM will be easily available. This is typically not the case. Change management is normally focused on minimizing risks to availability and performance for mission critical services, not on Software Asset Management requirements.

The systems management software deployed to servers is focused on availability and performance – it may tell you about the processes running (for example) – but knowing you have 100 servers with “Oracle database” on them is less than useless for establishing license requirements – you need information on edition, options, packs, clustering, virtual and physical hardware, etc. to establish license requirements.

SAM Issues for Networks

Modern networks are increasingly complex, relying on different technologies from multiple vendors and all with unique licensing metrics and characteristics. Much server-based enterprise-level software has little or no licensing controls and so it is very easy for unauthorized deployments to occur.

At the same time, these environments are also protected by various gatekeepers within the organization who are responsible for enforcing access and security policies. The better they do their job, the harder it can be for SAM practitioners to have the access they need.

The complexity, diversity and size of modern large-scale enterprise networks present many real challenges for those trying to get the accurate inventory data they need.

Networks have Technical Diversity and Complexity

Large networks comprise a range of technologies, platforms and versions. The person tasked with providing accurate inventory data needs to be well-versed in all aspects of all of the technologies to ensure a real and accurate picture of the software assets deployed and in use, and the hardware environment on which it is installed. These technologies often include Windows, Solaris, Linux, HP/UX, AIX, VMware, LPAR, databases, middleware, etc. – the list goes on and on.

Network Scale

Assuming that you have the skills to interrogate every server and complex application accurately, the size of modern networks means that this is a very time consuming process. While the data may be accurate, it is unlikely to be up to date and easily accessible. When the servers are located in different locations, this problem is exacerbated.

Network Consistency

One approach to resolve the problem of scale is to have a team of people collecting this data. However, everyone, regardless of how experienced and trained they are, will have different levels of expertise and is liable to make manual mistakes which may produce different and inaccurate results.

Network Inventory Accuracy

The potential for inconsistency will inevitably lead to concerns about the accuracy of the inventory data. The complexity of networks means that servers could be overlooked, so accuracy can be further compromised by being incomplete.

Automated Solutions

Adopting an automated approach to inventory is a potential answer to some of these problems but there are different pros and cons to be assessed when looking at automated tools. Security, ease of deployment, coverage and detail level for different operating systems, databases, middleware or platforms all vary from tool to tool. Flexibility is also a key consideration – how will the automation cope with in-house developments which imbed and extend commercial software and how will it identify users of in-house or bespoke products for database licensing, etc.?

SAM is at a crucial point in its history. Most organizations see the benefit in adopting SAM, but many of these are disappointed at not getting the return on their investment that they expected.

Although many of the processes and procedures embedded into organizations today were developed for desktop applications, these are sufficient for dealing with wider server compliance requirements and license positions. The reason for desktop or server management failure can be often be traced back to inaccurate inventory – Garbage in, garbage out. While the data exists for the desktop estate, it is often unavailable for complex server environments.

Assuming competent people armed with business support and the right processes are in place, success is achievable ONLY if the data upon which they make decisions is not flawed. The “foundation data” consisting of the details of software deployment, the hardware it runs on, virtual to physical mappings and an understanding of license entitlement are critical to ensuring the success of any SAM project.