IAITAM Preview: Strategic License Management in the Virtualized World

04 April 2014
10 minute read
ITAM News & Analysis

IAITAM Preview: Strategic License Management in the Virtualized World

04 April 2014
10 minute read
Live by the rule, “Content is King.”

Live by the rule, “Content is King.”

Ahead of the IAITAM Spring Conference taking place in Las Vegas, April 28 – May 1, I interviewed Robbie Plourde, VP of Consulting and Services at Aspera, about his presentation  “Strategic License Management in the Virtualized World”.

In just a few words, tell us what it is you do. What does an average day look like for you?

I am currently the VP of Consulting and Services for Aspera Technologies responsible for overseeing the consultant and services function in the US.  An average day for me consists of interactions with customers and prospects; helping them to better understand Software License Management.  I work with them to align their needs, goals, and solutions improving their compliance positions, as well as, license cost savings initiatives.

How does virtualization change license management?

Server licenses, which were already perplexing IT departments, have become even more complex with the advent of virtualization. Similar to the advancement of the mobile device industry, software publishers are constantly coming up with new license models. Think of it this way: clusters and server farms allow for rapid creation of virtual instances, flexible distribution of load demand, and immediate access to high processing power. But how should the software be licensed when it can, in principle, run on the entire cluster—but doesn’t?

The abstraction of hardware through virtualization means medium term, but nonetheless, retirement of traditional license models. Still, most models can be roughly placed into two categories: licenses based on the end device or licensing by the number of users.

User based license models focus on the number of authorized users. To accommodate for licenses based on an end device publishers have modified existing license models. These modified models include more complex metrics such as IBM’s PVUs. The metrics require multiple sets of data (e.g. CPUs/cores, type of virtualization – hard or soft partitioning –, hardware manufacturers, configuration, etc.) and mathematical calculations. In order to determine the effective license demand (how many licenses are needed to cover the software usage), the metrics must be followed, while also taking the licensing rules and product use rights into consideration. For example, Microsoft’s 90 days rule that says: “Volume Licensing product licenses can be reassigned to other devices every 90 days, not more frequently.”

Unfortunately, the required data for server metrics is difficult for organizations to gather and track on an ongoing basis, and the ever changing licensing terms have left ample room for legal gray zones.

Can it help with cost saving? If so, how?

Yes. (There’s a “but” coming later.)

Ultimately, in a virtualized environment license costs should only be applicable when a program is actually loaded or used by a user. This means costs that directly correlate to the value provided to the business. Another benefit would be more precise license terms in which the actual use of the software is measured.

Compliance with license terms can then be better controlled, translating into less over-buying to compensate for audit risks and under-licensing. For at least a good two years now, companies such as Amazon and 3Tera have been offering Cloud Computing at inexpensive rates.

BUT, a precondition for user oriented licenses is having comprehensive, and often complex, data collection methods for software operation, in which continual tracking of the usage data is recorded down to the second, similar to early billing plans in the cell phone industry. In the future, Software Asset Management will attach high importance to evaluating effective usage and boosting cost optimization, rather than on revealing where a company is under or over-licensed.

Many are still struggling with license management without adding virtualization to the mix, what would your advice to those people be?

SAM covers a complex cross section of IT processes and can pose an integration problem between IT, strategic sourcing and finance. For this reason the focal point of license management should be transparency.

Most global enterprises have decentralized organizational structures. In general, licenses and contracts are purchased using different order systems and manually collected in Excel spreadsheets. And while individual tools may be sporadically used for specific data or vendors, there’s no specialized license management tool in place to handle all licenses and all vendors. This creates isolated silos of information.

What’s more, organizations with existing IT Asset Management (ITAM) practices discover that their enterprise ITAM technologies cannot adequately deliver license management. These tools do not provide the level of functionality required to manage entitlements, product use rights or to calculate license metrics.

Often, these organizations find themselves looking for a specialized license management tool to fill the license management—and SAM—gap in their existing ITAM solutions.

My advice is to invest in a dedicated license management tool to consolidate, centralize and normalize licensing data and forget about customizing your ITAM solution. Customization will only lead to costly, complex, and frequent adaptions of the technology to deal with ever-changing licensing terms and server software.

Can virtualization help decrease operational spend? If so, how?

Servers run at very low average utilization levels (less than 15%). Virtualization increases utilization, which means for a given workload that can be virtualized, a company can typically reduce the number of physical servers. From an operational spend perspective this means hardware and energy costs could be reduced. No technology in itself is ever a cost saver.  Only the appropriate demand management can help decrease operational spend.

What would your top 5 tips for success be when it comes to license management in the virtualized world?

  • Don’t assume a tool replaces the need for license expertise. Only a licensing expert can help you interpret the gray zones in the licensing.
  • SAM requires several discovery tools, therefore, when choosing a SAM provider a key evaluation criterion should be based on the provider’s experience with integrating its solution with existing discovery tools in your environment.  You should establish an evaluation/validation checklist such as:

i.         How many out-of-box connectors do they provide?

ii.         Their ability to extend the capabilities of these existing technologies to evaluate the required data; and

iii.         Their specialized knowledge to create additional scripts to fill in the data holes.

Typically, a discovery tool will only cover one or a few platforms, but not all. Integrated discovery tools in suite products do not dive deeply enough to gather all the required information to manage server license metrics. Generally, these tools are designed for desktop scans and superficial server inventory. This means that an organization must invest in additional tools and scripts to fill the data holes.

  • Think about SLM data management before/while you implement.   Use the ITIL Process steps:

i.         Strategy

ii.         Design (a complete solution not only the virtual infrastructure but also data management for SAM),

iii.         Transition (both the infrastructure and the SLM data stream),

iv.         Operate (the infrastructure and your ongoing compliance position)

  • Plan for dynamic Virtual Machine usage.  Because VM’s tend to be dynamically created on an as-needed basis and are often temporary in nature, it is useful to take the overall VM lifecycle into account when planning for license compliance.
  • Use asset management tools designed to handle the virtualization landscape.  Software makers have begun adapting their licensing policies to make it easier to use their products in a virtual environment, but such policies may not always mesh well with license metering software.

What is your advice for staying in control of your license management position?

Manage your data quality. This is why it’s imperative that SAM tools have data quality management features built-in—and I mean more than just log files.

Every SAM program lives and dies by the quality of its data. So my advice is to live by the rule, “Content is King.” Content is data, so in other words: only consistent and high quality data should make its way into the SAM tool.

A good starting point is to define what quality data is. Quality data is having all the information you need, when you need it. The two main criteria to determine quality:

  1. Completeness – Data completeness in license management is three-dimensional. It requires having a) full organizational data (legal entities, cost centers, users, accounts); b) tools / scripts to gather data on each platform, and c) comprehensive hardware data. The data is not complete if you need configuration information on a Unix server, but only have tools to scan Windows. To manage many server licenses you need to know the number of CPUs. If you don’t have this information then you need to work on processes to get it.
  2. Consistency – Consistency draws on the reliability of the data. If server “SRV004” is recorded in your CMDB, then you should also find “SRV004” in your discovery data. If this is not the case, your data is not consistent. Similarly, if “SRV004” is marked in the CMDB as having four cores, then the scan data should provide the same information. Expected results vs. delivered results are a key factor to determine data consistency. If you expect 3,000 servers to be inventoried and only 1,500 show up in the data, then you have a consistency problem.

To ensure high quality data I recommend companies gradually expand the coverage of data collection metric for metric using quality gates before moving on, while keeping the existing data up to date.

If you could only give one piece of advice on this topic what would it be?

Manage your data quality!

What is the most important lesson you have ever learned when it comes to ITAM?

  1. Plan-Do-Check-Act (PDCA Cycle)
  2. Strategy-Design-Transition-Operation (ITIL processes)

Sticking close to those principles makes ITAM/SAM life so much easier.

Any final pieces of advice?

Look for a reliable Partner when seeking out Software Asset Management solutions not just a tool vendor.  There are many vendors out in the market, do your research and make sure the vendor you select is willing to work with you as a partner and not just selling you a solution.

Image Credit

Can’t find what you’re looking for?