The Internet of Things has some interesting implications for Software Asset Managers (part 2)

Gartner Projection for Growth in Internet of Things, from 0.9bn in 2009 to 25bn in 2020.

In part one of this blog, I discussed the potential impact of an exponential growth of connected devices, in terms of today’s software licenses.

However, the Internet of Things is already bringing a different licensing challenge. An ever-increasing number of smart devices are being brought to home, street and workplaces. As with many rapid technical revolutions, the focus has been on innovation and market growth. Monetisation has been a secondary concern.  Reality, however, has to catch up in time, and now we are seeing a number of blogs and articles (such as this example from Wired’s Jerome Buvat) debating how IoT vendors might actually start to make some money back.

So why does this matter to Software Asset Managers?

In April 2014, Gartner published a research paper, aimed at IoT vendors, entitled “Licensing and Entitlement Management is One of the Keys to Monetizing the Internet of Things“. Its author, Lawrie Wurster, argued strongly that vendors should see IoT devices not so much as hardware assets, but as platforms for software:

“…to secure additional revenue, manufacturers need to recognize the role that embedded software and applications play in the IoT, and they need to monetize this value”

Gartner point out a number of big advantages for vendors:

  • New offerings can be created with software enhancements, increasing speed to market and removing the expensive retooling of production lines.
  • A single license can bundle hardware, feature offerings and supporting services such as consulting.
  • Vendors can create tiered offerings, enabling the customer to start with basic levels of capability, but with the possibility to purchase more advanced features as they mature.
  • Offerings can be diversified. The vendor can create specific regional offerings, or niche solutions for specialist markets, without needing to manufacture different hardware.

This is not merely analyst speculation, though. It is already happening, and there are already vendors like Flexera helping to enable it. Flexera are a well known name to software asset managers (and my employer, BMC, works in close partnership with them in the SAM space), but another significant part of their business is the provision of licensing frameworks to vendors.  This year, they co-published a report with IDC which presented some striking findings from a survey of 172 device vendors

  • 60% of the vendors are already bundling a mixture of device, software and services into licenses:Chart from Internet of Things study by Flexera, showing that 60% of vendors say “We use licensing and entitlement management systems to develop new offerings”
  • 32% already use software to enable upsold options. More than half will be doing this by 2017.
  • 27% already use a pay-per-use model, charging by the amount of consumption of the software on the devices. A further 22% plan to do so by 2017.
    While there are clear advantages, both to vendors and consumers, there is a big unspoken challenge here. With licensing comes the difficulty of license management. This is not something that industry has done well even before the smart device revolution: Billions of dollars are paid annually by enterprises in compliance settlements.

    Many ITAM functions depend heavily on automated discovery of software installed and used on devices. However, today’s discovery tools may not adapt to discovering multiple new classes of IP-connected devices. Even when the devices are visible, it may not be easy to detect which licensed options have been purchased and enabled.

    Another big challenge might arise from a lack of centralisation. The growth of smart devices will be seen right across the business: in vehicles, facilities, logistics, manufacturing, even on the very products the company itself is selling. With software, the IT department typically had some oversight, although even this has been eroding (SkyHigh Networks, for example, now put the average number of cloud services in an enterprise at over 900… and it’s likely that a significant number of these were bought outside IT’s line of sight).  Put bluntly: IT may simply have no mandate to control purchasing of licensed devices.

This puts the IT Asset Management function in an interesting position. Software Asset Management and Hardware Asset Management, traditionally seen as two related but separable personas, are going to converge when it comes to smart devices. More widely, businesses may need guidance and support, to learn the lessons from IT’s previous difficulties in this area, and avoid even greater compliance and cost-sprawl problems in future.

The Internet of Things has some interesting implications for Software Asset Managers (part 1)

Gartner Projection for Growth in Internet of Things, from 0.9bn in 2009 to 25bn in 2020.

The phrase, “the Internet of Things”, is believed to have been coined by a brand manager at Proctor and Gamble. Kevin Ashton, in a 1999 presentation, envisaged an exponential growth of connected devices, as supply chain logistics got smarter.

Today, the Internet of Things is seen as one of the most significant technology trends, with analysts predicting that the number of connected, smart devices will grow to tens of billions over the next few years.

Much of this proliferation will happen in the workplace. For Software Asset Managers, this could have significant implications. The Internet of Things will not merely be a corner case for SAM: it could impact some of the biggest contracts, with key vendors like Oracle.

Oracle’s licensing rules are explained, at some length, in their Software Investment Guide. One commonly-used license type is called “Named-User Plus”.  Aimed at “environments where users and/or devices can be easily identified and counted”, the license model is illustrated with the following example:

Forklift-based licensing example from the Oracle Software Investment guide

Here, 15 fixed temperature devices are communicating directly with an Oracle database.  There are also 30 forklifts, each of which has a transporter which also writes to the database.

In this case, a total of 415 licenses are required: 15 for the temperature sensors, and 400 for the humans operating the forklifts (because “the forklift is not a non-human-operated device”).

In the past, I’ve used this example, only semi-seriously, to illustrate what might happen if the Internet of Things grows at the speed of the pundits’ projections. Recently, the 2015 Gartner Predicts report for the Internet of Things projected an almost 28-fold growth in connected devices from 2009 to 2020.

Gartner Projection for Growth in Internet of Things, from 0.9bn in 2009 to 25bn in 2020.

The year 2009 is rather pertinent, because Oracle’s forklift example seems to have first appeared in the Software Investment Guide in that year (here’s an example at the Internet Archive).

If we crudely apply the Gartner’s connected growth rate to the number of devices shown in Oracle’s forklift example, there would be well over 1200 connected devices to license by 2020. That’s a trebling of the cost.

I have always laughingly acknowledged this as a crude illustration, until I chanced upon a March 2015 Forbes article titled “The Intelligent Forklift in the Age of the Industrial Internet of Things”:

Today’s “smart” forklift includes diagnostics that allow the equipment to signal when it needs to be serviced, speed controls, anti-slip technology that monitors wheel spin and improve traction on slick floors, collision detection, fork speed optimization, and more.

All of a sudden, my deliberately far-fetched example didn’t seem quite so unlikely.

As always, in Software Asset Management, the challenge is unlikely to be simple or contained. Software Asset Managers deal with many vendors, with many license types. Many of those licenses may depend on counts of connected devices. Many contracts pre-date the Internet of Things, which means costing models are outdated. Unfortunately, that’s unlikely to make the consumer any less liable.

In part 2 of this article, we will look at another major challenge already arising from the Internet of Things: the increasing application of software-style license terms to hardware.

Does SaaS mean the end of audits? The BSA don’t think so.

BSA document cover

In an industry which has struggled with year-on-year rises in the number of vendor-imposed software compliance audits, it can be tempting to see SaaS software, with its subscription pricing models, as a panacea. If we can replace a complex web of installation, site, and user-based licenses with a set of simple subscriptions, won’t that make the compliance challenge much simpler?

Unfortunately, it’s not as straightforward as that. This white paper (pdf, opens in new tab) by industry watchdog BSA – The Software Alliance – explores the breadth of ways it’ll be possible to breach terms and conditions of SaaS software.

A basic SaaS subscription for a simple client application might seem very easy to manage. BSA’s document, however, effectively arms auditors with a checklist of breaches to look for, including:

  • Accessing the service from prohibited geographies.
  • Sharing user accounts.
  • Allowing systems to pose as users.
  • Providing access to non-employees (e.g. contractors) where such access is prohibited.

For companies working with Cloud Service Providers, BSA goes into significant detail on the challenges they may face in retaining compliance with their existing licensing agreements: a range of challenges including IP challenges, geographical limitations, and providing auditors with required access to Cloud infrastructure environments.

BSA represents many of the most assertive organizations involved in license audits, and this document suggests, firmly, that the challenge of audits will not be disappearing soon.  As the document states, while Cloud-based software “solves some license compliance challenges, it also creates new ones”.

Are enterprise Software License Audits costing businesses over $4bn per year?

KMPG survey responses about revenue derived from software audits

I blogged yesterday about the recently released KPMG survey of the software compliance industry.

One very interesting graph breaks down the percentage revenue derived by software vendors from their compliance programs:

KMPG survey responses about revenue derived from software audits

The overall survey is framed as follows:

(KPMG) surveyed 31 software companies representing more than 50 percent of the revenue in the software industry, where enterprise software revenue is expected to total $301 billion in 2013 (Gartner). 

If we take just 50% of that total US $301 billion enterprise software market (to represent – conservatively – the market share of the 31 companies that responded), and extrapolate from the mid-points of the buckets in the diagram (e.g. take “2% to less than 4%” as 3%), then we get an estimated figure for the total revenue derived from compliance programs of $3.99 billion.

That, of course, assumes an even distribution of software company sizes across each of the response levels.  That’s not a sound assumption, but it could push the figure higher as well as lower.   This also discounts the remainder of the enterprise market that did not respond to the survey, or were not surveyed, which could add an unknown amount to the figure.  Additionally, the figure above uses a value of 10% for the “10% or more” bucket – in reality this is likely to sit somewhere over 10%, but we have no data to indicate by how much.

What seems safe to say is that response to compliance enforcement is costing enterprise software consumers billions of dollars, and there is a good chance that the overall figure will be in excess of US $4 billion.

It has been difficult, to date, to estimate a reliable market size for the Software Asset Management market (not least because it is difficult to define: how much of the market is already accounted for in estimates for technologies such as discovery and the IT Asset respository?).  However, if the damage caused by a lack of control is already counted in the billions, this suggests a significant addressable market.

Even a modest 5% estimate for the market value of saved compliance penalties would suggest an overall market of $200 million for preventative SAM alone, and this is before we consider the value of optimization rather than threat reduction.

This fascinating KPMG survey reveals the software license auditor’s viewpoint

KPMG survey front cover - "Is unlicensed software hurting your bottom line"

Software licensing audits are a big challenge for IT departments.  65% of respondents to a 2012 Gartner survey reported that they had been audited by at least one software vendor in the past 12 months, a figure which has been on a steady upward trajectory for a number of years.

Often, companies being audited for software compliance will actually deal, at the front-line, with a 3rd party audit provider. One of the big names in this niche is KPMG, whose freely-downloadable November 2013 report, “Is unlicensed software hurting your bottom line?”, provides a very interesting window into the software compliance business.

The report details the results of a survey conducted between February and April 2013, with respondents made up “31 software companies representing more than 50 percent of the revenue in the software industry”.

Revenue is driving software audits

The survey results show, rather conclusively, a belief in the business value of tackling non-compliance:

  • 52% of companies felt that their losses through unlicensed use of software amounted to more than 10% of their revenue.
  • Almost 90% reported that their compliance program is a source of revenue. For about a tenth, it makes up more than 10% of their overall software revenue.  For roughly half, it is at least 4%.

Compliance audits are increasingly seen as a sales process

  • In more than half of responding organisations, the software compliance function is part of Sales. This is reported as being up from 1 in 3, in an equivalent 2007 survey.
  • In 2007, 47% of compliance teams were part of the Finance department. This figure has plummeted to just 13%.

This shift is not universal, and some companies seem committed to a non-Sales model for their compliance team.  A compliance team member from one major software vendor talked to me about the benefit of this to his role: He can tell the customer he is completely independent of the sales function, and is paid no commission or bonus based on audit findings.  Many other vendors, however, structure audits as a fully-commissioned role.  As the survey points out:

  • Only 20% of companies pay no commission to any individuals involved in the compliance process.
  • In 59% of cases, the commission structure used is the same as the normal sales commission program.

There is further indication of the role of sales in the audit process, in the answers to the question on “settlement philosophy”.  More than half of the respondents reported a preference for using audit findings as leverage in a “forward-looking sales approach”, rather than wanting to seek an immediate financial settlement.

Almost half of vendors select audit targets based on profiling

The biggest single selection reason for a compliance review was nomination by the sales account team (53%), with previous account history in close second place (50%).

Interestingly, however, 47% reported selecting customers for review based on “Data analytics suggesting higher risk of non-compliance”, with 7% stating that random selection is used.  It seems that audits are still a strong likelihood regardless of an organisation’s actual compliance management.

Auditors prefer their own proprietary tools to customers’ SAM tools

There seems to be a distinct lack of regard for Software Asset Management tools. 42% of respondents seek to use their own discovery scripts in the audit process. Only 26% of the vendors stated that they use customers’ SAM tools, and remarkably this is down from 29% in 2007, when one might expect few SAM tools would have been found on customer sites anyway.

This echoes the experience of a number of customers with whom I have previously spoken, and it can be a real source of annoyance. How, some argue, is it fair that license models are so complex that it takes a secretive proprietary script, only available to the auditor, to perform a definitive deployment count?

Other observations

  • Software tagging has not been widely adopted: Less than half of respondents do it, or have plans to do so.
  • SaaS reduces the role of the software auditor. Only 15% reported any compliance issues, and more than half don’t even look for them.
  • Few companies seek to build protection against overdeployment into their software. From conversations I have had, most seem to want to encourage wide distribution. Some desktop software was deliberately released in a manner that has encouraged wide, almost viral distribution. In at least one case, an acquisition by a larger company has been the trigger for a significant and aggressive audit program, targeting almost every large company on the assumption that the software is likely to be found there.

Conclusions?

It is very clear from the survey results that many large software vendors have established their compliance program as a significant revenue generator, and with a significant shift of these functions into the sales department, we can probably assume that there is a broad intent to maintain or even grow this role.

Whether this is even compatible with a more collaborate model of software compliance management is highly questionable: the business case for the status quo seems very sound, from the vendor’s point of view.  With so many vendors only trusting the discovery scripts used by their auditors, the situation for customers is nearly impossible: how can they verify compliance if the only counting tool is in the hand of the vendor?

The light at the end of the tunnel for many customer may be SaaS:  SaaS software tends to be more self-policing, and consumption models are often simpler. However, it brings its own challenges: zombie accounts, decentralised purchasing, and a new set of inconsistent consumption models. Meanwhile, hosted software does not go away.

Microsoft hike key license price by 15%. How can you offset the rise?

A few days ago, Microsoft (or rather, many of its resellers) announced a 15% price rise for it’s user-based Client Access license, for a range of applications. The price hike was pretty much immediate, taking effect from 1st December 2012.

The change affects a comprehensive list of applications, so it’s likely that most organizations will be affected (although there are some exceptions, such as the PSA12 agreement in the UK public sector).

Under Microsoft’s client/server licensing system, Client Access Licenses (CALs) are required for every user or device accessing a server.

Customers using these models need to purchase these licenses in addition to the server application licenses themselves (and in fact, some analysts claim that CALs provide up to 80% of  license revenue derived from these models).

What’s interesting is that the price rise only affects User-based CALs, not Device-based CALs. Prior to this change, the price of each CAL was typically the same for any given application/option, regardless of type.

This is likely to be a response to a significant industry shift towards user-based licensing, driven to a large extent by the rise of “Bring your own Device” (BYOD). As employees use more and more devices to connect to server-based applications, the Device CAL becomes less and less attractive.

As a result, many customers are shifting to user-based licensing, and with good reason.CALbeforeafter

15% is a big rise to swallow.   However, CAL licensing has often been pretty inefficient. With the benefit-of-proof firmly on the customer, a true-up or audit often results in “precautionary spending”: “You’re not sure how many of our 5,000 users will be using this system, so we’d suggest just buying 5,000 CALs“. This may be compounded by ineffective use of the different licensing options available.

Here are three questions that every Microsoft customer affected by this change should be asking:

Do we know how many of our users actually use the software?
This is the most important question of all. It’s very easy to over-purchase CALs, particularly if you don’t have good data on actual usage. But if you can credibly show that 20% of that user base is not using the software, that could be a huge saving.

Could we save money by using both CAL types?
Microsoft and their resellers typically recommend that companies stick to one type of CAL or the other, for each application. But this is normally based on ease of management, not a specific prohibition of this approach.
But what if our sales force uses lots of mobile devices and laptops, while our warehouse staff only access a small number of shared PCs. It is likely to be far more cost effective to purchase user CALs for the former group, while licensing the shared PCs with device licenses. The saving may make the additional management overhead very worthwhile.

Do we have a lot of access by non-employee third parties such as contractors?
If so, look into the option of purchasing an External Connector license for the application, rather than individual CALs for those users or their devices.  External Connectors are typically a fixed price option, rather than a per-user CAL, so understand the breakpoints at which they become cost effective.  The exercise is described at the Emma Explains Microsoft Licensing in Depth blog.  Microsoft’s explanation of this license type is here.

The good news is that the price hike will usually kick in at most customer’s next renewal. If you have a current volume licensing agreement, the previous prices should still apply until then.

This gives most Software Asset Managers a bit of time to do some thinking. If you can arm your company with the answer to the above questions by the time your next renewal comes around, you could potentially save a significant sum of money, and put a big dent in that unwelcome 15% price hike.

Image courtesy of Howard Lake on Flickr, used under Creative Commons licensing

Painted into a Corner: Why Software Licensing isn’t getting simpler

It’s not easy being a Software License Manager.

It’s really not easy being a Software License Manager in a company which uses products from one or more of the “usual suspects” among the major software vendors.  Some of the largest have spent recent years creating a licensing puzzle of staggering complexity.

There’s an optimistic school of thought which supposes that the next big change in the software industry – a shift to service-oriented, cloud-based software delivery – will make this particular challenge go away.  But how true is this? To answer the question, we need to take a look back, and understand how we arrived at the current problem.

In short, today’s complexity was driven by the last big industry megatrend: virtualization.

In an old-fashioned datacenter, licensing was pretty straightforward.  You’re running our software on a box?  License that box,  please.  Some boxes have got bigger?  Okay, count the CPUs, thanks. It was nothing that should have been a big issue for an organized Asset Manager with an effective discovery tool.  But as servers started to look a bit less like, well, servers, things changed, and it was a change that became rather dramatic.

The same humming metal boxes were still there in the data center, but the operating system instances they were supporting had become much more difficult to pin down.  Software vendors found themselves in a tricky situation, because suddenly there were plenty of options to tweak the infrastructure to deliver the same amount of software at a lower license cost. This, of course, posed a direct threat to revenues.

The license models had to be changed, and quickly. The result was a new set of metrics, based on assessment of the actual capacity delivered, rather than on direct counting of physical components.

In 2006, in line with a ramping-up of the processor core count in its Power5 server offering, IBM announced its new licensing rules.We want customers to think in terms of ‘processor value units’ instead of cores”, said their spokesman. A key message was simplification, but that was at best debatable:  CPUs and cores can be counted, whereas processor-specific unit costs have to be looked-up.  And note the timing: This was not something that arrived with the first Power5 servers. It was well into the lifetime of that particular product line.  Oh, and by the way, older environments like Power4 were brought into the model, too.

And what about the costs?  “This is not a pricing action. We aren’t changing prices”. added the spokesman.

For a vendor, that assertion is important. Changing pricing frameworks is a dangerous game for software companies, even if on paper it looks like a zero-sum game.  The consequences of deviating significantly around the current mean can be severe:  The customers whose prices rise tend to leave. Those whose prices drop pay you less.  Balance isn’t enough – you need to make it smooth for every customer.

Of course, virtualization didn’t stand still from August 2006 onwards, and hence neither did the license models.  With customers often using increasingly sophisticated data centers, built on large physical platforms, the actual processing capacity allocated to software might be significantly less than the total capacity of the server farm.  You can’t get away with charging for hundreds of processors where software is perhaps running on a handful of VMs.

So once again, those license models needed to change.  And, as is typical for revisions like this, sub-capacity licensing was achieved through the addition of more details, and more rules.  It was pretty much impossible to make any such change reductive.

This trend has continued:  IBM’s Passport Advantage framework, at the time of writing, has an astonishing  46 different scenarios modelled in its processor value unit counting rules, and this number keeps increasing as new virtualization technologies are released. Most aren’t simple to measure: the Asset Manager needs access to a number of detailed facts and statistics.  Cores, CPUs, capacity caps, partitioning, the ability of VMs to leap from one physical box to another – all of these and more may be involved in the complex calculations. Simply getting hold of the raw data is a big challenge.

Another problem for the Software Asset Manager is the fact that there is often a significant and annoying lag between the emergence of a new technology, and the revision of software pricing models to fit it. In 2006, Amazon transformed IT infrastructure with their cloud offering. Oracle’s guidelines for applying its licensing rules in that environment only date back to 2008. Until the models are clarified, there’s ambiguity. Afterwards, there are more rules to account for.

(Incidentally, this problem is not limited to server-based software.  A literal interpretation of many desktop applications’ EULAs can be quite frightening for companies using widespread thin-client deployment. You might only have one user licensed to work with a specialist tool, but if they can theoretically open it on all 50,000 devices in the company, a bad-tempered auditor might be within their rights to demand 50,000 licenses.)

License models catch up slowly, and they catch up reactively, only when vendors feel the pressure to change them. This highlights another problem: despite the fine efforts of industry bodies like the SAM Standards Working Group, vendors have not found a way to collaborate.  As the IBM spokesman put it in that initial announcement: “We can’t tell the other vendors how to do their pricing structure”.

As a result, the problem is not just that these license models are complex.  There are also lots of them.  Despite fundamentally measuring the same thing, Oracle’s Processor Core Factors are completely different to IBM’s Processor Value Units.  Each vendor deals with sub-capacity in its own way, not just in terms of counting rules but even in terms of which virtual systems can be costed on this basis. Running stuff in the cloud? There are still endless uncertainties and ambiguities.  Each vendor is playing a constant game of catch-up, and they’re each separately writing their own own rules for the game. And meanwhile, their auditors knock on the door more and more.

Customers, of course, want simplification. But the industry is not delivering it. And the key problem is that pricing challenge.  A YouTube video from 2009 shows Microsoft’s Steve Ballmer responding to a customer looking for a simpler set of license models.  An edited transcript is as follows:

Questioner:

Particularly in application virtualization and general virtualization, some of Microsoft’s licensing is full of challenging fine print…

…I would appreciate your thoughts on simplifying the licensing applications and the licensing policies.”

Ballmer:

“I don’t anticipate a big round of simplifying our licenses.  It turns out every time you simplify something, you get rid of something.  And usually what we get rid of, somebody has used to keep their prices down…

…The last round of simplification we did of licensing was six years ago…. it turned out that a lot of the footnotes, a lot of the fine print,  a lot of the caveats, were there because somebody had used them to reduce their costs…

…I know we would all like the goal to be simplification, but I think the goal is simplification without price increase. And our shareholders would like it to be a simplification without price decreases”…

…I’d say we succeeded on simplification, and our customer satisfaction numbers plummeted for two and a half years”.

In engineering circles there is a wise saying: “Strong, light, cheap: Pick any two”. The lesson from the last few years in IT  is that we can apply a similar mantra to software licensing:  Simple, Flexible, Consistently Priced: Pick any two.

Vendors have almost always chosen the latter two.

This brings us to the present day, and the next great trend in the industry. According to IDC’s 2011 Software Licensing and Pricing survey, a significant majority of the new commercial applications brought to market in 2012 will be built for the Cloud. Vendors are seeing declining revenues from perpetual license models, while subscription-based revenue increases. Some commentators view this as a trend that will lead to the simplification of software license management. After all, people are easier to count than dynamic infrastructure… right?

However, for this simplification to occur, the previous pattern has to change, and it’s not showing any sign of doing so.  The IDC survey reported that nearly half of the vendors who are imminently moving to usage-based pricing models still had no means to track that usage. But no tracking will mean no revenue, so we know they’ll need to implement something. Once again, the software industry is in an individual and reactive state, rather than a collaborative one, and that will mean different metrics, multiple data collection tools, and a new set of complex challenges for the software asset manager.

And usage based pricing is no guarantee of simplicity. A glance at the Platform-as-as-Service sector illustrates this problem neatly. Microsoft’s Azure, announced in 2009 and launched in 2010, promised new flexibility and scalability… and simplicity. But again, flexibility and simplicity don’t seem to be sitting well together.

To work out the price of an Azure service, the Asset Manager needs to understand a huge range of facts, including (but by no means limited to) usage times, usage volumes, and secondary options such as caching (both performance and geographic), messaging and storage.  Got all that?  Good, because now we have to get to grips with the contractual complications: MSDN subscriptions have to be accounted for, along with the impact of any existing Enterprise Agreements. Microsoft recognized the challenge and provided a handy calculator, only to acknowledge that “you will most likely find that the details of your scenario warrant a more comprehensive solution”. Simplicity, Flexibility, Consistent Pricing: Pick any two.

And, of course, the old models won’t go away either. Even in a service-oriented future, there will still be on-premise IT, particularly amongst the organizations providing those services.

Software vendors have painted themselves into a corner with their license models, and unless they can find a way to break that pattern, we face a real risk that the license management challenge will get even more complex. Entrenched complexity in the on-premise sector will be joined by a new set of challenges in the cloud.

The pattern needs to change. If it doesn’t change, be nice to your Software Asset Manager. They’ll need a coffee.