The Internet of Things has some interesting implications for Software Asset Managers (part 2)

Gartner Projection for Growth in Internet of Things, from 0.9bn in 2009 to 25bn in 2020.

In part one of this blog, I discussed the potential impact of an exponential growth of connected devices, in terms of today’s software licenses.

However, the Internet of Things is already bringing a different licensing challenge. An ever-increasing number of smart devices are being brought to home, street and workplaces. As with many rapid technical revolutions, the focus has been on innovation and market growth. Monetisation has been a secondary concern.  Reality, however, has to catch up in time, and now we are seeing a number of blogs and articles (such as this example from Wired’s Jerome Buvat) debating how IoT vendors might actually start to make some money back.

So why does this matter to Software Asset Managers?

In April 2014, Gartner published a research paper, aimed at IoT vendors, entitled “Licensing and Entitlement Management is One of the Keys to Monetizing the Internet of Things“. Its author, Lawrie Wurster, argued strongly that vendors should see IoT devices not so much as hardware assets, but as platforms for software:

“…to secure additional revenue, manufacturers need to recognize the role that embedded software and applications play in the IoT, and they need to monetize this value”

Gartner point out a number of big advantages for vendors:

  • New offerings can be created with software enhancements, increasing speed to market and removing the expensive retooling of production lines.
  • A single license can bundle hardware, feature offerings and supporting services such as consulting.
  • Vendors can create tiered offerings, enabling the customer to start with basic levels of capability, but with the possibility to purchase more advanced features as they mature.
  • Offerings can be diversified. The vendor can create specific regional offerings, or niche solutions for specialist markets, without needing to manufacture different hardware.

This is not merely analyst speculation, though. It is already happening, and there are already vendors like Flexera helping to enable it. Flexera are a well known name to software asset managers (and my employer, BMC, works in close partnership with them in the SAM space), but another significant part of their business is the provision of licensing frameworks to vendors.  This year, they co-published a report with IDC which presented some striking findings from a survey of 172 device vendors

  • 60% of the vendors are already bundling a mixture of device, software and services into licenses:Chart from Internet of Things study by Flexera, showing that 60% of vendors say “We use licensing and entitlement management systems to develop new offerings”
  • 32% already use software to enable upsold options. More than half will be doing this by 2017.
  • 27% already use a pay-per-use model, charging by the amount of consumption of the software on the devices. A further 22% plan to do so by 2017.
    While there are clear advantages, both to vendors and consumers, there is a big unspoken challenge here. With licensing comes the difficulty of license management. This is not something that industry has done well even before the smart device revolution: Billions of dollars are paid annually by enterprises in compliance settlements.

    Many ITAM functions depend heavily on automated discovery of software installed and used on devices. However, today’s discovery tools may not adapt to discovering multiple new classes of IP-connected devices. Even when the devices are visible, it may not be easy to detect which licensed options have been purchased and enabled.

    Another big challenge might arise from a lack of centralisation. The growth of smart devices will be seen right across the business: in vehicles, facilities, logistics, manufacturing, even on the very products the company itself is selling. With software, the IT department typically had some oversight, although even this has been eroding (SkyHigh Networks, for example, now put the average number of cloud services in an enterprise at over 900… and it’s likely that a significant number of these were bought outside IT’s line of sight).  Put bluntly: IT may simply have no mandate to control purchasing of licensed devices.

This puts the IT Asset Management function in an interesting position. Software Asset Management and Hardware Asset Management, traditionally seen as two related but separable personas, are going to converge when it comes to smart devices. More widely, businesses may need guidance and support, to learn the lessons from IT’s previous difficulties in this area, and avoid even greater compliance and cost-sprawl problems in future.

The Internet of Things has some interesting implications for Software Asset Managers (part 1)

Gartner Projection for Growth in Internet of Things, from 0.9bn in 2009 to 25bn in 2020.

The phrase, “the Internet of Things”, is believed to have been coined by a brand manager at Proctor and Gamble. Kevin Ashton, in a 1999 presentation, envisaged an exponential growth of connected devices, as supply chain logistics got smarter.

Today, the Internet of Things is seen as one of the most significant technology trends, with analysts predicting that the number of connected, smart devices will grow to tens of billions over the next few years.

Much of this proliferation will happen in the workplace. For Software Asset Managers, this could have significant implications. The Internet of Things will not merely be a corner case for SAM: it could impact some of the biggest contracts, with key vendors like Oracle.

Oracle’s licensing rules are explained, at some length, in their Software Investment Guide. One commonly-used license type is called “Named-User Plus”.  Aimed at “environments where users and/or devices can be easily identified and counted”, the license model is illustrated with the following example:

Forklift-based licensing example from the Oracle Software Investment guide

Here, 15 fixed temperature devices are communicating directly with an Oracle database.  There are also 30 forklifts, each of which has a transporter which also writes to the database.

In this case, a total of 415 licenses are required: 15 for the temperature sensors, and 400 for the humans operating the forklifts (because “the forklift is not a non-human-operated device”).

In the past, I’ve used this example, only semi-seriously, to illustrate what might happen if the Internet of Things grows at the speed of the pundits’ projections. Recently, the 2015 Gartner Predicts report for the Internet of Things projected an almost 28-fold growth in connected devices from 2009 to 2020.

Gartner Projection for Growth in Internet of Things, from 0.9bn in 2009 to 25bn in 2020.

The year 2009 is rather pertinent, because Oracle’s forklift example seems to have first appeared in the Software Investment Guide in that year (here’s an example at the Internet Archive).

If we crudely apply the Gartner’s connected growth rate to the number of devices shown in Oracle’s forklift example, there would be well over 1200 connected devices to license by 2020. That’s a trebling of the cost.

I have always laughingly acknowledged this as a crude illustration, until I chanced upon a March 2015 Forbes article titled “The Intelligent Forklift in the Age of the Industrial Internet of Things”:

Today’s “smart” forklift includes diagnostics that allow the equipment to signal when it needs to be serviced, speed controls, anti-slip technology that monitors wheel spin and improve traction on slick floors, collision detection, fork speed optimization, and more.

All of a sudden, my deliberately far-fetched example didn’t seem quite so unlikely.

As always, in Software Asset Management, the challenge is unlikely to be simple or contained. Software Asset Managers deal with many vendors, with many license types. Many of those licenses may depend on counts of connected devices. Many contracts pre-date the Internet of Things, which means costing models are outdated. Unfortunately, that’s unlikely to make the consumer any less liable.

In part 2 of this article, we will look at another major challenge already arising from the Internet of Things: the increasing application of software-style license terms to hardware.

Does SaaS mean the end of audits? The BSA don’t think so.

BSA document cover

In an industry which has struggled with year-on-year rises in the number of vendor-imposed software compliance audits, it can be tempting to see SaaS software, with its subscription pricing models, as a panacea. If we can replace a complex web of installation, site, and user-based licenses with a set of simple subscriptions, won’t that make the compliance challenge much simpler?

Unfortunately, it’s not as straightforward as that. This white paper (pdf, opens in new tab) by industry watchdog BSA – The Software Alliance – explores the breadth of ways it’ll be possible to breach terms and conditions of SaaS software.

A basic SaaS subscription for a simple client application might seem very easy to manage. BSA’s document, however, effectively arms auditors with a checklist of breaches to look for, including:

  • Accessing the service from prohibited geographies.
  • Sharing user accounts.
  • Allowing systems to pose as users.
  • Providing access to non-employees (e.g. contractors) where such access is prohibited.

For companies working with Cloud Service Providers, BSA goes into significant detail on the challenges they may face in retaining compliance with their existing licensing agreements: a range of challenges including IP challenges, geographical limitations, and providing auditors with required access to Cloud infrastructure environments.

BSA represents many of the most assertive organizations involved in license audits, and this document suggests, firmly, that the challenge of audits will not be disappearing soon.  As the document states, while Cloud-based software “solves some license compliance challenges, it also creates new ones”.

Is the lack of ITSM and ITAM alignment causing application sprawl?

Urban sprawl

I’ve written before about the negative consequences of the lack of industry alignment between ITIL-focused ITSM functions, and the IT Asset Management groups which typically evolved somewhat separately.

A recent CapGemini study of CIOs and IT decision makers concisely illustrated one impact this is having:

  • 48% believe their business has more applications than it needs (up from 34% over the previous three years).
  • Only 37% percent believe the majority of their applications are mission critical.
  • 70% believe at least a fifth of their company’s applications share similar functionality and could be consolidated.

The majority believe a fifth of those applications should be retired or replaced.

This shows a very strong consensus amongst IT leaders: IT is spending too much money and time on too many applications, with too much overlap. And in the rapidly evolving application landscape, this impact is by no means limited to traditional on-premise software: Skyhigh’s 2013 study on cloud service adoption found that enterprise respondents used, on average, well over 500 cloud services (the largest number of services found in one organisation was an eye-watering 1769).[Update for Q1 2015: SkyHigh now put the average at over 900]

If we want to remain serious about understanding the business services our IT organizations are managing, overseeing and underpinning, surely we can’t lose track of key assets like this?

How can IT possibly aim to control this sprawl, understand its impact, pinpoint its risks and and remove its vulnerabilities, if there is no unified overseeing function? Who is tracking which users are entitled to which services? Who ensures that users are equipped with the right services, and who removes their access once they leave, to ensure both data security and cost control? Who can identify the impact on key services if an application is removed or consolidated?

Concerningly, this does not appear to be high on the agenda in ITSM discussions. We still see two separate threads in the conference ecosystem: ITSM conferences rarely address asset management. Asset management conferences talk about suppliers and infrastructure without putting them in the context of the services they underpin. My own role involves product management of an ITAM system which is part of an ITSM suite, so I attend both sets of conferences, see both parallel tracks, and experience nagging concerns in each case that the other side of the picture is overlooked.

Recent initiatives such as the Pink Think Tank 14 are, welcomely, addressing in increased detail the multi-sourced, multi-vendor evolution of IT service delivery, but there still does not appear to be a detailed focus on the actual assets and software being supplied by those vendors.  That’s a gap. Those vendors fill the IT environment with assets, from physical kit through software services to less tangible “assets” like critical people with vital knowledge.  All those things cost money. They may have contractual associations. We may need to know, very quickly, who owns and supports them. And if a supplier is replaced, we need to know what they might take with them.

The harsh reality, as clearly shown by CapGemini’s study, is that CIOs and leaders are asking questions about consolidation that will require a detailed, holistic understanding of what we are actually spending money on, and why it is there.

Gartner’s London summit message: Make ITAM important!

Gartner’s IT Financial, Procurement and Asset Management rolled into London last week (11th and 12th September 2013), and promptly kicked off on an ominous note: Stewart Buchanan’s opening keynote warning that certain roles in IT, including that of the IT Asset Manager, risk becoming obsolete.

As the two day event progressed, however, it became increasingly clear that Gartner’s analysts don’t see ITAM as a complete anachronism. It is important, however, that it evolves with the technology and practices around it. Asset Management needs to become a key strategic tool to the business. For those of us who have been blogging on this theme for some time, and who have witnessed the best ITAM professionals in the industry delivering huge results from this approach, it is great to hear Gartner emphasising it so strongly.

Research Director Victoria Barber stressed the power of a strong “symbiotic relationship” between the Asset Management function, and IT’s financial controllers. “Finance needs to understand how it can leverage the data from Asset; Asset Management needs to understand how to support it”.

Barber’s fellow Research Director Patricia Adams described the evolving role of the IT Asset Management team in an increasingly virtualised environment. By Monday morning, she advised, the ITAM team should ensure that it is part of the process for spinning up a virtual machine.

Moving forward, Adams continued, they need to be aware of emerging technologies and preparing for potential adoption. This needs good awareness of what is going on in the business: “You want to make sure the asset team has the skills to work with the config team, to work with the virtualisation team, to understand what those teams are doing”.

As Buchanan concluded in a later session, companies should “use ITAM to continually improve and optimise both IT operations and the business use of IT”.

To this audience, at least, Gartner’s message is an encouraging one.

Microsoft hike key license price by 15%. How can you offset the rise?

A few days ago, Microsoft (or rather, many of its resellers) announced a 15% price rise for it’s user-based Client Access license, for a range of applications. The price hike was pretty much immediate, taking effect from 1st December 2012.

The change affects a comprehensive list of applications, so it’s likely that most organizations will be affected (although there are some exceptions, such as the PSA12 agreement in the UK public sector).

Under Microsoft’s client/server licensing system, Client Access Licenses (CALs) are required for every user or device accessing a server.

Customers using these models need to purchase these licenses in addition to the server application licenses themselves (and in fact, some analysts claim that CALs provide up to 80% of  license revenue derived from these models).

What’s interesting is that the price rise only affects User-based CALs, not Device-based CALs. Prior to this change, the price of each CAL was typically the same for any given application/option, regardless of type.

This is likely to be a response to a significant industry shift towards user-based licensing, driven to a large extent by the rise of “Bring your own Device” (BYOD). As employees use more and more devices to connect to server-based applications, the Device CAL becomes less and less attractive.

As a result, many customers are shifting to user-based licensing, and with good reason.CALbeforeafter

15% is a big rise to swallow.   However, CAL licensing has often been pretty inefficient. With the benefit-of-proof firmly on the customer, a true-up or audit often results in “precautionary spending”: “You’re not sure how many of our 5,000 users will be using this system, so we’d suggest just buying 5,000 CALs“. This may be compounded by ineffective use of the different licensing options available.

Here are three questions that every Microsoft customer affected by this change should be asking:

Do we know how many of our users actually use the software?
This is the most important question of all. It’s very easy to over-purchase CALs, particularly if you don’t have good data on actual usage. But if you can credibly show that 20% of that user base is not using the software, that could be a huge saving.

Could we save money by using both CAL types?
Microsoft and their resellers typically recommend that companies stick to one type of CAL or the other, for each application. But this is normally based on ease of management, not a specific prohibition of this approach.
But what if our sales force uses lots of mobile devices and laptops, while our warehouse staff only access a small number of shared PCs. It is likely to be far more cost effective to purchase user CALs for the former group, while licensing the shared PCs with device licenses. The saving may make the additional management overhead very worthwhile.

Do we have a lot of access by non-employee third parties such as contractors?
If so, look into the option of purchasing an External Connector license for the application, rather than individual CALs for those users or their devices.  External Connectors are typically a fixed price option, rather than a per-user CAL, so understand the breakpoints at which they become cost effective.  The exercise is described at the Emma Explains Microsoft Licensing in Depth blog.  Microsoft’s explanation of this license type is here.

The good news is that the price hike will usually kick in at most customer’s next renewal. If you have a current volume licensing agreement, the previous prices should still apply until then.

This gives most Software Asset Managers a bit of time to do some thinking. If you can arm your company with the answer to the above questions by the time your next renewal comes around, you could potentially save a significant sum of money, and put a big dent in that unwelcome 15% price hike.

Image courtesy of Howard Lake on Flickr, used under Creative Commons licensing

Painted into a Corner: Why Software Licensing isn’t getting simpler

It’s not easy being a Software License Manager.

It’s really not easy being a Software License Manager in a company which uses products from one or more of the “usual suspects” among the major software vendors.  Some of the largest have spent recent years creating a licensing puzzle of staggering complexity.

There’s an optimistic school of thought which supposes that the next big change in the software industry – a shift to service-oriented, cloud-based software delivery – will make this particular challenge go away.  But how true is this? To answer the question, we need to take a look back, and understand how we arrived at the current problem.

In short, today’s complexity was driven by the last big industry megatrend: virtualization.

In an old-fashioned datacenter, licensing was pretty straightforward.  You’re running our software on a box?  License that box,  please.  Some boxes have got bigger?  Okay, count the CPUs, thanks. It was nothing that should have been a big issue for an organized Asset Manager with an effective discovery tool.  But as servers started to look a bit less like, well, servers, things changed, and it was a change that became rather dramatic.

The same humming metal boxes were still there in the data center, but the operating system instances they were supporting had become much more difficult to pin down.  Software vendors found themselves in a tricky situation, because suddenly there were plenty of options to tweak the infrastructure to deliver the same amount of software at a lower license cost. This, of course, posed a direct threat to revenues.

The license models had to be changed, and quickly. The result was a new set of metrics, based on assessment of the actual capacity delivered, rather than on direct counting of physical components.

In 2006, in line with a ramping-up of the processor core count in its Power5 server offering, IBM announced its new licensing rules.We want customers to think in terms of ‘processor value units’ instead of cores”, said their spokesman. A key message was simplification, but that was at best debatable:  CPUs and cores can be counted, whereas processor-specific unit costs have to be looked-up.  And note the timing: This was not something that arrived with the first Power5 servers. It was well into the lifetime of that particular product line.  Oh, and by the way, older environments like Power4 were brought into the model, too.

And what about the costs?  “This is not a pricing action. We aren’t changing prices”. added the spokesman.

For a vendor, that assertion is important. Changing pricing frameworks is a dangerous game for software companies, even if on paper it looks like a zero-sum game.  The consequences of deviating significantly around the current mean can be severe:  The customers whose prices rise tend to leave. Those whose prices drop pay you less.  Balance isn’t enough – you need to make it smooth for every customer.

Of course, virtualization didn’t stand still from August 2006 onwards, and hence neither did the license models.  With customers often using increasingly sophisticated data centers, built on large physical platforms, the actual processing capacity allocated to software might be significantly less than the total capacity of the server farm.  You can’t get away with charging for hundreds of processors where software is perhaps running on a handful of VMs.

So once again, those license models needed to change.  And, as is typical for revisions like this, sub-capacity licensing was achieved through the addition of more details, and more rules.  It was pretty much impossible to make any such change reductive.

This trend has continued:  IBM’s Passport Advantage framework, at the time of writing, has an astonishing  46 different scenarios modelled in its processor value unit counting rules, and this number keeps increasing as new virtualization technologies are released. Most aren’t simple to measure: the Asset Manager needs access to a number of detailed facts and statistics.  Cores, CPUs, capacity caps, partitioning, the ability of VMs to leap from one physical box to another – all of these and more may be involved in the complex calculations. Simply getting hold of the raw data is a big challenge.

Another problem for the Software Asset Manager is the fact that there is often a significant and annoying lag between the emergence of a new technology, and the revision of software pricing models to fit it. In 2006, Amazon transformed IT infrastructure with their cloud offering. Oracle’s guidelines for applying its licensing rules in that environment only date back to 2008. Until the models are clarified, there’s ambiguity. Afterwards, there are more rules to account for.

(Incidentally, this problem is not limited to server-based software.  A literal interpretation of many desktop applications’ EULAs can be quite frightening for companies using widespread thin-client deployment. You might only have one user licensed to work with a specialist tool, but if they can theoretically open it on all 50,000 devices in the company, a bad-tempered auditor might be within their rights to demand 50,000 licenses.)

License models catch up slowly, and they catch up reactively, only when vendors feel the pressure to change them. This highlights another problem: despite the fine efforts of industry bodies like the SAM Standards Working Group, vendors have not found a way to collaborate.  As the IBM spokesman put it in that initial announcement: “We can’t tell the other vendors how to do their pricing structure”.

As a result, the problem is not just that these license models are complex.  There are also lots of them.  Despite fundamentally measuring the same thing, Oracle’s Processor Core Factors are completely different to IBM’s Processor Value Units.  Each vendor deals with sub-capacity in its own way, not just in terms of counting rules but even in terms of which virtual systems can be costed on this basis. Running stuff in the cloud? There are still endless uncertainties and ambiguities.  Each vendor is playing a constant game of catch-up, and they’re each separately writing their own own rules for the game. And meanwhile, their auditors knock on the door more and more.

Customers, of course, want simplification. But the industry is not delivering it. And the key problem is that pricing challenge.  A YouTube video from 2009 shows Microsoft’s Steve Ballmer responding to a customer looking for a simpler set of license models.  An edited transcript is as follows:

Questioner:

Particularly in application virtualization and general virtualization, some of Microsoft’s licensing is full of challenging fine print…

…I would appreciate your thoughts on simplifying the licensing applications and the licensing policies.”

Ballmer:

“I don’t anticipate a big round of simplifying our licenses.  It turns out every time you simplify something, you get rid of something.  And usually what we get rid of, somebody has used to keep their prices down…

…The last round of simplification we did of licensing was six years ago…. it turned out that a lot of the footnotes, a lot of the fine print,  a lot of the caveats, were there because somebody had used them to reduce their costs…

…I know we would all like the goal to be simplification, but I think the goal is simplification without price increase. And our shareholders would like it to be a simplification without price decreases”…

…I’d say we succeeded on simplification, and our customer satisfaction numbers plummeted for two and a half years”.

In engineering circles there is a wise saying: “Strong, light, cheap: Pick any two”. The lesson from the last few years in IT  is that we can apply a similar mantra to software licensing:  Simple, Flexible, Consistently Priced: Pick any two.

Vendors have almost always chosen the latter two.

This brings us to the present day, and the next great trend in the industry. According to IDC’s 2011 Software Licensing and Pricing survey, a significant majority of the new commercial applications brought to market in 2012 will be built for the Cloud. Vendors are seeing declining revenues from perpetual license models, while subscription-based revenue increases. Some commentators view this as a trend that will lead to the simplification of software license management. After all, people are easier to count than dynamic infrastructure… right?

However, for this simplification to occur, the previous pattern has to change, and it’s not showing any sign of doing so.  The IDC survey reported that nearly half of the vendors who are imminently moving to usage-based pricing models still had no means to track that usage. But no tracking will mean no revenue, so we know they’ll need to implement something. Once again, the software industry is in an individual and reactive state, rather than a collaborative one, and that will mean different metrics, multiple data collection tools, and a new set of complex challenges for the software asset manager.

And usage based pricing is no guarantee of simplicity. A glance at the Platform-as-as-Service sector illustrates this problem neatly. Microsoft’s Azure, announced in 2009 and launched in 2010, promised new flexibility and scalability… and simplicity. But again, flexibility and simplicity don’t seem to be sitting well together.

To work out the price of an Azure service, the Asset Manager needs to understand a huge range of facts, including (but by no means limited to) usage times, usage volumes, and secondary options such as caching (both performance and geographic), messaging and storage.  Got all that?  Good, because now we have to get to grips with the contractual complications: MSDN subscriptions have to be accounted for, along with the impact of any existing Enterprise Agreements. Microsoft recognized the challenge and provided a handy calculator, only to acknowledge that “you will most likely find that the details of your scenario warrant a more comprehensive solution”. Simplicity, Flexibility, Consistent Pricing: Pick any two.

And, of course, the old models won’t go away either. Even in a service-oriented future, there will still be on-premise IT, particularly amongst the organizations providing those services.

Software vendors have painted themselves into a corner with their license models, and unless they can find a way to break that pattern, we face a real risk that the license management challenge will get even more complex. Entrenched complexity in the on-premise sector will be joined by a new set of challenges in the cloud.

The pattern needs to change. If it doesn’t change, be nice to your Software Asset Manager. They’ll need a coffee.