Knowing what you DON’T know

Question Mark

I presented an Asset Management breakout session at the BMC Engage conference in Las Vegas today.  The slides are here:

An interesting question came up at the end: What percentage accuracy is good enough, in an IT Asset Management system?  It’s a question that might get many different answers.  Context is important: you might expect a much higher percentage (maybe 98%?) in a datacentre, but it’s not so realistic to achieve that for client devices which are less governable… and more likely to be locked away in forgotten drawers.

However, I think any percentage figure is pretty meaningless without another important detail: a good understanding of what you don’t know. Understanding what makes up the percentage of things that you don’t have accurate data on is arguably just as important as achieving a good positive score.

One of the key points of my presentation is that there has been a rapid broadening of the entities that might be defined as an IT Asset:

The evolution of IT Assets

The digital services of today and the future will likely be underpinned by a broader range of Asset types than ever.  A single service, when triggered, may touch everything from a 30-year-old mainframe to a seconds-old Docker instance. Any or all of those underpinning components may be of importance to the IT Asset Manager. After all, they cost money. They may trigger licensing requirements. They need to be supported. The Service Desk may need to log tickets against them.

The trouble is, not all of the new devices can be identified, discovered and managed in the same way as the old ones.  The “discover and reconcile” approach to Asset data maintenance still works for many Asset types, but we may need a completely different approach for new Asset classes like SaaS services, or volatile container instances.

The IT Asset Manager may not be able to solve all those problems.  They may not even be in a position to have visibility, particularly if IT has lots its overarching governance role over what Assets come into use in the organization (SkyHigh Networks most recent Cloud Adoption and Risk Report puts the average number of Cloud Services in use in an enterprise at almost 1100. Does anyone think IT has oversight over all of those, anywhere?).

However, it’s still important to understand and communicate those limitations.  With CIOs increasingly focused on ITAM-dependent data such as the overall cost of running a digital service, any blind spots should be identified, understood, and communicated. It’s professional, it’s helpful, it enables a case to be made for corrective action, and it avoids something that senior IT executives hate: surprises.

Question mark image courtesy Cesar Bojorquez on Flickr. Used under Creative Commons licensing.

Advertisement

The Internet of Things has some interesting implications for Software Asset Managers (part 2)

Gartner Projection for Growth in Internet of Things, from 0.9bn in 2009 to 25bn in 2020.

In part one of this blog, I discussed the potential impact of an exponential growth of connected devices, in terms of today’s software licenses.

However, the Internet of Things is already bringing a different licensing challenge. An ever-increasing number of smart devices are being brought to home, street and workplaces. As with many rapid technical revolutions, the focus has been on innovation and market growth. Monetisation has been a secondary concern.  Reality, however, has to catch up in time, and now we are seeing a number of blogs and articles (such as this example from Wired’s Jerome Buvat) debating how IoT vendors might actually start to make some money back.

So why does this matter to Software Asset Managers?

In April 2014, Gartner published a research paper, aimed at IoT vendors, entitled “Licensing and Entitlement Management is One of the Keys to Monetizing the Internet of Things“. Its author, Lawrie Wurster, argued strongly that vendors should see IoT devices not so much as hardware assets, but as platforms for software:

“…to secure additional revenue, manufacturers need to recognize the role that embedded software and applications play in the IoT, and they need to monetize this value”

Gartner point out a number of big advantages for vendors:

  • New offerings can be created with software enhancements, increasing speed to market and removing the expensive retooling of production lines.
  • A single license can bundle hardware, feature offerings and supporting services such as consulting.
  • Vendors can create tiered offerings, enabling the customer to start with basic levels of capability, but with the possibility to purchase more advanced features as they mature.
  • Offerings can be diversified. The vendor can create specific regional offerings, or niche solutions for specialist markets, without needing to manufacture different hardware.

This is not merely analyst speculation, though. It is already happening, and there are already vendors like Flexera helping to enable it. Flexera are a well known name to software asset managers (and my employer, BMC, works in close partnership with them in the SAM space), but another significant part of their business is the provision of licensing frameworks to vendors.  This year, they co-published a report with IDC which presented some striking findings from a survey of 172 device vendors

  • 60% of the vendors are already bundling a mixture of device, software and services into licenses:Chart from Internet of Things study by Flexera, showing that 60% of vendors say “We use licensing and entitlement management systems to develop new offerings”
  • 32% already use software to enable upsold options. More than half will be doing this by 2017.
  • 27% already use a pay-per-use model, charging by the amount of consumption of the software on the devices. A further 22% plan to do so by 2017.
    While there are clear advantages, both to vendors and consumers, there is a big unspoken challenge here. With licensing comes the difficulty of license management. This is not something that industry has done well even before the smart device revolution: Billions of dollars are paid annually by enterprises in compliance settlements.

    Many ITAM functions depend heavily on automated discovery of software installed and used on devices. However, today’s discovery tools may not adapt to discovering multiple new classes of IP-connected devices. Even when the devices are visible, it may not be easy to detect which licensed options have been purchased and enabled.

    Another big challenge might arise from a lack of centralisation. The growth of smart devices will be seen right across the business: in vehicles, facilities, logistics, manufacturing, even on the very products the company itself is selling. With software, the IT department typically had some oversight, although even this has been eroding (SkyHigh Networks, for example, now put the average number of cloud services in an enterprise at over 900… and it’s likely that a significant number of these were bought outside IT’s line of sight).  Put bluntly: IT may simply have no mandate to control purchasing of licensed devices.

This puts the IT Asset Management function in an interesting position. Software Asset Management and Hardware Asset Management, traditionally seen as two related but separable personas, are going to converge when it comes to smart devices. More widely, businesses may need guidance and support, to learn the lessons from IT’s previous difficulties in this area, and avoid even greater compliance and cost-sprawl problems in future.

The Internet of Things has some interesting implications for Software Asset Managers (part 1)

Gartner Projection for Growth in Internet of Things, from 0.9bn in 2009 to 25bn in 2020.

The phrase, “the Internet of Things”, is believed to have been coined by a brand manager at Proctor and Gamble. Kevin Ashton, in a 1999 presentation, envisaged an exponential growth of connected devices, as supply chain logistics got smarter.

Today, the Internet of Things is seen as one of the most significant technology trends, with analysts predicting that the number of connected, smart devices will grow to tens of billions over the next few years.

Much of this proliferation will happen in the workplace. For Software Asset Managers, this could have significant implications. The Internet of Things will not merely be a corner case for SAM: it could impact some of the biggest contracts, with key vendors like Oracle.

Oracle’s licensing rules are explained, at some length, in their Software Investment Guide. One commonly-used license type is called “Named-User Plus”.  Aimed at “environments where users and/or devices can be easily identified and counted”, the license model is illustrated with the following example:

Forklift-based licensing example from the Oracle Software Investment guide

Here, 15 fixed temperature devices are communicating directly with an Oracle database.  There are also 30 forklifts, each of which has a transporter which also writes to the database.

In this case, a total of 415 licenses are required: 15 for the temperature sensors, and 400 for the humans operating the forklifts (because “the forklift is not a non-human-operated device”).

In the past, I’ve used this example, only semi-seriously, to illustrate what might happen if the Internet of Things grows at the speed of the pundits’ projections. Recently, the 2015 Gartner Predicts report for the Internet of Things projected an almost 28-fold growth in connected devices from 2009 to 2020.

Gartner Projection for Growth in Internet of Things, from 0.9bn in 2009 to 25bn in 2020.

The year 2009 is rather pertinent, because Oracle’s forklift example seems to have first appeared in the Software Investment Guide in that year (here’s an example at the Internet Archive).

If we crudely apply the Gartner’s connected growth rate to the number of devices shown in Oracle’s forklift example, there would be well over 1200 connected devices to license by 2020. That’s a trebling of the cost.

I have always laughingly acknowledged this as a crude illustration, until I chanced upon a March 2015 Forbes article titled “The Intelligent Forklift in the Age of the Industrial Internet of Things”:

Today’s “smart” forklift includes diagnostics that allow the equipment to signal when it needs to be serviced, speed controls, anti-slip technology that monitors wheel spin and improve traction on slick floors, collision detection, fork speed optimization, and more.

All of a sudden, my deliberately far-fetched example didn’t seem quite so unlikely.

As always, in Software Asset Management, the challenge is unlikely to be simple or contained. Software Asset Managers deal with many vendors, with many license types. Many of those licenses may depend on counts of connected devices. Many contracts pre-date the Internet of Things, which means costing models are outdated. Unfortunately, that’s unlikely to make the consumer any less liable.

In part 2 of this article, we will look at another major challenge already arising from the Internet of Things: the increasing application of software-style license terms to hardware.

Does SaaS mean the end of audits? The BSA don’t think so.

BSA document cover

In an industry which has struggled with year-on-year rises in the number of vendor-imposed software compliance audits, it can be tempting to see SaaS software, with its subscription pricing models, as a panacea. If we can replace a complex web of installation, site, and user-based licenses with a set of simple subscriptions, won’t that make the compliance challenge much simpler?

Unfortunately, it’s not as straightforward as that. This white paper (pdf, opens in new tab) by industry watchdog BSA – The Software Alliance – explores the breadth of ways it’ll be possible to breach terms and conditions of SaaS software.

A basic SaaS subscription for a simple client application might seem very easy to manage. BSA’s document, however, effectively arms auditors with a checklist of breaches to look for, including:

  • Accessing the service from prohibited geographies.
  • Sharing user accounts.
  • Allowing systems to pose as users.
  • Providing access to non-employees (e.g. contractors) where such access is prohibited.

For companies working with Cloud Service Providers, BSA goes into significant detail on the challenges they may face in retaining compliance with their existing licensing agreements: a range of challenges including IP challenges, geographical limitations, and providing auditors with required access to Cloud infrastructure environments.

BSA represents many of the most assertive organizations involved in license audits, and this document suggests, firmly, that the challenge of audits will not be disappearing soon.  As the document states, while Cloud-based software “solves some license compliance challenges, it also creates new ones”.

Is the lack of ITSM and ITAM alignment causing application sprawl?

Urban sprawl

I’ve written before about the negative consequences of the lack of industry alignment between ITIL-focused ITSM functions, and the IT Asset Management groups which typically evolved somewhat separately.

A recent CapGemini study of CIOs and IT decision makers concisely illustrated one impact this is having:

  • 48% believe their business has more applications than it needs (up from 34% over the previous three years).
  • Only 37% percent believe the majority of their applications are mission critical.
  • 70% believe at least a fifth of their company’s applications share similar functionality and could be consolidated.

The majority believe a fifth of those applications should be retired or replaced.

This shows a very strong consensus amongst IT leaders: IT is spending too much money and time on too many applications, with too much overlap. And in the rapidly evolving application landscape, this impact is by no means limited to traditional on-premise software: Skyhigh’s 2013 study on cloud service adoption found that enterprise respondents used, on average, well over 500 cloud services (the largest number of services found in one organisation was an eye-watering 1769).[Update for Q1 2015: SkyHigh now put the average at over 900]

If we want to remain serious about understanding the business services our IT organizations are managing, overseeing and underpinning, surely we can’t lose track of key assets like this?

How can IT possibly aim to control this sprawl, understand its impact, pinpoint its risks and and remove its vulnerabilities, if there is no unified overseeing function? Who is tracking which users are entitled to which services? Who ensures that users are equipped with the right services, and who removes their access once they leave, to ensure both data security and cost control? Who can identify the impact on key services if an application is removed or consolidated?

Concerningly, this does not appear to be high on the agenda in ITSM discussions. We still see two separate threads in the conference ecosystem: ITSM conferences rarely address asset management. Asset management conferences talk about suppliers and infrastructure without putting them in the context of the services they underpin. My own role involves product management of an ITAM system which is part of an ITSM suite, so I attend both sets of conferences, see both parallel tracks, and experience nagging concerns in each case that the other side of the picture is overlooked.

Recent initiatives such as the Pink Think Tank 14 are, welcomely, addressing in increased detail the multi-sourced, multi-vendor evolution of IT service delivery, but there still does not appear to be a detailed focus on the actual assets and software being supplied by those vendors.  That’s a gap. Those vendors fill the IT environment with assets, from physical kit through software services to less tangible “assets” like critical people with vital knowledge.  All those things cost money. They may have contractual associations. We may need to know, very quickly, who owns and supports them. And if a supplier is replaced, we need to know what they might take with them.

The harsh reality, as clearly shown by CapGemini’s study, is that CIOs and leaders are asking questions about consolidation that will require a detailed, holistic understanding of what we are actually spending money on, and why it is there.

This fascinating KPMG survey reveals the software license auditor’s viewpoint

KPMG survey front cover - "Is unlicensed software hurting your bottom line"

Software licensing audits are a big challenge for IT departments.  65% of respondents to a 2012 Gartner survey reported that they had been audited by at least one software vendor in the past 12 months, a figure which has been on a steady upward trajectory for a number of years.

Often, companies being audited for software compliance will actually deal, at the front-line, with a 3rd party audit provider. One of the big names in this niche is KPMG, whose freely-downloadable November 2013 report, “Is unlicensed software hurting your bottom line?”, provides a very interesting window into the software compliance business.

The report details the results of a survey conducted between February and April 2013, with respondents made up “31 software companies representing more than 50 percent of the revenue in the software industry”.

Revenue is driving software audits

The survey results show, rather conclusively, a belief in the business value of tackling non-compliance:

  • 52% of companies felt that their losses through unlicensed use of software amounted to more than 10% of their revenue.
  • Almost 90% reported that their compliance program is a source of revenue. For about a tenth, it makes up more than 10% of their overall software revenue.  For roughly half, it is at least 4%.

Compliance audits are increasingly seen as a sales process

  • In more than half of responding organisations, the software compliance function is part of Sales. This is reported as being up from 1 in 3, in an equivalent 2007 survey.
  • In 2007, 47% of compliance teams were part of the Finance department. This figure has plummeted to just 13%.

This shift is not universal, and some companies seem committed to a non-Sales model for their compliance team.  A compliance team member from one major software vendor talked to me about the benefit of this to his role: He can tell the customer he is completely independent of the sales function, and is paid no commission or bonus based on audit findings.  Many other vendors, however, structure audits as a fully-commissioned role.  As the survey points out:

  • Only 20% of companies pay no commission to any individuals involved in the compliance process.
  • In 59% of cases, the commission structure used is the same as the normal sales commission program.

There is further indication of the role of sales in the audit process, in the answers to the question on “settlement philosophy”.  More than half of the respondents reported a preference for using audit findings as leverage in a “forward-looking sales approach”, rather than wanting to seek an immediate financial settlement.

Almost half of vendors select audit targets based on profiling

The biggest single selection reason for a compliance review was nomination by the sales account team (53%), with previous account history in close second place (50%).

Interestingly, however, 47% reported selecting customers for review based on “Data analytics suggesting higher risk of non-compliance”, with 7% stating that random selection is used.  It seems that audits are still a strong likelihood regardless of an organisation’s actual compliance management.

Auditors prefer their own proprietary tools to customers’ SAM tools

There seems to be a distinct lack of regard for Software Asset Management tools. 42% of respondents seek to use their own discovery scripts in the audit process. Only 26% of the vendors stated that they use customers’ SAM tools, and remarkably this is down from 29% in 2007, when one might expect few SAM tools would have been found on customer sites anyway.

This echoes the experience of a number of customers with whom I have previously spoken, and it can be a real source of annoyance. How, some argue, is it fair that license models are so complex that it takes a secretive proprietary script, only available to the auditor, to perform a definitive deployment count?

Other observations

  • Software tagging has not been widely adopted: Less than half of respondents do it, or have plans to do so.
  • SaaS reduces the role of the software auditor. Only 15% reported any compliance issues, and more than half don’t even look for them.
  • Few companies seek to build protection against overdeployment into their software. From conversations I have had, most seem to want to encourage wide distribution. Some desktop software was deliberately released in a manner that has encouraged wide, almost viral distribution. In at least one case, an acquisition by a larger company has been the trigger for a significant and aggressive audit program, targeting almost every large company on the assumption that the software is likely to be found there.

Conclusions?

It is very clear from the survey results that many large software vendors have established their compliance program as a significant revenue generator, and with a significant shift of these functions into the sales department, we can probably assume that there is a broad intent to maintain or even grow this role.

Whether this is even compatible with a more collaborate model of software compliance management is highly questionable: the business case for the status quo seems very sound, from the vendor’s point of view.  With so many vendors only trusting the discovery scripts used by their auditors, the situation for customers is nearly impossible: how can they verify compliance if the only counting tool is in the hand of the vendor?

The light at the end of the tunnel for many customer may be SaaS:  SaaS software tends to be more self-policing, and consumption models are often simpler. However, it brings its own challenges: zombie accounts, decentralised purchasing, and a new set of inconsistent consumption models. Meanwhile, hosted software does not go away.

The Zombie Apocalypse: an IT Asset Manager’s Survival Guide

Zombie Response Van

Zombie Response Van

IT Asset Management is not a profession commonly associated with the undead peril.  Little do their colleagues know, that the beleaguered ITAM specialist faces an ever-increasing horde of mysterious, shambling, moaning zombies.

Here, we detail some of the most common zombie types, and tell you how to spot them…

 

1) The Iron Zombie

Physical zombie server. Trip hazard, vermin house, dust collector...

This increasingly rare zombie species is nevertheless still found in forgotten corners of IT offices, blinking its faded LEDs in sinister fashion, and blowing dust out of its 3.5″ disk drive.

In its laptop variant, this is where your Visio licenses go to die.

Typical Habitats:

  • The footwell under sysadmins’ desks.
  • Corners of network switch rooms.
  • Third drawer down in the filing cabinet (laptop subspecies)

Hazards:

  • Ancient support contracts.
  • Last resting place for expensive developer tool licenses.
  • Heat output overwhelming air conditioning.
  • Incoming malware easily able to overcome unpatched 8 year old Operating System
  • Support or lease payments for an expensive paperweight
  • Broken toes.
  • Mice.

Ways to find them:

  • Trip over them.
  • Follow the sound of dust-clogged fan bearings.
  • Invite a software license auditor into the building.
  • Physical audit of technical office locations.

2) Virtual Zombies

Zombie virtual machiene. You can't photograph these, so here's a diagram.

This modern zombie species is increasingly prevalent, both on-site and off.  As well as simply being untidy, they can have all manner of impacts on the business: one forgotten major-vendor database instance, for example, can suddenly make every processor core on the entire physical backend entirely licenseable (including backdated support. At full list price. Scared yet?).

Gartner analyst Philip Dawson, at the Gartner Datacenter Summit in London, in November 2013, stated that 40% of VMs are over 3 years old, with 20% at least 5 years old.

Typical Habitats:

  • The company virtual farm.
  • Amazon Web Services.

Hazards:

  • Invisibility (or frustratingly visible opacity).
  • Tendency to be service critical without anyone realising. If you turn it off, who is going to scream?
  • You know all that careful capacity optimization you did on the server farm?
  • You can’t patch what you can’t see.

Ways to find them:

  • Invite a software license auditor onto the company network with their own discovery scripts.  This may be expensive.
  • Trawl credit card records for Amazon spend.
  • Agentless discovery, preferably with good quality application and dependency mapping.

3) Bring-Your-Own-Zombie

Bring your own zombie will eat your MDM licenses.

A recently discovered zombie species, the Bring Your Own Zombie is typically created when a user acquires a shiny new device, and either forgets or declines to deregister the old one.  It’s early days for BYOD, of course, so stats are hard to come by, but Amtel estimate a 10% rate of zombification for mobile devices. Okay, they’re an MDM vendor, but even at half that rate, a company with 10,000 BYOD refreshing hardware on a two year cycle will build up up a zombie army of a thousand devices over the next four years. That’s a lot of risky data, and a five- or six- figure excess MDM spend.

With many Mobile Device Management applications being paid for on a per-device subscription basis, the gradual buildup of BYODZ’s can steadily increase your bills, to no actual benefit.  And what of the device itself?  With no clean deregistration, and cleansing of corporate data, your data can become very viral, very quickly.

Typical Habitats:

  • Odd drawers in employees’ houses.
  • Ebay.

Hazards:

  • Will eat your MDM licenses.
  • Software Auditor: “So you’re licensing this software by device? Excellent, can I just take a look at your list of registered tablets and smartphones?”.
  • Never underestimate the corporate-data bandwidth of a padded envelope.

Ways to find them:

  • Amnesty.
  • Ask Joe in Accounting if he’s really still using a Nokia N85.

4) Zombie.bat

Zombie script file

This broad category of zombie includes all scripts, undocumented file imports, complex spreadsheets, mysterious VBA code, and the like, that get created in a productive afternoon by a sysadmin, intern or helpful hobbyist, and which embed themselves into nondescript but rather important tasks like starting up the directory server, or producing billable timesheet reports.
Gartner, at their 2013 Datacenter Summit, expressed a concern in one keynote that undocumented code is on the rise even as IT departments look increasingly to industrialise infrastructure.

Typical Habitats:

  • The finance department. In fact, any department.
  • Microsoft Access.
  • Arcane startup scripts on important servers.

Hazards:

  • Easy to create, difficult to support.
  • Undocumented, unattributed, unseen.

Ways to find them:

  • Have a major outage, trace it back to a six year old Perl script.
  • Wait for a call to the Helpdesk about the important and complicated Excel sales spreadsheet that was written by an intern several years ago, and which has broken.
  • Work with sysadmins to catalog critical code, and preferably built it into a solid CMDB with critical service dependencies

The serious points

Zombie assets are a genuine and growing issue. At best, the problem means that the return on investment in IT infrastructure is not what it should be. With IT budgets squeezed and the increasing demand on CIOs to run their functions as an effective business unit, this is an unnecessary impact on the bottom line, arising directly from IT Assets.  IT Asset Managers should never ignore that.

Additionally, there are plenty of additional circumstances where a lack of control over assets at the end of their lifecycle can lead to unforseen and even dramatic negative consequences:

  • Zombie hardware may still be under support contract.  Leased hardware, if not returned, can incur significant penalties and additional costs.
  • Uncontrolled end-of-life can mean uncontrolled disposal, with the associated risks of data loss, environmental damage and penalty, and negative publicity events arising from either.
  • The relative ease of deploying VMs in the datacenter inevitably risks sprawl.  Datacenters end up “fragmented” in the same way that a PC’s hard drive can, with pockets of unused capacity walled off around badly optimised server images.  “Lost” VMs in particular are a big threat: even if you can’t find them, a hacker or a software auditor might be able to.

What can be done?

At the 2013 Garter IT Financial, Procurement and Asset Management summit, research VP Patricia Adams recommended an “Action Plan for IT Asset Managers”.

  • From “next Monday”, Adams advised, IT Asset Managers should ensure their team is part of the process for staging a VM, focusing on collection of data prior to deployment (as this is easier than doing it reactively.
  • In the “next 90 days”, define an end of life process for virtual applications, and ensure that data on assets and software is accurate.

A recent CIO Asia guest article recommends adopting the ecological principle of “Reduce, Re-use, Recycle” in managing VMs.  Reduction, in this case, by controlling the VM request process and ensuring that each request receives appropriate review and authorisation. Re-use, through control of unused VMs, e.g. by archiving permanently or temporarily, to allow their underpinning architecture to be repurposed. Recycling, by identifying and releasing stranded capacity, where other bottlenecks in the system mean that resources sit unused.

Emerging challenges like BYOD sprawl need new initiatives to reduce risk. Last week I attended a seminar held by members of the software compliance industry (in other words, auditors), and BYOD was a headline presentation topic. Compliance teams are establishing ways to audit these devices, so software consumers need to develop processes to keep them in check.

If Asset Management is accountable for the optimised use of IT assets, then the IT Asset Manager needs to consider their own accountability, even where these functions are directly controlled by other teams.  Get involved, work cross functionally, and ensure that the risks are communicated clearly and vigorously.

Photo credits:
Zombie Response Van: Author’s own photo. The van belongs to Zed Events who hold “Zombie Apocalypse” events in a disused shopping centre in my home town of Reading, UK. I’ve not been, but it looks awesome, and I imagine it’s actually very good practice for the IT Asset Manager faced with a particularly gnarly, uncontrolled Amazon account.
Iron Zombie: From Flickr, used/modified under Creative Commons license, thanks to Vinny Malek.
Virtual Zombie: Author’s own diagram.
Bring-Your-Own-Zombie: From Flickr, used/modified under Creative Commons license, thanks to magic_quote
Zombie.bat: From Flickr, used/modified under Creative Commons license, thanks to *n3wjack’s world in pixels.

itSMF UK and the mysterious case of the missing Asset Managers

logo of the ITSM13 conference

Something is bothering me.

When I first looked at the agenda for the 2013 itSMF UK conference in November, what stood out for me was a glaring omission: where is the IT Asset Management content?

First, let me state: It’s a really good agenda, full of really interesting speakers, and I will certainly aim to be there. I’ve been privileged to work in the the UK ITSM sector for the thick end of two decades, and many of the names on the agenda are people i feel lucky to have worked and interacted with.

If you can, you should definitely go.

However, the lack of any ITAM focus, across more than 40 presentation sessions, is strange. If we want to understand our business services, we have to have a grasp on the assets underpinning them. The nearest this agenda appears to get to that is an interesting looking session on Supplier Management – important, but only part of the picture, and again, something that doesn’t really work without a good knowledge of what we are actually buying.

It took ITIL a while to come to the realisation that an asset is relevant in more ways than being just a depreciating item on a balance sheet, but version 3 finally got there, and then some:

“Service Asset”, according to ITIL v3: Any Capability or Resource of a Service Provider. Resource (ITILv3): [Service Strategy] A generic term that includes IT Infrastructure, people, money or anything else that might help to deliver an IT Service. Resources are considered to be Assets of an Organization Capability (ITIL v3): [Service Strategy] The ability of an Organization, person, Process, Application, Configuration Item or IT Service to carry out an Activity. Capabilities are intangible Assets of an Organization.”

So… we consider our service-underpinning capabilities and resources to be our assets, but we don’t discuss managing those assets at the premier conference about managing the services? More importantly, we offer nothing to its increasingly important practitioners?

As long as ITAM is only discussed at ITAM conferences, and ITSM keeps up the habit of excluding it (this isn’t universal, mind: this presentation by Scott Shaw at Fusion 13 seems to hit the perfect message), then we risk looking disjointed and ineffective to CIOs who depend on the complete picture. To me, that’s pretty worrying.

(Footnote: I did submit a speaker proposal, but this isn’t about my proposal specifically – I’m sure lots of proposals couldn’t make the list)

Gartner’s London summit message: Make ITAM important!

Gartner’s IT Financial, Procurement and Asset Management rolled into London last week (11th and 12th September 2013), and promptly kicked off on an ominous note: Stewart Buchanan’s opening keynote warning that certain roles in IT, including that of the IT Asset Manager, risk becoming obsolete.

As the two day event progressed, however, it became increasingly clear that Gartner’s analysts don’t see ITAM as a complete anachronism. It is important, however, that it evolves with the technology and practices around it. Asset Management needs to become a key strategic tool to the business. For those of us who have been blogging on this theme for some time, and who have witnessed the best ITAM professionals in the industry delivering huge results from this approach, it is great to hear Gartner emphasising it so strongly.

Research Director Victoria Barber stressed the power of a strong “symbiotic relationship” between the Asset Management function, and IT’s financial controllers. “Finance needs to understand how it can leverage the data from Asset; Asset Management needs to understand how to support it”.

Barber’s fellow Research Director Patricia Adams described the evolving role of the IT Asset Management team in an increasingly virtualised environment. By Monday morning, she advised, the ITAM team should ensure that it is part of the process for spinning up a virtual machine.

Moving forward, Adams continued, they need to be aware of emerging technologies and preparing for potential adoption. This needs good awareness of what is going on in the business: “You want to make sure the asset team has the skills to work with the config team, to work with the virtualisation team, to understand what those teams are doing”.

As Buchanan concluded in a later session, companies should “use ITAM to continually improve and optimise both IT operations and the business use of IT”.

To this audience, at least, Gartner’s message is an encouraging one.

Let’s work together to fix ITAM’s image problem

Intel Datacenter

Intel Datacenter

This is a long article, but I hope it is an important one. I think IT Asset Management has an image problem, and it’s one that we need to address.

I want to start with a quick story:

Representing BMC software, I recently had the privilege of speaking at the Annual Conference and Exhibition of the International Association of IT Asset Managers (IAITAM).  I was curious about how well attended my presentation would be. It was up against seven other simultaneous tracks, and the presentation wasn’t about the latest new-fangled technology or hot industry trend. In fact, I was concerned that it might seem a bit dry, even though I felt pretty passionate that it was a message worth presenting.

It turned out that my worries were completely unfounded.  “Benchmarking ITAM; Understand and grow your organization’s Asset Management maturity”  filled the room on day 1, and earned a repeat show on day 2. That was nice after such a long flight. It proved to be as important to the audience as I hoped it would be.

I was even more confident that I’d picked the right topic when I’d finished my introduction and my obligatory joke about the weather (I’m British, it was hot, it’s the rules), I asked the first few questions of my audience:

“How many of you are involved in hands-on IT Asset Management?”

Of the fifty or so people present, about 48 hands went up.

“And how many of you feel that if your companies invested more in your function, you could really repay that strongly?”

There were still at least 46 hands in the air.

IT Asset Management is in an interesting position right now.  Gartner’s 2012 Hype Cycle for IT Operations Management placed it at the bottom of the “Trough of Disillusionment”… that deep low point where the hype and expectations have faded.  Looking on the bright side, the only way is up from here.

It’s all a bit strange, because there is a massive role for ITAM right now. Software auditors keep on auditing. Departments keep buying on their own credit cards. Even as we move to a more virtualized, cloud-driven world, there are still flashing boxes to maintain and patch, as well as a host of virtual IT assets which still cost us money to support and license. We need to address BYOD and mobile device management. Cloud doesn’t remove the role of ITAM, it intensifies it.

There are probably many reasons for this image problem, but I want to present an idea that I hope will help us to fix it.

One of the massive drivers of the ITSM market as a whole has been the development of a recognized framework of processes, objectives, and – to an extent – standards. The IT Infrastructure Library, or ITIL, a huge success story for the UK’s Office of Government Commerce since its creation in the 1980s.

ITIL gave ITSM a means to define and shape itself, perfectly judging the tipping point between not-enough-substance and too-much-detail.

Many people, however, contend that ITIL never quite got Asset Management. As a discipline, ITAM evolved in different markets at different times, often driven by local policies such as taxation on IT equipment. Some vendors such as France’s Staff&Line go right back to the 1980s. ITIL’s focus on the Configuration Management Database (CMDB) worked for some organizations, but was irrelevant to many people focused solely on the business of managing IT assets in their own right.  ITIL v3’s Service Asset Management is arguably something of an end-around.

However, ITIL came with a whole set of tools, practices and service providers that helped organizations to understand where they currently sat on an ITSM maturity curve, and where they could be. ITIL has an ecosystem – and it’s a really big one.

Time for another story…

In my first role as an IT professional, back in 1997, I worked for a company whose IT department boldly drove a multi-year transformation around ITIL. Each year auditors spoke with ITIL process owners, prodded and poked around the toolsets (this was my part of the story), and rated our progress in each of the ITIL disciplines.

Each year we could demonstrate our progress in Change Management, or Capacity Management, or Configuration Management, or any of the other ITIL disciplines. It told us where we were succeeding and where we needed to pick up. And because this was based on a commonly understood framework, we could also benchmark against other companies and organizations. As the transformation progressed, we started setting highest benchmark scores in the business. That felt good, and it showed our company what they were getting for their investment.

But at the same time, there was a successful little team, also working with our custom Remedy apps, who were automating the process of asset request, approval and fulfillment.  Sadly, they didn’t really figure in the ITIL assessments, because, well, there was no “Asset Management” discipline defined in ITIL version 2. We all knew how good they were, but the wider audience didn’t hear about them.

Even today, we don’t have a benchmarking structure for IT Asset Management that is widely shared across the industry. There are examples of proprietary frameworks like Microsoft’s SAM Optimization Model, but it seems to me that there is no specific open “ITIL for ITAM”.

This is a real shame, because Benchmarking could be a really strong tool for the IT Asset Manager to win backing from their business. There are many reasons why:

  • Benchmarking helps us to understand where we are today.
  • More importantly, it helps us to show where we could get, how difficult and expensive that might be, and what we’re missing by not being there.

Those two points alone start to show us what a good tool it is for building a case for investment. Furthermore:

  • Asset Management is a very broad topic. If we benchmark each aspect of it in our organizations, we can get a better idea of where our key strengths and weaknesses are, and where we should focus our efforts.
  • Importantly, we can also show what we have achieved. If Asset Management has an image problem, then we need a way to show off our successes.

And then, provided we work to a common framework…

  • Benchmarking gives us an effective way of comparing with our peers, and with the best (and worst!) in the industry.

At the IAITAM conference, and every time I’ve raised this topic with customers since, there has been a really positive response. There seems to be a real hunger for a straightforward and consistent way of ranking ITAM maturity, and using it to reinforce our business cases.

For our presentation at IAITAM, we wanted to have a starting point, so we built one, using some simple benchmarking principles.

First, we came up with a simple scoring system. “1 to 4” or “1 to 5”, it doesn’t really matter, but we went for the former.  Next, we identified what an organization might look like, at a broad ITAM level, at each score. That’s pretty straightforward too:

Asset Maturity – General Scoring Guidelines

  • Level 1: Little or no effective management, process or automation.
  • Level 2: Evidence of established processes, automation and management.  Partial coverage and value realization. Some automation.
  • Level 3: Fully established and comprehensive processes. Centralized data repository. Significant
    automation.
  • Level 4:  Best-in class processes, tools and results. Integral part of wider business decision support and strategy.  Extensive automation.

In other words, Level 4 would be off-the-chart, industry leading good. Level 1 would be head-in-the-sand barely started.  Next, we need to tackle that breadth. Asset, as we’ve said, is a broad subject. Software and hardware, datacenter and desktop, etc…

We did this by specifying two broad areas of measurement scope:

  • Structural:  How we do things.  Tools, processes, people, coverage.
  • Value: What we achieve with those things.  Financial effectiveness, compliance, environmental.

Each of these areas can now be divided into sub-categories. For example, on “Coverage” we can now describe in a bit more detail how we’d expect an organization at each level to look:

“Asset Coverage” Scoring Levels

  • Level 1: None, or negligible amount, of the organization’s IT Assets under management
  • Level 2: Key parts of the IT Asset estate under management, but some significant gaps remaining
  • Level 3: Majority of the IT Asset estate is under management, with few gaps
  • Level 4: Entire IT Asset estate under full management by the ITAM function.

This process repeats for each measurement area. Once each is defined, the method of application is up to the user (for example, separate assessments might be appropriate for datacenter assets and laptops/desktops, perhaps with different ranking/weighting for each).

You can see our initial, work-in-progress take on this at our Communities website at BMC, without needing to log in: https://communities.bmc.com/communities/people/JonHall/blog/2012/10/17/asset-management-benchmarking-worksheet.  We feel that this resource is strongest as a community resource. If it helps IT Asset managers to build a strong case for investment, then it helps the ITAM sector.

Does this look like something that would be useful to you as an IT Asset Manager, and if so, would you like to be part of the community that builds it out?

Photo from the IntelFreePress Flickr feed and used here under Creative Commons Licensing without implying any endorsement by its creator.