The GPS in the satnav of your car is able to tell you your location by listening to signals of satellites orbiting the earth.
Your GPS won’t work with one satellite source; it typically needs four different satellite signals in order to provide a confident judgment of your location.
Each satellite in the sky transmits a message, which includes its current position and the time of transmission.
From this data the GPS can use its smart little onboard computer to crunch the navigation sums (How far away each satellite is and when the message was sent) and work out your location. Only when it has four data sources can it validate your location.
How many signals are you listening to for your asset data? How do you verify that your data is in fact accurate?
Let’s say you have 20,000 assets under management and you are using SCCM for inventory and then utilize some form of license management system to harness SCCM data. Let’s say that through hard work and perseverance you’ve managed to reach a compliance position of 100% (give or take 5%) for your most important software publishers. How accurate is that 100%?
What if your SCCM data is only 80% accurate? After all, an SCCM specialist will tell you that 90% coverage whilst deploying a new package is a job well done (Agents fail, deployments fail, systems go missing or hide under desks). Based on our figure of 20,000 machines – we have potentially 4,000 devices unaccounted for, with inaccurate or out of date records. Your 100% compliance rate is suddenly looking less rosy. Not to mention all that hardware unaccounted for – are they encrypted?
I say this not to spread FUD or undermine your work. I say this because this is how auditor’s work and this is the level of verification you need in your ongoing SAM practice. Ultimately you need to deliver your compliance figure or asset metrics – with an accompanying confidence measure.
Multiple Verification Points
What other satellites can you collect data from to verify your results? “We’re compliant based on what we can see right now” might not be good enough.
ITAM tools and CMDB’s store a lot of this data already and should be doing more to help you. I think it would really add to the business value of an ITAM tool if it could calculate the health of your data sources by passing it past other sources.
- 30% of your Altiris data is over 6 months old
- 40% of your accounts in AD are dormant
- SCCM is only working on 80% of your estate
- There are 20% more inventory records than exist within AD
- And so on
A hole existing in your asset data is a fact of life. I’m not saying, as an Asset Manager, you need to solve them all. But knowing they exist will increase your awareness of risk and strengthen the value of your information and strengthen your arguments for change.
About Martin Thompson
Martin is also the founder of ITAM Forum, a not-for-profit trade body for the ITAM industry created to raise the profile of the profession and bring an organisational certification to market. On a voluntary basis Martin is a contributor to ISO WG21 which develops the ITAM International Standard ISO/IEC 19770.
He is also the author of the book "Practical ITAM - The essential guide for IT Asset Managers", a book that describes how to get started and make a difference in the field of IT Asset Management. In addition, Martin developed the PITAM training course and certification.
Prior to founding the ITAM Review in 2008 Martin worked for Centennial Software (Ivanti), Silicon Graphics, CA Technologies and Computer 2000 (Tech Data).
When not working, Martin likes to Ski, Hike, Motorbike and spend time with his young family.
Connect with Martin on LinkedIn.