The ISO SAM Standard refers to the concept of ‘Trustworthy Data’.
As the name suggests – it is data that the organization trusts.
Whilst no enterprise level asset database is 100% accurate, we need to have a sufficient level of confidence in our asset data so we can make the right decisions.
We want to trust this data have the confidence that it won’t come back to bite us later.
A few weak reports with dodgy results and teams across the organization begin to doubt the validity of your data.
In order to keep asset data alive, and therefore trustworthy, we need to think about what makes it get stale in the first place. How does the data get bloated full of duplicates, riddled with errors and discrepancies?
First of all lets look at the inputs and outputs, what is likely to be flowing in and out of your asset database?
New additions to your asset database are likely to be new software installed, software upgrades, and new devices connected to the network, rebuilt machines, upgraded hardware and configuration changes.
Outgoing from your asset database is likely to be software uninstalled and devices placed in storage, stolen, retired or otherwise removed.
In addition to these network based changes, you then might have your financial, contractual and time based changes:
- FINANCIAL: New purchases are made, new invoices are received
- CONTRACTUAL: Terms of agreements and contractual arrangements change
- VIRTUAL: virtual machines and logins spawn like rabbits, disks get full, space and services get consumed
- TIME: time goes by and maintenance contracts and leases expire
- POLITICAL: Users come and go, change departments, acquire second devices etc.
An inventory and discovery solution will go some way to take away a lot of this heavy lifting in terms of changes. However inventory and discovery technology is never a plug-and-play experience. Even the most sophisticated automated technology requires ‘baby sitting’ and ongoing upkeep.
Critically, the technology needs to be on the same page as the people and processes. You have to remember that an inventory and discovery solution is network based. It can only tell you what is happening on the network. It can tell you Bob’s machine was last audited last Tuesday, it can’t tell you Bob left the company on Wednesday and took his laptop with him. For example one of the most common reasons for a bloated database full of out of date data is a disconnect between staff leaving the company and their assets which are redistributed or lost. Or machines being rebuilt and reconnected to the network without the old record being updated.
Regular software auditing and reconciliations for your major vendors will help keep you up to date with a lot of the software changes. In terms of hardware, you may wish to consider the following:
Sample house keeping duties and routine checks:
- Have we captured all devices? Do we know them? Can they be identified and allocated to the appropriate department or cost centre?
- Are they responding in a timely manner? Have any devices gone AWOL?
- Are they reporting accurate data? Can we cross- reference our asset data with other sources in order to identify anomalies? E.g. Active Directory
- Which machines are not responding or not communicating properly?
- Which machines are duplicates?
Finally, your team needs to maintain the health of the actual asset database infrastructure by checking server logs, backups, checking alerts and messages to ensure the whole system maintains a healthy heartbeat.
Your view? How do you maintain the level of accuracy in your asset data system? What is a good level of accuracy to aim at?
About Martin Thompson
Martin is also the founder of ITAM Forum, a not-for-profit trade body for the ITAM industry created to raise the profile of the profession and bring an organisational certification to market. On a voluntary basis Martin is a contributor to ISO WG21 which develops the ITAM International Standard ISO/IEC 19770.
He is also the author of the book "Practical ITAM - The essential guide for IT Asset Managers", a book that describes how to get started and make a difference in the field of IT Asset Management. In addition, Martin developed the PITAM training course and certification.
Prior to founding the ITAM Review in 2008 Martin worked for Centennial Software (Ivanti), Silicon Graphics, CA Technologies and Computer 2000 (Tech Data).
When not working, Martin likes to Ski, Hike, Motorbike and spend time with his young family.
Connect with Martin on LinkedIn.