I recently spoke with BDNA about the upcoming release of BDNA Insight.
BDNA offer some interesting technology around their core IT discovery.
Their ‘BDNA Catalog’ offers business intelligence around the IT assets discovered such as detailed product information and power consumption. ‘BDNA Maps’ manages application dependencies, virtualization relationships and storage dependencies.
They seem to be working towards a CMDB meets IT Asset Management offering, which I assume will dovetail nicely into the service management focused PS Soft product line which they acquired late last year.
Unlike most IT Asset Management vendors BDNA do not use an agent. At some point during an IT audit project the issue of agents will arise. That is, if we want to collect an accurate inventory of the assets on our network we will need to discuss how we collect the information required for each machine.
Using Existing Systems Management Tools
Some organisations might choose to harness their existing systems management tools to collect the data such as Microsoft SMS. This is a good shortcut since the infrastructure is already in place to collect the data required but commonly fails because the tool was not deployed with this in mind.
In my experience audit data from such systems is either inaccurate, takes far too long to be generated or is not fit for purpose. This is often a political debate within an organisation as much as technological one as people fight the cause of their respective tools.
When it comes to collecting data using a dedicated audit and inventory tool, I believe it is fair to say that the majority of inventory tool vendors have chosen to use agent technology. Whereby a central system monitors and collects the audit results from remote agents which are deployed on networked desktops, servers, laptops or any other networked device you wish to audit.
There are arguments for and against using agents, some IT Asset Management vendors offer an agent, some offer agentless, some offer their clients both. It could be argued that agentless technology is a good tactical tool and using an agent is a better long term solution, but equally it could be argued that the reverse is true. Ultimately it boils down to what will work for your organisation, what sort of information you wish to collect and the unique challenges you are facing.
- Depth of Inventory – It is said that using an agent can offer a ‘Deep-Dive’ in terms of depth of data. For example it might be difficult to record a daily account of what software is being used on a machine without an agent in place.
- Remote Machines – It is argued by agent driven vendors that it makes more sense to deploy to a machine that only periodically connects to the network if you wish to maintain an accurate inventory
- Network Bandwidth – This will depend on the strength of your network connections and remote locations, but it is argued that is more network friendly to have an agent transmitting it’s audit results over the network
- Less Political Hurdles – The main benefit of going the agentless route is that there are less political hurdles to leap in order to get the system deployed.
- Less Change Management / Build Process Concerns – Code is not being deployed to machines, no changes to builds are occurring so less overheads are required for the deployment.
- Non Intrusive – Agents do not reside on the local machine and deployment commonly does not require administrative access to machines (a common hiccup in agent based deployment)
Have I missed any benefits for either method? What are your experiences of deploying agents or using agentless technology?
About Martin Thompson
Martin is also the founder of ITAM Forum, a not-for-profit trade body for the ITAM industry created to raise the profile of the profession and bring an organisational certification to market. On a voluntary basis Martin is a contributor to ISO WG21 which develops the ITAM International Standard ISO/IEC 19770.
He is also the author of the book "Practical ITAM - The essential guide for IT Asset Managers", a book that describes how to get started and make a difference in the field of IT Asset Management. In addition, Martin developed the PITAM training course and certification.
Prior to founding the ITAM Review in 2008 Martin worked for Centennial Software (Ivanti), Silicon Graphics, CA Technologies and Computer 2000 (Tech Data).
When not working, Martin likes to Ski, Hike, Motorbike and spend time with his young family.
Connect with Martin on LinkedIn.