Softline Group Northern Europe // Knowledge Base // Data Quality vs Data Quantity: part 2 – End point solution vs. Platform

Data Quality vs Data Quantity: part 2 – End point solution vs. Platform

In part one of this blog, we discussed how context is king when it comes to interpreting data. And one of the most striking contextual changes we see at our clients lately, is the difference between data quality & management tooling between endpoint solutions and platform solutions. In the past, an endpoint solution for ITAM was the standard, but it happens more and more that ITAM is operated within a platform, a module of a platform or is linked to a platform. And that brings about its own challenges.

 

Endpoint solution

An ITAM implementation using an endpoint solution is often a greenfield situation. An ITAM project starts, a new server is installed, the tooling is installed and then one by one the agents and data sources are connected and the solution is built step by manageable step. The sources that are going to be connected are usually known in advance. The user group of the ITAM solution is relatively small and often already part of the project team. The stakeholders and their wishes are reasonably well known in advance.

All phases of data processing take place within the environment of the endpoint solution. Discovery, consolidation, normalisation and consumption take place within a platform over which the organisation itself has control. Even when data sources are added for consumption, it is only about bringing in information. It is fairly controlled.

Platform solution

When organisations start working towards a platform implementation, the target platform usually already contains data, and we are immediately faced with all the classic challenges we discussed in part 1. A platform also already has users, processes and stakeholders. There are people who are working with asset data or who have an incident process, report or even have a change process. This makes the dynamics different not only during the implementation phase - after all, there must be much more analysis when the data is combined - but also during the RUN phase after the project is handed over to business as usual.

Another difference it the data is often consumed by different groups. Not only by the ITAM team, but also by existing stakeholders using the tooling. Even when only one additional component of a platform is activated, some parts may still need to be integrated and ITAM may not be the only one performing discovery, consolidation, normalisation and enrichment.

When two different tools are combined, for example when an extra connector or an ITAM tool is added to the platform, an extra challenge arises: Where should the connection be made? Should that be during an early phase of data collection within discovery, or at a later point in time during consolidation or normalisation, or perhaps at the very end? And what prevails? What are the criteria for normalisation, de-duplication and enrichment? But the challenges are not limited to the implement phase. During the RUN phase we noticed that the ITAM team is the most important stakeholder during the management of a platform and there are many other teams and individuals to take into account. From a technical view, a platform has its own roll-out cadence and ITAM has no choice but to follow. All of this factor into the dynamics being very different compared to an endpoint- implementation. So naturally, that also requires a very different approach to stakeholder management.
 

Stakeholder management opportunities

Yet there are not only challenges, but also opportunities! The moment tooling is used to enrich data and tooling is combined, it usually brings to light what is really going on within an organisation, especially in the area of data ownership. So an impact analysis can help determining the important question on of  who owns what data. The same applies to the coverage of the tooling, if two or more(!) different sources are combined, the chance that they match one on one is less than 1. In addition, there may be different views on the ‚single point of truth‘, sometimes called the ‘golden source’ within the organisation. A question that frequently has a large political or even emotional component. It also brings to light choices, implicit or explicit, that have been made within the ITAM scoping process.

The key is no to shy away from these challenges, but embrace them as a starting point for discussions. It can help you on your journey to maturity and be a catalyst for further development. The context that was hidden now becomes clear.

Our context-aware consultants are happy to assist you with any data challenges you might have, so don’t hesitate to reach out. (The high-quality contact data is below)

Or read part 1 of this series: “Big Data” challenges