How to manage Analytics drift in your organization

Companies experience a “drift” in their analytical applications when applications lose accuracy and effectiveness. Then the analyzes start out below average in the business use cases for which they were originally designed. There are many reasons why analytics deviate from its original purpose and become less effective. Most of these reasons relate to changes in data, algorithms, or business use cases.

When analytics drift occurs, it is damaging to proponents of analytics in organizations. Ineffective analytics make CEOs and other top-line leaders less trustful of analytics—and less likely to rely on or endorse them.

IT and analytics advocates can prevent these situations by proactively looking for instances where analytics are getting weak and then taking corrective action.The first signs of poor performance can be test reports that are no longer used as often as they used to be, or test results that are often questioned. Once IT finds an analytics application that is performing poorly, the application can be investigated more closely.

Here are the most logical places IT should look when an analytics application is losing performance:

Data

Are new data sources available that would improve the quality and completeness of the data that the analysis consults?

Data sources remain online that can improve the results of analysis queries because the data is more complete than before. The key to improving analytics is making sure that the most up-to-date data sources are embedded in the data repository your company uses for queries.

Is the data corrupt?

How often do you update the data in your analytics data repository? Is the data properly cleaned and prepared before it is put in the main repository, or has users (or IT) modified the data to make it less reliable? ?

Is there data lag?

If your industry is transportation, do you know about the latest road closures and repairs in the different regions of the country where your fleet of trucks is traveling? And do you regularly ask your data providers how often the data they provide is updated?

Has the business use case changed?

Yesterday’s analysis could have been based on lost and uncollected shipments, but today the focus may be on incorrect inventory counts. If a business use case has significantly migrated away from the original intent of what the analytics were designed for, it might be time to rewrite the analytics or to discontinue them.

Algorithms and queries

Are the algorithms and queries that users pose getting the desired results?

It may be time to refine your algorithms so that they can more accurately extract data for the information users are looking for. This can be done by iteratively testing different variations of algorithms and queries and then reviewing the results.

Has the business use case changed?

A significant change in an enterprise use case can render most algorithms and queries useless overnight. In this case, it is time to redraw the queries and algorithms that match the goals of the new business case.

Other areas of analytics mitigation 

There are many different reasons why tests can become ineffective. When this happens, companies begin to distrust their analytics, resulting in lower usage. This also takes IT where it doesn’t want to be and seeks to drive analytics when key people in the organization start to distrust them.

In addition to the data practices and algorithms IT can use to keep analytics relevant, IT can also do the following:

  • Regularly monitor new data sources that may add meaning to existing analytics;
  • Perform robust cleansing and preparation of data before it is placed in analytical data stores; and
  • Implement machine learning that can recognize repetitive data patterns and infer meanings that can be added to the artificial intelligence “brain” for processing, so analytics can “smarter” and better respond to changing business conditions.

Source link