Retail Intelligence Reporting That Works

Applying AI & Data to Retail Intelligence

Retail Intelligence Dashboards are hard when you can't really trust the underlying data.

Retail Intelligence is all the rage. It’s about tracking the performance of your operations in relation to consumer behavior and sales performance across a swathe of touch points. But through a BI Dashboard, darkly, we may scratch beneath the surface to find our insights… and ultimately discover what lurks beneath the dashboards.

Retail Intelligence depends on sourcing data from multiple points that span the globe:

There is the packaging and provisioning of this data from disparate systems such as:

  1. Manufacturers
  2. Suppliers
  3. Distributors
  4. Shipping
  5. Channel Partners
  6. Stores
  7. Websites
  8. Social Commerce
  1. Point of Sale
  2. E-Commerce
  3. Inventory 
  4. Order Management
  5. Payment Gateways
  6. Credit Controls
  7. ERP / Finance Systems
  8. Bank Reconciliations

These are hardly exhaustive lists, but one thing remains constant, the volume of data being amassed across retail organizations is huge and growing. 

97% of data in an average enterprise is unused and yet the “monetisation of data” was cited as one of the top 3 concerns of the C-Suite across a number of Fortune 500s. 

Everything is being tracked and very little thought is put to the productization of this data that already exists within your business. It’s a big job to get right and quite often, the goal posts are moving every quarter. 

It’s hard for in-house Data Analytics teams to cover both the implementation and maintenance of the underlying data platforms and infrastructure at the same time they are working on “analytics use cases” which are likely what the finance and sales decision makers are demanding.

Dashboards Are The Decision Maker’s Window

Dashboards are all the rage. They provide a consolidated view of information that helps you run your business and keep the most important data up front. 

Let’s take the dashboard in a car as an analogy: 

You know how fast you’re going, how much fuel is in the tank (or how much charge is left in the battery) and if there are any issues with the engine. You need this data to drive safely and effectively, right?

Imagine what it’s like if you don’t trust that data. 

What if you’re driving 50 mph but the speedometer says you’re doing 30 mph? 

Or if you’ve got a 150 mile journey ahead and, even though the tank says ‘full’, you’re running on fumes and are likely to run out on the highway. 

This analogy extends quite nicely to how we use dashboards in business. 

We need accurate information, clearly visualized and performant, to make the right management decisions. 

A good dashboard is one that is clear, accurate and user-friendly. 

The components of the dashboard should be fit for purpose and speed up a person’s ability to spot trends, identify issues and take corrective action when a particular problem surfaces. 

A lot of the value that a dashboard provides to its users is largely dependent on how it has been set up; a lot of that value is lost when components in the underlying data pipeline fail and you don’t know why. 

It’s quite likely that the pressure to deliver something quickly has meant that corners were cut to ship features. Whilst the unsuspecting end-user may not notice a change in the interface they rely on, something far more sinister may be lurking beneath. 

That sinister ‘something’ could be the one domino to fall, creating a cascade of issues that are hard to resolve because of the spaghetti code and technical debt lingering since the day the dashboard was unveiled.

Understanding The Approach

There are two approaches to dashboards: back-to-front and front-to-back.

  • A front-to-back approach starts with the underlying data and works its way to the business user. This approach pays more attention to the data pipelines and the provisioning access to the data, rather than the design and efficacy of the metrics.

  • A back-to-front approach starts with the design of the dashboard, and works its way to the data source. This approach usually pays more attention to how the dashboard will be used rather than the technical work necessary to create robust data pipelines.

We would argue that both approaches are insufficient unless they really have worked their way to the end of the line. 

That is to say, if a data-first approach stops before it reaches the dashboard design phase, the end users will have problems in adopting the dashboard. If a metrics-first approach stops before it gets to the deeper data engineering and ETL process questions, the resulting dashboard will likely fail to deliver accurate data continuously. 

Many businesses rightly choose the back-to-front approach because it is easier to tangibly view a mockup of a dashboard than it is to conceptualize a data model with queries. 

It is easier to ask a team what they would like to see than show them what they could have. However, it is often important to go a few levels deeper than provisioning API access to the underlying database or data service to evaluate how robust the pipeline actually is. 

If the dashboard only works with real-time data, for example, that’s a complex engineering task. If the dashboards that business users rely on are based upon data they do not own or directly control, then safeguards must be put in place to measure the reliability, frequency and accuracy of the data that is provided by a 3rd party. 

This is probably one of the most common issues that we come across as a data engineering firm with our clients: upstream data is not subjected to the same level of scrutiny as downstream analytics. 

Data quality, integrity and lineage are beyond the scope, understanding and ownership of most downstream users so why would they think to ask about it? And yet, this is where most problems originate…

To create a good retail intelligence dashboard, you need to take the time to understand what lurks beneath.

  • What is the origin of the data and who is responsible for it?
  • What has to happen to turn the data into something usable?
  • What are the stages of data transformation? Is it monitored?
  • If something goes wrong, can we identify the cause easily?
  • Is there a point of contact we can turn to in case of data issues?

We are not saying that you need to become a pseudo-data engineer just to build a dashboard. In fact, most of the time, out-of-the-box SaaS dashboards will do just fine and have been specifically designed to abstract the end user from having to deal with what lurks beneath. 

However, once dashboards start to get really specific or begin to ingest sensitive data, such as customer or financial data, then the templated plug-n-play providers just won’t be able to deliver the use case sufficiently. 

You’ll need to have strong capabilities in ETL, data warehousing, process automation, monitoring and data processing before you even think about the presentation layer and what that dashboard looks like. If not, you’ll end up spending more time choosing the color of the bar charts than the quality of the underlying data.

Dashboards are one of the best ways to provide a consolidated view of important information in a visual format to end users – typically those tasked with making operational decisions or analyzing performance. However, they are the tip of the iceberg in the data value chain. 

Adopting A New Approach

When everyone is selling you their flavor of “we turn raw data into actionable insight” it is useful to come back to first principles to visualize how data is made useful in an organization… without regressing into technobabble!

Step 1 – Data Sources

In this first step, you should clearly define where the data resides, how you’re going to access it, how often you’re going to access it and what rules you need to comply with. For most modern use cases, you will have multiple data sources that you’ll want to access. Understanding the lineage of this data and maintaining a catalog is highly important.

Step 2 – Data Staging

Now that you’ve identified where the data is, you need to decide what data you actually need and extract it. This is typically an exercise that requires a careful decomposition of the underlying data model you have designed and mapping of the base components at source. You’re now going to keep this “raw” data in a staging environment ready to be accessed.

Step 3 – Data Transformation

You’ve managed to extract the data you need (possibly from a variety of sources) and have it sitting in a database or a storage folder somewhere. Cleansing, reformatting, merging, enriching and integrating separate data into this “dataset” requires some coding, some rules to shape the dataset into a format that can then be successfully ingested into your database.

Step 4 – Data Warehousing

With “clean” inputs – an ingestible file to be imported or ingested via an API connection from the staging environment – you’ll be ready to test the next component of your data pipeline. Data warehouses are ideal systems used for the analysis and reporting of a large amount of structured and semi-structured data. Choose a platform that best meets your needs.

Step 5 – Data Visualization

Now we get to the Dashboard. Having defined the important things you want to see, the questions you want answered and the ways in which the information is best visualized, you’re ready to begin your first dashboard deployment. There are many platforms out there such as PowerBI, Tableau or Qlik, but the design principles remain the same: make it useful for users.

With a deeper understanding of the layers underpinning your Business Intelligence dashboards, you can see that there is much more than meets the eye. The accuracy, consistency and quality of the information presented to the end user is only as good as the weakest link in the data processing cycle. If you’re struggling with making your Dashboards work for you, it’s likely an upstream issue that the data experts should take a look at. 

Join us next month for our webinar entitled “I’m Not Your Chief Dashboard Officer” as we explore the key steps in fixing data pipelines for a number of enterprise use cases.