One of Parker Avery’s key service areas focuses on identification and implementation of new business capabilities—to achieve any number of retailers’ strategic objectives. In this capacity, we work closely with our retail clients in understanding their desired future state, developing detailed requirements necessary to achieve future state capabilities, and then assessing, vetting, and selecting new retail solutions…typically also followed by implementing the selected systems. Lately, we have been working on projects that encompass demand forecasting and other analytic optimization tools. This intensified interest was also evident at NRF’s Big Show, where we solidified the fact that the analytics marketplace is denser than ever with advanced software that includes cutting-edge capabilities driven by machine learning (ML) and/or artificial intelligence (AI). You can read Parker Avery’s NRF recap here. There were a number of innovations that frankly may still not be mature enough for much retail adoption any time soon—or, on the flip side, retailers simply aren’t ‘foundationally’ ready for them—particularly with respect to the state of their data. We talk about building a strong foundation—and specifically data maturity—in a recent Parker Avery point of view titled, “Future-Proofing Retail.”

Let me give you a simple example of what I mean by being foundationally ready. Over the past few years, we’ve all heard about the increasing prevalence of ‘smart homes’ with automated lighting, security, thermostat controls, and the like. This all sounds fantastic: at the touch of a button on a smartphone, everything ‘magically’ works, turns on, sends alerts, etc. Conceptually, a homeowner buys the smart home technology, effortlessly installs it, and suddenly has a smart home with everything connected. But what if that home was built several decades ago, and still has the original furnace, thermostat, and other appliances that suddenly are expected to be ‘smart?’ This aged infrastructure necessitates additional effort and/or investment, as much of this equipment is simply not capable of communicating with the new smart home apps or technology. This scenario leads to frustration and disappointment for the homeowner and often eventual abandonment—or at least minimized use—of the new technology soon after the initial honeymoon period, since it didn’t end up being the smart home panacea.

Sometimes people seem to expect a magic result that has very little to do with their capacity to contribute to the process.

My smart home analogy is an example of where expectations can veer off from reality if you’re not careful. We’ve had clients insist on prioritizing a solution that includes AI because they know that this is the direction that all developing analytics are going; however, they don’t always know the difference between machine learning and AI or what might be required to truly achieve the expected results. When most people hear the term ‘artificial intelligence,’ they immediately think of autonomous, life-like robots that seem to ‘think’ and act independently (like the ones in the Terminator or I-Robot movies). In reality, this level of AI is not yet possible—and may not be in our lifetimes. However, whether you think AI is the future of technology, exaggerated science fiction, or already part of our everyday lives, you are right.

The most common form of AI today is machine learning, which at its most basic is the practice of using algorithms to parse data, identify patterns, and learn from them in order to make predictions about a future event. Most analytic platforms now include some level of machine learning, as it’s a staple for basic trend prediction and a completely realistic expectation when looking at new capabilities. However, today when software companies refer to AI, they are usually referring to some form of ‘artificial neural networks’ (ANN) or ‘deep machine learning.’ This is where predictive analytics starts getting interesting, as it involves layers of neural networks that can consume deeper levels of abstract data to make more complex predictions (think facial recognition software and self-driving cars).

To truly realize the life-changing advantages associated with this deep learning, you must be able to feed the system a lot of data points in order to train its predictive algorithms. In the example of facial recognition, you would have to feed it hundreds (actually, thousands) of examples of male faces, female faces, and even non-human faces so that it could even determine the most basic structures of a human face and begin training its probability algorithms. There are several examples of this already in use in social media channels.

This seems straight-forward right? We’ll just feed it all our data and wait for the AI magic to begin. Well, it’s not always that easy since many retail companies today are still struggling with basic master data governance.

Whoa. Master data governance?

You are probably wondering how we started on a conversation about cool new analytics capabilities and now took a sharp left turn into the (seemingly boring) world of master data. Remember my earlier point about being foundationally ready? Let’s explore this. Many retail companies either have too little product attribute data, customer data, event or promotion data, or it is inconsistently maintained—or missing all together. Sometimes the data they do have is held in disparate systems that are not integrated or accessible to their analytics tools. Well, all this data is a critical component to making the magic of AI (i.e., deep machine learning) even possible.

Remember the smart home analogy we talked about earlier? Well your reliable, 15-year old furnace and hot water heater represent your available data (attributes, events, causal factors) and the smart home technology you want is AI-driven advanced predictive analytics. Your results can only be as good as the base you are working with. So, when you think about your analytic goals, the first thing you should do is inspect your master data. Is it standardized and centralized? Is it being governed consistently and accurately? Do you have or can you get the data required to analyze trends related to your goals?

For example, if you want to incorporate the impact of weather on your sales, start capturing weather impacts by location or invest in a third-party provider to import the required data points. There is nothing wrong with wanting to build advanced analytic capabilities but if you don’t have the underlying foundation to support it, you need to reconsider your approach or your expectations.

If you want to hear more about machine learning and AI, there are some entertaining and informative videos on the Doctor Data show.  Here is an episode where Dr. Eric Siegel, host and former Columbia University Analytics professor, debates the existence of AI all together.


Published On: March 22, 2019Categories: Analytics, Big Data, Innovation, Marty Anderson, Master Data