Walmart – the world’s biggest retailer with over 20,000 stores in 28 countries, is in the process of building the world’ biggest private cloud, to process 2.5 petabytes of data every hour.
To make sense of all of this information, and put it to work solving problems, the company has recently completed construction of what it calls its Data Café – a state-of-the-art analytics hub located within its Bentonville, Arkansas headquarters.
Here, over 200 streams of internal and external data, including 40 petabytes of recent transactional data, can be modelled, manipulated and visualized. Teams from any part of the business are invited to bring their problems to the analytics experts and then see a solution appear before their eyes on the nerve centre’s touch screen “smart boards”.
This tool has cut down the amount of time it takes to solve complex business questions, which are reliant on multiple external and internal variables, from weeks to minutes.
He said “If you can’t get insights until you’ve analyzed your sales for a week or a month, then you’ve lost sales within that time.
“If you can cut down that time from two or three weeks to 20 or 30 minutes, then that saves a lot of money for Walmart and stopped us losing sales. That’s the real value of what we have built with the data café.”
When bombarded with huge amounts of verified, quantifiable data at high speeds, problems caused by human error or miscalculation at the planning or execution stage of a particular business activity will often simply melt away.
For example Naveen told me about a grocery team who could not understand why sales had suddenly declined in a particular product category. The team came to the café to find out why, and by drilling into the data were quickly able to see that pricing miscalculations had been made, leading to the products being listed at a higher price than they should have been, in some regions.
In another example, during Halloween, sales analysts were able to see in real-time that although a particular novelty cookie was very popular in most stores, there were two stores where it wasn’t selling at all. The alert allowed the situation to be quickly investigated, and it was found that a simple stocking oversight had led to the cookies not being put on the shelves. The company was able to then rectify the situation immediately, avoiding further lost sales.
The system also provides automated alerts, so when particular metrics fall below a set threshold in any department, they can be invited to bring their problems to the Data Café and hopefully find a quick solution.
Peddamail told me “Our goal is always to get information to our business partners as fast as we can, so they can take action and cut down the turnaround time. It is proactive and reactive analytics.”
As well as 200 billion rows of transactional data (representing only the past few weeks!), the Café pulls in information from 200 sources including meteorological data, economic data, Nielsen data, telecom data, social media data, gas prices, and local events databases.
Anything within these vast and varied datasets could hold the key to the solution to a particular problem, and Walmart’s algorithms are designed to blaze through them in microseconds to come up with real-time solutions.
At the moment, the solution is generally geared towards solving problems in the merchandising arm of the business, but in time it will be expanded to other areas such as HR and marketing. This is all part of the company’s plans to build the world’s largest cloud-based database in the world.
While separate, siloed systems are used within stores for many functions – such as inventory management, customer loyalty and price comparisons with local rivals, the eventual aim is to bring all of this information under one roof, where the impact on all of the company’s operations can be assessed and assimilated into the analytics.
Kevin Thornton, director of communications within the Walmart Technology Division, told me “Currently it’s about setting up the infrastructure, so data teams can relay information to store managers. But going forward we will be looking at implementing some kind of analytics solution at store level as well.
“We are looking at ways we can combine it, so we can all pool data and look at it using the same kind of visualization tools. We’re not there quite yet, but that is the goal.”
It seems strange to apply an adjective such as “agile” to a behemoth like Walmart. However in Big Data terms its size certainly gives it power – it has huge amounts of data at its fingertips and the resources to go on to collect far more. By combining this with the ability to make very fast decisions and implement changes based on incoming, real-time data, it is clear Walmart sees data as key to keeping itself at the top.