Challenge

Changes in consumer behavior have led retailers to pursue higher levels of customer proximity, by increasing the number of stores while focusing on convenience formats. This shift in focus requires heightened agility when analyzing possible new store locations.

This project aimed at improving a retailer’s accuracy when predicting the sales potential of given store locations, while also increase the speed at which these forecasts are generated. The goal was to find the highest value locations for the retailer’s future stores.   

Solution

We started off by estimating the total market size by combining different data sources, such as external reports, demographic stats and raw datasets (e.g., using Airbnb’s listings to plot the impact of tourism across key urban areas). This allowed us to have a remarkable level of geographic detail without foregoing precision.

Afterwards, we built an interactive app that makes it possible for the user to study the performance of future stores, given specific store characteristics such as exact location, store size and store brand. This tool has a built-in machine learning model that forecasts store sales by looking at the competitive landscape and estimating the store’s market share within its area of influence.

Results

The developed solution enabled the client to increase sales’ forecast accuracy by 15% and to decrease the time needed to analyze one new store location from 5-10 days to 1-2 minutes. Since a solid forecast could be delivered much faster, the focus of the team shifted to promising locations only.

Multiple validation scenarios were tested to prove the robustness of the solution, which allowed it to become a central piece in the client’s expansion process.

Related Case Studies

UNIFYING AN ELECTRONIC RETAILER LOGISTICS FLOW FROM SUPPLIERS TO STORES
RESHAPING THE ROUTING OPERATION AND PLANNING OF A THIRD PARTY LOGISTICS PROVIDER
APPLYING ANALYTICS TO IMPROVE PEOPLE PLANNING IN A RETAIL CHAIN

Insights

  Warehouses are a critical part of a seamless supply chain. The current focus on assigning value-added activities to warehouses, in order to reduce downstream labour costs, asserts that warehouses need to be designed in detail and carefully thought.   The significant amount of processes happening simultaneously and the interdependencies between them imply that it is crucial to make accurate decisions when implementing any changes. When studying a process improvement, a layout change or the introduction of a modern technology in the warehouse operation, the critical question that arises is the following: Will this change affect negatively the warehouse’s performance?   What are the benefits of having a more balanced operation? A warehouse can only operate at peak efficiency if all its processes are balanced as a whole and are communicating without any major disruptions. The existing discontinuances can either be due to suppliers that are delivering too early or too late, to a bottleneck in the warehouse operation or to the difficulty in finding something specific in the warehouse. Considering the example of an unbalanced reception, if the warehouse’s suppliers are not allocated to the best periods of time, it might lead to the need of having more operators and more buffer space in the reception area. Also, an unbalanced reception could imply that downstream processes, such as sorting or shipping, would need more operators to work at its maximum capacity. Simulating warehouse flow can help determining bottlenecks in the operation and the hidden cost of an unbalanced warehouse operation.   Where should products be to improve sorting efficiency? The current demand for more products with shorter lifecycles and for increased differentiation is a challenge to warehouse operations. This evolution leads to more investigations on the placement of the different products in the warehouse, to ensure the reduction of the amount of time spent in picking activities, the most labor-intensive warehouse operation. Simulation is a proven methodology to test different location assignment positions and to verify the impact of changes in the amount of time and resources needed in a warehouse.   Can an innovative technology or layout improve warehouse performance? A crucial part of managing a warehouse is the need to improve its performance and to make complex decisions among diverse options on how to achieve the proposed improvements. Could the change of a warehouse’s layout lead to an increase in the congestion between operators? Could the adoption of a new automation process, which would double the overall processing rate of the warehouse, be worthy, without an increase in the reception rate? Would that lead to any savings or would it just make the warehouse more unbalanced? All these different scenarios can be properly simulated in detail, evaluating the trade-off between each measure and weighing the distinct opportunities holistically.   Minimising the risk of warehouse changes Usually, investing in warehouse changes requires big investments that can either derive from new equipment, from outsourcing or moving the operation while the change is occurring, from overtime compensation to the workers, among others. In worst case scenarios, these changes can have consequences which may affect the performance of the whole supply chain and put it at risk. It is highly uneconomical and extremely risky to test different scenarios in a real warehouse, enhancing the importance of virtual reality. Simulation presents itself as a new world of dynamism towards decision making and demonstrates an endless growth potential in companies.  

SEE MORE

To have impact, numbers need stories and vice versa. For business to form a complete picture, they need both Big Data (quantitative information) and Thick Data (qualitative information). Each of them produces different types of insights at varying scales and depths. They may yield interesting synergies and complement each other.   In particular, the tactical decision-making level has a large decision-making gap that can leverage this ‘dual’ approach. On the one hand, analytics-based methods are able to quantify and generalize insights (Big Data); on the other hand, there are empirical approaches that can improve the business perspective and the depth of analysis (Thick data). We’ve devised four main methods to integrate Thick into Big Data: Raise hypothesis: use Thick Data as a source of inspiration to raise hypothesis that are then tested over the population with Big Data Confirm correlations: check Thick Data insights to confirm correlations that were found through exploratory Big Data analysis Warm-up: plug Thick Data numbers into the warm-up phase of a Big Data project Full connector: start using Thick Data similarly to the Warm-up method, but then continue to use it to keep calibrating the Big Data model   Big & Thick Data in the Telco industry with a full connector approach A great example of a blend of Big & Thick Data is a program launched by a leading Telco provider, with the goal of empowering staff to interact with customers on a personalized level. The first approach was to combine 79k data variables around customer service and marketing into a single Net Promotor Score (NPS). This model helped move from around 30k NPS attributions per year to scoring the whole customer base with an accuracy of around 80%. The next step was a sentiment analysis over call center records to move into contextual customer insights, thus refining the NPS score. Leverage NPS data allowed to differentiate marketing strategy and interaction guidelines between customer groups (detractor vs. passives vs. advocates). For example, to make carefully crafted outbound calls for detractors and use electronic direct marketing (EDM) for advocates. The insights obtained were also essential to ensure there was a clear follow-up point to use with each customer. To engage staff was built a simple internal portal, with NPS, interaction scripts and key client info. At this point was critical to create strong engagement links with IT. During the first six months, 50k customer issues were identified and solved. As a result, the customer perception (NPS scores) improved.   The path towards transformational projects As we can learn from the previous case study, transformational projects involve cross-functional skills in Thick Data, Data Science and Management. Effective transformational projects that leverage Big and Thick Data require: a holistic perspective of the problem, covering multiple integrated processes, stakeholders and KPIs a solid methodological approach, to grasp the synergies of both data streams  a sustainable cultural change towards the execution and adoption of such projects to support decision making  

SEE MORE

Business analytics involves several methods and tools that can be organized into three dimensions: Descriptive analytics – understanding the performance of the past (i.e., reporting) Predictive analytics – using data to anticipate how the future will look like (i.e., forecasting) Prescriptive analytics – suggesting a course of action to improve your business (i.e., optimizing)   As a manager, you should know what you want the data to do and recognize the five key benefits that business analytics yields:   1. Improved return-on-investment when compared to ‘pure analytics’ ‘Pure analytics’ means staring mining data without a specific business objective in mind. Such projects yield a high-risk of lack of results. Using business analytics, the business comes ahead of the data and guides the exploration process in a more consequent manner.   2. Superior robustness and interpretability of results when compared to ‘pure analytics’ As the business is guiding the overall process, from the problem framing to the validation of the solutions, teams are more likely to understand and use the results in the operation. Moreover, business sense should also steer away solutions that are only performing under strong assumptions.   3. More detailed decision-making when compared to just relying on business sense Without advanced analytics, managers often make decisions that work well on average, but fail to recognize the complexity of the business landscape. The power of data is connected to the possibility of tailoring decision to the different situations appropriately.   4. More alignment within your organization than with other approaches to decision making Business analytics strikes a good balance between grounding decisions on bottom-up evidence – data, while ensuring the appropriate business guidance. This equilibrium translates into team’s comfort as data brings the operational complexity with the appropriate business framing.   5. An opportunity to challenge your business beliefs With a fast-paced evolution of the expectations of the different echelons of the supply chain (e.g., suppliers and customers) it is ever more crucial to continuously challenge business beliefs that often lead to poor decision making. Business analytics, by allowing data exploration, is a very good instrument to cross-check managers intuitions.   Conclusion To reap the five key benefits that business analytics can bring, it is mandatory to have C-level support and the right team that blends business expertise and analytics prowess. The larger the number of stakeholders, who are able to manage both skills, the better.

SEE MORE