Operations Research

Assembly Line

Optimizing Order Picking to Increase Omnichannel Profitability with Databricks

📅 Date:

✍️ Authors: Peyman Mohajerian, Bryan Smith

🔖 Topics: BOPIS, Operations Research

🏢 Organizations: Databricks


The core challenge most retailers are facing today is not how to deliver goods to customers in a timely manner, but how to do so while retaining profitability. It is estimated that margins are reduced 3 to 8 percentage-points on each order placed online for rapid fulfillment. The cost of sending a worker to store shelves to pick the items for each order is the primary culprit, and with the cost of labor only rising (and customers expressing little interest in paying a premium for what are increasingly seen as baseline services), retailers are feeling squeezed.

But by parallelizing the work, the days or even weeks often spent evaluating an approach can be reduced to hours or even minutes. The key is to identify discrete, independent units of work within the larger evaluation set and then to leverage technology to distribute these across a large, computational infrastructure. In the picking optimization explored above, each order represents such a unit of work as the sequencing of the items in one order has no impact on the sequencing of any others. At the extreme end of things, we might execute optimizations on all 3.3-millions simultaneously to perform our work incredibly quickly.

Read more at Databricks Blog

auton-survival: An Open-Source Package for Regression, Counterfactual Estimation, Evaluation

📅 Date:

✍️ Authors: Chirag Nagpal, Willa Potosnak

🔖 Topics: Operations Research, Predictive Maintenance

🏢 Organizations: Carnegie Melon


Real-world decision-making often requires reasoning about when an event will occur. The overarching goal of such reasoning is to help aid decision-making for optimal triage and subsequent intervention. Such problems involving estimation of Times-to-an-Event frequently arise across multiple application areas, including, predictive maintenance. Reliability engineering and systems safety research involves the use of remaining useful life prediction models to help extend the longevity of machinery and equipment by proactive part and component replacement.

Discretizing time-to-event outcomes to predict if an event will occur is a common approach in standard machine learning. However, this neglects temporal context, which could result in models that misestimate and lead to poorer generalization.

Read more at CMU ML Blog

The evolution of Amazon’s inventory planning system

📅 Date:

🔖 Topics: demand planning, operations research, E-commerce, glocalization

🏢 Organizations: Amazon


Forecasting models developed by Amazon’s Supply Chain Optimization Technologies organization predict the demand for every product. Buying systems determine the right level of product to purchase from different suppliers, while large-scale placement systems determine the optimal location for products across the hundreds of facilities belonging to Amazon’s global fulfillment network.

“In 2016, Amazon’s supply chain network was designed for scenarios where inventory from any fulfillment center could be shipped to any customer to meet a two-day promise,” said Salal Humair, senior principal research scientist at Amazon who has been with the company for seven years. This design was inadequate for the new world in which Amazon was operating; one shaped by what Humair calls the “globalization-localization imperative.”

A new multi-echelon inventory system developed by SCOT (a project whose roots stretch back to 2016) is a significant break from the past. The heart of the model is a multi-product, multi-fulfillment center, capacity-constrained model for optimizing inventory levels for multiple delivery speeds, under a dynamic fulfillment policy. The framework then uses a Lagrangian-type decomposition framework to control and optimize inventory levels across Amazon’s network in near real-time.

Broadly speaking, decomposition is a mathematical technique that breaks a large, complex problem up into smaller and simpler ones. Each of these problems is then solved in parallel or sequentially. The Lagrangian method of decomposition factors complicated constraints into the solution, while providing a ‘cost’ for violating these constraints. This cost makes the problem easier to solve by providing an upper bound to the maximization problem, which is critical when planning for inventory levels at Amazon’s scale.

Read more at Amazon Science

How Amazon's Middle Mile team helps packages make the journey to your doorstep

📅 Date:

🔖 Topics: E-commerce, operations research

🏢 Organizations: Fanuc


“To give you an idea of the scale and complexity we’re managing, our trucking network alone presents us with over 1088 — or ten octovigintillion — possible routing solutions,” says Tim Jacobs, director of Middle Mile Research Science and Optimization. “This is an especially large number, when you consider that there are 1082 atoms in the visible universe.”

And that’s just for the trucking network.

When a product is ordered on the Amazon Store, there are several ways it can make its way from a fulfillment center to the customer’s residence.

Read more at Amazon Science