Supply chain technology vendors are increasingly incorporating machine learning into their applications, helping their solutions more accurately understand and react to changing conditions.
One of the biggest developing trends in the logistics technology space is the growing application of machine learning in warehousing and transportation. In fact, something of an arms race has developed among technology providers as they try to leverage machine learning to differentiate their applications.
Machine learning is a branch of artificial intelligence. "Learning" occurs when a machine takes an existing data set, observes the accuracy of the output, and updates its own model so that better outputs will occur. Any machine that does this is using machine learning. It doesn't matter if data science methods are used or not. It does not matter if neural networks or some other form of supervised or unsupervised learning technique is being used. From a user's perspective, it's not necessary to get bogged down on the specific technique.
Article Figures
[Figure 1] Machine learning can improve algorithms used by warehouse management systemsEnlarge this image
Warehouse applications
Technology providers are already applying machine learning to many areas of the warehouse. Part of what makes warehousing a suitable application for machine learning is the fact that a warehouse operating environment is constantly in flux, especially in today's direct-to-consumer facilities. These facilities must constantly balance the competing priorities of efficiency and responsiveness. At the same time, there are numerous potential constraints on warehouse operations, and it is difficult to predict under which circumstances a given function or resource may become a constraint on throughput. Predictability becomes especially difficult when a facility dynamically introduces orders into an existing workload. Machine learning's ability to adapt to changing conditions in complex environments means that it can produce insights that would not be possible with traditional software.
For example, Manhattan Associates utilizes machine learning within the Order Streaming component of its warehouse management system (WMS) to determine the amount of time required to complete a certain task in a given set of circumstances. The machine learning algorithm reviews past data including type of task, historic duration, and item characteristics. It then identifies which conditions will affect how long it takes to complete a task. The next time that task is assigned, the system can take those conditions into account when estimating how long it will take to complete the task.
As another example, JDA Software is exploring machine learning within its Luminate Warehouse Tasking application to simulate the correlations between multiple attributes (such as congestion and increasing/decreasing demand for a particular resource) and order processing times.
A conceptual illustration of this concept can be seen in Figure 1. It may be thought that the primary factor affecting order processing time is the distance from the dispatch to the pick point. However, the first chart in Figure 1 shows that the predictive ability of that algorithm (shown by the orange line) is not accurate for some of the picks. When the picks are divided into two subsets based on weight, we can see that the accuracy of the algorithm changes. Machine learning can recognize this degradation and create a new input-output relationship that offers a more robust predictive power. Machine learning may determine that distance to dispatch is the determining factor for items under 100 pounds, but that weight is the determining factor for items over 100 pounds.
Machine learning is also currently used in support of warehouse automation. RightPick, the piece-picking solution from RightHand Robotics, encounters a wide range of items and utilizes machine learning to improve its performance based on the prior experience of its robots. RightPick captures an abundance of data from its autonomous picks such as what the robot saw (camera), what it did (including approach and pick method), and what happened (such as success, failure, or placement). This data then feeds convolutional neural networks that enable the robot to distinguish between adjacent items, which help improve picking accuracy. The solution's software intelligence, driven by machine learning, is enabling the robots to pick 50 percent faster than they did a year prior. This productivity improvement is due to having a higher pick-completion ratio and a shorter pick-attempt time. Knapp, an Austria-based warehouse automation provider, also applies machine learning to the piece-picking process. Machine learning supports Knapp's Pick-it-Easy Robot by identifying item shape and determining the best grip method and ideal grip point.
Transportation applications
Machine learning is also becoming increasingly important in transportation management and execution systems. The most notable application is generating a more informed and up-to-date estimated time of arrival (ETA) for shipments. Machine learning is working with real-time visibility solutions to learn more about constraints (such as capacity, regulations, and hours of service) and then using that information to give a much better ETA for shipments to warehouses, stores, and the end customer.
These ETA systems are using a variety of data streams. One emerging data stream involves using Internet of Things (IoT) data from trucks to get a better understanding of driver behavior, such as typical driving speeds and times as well as how they operate in heavily congested areas. Trimble Transportation's True ETA application, for example, takes sensor data from trucks and incorporates hours of service rules to know when, where, and for how long a driver needs to stop. The application also understands that where and when the driver stops will have an impact on the ETA. This is especially true if drivers stop before a major city and will have to endure rush hour traffic once they start driving again.
Other data streams include port data; social, news, events, and weather (SNEW) data; and traffic data. Many TMS companies are partnering with data aggregators such as FourKites, project44, 10-4 Systems, and others to use this data for improved ETAs. This data helps to develop forward-looking transportation plans. JDA is an example of a TMS provider that is bringing in multiple external data sources as part of transportation planning and execution. JDA uses these data streams to better understand potential disruptions in the travel time for shipments. Using machine learning, companies can make more resilient plans that can absorb disruption without making major changes. An example is learning about the downstream effect that a late container at the port has on the overall transportation network and adjusting plans and ETAs accordingly. Most importantly, this information can help companies proactively communicate with customers when a disruption occurs.
Machine learning is playing a role in other aspects of transportation management as well. Companies buy a TMS to achieve freight savings by enabling network simulation and design, load consolidation, lower-cost mode selections, and multi-stop route optimization. Machine learning gives companies the ability to maintain high service levels while achieving these savings. Shippers can learn which carriers meet on-time service levels and which do not, which lanes typically carry more chance for delays, and whether there is an optimal number of stops before shipments become late. Machine learning can aid shippers in better understanding how to drive efficiencies without sacrificing service levels.
Supply chain software companies are in the early stages of learning how to incorporate these technologies into their solutions. The solutions available today will only continue to improve. When a shipper implements a machine-learning solution, its individual solution will improve over time as it accumulates more and more data. Additionally, some supply chain solutions are offered in a many-to-many cloud architecture. These solutions have the ability to improve based upon the data not just of one shipper, but of all the shippers that are using the solution.
The launch is based on “Amazon Nova,” the company’s new generation of foundation models, the company said in a blog post. Data scientists use foundation models (FMs) to develop machine learning (ML) platforms more quickly than starting from scratch, allowing them to create artificial intelligence applications capable of performing a wide variety of general tasks, since they were trained on a broad spectrum of generalized data, Amazon says.
The new models are integrated with Amazon Bedrock, a managed service that makes FMs from AI companies and Amazon available for use through a single API. Using Amazon Bedrock, customers can experiment with and evaluate Amazon Nova models, as well as other FMs, to determine the best model for an application.
Calling the launch “the next step in our AI journey,” the company says Amazon Nova has the ability to process text, image, and video as prompts, so customers can use Amazon Nova-powered generative AI applications to understand videos, charts, and documents, or to generate videos and other multimedia content.
“Inside Amazon, we have about 1,000 Gen AI applications in motion, and we’ve had a bird’s-eye view of what application builders are still grappling with,” Rohit Prasad, SVP of Amazon Artificial General Intelligence, said in a release. “Our new Amazon Nova models are intended to help with these challenges for internal and external builders, and provide compelling intelligence and content generation while also delivering meaningful progress on latency, cost-effectiveness, customization, information grounding, and agentic capabilities.”
The new Amazon Nova models available in Amazon Bedrock include:
Amazon Nova Micro, a text-only model that delivers the lowest latency responses at very low cost.
Amazon Nova Lite, a very low-cost multimodal model that is lightning fast for processing image, video, and text inputs.
Amazon Nova Pro, a highly capable multimodal model with the best combination of accuracy, speed, and cost for a wide range of tasks.
Amazon Nova Premier, the most capable of Amazon’s multimodal models for complex reasoning tasks and for use as the best teacher for distilling custom models
Amazon Nova Canvas, a state-of-the-art image generation model.
Amazon Nova Reel, a state-of-the-art video generation model that can transform a single image input into a brief video with the prompt: dolly forward.
Economic activity in the logistics industry expanded in November, continuing a steady growth pattern that began earlier this year and signaling a return to seasonality after several years of fluctuating conditions, according to the latest Logistics Managers’ Index report (LMI), released today.
The November LMI registered 58.4, down slightly from October’s reading of 58.9, which was the highest level in two years. The LMI is a monthly gauge of business conditions across warehousing and logistics markets; a reading above 50 indicates growth and a reading below 50 indicates contraction.
“The overall index has been very consistent in the past three months, with readings of 58.6, 58.9, and 58.4,” LMI analyst Zac Rogers, associate professor of supply chain management at Colorado State University, wrote in the November LMI report. “This plateau is slightly higher than a similar plateau of consistency earlier in the year when May to August saw four readings between 55.3 and 56.4. Seasonally speaking, it is consistent that this later year run of readings would be the highest all year.”
Separately, Rogers said the end-of-year growth reflects the return to a healthy holiday peak, which started when inventory levels expanded in late summer and early fall as retailers began stocking up to meet consumer demand. Pandemic-driven shifts in consumer buying behavior, inflation, and economic uncertainty contributed to volatile peak season conditions over the past four years, with the LMI swinging from record-high growth in late 2020 and 2021 to slower growth in 2022 and contraction in 2023.
“The LMI contracted at this time a year ago, so basically [there was] no peak season,” Rogers said, citing inflation as a drag on demand. “To have a normal November … [really] for the first time in five years, justifies what we’ve seen all these companies doing—building up inventory in a sustainable, seasonal way.
“Based on what we’re seeing, a lot of supply chains called it right and were ready for healthy holiday season, so far.”
The LMI has remained in the mid to high 50s range since January—with the exception of April, when the index dipped to 52.9—signaling strong and consistent demand for warehousing and transportation services.
The LMI is a monthly survey of logistics managers from across the country. It tracks industry growth overall and across eight areas: inventory levels and costs; warehousing capacity, utilization, and prices; and transportation capacity, utilization, and prices. The report is released monthly by researchers from Arizona State University, Colorado State University, Rochester Institute of Technology, Rutgers University, and the University of Nevada, Reno, in conjunction with the Council of Supply Chain Management Professionals (CSCMP).
Specifically, 48% of respondents identified rising tariffs and trade barriers as their top concern, followed by supply chain disruptions at 45% and geopolitical instability at 41%. Moreover, tariffs and trade barriers ranked as the priority issue regardless of company size, as respondents at companies with less than 250 employees, 251-500, 501-1,000, 1,001-50,000 and 50,000+ employees all cited it as the most significant issue they are currently facing.
“Evolving tariffs and trade policies are one of a number of complex issues requiring organizations to build more resilience into their supply chains through compliance, technology and strategic planning,” Jackson Wood, Director, Industry Strategy at Descartes, said in a release. “With the potential for the incoming U.S. administration to impose new and additional tariffs on a wide variety of goods and countries of origin, U.S. importers may need to significantly re-engineer their sourcing strategies to mitigate potentially higher costs.”
Grocers and retailers are struggling to get their systems back online just before the winter holiday peak, following a software hack that hit the supply chain software provider Blue Yonder this week.
The ransomware attack is snarling inventory distribution patterns because of its impact on systems such as the employee scheduling system for coffee stalwart Starbucks, according to a published report. Scottsdale, Arizona-based Blue Yonder provides a wide range of supply chain software, including warehouse management system (WMS), transportation management system (TMS), order management and commerce, network and control tower, returns management, and others.
Blue Yonder today acknowledged the disruptions, saying they were the result of a ransomware incident affecting its managed services hosted environment. The company has established a dedicated cybersecurity incident update webpage to communicate its recovery progress, but it had not been updated for nearly two days as of Tuesday afternoon. “Since learning of the incident, the Blue Yonder team has been working diligently together with external cybersecurity firms to make progress in their recovery process. We have implemented several defensive and forensic protocols,” a Blue Yonder spokesperson said in an email.
The timing of the attack suggests that hackers may have targeted Blue Yonder in a calculated attack based on the upcoming Thanksgiving break, since many U.S. organizations downsize their security staffing on holidays and weekends, according to a statement from Dan Lattimer, VP of Semperis, a New Jersey-based computer and network security firm.
“While details on the specifics of the Blue Yonder attack are scant, it is yet another reminder how damaging supply chain disruptions become when suppliers are taken offline. Kudos to Blue Yonder for dealing with this cyberattack head on but we still don’t know how far reaching the business disruptions will be in the UK, U.S. and other countries,” Lattimer said. “Now is time for organizations to fight back against threat actors. Deciding whether or not to pay a ransom is a personal decision that each company has to make, but paying emboldens threat actors and throws more fuel onto an already burning inferno. Simply, it doesn’t pay-to-pay,” he said.
The incident closely followed an unrelated cybersecurity issue at the grocery giant Ahold Delhaize, which has been recovering from impacts to the Stop & Shop chain that it across the U.S. Northeast region. In a statement apologizing to customers for the inconvenience of the cybersecurity issue, Netherlands-based Ahold Delhaize said its top priority is the security of its customers, associates and partners, and that the company’s internal IT security staff was working with external cybersecurity experts and law enforcement to speed recovery. “Our teams are taking steps to assess and mitigate the issue. This includes taking some systems offline to help protect them. This issue and subsequent mitigating actions have affected certain Ahold Delhaize USA brands and services including a number of pharmacies and certain e-commerce operations,” the company said.
Editor's note:This article was revised on November 27 to indicate that the cybersecurity issue at Ahold Delhaize was unrelated to the Blue Yonder hack.
The new funding brings Amazon's total investment in Anthropic to $8 billion, while maintaining the e-commerce giant’s position as a minority investor, according to Anthropic. The partnership was launched in 2023, when Amazon invested its first $4 billion round in the firm.
Anthropic’s “Claude” family of AI assistant models is available on AWS’s Amazon Bedrock, which is a cloud-based managed service that lets companies build specialized generative AI applications by choosing from an array of foundation models (FMs) developed by AI providers like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon itself.
According to Amazon, tens of thousands of customers, from startups to enterprises and government institutions, are currently running their generative AI workloads using Anthropic’s models in the AWS cloud. Those GenAI tools are powering tasks such as customer service chatbots, coding assistants, translation applications, drug discovery, engineering design, and complex business processes.
"The response from AWS customers who are developing generative AI applications powered by Anthropic in Amazon Bedrock has been remarkable," Matt Garman, AWS CEO, said in a release. "By continuing to deploy Anthropic models in Amazon Bedrock and collaborating with Anthropic on the development of our custom Trainium chips, we’ll keep pushing the boundaries of what customers can achieve with generative AI technologies. We’ve been impressed by Anthropic’s pace of innovation and commitment to responsible development of generative AI, and look forward to deepening our collaboration."