Because it reveals hidden costs, the "unit total cost" approach to purchasing can help companies accurately evaluate the cost of doing business with individual suppliers.
Many companies want to create the optimum supply chain for their organizations. Often, however, they don't have the information they need to reach that goal. The problem is that optimization requires visibility into all supply chain costs—yet companies frequently limit their focus to supplier- and transportation-related expenses.
Such a limited view fails to consider costs that organizations absorb internally. This is especially true when it comes to purchasing products and services. Because these internal costs can significantly increase the total cost of supply, organizations should have a method for identifying and quantifying them; otherwise they are likely to make purchasing decisions based on incomplete information.
The first step in determining total supply chain costs is to create a context and a format for assembling all cost factors. It also is important to identify relevant internal "issues" and translate them into their dollar values. Examples of internal issues that could affect overall costs include physical occurrences, such as scrap rates, in-transit damage, or quality problems; administrative problems, such as incorrect invoices or recurrent expediting of shipments; and preferences, such as disadvantaged business status or a high cost to switch sources.
One proven method for evaluating both internal and external issues is "unit total cost" (UTC).
Unit total cost is defined as the unit purchase price amended by an appropriate monetary factor assigned to each issue. UTC creates a clearer picture of what a given source of supply actually costs an organization; it is most useful when selecting or negotiating with suppliers. It also provides a context for all stakeholders to see the total picture of their organization's costs.
In this article, we will outline how to calculate and apply unit total cost, step by step.
Five-step framework
Five basic steps provide a framework for successfully applying unit total cost in purchasing decisions. They include:
1. Identify all of the total-cost factors that are important to the organization. This is best done by inviting all stakeholders to identify cost elements and other issues of interest to them regarding the source of supply that is under evaluation.
2. Develop a "price adder" formula that will translate each total-cost factor into dollars based on its perceived level of importance.
3. Add to each supplier's quoted price a debit (or credit) for each total-cost factor, appropriate to that supplier's performance in relation to those factors.
4. Add together the quoted price plus all total-cost factors to get unit total cost.
5. Award the business to the supplier with the lowest unit total cost.
People who are involved along the supply path are stakeholders who have a vested interest in the choice of a product or supplier, and they should be invited to participate in determining the total cost of a given source. They participate by identifying issues of concern and potential cost drivers in their particular areas. Stakeholders need and usually want to be involved so they can champion their own issues, and by participating in this process they will also learn about other groups' issues. This cross-functional sharing of information is one of the benefits of the UTC approach. Before applying unit total cost, therefore, it's important to get stakeholders and management to agree on how to treat issues that are likely to come up repeatedly.
Once agreement has been reached, select a purchased item as the first total-cost candidate and begin the process of identifying total-cost factors. Be sure to flush all relevant issues out into the open by mapping the processes associated with that item as far upstream and downstream as possible—not only its physical movement but also administrative processes such as purchasing and accounts payable. Document that flow and make notes, including the name of a key stakeholder for each step along the way. Ask these people what issues affect them relative to the item being evaluated. Then map the process flow and its associated issues. Figure 1 shows a sample process map with stakeholders and the concerns they identified.
Next, quantify the costs associated with those issues. These can be sorted into two groups: "hard costs" and "soft costs." Hard costs are those for which there is an invoice or a direct cash outlay, such as freight payments or inventory. Soft costs consume resources but have no direct cash outlay; they measure productivity. An example is the cost to an organization of lost time caused by correcting errors or expediting shipments. Time has value, and time that is repeatedly lost is a cost the organization can potentially recover.
Some organizations may be hesitant to include productivity losses in their calculations of supply chain costs, but it is important to understand that those losses are indeed costly. They can be kept separate from hard costs but they should be considered.
Monetary measures
Once the question of how to handle soft costs has been resolved, sort the cost factors that have been identified into three categories for inclusion in UTC calculations:
Cost factors: Hard costs that have already been quantified.
Performance factors: The cost to the organization of a supplier's failure to perform.
Policy factors: Issues of policy, preference, and all other issues that are not data-related.
Although the costs for each of these categories are calculated differently, money is a good common denominator. When every issue is measured in terms of its financial impact, all of those factors can be added together to determine the total cost of doing business with a particular supplier.
Let's look at each of the three categories.
Cost factors. Cost factors are the easiest to calculate because they already are measured monetarily. The calculation converts costs into their per-unit equivalent. For each factor, divide the total cost by the number of units over which it applies. Since each occurrence may not be identical, calculate the data for several typical examples and average them to derive a mean cost. Then use this mean cost in totalcost calculations.
Take transportation as an example. To calculate the per-unit freight cost, divide the total freight charges incurred over a period of time by the total number of units shipped during that period. Using the total charges over a period of time yields an automatic averaging of the costs.
Cost factors should also include areas where suppliers save money. Cost savings can be included in UTC as subtractions from the total cost. Figure 2 provides examples of cost-factor calculations.
Performance factors. Calculating the cost impact of anything classified as a performance issue requires data on the actual level of performance, which usually is measured as a percentage. Commonly measured performance factors include on-time delivery, product quality, and lead time.
Unlike cost factors, performance factors do not have costs directly associated with them, so a "price adder" formula must be established for each one. Two sets of criteria matter:
1. The formula must be relevant to the factor in question and easy to calculate.
2. The formula can be applied to multiple suppliers and used to differentiate performance.
Performance factors can be calculated exactly or approximately. Approximations that meet the above criteria are valid. They also are much easier: Simply use the nonperformance percentage as a price adder. For example, if a supplier has an on-time record of 89 percent, then it is not on time 11 percent of the time, and 11 percent would be added to its quoted price.
Several methods of measuring quality have been used successfully. Some organizations have calculated the actual cost of dealing with nonconforming material. Others apply a price adder based on the percentage of products with unacceptable quality.
Long lead times limit flexibility and may drive high levels of inventory. If lead time matters (and it should!), then a "tax" on lead time is a relevant price adder. One percent for each week of a supplier's quoted lead time is certainly within the realm of reason. The more important lead-time reduction is, the higher the tax should be.
Examples of three types of performance calculations are shown in Figure 3.
Policy factors. Policy factors can be applied either as a credit to suppliers that comply with the policy in question or a tax on those that don't. Policy issues often are subjective and normally are not considered in monetary terms. Examples include stakeholder preferences, risk management issues, disadvantaged business status, and social responsibility commitments.
Because policy factors often are not quantified, the sponsors who put them on the list of issues need to provide a statement of value for each one. The purpose of this statement is to develop numeric data that can be used in total-cost calculations. This is not necessary for other types of cost factors because they already have quantifiable data.
The sponsor defines the boundary between policy-related costs that are allowable and those that are too high by answering this question: "How much more are you willing to pay to give preference to a supplier who incorporates your issue in its business activities over a supplier that does not?" Please note that this is a boundary definition only and does not necessarily mean that prices will change in actual practice. Once such a boundary has been defined, suppliers that comply with the relevant policy can be credited up to the established limit. An example of policy-factor calculations is shown in Figure 4.
Examples of policy issues that some organizations have chosen to address include:
Consensual reciprocity: Companies that want to favor other businesses for various reasons may want to include these as factors when selecting suppliers. For instance, companies that do business in countries that require a certain level of local content may give local suppliers a policy edge.
Contributions: If a nonprofit organization would like to favor donors when it awards business, unit total cost will allow policy makers to decide how much more they are willing to pay in order to give the business to a donor rather than to a company that is not a donor but whose products or services may be cheaper.
Recycled content: As environmental awareness increases, recycled and recyclable materials are becoming more important to many companies. To include this issue in UTC, establish an appropriate percentage credit for recycled content. (Credits of 5 percent to 10 percent are common.)
Heroics: If a supplier has put forth extraordinary effort to support a customer in times of distress, the customer may wish to reward that company by giving it preference.
Design support: Development groups such as research or engineering often push for favoring suppliers that have supported research efforts with fast prototyping and other special services.
Some policy factors—for example, disadvantaged business status—can be considered on a yes-or-no basis when definitions are clear and either a business fits a category or it doesn't. For this type of factor, credit a qualifying supplier up to the boundary limit.
Other policy factors—heroics, for example—are variable in nature. Depending on how a company has defined "heroic," a supplier might meet those criteria more than once. For this type of factor, develop an agreed-upon amount of credit per occurrence. Multiply that amount by the number of times a supplier has met the criteria to determine the total credit.
After all of the factors in all three categories have been calculated, add them into the unit price of the item to get the unit total cost, and use the resulting totals to make business decisions. Figure 5 provides an example of unit total cost calculations.
Making healthier choices
Applying the unit total cost approach to supplier evaluation will provide benefits both within and outside the organization. Internally, using total-cost calculations to select and manage suppliers allows all cost factors, not just the obvious ones, to be included. UTC will make clear the justifications for choosing a particular supplier—especially when it is not the lowest-priced supplier—while teaching the whole organization about where and how costs are incurred. It forces policy makers to decide what they value and how much they are willing to pay to exercise those values. Moreover, it gets the monkey off supply management's back when it comes to deciding how to treat issues that are not monetary in nature.
Externally, companies can share UTC calculations with suppliers to provide a powerful message about what they have to do to earn their business. When suppliers are shown those costs, they can clearly see which factors drag them down and by how much, and they can see how improvement would affect their competitiveness. In response, they could choose to slash prices (a choice that might be hazardous to their long-term health), and/or they could work on improving performance to meet their customers' needs (a much healthier choice). In short, they could see exactly how performance improvement affects their "bottom line" from their customers' point of view.
Supply chain costs include many factors beyond logistics, transportation, and physical handling. When examining a supply chain for ways to reduce expenses and control costs, managers should look carefully within their own organizations. In many cases, internal issues offer more opportunities for improvement than do external issues.
It is important to have both the knowledge to recognize those opportunities and the analytical skills to capitalize on them. Unit total cost is a simple but valuable tool that can help supply chain managers achieve both of those objectives.
The launch is based on “Amazon Nova,” the company’s new generation of foundation models, the company said in a blog post. Data scientists use foundation models (FMs) to develop machine learning (ML) platforms more quickly than starting from scratch, allowing them to create artificial intelligence applications capable of performing a wide variety of general tasks, since they were trained on a broad spectrum of generalized data, Amazon says.
The new models are integrated with Amazon Bedrock, a managed service that makes FMs from AI companies and Amazon available for use through a single API. Using Amazon Bedrock, customers can experiment with and evaluate Amazon Nova models, as well as other FMs, to determine the best model for an application.
Calling the launch “the next step in our AI journey,” the company says Amazon Nova has the ability to process text, image, and video as prompts, so customers can use Amazon Nova-powered generative AI applications to understand videos, charts, and documents, or to generate videos and other multimedia content.
“Inside Amazon, we have about 1,000 Gen AI applications in motion, and we’ve had a bird’s-eye view of what application builders are still grappling with,” Rohit Prasad, SVP of Amazon Artificial General Intelligence, said in a release. “Our new Amazon Nova models are intended to help with these challenges for internal and external builders, and provide compelling intelligence and content generation while also delivering meaningful progress on latency, cost-effectiveness, customization, information grounding, and agentic capabilities.”
The new Amazon Nova models available in Amazon Bedrock include:
Amazon Nova Micro, a text-only model that delivers the lowest latency responses at very low cost.
Amazon Nova Lite, a very low-cost multimodal model that is lightning fast for processing image, video, and text inputs.
Amazon Nova Pro, a highly capable multimodal model with the best combination of accuracy, speed, and cost for a wide range of tasks.
Amazon Nova Premier, the most capable of Amazon’s multimodal models for complex reasoning tasks and for use as the best teacher for distilling custom models
Amazon Nova Canvas, a state-of-the-art image generation model.
Amazon Nova Reel, a state-of-the-art video generation model that can transform a single image input into a brief video with the prompt: dolly forward.
Economic activity in the logistics industry expanded in November, continuing a steady growth pattern that began earlier this year and signaling a return to seasonality after several years of fluctuating conditions, according to the latest Logistics Managers’ Index report (LMI), released today.
The November LMI registered 58.4, down slightly from October’s reading of 58.9, which was the highest level in two years. The LMI is a monthly gauge of business conditions across warehousing and logistics markets; a reading above 50 indicates growth and a reading below 50 indicates contraction.
“The overall index has been very consistent in the past three months, with readings of 58.6, 58.9, and 58.4,” LMI analyst Zac Rogers, associate professor of supply chain management at Colorado State University, wrote in the November LMI report. “This plateau is slightly higher than a similar plateau of consistency earlier in the year when May to August saw four readings between 55.3 and 56.4. Seasonally speaking, it is consistent that this later year run of readings would be the highest all year.”
Separately, Rogers said the end-of-year growth reflects the return to a healthy holiday peak, which started when inventory levels expanded in late summer and early fall as retailers began stocking up to meet consumer demand. Pandemic-driven shifts in consumer buying behavior, inflation, and economic uncertainty contributed to volatile peak season conditions over the past four years, with the LMI swinging from record-high growth in late 2020 and 2021 to slower growth in 2022 and contraction in 2023.
“The LMI contracted at this time a year ago, so basically [there was] no peak season,” Rogers said, citing inflation as a drag on demand. “To have a normal November … [really] for the first time in five years, justifies what we’ve seen all these companies doing—building up inventory in a sustainable, seasonal way.
“Based on what we’re seeing, a lot of supply chains called it right and were ready for healthy holiday season, so far.”
The LMI has remained in the mid to high 50s range since January—with the exception of April, when the index dipped to 52.9—signaling strong and consistent demand for warehousing and transportation services.
The LMI is a monthly survey of logistics managers from across the country. It tracks industry growth overall and across eight areas: inventory levels and costs; warehousing capacity, utilization, and prices; and transportation capacity, utilization, and prices. The report is released monthly by researchers from Arizona State University, Colorado State University, Rochester Institute of Technology, Rutgers University, and the University of Nevada, Reno, in conjunction with the Council of Supply Chain Management Professionals (CSCMP).
Specifically, 48% of respondents identified rising tariffs and trade barriers as their top concern, followed by supply chain disruptions at 45% and geopolitical instability at 41%. Moreover, tariffs and trade barriers ranked as the priority issue regardless of company size, as respondents at companies with less than 250 employees, 251-500, 501-1,000, 1,001-50,000 and 50,000+ employees all cited it as the most significant issue they are currently facing.
“Evolving tariffs and trade policies are one of a number of complex issues requiring organizations to build more resilience into their supply chains through compliance, technology and strategic planning,” Jackson Wood, Director, Industry Strategy at Descartes, said in a release. “With the potential for the incoming U.S. administration to impose new and additional tariffs on a wide variety of goods and countries of origin, U.S. importers may need to significantly re-engineer their sourcing strategies to mitigate potentially higher costs.”
Grocers and retailers are struggling to get their systems back online just before the winter holiday peak, following a software hack that hit the supply chain software provider Blue Yonder this week.
The ransomware attack is snarling inventory distribution patterns because of its impact on systems such as the employee scheduling system for coffee stalwart Starbucks, according to a published report. Scottsdale, Arizona-based Blue Yonder provides a wide range of supply chain software, including warehouse management system (WMS), transportation management system (TMS), order management and commerce, network and control tower, returns management, and others.
Blue Yonder today acknowledged the disruptions, saying they were the result of a ransomware incident affecting its managed services hosted environment. The company has established a dedicated cybersecurity incident update webpage to communicate its recovery progress, but it had not been updated for nearly two days as of Tuesday afternoon. “Since learning of the incident, the Blue Yonder team has been working diligently together with external cybersecurity firms to make progress in their recovery process. We have implemented several defensive and forensic protocols,” a Blue Yonder spokesperson said in an email.
The timing of the attack suggests that hackers may have targeted Blue Yonder in a calculated attack based on the upcoming Thanksgiving break, since many U.S. organizations downsize their security staffing on holidays and weekends, according to a statement from Dan Lattimer, VP of Semperis, a New Jersey-based computer and network security firm.
“While details on the specifics of the Blue Yonder attack are scant, it is yet another reminder how damaging supply chain disruptions become when suppliers are taken offline. Kudos to Blue Yonder for dealing with this cyberattack head on but we still don’t know how far reaching the business disruptions will be in the UK, U.S. and other countries,” Lattimer said. “Now is time for organizations to fight back against threat actors. Deciding whether or not to pay a ransom is a personal decision that each company has to make, but paying emboldens threat actors and throws more fuel onto an already burning inferno. Simply, it doesn’t pay-to-pay,” he said.
The incident closely followed an unrelated cybersecurity issue at the grocery giant Ahold Delhaize, which has been recovering from impacts to the Stop & Shop chain that it across the U.S. Northeast region. In a statement apologizing to customers for the inconvenience of the cybersecurity issue, Netherlands-based Ahold Delhaize said its top priority is the security of its customers, associates and partners, and that the company’s internal IT security staff was working with external cybersecurity experts and law enforcement to speed recovery. “Our teams are taking steps to assess and mitigate the issue. This includes taking some systems offline to help protect them. This issue and subsequent mitigating actions have affected certain Ahold Delhaize USA brands and services including a number of pharmacies and certain e-commerce operations,” the company said.
Editor's note:This article was revised on November 27 to indicate that the cybersecurity issue at Ahold Delhaize was unrelated to the Blue Yonder hack.
The new funding brings Amazon's total investment in Anthropic to $8 billion, while maintaining the e-commerce giant’s position as a minority investor, according to Anthropic. The partnership was launched in 2023, when Amazon invested its first $4 billion round in the firm.
Anthropic’s “Claude” family of AI assistant models is available on AWS’s Amazon Bedrock, which is a cloud-based managed service that lets companies build specialized generative AI applications by choosing from an array of foundation models (FMs) developed by AI providers like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon itself.
According to Amazon, tens of thousands of customers, from startups to enterprises and government institutions, are currently running their generative AI workloads using Anthropic’s models in the AWS cloud. Those GenAI tools are powering tasks such as customer service chatbots, coding assistants, translation applications, drug discovery, engineering design, and complex business processes.
"The response from AWS customers who are developing generative AI applications powered by Anthropic in Amazon Bedrock has been remarkable," Matt Garman, AWS CEO, said in a release. "By continuing to deploy Anthropic models in Amazon Bedrock and collaborating with Anthropic on the development of our custom Trainium chips, we’ll keep pushing the boundaries of what customers can achieve with generative AI technologies. We’ve been impressed by Anthropic’s pace of innovation and commitment to responsible development of generative AI, and look forward to deepening our collaboration."