Commentary: Six best practices for new product introduction
The new product introduction process has not changed very much over the years. Here are six ways to make it more strategic, collaborative, and digital.
In the digital economy, products are becoming more complex, lifecycles are getting shorter, and market environments continue to evolve. While much has changed in new product introduction (NPI) over the years, the process that supports it hasn't.
Static solutions cannot keep up with dynamic conditions. LNS Research reveals that 91% of the market still uses spreadsheets and electronic documents to track product requirements. With innovation largely focused on the design and engineering phases of product development, the NPI process has been left behind.
Unexpected product delays and costs happen to the best of us. In 2012, Apple announced that it would make a Mac in the U.S., but when it started manufacturing the computer, finding sufficient parts domestically was difficult. The unraveling began with a threaded nail—a custom screw. Because the NPI team did not review the entire bill of material and confirm all sources of supply down to the screw level, the new Mac product launch was delayed for months, according to The New York Times.
To improve time to market, increase gross margins, and successfully navigate risk, it's time to overhaul our outdated approach to NPI. Change starts by making the process more strategic, collaborative, and digital with these six best practices.
1. Pre-mortem planning. Hindsight is 20/20, so a post-mortem exercise conducted after product release, with lessons learned informing a better path forward, is a recommended best practice. But we can benefit from this point of view sooner by conducting scenario planning earlier in the process, anticipating failures, and mitigating the probable sources. Which suppliers are at risk? Is manufacturing in an area subject to higher tariffs? Which expedited options are available if the schedule slips?
Pre-mortem planning is a powerful way to identify possible problems. Traditional challenges such as quality, risk, and forecast accuracy are amplified by shorter product lifecycles and unpredictable markets. Anticipating the unexpected minimizes the potential impact.
2. Earlier sourcing input. By sourcing the bill of materials (BOM) right the first time, schedule delays, expensive redesigns, or additional charges from rush fees or tariffs are less likely to cause disruption at an inopportune time. Procurement can help optimize the build by providing sourcing recommendations including synergies with other products for deeper discounts, tariff implications, and shipping alternatives.
With an estimated 60-80% of the product cost and risks determined at the design stage, late changes can be more expensive than necessary. By aligning processes, technologies, and metrics to bring sourcing knowledge closer to the point of design, trade-offs happen earlier, such as reducing material expenses and increasing parts reuse. The cost to qualify a supplier or shift to a new location is incredibly high; engaging sourcing teams sooner will help NPI teams navigate supplier and part selection in a fluctuating global economy.
3. Target-based costing. Identifying the desired profit margin and maximum allowable cost to meet that margin upfront emphasizes cost in the design phase and distributes competitive pressure across the supply chain. When all team members understand targets and key budget drivers at the start, these factors will be included in decisions about the product from design to production and shipping. The upstream decisions made during NPI can significantly impact the timeframe and cost of a product during its lifecycle. When the product moves to manufacturing and ramps to target volumes, it is difficult to change plans, find new suppliers, or cut expenses.
Target-based costing was led by the automotive industry and is a key reason that automakers in Japan gained a competitive advantage over the U.S. and Europe. Integrating artificial intelligence (AI) into this methodology increases the speed and accuracy of projections when examining market conditions, reviewing target price against projected costs, and setting a target margin. Teams then work backwards to identify the constraints around allowable cost and drive strategies with suppliers on component target-level costing and alignment.
4. Data integration. Here's where advanced technology and big data analytics can really make a difference. Distributed teams are trying to compare all the risks, costs, and trade-offs to select the right suppliers, locations, and alternate parts for more products in compressed timelines. Disparate tools, such as spreadsheets, make it difficult to understand the factors that can influence the full product lifecycle. Manual methods of managing information and tracking changes limit the ability to collaborate and scale.
Cognitive NPI solutions consolidate diverse data sources in a centralized repository, allowing AI to analyze the information. This outside-in view?aggregating external data, such as market intelligence and raw material input costs, with internal data, such demand forecasts and purchase history?derives new insights for additional visibility.
5. Cross-functional coordination. Additional efficiency in NPI requires cross-functional coordination between manufacturing, quality, marketing, packaging, customer service, regulatory, and sourcing departments. Systems with shared tools and automated processes are one way to improve team integration. These tools can also provide an audit trail so stakeholders can easily evaluate changes, fix errors, work on exceptions, answer questions, and make feature-cost tradeoffs to launch at the optimal level of risk.
6. Post-mortem analysis. The final step in the NPI process, post-mortem analysis identifies improvements for the future. Did pre-mortem planning predictions bear out to avoid pitfalls, or are there new scenarios to consider for next time? Again, procurement should be involved in this step to evaluate the impact of decisions to direct material cost and risk attributes.
Post-mortem analysis also highlights opportunities to reevaluate the NPI process and implement innovation. Where are the bottlenecks? What steps can be streamlined or automated? Are the right technology solutions in place? There's very little AI-based decision-making within NPI today, which provides a massive opportunity for improved cross-functional collaboration and intelligence.
By following the steps laid out here and focusing on increasing planning and collaboration across functions, you can improve your new product introduction process and decrease the chances of following in Apple's footsteps. Hopefully, your next heavily anticipated new product launch will not be delayed for want of a screw. Better data visibility and integration efforts can go a long way toward enabling this planning and collaboration. When your data is fragmented, it hinders your ability to optimize costs, improve supplier selection, and mitigate risk. Applying artificial intelligence to this data can help enterprises produce valuable insights that can accelerate time to market, improve gross revenues, and decrease risks.
All of these efforts to strengthen the new product introduction process will help companies save time and money by increasing their ability to respond to ever-changing customer and market demand.
The launch is based on “Amazon Nova,” the company’s new generation of foundation models, the company said in a blog post. Data scientists use foundation models (FMs) to develop machine learning (ML) platforms more quickly than starting from scratch, allowing them to create artificial intelligence applications capable of performing a wide variety of general tasks, since they were trained on a broad spectrum of generalized data, Amazon says.
The new models are integrated with Amazon Bedrock, a managed service that makes FMs from AI companies and Amazon available for use through a single API. Using Amazon Bedrock, customers can experiment with and evaluate Amazon Nova models, as well as other FMs, to determine the best model for an application.
Calling the launch “the next step in our AI journey,” the company says Amazon Nova has the ability to process text, image, and video as prompts, so customers can use Amazon Nova-powered generative AI applications to understand videos, charts, and documents, or to generate videos and other multimedia content.
“Inside Amazon, we have about 1,000 Gen AI applications in motion, and we’ve had a bird’s-eye view of what application builders are still grappling with,” Rohit Prasad, SVP of Amazon Artificial General Intelligence, said in a release. “Our new Amazon Nova models are intended to help with these challenges for internal and external builders, and provide compelling intelligence and content generation while also delivering meaningful progress on latency, cost-effectiveness, customization, information grounding, and agentic capabilities.”
The new Amazon Nova models available in Amazon Bedrock include:
Amazon Nova Micro, a text-only model that delivers the lowest latency responses at very low cost.
Amazon Nova Lite, a very low-cost multimodal model that is lightning fast for processing image, video, and text inputs.
Amazon Nova Pro, a highly capable multimodal model with the best combination of accuracy, speed, and cost for a wide range of tasks.
Amazon Nova Premier, the most capable of Amazon’s multimodal models for complex reasoning tasks and for use as the best teacher for distilling custom models
Amazon Nova Canvas, a state-of-the-art image generation model.
Amazon Nova Reel, a state-of-the-art video generation model that can transform a single image input into a brief video with the prompt: dolly forward.
Economic activity in the logistics industry expanded in November, continuing a steady growth pattern that began earlier this year and signaling a return to seasonality after several years of fluctuating conditions, according to the latest Logistics Managers’ Index report (LMI), released today.
The November LMI registered 58.4, down slightly from October’s reading of 58.9, which was the highest level in two years. The LMI is a monthly gauge of business conditions across warehousing and logistics markets; a reading above 50 indicates growth and a reading below 50 indicates contraction.
“The overall index has been very consistent in the past three months, with readings of 58.6, 58.9, and 58.4,” LMI analyst Zac Rogers, associate professor of supply chain management at Colorado State University, wrote in the November LMI report. “This plateau is slightly higher than a similar plateau of consistency earlier in the year when May to August saw four readings between 55.3 and 56.4. Seasonally speaking, it is consistent that this later year run of readings would be the highest all year.”
Separately, Rogers said the end-of-year growth reflects the return to a healthy holiday peak, which started when inventory levels expanded in late summer and early fall as retailers began stocking up to meet consumer demand. Pandemic-driven shifts in consumer buying behavior, inflation, and economic uncertainty contributed to volatile peak season conditions over the past four years, with the LMI swinging from record-high growth in late 2020 and 2021 to slower growth in 2022 and contraction in 2023.
“The LMI contracted at this time a year ago, so basically [there was] no peak season,” Rogers said, citing inflation as a drag on demand. “To have a normal November … [really] for the first time in five years, justifies what we’ve seen all these companies doing—building up inventory in a sustainable, seasonal way.
“Based on what we’re seeing, a lot of supply chains called it right and were ready for healthy holiday season, so far.”
The LMI has remained in the mid to high 50s range since January—with the exception of April, when the index dipped to 52.9—signaling strong and consistent demand for warehousing and transportation services.
The LMI is a monthly survey of logistics managers from across the country. It tracks industry growth overall and across eight areas: inventory levels and costs; warehousing capacity, utilization, and prices; and transportation capacity, utilization, and prices. The report is released monthly by researchers from Arizona State University, Colorado State University, Rochester Institute of Technology, Rutgers University, and the University of Nevada, Reno, in conjunction with the Council of Supply Chain Management Professionals (CSCMP).
Specifically, 48% of respondents identified rising tariffs and trade barriers as their top concern, followed by supply chain disruptions at 45% and geopolitical instability at 41%. Moreover, tariffs and trade barriers ranked as the priority issue regardless of company size, as respondents at companies with less than 250 employees, 251-500, 501-1,000, 1,001-50,000 and 50,000+ employees all cited it as the most significant issue they are currently facing.
“Evolving tariffs and trade policies are one of a number of complex issues requiring organizations to build more resilience into their supply chains through compliance, technology and strategic planning,” Jackson Wood, Director, Industry Strategy at Descartes, said in a release. “With the potential for the incoming U.S. administration to impose new and additional tariffs on a wide variety of goods and countries of origin, U.S. importers may need to significantly re-engineer their sourcing strategies to mitigate potentially higher costs.”
Grocers and retailers are struggling to get their systems back online just before the winter holiday peak, following a software hack that hit the supply chain software provider Blue Yonder this week.
The ransomware attack is snarling inventory distribution patterns because of its impact on systems such as the employee scheduling system for coffee stalwart Starbucks, according to a published report. Scottsdale, Arizona-based Blue Yonder provides a wide range of supply chain software, including warehouse management system (WMS), transportation management system (TMS), order management and commerce, network and control tower, returns management, and others.
Blue Yonder today acknowledged the disruptions, saying they were the result of a ransomware incident affecting its managed services hosted environment. The company has established a dedicated cybersecurity incident update webpage to communicate its recovery progress, but it had not been updated for nearly two days as of Tuesday afternoon. “Since learning of the incident, the Blue Yonder team has been working diligently together with external cybersecurity firms to make progress in their recovery process. We have implemented several defensive and forensic protocols,” a Blue Yonder spokesperson said in an email.
The timing of the attack suggests that hackers may have targeted Blue Yonder in a calculated attack based on the upcoming Thanksgiving break, since many U.S. organizations downsize their security staffing on holidays and weekends, according to a statement from Dan Lattimer, VP of Semperis, a New Jersey-based computer and network security firm.
“While details on the specifics of the Blue Yonder attack are scant, it is yet another reminder how damaging supply chain disruptions become when suppliers are taken offline. Kudos to Blue Yonder for dealing with this cyberattack head on but we still don’t know how far reaching the business disruptions will be in the UK, U.S. and other countries,” Lattimer said. “Now is time for organizations to fight back against threat actors. Deciding whether or not to pay a ransom is a personal decision that each company has to make, but paying emboldens threat actors and throws more fuel onto an already burning inferno. Simply, it doesn’t pay-to-pay,” he said.
The incident closely followed an unrelated cybersecurity issue at the grocery giant Ahold Delhaize, which has been recovering from impacts to the Stop & Shop chain that it across the U.S. Northeast region. In a statement apologizing to customers for the inconvenience of the cybersecurity issue, Netherlands-based Ahold Delhaize said its top priority is the security of its customers, associates and partners, and that the company’s internal IT security staff was working with external cybersecurity experts and law enforcement to speed recovery. “Our teams are taking steps to assess and mitigate the issue. This includes taking some systems offline to help protect them. This issue and subsequent mitigating actions have affected certain Ahold Delhaize USA brands and services including a number of pharmacies and certain e-commerce operations,” the company said.
Editor's note:This article was revised on November 27 to indicate that the cybersecurity issue at Ahold Delhaize was unrelated to the Blue Yonder hack.
The new funding brings Amazon's total investment in Anthropic to $8 billion, while maintaining the e-commerce giant’s position as a minority investor, according to Anthropic. The partnership was launched in 2023, when Amazon invested its first $4 billion round in the firm.
Anthropic’s “Claude” family of AI assistant models is available on AWS’s Amazon Bedrock, which is a cloud-based managed service that lets companies build specialized generative AI applications by choosing from an array of foundation models (FMs) developed by AI providers like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon itself.
According to Amazon, tens of thousands of customers, from startups to enterprises and government institutions, are currently running their generative AI workloads using Anthropic’s models in the AWS cloud. Those GenAI tools are powering tasks such as customer service chatbots, coding assistants, translation applications, drug discovery, engineering design, and complex business processes.
"The response from AWS customers who are developing generative AI applications powered by Anthropic in Amazon Bedrock has been remarkable," Matt Garman, AWS CEO, said in a release. "By continuing to deploy Anthropic models in Amazon Bedrock and collaborating with Anthropic on the development of our custom Trainium chips, we’ll keep pushing the boundaries of what customers can achieve with generative AI technologies. We’ve been impressed by Anthropic’s pace of innovation and commitment to responsible development of generative AI, and look forward to deepening our collaboration."