Ben Ames has spent 20 years as a journalist since starting out as a daily newspaper reporter in Pennsylvania in 1995. From 1999 forward, he has focused on business and technology reporting for a number of trade journals, beginning when he joined Design News and Modern Materials Handling magazines. Ames is author of the trail guide "Hiking Massachusetts" and is a graduate of the Columbia School of Journalism.
The 5,500-mile pipeline was turned off on Friday after Atlanta-based Colonial Pipeline Co. said it discovered that hackers had installed ransomware in its systems. In response, the company halted all pipeline operations, hired a third-party cybersecurity firm, and contacted law enforcement and other federal agencies.
By Sunday night, the company had restarted some “smaller lateral lines between terminals and delivery points,” but said its four main lines remained switched off. As IT experts continue efforts to repair the computer damage, the company on Monday said it was looking for ways to restore oil flows.
“We continue to evaluate product inventory in storage tanks at our facilities and others along our system and are working with our shippers to move this product to terminals for local delivery,” Colonial Pipeline said in a release. “Actions taken by the Federal Government to issue a temporary hours of service exemption for motor carriers and drivers transporting refined products across Colonial’s footprint should help alleviate local supply disruptions and we thank our government partners for their assistance in resolving this matter.”
The oil freeze is significant because Colonial Pipeline says its system delivers some 45% of total fuel consumed on the East Coast.
By temporarily lifting its caps on truck drivers’ hours behind the wheel, the Federal Motor Carrier Safety Administration (FMCSA) could help the transportation industry use trucks to work around the frozen pipeline. The exemption applies to drivers hauling certain refined petroleum products in Alabama, Arkansas, District of Columbia, Delaware, Florida, Georgia, Kentucky, Louisiana, Maryland, Mississippi, New Jersey, New York, North Carolina, Pennsylvania, South Carolina, Tennessee, Texas, and Virginia.
According to the load board network operator DAT Freight & Analytics, spot truckload rates remained near all-time highs during the week ending May 3, just a year after they bottomed out during pandemic business and travel closures.
The Portland, Oregon-based firm’s statistics did not cover tanker trucks, but said the seven-day average line-haul rate (excluding fuel surcharges) for dry vans was $2.27 a mile last week, 95 cents higher than the same period one year ago. Likewise, spot refrigerated freight averaged $2.61 per mile (up 94 cents) and the average flatbed rate was $2.62 per mile (up 93 cents).
The launch is based on “Amazon Nova,” the company’s new generation of foundation models, the company said in a blog post. Data scientists use foundation models (FMs) to develop machine learning (ML) platforms more quickly than starting from scratch, allowing them to create artificial intelligence applications capable of performing a wide variety of general tasks, since they were trained on a broad spectrum of generalized data, Amazon says.
The new models are integrated with Amazon Bedrock, a managed service that makes FMs from AI companies and Amazon available for use through a single API. Using Amazon Bedrock, customers can experiment with and evaluate Amazon Nova models, as well as other FMs, to determine the best model for an application.
Calling the launch “the next step in our AI journey,” the company says Amazon Nova has the ability to process text, image, and video as prompts, so customers can use Amazon Nova-powered generative AI applications to understand videos, charts, and documents, or to generate videos and other multimedia content.
“Inside Amazon, we have about 1,000 Gen AI applications in motion, and we’ve had a bird’s-eye view of what application builders are still grappling with,” Rohit Prasad, SVP of Amazon Artificial General Intelligence, said in a release. “Our new Amazon Nova models are intended to help with these challenges for internal and external builders, and provide compelling intelligence and content generation while also delivering meaningful progress on latency, cost-effectiveness, customization, information grounding, and agentic capabilities.”
The new Amazon Nova models available in Amazon Bedrock include:
Amazon Nova Micro, a text-only model that delivers the lowest latency responses at very low cost.
Amazon Nova Lite, a very low-cost multimodal model that is lightning fast for processing image, video, and text inputs.
Amazon Nova Pro, a highly capable multimodal model with the best combination of accuracy, speed, and cost for a wide range of tasks.
Amazon Nova Premier, the most capable of Amazon’s multimodal models for complex reasoning tasks and for use as the best teacher for distilling custom models
Amazon Nova Canvas, a state-of-the-art image generation model.
Amazon Nova Reel, a state-of-the-art video generation model that can transform a single image input into a brief video with the prompt: dolly forward.
Economic activity in the logistics industry expanded in November, continuing a steady growth pattern that began earlier this year and signaling a return to seasonality after several years of fluctuating conditions, according to the latest Logistics Managers’ Index report (LMI), released today.
The November LMI registered 58.4, down slightly from October’s reading of 58.9, which was the highest level in two years. The LMI is a monthly gauge of business conditions across warehousing and logistics markets; a reading above 50 indicates growth and a reading below 50 indicates contraction.
“The overall index has been very consistent in the past three months, with readings of 58.6, 58.9, and 58.4,” LMI analyst Zac Rogers, associate professor of supply chain management at Colorado State University, wrote in the November LMI report. “This plateau is slightly higher than a similar plateau of consistency earlier in the year when May to August saw four readings between 55.3 and 56.4. Seasonally speaking, it is consistent that this later year run of readings would be the highest all year.”
Separately, Rogers said the end-of-year growth reflects the return to a healthy holiday peak, which started when inventory levels expanded in late summer and early fall as retailers began stocking up to meet consumer demand. Pandemic-driven shifts in consumer buying behavior, inflation, and economic uncertainty contributed to volatile peak season conditions over the past four years, with the LMI swinging from record-high growth in late 2020 and 2021 to slower growth in 2022 and contraction in 2023.
“The LMI contracted at this time a year ago, so basically [there was] no peak season,” Rogers said, citing inflation as a drag on demand. “To have a normal November … [really] for the first time in five years, justifies what we’ve seen all these companies doing—building up inventory in a sustainable, seasonal way.
“Based on what we’re seeing, a lot of supply chains called it right and were ready for healthy holiday season, so far.”
The LMI has remained in the mid to high 50s range since January—with the exception of April, when the index dipped to 52.9—signaling strong and consistent demand for warehousing and transportation services.
The LMI is a monthly survey of logistics managers from across the country. It tracks industry growth overall and across eight areas: inventory levels and costs; warehousing capacity, utilization, and prices; and transportation capacity, utilization, and prices. The report is released monthly by researchers from Arizona State University, Colorado State University, Rochester Institute of Technology, Rutgers University, and the University of Nevada, Reno, in conjunction with the Council of Supply Chain Management Professionals (CSCMP).
Specifically, 48% of respondents identified rising tariffs and trade barriers as their top concern, followed by supply chain disruptions at 45% and geopolitical instability at 41%. Moreover, tariffs and trade barriers ranked as the priority issue regardless of company size, as respondents at companies with less than 250 employees, 251-500, 501-1,000, 1,001-50,000 and 50,000+ employees all cited it as the most significant issue they are currently facing.
“Evolving tariffs and trade policies are one of a number of complex issues requiring organizations to build more resilience into their supply chains through compliance, technology and strategic planning,” Jackson Wood, Director, Industry Strategy at Descartes, said in a release. “With the potential for the incoming U.S. administration to impose new and additional tariffs on a wide variety of goods and countries of origin, U.S. importers may need to significantly re-engineer their sourcing strategies to mitigate potentially higher costs.”
The new funding brings Amazon's total investment in Anthropic to $8 billion, while maintaining the e-commerce giant’s position as a minority investor, according to Anthropic. The partnership was launched in 2023, when Amazon invested its first $4 billion round in the firm.
Anthropic’s “Claude” family of AI assistant models is available on AWS’s Amazon Bedrock, which is a cloud-based managed service that lets companies build specialized generative AI applications by choosing from an array of foundation models (FMs) developed by AI providers like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon itself.
According to Amazon, tens of thousands of customers, from startups to enterprises and government institutions, are currently running their generative AI workloads using Anthropic’s models in the AWS cloud. Those GenAI tools are powering tasks such as customer service chatbots, coding assistants, translation applications, drug discovery, engineering design, and complex business processes.
"The response from AWS customers who are developing generative AI applications powered by Anthropic in Amazon Bedrock has been remarkable," Matt Garman, AWS CEO, said in a release. "By continuing to deploy Anthropic models in Amazon Bedrock and collaborating with Anthropic on the development of our custom Trainium chips, we’ll keep pushing the boundaries of what customers can achieve with generative AI technologies. We’ve been impressed by Anthropic’s pace of innovation and commitment to responsible development of generative AI, and look forward to deepening our collaboration."
A growing number of organizations are identifying ways to use GenAI to streamline their operations and accelerate innovation, using that new automation and efficiency to cut costs, carry out tasks faster and more accurately, and foster the creation of new products and services for additional revenue streams. That was the conclusion from ISG’s “2024 ISG Provider Lens global Generative AI Services” report.
The most rapid development of enterprise GenAI projects today is happening on text-based applications, primarily due to relatively simple interfaces, rapid ROI, and broad usefulness. Companies have been especially aggressive in implementing chatbots powered by large language models (LLMs), which can provide personalized assistance, customer support, and automated communication on a massive scale, ISG said.
However, most organizations have yet to tap GenAI’s potential for applications based on images, audio, video and data, the report says. Multimodal GenAI is still evolving toward mainstream adoption, but use cases are rapidly emerging, and with ongoing advances in neural networks and deep learning, they are expected to become highly integrated and sophisticated soon.
Future GenAI projects will also be more customized, as the sector sees a major shift from fine-tuning of LLMs to smaller models that serve specific industries, such as healthcare, finance, and manufacturing, ISG says. Enterprises and service providers increasingly recognize that customized, domain-specific AI models offer significant advantages in terms of cost, scalability, and performance. Customized GenAI can also deliver on demands like the need for privacy and security, specialization of tasks, and integration of AI into existing operations.