Skip to content
Search AI Powered

Latest Stories

Want to be agile? Master your data

Your supply chain's speed and efficiency depends on the flow of synchronized data between you and your partners. The first step: making sure your own data is clean and consistent.

Want to be agile? Master your data

When change occurs, the agile organization will spring into action, delivering a swift and effective response. The era of lumbering corporate giants has finally given way to the age of the nimble, adaptive competitor—one that considers agility to be both a competitive weapon and a corporate strategy. The more agile your organization, the new thinking goes, the better it will be able to handle such challenges as growth, structural change, globalization, and regulatory pressures. But how do you achieve agility?

A key prerequisite for becoming agile is to make sure all stakeholders within the enterprise share a common view of the data that they use to make business decisions. This enables all parts of your business to work together more effectively than if each part had its own view of the business and an independent plan of action. In order to do that, enterprises need to achieve a clean and consistent interpretation of master data, or standardized attributes that are common across multiple items—for example, price, size, location, and Global Trade Item Number (GTIN). Master data management (MDM) is a program that helps enterprises ensure that the quality of the master data that is used by all stakeholders within an enterprise remains high.


Article Figures
[Figure 1] Global data synchronization network conceptual topology


[Figure 1] Global data synchronization network conceptual topologyEnlarge this image

However, an enterprise is only one node within a value chain. As competition continues to increase, more enterprises are recognizing that their ability to become (or remain) agile depends heavily on their trading partners. If those trading partners stumble, an enterprise's potential breakthrough performance may not yield overall value chain improvements, and those solid improvements will end up being wasted. For an entire value chain to increase its agility or achieve extended breakthrough performance, semantic reconciliation of master data needs to take place across multiple enterprises. Known as "global data synchronization" (GDS), this program should incorporate any relevant industry standards and stretch across multienterprise business processes.

The benefits of such a program can be significant. Managing information more effectively across the value chain not only increases agility but also reduces costs, keeping information integration problems to a minimum and allowing everyone to work from a single consistent, consolidated, and authoritative version of the information. In addition, companies that share information across the value chain are increasingly deriving competitive advantage because they can marshal their assets, partners, and people to work in unison to outmaneuver business rivals.

Sharing data beyond the enterprise
Today there already are many examples of master data being extended or shared across trading boundaries:

  • In the automotive industry, data and semantics (the intended meaning of the data) are shared to support collaborative product design as well as multienterprise business processes related to forecasting and replenishment.
  • In multichannel retailing (where an enterprise sells the same product, sometimes to the same customer, via different business channels—stores, kiosks, direct/catalog sales, Web sites, and so on), data on products, customers, and locations must be aligned across all of the channels' business processes as well as with third-party logistics operators.
  • In the consumer goods industries, significant amounts of data are shared with trading partners in order to support dynamic business processes such as promotions, new-product introductions, and product substitutions.
  • In the food-service industries (where buying products is consolidated through a small number of companies that represent large numbers of national and local restaurant chains), trading partners require alignment of data for common items needed by multiple customers.
  • In both business-to-business and business-to-consumer commercial trading, many suppliers across many industries sell their products and services via a "browse and buy" catalog or service based on commonly accepted data definitions.

Yet in spite of all these data-sharing efforts, the poor quality of master data has become a hot topic for many industries. When you look at an enterprise as part of a wider value chain, the reconciliation of master data becomes much more complicated than it is internally. That's largely because the majority of the data that are consumed by an enterprise reside outside the enterprise. As a result, the enterprise lacks any formal control over those data or the associated external business processes. Therefore, internal enterprise information management and external global data synchronization are two core programs that should be part of any chaos-resilient supply chain management strategy.

An example from the consumer-goods industry
Although the approaches and the IT solutions may vary somewhat, the issues and general concepts of data synchronization apply across most industries. Two sectors that have made significant strides in advancing global data synchronization are retail and consumer goods. These efforts have grown out of an industrywide focus on improving product introduction and replenishment processes. A number of large global retailers—including Wal-Mart, Target, Carrefour, Metro AG, Home Depot, and Ace Hardware—have been working with their distributors and suppliers to develop governance models for the creation and maintenance of master data, industry data dictionaries and data models, and compliance and certification processes for those data directories.

The result of their efforts is what the industry calls the Global Data Synchronization Network (GDSN), which is managed by the not-for-profit global standards organization GS1. GDSN is a collection of enterprise and industry data repositories that are connected electronically in order to assure that product and attribute information is aligned across the value chain. It is focused on ensuring the semantic consistency of product data and other associated master data to improve business agility and to reduce waste.

GDS is achieved via the deployment of a large global product directory that is referenced by all buying and selling processes. Called the Global Registry, this directory serves as a very large "look-up" table of core data such as Global Trade Item Number and Global Location Number (which is used to identify legal entities, trading partners, and locations). The registry is accessed by local or remote data pools, which are repositories of both the core data and extended data (such as price) pertaining to products or commercial contracts shared between trading partners.

Sellers use their data pools to publish product data and other master data. Buyers use their data pools to subscribe to these data. Manufacturers and distributors (sellers/publishers) create and enrich these data (which can include packaging instructions, operating manuals, and warranties) throughout the product's lifecycle. Buyers consume and further enrich the data, even to the point of sharing it with consumers. These data pools can be hosted by data-synchronization services such as 1Sync, Agentrics, GXS, or Sterling Commerce, or they can be maintained behind an enterprise firewall.

The information is stored in the data pools and linked via the Global Registry to the rest of the community (see Figure 1). Only the data needed to uniquely identify the item and seller (the "thin" data) are stored in and registered with the Global Registry. These data include item code, Global Trade Item Number, product category, owner, and other core item attributes. Globally, this represents very few attributes, but, for many products, this nonetheless translates to a very large database.

Additional data needed to support all current and future business processes (the "fat" data) are synchronized automatically between publishers and subscribers based on the rules related to the subscription (such as frequency and location) and the relationship requirements. Some partners require unique data attributes to do business; others require more standard data elements. In all cases, this data flow (represented by the blue line in the figure) takes place outside of the normal electronic data interchange or business transaction flow (represented by the black line).

Figure 1 shows a conceptual or logical topology of the GDSN with publishers (suppliers), subscribers (retailers/buyers), and their respective (country, regional) data pools synchronizing messages via a (global) product registry.

Currently, the GDSN is still at the foundational level. Further development will be needed before the network can fulfill its promise of helping participants increase agility across the value chain. Through 2009, the IT consultancy Gartner expects to see improvements in agility through more efficient business processes related to new-product introduction and price/promotion collaboration as adoption increases in North America, Europe, and Asia.

Other industries follow suit
Global data-synchronization efforts are not unique to the consumer goods and retail sectors. Other industries are also looking to improve supply chain agility and end-to-end decision making across their value chains. We have seen comparable initiatives emerge in the life science industry, such as the Global Healthcare Exchange, or GHX (www.ghx.com), which is building an industry product catalog. Similar programs have been introduced in the automotive manufacturing and retail sector through the Automotive Aftermarket Industry Association (www.aftermarket.org) and National Automotive Dealers Association (www.nada.org). Likewise, the Coalition for Healthcare eStandards (www.chestandards.org) and the Health Care EBusiness Collaborative (www.hcec.org) are developing standards for data synchronization for medical logistics.

Indeed, the GDS framework is valid across any network of trading members and, therefore, can be extended across any industry. It is a framework that can support any number of stakeholders in an industry where the community desires to improve value chain agility.

The right start: intra-enterprise synchronization
As users embark on their GDS programs, they will realize that they can't share even simple data with trading partners unless the data behind their own firewalls is clean. Master data management, then, is a prerequisite for global data synchronization. If your internal data aren't semantically consistent (for example, if one application represents price in U.S. dollars and another in euros), then you will fail to realize the benefits of data synchronization and will be unable to mitigate the costs of external integration. Furthermore, by feeding your partner bad data, you risk causing significant harm to the relationship.

MDM is almost a mirror image of GDS. If GDS is about semantic reconciliation of master data among enterprises, then MDM is partly about the semantic reconciliation of master data inside the enterprise. Yet MDM is the process by which all forms of master data will be managed, not just the product and commercial data needed for GDS.

Different industries are at different stages of the data management effort. Tier-1 members of the retail and consumer goods industries are already active in data synchronization. Tier 2 and below are also active, although often by category/vertical industry: consumer electronics, hard lines (that is, household goods like garden, kitchen, and household furniture), and so on. In the life science industry, tier-1 members are working with GHX to build their first centralized product data catalog.

The technologies needed to achieve MDM aren't all new, and much effort will be expended in aligning technologies that are already deployed in the enterprise—for example, for product information management and customer data integration. Rationalizing data management across enterprise resource planning, customer relationship management, supply chain management, supplier relationship management, and product lifecycle management systems is only the beginning of the effort. Yet MDM must be addressed if you are to derive any value from GDS. It won't do you any good to learn a new language if you can't then translate the data and information back to your peers.

Technology requirements
Business agility is predicated on the understanding that stakeholders within your enterprise are coordinated. This coordination requires the use of data that is semantically consistent within and across business applications. The more complex (that is, heterogeneous) your IT landscape is, the harder and more costly it will be to achieve that consistency. When agility depends on coordination of business activities across the trading-partner boundary, that complexity requires a new level of attention and IT investment. MDM is the program you need to adopt within the enterprise, and this has to be extended with a GDS program outside your enterprise.

Here are the some of the steps that must be taken to assure that data is synchronized both within the enterprise and with your external partners:

  • Understand what is needed to support GDS and MDM
  • Assess your IT projects today to ensure that MDM and GDS requirements are being met, are aligned, and are reconciled
  • Work with strategic customers and/or suppliers to orchestrate GDS programs and to ensure that these programs exploit the necessary technical and industry standards (such as those laid out by GDSN for retail and consumer products or by the Global Healthcare Exchange for life sciences).

The future of GDS
The extension of master data management to external trading partners will only increase. Gartner predicts that through 2007, 30 percent of Global 1000 enterprises involved in the manufacture, movement, buying, and selling of consumer goods will require external data synchronization with their top 10 trading partners. Furthermore, through 2010, 30 percent of Global 1000 enterprises involved in the manufacture, movement, buying, and selling of non-consumer goods (mining, aerospace and defense, electronics, and chemicals) will participate in external data-synchronization programs with their top 10 trading partners.

An enterprise's agility is significantly inhibited by the struggle to ensure that information can flow seamlessly and continuously across all boundaries. Without that seamless flow of information, business transactions can become bogged down, and enterprises may find themselves needlessly spending time and money to reconcile data discrepancies. MDM overcomes existing limitations and GDS extends this across the value chain; combined, these two programs establish a new discipline for enabling business agility.

This article is printed with permission of Gartner Inc. Copyright 2007.

Recent

More Stories

AI image of a dinosaur in teacup

The new "Amazon Nova" AI tools can use basic prompts--like "a dinosaur sitting in a teacup"--to create outputs in text, images, or video.

Amazon to release new generation of AI models in 2025

Logistics and e-commerce giant Amazon says it will release a new collection of AI tools in 2025 that could “simplify the lives of shoppers, sellers, advertisers, enterprises, and everyone in between.”

Benefits for Amazon's customers--who include marketplace retailers and logistics services customers, as well as companies who use its Amazon Web Services (AWS) platform and the e-commerce shoppers who buy goods on the website--will include generative AI (Gen AI) solutions that offer real-world value, the company said.

Keep ReadingShow less

Featured

Logistics economy continues on solid footing
Logistics Managers' Index

Logistics economy continues on solid footing

Economic activity in the logistics industry expanded in November, continuing a steady growth pattern that began earlier this year and signaling a return to seasonality after several years of fluctuating conditions, according to the latest Logistics Managers’ Index report (LMI), released today.

The November LMI registered 58.4, down slightly from October’s reading of 58.9, which was the highest level in two years. The LMI is a monthly gauge of business conditions across warehousing and logistics markets; a reading above 50 indicates growth and a reading below 50 indicates contraction.

Keep ReadingShow less
chart of top business concerns from descartes

Descartes: businesses say top concern is tariff hikes

Business leaders at companies of every size say that rising tariffs and trade barriers are the most significant global trade challenge facing logistics and supply chain leaders today, according to a survey from supply chain software provider Descartes.

Specifically, 48% of respondents identified rising tariffs and trade barriers as their top concern, followed by supply chain disruptions at 45% and geopolitical instability at 41%. Moreover, tariffs and trade barriers ranked as the priority issue regardless of company size, as respondents at companies with less than 250 employees, 251-500, 501-1,000, 1,001-50,000 and 50,000+ employees all cited it as the most significant issue they are currently facing.

Keep ReadingShow less
diagram of blue yonder software platforms

Blue Yonder users see supply chains rocked by hack

Grocers and retailers are struggling to get their systems back online just before the winter holiday peak, following a software hack that hit the supply chain software provider Blue Yonder this week.

The ransomware attack is snarling inventory distribution patterns because of its impact on systems such as the employee scheduling system for coffee stalwart Starbucks, according to a published report. Scottsdale, Arizona-based Blue Yonder provides a wide range of supply chain software, including warehouse management system (WMS), transportation management system (TMS), order management and commerce, network and control tower, returns management, and others.

Keep ReadingShow less
drawing of person using AI

Amazon invests another $4 billion in AI-maker Anthropic

Amazon has deepened its collaboration with the artificial intelligence (AI) developer Anthropic, investing another $4 billion in the San Francisco-based firm and agreeing to establish Amazon Web Services (AWS) as its primary training partner and to collaborate on developing its specialized machine learning (ML) chip called AWS Trainium.

The new funding brings Amazon's total investment in Anthropic to $8 billion, while maintaining the e-commerce giant’s position as a minority investor, according to Anthropic. The partnership was launched in 2023, when Amazon invested its first $4 billion round in the firm.

Keep ReadingShow less