These companies typically share three common approaches. First, they select AI use cases based on business priorities, with a rigorous focus on value. This often involves building a dedicated organizational unit to orchestrate and accelerate scaling.
Second, they establish an enablement function to ensure that, as new skills and capabilities are developed, they are available to the teams that need them across the organization. In some instances, the enablement function may assign dedicated teams to develop needed capabilities that are lacking.
Third, they employ a consistent execution model, with AI use cases running through agile build and validation cycles. Prototypes collect early end-user feedback and lead to the creation of MVPs, which add features and users as they are scaled and integrated into the operating model. Ultimately, use cases and their operating models are deployed across the organization.
Management teams should ask themselves three questions about their ability to prioritize and scale use cases:
- Do we have a systematic approach that prioritizes scaling of use cases based on value?
- Do we build use cases using a consistent agile execution model that allows for accelerated scaling?
- Do we track the use cases in the pipeline against clear objectives and key results?
Accessible Data and Technology. Leaders make data, technology, and algorithms available for use by teams across the organization. Other companies maintain siloed tech and data that impede scaling; common barriers include the use of individual data sets rather than a single accessible data pool, separately built and incompatible tech stacks, and inconsistent or redundant algorithms. Companies that make more than 75% of technology and data available have a 40% greater likelihood of realizing AI use cases at scale than those that make 25% or less widely available.
A modular data and digital platform serves as the technical foundation for data accessibility and allows for rapid release cycles of AI use cases. The platform typically uses cloud infrastructure, core systems, and clearly defined interfaces. In addition, data models are built to support all core workflows. For example, a data model for an AI-powered sales process would ensure accessibility of the relevant data during the presales phase, negotiation, deal closing, and delivery.
Three questions that management teams should ask about the accessibility of data and technology:
- Have we implemented a data and digital platform that uses clearly defined interfaces that allow teams to access and process data across the organization?
- Have we identified the data assets that provide competitive advantage, specified ownership and quality assurance, and organized them into data models that can be utilized along the full value chain?
- Have we made algorithms, and the software they are embedded in, available across the organization to avoid inconsistencies and redundant work?
Leadership and Talent. AI leaders recognize that putting the right people in the right roles is a critical foundation for success. By emphasizing human as well as technological capabilities, they seek to facilitate organizational and machine learning. The companies that dedicate 10% or more of their digital staff to AI-specific roles, and that have 30% or more of their staff utilizing AI solutions on a daily basis, generate more than twice the added EBIT (11%) of those that dedicate human resources below these thresholds. In addition, companies with aligned leadership, strong interbusiness unit and interfunctional collaboration, and end-to-end agile product delivery increase their AI use case maturity by an average of about 25%. Aligned leadership is essential to setting priorities, such as trimming a large number of initiatives down to few high-potential use cases that promise the most value and organizing them into an integrated roadmap.
Three questions to ask about leadership and talent:
- Have we taken all the relevant steps to attract the necessary AI talent—for example, by leveraging talent ecosystems and strengthening our employer value proposition with a focus on technical skills?
- Have we empowered the organization to build foundational technical and analytical skills and to set up a creative learning environment that fosters the use of AI solutions on a daily basis?
- Have we created an organizational environment in which teams are able to make decisions, generate new ideas for leveraging AI, and take risks, knowing that they are supported by an aligned C-suite?
AI Scaling in Practice
Two examples illustrate how leading companies put the measures described above into practice.
A European industrial goods company with global reach had made earlier attempts at digital transformation and had hundreds of digital, automation, and AI projects in the works. All were lodged in functional silos, with different operating models and lacking strategic guidance and direction. The company had made significant investments in “data platforms” managed by its IT function but had seen no financial impact, in part because it had no overarching plan or approach.
A new head of digital transformation, who reported to the CEO, had a strong operational background. He took a fundamentally different approach, comprising a half-dozen guiding principles:
- Anchor the case for change and the AI priorities at the top.
- Define the target state and the objectives and key results for top-priority cross-functional domains.
- Prove value first, then scale the capability incrementally.
- Involve people with both business and technical expertise.
- Ensure senior-management involvement to help embed change at the frontline.
- Initially locate the work in an accelerator unit outside of IT and bring it into the full organization when mature.
Senior management identified ten "lighthouse" use cases that would generate value and form the basis for a refreshing of the company’s data platform and IT operating model and organization. The use cases forced the company to identify the data assets that provided competitive advantage and establish a single, accessible database. They also helped specify the requirements for the data platform and overall architecture.
As the company eliminated the barriers to data access, it established a stronger foundation for developing and scaling the selected AI use cases. From the lighthouse pilots, a team dedicated to enablement identified missing technical capabilities and skills, such as expertise in cloud platforms, and supported the creation of new teams to fill the gaps. It also established a consistent agile execution model for developing and scaling the use cases. Over time, momentum grew as more data and tools became available across the organization, leading to a stronger foundation for developing future use cases.
A global consumer goods company operating in more than 150 markets was determined to use AI to boost revenues and profitability. It started from a position of low digital maturity, with limited data science capability and no digital use cases operating at scale. It identified 20 initial AI use cases, from which it prioritized three AI solutions to pilot:
- Marketing Budget Allocation. Where to invest in advertising and marketing to maximize return.
- Sales Force Effectiveness. Which stores to focus on in any given week and the next best action for each one.
- Product Promotion and Pricing. The optimal product price and most effective promotion calendar to maximize sales and margins in the next year.
The company framed a business case and roadmap to scope its ambition and prioritized use cases based on feasibility, size of the prize, time to impact, and selected pilot countries. It then defined the data and IT architecture that it needed to build, designed a modeling approach, and tested MVPs in two or three pilot markets for each use case. As it proceeded, the company built out a data and tech platform that enabled scaling of use cases, global data hubs, and a cloud-based technology stack. It also developed the necessary human capabilities, including a dedicated central organization to accelerate the building and deployment of AI solutions, recruitment of about 70 data and digital experts, and launch of an ambitious “upskilling” program structured around data management, AI, and agile product ownership.
About eight months after launching its initial MVP, the company was able to deploy AI solutions in more than ten markets covering a majority of its sales and embed them in its IT systems. The program achieved results in line with financial and organizational benchmarks targeted at the outset, and all deployed AI solutions had a positive impact on financial performance.
Even small investments in AI can pay off. But making AI work requires targeting and discipline—and a focus on human skills as well as technology. A well-planned approach based on building the digital foundation and capabilities to scale up AI use cases can serve as a powerful and profitable accelerant.