The widespread adoption of large language models (LLMs) powering generative AI has followed a slow yet familiar curve—from initial skepticism to growing acceptance to rapid implementation across nearly every sector. As organizations recognize the transformative capabilities of these tools, adoption is surging, reaching 72 percent in 2023, a nearly 20 percent leap over the prior year.
This growth is driven by the measurable value generative AI creates. Companies that leverage AI at scale experience significant benefits, with some achieving up to 40 percent cost savings and a 66 percent lift in productivity in key areas like customer service and operations. These benefits are expected to grow exponentially as the technology advances.
(How as 2024 for the AI industry? Read this Bold report card and find out.)
Despite these benefits, many organizations remain hesitant, unsure how to integrate these tools or concerned about costly missteps. Inaction, however, isn’t an option for companies aiming to stay competitive. Conversational AI tools leveraging LLMs have great potential as critical strategic differentiators, enabling organizations to transform operations and unlock unprecedented opportunities for efficiency and innovation. Given the fast pace of change, business and tech leaders need to carefully think through the use cases and implementation approach that will help them realize value quickly and sustainably.
Emerging Use Cases of Conversational AI
Applying LLMs to conversational AI further enhances traditional use cases such as customer service and enables new possibilities, such as personal assistants, task copilots, and education. As a result, numerous companies offering dedicated AI solutions have emerged recently. Enterprises can benefit from these ready-to-use solutions or build custom solutions to streamline costs and drive revenue growth.
Use Case #1—Customer Service
Customer service, the oldest and most established application of conversational AI, is being revolutionized through automated support to improve agent productivity and customer experience. AI assistants can autonomously handle inquiries and assist human agents in resolving complex inquiries quickly and accurately. A leading example is Bank of America’s “Erica,” which handles around two million interactions daily. Erica provides customers with account information, such as balance or recent transactions, and helps agents quickly find relevant information, such as missing documents in a loan application. She also handles mundane and repetitive tasks like automatically updating the support ticket based on the call transcript, allowing the agents to focus on high-value tasks. The result is faster service, lower costs, and greater satisfaction for customers and employees. Now, LLMs are taking customer support a step further by enabling more natural interactions, resolving complex queries promptly, and providing personalized responses.
Use Case #2—Personal Assistants
At work, conversational AI-powered personal assistants boost productivity by streamlining time-consuming tasks, such as onboarding a new team member, meeting preparation, and enterprise knowledge search. For example, tools like Motion and Clara assist employees with scheduling and task management. RingSense integrates with customer relationship manager (CRM) software and helps sales reps track customer journeys, understand interactions, and automate data entry. Employees can search through their enterprise data and reduce time wasted looking for information using an AI tool like Coveo.
Use Case #3—Task Copilots
Conversational AI copilots simplify complex actions into natural language commands, eliminating many steps usually required to complete the task. For example, users can talk to their documents or spreadsheets with AI-enabled tools like Microsoft 365 Copilot to reformat documents, create presentations, and generate insights from Excel data—all through conversational commands. GitHub Copilot assists with coding by explaining complex architectures, suggesting optimizations, and identifying potential security vulnerabilities, enhancing human expertise rather than replacing it. In addition to improving productivity, copilots bridge skill gaps and democratize specialized work. Canva’s Magic Design, for instance, helps users without graphic design experience create professional visuals by translating ideas into design principles.
Use Case #4—Education
Conversational AI revolutionizes education by providing personalized support for teachers and students. For teachers, tools like Carnegie Learning’s AI-powered platform help identify learning gaps, create lesson plans, suggest differentiated instruction approaches, and handle grading and other routine tasks. Student-focused applications offer them learning support through interactive explanations and tailored practice problems. For example, Quizlet uses AI to generate personalized quizzes and practice tests based on students’ study materials, adapting to their learning progress. These teaching and study assistance applications could be transformative for remote and underserved communities, where access to specialized teachers or tutors may be limited.
Strategies for Deploying Conversational AI
A phased approach is crucial to realize a return on investment (ROI) when implementing these use cases. Given its maturity, customer support often serves as an ideal first step, with many industry and function-specific solutions readily available in the market. Also, organizations typically have clearly defined use cases, training data, and established performance frameworks for customer support that make them easier to implement. Out-of-the-box copilots, such as Microsoft 365, are another use case that can help enterprises get started quickly.
It is also vital to stage the development process and test the AI solutions during implementation. Enterprises can begin with proof-of-concept projects to validate the ROI, then scale to successful, more specialized implementations that are capital intensive. They can use off-the-shelf offerings to get started on standard use cases and enterprise versions of public LLMs to easily build custom applications with simple no-code prompts. As initial solutions prove valuable, companies can invest in more sophisticated deployments, including fine-tuned models.
About the Author:
Surabhi Sankhla is a vice president of product and an expert in conversational AI. With more than 14 years of experience building digital platforms and solutions, she is currently leading the launch of an enterprise virtual assistant. During her tenure at Amazon, Surabhi built, launched, and expanded products for Amazon’s Alexa. She also led several international digital transformation projects during her time at The Boston Consulting Group (BCG) and Uber. Surabhi has a bachelor’s degree in electrical engineering from Indian Institute of Technology (IIT) Delhi and an MBA from Stanford. Connect with Surabhi on LinkedIn.