Page 326 - AI for Good Innovate for Impact
P. 326
AI for Good Innovate for Impact
nodes, forming a knowledge graph. This graph structure clearly displays the entities and their
interrelationships within documents, providing robust support for subsequent knowledge
retrieval and answer generation.
Business Level: The Hubei Telecom 10000-system currently supports over 200 business
scenarios and nearly 800 business interfaces. The "Diting" Intelligent Customer Service Agent
introduces a two-layer knowledge graph to encapsulate the expertise of service experts,
precisely matching different intents and providing personalized service solutions for customers.
The first layer of the graph focuses on service plans, ensuring accurate execution of service
actions during the service process to meet diverse customer needs. The second layer focuses
on service components, defining the specific business interfaces required for each service
action. All necessary interfaces for current customer service scenarios are registered in the
"Diting" agent to ensure on-demand invocation and efficient service execution.
LightRAG is used, which consists of two main stages, index building and query response. Index
building involves splitting the original documents into small paragraphs. Using a language
model, it extracts key entities and their relationships. Duplicate content is removed to create a
more compact knowledge structure. All concepts and relationships are stored as a knowledge
graph in a graph database. Each concept and text is also converted into a vector using a vector
database lookup for fast similarity searches using mathematical methods. When new data is
added, LightRAG uses an incremental update approach to update only the affected parts,
avoiding full rebuilds.
The Query Response involves the system analyzing the user's query and identifying two types
of keywords, Local Keywords and Global Keywords. This two-step retrieval uses local keywords
to find the most relevant content in the vector database and use global keywords to explore
related information in the knowledge graph via relationships. The retrieved results are combined
into a context prompt and fed into a large language model (LLM). The system organizes the
output using a predefined response template, ensuring the final answer is coherent, logical,
and complete.
Partners
• China Telecom Artificial Intelligence Technology Co., Ltd., provides computing power
platforms and some AI capabilities. [7]
2�2 Benefits of the use case
Reduction in Training Time: Traditional customer service relies heavily on repetitive manual
labor, resulting in high resource consumption and low efficiency. In contrast, "Diting" automates
the handling of user requests through AI, reducing the workload of customer service personnel.
The training period for newly hired agents has been shortened from three months to one week,
significantly reducing operational costs.
Improvement in Service Efficiency: This case builds an intelligent customer service system,
introducing AI capabilities to assist customer service personnel in handling calls in traditional
customer service. By automatically recognizing intents, extracting information, invoking
component interfaces (e.g., checking phone bill interfaces), and executing service solutions
automatically, it significantly improves the efficiency and quality of customer service. The
application of "Diting" has reduced the time spent on information sorting and confirmation
by 20%, and the average handling time per operation has been cut by 10%. The time required
290