Page 328 - AI for Good Innovate for Impact
P. 328
AI for Good Innovate for Impact
• REQ-02: Model Capabilities: It is critical to acquire open-source large AI models and
conduct secondary training and development to adapt them to specific requirements.
• REQ-03: Computing Power Requirements: It is expected that sufficient computing power
must be ensured for the inference, fine-tuning, and training of the large model.
4 Sequence Diagram
5 References
[1] GSMA, Operator Best Practices: AI Large Model Empowering Verticals (Use Cases).
[Online]. Available: https:// www .gsma .com/ about -us/ regions/ greater -china/ wp -content/
uploads/ 2024/ 12/ Operator -Best -Practices -AI -Large -Model -Empowering -Verticals -Use
-Cases -CN .pdf
[2] Wolai, “Vertical Industry Large Model Application Scenarios.” [Online]. Available: https://
www .wolai .com/ gr k6SvFBakRE pFkAKkuTv9
[3] Y. Liu, M. Ott, N. Goyal, et al., “RoBERTa: A Robustly Optimized BERT Pretraining
Approach,” arXiv preprint, arXiv:1909.10351, 2019. [Online]. Available: https:// arxiv .org/
abs/ 1909 .10351
[4] Google Research, “BERT: TensorFlow code and pre-trained models for BERT,” GitHub
repository, 2018. [Online]. Available: https:// github .com/ google -research/ bert
[5] Tele-AI, Telechat: A Telecom Assistant Powered by LLM. GitHub repository. [Online].
Available: https:// github .com/ Tele -AI/ Telechat
[6] QwenLM, Qwen3: Next-generation Open LLM by Alibaba. GitHub repository. [Online].
Available: https:// github .com/ QwenLM/ Qwen3
[7] China Telecom Global, “China Telecom Announces New AI Advancements,” [Online].
Available: https:// www .chinatelecomglobal .com/ sc/ whats -new/ 37325
292