计算机科学
变压器
生成语法
领域(数学分析)
生成模型
建筑
域模型
人工智能
领域知识
电气工程
艺术
数学分析
数学
电压
视觉艺术
工程类
作者
ChunLin Yin,KunPeng Du,Qiong Nong,H Zhang,Li Yang,Biao Yan,Xiang Huang,Xiaobo Wang,Xuan Zhang
摘要
Abstract Recently, large‐scale language models (LLMs) such as chat generative pre‐trained transformer and generative pre‐trained transformer 4 have demonstrated remarkable performance in the general domain. However, inadaptability in a particular domain has led to hallucination for these LLMs when responding in specific domain contexts. The issue has attracted widespread attention, existing domain‐centered fine‐tuning efforts have predominantly focused on sectors like medical, financial, and legal, leaving critical areas such as power energy relatively unexplored. To bridge this gap, this paper introduces a novel power energy chat model called PowerPulse. Built upon the open and efficient foundation language models (LLaMA) architecture, PowerPulse is fine‐tuned specifically on Chinese Power Sector Domain Knowledge. This work marks the inaugural application of the LLaMA model in the field of power energy. By leveraging pertinent pre‐training data and instruction fine‐tuning datasets tailored for the power energy domain, the PowerPulse model showcases exceptional performance in tasks such as text generation, summary extraction, and topic classification. Experimental results validate the efficacy of the PowerPulse model, making significant contributions to the advancement of specialized language models in specific domains.
科研通智能强力驱动
Strongly Powered by AbleSci AI