精神分裂症(面向对象编程)
自然语言处理
人工智能
计算机科学
自然(考古学)
心理学
地理
程序设计语言
考古
作者
Pei‐Yun Lin,Ying‐Hsuan Chen,Yuh-Jer Chang,Tsung-Tse Ho,Tai-Chuan Shih,Chih‐Hung Ko,Ying-Hui Lai
出处
期刊:Research Square - Research Square
日期:2024-01-09
被引量:2
标识
DOI:10.21203/rs.3.rs-3836497/v1
摘要
Abstract Background:The correct diagnosis of schizophrenia is essential to reduce the economic burden and avoid worsening patients’ comorbidities. However, current clinical diagnosis is subjective and time consuming. We propose a deep learning method using the bidirectional encoder representations from transformers (BERT) to identify lexical incoherence related to schizophrenia. Methods:We use a fine-tuned BERT model to extract schizophrenia-related text features and detect possible schizophrenia. Our study involves the enrollment of 13 participants diagnosed with schizophrenia and 13 participants without schizophrenia. Following the collection of speech data, we create a training set by sampling from 10 speakers in each group. Subsequently, the remaining speakers' data is reserved for external testing to assess the model's performance. Results:After adjusting the parameters of the BERT model, we achieve excellent detection results, with an average accuracy of 84%, 95% of true positives, and an F1 score of 0.806. These results underscore the efficacy of our proposed system in identifying lexical incoherence related to schizophrenia. Conclusions:Our proposed method, leveraging the deep learning BERT model, shows promise in contributing to schizophrenia diagnosis. The model's self-attention mechanism successfully extracts representative schizophrenia-related text features, providing an objective indicator for psychiatrists. With ongoing refinement, the BERT model serves as a valuable auxiliary tool for expedited and objective schizophrenia diagnosis, ultimately alleviating societal economic burdens and preventing major complications in patients.
科研通智能强力驱动
Strongly Powered by AbleSci AI