航空
计算机科学
领域(数学分析)
变压器
航空安全
人工智能
优势和劣势
航空事故
自然语言处理
数据科学
工程类
航空航天工程
数学分析
哲学
数学
电气工程
认识论
电压
作者
Xiao Jing,Akul Chennakesavan,Chetan Chandra,Mayank V. Bendarkar,Michelle Kirby,Dimitri N. Mavris
出处
期刊:AIAA Aviation 2019 Forum
日期:2023-06-08
被引量:11
摘要
View Video Presentation: https://doi.org/10.2514/6.2023-3438.vid The advent of transformer-based models pre-trained on large-scale text corpora has revolutionized Natural Language Processing (NLP) in recent years. Models such as BERT (Bidirectional Encoder Representations from Transformers) offer powerful tools for understanding contextual information and have achieved impressive results in numerous language understanding tasks. However, their application in the aviation domain remains relatively unexplored. This study discusses the challenges of applying multi-label classification problems on aviation text data. A custom aviation domain specific BERT model (Aviation-BERT) is compared against BERT-base-uncased for anomaly event classification in the Aviation Safety Reporting System (ASRS) data. Aviation-BERT is shown to have superior performance based on multiple metrics. By focusing on the potential of NLP in advancing complex aviation safety report analysis, the present work offers a comprehensive evaluation of BERT on aviation domain datasets and discusses its strengths and weaknesses. This research highlights the significance of domain-specific NLP models in improving the accuracy and efficiency of safety report classification and analysis in the aviation industry.
科研通智能强力驱动
Strongly Powered by AbleSci AI