工作流程
计算机科学
变压器
深度学习
学习网络
人工智能
地震学
地质学
工程类
电气工程
数据库
电压
作者
Randy Harsuko,Tariq Alkhalifah
出处
期刊:Geophysics
[Society of Exploration Geophysicists]
日期:2024-04-25
卷期号:: 1-64
被引量:1
标识
DOI:10.1190/geo2023-0403.1
摘要
StorSeismic is a recently introduced model based on the Transformer network to adapt to various seismic processing tasks through its pretraining and fine-tuning strategy. In the original implementation, StorSeismic utilized a sinusoidal positional encoding and a conventional self-attention mechanism, both borrowed from natural language processing (NLP) applications. For seismic processing they provided good results, but also hinted at limitations in efficiency and expressiveness. We propose modifications to these two key components, by utilizing relative positional encoding and low-rank attention matrices as replacements for the standard ones. The proposed changes are tested on processing tasks applied to a realistic Marmousi and offshore field data as a sequential strategy, starting from denoising, direct arrival removal, multiple attenuation, and finally root-mean-squared velocity ( V RMS ) prediction for normal moveout (NMO) correction. We observe faster pretraining and competitive results on the fine-tuning tasks and, additionally, fewer parameters to train compared to the standard model.
科研通智能强力驱动
Strongly Powered by AbleSci AI