Abstract Objective. Automatic segmentation of gastric lesions in ultrasound images is crucial for the early diagnosis and treatment of gastric cancer, the second leading cause of cancer-related deaths worldwide. However, the limited amount of related research and the challenges posed by variable lesion morphology, artifacts, blurred boundaries, and intensity inhomogeneity make accurate segmentation particularly difficult, especially for less-experienced clinicians. Approach. To address these challenges, we propose a novel attention-integrated pyramid network (AIP-Net) designed to enhance segmentation accuracy and support clinical decision-making. In the encoder phase, the model integrates convolutional neural networks with a lesion boundary detection (BD) module to strengthen lesion-specific feature extraction, while a connected mask captures complex and subtle directional information. In the decoder phase, segmentation performance is further improved through spatial feature fusion and channel processing. Main Results. Experimental results demonstrate that the proposed method outperforms state-of-the-art segmentation approaches on gastric cancer ultrasound datasets, particularly in cases involving lesions with unclear or ambiguous boundaries. Additional experiments on a breast ultrasound dataset further verify the generalization capability of the proposed model. Significance. The proposed AIP-Net provides valuable diagnostic support by improving targeting accuracy and facilitating precise diagnosis and treatment planning in gastric ultrasound detection. Its robust performance and strong generalizability also highlight its potential for broader clinical applications in ultrasound-based lesion analysis.