Prostate cancer is the second most common cancer in men, and transrectal ultrasound (TRUS) guided biopsy is the standard method to diagnose prostate cancer. Accurate prostate segmentation in TRUS images is crucial for precise biopsy. Manual segmentation is laborious, while automated segmentation faces significant challenges due to the low signal-to-noise ratio, blurred boundaries, and presence of noise and artifacts. To address these issues, this paper proposes a 3D edge-attention denoising diffusion network, aiming to achieve high accuracy and generalizability for prostate segmentation in TRUS-guided biopsy. The proposed network incorporates an edge attention denoising U-Net (EAD U-Net) to extract and utilize desired edge information in TRUS images, improving the segmentation accuracy in challenging regions of the prostate. To reduce uncertainty and enhance network accuracy, we incorporate a Kalman fusion module, which utilizes the Kalman filter and all estimations from the EAD U-Net in reverse process to obtain the optimal segmentation estimation. The proposed network was evaluated using 1834 3D ultrasound images from two open-source datasets. Comparative experiments with existing methods demonstrate that our method surpasses state-of-the-art techniques, proving its effectiveness in prostate segmentation from TRUS images. The proposed method achieved an average Dice similarity coefficient of 92.92% and 94.0%, and the 95th percentile of Hausdorff distance of 1.07 mm and 0.77 mm on two datasets, demonstrating the potential to facilitate accurate MRI-TRUS fusion guided prostate biopsy.