Encrypted traffic classification (ETC) is essential for ensuring network security and efficient management. Despite advances in deep learning, ETC remains challenging as existing models struggle to learn robust, discriminative representations from content-encrypted, highly imbalanced traffic.To address these challenges, we propose BTRFormer, a novel ETC approach that capitalizes on the inherent properties of encryption algorithms to enhance classification accuracy. At the core of BTRFormer lies a block-based, multi-layer traffic representation that adopts a 4×4 block as the fundamental unit, inspired by the encryption algorithm’s use of 16-byte blocks for encryption operations. This representation preserves the intrinsic structure of encrypted payloads, facilitating the model’s ability to learn deep semantic features. Subsequently, a transformer-based model is employed to learn from the multi-layer representation, capturing intra-block, inter-block, and inter-packet dependencies through block-wise attention mechanisms. Finally, BTRFormer leverages a pre-training phase on large-scale unlabeled data, followed by fine-tuning with a minimal amount of labeled samples to improve generalization and adaptability. Experimental results show that BTRFormer significantly outperforms SOTA methods on six real-world datasets, highlighting its effectiveness in encrypted traffic classification and secure network management.