Lightweight remote sensing image super-resolution (RSISR) methods aim to reconstruct remote sensing images (RSIs) while reducing computational complexity. Previous lightweight model development has primarily focused on the design of convolutional neural networks (CNNs). While CNNs excel at capturing local features, they are limited in establishing long-range dependencies. Mamba, as a model for long-range modeling, has linear computational complexity, making it a viable option for lightweight models. Based on these considerations, this paper proposes an efficient mamba-attention network (EMAN) that can efficiently capture the intricate details and broader semantic information in RSIs. Specifically, we designed a multi-scale detail extraction unit (MDEU) and a multi-dimensional mamba-attention (MDMA). In MDEU, we introduced a multi-scale mechanism and local variance to focus on structural information in RSIs. In MDMA, we integrated spatial expansion and an atrous-based selective scan mechanism to design an efficient scanning method. This method ensures the lightweight nature of the model while establishing global correlations. Additionally, MDMA establishes inter-channel correlations to enhance information exchange. We conducted a comprehensive evaluation of the proposed method on two remote sensing datasets and five benchmark super-resolution (SR) datasets. Extensive experiments demonstrate that our method can achieve superior performance while maintaining a model complexity similar to other lightweight models.