We present a method for implementing quantum temperature sensing for extremely low temperatures in a quasi-1D dipolar Bose-Einstein condensate reservoir with a magnetic field-driven impurity atom acting as a quantum sensor. By analyzing the quantum signal-to-noise ratio (QSNR) as a metric for temperature sensing performance, we demonstrate that the presence of an attractive dipolar interaction in the reservoir, which includes the effects of non-Markovian dynamics on the sensor, significantly enhances estimation efficiency. We also investigate the steady-state estimation efficiency for long-encoding times through an analytical expression, which shows that the optimal QSNR depends on the driving magnetic field of the impurity atom. Our method can achieve high-efficiency temperature sensing for any low temperature by tuning the magnetic field. These findings suggest that our approach has potential applications in high-resolution quantum thermometry.