We study the effect of localized modes in lattices of size $N$ with parity-time ($\mathcal{P}\mathcal{T}$) symmetry. Such modes are arranged in pairs of quasidegenerate levels with splitting $\ensuremath{\delta}\ensuremath{\sim}{exp}^{\ensuremath{-}N/\ensuremath{\xi}}$ where $\ensuremath{\xi}$ is their localization length. The level ``evolution'' with respect to the $\mathcal{P}\mathcal{T}$ breaking parameter $\ensuremath{\gamma}$ shows a cascade of bifurcations during which a pair of real levels becomes complex. The spontaneous $\mathcal{P}\mathcal{T}$ symmetry breaking occurs at ${\ensuremath{\gamma}}_{\mathcal{P}\mathcal{T}}\ensuremath{\sim}\mathrm{min}{\ensuremath{\delta}}$, thus resulting in an exponentially narrow exact $\mathcal{P}\mathcal{T}$ phase. As $N/\ensuremath{\xi}$ decreases, it becomes more robust with ${\ensuremath{\gamma}}_{\mathcal{P}\mathcal{T}}\ensuremath{\sim}1/{N}^{2}$ and the distribution $\mathcal{P}({\ensuremath{\gamma}}_{\mathcal{P}\mathcal{T}})$ changes from log-normal to semi-Gaussian. Our theory can be tested in the frame of optical lattices.