ABSTRACT The famous Tucker decomposition has been widely and successfully used in many fields. However, it often suffers from the curse of dimensionality due to the core tensor and large ranks. To tackle this issue, we introduce an additional core tensor into Tucker decomposition and propose the so‐called double‐Tucker (dTucker) decomposition. The additional core can share the ranks of the original Tucker decomposition and hence make the parameters of the new decomposition be reduced greatly. We employ the alternating least squares (ALS) method with explicit structures on coefficient matrices of the ALS subproblems to compute the dTucker decomposition. To figure out the structures, a new tensor product is defined. Its properties and the aforementioned structures together motivate an ALS‐based randomized algorithm built on Kronecker sub‐sampled randomized Fourier transform for our new decomposition. A special case of the algorithm leads to a more efficient leverage‐based random sampling algorithm. These randomized algorithms can avoid forming the full coefficient matrices of ALS subproblems by implementing projecting and sampling on factor tensors. Numerical experiments including tensor reconstruction and multi‐view subspace clustering are presented to test our decomposition and algorithms, which show that dTucker decomposition can effectively decrease the ranks of the classical one and hence the total parameters, and the randomized algorithms reduce running time greatly while maintaining similar accuracy. Moreover, the numerical results also show that our decomposition can even outperform the popular tensor train decomposition and the newly developed tensor wheel decomposition on compressing parameters.