Few-Shot Class-Incremental Learning (FSCIL) aims to keep recognizing novel classes from a limited number of samples after training on abundant data from base classes while maintaining the performance of the old classes. The challenge, however, is that limited data from new classes not only leads to the issue of overfitting but also catastrophic forgetting. To address these two issues, we propose a causal inference strategy in the mainstream FSCIL framework, which encourages the model to learn significant knowledge in the base training session and enhance the model's ability to extract features to cope with the emergence of unseen classes in the incremental session, by improving the learning of causal relationships between features and predictions for perturbed samples. In addition, to improve the effectiveness of learning new tasks in the incremental sessions while preventing the model from overfitting to the novel class data, we freeze the feature extractor while adding a Fourier transform after the feature extractor in the incremental session. It can denoise the features, strengthen the features of the novel classes, and suppress the error in extracting the features of the limited number of samples directly from the feature extractor. Extensive experiments on CIFAR100, Caltech-USCD Birds-200-2011, and miniImageNet datasets show that our proposed framework achieves state-of-the-art performance on FSCIL. The source code of our designed framework is at https://github.com/SWU-CS-MediaLab/CIFSCIL.