Multi-task learning (MTL) has become a research hotspot for the analysis of whole-slide histopathological images (WSIs) since it can capture the shared representations of different tasks for the improvement of individual tasks. However, the shared representations learned by MTL are always dominated by the tasks appearing in the training set that is difficult to directly apply it on the unseen (new) tasks, especially when the unseen tasks are significantly different from the known tasks. To address the above issues, we develop a Task-Agnostic Feature-Learner (TAFL) for efficient adaptation to unseen clinical tasks, which can leverage useful image information from the existing tasks for new clinical trials with minimal task-specific modifications. Specifically, we firstly develop a neural architecture search (NAS) module that can design the network architectures of TAFL automatically. Then, a novel task-level meta-learning algorithm is developed to extract efficient and universal information from the known tasks for improving the prediction performance on the unseen tasks. We evaluate our method on three publicly available datasets derived from The Cancer Genome Atlas (TCGA) for various clinical prediction tasks (i.e., staging, cancer subtyping and survival prediction), and the experimental results indicate that our TAFL can effectively adapt to unseen tasks with better prediction performance.