摘要
Physics-Informed Neural Networks (PINNs) represent a rapidly evolving class of scientific machine learning models that tightly integrate physical laws, typically expressed as partial differential equations (PDEs), into neural network training. By embedding differential constraints directly into the loss function, PINNs enable data-efficient, mesh-free, and equation-consistent approximations of complex physical systems. Over the past few years, PINNs have emerged as a compelling framework for a wide range of forward and inverse problems across disciplines such as fluid dynamics, material science, electromagnetism, biomechanics, and geophysics. This survey provides a comprehensive and critical review of the current state of PINNs. We begin by establishing the mathematical foundations and core architecture of PINNs, illustrating how governing equations, boundary conditions, and measurement data can be unified within a single learning framework. We then explore recent architectural advances and algorithmic innovations, including domain decomposition (e.g., XPINNs), adaptive sampling, spectral PINNs, and stochastic extensions, that address challenges related to scalability, convergence, and uncertainty quantification. Furthermore, we examine benchmark problems, evaluation protocols, and application-specific customizations that have shaped the empirical development of the field. The survey also delves deeply into the limitations of existing approaches, including optimization difficulties, issues in capturing multi-scale and discontinuous phenomena, generalization gaps, and interpretability concerns. We articulate open research challenges and outline emerging directions such as operator learning, meta-learning, hybrid neural-simulation frameworks, neurosymbolic PINNs, and hardware-efficient implementations. By unifying theory, practice, and future vision, this survey aims to serve as a foundational reference for researchers and practitioners across scientific computing, applied mathematics, and machine learning. As PINNs continue to evolve, they offer the promise of enabling a new paradigm of physics-aware, data-driven modeling that is both computationally efficient and scientifically grounded.