潜在语义分析
概率潜在语义分析
代表(政治)
自然语言处理
知识表示与推理
计算机科学
人工智能
认知心理学
心理学
数学
认知科学
政治学
政治
法学
作者
Thomas K. Landauer,Susan Dumais
出处
期刊:Psychological Review
[American Psychological Association]
日期:1997-04-01
卷期号:104 (2): 211-240
被引量:5904
标识
DOI:10.1037/0033-295x.104.2.211
摘要
How do people know as much as they do with as little information as they get? The problem takes many forms; learning vocabulary from text is an especially dramatic and convenient case for research. A new general theory of acquired similarity and knowledge representation, latent semantic analysis (LSA), is presented and used to successfully simulate such learning and several other psycholinguistic phenomena. By inducing global knowledge indirectly from local co-occurrence data in a large body of representative text, LSA acquired knowledge about the full vocabulary of English at a comparable rate to schoolchildren. LSA uses no prior linguistic or perceptual similarity knowledge; it is based solely on a general mathematical learning method that achieves powerful inductive effects by extracting the right number of dimensions (e.g., 300) to represent objects and contexts. Relations to other theories, phenomena, and problems are sketched.
科研通智能强力驱动
Strongly Powered by AbleSci AI