变压器
业务
计算机科学
营销
工程类
电气工程
电压
作者
Zhenghua Lu,P. K. Kannan
标识
DOI:10.1177/00222437251347268
摘要
When analyzing a sequence of customer interactions, it is important for firms to understand how these interactions align with key objectives, such as generating qualified customer leads, driving conversion events, or reducing churn. The authors introduce a transformer-based framework that models customer interactions in a sequence similar to how a sentence is modeled as a sequence of words by large language models. They propose a heterogeneous-mixture multihead self-attention mechanism that captures individual heterogeneity in touchpoint effects. The model identifies self-attention patterns that reflect both population-level trends and the unique relationships between touchpoints within each customer journey. By assigning varying weights to each attention head, the model accounts for the distinctive aspects of the journey of each user. This results in more accurate predictions, enabling precise targeting and outperforming existing approaches such as hidden Markov models, point process models, and long short-term memory (LSTM) models. This empirical application in a multichannel marketing context demonstrates how managers can leverage the model's features to identify high-potential customers for targeting. Extensive simulations further establish the model's superiority over competing approaches. Beyond multichannel marketing, the transformer-based model also has broad applicability in customer journeys across other domains.
科研通智能强力驱动
Strongly Powered by AbleSci AI