最佳线性无偏预测
特质
数量性状位点
基因组信息
统计
吸收(声学)
选择(遗传算法)
生物
数学
计算机科学
计算生物学
遗传学
基因
机器学习
材料科学
基因组
复合材料
程序设计语言
作者
Tu Luan,Øyvind Nordbø,Ina Andersen-Ranberg,Theo H. E. Meuwissen
摘要
Many quantitative traits measured in breeding programs are genetically correlated. The genetic correlations between the traits indicate that the measurement of one trait carries information on others. To benefit from this information, multi-trait genomic prediction (MTGP) is preferable to use. However, MTGP is more difficult to implement compared to single-trait genomic prediction (STGP), and even more challenging for the goal to exploit not only the information on other traits but also the information on ungenotyped animals. This could be accomplished using both single and multistep methods. The single-step method was achieved by implementing a single-step genomic best linear unbiased prediction (ssGBLUP) approach using a multi-trait model. Here, we examined a multistep analysis based on an approach called "Absorption" to achieve this goal. The Absorption approach absorbed all available information including the phenotypic information on ungenotyped animals as well as the information on other traits if applicable, into mixed model equations of genotyped animals. The multistep analysis included (1) to apply the Absorption approach that exploits all available information and (2) to implement genomic BLUP (GBLUP) prediction on the absorbed dataset. In this study, the ssGBLUP and multistep analysis were applied to 5 traits in Duroc pigs, which were slaughter percentage, feed consumption from 40 to 120 kg (FC40_120), days of growth from 40 to 120 kg (D40_120), age at 40 kg (A40) and lean meat percentage. The results showed that MTGP yielded higher accuracy than STGP, which on average was 0.057 higher for the multistep method and 0.045 higher for ssGBLUP. The multistep method achieved similar prediction accuracy as ssGBLUP. However, the prediction bias of the multistep method was in general lower than that of ssGBLUP.
科研通智能强力驱动
Strongly Powered by AbleSci AI