A computer vision image differential approach for automatic detection of aggressive behavior in pigs using deep learning

卷积神经网络 人工智能 联营 模式识别(心理学) 深度学习 计算机科学 辍学(神经网络) 图像(数学) 乙状窦函数 机器学习 人工神经网络
作者
Jasmine Fraser,Harry Aricibasi,Dan Tulpan,Renée Bergeron
出处
期刊:Journal of Animal Science [Oxford University Press]
卷期号:101
标识
DOI:10.1093/jas/skad347
摘要

Abstract Pig aggression is a major problem facing the industry as it negatively affects both the welfare and the productivity of group-housed pigs. This study aimed to use a supervised deep learning (DL) approach based on a convolutional neural network (CNN) and image differential to automatically detect aggressive behaviors in pairs of pigs. Different pairs of unfamiliar piglets (N = 32) were placed into one of the two observation pens for 3 d, where they were video recorded each day for 1 h following mixing, resulting in 16 h of video recordings of which 1.25 h were selected for modeling. Four different approaches based on the number of frames skipped (1, 5, or 10 for Diff1, Diff5, and Diff10, respectively) and the amalgamation of multiple image differences into one (blended) were used to create four different datasets. Two CNN models were tested, with architectures based on the Visual Geometry Group (VGG) VGG-16 model architecture, consisting of convolutional layers, max-pooling layers, dense layers, and dropout layers. While both models had similar architectures, the second CNN model included stacked convolutional layers. Nine different sigmoid activation function thresholds between 0.1 and 1.0 were evaluated and a 0.5 threshold was selected to be used for testing. The stacked CNN model correctly predicted aggressive behaviors with the highest testing accuracy (0.79), precision (0.81), recall (0.77), and area under the curve (0.86) values. When analyzing the model recall for behavior subtypes prediction, mounting and mobile non-aggressive behaviors were the hardest to classify (recall = 0.63 and 0.75), while head biting, immobile, and parallel pressing were easy to classify (recall = 0.95, 0.94, and 0.91). Runtimes were also analyzed with the blended dataset, taking four times less time to train and validate than the Diff1, Diff5, and Diff10 datasets. Preprocessing time was reduced by up to 2.3 times in the blended dataset compared to the other datasets and, when combined with testing runtimes, it satisfied the requirements for real-time systems capable of detecting aggressive behavior in pairs of pigs. Overall, these results show that using a CNN and image differential-based deep learning approach can be an effective and computationally efficient technique to automatically detect aggressive behaviors in pigs.

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
TOPLi发布了新的文献求助10
刚刚
1秒前
1秒前
神奇海螺发布了新的文献求助10
1秒前
4秒前
4秒前
打打应助小菜鸡采纳,获得10
6秒前
大宝君应助科研通管家采纳,获得10
6秒前
情怀应助科研通管家采纳,获得10
7秒前
ding应助科研通管家采纳,获得10
7秒前
丘比特应助科研通管家采纳,获得10
7秒前
赘婿应助科研通管家采纳,获得10
7秒前
7秒前
zhaxiao发布了新的文献求助10
7秒前
7秒前
9秒前
10秒前
12秒前
13秒前
jcx完成签到 ,获得积分10
14秒前
京1kqq发布了新的文献求助10
14秒前
,,,完成签到,获得积分10
14秒前
满意的菀发布了新的文献求助10
16秒前
16秒前
sulh发布了新的文献求助10
17秒前
ZoeyD完成签到 ,获得积分10
17秒前
,,,发布了新的文献求助10
18秒前
sxt发布了新的文献求助10
18秒前
19秒前
22秒前
知更鸟完成签到,获得积分10
23秒前
sally完成签到,获得积分10
24秒前
满意的菀完成签到,获得积分20
25秒前
NexusExplorer应助sulh采纳,获得10
25秒前
27秒前
Micheal Steven完成签到,获得积分10
31秒前
,,,完成签到,获得积分10
33秒前
YanyanLiu发布了新的文献求助10
35秒前
35秒前
狮子超鸡蛋应助小灰采纳,获得100
36秒前
高分求助中
请在求助之前详细阅读求助说明!!!! 20000
One Man Talking: Selected Essays of Shao Xunmei, 1929–1939 1000
The Three Stars Each: The Astrolabes and Related Texts 900
Yuwu Song, Biographical Dictionary of the People's Republic of China 700
Bernd Ziesemer - Maos deutscher Topagent: Wie China die Bundesrepublik eroberte 500
A radiographic standard of reference for the growing knee 400
Epilepsy: A Comprehensive Textbook 400
热门求助领域 (近24小时)
化学 材料科学 医学 生物 有机化学 工程类 生物化学 纳米技术 物理 内科学 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 电极 光电子学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 2472208
求助须知:如何正确求助?哪些是违规求助? 2138412
关于积分的说明 5449512
捐赠科研通 1862294
什么是DOI,文献DOI怎么找? 926116
版权声明 562752
科研通“疑难数据库(出版商)”最低求助积分说明 495352