MNIST数据库
乙状窦函数
激活函数
卷积神经网络
计算机科学
人工智能
功能(生物学)
人工神经网络
模式识别(心理学)
进化生物学
生物
作者
Archana Tomar,Harish Patidar
标识
DOI:10.1109/iccubea58933.2023.10392280
摘要
In this work, we implemented and evaluated the accuracy of two commonly used activation functions in convolutional neural networks (CNNs), Rectified Linear Unit (ReLU) or/and Sigmoid, as well as a newer activation function called Residual (Res) activation function. trained CNN models with each of these activation functions on MNIST and CIFAR datasets and evaluate their performance. Overall, this work provides insights into the performance of different activation functions in CNNs.in CIFAR dataset Res Activation function provides slightly better accuracy than the Leaky ReLU and PReLU activation function and rectify_sigmoid achieving a slightly higher accuracy of 89.82%, compared to the activation function of res.
科研通智能强力驱动
Strongly Powered by AbleSci AI