Softmax函数
激活函数
人工神经网络
功能(生物学)
人工智能
班级(哲学)
模式识别(心理学)
计算机科学
深层神经网络
数学
算法
进化生物学
生物
出处
期刊:Cornell University - arXiv
日期:2018-01-01
被引量:1846
标识
DOI:10.48550/arxiv.1803.08375
摘要
We introduce the use of rectified linear units (ReLU) as the classification function in a deep neural network (DNN). Conventionally, ReLU is used as an activation function in DNNs, with Softmax function as their classification function. However, there have been several studies on using a classification function other than Softmax, and this study is an addition to those. We accomplish this by taking the activation of the penultimate layer $h_{n - 1}$ in a neural network, then multiply it by weight parameters $θ$ to get the raw scores $o_{i}$. Afterwards, we threshold the raw scores $o_{i}$ by $0$, i.e. $f(o) = \max(0, o_{i})$, where $f(o)$ is the ReLU function. We provide class predictions $\hat{y}$ through argmax function, i.e. argmax $f(x)$.
科研通智能强力驱动
Strongly Powered by AbleSci AI