MNIST数据库
计算机科学
激活函数
人工神经网络
功能(生物学)
人工智能
径向基函数
非线性系统
网(多面体)
特征(语言学)
核(代数)
径向基函数核
机器学习
模式识别(心理学)
数据挖掘
核方法
数学
支持向量机
物理
组合数学
哲学
生物
进化生物学
量子力学
语言学
几何学
作者
Chi-Chun Zhou,Hai-Long Tu,Yi Liu,Jianhua Ma
出处
期刊:Cornell University - arXiv
日期:2020-01-01
被引量:2
标识
DOI:10.48550/arxiv.2005.06678
摘要
A deep neural network for classification tasks is essentially consist of two components: feature extractors and function approximators. They usually work as an integrated whole, however, improvements on any components can promote the performance of the whole algorithm. This paper focus on designing a new function approximator. Conventionally, to build a function approximator, one usually uses the method based on the nonlinear activation function or the nonlinear kernel function and yields classical networks such as the feed-forward neural network (MLP) and the radial basis function network (RBF). In this paper, a new function approximator that is effective and efficient is proposed. Instead of designing new activation functions or kernel functions, the new proposed network uses the fractional form. For the sake of convenience, we name the network the ratio net. We compare the effectiveness and efficiency of the ratio net and that of the RBF and the MLP with various kinds of activation functions in the classification task on the mnist database of handwritten digits and the Internet Movie Database (IMDb) which is a binary sentiment analysis dataset. It shows that, in most cases, the ratio net converges faster and outperforms both the MLP and the RBF.
科研通智能强力驱动
Strongly Powered by AbleSci AI