登录    注册    忘记密码    使用帮助

详细信息

基函数神经网络逼近能力探讨及全局收敛性分析     被引量:7

Approximation-Performance and Global-Convergence Analysis of Basis-Function Feedforward Neural Network

文献类型:期刊文献

中文题名:基函数神经网络逼近能力探讨及全局收敛性分析

英文题名:Approximation-Performance and Global-Convergence Analysis of Basis-Function Feedforward Neural Network

作者:肖秀春[1,2];张雨浓[2];姜孝华[2];彭啸亚[2]

机构:[1]广东海洋大学信息学院,湛江524088;[2]中山大学信息科学与技术学院,广州510275

年份:2009

卷号:15

期号:2

起止页码:4

中文期刊名:现代计算机

外文期刊名:Modern Computer

基金:国家自然科学基金(No60643004;60775050);中山大学科研启动费;后备重点课题资助

语种:中文

中文关键词:基函数神经网络;权值直接确定;全局收敛性;逼近性能

外文关键词:Basis-Function Neural Network; Weights Direct Determination; Global Convergence;Approximation Performance

中文摘要:构建一类新型基函数神经网络。依据梯度下降法思想,给出该神经网络的权值迭代公式,证明迭代序列能全局收敛到网络的最优权值,并由此推导出基于伪逆的最优权值一步计算公式——简称为权值直接确定法。理论分析表明,该新型神经网络具有最佳均方逼近能力和全局收敛性质,其权值直接确定法避免了冗长迭代计算、易陷于局部极小点、学习率难选取等传统BP神经网络难以解决的难题,仿真验证显示其相对BP神经网络的各种改进算法具有运算速度快、计算精度高等优势,且对于噪声有良好的滤除特性。

外文摘要:Constructs a new type of basis-function feedforward neural network. The weights-updating formula for the constructed neural network is derived based on the gradient-descent method, with global-convergence and least-squares approximation explored then. Moreover, presents a weights--direct--determination method by using pseudoinverse. Theoretical analysis demonstrates that the constructed neural network could remedy the weakness of conventional BP neural networks, such as, the local-minima phenomenon and difficulty of choosing learning rate, as the weights-direct-determination method could obtain the optimal weights just in one step without any lengthy BP iterative-training. Computer-simulation results substantiate the advantages of our neural network and its weights-direct-determination method, in the sense of speedy computation and high precision.

参考文献:

正在载入数据...

版权所有©广东海洋大学 重庆维普资讯有限公司 渝B2-20050021-8 
渝公网安备 50019002500408号 违法和不良信息举报中心