Improper initialization will lead the algorithms converge to local minimum.
不恰当的初始化会造成算法收敛到局域极小值。
A stationary point which is neither a local maximum nor a local minimum point is called a saddle point.
一个既不是局部极大点又不是局部极小点的平稳点称为一个鞍点。
Ma Jin, 13, a science student in Henan, scored above the local minimum mark for applying to key universities.
河南省有一名13岁理科生马进,成绩超过本科重点线。
The local minimum problem is a much important factor which restricts the performance of the classic LBG algorithm.
经典LBG算法的局部极小值问题是制约其性能的重要因素。
Traditional neural network algorithms are easy to fall into the local minimum, slow convergence when in fault diagnosis.
传统的神经网络算法应用于故障诊断时,具有易陷入局部极小值,收敛速度较慢等缺点。
The variation of local friction coefficient is similar to velocity and local minimum value occurred in reverse flow area.
摩擦阻力的变化趋势与速度类似,在产生逆向流处出现局部极小值。
By studying the improved potential field function method, it is found that the method couldn't solve all local minimum problems.
研究了改进人工势场法,发现该方法并不能完全解决局部极小点问题。
Strategy of limited training time and learning again prevent the network from over-training, and from being lost in local minimum.
在培训过程中采用培训时间控制策略和再学习策略,有效地防止了培训过度,可作为解决局部极小值问题的一种实用措施。
From next month, McDonald's will increase workers' pay to levels 12 to 56 percent above local minimum wages, McDonald's China spokesman George Gu said.
麦当劳公司中国发言人GeorgeGu说,从下个月起,麦当劳将全面上调员工工资,上调后员工的工资水平将高出当地最低工资标准的12%至56%。
Provided that Party B accomplishes normal work in legal working hours, the wage paid to Party B shall not less than the local minimum wage standard.
乙方在法定工作时间内提供了正常劳动,甲方向乙方支付的工资不得低于当地最低工资标准。
And for the latter, initial value should be sufficiently close to the result to avoid the nasty local minimum and save the precious computation time.
后者需要初值足够接近结果,以避免陷入讨厌的局部极小点并节约宝贵的计算时间。
The problem of trapping into the local minimum is solved, which is inherent with the learning algorithms-based on the BP principle by weight strategy.
但是BP网络极容易陷入局部极小(值),应用加权策略解决了此问题。
Then some defects such as slow convergence rate and getting into local minimum in BP algorithm are pointed out, and the root of the defects is presented.
分析了BP算法的基本原理,指出了BP算法具有收敛速度慢、易陷入局部极小点等缺陷以及这些缺陷产生的根源。
Then some defects such as slow convergence rate and getting into local minimum in BP algorithm are pointed out, and the root of the defects is presented.
针对前向神经网络BP算法由于初始权值选择不当而陷入局部极小点这一缺陷,提出新的全局优化训练算法。
Compared with the past schemes, the proposed one can escape from local minimum points and guarantee the convergence of joint angles to desired configuration.
与以往的算法相比,所提出的算法可以跳出局部最小点,并使关节收敛到期望构形。
Applying improved BP algorithm, it shows the improved BP algorithm can easily converge into the local minimum point and it can improve the accuracy of model.
建模实践表明,改进后的BP算法可能使网络误差函数达到局部极小点,提高了算法的拟合精度。
Considering fuzzy C-means clustering algorithms are sensitive to initialization and easy fall - en to local minimum, a novel optimization method is proposed.
针对模糊C均值聚类算法对初始值敏感、易陷入局部最优的缺陷,提出一种新的优化方法。
Because there are many local minimum points in objective function waveform inversion's effectiveness is weak when processing anamorphic data and actual data.
由于目标函数中存在大量局部极小点,在处理合成数据和实际数据时,该方法效果很差。
Moreover, an improved conjugate gradient algorithm is used to train the network and to overcome the shortcoming of easily trapping into local minimum points.
网络训练时采用共轭梯度学习算法并对此算法进行了改进,有效的克服了梯度学习算法容易陷入局部极小的缺点。
This method is robust to Gaussian noise and constellation rotation due to initial phase of signal and avoids overfitting and local minimum in neural networks.
这种方法对高斯噪声和星座图由于信号初始相位而引入的旋转具有良好的稳健性,并避免了神经网络中的过学习和局部极小点等缺陷。
An improved BP neural network is proposed for the purpose of overcoming the slow convergence and existence of local minimum in conventional BP neural network.
先对传统的BP人工神经网络进行了分析,针对其收敛速度慢,存在局部极小值的缺点提出了一种改进后的BP人工神将网络。
Therefore, the traditional methods are inclined to bring many problems like model-choosing, over-fitting, non-linear, disaster of dimensionality, local minimum.
于是,这些传统方法常常被模型选择与过学习问题、非线性和维数灾难问题、局部极小点问题等困扰。
In this algorithm, a subproblem is set up to search for a new feasible point at which the value of the objective function is lower than the current local minimum.
该算法通过构造子问题来寻找优于当前局部最优解的可行解。
BP algorithm is the most popular training algorithm for feed forward neural network learning. But falling into local minimum and slow convergence are its drawbacks.
BP算法是前馈神经网络训练中应用最多的算法,但其具有收敛慢和陷入局部极值的严重缺点。
The particle swarm optimization(PSO) algorithm, is used to train neural network to solve the drawbacks of BP algorithms which is local minimum and slow convergence.
针对多层前馈网络的误差反传算法存在的收敛速度慢,且易陷入局部极小的缺点,提出了采用微粒群算法(PSO)训练多层前馈网络权值的方法。
The result of maximum energy method depends on the initial values of residual statics. The solution is often trapped into the local minimum of the objective function.
最大能量法的结果受初始静校正量的影响,它的解容易陷入目标函数的局部极小值。
Combining grading method with chaotic optimization, the neural network model achieves rapid training and avoids local minimum when there are a lot of samples to be trained.
考虑神经网络在训练大规模样品时易陷入局部极小,用梯度下降法与混沌优化方法相结合,使神经网络实现快速训练的同时,避免陷入局部极小。
Stimulation results demonstrate the convergences of expert system are well, and it can avoid falling into local minimum. It also has better fault tolerance and high stability.
仿真结果表明,此专家系统在训练中能够迅速收敛避免陷入局部极小值,而且同时具有较好的容错性和较高的稳定性。
Stimulation results demonstrate the convergences of expert system are well, and it can avoid falling into local minimum. It also has better fault tolerance and high stability.
仿真结果表明,此专家系统在训练中能够迅速收敛避免陷入局部极小值,而且同时具有较好的容错性和较高的稳定性。
应用推荐