• Standard particle swarm algorithm is easy to fall into local optimum.

    标准粒子算法陷入局部最优值

    youdao

  • BP neural network, as its nature of gradient descent method, is easy to fall into local optimum.

    但BP神经网络本质梯度下降容易陷入局部最优

    youdao

  • That they are easy to fall into a local optimum is the shortcoming of conventional optimization methods.

    传统优化方法,所谓的确定性优化方法的突出缺陷容易陷入局部最优解。

    youdao

  • This method is fast while avoiding the shortcoming that the G-S result is easy to trap in local optimum.

    方法不仅具有追迹法计算简单运算量小的优点,而且克服G - S算法容易陷入局部最优的缺点。

    youdao

  • However, the standard Particle Swarm Optimization is easy to fall into local optimum, and slow convergence.

    然而标准粒子算法存在容易陷入局部最优,后期收敛过慢等问题。

    youdao

  • Otherwise, the hybrid algorithm can avoid trapping in local optimum and does not need initial feasible solution.

    另外算法有效避免陷入局部最优,也要求提供初始可行

    youdao

  • The experimental result indicates that the modified PSO increases the ability to break away from the local optimum.

    实验结果表明改进后粒子群算法防止陷入局部最优能力有了明显的增强

    youdao

  • The approaches based on ANN or gradient hill climb algorithm have limitations such as the function form and local optimum.

    采用人工神经网梯度爬山算法存在对优化函数形式有限制陷入局部最优等局限性。

    youdao

  • Local optimum of the permutation-based chromosomes is defined, and a hill-climbing algorithm is proposed to get the local optimum.

    定义了基于排列染色体局部极值,以此为基础构造了求极值的爬山算法

    youdao

  • The results show the optimized BP neural network can effectively avoid converging on local optimum and reduce training time greatly.

    实验结果证明优化后BP网络有效地避免收敛局部最优值,大大地缩短训练时间。

    youdao

  • And, in FNN weight training, improved PSO in the convergence rate and the ability to jump out to local optimum algorithm is better than BP.

    改进粒子群算法模糊神经网络权值的训练收敛速度跳出局部能力都要BP算法优。

    youdao

  • But the result easily falls into the local optimum with random initial choice, and more control points are required to assure higher accuracy.

    随机选取初始种群的遗传算法,容易使得结果陷入局部最优达到较高的拟合精度,需要增加更多控制顶点

    youdao

  • Based on the objective function, local optimum and iterative method were adopted to get a computer solution on the optimum economical thickness.

    保温设计目标函数采用局部逐步迭代方法实现多层保温经济厚度计算机求解

    youdao

  • Experimental results show that the improved algorithm performs better than the traditional PSO and may avoid falling into the local optimum instead.

    实验结果证明传统PSO算法相比改进算法的寻优效果较好在一定程度上避免陷入局部

    youdao

  • However, drawbacks of slow convergent speed and sometimes just getting local optimum solution are found by tests when common GA technology is applied.

    试验发现普通遗传算法存在收敛速度且容易陷入局部最优等问题。

    youdao

  • The adoption of remembrance-guided search method emphasizes local optimum value in each remembrance segment, which avoids the blindness of global search.

    算法采用记忆指导搜索策略重点搜索了记忆局部最优避免了全局搜索的盲目性

    youdao

  • Aiming at problem that Particle Swarm Optimization (PSO) algorithm falls into local optimum easily, this paper presents a PSO algorithm based on sub-region.

    针对粒子优化(PSO)算法在寻容易陷入局部最优的不足,提出一种基于子区域的PSO算法。

    youdao

  • The experimental results show that the proposed method can accurately segment PET image lesion area, to avoid falling into local optimum and has a good real-time.

    实验结果表明本文提出的方法能够PET图像病灶区域进行精确分割避免陷入局部最优具有良好的实时性。

    youdao

  • The new algorithm includes the mutation operator during the running time which can be useful to improve the ability of PSO in breaking away from the local optimum.

    算法运行过程中增加随机变异算子,通过对当前最佳粒子进行随机变异增强粒子群优化算法跳出局部最优解能力

    youdao

  • Such an algorithm is verified to accelerate convergence process, enhance searching efficiency and solving precision as well as avoid low efficiency and local optimum.

    实例证明算法求解加速收敛过程提高搜索效率,在避免陷入局部最优的同时提高了求解精度

    youdao

  • To solve the problem that particle swarm optimization algorithm is apt to trap in local optimum, a novel cooperative particle swarm optimization algorithm is proposed.

    为了解决基本粒子算法不易跳出局部最优问题提出协同粒子群优化算法。

    youdao

  • Neural network BP training algorithm based on gradient descend technique may lead to entrapment in local optimum so that the network inaccurately classifies input patterns.

    基于梯度下降神经网络训练算法易于陷入局部最小从而使网络不能对输入模式进行准确分类

    youdao

  • The general dynamic clustering algorithms are used by static samples. The results of clustering not only rely on the original classification, but easily get into local optimum.

    一般动态算法都是针对静态样本数据聚类结果不仅依赖初始分类而且陷入局部极小。

    youdao

  • Although the traditional K - means algorithm has good convergence rate and can be realized easily, it can easily be trapped in a local optimum, and it is sensitive in initial setting.

    传统K-均值方法用于聚类具有收敛速度快、算法实现简单等特点,容易陷入局部最优对初始解敏感

    youdao

  • Although the traditional K - means algorithm has good convergence rate and can be realized easily, it can easily be trapped in a local optimum, and it is sensitive in initial setting.

    传统K-均值方法用于聚类具有收敛速度快、算法实现简单等特点,容易陷入局部最优对初始解敏感

    youdao

$firstVoiceSent
- 来自原声例句
小调查
请问您想要如何调整此模块?

感谢您的反馈,我们会尽快进行适当修改!
进来说说原因吧 确定
小调查
请问您想要如何调整此模块?

感谢您的反馈,我们会尽快进行适当修改!
进来说说原因吧 确定