Enhanced Backpropagation Algorithm by Auto-Tuning Method of Learning Rate using Fuzzy Control System

퍼지 제어 시스템을 이용한 학습률 자동 조정 방법에 의한 개선된 역전파 알고리즘

  • 김광백 (신라대학교 컴퓨터공학과) ;
  • 박충식 (영동대학교 컴퓨터공학과)
  • Published : 2004.04.01

Abstract

We propose an enhanced backpropagation algorithm by auto-tuning of learning rate using fuzzy control system for performance improvement of backpropagation algorithm. We propose two methods, which improve local minima and loaming times problem. First, if absolute value of difference between target and actual output value is smaller than $\varepsilon$ or the same, we define it as correctness. And if bigger than $\varepsilon$, we define it as incorrectness. Second, instead of choosing a fixed learning rate, the proposed method is used to dynamically adjust learning rate using fuzzy control system. The inputs of fuzzy control system are number of correctness and incorrectness, and the output is the Loaming rate. For the evaluation of performance of the proposed method, we applied the XOR problem and numeral patterns classification The experimentation results showed that the proposed method has improved the performance compared to the conventional backpropagatiot the backpropagation with momentum, and the Jacob's delta-bar-delta method.

본 논문에서는 역전파 알고리즘의 성능 개선을 위해 퍼지 제어 시스템을 적용하여 학습률을 자동으로 조정하는 개선된 역전파 알고리즘을 제안한다. 제안된 방법은 목표값과 출력값의 차이에 대한 절대값이 $\varepsilon$ 보다 적거나 같으면 정확성으로 분류하고 크면 부정확성으로 분류한다. 정확성과 부정확성의 개수를 퍼지 제어 시스템에 적용하여 학습률을 동적으로 조정한다. 제안된 방법을 XOR 문제와 숫자 패턴 분류에 적용하여 실험한 결과, 기존의 역전파 알고리즘, 모멘텀 방식, Jacob의 delta-bar-delta 방식보다 성능이 개선됨을 확인하였다.

Keywords

References

  1. R. Hecht-Nielsen, 'Theory of backpropagation Neural Networks,' Proceedings of IJCNN, Vol.1, pp.593-605, 1989
  2. Peiman G. Maghami and Dean W. Sparks, 'Design of Neural Networks for Fast Convergence and Accuracy: Dynamics and Control', IEEE Transactions on Neural Networks, Vol.11, No.11, pp.113-123, 2000 https://doi.org/10.1109/72.822515
  3. R. A. Jacobs, 'Increased rates of convergence through learning rate adaptation,' IEEE Transactions on Neural Networks, Vol.1, No.4, pp.295-308, 1988 https://doi.org/10.1016/0893-6080(88)90003-2
  4. J. W. Kim, K. K. Jung and K. H Eom, 'Auto-Tuning Method of Learning Rate for Performance Improvement of Back propagation Algorithm,' Journal of Korea Institute of Electronics Engineers, Vol.39, No.4, pp.19-27, 2002
  5. Cheung, et al, 'Relative Effectiveness of Training Set Patterns for Back- propagation,' Proceedings of IJCNN, Vol.1, pp. 673-678, 1990
  6. M. T. Hagan and M. Menhaj, 'Training Feedforward Networks with the Marquardt Algorithm,' IEEE Transaction on Neural Networks, Vol.5, No.6, 1994
  7. C. Charalambous, 'Conjugate gradient algorithm for efficient training of artificial neural networks,' IEEE Proceedings of Neural Networks, Vol.139, No.3, pp.301-310, 1992
  8. M. Hagiwaea, 'Theoretical Derivation of Momentum Term in Backprogation,' Proceedings of IJCNN, Vol.I, pp.682-686, 1992
  9. Y. Hirose, K. Yamashita and S. Hijiya, 'Backpropagation Algorithm Which Varies the Number of Hidden Units', Neural Networks, Vol.4, pp.61-66, 1991 https://doi.org/10.1016/0893-6080(91)90032-Z
  10. M. Jamshidi, N. Vadiee and T. J. Ross, Fuzzy Logic and Control, Prentice-Hall, 1993