Title page for 965201102


[Back to Results | New Search]

Student Number 965201102
Author Yan-Fong Kuo(郭彥鋒)
Author's Email Address p58905222@yahoo.com.tw
Statistics This thesis had been viewed 1786 times. Download 1002 times.
Department Electrical Engineering
Year 2009
Semester 1
Degree Master
Type of Document Master's Thesis
Language zh-TW.Big5 Chinese
Title Comparisons of Neural Network Classifiers Based on Learning Algorithms with Different Structures
Date of Defense 2009-10-06
Page Count 65
Keyword
  • back propagation neural network
  • dynamic fuzzy neural network
  • pruning technique
  • Abstract This thesis aims to investigate and evaluate neural network classifiers, especially on back propagation neural network and dynamic fuzzy neuralnetwork. And we further analyze and improve of both classifiers to ensure the high accuracy of internet. In back propagation neural network, we mainly focus on the learning algorithm and adopt the Levenberg-Marquart method to improve the performance. Moreover, the discussion of the dynamic fuzzy neural network could be divided into two parts: structure learning and parameter learning. The optimal parameter learning is the main work in this study. And it is used by the pruning techniques for dynamic fuzzy neural network structure and would lead to an easy operation for internet, structure simplification and facilitating the
    accomplishment. Finally, from the experimental results, the classification is made from the UCI database to evaluate the accuracy of both back propagation neural network and dynamic fuzzy neural network classifiers.
    Table of Content 第一章 緒論 .............................................. 1
    1.1 研究背景 .............................................1
    1.2 研究動機與目的 .......................................1
    1.3 主要貢獻 .............................................3
    1.4 論文架構 .............................................4
    第二章 類神經網路的架構和特性 ............................5
    2.1 人腦的神經系統 .......................................5
    2.1.1 人工神經元架構 .................................. 5
    2.1.2 人工神經元處理模式 ............................. 7
    2.2 類神經網路的學習規則 .................................9
    2.2.1 類神經網路學習演算法分類 ....................... 9
    第三章 基於類神經之分類器 .............................. 11
    3.1 簡介 ................................................11
    3.2 線性分類器 ..........................................11
    3.2.1 感知機 ........................................ 11
    3.2.2 感知機之激勵函數 .............................. 12
    3.2.3 感知機演算法 .................................. 13
    3.2.4 感知機之最佳化 ................................ 15
    第四章 靜態倒傳遞類神經網路 ............................ 18
    4.1 簡介 ................................................18
    4.2 倒傳遞網路架構 ..................................... 18
    4.2.1 倒傳遞網路之激勵函數 .......................... 19
    4.2.2 倒傳遞網路演算法 .............................. 21
    4.2.3 倒傳遞網路之設計 .............................. 24
    4.3 倒傳遞網路改良方法 ..................................29
    4.3.1 附加動量法 .................................... 29
    4.3.2 Levenberg-Marquardt 法 ....................... 32
    4.3.3 訓練演算的收斂速度對比 ........................ 33
    第五章動態模糊類神經網路 ................................36
    5.1 簡介 ................................................36
    5.2 模糊系統 ........................................... 36
    5.2.1 模糊集 ........................................ 37
    5.2.2 模糊規則 ...................................... 37
    5.2.3 模糊推論系統 .................................. 38
    5.3 徑向基類神經網路 ....................................40
    5.4 動態模糊類神經網路 ................................. 42
    5.5 動態模糊類神經網路學習演算法 ....................... 45
    5.5.1 規則產生準則 ............................... 45
    5.5.2 分級學習思想 ............................... 47
    5.5.3 前提參數分配 ............................... 48
    5.5.4 結果參數的確定 ............................. 50
    5.5.5 網路刪除技巧之改良 ......................... 51
    第六章實驗結果與討論 ....................................55
    6.1 辨識率之評估 .................................... 55
    6.1.1 實驗一 .................................... 56
    6.1.2 實驗二 .................................... 57
    6.1.3 實驗三 .................................... 59
    6.2 討論 ............................................. 60
    第七章結論與未來展望 ................................... 61
    7.1 結論 ............................................. 61
    7.2 未來展望 ......................................... 62
    參考文獻 ............................................. 63
    Reference [1] F. Rosenblatt, “The perceptron:A probabilistic for information storage and organization in the brain, ” Psychological
    Review, vol.65, pp.386-408, 1985.
    [2] A.E. Bryson, Y. C. HO, Applied optimal control, New York: Blaisdell, 1969.
    [3] G.I .Webb, Multiboosting:A technique for combining boosting and wagging, Mach Learn, pp.159-196, 2000.
    [4] P.J. Werbos, Beyond regression: New tools for prediction and analysis in the behavioral sciences, MA:Harvard University, 1974.
    [5] D.E. Rumelhart, G.E Hinton, R.J Williams, “ Learning
    respresentations of back propagation errors, ”Nature, pp.533-536, 1986.
    [6] Z.M. Tan,“A study on Video Servo Control Systems, "Master Thesis, Department of Mechanical and Electro-Mechanical
    Engineering, National Sun Yat-Sen University, Taiwan, 2007.
    [7] S.I. Gallant,“Neural Network Learning and Expert Systems, "The MIT Press, Massachusetts, 1993.
    [8] P.M Murphy, UCI-Benchmark Repository of Artificial and Real Data Sets, http://www.ics.uci.edu/~mlearn, University of Cal
    [9] D. Plaut,S.Nowlaw,and D.Hinton, “ Experiment on learning by back-propagation, ”Technical Report CMU-CS-86-126 , Department of Computer Scinece, Carnegic Mellon University,Pittsburgh, PA, 1986.
    [10] K.A. Levenberg, method for the solution of certain problem in least squares, Quart.Apple.Math., pp.164-168, 1994.
    [11] C.M. Bishop, Neural Networks for pattern Recognition, Oxford University Press, 1995.
    [12] B. Hassibi, D.G. Stock, G.J. Wolff, “Optimal brain surgeon and general network purning, ”Proceedings IEEE Conference on Neural Networks, vol. 1 , San Francisco, 1993.
    [13] J.Zurada, Introduction to Artificial Neural Networks, West Publishing Company, St.Paul, MN., 1992.
    [14] J.S.Jang, C.T. Sun, “Functional Equivalence between Radial Basis Function Networks and Fuzzy Inference Systems, ”IEEE trans.Neural Network, pp.156-158, 1993.
    [15] G.C Goodwin, K.S. Sin, “Adaptive Filtering Prediction and Control, ” Englewood Cliffs, NJ:Prenticeh-Hall, 1984.
    [16] H. Akaike, “ A New Look at the statistical Model
    Indentification , ”IEEE trans.Automat.Contr., pp.716-723, 1974.
    [17] D.Angluin,C.smith,“Inductive Infrernce:Theory and Methods,”ACM comput.Surv.,pp.716-723,1984.
    [18] J.Moody, C.J. Darken, “Fast Learning in Network of Locally-Tuned Processing Units, Neural Computation ,pp.281-294, 1989.ifornia Irvine.CA, 1995.
    [19] S.Q. Wu, M.J. Er, “ Dynamic Fuzzy Neural Networks:A Novel
    Approach to Function Approximation, ”IEEE trans.Syst, Man,
    Cybern.Part B., pp358-364, 2000.
    [20] M.J. Er, S.Q. Wu,“A Fast Learning Algorithm for Parsimonious Fuzzy Neural System, ” Fuzzy sets and Systems, pp.337-351, 2002.
    [21] S. Lee, R.M. Kil, “ A Gaussian Potential Function
    Network with Hierarchically Self-Organizing Learning, ” Neural Networks , pp.207-224, 1991.
    [22] Y. Lu, N. Sundararajan, P.A. Saratchandran, “Sequential Learning Scheme for Function Approximation by Using Minimal Radial Basis Function Networks, ”Neural Computation, pp.461-478, 1997.
    [23] T.Hastie, R.Tibshirani, and J.Friendman, The Element of statistical Learning : Data Mining, Inference and Prediction, Springer-Verlag. Berlin Heidelberg New York, pp.214-217, 2001.
    Advisor
  • Hung-Yuan Chung(鍾鴻源)
  • Files
  • 965201102.pdf
  • approve immediately
    Date of Submission 2009-10-15

    [Back to Results | New Search]


    Browse | Search All Available ETDs

    If you have dissertation-related questions, please contact with the NCU library extension service section.
    Our service phone is (03)422-7151 Ext. 57407,E-mail is also welcomed.