Title page for 955202006


[Back to Results | New Search]

Student Number 955202006
Author Chin-Yen Yeh(葉錦諺)
Author's Email Address doomforest@yahoo.com.tw
Statistics This thesis had been viewed 1170 times. Download 793 times.
Department Computer Science and Information Engineering
Year 2007
Semester 2
Degree Master
Type of Document Master's Thesis
Language zh-TW.Big5 Chinese
Title Imaged-Based Human Computer Interfaces for People with Severe Disabilities
Date of Defense 2008-07-01
Page Count 74
Keyword
  • Amyotrophic Lateral Sclerosis
  • Communication Aid
  • Eye-Blink Communication
  • Human Computer Interface
  • Abstract This paper presents an implementation of a low-cost image-based
    communication aid system which allows people with disabilities to use their
    limited voluntary motions to communicate with family and friends, access
    computers, and control TV and air conditioner, etc. The system consists of only
    one low-cost web camera and a personal computer.
    The communication aid dichotomizes daily living necessities into 7 groups
    (e.g., Voiced Messages, Typing, Home Appliance Control, Help, A/V
    Entertainments, Web Surfing, Messages). In addition to the seven selections,
    Suspend and Exit, are another two available options. The system provides users
    with three kinds of operational interfaces to operate the communication aid. First
    of all, people who still are able to control their arms can operate the system via
    the movements of their arms. Secondly, for people who still can voluntarily open
    and close their mouths, the system can be operated via the open and the closure
    of the mouth. Thirdly, for people with severe disabilities, an extreme disability
    such as severe cerebral palsy or amyotrophic lateral sclerosis, the system can be
    operated via eye blinks. People with different kinds of disabilities can choose an
    appropriate operational interface from the aforementioned three interfaces to
    operate the communication aid.
    Experimental results show that the performance of the proposed
    communication aid is very encouraging.
    Table of Content 摘要i
    Abstractii
    誌謝iii
    目錄iv
    圖目錄vi
    表目錄viii
    一、緒論1
    1-1 研究動機1
    1-2 研究目的2
    1-3 論文架構2
    二、溝通輔具系統相關研究3
    2-1 眨眼溝通法3
    2-2眼電圖法4
    2-3 紅外線視訊系統5
    2-4 紅外線眼動圖法6
    2-5 光學瞳位追蹤系統7
    三、溝通輔具系統演算法10
    3-1 系統校正演算法11
    3-1-1 侵蝕與擴張12
    3-1-2 標號演算法13
    3-2 自動選擇點選方式15
    3-2-1 眼動點選演算法16
    3-2-2 唇動點選演算法26
    3-2-3 大動作點選演算法29
    3-3點選檢驗31
    四、溝通輔具系統介面32
    4-1 硬體環境32
    4-2 系統操作流程33
    4-3 系統主要功能35
    4-3-1 中文語音資料庫38
    4-3-2 影音播放介面39
    4-3-3 中英文輸入介面40
    4-3-4 網際網路瀏覽介面43
    4-3-5 家電控制介面44
    4-3-6 尋求協助介面46
    4-3-7 傳送訊息介面47
    五、實驗結果49
    5-1 偵測眼睛測試49
    5-2 眨眼測試50
    5-3 唇動測試50
    5-4 大動作測試51
    5-5 中文輸入介面比較及測試51
    5-6 自動判斷點選方式測試53
    5-7 點選打字測試54
    六、結論與未來展望56
    6-1 結論56
    6-2 未來展望56
    參考文獻58
    Reference [1]J. Y. Bouguet, “Pyramidal Implementation of the Lucas Kanade Feature Tracker Description of the Algorithm,” Intel Corporation Microprocessor Research Labs.
    [2]M. J. Black, D. J. Fleet, and Y. Yacoob, “A Framework for Modeling Appearance Change in Image Sequences,” in International Conference on Computer Vision, 1998, pp. 660-667.
    [3]M. Betke, J. Gips, and P. Fleming, “The Camera Mouse: Visual Tracking of Body Feature to Provide Computer Access for People with Severe Disabilities,” IEEE Trans. on Neural Systems and Rehabilitation Engineering, vol. 10, no. 1, pp. 1-10, 2002.
    [4]T. N. Bhaskar, F. T. Keat, S. Ranganath, and Y. V. Venkatesh, “Blink Detection and Eye Tracking for Eye Localization,” in Conference on Convergent Technologies for Asia-Pacific Region, 2003, vol. 2, pp. 821-824.
    [5]Y. L. Chen, F. T. Tang, W. H. Chang, M. K. Wong, Y. Y. Shih, and T. S. Kuo, “The New Design of an Infrared-Controlled Human-Computer Interface for the Disabled,” IEEE Trans. on Rehabilitation Engineering, vol. 7, pp. 474-481, Dec. 1999.
    [6]E. B. Delabarre, “A Method of Recording Eye-Movements,” The American Journal of Psychology, vol. 9, no. 4, pp. 572-574, July 1898.
    [7]L. S. Di and A. Bulgarelli, “A Simple and Efficient Connected Components Labeling Algorithm,” in IEEE Proc. of Image Analysis and Processing, Sep. 1999, pp. 322-327.
    [8]D. G. Evans, R. Drew, and P. Blenkhorn, “Controlling Mouse Pointer Position Using an Infrared Head-Operated Joystick,” IEEE Trans. on Rehabilitation Engineering, vol. 8, no. 1, pp. 107-117, 2000.
    [9]T. Fawcett, “An Introduction to ROC Analysis,” Pattern Recognition Letters, vol. 27, no. 8, pp. 861-874, June 2006.
    [10]M. F. Funada, S. P. Ninomija, S. Suzuki, K. Idogawa, Y. Yam, and H. Ide, “On an Image Processing of Eye Blinking to Monitor Awakening Levels of Human Beings,” in 18th Annual International Conference of the Engineering in Medicine and Biology Society, 1996, vol. 3, pp. 966-967.
    [11]K. Grauman, M. Betke, J. Gips, and G. R. Bradski, “Communication via Eye Blinks – Detection and Duration Analysis in Real Time,” in Proc. of CVPR 2001, vol. 1, pp. I-1010-1017.
    [12]J. Gang and E. Sung, “Study on Eye Gaze Estimation,” IEEE Trans. on Systems, Man and Cybernetics, Part B, vol. 32, no. 3, pp. 332-350, June 2002.
    [13]R. Heishman and Z. Duric, “Using Image Flow to Detect Eye Blinks in Color Videos,” in IEEE Workshop on Applications of Computer Vision, 2007.
    [14]R. L. Hsu, M. A. Mottaleb, and A. K. Jain, “Face Detection in Color Images,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 24, pp. 696-706, 2002.
    [15]D. Kumar and E. Poole, “Classification of EOG for Human Computer Interface,” in Conference in the Second Joint EMBS/BMES, Oct. 2002, vol. 1, pp. 64-67.
    [16]C. S. Lin, C. C. Huan, C. N. Chan, M. S. Yeh, and C. C. Chiu, “Design of a Computer Game Using an Eye-Tracking Device for Eye's Activity Rehabilitation,” Optics and Lasers in Engineering, pp. 91-108, July 2004.
    [17]H. Lim and V. K. Singh, “Design of Healthcare System for Disable Person Using Eye Blinking,” in Proc. of Fourth Annual ACIS International Conference on Computer and Information Science, 2005, pp. 551-555.
    [18]C. H. Morimoto, D. Koons, A. Amit, M. Flickner, and S. Zhai, “Keeping an Eye for HCI,” in Processing of XII Brazilian Symp. Computer Graphics and Image, 1999, pp. 171-176.
    [19]T. Miyakawa, H. Takano, and K. Nakamura, “Development of Non-contact Real-time Blink Detection System for Doze Alarm,” in Proc. of SICE Annual Conference, 2004, vol. 2, pp. 1626-1631.
    [20]G. Norris and E. Wilson, “The Eye Mouse, An Eye Communication Device,” in IEEE Proc. of Bioengineering, May 1997, pp. 66–67.
    [21]I. Park, J.-H. Ahn, and H. Byun, “Efficient Measurement of Eye Blinking under Various Illumination Conditions for Drowsiness Detection Systems,” in Proc. of the 18th International Conference on Pattern Recognition, 2006, vol. 1, pp. 383-386.
    [22]K. S. Park and K. T. Lee, “Eye-Controlled Human Computer Interface Using the Line-of-Sight and the Intentional Blink,” Computer Engineering, vol. 30, no. 3, pp. 463-473, 1996.
    [23]R. B. Reilly and M. J. O’Malley, “Adaptive Noncontact Gesture-Based System for Augmentative Communication,” IEEE Trans. on Rehabilitation Engineering, vol. 7, no. 2, pp. 174-182, 1999.
    [24]J. A. Stern, “The Eye Blink: Affective and Cognitive Influence,” in Anxiety: Recent Developments in Cognitive, Psychophysiological, and Health Research, D. G. Forgays, T. Sosnowski, and K. Wrzesniewski, Eds. Washington: Hemisphere Publishing, 1992, ch. 8, pp. 109-128.
    [25]M. C. Su, S. Y. Su, and G. D. Chen, “A Low Cost Vision-Based Human-Computer Interface for People with Severe Disabilities,” Biomedical Engineering-Applications, Basis, Communications, vol. 17, no. 6, pp. 10-18, 2005.
    [26]Y. Tomita, Y. Igarashi, S. Honda, and N. Matsuo, “Electro-Oculography Mouse for Amyotrophic Lateral Sclerosis Patients,” in IEEE Conference Engineering in Medicine and Biology Society, Nov. 1996, vol. 5, pp. 1780-1781.
    [27]O. Takami, N. Irie, C. Kang, T. Ishmatsu, and T. Ochiai, “Computer Interface to Use Head Movement for Handicapped People,” in Proc. of IEEE TENCON’96, Digital, Signal Processing Applications, 1996, vol. 1, pp. 468-472.
    [28]T. Westeyn, P. Pesti, K.-H. Park, and T. Starner, “Biometric Identification Using Song-Based Blink Pattern,” in Proc. of 11th International Conference on Human-Computer Interaction, 2005, vol. 3.
    [29]L. Young and D. Sheena, “Survey of Eye Movement Recording Methods,” Behav. Res. Meth. Instrum, vol. 7, no. 5, pp. 397-429, 1975.
    [30]Z. Zhu and Q. Ji, “Eye Gaze Tracking under Natural Head Movements,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition, June 2005, vol. 1, pp. 918-923.
    [31]LC Technologies, INC. EYEGAZE SYSTEMS. [Online]. Available: http://www.eyegaze.com/INDEX.htm July 2008 [date accessed]
    [32]Texas Instruments, System-On-Chip for 2.4 GHz ZigBee. [Online]. Available: http://focus.ti.com/lit/ds/symlink/cc2431.pdf July 2008 [date accessed]
    [33]李佳霖,「頭配式顯示器之人體生理參數量測分析」,碩士論文,醫學工程研究所,中原大學,民國九十年六月。
    [34]吳成柯、戴善榮、程湘君、雲立實 譯,數位影像處理。儒林圖書有限公司,民國九十年。
    [35]邱國鈞,「追瞳系統之研製及其應用」,碩士論文,資訊工程研究所,國立中央大學,民國九十五年七月。
    [36]簡志忠,「光學式瞳位追蹤器之人機介面系統研製」,碩士論文,自動控制工程研究所,逢甲大學,民國八十六年。
    [37]莊英杰,「追瞳系統之研發於身障者之人機介面應用」,碩士論文,資訊工程研究所,國立中央大學,民國九十三年六月。
    [38]郭靜男,「可眼控及頭控之多功能PC Camera之研發與應用」,碩士論文,自動控制工程研究所,逢甲大學,民國九十二年五月。
    [39]陳弦澤,「改良式紅外線眼控系統之研發與應用」,碩士論文,自動控制工程研究所,逢甲大學,民國九十三年五月。
    [40]陳晏輝,「利用眼球運動發展肢體障礙者之人機界面」,碩士論文,醫學工程研究所,國立成功大學,民國八十七年。
    [41]張凱傑,「眼控與頭控之人機介面系統研發與整合」,碩士論文,自動控制工程研究所,逢甲大學,民國九十年。
    [42]詹永舟,「瞳位追蹤應用於眼控系統及眼球動態量測儀器之製作與分析」,碩士論文,自動控制工程研究所,逢甲大學,民國八十七年。
    [43]蔡金源,「以眼球控制之殘障者人機介面系統:紅外線視動滑鼠」,碩士論文,電機工程研究所,國立台灣大學,民國八十六年。
    Advisor
  • Mu-Chun Su(蘇木春)
  • Files
  • 955202006.pdf
  • approve immediately
    Date of Submission 2008-07-17

    [Back to Results | New Search]


    Browse | Search All Available ETDs

    If you have dissertation-related questions, please contact with the NCU library extension service section.
    Our service phone is (03)422-7151 Ext. 57407,E-mail is also welcomed.