• 藍色版面
  • 綠色版面
  • 橘色版面
  • 粉紅色版面
  • 棕色版面
帳號:guest(          離開系統
字體大小: 字級放大   字級縮小   預設字形  


研究生: 鄭欽澤
研究生(外文): Chin-Tse Cheng
論文名稱: 運用正弦函數組合技術於小樣本職能治療推斷之研究
論文名稱(外文): Using a sinusoidal combining technology in a small sample in the study of Occupational Therapy
指導教授: 蔡東亦
指導教授(外文): Tung-I Tsai
學位類別: 碩士
校院名稱: 樹德科技大學
系所名稱: 資訊管理系碩士班
論文出版年: 2011
畢業學年度: 99
語文別: 中文
論文頁數: 55
中文關鍵詞: 職能治療褚氏日常生活評量表小樣本類神經網路
外文關鍵詞: occupational therapyDaily Living Function Scalesmall samplesNeural Network
  • 被引用:0
  • 點閱:5
  • 評分:*****
  • 下載:1
  • 書目收藏:0
由於類神經網路系統(Neural Network)具有高記憶能力與高效率的學習模式,可模擬不同類型的問題,且類神經網路具有處理非線性問題之能力。然而,對於類神經網路系統而言,必須仰賴訓練樣本所提供的資訊建立學習模式,要建立一個精準的類神經網路系統,唯獨充足的訓練資料方可,但實際上常因原始資料筆數有限,使得模式的預測不穩定。

As technology improves and medical level raises, there will be a decrease of the mortality rate. It also causes people to get faster the pace of life and to increase the pressure. Therefore, occupational therapy has been paid attention to the masses gradually in recent years. Due to the principles, which clinic information is not open, it leads to acquire the information hardly. Hence, the experience of doctors is not easy to share. Daily Living Function Scale limits by the amount of collecting sample data because of not excessive using. It is a main issue that how to diagnose the diseases of patients accurately by lacking samples and experience.

As a result of high capability of memory and efficiency of learning models in Neural Network, it can simulate different types of problems and deal with non-linear problems. However, users must provide enough training samples to build up reliable learning modes. Unfortunately, in fact, the amount of training data is often limited to make the models unstably.

This study uses the combination of sinusoidal functions to generate virtual factors, which can unknown critical data by insufficient number of samples. To use the method of extreme value theory, it predicts the ranges of population and generates the virtual sample. The next step is to construct adaptive models of the neural network in small samples.

摘要  i
誌謝  iii
目錄  iv
表目錄  vi
圖目錄  vii
第一章 緒論  1
1.1 研究動機  1
1.2 研究目的  4
1.3 研究架構  5
1.4 論文架構  6
第二章 文獻探討  7
2.1 心理疾病  7
2.1.1 精神病  7
2.1.2治療方法  9
2.2 量測方法  11
2.3 小樣本學習  13
2.3.1 虛擬樣本  13
2.3.2 資訊擴展  14
2.3.3 貝氏網路  22
2.3.4 支援向量機  24
2.4 類神經網路  26
2.4.1 類神經網路基本概念  26
2.4.2 類神經網路架構  27
2.4.3 倒傳遞網路  30
2.5 小結  31
第三章 研究方法  32
3.1 極端值理論  32
3.2 研究步驟  35
第四章 研究結果  45
第五章 結論與建議  50
參考文獻  53

9.顏文偉 (1994)。DSM-IV 美國精神疾病診斷標準。正道心理諮詢網(2004) 資訊擷取至:http://www.zgxl.net/xlzl/cjxljb/dsmiv.htm
1.Mosey, A. C. (1981), Occupational therapy: Configuration of a profession. New York: Raven Press.
2.Leufstadius, C., & Eklund, M. (2008), Time use among individuals with persistent mental illness: Identifying risk factors for imbalance in daily activities. Scandinavian Journal of Occupational Therapy, 15, 23-33.
3.American Psychiatric Association. (2000), Diagnostic and statistical manual of mental disorders : DSM-IV-TR.
4.Brayman, S. J., Kirby, T. F., Misenheimer, A. M., & Short, M. J. (1976), Comprehensive occupational therapy evaluation scale. American Journal ofOccupational Therapy, 30 ,94-100.
5.Niyogi, P., Girosi, F. ,& Tomaso, P. (1998), Incorporating prior information in machine learning by creating virtual examples. Proceedings of the IEEE, 275-298.
6.Li, D. C., Chen, L. S., & Lin, Y. S. (2003), Using functional virtual population as assistance to learn scheduling knowledge in dynamic manufacturing environments. International Journal of Production Research, 41, 4011-4024.
7.Jang, J. S. R. (1993), ANFIS: Adaptive-Network-based Fuzzy Inference Systems. IEEE Transactions on Systems, Man and Cybernetics, 23, 665-685.
8.Li, D. C., Wu, C., Tsai, T. I., & Chang, F. M. (2006), Using Mega-Fuzzification and Data Trend Estimation in Small Data Set Learning for Early FMS Scheduling Knowledge. Computers & Operations Research 33, 1857-1869.
9.Huang, C.F. (1997), Principle of information diffusion. Fuzzy Sets and Systems. 91, 69-90.
10.Huang, C. F., & Moraga, C. (2004), A diffusion-neural-network for learning from small samples. International Journal of Approximate Reasoning, 35,137-161.
11.Heckerman, D., Geiger, D., & Chickering, D. M. (1995), Learning Bayesian networks:The combination of knowledge and statistical data. Machine Learning, 20,197-243.
12.Charniak, E. (1991), Bayesian networks without tears: making Bayesian networks moreaccessible to the probabilistically unsophisticated. AI Magazine, 12, 50-63.
13.Pearl, J. (1986), Fusion, propagation, and structuring in belief networks. ArtificialIntelligence, 29, 241-288.
14.Xenos, M. (2004), Prediction and assessment of student behaviour in open and distanceeducation in computers using Bayesian networks. Computers & Education, 43,345-359.
15.Russell, S., & Norvig, P. (1995), Artificial Intelligence: A Modern Approach (2 ed.):Prentice Hall Series in Artificial Intelligence.
16.Schölkopf, B., Burges, C., & Vapnik, V. (1996), Incorporating invariances in support vector learning machines. In C. von der Malsburg, W. von Seelen, J. C. Vorbruggen, and B. Sendhoff, editors, Artificial Neural Networks – ICANN 96, 47 -52, Berlin. Springer Lecture Notes in Computer Science.
17.Drucker, H., Burges, C. J. C., Kaufman, L., Smola, A., & Vapnik, V. (1997),  Support vector regression machines. in Advances in Neural Information Processing Systems, 9, 155.
18.Rumelhart, D.E., Hinton, G.E., & R.J.Williams. (1986), Learning internal representations by error propagation. IN . E.Rumelhart, J.L.McClelland, et al., eds., Parallel Distributed Processing,Cambridge,MA:MIT Press.
19.McCulloch, W. S., & Pitts, W. H. (1943), A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics, 5,115-133.
20.HOPFIELD,J.J., (1982), Neural networks and physical systems with emergent collective computational abilities. Biophysics, Proc. NatL Acad. Sci. USA. 79, 2554-2558.
21.Zeidenberg, M. (1990), Neural network models in artificial intelligence. Neural computers; Artificial intelligence,239-258.
22.Werbos, p. (1974), Beyond Regression:New Tools for Predicttion and Analysis in the Behavioral Sciences. Ph.D. Dissertation, Harvard University, Cambridge, Mass.
23.Anthony, M., & Biggs, N. (1997), Computational Learning Theory ,Cambridge University Press.
24.Jenkinson, A.F. (1995), The frequency distribution of the annual maximum (or minimum) values of meteorological events. Quarterly Journal of the Royal Meteorological Society, 81,158-172.
25.Lowery, M.D., Nash, J.E. (1970), A comparison of methods of fitting the double exponential distribution. Journal of Hydrology, 10,259-275.
26.Amirakian, B., & Nishimura, H. (1994), What size network is good for generalization of a specific task ofinterest? Neural Networks, 7, 321-329.
27.Wang, Z. N., Dimassimo, C., Tham, M. T., & Morris, A. J. (1994), A procedure for determining thetopology of multilayer feedforward neural networks. Neural Networks, 7, 291-300.

第一頁 上一頁 下一頁 最後一頁 top
* *