自动机的推导与优化算法的结合
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
文法推理由于其广泛的应用前景而受到越来越多的关注,它已被成功地应用于:句法模式识别[13],演讲和自然语言的处理,基因分析,图像处理,序列预测,信息检索,密码术等等。由于文法推理与推导相应的自动机是等价的[14],所以作者将研究的重心集中在获得能识别给定样本的自动机上。本文主要研究了几种优化算法在此领域中的应用。
     目前,国内外对神经网络与自动机的结合的研究己取得了一系列成果;在第一章,我们首先将对这些结果以及这个领域的研究思想与方法做一个概要的介绍;然后提出一种推导模糊有限状态自动机的构造性算法,解决了仿真实验中所给出的具体网络的隐藏层神经元个数的确定问题;在实验中,我们首先将样本输入带1个隐藏层神经元的反馈网络训练,150个纪元以后增加神经元,此时的新网络在124纪元时收敛;而Blanco[3]的固定性网络学习好相同的样本需要432个纪元。
     第二章我们设计了一种用于模糊有限自动机推导的进化策略:(μ,λ)~(FA)-策略,该策略将自动机的转移函数和输出函数用矩阵的形式表示出来,并产生了一个与此编码特征相对应的变异操作以及自动机个体关于模糊训练样本集的适应度函数。通过实验证明该策略是有效的,从而为自动机的推导提出了一种新方法。此方法与使用神经网络的方法相比,更直观,简单,并且在其他研究中也有着广泛的应用。最后我们提出了将擅长于全局搜索的进化算法与擅长于局部搜索的梯度下降法结合起来推导自动机(有限状态自动机,下推自动机,模糊自动机)的具有指导性的实现步骤:首先使用进化算法训练给定的样本集,当群体的最优个体的“适应度”变化不明显或者进化代数达到预定值时,则将此时得到的自动机编码入网络中,训练网络直到误差达到期望值。一旦网络训练好后,使用抽取算法从中将自动机抽取出来。
     第三章提出了待研究的几个开问题。
Grammar inference has received more and more attention due to its wide application. This paper is concerned with the combination of several optimal algorithms with this research field.
    In chapter 1, we begin with some results of the grammar inference using neural networks and then a construction method is proposed to induce the fuzzy finite state automaton. This algorithm recovers the absence of the empiric in the case of the fixed-topology network and generates an optimal topology automatically. We end this chapter with some problems in the future.
    In chapter 2, We present an evolution strategy to infer fuzzy finite-state automaton, The fitness function of a generated automaton with respect to the set of examples of a fuzzy language, the representation of the transition and the output of the automaton and the simple mutation operators that work on these representations are given. Results are reported on the inference of a fuzzy language.
    Evolution algorithm and gradient decent algorithm are combined to the grammar inference. First use an evolution algorithm to infer the automaton from examples of a language and then encode the generated automaton to a network. The expected automaton is finally extracted from the trained network.
引文
[1] J. M. Howie. Automata and Languages [M]. Oxford University (1991)
    [2] C.L.Giles C.BMiller, D.Chen, "Learning and extracting finite state automata with second-order recurrent neural networks," Neural Computation 4 (1992) 393-405.
    [3] A, Blanco, M. Delgado, M. C. Pegalajar, "Fuzzy automaton induction using neural network", International Journal of Approximate Reasoning 27 (2001) 1-26.
    [4] Christian W.Omlin, C.Lee.Giles, Training second-order recurrent neural networks using hints, Machine learning :Proceedings of the ninth international conference, Morgan kaufmann, San Matco, CA.
    [5] Mikael Boden and Janet Wiles "Context-free and context-sensitive dynamics in recurrent neural networks" Connection Science, 12(3): 197-210,2000
    [6] M. Blanco, C. Mantas, M.C. Pegalajar, Extracting rules from recurrent neural network using Kohonen network, in Proceeding of the Information Processing and Management of Uncertainty in Knowledge-Based Systems (IPMU'96), vol.3, 1996, pp. 1055-1060.
    [7] Christian W. Omlin, K. Thornber, and C. Lee Giles, "fitzzy fmite-state automata can be deterministically encoded into recurrent neural networks", IEEE Trans on Fuzzy Systems. Vol.6, No. 1.1998.
    [8] Williams R J, et al. A learning algorithm for continually running fully recurrent neural networks. Neural Computation, 1989,1:270-280.
    [9] Williams R J, et al. An efficient gradient-based algorithm for on line training of recurrent network trajectories, Neural computation,1990,2:490-501.
    [10] Christian W. Omlin, C. Lee. Giles, "Recurrent neural networks learn deterministic representations of fuzzy finite-state automata.
    [11] Yao X. Evolutionary artificial neural networks. International Journal of Neural Systems, 1993,4(3):203-222
    [12] Yao X. et al. A new evolving system for evolving artificial neural networks. IEEE Trans NN,1997,8(2):694-713
    
    
    [13] Fu, K. and Booth, T.,1986. Grammatical inference: introduction and survey, IEEE Trans on Pattern Analysis and Machine Intelligence 8,343-375.
    [14] T.Kohonen, The self-organizing map, Proceeding of the IEEE 9 (1990) 1464-1480.
    [15] D. Dubois, H. Prade, Fuzzy sets and systems: theory and application, Mathematics in science and engineering, vol. 144, pp. 220-226, Academic Press, New York, 1980.
    [16] Cleeremans, A., Servan-Schreiber, D., And McClelland, J. 1989, Finite state automata and simple recurrent networks, Neural comp. 1 (3),372.
    [17] Elman, J. L. 1990, Finding structure in time. Cog. Sci. 14, 179.
    [18] Pollack, J.B. 1990. The induction of dynamical recognizers. Tech. Rep. 90-JP-Automata, Dept. of Computer and information Science, Ohio State University.
    [19] Z. Zeng, R. Goodman, P. Smyth, "Learning finite state machines with self-elustering recurrent networks", Neural Computation 5 (6) (1993) 976-990.
    [20] Sun, G. Z. Giles, C. L., Chen,, H. H., & Lee, Y. C.(1993). The neural network pushdown automaton: Model, stack and learning simulations(Technical Report No. CS-TR-3118). University of Maryland, College Park.
    [21] H. P. Schwefel, Evolution and optimum seeking. New York: Wiley, 1995.
    [22] S. Fahlman, "the cascade-correlation learning architecture", in advances in neural information processing systems 2(D. Touretzky, ed.),(San Mateo, CA), pp. 524-532, Morgan Kaufrnarm Publishers, 1990.
    [23] Minsky, M.L.1967.Computation:Finite and infinite Machines, Chap.3.5.Prentice Hall, Englewood Cliffs, NJ.
    [24] Kenji Doya, "Recurrent networks: learning algorithms", Michael Arbib ed. Handbook of brain theory and neural networks 2nd edition, MIT Press
    [25] Giles, C.L., Sun, G.Z., Chen, H.H., Lee, Y.C., and. Chen,D.1990.Higher order recurrent networks and grammatical reference. In Advances in neural Information Systems 2,D.S.Touretzlcy, ed.,p.380.Morgan Kaufmann, San Marco, CA.
    [26] Giles,C.L.,Chen,D.,Miller, C.B.,Chen, H. H.,Sun, G.Z.,andLee,Y.C. 1991. Grammatical inference using second-order recurrent neural networks. In Proceedings of the International Joint Conference on Neural Networks, IEEE 91CH3049-4,Vol.2,p.357.IEEE
    
    
    [27] R. L. Watrous, G. M. Kuhn, Induction of finite-state languages using second-order recurrent networks, Neural Computation 4(1992)406-414.
    [28] Liu,Y.D.,Sun,G.Z.,Chen, H.H.,Lee,Y. C.,andGiles,C.L. 1990.Grammatical inference and neural network state machines. In Proceedings of the International Joint Conference on Neural Networks, IJCNN-90-WASH-DC,Vol.I,p.285.Lawrence Erlbaum,Hillsdale,NJ.
    [29] Elman,J.L. 1991. Distributed representation, simple recurrent networks, and grammatical structure. Machine Learn. 7(2/3), 195-225.
    [30] 万敏,莫智文。用于模糊文法推理的进化策略。电子科技大学学报(自然科学版)(已录用)

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700