基于微分包含的非光滑动力系统分析及其应用
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
基于微分包含与非光滑分析,本文系统地研究了带有不连续激励函数延时神经网络、次梯度系统神经网络、非光滑类梯度系统和Hilbert空间中带有Clarke次微分发展包含这四大类微分包含的动力学性质,所得到的结果如下:
     1.研究了一类延时神经网络的指数稳定性与有限时间收敛性。这类问题目前已经有的结果基本上都是在激励函数连续有界情形下获得的,而本文在激励函数不连续且无界的情形下,证明了这类神经网络的两种稳定性:指数稳定性与有限时间收敛性。首先,借助于集值映射的拓扑度理论证明了该神经网络存在唯一的平衡点。然后通过构造Lyapunov函数证明了过任意初始点该神经网络不但存在唯一的全局解,而且这个全局解是按照指数速度收敛到平衡点,即该网络是指数稳定的。许多文献的结论都可以看成是这个定理的推论,另外这个定理的条件比较容易验证而且还具有鲁棒性。接着,在某些给定的条件下,本文证明了该网络的任意轨迹都会在有限时间内收敛到平衡点,即所谓的有限时间收敛,它是不连续系统特有的一种现象。同时,利用两个数值算例解释了上述结论的可行性。
     2.研究了一类次梯度系统神经网络的动力学行为。这类次梯度系统是从目前广泛用于解优化问题的多种神经网络模型中抽象出来的,而且全域细胞神经网络也是它的特例。本文首先证明了这类系统全局解与平衡点的存在性。然后研究了这类系统的稳定性,目前已有的这类系统稳定性结果是拟收敛性。本文利用非光滑的?ojasiewicz不等式,证明了它的全局渐近稳定性,即从任意点出发的解都会渐近收敛到一个平衡点。这个定理的直接推论就是全域细胞神经网络的渐近稳定性,它大大改进了以前有关全域细胞神经网络稳定性的结论。另外,通过?ojasiewicz指数可以计算出解的收敛速度。本文接着研究了一个与这类系统有关的约束极小值问题,证明了这类次梯度系统的(或渐近)稳定平衡点集恰好就是这个约束极小值问题的(或严格)极小点集。最后给出了两个有关这类次梯度系统解的逼近定理,并通过具体例子来阐明了这两个定理的可行性。
     3.研究了一类非光滑类梯度系统的动力学行为。著名的Hopfield神经网络和细胞神经网络都可以看成是这类系统的特例。首先,利用拓扑度的同伦不变性和凸函数次微分的极大单调性,证明了这类系统存在平衡点与全局解。然后构造Lyapunov函数并借助于反证法得到了这类系统全局解的渐近稳定性。接着,利用这个系统求解了一类非光滑函数在集合{0,1}n上的局部极小值问题和一类非线性规划问题,并列举了相关的数值算例来详细说明。最后,分三种情况研究了这类系统周期解的存在性问题:(1)激励函数有界,(2)激励函数满足次线性增长条件,(3)激励函数是C2的而且严格单增。
     4.研究了Hilbert空间中一类发展包含解的存在性问题。在近几十年时间里,人们集中研究了带有凸函数次微分发展包含解的存在性问题。而本文将研究这类发展包含更为一般的情形:带有Clarke次微分的发展包含。与凸函数次微分相比,Clarke次微分有着更广泛的理论和现实应用,然而由于其不具备极大单调性,这大大增加了此类问题研究的难度。本文首先证明了这类发展包含在扰动项是单值时解的存在唯一性定理,并得到了两个非常重要的不等式估计。基于这两个不等式估计并借助于连续选择定理和Schauder不动点定理证明了这类发展包含在扰动项是下半连续集值映射时强解存在性定理。然后利用端点选择定理证明了这类发展包含端点解的存在性,在此基础上得到了松弛型定理,即这类发展包含的端点解集在强解集中是稠密的。最后,将这些结论应用到了两个抛物型偏微分方程的例子,证明了它们解的存在性定理。
Based on differential inclusion and nonsmooth analysis, the dissertation studies thedynamical property of delayed neural networks with discontinuous activations, neuralnetworks of subgradient system, nonsmooth gradient-like systems and evolution inclusionwith Clarke subdifferential type in Hilbert space. The main result of the dissertation islisted as follows:
     1. Study the exponential stability and convergence in finite time of delayed neuralnetwork. At present, most existing results of stability of this neural network are mainlybased on continuity and boundedness of activation function. In this dissertation, withoutassumption of continuity and boundedness of activation function, we prove two stabil-ity of such neural network: exponential stability and convergence in finite time. Firstly,by set-valued topology degree theory, we get the existence and uniqueness of equilib-rium point of such neural network. Then by constructing Lyapunov function, we provethat there exists unique global solution for any initial value problem, and this solutionconverges to the equilibrium point with exponential rate,i.e. the neural network is expo-nential stability. Many existing results can be considered as a corollary of this theorem.Moreover, the conditions of theorem are easily testable and robust. In the end, under somemild hypothesises, we prove that any trajectory of such neural network will converge tothe equilibrium point in finite time, i.e., convergence in finite time, which is a peculiarphenomena of discontinuous system. Meanwhile, two numerical examples are presentedto illustrate the applicability of our results.
     2. Study dynamical behaviors of a class of neural networks of subgradient system,which can be regarded as a generalization of neural network models considered in the op-timization context. Full range cellular neural networks (FR-CNNs) is also its special case.At first, we prove the existence of global solution and equilibrium point, and then studyits stability. Most results on stability of such system are quasi-convergence. In this disser-tation, by nonsmooth ?ojasiewicz inequality, we prove the asymptotic convergence of thetrajectories of this subgradient system, i.e., starting from any initial point, its trajectorywill converge to an equilibrium point in the end. As a direct application, this theorem im-plies asymptotic stability of FR-CNNs, which greatly improves the results of stability of FR-CNNs. Moreover, by ?ojasiewicz exponent, we can easily compute the convergencerate of its solution. Then a constrained minimization problem is studied, which can be as-sociated with this neural network. It is proved that the local constrained (strict) minimumof the objective function coincides with the (asymptotically) stable equilibria point of thisneural network. Finally, we present two theorems about approximation of solutions ofthis subgradient system and serval examples are given to explain these theorems.
     3. Study dynamical behaviors of a class of nonsmooth gradient-like systems. Thewell-known Hopfield neural network and cellular neural network are all special cases ofit. Firstly, by homotopic invariance of topology degree and the maximal monotonicityof convex subdifferential, we prove the existence and uniqueness of global solution andequilibrium point of this system. Then, by virtue of a constructed Lyapunov function andreduction to absurdity, we get the asymptotic stability of this system. After that, we applyresults above into seeking local minimum point of nonsmooth function over {0, 1}n anda class of nonlinear programming problems, and some examples are presented to showits applicability. In the end, we investigate the existence of periodic solution of this non-smooth gradient-like system, which is divided into three cases:(1) activation function isbounded, (2) activation function satisfies sublinear growth condition, (3) activation func-tion belongs to C2 and strictly increase.
     4. Study the existence of solution of evolution inclusions in Hilbert space. In re-cent decades, people focus on the evolution inclusion with convex subdifferential, andin this dissertation, we will study more general case: evolution inclusion with Clarkesubdifferential. Compared with convex subdifferential, Clarke subdifferential has widerapplications in theory and practice. However, Clarke subdifferential doesn’t have maxi-mal monotonicity, which means it will be more difficult to study evolution inclusion withClarke subdifferential. In this dissertation, at first, we get the existence and uniqueness ofits solution in case that the perturbation is a single-valued function, and we also get twoimportant inequalities. Based on these two inequalities, by continuous selection theoremand Schauder fixed point theorem, we get existence theorem of strong solution of thisevolution inclusion when the perturbation is a multivalued lower semicontinuous map.Then, the existence theorem of extremal solution is proved by extremal selection theo-rem, under which we prove the relaxation theorem, i.e., the extremal solution set is densein the strong solution set of this evolution inclusion. Finally, we apply these results into two examples of parabolic PDE’s and get their existence theorems of solution.
引文
1 J. P. Aubin, H. Frankowska. Set-valued Analysis[M]. Boston: Birkhauser, 1990.
    2 J. P. Aubin, A. Cellina. Differential Inclusions[M]. Berlin: Springer-Verlag, 1984.
    3 F. H. Clarke. Optimization and Non-smooth Analysis[M]. New York: Wiley, 1983.
    4 W. Schirotzek. Nonsmooth Analysis[M]. Berlin: Springer-Verlag, 2007.
    5 R. T. Rockafellar, R. J. Wets. Variational Analysis[M]. Berlin: Springer-Verlag,1998.
    6 J. P. Aubin. Viability Theory[M]. Boston: Basel, Birkhauser, 1990.
    7 M. Forti, P. Nistri. Global Convergence of Neural Networks with DiscontinuousNeuron Activations[J]. IEEE Trans. on Circuits and Systems, 2003, 50:1421–1435.
    8 M. Forti. M-matrices and Global Convergence of Discontinuous Neural Net-works[J]. International Journal of Circuit Theory and Applications, 2007, 35:105–130.
    9 H.Brezis. Ope′rateurs Maximaux Monotones et Semigroupes de Contractions dansles Espaces de Hilbert[M]. North Holland: Amsterdam, 1973.
    10 H. Attouch, A. Damlamian. On Multivalued Evolution Equations in HilbertSpaces[J]. Israel Journal of Mathematics, 1972, 12:373–390.
    11 T. Donchev. Qualitative Properties of a Class Differential Inclusions[J]. GlasnikMatematieki, 1996, 31:269–276.
    12 M. E. Filippakis, N. S. Papageorgiou. Periodic Solutions for Differential Inclusionsin RN[J]. Archivum Mathematicum, 2006, 42:115–123.
    13 N. S. Papageorgiou, N. Yannakakis. Existence of Extremal Solutions for NonlinearEvolution Inclusions[J]. Archivum Mathematicum, 2001, 37:9–23.
    14 S. Guillaume. Subdifferential Evolution Inclusion in Nonconvex Analysis[J]. Pos-itivity, 2000, 4:357–395.
    15 A. Marchoud. Sur les Champs De Demicones et Equations Differentielles du Pre-mier Oeder[J]. Bulletin de la Societe Mathematique de France, 1943, 62:1–38.
    16 A. Marchoud. Sur les Champs Continus de Demi- Cones Convexes et Leurs Inte-grales[J]. Compositio Mathematica, 1963, 3:89–127.
    17 S. C. Zaremba. A Propos des Champs de Demi-cones Convexes[J]. Bulletin desSciences Mathe′matiques, 1940, 64:5–12.
    18 T. Wazewski. Syste`mes de Commande et e′quations Au Contingent[J]. Bulletin ofthe Polish Academy of Sciences, 1961, 9:152–155.
    19 A. F. Filippov. Differential Equations with Multi-valued Discontinuous Right-handSide[J]. Doklady Akademii Nauk SSSR, 1963, 151:65–68.
    20 A. Cellina, M. Marchi. Nonconvex Perturbations of Maximal Monotone Differen-tial Inclusions[J]. Israel Journal of Mathematics, 1983, 46:1–11.
    21 D. Wagner. Survey of Measurable Selection Theorems[J]. SIAM Journal on Controland Optimization, 1977, 15:859–903.
    22 A.Bressan. On Differential Relation with Lower Continuous Right- Hand Side, anExistence Theorem[J]. Journal of Differential Equations, 1980, 37:89–97.
    23 B. Cornet. Existence of Slow Solutions for a Class of Differential Inclusions[J].Journal of Mathematical Analysis and Applications, 1983, 96:130–147.
    24 A. A. Tolstonogov. Differential Inclusions in Banach Spaces[M]. Netherland:Kluwer Academic, 2000.
    25 E. K. P.Chong, S.Hui, S. H. Zak. An Analysis of a Class of Neural Networksfor Solving Linear Programming Problems[J]. IEEE Trans. on Automatic Control,1999, 44(11):1995–2006.
    26 M. Forti, M. Grazzini, P. Nistri, et al. Generalized Lyapunov Approach for Con-vergence of Neural Networks with Discontinuous or Non-lipschitz Activations[J].Physica D, 2006, 214:88–88.
    27 M. Forti, A. Tesi. New Conditions for Global Stability of Neural Networks withApplication to Linear and Quadratic Programming Problems[J]. IEEE Trans. onCircuits and Systems, 1995, 42:354–366.
    28 J. F. Wang, L. H. Huang, Z. Y. Guo. Dynamical Behavior of Delayed Hopfield Neu-ral Networks with Discontinuous Activations[J]. Applied Mathematical Modelling,2008, 33:1793–1802.
    29 X. P. Xue, W. Bian. A Project Neural Network for Solving Degenerate ConvexQuadratic Program[J]. Neurocomputing, 2007, 70:2449–2459.
    30 M. Forti, P. Nistri, M. Quincampoix. Generalized Neural Network for NonsmoothNonlinear Programming Problems.[J]. IEEE Trans. on Circuits and Systems, 2004,51:1741–1754.
    31 W. L. Lu, J. Wang. Convergence Analysis of a Class of Nonsmooth Gradient Sys-tems[J]. IEEE Trans. on Circuits and Systems, 2008, 55:3514–3527.
    32 P. Nistri, M. Quincampoix. On the Properties of Solutions to a Differential InclusionAssociated with a Nonsmooth Constrained Optimization Problem[J]. Proceedingsof the 44th IEEE Conference on Decision and Control and the European ControlConference 2005, 2005:5832–5836.
    33 X. P. Xue, J. F. Yu. Periodic Solutions for Semilinear Evolution Inclusions[J].Journal of Mathematical Analysis and Applications, 2007, 331:1246–1262.
    34 L. H. Huang, J. F. Wang, X. N. Zhou. Existence and Global Asymptotic Stabilityof Periodic Solutions for Hopfield Neural Networks with Discontinuous Activa-tions[J]. Nonlinear Analysis: Real World Applications, 2009, 10:1651–1661.
    35 W. L. Lu, T. P. Chen. Dynamical Behaviors of Cohen- Grossberg Neural Networkswith Discontinuous Activation Functions[J]. Neural Networks, 2005, 18:231–242.
    36 W. L. Lu, T. P. Chen. Dynamical Behaviors of Delayed Neural Networks withDiscontinuous Activation Functions[J]. Neural Computation, 2006, 18:683–708.
    37 J. J. Hopfield. Neurons with Graded Response Have Collective ComputationalProperties Like Those of Two-state Neurons[J]. Proceedings of the NationalAcademy of Sciences, 1984, 81:3088–3092.
    38 A.F.Filippov. Differential Equations with Discontinuous Righthand Sides[M].Boston: Kluwer Academic, 1988.
    39 G. D. Sandre, M. Forti, A. P. P. Nistri. Dynamical Analysis of Full-range CellularNeural Networks by Exploiting Differential Variational Inequalities[J]. IEEE Trans.on Circuits and Systems, 2007, 54:1736–1749.
    40 M. Forti, P. Nistri, M. Quincampoix. Convergence of Neural Networks for Pro-gramming Problems via a Nonsmooth ?ojasiewicz Inequality.[J]. IEEE Trans. onNeural Networks, 2006, 17:1471–1486.
    41 X. P. Xue, W. Bian. Subgradient-based Neural Networks for Nonsmooth ConvexOptimization Problems[J]. IEEE Trans. on Circuits and Systems, 2008, 55:2378–2391.
    42 W. Bian, X. P. Xue. Subgradient-based Neural Networks for Nonsmooth Noncon-vex Optimization Problem[J]. IEEE Trans. on Neural Networks, 2009, 20:1024–1038.
    43 M. Vidyasagar. Minimum-seeking Properties of Analog Neural Networks withMultilinear Objective Functions[J]. IEEE Trans. on Automatic Control, 1995,40:1359–1375.
    44 P. A. Absil, R. Sepulchre. Continuous Dynamical Systems That Realize DiscreteOptimization on the Hypercube[J]. Systems Control Letter, 2004, 52:297–304.
    45 H. Attouch. A Dynamical Approach to Convex Minimization Coupling Approx-imation with the Steepest Descent Method[J]. Journal of Differential Equations,1996, 128:519–540.
    46 Z. Ye, B. Zhang, C. Cao. Convergence Analysis on Minimax Neural Networks[J].Information and Control, 1997, 26(1):1–6.
    47 Q. Tao, T. J. Fang. The Neural Network Model for Solving Minimax Problems withConstraints[J]. Control Theory Application, 2000, 17:82–84.
    48 X. B. Gao, L. Z. Liao. A Novel Neural Network for a Class of Convex QuadraticMinmax Problems[J]. Neural Computation, 2006, 18:1818–1846.
    49 X. B. Gao, L. Z. Liao, W. M. Xue. A Neural Network for a Class of ConvexQuadratic Minimax Problems with Constraints[J]. IEEE Trans. on Neural Net-works, 2004, 15:622–628.
    50 D. Kravvaritis, N. S. Papageorgiou. Multivalued Perturbations of SubdifferentialType Evolution Equations in Hilbert Spaces[J]. Journal of Differential Equations,1988, 76:238–255.
    51 N. S. Papageorgiou, F. Papalini. On the Structure of the Solution Set of Evolu-tion Inclusions with Time-dependent Subdifferentials[J]. Rendiconti del SeminarioMatematico della Universita di Padova, 1997, 65:163–187.
    52 T. Cardinali, F. Papalini. Existence Theorems for Nonlinear Evolutions Inclu-sions[J]. Annali di Matematica Pura ed Applicata, 1997, 173:1–11.
    53 N. S. Papageorgiou. Parametrized Relaxation for Evolution Inclusions of the Sub-differential Type[J]. Archivum Mathematicum, 1995, 31:9–28.
    54 H. Okochi. On the Existence of Periodic Solutions to Nonlinear Abstract ParabolicEquations[J]. Journal of the Mathematical Society of Japan, 1988, 40:541–553.
    55 S. Carl, D. Motreanu. Extremal Solutions of Quasilinear Parabolic Inclusionswith Generalized Clarke’s Gradient[J]. Journal of Differential Equations, 2003,191:206–233.
    56 S. Carl. Existence of Extremal Solutions of Boundary Hemivariational Inequali-ties[J]. Journal of Differential Equations, 2001, 171:370–396.
    57 S. Carl. Extremal Solutions of Parabolic Hemivariational Inequalities[J]. NonlinearAnalysis, 2001, 47:5077–5088.
    58 K. Su, Z. He. Solutions of Nonlinear Evolution Inclusions[J]. Journal of Mathe-matical Analysis and Applications, 2006, 313:84–97.
    59 X. P. Xue, Y. Cheng. Existence of Periodic Solutions of Nonlinear Evolution In-clusions in Banach Spaces[J]. Nonlinear Analysis: Real World Applications, 2010,11:459–471.
    60 M. Forti, P. Nistri, D. Papini. Global Exponential Stability and Global Convergencein Finite Time of Delayed Neural Networks with Infinite Gain[J]. IEEE Trans. onNeural Networks, 2005, 16:1449–1463.
    61 X. B. Liang, J. Wang. An Additive Diagonal Stability Condition for Absolute Sta-bility of a General Class of Neural Networks[J]. IEEE Trans. on Circuits and Sys-tems, 2001, 48:1308–1317.
    62 S. Hu, J. Wang. Absolute Exponential Stability of a Class of Continuous-timeRecurrent Neural Networks[J]. IEEE Trans. on Neural Networks, 2003, 14:35–45.
    63 S. Arik. An Improved Global Stability Result for Delayed Cellular Neural Net-works[J]. IEEE Trans. on Circuits and Systems, 2002, 49:1211–1214.
    64 S. P. Bhat, D. S. Bernstein. Finite-time Stability of Continuous Autonomous Sys-tems[J]. SIAM Journal on Control and Optimization, 2000, 38:751–766.
    65 Y.Orlov. Finite Time Stability and Robust Control Synthesis of Uncertain SwitchedSystems[J]. SIAM Journal on Control and Optimization, 2005, 43:1253–1271.
    66 T. Roska, T. Boros, P. Thiran, et al. Detecting Simple Motion Using Cellular NeuralNetworks[J]. Proc. 1990 IEEE Int. Workshop on Cellular Neural Networks andTheir Applications, 1990:127–138.
    67 S. Arik, V. Tavsanoglu. On the Global Asymptotic Stability of Delayed CellularNeural Networks[J]. IEEE Trans. on Circuits and Systems, 2000, 47:571–574.
    68 S. Arik. An Analysis of Global Asymptotic Stability of Delayed Cellular NeuralNetworks[J]. IEEE Trans. on Neural Networks, 2002, 13:1239–1242.
    69 T. L. Liao, F. C. Wang. Global Stability for Cellular Neural Networks with TimeDelay[J]. IEEE Trans. on Neural Networks, 2000, 11:1481–1484.
    70 S. Arik, V. Tavsanoglu. Equilibrium Analysis of Delayed CNNs[J]. IEEE Trans.on Circuits and Systems, 1998, 45:168–171.
    71 M. P. Kennedy, L. O. Chua. Neural Networks for Nonlinear Programming[J]. IEEETrans. on Circuits and Systems, 1988, 35:554–562.
    72 X. B. Liang, J. Si. Global Exponential Stability of Neural Networks with GloballyLipschitz Continuous Activations and its Application to Linear Variational Inequal-ity Problem[J]. IEEE Trans. on Neural Networks, 2001, 12:349–359.
    73 X. P. Xue, W. Bian. A Project Neural Network for Solving Degenerate QuadraticMinimax Problem with Linear Constraints[J]. Neurocomputing, 2009, 72:1826–1838.
    74 S. B. Liu, J. Wang. A Simplified Dual Neural Networks for Quadratic Programmingwith its KWTA Application[J]. IEEE Trans. on Neural Networks, 2006, 17:1500–1510.
    75 Q. S. Liu, J. Wang. A One- Layer Recurrent Neural Network with a DiscontinuousHard- Limiting Activation Function for Quadratic Programming[J]. IEEE Trans.on Neural Networks, 2008, 19:558–570.
    76 J. L. Li, J. Whitaker. Exceptional Family of Elements and Solvability of Varia-tional Inequalities for Mappings Defined only on Closed Convex Cones in BanachSpaces[J]. Journal of Mathematical Analysis and Applications, 2005, 310:254–261.
    77 M. D. la Sen. On K-Positivity Properties of Time-Invariant Linear Systems Subjectto Discrete Point Lags[J]. Positivity, 2007, 11:319–340.
    78 L. Caccetta, V. G. Rumchev. Reachable Discrete-Time Positive Systems with Min-imal Dimension Control Sets[J]. Dynamics of Continuous, Discrete and ImpulsiveSystems, 1998, 4:539–552.
    79 M. D. Marco, M. Forti, M. Grazzini, et al. Lyapunov Method and Convergenceof the Full-range Model of CNNs[J]. IEEE Trans. on Circuits and Systems, 2008,54:3528–3541.
    80 S. ?ojasiewicz. Sur la Ge′ome′trie Semi-et Sous-Analytique[J]. Annales de L’institutFourier, 1993, 43:1575–1595.
    81 E. Feireisl, F. Issard-Roch, H. Petzeltova. A Non-smooth Version of the?ojasiewicz-simon Theorem with Applications to Non-local Phase-field Sys-tems[J]. Journal of Differential Equations, 2004, 199:1–21.
    82 Y. Zhang, S. Zheng. Asymptotic Behavior of Solutions to a Quasilinear Nonuni-form Parabolic System Modelling Chemotaxis[J]. Journal of Differential Equa-tions, 2010, 248:1684–1710.
    83 J. Pruess, G. S. R. Zacher. On Convergence of Solutions to Equilibria for Quasi-linear Parabolic Problems[J]. Journal of Differential Equations, 2009, 246:3902–3931.
    84 R. Chill, A. Fiorenza. Convergence and Decay Rate to Equilibrium of BoundedSolutions of Quasilinear Parabolic Equations[J]. Journal of Differential Equations,2006, 228:611–632.
    85 P. A. Absil. Convergence of the Iterates of Descent Methods for Analytic CostFunctions[J]. SIAM Journal on Optimization, 2005, 16:531–547.
    86 J. Bolte, A. Daniilidis, A. Lewis. The ?ojasiewicz Inequality for Nonsmooth Sub-analytic Functions with Applications to Subgradient Dynamical Systems[J]. SIAMJournal on Control and Optimization, 2007, 17:1205–1233.
    87 G. Y. Li, K. F. Ng. Error Bounds of Generalized D-Gap Functions for Nonsmoothand Nonmonotone Variational Inequality Problems[J]. SIAM Journal on Optimiza-tion, 2009, 20:667–690.
    88 J. Bolte, A. Daniilidis, A. Lewis. A Nonsmooth Morse-Sard Theorem for Sub-analytic Functions[J]. Journal of Mathematical Analysis and Applications, 2006,321:729–740.
    89 J. J. Hopfield. Neural Networks and Physical Systems with Emergent CollectiveComputational Abilities[J]. Proceedings of the National Academy of Sciences,1982, 79:2554–2558.
    90 L. O. Chua, L. Yang. Cellular Neural Networks: Theory[J]. IEEE Trans. on Circuitsand Systems, 1988, 35:1257–1272.
    91 L. O. Chua, L. Yang. Cellular Neural Networks: Application[J]. IEEE Trans. onCircuits and Systems, 1988, 35:1273–1290.
    92 M. Forti, A. Tesi. Absolute Stability of Analytic Neural Networks: An ApproachBased on Finite Trajectory Length[J]. IEEE Trans. on Circuits and Systems, 2004,51:2460–2469.
    93 X. Y. Liu, J. D. Cao. On Periodic Solutions of Neural Networks via DifferentialInclusions[J]. Neural Networks, 2009, 22:329–334.
    94 H. Jiang, J. Cao. Global Exponential Stability of Periodic Neural Networks withTime-varying Delays[J]. Neurocomputing, 2006, 70:343–350.
    95 X.Yang. Existence and Global Exponential Stability of Periodic Solution forCohen-grossberg Shunting Inhibitory Cellular Neural Networks with Delays andImpulses[J]. Neurocomputing, 2009, 72:2219–2226.
    96 Z. H. Yuan, L. F. Yuan, L. H. Huang. Dynamics of Periodic Cohen-GrossbergNeural Networks with Varying Delays[J]. Neurocomputing, 2006, 70:164–172.
    97 M. D. Marco, M. Forti, M. Grazzini, et al. On Global Exponential Stability ofStandard and Full-range CNNs[J]. International Journal of Circuit Theory and Ap-plications, 2008, 36:653–680.
    98 S. Townley, A. Ilchmann, M. G. Wei, et al. Existence and Learning of Oscillationsin Recurrent Neural Networks[J]. IEEE Trans. on Neural Networks, 2000, 11:205–214.
    99 X. B. Liang, J. Si. Global Exponential Stability of Neural Networks with GloballyLipschitz Continuous Activations and its Application to Linear Variational Inequal-ity Problem[J]. IEEE Trans. on Neural Networks, 2001, 12:349–359.
    100 B. Chen, J. Wang. Global Exponential Periodicity and Global Exponential Stabilityof a Class of Recurrent Neural Networks[J]. Physics Letters A, 2004, 329:36–48.
    101 Z. Liu, A. Chen, J. Cao, et al. Existence and Global Exponential Stability of Pe-riodic Solution for BAM Neural Networks with Periodic Coefficients and Time-varying Delays[J]. IEEE Trans. on Circuits and Systems, 2003, 50:1162–1173.
    102 Y. Li, L. Lu. Global Exponential Stability and Existence of Periodic Solution ofHopfield-type Neural Networks with Impulses[J]. Physics Letters A, 2004, 333:62–71.
    103 Z. Wang, Y. Wang, Y. Liu. Global Synchronization for Discrete-time Stochas-tic Complex Networks with Randomly Occurred Nonlinearities and Mixed Time-delays[J]. IEEE Trans. on Neural Networks, 2010, 21:11–25.
    104 Y. Liu, Z. Wang, J. Liang, et al. Stability and Synchronization of Discrete-time Markovian Jumping Neural Networks with Mixed Mode-dependent Time-delays[J]. IEEE Trans. on Neural Networks, 2009, 20:1102–1116.
    105 J. Liang, Z. Wang, X. Liu. State Estimation for Coupled Uncertain Stochastic Net-works with Missing Measurements and Time-varying Delays: The Discrete-timeCase[J]. IEEE Trans. on Neural Networks, 2009, 20:781–793.
    106 S. Wiggins. Introduction to Applied Nonlinear Dynamical Systems and Chaos[M].New York: Springer-Verlag, 1991.
    107 L. Cesari. Optimization Theory and Applications[M]. New York: Springer-Verlag,1983.
    108 A. Tolstonogov. Extreme Continuous Selectors of Multivalued Maps and the Bang-bang Principle for Evolution Inclusions[J]. Soviet mathematics Doklady, 1991,317:1–8.
    109 F. D. Blasi, G. Pianigiani. Baire’s Category and the Bang-bang Property for Evolu-tion Differential Inclusions of Contractive Type[J]. Journal of Mathematical Anal-ysis and Applications, 2010, 367:550–567.
    110 H. Frankowska, F. Rampazzo. Relaxation of Control Systems under State Con-straints[J]. SIAM Journal on Control and Optimization, 1999, 37:1291–1309.
    111 M. Sandberg. Convergence of the Forward Euler Method for Nonconvex Differen-tial Inclusions[J]. SIAM Journal on Numerical Analysis, 2008, 47:308–320.
    112 M. I. Krastanov, N. K. Ribarska, T. Y. Tsachev. On the Existence of Solutionsto Differential Inclusions with Nonconvex Right-hand Sides[J]. SIAM Journal onOptimization, 2007, 18:733–751.
    113 J. Bolte, A. Daniilidis, A. Lewis, et al. Clarke Subgradients of Stratifiable Func-tions[J]. SIAM Journal on Optimization, 2007, 18:556–572.
    114 N. G. Lloyd. Degree Theory: Cambridge Tracts in Mathematics[M]. Cambridge:Cambridge Univ. Press, 1978.
    115 J. K. Hale, S. M. V. Luneli. Introduction to Functional Differential Equations[M].Berlin: Springer-Verlag, 1993.
    116 S.Boyd, L. E. Ghaoui, E. Feron, et al. Linear Matrix Inequalities in System andControl Theory[M]. Philadelphia: SIAM, 1994.
    117 P. Gahinet, A. Nemirovski, J. Laub, et al. LMI Control Toolbox for Use with Mat-lab[M]. Natick: MA:The MATH Works, Inc, 1995.
    118 H. W. Xu, W. Bian, X. P. Xue. Solving a Class of Saddle Point Problems by NeuralNetworks[J]. International Journal of Innovative Computing, 2009, 5:2857–2868.
    119 S. H. Zak, V. Upatising, S. Hui. Solving Linear Programming Problems with NeuralNetworks: A Comparative Study[J]. IEEE Trans. on Neural Networks, 1995, 6:94–104.
    120 Y. S. .Xia, G. Feng, J. Wang. A Recurrent Neural Network with Exponential Con-vergence for Solving Convex Quadratic Program and Related Linear PiecewiseEquations[J]. Neural Networks, 2004, 17:1003–1015.
    121 M. Degiovanni, A. Marino, M. Tosque. Evolution Equations with Lack of Convex-ity[J]. Nonlinear Analysis, 1985, 9:1401–1443.
    122 H. V. Ha, H. D. Nguyen. Lojasiewicz Exponent of the Gradient Near the Fiber[J].Annales Polonici Mathematici, 2009, 96:197–207.
    123 H. V. Ha, T. S. Pham. On the Lojasiewicz Exponent at Infinity of Real Polynomi-als[J]. Annales Polonici Mathematici, 2008, 94:197–208.
    124 K. Kurdyka, T. Mostowski, A. Parusinski. Proof of the Gradient Conjecture of R.Thom[J]. Annals of Mathematics, 2000, 152:763–792.
    125 J. Bolte, A. Daniilidis, A. S. Lewis. A Sard Theorem for Non-differentiable Func-tions[J]. Journal of Mathematical Analysis and Applications, 2006, 321:729–740.
    126 P. A. Absil, K. Kurdyka. On the Stable Equilibrium Points of Gradient Systems.[J].Systems Control Letter, 573-577, 55:2006.
    127 J. Dugundji, A. Granas. Fixed Point Theory[M]. Warsaw: Springer-Verlag, 1986.
    128 N. Dunford. A Mean Ergodic Theorem[J]. Duke Mathematical Journal, 1939,5:635–646.
    129 T. Donchev, E. Farkhi. Stability and Euler Approximation of One-sided Lips-chitz Differential Inclusion[J]. SIAM Journal on Control and Optimization, 1998,36:780–796.
    130 A. Tolstonogov. Continuous Selectors of Multivalued Maps with Closed, Nocon-vex, Decomposable Values[J]. Russian Academy of Sciences. Sbornik Mathemat-ics, 1996, 185:121–142.
    131 N. S. Papageorgiou. Convergence Theorems for Banach Space Valued IntegrableMultifunctions[J]. International Journal of Mathematics and Mathematical Sci-ences, 1987, 10:433–442.
    132 A. Fryszkowski. Continuous Selections for a Class of Nonconvex MultivaluedMaps[J]. Studia Mathematica, 1983, 76:179–186.
    133 N. S. Papageorgiou. On Measurable Multifunctions with Applications MultivaluedEquations[J]. Math. Japonica, 1987, 32:437–464.
    134 R. Martin. Nonlinear Operators and Differential Equations in Banach Spaces[M].New York: Wiley, 1976.
    135 V. Barbu. Nonlinear Semigroups and Differential Equations in Banach Spaces[M].Noorchoff: Leyden. the Netherlands, 1976.