1. Expressive Power of Evolving Neural Networks Working on Infinite Input Streams
- Author
-
Jérémie Cabessa, Olivier Finkel, Laboratoire d'économie mathématique et de microéconomie appliquée (LEMMA), Université Panthéon-Assas (UP2), Institut de Mathématiques de Jussieu - Paris Rive Gauche (IMJ-PRG), Université Pierre et Marie Curie - Paris 6 (UPMC)-Université Paris Diderot - Paris 7 (UPD7)-Centre National de la Recherche Scientifique (CNRS), Ralf Klasing and Marc Zeitoun, and Université Panthéon-Assas (UP2)-Sorbonne Université (SU)
- Subjects
Computer science ,Distributed computing ,attractors ,Context (language use) ,0102 computer and information sciences ,01 natural sciences ,Omega ,analytic sets ,010305 fluids & plasmas ,effective Borel and analytic sets ,0103 physical sciences ,Borel sets ,Discrete mathematics ,Topological complexity ,[INFO.INFO-LO]Computer Science [cs]/Logic in Computer Science [cs.LO] ,ω-languages ,neural networks ,formal languages ,Nondeterministic algorithm ,[MATH.MATH-LO]Mathematics [math]/Logic [math.LO] ,Evolving networks ,Recurrent neural network ,010201 computation theory & mathematics ,[SDV.NEU]Life Sciences [q-bio]/Neurons and Cognition [q-bio.NC] ,Uncountable set ,Borel set - Abstract
Evolving recurrent neural networks represent a natural model of computation beyond the Turing limits. Here, we consider evolving recurrent neural networks working on infinite input streams. The expressive power of these networks is related to their attractor dynamics and is measured by the topological complexity of their underlying neural \(\omega \)-languages. In this context, the deterministic and non-deterministic evolving neural networks recognize the (boldface) topological classes of \(BC(\varvec{\mathrm {\Pi }}^0_2)\) and \(\varvec{\mathrm {\Sigma }}^1_1\) \(\omega \)-languages, respectively. These results can actually be significantly refined: the deterministic and nondeterministic evolving networks which employ \(\alpha \in 2^\omega \) as sole binary evolving weight recognize the (lightface) relativized topological classes of \(BC(\mathrm {\Pi }^0_2)(\alpha )\) and \(\mathrm {\Sigma }^1_1(\alpha )\) \(\omega \)-languages, respectively. As a consequence, a proper hierarchy of classes of evolving neural nets, based on the complexity of their underlying evolving weights, can be obtained. The hierarchy contains chains of length \(\omega _1\) as well as uncountable antichains.
- Published
- 2017
- Full Text
- View/download PDF