47 results on '"Kühn, Tobias"'
Search Results
2. Diagrammatics for the Inverse Problem in Spin Systems and Simple Liquids
- Author
-
Kühn, Tobias and van Wijland, Frédéric
- Subjects
Condensed Matter - Statistical Mechanics ,Condensed Matter - Disordered Systems and Neural Networks ,Condensed Matter - Soft Condensed Matter - Abstract
Modeling complex systems, like neural networks, simple liquids or flocks of birds, often works in reverse to textbook approaches: given data for which averages and correlations are known, we try to find the parameters of a given model consistent with it. In general, no exact calculation directly from the model is available and we are left with expensive numerical approaches. A particular situation is that of a perturbed Gaussian model with polynomial corrections for continuous degrees of freedom. Indeed perturbation expansions for this case have been implemented in the last 60 years. However, there are models for which the exactly solvable part is non-Gaussian, such as independent Ising spins in a field, or an ideal gas of particles. We implement a diagrammatic perturbative scheme in weak correlations around a non-Gaussian yet solvable probability weight. This applies in particular to spin models (Ising, Potts, Heisenberg) with weak couplings, or to a simple liquid with a weak interaction potential. Our method casts systems with discrete degrees of freedom and those with continuous ones within the same theoretical framework. When the core theory is Gaussian it reduces to the well-known Feynman diagrammatics., Comment: 34 pages, 3 figures. Equivalent to published version
- Published
- 2022
- Full Text
- View/download PDF
3. Gell-Mann-Low criticality in neural networks
- Author
-
Tiberi, Lorenzo, Stapmanns, Jonas, Kühn, Tobias, Luu, Thomas, Dahmen, David, and Helias, Moritz
- Subjects
Condensed Matter - Disordered Systems and Neural Networks - Abstract
Criticality is deeply related to optimal computational capacity. The lack of a renormalized theory of critical brain dynamics, however, so far limits insights into this form of biological information processing to mean-field results. These methods neglect a key feature of critical systems: the interaction between degrees of freedom across all length scales, which allows for complex nonlinear computation. We present a renormalized theory of a prototypical neural field theory, the stochastic Wilson-Cowan equation. We compute the flow of couplings, which parameterize interactions on increasing length scales. Despite similarities with the Kardar-Parisi-Zhang model, the theory is of a Gell-Mann-Low type, the archetypal form of a renormalizable quantum field theory. Here, nonlinear couplings vanish, flowing towards the Gaussian fixed point, but logarithmically slowly, thus remaining effective on most scales. We show this critical structure of interactions to implement a desirable trade-off between linearity, optimal for information storage, and nonlinearity, required for computation.
- Published
- 2021
- Full Text
- View/download PDF
4. Large Deviations Approach to Random Recurrent Neuronal Networks: Parameter Inference and Fluctuation-Induced Transitions
- Author
-
van Meegen, Alexander, Kühn, Tobias, and Helias, Moritz
- Subjects
Condensed Matter - Disordered Systems and Neural Networks ,Quantitative Biology - Neurons and Cognition - Abstract
We here unify the field theoretical approach to neuronal networks with large deviations theory. For a prototypical random recurrent network model with continuous-valued units, we show that the effective action is identical to the rate function and derive the latter using field theory. This rate function takes the form of a Kullback-Leibler divergence which enables data-driven inference of model parameters and calculation of fluctuations beyond mean-field theory. Lastly, we expose a regime with fluctuation-induced transitions between mean-field solutions., Comment: Extension to multiple populations
- Published
- 2020
- Full Text
- View/download PDF
5. Transient chaotic dimensionality expansion by recurrent networks
- Author
-
Keup, Christian, Kühn, Tobias, Dahmen, David, and Helias, Moritz
- Subjects
Condensed Matter - Disordered Systems and Neural Networks ,Quantitative Biology - Neurons and Cognition - Abstract
Neurons in the brain communicate with spikes, which are discrete events in time and value. Functional network models often employ rate units that are continuously coupled by analog signals. Is there a qualitative difference implied by these two forms of signaling? We develop a unified mean-field theory for large random networks to show that first- and second-order statistics in rate and binary networks are in fact identical if rate neurons receive the right amount of noise. Their response to presented stimuli, however, can be radically different. We quantify these differences by studying how nearby state trajectories evolve over time, asking to what extent the dynamics is chaotic. Chaos in the two models is found to be qualitatively different. In binary networks we find a network-size-dependent transition to chaos and a chaotic submanifold whose dimensionality expands stereotypically with time, while rate networks with matched statistics are nonchaotic. Dimensionality expansion in chaotic binary networks aids classification in reservoir computing and optimal performance is reached within about a single activation per neuron; a fast mechanism for computation that we demonstrate also in spiking networks. A generalization of this mechanism extends to rate networks in their respective chaotic regimes., Comment: Final, shortened post-print version. (see v3 for extended material on microscopic and macroscopic chaos)
- Published
- 2020
- Full Text
- View/download PDF
6. Self-consistent formulations for stochastic nonlinear neuronal dynamics
- Author
-
Stapmanns, Jonas, Kühn, Tobias, Dahmen, David, Luu, Thomas, Honerkamp, Carsten, and Helias, Moritz
- Subjects
Condensed Matter - Statistical Mechanics - Abstract
Neural dynamics is often investigated with tools from bifurcation theory. However, many neuron models are stochastic, mimicking fluctuations in the input from unknown parts of the brain or the spiking nature of signals. Noise changes the dynamics with respect to the deterministic model; in particular bifurcation theory cannot be applied. We formulate stochastic neuronal dynamics in the Martin-Siggia-Rose de Dominicis-Janssen (MSRDJ) formalism and present the fluctuation expansion of the effective action and the functional renormalization group (fRG) as two systematic ways to incorporate corrections to the mean dynamics and time-dependent statistics due to fluctuations in the presence of nonlinear neuronal gain. To formulate self-consistency equations, we derive a fundamental link between the effective action in the Onsager-Machlup(OM) formalism, which allows the study of phase transitions, and the MSRDJ effective action, which is computationally advantageous. These results in particular allow the derivation of an OM effective action for systems with non-Gaussian noise. This approach naturally leads to effective deterministic equations for the first moment of the stochastic system; they explain how nonlinearities and noise cooperate to produce memory effects. Moreover, the MSRDJ formulation yields an effective linear system that has identical power spectra and linear response. Starting from the better known loopwise approximation, we then discuss the use of the fRG as a method to obtain self-consistency beyond the mean. We present a new efficient truncation scheme for the hierarchy of flow equations for the vertex functions by adapting the Blaizot, M\'endez and Wschebor approximation from the derivative expansion to the vertex expansion. The methods are presented by means of the simplest possible example of a stochastic differential equation that has generic features of neuronal dynamics., Comment: Equivalent to published version, including two minor typo fixes in Eq (6) and Fig 6. All conclusions unchanged. 7 figures
- Published
- 2018
- Full Text
- View/download PDF
7. Expansion of the effective action around non-Gaussian theories
- Author
-
Kühn, Tobias and Helias, Moritz
- Subjects
Condensed Matter - Statistical Mechanics ,Mathematical Physics - Abstract
This paper derives the Feynman rules for the diagrammatic perturbation expansion of the effective action around an arbitrary solvable problem. The perturbation expansion around a Gaussian theory is well known and composed of one-line irreducible diagrams only. For the expansions around an arbitrary, non-Gaussian problem, we show that a more general class of irreducible diagrams remains in addition to a second set of diagrams that has no analogue in the Gaussian case. The effective action is central to field theory, in particular to the study of phase transitions, symmetry breaking, effective equations of motion, and renormalization. We exemplify the method on the Ising model, where the effective action amounts to the Gibbs free energy, recovering the Thouless-Anderson-Palmer mean-field theory in a fully diagrammatic derivation. Higher order corrections follow with only minimal effort compared to existing techniques. Our results show further that the Plefka expansion and the high-temperature expansion are special cases of the general formalism presented here., Comment: 37 pages, published version
- Published
- 2017
- Full Text
- View/download PDF
8. mlr Tutorial
- Author
-
Schiffner, Julia, Bischl, Bernd, Lang, Michel, Richter, Jakob, Jones, Zachary M., Probst, Philipp, Pfisterer, Florian, Gallo, Mason, Kirchhoff, Dominik, Kühn, Tobias, Thomas, Janek, and Kotthoff, Lars
- Subjects
Computer Science - Learning - Abstract
This document provides and in-depth introduction to the mlr framework for machine learning experiments in R.
- Published
- 2016
9. Locking of correlated neural activity to ongoing oscillations
- Author
-
Kühn, Tobias and Helias, Moritz
- Subjects
Quantitative Biology - Neurons and Cognition ,Condensed Matter - Disordered Systems and Neural Networks - Abstract
Population-wide oscillations are ubiquitously observed in mesoscopic signals of cortical activity. In these network states a global oscillatory cycle modulates the propensity of neurons to fire. Synchronous activation of neurons has been hypothesized to be a separate channel of signal processing information in the brain. A salient question is therefore if and how oscillations interact with spike synchrony and in how far these channels can be considered separate. Experiments indeed showed that correlated spiking co-modulates with the static firing rate and is also tightly locked to the phase of beta-oscillations. While the dependence of correlations on the mean rate is well understood in feed-forward networks, it remains unclear why and by which mechanisms correlations tightly lock to an oscillatory cycle. We here demonstrate that such correlated activation of pairs of neurons is qualitatively explained by periodically-driven random networks. We identify the mechanisms by which covariances depend on a driving periodic stimulus. Mean-field theory combined with linear response theory yields closed-form expressions for the cyclostationary mean activities and pairwise zero-time-lag covariances of binary recurrent random networks. Two distinct mechanisms cause time-dependent covariances: the modulation of the susceptibility of single neurons (via the external input and network feedback) and the time-varying variances of single unit activities. For some parameters, the effectively inhibitory recurrent feedback leads to resonant covariances even if mean activities show non-resonant behavior. Our analytical results open the question of time-modulated synchronous activity to a quantitative analysis., Comment: 57 pages, 12 figures, published version
- Published
- 2016
- Full Text
- View/download PDF
10. Information content in continuous attractor neural networks is preserved in the presence of moderate disordered background connectivity
- Author
-
Kühn, Tobias, primary and Monasson, Rémi, additional
- Published
- 2023
- Full Text
- View/download PDF
11. On Class Imbalance Correction for Classification Algorithms in Credit Scoring
- Author
-
Bischl, Bernd, Kühn, Tobias, Szepannek, Gero, Lübbecke, Marco, editor, Koster, Arie, editor, Letmathe, Peter, editor, Madlener, Reinhard, editor, Peis, Britta, editor, and Walther, Grit, editor
- Published
- 2016
- Full Text
- View/download PDF
12. Diagrammatics for the inverse problem in spin systems and simple liquids
- Author
-
Kühn, Tobias, primary and van Wijland, Frédéric, additional
- Published
- 2023
- Full Text
- View/download PDF
13. Network inference for spike-count neurons
- Author
-
Kühn, Tobias and Ferrari, Ulisse
- Subjects
Computational Neuroscience ,Networks, dynamical systems - Abstract
Bernstein Conference 2022 abstract. http://bernstein-conference.de
- Published
- 2022
- Full Text
- View/download PDF
14. SPACE-M
- Author
-
Blumenthal, Marcel and Kühn, Tobias
- Subjects
FOS: Psychology ,face perception ,Psychology ,Celebrity Familiarity ,Social and Behavioral Sciences ,self-prioritazation - Abstract
Our study refers to the experiment conducted by Woźniak and Knoblich (2019) in which they found the self-prioritization effect (SPE) introduced by Sui et al. (2012) for unfamiliar faces matched with the label “You” and a prioritization for the name of the best friend. We aim to further investigate if the prioritization observed for the best friend can be explained by a familiarity effect based on personal familiarity and replicated with labels of publicly famous politicians names. We further want to cover the unfamiliar faces with a mask to examine if the SPE and the potential familiarity/celebrity-prioritization effect are eliminated with fewer facial features presented since the faces identity cannot be fully accessed. Additionally we expect to further increase reaction times by presenting the mask completely black and without spacial cues.
- Published
- 2022
- Full Text
- View/download PDF
15. Erratum: Self-consistent formulations for stochastic nonlinear neuronal dynamics [Phys. Rev. E 101 , 042124 (2020)]
- Author
-
Stapmanns, Jonas, primary, Kühn, Tobias, additional, Dahmen, David, additional, Luu, Thomas, additional, Honerkamp, Carsten, additional, and Helias, Moritz, additional
- Published
- 2022
- Full Text
- View/download PDF
16. Gell-Mann–Low Criticality in Neural Networks
- Author
-
Tiberi, Lorenzo, primary, Stapmanns, Jonas, additional, Kühn, Tobias, additional, Luu, Thomas, additional, Dahmen, David, additional, and Helias, Moritz, additional
- Published
- 2022
- Full Text
- View/download PDF
17. Large-Deviation Approach to Random Recurrent Neuronal Networks: Parameter Inference and Fluctuation-Induced Transitions
- Author
-
van Meegen, Alexander, primary, Kühn, Tobias, additional, and Helias, Moritz, additional
- Published
- 2021
- Full Text
- View/download PDF
18. Transient Chaotic Dimensionality Expansion by Recurrent Networks
- Author
-
Keup, Christian, primary, Kühn, Tobias, additional, Dahmen, David, additional, and Helias, Moritz, additional
- Published
- 2021
- Full Text
- View/download PDF
19. Transient chaotic SNR amplification
- Author
-
Keup, Christian, Kühn, Tobias, Dahmen, David, and Helias, Moritz
- Abstract
Strongly chaotic non-linear networks strongly separate inputs, but are believed to be useless for classification tasksbecause also irrelevant (noise) differences within any class are exacerbated, leading to bad generalization. We show thisis actually not the case during the initial time period following input presentation: During this time, the representationis dominated by expansion, but not by mixing, and larger differences (between classes) expand faster than smallerdifferences (within classes). Therefore, the representation is disentangled by the dynamics, and when classifying thenetwork state by linear readouts, the signal-to-noise ratio (SNR) actually increases, before it eventually deteriorateswhen mixing begins to dominate. We show that this is a general effect in high-dimensional non-linear chaotic systems,and demonstrate it in spiking, continuous rate, and LSTM networks. The transient SNR amplification is always fast(within 50 ms) for spiking networks, while its timescale in continuous valued networks depends on the distance to theedge of chaos. Moreover, this fast, noise-resilient transient disentanglement of representations is in line with empiricalevidence: the olfactory bulb, for example, rapidly enhances the separability of sensory representations in a singlerecurrent layer, being the initial processing stage of a relatively flat hierarchy.
- Published
- 2021
20. Inferring random network parameters from continuous-time trajectories
- Author
-
van Meegen, Alexander, Kühn, Tobias, and Helias, Moritz
- Published
- 2020
21. Large Deviation Approach to Random Recurrent Neuronal Networks: Rate Function, Parameter Inference, and Activity Prediction
- Author
-
van Meegen, Alexander, Kühn, Tobias, and Helias, Moritz
- Abstract
Statistical field theory captures collective non-equilibrium dynamics of neuronal networks, but it does not address the inverse problem of searching the connectivity to implement a desired dynamics. We here show for an analytically solvable network model that the effective action in statistical field theory is identical to the rate function in large deviation theory; using field theoretical methods we derive this rate function. It takes the form of a Kullback-Leibler divergence and enables data-driven inference of model parameters and Bayesian prediction of time series.
- Published
- 2020
22. Self-consistent formulations for stochastic nonlinear neuronal dynamics
- Author
-
Stapmanns, Jonas, primary, Kühn, Tobias, additional, Dahmen, David, additional, Luu, Thomas, additional, Honerkamp, Carsten, additional, and Helias, Moritz, additional
- Published
- 2020
- Full Text
- View/download PDF
23. Renormalization Group for Spatially Extended Neuronal Networks
- Author
-
Stapmanns, Jonas, Kühn, Tobias, Dahmen, David, Luu, Thomas, Honerkamp, Carsten, and Helias, Moritz
- Subjects
Computational Neuroscience ,Neurons, networks, dynamical systems - Abstract
Many phenomena observed in biological neural networks can only be explained by assuming nonlinearinteractions. Due to effects like synaptic failure and channel noise, neuronal dynamics is also inherentlystochastic. The investigation of the interaction of both of these properties is challenging because due to thenonlinearity, correlations of higher order influence those of lower order. Nonlinear, stochastic systems exhibita plethora of dynamical states. The cortex, especially, is often suggested to operate close to a critical pointat which linear response theory fails since the neural dynamics is dominated by large fluctuations on alllength scales [1].This is the realm of the Renormalization Group (RG), stemming from statistical and particle physics. Weuse this technique in the form introduced by Kenneth G. Wilson [2] to study a two-dimensional stochasticneural field model. Its connectivity is composed of two Gaussian kernels, one mimicking the excitatoryand the other the inhibitory input. Its dynamics is given by a nonlinear gain and intrinsic nonlinear neurondynamics. Gaussian white noise accounting for unspecified external input and intrinsic stochasticity drivesour system. In its long-distance approximation, this model is similar to that proposed by Kardar, Parisi, andZhang (KPZ) [3].Along the lines taken in their approach, we derive RG-flow equations describing the couplings in ourneural field model on different length scales. From this, one finds the upper critical dimension dc=2,which corresponds to the dimension of networks in cortex. Above dc , mean-field theory is exact as theGaussian fixed point is attractive for small interactions, whereas below dc , the interaction dominates thebehavior. For d=dc , however, we find that the Gaussian fixed point becomes unstable and the interactionparameter flows into a strong coupling regime – similar to the KPZ model. A strong coupling fixed pointmay be present, which would indicate self-similarity - the signature of a critical state. Our analysis thereforeimplies certain constrains on the architecture of the neural network (within our model) if it is supposed towork at a critical point. For example, we conclude that we get stable dynamics only if the excitatoryinputs extend wider than the inhibitory ones.AcknowledgementsPartly supported by seed funds MSCALE and CLS002 of the RWTH University; the JARA Center for Doctoral studies within the graduate School for Simulation and Data Science (SSD); the Helmholtz association: Young investigator's grant VH-NG-1028; EU-Grant 785907 (HBP).References[1] Beggs, J. M., Plenz, D. (2003), Neuronal avalanches in neocortical circuits, Journal of Neuroscience, 23, 35, 11167–11177., 10.1523/JNEUROSCI.23-35-11167.2003[2] Wilson, K. G., Kogut, J. (1974), The renormalization group and the ε expansion, Physics Reports, 12, 2 , 75 - 199.[3] Kardar, M., Parisi, G., Zhang, Y.(1986), Dynamic Scaling of Growing Interfaces, Phys. Rev. Lett. 56, 889–892., 10.1103/PhysRevLett.56.88
- Published
- 2019
24. Un vent frais souffle sur le Musée sonore
- Author
-
Kühn, Tobias
- Subjects
M Music - Published
- 2019
- Full Text
- View/download PDF
25. A bridge from large deviation theory to statistical field theory
- Author
-
Van Meegen, Alexander, Kühn, Tobias, and Helias, Moritz
- Subjects
Computational Neuroscience ,Neurons, networks, dynamical systems - Published
- 2019
26. Path integral methods for correlated activity in neuronal networks
- Author
-
Kühn, Tobias, Helias, Moritz, and Honerkamp, Carsten
- Subjects
local field potential ,Quantitative Biology::Neurons and Cognition ,effective action ,stochastic field theory ,chaos ,correlation ,Ising model ,functional renormalization group ,oscillations ,ddc:530 ,binary neuron - Abstract
Dissertation, RWTH Aachen University, 2019; Aachen 1 Online-Ressource (xviii, 271 Seiten) : Illustrationen, Diagramme (2019). = Dissertation, RWTH Aachen University, 2019, Nervous systems of highly developed organisms consists of very many cells. The human brain, to name a very complex example, is composed of nearly 100 billion neurons, that are connected via up to thousand trillion synapses. It is an essential aim of theoretical neuroscience to discover the functioning of this complicated system on the basis of the interaction of its individual parts. Many methods for achieving this goal are borrowed from many particle physics - classical (non-quantum-mechanical) statistical physics, to be precise. An important common property of biological neuronal networks and the usual subjects of statistical physics is that both can be described by models of stochastic (“noisy”) processes. The calculation of measurable quantities from these models is often difficult, for which reason approximate solutions are sought for. Here, the role of mean-field theory deserves to be emphasized, in which fluctuations are treated in a strongly simplified form. Even though in most cases many effects are neglected by this approach, it often yields quantitatively correct results in neuroscience. In this work, we use statistical field theory to derive this and related approximations for different systems, apply them to concrete problems and examine methods to improve them. To capture the interaction amongst different neurons, the description of the activity of a single neuron is often reduced to the question if it is active or not (binary model neuron). By means of its mean-field theory, we describe by which mechanisms the correlations between pairs of neurons change, when a network is driven by a stimulus varying in time. For inferences about the connections in an examined network from experimentally detected activity, the binary representation of neuronal activity is frequently used, as well. This method relies on the Ising model whose mean-field theory we derive using Feynman diagrams. We extend the formalism needed for this purpose to include expansions around non-Gaussian theories like the Ising model without coupling. Furthermore, we examine the statistics of the neuronal activity in an disordered network and its susceptibility to perturbation in mean-field theory. A generalized framework enables us to compare these results with the statistics and dynamics of networks consisting of rate model neurons. In the latter model, each nerve cell is solely characterized by the rate, which indicates how frequently it gets active. We use it in a different context to compare different path integral formalisms representing neuronal activity described by stochastic differential equations. Here we show how mean-field theory can be systematically corrected by the so called loop expansion and how the emerging correction terms can be interpreted in case mean-field theory should prove insufficient for a certain set of parameters. Another possibility to improve mean-field approximations is given be the functional Renormalization Group, whose application to simple models of biological networks we demonstrate for an example., Published by Aachen
- Published
- 2019
- Full Text
- View/download PDF
27. Functional Renormalization Group for Stochastic Rate Neurons
- Author
-
Kühn, Tobias, Stapmanns, Jonas, Dahmen, David, Honerkamp, Carsten, and Helias, Moritz
- Abstract
It is often suggested that the cortex operates close to a critical point at which linear response theory fails since the neural dynamics is dominated by large fluctuations on all length scales. The functional Renormalization Group (fRG) is not stained with this flaw because in principle it treats statistics of arbitrary order in an unbiased and self-consistent way. We apply fRG to a self-interacting, stochastic, quadratic rate neuron and show how this method incorporates corrections to the mean dynamics and time-dependent statistics due to fluctuations in the presence of nonlinear neuronal gain. To obtain a simplified treatment of the frequency-dependence of all observables, we adapt the Blaizot Méndez-Galain Wschebor (BMW) scheme to the vertex expansion, which yields good predictions.We expect that the insights into fRG-techniques gained within our study will help to tackle challenges occurring in the description of phenomena in spatially extended networks, notably the calculation of critical exponents and the coarse-graining of microscopic models.
- Published
- 2019
28. Taming Stochastic, Nonlinear Rate Neurons With Field Theory
- Author
-
Stapmanns, Jonas, Kühn, Tobias, Dahmen, David, Honerkamp, Carsten, and Helias, Moritz
- Published
- 2019
29. Field Theory for Nonlinear Stochastic Rate Neurons
- Author
-
Stapmanns, Jonas, Kühn, Tobias, Dahmen, David, Honerkamp, Carsten, and Helias, Moritz
- Subjects
Computational Neuroscience ,Neurons, networks, dynamical systems - Published
- 2018
- Full Text
- View/download PDF
30. TAP-Approximation and beyond with Feynman diagrams
- Author
-
Kühn, Tobias and Helias, Moritz
- Published
- 2018
31. Expanding the effective action around non-Gaussian theories
- Author
-
Kühn, Tobias and Helias, Moritz
- Abstract
The effective action or Gibbs Free Energy is the central quantity to study phase transitions and is at the core of effective theories constructed, for example, by the renormalization group. It is known that only one-line-irreducible Feynman diagrams contribute in the case that the theory, about which one expands, is Gaussian. We introduce a generalized notion of one-line-irreducibility: diagrams that remain connected after detaching a single leg of an interaction vertex. We show that the effective action decomposes into diagrams that are either irreducible in this more general sense or belong to a second class of diagrams that has no analogue in Gaussian theories [Kühn & Helias 2017, arXiv:1711.05599]. The presented method allows the efficient diagrammatic perturbative computation of the effective action around any exactly solvable problem. We illustrate this method by application to the (classical) Ising model expanded in the coupling strength. This reproduces the Plefka expansion [Plefka 1982], including the TAP-correction [Thouless et al. 1977] to mean-field theory. We find that the diagrammatic formulation considerably simplifies the calculation compared to existing techniques [Takayama & Nakanishi 1997, Georges & Yedidia 1991]. Supported by the Helmholtz foundation (VH-NG-1028, SMHB); EU Grant 604102 (HBP).
- Published
- 2018
32. Dynamics of Cell Assemblies in Binary Neuronal Networks
- Author
-
Keup, Christian, Kühn, Tobias, and Helias, Moritz
- Subjects
ddc:610 - Abstract
Connectivity in local cortical networks is far from random: Reciprocal connections are over-represented, and there are subgroups of neurons which are stronger connected among each other than to the remainder of the network [1,2]. These observations provide a growing evidence for the existence of neuronal assemblies, that is groups of neurons with stronger and/or more numerous connections between members compared to non-members. To study quantitatively the dynamics of these building blocks, we consider a single assembly of binary neurons embedded in a larger randomly connected EI-network and explore its properties by analytical methods and simulation. In dynamical mean field theory [3], we obtain expressions for mean activities, auto- and cross-correlations, and response to input fluctuations using a Gaussian closure. For sufficiently strong assembly self-feedback, a bifurcation from a mono-stable to a bistable regime exists. The critical regime around the bifurcation is of interest, as input variations can drive the assembly to high or low activity states and large spontaneous fluctuations are present. These could be a source of neuronal avalanches observed in cortex, and the robust response to input could constitute attractor states supporting classification in sensory perception. In this regime however, the gaussian approximation is not accurate due to large fluctuation corrections. We therefore work on a path-integral formulation of such systems built on developments in the application of statistical field theory to neuronal networks [4]. This formulation allows the derivation of an effective potential, a systematic treatment of approximations and the quantification of the response to inputs.References1. Ko H, Hofer SB, Pichler B, Buchanan KA , Sjöström PJ, Mrsic-Flogel TD, (2011) Functional specificity of local synaptic connections in neocortical networks. Nature 473: 87-912. Perin R, Berger TK, Markram H, (2011) A synaptic organizing principle for cortical neuronal groups. PNAS 108: 5419-54243. Helias M, Tetzlaff T, Diesmann M (2014) The Correlation Structure of Local Neuronal Networks Intrinsically Results from Recurrent Dynamics. PLoS Comput Biol 10(1): e10034284. Schücker J, Goedeke S, Dahmen D, Helias M, (2016) Functional methods for disordered neural networks. arXiv 1605:06758v2
- Published
- 2018
33. A diagrammatic derivation of the TAP-approximation
- Author
-
Kühn, Tobias and Helias, Moritz
- Subjects
Computational Neuroscience ,Data analysis, machine learning, neuroinformatics - Abstract
Originally invented to describe magnetism, the Ising model has proven to be useful in many other applications, as, for example, inference problems in computer science, socioeconomic physics, the analysis of neural data [1,2,3] and modeling of neural networks (binary neurons). Despite its simplicity, there exists no general solution to the Ising model, i.e. the partition function is unknown in the case of an interacting system. Mean field theory is often used as an approximation being exact in the noninteracting case and for infinite dimensions. A correction term to the mean field approximation of Gibb's free energy (the effective action) of the Ising model was given by Thouless, Anderson and Palmer (TAP) [4] as a “fait accompli” and was later derived by different methods in [5,6,7], where also higher order terms were computed.We present a diagrammatic derivation (Feynman diagrams) of these correction terms and embed the problem in the language of field theory. Furthermore, we show how the iterative construction of the effective action used in the Ising case generalizes to arbitrary non-Gaussian theories.References[1] Tkacik, G., Schneidman, E., Berry II, M. J., Bialek, W. (2008): Ising models for networks of real neurons. arXiv:q-bio/0611072[2] Roudi, Y., Tyrcha, J. and Hertz, J.A. (2009): Ising model for neural data: Model quality and approximate methods for extracting functional connectivity. Phys. Rev. E 79, 051915[3] Hertz, J.A., Roudi, Y. and Tyrcha, J (2011): Ising models for inferring network structure from spike data. arXiv:1106.1752.[4] Thouless, D.J., Anderson, P.W. and Palmer, R.G. (1977): Solution of ’Solvable model of a spin glass’. Phil. Mag. 35 3, 593 – 601[5] Georges, A. and Yedidia, J.S. (1991): How to expand around mean-field theory using high-temperature expansions. J. Phys. A 24, 2173 – 2192[6] Parisi, G. and Potters, M. (1995): Mean-Field equations for spin models with orthogonal interaction matrices. J. Phys. A 28, 5267 – 5285[7] Tanaka, T. (2000): Information Geometry of Mean-Field Approximation. Neur. Comp. 12, 1951-1968.Acknowledgements. This work was partially supported by HGF young investigator’s group VH-NG-1028, Helmholtz portfolio theme SMHB, Juelich Aachen Research Alliance (JARA), and EU Grant 604102 (Human Brain Project, HBP).
- Published
- 2017
- Full Text
- View/download PDF
34. Temporal structure of synchrony and Unitary Events in periodically-driven balanced networks
- Author
-
Kühn, Tobias, Denker, Michael, Mana, PierGianLuca, Grün, Sonja, and Helias, Moritz
- Abstract
Whether the brain employs the temporal domain for the representation of information is still a matter of ongoing debates. Theory and experiments point toward an entanglement of firing rates and correlations [1]. Moreover, in [2], it was shown by Unitary Event (UE) analysis [3] that the occurrence of excess synchronous spike events of neurons observed in parallel are more strongly locked to the phase of the local field potential (LFP)-beta-oscillations than chance synchronous events or individual spikes, which was related to the concept of cell assemblies. We want to study the influence of oscillatory drive from remote brain areas - expressed as oscillations in the LFP - on the correlation of single neuron activities in a small cortical subnetwork. A balanced random network of homogeneously connected binary model neurons [4] receiving input from a sinusoidal perturbation [5] captures the main properties of this type of systems and illustrates mechanisms that cause time-modulated covariances. Using linear response theory, we compute the time-dependent averages and covariances of the stochastic neuronal activity in mean-field-theory, which agree with their simulated counterparts given that the perturbation is of the order of the fluctuations of the inputs. We find that the zero-time lag pairwise covariances consist of two terms, one due to the modulated susceptibility (via external input and recurrent feedback) and one due to the time-varying autocovariances. For some connectivity parameters, this leads to resonant covariances and non-resonant mean activities. The resonant behavior of the covariances occurs because the susceptibility is modulated by two terms with different signs and different dependence on the perturbing frequency: The direct drive and the recurrent feedback . The application of the UE-analysis to data emerging from the model network shows that the probability for UEs to occur is indeed oscillatory already in an unstructured network. A locking as strong as described in [2], however, is not observed. An interesting extension of our model would therefore be to include cell assemblies as additional populations of excitatory neurons that are connected more densely amongst themselves than to the rest [6]. That would allow a closer comparison to experimental findings. However, already the results for the random network can help to answer the salient question how oscillations in mesoscopic signals and spike correlations interact.AcknowledgementsSupported by the Helmholtz foundation (VH-NG-1028, SMHB); EU Grant 720270 (HBP), Simulations with NEST (nest-simulator.org).References1 J. de la Rocha, B. Doiron, E. Shea-Brown, K. Josic, and A. Reyes, Correlation between neural spike trainsincreases with firing rate, Nature 2007, 448(7155), 802–806.2. Denker M, Roux S, Lindén H, Diesmann M, Riehle A, Grün S: The Local Field Potential reflects surplus Spike Synchrony. Cereb Cortex. 2011 D 21:2681 - 2695.3. Grün S, Diesmann M, Aertsen A. 'Unitary Events' in Multiple Single-Neuron Spiking Activity. II. Non-Stationary Data. Neural Comput. 2002, 14(1):81 - 119.4. Ginzburg I, Sompolinsky H: Theory of correlations in stochastic neural networks. Phys Rev E. 1994, 50(4):3171 - 3191.5. Kühn T, Helias M: Correlated activity of periodically driven binary networks. arXiv:1607.08552v26. Litwin-Kumar A, Chacron MJ, Doiron B: The spatial Structure of Stimuli shapes the Timescale of Correlations in Population Spiking Activity. PLoS Comput Biol. 2012:8(9):e1002667.
- Published
- 2017
35. Corrigendum: Expansion of the effective action around non-Gaussian theories (2018 J. Phys. A: Math. Theor. 51 375004)
- Author
-
Kühn, Tobias, primary and Helias, Moritz, additional
- Published
- 2018
- Full Text
- View/download PDF
36. Expansion of the effective action around non-Gaussian theories
- Author
-
Kühn, Tobias, primary and Helias, Moritz, additional
- Published
- 2018
- Full Text
- View/download PDF
37. Correlated activity of periodically driven binary networks
- Author
-
Kühn, Tobias, Denker, Michael, Mana, PierGianLuca, Grün, Sonja, and Helias, Moritz
- Abstract
Experiments showed that excess synchronous spike events are locked to the phase of LFP beta-oscillations more strongly than spikes not part of such events [Denker et al. 2011, Cereb. Cortex]. To identify themechanisms by which correlations depend on the phase of the LFP,which primarily reflects input activity, we examine a balanced network of homogeneously connected binary model neurons [Ginzburg etal. 1994, PRE] receiving input from a sinusoidal perturbation. The Glauber dynamics of the network is simulated and approximated by mean-field theory. Treating the periodic input in linear response theory,the cyclostationary first two moments are analytically computed. They agree with their simulated counterparts over a wide parameter range. The zero-time lag correlations consist of two terms, one due to the modulated susceptibility (via the external input and network feedback) and one due to the time-varying autocorrelations. For some parameters, this leads to resonant correlations and non-resonant mean activities. Our results can help to answer the salient question how oscillations in mesoscopic signals and spike correlations interact. Supported by the Helmholtz foundation (VH-NG-1028, SMHB); EUGrant 604102 (HBP). Simulations with NEST (nest-simulator.org).
- Published
- 2016
38. Correlations in binary networks with time-dependent input
- Author
-
Kühn, Tobias, Denker, Michael, Mana, PierGianLuca, Grün, Sonja, and Helias, Moritz
- Published
- 2016
39. How does an oscillatory drive shape the correlations in binary networks?
- Author
-
Kühn, Tobias, Denker, Michael, PortaMana, PierGianLuca, Grün, Sonja, and Helias, Moritz
- Subjects
Computational Neuroscience ,Quantitative Biology::Neurons and Cognition ,Bernstein Conference - Abstract
Two important parts of electrophysiological recordings are the spike times and the local field potential (LFP), which is considered to primarily reflect input activity. In [1], it was shown by unitary event analysis [2,3] that excess synchronous spike events are locked to the phase of LFP beta-oscillations more strongly than spikes not part of such events. Denker et al. proved by a statistical model that this finding could be explained by the existence of cell assemblies, i.e. groups of (excitatory) neurons that are more strongly connected amongst each other than to the rest of the network.To study the influence of the LFP on the correlated single neuron activities first for a simple model capturing the main properties of cortical neural networks, we examine a balanced network of homogeneously connected binary model neurons [4] receiving input from a sinusoidal perturbation [5]. The Glauber dynamics of the network is simulated and approximated by mean-field theory. Treating the periodic input in linear response theory, the cyclostationary first two moments are analytically computed, which agree with their simulated counterparts over a wide parameter range. The deviations of the zero-time lag correlations from their stationary values consist of two summands owing to the modulated susceptibility (one via direct modulation, one via modulated mean activity) and one to the driving of the autocorrelations. For some parameters, this leads to resonant correlations and non-resonant mean activities. Our results can help to answer the question how oscillations in mesoscopic signals and spike correlations interact. As a next step, our model could be extended to include cell assemblies [6], which will allow us to compare our results with the experimental findings more closely.figure caption:A: Contributions to the time-dependent variation of the correlations in linear perturbation theory. B: The deviation of the correlations from their stationary value is maximal for a certain frequency even for this setting with a connectivity matrix having solely purely real eigenvalues. References:[1] Denker M., Cerebral Cortex, 21:2681--2695, 2011, The Local Field Potential Reflects Surplus Spike Synchrony. [2] Grün S, Diesmann M, Aertsen A. Neural Comput., 14:43--80, 2002a, Unitary events in multiple single-neuron spiking activity: I. detection and significance. [3] Grün S, Diesmann M, Aertsen A. Neural Comput., 14:81--119, 2002b, Unitary events in multiple single-neuron spiking activity: II. Nonstationary data. [4] Ginzburg I., Sompolinsky H., Phys. Rev. E 50(4):3171--3191, 1994, Theory of correlations in stochastic neural networks.[5] Kühn T., Helias M., arXiv:1607.08552, 2016, Correlated activity of periodically driven binary networks.[6] Litwin-Kumar A., & Doiron B., Nature Neur., 15(11):1498--1505, 2012, Slow dynamics and high variability in balanced cortical networks with clustered connections.
- Published
- 2016
40. 26th Annual Computational Neuroscience Meeting (CNS*2017): Part 2
- Author
-
Rubchinsky, Leonid L., primary, Ahn, Sungwoo, additional, Klijn, Wouter, additional, Cumming, Ben, additional, Yates, Stuart, additional, Karakasis, Vasileios, additional, Peyser, Alexander, additional, Woodman, Marmaduke, additional, Diaz-Pier, Sandra, additional, Deraeve, James, additional, Vassena, Eliana, additional, Alexander, William, additional, Beeman, David, additional, Kudela, Pawel, additional, Boatman-Reich, Dana, additional, Anderson, William S., additional, Luque, Niceto R., additional, Naveros, Francisco, additional, Carrillo, Richard R., additional, Ros, Eduardo, additional, Arleo, Angelo, additional, Huth, Jacob, additional, Ichinose, Koki, additional, Park, Jihoon, additional, Kawai, Yuji, additional, Suzuki, Junichi, additional, Mori, Hiroki, additional, Asada, Minoru, additional, Oprisan, Sorinel A., additional, Dave, Austin I., additional, Babaie, Tahereh, additional, Robinson, Peter, additional, Tabas, Alejandro, additional, Andermann, Martin, additional, Rupp, André, additional, Balaguer-Ballester, Emili, additional, Lindén, Henrik, additional, Christensen, Rasmus K., additional, Nakamura, Mari, additional, Barkat, Tania R., additional, Tosi, Zach, additional, Beggs, John, additional, Lonardoni, Davide, additional, Boi, Fabio, additional, Di Marco, Stefano, additional, Maccione, Alessandro, additional, Berdondini, Luca, additional, Jędrzejewska-Szmek, Joanna, additional, Dorman, Daniel B., additional, Blackwell, Kim T., additional, Bauermeister, Christoph, additional, Keren, Hanna, additional, Braun, Jochen, additional, Dornas, João V., additional, Mavritsaki, Eirini, additional, Aldrovandi, Silvio, additional, Bridger, Emma, additional, Lim, Sukbin, additional, Brunel, Nicolas, additional, Buchin, Anatoly, additional, Kerr, Clifford Charles, additional, Chizhov, Anton, additional, Huberfeld, Gilles, additional, Miles, Richard, additional, Gutkin, Boris, additional, Spencer, Martin J., additional, Meffin, Hamish, additional, Grayden, David B., additional, Burkitt, Anthony N., additional, Davey, Catherine E., additional, Tao, Liangyu, additional, Tiruvadi, Vineet, additional, Ali, Rehman, additional, Mayberg, Helen, additional, Butera, Robert, additional, Gunay, Cengiz, additional, Lamb, Damon, additional, Calabrese, Ronald L., additional, Doloc-Mihu, Anca, additional, López-Madrona, Víctor J., additional, Matias, Fernanda S., additional, Pereda, Ernesto, additional, Mirasso, Claudio R., additional, Canals, Santiago, additional, Geminiani, Alice, additional, Pedrocchi, Alessandra, additional, D’Angelo, Egidio, additional, Casellato, Claudia, additional, Chauhan, Ankur, additional, Soman, Karthik, additional, Srinivasa Chakravarthy, V., additional, Muddapu, Vignayanandam R., additional, Chuang, Chao-Chun, additional, Chen, Nan-yow, additional, Bayati, Mehdi, additional, Melchior, Jan, additional, Wiskott, Laurenz, additional, Azizi, Amir Hossein, additional, Diba, Kamran, additional, Cheng, Sen, additional, Smirnova, Elena Y., additional, Yakimova, Elena G., additional, Chizhov, Anton V., additional, Chen, Nan-Yow, additional, Shih, Chi-Tin, additional, Florescu, Dorian, additional, Coca, Daniel, additional, Courtiol, Julie, additional, Jirsa, Viktor K., additional, Covolan, Roberto J. M., additional, Teleńczuk, Bartosz, additional, Kempter, Richard, additional, Curio, Gabriel, additional, Destexhe, Alain, additional, Parker, Jessica, additional, Klishko, Alexander N., additional, Prilutsky, Boris I., additional, Cymbalyuk, Gennady, additional, Franke, Felix, additional, Hierlemann, Andreas, additional, da Silveira, Rava Azeredo, additional, Casali, Stefano, additional, Masoli, Stefano, additional, Rizza, Martina, additional, Rizza, Martina Francesca, additional, Sun, Yinming, additional, Wong, Willy, additional, Farzan, Faranak, additional, Blumberger, Daniel M., additional, Daskalakis, Zafiris J., additional, Popovych, Svitlana, additional, Viswanathan, Shivakumar, additional, Rosjat, Nils, additional, Grefkes, Christian, additional, Daun, Silvia, additional, Gentiletti, Damiano, additional, Suffczynski, Piotr, additional, Gnatkovski, Vadym, additional, De Curtis, Marco, additional, Lee, Hyeonsu, additional, Paik, Se-Bum, additional, Choi, Woochul, additional, Jang, Jaeson, additional, Park, Youngjin, additional, Song, Jun Ho, additional, Song, Min, additional, Pallarés, Vicente, additional, Gilson, Matthieu, additional, Kühn, Simone, additional, Insabato, Andrea, additional, Deco, Gustavo, additional, Glomb, Katharina, additional, Ponce-Alvarez, Adrián, additional, Ritter, Petra, additional, Campo, Adria Tauste, additional, Thiele, Alexander, additional, Deeba, Farah, additional, Robinson, P. A., additional, van Albada, Sacha J., additional, Rowley, Andrew, additional, Hopkins, Michael, additional, Schmidt, Maximilian, additional, Stokes, Alan B., additional, Lester, David R., additional, Furber, Steve, additional, Diesmann, Markus, additional, Barri, Alessandro, additional, Wiechert, Martin T., additional, DiGregorio, David A., additional, Dimitrov, Alexander G., additional, Vich, Catalina, additional, Berg, Rune W., additional, Guillamon, Antoni, additional, Ditlevsen, Susanne, additional, Cazé, Romain D., additional, Girard, Benoît, additional, Doncieux, Stéphane, additional, Doyon, Nicolas, additional, Boahen, Frank, additional, Desrosiers, Patrick, additional, Laurence, Edward, additional, Dubé, Louis J., additional, Eleonora, Russo, additional, Durstewitz, Daniel, additional, Schmidt, Dominik, additional, Mäki-Marttunen, Tuomo, additional, Krull, Florian, additional, Bettella, Francesco, additional, Metzner, Christoph, additional, Devor, Anna, additional, Djurovic, Srdjan, additional, Dale, Anders M., additional, Andreassen, Ole A., additional, Einevoll, Gaute T., additional, Næss, Solveig, additional, Ness, Torbjørn V., additional, Halnes, Geir, additional, Halgren, Eric, additional, Pettersen, Klas H., additional, Sætra, Marte J., additional, Hagen, Espen, additional, Schiffer, Alina, additional, Grzymisch, Axel, additional, Persike, Malte, additional, Ernst, Udo, additional, Harnack, Daniel, additional, Ernst, Udo A., additional, Tomen, Nergis, additional, Zucca, Stefano, additional, Pasquale, Valentina, additional, Pica, Giuseppe, additional, Molano-Mazón, Manuel, additional, Chiappalone, Michela, additional, Panzeri, Stefano, additional, Fellin, Tommaso, additional, Oie, Kelvin S., additional, Boothe, David L., additional, Crone, Joshua C., additional, Yu, Alfred B., additional, Felton, Melvin A., additional, Zulfiqar, Isma, additional, Moerel, Michelle, additional, De Weerd, Peter, additional, Formisano, Elia, additional, Oie, Kelvin, additional, Franaszczuk, Piotr, additional, Diggelmann, Roland, additional, Fiscella, Michele, additional, Guarino, Domenico, additional, Antolík, Jan, additional, Davison, Andrew P., additional, Frègnac, Yves, additional, Etienne, Benjamin Xavier, additional, Frohlich, Flavio, additional, Lefebvre, Jérémie, additional, Marcos, Encarni, additional, Mattia, Maurizio, additional, Genovesio, Aldo, additional, Fedorov, Leonid A., additional, Dijkstra, Tjeerd M.H., additional, Sting, Louisa, additional, Hock, Howard, additional, Giese, Martin A., additional, Buhry, Laure, additional, Langlet, Clément, additional, Giovannini, Francesco, additional, Verbist, Christophe, additional, Salvadé, Stefano, additional, Giugliano, Michele, additional, Henderson, James A., additional, Wernecke, Hendrik, additional, Sándor, Bulcsú, additional, Gros, Claudius, additional, Voges, Nicole, additional, Dabrovska, Paulina, additional, Riehle, Alexa, additional, Brochier, Thomas, additional, Grün, Sonja, additional, Gu, Yifan, additional, Gong, Pulin, additional, Dumont, Grégory, additional, Novikov, Nikita A., additional, Gutkin, Boris S., additional, Tewatia, Parul, additional, Eriksson, Olivia, additional, Kramer, Andrei, additional, Santos, Joao, additional, Jauhiainen, Alexandra, additional, Kotaleski, Jeanette H., additional, Belić, Jovana J., additional, Kumar, Arvind, additional, Kotaleski, Jeanette Hellgren, additional, Shimono, Masanori, additional, Hatano, Naomichi, additional, Ahmad, Subutai, additional, Cui, Yuwei, additional, Hawkins, Jeff, additional, Senk, Johanna, additional, Korvasová, Karolína, additional, Tetzlaff, Tom, additional, Helias, Moritz, additional, Kühn, Tobias, additional, Denker, Michael, additional, Mana, PierGianLuca, additional, Dahmen, David, additional, Schuecker, Jannis, additional, Goedeke, Sven, additional, Keup, Christian, additional, Heuer, Katja, additional, Bakker, Rembrandt, additional, Tiesinga, Paul, additional, Toro, Roberto, additional, Qin, Wei, additional, Hadjinicolaou, Alex, additional, Ibbotson, Michael R., additional, Kameneva, Tatiana, additional, Lytton, William W., additional, Mulugeta, Lealem, additional, Drach, Andrew, additional, Myers, Jerry G., additional, Horner, Marc, additional, Vadigepalli, Rajanikanth, additional, Morrison, Tina, additional, Walton, Marlei, additional, Steele, Martin, additional, Anthony Hunt, C., additional, Tam, Nicoladie, additional, Amaducci, Rodrigo, additional, Muñiz, Carlos, additional, Reyes-Sánchez, Manuel, additional, Rodríguez, Francisco B., additional, Varona, Pablo, additional, Cronin, Joseph T., additional, Hennig, Matthias H., additional, Iavarone, Elisabetta, additional, Yi, Jane, additional, Shi, Ying, additional, Zandt, Bas-Jan, additional, Van Geit, Werner, additional, Rössert, Christian, additional, Markram, Henry, additional, Hill, Sean, additional, O’Reilly, Christian, additional, Perin, Rodrigo, additional, Lu, Huanxiang, additional, Bryson, Alexander, additional, Hadrava, Michal, additional, Hlinka, Jaroslav, additional, Hosaka, Ryosuke, additional, Olenik, Mark, additional, Houghton, Conor, additional, Iannella, Nicolangelo, additional, Launey, Thomas, additional, Kotsakidis, Rebecca, additional, Soriano, Jaymar, additional, Kubo, Takatomi, additional, Inoue, Takao, additional, Kida, Hiroyuki, additional, Yamakawa, Toshitaka, additional, Suzuki, Michiyasu, additional, Ikeda, Kazushi, additional, Abbasi, Samira, additional, Hudson, Amber E., additional, Heck, Detlef H., additional, Jaeger, Dieter, additional, Lee, Joel, additional, Janušonis, Skirmantas, additional, Saggio, Maria Luisa, additional, Spiegler, Andreas, additional, Stacey, William C., additional, Bernard, Christophe, additional, Lillo, Davide, additional, Petkoski, Spase, additional, Drakesmith, Mark, additional, Jones, Derek K., additional, Zadeh, Ali Sadegh, additional, Kambhampati, Chandra, additional, Karbowski, Jan, additional, Kaya, Zeynep Gokcen, additional, Lakretz, Yair, additional, Treves, Alessandro, additional, Li, Lily W., additional, Lizier, Joseph, additional, Kerr, Cliff C., additional, Masquelier, Timothée, additional, Kheradpisheh, Saeed Reza, additional, Kim, Hojeong, additional, Kim, Chang Sub, additional, Marakshina, Julia A., additional, Vartanov, Alexander V., additional, Neklyudova, Anastasia A., additional, Kozlovskiy, Stanislav A., additional, Kiselnikov, Andrey A., additional, Taniguchi, Kanako, additional, Kitano, Katsunori, additional, Schmitt, Oliver, additional, Lessmann, Felix, additional, Schwanke, Sebastian, additional, Eipert, Peter, additional, Meinhardt, Jennifer, additional, Beier, Julia, additional, Kadir, Kanar, additional, Karnitzki, Adrian, additional, Sellner, Linda, additional, Klünker, Ann-Christin, additional, Kuch, Lena, additional, Ruß, Frauke, additional, Jenssen, Jörg, additional, Wree, Andreas, additional, Sanz-Leon, Paula, additional, Knock, Stuart A., additional, Chien, Shih-Cheng, additional, Maess, Burkhard, additional, Knösche, Thomas R., additional, Cohen, Charles C., additional, Popovic, Marko A., additional, Klooster, Jan, additional, Kole, Maarten H.P., additional, Roberts, Erik A., additional, Kopell, Nancy J., additional, Kepple, Daniel, additional, Giaffar, Hamza, additional, Rinberg, Dima, additional, Koulakov, Alex, additional, Forlim, Caroline Garcia, additional, Klock, Leonie, additional, Bächle, Johanna, additional, Stoll, Laura, additional, Giemsa, Patrick, additional, Fuchs, Marie, additional, Schoofs, Nikola, additional, Montag, Christiane, additional, Gallinat, Jürgen, additional, Lee, Ray X., additional, Stephens, Greg J., additional, Kuhn, Bernd, additional, Tauffer, Luiz, additional, Isope, Philippe, additional, Inoue, Katsuma, additional, Ohmura, Yoshiyuki, additional, Yonekura, Shogo, additional, Kuniyoshi, Yasuo, additional, Jang, Hyun Jae, additional, Kwag, Jeehyun, additional, de Kamps, Marc, additional, Lai, Yi Ming, additional, dos Santos, Filipa, additional, Lam, K. P., additional, Andras, Peter, additional, Imperatore, Julia, additional, Helms, Jessica, additional, Tompa, Tamas, additional, Lavin, Antonieta, additional, Inkpen, Felicity H., additional, Ashby, Michael C., additional, Lepora, Nathan F., additional, Shifman, Aaron R., additional, Lewis, John E., additional, Zhang, Zhong, additional, Feng, Yeqian, additional, Tetzlaff, Christian, additional, Kulvicius, Tomas, additional, Li, Yinyun, additional, Pena, Rodrigo F. O., additional, Bernardi, Davide, additional, Roque, Antonio C., additional, Lindner, Benjamin, additional, Vellmer, Sebastian, additional, Saudargiene, Ausra, additional, Maninen, Tiina, additional, Havela, Riikka, additional, Linne, Marja-Leena, additional, Powanwe, Arthur, additional, Longtin, Andre, additional, Garrido, Jesús A., additional, Graham, Joe W., additional, Dura-Bernal, Salvador, additional, Angulo, Sergio L., additional, Neymotin, Samuel A., additional, and Antic, Srdjan D., additional
- Published
- 2017
- Full Text
- View/download PDF
41. Locking of correlated neural activity to ongoing oscillations
- Author
-
Kühn, Tobias, primary and Helias, Moritz, additional
- Published
- 2017
- Full Text
- View/download PDF
42. Implementierung und Evaluation ergänzender Korrekturmethoden für statistische Lernverfahren bei unbalancierten Klassifikationsproblemen
- Author
-
Kühn, Tobias
- Published
- 2014
- Full Text
- View/download PDF
43. Kostenschätzungen für die Reaktivierung passiver Gleisanschlüsse : eine neue Methode für Kostenschätzungen mithilfe von Case-based Reasoning (CBR) basiert auf der Wiederverwendung von historischem Projektwissen
- Author
-
Kowalski, Martin, Zelewski, Stephan, Günes, Nazif, and Kühn, Tobias
- Subjects
Wirtschaftswissenschaften - Published
- 2011
44. Gebäudesanierung : öko, aber unsozial?
- Author
-
Kopatz, Michael, Bierwirth, Anja, Kühn, Tobias, Kopatz, Michael, Bierwirth, Anja, and Kühn, Tobias
- Published
- 2013
45. A Neuron Model Independent Path Integral Explored via Binary Assemblies
- Author
-
Keup, Christian, Kühn, Tobias, and Helias, Moritz
- Subjects
Quantitative Biology::Neurons and Cognition - Abstract
We present the basic exploration of a novel path integral formulation for models of biological neuronal networks that allows to keep the neuron model unspecified until quantities are explicitly calculated. This is done at the example of a binary neuron network containing an assembly, which is suited to discuss the limits of standard mean field theory and the feasibility of a path integral approach, while also being of some neuroscientific interest. Advanced theoretical approaches to the description of neuronal network activity are still in their infancy, but much needed, due to its nonlinearities, statistical nature, and nonequilibrium dynamics. There is thus now renewed interest in the transfer of mathematical tools developed in statistical physics to the theory of neural networks. We model a Hebbian cell assembly as a group of O(100) excitatory binary neurons with increased coupling embedded in a larger balanced random network. Using standard mean-field theory and simulation, the system properties and parameter dependencies are analysed, especially emergence of the high activity state, spontaneous transitions, and pairwise correlations. We then introduce the path integral formulation, applying it to a single population of binary neurons. We show the relation of the tree level approximation to mean field theory, calculate propagators and a 1-loop diagram, and generalize to multiple populations. The formulation specifically uses generic properties of neuronal networks, which allows the formal description of the systems properties before an effective neuron model is fixed. It is analytically feasible for rate, binary and, possibly, spiking neurons. The implications of the results for the assembly model and the relation of our formulation to other path integral approaches are discussed. We stress that a functional form unlocks tools to treat critical phenomena, large fluctuations, disorder and may lead eventually to effective coarse grained theories by using renormalization group methods.
- Published
- 2017
46. [Letter to J. Weiss, R. Kuhn, S. Wentrock, J. Malitz, M. Reuss bristle: Can young cancer patients reintegrate professionally? In: Insurance Medicine 65 (2013), No. 4, p.197].
- Author
-
Kühn T
- Subjects
- Female, Humans, Male, Disability Evaluation, Neoplasms rehabilitation, Rehabilitation, Vocational psychology
- Published
- 2014
47. [Letter to E. Pickering, J. Becher, A. Regenauer: Is family history dispensable? In: Versicherungsmedizin 65 (2013) Heft 2, S. 73. ].
- Author
-
Kühn T
- Subjects
- Female, Humans, Male, Breast Neoplasms epidemiology, Family Health statistics & numerical data, Insurance, Life statistics & numerical data, Medical History Taking statistics & numerical data, Risk Assessment methods, Schizophrenia epidemiology
- Published
- 2013
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.