Dr. Christian Tetzlaff

Group(s): Neural Computation
Email:
tetzlaff@phys.uni-goettingen.de
Phone: +49 551/ 39 10762

Global QuickSearch:   Matches: 0

Search Settings

    Author / Editor / Organization
    Year
    Title
    Journal / Proceedings / Book
    Faghihi, F. and Kolodziejski, C. and Fiala, A. and Wörgötter, F. and Tetzlaff, C. (2013).
    An Information Theoretic Model of Information Processing in the Drosophila Olfactory System: the Role of Inhibitory Neurons for System Efficiency. Frontiers in Computational Neuroscience, 7, 183. DOI: 10.3389/fncom.2013.00183.
    BibTeX:
    @article{faghihikolodziejskifiala2013,
      author = {Faghihi, F. and Kolodziejski, C. and Fiala, A. and Wörgötter, F. and Tetzlaff, C.},
      title = {An Information Theoretic Model of Information Processing in the Drosophila Olfactory System: the Role of Inhibitory Neurons for System Efficiency},
      journal = {Frontiers in Computational Neuroscience},
      year = {2013},
      volume= {7},
      number = {183},
      url = {http://journal.frontiersin.org/Journal/10.3389/fncom.2013.00183/full},
      doi = {10.3389/fncom.2013.00183},
      abstract = {Fruit flies Drosophila melanogaster rely on their olfactory system to process environmental information. This information has to be transmitted without system-relevant loss by the olfactory system to deeper brain areas for learning. Here we study the role of several parameters of the flys olfactory system and the environment and how they influence olfactory information transmission. We have designed an abstract model of the antennal lobe, the mushroom body and the inhibitory circuitry. Mutual information between the olfactory environment, simulated in terms of different odor concentrations, and a sub-population of intrinsic mushroom body neurons Kenyon cells was calculated to quantify the efficiency of information transmission. With this method we study, on the one hand, the effect of different connectivity rates between olfactory projection neurons and firing thresholds of Kenyon cells. On the other hand, we analyze the influence of inhibition on mutual information between environment and mushroom body. Our simulations show an expected linear relation between the connectivity rate between the antennal lobe and the mushroom body and firing threshold of the Kenyon cells to obtain maximum mutual information for both low and high odor concentrations. However, contradicting all-day experiences, high odor concentrations cause a drastic, and unrealistic, decrease in mutual information for all connectivity rates compared to low concentration. But when inhibition on the mushroom body is included, mutual information remains at high levels independent of other system parameters. This finding points to a pivotal role of inhibition in fly information processing without which the systems efficiency will be substantially reduced}}
    Abstract: Fruit flies Drosophila melanogaster rely on their olfactory system to process environmental information. This information has to be transmitted without system-relevant loss by the olfactory system to deeper brain areas for learning. Here we study the role of several parameters of the flys olfactory system and the environment and how they influence olfactory information transmission. We have designed an abstract model of the antennal lobe, the mushroom body and the inhibitory circuitry. Mutual information between the olfactory environment, simulated in terms of different odor concentrations, and a sub-population of intrinsic mushroom body neurons Kenyon cells was calculated to quantify the efficiency of information transmission. With this method we study, on the one hand, the effect of different connectivity rates between olfactory projection neurons and firing thresholds of Kenyon cells. On the other hand, we analyze the influence of inhibition on mutual information between environment and mushroom body. Our simulations show an expected linear relation between the connectivity rate between the antennal lobe and the mushroom body and firing threshold of the Kenyon cells to obtain maximum mutual information for both low and high odor concentrations. However, contradicting all-day experiences, high odor concentrations cause a drastic, and unrealistic, decrease in mutual information for all connectivity rates compared to low concentration. But when inhibition on the mushroom body is included, mutual information remains at high levels independent of other system parameters. This finding points to a pivotal role of inhibition in fly information processing without which the systems efficiency will be substantially reduced
    Review:
    Tetzlaff, C. and Kolodziejski, C. and Timme, M. and Tsodyks, M. and Wörgötter, F. (2013).
    Synaptic scaling enables dynamically distinct short- and long-term memory formation. PLoS Computational Biology, e10003307, 910. DOI: doi:10.1371/journal.pcbi.1003307.
    BibTeX:
    @article{tetzlaffkolodziejskitimme2013,
      author = {Tetzlaff, C. and Kolodziejski, C. and Timme, M. and Tsodyks, M. and Wörgötter, F.},
      title = {Synaptic scaling enables dynamically distinct short- and long-term memory formation},
      pages = {e10003307},
      journal = {PLoS Computational Biology},
      year = {2013},
      volume= {910},
      doi = {doi:10.1371/journal.pcbi.1003307},
      abstract = {Memory storage in the brain relies on mechanisms acting on time scales from minutes, for long-term synaptic potentiation, to days, for memory consolidation. During such processes, neural circuits distinguish synapses relevant for forming a long-term storage, which are consolidated, from synapses of short-term storage, which fade. How time scale integration and synaptic differentiation is simultaneously achieved remains unclear. Here we show that synaptic scaling - a slow process usually associated with the maintenance of activity homeostasis - combined with synaptic plasticity may simultaneously achieve both, thereby providing a natural separation of short- from long-term storage. The interaction between plasticity and scaling provides also an explanation for an established paradox where memory consolidation critically depends on the exact order of learning and recall. These results indicate that scaling may be fundamental for stabilizing memories, providing a dynamic link between early and late memory formation processes.}}
    Abstract: Memory storage in the brain relies on mechanisms acting on time scales from minutes, for long-term synaptic potentiation, to days, for memory consolidation. During such processes, neural circuits distinguish synapses relevant for forming a long-term storage, which are consolidated, from synapses of short-term storage, which fade. How time scale integration and synaptic differentiation is simultaneously achieved remains unclear. Here we show that synaptic scaling - a slow process usually associated with the maintenance of activity homeostasis - combined with synaptic plasticity may simultaneously achieve both, thereby providing a natural separation of short- from long-term storage. The interaction between plasticity and scaling provides also an explanation for an established paradox where memory consolidation critically depends on the exact order of learning and recall. These results indicate that scaling may be fundamental for stabilizing memories, providing a dynamic link between early and late memory formation processes.
    Review:
    Tetzlaff, C. and Kolodziejski, C. and Timme, M. and Wörgötter, F. (2012).
    Analysis of synaptic scaling in combination with Hebbian plasticity in several simple networks. Front Comput. Neurosci, 36, 6. DOI: 10.3389/fncom.2012.00036.
    BibTeX:
    @article{tetzlaffkolodziejskitimme2012,
      author = {Tetzlaff, C. and Kolodziejski, C. and Timme, M. and Wörgötter, F.},
      title = {Analysis of synaptic scaling in combination with Hebbian plasticity in several simple networks},
      pages = {36},
      journal = {Front Comput. Neurosci},
      year = {2012},
      volume= {6},
      doi = {10.3389/fncom.2012.00036},
      abstract = {Conventional synaptic plasticity in combination with synaptic scaling is a biologically plau-sible plasticity rule that guides the development of synapses toward stability. Here we analyze the development of synaptic connections and the resulting activity patterns in dif-ferent feed-forward and recurrent neural networks, with plasticity and scaling. We show under which constraints an external input given to a feed-forward network forms an input trace similar to a cell assembly Hebb, 1949 by enhancing synaptic weights to larger stable values as compared to the rest of the network. For instance, a weak input creates a less strong representation in the network than a strong input which produces a trace along large parts of the network.These processes are strongly influenced by the underlying con-nectivity. For example, when embedding recurrent structures excitatory rings, etc. into a feed-forward network, the input trace is extended into more distant layers, while inhibition shortens it. These findings provide a better understanding of the dynamics of generic net-work structures where plasticity is combined with scaling. This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties}}
    Abstract: Conventional synaptic plasticity in combination with synaptic scaling is a biologically plau-sible plasticity rule that guides the development of synapses toward stability. Here we analyze the development of synaptic connections and the resulting activity patterns in dif-ferent feed-forward and recurrent neural networks, with plasticity and scaling. We show under which constraints an external input given to a feed-forward network forms an input trace similar to a cell assembly Hebb, 1949 by enhancing synaptic weights to larger stable values as compared to the rest of the network. For instance, a weak input creates a less strong representation in the network than a strong input which produces a trace along large parts of the network.These processes are strongly influenced by the underlying con-nectivity. For example, when embedding recurrent structures excitatory rings, etc. into a feed-forward network, the input trace is extended into more distant layers, while inhibition shortens it. These findings provide a better understanding of the dynamics of generic net-work structures where plasticity is combined with scaling. This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties
    Review:
    Tetzlaff, C. and Kolodziejski, C. and Markelic, I. and Wörgötter, F. (2012).
    Time scales of memory, learning, and plasticity. Biol. Cybern, 715-726, 10611. DOI: 10.1007/s00422-012-0529.
    BibTeX:
    @article{tetzlaffkolodziejskimarkelic2012,
      author = {Tetzlaff, C. and Kolodziejski, C. and Markelic, I. and Wörgötter, F.},
      title = {Time scales of memory, learning, and plasticity},
      pages = {715-726},
      journal = {Biol. Cybern},
      year = {2012},
      volume= {10611},
      url = {http://dx.doi.org/10.1007/s00422-012-0529},
      doi = {10.1007/s00422-012-0529},
      abstract = {If we stored every bit of input, the storage capacity of our nervous system would be reached after only about 10 days. The nervous system relies on at least two mechanisms that counteract this capacity limit: compression and forgetting. But the latter mechanism needs to know how long an entity should be stored: some memories are relevant only for the next few minutes, some are important even after the passage of several years. Psychology and physiology have found and described many different memory mechanisms, and these mechanisms indeed use different time scales. In this prospect we review these mechanisms with respect to their time scale and propose relations between mechanisms in learning and memory and their underlying physiological basis}}
    Abstract: If we stored every bit of input, the storage capacity of our nervous system would be reached after only about 10 days. The nervous system relies on at least two mechanisms that counteract this capacity limit: compression and forgetting. But the latter mechanism needs to know how long an entity should be stored: some memories are relevant only for the next few minutes, some are important even after the passage of several years. Psychology and physiology have found and described many different memory mechanisms, and these mechanisms indeed use different time scales. In this prospect we review these mechanisms with respect to their time scale and propose relations between mechanisms in learning and memory and their underlying physiological basis
    Review:
    Tetzlaff, C. and Kolodziejski, C. and Timm, M. and Wörgötter, F. (2011).
    Synaptic Scaling in Combination with many Generic Plasticity Mechanisms Stabilizes Circuit Connectivity. Front. Comput. Neurosci, 47, 5. DOI: 10.3389/fncom.2011.00047.
    BibTeX:
    @article{tetzlaffkolodziejskitimm2011,
      author = {Tetzlaff, C. and Kolodziejski, C. and Timm, M. and Wörgötter, F.},
      title = {Synaptic Scaling in Combination with many Generic Plasticity Mechanisms Stabilizes Circuit Connectivity},
      pages = {47},
      journal = {Front. Comput. Neurosci},
      year = {2011},
      volume= {5},
      doi = {10.3389/fncom.2011.00047},
      abstract = {Synaptic scaling is a slow process that modifies synapses, keeping the firing rate of neural circuits in specific regimes. Together with other processes, such as conventional synaptic plasticity in the form of long term depression and potentiation, synaptic scaling changes the synaptic patterns in a network, ensuring diverse, functionally relevant, stable, and input-dependent connectivity. How synaptic patterns are generated and stabilized, however, is largely unknown. Here we formally describe and analyze synaptic scaling based on results from experimental studies and demonstrate that the combination of different conventional plasticity mechanisms and synaptic scaling provides a powerful general framework for regulating network connectivity. In addition, we design several simple models that reproduce experimentally observed synaptic distributions as well as the observed synaptic modifications during sustained activity changes. These models predict that the combination of plasticity with scaling generates globally stable, input-controlled synaptic patterns, also in recurrent networks. Thus, in combination with other forms of plasticity, synaptic scaling can robustly yield neuronal circuits with high synaptic diversity, which potentially enables robust dynamic storage of complex activation patterns. This mechanism is even more pronounced when considering networks with a realistic degree of inhibition. Synaptic scaling combined with plasticity could thus be the basis for learning structured behavior even in initially random networks}}
    Abstract: Synaptic scaling is a slow process that modifies synapses, keeping the firing rate of neural circuits in specific regimes. Together with other processes, such as conventional synaptic plasticity in the form of long term depression and potentiation, synaptic scaling changes the synaptic patterns in a network, ensuring diverse, functionally relevant, stable, and input-dependent connectivity. How synaptic patterns are generated and stabilized, however, is largely unknown. Here we formally describe and analyze synaptic scaling based on results from experimental studies and demonstrate that the combination of different conventional plasticity mechanisms and synaptic scaling provides a powerful general framework for regulating network connectivity. In addition, we design several simple models that reproduce experimentally observed synaptic distributions as well as the observed synaptic modifications during sustained activity changes. These models predict that the combination of plasticity with scaling generates globally stable, input-controlled synaptic patterns, also in recurrent networks. Thus, in combination with other forms of plasticity, synaptic scaling can robustly yield neuronal circuits with high synaptic diversity, which potentially enables robust dynamic storage of complex activation patterns. This mechanism is even more pronounced when considering networks with a realistic degree of inhibition. Synaptic scaling combined with plasticity could thus be the basis for learning structured behavior even in initially random networks
    Review:
    Tetzlaff, C. and Okujeni, S. and Egert, U. and Wörgötter, F. and Butz, M. (2010).
    Self-Organized Criticality in Developing Neuronal Networks. PLoS Comput Biol, 6, 12. DOI: 10.1371/journal.pcbi.1001013.
    BibTeX:
    @article{tetzlaffokujeniegert2010,
      author = {Tetzlaff, C. and Okujeni, S. and Egert, U. and Wörgötter, F. and Butz, M.},
      title = {Self-Organized Criticality in Developing Neuronal Networks},
      journal = {PLoS Comput Biol},
      year = {2010},
      volume= {6},
      number = {12},
      url = {http://www.ploscompbiol.org/article/info%3Adoi%2F10.1371%2Fjournal.pcbi.1001013},
      doi = {10.1371/journal.pcbi.1001013},
      abstract = {Recently evidence has accumulated that many neural networks exhibit self-organized criticality. In this state, activity is similar across temporal scales and this is beneficial with respect to information flow. If subcritical, activity can die out, if supercritical epileptiform patterns may occur. Little is known about how developing networks will reach and stabilize criticality. Here we monitor the development between 13 and 95 days in vitro of cortical cell cultures n 20 and find four different phases, related to their morphological maturation: An initial low-activity state is followed by a supercritical and then a subcritical one until the network finally reaches stable criticality Using network modeling and mathematical analysis we describe the dynamics of the emergent connectivity in such developing systems. Based on physiological observations, the synaptic development in the model is determined by the drive of the neurons to adjust their connectivity for reaching on average firing rate homeostasis. We predict a specific time course for the maturation of inhibition, with strong onset and delayed pruning, and that total synaptic connectivity should be strongly linked to the relative levels of excitation and inhibition. These results demonstrate that the interplay between activity and connectivity guides developing networks into criticality suggesting that this may be a generic and stable state of many networks in vivo and in vitro}}
    Abstract: Recently evidence has accumulated that many neural networks exhibit self-organized criticality. In this state, activity is similar across temporal scales and this is beneficial with respect to information flow. If subcritical, activity can die out, if supercritical epileptiform patterns may occur. Little is known about how developing networks will reach and stabilize criticality. Here we monitor the development between 13 and 95 days in vitro of cortical cell cultures n 20 and find four different phases, related to their morphological maturation: An initial low-activity state is followed by a supercritical and then a subcritical one until the network finally reaches stable criticality Using network modeling and mathematical analysis we describe the dynamics of the emergent connectivity in such developing systems. Based on physiological observations, the synaptic development in the model is determined by the drive of the neurons to adjust their connectivity for reaching on average firing rate homeostasis. We predict a specific time course for the maturation of inhibition, with strong onset and delayed pruning, and that total synaptic connectivity should be strongly linked to the relative levels of excitation and inhibition. These results demonstrate that the interplay between activity and connectivity guides developing networks into criticality suggesting that this may be a generic and stable state of many networks in vivo and in vitro
    Review:
    Kolodziejski, C. and Tetzlaff, C. and Wörgötter, F. (2010).
    Closed-form treatment of the interactions between neuronal activity and timing-dependent plasticity in networks of linear neurons. Front. Comput. Neurosci, 1-15, 4. DOI: 10.3389/fncom.2010.00134.
    BibTeX:
    @article{kolodziejskitetzlaffwoergoetter2010,
      author = {Kolodziejski, C. and Tetzlaff, C. and Wörgötter, F.},
      title = {Closed-form treatment of the interactions between neuronal activity and timing-dependent plasticity in networks of linear neurons},
      pages = {1-15},
      journal = {Front. Comput. Neurosci},
      year = {2010},
      volume= {4},
      doi = {10.3389/fncom.2010.00134},
      abstract = {Network activity and network connectivity mutually influence each other. Especially for fast processes, like spike-timing- dependent plasticity STDP, which depends on the interaction of few two signals, the question arises how these inter- actions are continuously altering the behavior and structure of the network. To address this question a time-continuous treatment of plasticity is required. However, this is even in simple recurrent network structures currently not possible. Thus, here we develop for a linear differential Hebbian learning system a method by which we can analytically investigate the dynamics and stability of the connections in recurrent networks. We use noisy periodic external input signals, which through the recurrent connections lead to complex actual ongoing inputs and observe that large stable ranges emerge in these networks without boundaries or weight-normalization. Somewhat counter-intuitively, we find that about 40 of these cases are obtained with an LTP- dominated STDP curve. Noise can reduce stability in some cases, but generally this does not occur. Instead stable domains are often enlarged. This study is a first step towards a better understanding of the on- going interactions between activity and plasticity in recurrent networks using STDP. The results suggests that stability of sub-networks should generically be present also in larger structures}}
    Abstract: Network activity and network connectivity mutually influence each other. Especially for fast processes, like spike-timing- dependent plasticity STDP, which depends on the interaction of few two signals, the question arises how these inter- actions are continuously altering the behavior and structure of the network. To address this question a time-continuous treatment of plasticity is required. However, this is even in simple recurrent network structures currently not possible. Thus, here we develop for a linear differential Hebbian learning system a method by which we can analytically investigate the dynamics and stability of the connections in recurrent networks. We use noisy periodic external input signals, which through the recurrent connections lead to complex actual ongoing inputs and observe that large stable ranges emerge in these networks without boundaries or weight-normalization. Somewhat counter-intuitively, we find that about 40 of these cases are obtained with an LTP- dominated STDP curve. Noise can reduce stability in some cases, but generally this does not occur. Instead stable domains are often enlarged. This study is a first step towards a better understanding of the on- going interactions between activity and plasticity in recurrent networks using STDP. The results suggests that stability of sub-networks should generically be present also in larger structures
    Review:
    Fauth, M. and Wörgötter, F. and Tetzlaff, C. (2015).
    The Formation of Multi-synaptic Connections by the Interaction of Synaptic and Structural Plasticity and Their Functional Consequences. PLoS Comput Biol, e1004031, 11, 1. DOI: 10.1371/journal.pcbi.1004031.
    BibTeX:
    @article{fauthwoergoettertetzlaff2015,
      author = {Fauth, M. and Wörgötter, F. and Tetzlaff, C.},
      title = {The Formation of Multi-synaptic Connections by the Interaction of Synaptic and Structural Plasticity and Their Functional Consequences},
      pages = {e1004031},
      journal = {PLoS Comput Biol},
      year = {2015},
      volume= {11},
      number = {1},
      month = {01},
      publisher = {Public Library of Science},
      url = {http://dx.doi.org/10.1371%2Fjournal.pcbi.1004031},
      doi = {10.1371/journal.pcbi.1004031},
      abstract = {titleAuthor Summary/title pThe connectivity between neurons is modified by different mechanisms. On a time scale of minutes to hours one finds synaptic plasticity, whereas mechanisms for structural changes at axons or dendrites may take days. One main factor determining structural changes is the weight of a connection, which, in turn, is adapted by synaptic plasticity. Both mechanisms, synaptic and structural plasticity, are influenced and determined by the activity pattern in the network. Hence, it is important to understand how activity and the different plasticity mechanisms influence each other. Especially how activity influences rewiring in adult networks is still an open question./p pWe present a model, which captures these complex interactions by abstracting structural plasticity with weight-dependent probabilities. This allows for calculating the distribution of the number of synapses between two neurons analytically. We report that biologically realistic connection patterns for different cortical layers generically arise with synaptic plasticity rules in which the synaptic weights grow with postsynaptic activity. The connectivity patterns also lead to different activity levels resembling those found in the different cortical layers. Interestingly such a system exhibits a hysteresis by which connections remain stable longer than expected, which may add to the stability of information storage in the network./p}}
    Abstract: titleAuthor Summary/title pThe connectivity between neurons is modified by different mechanisms. On a time scale of minutes to hours one finds synaptic plasticity, whereas mechanisms for structural changes at axons or dendrites may take days. One main factor determining structural changes is the weight of a connection, which, in turn, is adapted by synaptic plasticity. Both mechanisms, synaptic and structural plasticity, are influenced and determined by the activity pattern in the network. Hence, it is important to understand how activity and the different plasticity mechanisms influence each other. Especially how activity influences rewiring in adult networks is still an open question./p pWe present a model, which captures these complex interactions by abstracting structural plasticity with weight-dependent probabilities. This allows for calculating the distribution of the number of synapses between two neurons analytically. We report that biologically realistic connection patterns for different cortical layers generically arise with synaptic plasticity rules in which the synaptic weights grow with postsynaptic activity. The connectivity patterns also lead to different activity levels resembling those found in the different cortical layers. Interestingly such a system exhibits a hysteresis by which connections remain stable longer than expected, which may add to the stability of information storage in the network./p
    Review:
    Tetzlaff, C. and Dasgupta, S. and Kulvicius, T. and Wörgötter, F. (2015).
    The Use of Hebbian Cell Assemblies for Nonlinear Computation. Scientific Reports, 5. DOI: 10.1038/srep12866.
    BibTeX:
    @article{tetzlaffdasguptakulvicius2015,
      author = {Tetzlaff, C. and Dasgupta, S. and Kulvicius, T. and Wörgötter, F.},
      title = {The Use of Hebbian Cell Assemblies for Nonlinear Computation},
      journal = {Scientific Reports},
      year = {2015},
      volume= {5},
      publisher = {Nature Publishing Group},
      url = {http://www.nature.com/articles/srep12866},
      doi = {10.1038/srep12866},
      abstract = {When learning a complex task our nervous system self-organizes large groups of neurons into coherent dynamic activity patterns. During this, a network with multiple, simultaneously active, and computationally powerful cell assemblies is created. How such ordered structures are formed while preserving a rich diversity of neural dynamics needed for computation is still unknown. Here we show that the combination of synaptic plasticity with the slower process of synaptic scaling achieves (i) the formation of cell assemblies and (ii) enhances the diversity of neural dynamics facilitating the learning of complex calculations. Due to synaptic scaling the dynamics of different cell assemblies do not interfere with each other. As a consequence, this type of self-organization allows executing a difficult, six degrees of freedom, manipulation task with a robot where assemblies need to learn computing complex non-linear transforms and - for execution - must cooperate with each other without interference. This mechanism, thus, permits the self-organization of computationally powerful sub-structures in dynamic networks for behavior control.}}
    Abstract: When learning a complex task our nervous system self-organizes large groups of neurons into coherent dynamic activity patterns. During this, a network with multiple, simultaneously active, and computationally powerful cell assemblies is created. How such ordered structures are formed while preserving a rich diversity of neural dynamics needed for computation is still unknown. Here we show that the combination of synaptic plasticity with the slower process of synaptic scaling achieves (i) the formation of cell assemblies and (ii) enhances the diversity of neural dynamics facilitating the learning of complex calculations. Due to synaptic scaling the dynamics of different cell assemblies do not interfere with each other. As a consequence, this type of self-organization allows executing a difficult, six degrees of freedom, manipulation task with a robot where assemblies need to learn computing complex non-linear transforms and - for execution - must cooperate with each other without interference. This mechanism, thus, permits the self-organization of computationally powerful sub-structures in dynamic networks for behavior control.
    Review:
    Fauth, M. and Wörgötter, F. and Tetzlaff, C. (2015).
    The Formation of Multi-synaptic Connections by the Interaction of Synaptic and Structural Plasticity and Their Functional Consequences. PLoS Comput Biol, e1004031, 11, 1. DOI: 10.1371/journal.pcbi.1004031.
    BibTeX:
    @article{fauthwoergoettertetzlaff2015a,
      author = {Fauth, M. and Wörgötter, F. and Tetzlaff, C.},
      title = {The Formation of Multi-synaptic Connections by the Interaction of Synaptic and Structural Plasticity and Their Functional Consequences},
      pages = {e1004031},
      journal = {PLoS Comput Biol},
      year = {2015},
      volume= {11},
      number = {1},
      institution = {Georg-August University Göttingen, Third Institute of Physics, Bernstein Center for Computational Neuroscience, Göttingen, Germany.},
      language = {english},
      month = {Jan},
      doi = {10.1371/journal.pcbi.1004031},
      abstract = {Cortical connectivity emerges from the permanent interaction between neuronal activity and synaptic as well as structural plasticity. An important experimentally observed feature of this connectivity is the distribution of the number of synapses from one neuron to another, which has been measured in several cortical layers. All of these distributions are bimodal with one peak at zero and a second one at a small number (3-8) of synapses. In this study, using a probabilistic model of structural plasticity, which depends on the synaptic weights, we explore how these distributions can emerge and which functional consequences they have. We find that bimodal distributions arise generically from the interaction of structural plasticity with synaptic plasticity rules that fulfill the following biological realistic constraints: First, the synaptic weights have to grow with the postsynaptic activity. Second, this growth curve and/or the input-output relation of the postsynaptic neuron have to change sub-linearly (negative curvature). As most neurons show such input-output-relations, these constraints can be fulfilled by many biological reasonable systems. Given such a system, we show that the different activities, which can explain the layer-specific distributions, correspond to experimentally observed activities. Considering these activities as working point of the system and varying the pre- or postsynaptic stimulation reveals a hysteresis in the number of synapses. As a consequence of this, the connectivity between two neurons can be controlled by activity but is also safeguarded against overly fast changes. These results indicate that the complex dynamics between activity and plasticity will, already between a pair of neurons, induce a variety of possible stable synaptic distributions, which could support memory mechanisms.}}
    Abstract: Cortical connectivity emerges from the permanent interaction between neuronal activity and synaptic as well as structural plasticity. An important experimentally observed feature of this connectivity is the distribution of the number of synapses from one neuron to another, which has been measured in several cortical layers. All of these distributions are bimodal with one peak at zero and a second one at a small number (3-8) of synapses. In this study, using a probabilistic model of structural plasticity, which depends on the synaptic weights, we explore how these distributions can emerge and which functional consequences they have. We find that bimodal distributions arise generically from the interaction of structural plasticity with synaptic plasticity rules that fulfill the following biological realistic constraints: First, the synaptic weights have to grow with the postsynaptic activity. Second, this growth curve and/or the input-output relation of the postsynaptic neuron have to change sub-linearly (negative curvature). As most neurons show such input-output-relations, these constraints can be fulfilled by many biological reasonable systems. Given such a system, we show that the different activities, which can explain the layer-specific distributions, correspond to experimentally observed activities. Considering these activities as working point of the system and varying the pre- or postsynaptic stimulation reveals a hysteresis in the number of synapses. As a consequence of this, the connectivity between two neurons can be controlled by activity but is also safeguarded against overly fast changes. These results indicate that the complex dynamics between activity and plasticity will, already between a pair of neurons, induce a variety of possible stable synaptic distributions, which could support memory mechanisms.
    Review:
    Fauth, M. and Wörgötter, F. and Tetzlaff, C. (2015).
    Formation and Maintenance of Robust Long-Term Information Storage in the Presence of Synaptic Turnover. PLoS Comput Biol, e1004684, 11, 12. DOI: 10.1371/journal.pcbi.1004684.
    BibTeX:
    @article{fauthwoergoettertetzlaff2015b,
      author = {Fauth, M. and Wörgötter, F. and Tetzlaff, C.},
      title = {Formation and Maintenance of Robust Long-Term Information Storage in the Presence of Synaptic Turnover},
      pages = {e1004684},
      journal = {PLoS Comput Biol},
      year = {2015},
      volume= {11},
      number = {12},
      institution = {Department of Neurobiology, Weizmann Institute of Science, Rehovot, Israel.},
      language = {eng},
      month = {Dec},
      doi = {10.1371/journal.pcbi.1004684},
      abstract = {A long-standing problem is how memories can be stored for very long times despite the volatility of the underlying neural substrate, most notably the high turnover of dendritic spines and synapses. To address this problem, here we are using a generic and simple probabilistic model for the creation and removal of synapses. We show that information can be stored for several months when utilizing the intrinsic dynamics of multi- synapse connections. In such systems, single synapses can still show high turnover, which enables fast learning of new information, but this will not perturb prior stored information (slow forgetting), which is represented by the compound state of the connections. The model matches the time course of recent experimental spine data during learning and memory in mice supporting the assumption of multi-synapse connections as the basis for long-term storage.}}
    Abstract: A long-standing problem is how memories can be stored for very long times despite the volatility of the underlying neural substrate, most notably the high turnover of dendritic spines and synapses. To address this problem, here we are using a generic and simple probabilistic model for the creation and removal of synapses. We show that information can be stored for several months when utilizing the intrinsic dynamics of multi- synapse connections. In such systems, single synapses can still show high turnover, which enables fast learning of new information, but this will not perturb prior stored information (slow forgetting), which is represented by the compound state of the connections. The model matches the time course of recent experimental spine data during learning and memory in mice supporting the assumption of multi-synapse connections as the basis for long-term storage.
    Review:
    Nachstedt, T. and Tetzlaff, C. (2017).
    Working Memory Requires a Combination of Transient and Attractor-Dominated Dynamics to Process Unreliably Timed Inputs. Scientific Reports, 2473, 7, 1. DOI: 10.1038/s41598-017-02471-z.
    BibTeX:
    @article{nachstedttetzlaff2017,
      author = {Nachstedt, T. and Tetzlaff, C.},
      title = {Working Memory Requires a Combination of Transient and Attractor-Dominated Dynamics to Process Unreliably Timed Inputs},
      pages = {2473},
      journal = {Scientific Reports},
      year = {2017},
      volume= {7},
      number = {1},
      publisher = {Springer US},
      url = {http://www.nature.com/articles/s41598-017-02471-z},
      doi = {10.1038/s41598-017-02471-z},
      abstract = {Working memory stores and processes information received as a stream of continuously incoming stimuli. This requires accurate sequencing and it remains puzzling how this can be reliably achieved by the neuronal system as our perceptual inputs show a high degree of temporal variability. One hypothesis is that accurate timing is achieved by purely transient neuronal dynamics by contrast a second hypothesis states that the underlying network dynamics are dominated by attractor states. In this study, we resolve this contradiction by theoretically investigating the performance of the system using stimuli with differently accurate timing. Interestingly, only the combination of attractor and transient dynamics enables the network to perform with a low error rate. Further analysis reveals that the transient dynamics of the system are used to process information, while the attractor states store it. The interaction between both types of dynamics yields experimentally testable predictions and we show that this way the system can reliably interact with a timing-unreliable Hebbian-network representing long-term memory. Thus, this study provides a potential solution to the long-standing problem of the basic neuronal dynamics underlying working memory.}}
    Abstract: Working memory stores and processes information received as a stream of continuously incoming stimuli. This requires accurate sequencing and it remains puzzling how this can be reliably achieved by the neuronal system as our perceptual inputs show a high degree of temporal variability. One hypothesis is that accurate timing is achieved by purely transient neuronal dynamics by contrast a second hypothesis states that the underlying network dynamics are dominated by attractor states. In this study, we resolve this contradiction by theoretically investigating the performance of the system using stimuli with differently accurate timing. Interestingly, only the combination of attractor and transient dynamics enables the network to perform with a low error rate. Further analysis reveals that the transient dynamics of the system are used to process information, while the attractor states store it. The interaction between both types of dynamics yields experimentally testable predictions and we show that this way the system can reliably interact with a timing-unreliable Hebbian-network representing long-term memory. Thus, this study provides a potential solution to the long-standing problem of the basic neuronal dynamics underlying working memory.
    Review:
    Nachstedt, T. and Tetzlaff, C. and Manoonpong, P. (2017).
    Fast dynamical coupling enhances frequency adaptation of oscillators for robotic locomotion control. Frontiers in Neurorobotics, 1--14, 11. DOI: 10.3389/fnbot.2017.00014.
    BibTeX:
    @article{nachstedttetzlaffmanoonpong2017,
      author = {Nachstedt, T. and Tetzlaff, C. and Manoonpong, P.},
      title = {Fast dynamical coupling enhances frequency adaptation of oscillators for robotic locomotion control},
      pages = {1--14},
      journal = {Frontiers in Neurorobotics},
      year = {2017},
      volume= {11},
      url = {http://journal.frontiersin.org/article/10.3389/fnbot.2017.00014},
      doi = {10.3389/fnbot.2017.00014},
      abstract = {Rhythmic neural signals serve as basis of many brain processes, in particular of locomotion control and generation of rhythmic movements. It has been found that specific neural circuits, named central pattern generators (CPGs), are able to autonomously produce such rhythmic activities. In order to tune, shape and coordinate the produced rhythmic activity, CPGs require sensory feedback, i.e., external signals. Nonlinear oscillators are a standard model of CPGs and are used in various robotic applications. A special class of nonlinear oscillators are adaptive frequency oscillators (AFOs). AFOs are able to adapt their frequency toward the frequency of an external periodic signal and to keep this learned frequency once the external signal vanishes. AFOs have been successfully used, for instance, for resonant tuning of robotic locomotion control. However, the choice of parameters for a standard AFO is characterized by a trade-off between the speed of the adaptation and its precision and, additionally, is strongly dependent on the range of frequencies the AFO is confronted with. As a result, AFOs are typically tuned such that they require a comparably long time for their adaptation. To overcome the problem, here, we improve the standard AFO by introducing a novel adaptation mechanism based on dynamical coupling strengths. The dynamical adaptation mechanism enhances both the speed and precision of the frequency adaptation. In contrast to standard AFOs, in this system, the interplay of dynamics on short and long time scales enables fast as well as precise adaptation of the oscillator for a wide range of frequencies. Amongst others, a very natural implementation of this mechanism is in terms of neural networks. The proposed system enables robotic applications which require fast retuning of locomotion control in order to react to environmental changes or conditions.}}
    Abstract: Rhythmic neural signals serve as basis of many brain processes, in particular of locomotion control and generation of rhythmic movements. It has been found that specific neural circuits, named central pattern generators (CPGs), are able to autonomously produce such rhythmic activities. In order to tune, shape and coordinate the produced rhythmic activity, CPGs require sensory feedback, i.e., external signals. Nonlinear oscillators are a standard model of CPGs and are used in various robotic applications. A special class of nonlinear oscillators are adaptive frequency oscillators (AFOs). AFOs are able to adapt their frequency toward the frequency of an external periodic signal and to keep this learned frequency once the external signal vanishes. AFOs have been successfully used, for instance, for resonant tuning of robotic locomotion control. However, the choice of parameters for a standard AFO is characterized by a trade-off between the speed of the adaptation and its precision and, additionally, is strongly dependent on the range of frequencies the AFO is confronted with. As a result, AFOs are typically tuned such that they require a comparably long time for their adaptation. To overcome the problem, here, we improve the standard AFO by introducing a novel adaptation mechanism based on dynamical coupling strengths. The dynamical adaptation mechanism enhances both the speed and precision of the frequency adaptation. In contrast to standard AFOs, in this system, the interplay of dynamics on short and long time scales enables fast as well as precise adaptation of the oscillator for a wide range of frequencies. Amongst others, a very natural implementation of this mechanism is in terms of neural networks. The proposed system enables robotic applications which require fast retuning of locomotion control in order to react to environmental changes or conditions.
    Review:

    © 2011 - 2017 Dept. of Computational Neuroscience • comments to: sreich _at_ gwdg.de • Impressum / Site Info