Information

What are the response frequencies of sensory neurons?

What are the response frequencies of sensory neurons?


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

Both visual and auditory stimuli are sent to the brain via ganglion cells (retinal resp. spiral). Both are the first cells along their resp. pathways that produce action potentials.

My question concerns typical frequencies of action potentials sent along the axons of the visual vs. auditory ganglion cells as a reaction to a "typical stimulus", i.e. a medium long, medium strong signal of some fixed frequency (e.g. light: red, sound: 440Hz) against a white resp. silent background.

Are these frequencies of comparable range, or does one type of ganglion cell (retinal vs. spiral) fire with a significantly higher or lower rate than the other?

(The question would not make sense, if the physical frequencies of light and sound - which trigger the receptor cells - would be coded by frequencies of action potentials. But I assume that this is not the case, is it?)


The auditory brainstem shows "phase-locking" typically up to 1-3Khz at most; 3000Hz is an incredibly high firing rate for a single neuron, but this phase-locking is achieved not by individual cells firing in-phase with an auditory stimulus, but rather with a population of cells that tend to fire in-phase, such that if you average across the population you get a phase-locked population volley.

In some cases, in some animals, this phase locking can even get to the higher frequencies (see here for example).

However, this phase locking seems primarily important for sound localization via interaural time differences. Frequency itself is encoded by which population of hair cells is activated, according to the properties of the basilar membrane. Firing rates of individual spiral ganglion cells are only faster than 100 Hz at very high stimulus intensities.

Similar to the spiral ganglion cells, retinal ganglion cells primary encode intensity information in their firing rates.

However, in both cases, it's important to recognize how crucial adaptation is in sensory systems. RGCs in particular fire primarily to transients, so it is typical to use light flashes, drifting gratings, or other dynamic stimuli. The response to a "medium long, medium strong signal of some fixed (wavelength)" is going to be brief, followed by silence, not a constant response like you imply.


A developmental switch in the response of DRG neurons to ETS transcription factor signaling

Two ETS transcription factors of the Pea3 subfamily are induced in subpopulations of dorsal root ganglion (DRG) sensory and spinal motor neurons by target-derived factors. Their expression controls late aspects of neuronal differentiation such as target invasion and branching. Here, we show that the late onset of ETS gene expression is an essential requirement for normal sensory neuron differentiation. We provide genetic evidence in the mouse that precocious ETS expression in DRG sensory neurons perturbs axonal projections, the acquisition of terminal differentiation markers, and their dependence on neurotrophic support. Together, our findings indicate that DRG sensory neurons exhibit a temporal developmental switch that can be revealed by distinct responses to ETS transcription factor signaling at sequential steps of neuronal maturation.

Figures

Figure 1. Replacement of Er81 by EWS-Pea3

Figure 1. Replacement of Er81 by EWS-Pea3

(A) Generation of Er81 EWS-Pea3 mutant mice. Above…

Figure 2. Rescue of Ia Proprioceptive Afferent…

Figure 2. Rescue of Ia Proprioceptive Afferent Projections into the Ventral Spinal Cord in Er81…

Figure 3. Defects in the Establishment of…

Figure 3. Defects in the Establishment of Sensory Afferent Projections upon Precocious Expression of EWS-Pea3…

Figure 4. Neurotrophin-Independent Neurite Outgrowth In Vitro…

Figure 4. Neurotrophin-Independent Neurite Outgrowth In Vitro of DRG Neurons Expressing EWS-Pea3 Precociously

Figure 5. DRG Neurons Expressing EWS-Pea3 Isochronically…

Figure 5. DRG Neurons Expressing EWS-Pea3 Isochronically Depend on Neurotrophins for Survival

Figure 6. Loss of Trk Receptor Expression…

Figure 6. Loss of Trk Receptor Expression and Increased Survival in DRG Neurons upon Precocious…

Figure 7. Gene Expression Analysis upon Induction…

Figure 7. Gene Expression Analysis upon Induction of Precocious or Isochronic ETS Signaling

Figure 8. Precocious ETS Signaling Induces Gene…

Figure 8. Precocious ETS Signaling Induces Gene Expression Changes Cell-Autonomously

Figure 9. Progressive Neuronal Specification Is Paralleled…

Figure 9. Progressive Neuronal Specification Is Paralleled by a Developmental Shift in Response to ETS…


The Effect of Lipopolysaccharides on Primary Sensory Neurons in Crustacean Models

Many types of gram-negative bacteria are responsible for serious infections, such as septicemia. Lipopolysaccharides (LPS), the endotoxins released from these bacteria, are responsible for inducing the immune response of organisms such as crustaceans, who have well-conserved Toll- like receptors. Little is known about the direct impact LPS has on primary sensory neurons apart from this immune reaction. Previous studies have demonstrated that motor neurons increase both spontaneous and evoked firing frequencies with LPS, but differences have been observed across species. Here, the effects of LPS from two strains of gram-negative bacteria (Serratia marcescensand Pseudomonas aeruginosa) on firing frequency of primary sensory proprioceptors in the crab propodite-dactylopodite (PD) organ and crayfish muscle receptor organ (MRO) is examined. These sensory organs correlate to mammalian proprioception, as the MRO is analogous to the mammalian muscle spindle, and the PD organ allows for the separation of motor nerve function from sensory neuronal transduction. The neuronal function of the two model organisms was studied through the stretch-activation of rapidly-adapting and slowly-adapting sensory neurons. Results indicated that there is no statistically significant impact on sensory transduction through the application of LPS however, in the crab PD organ, the application of LPS from both strains decreased the nerve activity except when the LPS from both bacteria was applied together. In the crayfish MRO, there usually was an increase in nerve activity. In saline controls, there was also an increase in firing of the neurons in both preparations, but this also was not statistically significant. Interestingly, the MRO muscle fibers often contracted upon the addition of LPS, perhaps indicating that the known impact LPS has on motor nerve function is partially responsible for the results obtained.


The Brain ‘Rotates’ Memories to Save Them From New Sensations

To revist this article, visit My Profile, then View saved stories.

Research in mice shows that neural representations of sensory information get rotated 90 degrees to transform them into memories. In this orthogonal arrangement, the memories and sensations do not interfere with one another. Illustration: Samuel Velasco/Quanta Magazine

To revist this article, visit My Profile, then View saved stories.

During every waking moment, we humans and other animals have to balance on the edge of our awareness of past and present. We must absorb new sensory information about the world around us while holding on to short-term memories of earlier observations or events. Our ability to make sense of our surroundings, to learn, to act, and to think all depend on constant, nimble interactions between perception and memory.

Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research develop­ments and trends in mathe­matics and the physical and life sciences.

But to accomplish this, the brain has to keep the two distinct otherwise, incoming data streams could interfere with representations of previous stimuli and cause us to overwrite or misinterpret important contextual information. Compounding that challenge, a body of research hints that the brain does not neatly partition short-term memory function exclusively into higher cognitive areas like the prefrontal cortex. Instead, the sensory regions and other lower cortical centers that detect and represent experiences may also encode and store memories of them. And yet those memories can’t be allowed to intrude on our perception of the present, or to be randomly rewritten by new experiences.

A paper published recently in Nature Neuroscience may finally explain how the brain’s protective buffer works. A pair of researchers showed that, to represent current and past stimuli simultaneously without mutual interference, the brain essentially “rotates” sensory information to encode it as a memory. The two orthogonal representations can then draw from overlapping neural activity without intruding on each other. The details of this mechanism may help to resolve several long-standing debates about memory processing.

To figure out how the brain prevents new information and short-term memories from blurring together, Timothy Buschman, a neuroscientist at Princeton University, and Alexandra Libby, a graduate student in his lab, decided to focus on auditory perception in mice. They had the animals passively listen to sequences of four chords over and over again, in what Buschman dubbed “the worst concert ever.”

These sequences allowed the mice to establish associations between certain chords, so that when they heard one initial chord versus another, they could predict what sounds would follow. Meanwhile, the researchers trained machine-learning classifiers to analyze the neural activity recorded from the rodents’ auditory cortex during these listening sessions, to determine how the neurons collectively represented each stimulus in the sequence.

Buschman and Libby watched how those patterns changed as the mice built up their associations. They found that over time, the neural representations of associated chords began to resemble each other. But they also observed that new, unexpected sensory inputs, such as unfamiliar sequences of chords, could interfere with a mouse’s representations of what it was hearing—in effect, by overwriting its representation of previous inputs. The neurons retroactively changed their encoding of a past stimulus to match what the animal associated with the later stimulus—even if that was wrong.

The researchers wanted to determine how the brain must be correcting for this retroactive interference to preserve accurate memories. So they trained another classifier to identify and differentiate neural patterns that represented memories of the chords in the sequences—the way the neurons were firing, for instance, when an unexpected chord evoked a comparison to a more familiar sequence. The classifier did find intact patterns of activity from memories of the actual chords that had been heard—rather than the false “corrections” written retroactively to uphold older associations—but those memory encodings looked very different from the sensory representations.

The memory representations were organized in what neuroscientists describe as an “orthogonal” dimension to the sensory representations, all within the same population of neurons. Buschman likened it to running out of room while taking handwritten notes on a piece of paper. When that happens, “you will rotate your piece of paper 90 degrees and start writing in the margins,” he said. “And that’s basically what the brain is doing. It gets that first sensory input, it writes it down on the piece of paper, and then it rotates that piece of paper 90 degrees so that it can write in a new sensory input without interfering or literally overwriting.”

In other words, sensory data was transformed into a memory through a morphing of the neuronal firing patterns. “The information changes because it needs to be protected,” said Anastasia Kiyonaga, a cognitive neuroscientist at UC San Diego who was not involved in the study.

This use of orthogonal coding to separate and protect information in the brain has been seen before. For instance, when monkeys are preparing to move, neural activity in their motor cortex represents the potential movement but does so orthogonally to avoid interfering with signals driving actual commands to the muscles.

Still, it often hasn’t been clear how the neural activity gets transformed in this way. Buschman and Libby wanted to answer that question for what they were observing in the auditory cortex of their mice. “When I first started in the lab, it was hard for me to imagine how something like that could happen with neural firing activity,” Libby said. She wanted to “open the black box of what the neural network is doing to create this orthogonality.”

Experimentally sifting through the possibilities, they ruled out the possibility that different subsets of neurons in the auditory cortex were independently handling the sensory and memory representations. Instead, they showed that the same general population of neurons was involved, and that the activity of the neurons could be divided neatly into two categories. Some were “stable” in their behavior during both the sensory and memory representations, while other “switching” neurons flipped the patterns of their responses for each use.

To the researchers’ surprise, this combination of stable and switching neurons was enough to rotate the sensory information and transform it into memory. “That’s the entire magic,” Buschman said.

In fact, he and Libby used computational modeling approaches to show that this mechanism was the most efficient way to build the orthogonal representations of sensation and memory: It required fewer neurons and less energy than the alternatives.

Buschman and Libby’s findings feed into an emerging trend in neuroscience: that populations of neurons, even in lower sensory regions, are engaged in richer dynamic coding than was previously thought. “These parts of the cortex that are lower down in the food chain are also fitted out with really interesting dynamics that maybe we haven’t really appreciated until now,” said Miguel Maravall, a neuroscientist at the University of Sussex who was not involved in the new study.

The work could help reconcile two sides of an ongoing debate about whether short-term memories are maintained through constant, persistent representations or through dynamic neural codes that change over time. Instead of coming down on one side or the other, “our results show that basically they were both right,” Buschman said, with stable neurons achieving the former and switching neurons the latter. The combination of processes is useful because “it actually helps with preventing interference and doing this orthogonal rotation.”

Buschman and Libby’s study might be relevant in contexts beyond sensory representation. They and other researchers hope to look for this mechanism of orthogonal rotation in other processes: in how the brain keeps track of multiple thoughts or goals at once in how it engages in a task while dealing with distractions in how it represents internal states in how it controls cognition, including attention processes.

“I’m really excited,” Buschman said. Looking at other researchers’ work, “I just remember seeing, there’s a stable neuron, there’s a switching neuron! You see them all over the place now.”

Libby is interested in the implications of their results for artificial intelligence research, particularly in the design of architectures useful for AI networks that have to multitask. “I would want to see if people pre-allocating neurons in their neural networks to have stable and switching properties, instead of just random properties, helped their networks in some way,” she said.

All in all, “the consequences of this kind of coding of information are going to be really important and really interesting to figure out,” Maravall said.

Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.


Discussion

In this work, we asked how information from somatosensory and auditory inputs is integrated in the mouse neocortex. With two-photon Ca 2+ imaging, we investigated large populations of layer 2/3 neurons across somatosensory and auditory areas with single cell resolution. We found that neurons across somatosensory cortices are tuned to the frequency of tactile stimulation. The addition of concurrent sound resulted in modulation of these tactile responses in both S1 and S2, and this modulation typically manifested as a suppression of the response. Moreover, the degree of suppression depended on tactile frequency, with responses to low frequencies more inhibited than responses to high frequencies. We also identified a population of neurons in S2 responsive to sound but not to touch. Unlike in auditory cortex, sound responses of many (31 of 82) sound-selective neurons in S2 were strongly inhibited by addition of tactile stimuli at high tactile frequencies. These neurons were spatially colocalized with S2 touch-selective neurons.

The detection of the frequency of mechanical vibrations is important for animals to discern surface texture and to handle tools 30,31 , and tuning to spectral frequency in the somatosensory system can encode texture information 32 . In our study, the presence of well-tuned neurons in both S1 and S2 supports the notion that tactile frequency tuning may be a general organizational feature for mouse tactile sensation. The higher proportion of neurons with tuning to lower tactile frequencies in S2 than in S1 may reflect differences in thalamocortical inputs to the two regions. S1 receives strong thalamic drive from the ventral posterior medial nucleus (VPM), while S2 receives a larger share of its thalamocortical input from the posterior medial nucleus (POm) 33,34 . Interestingly, although both POm and VPM cells show adaptation, causing decreased response amplitude under high frequency stimulation, POm cells exhibit earlier adaptation than VPM cells 35 and as a result are tuned to lower frequencies than VPM cells. Thus, the tuning properties of neurons in S2 may be inherited from the response properties of thalamic neurons, although it could also reflect longer temporal integration windows in higher areas of cortex 36 .

We found that the addition of an auditory stimulus modulated tactile responses in both S1 and S2, consistent with the sound-driven hyperpolarizing currents previously observed in mouse S1 37 . This modulation has three notable features: (1) Although a similar proportion of neurons in both S1 and S2 were facilitated by sound, more neurons in S2 were inhibited than in S1 (Figs. 3d and 5d). (2) Inhibition of neurons tuned to low tactile frequencies in both S1 and S2 was more severe than inhibition of neurons to high tactile frequencies in the same regions (Figs. 3e and 5e). (3) Sound-driven suppression in S2 is tactile frequency dependent, with stronger inhibition occurring at lower tactile frequencies (Figs. 7 and 8). Previous studies in human and non-human primates have revealed that multimodal integration improves detection of events in the environment 3,38,39 . The optimal integration of competing sensory cues involves dominance of the most reliable sensory cue to minimize variance in the estimation of the true stimulus 3 . This evaluation of reliability between different sensory cues is a dynamic process, with the weight or value of each stimulus modality being continuously updated 39 . Low frequency tactile stimulation is potentially less salient of a signal than high frequency tactile stimulation, since it comprises lower velocity whisker motions. Indeed, we observed more suppression of tactile responses at lower tactile stimulus frequencies than at high frequencies (Figs. 7 and 8), indicating that auditory responses are more dominant when tactile stimuli are weak. This result is consistent with the prior observation that, during optimal multimodal integration, the more reliable stimulus modality dominates the response 40 . On the other hand, this frequency-dependent integration is complementary to “inverse effectiveness,” where multimodal integration is largest for weak multimodal stimuli near threshold and decreases with increasing stimulus intensity, as has been reported in the superior colliculus 41,42 .

Sound-touch modulations may involve more than just direct interactions between the unimodal stimuli themselves. Attention, arousal, motor behavior, and hidden internal states can be influenced by sensory stimuli and they, in turn, can influence the response to a sensory stimulus 23,24,43,44,45 . Indeed, multisensory integration, if relevant to behavior, should be associated with a change in the internal state of the animal. Pointing towards this complex interplay of stimuli and internal states, we found that locomotive behavior, while able to influence sensory responses, could not explain sound-driven inhibition of tactile responses on its own (Fig. 4). To untangle these potentially complex interconnections, the underlying cellular and network mechanisms that mediate these interactions need to be uncovered. While our present work focused on neurons in layers 2/3, a fruitful locus of study would be layers 1 and 6, where crossmodal 46 and neuromodulatory 47 inputs are known to be stronger and may thus gate sensory inputs and mediate attentional effects.

Previously, it was believed that multimodal influences on activities within classically defined unimodal areas are mediated by feedback from multisensory integration in higher-order cortical regions 48,49 . However, human studies using event-related potentials (ERPs) suggest that these multimodal influences may also be carried in the feedforward inputs coming from subcortical regions to unimodal regions 48,50,51 . In the present study, we identified a small (1.2%) population of sound-selective neurons within S2 itself. Although prior studies have shown non-matching neurons in primary cortices that respond solely to other sensory modality inputs 52 , the sound-selective neurons we found may play a special computational role in multimodal integration. The sound-driven responses in these neurons were strongly suppressed at high tactile frequencies (Fig. 9a–e), and those inhibited by tactile stimuli neurons are clustered near the center of the whisker-responsive region of S2 (Fig. 10), similar to the spatial organization of non-matching neurons seen in other studies 51,52 . The existence of touch-inhibited sound-selective neurons in S2 indicates that they may play a role in local sound-driven suppression observed in tactile-selective neurons of S2. This winner-take-all circuit (Fig. 10a) could dynamically select a stimulus modality at each moment and, under the right conditions, would be consistent with divisive normalization, a model that has been proposed as a driving force behind multisensory interactions 53,54,55 .


Organization of receptive field properties

There is a serial and hierarchical organization of receptive field properties. Each sensory modality is composed of multiple brain areas. As one proceeds from receptor to thalamus to the primary sensory cortex and higher cognitive areas of the brain, receptive fields demonstrate increasingly complex stimulus requirements. For example, in the auditory system, peripheral neurons may respond well to pure tones, whereas some central neurons respond better to frequency-modulated sounds. In the primary visual and somatosensory cortex, receptive fields are selective for the orientation or direction of motion of a stimulus, whereas in higher visual cortical areas, neurons may respond best to images of faces or objects.

In the visual and somatosensory systems, receptive fields can be essentially circular or oval regions of retina or skin. By contrast, in the thalamus, visual and somatosensory receptive fields are circular and exhibit centre-surround antagonism, in which onset of a stimulus in one skin or retinal region elicits activating responses and in surrounding regions elicits inhibitory responses. Thus, the same stimulus produces opposite responses in those regions. The effects of stimulus antagonism at different locations are a manifestation of the phenomenon called lateral inhibition. In lateral inhibition the optimal stimulus is not spatially uniform across the receptive field rather, it is a discrete spot of light (in the case of the eye) or contact (in the case of a body surface), with contrast between central and surrounding regions.

Referring as it does to a region, a receptive field is fundamentally a spatial entity (a portion of the visual field or retina, or a portion of the body surface) that makes the most sense in the visual and somatosensory systems. In the auditory system hair cells tuned to particular frequencies are located at different locations along the basilar membrane, implying a spatial relevance for auditory receptive fields. In the auditory system one could define a cell’s receptive field as the specific set of frequencies to which the cell responds. In the nervous system generally, the receptive field of a sensory neuron is defined by its synaptic inputs each cell’s receptive field results from the combination of fields of all of the neurons providing input to it. Because inputs are not simply summed, the receptive field properties of a neuron commonly are described in terms of the stimuli that elicit responses from the cell.


Integration of Signals from Mechanoreceptors

The configuration of the different types of receptors working in concert in human skin results in a very refined sense of touch. The nociceptive receptors—those that detect pain—are located near the surface. Small, finely calibrated mechanoreceptors—Merkel’s disks and Meissner’s corpuscles—are located in the upper layers and can precisely localize even gentle touch. The large mechanoreceptors—Pacinian corpuscles and Ruffini endings—are located in the lower layers and respond to deeper touch. (Consider that the deep pressure that reaches those deeper receptors would not need to be finely localized.) Both the upper and lower layers of the skin hold rapidly and slowly adapting receptors. Both primary somatosensory cortex and secondary cortical areas are responsible for processing the complex picture of stimuli transmitted from the interplay of mechanoreceptors.


Contents

Stochastic resonance was first discovered in a study of the periodic recurrence of Earth's ice ages. [2] [3] The theory developed out of an effort to understand how the earth's climate oscillates periodically between two relatively stable global temperature states, one "normal" and the other an "ice age" state. The conventional explanation was that variations in the eccentricity of earth's orbital path occurred with a period of about 100,000 years and caused the average temperature to shift dramatically. The measured variation in the eccentricity had a relatively small amplitude compared to the dramatic temperature change, however, and stochastic resonance was developed to show that the temperature change due to the weak eccentricity oscillation and added stochastic variation due to the unpredictable energy output of the sun (known as the solar constant) could cause the temperature to move in a nonlinear fashion between two stable dynamic states.

As an example of stochastic resonance, consider the following demonstration after Simonotto et al. [4]

The image to the left shows an original picture of the Arc de Triomphe in Paris. If this image is passed through a nonlinear threshold filter in which each pixel detects light intensity as above or below a given threshold, a representation of the image is obtained as in the images to the right. It can be hard to discern the objects in the filtered image in the top left because of the reduced amount of information present. The addition of noise before the threshold operation can result in a more recognizable output. The image below shows four versions of the image after the threshold operation with different levels of noise variance the image in the top right hand corner appears to have the optimal level of noise allowing the Arc to be recognized, but other noise variances reveal different features.

The quality of the image resulting from stochastic resonance can be improved further by blurring, or subjecting the image to low-pass spatial filtering. This can be approximated in the visual system by squinting one's eyes or moving away from the image. This allows the observer's visual system to average the pixel intensities over areas, which is in effect a low-pass filter. The resonance breaks up the harmonic distortion due to the threshold operation by spreading the distortion across the spectrum, and the low-pass filter eliminates much of the noise that has been pushed into higher spatial frequencies.

A similar output could be achieved by examining multiple threshold levels, so in a sense the addition of noise creates a new effective threshold for the measurement device.

Cuticular mechanoreceptors in crayfish Edit

Evidence for stochastic resonance in a sensory system was first found in nerve signals from the mechanoreceptors located on the tail fan of the crayfish (Procambarus clarkii). [5] An appendage from the tail fan was mechanically stimulated to trigger the cuticular hairs that the crayfish uses to detect pressure waves in water. The stimulus consisted of sinusoidal motion at 55.2 Hz with random Gaussian noise at varying levels of average intensity. Spikes along the nerve root of the terminal abdominal ganglion were recorded extracellularly for 11 cells and analyzed to determine the SNR.

Two separate measurements were used to estimate the signal-to-noise ratio of the neural response. The first was based on the Fourier power spectrum of the spike time series response. The power spectra from the averaged spike data for three different noise intensities all showed a clear peak at the 55.2 Hz component with different average levels of broadband noise. The relatively low- and mid-level added noise conditions also show a second harmonic component at about 110 Hz. The mid-level noise condition clearly shows a stronger component at the signal of interest than either low- or high-level noise, and the harmonic component is greatly reduced at mid-level noise and not present in the high-level noise. A standard measure of the SNR as a function of noise variance shows a clear peak at the mid-level noise condition. The other measure used for SNR was based on the inter-spike interval histogram instead of the power spectrum. A similar peak was found on a plot of SNR as a function of noise variance for mid-level noise, although it was slightly different from that found using the power spectrum measurement.

These data support the claim that noise can enhance detection at the single neuron level but are not enough to establish that noise helps the crayfish detect weak signals in a natural setting. Experiments performed after this at a slightly higher level of analysis establish behavioral effects of stochastic resonance in other organisms these are described below.

Cercal mechanoreceptors in crickets Edit

A similar experiment was performed on the cricket (Acheta domestica), an arthropod like the crayfish. [6] The cercal system in the cricket senses the displacement of particles due to air currents utilizing filiform hairs covering the cerci, the two antenna-like appendages extending from the posterior section of the abdomen. Sensory interneurons in terminal abdominal ganglion carry information about intensity and direction of pressure perturbations. Crickets were presented with signal plus noise stimuli and the spikes from cercal interneurons due to this input were recorded.

Two types of measurements of stochastic resonance were conducted. The first, like the crayfish experiment, consisted of a pure tone pressure signal at 23 Hz in a broadband noise background of varying intensities. A power spectrum analysis of the signals yielded maximum SNR for a noise intensity equal to 25 times the signal stimulus resulting in a maximum increase of 600% in SNR. 14 cells in 12 animals were tested, and all showed an increased SNR at a particular level of noise, meeting the requirements for the occurrence of stochastic resonance.

The other measurement consisted of the rate of mutual information transfer between the nerve signal and a broadband stimulus combined with varying levels of broadband noise uncorrelated with the signal. The power spectrum SNR could not be calculated in the same manner as before because there were signal and noise components present at the same frequencies. Mutual information measures the degree to which one signal predicts another independent signals carry no mutual information, while perfectly identical signals carry maximal mutual information. For varying low amplitudes of signal, stochastic resonance peaks were found in plots of mutual information transfer rate as a function of input noise with a maximum increase in information transfer rate of 150%. For stronger signal amplitudes that stimulated the interneurons in the presence of no noise, however, the addition of noise always decreased the mutual information transfer demonstrating that stochastic resonance only works in the presence of low-intensity signals. The information carried in each spike at different levels of input noise was also calculated. At the optimum level of noise, the cells were more likely to spike, resulting in spikes with more information and more precise temporal coherence with the stimulus.

Stochastic resonance is a possible cause of escape behavior in crickets to attacks from predators that cause pressure waves in the tested frequency range at very low amplitudes, like the wasp Liris niger. Similar effects have also been noted in cockroaches. [6]

Cutaneous mechanoreceptors in rats Edit

Another investigation of stochastic resonance in broadband (or, equivalently, aperiodic) signals was conducted by probing cutaneous mechanoreceptors in the rat. [7] A patch of skin from the thigh and its corresponding section of the saphenous nerve were removed, mounted on a test stand immersed in interstitial fluid. Slowly adapting type 1 (SA1) mechanoreceptors output signals in response to mechanical vibrations below 500 Hz.

The skin was mechanically stimulated with a broadband pressure signal with varying amounts of broadband noise using the up-and-down motion of a cylindrical probe. The intensity of the pressure signal was tested without noise and then set at a near sub-threshold intensity that would evoke 10 action potentials over a 60-second stimulation time. Several trials were then conducted with noise of increasing amplitude variance. Extracellular recordings were made of the mechanoreceptor response from the extracted nerve.

The encoding of the pressure stimulus in the neural signal was measured by the coherence of the stimulus and response. The coherence was found to be maximized by a particular level of input Gaussian noise, consistent with the occurrence of stochastic resonance.

Electroreceptors in paddlefish Edit

The paddlefish (Polyodon spathula) hunts plankton using thousands of tiny passive electroreceptors located on its extended snout, or rostrum. The paddlefish is able to detect electric fields that oscillate at 0.5–20 Hz, and large groups of plankton generate this type of signal.

Due to the small magnitude of the generated fields, plankton are usually caught by the paddlefish when they are within 40 mm of the fish's rostrum. An experiment was performed to test the hunting ability of the paddlefish in environments with different levels of background noise. [8] It was found that the paddlefish had a wider distance range of successful strikes in an electrical background with a low level of noise than in the absence of noise. In other words, there was a peak noise level, implying effects of stochastic resonance.

In the absence of noise, the distribution of successful strikes has greater variance in the horizontal direction than in the vertical direction. With the optimal level of noise, the variance in the vertical direction increased relative to the horizontal direction and also shifted to a peak slightly below center, although the horizontal variance did not increase.

Another measure of the increase in accuracy due to the optimal noise background is the number of plankton captured per unit time. For four paddlefish tested, two showed no increase in capture rate, while the other two showed a 50% increase in capture rate.

Separate observations of the paddlefish hunting in the wild have provided evidence that the background noise generated by plankton increase the paddlefish's hunting abilities. Each individual organism generates a particular electrical signal these individual signals cause massed groups of plankton to emit what amounts to a noisy background signal. It has been found that the paddlefish does not respond to only noise without signals from nearby individual organisms, so it uses the strong individual signals of nearby plankton to acquire specific targets, and the background electrical noise provides a cue to their presence. For these reasons, it is likely that the paddlefish takes advantage of stochastic resonance to improve its sensitivity to prey.

Individual model neurons Edit

Stochastic resonance was demonstrated in a high-level mathematical model of a single neuron using a dynamical systems approach. [9] The model neuron was composed of a bi-stable potential energy function treated as a dynamical system that was set up to fire spikes in response to a pure tonal input with broadband noise and the SNR is calculated from the power spectrum of the potential energy function, which loosely corresponds to an actual neuron's spike-rate output. The characteristic peak on a plot of the SNR as a function of noise variance was apparent, demonstrating the occurrence of stochastic resonance.

Inverse stochastic resonance Edit

Another phenomenon closely related to stochastic resonance is inverse stochastic resonance. It happens in the bistable dynamical systems having the limit cycle and stable fixed point solutions. In this case the noise of particular variance could efficiently inhibit spiking activity by moving the trajectory to the stable fixed point. It has been initially found in single neuron models, including classical Hodgkin-Huxley system. [10] [11] Later inverse stochastic resonance has been confirmed in Purkinje cells of cerebellum, [12] where it could play the role for generation of pauses of spiking activity in vivo.

Multi-unit systems of model neurons Edit

An aspect of stochastic resonance that is not entirely understood has to do with the relative magnitude of stimuli and the threshold for triggering the sensory neurons that measure them. If the stimuli are generally of a certain magnitude, it seems that it would be more evolutionarily advantageous for the threshold of the neuron to match that of the stimuli. In systems with noise, however, tuning thresholds for taking advantage of stochastic resonance may be the best strategy.

A theoretical account of how a large model network (up to 1000) of summed FitzHugh–Nagumo neurons could adjust the threshold of the system based on the noise level present in the environment was devised. [13] [14] This can be equivalently conceived of as the system lowering its threshold, and this is accomplished such that the ability to detect suprathreshold signals is not degraded.

Stochastic resonance in large-scale physiological systems of neurons (above the single-neuron level but below the behavioral level) has not yet been investigated experimentally.

Psychophysical experiments testing the thresholds of sensory systems have also been performed in humans across sensory modalities and have yielded evidence that our systems make use of stochastic resonance as well.

Vision Edit

The above demonstration using the Arc de Triomphe photo is a simplified version of an earlier experiment. A photo of a clocktower was made into a video by adding noise with a particular variance a number of times to create successive frames. This was done for different levels of noise variance, and a particularly optimal level was found for discerning the appearance of the clocktower. [4] Similar experiments also demonstrated an increased level of contrast sensitivity to sine wave gratings. [4]

Tactility Edit

Human subjects who undergo mechanical stimulation of a fingertip are able to detect a subthreshold impulse signal in the presence of a noisy mechanical vibration. The percentage of correct detections of the presence of the signal was maximized for a particular value of noise. [15]

Audition Edit

The auditory intensity detection thresholds of a number of human subjects were tested in the presence of noise. [16] The subjects include four people with normal hearing, two with cochlear implants and one with an auditory brainstem implant.

The normal subjects were presented with two sound samples, one with a pure tone plus white noise and one with just white noise, and asked which one contained the pure tone. The level of noise which optimized the detection threshold in all four subjects was found to be between -15 and -20 dB relative to the pure tone, showing evidence for stochastic resonance in normal human hearing.

A similar test in the subjects with cochlear implants only found improved detection thresholds for pure tones below 300 Hz, while improvements were found at frequencies greater than 60 Hz in the brainstem implant subject. The reason for the limited range of resonance effects are unknown. Additionally, the addition of noise to cochlear implant signals improved the threshold for frequency discrimination. The authors recommend that some type of white noise addition to cochlear implant signals could well improve the utility of such devices.


HOW THE SEROTONIN SYSTEM INFLUENCES SENSORY PROCESSING [MURTHY LAB]

Brains process external information rapidly at a sub-second time scale, which is set by the dynamic electrophysiological properties of neurons and the fast communication within neuronal populations. This fast neural processing is complemented by the so-called neuromodulatory systems (involving certain class of neurotransmitters such as dopamine and serotonin). Neuromodulation has generally been thought to occur at slower time scales, for example during periods of alertness following some salient event or over circadian periods for sleep-wake modulation. The fast and slow systems working together allow the brain to not only react rapidly to external stimuli, but also assign context or meaning to these stimuli. In our recent study, appearing in Nature Neuroscience, we set out to understand how a particular neuromodulatory system involving serotonin influences information processing in a sensory system. What we found was unexpected and exciting.

Serotonin is a chemical that has been linked to high level cognitive features such as depression, aggression and mood. Although we are far from understanding the neuronal architecture underlying any of these effects, it is generally theorized that the serotonin system affects neural processing by slowly altering the properties of circuit elements, the neurons and synapses. In mammals, serotonin is secreted by neurons located in the raphe nuclei, which send their axons widely throughout the brain, including very dense projections to the early stages of olfactory system. This fact, combined with the importance of olfaction for mice, prompted us to examine the involvement of the serotonin system in odor processing.
Our experiments were enabled by the explosive advances in neuroscience techniques, including optogenetics (which allowed us to selectively activate specific neurons and axons with light) and optical reporters of activity (genetically-encoded calcium indicators that transduce neural activity to light). We used multiphoton microscopy to look at the activity of two different populations of output neurons (mitral and tufted cells) in the olfactory bulb, the first odor processing stage in vertebrates. To our surprise, we found that even brief activation of raphe neurons caused immediate excitation of mitral and tufted cells. An even greater surprise was in store when we complemented our whole animal experiments with mechanistic studies in ex vivo brain slices in addition to releasing serotonin, raphe neurons also released a fast excitatory neurotransmitter, glutamate. In fact, glutamate mediates much of the excitation of mitral and tufted cells in our experiments, with serotonin release likely requiring more intense activity in raphe neurons.
Sensory systems are not only required to detect external stimuli (odors in the case of the olfactory system), but they also need to make distinctions between different stimuli. We asked how the activation of raphe neurons modulates these functions of the olfactory system. Uncharacteristically for a neuromodulatory system, qualitatively distinct effects were seen in the two types of olfactory bulb neurons: activating raphe neurons enhanced the response of tufted cells to odors, but bi-directionally modulated the odor response of mitral cells. A quantitative analysis of the population coding of odors revealed that raphe activation makes tufted cells more sensitive to detecting odors and the mitral cells better at discriminating different odors.
Overall, our study indicates that a “neuromodulatory” system, traditionally considered to have slow actions, can actually be part of fast ongoing information processing by releasing multiple types of neurotransmitters. Further, these modulatory neurons need not have monolithic effects, but can influence different channels of information processing in distinct ways. Conceptually, our study blurs the distinction between neuromodulation and computation itself.This research was supported by grants from the NIH, a seed grant from the Harvard Brain Initiative and fellowships from the NSF and the Sackler Foundation.

Read more in Nature Neuroscience or download PDF
Read more in News and Views or download PDF


Reflex Arcs

Reflex arcs are an interesting phenomenon for considering how the PNS and CNS work together. Reflexes are quick, unconscious movements, like automatically removing a hand from a hot object. Reflexes are so fast because they involve local synaptic connections in the spinal cord, rather than relay of information to the brain. For example, the knee reflex that a doctor tests during a routine physical is controlled by a single synapse between a sensory neuron and a motor neuron. While a reflex may only require the involvement of one or two synapses, synapses with interneurons in the spinal column transmit information to the brain to convey what happened after the event is already over (the knee jerked, or the hand was hot). So this means that the brain is not involved at all in the movement associated with the reflex, but it is certainly involved in learning from the experience – most people only have to touch a hot stove once to learn that they should never do it again!

The simplest neuronal circuits are those that underlie muscle stretch responses, such as the knee-jerk reflex that occurs when someone hits the tendon below your knee (the patellar tendon) with a hammer. Tapping on that tendon stretches the quadriceps muscle of the thigh, stimulating the sensory neurons that innervate it to fire. Axons from these sensory neurons extend to the spinal cord, where they connect to the motor neurons that establish connections with (innervate) the quadriceps. The sensory neurons send an excitatory signal to the motor neurons, causing them to fire too. The motor neurons, in turn, stimulate the quadriceps to contract, straightening the knee. In the knee-jerk reflex, the sensory neurons from a particular muscle connect directly to the motor neurons that innervate that same muscle, causing it to contract after it has been stretched. Image credit: https://www.khanacademy.org/science/biology/ap-biology/human-biology/neuron-nervous-system/a/overview-of-neuron-structure-and-function, modified from “Patellar tendon reflex arc,” by Amiya Sarkar (CC BY-SA 4.0). The modified image is licensed under a CC BY-SA 4.0 license.

This video provides an overview of how reflex arcs work:


Watch the video: 4ο Πανελλήνιο Συνέδριο - Σάββατο 14 Μαρτίου 10:30 - 12:00 (September 2022).


Comments:

  1. Faek

    He's not right without a doubt

  2. Jusho

    I confirm. It was with me too. Let's discuss this issue. Here or at PM.

  3. Gabra

    Just what you need. Good topic, I will participate. Together we can come to the right answer.

  4. Mezentius



Write a message