Open access peer-reviewed chapter

Brownian Computing and Its Similarities with Our Brains

Written By

Alexandra Pinto

Submitted: 05 April 2024 Reviewed: 22 April 2024 Published: 31 July 2024

DOI: 10.5772/intechopen.1005437

From the Edited Volume

Stochastic Processes - Theoretical Advances and Applications in Complex Systems

Don Kulasiri

Chapter metrics overview

29 Chapter Downloads

View Full Metrics

Abstract

With the current need for the miniaturization of electronic circuits and transistors, there is a critical problem of fluctuations that create signal interference for the outputs of those circuits. One solution is to use those fluctuations and stochastic manipulation to the benefit of computing architectures. Such a solution is defined as Brownian computing, and it is possible to apply it with Brownian circuits. At those physical limits, nonlinear dynamics dominate, and synaptic modeling is critical. Synapses are the target of most, if not all, brain disorders. Because of the enormous number of synapses in our brain (~ 100,000,000,000), it is often difficult to comprehend how defects in one type of synapse yield the overall loss in brain performance characteristic of all brain disorders. Computational modeling of synapses has been a slow process that is constrained by advances in the interdisciplinary fusion of physics, both for measuring and data analysis, and advances in molecular biology or biophysics. These constraints are also computational: some models have detailed sets of coupled equations with experimental parameters that, although accurate, are impossible to solve because of the enormous number of synapses. This limitation impedes understanding of brain information processing, both in health and in disease states.

Keywords

  • Brownian computing
  • beyond von-neumann computing
  • synapses
  • memristors
  • non-linear dynamics

1. Introduction

With the current need for miniaturization of electronic circuits and transistors, there is a critical problem of fluctuations that create signal interference for the outputs of those circuits. One solution is to use those fluctuations and stochastic manipulation to the benefit of computing architectures. Such a solution is defined as Brownian computing, and it is possible to apply it with Brownian circuits [1]. At those physical limits, nonlinear dynamics dominate.

In this chapter, we are going to focus on one of the similarities between our brain and one type of brain-inspired computing approach known as reservoir computing (RC) [2], which uses to its benefit the nonlinear dynamics of an echo state to map a complex problem to a much simpler linear problem, a well-know concept in computer science - problem reduction. Our brains are constantly transforming our inputs into a simpler representation through the use of nonlinear dynamics to obtain perception and action in a synchronous way. In the next section, I will introduce an analogous source of fluctuations in our brains compared to the miniaturization of transistors. We will focus on synaptic modeling as they are the main source of nonlinearity and fluctuation in the brain, and they will play the role of echo state reservoir in analogy with reservoir computing. Tunning synaptic weight during learning is equivalent to biasing the operating point with the corresponding slope of the flux versus charge dynamics, where the synaptic strength is equivalent to the memristance [3].

Advertisement

2. Computation with fluctuations in the brain: a quantized approach

Networks of neurons have the task of conveying a digital n-dimensional input space, which is comprised of each signal detected by each neuron, into an analogous four-dimensional space, that is our output space, the environment we perceive dynamically as a time-space continuum. This transformation from input to output must be created both in a continuous manner according to classical mechanics and conditioned to the Plank constant, the minimum action according to quantum theories, such that our perception of time and space could be smooth and efficient so as to minimize the errors in perception.

The brain uses these mechanisms for the projection of what we perceive as reality. This can only happen by computing with noise or spontaneous fluctuations of periodic origin in chemical synapses [4]. This is analogous to Brownian circuits and this synaptic modeling with memristors mechanisms, and what could be seen as a mesoscopic scale phase transition prediction by considering long-range correlations through the Bose-Einstein condensate. This can only be done with memristors since the flux-to-charge relation captures the mesoscopic and quantum effects needed.

As a side effect of the model, it gives us the Fokker-Einstein equation as a result of dynamical considerations. The scales and frequencies between the study of Brownian circuit fluctuation, lasers, and the synaptic cleft are at the limit between classical and quantum effects. This is why we can understand this model as a memristor and it is the generation of synaptic self-sustained oscillations. It is relevant to explore its connection with fluctuations of Quantum origin.

The before mentioned proposition implies that synaptic behavior cannot only be added by the Hebbian rule, but it has to be added as oscillations accounting for the Quantum effects that the dispersion pattern of interference creates. As a consequence of the Quantum correction for the interference in neuronal activity, we can understand the inter-neural communication as the particle-wave duality principle in neurons.

This proposition is necessary because input information in the nervous system is in the form of an analogous continuous wave that guarantees the flow and appropriate match of information. Since there is a change in the biophysical and chemical substrates from pre-synapse hydrophobic composition and its electromagnetic content to the more open hydrated space of the synaptic cleft, which affects the biophysical and electromagnetic characteristics of the post-synapse.

The nature of information along the transmission must carry a transformation from the wave state of the pre-synapse to the particle probabilistic state of the post-synapse. This transformation is carry out at the synaptic cleft, where, as a consequence of the diffracted interference pattern, the wave probability density is calculated in the Quantum Mechanical formalism. The most relevant characteristic of this model can be seen in learning and subsequently in memory, given that the oscillatory state at the synapse has a recurrent update given by the action potential probabilistic state that depends on the postsynaptic effect. This behavior captures attenuation which is the basis for learning.

There is a big impact on this proposition for memory and plasticity given that only recurrent patterns are constructing a stable channel at the post-synapse. However, its impact is not necessarily local given the effect of the quantum mechanical considerations; non-local quantum effects can be carried out by long-range correlations on the postsynaptic probability pattern. For this special consideration, we need to go beyond quantum effects and add fluctuating effects.

Even though the origin of the effect of complex, it is possible to simplify the model to a deterministic point. This does not imply that decision-making is fully determined, but the neural activity, as a physical substrate, can be fully determined by energetic considerations. Energy quantization can be used in a deterministic fashion for prediction purposes.

Advertisement

3. Synapses, source of fluctuations in the brain

Fluctuations in the brain can be modeled as a wave-to-pulse-to-wave conversion in the neuron’s trigger zones, associated with the chemical synapses occurring in the nervous system. The wave-to-pulse-to-wave conversion was deeply studied by Freeman in the mesoscopic scale [5]. The current experimental database and theoretical explanation on this subject are very limited on the explanation of the different process that convert electrical into chemical and back to electrical energy via the synaptic cleft, where conservation of information and frequency must happen [4]. The interest in studying this problem arose from the fact and knowledge that the trains of action potentials certainly play a part in transmitting information through the nervous system, together with the fact that most of the time neurons are silent, without transmitting action potentials. Generally speaking, it is observed that 99 percent of the time, the electrical activity of neurons is in the subthreshold regime [6].

It is the combination of those low-amplitude fluctuating signals, without the energy to emanate an action potential, carrying the majority of information, that after broadcasting and at higher cortical levels of integration, generate the necessary knowledge and meaning that allows us to: a) perceive the world subjectively, b) communicate via the use of language, c) retain valuable memories, and d) in general, perform activities that do not require the fast response inter neuron communication that is associated with electric synapses.

The model that is proposed in this chapter is of a synapse as an oscillator. This oscillator undergoes constructive and destructive interference patterns of wave activity. The function of this wave patter is to smooth the train of action potential via the release of neurotransmitters from the presynaptic neuron into the post-synaptic neuron. The frequency is preserved during the process of transforming electrical into chemical and back to electrical energy. In a summary, in this theoretical proposal, synapses are described as a system composed of an input wave that is transformed quantum mechanically through interferometry.

A synapse is replaced by an oscillation that depends on the wavelength of the input signal. The collective synaptic interference pattern of waves will determine the points of maximum amplitude for the density wave synaptic function, where the action potential is observed.

There is a relation between the pulse train arriving at the post-synaptic neuron terminal and the release of neurotransmitters in order to manifest a chemical wave-like pattern that finally is converted, in the post-synaptic neuron, to a smoother electrical wave-like pattern and hence the term pulse wave conversion. Note that this model also describes a wave chemical to wave electrical conversion, contrary to the current assumption in neuroscience. This new proposal for the synapse as a memristor [7], allows us to recover the density function of the postsynaptic neuron. The final electrical activity resting potentials, described as a wave pattern, will, when above threshold, generate a new action potential and hence the term wave-to-pulse conversion [8]. Memristors also capture the memory capabilities of synapses and the nonlinearity chaotic behavior of the action potential.

Advertisement

4. Why oscillations represented by memristors at synapses?

Memristors, as well as synapses, store information, and the continuous flow of information needs a flux charge measure. The slope of such a relation is measured by the variable amount of neurotransmitter at the synaptic cleft. This is why we model the synapse as an oscillator; besides, oscillations in nature are ubiquitous (mechanical (movement of fluids)) and electromagnetic (movement of ions).

Most of the time, neurons are not evoking action potentials, but they are not at the zero equilibrium point either, rest state requires the continuous flow of information based mainly on fluctuations at synapses. Transformation from the synaptic fluctuating nonlinear state to the action potential state requires the addition and subtraction of memristors, forming the interference pattern that depends on other synapses connected to the same neuron, which as a result forms the wave probability density.

The model of the synapse is the same for all of them (biologically plausible oscillator), but the location is special in the influence of the pattern formation (not always constructively). Learning in the synapse (wave state) updates depending on the action potential (particle state). As the uncertainty principle, where the certainty of position is constrained by certainty in momentum and vice-versa.

Constant oscillation at the interior of the membrane; the potential is never zero, there is a non-stopping flow of information that happens without accounting for action potential initiation, but assuming that it is interference is also responsible for background noise generation, then is a cyclic process that is sensible to small perturbations. Evoking an action potential can be observed as the fact that the wave synaptic state functions overlap and cause the locking of their phases, creating a stronger state.

Given that spontaneous oscillations are an important assumption, considering the most common scenario in the lifespan of a neuron where it is 99 percent of the time in a sub-threshold regime and many of them die without ever making a spike, we do not need a forcing term in the oscillator modeling, but rather a very small initial condition that should move the system from the unstable origin. Neurons, in the absence of action potential stimulation, are in a latent constant transmission of information in the form of small amplitude waves that reside in the sub-threshold regime and which spontaneously emerge as a consequence of constructive interference. In this case, the resonant state is found for slightly perturbations of the initial condition.

Modeling synapses has been a slow process that is constrained by advances in the interdisciplinary fusion of physics, both for measuring and data analysis, and advances in molecular biology or biophysics [9]. Nonetheless, nowadays, the constraints are also computational, given that some models have detailed sets of coupled equations with experimental parameters that are pretty accurate but solving them for each one of the 5 × 1013 synapses is an impossible computational task with current computational resources. The main con-sequence of this computational limit is that there are different forms of synapse modeling, from detailed biophysical models that sacrifice the possibility of analyzing the behavior of several synapses to reductionist models that consider the phenomenon in a probabilistic way. Analyzing synapses with the wave-particle viewpoint is computationally efficient [10].

Advertisement

5. Electronic modeling of synapses

In this section, we will consider the electronic modeling of synapses with memristors. We will also add the proposed synaptic interference for computing collective behavior as synchronous oscillators. The memristor act as a synapse with the energy source through a nonlinear resistor with pumping effect that is connected with an inductance for magnetic field connectivity constraint. There is generation of action potential at locations of maximum amplitude at memristors interference pattern. Analog with double slit problem is for wave interference representation and waveparticle duality or synaptic-action potential dynamics. Synapses act as memristors that generate waves interfering to create a probability density of action potential generation in the form of constructive interference. This means that synapses cannot be added as discrete particles but they need to be added by the alternative approach of constructive and destructive interference.

Synaptic activity as a source of fluctuations for Brownian computing with interference is valuable for reducing computational costs. There are possible tunneling effects at resonance points since we are interested in the subthreshold regime where the amplitude of the signal is very small in comparison with the size of the barrier potential. Computation with uncertainty is possible. Both inhibitory and excitatory synapses (consequently excitatory and inhibitory neurons) can be modeled with the same model, memristors, which makes the computation easier than under considerations of ionic activity; however, inhibition is expressed as destructive interference and excitatory with constructive interference.

Advertisement

6. Algorithmic explanation

  1. Compute the neural transmission as a function of the distribution of the post-synaptic neurons from the delta distribution of observations: this means the distribution of features, depending on k, the sub-network within each node in the network, and the delta distribution.

  2. The weight matrix connects pre-synaptic neurons with post-synaptic neurons. In this case, the weight matrix is the synapse model of diffusion determined by oscillators with memristors elements. The weight matrix depends on the generative function of the post-synaptic neurons; that means the current prediction as a function of the neurons positions.

  3. Calculate the synaptic weight for each particle, using each calculated synaptic weight matrix to update the state of the pre-synaptic and post-synaptic neuron for each neuron.

  4. The update of the pre-synaptic neuron between the distribution of observations and the generative function of the current prediction. In this case, it is the communication between pre-synaptic and post-synaptic neuron that will change the current state of the pre-synaptic neuron through interference oscillatory patterns that reflect the duality particle-wave properties.

  5. Given that the communication between pre- and post-synaptic neurons is in both directions, the post-synaptic neurons require an update rule that depends on the pre-synaptic. Pre-synaptic neurons are the input that transmits sensory information; that means that even if communication happens with a preferential directional stream, it does not mean that in the opposite direction, there could not be a response effect that reflects the action-reaction nature of states in the particle-wave nature of communication. This update is, in this case, given by addition of the current post-synaptic neuron state, the deterministic classical drift function, the synaptic connectivity from the distribution of pre-synaptic neurons, and lastly, a Brownian motion term that comes from the fact that the process requires a spontaneous component.

  6. The algorithm finishes with the total number of neurons.

  7. At the end, the distribution of post-synaptic neurons or in this term, the wave distribution of the postsynaptic neuron is found after each neuron passes through the weight matrix or in this term the splitting phenomenon produced by the diffusion process that waves passing through the synaptic cleft have to experience. Once these waves get in contact with the wall of postsynaptic activity distribution, there is a reflective component that changes the activity of the pre-synaptic distribution.

Advertisement

7. Mathematical foundation

Analyzing synapses with the wave-particle viewpoint is computationally efficient for studies of memory and learning, where it is crucial to detect the appearance and disappearance of synaptic contacts and plasticity. With this framework, synapses are amplitudes of wave activity that vanish or emanate depending on the collective oscillatory behavior. This is a novel way for modeling interneuron communication that does not reside on the complexity of detailed biophysical neuronal models or its synapses, whereas is presented in this document, it is necessary to compute around 25 coupled differential equations.

Another desirable characteristic of this proposal is the fact that the required noise generation, proper of real synaptic recordings, is implicit in the wave pattern generation as a consequence of the distribution and structural complexity that can be reduced as the location of the slit, again as analogous to Young’s experiment. This is a valuable feature, given that it is not necessary to add extra noise terms that, in the case of biophysical models or integrate and fire models, make the system harder to solve computationally.

This study is a potential tool for the analysis data where it is required a transformation from the postsynaptic signal to the presynaptic or vice-versa. This approach can also be useful in computational tasks, both in software and hardware, that can make use of real biological network architectures as templates of interference generators that are able to solve different computations.

Simplified explanation of the model of synaptic wave signal detection: Suppose that the distance and speed of a synaptic signal are to be determined. For this purpose, a transmitter/receiver neuron sends a test signal action potential signal f, toward the synaptic wave. Then, the signal f is reflected by the synaptic wave, and a part of the reflected signal is received by the neuron as an echo. The sent test signal is a pulse of short duration:

fx=σxexp2πiω·xE1

with an envelope signal σ of slow variation. This means that supp σ transform is contained between -A and A and A is small compared to the carrier frequency ω. The support of f transformed is contained between ωA and ω+A.

Considering r as the distance between pre and postsynaptic neurons the transmitter and receiver, with v as the relative velocity between the receiver and the synaptic signal and with c as the speed of light. Then the ”echo” (in analogy to the echo state in reservoir computing which is also represented by synaptic activity) is received with a time lag

δt=2rcE2

Each frequency, f undergoes a Doppler shift δω=2ωvc. Since the bandwidth 2A is small compared to the carrier frequency ω, the frequency shifts can be approximated by the shift δω2ωvc, independently of the exact form of f. No further distortion of f is taken into account, therefore the ”echo” has the form:

e=MδωTδtfE3

At the receiver neuron the ”echo” is compared to the time-frequency shifts of the original signal f by taking the correlation absolute value of the inner product of the ”echo” signal with the MωTxf. Which is equivalent to the absolute value of the Short-time Fourier transform (STFT) of fx=δ and ωδω which is equal to the ambiguity function of the same function.

The values of δt and δω and the distance r and the velocity v of the synaptic wave, can be determined by means of the lemma: Suppose that f is part of the Hilbert space L2 of Rd, and f is different from zero, then the absolute value of the ambiguity function is less than the ambiguity function of f evaluated at the Origen. Which is equal to the xω and x different from 0.

Synaptic wave is the support of the function σ with which the Wigner distribution should be convoluted in order to average each of its points. The lack of positivity of the Wigner distribution is similar to the behavior of the membrane potential in the neuron, in mathematics, this negative point values do not have any meaningful interpretation. We have to understand the oscillation properties of the neuron membrane and this is translated to understand the oscillation properties of the Wigner distribution Wf. A solution for the negative values of the membrane, one might take averages at each point. The standard averaging procedure in mathematics is the convolution of Wf with a smoothing function σ which is centered at the origin, then the convolution can be seen as a local average Wf at (x,w). Where as we already explained, the ω and x signals are the information from the frequency of action potentials and the voltage of the membrane respectively. Since only regions in phase space of area δxδω1 are relevant, one may conjecture that the oscillations cancel out and that the convolution is non-negative for all functions whenever the support sigma is large enough.

It is important to find the kernels σ for which the averaged Wigner distribution is always positive. If we take into account this assumption and intuition of the synaptic wave as a Gaussian σ, then the convolution works, and we can consider the average of each point. Now, connected with the circle and the light spectrum just a few millimeters from the edge, with white light in the center, and the coefficient indexes of each color defining different diameters for the cone of light.

Considering the synaptic wave as a general Gaussian has important implications in the sense that we can find all the signal from the general form, and from this kernel recover Wf. Let me refer to the following paper for more details on the step from the uncertainty principle to the Wigner distribution based on the work of A.J.E.M. Janssen [11].

Advertisement

8. Conclusions

Synapses are the target of most, if not all, brain disorders. Because of the enormous number of synapses in our brain (approximately 100,000,000,000), it is often difficult to comprehend how defects in one type of synapse yield the overall loss in brain performance characteristic of all brain disorders. Computational modeling of synapses has been a slow process that is constrained by advances in the interdisciplinary fusion of physics, both for measuring and data analysis, and advances in molecular biology or biophysics. These constraints are also computational: some models have detailed sets of coupled equations with experimental parameters that, although accurate, are impossible to solve because of the enormous number of synapses. This limitation impedes understanding of brain information processing, both in health and in disease states.

Modeling synapses as oscillators with memristors seeks to overcome this computational limitation via an entirely new theoretical approach based on modeling synapses via a wave-particle viewpoint that is computationally efficient. This framework is based on considering synapses as amplitudes of wave activity that vanish or emanate depending on the collective oscillatory behavior. This is a novel way for modeling interneuron communication that does not reside on the complexity of detailed biophysical neuronal models or its synapses. As future consequences of the theoretical proposal, it is necessary to analyze the following implications once experiments at the Amstrong scale are available:

  • Analysis of ion channels as wave perturbations and analysis of energy constraints considering oscillations in the Glutamate cycle in interaction with astrocytes for new carriers of long-range correlations, allowing the generation of long-range pattern waves.

  • Analysis of mesoscopic oscillations and connection with macroscopic oscillations by applying quantum field theory in the analysis of long-range correlation.

  • Improvement of the model considering perturbation details of the arriving wave function.

Perhaps the most valuable contribution of this theoretical proposal is the fact that these equations have an implicit noise generation that allows us to compute under uncertainty conditions without the need for complex stochastic equations for Brownian motion. This can be applied to computation tasks in hardware and software.

Advertisement

Acknowledgments

The author would like to thank the Hoursec team for their motivation to work in a new computational paradigm that will impact the lives of many. The author also thanks the support of our partners and early market traction that are aware of the environmental problem of conventional computing architectures.

Advertisement

Conflict of interest

This chapter should be published under a CC BY License. This publication is copyrighted by the author and subject to all applicable copyright and other protections under U.S. law or other countries regulations. Altering, reselling, redistributing, publishing, or republishing text, figures, data, multimedia, or other information from this copyrighted publication, with the exception of brief, appropriately cited quotations, in any form or medium without prior written permission from the author is prohibited. Systematic downloading, and/or the making of print or electronic copies for public transmission are prohibited. All rights not expressly granted are reserved to the author. The author shall not be required to distribute, and the Subscriber/User shall not redistribute, content from this copyrighted publication to a country where the export thereof is prohibited by U.S. law or other countries regulations. Copyright or Creative Commons License notices may not be removed, obscured, or modified.

References

  1. 1. Lee J et al. Brownian circuits: Designs. International Journal of Unconventional Computing. 2016;12:341-362
  2. 2. Tanaka G et al. Recent advances in physical reservoir computing: A review. Neural Networks. 2019;115:100-123
  3. 3. Mainzer, K. and Chua, L. Local activity principle (2013).
  4. 4. Dale H. Otto loewi. 1873-1961. Biographical Memoirs of Fellows of the Royal Society. 1962;8:67-89
  5. 5. Freeman WJ. Neurodynamics: An Exploration in Mesoscopic Brain Dynamics. London: Springer; 2000
  6. 6. Bullock TH. How do brains work?: papers of a comparative neurophysiologist. Boston: Birkh ̈auser; 1993
  7. 7. Chua L. Memristor-the missing circuit element. IEEE Transactions on Circuit Theory. 1971;18:507-519
  8. 8. Katchalsky A. Thermodynamics of flow and biological organization. Biophysics and Other Topics. 1976:521-547
  9. 9. Li Z. A model of olfactory adaptation and sensitivity enhancement in the olfactory bulb. Biological Cybernetics. 1990;62:349-361
  10. 10. Nicolis G, Prigogine I. Irreversible processes at nonequilibrium steady states and Lyapounov functions (stability theory/nonequilibrium thermodynamics/fluctuation theory). Physics. 1979;76:6060-6061
  11. 11. Janssen A, Leeuwaarden J. Cumulants of the maximum of the gaussian random walk. Stochastic Processes and their Applications. 2007;117:1928-1959

Written By

Alexandra Pinto

Submitted: 05 April 2024 Reviewed: 22 April 2024 Published: 31 July 2024