1. Introduction
There are hybrid neural networks, which are neither continuous-time nor purely discrete-time, and among them are dynamical systems with impulses and models with piecewise constant arguments [
1,
2,
3,
4,
5,
6,
7,
8,
9,
10]. In recent years, the dynamics of Hopfield-type neural networks have been studied and developed by many authors by using impulsive differential equations [
11,
12,
13,
14,
15] and differential equations with a piecewise constant argument [
16,
17,
18,
19]. In this paper, a new model of Hopfield-type neural networks with an unpredictable input-output, as well as a delayed and advanced generalized piecewise constant argument is proposed. Hopfield-type neural networks are effective at adaptive pattern recognition and vision and image processing [
20,
21,
22]. Differential equations with a piecewise constant argument describing the Hopfield neural networks may “memorize” the values of the phase variable at certain moments of time to utilize the values during the middle process till the next moment [
5,
6,
7,
8,
9,
10,
16,
17,
18,
19,
23,
24,
25,
26,
27,
28]. Neural networks, comprised of chaotically oscillating elements, store and transmit information in almost the same way as nerve cells in the brain. It is known that unpredictable oscillations cause chaotic behavior [
29,
30,
31,
32,
33,
34,
35,
36,
37,
38,
39,
40,
41]. Therefore, their presence is necessary to study chaotic dynamics in neural networks.
The novelty of our results has to be considered with respect to oscillations, chaos, and modeling for neural networks. Oscillations such as periodic and almost periodic were discussed intensively in [
16,
17,
18,
19,
23,
24,
25,
26,
27,
42,
43,
44,
45]. However, the most developed are unpredictable oscillations, which were introduced and developed in [
31,
32,
33,
34,
35,
36,
37,
38,
39]. This is the first time unpredictable oscillations have been considered for neural networks with a generalized-type piecewise constant argument. The argument admits the property to be delayed and advanced, and consequently, it provides rich opportunities for the investigation and application of neural networks.
It is known that oscillations and periodic motions are frequently observed in the activities of the neurons in the brain. Recent developments in the field of neural networks have led to an increased interest in the complexity of the dynamics. Oscillations and chaos in neural networks are actual and have stimulated the interest of many scientists [
46,
47,
48,
49,
50]. They occur in a neural network system due to the properties of single neurons [
46,
50,
51] and synaptic connections among neurons [
52,
53]. Neural networks in present research display unpredictable oscillations and chaos. The unpredictable function was introduced in [
29] and is based on the dynamics of unpredictable points and Poincaré chaos [
30]. More precisely, the function is an unpredictable point of the Bebutov dynamics, and consequently, it is a member of the chaotic set [
31]. The notion of the unpredictable point extends the frontiers of the classical theory of dynamical systems, and the unpredictable function provides new problems of the existence of unpredictable oscillations for the theory of differential equations [
29,
30,
31,
32,
33,
34,
35,
36]. These studies have been identified as major contributing factors for the emergence of new types of sophisticated motion. Significant results have been obtained for unpredictable oscillations of Hopfield-type neural networks, shunting inhibitory cellular neural networks, and inertial neural networks [
37,
38,
39].
To the best of our knowledge, there have been very few results on the dynamical behavior of Hopfield-type neural networks with piecewise constant arguments [
16,
17,
18,
19,
26,
27]. In the present paper, we try to expand them by considering piecewise constant arguments of the generalized type [
5,
6,
7,
8,
9,
10,
16,
17,
18,
19,
23,
24,
25,
26,
27,
28,
43,
44] and by using the theory of unpredictable functions. We improve on previous methods by considering unpredictable inputs, which allow studying the distribution of chaotic signals in neural networks.
2. Preliminaries
Denote by the set of all real numbers, natural numbers, and integers, respectively. Introduce a norm for the vector as where is the absolute value. Correspondingly, for a square matrix , the norm is utilized.
We fix two real valued sequences such that for all as It is assumed that there exists a positive number such that for all integers k.
The main subject under investigation in this paper is the following Hopfield-type neural network system with a piecewise constant argument:
where
if
—the rates with which the units self-regulate or reset their potentials when isolated from other units and inputs;
m—the number of neurons in the network;
—the state of the ith unit at time t;
, —the activation functions of the incoming potentials of the unit j;
—the synaptic connection weights of the unit j on the unit i;
—the time-varying stimulus, corresponding to the external input from outside the network to the unit i.
Throughout this paper, we assume that the parameters and are real and that the activation functions are continuous functions. Moreover, suppose that there exist positive constants and such that the inequality holds for each .
We present system (
1) in the following vector form:
where
is the neuron state vector,
and
are the activations, and
is the input vector. Moreover,
,
,
are matrices.
As the usual activations for continuous time neural network dynamics, the following sigmoidal functions are considered [
45]:
They are used in neural networks as activation functions, since they allow both amplifying weak signals and do not become saturated by strong signals. The activation function and the output function are summed up with the term transfer functions. If the activation function determines the total signal a neuron receives, the transfer function translates the input signals to the output signals.
The block diagram of the Hopfield-type neural network system with a piecewise constant argument is shown in
Figure 1, and the symbols for the diagram are described in
Table 1.
Definition 1 ([
29]).
A uniformly continuous and bounded function is unpredictable if there exist positive numbers and sequences , both of which diverge to infinity such that as uniformly on compact subsets of and for each and . 3. Main Results
Let denote the space of m-dimensional vector-functions , with the norm The functions of this space are assumed to satisfy the following properties:
- (A1)
they are uniformly continuous;
- (A2)
there exists a number such that for each function ;
- (A3)
there exists a sequence that diverges to infinity such that uniformly on each closed and bounded interval of the real axis for each function .
The following conditions on the system (
2) are assumed:
- (C1)
and for all where are positive constants;
- (C2)
there exist positive numbers , such that and ;
- (C3)
is a function from the space , and there exists a positive number such that
- (C4)
- (C5)
- (C6)
, where
;
- (C7)
;
- (C8)
there exists a sequence with the property as such that and as on each finite interval of integers, where is the sequence given in Definition 1.
Lemma 1 ([
10]).
A function is a bounded solution of Equation (1) if and only if it is a solution of the following integral equation: Let us introduce the operator
on
such that:
Proof. Let us evaluate the derivative of
with respect to the time variable
Then, we have:
Hence, we can find for all
that:
Since the derivative of
is bounded,
is uniformly continuous. This means that
satisfies the property (A1).
Moreover, we have for
that:
The last inequality together with the condition (C4) implies that
Thus,
satisfies the property (A2).
Now, we need to check the last property (A3) for
. In other words, we have to verify that there exists a sequence
that diverges to infinity such that for each
uniformly on each closed and bounded interval of the real axis. Fix an arbitrary positive number
and a closed interval
where
with
. It is enough to show that
for sufficiently large
n and
We choose two numbers
and
such that:
Take
n large enough such that
and
on
. Then, for
, by writing:
one can see that:
is valid. If we divide the last integral into two parts, we get for
that:
We need to find an upper bound for the last integral. For this purpose, we shall evaluate it by dividing the interval of integration into subintervals as follows. For a fixed , we assume without loss of generality that and so that there exist exactly p discontinuity moments in the interval
Let us denote:
We shall need the following presentation of the last integral.
Now, if we define the integrals in the last expression as:
and:
where
then we can write that:
For
, and we have by the condition (C8) that
Hence, we obtain that:
Since the function
is uniformly continuous, for large
n and
, we can find a
such that
if
As a result of this discussion, we conclude that:
Moreover, we have by the condition (C8) that:
Applying a similar idea used for the integral
we get:
Thus, it is true that:
As a result of these evaluations, it follows that:
for all
. Hence, the inequalities (
4)–(
6) give that
for
. Thus, the function
satisfies the property (A3). As a result, the operator
is invariant in
. □
Lemma 3. The operator Π is a contraction on
Proof. Let the functions
and
belong to the space
We obtain for all
that:
Then, it is true for all
that:
Consequently, the condition (C5) implies that the operator is contractive. The lemma is proven. □
The following assertion is needed in the proof of the stability of the solution.
Lemma 4 ([
10]).
Assume that the conditions (C1), (C7) are fulfilled and is a continuous function with If is a solution of:then the following inequality:holds for all , where K is as defined in (C6). Proof. First, we fix an integer i such that and then consider two alternative cases (a) and (b) .
For (a)
, we have:
The Gronwall–Bellman lemma yields that:
Moreover, for
, we have:
Consequently, it follows from the condition (C7) that
, for
. Therefore, (
8) holds for all
,
.
The assertion for case (b) can be proven in the same way.
Thus, one can conclude that (
8) holds for all
. The lemma is proven. □
Theorem 1. Assume that the conditions (C1)–(C8) hold true. If the function ϑ is unpredictable, then the system (1) has a unique exponentially stable unpredictable solution. Proof. First, we show that
is a complete space. Let
which has a limit
on
be a Cauchy sequence in the space
It can be easily shown that the limit function
is uniformly continuous and bounded, and hence, it satisfies the properties (A2) and (A3). It remains only to show that
satisfies the property (A3). Consider a closed and bounded interval
We have:
If one takes sufficiently large
n and
k such that each term on the right-hand side of the last inequality is less than
for a small enough
and
then the inequality
is satisfied on
This implies that the sequence
converges uniformly to
on
Thus, the space
is complete. Since the operator
is invariant and contractive in
, according to Lemmas 2 and 3, respectively, it follows from the contraction mapping theorem that the operator
has a unique fixed point
which is the unique solution of the neural network system (
1). Hence, the uniqueness of the solution is shown.
Next, we verify that this solution is unpredictable. We can find a positive number
and
such that the following inequalities:
and:
are satisfied. Suppose that the numbers
and
are fixed.
Denote:
and consider the cases: (i)
(ii)
(i) If
holds, we have:
for
(ii) If
is true, it follows from (
11) that:
for
We can see that:
and:
Subtracting the first equation from the second one, we get:
Therefore, we have that:
for
For a fixed
we can take sufficiently small
so that
for some
Hence,
for
which implies together with the condition (C8) that
Since
the function
z is uniformly continuous. Using this fact, for
and for large
n, we can find a
such that:
if
At the end, we have by the inequality (
10) that:
Based on the inequalities obtained in cases (i) and (ii), we see that the solution is unpredictable with and .
Lastly, let us consider the stability of the solution .
Denote
where
is another solution of the neural network system (
1) with a piecewise constant argument of the generalized type. Then,
will be a solution of (
7).
By applying the inequalities (
8) to (
12), we obtain that:
Hence, we find that:
The last inequality can be written as follows:
If we apply the Gronwall–Bellman lemma for the last inequality, it leads to:
In other words, we have:
Now, based on the condition (C6), we conclude that the solution
of (
1) is uniformly exponentially stable. The theorem is proven. □
4. Examples and Numerical Simulations
We present two examples in this section. First, we construct an example of an unpredictable function by means of the logistic map considered in [
29]. Then, we make use of this function in the second example, which deals with a Hopfield-type neural network system.
Example 1. Consider the following discrete logistic map given by:where We know that if , then the iterations of this map belong to the interval [54]. Moreover, if , Equation (13) has an unpredictable solution. Let denote an unpredictable solution of (13) for . There exist a positive number and sequences that diverge to infinity such that as for each i in bounded intervals of integers and for each Consider the following integral defined by:where for It is worth noting that is bounded on the whole real axis such that In [37], it was proven that the function is an unpredictable function. Since we do not know the initial value of the function , we are not able to visualize it. For this reason, we represent the function as follows:where . It is worth noting that the simulation of an unpredictable function is impossible, since the initial value is not known. That is why, we simulate a function that approaches the function as time increases. Let us determine:where is a fixed number, which is not necessarily equal to . Subtract equality (15) from equality (14) to obtain that , . The last equation demonstrates that the difference exponentially diminishes. This means that the function exponentially tends to the unpredictable function , i.e., the graphs of these functions approach, as time increases. The functions and are the solutions of the differential equation:and instead of the curve describing the unpredictable solution , we can take the graph of , which approximates the first one asymptotically. In Figure 2, we depict the graph defined by the initial value . Example 2. Consider the following Hopfield-type neural network given by:where , ,and the time-varying stimulus is:Here, is the unpredictable function mentioned in Example 1. Let the argument function be defined by the sequences ,
We can see that the conditions (C1)–(C8) are valid for the neural network (
16) with
for
, and moreover,
,
. If we accept
, then the system (
16) satisfies all conditions of Theorem 1. Therefore, (
16) has a unique exponentially stable unpredictable solution
Since the initial value is not known precisely, it is not possible to simulate the unpredictable solution . For this reason, we consider another solution , which starts initially at the point .
The graph of function
approaches the unpredictable solution
of Equation (
16), as time increases. That is, instead of the curve describing the unpredictable solution, one can consider the graph of
. We present the coordinates of the solution
in
Figure 3. Moreover,
Figure 4 indicates the trajectory of the solution.
Further, we describe a circuit implementation of the proposed Hopfield-type neural network (
16) using MATLAB Simulink. The Simulink model of the Hopfield-type neural network is depicted in
Figure 5, and the symbols are described in
Table 2.
In the block diagram, we took the hyperbolic tangent transfer function as the sigmoid functions f and g. To implement the block diagram, we used the transfer function “tansig” from the MATLAB Simulink library. Inputs are unpredictable functions.
Author Contributions
M.A., conceptualization, methodology, and investigation; D.A.Ç., conceptualization, investigation, and writing—review and editing; M.T., investigation, supervision, and writing—review and editing; Z.N., software, investigation, and writing—original draft. All authors read and agreed to the published version of the manuscript.
Funding
This research received no external funding.
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
Not applicable.
Acknowledgments
The authors wish to express their sincere gratitude to the referees for the helpful criticism and valuable suggestions. M. Tleubergenova was supported by the Science Committee of the Ministry of Education and Science of the Republic of Kazakhstan (Grant No. AP08955400). M. Akhmet was supported by 2247—National Leading Researchers Program of TÜBİTAK, Turkey, N 1199B472000670. M. Akhmet and D. Aruğaslan Çinçin were supported by TÜBİTAK, the Scientific and Technological Research Council of Turkey, under the project 118F161. Z. Nugayeva was supported by the Science Committee of the Ministry of Education and Science of the Republic of Kazakhstan (Grant No. AP08856170). M. Tleubergenova and Z. Nugayeva were supported by the Science Committee of the Ministry of Education and Science of the Republic of Kazakhstan (Grant No. AP09258737).
Conflicts of Interest
The authors declare no conflict of interest.
References
- Akhmet, M. Principles of Discontinuous Dynamical Systems; Springer: New York, NY, USA, 2010. [Google Scholar]
- Wiener, J. Generalized Solutions of Functional Differential Equations; World Scientific: Singapore, 1993. [Google Scholar]
- Lakshmikantham, V.; Bainov, D.D.; Simeonov, P.S. Theory of Impulsive Differential Equations; World Scientific: Singapore, 1989. [Google Scholar]
- Gopalsamy, K. Stability and Oscillation in Delay Differential Equations of Population Dynamics; Kluwer Academic Publishers: Dordrecht, The Netherlands, 1992. [Google Scholar]
- Pinto, M. Asymptotic equivalence of nonlinear and quasi linear differential equations with piecewise constant arguments. Math. Comput. Model. 2009, 49, 1750–1758. [Google Scholar] [CrossRef]
- Coronel, A.; Maulén, C.; Pinto, M.; Sepúlveda, D. Dichotomies and asymptotic equivalence in alternately advanced and delayed differential systems. J. Math. Anal. Appl. 2017, 450, 1434–1458. [Google Scholar] [CrossRef] [Green Version]
- Huang, Z.; Wang, X.; Xia, Y. A topological approach to the existence of solutions for nonlinear differential equations with piecewise constant argument. Chaos Solitons Fractals 2009, 39, 1121–1131. [Google Scholar] [CrossRef]
- Akhmet, M.U. Stability of differential equations with piecewise constant arguments of generalized type. Nonlinear Anal. 2008, 68, 794–803. [Google Scholar] [CrossRef]
- Akhmet, M.U.; Aruğaslan, D. Lyapunov-Razumikhin method for differential equations with piecewise constant argument. Discret. Contin. Dyn. Syst. 2009, 25, 457–466. [Google Scholar] [CrossRef]
- Akhmet, M. Nonlinear Hybrid Continuous/Discrete-Time Models; Atlantis Press: Paris, France, 2011. [Google Scholar]
- Gopalsamy, K. Stability of artificial neural networks with impulses. Appl. Math. Comput. 2004, 154, 783–813. [Google Scholar] [CrossRef]
- Guan, Z.; Chen, G. On delayed impulsive Hopfield neural networks(1). Neural Netw. Off. J. Int. Neural Netw. Soc. 1999, 12, 273–280. [Google Scholar] [CrossRef]
- Xu, D.; Yang, Z. Impulsive delay differential inequality and stability of neural networks. J. Math. Anal. Appl. 2005, 305, 107–120. [Google Scholar] [CrossRef] [Green Version]
- Mohammad, S. Exponential stability in Hopfield-type neural networks with impulses. Chaos Solitons Fractals 2007, 32, 456–467. [Google Scholar] [CrossRef]
- Li, Y.; Lu, L. Global exponential stability and existence of periodic solution of Hopfield-type neural networks with impulses. Phys. Lett. A 2004, 333, 62–71. [Google Scholar] [CrossRef]
- Akhmet, M.; Yilmaz, E. Hopfield-type neural network system with piecewise constant argument. Int. J. Qual. Theory Differ. Equ. Appl. 2009, 3, 8–14. [Google Scholar] [CrossRef]
- Wan, L.; Wu, A. Stabilization control of generalized type neural networks with piecewise constant argument. J. Nonlinear Sci. Appl. 2016, 9, 3580–3599. [Google Scholar] [CrossRef] [Green Version]
- Pinto, M.; Sepúlveda, D.; Torres, R. Exponential periodic attractor of impulsive Hopfield-type neural network system with piecewise constant argument. Electron. J. Qual. Theory Differ. Equ. 2018, 34, 1–28. [Google Scholar] [CrossRef]
- Akhmet, M.; Yilmaz, E. Impulsive Hopfield-type neural network system with piecewise constant argument. Nonlinear Anal. Real World Appl. 2010, 11, 2584–2593. [Google Scholar] [CrossRef]
- Pajares, G. A Hopfield Neural Network for Image Change Detection. IEEE Trans. Neural Netw. 2006, 17, 1250–1264. [Google Scholar] [CrossRef] [PubMed]
- Ramya, C.; Kavitha, G.; Shreedhara, K.S. Recalling of images using Hopfield neural network model. arXiv 2011, arXiv:1105.0332. [Google Scholar]
- Soni, N.; Sharma, E.K.; Kapoor, A. Application of Hopfield neural network for facial image recognition. IJRTE 2019, 8, 3101–3105. [Google Scholar]
- Akhmet, M.U.; Enes, Y. Neural Networks with Discontinuous/Impact Activations; Springer: New York, NY, USA, 2014. [Google Scholar]
- Akhmet, M.; Aruğaslan, D.; Cengiz, N. Exponential stability of periodic solutions of recurrent neural networks with functional dependence on piecewise constant argument. Turk. J. Math. 2018, 42, 272–292. [Google Scholar] [CrossRef]
- Akhmet, M.U. Integral manifolds of differential equations with piecewise constant argument of generalized type. Nonlinear Anal. 2007, 66, 367–383. [Google Scholar] [CrossRef] [Green Version]
- Chiu, K.S.; Pinto, M.; Jeng, J. Existence and global convergence of periodic solutions in recurrent neural network models with a general piecewise alternately advanced and retarded argument. Acta Appl. Math. 2014, 133, 133–152. [Google Scholar] [CrossRef]
- Torres, R.; Pinto, M.; Castillo, S.; Kostić, M. Uniform Approximation of Impulsive Hopfield Cellular Neural Networks by Piecewise Constant Arguments on [τ,∞). Acta Appl. Math. 2021, 171, 8. [Google Scholar] [CrossRef]
- Akhmet, M.U. On the reduction principle for differential equations with piecewise constant argument of generalized type. J. Math. Anal. Appl. 2007, 336, 646–663. [Google Scholar] [CrossRef]
- Akhmet, M.; Fen, M.O. Poincaré chaos and unpredictable functions. Commun. Nonlinear Sci. Numer. Simulat. 2017, 48, 85–94. [Google Scholar] [CrossRef] [Green Version]
- Akhmet, M.; Fen, M.O. Unpredictable points and chaos. Commun. Nonlinear Sci. Numer. Simulat. 2016, 40, 1–5. [Google Scholar] [CrossRef] [Green Version]
- Akhmet, M.; Fen, M.O. Existence of unpredictable solutions and chaos. Turk. J. Math. 2017, 41, 254–266. [Google Scholar] [CrossRef]
- Akhmet, M.; Fen, M.O. Non-autonomous equations with unpredictable solutions. Commun. Nonlinear Sci. Numer. Simulat. 2018, 59, 657–670. [Google Scholar] [CrossRef]
- Akhmet, M.; Fen, M.O.; Tleubergenova, M.; Zhamanshin, A. Unpredictable solutions of linear differential and discrete equations. Turk. J. Math. 2019, 43, 2377–2389. [Google Scholar] [CrossRef]
- Akhmet, M.; Tleubergenova, M.; Zhamanshin, A. Quasilinear differential equations with strongly unpredictable solutions. Carpathian J. Math. 2020, 36, 341–349. [Google Scholar]
- Akhmet, M.U.; Fen, M.O.; Alejaily, E.M. Dynamics with Chaos and Fractals; Springer: Cham, Switzerland, 2020. [Google Scholar]
- Akhmet, M.; Tleubergenova, M.; Fen, M.O.; Nugayeva, Z. Unpredictable solutions of linear impulsive systems. Mathematics 2020, 8, 1798. [Google Scholar] [CrossRef]
- Akhmet, M.; Tleubergenova, M.; Nugayeva, Z. Strongly unpredictable oscillations of Hopfield-type neural networks. Mathematics 2020, 8, 1791. [Google Scholar] [CrossRef]
- Akhmet, M.; Seilova, R.; Tleubergenova, M.; Zhamanshin, A. Shunting inhibitory cellular neural networks with strongly unpredictable oscillations. Commun. Nonlinear Sci. Numer. Simulat. 2020, 89, 105287. [Google Scholar] [CrossRef]
- Akhmet, M.; Tleubergenova, M.; Zhamanshin, A. Inertial neural networks with unpredictable oscillations. Mathematics 2020, 8, 1797. [Google Scholar] [CrossRef]
- Miller, A. Unpredictable points and stronger versions of Ruelle–Takens and Auslander–Yorke chaos. Topol. Appl. 2019, 253, 7–16. [Google Scholar] [CrossRef]
- Thakur, R.; Das, R. Strongly Ruelle-Takens, strongly Auslander-Yorke and Poincaré chaos on semiflows. Commun. Nonlinear. Sci. Numer. Simulat. 2019, 81, 105018. [Google Scholar] [CrossRef]
- Akhmet, M.U. Almost Periodicity, Chaos, and Asymptotic Equivalence; Springer: New York, NY, USA, 2020. [Google Scholar]
- Yu, T.; Cao, D.; Liu, S.; Chen, H. Stability analysis of neural networks with periodic coefficients and piecewise constant arguments. J. Frankl. Inst. 2016, 353, 409–425. [Google Scholar] [CrossRef]
- Xi, Q. Global exponential stability of Cohen-Grossberg neural networks with piecewise constant argument of generalized type and impulses. Neural Comput. 2016, 28, 229–255. [Google Scholar] [CrossRef] [PubMed]
- Danciu, D. Qualitative behavior of the time delay Hopfield type neural networks with time varying stimulus. Ann. Univ. Craiova Ser. El. Eng. 2002, 26, 72–82. [Google Scholar]
- Guevara, M.R.; Glass, L.; Mackey, M.C.; Shrier, A. Chaos in neurobiology. IEEE Trans. Syst. Man Cybern. 1983, 13, 790–798. [Google Scholar] [CrossRef]
- Derrida, B.; Meir, R. Chaotic behavior of a layered neural network. Phys. Rev. A 1988, 38, 3116–3119. [Google Scholar] [CrossRef]
- Wang, L.; Pichler, E.E.; Ross, J. Oscillations and chaos in neural networks: An exactly solvable model. Proc. Nat. Acad. Sci. USA 1990, 87, 9467–9471. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Landau, I.D.; Sompolinsky, H. Coherent chaos in a recurrent neural network with structured connectivity. PLoS Comput. Biol. 2018, 14, e1006309. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Qu, J.; Wang, R.; Yan, C.; Du, Y. Oscillations and synchrony in a cortical neural network. Cogn. Neurodyn. 2014, 8, 157–166. [Google Scholar] [CrossRef] [PubMed]
- Muscinelli, S.P.; Gerstner, W.; Schwalger, T. How single neuron properties shape chaotic dynamics and signal transmission in random neural networks. PLoS Comput. Biol. 2019, 15, e1007122. [Google Scholar] [CrossRef]
- Penn, Y.; Segal, M.; Moses, E. Network synchronization in hippocampal neurons. Proc. Natl. Acad. Sci. USA 2016, 113, 3341–3346. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Bel, A.; Rotstein, H.G. Membrane potential resonance in non-oscillatory neurons interacts with synaptic connectivity to produce network oscillations. J. Comput. Neurosci. 2019, 46, 169–195. [Google Scholar] [CrossRef] [PubMed]
- Hale, J.; Koçak, H. Dynamics and Bifurcations; Springer: New York, NY, USA, 1991. [Google Scholar]
| Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).