In recent years, the global stability of recurrent neural networks (RNNs) has been investigated extensively. It is well known that time delays and external disturbances can derail the stability of RNNs. In this paper, we analyze the robustness of global stability of RNNs subject to time delays and random disturbances. Given a globally exponentially stable neural network, the problem to be addressed here is how much time delay and noise the RNN can withstand to be globally exponentially stable in the presence of delay and noise. The upper bounds of the time delay and noise intensity are characterized by using transcendental equations for the RNNs to sustain global exponential stability. Moreover, we prove theoretically that, for any globally exponentially stable RNNs, if additive noises and time delays are smaller than the derived lower bounds arrived at here, then the perturbed RNNs are guaranteed to also be globally exponentially stable. Three numerical examples are provided to substantiate the theoretical results.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here