The static properties of the fundamental model for epidemics of diseases allowing immunity (susceptible-infected-removed model) are known to be derivable by an exact mapping to bond percolation. Yet when performing numerical simulations of these dynamics in a network a number of subtleties must be taken into account in order to correctly estimate the transition point and the associated critical properties. We expose these subtleties and identify the different quantities which play the role of criticality detector in the two dynamics.
We investigate the dynamic relaxation of random walks on temporal networks by focusing in the recently proposed activity driven model [N. Perra, B. Gon, calves, R. Pastor-Satorras, A. Vespignani, Sci. Rep. 2, 469 (2012)]. For realistic activity distributions with a power-law form, we observe the presence of a very slow relaxation dynamics compatible with aging effects. A theoretical description of this processes in achieved by means of a mapping to Bouchaud's trap model. The mapping highlights the profound difference in the dynamics of the random walks according to the value of the exponent gamma in the activity distribution.
Many sociological networks, as well as biological and technological ones, can be represented in terms of complex networks with a heterogeneous connectivity pattern. Dynamical processes taking place on top of them can be very much influenced by this topological fact. In this paper we consider a paradigmatic model of non-equilibrium dynamics, namely the forest fire model, whose relevance lies in its capacity to represent several epidemic processes in a general parametrization. We study the behavior of this model in complex networks by developing the corresponding heterogeneous mean-field theory and solving it in its steady state. We provide exact and approximate expressions for homogeneous networks and several instances of heterogeneous networks. A comparison of our analytical results with extensive numerical simulations allows to draw the region of the parameter space in which heterogeneous mean-field theory provides an accurate description of the dynamics, and enlights the limits of validity of the mean-field theory in situations where dynamical correlations become important.
We show that Information Theory quantifiers are suitable tools for detecting and for quantifying noise-induced temporal correlations in stochastic resonance phenomena. We use the Bandt & Pompe (BP) method [Phys. Rev. Lett. 88, 174102 (2002)] to define a probability distribution, P, that fully characterizes temporal correlations. The BP method is based on a comparison of neighboring values, and here is applied to the temporal sequence of residence-time intervals generated by the paradigmatic model of a Brownian particle in a sinusoidally modulated bistable potential. The probability distribution P generated via the BP method has associated a normalized Shannon entropy, H[P], and a statistical complexity measure, C[P], which is defined as proposed by Rosso et al. [Phys. Rev. Lett. 99, 154102 (2007)]. The statistical complexity quantifies not only randomness but also the presence of correlational structures, the two extreme circumstances of maximum knowledge (“perfect order") and maximum ignorance (“complete randomness") being regarded an “trivial", and in consequence, having complexity C = 0. We show that both, H and C, display resonant features as a function of the noise intensity, i.e., for an optimal level of noise the entropy displays a minimum and the complexity, a maximum. This resonant behavior indicates noise-enhanced temporal correlations in the sequence of residence-time intervals. The methodology proposed here has great potential for the precise detection of subtle signatures of noise-induced temporal correlations in real-world complex signals.
We study a network of coupled logistic maps whose interactions occur with a certain distribution of delay times. The local dynamics is chaotic in the absence of coupling and thus the network is a paradigm of a complex system. There are two regimes of synchronization, depending on the distribution of delays: when the delays are sufficiently heterogeneous the network synchronizes on a steady-state (that is unstable for the uncoupled maps); when the delays are homogeneous, it synchronizes in a time-dependent state (that is either periodic or chaotic). Using two global indicators we quantify the synchronizability on the two regimes, focusing on the roles of the network connectivity and the topology. The connectivity is measured in terms of the average number of links per node, and we consider various topologies (scale-free, small-world, star, and nearest-neighbor with and without a central hub). With weak connectivity and weak coupling strength, the network displays an irregular oscillatory dynamics that is largely independent of the topology and of the delay distribution. With heterogeneous delays, we find a threshold connectivity level below which the network does not synchronize, regardless of the network size. This minimum average number of neighbors seems to be independent of the delay distribution. We also analyze the effect of self-feedback loops and find that they have an impact on the synchronizability of small networks with large coupling strengths. The influence of feedback, enhancing or degrading synchronization, depends on the topology and on the distribution of delays.
Words in humans follow the so-called Zipf's law. More precisely, the word frequency spectrum follows a power function, whose typical exponent is ß ˜ 2, but significant variations are found. We hypothesize that the full range of variation reflects our ability to balance the goal of communication, i.e. maximizing the information transfer and the cost of communication, imposed by the limitations of the human brain. We show that the higher the importance of satisfying the goal of communication, the higher the exponent. Here, assuming that words are used according to their meaning we explain why variation in ß should be limited to a particular domain. From the one hand, we explain a non-trivial lower bound at about ß = 1.6 for communication systems neglecting the goal of the communication. From the other hand, we find a sudden divergence of ß if a certain critical balance is crossed. At the same time a sharp transition to maximum information transfer and unfortunately, maximum communication cost, is found. Consistently with the upper bound of real exponents, the maximum finite value predicted is about ß = 2.4. It is convenient, for human language not to cross the transition and remain in a domain where maximum information transfer is high but at a reasonable cost. Therefore, only a particular range of exponents should be found in human speakers. The exponent ß contains information about the balance between cost and communicative efficiency.