摘要:AbstractIn analyzing a complex network in the real world, it is ideally of great help to recognize its universality class. While biological networks, in particular, grow under various ‘learning rules’, their impacts on scaling have not yet been characterized enough. Here we applied the Hodge-Kodaira decomposition, a topological method to count global loops, to neural networks with different learning rules and edge densities. Interestingly, the networks which evolved under different learning rules showed different scalings with edge densities. The causal learning rule scaled similarly to its underlying graph (i.e. Erdös-Renyi random graph, in this study), on which a network can grow, while the Hebbian-like rule did not.