摘要:We consider the open issue of how the energy efficiency of the neural information transmission process, in a general neuronal array, constrains the network size, and how well this network size ensures the reliable transmission of neural information in a noisy environment. By direct mathematical analysis, we have obtained general solutions proving that there exists an optimal number of neurons in the network, where the average coding energy cost (defined as energy consumption divided by mutual information) per neuron passes through a global minimum for both subthreshold and superthreshold signals. With increases in background noise intensity, the optimal neuronal number decreases for subthreshold signals and increases for suprathreshold signals. The existence of an optimal number of neurons in an array network reveals a general rule for population coding that states that the neuronal number should be large enough to ensure reliable information transmission that is robust to the noisy environment but small enough to minimize energy cost.