摘要:Agents in a network want to learn the true state of the world from their own signals and their neighbors' reports. Agents know only their local networks, consisting of their neighbors and the links among them. Every agent is Bayesian with the (possibly misspecified) prior belief that her local network is the entire network. We present a tractable learning rule to implement suchlocally Bayesian learning: each agent extracts new information using the full history of observed reports in her local network. Despite their limited network knowledge, agents learn correctly when the network is asocial quilt, a tree‐like union of cliques. But they fail to learn when a network contains interlinked circles (echo chambers), despite an arbitrarily large number of correct signals.
关键词:Locally Bayesian learning; rational learning with misspecified priors; efficient learning in finite networks