We show that distributed detection over random networks, or using a random protocol, e.g., of the gossip type, is asymptotically optimal, if the rate of information flow across the random network is large enough. Asymptotic optimality is in the sense of Chernoff information; in other words, we determine when the exponential rate of decay of the error probability for distributed detection is the best possible and equal to the rate of decay of the best centralized detector. The rate of information flow is defined by |log r|, where r is the second largest eigenvalue of the second moment of the random, consensus weight matrix. We quantify interesting tradeoffs in distributed detection, between the rate of information flow and the achievable detection performance.