We first analyze Trickle using an idealized single-cell network model, with perfect synchronization and no packet loss. Progressively relaxing these assumptions, we evaluate the algorithm in simulation, first without synchronization, then in the presence of loss, and finally in the multi-cell case. We validate these simulation results with empirical data from a real-world deployment. We show that Trickle scales well, with the aggregate network transmission count increasing as a logarithm of cell density. We show that by dynamically adjusting listening periods, Trickle can rapidly propagate new code, taking on the order of seconds, while keeping maintenance costs on the order of a few sends per hour per node.