In order to put our results in context we provide a detailed taxonomy of the known results on AVCs in a unified setting. We then prove new results on list decoding with constrained states, a relaxation of the main problem in which the receiver may output a short list of possible messages. In particular, we show constant list sizes can achieve capacity under an average-error criterion and that a list size L can achieve within O(1/L) from the capacity under a maximal-error criterion, complementing the known results for unconstrained state sequences.
If the encoder and decoder share a secret key, they can use a randomized code to make their communication more robust. An important practical consideration in using joint randomization for communication schemes is the tradeoff between key size and error probability. Inspired by ad-hoc networks, we propose a new AVC model called the AVC with "nosy noise," in which the jammer can observe the transmitted codeword non-causally. We show that a key size of O(\log n) bits is sufficient to achieve capacity for codes of blocklength n in this model as well as in the case for the standard AVC. If a secure feedback channel is available, the key can be shared via feedback. Limited feedback can also be used to adapt the rate to the actual channel state sequence. We develop an AVC framework for rateless coding and show schemes that achieve rates arbitrarily close to the empirical mutual information.
Finally, we address the Gaussian version of the AVC, where we show that a key size of O(\log n) bits is again sufficient to achieve capacity. This result allows us to find an achievable rate region for degraded broadcast channels. In the case where randomized coding is infeasible, we show how a known interference signal at the transmitter can enlarge the capacity region. This result has applications to watermarking and a model for spectrum-sharing communication systems.