In this paper we study interactive ``one-shot'' analogues of the classical Slepian-Wolf theorem. Alice receives a value of a random variable X , Bob receives a value of another random variable Y that is jointly distributed with X . Alice's goal is to transmit X to Bob (with some error probability ). Instead of one-way transmission, which is studied in the classical coding theory, we allow them to interact. They may also use shared randomness.
We show, that Alice can transmit X to Bob in expected H ( X Y ) + 2 H ( X Y ) + O ( log 2 1 ) number of bits. Moreover, we show that every one-round protocol with information complexity I can be compressed to the (many-round) protocol with expected communication about I + 2 I bits. This improves a result by Braverman and Rao \cite{braverman2011information}, where they had 5 I . Further, we show how to solve this problem (transmitting X ) using 3 H ( X Y ) + O ( log 2 1 ) bits and 4 rounds on average. This improves a result of~\cite{brody2013towards}, where they had 4 H ( X Y ) + O ( log 1 ) bits and 10 rounds on average.
In the end of the paper we discuss how many bits Alice and Bob may need to communicate on average besides H ( X Y ) . The main question is whether the upper bounds mentioned above are tight. We provide an example of ( X Y ) , such that transmission of X from Alice to Bob with error probability requires H ( X Y ) + log 2 1 bits on average.