Communication complexity is a central model of computation introduced by Yao in 1979, wheretwo players, Alice and Bob, receive inputs x and y respectively and want to compute f(x;y) for some fixedfunction f with the least amount of communication. Recently people have revisited the question of the privacyof such protocols: is it possible for Alice and Bob to compute f(x;y) without revealing too much informationabout their inputs? There are two types of privacy for communication protocols that have been proposed:first, an information theoretic definition (by Bar-Yehuda et al, and Klauck), which for Boolean functions is equivalent to thenotion of information cost introduced by Chakrabarti et al in 2001 and that has since found many important applications;second, a combinatorial definition introduced by Feigenbaum et al in 2010 and further developed by Ada et al in 2012.We provide new results for both notions of privacy, as well as the relation between them. Our new lowerbound techniques both for the combinatorial and the information-theoretic definitions enable us to give tightbounds for the privacy of several functions, including Equality, Disjointness, Inner Product, Greater Than. Inthe process we also prove tight bounds (up to 1 or 2 additive bits) for the external information complexity ofthese functions.We also extend the definitions of privacy to bounded-error randomized protocols and provide a relationbetween the two notions and the communication complexity. Again, we are able to prove tight bounds for theabove-mentioned functions as well as the Vector in Subspace and Gap Hamming Distance problems.