Affiliation
Contact
  • Phone: +91 80 2293-2277
  • Email: htyagi@ece.iisc.ernet.in
Academic Experience
  • Postdoc at the ITA center, UCSD (2014)
  • Ph.D. in ECE, UMD (2013)
  • Dual degree in EE, IIT Delhi (2007)

Research Interests

My research interest broadly lies in information theoretic analysis of emerging problems in computer science, communication theory, and statistical learning. In particular, I have focussed on application of information theory to security, interactive communication, and statistical learning. Lately, I have been interested in practical data exchange protocols, compression of interactive communication, secure computing, distributed inference, and distributed algortihms for communication over MANETs.


Courses


Group

  • Raghava GD (Ph.D. student, Aug 2015-)
  • Sahasranand KR (Ph.D. student, Aug 2015-)
  • Lekshmi Ramesh (Ph.D. student, Aug 2015-) [Jointly advised with Chandra Murthy ]
  • Rohit Chatterjee (UG student, Dec 2015 - Jun 2016) [Jointly advised with Bhavana Kanukurthi ]
  • Srikanth Pai (Postdoctoral researcher, Dec 2015 - Oct 2016)

Publications

Preprints

  • With S. Venkatakrishnan, P. Viswanath, and S. Watanabe, Information Complexity Density and Simulation of Protocols. On Arxiv.
    [Abstract] [pdf]

    A simulation of an interactive protocol entails the use of interactive communication to produce the output of the protocol to within a fixed statistical distance $\epsilon$. Recent works have proposed that the information complexity of the protocol plays a central role in characterizing the minimum number of bits that the parties must exchange for a successful simulation, namely the distributional communication complexity of simulating the protocol. Several simulation protocols have been proposed with communication complexity depending on the information complexity of the simulated protocol. However, in the absence of any general lower bounds for distributional communication complexity, the conjectured central role of information complexity is far from settled. We fill this gap and show that the distributional communication complexity of $\epsilon$-simulating a protocol is bounded below by the $\epsilon$-tail $\lambda_\epsilon$ of the information complexity density, a random variable with information complexity as its expected value. For protocols with bounded number of rounds, we give a simulation protocol that yields a matching upper bound. Thus, it is not information complexity but $\lambda_\epsilon$ that governs the distributional communication complexity. As applications of our bounds, in the amortized regime for product protocols, we identify the exact second order term, together with the precise dependence on $\epsilon$. For general protocols such as a mixture of two product protocols or for the amortized case when the repetitions are not independent, we derive a general formula for the leading asymptotic term. These results sharpen and significantly extend known results in the amortized regime. In the single-shot regime, our lower bound clarifies the dependence of communication complexity on $\epsilon$. We illustrate this with an example that exhibits an arbitrary separation between distributional communication complexity and information complexity for all sufficiently small $\epsilon$.

  • With P. Viswanath and S. Watanabe, Interactive Communication for Data Exchange. On Arxiv.
    [Abstract] [pdf]

    Two parties observing correlated data seek to exchange their data using interactive communication. How many bits must they communicate? We propose a new interactive protocol for data exchange which increases the communication size in steps until the task is done. We also derive a lower bound on the minimum number of bits that is based on relating the data exchange problem to the secret key agreement problem. Our single-shot analysis applies to all discrete random variables and yields upper and lower bounds of a similar form. In fact, the bounds are asymptotically tight and lead to a characterization of the optimal rate of communication needed for data exchange for a general source sequence such as a mixture of IID random variables as well as the optimal second-order asymptotic term in the length of communication needed for data exchange for IID random variables, when the probability of error is fixed. This gives a precise characterization of the asymptotic reduction in the length of optimal communication due to interaction; in particular, two-sided Slepian-Wolf compression is strictly suboptimal.

Monographs

Journal Papers

  • With S. Watanabe, Universal Multiparty Data Exchange and Secret Key Agreement. To appear, IEEE Transactions on Information Theory, 2017 .
    [Abstract] [pdf]

    Multiple parties observing correlated data seek to recover each other’s data and attain omniscience. To that end, they communicate interactively over a noiseless broadcast channel: Each bit transmitted over this channel is received by all the parties. We give a universal interactive communication protocol for omniscience which requires communication of rate only $\mathcal{O}\left(n^{−1/2}\sqrt{\log n}\right)$ more than the optimal rate for every independent and identically distributed (in time) sequence of data. Drawing on the duality between omniscience and secret key agreement due to Csiszá́r and Narayan, as a by-product, we obtain a universal protocol for generating a multiparty secret key of rate at most $\mathcal{O}\left(n^{−1/2}\sqrt{\log n}\right)$ less than the maximum rate possible.

  • With J. Acharya, A. Orlitsky, and A. T. Suresh, Estimating Renyi Entropy of Discrete Distributions. IEEE Transactions on Information Theory vol. 63, no. 1, 2017.
    [Abstract] [pdf]

    It was shown recently that estimating the Shannon entropy $H(p)$ of a discrete $k$-symbol distribution $p$ requires $\Theta(k/\log k)$ samples, a number that grows near-linearly in the support size. In many applications $H(p)$ can be replaced by the more general Rènyi entropy of order $\alpha$, $H_\alpha(p)$. We determine the number of samples needed to estimate $H_\alpha(p)$ for all $\alpha$, showing that $\alpha < 1$ requires a super-linear, roughly $k^{1/\alpha}$ samples, noninteger $\alpha>1$ requires a near-linear $k$ samples, but, perhaps surprisingly, integer $\alpha>1$ requires only $\Theta(k^{1-1/\alpha})$ samples. Furthermore, developing on a recently established connection between polynomial a\ pproximation and estimation of additive functions of the form $\sum_xf (p_x)$, we reduce the \ sample complexity for noninteger values of $\alpha$ by a factor of $\log k$ compared to the empirical estimator. The estimators achieving these bounds are simple and run in time linear in the number of samples. Our lower bounds provide explicit constructions of distributions with different Rènyi entropies that are hard to distinguish.

  • With M. Hayashi and S. Watanabe, Secret Key Agreement: General Capacity and Second-Order Asymptotics. IEEE Transactions on Information Theory vol. 62, no. 7, 2016.
    [Abstract] [pdf]

    We revisit the problem of secret key agreement using interactive public communication for two parties and propose a new secret key agreement protocol. The protocol attains the secret key capacity for general observations and attains the second-order asymptotic term in the maximum length of a secret key for independent and identically distributed observations. In contrast to the previously suggested secret key agreement protocols, the proposed protocol uses interactive communication. In fact, the standard one-way communication protocol used prior to this work fails to attain the asymptotic results above. Our converse proofs rely on a recently established upper bound for secret key lengths. Both our lower and upper bounds are derived in a single-shot setup and the asymptotic results are obtained as corollaries.

  • With A. Vardy, Universal Hashing for Information Theoretic Security (review article). Proceedings of the IEEE vol. 103, no. 10, 2015.
    [Abstract] [pdf]

    The information theoretic approach to security entails harnessing the correlated randomness available in nature to establish security. It uses tools from information theory and coding and yields provable security, even against an adversary with unbounded computational power. However, the feasibility of this approach in practice depends on the development of efficiently implementable schemes. In this article, we review a special class of practical schemes for information theoretic security that are based on 2-universal hash families. Specific cases of secret key agreement and wiretap coding are considered, and general themes are identified. The scheme presented for wiretap coding is modular and can be implemented easily by including an extra pre-processing layer over the existing transmission codes.

  • With S. Watanabe, Converses for Secret Key Agreement and Secure Computing. IEEE Transactions on Information Theory, vol. 61, no. 9, 2015.
    [Abstract] [pdf]

    We consider information theoretic secret key agreement and secure function computation by multiple parties observing correlated data, with access to an interactive public communication channel. Our main result is an upper bound on the secret key length, which is derived using a reduction of binary hypothesis testing to multiparty secret key agreement. Building on this basic result, we derive new converses for multiparty secret key agreement. Furthermore, we derive converse results for the oblivious transfer problem and the bit commitment problem by relating them to secret key agreement. Finally, we derive a necessary condition for the feasibility of secure computing by trusted parties that seek to compute a function of their collective data, using interactive public communication that by itself does not give away the value of the function. In many cases, we strengthen and improve upon previously known converse bounds. Our results are single-shot and do not assume that the observations are independent and identically distributed. For the case when the observations are indeed independent and identically distributed, we derive strong versions of previously known converses.

  • With P. Narayan, How Many Queries will Resolve Common Randomness? IEEE Transactions on Information Theory, vol. 59, no. 9, 2013.
    [Abstract] [pdf]

    A set of m terminals, observing correlated signals, communicate interactively to generate common randomness for a given subset of them. Knowing only the communication, how many direct queries of the value of the common randomness will resolve it? A general upper bound, valid for arbitrary signal alphabets, is developed for the number of such queries by using a query strategy that applies to all common randomness and associated communication. When the underlying signals are independent and identically distributed repetitions of m correlated random variables, the number of queries can be exponential in signal length. For this case, the mentioned upper bound is tight and leads to a single-letter formula for the largest query exponent, which coincides with the secret key capacity of a corresponding multiterminal source model. In fact, the upper bound constitutes a strong converse for the optimum query exponent, and implies also a new strong converse for secret key capacity. A key tool, estimating the size of a large probability set in terms of Rènyi entropy, is interpreted separately also as a lossless block coding result for general sources. As a particularization, it yields the classic result for a discrete memoryless source.

  • Common Information and Secret Key Capacity. IEEE Transactions on Information Theory, vol. 59, no. 9, 2013.
    [Abstract] [pdf]

    We study the generation of a secret key of maximum rate by a pair of terminals observing correlated sources and with the means to communicate over a noiseless public communication channel. Our main result establishes a structural equivalence between the generation of a maximum rate secret key and the generation of a common randomness that renders the observations of the two terminals conditionally independent. The minimum rate of such common randomness, termed {\it interactive common information}, is related to Wyner's notion of common information, and serves to characterize the minimum rate of interactive public communication required to generate an optimum rate secret key. This characterization yields a single-letter expression for the aforementioned communication rate when the number of rounds of interaction are bounded. An application of our results shows that interaction does not reduce this rate for binary symmetric sources. Further, we provide an example for which interaction does reduce the rate of communication above. Also, certain invariance properties of common information quantities are established that may be of independent interest.

  • Distributed Function Computation with Confidentiality. IEEE JSAC: In-Network Function Computation, April 2013.
    [Abstract] [pdf]

    A set of terminals observe correlated data and seek to compute functions of the data using interactive public communication. At the same time, it is required that the value of a private function of the data remains concealed from an eavesdropper observing this communication. In general, the private function and the functions computed by the nodes can be all different. We show that a class of functions are securely computable if and only if the conditional entropy of data given the value of private function is greater than the least rate of interactive communication required for a related multiterminal source-coding task. A single-letter formula is provided for this rate in special cases.

  • With P. Narayan, State Dependent Channels: Strong Converse and Bounds on Reliability Function. Excursions in Harmonic Analysis: Applied and Numerical Harmonic Analysis, Springer, 2013.
    [Abstract] [pdf]

    We consider an information theoretic model of a communication channel with a time-varying probability law. Specifically, our model consists of a state dependent discrete memoryless channel, in which the underlying state process is independent and identically distributed with known probability distribution, and for which the channel output at any time instant depends on the inputs and states only through their current values. For this channel, we provide a strong converse result for its capacity, explaining the structure of optimal transmission codes. Exploiting this structure, we obtain upper bounds for the reliability function when the transmitter is provided channel state information causally and noncausally. Instrumental to our proofs is a new technical result which provides an upper bound on the rate of codes with codewords that are "conditionally typical over large message dependent subsets of a typical set of state sequences." This technical result is a nonstraightforward extension of an analogous result for a discrete memoryless channel without states; the latter provides a bound on the rate of a good code with codewords of a fixed composition.

  • With P. Narayan and P. Gupta, When is a Function Securely Computable? IEEE Transactions on Information Theory, vol. 57, no. 10, 2011.
    [Abstract] [pdf]

    A subset of a set of terminals that observe correlated signals seek to compute a function of the signals using public communication. It is required that the value of the function be concealed from an eavesdropper with access to the communication. We show that the function is securely computable if and only if its entropy is less than the capacity of a new secrecy generation model, for which a single-letter characterization is provided.

  • With A. Tripathi, A Simple Criterion on Degree Sequences of Graphs. Discrete Applied Mathematics 156(18): 3513-3517, 2008.
    [Abstract] [pdf]

    A finite sequence of nonnegative integers is called graphic if the terms in the sequence can be realized as the degrees of vertices of a finite simple graph. We present two new characterizations of graphic sequences. The first of these is similar to a result of Havel-Hakimi, and the second equivalent to a result of Erdös and Gallai, thus providing a short proof of the latter result. We also show how some known results concerning degree sets and degree sequences follow from our results.

Conference Proceedings

  • With S. Watanabe, Optimality of the Recursive Data Exchange Protocol. ISIT, 2017.
    [Abstract] [pdf]

    Multiple parties observe correlated data generated independent and identically (in time) from a known joint distribution. Parties communicate with each other interactively to enable each party to recover the data observed by all the other parties and attain omniscience. We characterize the asymptotic growth of the number of bits of interactive communication required by the parties to attain omniscience up to the second-order term. For the converse, we provide a single-shot lower bound for the required number of bits of communication, which yields the asymptotic result as a special case. It is shown that the knowledge of the distribution can be used to modify the recently proposed recursive universal data exchange protocol to render it optimal up to the second-order term. As a corollary, we provide a precise characterization of the reduction in communication for omniscience due to interaction.

  • With S. Watanabe, Universal Multiparty Data Exchange. ISIT, 2016.
    [Abstract] [pdf]

    Multiple parties observing correlated data seek to recover each other’s data and attain omniscience. To that end, they communicate interactively over a noiseless broadcast channel: Each bit transmitted over this channel is received by all the parties. We give a universal interactive protocol for omniscience which requires communication of rate only $\mathcal{O}\left(n^{−1/2}\sqrt{\log n}\right)$ more than the optimal rate for every independent and identically distributed (in time) sequence of data.

  • With S. Venkatakrishnan, P. Viswanath, and S. Watanabe, Information Complexity Density and Simulation of Protocols. ITCS, 2016.
    [Abstract] [pdf]

    A simulation of an interactive protocol entails the use of interactive communication to produce the output of the protocol to within a fixed statistical distance $\epsilon$. Recent works have proposed that the information complexity of the protocol plays a central role in characterizing the minimum number of bits that the parties must exchange for a successful simulation, namely the distributional communication complexity of simulating the protocol. Several simulation protocols have been proposed with communication complexity depending on the information complexity of the simulated protocol. However, in the absence of any general lower bounds for distributional communication complexity, the conjectured central role of information complexity is far from settled. We fill this gap and show that the distributional communication complexity of $\epsilon$-simulating a protocol is bounded below by the $\epsilon$-tail $\lambda_\epsilon$ of the information complexity density, a random variable with information complexity as its expected value. For protocols with bounded number of rounds, we give a simulation protocol that yields a matching upper bound. Thus, it is not information complexity but $\lambda_\epsilon$ that governs the distributional communication complexity. As applications of our bounds, in the amortized regime for product protocols, we identify the exact second order term, together with the precise dependence on $\epsilon$. For general protocols such as a mixture of two product protocols or for the amortized case when the repetitions are not independent, we derive a general formula for the leading asymptotic term. These results sharpen and significantly extend known results in the amortized regime. In the single-shot regime, our lower bound clarifies the dependence of communication complexity on $\epsilon$. We illustrate this with an example that exhibits an arbitrary separation between distributional communication complexity and information complexity for all sufficiently small $\epsilon$.

  • With P. Viswanath and S. Watanabe, Interactive Communication for Data Exchange. ISIT, 2015.
    [Abstract] [pdf]

    Two parties observing correlated data seek to exchange their data using interactive communication. How many bits must they communicate? We propose a new interactive protocol for data exchange which increases the communication size in steps until the task is done. Next, we derive a lower bound on the minimum number of bits that is based on relating the data exchange problem to the secret key agreement problem. Our single-shot analysis applies to all discrete random variables and yields upper and lower bound of a similar form. In fact, the bounds are asymptotically tight and lead to a characterization of the optimal rate of communication needed for data exchange for a general sequence such as mixture of IID random variables as well as the optimal second-order asymptotic term in the minimum length of communication needed for data exchange for the IID random variables, when the probability of error is fixed. This gives a precise characterization of the asymptotic reduction in the length of optimal communication due to interaction; in particular, two-sided Slepian-Wolf compression is strictly suboptimal.

  • With S. Watanabe, Impossibility Bounds for Secure Computing. ISIT, 2015.
    [Abstract] [pdf]

    We derive impossibility (converse) bounds for the efficiency of implementing information theoretically secure oblivious transfer and bit commitment using correlated observations. Our approach is based on relating these problems to that of testing if the observations of the parties are conditionally independent given the adversary's observation. The resulting bounds strengthen and improve upon several previously known results.

  • With P. Narayan and S. Watanabe, Common Randomness for Secure Computing. ISIT, 2015.
    [Abstract] [pdf]

    We revisit A.C. Yao's classic problem of secure function computation by interactive communication, in an information theoretic setting. Our approach, based on examining the underlying common randomness, provides a new proof of the characterization of a securely computable function by deterministic protocols. This approach also yields a characterization of the minimum communication needed for secure computability.

  • With T. Javidi and Y. Kaspi, Gaussian Estimation under Attack Uncertainty. ITW Jerusalem, 2015.
    [Abstract] [pdf]

    We consider the estimation of a standard Gaussian random variable under an observation attack where an adversary may add a zero mean Gaussian noise with variance in a bounded, closed interval to an otherwise noiseless observation. A straightforward approach would entail either ignoring the attack and simply using an optimal estimator under normal operation or taking the worst-case attack into account and using a minimax estimator that minimizes the cost under the worst-case attack. In contrast, we seek to characterize the optimal tradeoff between the MSE under normal operation and the MSE under the worst-case attack. Equivalently, we seek a minimax estimator for any fixed prior probability of attack. Our main result shows that a unique minimax estimator exists for every fixed probability of attack and is given by the Bayesian estimator for a least-favorable prior on the set of possible variances. Furthermore, the least-favorable prior is unique and has a finite support. While the minimax estimator is linear when the probability of attack is 0 or 1, our numerical results show that the minimax linear estimator is far from optimal for all other probabilities of attack and a simple nonlinear estimator does much better.

  • With J. Acharya, A. Orlitsky, and A. T. Suresh, The Complexity of Estimating Renyi Entropy. SODA, 2015.
    [Abstract] [pdf]

    We study the number of independent samples from a $k$-symbol distribution needed to estimate its Renyi entropy of order a and show that super-linear (in $k$) samples are needed for $a < 1$, near linear $k$ samples are needed for noninteger $a>1$, but, perhaps surprisingly, for integer a>1 only $k^{1-1/a}$ samples suffice (upto constants).

  • With S. Watanabe, Converse Results for Secrecy Generation over Channels. Invited, Asilomar, 2014.
    [Abstract] [pdf]

    We revisit the problem of secret key agreement in channel models, where in addition to a noisy, albeit secure channel, the terminals have access to a noiseless public commu- nication channel. We show a strong converse for the secret key capacity in the point-to-point model and give upper bounds for the general case. Underlying our proofs is a recently discovered single-shot converse for secret key rates in multiterminal source models.

  • With M. Hayashi and S. Watanabe, Strong Converse for a Degraded Wiretap Channel via Active Hypothesis Testing. Invited, Allerton, 2014.
    [Abstract] [pdf]

    We establish an upper bound on the rate of codes for a wiretap channel with public feedback for a fixed probability of error and secrecy parameter. As a corollary, we obtain a strong converse for the capacity of a degraded wiretap channel with public feedback. Our converse proof is based on a reduction of active hypothesis testing for discriminating between two channels to coding for wiretap channel with feedback.

  • With A. Vardy, Explicit Capacity-Achieving Coding Scheme for the Gaussian Wiretap Channel. ISIT, 2014.
    [Abstract] [pdf]

    We extend the Bellare-Tessaro coding scheme for a discrete, degraded, symmetric wiretap channel to a Gaussian wiretap channel. Denoting by SNR the signal-to-noise ratio of the eavesdropper's channel, the proposed scheme converts a transmission code of rate R for the channel of the legitimate receiver into a code of rate R - 0.5 log(1 + SNR) for the Gaussian wiretap channel. The conversion has a polynomial complexity in the codeword length and the proposed scheme achieves strong security. In particular, when the underlying transmission code is capacity achieving, this scheme achieves the secrecy capacity of the Gaussian wiretap channel. In effect, this work allows us to convert any constructive scheme for transmitting over a Gaussian channel into a constructive scheme for the Gaussian wiretap channel.

  • With M. Hayashi and S. Watanabe, Secret Key Agreement: General Capacity and Second-Order Asymptotics. ISIT, 2014.
    [Abstract] [pdf]

    We revisit the problem of secret key agreement using interactive public communication for two parties. When the underlying observations are independent and identically distributed, we establish the second-order asymptotic term in the maximum length of a secret key. Furthermore, for general observations, we establish the secret key capacity. Underlying our proofs is a new secret key agreement scheme and a recently established upper bound on secret key lengths.

  • With S. Watanabe, A Bound For Multiparty Secret Key Agreement And Implications For Problem Of Secure Computing. EUROCRYPT, 2014.
    [Abstract] [pdf]

    We consider secret key agreement by multiple parties observing correlated data and communicating interactively over an insecure communication channel. Our main contribution is a single-shot upper bound on the length of the secret keys that can be generated, without making any assumptions on the distribution of the underlying data. Heuristically, we bound the secret key length in terms of ``how far'' is the joint distribution of the initial observations of the parties and the eavesdropper from a distribution that renders the observations of the parties conditionally independent across some partition, when conditioned on the eavesdropper's side information. The closeness of the two distributions is measured in terms of the exponent of the probability of error of type II for a binary hypothesis testing problem, thus bringing out a structural connection between secret key agreement and binary hypothesis testing. When the underlying data consists of an independent and identically distributed sequence, an application of our bound recovers several known upper bounds for the asymptotic rate of a secret key that can be generated, without requiring the agreement error probability or the security index to vanish to 0 asymptotically.
    Also, we consider the following problem of secure function computation with trusted parties: Multiple parties observing correlated data seek to compute a function of their collective data. To this end, they communicate interactively over an insecure communication channel. It is required that the value of the function be concealed from an eavesdropper with access to the communication. When is such a secure computation of a given function feasible? Using the aforementioned upper bound, we derive a necessary condition for the existence of a communication protocol that allows the parties to reliably recover the value of a given function, while keeping this value concealed from an eavesdropper with access to (only) the communication.

  • With S. Watanabe, Secret key capacity for multipleaccess channel with public feedback. Invited, Allerton, 2013.
    [Abstract] [pdf]

    We consider the generation of a secret key (SK) by the inputs and the output of a secure multipleaccess channel (MAC) that additionally have access to a noiseless public communication channel. Under specific restrictions on the protocols, we derive various upper bounds on the rate of such SKs. Specifically, if the public communication consists of only the feedback from the output terminal, then the rate of SKs that can be generated is bounded above by the maximum symmetric rate $R_f^*$ in the capacity region of the MAC with feedback. On the other hand, if the public communication is allowed only before and after the transmission over the MAC, then the rate of SKs is bounded above by the maximum symmetric rate $R^*$ in the capacity region of the MAC without feedback. Furthermore, for a symmetric MAC, we present a scheme that generates an SK of rate $R_f^*$, improving the best previously known achievable rate $R^*$. An application of our results establishes the SK capacity for adder MAC, without any restrictions on the protocols.

  • With P. Narayan, How many queries will resolve common randomness? ISIT, 2013.
    [Abstract] [pdf]

    A set of m terminals, observing correlated signals, communicate interactively to generate common randomness for a given subset of them. Knowing only the communication, how many direct queries of the value of the common randomness will resolve it? A general upper bound, valid for arbitrary signal alphabets, is developed for the number of such queries by using a query strategy that applies to all common randomness and associated communication. When the underlying signals are independent and identically distributed repetitions of m correlated random variables, the number of queries can be exponential in signal length. For this case, the mentioned upper bound is tight and leads to a single-letter formula for the largest query exponent, which coincides with the secret key capacity of a corresponding multiterminal source model. In fact, the upper bound constitutes a strong converse for the optimum query exponent, and implies also a new strong converse for secret key capacity. A key lemma, estimating the size of a large probability set in terms of Rènyi entropy, is interpreted separately also as a lossless block coding result for a general source. As a particularization, it yields the classic result for a discrete memoryless source.

  • With N. Kashyap, Y. Sankarasubramaniam, and K. Viswanathan, Fault-Tolerant Secret Key Generation. ISIT, 2012.
    [Abstract] [pdf]

    Mobile nodes observing correlated data commu- nicate using an insecure bidirectional switch to generate a secret key, which remains concealed from the switch. We are interested in fault-tolerant secret key rates, i.e., the rates of secret key generated even if a subset of nodes drop out before the completion of the communication protocol. We formulate a new notion of fault-tolerant secret key capacity, and present an upper bound on it. This upper bound is shown to be tight when the random variables corresponding to the observations of nodes are exchangeable. Further, it is shown that one round of interaction achieves the fault-tolerant secret key capacity in this case. The upper bound is also tight for the case of a pairwise independent network model consisting of a complete graph, and can be attained by a noninteractive protocol.

  • Distributed Computing With Privacy ISIT, 2012.
    [Abstract] [pdf]

    A set of terminals that observe correlated data seek to compute functions of the data using interactive public commu- nication. At the same time it is required that this communication observed by an eavesdropper, does not reveal the value of a private function of the data. In general, the private function and the functions computed by the nodes can be all different. We show that a class of functions are securely computable if and only if the conditional entropy of data given the value of private function is greater than the least rate of interactive communication required for an appropriately chosen multiterminal source coding task. A single-letter formula is provided for this rate in special cases.

  • Minimal public communication for maximum rate secret key generation. ISIT, 2011.
    [Abstract] [pdf]

    Secret key generation is considered for a pair of terminals that observe correlated sources and communicate interactively over a public channel. It is argued that optimum rate secret key generation is linked inherently to the Wyner's notion of common information between two dependent random variables. The minimum rate of interactive public communication required to generate an optimum rate secret key is characterized in terms of a variant of this notion of common information.

  • With P. Narayan and P. Gupta, When Is a Function Securely Computable? ISIT, 2011.
    [Abstract] [pdf]

    A subset of a set of terminals that observe correlated signals seek to compute a given function of the signals using public communication. It is required that the value of the function be kept secret from an eavesdropper with access to the communication. We show that the function is securely computable if and only if its entropy is less than the aided secret key capacity of an associated secrecy generation model, for which a single-letter characterization is provided.

  • With P. Narayan and P. Gupta, Secure Computing. ISIT, 2010.
    [Abstract] [pdf]

    We study a problem of secure computation by multiple parties of a given function of their cumulative observations, using public communication but without revealing the value of the function to an eavesdropper with access to this communication. A Shannon theoretic formulation is introduced to characterize necessary and sufficient conditions for secure computability. Drawing on innate connections of this formulation to the problem of secret key generation by the same parties using public communication, we show that a function is securely computable if and only if its entropy is smaller than the secret key capacity. Conditions for secure computability at a lone terminal are also derived by association with an appropriate secret key generation problem.

  • With P. Narayan, The Gelfand-Pinsker Channel: Strong Converse and Upper Bound for the Reliability Function. ISIT, 2009.
    [Abstract] [pdf]

    We consider a Gelfand-Pinsker discrete memoryless channel (DMC) model and provide a strong converse for its capacity. The strong converse is then used to obtain an upper bound on the reliability function. Instrumental in our proofs is a new technical lemma which provides an upper bound for the rate of codes with codewords that are conditionally typical over large message dependent subsets of a typical set of state sequences. This technical result is a nonstraightforward analog of a known result for a DMC without states that provides an upper bound on the rate of a good code with codewords of a fixed type (to be found in, for instance, the Csiszàr-Körner book

  • With R. K. Mallik and S. Raina. Optimal Receiver for MPSK Signaling with Imperfect Channel Estimation. WCNC, 2007.
    [Abstract] [pdf]

    We derive the structure of the opti- mal receiver for MPSK signaling over a correlated Rayleigh fading channel. The channel is estimated using the minimum mean square error criterion by means of pilot symbols. For the case of high signal-to-noise ratio, an approximate expression for the symbol error probability (SEP) of this scheme is obtained as an integral, and compared with the SEP of a suboptimal receiver which uses maximal- ratio combining. Numerical results show that the performance gap between the optimal and sub- optimal receivers increases with increase of the channel correlation and the number of diversity branches, whereas it decreases with increase of pilot-to-signal ratio

Theses

  • Master's Thesis: Optimal Receivers for MPSK Signaling with Imperfect Channel Estimation
    Indian Institute of Technology, Delhi, 2007.
    Supervisor: Prof. Ranjan K. Mallik

  • PhD Dissertation: Common randomness principles of secrecy
    University of Maryland, College Park, 2013. [pdf]
    Supervisor: Prof. Prakash Narayan


Talks

Talks with Videos

  • Institut Henri Poincarè, Trimester on Nexus of Information and Computing Theories, 2016
    Tutorial on Information Theoretic Secrecy and Interactive Communication.



  • Simons Institute for Theory of Computing, University of California Berkeley, Workshop on Information Theory, Learning and Big Data, 2015
    Sample Complexity of Estimating Entropy.
    [Slides]



  • Interactive Information Theory Workshop, Banff International Research Station, Canada, 2012
    Function Computation, Secrecy Generation and Common Randomness.
    [Slides]



Some Other Invited Talks

  • Indian Institute of Science, ECE Faculty Seminar, 2016
    Universal Multiparty Data Exchange.
    [Slides]
  • University of Delaware; CyLab, Carnegie Mellon University; UC San Diego, 2014
    Converse Results for Information Theoretic Cryptography.
    [Slides]
  • Ph.D. Defense, University of Maryland, College Park, 2013
    Common Randomness Principles of Secrecy.
    [Slides]
  • University of Illinois, Urbana-Champaign, 2013
    How Many Queries Will Resolve Common Randomness?
    [Slides]
  • Hewlett Packard Labs, Bangalore, India, 2012
    Secret Key Generation and Secure Computing.
    [Slides]

Conference Talks

  • ISIT 2016 Universal Multiparty Data Exchange.
    [Slides]
  • ITCS 2016 Information Complexity Density and Simulation of Protocols
    [Slides]
  • Bombay Information Theory Seminar (BITS) 2016 Independence Testing Bound and Interactive Communication
    [Slides]
  • ISIT 2015 Interactive Communication for Data Exchange
    [Slides]
  • Simons Institute Workshop 2015 and NCC 2015 Sample Complexity of Estimating Entropy
    [Slides]
  • Allerton 2014 Strong Converse for a Degraded Channel via Active Hypothesis Testing
    [Slides]
  • ISIT 2014 Explicit capacity achieving codes for the Gaussian wiretap channel
    [Slides]
  • ISIT 2014 Secret Key Agreement: General Capacity and Second-Order Asymptotics
    [Slides]
  • EUROCRYPT 2014 A Bound for Multiparty Secret Key Agreement and Implications for a Problem of Secure Computing
    [Slides]
  • CISS 2014 A Converse For Secret Key Agreement
    [Slides]
  • ITA 2014 A Bound on Secret Key Length via Binary Hypothesis Testing.
    [Slides]
  • Allerton 2013 Secret Key Capacity for Multipleaccess Channel with Public Feedback.
    [Slides]
  • ISIT 2013 How Many Queries Will Resolve Common Randomness?
    [Slides]
  • ISIT 2012 Fault Tolerant Secret Key Generation.
    [Slides]
  • ISIT 2011 Minimal Public Communication for Maximum Rate Secret Key Generation.
    [Slides]
  • ISIT 2011 When is a Function Securely Computable?
    [Slides]
  • ISIT 2009 The Gelfand-Pinsker Channel: Strong Converse and Upper Bound for the Reliability Function.
    [Slides]