Composition of Privacy Guarantees: Classical and Quantum

I try to always consider the classical alternative to any quantum computation or quantum information-theoretic primitive. This is a deliberate choice. I am not a pure quantum theorist in the sense of studying quantum models in isolation, nor am I interested in quantum advantage as an article of faith. Rather, my goal is to delineate (as precisely as possible) the boundary between what classical and quantum theories can guarantee, especially when privacy guarantees are composed over time, across mechanisms, or between interacting systems.

In the context of privacy, composition is where theory meets reality: real systems are never single-shot. They involve repeated interactions, adaptive adversaries, and layered mechanisms. Quantum information introduces new phenomena (entanglement, non-commutativity, and measurement disturbance) that complicate classical intuitions about composition. At the same time, classical privacy theory has developed remarkably robust tools that often remain surprisingly competitive, even when quantum resources are allowed.

The guiding question of this post is therefore not “What can quantum systems do that classical ones cannot?” but rather:

When privacy guarantees are composed, what genuinely changes in the transition from classical to quantum. And what does not?

By keeping classical alternatives explicitly in view, we can better understand which privacy phenomena are inherently quantum, which are artifacts of modeling choices, and which reflect deeper structural principles that transcend the classical vs. quantum divide.

Classical Composition of Differential Privacy

Recall the definition of differential privacy:

Approximate Differential Privacy
Let \mathcal{X} denote the data universe and let \mathcal{D} \subseteq \mathcal{X}^n be the set of datasets.
Two datasets D,D'\in\mathcal{D} are called neighbors, denoted D\sim D', if they differ in the data of exactly one individual.

A (possibly randomized) algorithm \mathcal{M} : \mathcal{D} \to (\mathcal{Y},\mathcal{F}) is said to be
(\varepsilon,\delta)-differentially private if for all neighboring datasets D\sim D' and all measurable events
S \in \mathcal{F},
\Pr[\mathcal{M}(D)\in S] \;\le\; e^{\varepsilon}\Pr[\mathcal{M}(D')\in S] + \delta.

It has been shown in a few references/textbooks that basic composition holds for differential privacy. We recall the statement:

Theorem (Basic sequential composition for approximate differential privacy)
Fix k\in\mathbb{N}. For each i\in\{1,\ldots,k\} let \mathcal{M}_i be a (possibly randomized) algorithm that, on input a dataset D, outputs a random variable in some measurable output space (\mathcal{Y}_i,\mathcal{F}_i).
Assume that for every i, \mathcal{M}_i is (\varepsilon_i,\delta_i)-differentially private.

Define the k-round interactive (sequential) mechanism \mathcal{M} as follows: on input D, for i=1,\ldots,k, it outputs Y_i \leftarrow \mathcal{M}_i (D; Y_1,\ldots,Y_{i-1}),
where \mathcal{M}_i(\cdot; y_{<i}) denotes the ith mechanism possibly chosen adaptively as a (measurable) function of the past transcript y_{<i}=(y_1,\ldots,y_{i-1}).
Let Y=(Y_1,\ldots,Y_k) denote the full transcript in the product space
(\mathcal{Y},\mathcal{F}) := \prod_{i=1}^k (\mathcal{Y}_i,\mathcal{F}_i).

Then \mathcal{M} is \left(\sum_{i=1}^k \varepsilon_i,\ \sum_{i=1}^k \delta_i\right)-differentially private.

In particular, if \varepsilon_i=\varepsilon and \delta_i=\delta for all i, then \mathcal{M} is (k\varepsilon, k\delta)-differentially private.

What happens in the quantum setting?

Composition of Quantum Differential Privacy

A central “classical DP intuition” we have already set up is: once you have per-step privacy bounds, you can stack them, and in the simplest form the parameters add. e.g., (\varepsilon, \delta) adds across rounds. In the quantum world, however, DP is commonly defined operationally against arbitrary measurements; and this makes the usual classical composition proofs, which rely on a scalar privacy-loss random variable, no longer directly applicable.

In a recent work, Theshani Nuradha and I show two complementary points, one negative (a barrier) and one positive:

  1. Composition can fail in full generality for approximate QDP (POVM-based).
    We show that if you allow correlated joint implementations when combining mechanisms/channels, then “classical-style” composition need not hold: even channels that are “individually perfectly private” can lose privacy drastically when composed in this fully general way.
  2. Composition can be restored under explicit structural assumptions.
    Then we identify a regime where you can recover clean composition statements: tensor-product channels acting on product neighboring inputs. In that regime, we propose a quantum moments accountant built from an operator-valued notion of privacy loss and a matrix moment-generating function (MGF).
  3. How we get operational guarantees (despite a key obstacle).
    A subtlety we highlight: the Rényi-type divergence we consider for the moments accountant does not satisfy a data-processing inequality. Nevertheless, we prove that controlling appropriate moments is still enough to upper bound measured Rényi divergence, which does correspond to operational privacy against arbitrary measurements.
  4. End result: advanced-composition-style behavior (in the right setting).
    Under those structural assumptions, the paper obtains advanced-composition-style bounds with the same leading-order behavior as in classical DP. i.e., you can once again reason modularly about long pipelines, but only after carefully stating what “composition” means (i.e., joint, tensor-product, factorized) physically/operationally in the quantum setting.

Check out the paper. Feedback/comments are welcome!

The Talking Drum as a Communication Channel

We just wrapped up Week 1 of my UIUC course, ECE598DA: Topics in Information-Theoretic Cryptography. The class introduces students to how tools from information theory can be used to design and analyze both privacy applications and foundational cryptographic protocols. Like many courses in privacy and security, we began with the classic one-time pad as our entry point into the fascinating world of secure communication.

We also explored another ‘tool’ for communication: the talking drum. This musical tradition offers a striking example of how information can be encoded, transmitted, and understood only by those familiar with the underlying code. In class, I played a video of a master drummer to bring this idea to life.

What Are Talking Drums?

Talking drums, especially those like the Yoruba dùndún, are traditional African hourglass‑shaped percussion instruments prized for their ability to mimic speech. Skilled drummers can vary pitch and rhythm to convey tonal patterns, effectively transmitting messages over short distances.

  • Speech surrogacy: The drum replicates the microstructure of tonal languages by adjusting pitch and rhythm, embodying what researchers call a “speech surrogate” .
  • Cultural ingenuity: Historically, these drums served as everyday communication tools, not merely for music or rituals but for sharing proverbs, announcements, secure messages, and more.

Here’s one of the exercises I gave students in Week 1:

Exercise: Talking drums. Chapter 1 of Gleick’s The Information highlights the talking drum as an early information technology: a medium that compresses, encodes, and transmits messages across distance. Through a communications theory lens, can you describe the talking drum as a medium that achieves a form of secure communication?

And here’s a possible solution:

African talking drums (e.g., Yoruba “dùndún”) reproduce the pitch contours and tonal patterns of speech. Since many West African languages are tonal, the drum reproduces structure without literal words.

  • Encoding: A spoken sentence is mapped into rhythmic and tonal patterns.
  • Compression: The drum strips away vowels and consonants, leaving tonal “skeletons.”
  • Security implication: To an outsider unfamiliar with the tonal code or local idioms, the message is incomprehensible. In effect, the drum acts as an encryption device where the key is cultural and linguistic context.

There are a few entities to model:

  • Source: Message in natural language (tonal West African language, e.g., Yoruba).
  • Encoder: Drummer maps source to a drummed signal using tonal contours and rhythmic patterns.
  • Channel: Physical propagation of drum beats across distance, subject to noise (wind, echo, competing sounds).
  • Legitimate receiver: Villager fluent in both the spoken language and cultural conventions.
  • Adversary: Outsider (colonial administrator, rival tribe, foreign merchant) who hears the same signal but lacks full knowledge of mapping or redundancy rules.

Let X denote a message in a tonal language (e.g., Yoruba). A drummer acts as an encoder E mapping X to a drummed signal S = E(X,K), where K denotes shared cultural/linguistic knowledge (idioms, proverbs, discourse templates) known to legitimate receivers but not to outsiders. The signal S traverses a physical channel C and is received as Y_R by insiders and as Y_A by an adversary (outsider). Decoders D_R and D_A attempt to reconstruct X:

Privacy and Security in Data Markets

At SIGMOD 2025, my collaborators and I are scheduled to give a tutorial on Privacy and Security in Distributed Data Markets. The core material that will be presented is summarized in the accompanying paper.

Abstract

Data markets play a pivotal role in modern industries by facilitating the exchange of data for predictive modeling, targeted marketing, and research. However, as data becomes a valuable commodity, privacy and security concerns have grown, particularly regarding the personal information of individuals. This tutorial explores privacy and security issues when integrating different data sources in data market platforms. As motivation for the importance of enforcing privacy requirements, we discuss attacks on data markets focusing on membership inference and reconstruction attacks. We also discuss security vulnerabilities in decentralized data marketplaces, including adversarial manipulations by buyers or sellers. We provide an overview of privacy and security mechanisms designed to mitigate these risks. In order to enforce the least amount of trust for buyers and sellers, we focus on distributed protocols. Finally, we conclude with opportunities for future research on understanding and mitigating privacy and security concerns in distributed data markets.

Schedule

Part I: Survey on Data Markets

Part II: Privacy and Security Risks

Part III: Privacy-Preserving Technologies and Security Tools

Part IV: Regulatory Considerations

Part V: Open Problems & Future Work

Part VI: Q & A

Leading up to the conference, I’m planning to post on different aspects of the tutorial.