# Adaptors generalised

To what extent might it be possible to have actions on the Bitcoin blockchain be enforced by verification of statements that involve generators other than the standard generator \(G\)? In general the answer seems like it should be "not at all", but it isn't trivial. I started wondering if we could generalize the signature adaptor concept for this.

For the knowledgeable or the impatient, feel free to skip here for the more concrete idea on this theme; you can always then rewind to read more background detail. For the rest, before we begin, a few notes:

- "multidimensions" - this is used as analogy, but it's one that it's implicit in a lot of modern applications of public key cryptography. We can choose a set of generators for a discrete-log hard group (such as a curve like secp256k1) whose relative discrete logs are believed unknown (usually generated by hashing to curve, an application of the concept of NUMS). We can then find
*representations*for a secret*tuple*\(x_1, x_2 \ldots\ x_n\) over the "basis" of generators \(G_1, G_2 \ldots\ G_n\). The reason for scare quotes and the word "analogy" is because this is not a space of dimension \(n\); the group itself is a space of dimension 1; each point can be represented with one free variable as \(\lambda G\), \(\lambda \in \mathbb{Z}_p\) for group order \(p\). We could perhaps call it a "computational n-dimensional space" because the problem of finding an*alternate*representation can be reduced to the original discrete log hardness problem. This is justified rigorously by Brands in his thesis, see specifically Corollary 8. - notation: because of extensive use of tuples, we adopt these conventions: a tuple of scalar values will sometimes be written with the shorthand \((x)\). Similarly for curve points: \((G)\) will be shorthand for \(G_1, G_2, \ldots G_n\). When considering "products" like \(x_1G_1 + x_2G_2 + \ldots \), we may sometimes use vector notation : \(\underline{\mathbf{x}} \cdot \underline{\mathbf{G}}\).

### \(\Sigma\)-protocols for alternate homomorphisms

You probably know what they are if you're reading this, but for learners, I could suggest e.g. this article for an introduction to the topic of \(\Sigma\)-protocols, not to mention several articles and even videos/talks of mine. And a more meaty discussion from Damgard can be found here.

We observe now that, while the standard construction uses, as its one-way homomorphism from \(f: \mathbb{F} \longrightarrow \mathbb{G}\), a singular elliptic curve scalar multiplication (or exponentiation in the discrete log case), the same works with any other \(f'\) that is a one-way homomorphism; one only has to ensure that the 'one-wayness' property actually applies, i.e., that reversal of \(f\) is reducible to a computationally hard problem.

So, let's consider \(f: (x_1, x_2, \ldots x_n) \longrightarrow P = x_1 G_1 + x_2 G_2 +\ldots x_nG_n\) as an example. As per our notation note above, we can shorten this to \(f: ((x)) \longrightarrow P = \underline{\mathbf{x}} \cdot \underline{\mathbf{G}}\). Finding \((x)\) given \((P, (G))\) is what is termed the **representation problem**. This problem is reducible to (EC)DLP as explained by Brands here in Corollary 8.

It follows that we can build a \(\Sigma\)-protocol possessing 2-special soundness, using \(f'\), i.e., the representation problem, and we will next do that before applying the concept of adaptors to it.

Thus, to prove knowledge of the "representation", or opening \((x)\), of a vector commitment \(V = \underline{\mathbf{x}} \cdot \underline{\mathbf{G}}\), we can construct the corresponding sigma protocol:

- choose \((k)\) by sampling each element from \(\mathbf{F}\) at random
- construct \(R = \underline{\mathbf{k}} \cdot \underline{\mathbf{G}}\)
- receive hash challenge via Fiat Shamir heuristic: \(e = \mathbb{H}(R||V||m)\) where \(m\) is a protocol-specific message.
- construct the
*set*of responses: \((s) = (k) + e(x)\) - verification is \(\underline{\mathbf{s}} \cdot \underline{\mathbf{G}} \stackrel{?}{=} R + eV\).

This protocol has 2-special soundness and honest verifier zero knowledge by the same argument, extended trivially to multiple responses for each of the two executions, as for the Schnorr identity protocol and its Fiat-Shamir transformed signature scheme.

### Extending the adaptor to multidimensions

The adaptor can be applied in the same way as in the 1-dimensional case:

- choose \((t)\), a tuple of adaptor secrets. Then, calculate the point \(T = \underline{\mathbf{t}} \cdot \underline{\mathbf{G}}\).
- construct the tuple \((s')\), each element of which is calculated as \(s_i' = k_i + \mathbb{H}(R+T,V,m)x_i\).
- The verifier can then verify the adaptor with: \(\underline{\mathbf{s}}' \cdot \underline{\mathbf{G}} \stackrel{?}{=} R + \mathbb{H}(R+T||V||m)V\).

As in the 1-dimensional case, if this verification is true, then the verifier knows that if a valid final response tuple in the sigma protocol, for the given point \(R\) is revealed, it will yield the adaptor secret tuple \((t)\):

\(s_i - s_i' = t_i \quad \forall i\)

Note that the security of this construction, i.e., that the prover cannot equivocate on the values of the adaptor secrets \(t_i\), relies on the hardness of the representation problem.

#### Multi-adaptors are still not proofs of knowledge

There are two typical modes of usage of adaptors; one is where Alice gives Bob the adaptor point \(T\) and Bob constructs an adaptor \(s'\) using *his* private key and the provided \(T\). The other is when Bob gives both \(s'\) and \(T\) to Alice so that she is guaranteed to receive the adaptor secret if \(s\) is published.

Both usages will typically involve multisignature/multi-authorization in some way to enforce non-equivocation, but we will elide these details for now.

Let's note though, that just as for the one-dimensional case, this "multidimensional adaptor" never serves as a proof of knowledge of either the secret keys \((x)\) nor the adaptor secret \((t)\). To illustrate that concretely, a person could follow this protocol to generate a validating multidimensional adaptor:

- \((s') \stackrel{\$}{\longleftarrow} \mathbb{F}^{n}\)
- \(Q \stackrel{\$}{\longleftarrow} \mathbb{G}\)
- \(R = \underline{\textbf{s}}' \cdot \underline{\textbf{G}} - H(Q||V||m)V\)
- \(T = Q - R\)

After doing this, they can present \((T, (s'))\) as a validating set of adaptors for the commitment \(V\).

Notice that in the above calculation of \(R\), the "forger" cannot find the representation of it with respect to the base set \((G)\) unless they already know the representation of \(V\). And that thus, a representation of \(T\) is also not knowable by the hardness of the representation problem.

### Can adaptors also be applied to discrete log equivalence?

Those who are aware of "generalised \(\Sigma\)-protocols" or perhaps just DLEQs may wonder if we can generalize adaptors further to cases like an adaptor on a DLEQ proof.

To recap the DLEQ:

Given agreement on \(n\) relative-discrete-log-unknown generators \((G)\), we can prove in HVZK that a set of curve points \((P)\) have the property that \(P_i = x G_i\) is true \(\forall i\). The way to do this is to construct multiple ephemeral commitments \((R)\) from one nonce \(k\), and then return one response \(s\) to the (Fiat Shamir generated) challenge. The protocol is therefore:

- Generate \(k\) at random from \(\mathbb{F}\)
- Calculate \((R)\) from \(R_i = kG_i\)
- Receive the hash challenge \(e = \mathbb{H}((R)||(P)||m)\) where each element of the tuples should be concatenated individually (not summed), and \(m\) is some protocol specific message.
- Return the single response \(s = k + ex\).
- Verification requires \(s G_i \stackrel{?}{=} R_i + e P_i \ \forall i\).

Successful verification convinces the verifier of two facts: that the \(n\) points have a common discrete log, and that the prover knows this value.

Note that, instead of (a response \(s_i\) per generator) as in the previous section, we now have **one **response for multiple generators together.

So, let's try to apply the idea of adaptor to this structure in the most natural way. We'll focus on a case of only two generators \(G_1, G_2\) and two points \(P_1, P_2\) that we are claiming have the same discrete log:

- Generate \(k\) at random from \(\mathbb{F}\)
- Generate \(t\) at random from \(\mathbb{F}\)
- Calculate \(R_1, R_2\) from \(R_i = k G_i\)
*and*calculate \(T_i = t G_i\) - The hash challenge must be different: \(e = \mathbb{H}((R+T)||(P)||m)\) where, note, \((R+T)\) means a tuple of individual sums: \(R_1 + T_1||R_2+T_2||\ldots\) in the general case; here only the first two terms exist.
- The adaptor is now one scalar: \(s' = k + ex\).
- Adaptor verification: \(s' G_i \stackrel{?}{=} R_i + e P_i \ \forall i\).

Consider that last step; does passing of each of these verification equations for the "adaptor" prove that *if the full signature \(s\) is revealed,* the DLEQ statement (that \(P_1 = xG_1\) and \(P_2 = xG_2\)) will be proven true?

**No**: suppose your goal is to forge this proof in the specific sense of starting with \(P=xG\) and then just choosing \(P_2\) as a random curve point. Our goal is to create an \(s'\) which verifies without the later revelation of \(s\) requiring that the DLEQ statement being actually true.

To do this we follow these steps:

- Choose \(k, t \stackrel{\$}{\leftarrow} \mathbb{F}\) and \(Q \stackrel{\$}{\leftarrow} \mathbb{G}\)
- set \(e = \mathbb{H}(R_1 + T_1||Q||P_1||P_2||m)\)
- Calculate \(s' = k + ex\)
- Calculate \(R_2 = s'G_2 - eP_2\)
- Calculate \(T_2 = Q - R_2\)

After this process, both the first (trivially, since it was performed honestly) and the second verification steps will pass for the adaptor. If the dishonest prover then completes the adaptor, revealing \(s = s' + t\), this \(s\) will "half-pass" in the sense that \(R_1 + tG_1\) will match the first part of the hash serialization, but will fail if the verifier checks against \(G_2\): i.e., whether \(R_2 + tG_2 \stackrel{?}{=} sG_2 - eP_2\) because \(T_2 \ne tG_2\). **This matters because we are looking for an onchain verification mechanism; if the first half (in which \(G_1\) could be the secp256k1 generator) verifies successfully but the DLEQ statement is false, it is not useful.**

*Minor addendum: a natural approach, and one I've considered, is appending a DLEQ proof that \(T_1, T_2\) have the same preimage; this approach seems a little dubious and complex, though; even ignoring the amusingly recursive idea of proving DLEQ by proving another DLEQ. I think the next section describes a better solution.*

### The half-adaptor for DLEQ

The process is failing because we have given the prover too much "wiggle room": they have two free variables in the two nonce fields, but only one adaptor variable. The natural solution is to limit to only one free variable, so that the hash serialization becomes:

\(e = \mathbb{H}(R_1 + T||R_2||P_1||P_2||m)\)

In other words, the adaptor is only "malleating" the first half of the two-curve-point DLEQ proof. In this case we maintain the delicate balance: the adaptor value \(s'\) *will* prove knowledge of the secret for the second generator \(G_2\), but it will *not* prove discrete log equivalence; only the completed \(s\) will do that. We first give the construction, then the (by now familiar!) forgery argument for the second half of the previous sentence.

- Generate \(k\) at random from \(\mathbb{F}\)
- Generate \(t\) at random from \(\mathbb{F}\)
- Calculate \(R_1, R_2\) from \(R_i = k G_i\)
*and*calculate \(T = t G_1\) - Calculate \(e = \mathbb{H}(R_1 + T||R_2||P_1||P_2||m)\)
- The adaptor: \(s' = k + ex\).
- Adaptor verification: \(s' G_i \stackrel{?}{=} R_i + e P_i \forall i \ \).

Notice that this "half-adaptor" is verified on *both* bases, which may seem strange: what's actually happening is that, for base \(G_2\), because of no \(T\) offset, we are fully verifying proof of knowledge at this step; for \(G_1\), there is not a proof of knowledge as per previous arguments. Concretely, we can forge this without actual DLEQ of \(P_1, P_2\) as follows:

- Choose \(x\) and set \(P_2 = xG_2\) but choose \(P_1 \stackrel{\$}{\leftarrow} \mathbb{G}\)
- Choose \(k, \stackrel{\$}{\leftarrow} \mathbb{F}\) and \(Q \stackrel{\$}{\leftarrow} \mathbb{G}\); set \(R_2 = kG_2\)
- set \(e = \mathbb{H}(Q||R_2||P_1||P_2||m)\)
- Calculate \(s' = k + ex\) (note how this is really a signature on \(P_2\))
- Calculate \(R_1 = s'G_1 - eP_1\)
- Calculate \(T = Q - R_1\)
- Both adaptor verifications \(s'G_1 \stackrel{?}{=} R_1 + eP_1\) and \(s'G_2 \stackrel{?}{=} R_2 + eP_2\) will pass.

However if we wait for the prover to produce \(s = s' + t\), then due to the hash fixing, they indeed prove that \(x\) is also the DL of \(P_1\), i.e. that DLEQ holds.

### Embedding the DLEQ adaptor into BIP340

To give a concrete usage example, we attempt to make the value \(s = s' + t\) from the previous algorithm, be an actually valid BIP340 signature.

Scenario: create a half-adaptor on the DLEQ where \(P = xG\), with \(G\) the secp256k1 generator, i.e. \(P\) is a standard Bitcoin public key, and \(Q = xJ\) with \(J\) some other (relative discrete log unknown) generator.

It's important to remember that the SIDP/Schnorr signature security argument requires that the ephemeral nonce curve point \(R\) be *in* the hash challenge \(e\); but it doesn't matter *where* it is. For a specification like BIP340, it of course does matter where it is, so that interacting agents can reach the same conclusion about validity.

Given this perspective, it's easy to understand what we must do. Since the *message* portion of the hash challenge in BIP340 is actually a transaction serialization, we just need to include the \(R\)-value for \(G\) in the standard position, and the \(R\)-value(s) for the other generator(s) **inside the transaction message**. This can be achieved with e.g. an unspendable output in the transaction, or otherwise embedding in custom scripts. The details are glossed over here; it's clearly *possible*.

Thus we have \(e = \mathbb{H}(R_G+T_G||P||\textrm{txmsg})\) with txmsg *embedding* \(R_J, Q\) (note - it may be possible to omit \(Q\) in certain protocols); with this slight alteration, we can see that the protocol of the previous part can be followed to generate a \(s'\) that the verifier can verify. Once received, the verifier knows that if a valid BIP340 signature \(s\) for this pubkey \(P\) and this transaction defined by txmsg is published, they will receive proof that the DLEQ of \(P\) and \(Q\) is the same, *atomically with* revelation of the adaptor secret \(t\).

I'm not yet sure about extending this from 2 to \(n\) bases ... at minimum it is clearly possible to aggregate multiple individual proofs in pairs to get the same effect, but it warrants more investigation.

### Forgeability of this construction; DLEQ is only proved on publishing

See the above for why this is true. Our goal was specifically that a bitcoin payment or state transition be locked behind proof of the statement of DLEQ, i.e. proof that the discrete logs of two curve points \(P\) and \(Q\) with respect to bases \(G\) and \(J\) are the same. In this specific construction, the pre-sent adaptor \(s'\) already proves knowledge of the discrete log of \(Q\), but as per the above argument, only when the full signature \(s\) is published, at the same time atomically revealing the adaptor secret \(t\), does the prover successfully prove the discrete log equivalence statement.

### Conclusion

It isn't yet clear to me to what extent these constructions could be useful in Bitcoin. Having the *proof* of a statement over arbitrary bases/generators be able to "unlock" a payment has enormous potential applications, but one usually imagines the verification occurring directly onchain. A construction like that above fits that definition, but - only just! The problem is that the preparatory phase involves convincing a second party of the conditionality - that, **given this preparatory argument**, if I claim the money I prove the statement - makes it clearly a weaker concept, and specifically because it involves interactivity.

Finally, there is potential that this can be applied to proofs of representation or even generalised sigma protocols, but it is far from clear.