[Cryptography] Quillon Graph: A private, post-quantum electronic cash system

Viktor S. Kristensen overdrevetfedmetodologi at pm.me
Mon Jan 12 00:21:57 EST 2026



Peter,

  Scenario D is a fair concern, and honestly the one that worries me more than "secret quantum superpowers".

  You're right that new cryptography introduces fresh implementation risk, new side-channels, new libraries, and new opportunities for failure. If the threat we're preparing for never materializes, then we've paid real complexity costs for nothing — and possibly created vulnerabilities that wouldn't otherwise exist. That is a valid engineering objection.

  Where I think the ledger case differs from the general "PQC migration" narrative is the irreversibility of exposure.

  For TLS, messaging, VPNs, etc., your point holds strongly: if the quantum threat never arrives, PQC complexity was unnecessary; if it does arrive, we patch, rotate, and move on. The past fades.

  For an append-only public record, the past does not fade. The cryptography chosen at time of writing fixes the confidentiality horizon permanently. There is no later upgrade path for data already committed. The temporal security paper (Section 11) formalizes this:

  Blockchain systems present the worst-case scenario for HNDL:
  1. Immutability: τ_secrecy = ∞ by design
  2. Public broadcast: Pr[adversary has data] = 1
  3. Financial value: Non-decaying utility
  4. Key exposure: Signing transactions exposes public keys

  This makes the cost of being wrong asymmetric:

  - If PQC is unnecessary → we carry extra complexity and bandwidth.
  - If PQC is necessary and we didn't deploy it → irreversible retrospective disclosure.

  Section 10 ("Decision-Theoretic Analysis") quantifies this asymmetry:

  Theorem 10.1 (Preparation Dominance). For L ≫ c and any prior π = Pr[θ = 1] > c/L, preparation is decision-theoretically dominant.

  For financial systems with L/c ~ 10⁶, any π > 10⁻⁶ justifies preparation.

  The payoff matrix is stark:
  ┌─────────────┬─────────────────────┬──────────────────────┐
  │             │   θ = 0 (No CRQC)   │ θ = 1 (CRQC arrives) │
  ├─────────────┼─────────────────────┼──────────────────────┤
  │ a = wait    │ 0                   │ −L (catastrophic)    │
  ├─────────────┼─────────────────────┼──────────────────────┤
  │ a = prepare │ −c (migration cost) │ 0                    │
  └─────────────┴─────────────────────┴──────────────────────┘
  That asymmetry is the entire motivation. It's not "quantum hype", it's "no second chance". Section 9 frames this as the Hellman Imperative:

  Axiom 9.1 (Hellman's Responsibility Principle).
  1. Waiting for certainty is itself a decision—often the wrong one
  2. The cost of preparation is bounded; the cost of failure may be unbounded

  On desinformatsiya: I agree completely that focusing only on speculative future threats while ignoring current attack surfaces would be bad engineering. That's why the design goal has been:

  - Keep the consensus layer conventional and auditable.
  - Keep cryptographic composition simple (KDF(classical || PQ)).
  - Avoid novel primitives outside NIST-selected or widely deployed classical ones.
  - Treat PQC as additive, not substitutive.

  Section 12.1 ("Hybrid Classical-Postquantum Construction") formalizes this as defense in depth:

  Definition 12.1 (Defense in Depth). Security requires breaking both classical and post-quantum components:

  Security = Sec(Classical) ∨ Sec(Post-Quantum)

  Concrete instantiation (Q-NarwhalKnight):
  - Symmetric: XChaCha20-Poly1305 (256-bit)
  - KEM: ML-KEM-1024 (Kyber, lattice-based)
  - Signatures: ML-DSA-87 (Dilithium5, lattice-based)
  - Hash: BLAKE3 / SHA-3 (quantum-resistant)
  - Key derivation: HKDF-SHA3-256

  In other words: add one new assumption, not replace all existing ones.

  If PQC turns out to be a long-term distraction (your Scenario D), the system remains secure under classical assumptions — the XChaCha20 layer doesn't disappear because Kyber was unnecessary. If QC arrives, the system avoids the "no remediation possible" trap. Section 12.2 acknowledges the residual uncertainty explicitly:

  Per Shannon: this is not perfect secrecy. It remains vulnerable to sufficiently advanced future capability.
  Per Witten: the mathematical foundations are unproven. Novel algorithms could emerge.
  Per Hellman: despite uncertainty, this is the responsible action.

  Your scenario — that PQC is a distraction manufactured to introduce complexity vulnerabilities — would require the hybrid approach to be worse than classical-only. But hybrid means an attacker must break both layers, not either. The attack surface for the classical layer remains unchanged; the PQC layer is purely additive. Unless NIST PQ algorithms contain active backdoors (not just weaknesses), Scenario D doesn't produce a worse security posture than doing nothing.

  That tradeoff may still be wrong — but at least it is a concrete, testable engineering choice, not a claim about inevitability of quantum computers.

  As always, I appreciate the skepticism. It forces the argument to stay grounded in operational reality rather than futurism.

  Best regards,
  Viktor


Really important if something matters

Afsendt med Proton Mail sikker e-mail.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: publickey - overdrevetfedmetodologi at pm.me - 0x5F4716BA.asc
Type: application/pgp-keys
Size: 1722 bytes
Desc: not available
URL: <https://www.metzdowd.com/pipermail/cryptography/attachments/20260112/d69bf764/attachment.key>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 603 bytes
Desc: OpenPGP digital signature
URL: <https://www.metzdowd.com/pipermail/cryptography/attachments/20260112/d69bf764/attachment.sig>


More information about the cryptography mailing list