[Cryptography] How to De-Bollocks Cryptography?
iang
iang at iang.org
Fri Sep 27 16:17:21 EDT 2024
Serialising madness seems to be a permapolemic. (Top posting bc general
comment not specific.)
Unlike PHB I've only worked maybe half a dozen schemes, and the one I
stick with is both the simplest and the oddest. It works like this:
1. each object is responsible for serialising and recovering itself - in
code.
2. each object must export a serialise-to-stream and a
recover-from-stream method.
3. The streams themselves expose methods to handle most types, but
almost everying is done in varints and byte arrays.
4. The objects also expose 2 methods for testing: example() and
equals(). A simple oroborous test calls example -> serialise -> recover
-> equals a million times.
That's it. It's literally done in code, and while there is a single line
of code per variable in each of the 4 methods, it's otherwise about as
much work as working with serialisers. Advantages include: no supply
chain dependency, no arbitrary changes, no extra system/language to
learn, ability to migrate objects dynamically through versions.
iang
ps: for those who wish to follow the rabbit down the hole
http://www.webfunds.org/guide/ouroboros.pdf
On 30/08/2024 18:16, Phillip Hallam-Baker wrote:
>
>
> On Fri, Aug 30, 2024 at 3:24 AM Chris Frey <cdfrey at foursquare.net> wrote:
>
> On Mon, Aug 12, 2024 at 11:52:53PM -0400, Phillip Hallam-Baker wrote:
> > The complexity in SAML came from using XML as a base and we did that
> > because, well ASN.1. ASN.1 was a very good idea that got ruined
> because of
> > one half baked proposal, DER encoding which is utterly
> unnecessary and a
> > bad way to achieve the intended outcome. Canonicalization is
> just as bad in
> > XML.
>
> I found this comment perplexing, considering this article from
> Phrack magazine, issue #70:
>
> http://phrack.org/issues/70/12.html#article
>
>
> Oh, yes, Phrack, the unrivaled source of security implementation
> expertise.
>
> The reason we have DER encoding is the insane notion that people are
> going to store X.509 certs in X.500 directories as a set of
> disassembled attributes and fields rather than keep the binary cert
> around.
>
> It is a fact that VeriSign issued BER certificates for many years
> before people noticed and nothing bad happened. According to X.509v3,
> the extensions are a set and in DER encoding require the encoder to
> sort them. An utter piffling waste of time and effort for absolutely
> no security or interoperability value.
>
> Worse still, DER encoding uses the definite length encodings which are
> a security risk in parsers for a start but since the length of the
> encoding length field is dependent on the length, it means that if you
> have nested encodings, you have to encode structures backwards to do
> it efficiently. And even then there is a penalty.
>
> I am not sure if I count as an expert as I only have 35 years
> experience designing and implementing encodings. I have done
> serializers and deserializers for all the commonly used schemes except
> for CBOR which I refuse to touch on principle. I have done four for
> ASN.1 alone, all of them DER. And in my non-expert opinion, DER stinks
> and is the very worst of the lot, even worse than SGML before we fixed
> the worst of that pile of pus. DER is bad security wise and bad
> efficiency wise. It is just bad.
>
> The new ASN.1 schema revisions are even worse. I don't use ASN.1
> schema. It is incomprehensible.
>
>
> People get serialization wrong by thinking they need to do fancy
> schema stuff. You really don't. JSON got that part right. But the
> various attempts at doing a schema get it wrong, wrong, wrong with a
> side order of wrong sauce.
>
> No, you don't have to specify the order in which fields are emitted or
> how long lists of sub items should be or the range of integers or any
> of the stuff people put into serializations. All you need to know is
> enough information to map the internal structures of the program in a
> modern language (C#, Java, Rust, Python) to a stream of bits and then
> convert those bits back to data structures.
>
> Canonicalization and data validation are bogus, they do not have a
> proper place in the serialization schema because it will never be
> expressive enough. So what happens is people rely on the deserializer
> to do the data validation when it can at best do a mono-buttocked job
> of it.
>
> If the schema is giving six options to serialize the same data
> structure, it is a bad schema format.
>
> Much better to do the data validation explicitly. Not least because
> the most important issue for validation is usually size of the package
> and that is almost never specified.
>
>
> The reason I like JSON is because it gets rid of the cruft. The only
> thing bad about it is that it lacks binary. But what we needed was
> extensions to JSON that allowed a file to incorporate binary chunks of
> data. The CBOR folk went off and designed a scheme based on a
> different data model which is not compatible. And they told those of
> us that wanted JSON extensions to go away, our input was not wanted
> and they were not obliged to listen.
>
> PHB
>
> _______________________________________________
> The cryptography mailing list
> cryptography at metzdowd.com
> https://www.metzdowd.com/mailman/listinfo/cryptography
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://www.metzdowd.com/pipermail/cryptography/attachments/20240927/8b3a94a9/attachment.htm>
More information about the cryptography
mailing list