[Cryptography] Use of RDRAND in Haskell's TLS RNG?
Viktor Dukhovni
cryptography at dukhovni.org
Mon Nov 21 03:18:41 EST 2016
> On Nov 21, 2016, at 2:12 AM, John Gilmore <gnu at toad.com> wrote:
>
>> Anyone else care to comment on the wisdom or folly of RDRAND as a
>> principal (sole) seeding mechanism for a TLS stack?
>
> Folly.
OK, got the message.
> PS: Isn't Haskell a portable language? There are plenty of systems
> that run on systems other than modern x86 chips. How could it depend
> on RDRAND and remain portable?
The RDRAND support in the Haskell cryptography package is conditional
on the compilation environment of that module. While Haskell code is
generally portable, I/O libraries can provide (sometimes by linking
with suitable external C-code) system-dependent features. Since entropy
sources are definitely I/O, their implementation is platform-dependent.
So Haskell's TLS uses RDRAND for entropy on just the CPUs in which it
detects RDRAND support. Both of my laptops sport RDRAND-capable CPUs.
These days it probably makes sense to implement backends for the various
new Unix entropy APIs:
https://en.wikipedia.org/wiki/Entropy-supplying_system_calls
when available. Any care to volunteer a patch?
I'm working on DANE support for the Haskell X509 chain validation code,
where I feel I am not out of my depth. If someone else cares to contribute
patches that improve the Unix entropy backend, that'd be just swell.
Relevant upthread messages are:
http://www.metzdowd.com/pipermail/cryptography/2016-November/030859.html
http://www.metzdowd.com/pipermail/cryptography/2016-November/030864.html
--
Viktor.
More information about the cryptography
mailing list