compressing randomly-generated numbers

Jeremy Hansen jhansen at rairtech.com
Thu Aug 10 21:36:25 EDT 2006


> I was mulling over some old emails about randomly-generated 
> numbers and realized that if I had an imperfectly random 
> source (something less than 100% unpredictable), that 
> compressing the output would compress it to the point where 
> it was nearly so.  Would there be any reason to choose one 
> algorithm over another for this application?

I see where you're coming from, but take an imperfectly random source
and apply a deterministic function to it, and if I recall correctly, you
still have a imperfectly random output. It would be better to use
something like Von Neumann's unbiasing algorithm (or any of the newer
improvements) to strip out the non-randomness.
 
> I recall talking to a CS prof once who said that LZW 
> compression was "optimal", which seemed and still seems 
> really odd to me because optimal compression would generate a 
> null output file.  So surely he meant asymptotically optimal, 
> or e-close to optimal, or something like that... anyone know?

He probably meant optimal in the information theoretic sense. If that
was the case, then no, optimal compression will not yield a null-length
output -- it will give you the minimum length output that could
represent your input from among all inputs. Or maybe he didn't ;)

Regards,
Jeremy Hansen, MS, CISSP
Director of Security
RAIR Technologies

Any views or opinions presented in this email are solely those of the
author and do not 
necessarily represent those of RAIR Technologies, Inc. The individual
author will be
personally liable for any damages or other liability arising from this
email. 
RAIR Technologies * Brookfield, WI * 262-780-6000  


---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo at metzdowd.com



More information about the cryptography mailing list