[Cryptography] pseudo-homomorphic encryption ??

Henry Baker hbaker1 at pipeline.com
Mon Jan 14 13:01:38 EST 2019


Here's a real-world problem that just came to
light as a result of a recent Ring.com (now
part of Amazon) screwup:

You have some sort of IoT (Internet of Things)
sensor -- perhaps a camera -- and you want to
train an AI/machine learning algorithm to
recognize something that is exposed to the
camera.

But perhaps you don't trust the AI/ML developer.

So you send him/her only an encrypted dataset
along with the classification data (yes/no or
perhaps a finite set of possibilities); this
classification data isn't encrypted, and there
isn't any easy way to figure out from the
sequence of classifications any useful info
about the encrypted dataset.

So far as I know, homomorphic encryption hasn't
matured to the point where the entire training
process could operate on homomorphically encrypted
data.

But we're not talking here about completely
generic calculations -- we're talking about
quite limited calculations, just in enormous
quantities (10^18 calculations).

Perhaps there are "homomorphic" encryption
systems that do *just enough* and AI/ML systems
that are dumbed down *just enough* that the
two constraints can meet in the middle.

After all, AI/ML systems don't seem to care
about most kinds of image distortions, so
perhaps they could still be capable of
characterizing certain pictures even after
encryption ?

Obviously, if such things are possible, then
there are clearly information leaks, but these
might even be useful.

The following link was suggested to me, but
I don't know enough about AI/ML to fully
appreciate it:

"Federated Learning: Collaborative Machine Learning without Centralized Training Data"

https://ai.googleblog.com/2017/04/federated-learning-collaborative.html



More information about the cryptography mailing list