[Cryptography] Claims of factoring 2048-bit RSA

Ron Garret ron at flownet.com
Tue Nov 7 10:52:35 EST 2023


> On Nov 6, 2023, at 6:48 AM, Stephan Neuhaus <stephan.neuhaus at zhaw.ch> wrote:
> 
> On 11/3/23 17:34, Amarendra Godbole wrote:
>> https://www.bankinfosecurity.com/blogs/researcher-claims-to-crack-rsa-2048-quantum-computer-p-3536
>> Of course quantum computer. I am not qualified enough to comment on
>> this article and its claims, though this group has many people who
>> are.
> 
> I, too, am not an expert,

I am, and everything you say is essentially correct.

On the other hand, it is not that long ago that you would have found experts saying that it would be many decades before AI technology reached current levels of performance, and you can see how that turned out.

The thing that would keep me up at night if I were inclined to worry about this (see below) is that if someone has a real breakthrough in QC technology they are very likely to keep it a secret.  There is as yet no known reason why it should be impossible to build a QC large enough to break RSA (and ECC), and in general for things that aren't impossible it's a question of when, not if, humans manage to figure out a way to do it.

There are two additional things to consider:  First, it is extremely unlikely that a breakthrough is going to be made by an individual.  Much more likely it will be a large corporation or a state actor.  And second, if one of these entities does figure out a way to build a sufficiently large QC, they will have a huge incentive to keep it a secret.  The consequence of this is that if a breakthrough does happen, you are extremely unlikely to learn about it by reading about it in a published paper.  Much more likely you will wake up one morning to discover that the stock market has crashed, ATMs and POS terminals have stopped working, you local bank branches are closed, and there are riots in the streets.

Personally, none of this keeps me up at night.  I see nothing on the horizon that will make a large QC possible any time soon.  I am much more worried about climate change and U.S. politics than the quantum apocalypse.  But I was also an AI expert once, and I left that field because I thought it was stagnating and I would not see significant progress in my lifetime, so my track record on predicting this sort of thing is not very good.  You should take that into account when performing your own risk assessment.

rg



More information about the cryptography mailing list