CGC Bulletin 1
Contents
- When should a PK cryptosystem be considered broken?
- Setting priority on discoveries in the field.
1. When should a PK cryptosystem be considered broken?
In the paper "Thompson's group and public key cryptography", in ACNS 2005, Lecture Notes Comp. Sc. 3531 (2005), 151--164, Shpilrain and Ushakov suggest a new PKC based on Thompson's group, together with concrete parameters.
In the Bochum Workshop, we announced a cryptanalysis of this PKC, see http://homepage.ruhr-uni-bochum.de/Arkadius.Kalka/workshop05/articles/researchannouncement.pdf We reported there on 0.7% success rate in finding the secrect keys of this PKC.
Cramer suggests that the Shpilrain's iteration solution to such problems (consult Shpilrain's papers) is not satisfactory, since history shows that usually success rates of attacks are considerably improved as time passes by, and having 0.7% today could mean 99% within several years. So, iterating the protocol enough times to make it secure could make it impractical.
Shpilrain replies: "I agree that success rates of attacks may be improved over time when better computers become available, but within reasonable limits. It is not unlikely that success rate of an attack can be improved from, say, 80% to 99% in a few years, but an improvement from 0.7% to 99% is extremely unlikely in a foreseeable future. I also agree that iterating the protocol too many times could make it impractical. For example, if an attack has success rate of 99%, then to reduce this rate to "practical zero", one would need hundreds of iterations which is, indeed, impractical. On the other hand, to reduce 0.7% to "practical zero", one would need just about 10 iterations which is still practical. The bottom line is: you have to do much better than 0.7% to claim that this particular attack is a threat to the security of our protocol. "
I wish to add that our claim that the system is insecure is only valid if one uses the protocol without iterations, and that this was the way we understood the proposal in the Shpilrain-Ushakov mentioned paper. While this is the standard way to use classical protocols, it may possibly not be enough for some of the modern suggestions.
Certainly, more research should be made on the Shpilrain-Ushakov PKC in order to understand its limits. Our student Dima Ruinsky is currently working on that.
2. Setting priority on discoveries in the field
We have received some reactions to our initial email, asking why we do not consider the IACR cryptology eprint server "good enough".
This was a misunderstanding.
To our opinion, IACR is the _best_ as long as exposure to the general cryptographic community is concerned. Its only disadvantage is that it allows you to replace files without leaving any trace of the earlier version. Assume that you work on breaking Arithmetica, and wish to get priority on its cryptanalysis. Then you publish some note with no actual numbers in IACR, then continue working, and when you have the actual result you make a replacement and can claim that the new version is not substantially different from the older. It would be possible but difficult to check your claim.
The way I see things, the ArXiv is more suitable for that purpose, since there one can access all versions of a submitted paper. Submitting a paper to the ArXiv and then sending to the present bulletin a short announcement with the exact link to the ArXived paper seems a good way to set your priority on works of interest for readers of this bulletin. Publishing in IACR is also recommended for a yet greater exposure.