We are implementing many algorithms which typically have lots of shared, publicly known and security-relevant parameters.
Currently, we simply use a class holding all the parameters and two predefined global objects:
class PublicParams(object):
p = q = 0
def __init__(self, p, q):
self.p = p
self.q = q
# used for tests
publicParams_test = PublicParams(15,7)
# Some 2048 bit numbers for example
publicParams_secure = PublicParams(128378947298374928374,128378947298374928374)
The algorithms then take a PublicParams
object as an argument that defaults to the productive publicParams_secure
def AlgoOne(n, publicParams = publicParams_secure):
# do stuff with publicParams.p
# ...
AlgoTwo(x, publicParams)
and
def AlgoTwo(x, publicParams= publicParams_secure):
# do stuff with publicParams.q
This way we can still inject different public parameters for easier unit testing:
class AlgoOneTest(unittest.TestCase):
def test(self):
# compare with manually computed result
self.assertTrue(AlgoOne(1, publicParams_test) == 10)
What I don't like about this approach:
- Giving the
publicParams
a default value makes it optional when calling some algorithm. However, it becomes easy to forget passing it when callingAlgoTwo
from withinAlgoOne
, which would result in two different objects being used if the test object was passed toAlgoOne
Is there a better way which is less prone to but still offers flexibility for unit testing? Is this really best practice?