slowdown.token – The python token implementation

Example:

>>> tokenizer = \
...     AES_RC4(
...         aes_key=Crypto.Random.get_random_bytes(16),  # 16 Bytes
...         rc4_key=Crypto.Random.get_random_bytes(8),   #  8 Bytes
...
...         # The decryption of tokens can speed up by caching.
...         # The default `cache_size` is 256.
...         cache_size=256,
...
...         # Tokens shoud be limited to a maximum length
...         # to avoid attacks on man-made tokens.
...         # The default `max_token` is 1048.
...         max_token=1048
...     )
>>> token = \
...     tokenizer.pack(
...         {
                'expiration_time': time.time()+3600,
...                    'username': 'guest'
...         }
...     )
>>> tokenizer.unpack(token)
{'expiration_time': 1600000000.0, 'username': 'guest'}
class slowdown.token.AES_RC4(aes_key: bytes = None, rc4_key: bytes = None, cache_size: int = - 1, max_token: int = - 1, magic_bytes: str = None)

The token encrypted by both AES and ARC4 algorithms.

pack(data: object) → str

Generate a encrypted token from the marshalable data.

unpack(token: str) → object

Validate the token and return the data contained in the encrypted token string.

class slowdown.token.Hash(salt: bytes, max_token: int = - 1, hash_func=None)

Hash token.

pack(data: object) → str

Generate a token from the marshalable data.

unpack(token: str) → object

Validate the token and return the data contained in the token string.