mokka.e2e
Module with blocks for end-to-end simulations.
Module with blocks for end-to-end simulations in PyTorch.
- class mokka.e2e.torch.BitwiseAutoEncoder(m, demapper_width=127, demapper_depth=3, channel=None, mod_extra_params=(0,), demod_extra_params=(0,), mapper=None, demapper=None)
Bases:
Module
Bitwise Auto-Encoder implementation.
- Parameters:
m – bits per symbol and input and output width of the Autoencoder
demapper_width – Width of the hidden layers in the demapper
demapper_depth – Number of hidden layers in the demapper
- __init__(m, demapper_width=127, demapper_depth=3, channel=None, mod_extra_params=(0,), demod_extra_params=(0,), mapper=None, demapper=None)
Construct BitwiseAutoEncoder.
- forward(b, *args)
Perform end-to-end simulation with the BitWiseAutoEncoder.
Take input-bits and output LLRs, then outside of the auto-encoder the loss can be calculated.
- Parameters:
b – tensor of bit-strings.
- Returns:
tensor of LLRs.
- static load_model(model_dict)
Load model weights from a dictionary.
- Parameters:
model_dict – model weights from a file, opened with torch.load.
- Returns:
BitwiseAutoEncoder model initialized with saved weights.