mokka.inft
Module implementing functions in information theory for MOKKa.
Implementation of information theoretic functions in pyTorch for MOKKa.
- mokka.inft.torch.BMI(m, N, b_ij, L_ij, p=None)
Calculate the bitwise mutual information.
Originally written by Benedikt Geiger
Parameters
- mint
number of bits
- Nint
number of sent/received symbols
- b_ij2D-matrix_like
value of the i-th (Sent) modulation symbol at the j-th position in the bit-string
- L_ij2D-matrix_like
LLRs
- p1D-matrix_like
probability of occurence of the sent symbols to adjust source entropy.
Returns
- BMIfloat
bitwise mutual information
- mokka.inft.torch.GMI(m, N, b_ij, L_ij, p=None)
Calculate the generalized mutual information.
Written by Benedikt Geiger
Parameters
- mint
number of bits
- Nint
number of sent/received symbols
- b_ij2D-matrix_like
value of the i-th (Sent) modulation symbol at the j-th position in the bit-string
- L_ij2D-matrix_like
LLRs
- p1D-matrix_like
probability of occurence of the sent symbols to adjust source entropy.
Returns
- GMIfloat
generalized mutual information
- mokka.inft.torch.MI(M, PX, N, symbol_idx, Q_YX)
Calculate the mutual information.
Numerically Computing Achievable Rates of Memoryless Channels
Parameters
- Mint
number of symbols
- PX2D-matrix_like
1 x M
- Nint
number of sent/received symbols
- symbol_idx1D-matrix_like
sent symbol indices
- Q_YX2D-matrix_like
N x M conditonal probabilities
Returns
- MIfloat
mutual information
- mokka.inft.torch.MI_AWGN(received_symbols: tensor, transmitted_sybmols: tensor, constellation_points: tensor, constellation_probabilites: tensor, noise_power: tensor) tensor
Calculate the Mutual infomation like a boss (LS) under the assumption of an AWGN channel
- Parameters:
received_symbols – t.tensor of received symbols
transmitted_sybmols – t.tensor of received symbols
constellation_points – position of the constellation points
constellation_probabilites – probabilites of the constellation points
noise_power – linear noise power (sum of both dimensions)
- Returns:
estimate of the mutual inforamation (MI)
Written by: - Benedikt Geiger, 11.12.2024
- mokka.inft.torch.SNR(M, sym_idx, rx_syms)
Calculate the signal to noise ratio.
Parameters
- Mint
number of constellation points
- sym_idxVector
Transmitted symbol indices between 0 and M-1
- rx_symsVector
Complex received symbols
Returns
- SNRfloat
Signal to noise ratio in linear units
- SNR_dB :float
Signal to noise ratio in decibels
- mokka.inft.torch.calculate_BER(predictions, labels)
Calculates the bit error rate
Parameters
- predictionsVector
Log-likelihood ratios (LLRs)
- labelsVector
Transmitted bits
Returns
- BERfloat
Bit Error Rate (BER)
- mokka.inft.torch.calculate_SER(predictions, labels)
Calculates the symbol error rate
Parameters
- predictionsVector
Probabilites that a certain symbol was transmitted
- labelsVector
Transmitted symbols
Returns
- SERfloat
Symbols Error Rate (SER)
- mokka.inft.torch.hMI(m, N, symbol_idx, demapped_symbol_idx)
Calculate the mutual information according to “Achievable Rates for Probabilistic Shaping” (Eq. 87).
If probabilistic shaping is applied this has to be reflected in the transmit symbols indices
Parameters
- mint
number of bits per symbol
- Nint
number of sent/received symbols
- symbol_idx1D-matrix_like
sent symbol indices
- demapped_symbol_idx1D-matrix_like
demapped symbol indices
Returns
- BMIfloat
bitwise mutual information