mokka.inft

Module implementing functions in information theory for MOKKa.

Implementation of information theoretic functions in pyTorch for MOKKa.

mokka.inft.torch.BMI(m, N, b_ij, L_ij, p=None)

Calculate the bitwise mutual information.

Originally written by Benedikt Geiger

Parameters

mint

number of bits

Nint

number of sent/received symbols

b_ij2D-matrix_like

value of the i-th (Sent) modulation symbol at the j-th position in the bit-string

L_ij2D-matrix_like

LLRs

p1D-matrix_like

probability of occurence of the sent symbols to adjust source entropy.

Returns

BMIfloat

bitwise mutual information

mokka.inft.torch.MI(M, PX, N, symbol_idx, Q_YX)

Calculate the mutual information.

Numerically Computing Achievable Rates of Memoryless Channels

Parameters

Mint

number of symbols

Nint

number of sent/received symbols

Q_YX2D-matrix_like

N x M conditonal probabilities

PX2D-matrix_like

1 x M

symbol_idx1D-matrix_like

sent symbol indices

Returns

MIfloat

mutual information

mokka.inft.torch.SNR(M, sym_idx, rx_syms)

Calculate the signal to noise ratio.

Parameters

Mint

number of constellation points

sym_idxVector

Transmitted symbol indices between 0 and M-1

rx_symsVector

Complex received symbols

Returns

SNRfloat

Signal to noise ratio in linear units

SNR_dB :float

Signal to noise ratio in decibels

mokka.inft.torch.hMI(m, N, symbol_idx, demapped_symbol_idx)

Calculate the mutual information according to “Achievable Rates for Probabilistic Shaping” (Eq. 87).

If probabilistic shaping is applied this has to be reflected in the transmit symbols indices

Parameters

mint

number of bits per symbol

Nint

number of sent/received symbols

symbol_idx1D-matrix_like

sent symbol indices

demapped_symbol_idx1D-matrix_like

demapped symbol indices

Returns

BMIfloat

bitwise mutual information