# How to Compute GMI

Assuming that

i) the code bits are independent and uniformly distributed, and

ii) log-likelihood ratios (LLRs) for the code bits were correctly calculated, the generalized mutual information (GMI) can be approximated as

where *m* is the number of bit positions in the multilevel modulation, *n* is the number of transmitted symbols, and *c*_{k,l} and *λ*_{k,l} are the code bits and LLRs, respectively.

When the LLRs are “mismatched”, i.e., when they were incorrectly calculated or when approximations were used (like for example the max-log approximation), the GMI should be estimated as

For more details, see:

[R1] L. Szczecinski, A. Alvarado, “Bit-Interleaved Coded Modulation: Fundamentals, Analysis and Design,” John Wiley & Sons, ISBN: 978-0-470-68617-1, January 2015.

[R2] A. Alvarado, and E. Agrell, “Four-Dimensional Coded Modulation with Bit-wise Decoders for Future Optical Communications“, J. Lightw. Technol., vol. 33, no. 10, pp. 1993–2003, May 2015.

[R3] A. Alvarado, E. Agrell, D. Lavery, R. Maher, and P. Bayvel, “Replacing the Soft FEC Limit Paradigm in the Design of Optical Communication Systems“, J. Lightw. Technol., 2015.

*Please note that the expressions in Sec. III-D of [R3] are valid only if the constellation is normalized to unit energy. We erroneously claimed in Sec. III-D of [R3] that those expressions are valid for any average symbol energy. *

1,703 total views, 1 views today