Stephan ten Brink has been a faculty member at the University of Stuttgart, Germany, since July 2013, where he is head of the Institute of Telecommunications.
From 1995 to 1997 and 2000 to 2003, Dr. ten Brink was with Bell Laboratories in Holmdel, New Jersey, conducting research on multiple antenna systems.
From July 2003 to March 2010, he was with Realtek Semiconductor Corp., Irvine, California, as Director of the wireless ASIC department, developing WLAN and UWB single chip MAC/PHY CMOS solutions.
In April 2010 he returned to Bell Laboratories as Department Head of the Wireless Physical Layer Research Department in Stuttgart, Germany.
Dr. ten Brink is an IEEE Fellow, and recipient and co-recipient of several awards, including the Vodafone Innovation Award, the IEEE Stephen O. Rice Paper Prize, and the IEEE Communications Society Leonard G. Abraham Prize for contributions to channel coding and signal detection for multiple-antenna systems.
He is best known for his work on iterative decoding (EXIT charts), MIMO communications (soft sphere detection, massive MIMO), and deep learning applied to communications.
Tutorial: “Basics of Iterative Processing in Detection and Decoding”
Abstract: This tutorial explains how iterative information exchange helps to improve detection and decoding of digital receivers using the EXIT chart technique. It outlines recent developments in the field, including polar code decoding, non-orthogonal multiple access (NOMA) and MIMO detection.
Keynote: “Optimizing Pulse Shaping for the Nonlinear Optical Fiber Channel using Deep Learning and the Autoencoder Concept”
Based on research work of Tim Uhlemann, Sebastian Cammerer, Alexander Span, Sebastian Dörner and StB at the Institute of Telecommunications, University of Stuttgart
Abstract: The optical fiber channel is dominated by effects such as chromatic disperison and Kerr nonlinearity, and is described by the nonlinear Schroedinger equation. In the early days of coherent communication over optical fibers in the 2000s, e.g., as used in the 5G backhaul and core network, the nonlinearity has been widely ignored to come up with simple transmit and receive/detection schemes. With increasing reach, data rates, and launch powers, however, the nonlinear effects cannot be further neglected. This work presents how pulse shaping can be optimized using an autoencoder and deep learning to find waveforms for jointly mitigating both chormatic dispersion as well as Kerr nonlinearity.