Pre-Trained Language Models in Semantic Communication
Luiz Fernando Gontijo, Paulo Cardieri

DOI: 10.14209/sbrt.2024.1571036810
Evento: XLII Simpósio Brasileiro de Telecomunicações e Processamento de Sinais (SBrT2024)
Keywords: Semantic Communications Language Models Signal Processing Deep Leaning
Abstract
Communication systems traditionally focus on accurately transmitting signals without considering semantic content. This paper introduces semantic communication models using pre-trained language models, T5 and BART, compared to conventional methods like Huffman and Turbo coding. Numerical results demonstrate the semantic models' superiority, especially in low SNR conditions, measured by BLEU and BERTScore metrics. Also, the proposed system only needs fine tuning to obtain good results in environments with severe fading. The results suggests a paradigm shift where decoding semantic meaning, rather than exact message replication, becomes crucial. Such models pave the way for novel communication architectures and emphasize the importance of semantic understanding in communication systems.

Download