Published on Fri Jan 31 2020

Data-Driven Factor Graphs for Deep Symbol Detection

Nir Shlezinger, Nariman Farsad, Yonina C. Eldar, Andrea J. Goldsmith

Many important schemes in signal processing and communications are instances of factor graph methods. In order to implement these algorithms, one must have accurate knowledge of the statistical model of the considered signals. We propose to use machine learning (ML) tools to learn the factor graph, instead of the overall system task.

0
0
0
Abstract

Many important schemes in signal processing and communications, ranging from the BCJR algorithm to the Kalman filter, are instances of factor graph methods. This family of algorithms is based on recursive message passing-based computations carried out over graphical models, representing a factorization of the underlying statistics. Consequently, in order to implement these algorithms, one must have accurate knowledge of the statistical model of the considered signals. In this work we propose to implement factor graph methods in a data-driven manner. In particular, we propose to use machine learning (ML) tools to learn the factor graph, instead of the overall system task, which in turn is used for inference by message passing over the learned graph. We apply the proposed approach to learn the factor graph representing a finite-memory channel, demonstrating the resulting ability to implement BCJR detection in a data-driven fashion. We demonstrate that the proposed system, referred to as BCJRNet, learns to implement the BCJR algorithm from a small training set, and that the resulting receiver exhibits improved robustness to inaccurate training compared to the conventional channel-model-based receiver operating under the same level of uncertainty. Our results indicate that by utilizing ML tools to learn factor graphs from labeled data, one can implement a broad range of model-based algorithms, which traditionally require full knowledge of the underlying statistics, in a data-driven fashion.

Wed Mar 04 2020
Machine Learning
Neural Enhanced Belief Propagation on Factor Graphs
A graphical model is a structured representation of locally dependent random variables. A traditional method to reason over these random variables is to perform inference using belief propagation. We propose a new hybrid model that runs conjointly a FG-GNN with belief propagation.
0
0
0
Fri Jun 05 2020
Machine Learning
Learned Factor Graphs for Inference from Stationary Time Sequences
The design of methods for inference from time sequences has traditionally relied on statistical models. Here we propose a framework that combines model-based algorithms and data-driven ML tools for stationary time sequences. We present an inference algorithm based on learned stationary factor graphs.
0
0
0
Mon Feb 16 2015
Machine Learning
Towards Building Deep Networks with Bayesian Factor Graphs
We propose a Multi-Layer Network based on the Bayesian framework of the Factor Graphs in Reduced Normal Form (FGrn) The Latent Variable Model (LVM) is the basic building block of a Quadtree hierarchy built on top of a bottom layer of random variables. Typical
0
0
0
Sat Jun 01 2019
Machine Learning
GLAD: Learning Sparse Graph Recovery
Recovering sparse conditional independence graphs from data is a fundamental problem in machine learning. Many optimization algorithms have been designed to solve this problem. We propose a deep learning architecture, GLAD, which uses an Alternating Minimization (AM) algorithm as our model inductive bias.
0
0
0
Sat Jul 11 2020
Machine Learning
Graph Neural Networks for Massive MIMO Detection
The paper uses graph neural networks (GNNs) to learn a message-passing solution for the inference task of massive multiple input multiple-output (MIMO) detection. We adopt a graphical model based on the Markov random field.
0
0
0
Wed Nov 13 2013
Machine Learning
Sparse Matrix Factorization
We investigate the problem of factorizing a matrix into several sparse matrices. We propose an algorithm for this under randomness and sparsity assumptions. This problem can be viewed as a simplification of the deep learning problem.
0
0
0