Published on Thu Jul 13 2017

Improving Sparsity in Kernel Adaptive Filters Using a Unit-Norm Dictionary

Felipe Tobar

New observations are incorporated to the dictionary when they are far from what the algorithm has seen in the past. The new methodology is validated on two real-world datasets against standard KAF.

0
0
0
Abstract

Kernel adaptive filters, a class of adaptive nonlinear time-series models, are known by their ability to learn expressive autoregressive patterns from sequential data. However, for trivial monotonic signals, they struggle to perform accurate predictions and at the same time keep computational complexity within desired boundaries. This is because new observations are incorporated to the dictionary when they are far from what the algorithm has seen in the past. We propose a novel approach to kernel adaptive filtering that compares new observations against dictionary samples in terms of their unit-norm (normalised) versions, meaning that new observations that look like previous samples but have a different magnitude are not added to the dictionary. We achieve this by proposing the unit-norm Gaussian kernel and define a sparsification criterion for this novel kernel. This new methodology is validated on two real-world datasets against standard KAF in terms of the normalised mean square error and the dictionary size.

Tue Jul 11 2017
Machine Learning
Initialising Kernel Adaptive Filters via Probabilistic Inference
We present a probabilistic framework for both determining the initial settings of kernel adaptive filters (KAFs) and constructing fully-adaptive KAFs. In addition to weights and dictionaries, kernel parameters are learnt sequentially.
0
0
0
Wed Aug 15 2018
Machine Learning
Study of Set-Membership Adaptive Kernel Algorithms
In the last decade, a considerable research effort has been devoted to developing adaptive algorithms based on kernel functions. We present data-selective adaptive kernel normalized least-mean square (KNLMS) algorithms that can increase their learning rate and reduce their computational complexity.
0
0
0
Mon Jun 12 2017
Machine Learning
Recursive Multikernel Filters Exploiting Nonlinear Temporal Structure
In kernel methods, temporal information on the data is commonly included by using time-delayed embeddings as inputs. We overcome this drawback by considering the different kernels separately. The resulting algorithms automatically learn to process highly nonlinear temporal information extracted from the input signal.
0
0
0
Wed Aug 18 2021
Machine Learning
Structure Parameter Optimized Kernel Based Online Prediction with a Generalized Optimization Strategy for Nonstationary Time Series
In this paper, sparsification techniques aided online prediction algorithms for nonstationary time series. The generalized optimization strategy provides a more self-contained way to construct the entire kernel connection modes, which enhances the ability to adaptively track the changingynamic characteristics.
1
0
0
Sat Jun 22 2013
Machine Learning
Online dictionary learning for kernel LMS. Analysis and forward-backward splitting algorithm
The order of the filters grows linearly with the number of input data. This dramatically increases the computational burden and memory requirement. We introduce a kernel least-mean-square algorithm with L1-norm regularization to automatically perform this task.
0
0
0
Wed Apr 25 2018
Machine Learning
Generalized Gaussian Kernel Adaptive Filtering
The present paper proposes generalized Gaussian kernel adaptive filtering, where the kernel parameters are adaptive and data-driven. Different from conventional kernel adaptive filters, the proposed regressor is a superposition of Gaussian kernels with all different parameters.
0
0
0