Published on Thu Sep 26 2019

B-Spline CNNs on Lie Groups

Erik J Bekkers

Group convolutional neural networks (G-CNNs) can be used to improve classical CNNs by equipping them with the geometric structure of groups. In our approach the differential structure of Lie groups is used to expand convolution kernels.

0
0
0
Abstract

Group convolutional neural networks (G-CNNs) can be used to improve classical CNNs by equipping them with the geometric structure of groups. Central in the success of G-CNNs is the lifting of feature maps to higher dimensional disentangled representations, in which data characteristics are effectively learned, geometric data-augmentations are made obsolete, and predictable behavior under geometric transformations (equivariance) is guaranteed via group theory. Currently, however, the practical implementations of G-CNNs are limited to either discrete groups (that leave the grid intact) or continuous compact groups such as rotations (that enable the use of Fourier theory). In this paper we lift these limitations and propose a modular framework for the design and implementation of G-CNNs for arbitrary Lie groups. In our approach the differential structure of Lie groups is used to expand convolution kernels in a generic basis of B-splines that is defined on the Lie algebra. This leads to a flexible framework that enables localized, atrous, and deformable convolutions in G-CNNs by means of respectively localized, sparse and non-uniform B-spline expansions. The impact and potential of our approach is studied on two benchmark datasets: cancer detection in histopathology slides in which rotation equivariance plays a key role and facial landmark localization in which scale equivariance is important. In both cases, G-CNN architectures outperform their classical 2D counterparts and the added value of atrous and localized group convolutions is studied in detail.

Fri Jan 24 2020
Machine Learning
PDE-based Group Equivariant Convolutional Neural Networks
We present a PDE-based framework that generalizes Group equivariant Convolutional Neural Networks (G-CNNs) In this framework, a network layer is seen as a set of PDEs where geometrically meaningful PDE's become the layer's trainable weights. Formulating our PDE on homogeneous spaces allows these networks to be designed with
4
0
1
Fri Nov 24 2017
Computer Vision
SplineCNN: Fast Geometric Deep Learning with Continuous B-Spline Kernels
We present Spline-based Convolutional Neural Networks (SplineCNNs), a variant of deep neural networks for irregular structured and geometric input. Our main contribution is a novel convolution operator based on B-splines, that makes the computation time independent from the kernel size.
0
0
0
Sun Dec 20 2020
Machine Learning
LieTransformer: Equivariant self-attention for Lie Groups
Group equivariant neural networks are used as building blocks of deep learning models. We propose an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups.
3
64
264
Tue Nov 19 2019
Computer Vision
General -Equivariant Steerable CNNs
The theory of Steerable CNNs yields constraints on the convolution kernels which depend on group representations. We show that these constraints for arbitrary group representations can be reduced to irreducible representations. -steerable convolutions are further shown to yield remarkable
2
0
1
Wed Nov 13 2019
Machine Learning
SpiralNet++: A Fast and Highly Efficient Mesh Convolution Operator
Intrinsic graph convolution operators with differentiable kernel functions play a crucial role in analyzing 3D shape meshes. In this paper, we present a fast and efficient intrinsic mesh convolution operator that does not rely on the intricate design of kernel function.
0
0
0
Thu Apr 12 2018
Artificial Intelligence
CubeNet: Equivariance to 3D Rotation and Translation
3D Convolutional Neural Networks are sensitive to transformations applied to their input. A voxelized version of a 3D object, and its rotated clone, will look unrelated to each other after passing through to the last layer of a network.
0
0
0