A structured CNN filter basis allows incorporating priors about natural image statistics and thus require less training examples to learn, saving valuable annotation time. Here, we build on the Gaussian derivative CNN filter basis that learn both the orientation and scale of the
...
A structured CNN filter basis allows incorporating priors about natural image statistics and thus require less training examples to learn, saving valuable annotation time. Here, we build on the Gaussian derivative CNN filter basis that learn both the orientation and scale of the filters. However, this Gaussian filter basis definition depends on a predetermined derivative order, which typically results in fixed frequency responses for the basis functions whereas the optimal frequency of the filters should depend on the data and the downstream learning task. We show that by learning the order of the basis we can accurately learn the frequency of the filters, and hence adapt to the optimal frequencies for the underlying task. We investigate the well-founded mathematical formulation of fractional derivatives to adapt the filter frequencies during training. Our formulation leads to parameter savings and data efficiency when compared to the standard CNNs and the Gaussian derivative CNN filter networks that we build on.