This chapter formulates a new cost function for adaptive filtering based on Renyi’s quadratic error entropy. The problem of estimating the linear system parameters \(\mathrm{\mathbf {w}} = {[{w}_{0},\ldots, {w}_{M-1}]}^{\mathrm{T}}\) in the setting of Figure 3.1 where x(n), and z(n) are random variables can be framed as model-based inference, because it relates measured data, uncertainty, and the functional description of the system and its parameters. The desired response z(n) can be thought of as being created by an unknown transformation of the input vector \(\mathrm{\mathbf {x}} = {[x(n),\ldots, x(n - M + 1)]}^{\mathrm{T}}\). Adaptive filtering theory [143, 284] addresses this problem using the MSE criterion applied to the error signal, \(e(n) = z(n) - f(\mathrm{\mathbf {w}},x(n))\) $${J}_{w}(e(n)) = E[{(z(n) - f(\mathrm{\mathbf{w}},x(n)))}^{2}]$$ (3.1) when the linear filter is a finite impulse response filter (FIR); $$y(n) =\sum\limits_{k=0}^{M-1}{w}_{ k}x(n - k).$$ (3.2)