|
In my research I take a different view of signal processing which
leads to adaptive methods that handle uncertainty automatically, and
which are therefore robust to noise and missing data. The new
perspective is to view signal processing as solving inference
problems. For example, from this perspective time-frequency analysis
is an inference about which sinusoids are present in the signal. The
solution to this inference problem combines what is known before we
see the signal (any prior information we have about the likely
sinusoids) and what the data tells us (which sinusoids are consistent
with the observed data). Often these problems involve estimating more
quantities than there are data-points, and they are consequently
ill-posed. For this reason, there isn't one solution to the problem,
but a range of plausible estimates that are consistent with both prior
knowledge and the observed data.
Once the analysis are framed as inference problems, the machinery of
probabilistic inference automatically provides methods for handling
missing and noisy data and for adapting the parameters of the
representations to match the statistics of the signal. For
time-frequency analysis this means unevenly sampled data is
automatically handled and the properties of the filters or wavelets
can be adapted to the signal.
The new view of signal processing feels quite different from classical
perspective. However, many simple inference problems end up recovering
standard signal processing algorithms. That is, signal processing and
probabilistic inference are two sides of the same coin.
Although inference problems are theoretically the most accurate way of
solving estimation tasks, they are also often computationally
intensive. However, the connection to traditional signal processing
approaches allow us to borrow efficient methods and apply them to
these inference problems.
|
|
|