Automatic data reweighting

Suppose you are designing an autonomous system that will gather data and adapt its behavior to that data.

At first you face the so-called cold-start problem.

You don’t have any data when you first turn the system on, and yet the system needs to do something before it has accumulated data.

So you prime the pump by having the system act at first on what you believe to be true prior to seeing any data.

Now you face a problem.

You initially let the system operate on assumptions rather than data out of necessity, but you’d like to go by data rather than assumptions once you have enough data.

Not just some data, but enough data.

Once you have a single data point, you have some data, but you can hardly expect a system to act reasonably based on one datum.

Rather than abruptly moving from prior assumptions to empirical data, you’d like the system to gradually transition from reliance on the former to reliance on the latter, weaning the system off initial assumptions as it becomes more reliant on new data.

The delicate part is how to manage this transition.

How often should you adjust the relative weight of prior assumptions and empirical data? And how should you determine what weights to use? Should you set the weight given to the prior assumptions to zero at some point, or should you let the weight asymptotically approach zero? Fortunately, there is a general theory of how to design such systems.

It’s called Bayesian statistics.

The design issues raised above are all handled automatically by Bayes theorem.

You start with a prior distribution summarizing what you believe to be true before collecting new data.

Then as new data arrive, the magic of Bayes theorem adjusts this distribution, putting more emphasis on the empirical data and less on the prior.

The effect of the prior gradually fades away as more data become available.

Related posts A Bayesian view of Amazon resellers Functional programming and conjugate models Occam’s razor and Bayes theorem.

Leave a Reply