May 23, 2015

Minimum Dyspoissonism vs. Qualitative Entropy

Home

The quantification of entropy is in the eye of the beholder. In other words, we can only measure entropy through a metric of our own choosing. What one metric regards as orderly, another might regard as entropic.

This is a stern warning to anyone who might otherwise consider dyspoissonism as some archetypical entropy metric: there is no such thing. It's merely another tool in the chest.

Take the following mask list with (Q = Z = 16), for example:

G = {0, 1, 2, 3, 4, 5, 6, 7, 7, 8, 8, 9, 9, 10, 10, 10}

G has a population list given by:

H = {7, 3, 1}

which turns out to have what I believe to be the minimum dyspoissonism (maximum logfreedom) for this Q and Z, namely about (4.2368106628E+01). But it hardly looks random to a human. If you're confused as to how this could possibly occur at maximum logfreedom, reread the first sentence in this post.

This is one particularly obtuse case in which, perhaps, someone has attempted to pass a dyspoissonism filter by contriving an easily constructed mask list which happens to have minimum dyspoissonism. I just thought I should point this out for those who might assume that such construction would be impossible.

What to do? Fortunately, there are plenty of other tools in the chest.

For one thing, we could take a simple derivative (next minus previous) transformation, yielding G':

G' = {0, 1, 1, 1, 1, 1, 1, 1, 0, 1, 0, 1, 0, 1, 0, 0}
H' = {0, 0, 0, 0, 0, 1, 0, 0, 0, 1}

Now the logfreedom collapses and the dyspoissonism surges closer to one. This unremarkable data set has been revealed for what it is. For maximum paranoia, we could continue differentiating until we hit maximum dyspoissonism, searching for what I call "maximum differential dyspoissonism". This technique is particularly useful when analyzing PRNGs that tend to fall apart after several differentiations.

Alternatively, one could evaluate the kernel skew of G, purely as a backstop between dyspoissonism and the human sense of entropy.

No comments:

Post a Comment