Entropy

Entropy describes how repetitive or varied the submitted order sizes are on each side of the book. If most updates keep using the same handful of sizes, entropy is low, while a broader mix of sizes pushes entropy higher. The bid and ask series are normalized before plotting, so the chart runs from 0 to 1 and is easier to compare across days with different numbers of observed size categories.

Mathematics

This metric builds a histogram of observed order size frequencies for bids and asks and computes Shannon entropy, H=ipilog2piH = -\sum_i p_i \log_2 p_i, where pip_i is the empirical probability of size bucket ii. Because the raw value depends on how many categories appear, it is normalized as H/log2(k)H / \log_2(k) when there are k>1k > 1 nonzero categories, producing a unitless score in [0,1][0,1]. Values near 00 indicate activity concentrated in a very small set of sizes, while values near 11 indicate sizes that are distributed much more evenly across the observed categories.

Illustration of low entropy with one dominant order-size bucket and several small ones.
Low entropy: activity is concentrated in one or two repeated sizes.
Illustration of high entropy with many order-size buckets appearing at similar frequencies.
High entropy: activity is spread more evenly across many different sizes.