# Counting Matroids by Entropy

Guest post by Rudi Pendavingh and Jorn van der Pol

# Introduction

Bounding the entropy of a random variable often gives surprisingly strong bounds on the cardinality of its support. With Nikhil Bansal [BPvdP15], we gave a short entropy argument to obtain a bound on the number of matroids on groundset $\{1,2,\ldots,n\}$ that is quite close to the best known upper bound.

Roughly, the method works by bounding the number of matroids of rank 2, and using an entropy argument to derive a bound for the number of matroids of rank $r$. The same methods works for a number of other minor-closed classes, possibly after bounding the number of matroids of rank 3 (rather than 2) in the class under consideration.

In this blog post, we show how entropy can be used for counting problems in general and for counting matroids in particular, and we give a new proof for the following theorem from [PvdP15].

Theorem. If $N=U_{2,k}$ or $N=U_{3,6}$, then asymptotically almost all matroids have an $N$-minor.

# Entropy

Let $X$ be a random variable taking its values in a set $\mathcal{X}$ (we will always assume that $\mathcal{X}$ is a finite set). For $x \in \mathcal{X}$, let us write $P(x) = \mathbb{P}(X=x)$ for the probability that $X=x$. The entropy of $X$ is defined as
$$\mathcal{H}(X) = -\sum_{x \in \mathcal{X}} P(x) \log P(x),$$
where $\log$ denotes the binary logarithm, and for convenience we write $0\log 0 = 0$.

As an example, let us consider the case where $X$ takes the values 0 and 1 with probability $1-p$ and $p$, respectively. The entropy of $X$ is given by the binary entropy function,
$$\mathcal{H}(X) = H(p) := -p \log p – (1-p) \log (1-p).$$
If $p=1/2$, $X$ is a purely random bit of information, and we have $\mathcal{H}(X) = 1$. If $p=0$ or $p=1$, there is no randomness involved, and $\mathcal{H}(X) = 0$.

More generally, the entropy of $X$ measures the amount of information that is gained by learning the value of $X$. This may provide some intuition for the following properties of $\mathcal{H}(X)$.

• Boundedness: $\mathcal{H}(X) \le \log |\mathcal{X}|$.
• Subadditivity: $\mathcal{H}(X,Y) \le \mathcal{H}(X) + \mathcal{H}(Y)$.

Boundedness holds with equality if and only if $X$ has the uniform distribution on $\mathcal{X}$, i.e. $P(x) = 1/|\mathcal{X}|$ for all $x$. This makes entropy very useful for counting problems. If the random variable $X$ has the uniform distribution over $\mathcal{X}$, then $|\mathcal{X}| = 2^{\mathcal{H}(X)}$. So, bounds on the entropy of $X$ translate directly to bounds on $|\mathcal{X}|$.

In our applications, the space $\mathcal{X}$ consists of binary vectors, indexed by some set $J$. This allows us to define projections $X_T = (X_j : j \in T)$ for $T\subseteq J$.

Our main tool is the following result, which was first obtained in [CFGS86] in a different form. A proof and further discussion can also be found in [AS08].

Shearer’s Entropy Lemma. If $X$ is a random variable taking values in $\mathcal{X} \subseteq \{0,1\}^J$, and $T_1, T_2, \ldots, T_m$ is a collection of subsets of $J$ such that each $x \in \mathcal{X}$ is contained in at least $k$ of the $T_i$, then
$$\mathcal{H}(X) \le \frac{1}{k} \sum_{i=1}^m \mathcal{H}(X_{T_i}).$$

Intuitively, the lemma asserts that if every bit of information is contained in at least $k$ of the projected random variables, then the total amount of information contained in the projections is at least $k$ times the amount of information in $X$. Note that subadditivity follows as a special case of Shearer’s Lemma.

# Matroid counting

In what follows, we will write $\mathbb{M}$ for the class of all matroids, $\mathbb{M}(E,r)$ for the class of matroids on groundset $E$ of rank $r$, and so on. $\mathcal{M}$ will be a subclass of matroids that is always closed under isomorphism, and possibly satisfies some additional properties.

As $\mathcal{M}$ is closed under isomorphism, $|\mathcal{M}\cap\mathbb{M}(E,r)|$ depends on $E$ only through its cardinality, and we will write $m_\mathcal{M}(n,r) = |\mathcal{M}\cap\mathbb{M}([n],r)|$.

The following result illustrates Shearer’s Entropy Lemma. As its proof is short, we will include it here.

Lemma. If $\mathcal{M}$ is closed under taking submatroids, then $$\frac{\log\left(1+m_{\mathcal{M}}(n,r)\right)}{ \binom{n}{r}} \leq \frac{\log \left(1+m_{\mathcal{M}}(n-1,r)\right)}{ \binom{n-1}{r}}.$$

Proof: If $M$ is a matroid with set of bases $\mathcal{B}$, and $e\in E(M)$ is not a coloop of $M$, then $\{B \in \mathcal{B} : e\not\in\mathcal{B}\}$ is the set of bases of $M\backslash e$. We will apply Shearer’s Entropy Lemma in such a way that its projections correspond to deletions.

Let $E$ be a set of cardinality $n$. We encode $M\in\mathcal{M}\cap\mathbb{M}(E,r)$ by the incidence vector of its bases. This is the binary vector $\chi$, indexed by the $r$-subsets of $[n]$, such that $\chi(B) = 1$ if and only if $B$ is a basis of $M$. Our space $\mathcal{X} \equiv \mathcal{X}(E,r)$ consists of all incidence vectors corresponding to $M\in\mathcal{M}\cap\mathbb{M}(E,r)$, as well as the all-zero vector. Thus, $|\mathcal{X}| = 1+m_\mathcal{M}(n,r)$, and if $X$ has the uniform distribution on $\mathcal{X}$, then $\mathcal{H}(X) = \log\left(1+ m_\mathcal{M}(n,r) \right)$.

If $e$ is not a coloop of $M$, then the incidence vector of $M\backslash e$ is obtained from $\chi$ by restricting $\chi$ to the entries avoiding $e$. (If $e$ is a coloop, this operation yields the all-zero vector, which is the reason that we need to include the all-zero vector in $\mathcal{X}$.)

We make two observations.

• For $e \in E$, let $T_e = \binom{E-e}{r}$. It follows from the previous paragraph, that if $X \in \mathcal{X}(E,r)$, the projection $X_{T_e}\in\mathcal{X}(E-e,r)$.
• Each $r$-subset of $E$ is contained in $|E|-r$ different $T_e$.

An application of Shearer’s Entropy Lemma, followed by an application of the Boundedness Property shows
$$\mathcal{H}(X) \le \frac{1}{n-r} \sum_{e \in E} \mathcal{H}(X_{T_e}) \le \frac{n}{n-r} \log \left(1+ |\mathcal{M}\cap\mathbb{M}(n-1,r)| \right),$$
which concludes the proof. $\square$

Repeated application of the statement dual to Lemma 1 yields the following variant:

Matroid Entropy Lemma. If $\mathcal{M}$ is closed under contraction, then for all $t \le r$
$$\frac{\log\left(1+m_{\mathcal{M}}(n,r)\right)}{\binom{n}{r}} \le \frac{\log\left(1+m_{\mathcal{M}}(n-t,r-t) \right)}{\binom{n-t}{r-t}}.$$

This variant enables us to lift upper bounds on the number of matroids of a fixed rank $s$  to upper bounds on the number of matroids in any rank $r\geq s$.

The first natural application of this scheme is counting matroids in general, so the case $\mathcal{M}=\mathbb{M}$. We will abbreviate $m(n,r):=m_{\mathbb{M}}(n,r)$, and consider the matroids of fixed rank $s=2$. Since each matroid of rank 2 on $n$ elements determines a nontrivial partition of $n+1$ elements, we have $1+m(n,2)\leq (n+1)^n$. Hence, $$\frac{\log (1+m(n,r))}{ \binom{n}{r}} \leq \frac{\log (1+m(n-r+2,2))}{\binom{n-r+2}{2}}\leq \frac{2\log(n-r+3)}{n-r+1}.$$ It then is straightforward to derive the following for $m(n):=\sum_r m(n,r)$.

Theorem (Bansal, Pendavingh, Van der Pol 2014).

$$\log m(n) \leq O\left(\binom{n}{n/2}\frac{\log(n)}{n}\right)\text{ as }n\rightarrow\infty$$

For comparison, we state the following lower bound.

Theorem (Knuth 1974;  Graham, Sloane 1980). $\log m(n) \geq \binom{n}{n/2}/n$.

So our entropy bound on $\log m(n)$ is off by a factor $\log n$, which is not too bad given the simplicity of the argument. The $\log n$ factor will not go away if we use a fixed rank $s>2$ for the base case. The following paraphrases a result due to Bennett and Bohman (see [Kee15]).

Theorem For any fixed $s\geq 3$, we have  $$\frac{\log m(n,s)}{\binom{n}{s}}\geq \frac{\log(e^{1-s}(n-s+1))}{n-s+1}\text{ as }n\rightarrow\infty$$

We defer the details of these lower bounds to a later post.

# Further applications

To show that asymptotically almost all matroids have certain fixed matroid $N$  as a minor, it suffices to find an upper bound on the number of matroids without such a minor which is vanishing compared to the above lower bound on $m(n)$. We next show how entropy counting gives a sufficient upper bound in case $N=U_{2,k}$ or $N=U_{3,6}$. Let $$Ex(N):= \{M \in \mathbb{M}: M\not \geq N\}|.$$ We first consider a fixed $N=U_{2,k}$, and bound $m'(n,r):=m_{Ex(U_{2,k})}(n,r)$.

Each matroid of rank 2 on $n$ elements without $U_{2,k}$-minor corresponds 1-1 to a partition of $n+1$ items into at most $k$ parts, so that  $1+m'(n,2)=k^{n+1}$. This gives $$\frac{\log (1+m'(n,r))}{ \binom{n}{r}} \leq \frac{\log (1+m'(n-r+2,2))}{\binom{n-r+2}{2}}\leq \frac{2\log(k)}{n}(1+o(1)),$$ which does not suffice to push the upper bound on $m_{Ex(U_{2,k})}(n):=\sum_r m'(n, r)$ below the lower bound on $m(n)$.

At the time of writing of [PvdP15], we made this same calculation and then gave up on using the matroid entropy lemma to produce the upper bound. Instead, we developed cover complexity to do the job. But recently we noted that we were wrong to give up on entropy so soon, which works beautifully if we proceed from matroids of fixed rank $s=3$ rather than $2$.

It is easy to show that a simple matroid of rank 3 without $U_{2,k}$ as a minor has at most $1+(k-1)(k-2)$ points (fix a point and consider that all other points are on a line through that point). Hence, there is a global upper bound $c_k$ on the number of such (simple) matroids. Since each matroid is fully determined by its simplification and the assignment of its non-loop elements to parallel classes, we obtain the upper bound $1+m'(n,3)\leq c_k (k^2)^n$. It follows that $$\frac{\log (1+m'(n,r))}{ \binom{n}{r}} \leq \frac{\log (1+m(n-r+3,3))}{\binom{n-r+3}{3}}\leq \frac{12\log(k)}{n^2}(1+o(1)).$$ That bound does suffice:

Theorem (Pendavingh, Van der Pol 2015).  For any fixed $k$, we have $$\log m_{Ex(U_{2,k})}(n) \leq O\left(\binom{n}{n/2}/n^2\right)\text{ as } n\rightarrow \infty$$

Hence $\log m_{Ex(U_{2,k})}(n)\leq o(\log m(n))$, and it follows that asymptotically almost all matroids on $n$ elements have $U_{2,k}$ as a minor.

To bound the matroids without $U_{3,6}$ as a minor, it will also suffice to consider simple matroids in base rank 3.

Lemma. There is a constant $n_0$ so that if $M$ is a simple matroid of rank 3 on $E$ without a $U_{3,6}$-minor, and $|E|\geq n_0$, then there are lines $\ell_1$ and $\ell_2$ and a point $p$ of $M$ so that $E= \ell_1\cup\ell_2\cup\{p\}$.

Our shortest argument works for $n_0=56$. You may enjoy improving that constant.

The structural description of the lemma implies a bound on the number of simple matroids without $U_{3,6}$ in rank 3, which again is enough to prove that asymptotically almost all matroids on $n$ elements have $U_{3,6}$ as a minor.

# Conjectures

When applying the matroid entropy lemma, a key issue  seems to be the choice of fixed rank $s$ for the base case. The lower we pick $s$, the easier work we have proving the base case, but higher $s$ will give better bounds. And as we found out to our embarrassment, we sometimes need to go to a sufficiently high $s$ before our bounds are any good.

So far we have not ventured beyond base rank $s=3$, which of course comes with an attractive arena of points-and-lines-with-your-favourite-property. We think this perhaps means that the entropy counting lemma has not been used to its full potential. To illustrate,  we offer the following conjectures.

Matroids without the 3-whirl

Conjecture. Asymptotically almost all matroids have the 3-whirl $W^3$ as a minor.

It is a theorem that almost all matroids are 3-connected [OSWW13], and we hazard the guess that most of those 3-connected matroids will have $M(K_4)$ or $W^3$ as a minor. Having $M(K_4)$ as a minor seems strictly more of a `coincidence’ than having $W^3$ as a minor, because $M(K_4)$ has all the hyperplanes of $W^3$, plus one extra. So if at least one of $W^3$ and $M(K_4)$ is a minor of almost all matroids, and the former is more likely than the latter, our conjecture becomes quite believable.

We could not make any sufficient upper bounds on $m_{Ex(W^3)}(n,s)$ in fixed rank $s=3$, but $s=4$ is open.

Matroids without a fixed uniform minor

We have very recently established the following [PvdP16].

Theorem Let $N$ be a uniform matroid. Asymptotically almost all matroids have $N$ as a minor.

The technique we used for this is much more involved than the above entropy method, and its does not give good bounds in fixed rank. The following is open.

Conjecture. Let $N$ be a uniform matroid. For any $c$ there exists an $s$ so that $$\log\left(1+m_{Ex(N)}(n,s)\right) / \binom{n}{s} \leq \frac{c}{n}$$ for all sufficiently large $n$.

If this conjecture is true for any $c<1/2$, this would yield an alternative proof of our theorem.

Oriented matroids

Conjecture Asymptotically almost all matroids are not orientable.

This conjecture may not appeal to all of you. An enumeration of small matroids reveals that the majority is in fact, orientable [MMIB12].

Let $\bar{m}(n,r)$ denote the number of oriented matroids on $E=\{1,\ldots, n\}$ and of rank $r$. There is a perfect analogue of the matroid entropy lemma for oriented matroids, with an entirely analogous proof (you use chirotopes rather than incidence vectors of base sets, otherwise the proof is identical).

Oriented Matroid Entropy Lemma. For all $t \le r$
$$\frac{\log\left(1+\bar{m}(n,r)\right)}{\binom{n}{r}} \le \frac{\log\left(1+\bar{m}(n-t,r-t) \right)}{\binom{n-t}{r-t}}.$$

If $\log \bar{m}(n)\leq (1-\epsilon)\log m(n)$ for sufficiently large $n$, then the number of oriented and hence the number of orientable matroids will be vanishing compared to the number of matroids, which would prove the conjecture. It would suffice to establish:

Conjecture. There is an $s$ and an $c<1/2$ such that $$\frac{\log\left(1+\bar{m}(n,s)\right)}{ \binom{n}{s}} \leq \frac{c}{n}$$ for all sufficiently large $n$.

Except for the constant $c<1/2$, that does not seem too much to ask, given the following theorem [ERS86].

Theorem. For each $s$ there is a $c$ such that $$\frac{\log \bar{m}(n,s)} { \binom{n}{s} }\leq \frac{c}{n}$$ for all sufficiently large $n$.

Note that this theorem already implies that for any fixed $s$, we have $$\log \bar{m}(n,s)\leq o(\log m(n,s))$$ for $n\rightarrow\infty$, since $\log m(n,s)/\binom{n}{s}\sim \log (n)/n$ as $n\rightarrow\infty$.

Finally, a construction due to Felsner and Valtr [FV11] shows that $$\log \bar{m}(n,3)\geq 0.1887 n^2,$$ i.e.  $\log \bar{m}(n,3)/ \binom{n}{3} \geq 1.32/n.$ So base rank $s=3$ will not do, but $s=4$ is open…

# References

[AS08] Noga Alon and Joel Spencer, The probabilistic method, 3rd edition.

[BPvdP14] N. Bansal, R.A. Pendavingh, and J.G. van der Pol. An entropy argument for counting matroids. Journal of Combinatorial Theory B, 109:258-262, 2014.

[CFGS86] F. R. K. Chung, R. L. Graham, P. Frankl, and J. B. Shearer. Some intersection theorems for ordered sets and graphs. Journal of Combinatorial Theory A, 43(1):23-37, 1986.

[ERS86] H. Edelsbrunner, J. O’Rourke, and R. Seidel. Constructing arrangements of lines and hyperplanes with applications. SIAM Journal on Computing 15 (2): 341–363, 1986 doi:10.1137/0215024.

[FV11] S. Felsner and P. Valtr. Coding and Counting Arrangements of Pseudolines. Discrete & Computational Geometry, 46:405

[Kee15] P. Keevash. Counting designs.  Arxiv preprint 1504.02909

[MMIB12] Y. Matsumoto, S. Moriyama, H. Imai and D. Bremner. Matroid enumeration for incidence geometry. Discrete and Computational Geometry. Vol. 47, issue 1, pp. 17-43, 2012.

[OSWW13] J. Oxley, C. Semple, L. Warshauer, and D. Welsh. On properties of almost all matroids. Adv. Appl. Math., 50(1):115–124, 2013.

[PvdP15] R.A. Pendavingh and J.G. van der Pol. Counting matroids in minor-closed classes. Journal of Combinatorial Theory B, 111:126–147, 2015.

[PvdP16] R.A. Pendavingh and J.G. van der Pol. On the number of bases in almost all matroids. Arxiv preprint 1602.04763

This site uses Akismet to reduce spam. Learn how your comment data is processed.