## Section 11.4 Applying Probability to Ramsey Theory

The following theorem, due to P. Erdős, is a true classic, and is presented here in a manner that is faithful to how it was first published. As we shall see later, it was subsequently recast—but that's getting the cart ahead of the horse.

###### Theorem 11.4.

If\(n\) is a positive integer. Then

###### Proof.

Let \(t\) be an integer with \(t>n\) and consider the set \(\cgF\) of all labeled graphs with vertex set \(\{1,2,\dots,t\}\text{.}\) Clearly, there are \(2^{C(t,2)}\) graphs in this family. Let \(\cgF_1\) denote the subfamily consisting of those graphs which contain a complete subgraph of size \(n\text{.}\) It is easy to see that

Similarly, let \(\cgF_2\) denote the subfamily consisting of those graphs which contain an independent set of size \(n\text{.}\) It follows that

We want to take the integer \(t\) as large as we can while still guaranteeing that \(|\cgF_1|+|\cgF_2|\le |\cgF|\text{.}\) This will imply that there is a graph \(G\) in \(\cgF\) which does not contain a complete subgraph of size \(n\) or an independent set of size \(n\text{.}\) So consider the following inequality:

Now we ask how large can \(t\) be without violating inequality (11.4.1)? To answer this, we use the trivial inequality \(\binom{t}{n}\le t^n/n!\) and the use the Stirling approximation for \(n!\text{.}\) After some algebra and taking the \(n^{\text{th} }\) root of both sides, we see that we need only guarantee that

Now let's take a second look at the proof of Theorem 11.4. We consider a probability space \((S,P)\) where the outcomes are graphs with vertex set \(\{1,2,\dots,t\}\text{.}\) For each \(i\) and \(j\) with \(1\le i \lt j\le t\text{,}\) edge \(ij\) is present in the graph with probability \(1/2\text{.}\) Furthermore, the events for distinct pairs are independent.

Let \(X_1\) denote the random variable which counts the number of \(n\)-element subsets of \(\{1,2,\dots,t\}\) for which all \(\binom{n}{2}\) pairs are edges in the graph. Similarly, \(X_2\) is the random variable which counts the number of \(n\)-element independent subsets of \(\{1,2,\dots,t\}\text{.}\) Then set \(X=X_1+X_2\text{.}\)

By linearity of expectation, \(E(X)=E(X_1)+E(X_2)\) while

If \(E(X)\lt 1\text{,}\) then there must exist a graph with vertex set \(\{1,2,\dots,t\}\) without a \(K_n\) or an \(I_n\text{.}\) And the question of how large \(t\) can be while maintaining \(E(X)\lt 1\) leads to exactly the same calculation we had before.

After more than fifty years and the efforts of many very bright researchers, only marginal improvements have been made on the bounds on \(R(n,n)\) from Theorem 11.2 and Theorem 11.4. In particular, no one can settle whether there is some constant \(c\lt 2\) and an integer \(n_0\) so that \(R(n,n)\lt 2^{cn}\) when \(n>n_0\text{.}\) Similarly, no one has been able to answer whether there is some constant \(d>1/2\) and an integer \(n_1\) so that \(R(n,n)>2^{dn}\) when \(n>n_1\text{.}\) We would certainly give you an \(A\) for this course if you managed to do either.

###### Discussion 11.5.

Carlos said that he had been trying to prove a good lower bound on \(R(n,n)\) using only constructive methods, i.e., no random techniques allowed. But he was having problems. Anything he tried seemed only to show that \(R(n,n)\ge n^c\) where \(c\) is a constant. That seems so weak compared to the exponential bound which the probabilistic method gives easily. Usually Alice was not very sympathetic to the complaints of others and certainly not from Carlos, who seemed always to be out front. But this time, Alice said to Carlos and in a manner that all could hear “Maybe you shouldn't be so hard on yourself. I read an article on the web that nobody has been able to show that there is a constant \(c>1\) and an integer \(n_0\) so that \(R(n,n)>c^n\) when \(n>n_0\text{,}\) provided that only constructive methods are allowed. And maybe, just maybe, saying that you are unable to do something that lots of other famous people seem also unable to do is not so bad.” Bob saw a new side of Alice and this too wasn't all bad.