# Spectrum auctions: elegance is in the mundane

As a student, Professor Anthony Lee Zhang of the Booth School was amazed that Paul Milgrom chose to devote so much time to spectrum auctions. Milgrom, co-recipient of the 2020 Nobel Prize in Economics, helped design the world’s first spectrum auction in 1994 and has been closely involved in the design process ever since. Its work is about the real world, the gory details and everything in between, and Stanford estimates the total amount of government revenue generated from spectrum auctions around the world to be over $ 100 billion.

When I joined Stanford GSB as a PhD student in 2014, I didn’t have a great idea what I wanted to do. My feeling was that there were two groups of economists. Policy-minded people have spent a lot of time researching regulatory changes and running many backslides. Theorists were much more technical, solving abstract and beautiful mathematical models, but the connection between their work and reality seemed a little tenuous at times. There seemed to be a bit of a trade-off, and I wasn’t sure of my own marginal rate of substitution between theoretical elegance and political relevance.

I started talking to Paul Milgrom towards the end of my freshman year. Paul, even before the Nobel Prize, was a bit of a living legend among theorists. Paul’s ten most cited works cover five or six different subfields of theory.^{} It was difficult to find a topic to work on that Paul had not made a fundamental contribution to.

At the time, I was actually a little puzzled that Paul chose to spend so much time on spectrum auctions in particular. Spectrum auctions are a very specific problem. There is only one article on auction theory among Paul’s Top 10 Google Scholar posts, and nothing directly related to spectrum auctions appears on the first page. Why spend so much time on spectrum auctions, instead of something more general and basic?

In my third year, I had the chance to work with Paul on a short policy proposal for the 3.5 GHz spectrum band, from which I learned a couple of things. Dealing with real politics was about as complicated as I imagined. The 2015 FCC report and order for the 3.5 GHz band covers 187 pages. I was very impressed that Paul, a legendary theorist, was thoroughly familiar with the institutional details of this and previous auctions.

I was struck by the fact that Paul is not the kind of academic who pontificates from above, leaving the details of the implementation to others. Paul helped design the world’s first spectrum auction in 1994 and has been closely involved in the design process ever since. Two and a half decades after the start of the field, he is still leading from the trenches.

Paul recently led the design of the FCC’s Incentive Auction, which ended in 2017. What sets this apart from previous auctions is that it was a *double bid*, to buy spectrum from approximately 1,000 broadcasters, repackage it and sell it for broadband use.^{} This is described in detail in Paul’s recent book, *Discover the prices*, and I will describe it briefly here.

The basic idea is this: Suppose stations A and B are broadcasting on channel 7 and C is broadcasting on channel 8. If the FCC buys spectrum from station B, we can free up all of channel 8 by requiring for station C to switch to channel 7. This “repackaging” process would then allow the FCC to build pieces of connected spectrum and then sell them for broadband use. So the FCC’s goal in the reverse auction is essentially to find the lowest total cost at which they can buy enough spectrum from the TV broadcasters, so that they can repackage the rest to free up enough spectrum.

This process is difficult because, if two broadcasters are too close to each other, they cannot use the same spectrum channel. In the example, stations A and C can only be repackaged in the same channel if they are far enough apart so that their transmissions do not interfere with each other.

Finding the optimal solution to the FCC problem is a bit like putting together a jigsaw puzzle of a thousand pieces – except that the pieces are all different sizes, the edges don’t fit together perfectly, and there is no no picture at the top to tell you if you’re anywhere near the correct answer. You might decide that the best solution is probably to put C in the same channel as D, and then piece the rest of the puzzle together under that assumption. But you have no way of verifying if your initial guess was correct, and if it wasn’t, it ruins the entire block of other pieces you build around them. It’s like what happens when you get to halfway through a Sudoku puzzle and realize that you made a mistake at the start.

How difficult can it be? If there are only 1000 stations, can’t we just do a computer search through all the possible solutions? It turns out that these problems are incredibly difficult, even for computers. If we have 1000 stations, the number of different station sets we need to consider is 2^{1000}, or about 10^{300}.

To give an idea of the size of this number, there are about 10^{80} atoms in the universe. Imagine that each atom in the universe contains another universe. And then every atom in every one of those universes contains another universe, and so on. We should go down to *Fourth* sub-universe, to get enough atoms to simply *count *the number of possible combinations of stations that the FCC should consider.

There is a sort of Hayekian problem here. Optimal resource allocation decisions are just big optimization problems. But they are *really* big problems, and solving them by raw computational force is almost unbelievably complex. Even the relatively small problem of purchasing 100 MHz of spectrum from 1,000 US broadcasters would require far more than all the computing power in the known universe.

The magic, of course, is that there is a shortcut: the price system. In some circumstances, we may simply perform a *auction*. The FCC presents a price to each station, and each station announces whether it is willing to sell at that price. The auctioneer starts each price high enough that there are more than enough stations willing to sell, and gradually lowers the price. Stations are dropping out one by one, until we find prices at which the spectrum supply in the market equals the amount we want to buy.

It doesn’t always work. Without going into details, the FCC’s licensing preferences must be *substitutes*. As long as the condition of substitutes is met, the auction is guaranteed to quickly navigate the vast space of possible station combinations, to find the best one. From a mathematical point of view, the condition of substitutes transforms a fundamentally insoluble problem into an almost insignificant problem. The definition of surrogates in the book of Paul takes three lines, and the proof that the auction finds the optimal allocation takes about six lines.

The real world is complicated, however, and the surrogate condition is unlikely to hold up exactly in practice: what happens when it doesn’t? Paul shows that the result is not too dependent on the assumption that exactly holds: when preferences are *almost* substitutes, in a sense that Paul defines rigorously, the auction algorithm is sure to find a result close to the optimum.^{} In an article co-authored with Kevin Leyton-Brown, Neil Newman, and Ilya Segal, Paul finds that in small-scale simulations the auction goes pretty well.

A common theme in mathematics is that subtlety and complexity can arise from the simplest of assumptions. Fermat’s Last Theorem fits in the margin of a notebook and took 358 years and hundreds of pages to prove. Paul’s work, in a sense, does the opposite. Volumes upon volumes of guidance notes and institutional constraints on complex allocation issues are distilled and refined, into simple theorems in which classical ideas of supply, demand, and equilibrium shine through, and type mechanisms market to solve these problems.

I learned two things from Paul. Number one, in order to make a real difference in the world, academics have to get serious about the gory details and everything. It is not always academically rewarding. Paul’s work on spectrum auctions is not the most cited, and not all are well known, even among economists.

But the economic impact of Paul’s work on the spectrum is immense. Stanford estimates the total amount of government revenue generated from spectrum auctions worldwide to be over $ 100 billion. A 2012 PCAST report on wireless spectrum estimates the welfare effects of improving spectrum usage at more than $ 1 trillion. I think it is entirely appropriate that the Nobel Committee chose to highlight auctions, and spectrum auctions in particular, among Paul’s countless contributions to economic theory.

Second, I learned that, as a 20 year old entering graduate school, I was wrong about the nature of the set of constraints that an economic theorist faces. Working on political issues does not always require giving up the search for theoretical beauty. Paul’s work has shown me that, hidden in the mundane details of political issues, there is often a surprising amount of elegance, if you know where to look.