You are currently browsing Eilon’s articles.

Le Monde tells us (in French) that researchers found 47-million-year-old fossils of nine mating pairs of turtles of species Allaeochelys crassesculpta in a lake in Germany. These fossils, which, according to the researchers, are the only fossils of mating pairs of animals to be found, taught the researchers a lot on this extinct species. But what caught my eye is a probabilistic statement made by one of the researchers:

“Des millions d’animaux vivent et meurent chaque année, et nombre d’entre eux se fossilisent par hasard, mais il n’y a vraiment aucune raison que ça arrive lorsque vous êtes en train de vous reproduire. Il est hautement improbable que les deux partenaires meurent en même temps, et les chances que les deux soient fossilisés à la fois sont encore plus maigres”, a expliqué Walter Joyce, de l’université allemande de Tübingen. Avec plusieurs couples, les probabilités s’amoindrissent encore.

Since the name Walter Joyce sounds English speaking, I searched for a version in English. I found this:

“No other vertebrates have ever been found like these, so these are truly exceptional fossils,” Joyce said. “The chances of both partners dying while mating are extremely low, and the chances of both partners being preserved as fossils afterward even lower. These fossils show that the fossil record has the potential to document even the most unlikely event if the conditions are right.”

The difference between the English version and the French version are slight yet important. The French reader learns that it is improbable that a pair of animals will die together while mating, and that it is even less probable that they will be fossilized together. The chances that this will happen to several mating pairs is even smaller. This is true if the death-while-mating+being-fossilized events of the different couples were independent. However, the article itself explains to the readers that here the events are dependent. During the mating process the turtles freeze, and sometimes sink to the bottom of the lake, where the water is poisonous. Thus, in this lake the probability of finding a fossil of a single turtle is less probable than finding the fossil of a mating pair of turtles, and the probability of finding several fossils of mating pairs is not necessarily lower than the probability of finding a single fossil of a mating pair… The English version does not correct this inaccuracy, but it does end with a note that can be interpreted as a correction of previous mistakes: “These fossils show that the fossil record has the potential to document even the most unlikely event if the conditions are right.”
Next week the semester starts, and I will get a new bunch of first-year undergrad students who are eager to study Introduction to Probability. I will ask them to read this article and spot the mistakes. I wonder how many will find what is wrong in this article.

In a previous post I described an infinite-horizon perfect-information game without a subgame-perfect equilibrium. Does there exist another refinement of Nash equilibrium that keeps the spirit of subgame perfection and is nonempty in every infinite-horizon perfect-information game?

One refinement of the concept of Nash equilibrium is that of trembling-hand equilibrium for extensive-form games, also called extensive-form perfect equilibrium. Call a strategy profile σ an extensive-form perfect equilibrium if there exists a sequence (σn) that satisfies the following properties:

  1. The sequence (σn) converges to σ.
  2. Under σn, each player in each of his decision nodes plays each action with probability at least 1/n.
  3. σn is a Nash equilibrium in the game Γn in which every player must play in each of his decision nodes each action with probability at least 1/n. (Note that the third requirement implies the second requirement).

Does every game admit such an extensive-form perfect equilibrium? The following example, to appear in a new paper of János Flesch, Jeroen Kuipers, Ayala Mashiah-Yaakovi, Gijs Schoenmakers, Eran Shmaya, your humble servant, and Koos Vrieze,  is a variant of the example I provided here, and it shows that this is not the case. The conclusion is that at present we do not have any refinement of the concept of Nash equilibrium for infinite-horizon games that keeps the spirit of subgame perfection and is always nonempty.

Read the rest of this entry »

I first met Jean-Francois Mertens while I was a graduate student at the Hebrew University of Jerusalem . He often visited the Center for Rationality at the Hebrew University and in one of those visits Abraham Neyman, my Ph.D. adviser, asked Mertens to sit with me and listen to my research.

Mertens has contributed to several areas in game theory.

His paper with Elon Kohlberg on stability of equilibria has opened a new research area.

His work with Shmuel Zamir on the universal belief space is fundamental and complements Harsanyi‘s work on belief hierarchies.

His study of the min-max value and max-min value in general repeated games had a huge influence on the research in this area. The Mertens conjecture on repeated games with signals, stating that in zero-sum repeated games with signals, whenever player 1 knows everything that player 2 knows (that is, the signal of player 1 contains the signal of player 2), then the limit of the n-stage game converges to the max-min value, has been the starting point of many works in this topic, and is still open, though recently some advances has been made to solve it.

His contribution to the Shapley value in games with continuum of players includes the extension of the diagonal formula, that historically applied to smooth nonatomic games, to a larger class of games that includes nondifferentiable and also discontinuous games.

For me, his most significant contribution is to the area of stochastic games, where he proved, together with Abraham Neyman, that every two-player zero-sum stochastic game has a uniform value. When Mertens and Neyman proved this result I was still in elementary school. The story I heard about the inception of this result is that in a workshop at CORE on repeated games in the Winter of 1978–1979, Sylvain Sorin was giving a talk on the Big Match, which was the first nontrivial two-player zero-sum undiscounted stochastic game that was solved. For Abraham Neyman, who was sitting in the audience, this was the first encounter with repeated games and stochastic games, and so he asked many questions, ranging from basic to ideas of extending the result to general stochastic games. Jean-Francois Mertens, who was sitting in the audience as well, patiently answered Neyman’s questions and explained why each route that Neyman suggested to extend the result is bound to fail. Like two swordmen, Neyman made another suggestion and Mertens fended it off. The result of this match was one of the most beautiful papers in game theory.

I met Jean-Francois Mertens at Toulouse in September last year, during the conference that Jerôme Renault organized. We had a dinner together. During the conference Mertens was very tired, and when he was back to Belgium he was diagnosed with a lung cancer in an advanced stage. He passed away on the night of July 17, 2012. We will all miss him. יהי זכרו ברוך

Every finite-horizon game in extensive form in which the number of actions in every node is finite has a subgame-perfect equilibrium. Indeed, consider a smallest subgame, that is, a subgame that contains no proper subgames. By Nash’s theorem this subgame has a Nash equilibrium in mixed strategies. Since this subgame contains no proper subgame, this Nash equilibrium is a subgame-perfect equilibrium of the subgame. We can then replace this whole subgame by a single node with terminal payoff that coincides with the equilibrium payoff of the subgame, and continue inductively to construct a subgame-perfect equilibrium (in mixed strategies) in the whole game.

When uncountably many actions are available at some nodes, a subgame-perfect equilibrium need not exist (Harris, Reny, and Robson, 1995).

What happens in infinite-horizon games? When horizon is infinite, a Nash equilibrium need not exist. Indeed, consider a one-player infinite-horizon game in which the player, Alice, has two possible actions, A and B, at every stage. Alice’s payoff is 1-(1/n), where n is the first stage in which she chooses the action B, if n is finite. If Alice never chooses the action B, her payoff is 0. In this game, Alice would like to choose the action B for the first time as late as possible, and, whenever she chooses B, she is better off waiting one more stage. Therefore Alice has no optimal strategy, and there is no equilibrium.

Plainly this toy-game does have an ε-equilibrium: choose B for the first time at stage 1/ε.

Does there exist a subgame-perfect ε-equilibrium in every infinite-horizon perfect-information extensive-form game? Apparently the answer is negative. The following example, to appear in a new paper of János Flesch, Jeroen Kuipers, Ayala Mashiah-Yaakovi, Gijs Schoenmakers, Eran Shmaya, your humble servant, and Koos Vrieze, proves this point.

Alice and Bob play an infinite-stage game. In each stage one of them is the active player who chooses the action (and the other does nothing). The active player has two actions, A and B, that correspond to the identity of the next active player. That is, if the current active player chooses action A, then Alice is the active player in the next stage, while if the current active player chooses action B, then Bob is the active player in the next stage.

What are the payoffs?

  • If there is a stage t such that, from stage t and on, Alice is the active player, then the payoff to Alice is -1 and Bob’s payoff is 2. In such a case we say that the game absorbs with Alice.
  • If there is a stage t such that, from stage t and on, Bob is the active player, then the payoff to Alice is -2 and Bob’s payoff is 1. In such a case we say that the game absorbs with Bob.
  • Otherwise the payoff to both is 0.

Let us argue that in this game there is no subgame-perfect ε-equilibrium, provided ε is sufficiently small. So fix ε>0 sufficiently small and suppose by contradiction that there is a subgame-perfect ε-equilibrium.

In any subgame in which Bob is the first active player he can guarantee 1 by always choosing B. In particular, Bob’s payoff in this subgame must be at least 1-ε.

Fix now a subgame in which Alice is the first active player. Can it be that the play absorbs with Alice with probability less than ε? If this happens, then, since Bob’s payoff is at least 1-ε, the probability that the play absorbs with Bob is at least 1-3ε. But then Alice’s payoff is way below -1, which she can guarantee by always playing A. Thus, in any subgame in which Alice is the first active player, play absorbs with Alice with probability at least ε.

By Levy’s 0-1 law this conclusion implies that with probability 1 the play absorbs with Alice. This in turn implies that play will not absorb with Bob: he will never start an infinite sequence of B’s. But then Alice has no incentive to play A: she will always play B and get 0.

This example shows that apart of the well-beloved concept of ε-equilibrium, we do not have a good concept of subgame-perfect ε-equilibrium in infinite-horizon games. In another post I will comment on the concept of trembling-hand equilibrium in infinite-horizon perfect-information games.

In Israel, when a graduate student completes writing his thesis, the thesis is sent to several referees, and based on their reports the university decides whether to grant the Ph.D. degree to the student.

France has a different system. Two referees provide reports that determine whether the student is allowed to defend his thesis. During the defense, a jury made of three judges, the two referees, and the Ph.D. adviser listen to the student’s talk and decides whether to grant the Ph.D. degree to the student or not. Given that the student’s family makes up the majority of the audience, and given that they already prepared a banquet in the adjacent room, the outcome is pretty obvious. I guess that 100 years ago when commuting was more difficult and the student’s family was not in the audience, the outcome varied more than today.

Read the rest of this entry »

BusinessInsider reports that Paul Krugman sees two possible outcomes to the Euro crisis, both are considered unacceptable by the Germans at present:

“I say it’s 50/50 … Either the Germans have to accept something they consider unacceptable, or they have to accept something, the breakup of the euro, that they consider something unacceptable.”

Read the original article here. I still have to fully understand and digest the game theoretic implications of Krugsman’s point of view.

The field trials to the women’s 100 meter dash for the London olympic games had a little unusual twist: Allyson Felix and Jeneba Tarmoh had the same result of 11.068 seconds. Three runners will represent the US in this event. The problem is that Felix and Tarmoh finished together at the third place. I read contradicting articles regarding whether the USA Track and Field, who is in charge of the procedure of choosing the representatives, did or did not have a rule to handle such a case. In any case, it was decided that both athletes will be given the choice of a run-off or a coin toss to determine the final representative.

I will let the poor runners and the USATF solve the issue of who will represent the US in the women’s 100 meters in London. What interests me is what Bobby Kersee, coacher of both Felix and Tarmoh, thinks of the suggested solution of flipping a coin.

Yahoo reports that Bobby Kersee told the Associated Press that “Nine times out of 10, most athletes aren’t going to want to flip a coin. Would you go to the Super Bowl and after two overtimes or what have you, have the referees take both coaches to the middle of the field and say, ‘We’re going to flip to see who wins the Super Bowl?’ I don’t see that.”

Why not? Suppose that a game does not end, like the three-month long quidditch game or the 6 overtimes basketball game between Indianapolis Olympians and the Rochester Royals in 1951. Does driving the players to death make more sense than flipping a coin? After all, the way players play when exhausted does not resemble their usual play, and it might well be that eventually the result is as random as a flip of a coin. It is easy to make a case for play-until-the-end. But sometimes, as in the Felix-Tarmoh case, such a solution might not be feasible due to various constraints, like deadlines and other races. In that case, why not flip a coin?

We are all descendants of Adam and Eve. But no need to go that way back. The family tree of the great Chinese philosopher Confucius includes more than 80 generations and more than 2 million people, most of them alive. Researchers say that 17 million people are direct descendants of Genghis Khan, the greatest conqueror who ever lived.

Our world inhabits more and more people. We no longer live in villages in which everyone knows everyone else, but in large cities in which we know only few scores of people. We become strangers to each other, we close our eyes and hearts to people in need. We are not family.

But we are. A couple of years ago I prepared a family tree. Many archives from Eastern Europe were lost and my family was somewhat trimmed 70 years ago, and therefore I could climb up the tree only to 1880, in one branch I climbed up to 1830. Overall I have about 1800 people in my tree. Admittedly, many of them are not related by blood but rather through marriage, but nonetheless we are all family.

And then I thought: suppose that one prepares a huge family tree, that includes us all, all the 7 billion souls that roam the earth. We draw a huge graph where each person who ever lived is a node, and each family tie, each marriage and parenthood, is an edge. Every connected component in this graph is one huge family, a group of people who are related by blood and marriage. I guess that we will not end up with one connected component, because of small communities who remained closed for centuries, yet I bet that we will end up with one huge connected component that will include the vast majority of us.

Suppose that you knew that 95% of the people that you see are part of your family. This person who wants to cross the street is the cousin of the cousin of the cousin of aunt Bess. And that driver who drives so slowly is the nephew of the great-grandfather of your neighbor, who is the second niece, twice removed, of the wife of uncle Jim. Will you still honk like crazy?

A family tree may make us all feel like one huge family and help us be better people. On the other hand, it might diminish the significance of the notion of family; if we are all one family, even of those we dislike, then probably one should honk at everyone, family or not. What do you think?

Sergiu Hart organized a small conference on Game Dynamics that took place last week at the Center for Rationality in the Hebrew University of Jerusalem. Luckily for me, Sergiu also likes dynamic games, and therefore talks were on both subjects.

The conference was Sergiu’s style: first talk at 10:30, second at 11:45, then a long lunch break, and then two additional talks, at 15:00 and 16:15. In one of the lunch breaks I asked John Levy to explain to me his construction of a discounted game without a stationary discounted equilibrium. I will try to share with you what I learned. (John’s discussion paper appears here.)

There are m players labeled 1,2,…,m. What is m? It is sufficiently large such that the total discounted payoff from stage m onward has a little effect on a player’s payoff. That is, if the discount factor is β (so that the total discounted payoff is the sum over n of (β to the power n) times (stage payoff at stage n)), then we require that (β to the power m) is a small number.

Read the rest of this entry »

Lowest unique bid auctions became popular in recent years. In such an auction, a prize of value $M (a car, an elecronic gudget, etc.) is offered for sale. Each participant can purchase any number of natural numbers at the price of $c per natural number. The winner is the participant who purchased the minimal number among all the numbers that were purchased by a single player. He pays his winning bid (that is, the minimal number that was purchased only by him) and gets the object. If no number was purchased by a single player, no-one wins the object. I explained this game in more details in this post.

This game has a symmetric equilibrium. Indeed, in equilibrium no player will purchase more than M/c numbers, and no player will purchase a number higher than M (actually, no player will purchase M as well). Therefore the set of pure strategies that one should consider for an equilibrium is finite (all subsets of the set {1,2,…,M-1} that contains at most M/c numbers), so that by Nash’s theorem the game has an equilibrium in mixed strategies. Since the game is symmetric, there is in fact a symmetric equilibrium in mixed strategies.

Consider the following variation, in which the winner does not pay his winning bid. Now the set of all pure strategies that can be selected in equilibrium is infinite: it consists of all subsets of the set of natural numbers that include at most M/c numbers. Does this game have an equilibrium? A symmetric equilibrium? When there are two participants the answer is positive. What about more than two participants? I have no idea. Anyone can find the answer?

Kellogg faculty blogroll