You are currently browsing the category archive for the ‘economics’ category.

Volume 42 of the AER, published in 1952, contains an article by Paul Samuelson entitled Spatial Price Equilibrium and Linear Programming’. In it, Samuelson uses a model of Enke (1951) as a vehicle to introduce the usefulness of linear programming techniques to Economists. The second paragraph of the paper is as follows:

In recent years economists have begun to hear about a new type of theory called linear programming. Developed by such mathematicians as G. B. Dantzig, J. v. Neumann, A. W. Tucker, and G. W. Brown, and by such economists as R. Dorfman, T. C. Koopmans, W. Leontief, and others, this field admirably illustrates the failure of marginal equalization as a rule for defining equilibrium. A number of books and articles on this subject are beginning to appear. It is the modest purpose of the following discussion to present a classical economics problem which illustrates many of the characteristics of linear programming. However, the problem is of economic interest for its own sake and because of its ancient heritage.

Of interest are the 5 reasons that Samuelson gives for why readers of the AER should care.

1. This viewpoint might aid in the choice of convergent numerical iterations to a solution.

2. From the extensive theory of maxima, it enables us immediately to evaluate the sign of various comparative-statics changes. (E.g., an increase in net supply at any point can never in a stable system decrease the region’s exports.)

3. By establishing an equivalence between the Enke problem and a maximum problem, we may be able to use the known electric devices for solving the former to solve still other maximum problems, and perhaps some of the linear programming type.

4. The maximum problem under consideration is of interest because of its unusual type: it involves in an essential way such non-analytic functions as absolute value of X, which has a discontinuous derivative and a corner; this makes it different from the conventionally studied types and somewhat similar to the inequality problems met with in linear programming.

5. Finally, there is general methodological and mathematical interest in the question of the conditions under which a given equilibrium problem can be significantly related to a maximum or minimum problem.

My students often use me as a sounding board for their new ventures. A sign that the modern University could pass for a hedge fund with classrooms. The request brings a chuckle as it always reminds me of my first exposure to entrepreneurial activity.

It happened in the most unlikeliest of places as well as times. A public (i.e. private) school in pre-Thatcherite England.  England was then the sick man of Europe and its decline was blamed upon the public schools.  Martin Wiener’s English Culture and the Decline of the Industrial Spirit, for example, argued that the schools had turned a nation of shopkeepers into one of lotus eaters.

Among the boys was a fellow, I’ll call Hodge. He was a well established source of contraband like cigarettes and pornographic magazines. He operated out of a warren of toilets in the middle of the school grounds called the White City. Why the school needed a small building devoted entirely to toilets was a product of the English distrust of indoor plumbing and central heating.

One lesson I learnt from Hodge was never buy a pornographic magazine sight unseen. The Romans call it caveat emptor, but, I think this, more vivid.

Hodge was always on the look out for new goods and services that he could offer for a profit to the other boys. One day, he hit upon the idea of buying a rubber woman (it was plastic and inflatable) and renting it out.  The customer base consisted of 400 teenage boys confined to a penal colony upon a wind blasted heath.

Consider the challenges. How was he to procure one (no internet)? Where would he hide the plastic inamorata to prevent theft or confiscation by the authorities? How would he find customers (no smart phones)? What should he charge? What was to prevent competition? And, of course, what happened? All, I think, best left to the imagination.

The various US intelligence agencies have identified three ways in which the Russian state meddled with the recent US elections:

1. Intrusions into voter registration systems.
2. Cyberattack on then DNC and subsequent release of hacked material.
3. Deployment of fake’ news and internet trolls.

The first two items on this list are illegal. If a PAC or US (or green card holder) Plutocrat had deployed their respective resources on the third item on this list, it would be perfectly legal. While one should expect the Russian’s to continue with item 3 for the next election, so will each of the main political parties.

Why is fake’ news influential? Shouldn’t information from a source with unknown and uncertain quality be treated like a lemon? For example, it is impossible for a user to distinguish between a twitter account associated with a real human from a bot. Nor can a user tell whether individual twitter yawps are independent or correlated.

Perhaps it depends on the distinction between information used to make a decision like which restaurant to go to and that which is for consumpiton value only (gossip). There appears to be no fake news crisis in restaurant reviews. There could be a number of reasons for this. The presence of non-crowd sourced reviews, the relatively low cost of experimentation coupled with frequent repetition and the fact that my decision to go to a restaurant does not compel you to do so comes to mind.

Political communication seems to be different, closer to entertainment than informing decision making.  If I consume political news that coincide with my partisan leanings because these enteratin me the most, it means that the news did not persuade me to lean that way (it follows that surpressing fake news should not change the distribution of political preferences). So, such news must serve another purpose, perhaps it increases turnout. If so, we should expect the DNC to be much more active in the deployment of fake’ news and an increase in turnout.

Platooning, driverless cars and ride hailing services have all been suggested as ways to reduce congestion. In this post I want to examine the use of coordination via ride hailing services as a way to reduce congestion. Assume that large numbers of riders decide to rely on ride hailing services. Because the services use Google Maps or Waze for route selection, it would be possible to coordinate their choices to reduce congestion.

To think thorough the implications of this, its useful to revisit an example of Arthur Pigou. There is a measure 1 of travelers all of whom wish to leave the same origin (${s}$) for the same destination (${t}$). There are two possible paths from ${s}$ to ${t}$. The top’ one has a travel time of 1 unit independent of the measure of travelers who use it. The bottom’ one has a travel time that grows linearly with the measure of travelers who employ it. Thus, if fraction ${x}$ of travelers take the bottom path, each incurs a travel time of ${x}$ units.

A central planner, say, Uber, interested in minimizing total travel time will route half of all travelers through the top and the remainder through the bottom. Total travel time will be ${0.5 \times 1 + 0.5 \times 0.5 = 0.75}$. The only Nash equilibrium of the path selection game is for all travelers to choose the bottom path yielding a total travel time of ${1}$. Thus, if the only choice is to delegate my route selection to Uber or make it myself, there is no equilibrium where all travelers delegate to Uber.

Now suppose, there are two competing ride hailing services. Assume fraction ${\alpha}$ of travelers are signed up with Uber and fraction ${1-\alpha}$ are signed up with Lyft. To avoid annoying corner cases, ${\alpha \in [1/3, 2/3]}$. Each firm routes its users so as to minimize the total travel time that their users incur. Uber will choose fraction ${\lambda_1}$ of its subscribers to use the top path and the remaining fraction will use the bottom path. Lyft will choose a fraction ${\lambda_2}$ of its subscribers to use the top path and the remaining fraction will use the bottom path.

A straight forward calculation reveals that the only Nash equilibrium of the Uber vs. Lyft game is ${\lambda_1 = 1 - \frac{1}{3 \alpha}}$ and ${\lambda_2 = 1 - \frac{1}{3(1-\alpha)}}$. An interesting case is when ${\alpha = 2/3}$, i.e., Uber has a dominant market share. In this case ${\lambda_2 = 0}$, i.e., Lyft sends none of its users through the top path. Uber on the hand will send half its users via the top and the remainder by the bottom path. Assuming Uber randomly assigns its users to top and bottom with equal probability, the average travel time for a Uber user will be

$\displaystyle 0.5 \times 1 + 0.5 \times [0.5 \times (2/3) + 1/3] = 5/6.$

The travel time for a Lyft user will be

$\displaystyle [0.5 \times (2/3) + 1/3] = 2/3.$

Total travel time will be ${7/9}$, less than in the Nash equilibrium outcome. However, Lyft would offer travelers a lower travel time than Uber. This is because, Uber which has the bulk of travelers, must use the top path to reduce total travel times. If this were the case, travelers would switch from Uber to Lyft. This conclusion ignores prices, which at present are not part of the model.

Suppose we include prices and assume that travelers now evaluate a ride hailing service based on delivered price, that is price plus travel time. Thus, we are assuming that all travelers value time at \$1 a unit of time. The volume of customers served by Uber and Lyft is no longer fixed and they will focus on minimizing average travel time per customer. A plausible guess is that there will be an equal price equilibrium where travelers divide evenly between the two services, i.e., ${\alpha = 0.5}$. Each service will route ${1/3}$ of its customers through the top and the remainder through the bottom. Average travel time per customer will be ${5/3}$. However, total travel time on the bottom will be ${2/3}$, giving every customer an incentive to opt out and drive their own car on the bottom path.

What this simple minded analysis highlights is that the benefits of coordination may be hard to achieve if travelers can opt out and drive themselves. To minimize congestion, the ride hailing services must limit traffic on the bottom path. This is the one that is congestible. However, doing so makes its attractive in terms of travel time encouraging travelers to opt out.

Colleagues outside of Economics often marvel at the coordinated nature of the Economics job market. The job market is so efficient, that the profession no longer wastes resources by having everyone read each candidate’s job market paper. That task is assigned to one person (Tyler Cowen) who reports back to the rest of us. In case you missed the report, here it is

Economics is not alone in having a coordinated job market. Philosophy has one, but it has begun to show signs of unraveling. The ability to interview via Skype, for example, has reduced the value in the eyes of many, for a preliminary interview at their annual meeting. In response, the American Philosophy Association posted the following statement regarding the job market calendar:

For tenure-track/continuing positions advertised in the second half of the calendar year, we recommend an application deadline of November 1 or later. It is further recommended that positions be advertised at least 30 days prior to the application deadline to ensure that candidates have ample time to apply.

In normal circumstances a prospective employee should have at least two weeks for consideration of a written offer from the hiring institution, and responses to offers of a position whose duties begin in the succeeding fall should not be required before February 1.

When advertising in PhilJobs: Jobs for Philosophers, advertisers will be asked to confirm that the hiring institution will follow the above guidelines. If an advertiser does not do so, the advertisement will include a notice to that effect.

Its natural to wonder if the Economics market is not far behind. Skype interviews are already taking place. The current set up requires a department to evaluate and select candidates  for preliminary interviews within a month (roughly the middle of November to mid December) which is hardly conducive to mature reflection (and argument).

I don’t often go to empirical talks, but when I do, I fall asleep. Recently, while so engaged, I dreamt of the replicability crisis’ in Economics (see Chang and Li (2015)). The penultimate line of their abstract is the following bleak assessment:

Because we are able to replicate less than half of the papers in our sample even with help from the authors, we assert that economics research is usually not replicable.’

Eager to help my empirical colleagues snatch victory from the jaws of defeat, I did what all theorists do. Build a model. Here it is.

The journal editor is the principal and the agent is an author. Agent has a paper characterized by two numbers ${(v, p)}$. The first is the value of the findings in the paper assuming they are replicable. The second is the probability that the findings are indeed replicable. The expected benefit of the paper is ${pv}$. Assume that ${v}$ is common knowledge but ${p}$ is the private information of agent. The probability that agent is of type ${(v,p)}$ is ${\pi(v,p)}$.

Given a paper, the principal can at a cost ${K}$ inspect the paper. With probability ${p}$ the inspection process will replicate the findings of the paper. Principal proposes an incentive compatible direct mechanism. Agent reports their type, ${(v, p)}$. Let ${a(v, p)}$ denote the interim probability that agent’s paper is provisionally accepted. Let ${c(v, p)}$ be the interim probability of agent’s paper not being inspected given it has been provisionally accepted. If a provisionally accepted paper is not inspected, it is published. If a paper subject to inspection is successfully replicated, the paper is published. Otherwise it is rejected and, per custom, the outcome is kept private. Agent cares only about the paper being accepted. Hence, agent cares only about

$\displaystyle a(v, p)c(v,p) + a(v, p)(1-c(v,p))p.$

The principal cares about replicability of papers and suffers a penalty of ${R > K}$ for publishing a paper that is not replicable. Principal also cares about the cost of inspection. Therefore she maximizes

$\displaystyle \sum_{v,p}\pi(v,p)[pv - (1-p)c(v,p)R]a(v,p) - K \sum_{v,p}\pi(v,p)a(v,p)(1-c(v,p))$

$\displaystyle = \sum_{v,p}\pi(v,p)[pv-K]a(v,p) + \sum_{v,p}\pi(v,p)a(v,p)c(v,p)[K - (1-p)R].$

The incentive compatibility constraint is
$\displaystyle a(v, p)c(v,p) + a(v, p)(1-c(v,p))p \geq a(v, p')c(v,p') + a(v, p')(1-c(v,p'))p.$

Recall, an agent cannot lie about the value component of the type.
We cannot screen on ${p}$, so all that matters is the distribution of ${p}$ conditional on ${v}$. Let ${p_v = E(p|v)}$. For a given ${v}$ there are only 3 possibilities: accept always, reject always, inspect and accept. The first possibility has an expected payoff of

$\displaystyle vp_v - (1-p_v) R = (v+R) p_v - R$

for the principal. The second possibility has value zero. The third has value ${ vp_v -K }$.
The principal prefers to accept immediately over inspection if

$\displaystyle (v+R) p_v - R > vp_v - K \Rightarrow p_v > (R-K)/R.$

The principal will prefer inspection to rejection if ${ vp_v \geq K}$. The principal prefers to accept rather than reject depends if ${p_v \geq R/(v+R).}$
Under a suitable condition on ${p_v}$ as a function of ${v}$, the optimal mechanism can be characterized by two cutoffs ${\tau_2 > \tau_1}$. Choose ${\tau_2}$ to be the smallest ${v}$ such that

$\displaystyle p_v \geq \max( (R/v+R), ((R-K)/R) ).$

Choose ${\tau_1}$ to be the largest ${v}$ such that ${p_v \leq \min (K/v, R/v+R)}$.
A paper with ${v \geq \tau_2}$ will be accepted without inspection. A paper with ${v \leq \tau_1}$ will be rejected. A paper with ${v \in (\tau_1, \tau_2)}$ will be provisionally accepted and then inspected.

For empiricists the advice would be to should shoot for high ${v}$ and damn the ${p}$!

More seriously, the model points out that even a journal that cares about replicability and bears the cost of verifying this will publish papers that have a low probability of being replicable. Hence, the presence of published papers that are not replicable is not, by itself, a sign of something rotten in Denmark.

One could improve outcomes by making authors bear the costs of a paper not being replicated. This points to a larger question. Replication is costly. How should the cost of replication be apportioned? In my model, the journal bore the entire cost. One could pass it on to the authors but this may have the effect of discouraging empirical research. One could rely on third parties (voluntary, like civic associations, or professionals supported by subscription). Or, one could rely on competing partisan groups pursuing their agendas to keep the claims of each side in check. The last seems at odds with the romantic ideal of disinterested scientists but could be efficient. The risk is partisan capture of journals which would shut down cross-checking.

From Kris Shaw, a TA in for my ECON 101 class, I learnt that the band Van Halen once required that brown M&M’s not darken their dressing room door. Why? Maybe it was a lark. Perhaps, a member of the band (or two) could not resist chuckling over the idea of a minor factotum appointed to the task of sorting the M&Ms. When minor factotum is asked what they did that day, the response was bound to elicit guffaws. However, minor factotum might have made it a point to not wash their hands before sorting the M&Ms. Then, who would be laughing harder?

A copy of the M&M rider can be found here. Along with van Halen’s explanation of why the rider was included:

……the group has said the M&M provision was included to make sure that promoters had actually read its lengthy rider. If brown M&M’s were in the backstage candy bowl, Van Halen surmised that more important aspects of a performance–lighting, staging, security, ticketing–may have been botched by an inattentive promoter.

So the rider helps screen, apparently, whether the promotor pays attention to detail. I think the explanation problematic. It suggests that it is hard to monitor effort expended by promoter on important things like staging for example. So, monitor something completely irrelevant. The strategic promoter should shirk on the staging and expend effort on the M&Ms.

Duppe and Weintraub date the birth of Economic Theory,  at June 1949. It was the year in which Koopmans organized the Cowles Commission Activity Analysis Conference. It is also counted as conference Zero of the Mathematical Programming Symposium. I mention this because the connections between Economic Theory and Mathematical Programming and Operations Research had, at one time been very strong. The conference, for example, was conceived of by Tjalling Koopmans, Harold Kuhn, George Dantzig, Albert Tucker, Oskar Morgenstern, and Wassily Leontief with the support of the Rand corporation.

One of the last remaining links to this period who straddled, like a Colossus, both Economic Theory and Operations Research, Herbert Eli Scarf, passed away on November 15th, 2015.

Scarf came to Economics and Operations Research by way of Princeton’s mathematics department. Among his classmates was Gomory of the cutting plane method Milnor of topology fame and Shapley. Subsequently, he went on to  Rand ( Dantzig, Bellman, Ford & Fulkerson). While there he met Samuel Karlin and Kenneth Arrow who introduced him to inventory theory. It was in this subject that Scarf made the first of many important contributions: the optimality of (S, s) polices. He would go on to establish equivalence of the core and competitive equilibrium (jointly with Debreu), identify a sufficient condition for non-emptiness of the core of a NTU game (now known as Scarf’s Lemma), anticipated the application of Groebner basis in integer programming (neighborhood systems) and of course his magnificent Computation of Economic Equilibria’.

Exegi monumentum aere perennnius regalique situ pyramidum altius, quod non imber edax, non Aquilo impotens possit diruere aut innumerabilis annorum series et fuga temporum. Non omnis moriar…….

I have finished a monument more lasting than bronze and higher than the royal structure of the pyramids, which neither the destructive rain, nor wild North wind is able to destroy, nor the countless series of years and flight of ages. I will not wholly die………….

You shouldn’t swing a dead cat, but if you did, you’d hit an economist doing data. Wolfers wrote:

“…...modern microeconomists are more likely to spend their days knee-deep in large-scale data sets describing the real-world decisions made by millions of people, and less likely to be mired in Greek-letter abstractions.”

Knee-deep usually goes with shit, while mired with bog. I’ll pick bog over shit, but suspect that that was not Wolfers’ intent.

The recent paper by Chang and Li about the difficulty of replicating empirical papers  does rather take the wind out of the empirical sails. One cannot help but wonder about the replicability of replicability studies. No doubt, a paper on the subject will be forthcoming.

Noah Smith on his blog wrote:

So the supply of both good and mediocre empirics has increased, but only the supply of mediocre theory has increased. And demand for good papers – in the form of top-journal publications – is basically constant. The natural result is that empirical papers are crowding out theory papers.

Even if one accepts the last sentence, the first can only be conjecture.  One might very well think that the supply of mediocre empirical papers is caused entirely by an increase in the supply of mediocre theory papers whose deficiencies are  glossed over with a patina of empirics. Interestingly, when reviewers could find nothing nice to say about Piketty’s theories they praised his data instead. Its like praising the author of a false theorem by saying while the proof is wrong, it is long.

The whole business has the feel of  tulip mania. Empirical papers as abundant as weeds. Analytics startups as plentiful as hedge funds. Analytics degree programs spreading like herpes. Positively Gradgrindian.

“THOMAS GRADGRIND, sir. A man of realities. A man of facts and calculations. A man who proceeds upon the principle that two and two are four, and nothing over, and who is not to be talked into allowing for anything over.”

In empirical econ classes around the world I imagine (because I’ve never been in one) Gradgrindian figures laying down the law:

“Facts alone are wanted in life. Plant nothing else, and root out everything else. You can only form the minds of reasoning animals upon Facts: nothing else will ever be of any service to them.”

I have nothing against facts. I am quite partial to some.  But, they do not speak for themselves without an underlying theory.

Chu Kin Chan, an undergraduate student from the Chinese University of Hong Kong, has collected the placement statistics of the top 10 PhD programs in Economics from the last 4 years. You can find the report here. In it you will find the definition of top 10 as well as which placements counted’. Given that not all PhD’s in economics who get academic positions do so in Economics departments, you can expect some judgement is required in deciding if a placements counts as a top 10′ or top 20′.

The results are similar to findings in other disciplines (the report refers to some of these). The top 10 departments place 5 times as many students in the top 20 departments as do those ranked 11 through 20. If you score a top 10 placement as +1, any other academic placement as a 0 and a non-academic placement as a -1, and then compute an average score per school, only one school gets a positive average score: MIT.

Chan also compares ranking of departments  by placement with a ranking  based on a measure of scholarly impact proposed by Glen Ellison. What is interesting is that departments that are very close to each other in the scholarly impact rating can differ quite a lot in terms of placement outcomes.

Read in tandem with the Card & Della Vigna study on falling acceptance rates in top journals and the recent Baghestanian & Popov piece on alma mater effects makes me glad not to be young again!