You are currently browsing rvohra’s articles.

This morning, a missive from the Econometrics society arrived in my in box announcing “two modest fees associated with the submission and publication of papers in its three journals.” As of May 1st 2020, the Society will assess a submission fee of $50 and a page charge of $10 per page for accepted papers. With papers on the short side running to around 30 pages and 10 page appendices this comes out to about $400. By the standards of the natural sciences this is indeed modest.

At the low end the American Meteorological Society charges $120 per page, no submission fee. In the middle tier, the largest open-access publishers — BioMed Central and PLoS — charge $1,350–2,250 to publish peer-reviewed articles in many of their journals, and their most selective offerings charge $2,700–2,900. At the luxury end of the market is the Proceedings of the National Academy which starts out at $1590 for 6 pages and rises upto $4,215 for a 12 page paper.

My colleague Aislinn Bohren has suggested rewarding referees with free page coupons: publish one page free for each five pages you referee. This may suffer the same fate as the Capitol Hill Baby Sitting co-operative.

In the short run the effect will be to drive papers to JET and GEB as not all academics have research budgets which will cover the fees. An alternative is to submit the paper for $50. If accepted, decline to have it published. Send it elsewhere and send a copy of the acceptance letter to one’s promotion and tenure committee. Voila, a new category in the CV: accepted at Econometrica but not published.

 

 

With the move to on-line classes after spring break in the wake of Covid-19, my University has allowed students to opt to take some, all or none of their courses as pass/fail this semester. By making it optional, students have the opportunity to engage in signaling. A student doing well entering into spring break may elect to take the course for a regular grade confident they will gain a high grade. A student doing poorly entering into spring break may elect to take the course pass/fail. It is easy to concoct a simplified model (think Grossman (1981) or Milgrom (1981)) where there is no equilibrium in which all students elect to take the course pass/fail. The student confident of being at the top of the grade distribution has an incentive to choose the regular grading option. The next student will do the same for fear of signaling they had a poor grade and so on down the line. In equilibrium all the private information will unravel.

This simple intuition ignores the heterogeneity in student conditions. It is possible that a student with a good score going into spring break may now face straitened circumstances after spring break. How they decide depends on what inferences they think, others, employers, for example, make about grades earned during this period. Should an employer simply ignore any grades earned during this period and Universities issue Covid-19 adjusted GPAs? Should an employer conclude that a student with a poor grade is actually a good student (because they did not choose the pass/fail option) who has suffered bad luck?

In response to the Covid-19 virus a number of American Universities are moving instruction on-line. Some see this as great natural experiment to test the efficacy of virtual instruction (NO). Others believe it will speed the pace at which instruction  moves on-line (NO). The focus now is on execution at scale in a short period of time.  We would be better off canceling the rest of term and giving all the students A’s.

Here is what I predict will happen. Students will be dilatory in viewing lectures. Temptation and the difficulty of adjusting to new habits will be obstacles. When the exams approach, some will complain that they are unprepared because virtual is not as good as live, their instructor made a hash of things, the absence of live office hours, etc. etc. The exams will be take home without any proctors. While one’s own spirit is willing, there are doubts about the rectitude of one’s classmates.

On the other hand, during this period of exile, perhaps, there will emerge another Newton.

 

An agent with an infectious disease confers a negative externality on the rest of the community. If the cost of infection is sufficiently high, they are encouraged and in some cases required to quarantine themselves. Is this the efficient outcome? One might wonder if a Coasian approach would generate it instead. Define a right to walk around when infected which can be bought and sold. Alas, infection has the nature of public bad which is non-rivalrous and non-excludable. There is no efficient, incentive compatible individually rational (IR) mechanism for the allocation of such public bads (or goods). So, something has to give. The mandatory quarantine of those who might be infected can be interpreted as relaxing the IR constraint for some.

If one is going to relax the IR constraint it is far from obvious that it should be the IR constraint of the infected. What if the costs of being infected vary dramatically? Imagine a well defined subset of the population bears a huge cost for infection while the cost for everyone else is minuscule. If that subset is small, then, the mandatory quarantine (and other mitigation strategies) could be far from efficient. It might be more efficient for the subset that bears the larger cost of infection to quarantine themselves from the rest of the community.

 

Six years ago, I decided to teach intermediate microeconomics. I described my views on how it should be taught in an earlier post. The notes for that course grew into a textbook that is now available in Europe and in the US this April. I am particularly delighted at being able to sport Paolo Ucello’s `The Hunt’ upon the cover. The publishers, Cambridge University Press, asked me to provide an explanation for why I had chosen this, and it appears on the rear cover. Should you make your way to Oxford, be sure to stop by the Ashmolean Museum to see it, the painting of course, in all its glory. I day dream, that like Samuelson’s `Economics’, it will sell bigly.

51jKqdtlkzL

Over a rabelaisian feast with convivial company, conversation turned to a twitter contretemps between economic theorists known to us at table. Its proximate cause was the design of the incentive auction for radio spectrum. The curious can dig around on twitter for the cut and thrust. A summary of the salient economic issues might be helpful for those following the matter.

Three years ago, in the cruelest of months, the FCC conducted an auction to reallocate radio spectrum. It had a procurement phase in which spectrum would be purchased from current holders and a second phase in which it was resold to others. The goal was to shift spectrum, where appropriate, from current holders to others who might use this scarce resource more efficiently.

It is the procurement phase that concerns us. The precise details of the auction in this phase will not matter. Its design is rooted in Ausubel’s clinching auction by way of Bikhchandani et al (2011) culminating in Milgrom and Segal (2019).

The pricing rule of the procurement auction was chosen under the assumption that each seller owned a single license. If invalid, it allows a seller with multiple licenses to engage in what is known as supply reduction to push up the price. Even if each seller initially owned a single license, a subset of sellers could benefit from merging their assets and coordinating their bids (or an outsider could come in and aggregate some sellers prior to the auction). A recent paper by my colleagues Doraszelski, Seim, Sinkinson and Wang offers estimates of how much sellers might have gained from strategic supply reduction.

Was the choice of price rule a design flaw? I say, compared to what? How about the VCG mechanism? It would award a seller owning multiple licenses the marginal product associated with their set of licenses. In general, if the assets held by sellers are substitutes for each other, the marginal product of a set will exceed the sum of the marginal products of its individual elements. Thus, the VCG auction would have left the seller with higher surplus than they would have obtained under the procurement auction assuming no supply reduction. As noted in Paul Milgrom’s  book, when goods are substitutes, the VCG auction creates an incentive for mergers. This is formalized in Sher (2010). The pricing rule of the procurement auction could be modified to account for multiple ownership (see Bikhchandani et al (2011)) but it would have the same qualitative effect. A seller would earn a higher surplus than they would have obtained under the procurement auction assuming no supply reduction. A second point of comparison would be to an auction that was explicitly designed to discourage mergers of this kind. If memory serves, this reduces the auction to a posted price mechanism.

Was there anything that could have been done to discourage  mergers? The auction did have reserve prices, so an upper limit was set on how much would be paid for licenses. Legal action is a possibility but its not clear whether that could have been pursued without delaying the auction.

Stepping back, one might ask a more basic question: should the reallocation of spectrum have been done by auction? Why not follow Coase and let the market sort it out? The orthodox answer is no because of hold-up and transaction costs. However, as Thomas Hazlett has argued, there are transaction costs on the auction side as well.

 

Hot on the heels of its new Division of Linear Algebra, Empire State’s President announced a new Institute for Wow. Unlike other centers and institutes on campus that were dedicated to basic research or innovation, this one would focus only on research that would grab attention. “Universities,” she said, “ have tried for centuries to inform and educate. We’ve learnt in the last decade from all the data collected that this just annoys the students, frustrates the professors and bores donors. Instead,” she continued, “we are going to entertain.” She went on to say that the institute would jettison traditional measures of impact and significance and focus on media mentions, `likes’ and `followers’. The president emphasized that this was in keeping with Empire State’s mission to be not just the best University in the world, but the best University for the world. “On the increasingly long road from birth to death we want to make sure that people are not bored,” she said.

The Institute for Wow will be directed by celebrity academic Isaac Bickerstaff one of the new breed of `click-bait’ style scholar that Empire hopes to attract. Bickerstaff first shot to fame with his `named cheese’ experiment. He took a large wheel of cheddar cheese and sliced it in two. One half he labeled `cheddar cheese’ and the other half he named `Partridge Farms cheddar cheese’. He then asked subjects to report their willingness to pay for each kind of cheese and discovered that on average they were willing to pay more for the `named’ cheddar. Grey hairs dismissed the work as not accounting for fixed cheddar cheese effects which so incensed Bickerstaff, that he went on to test his hypothesis on Red Leicester, Camembert, Wensleydale, Limburger and Stinking Bishop.

Bickerstaff subsequently went on to test the hypothesis on humans and discovered that papers written by `named’ professors were ranked more highly than the same paper written by a professor with no such honorific. On the strength of this Bickerstaff persuaded the dean of Empire State’s business school to raise money to endow a chair for every faculty member in the School. Within two years the publication output of the school had doubled and it had risen ten places in the rankings. The strategy was not without controversy. The University’s  academic senate thought this cheapened the idea of chaired professor. In a compromise, it was decided to call the new positions ottoman rather than chaired professorships.

Professor Bernard Drapier, well known faculty gadfly and guardian of traditions has railed against the Institute for Wow. He says it is yet another example of the University’s subjugation to the military-entertainment complex: “We must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-entertainment complex. ”

Empire State University today announced a new Division of Linear Algebra and Information. It is the university’s largest program change in decades and helps secure its status among the country’s top Linear Algebra research and training hubs.

“The division will enable students and researchers to tackle not just the scientific challenges opened up by pervasive linear algebra, but the societal, economic, and environmental impacts as well,” the university said.

Empire State is in an elite group with Carnegie Mellon University, MIT, Stanford, and the University of Washington in the caliber and scope of its linear algebra program, said A. N. Other, chief executive of the Plutocrat Institute of Artificial Intelligence, a computer-science professor at the University of Ruritania, and a tech entrepreneur. In creating the new division, Empire State is responding to two issues, Other said. The first is a large, chronic shortage of well-trained linear algebraists. The second is what value a university can add when technical courses are widely available through platforms like Coursera and Udacity. In emphasizing interdisciplinary training among scientists, engineers, social scientists, and humanists, Empire State firmly integrates linear algebra into its prestigious academic offerings, he said.

Empire’s move follows MIT’s announcement last month that it was investing $1 billion in a new college of linear algebra. But leaders at Empire State say their disclosure of the division today was driven by an imminent international search for a director, who will hold the title of associate provost, putting the program on an institutional par with the State’s colleges and schools. They explain that in creating a division rather than a new college, they are reflecting the way linear algebra has become woven into every discipline.

Full article at the Chronicle of Higher Ed.

Apparently, it is quite the rage to festoon one’s slides with company logos, particularly of the  frightful five. At present this is done for free. It suggests a new business. A platform that matches advertisers to faculty. Faculty can offer up their slides and advertisers can bid for the right to place their logos on the slides.

Volume 42 of the AER, published in 1952, contains an article by Paul Samuelson entitled `Spatial Price Equilibrium and Linear Programming’. In it, Samuelson uses a model of Enke (1951) as a vehicle to introduce the usefulness of linear programming techniques to Economists. The second paragraph of the paper is as follows:

In recent years economists have begun to hear about a new type of theory called linear programming. Developed by such mathematicians as G. B. Dantzig, J. v. Neumann, A. W. Tucker, and G. W. Brown, and by such economists as R. Dorfman, T. C. Koopmans, W. Leontief, and others, this field admirably illustrates the failure of marginal equalization as a rule for defining equilibrium. A number of books and articles on this subject are beginning to appear. It is the modest purpose of the following discussion to present a classical economics problem which illustrates many of the characteristics of linear programming. However, the problem is of economic interest for its own sake and because of its ancient heritage.

Of interest are the 5 reasons that Samuelson gives for why readers of the AER should care.

  1. This viewpoint might aid in the choice of convergent numerical iterations to a solution.

  2. From the extensive theory of maxima, it enables us immediately to evaluate the sign of various comparative-statics changes. (E.g., an increase in net supply at any point can never in a stable system decrease the region’s exports.)

  3. By establishing an equivalence between the Enke problem and a maximum problem, we may be able to use the known electric devices for solving the former to solve still other maximum problems, and perhaps some of the linear programming type.

  4. The maximum problem under consideration is of interest because of its unusual type: it involves in an essential way such non-analytic functions as absolute value of X, which has a discontinuous derivative and a corner; this makes it different from the conventionally studied types and somewhat similar to the inequality problems met with in linear programming.

  5. Finally, there is general methodological and mathematical interest in the question of the conditions under which a given equilibrium problem can be significantly related to a maximum or minimum problem.

 

Kellogg faculty blogroll