Thursday, 30 September 2010

George Stigler on what makes a teacher

The following quotation is from George Stigler's collection of essays on academia and society, "The Intellectual and the Marketplace".
The good teacher is a mysterious person, and yet we must know his character before we can prescribe his training. In my view, the good teacher is not distinguished by the breadth of his knowledge, by the lucidity of his exposition, or by the immediate reactions of his students. His fundamental task is not to dispense information, for in this role he is incomparably inferior to the written word. His task is to fan the spark of genuine intellectual curiosity and to instill the conscience of a scholar--to communicate the enormous adventure and the knightly conduct in the quest for knowledge


To this end, the fundamental requirements of the good teacher are competence (How can the incompetent be other than slovenly?) and intellectual vitality (How can the sedentary excite us to bold adventure?).

These traits may be acquired by wide reading and deep reflection, without engaging in research and becoming a specialist. But it is an improbable event. It is improbable psychologically: it asks a man to have the energy to read widely and the intellectual power to think freshly, and yet to do no research. He is to acquire knowledge and construct ideas--and keep them a secret. It is improbable scientifically: it asks a man to be competent in his understanding of work that he has had no part in constructing. At lease in economics, this is almost impossible. There is no book that states the consensus of the profession on the ideas that are changing--and these are naturally the most interesting ideas. Only the man who has tried to improve the ideas will know their strengths and weaknesses. Scholarship is not a spectator sport.
In other words, research and teaching are linked. One wonders what powers that be in our universities would make of this.

(HT: Division of Labour)

Who gains from minimum prices for alcohol?

An article in the Guardian in the UK notes,
Large supermarket chains would benefit from a £700m windfall if minimum pricing for alcohol was introduced across the UK, new research indicated today.

Tesco, the UK's biggest supermarket, stands to reap the most rewards, according to the Institute for Fiscal Studies (IFS).

The thinktank researched the likely impact of a 45p minimum unit price for alcohol – the controversial measure which had been proposed by the Scottish government but was recently rejected by opposition parties.

The IFS said such a policy would benefit retailers rather than the public purse, echoing an argument that critics of minimum pricing have used against the measure.

The stores which sell the most alcohol – Tesco, Asda and Sainsbury's – stand to gain the most from the measure.
And yes the article also reports that Tesco (who would gain an estimated £230 million) is backing the plan for minimum pricing. And why not? Currently some supermarkets, and I'm guessing Tesco would be one, use low booze prices as loss leaders. If some nice government was to come along and ban them -and their competitors- from doing so they will not, of course, have to make such losses on these items. And as every firm has to act in the same way there is no competitive pressure to undercut each other. Basically what is happening is that retailers would be given the legal right to set up an oligopoly and thus reap the profits from such. The biggest relative gains would be made by low-price and discount supermarkets, which sell the largest proportion of their alcohol below the 45p threshold.

So producers win, consumers lose. Don't supports of minimum prices see this?

Wednesday, 29 September 2010

Texting bans may add risk to roads

In a previous post I commented on some Interesting research being carried out by Jeffrey Miron into the effects of banning the use of cell phones while driving. Now I see from the Not PC blog that there are reports already out on this issue. In fact Texting bans may add risk to roads. The USA Today article says,
KANSAS CITY, Mo. — Laws banning texting while driving actually may prompt a slight increase in road crashes, research out today shows..
"Texting bans haven't reduced crashes at all," says Adrian Lund, president of the Insurance Institute for Highway Safety, whose research arm studied the effectiveness of the laws.
Researchers at the Highway Loss Data Institute compared rates of collision insurance claims in four states — California, Louisiana, Minnesota and Washington — before and after they enacted texting bans. Crash rates rose in three of the states after bans were enacted.

The Highway Loss group theorizes that drivers try to evade police by lowering their phones when texting, increasing the risk by taking their eyes even further from the road and for a longer time.
So, again, a law seems not be archiving what it set out to achieve. When will law makers learn?

John Taylor on policy in face of the financial crisis

At his blog Economics One John Taylor writes about his testimony before the US Senate Budget Committee. He writes,
My testimony summarized the results of studies conducted at Stanford during the past three years examining the empirical impact of the policies (the studies are described in the appendix).

One simple fact which I reported received considerable attention in the senators’ discussion. It was that only $2.4 billion of the $862 billion in the 2009 stimulus package (ARRA) has been spent on federal infrastructure—three-tenths of a percent. More may have resulted at the state and local level but there is no clear connection between the federal grants and such spending.

More generally I reported that on balance the federal policy responses to the crisis have not been effective. Three years after the crisis began the recovery is weak and unemployment is high. A direct examination of the fiscal stimulus packages shows that they had little effect and have left a harmful legacy of higher debt. The impact of the extraordinary monetary actions has been mixed: while some actions were helpful during the panic stage of the crisis, others brought the panic on in the first place and have had little or no impact since the panic. The monetary actions have also left a legacy of a large monetary overhang which must eventually be unwound.

I am frequently asked what I would have done differently. It turns out that I testified before the same Senate Budget Committee two years ago in November 2008 and recommended a specific four part fiscal policy response to the crisis. The response was based on certain established economic principles, which I summarized by saying that policy should be predictable, permanent and pervasive affecting incentives throughout the economy.

But this is not the policy we got. Rather than predictable, the policy has created uncertainty about the debt, growing federal spending, future tax rate increases, new regulations, and the exit from the unorthodox monetary policy. Rather than permanent, it has been temporary and thereby has not created a lasting economic recovery. And rather than pervasive, it has targeted certain sectors or groups such as automobiles, first time home buyers, large financial firms and not others. It is not surprising, therefore, that the policy response has left us with high unemployment and low growth. Given these facts, the best that one can say about the policy response is that things could have been even worse, a claim that I disagree with and see no evidence to support.
So the stimulus has not stimulated. It has created regime uncertainty however. Again this shows that New Zealand's somewhat low-key approach to the crisis may have been the right one.

Just for fun: theory of the firm 11

Two of the main theories of the firm are the transaction cost approach, associated with people like last year's Noble Prize winner in economics Oliver Williamson, and the property rights approach, associated with Sanford J. Grossman, Oliver Hart and John Moore. Recently Hart and Moore developed a new approach to contracts which has also been applied to the theory of the firm. This is commonly referred to as the "reference point" approach. An interesting question is how are these different models related?

In the past it was commonly argued that the property rights approach was a formalisation of the transaction cost approach. As Robert Gibbons (2005: 201) has written,
For example, one still sometimes hears the claim that “Grossman and Hart (1986) formalized Williamson (1979).” Indeed, I have heard this claim with two opposite spins: “Grossman–Hart merely formalized Williamson,” and “Finally, someone formalized Williamson.”
This however turns out not to be true. In a discussion of the differences between the Grossman-Hart-Moore (GHM) theory of the firm and the transaction-cost approach, Williamson (2000: 605-6) argues that the most important difference between them is that GHM introduce inefficiencies at the ex ante, that is, before the state of the world is known, investment stage while the transaction-cost approach assumes ex post contractual problems drive inefficiencies. There are no ex post inefficiencies in GHM due to their assumption of ex post common knowledge and costless bargaining. Williamson writes,
The most consequential difference between the TCE and GHM setups is that the former holds that maladaptation in the contract execution interval is the principal source of inefficiency, whereas GHM vaporize ex post maladaptation by their assumptions of common knowledge and costless ex post bargaining. The upshot is that all of the inefficiency in GHM is concentrated in the ex ante investments in human assets (which are conditional on the ownership of physical assets). (Williamson 2000: 605).
This shift from ex post maladaptation to ex ante investment distortions, in Williamson's view, matters. He says,
(1) The TCE rendition of the make-or-buy decision between successive stages (A and B) asks whether A and B should be separately owned and operated or if the ownership and operation of these two stages should be unified. If independent, then each stage appropriates its net receipts (high-powered incentives obtain) but maladaptation problems can arise during contract execution. If unified, then the two stages are man- aged coordinately through hierarchy. (Maladaptation problems are thereby relieved; incentives are lower-powered; and added bureaucratic costs arise.) By contrast, GHM view vertical integration in a directional way: either A buys B or B buys A, and it matters which way this is done. That is because common ownership under GHM does not imply unified management. Instead, each stage (in all configurations A and B are independent; A buys B; B buys A) appropriates its net receipts. This last is a very unusual condition, in that unified ownership is normally thought of as a means by which to effect cooperation. (2) TCE maintains that each generic mode of governance spot market, incomplete long-term contract, firm, bureau, etc. is defined by a syndrome of attributes to which distinctive strengths and weaknesses accrue. Specifically, TCE holds that alternative modes differ in incentive intensity, administrative controls (to include auditing, accounting, and transfer pricing), access to the courts, and informal organization (to include politicking). GHM assume that incentive intensity, administrative controls, and informal organization are unchanged by ownership and that courts are irrelevant (because of costless renegotiation). None of the physical asset utilization and transfer pricing distortions that I associate with the "impossibility of selective intervention" (Williamson 1985, pp. 135-40) thus occur under the GHM setup. (3) TCE examines a wide range of ex post devices for infusing credible commitments into contracts and applies this reasoning to a wide set of transactions. Variations on this theme include hybrid modes of organization, exchange agreements and other uses of hostages to support exchange, the organization of work, the organization of labor and human resources more generally, corporate governance, regulation (and deregulation), public bureaus, and project financing. Because GHM is a property rights and property rights only construction, it relates to some of these issues not at all and others very selectively. (Williamson 2000: 606) (I have removed references from this quote.)
So these theories are, as Gibbons puts it, essentially orthogonal.

What then of the new "reference point" approach to the firm?

I would argue that the reference point approach can be seen as a movement back toward transaction cost type thinking in that contracting is not fully enforceable ex post. Perfunctory performance can be enforced, consummate cannot and thus ex post inefficiencies can arise. Hart (2008) argues that aggrievement/shading costs, which drive the reference point approach, are akin to what he calls "haggling costs". Hart (2008: 405) explains,
He [Coase] suggested that the two most obvious costs of using the market/price mechanism are (i) discovering what the market prices are and (ii) negotiating a contract for each exchange transaction. Economists since Coase have referred to these as ‘haggling’ costs (although I do not believe that Coase uses this term). ‘Argument’ costs might also be appropriate for (ii).
Modelling haggling costs can be seen as a move in the direction of modelling transaction costs. That is, a move towards modeling the costs of market contracting. In the reference point approach ex post inefficiencies return, via shading costs, without the need for ex ante investment problems. So the action in these models, like the transaction cost models, is at the ex post stage. However it can also be argue that provision also needs to be made for the possibility that core features of the transaction cost theory are still being left out.

Thus Robert Gibbons (2010: 283) has a point when he says, while discussing some opportunities for the future of transaction-cost economics, that only time will tell if the reference point theory is an productive path to follow.

  • Gibbons, Robert (2005). `Four formal(izable) theories of the firm?', Journal of Economic Behavior and Organization, 58(2): 200-45.
  • Gibbons, Robert (2010). `Transaction-Cost Economics: Past, Present, and Future?', Scandinavian Journal of Economics, 112(2): 263-88.
  • Grossman, Sanford J. and Oliver D. Hart (1986). `The Costs and Benefits of Ownership: A Theory of Vertical and Lateral Integration', Journal of Political Economy, 94(4): 691-719.
  • Hart, Oliver D. (2008). `Economica Coase Lecture: Reference Points and the Theory of the Firm', Economica, 75(299) August: 404-11.
  • Williamson, Oliver E., (1979). `Transaction cost economics: the governance of contractual relations', Journal of Law and
  • Economics, 22: 233-261.
  • Williamson, Oliver E. (2000). `The New Institutional Economics: Taking Stock, Looking Ahead', Journal of Economic Literature, 38(3) September: 595-613.

Tuesday, 28 September 2010

Who should own schools?

Tim Worstall asks the following question:
If customer mutuals are such a great idea, you know Building Societies, the Co Op, organisations owned and ultimately managed by their customers, why are schools run by parents (acting, of course, in loco parent is for the ankle biters who are the real customers) such a bad idea?
Chris Dillow tries to answer it. Dillow draws on the work of Henry Hansmann and his excellent book The Ownership of Enterprise and thus takes a tranaction-costs approach to his answer.
1. Lock-in. Consumers don’t own fruit and veg stalls. But in American rural areas they often own utility companies. There’s a reason for this. With fruit and veg, a dissatisfied consumer can easily go to a different supplier. But a dissatisfied rural electricity customer cannot so easily shop around. They are locked into one supplier. Customer ownership stops this supplier ripping them off. It acts as a substitute for exit.
This suggests that the more students are locked into one school, the more they should own it.
But one could ask, How locked-in are students? In most cases there are alternative schools available if students, or their parents, want to move.
2. Monitoring. Ownership should flow to those best able to control an asset. Here, the case for parental ownership is mixed. On the one hand, they can monitor outputs very well - the progress their child is making. This argues for them controlling the school. But on the other hand, they (despite what they might think) know little about the inputs into education - the mechanics of teaching. This argues against them getting ownership.
But if the outputs are ok how much do they need to know about the inputs?
3. The dangers of predation. A good argument for mutual building societies is that bankers would otherwise be able to rip-off customers, by paying them low deposits and giving themselves high salaries. To the extent that there’s an analogous danger with schools, this argues for customer ownership. But is there? Dickens, Orwell and the Economist say yes. But most people’s experience of UK education, I suspect, suggests otherwise.
So does Dillow really think that the teacher unions do not take advantage of situation to make themselves better off if they can?
4. The costs of organizing. The harder it is to get hundreds of parents to agree upon how the school should be run, the weaker is the case for parental ownership.
One potential problem here is that parents will have different interests: some, say, will want the school to be strong in arts, others in science or sports. The more this is the case, the weaker the case for parental ownership.
Reasonable point. I have noted the problems with heterogeneous owners before. But the real question is a relative one, Is there a group with less heterogeneous intersts who could be the owners?
5. Risk bearing. Ownership should flow to those who bear the most risk. As it is students who suffer most from a bad school, this suggests a case for parental ownership.
Chris then argues that,
These principles suggest that parental ownership is not obviously stupid. But they are also consistent with teacher-ownership, especially if this is associated with competition: principles (2) and (4) might argue for this, whilst competition helps satisfy principles (1), (3) and (5).
One thing that argues against both parent and teacher ownership is who makes the investment in the school? Neither the teachers nor the parents may have the funds available to set up and maintain a school. Also even if they do have the funds, what happens when either the teacher or the child leave the school? How does the teacher or the parent get their investment back?

Chris Dillow continues,
But whilst the case for teacher vs. parental ownership is mixed according to these principles, the case for state ownership doesn’t leap out from them - unless we put great weight upon (4). Which raises the question: what can justify state ownership being the dominant form?
Funding may be the answer. As with most firms which are investor-owned, investors, the state in this case, may be the only group who has relatively homogeneous interests and the ability to fund the school.

The alternative is for schools to be private as many are now. Here they are owned by a group who are, by and large, neither parents nor teachers and who supply education on a not-for-profit or for-profit basis but this requires parents to pay for their ankle biters's eduction directly.

So in either case ownership of schools seems to come down to funding. Schools are owned by whoever, state or private, has the money to invest in them.

EconTalk this week

Gary Greenberg, psychologist and author of The Noble Lie and Manufacturing Depression, talks with EconTalk host Russ Roberts about the nature of addiction, depression and mental illness. Drawing on ideas in the two books, Greenberg argues that there are strong monetary incentives to define various problems as illnesses that psychiatrists "cure" with drugs. Greenberg argues that this distorts science and has strong impacts, good and bad, on how we view ourselves and the challenges of life. The conversation looks at the scientific basis for addiction and the role brain chemistry in depression. The conversation closes with a discussion of Greenberg's correspondence with the Unabomber.

Permanent disaster law better than legislation on the hoof?

Over at her blog Homepaddock writes,
Radio NZ reports the government is considering permanent legislation to deal with natural disasters.
While the Canterbury Earthquake Response and Recovery Act has been criticised for overriding existing laws, Civil Defence Minister John Carter says it’s already proving its worth.

And Mr Carter says the Government is considering whether new legislation is needed to deal with disasters, rather than having to rush through emergency laws, as has happened after the earthquake.
She then adds,
Legislation made with public input and careful consideration will almost always be better than that made on the hoof.
But is this so? It depends on how completely any law we write today can cover what will happen in an unknowable future. If, as seems reasonable to at least consider, each disaster is largely unique then it is not obvious that we can write a law today that can deal with all possible disasters in the future. The best we could do is write in a mechanism to decide what should be done in any situation, but such a mechanism itself must be incomplete, if only for reasons of bounded rationality. We cannot ever hope to cover all possible eventualities even with a general mechanism. Thus we will still need, in the future, some way of "completing" the incomplete mechanism.Thus would it not be better to write a totally incomplete law, i.e. no law at all, and then react at the time of a disaster, given that we will have much better information as to the particular problems people face and can therefore tailor the legislation to the needs of the people at the time.

Here we have a trade-off between certainty and flexibility. A law written today will give people certainty as to what actions would be carried out and by whom but making legislation "on the hoof" gives flexibility so that an legislation can to tailored to the particular circumstances.

Add to this leaning by doing, so that any future legislation can incorporate what we have learnt up until that time as to what works and what doesn't.

Monday, 27 September 2010

Bernanke on "Implications of the Financial Crisis for Economics"

Ben S. Bernanke gave a speech on the "Implications of the Financial Crisis for Economics" at Princeton. In it he said,
Although economists have much to learn from this crisis, as I will discuss, I think that calls for a radical reworking of the field go too far. In particular, it seems to me that current critiques of economics sometimes conflate three overlapping yet separate enterprises, which, for the purposes of my remarks today, I will call economic science, economic engineering, and economic management. Economic science concerns itself primarily with theoretical and empirical generalizations about the behavior of individuals, institutions, markets, and national economies. Most academic research falls in this category. Economic engineering is about the design and analysis of frameworks for achieving specific economic objectives. Examples of such frameworks are the risk-management systems of financial institutions and the financial regulatory systems of the United States and other countries. Economic management involves the operation of economic frameworks in real time--for example, in the private sector, the management of complex financial institutions or, in the public sector, the day-to-day supervision of those institutions.
For me I think someone should point out to Ben that the public sector cannot carryout the real time, day-to-day supervision of financial institutions and in trying may well cause more problems than they solve. Did Hayek not point out the important of knowledge of time and place, knowledge that central planners and regulators just can get. Without it regulators cannot regulate.

Interesting research

Below is a posting from Jeffrey Miron's blog:
Last year, Americans sent 1.6 trillion text messages and spent 2.3 trillion minutes talking on cellphones. They also drove nearly 3 trillion miles. More often than many will admit, those activities happened at the same time.

On Thursday, the Safe Driving Act takes effect here, placing Massachusetts among the majority of states where the law forbids texting while driving, and where 16- and 17-year-old motorists will not be allowed to use a cellphone in any manner while at the wheel, unless it is to dial 911.
I am not convinced such laws generate more benefit than cost, as I explained in an earlier post.

Rather than just speculate about this, however, I decided to get some evidence. I have two students writing senior theses on whether these laws reduce traffic accidents. I’ll report on their results in about six months.
I for one look forward to seeing the results. I will say now that my guess is that such laws will do little to reduce accidents, but we will see. One wonders what the unintended consequences of such law will be.

The irrelevance of modern political science

Steven F. Hayward writes on the above topic in The American. He writes about the recent meeting of the American Political Science Association (APSA) in Washington,
Most conventions in Washington are able to attract at least a bit of the city's star power. Obscure trade associations get House members. Larger groups get senators, or maybe, if they're lucky, a member of the White House's senior staff . . . There were no political luminaries in attendance at the American Political Science Association's convention last week, however. The fact that the country's brightest political scholars had all gathered at the Marriott Wardman Park barely seemed to register on the rest of the town. Worse, you got the feeling that the political scientists knew it. One of the conference's highlights, according to its Web site, was a panel titled "Is Political Science Relevant?”
Later he continues,
Ask yourself this question: Among policy makers in Washington, are you more likely to find academic economic journals in their offices, or academic political science journals? Why do economists not face the same kind of worry about their "relevance," even though their mathematical approaches to the subject matter can be even more esoteric and forbidding?
As my late friend Tom Silver once wrote: "Imagine yourself marooned on a desert island with only ten books to read, but in this case ten books not of your choosing. Suppose them all to be books written by behavior political scientists during the past twenty years. Question: Do you think that you would die first of boredom, or of self-inflicted wounds?"
An interesting question. But I make no comment.

Sunday, 26 September 2010

Question for a new MP

Kiwiblog writes
Have a look at the questions asked by the ODT of new ACT MP Hilary Calvert:

* Any convictions you would like to declare now?
* Any misbehaviour, non-criminal, that might come back to bite you?
* Any family members involved in criminal behaviour?
* Could you ever contemplate supporting Heather Roy?
* According to a radio report, you have a pecuniary interest in a licensed massage parlour.
For me I would ask just one question of a new MP: name one basic economic principle that the general public don't understand and how would you explain it?

My answer, if I were to be asked the question, would involve a discussion of comparative advantage, but I'm guessing most MPs wouldn't say that.

Friday, 24 September 2010

Stable democracies are more likely to enjoy sustained financial development

What do countries need for sustainable financial development? This is the question asked in new column at “Mother, can I trust the government?” Stable democracies are more likely to enjoy sustained financial development by Marc Quintyn and Geneviève Verdier.

In the article it is argued that protection of property rights is necessary but not sufficient condition for financial development. Using a sample of 160 countries from 1960 to 2005, Quintyn and Verdier find that checks and balances on power and political stability are also vital ingredients. There is research that shows that effective enforcement of property rights is an institution contributing to financial development – more so than other legal institutions. This does, however, raise the question what is the ultimate source of effective protection of property right? Quintyn and Verdier write,
A number of authors have argued that political institutions represent this ultimate source. They argue that effective protection of property rights can only be established in an environment where political institutions are willing to limit their own powers, through systems of checks and balances (Haber et al. 2008). Political institutions that provide checks and balances instil the necessary confidence in market participants that allows a country’s financial system to develop healthily. So far, a limited number of studies have supported this view (see Bordo and Rousseau (2006), Keefer (2008), Roe and Siegel (2008) and Tressel and Detragiache (2008).

In a recent paper (Quintyn and Verdier 2010), we test this hypothesis and find that most long-lasting episodes of financial deepening indeed have occurred in countries with high-quality political institutions.

Our research analyses a standard measure of financial development – the ratio of private sector credit to GDP – from a different perspective than the typical time-series analysis. For a sample of 160 countries over the period 1960-2005 we identify periods during which countries experienced an acceleration in financial development. We identify a total of 209 such accelerations – defined as the ratio growing more than 2% annually for a minimum of 5 years – and estimate the impact of economic and institutional conditions around the take-off, on the likelihood of such an event. Episodes of financial acceleration range in length from 5 years (the imposed minimum) to as long as 22 years. Obviously, the nature of these periods can be different. The literature generally identifies three types of accelerations:
  • Type 1. At beginning of a cyclical upturn, credit typically expands faster than output, due to the need to finance investments and higher working capital. Such episodes are associated with the working of the conventional accelerator;
  • Type 2. Excessive credit expansions or “credit booms” may result from inappropriate responses of market participants to changes in risks over time, sometimes following financial liberalisation. In the best case, the boom comes to a soft landing, in the worst case it results in a banking (and real sector) crisis;
  • Type 3. Longer periods of “financial deepening”, with rates of expansion of the financial system typically lower than type 1 or 2, but sustainable for a longer period of time, leading to a more sophisticated financial system.
To test the political-institutions hypothesis, we have a special interest in analysing the conditions under which sustained accelerations (type 3) are likely to occur. Based on the literature on lending booms, we identify ype-3 takeoffs as those lasting 10 years or longer, allowing us to compare the prevailing economic and institutional conditions at the start of short-lived and longer-lived periods of financial deepening.1 Of the 209 episodes, 161 are short episodes and 48 are long one – just over one-fifth of the total.

The significance of having experienced (at least) one long episode of financial acceleration for financial development is obvious from Figure 1. A majority of countries that now have highly developed financial systems experienced a long episode of acceleration (black bars) at some point over the past 50 years. On the other hand, a sustained acceleration is no guarantee for success as reversals seem to have occurred in a number of countries.

Figure 1. Cross-country disparities in the ratio of credit to private sector GDP (2005) (each bar represents a country)
Quintyn and Verdier continue,
How does an episode of financial acceleration become a period of financial deepening? We consider the hypothesis that a set of factors – macroeconomic variables (real GDP growth, inflation, GDP per capita), financial reforms (index of financial liberalisation from Tressel and Detragiache, 2008), the quality of political institutions (from Polity IV) – explain the probability of a take-off at time t. The dependent variable in our probit analysis is a dummy that is 1 in the years associated with a financial acceleration and 0 otherwise.2 All explanatory variables are lagged to reduce the risk of endogeneity.

Our main results [ ... ] show that the drivers of short and long episodes are different. Financial liberalisation has a significant and large impact on the probability of a take-off. However, the effect differs according to the duration of the take-off. The likelihood of a short episode increases significantly following successive efforts to liberalise the financial system. In contrast, mainly contemporaneous financial liberalisation seems to matter for long episodes. The impact of improved bank supervision seems rather weak across the board.

Our results lend strong support for the political institutions view. The Polity variable has a significant and negative effect on the probability of a take-off lasting less than ten years and, contrasting with this, a significant and positive impact on sustained episodes of financial development. This suggests that countries with checks and balances in the political system are more likely to experience genuine financial deepening. Recent regime changes (positive or negative) do not seem to increase the likelihood of short or long episodes significantly. This confirms that political stability and a confidence-enhancing institutional setup are important for financial development.

To further investigate the importance of political stability for financial development, we also considered the durability of the political regime. The results [ ... ] show that the durability of a democratic regime, in other words, a combination of stability and quality of the polity, greatly increases the probability of a sustained period of financial development.
  • Bordo, Michael and Peter Rousseau (2006), “Legal-Political Factors and the Historical Evolution of the Finance-Growth Link”, NBER Working Paper 12035.
  • Campos, Nauro and Fabrizio Coricelli (2009), “Financial Liberalization and Democracy: The Role of Reform Reversals”, CEPR Discussion Paper 7393, 41pp.
  • Gourinchas, Pierre-Olivier, Rodrigo Valdés and Oscar Landerretche (2001), “Lending Booms: Latin America and the World”, Economia, Spring, 47-99.
  • Haber, Stephen, Douglass North and Barry Weingast (2008), Political Institutions and Financial Development, 1-9 in Haber, Stephen, Douglass North and Barry Weingast (2008), Political Institutions and Financial Development, Stanford University Press.
  • Hilbers, Paul, Inci Otker-Robe, Ceyla Pazarbasioglu and Gudrun Johnsen (2005), “Assessing and Managing Rapid Credit Growth and the Role of Supervisory and Prudential Policies”, IMF Working Paper 05/151.
  • Keefer, Philip (2008), “Beyond Legal Origin and Checks and Balances: Political Credibility, Citizen Information and Financial Sector Development”, 125-155, in Haber,
  • Stephen, Douglass North and Barry Weingast (2008), Political Institutions and Financial Development, Stanford University Press.
  • Quintyn, Marc and Geneviève Verdier (2010), ““Mother, can I trust the Government?” Sustained Financial Deepening – A Political Institutions View”, IMF Working Paper 10/210
  • Roe, Mark, and Jordan Siegel (2008), “Political Instability: Its Effects on Financial Development, Its Roots in the Severity of Economic Inequality”, SSRN, July.
  • Tressel, Thierry and Enrica Detragiache (2008), “Do Financial Reforms Lead to Financial Development? Evidence from a New Dataset”, IMF Working Paper 08/265.

Greg Mankiw on being an economic advisor to the US President

Past wages and unemployment durations

Great Expectations: Past Wages and Unemployment Durations is an interesting looking new paper from René Böheim, Gerard Thomas Horvath and Rudolf Winter-Ebmer.

The abstract reads:
Decomposing wages into worker and firm wage components, we find that firm-fixed components (firm rents) are sizeable parts of workers' wages. If workers can only imperfectly observe the extent of firm rents in their wages, they might be mislead about the overall wage distribution. Such misperceptions may lead to unjustified high reservation wages, resulting in overly long unemployment durations. We examine the infuence of previous wages on unemployment durations for workers after exogenous lay-offs and, using Austrian administrative data, we find that younger workers are, in fact, unemployed longer if they profited from high firm rents in the past. We interpret our findings as evidence for overconfidence generated by imperfectly observed productivity
So if you can't tell what is affecting your wages, you or your firm, you may make the mistake of thinking your wage is due to just your own effects and abilities and thus be overconfident and set a reservation wage that is too high should you become unemployed. This results in a longer period of unemployment should you be made redundant. I wonder how true this is of the New Zealand labour market. Do our unemployed have too greater expectations?

Thursday, 23 September 2010

No change here then

The government has announced that
At New Zealand Post, Hon Michael Cullen has been appointed chair from November 1 to replace Rt Hon Jim Bolger.
So we can be pretty sure there will be not change in policy at NZ Post. :-(

What have we learned from market design?

Or so asks Alvin Roth in a new paper: What have we learned from market design? Update to Roth (2008). The abstract reads:
"After a market has been designed, adopted, and implemented, it has a continuing life of its own. For those involved directly in the market, it is useful to continue to monitor it to make sure it is functioning well. For those of us involved in market design, it is also good to check how things are going, as a way to find out if there are unanticipated problems that still need to be addressed. Finally, the design and operation of new marketplaces also raises new theoretical questions, which sometimes lead to progress in economic theory. In this update, I’ll briefly point to developments of each of these kinds, since the publication of Roth (2008), What have we learned from market design?. I’ll discuss theoretical results only informally, to avoid having to introduce the full apparatus of notation and technical assumptions."
There is nobody better to ask, and answer, such a question than Roth as he is one of the leaders in the field.

The 2008 paper referred to is available from: Roth, Alvin E. "What have we learned from market design?" Hahn Lecture, Economic Journal, 118 (March), 2008, 285-310.

Nice billboard

This picture of a Turners & Growers billboard in Wellington comes via Roger Kerr's blog:

The billboard is a small part of big battle to remove New Zealand’s last remaining export monopoly. Yes we still have them! :-(

The company Zespri has a monopoly on exporting New Zealand kiwifruit to all countries except Australia. Roger Kerr writes,
On the issue, Agriculture Minister David Carter has said the government will be guided by what most growers want – but what about the other growers? Blocking non-Zespri growers from exporting overseas because they are a minority is like banning all dairy producers who don’t supply Fonterra from exporting overseas. Fonterra is subject to competition as Zespri should be.
What does Zespri have to fear from competition? If it is doing as well as it can in export markets, it has nothing to fear, no one will be able to take market share away from it. If not, then it has much to fear, but that's the point of competition. New Zealand needs the most efficient exporters driving out export sectors. Competition is one way of making sure we have them. This is as true for kiwifruit as for milk or any other commodity. It a time this bit of blatant protectionism went the same way as the dinosaur.

Wednesday, 22 September 2010

How did the poor came to be poor?

To this this question Tim Worstall gives an excellent answer:
The central economic question is not how people came to be poor, not how poverty is created, for it isn’t. It’s how wealth is created, that wondrous process which pre-1700 really didn’t happen to anybody (no, the elites, while fabulously wealthy in terms of the portion of wealth they controlled were, by our standards today, dirt poor) and now happens to any country which cares to take up the process which is the interesting and important question.
Yes there are exceptions, Zimbabwe under Mugabe being an obvious recent example, but by and large Worstall has it right.Wealth is the exception, the norm for most of human existence was poverty, so wealth needs to be explained. Economic growth is the mystery-What causes it? Why post-1700? Why did it start in the UK? How did it develop in other places?- this is why economic historians spend so much time trying to explain it, and not trying to explain poverty.

Tuesday, 21 September 2010

Is anyone surprised by this?

This paper investigates the institutional causes of China’s Great Famine. It presents two empirical findings: 1) in 1959, when the famine began, food production was almost three times more than population subsistence needs; and 2) regions with higher per capita food production that year suffered higher famine mortality rates, a surprising reversal of a typically negative correlation. A simple model based on historical institutional details shows that these patterns are consistent with the policy outcomes in a centrally planned economy in which the government is unable to easily collect and respond to new information in the presence of an aggregate shock to production.
This is the abstract of a new paper: The Institutional Causes of China's Great Famine, 1959-61 by Xin Meng, Nancy Qian and Pierre Yared.

The authors note that,
Our third contribution is to add to studies on the efficiency of central planning and the trade-off between quantity and price controls. Our model generally builds off of arguments made during the historic Socialist Calculation Debate, when Austrian economists such as Von Mises (1935) and Hayek (1945) argued that it was practically impossible for central planners to aggregate necessary information in a timely fashion.
The paper's conclusion begins,
Our study points to inflexible government policy for food distribution as an important factor in causing the largest famine in history. We show that even if a government is not obviously malevolent or incompetent, an ideological commitment to central planning, together with practical constraints for gathering and responding to information, result in an inflexible policy that can cause a famine when aggregate production falls.
So central planning doesn't work. Like I said, Is anyone surprised by this result?

EconTalk this week

Richard Epstein of New York University and Stanford University's Hoover Institution talks with EconTalk host Russ Roberts about the current state of the economy, particularly the regulatory climate. Epstein argues the current level of regulation is producing unusually high costs. He digs more deeply into the pharmaceutical industry and discusses various regulations and alternative ways to encourage drug safety and innovation.

A transformation economy: shaping the future of EU trade policy

In this audio from David O’Sullivan, Director General for Trade at the European Commission, joins the VoxEU trade policy debate. He talked to CEPR's Viv Davies about, amongst other things, the EU’s responsibility within the world trading system, trade governance and the WTO, the role of reciprocity, the BRICs, and the importance of successfully concluding the DDA negotiations; he also comments on the issue of ‘multilateralising regionalism.’

I have two words for Mr O'Sullivan: Free Trade.

Monday, 20 September 2010

Game theory and the law

A picture:

and an explanation:
The local sheriff’s office had established the signs as a “ruse” to direct motorists to exit off the highway after viewing the warning of the upcoming DUI/Narcotics checkpoint. In fact, there was no checkpoint further down I-40. Instead, the sheriff set up a checkpoint at the end of the ramp of the first exit available to motorists after the posted signs, an exit not frequently used since no services were offered at the exit.
Separating equilibria do in some cases exist.

Blame Canada France for the great depression

In an article at Douglas Irwin asks Did France cause the Great Depression? Irwin writes,
A large body of research has linked the gold standard to the severity of the Great Depression. This column argues that while economic historians have focused on the role of tightened US monetary policy, not enough attention has been given to the role of France, whose share of world gold reserves soared from 7% in 1926 to 27% in 1932. It suggests that France’s policies directly account for about half of the 30% deflation experienced in 1930 and 1931.
France was accumulating and sterilising gold reserves at a much more rapid rate than the US. Partly as a result of the undervaluation of the franc in 1926, the Bank of France began to accumulate gold reserves at a rapid rate. France’s share of world gold reserves soared from 7% in 1926 to 27% in 1932

Irwin continues,
The redistribution of gold put other countries under enormous deflationary pressure. In 1929, 1930, and 1931, the rest of the world lost the equivalent of about 8% of the world’s gold stock, an enormous proportion – 15% – of the rest of the world’s December 1928 reserve holdings. This massive redistribution of gold would not have been a problem for the world economy if the US and France had been monetising the gold inflows. Then the gold inflows would have led to a monetary expansion in those countries, just as the gold outflows from other countries led to a monetary contraction for them. That would have been playing by the “rules of the game” of the classical gold standard. But during the interwar gold standard, there were no agreed-upon rules of the game, and both France and the US were effectively sterilising the inflows to ensure that they did not have an expansionary effect.
Next Irwin asks what were the effect on world prices?
In his 1752 essay “Of Money,” David Hume remarked: “If the coin be locked up in chests, it is the same thing with regard to prices, as if it were annihilated”. So what was the effect of the effective withdrawal of this gold from circulation on the world price level? In recent research (Irwin 2010), I find that a 1% increase in the gold stock increases world prices by 1.5%. Since the US and France effectively withdrew 11% of the world’s gold stock from circulation, this would have led to a fall in world prices of about 16%. From this simple exercise, we can conclude that the Federal Reserve and Bank of France directly account for about half of the 30% deflation experienced in 1930 and 1931 (see Sumner (1991) for a different calculation that is generally consistent with this finding).

Of course, once the deflationary spiral began, other factors began to reinforce it. The most important factor was that growing insolvency (due to debt-deflation problems identified by Irving Fisher) contributed to bank failures, which in turn led to a reduction in the money multiplier as the currency to deposit ratio increased. However, these endogenous responses cannot be considered as independent of the initial deflationary impulse, and therefore US and French policies can be held indirectly responsible for at least some portion of the remaining “unexplained” part of the price decline.
The conclusion?
In sum, economic historians have traditionally focused on the tightening of US monetary policy as the origin of the Great Depression. These findings suggest that the French contribution to the worldwide deflationary spiral deserves much greater prominence than it has thus far received.

Political parties and the theory of the firm

MacDoctor has this to say on the current problems in the ACT party:
It was utterly inevitable that the schizophrenic nature of ACT (Law and Order vs Economic reform) would lead to it imploding, and, sure enough, it did. Garrett’s demise merely underlines the fact that Rodney Hide lost his way years ago, undoubtably out of his depth trying to run a party with such diverse people. He could have managed a re-emergence of ACT into the party of economic reform, but instead, in true zombie fashion, was attracted more to the semblance of life with the bright, shiny and noisy populist law and order plan.
And MacD has a point. One of the most basic things that the theory of the firm has to say on ownership of a firm is that it will be given to a homogeneous group. It doesn't matter who that group is, it can be investors, workers, suppliers, consumers, as long as they have a homogeneity of interest. As an aside this helps explain why so many firms are in fact investor owned because investors are a group with a (largely) common interest: maximising profits. Thinking of firm as a political institution when ownership is widely held some form of voting mechanism must be used for decision making. But political representation preforms poorly, relative to markets, when there is any significant conflict of interest among the participants.

This is the lesson Rodney Hide needs to take on broad. Trying to run a party with the diverse interests, law and order versus economic reform in particular, of the current ACT party simply will not work. ACT needs homogeneity of interest, it can either go down the law and order path or the economic reform path but trying to get the conservative law and order group to mesh with a more liberal (in the classical sense) economic reform group just isn't going to work.

"Ownership" of ACT need to be clearly determined. ACT needs to workout what it really stands for or it will just continue on its current road to oblivion.

Sunday, 19 September 2010

Mario Rizzo on The Second Austrian Moment

Mario Rizzo over at ThinkMarkets writes on The Second Austrian Moment. He says
Now we see the unsustainability of the current entitlements built on the New Deal “principle.” And then we see a government creating a large new one in the midst of the crisis. We are told that Obamacare will save money. Like all of the other entitlements?

We also see the folly of many of the New Deal institutions like Fannie Mae and later Freddie Mac. We see their role in the housing bubble.

Now we see the folly of a monetary and fiscal policies based on temporary expedients. Economic agents cannot rationally plan when the role of the state is so uncertain and so liable to come up with arbitrary policy interventions, as in the recent bailouts. In many ways, the government told us that the ordinary laws of economics and classical wisdom about sound policy have been temporarily – but indefinitely – suspended.
New Zealand is lucky that in face of the financial crisis it didn't go as far down the government intervention path as many countries like the US, but that is not to say we haven't gone too far. One thing that is a problem in New Zealand, and has been for a long time, is the nature of arbitrary policy interventions, remember Muldoon's mini-budgets or the much more recent Christchurch Reconstruction Act? As Robert Higgs has pointed out private investment will not take place in an environment where investors are unsure of whether or not they will receive a return on their investments. Such government interventions such as those above make investors unsure.

With regard to the response of Austrian economists to the current problems Rizzo writes,
Yet there is a critical deficiency. We continue to lack empirical work, on a large enough scale, to convince other economists that we have something relevant to say. The macro-economic framework has created a demand and supply for certain kinds of aggregated data at the expense of data that might be more useful to Austrians. (But I am reminded that George Stigler used to say, “It is no excuse to say the data are not available – you just must be clever.”)

This is where, perhaps, those non-Austrians with a similar mindset may be very important. We need good empirical researchers. I am, quite frankly, not interested in reviewing all of the qualms about certain kinds of econometric work. No single econometric result is definitive but little by little a case for taking a theory seriously can be built.
A call for empirical work and work with non-Austrians will not go down well with some in the Austrian group, but I think it is a good call on Rizzos part. There are many economists out there who may not be Austrian, for any number of reasons, but still share many of the concerns about economic policy that the Austrians have. Building bridges between such groups can only help the Austrian get their message across.

Rizzo ends by saying
With a little bit of luck, lots of hard work, and a smart sense of making intellectual alliances, we can do better than Ludwig von Mises and Friedrich Hayek did during the Great Depression and its aftermath. We have their legacy as well as the new legions to make the case.

Friday, 17 September 2010

Colander on economics (updated)

The follow is from the Coordination Problem blog. It is David Colander's view of the state of economics and how to fix it. There are many interesting ideas in it. I'm sure economists will find much to agree with and to disagree with in what Colander has to say.
Written Testimony of David Colander
Submitted to the Congress of the United States, House Science and Technology Committee
July 20th, 2010

Mr. Chairman and members of the committee: I thank you for the opportunity to testify. My name is David Colander. I am the Christian A. Johnson Distinguished Professor of Economics at Middlebury College. I have written or edited over forty books, including a top-selling principles of economics textbook, and 150 articles on various aspects of economics. I was invited to speak because I am an economist watcher who has written extensively on the economics profession and its foibles, and specifically, how those foibles played a role in economists’ failure to adequately warn society about the recent financial crisis. I have been asked to expand on a couple of proposals I made for NSF in a hearing a year and a half ago.

I’m known in the economics profession as the Economics Court Jester because I am the person who says what everyone knows, but which everyone in polite company knows better than to say. As the court jester, I see it as appropriate to start my testimony with a variation of a well-known joke. It begins with a Congressman walking home late at night; he notices an economist searching under a lamppost for his keys. Recognizing that the economist is a potential voter, he stops to help. After searching a while without luck he asks the economist where he lost his keys. The economist points far off into the dark abyss. The Congressman asks, incredulously, “Then why the heck are you searching here?” To which the economist responds—“This is where the light is.”

Critics of economists like this joke because it nicely captures economic theorists’ tendency to be, what critics consider, overly mathematical and technical in their research. Searching where the light is (letting available analytic technology guide one’s technical research), on the surface, is clearly a stupid strategy; the obvious place to search is where you lost the keys.

That, in my view, is the wrong lesson to take from this joke. I would argue that for pure scientific economic research, the “searching where the light is” strategy is far from stupid. The reason is that the subject matter of social science is highly complex—arguably far more complex than the subject matter of most natural sciences. It is as if the social science policy keys are lost in the equivalent of almost total darkness, and you have no idea where in the darkness you lost them. In such a situation, where else but in the light can you reasonably search in a scientific way?

What is stupid, however, is if the scientist thinks he is going to find the keys under the lamppost. Searching where the light is only makes good sense if the goal of the search is not to find the keys, but rather to understand the topography of the illuminated land, and how that lighted topography relates to the topography in the dark where the keys are lost. In the long run, such knowledge is extraordinarily helpful in the practical search for the keys out in the dark, but it is only helpful where the topography that the people find when they search in the dark matches the topography of the lighted area being studied.

What I’m arguing is that it is most useful to think of the search for the social science policy keys as a two-part search, each of which requires a quite different set of skills and knowledge set. Pure scientific research—the type of research the NSF is currently designed to support—ideally involves searches of the entire illuminated domain, even those regions only dimly lit. It should also involve building new lamps and lampposts to expand the topography that one can formally search. This is pure research; it is highly technical; it incorporates the latest advances in mathematical and statistical technology. Put simply, it is rocket (social) science that is concerned with understanding for the sake of understanding. Trying to draw direct practical policy conclusions from models developed in this theoretical search is generally a distraction to scientific searchers.

The policy search is a search in the dark, where one thinks one has lost the keys. This policy search requires a practical sense of real-world institutions, a comprehensive knowledge of past literature, familiarity with history, and a well-tuned sense of nuance. While this search requires a knowledge of what the cutting edge scientific research is telling researchers about illuminated topography, the knowledge required is a consumer’s knowledge of that research, not a producer’s knowledge.

How Economists Failed Society
In my testimony last year, I argued that the economics profession failed society in the recent financial crisis in two ways. First, it failed society because it over-researched a particular version of the dynamic stochastic general equilibrium (DSGE) model that happened to have a tractable formal solution, whereas more realistic models that incorporated purposeful forward looking agents were formally unsolvable. That tractable DSGE model attracted macro economists as a light attracts moths. Almost all mainstream macroeconomic researchers were searching the same lighted area. While the initial idea was neat, and an advance, much of the later research was essentially dotting i's and crossing t's of that original DSGE macro model. What that meant was that macroeconomists were not imaginatively exploring the multitude of complex models that could have, and should have, been explored. Far too small a topography of the illuminated area was studied, and far too little focus was given to whether the topography of the model matched the topography of the real world problems.

What macroeconomic scientific researchers more appropriately could have been working on is a multiple set of models that incorporated purposeful forward looking agents. This would have included models with multiple equilibria, high level agent interdependence, varying degrees of information processing capacity, true uncertainty rather than risk, and non-linear dynamics, all of which seem intuitively central in macroeconomic issues, and which we have the analytical tools to begin dealing with.1 Combined, these models would have revealed that complex models are just that—complex, and just about anything could happen in the macro-economy. This knowledge that just about anything could happen in various models would have warned society to be prepared for possible crises, and suggested that society should develop a strategy and triage policies to deal with possible crises. In other words, it would have revealed that, at best, the DSGE models were of only limited direct policy relevance, since by changing the assumptions of the model slightly, one would change the policy recommendation of the model. The economics profession didn’t warn society about the limitations of its DSGE models.

The second way in which the economics profession failed society was by letting policy makers believe, and sometimes assuring policy makers, that the topography of the real-world matched the topography of the highly simplified DSGE models, even though it was obvious to anyone with a modicum of institutional knowledge and educated common sense that the topography of the DSGE model and the topography of the real-world macro economy generally were no way near a close match. Telling policy makers that existing DSGE models could guide policy makers in their search in the dark was equivalent to telling someone that studying tic-tac toe models can guide him or her in playing 20th dimensional chess. Too strong reliance by policy makers on DSGE models and reasoning led those policy makers searching out there in the dark to think that they could crawl in the dark without concern, only to discover there was a cliff there that they fell off, pulling the US economy with it.

Economists aren’t stupid, and the macro economists working on DSGE models are among the brightest. What then accounts for these really bright people continuing working on simple versions of the DSGE model, and implying to policy makers that these simple versions were useful policy models? The answer goes back to the lamppost joke. If the economist had answered honestly, he would have explained that he was searching for the keys in one place under the lamppost because that is where the research money was. In order to get funding, he or she had to appear to be looking for the keys in his or her research. Funders of economic research wanted policy answers from the models, not wild abstract research that concluded with the statement that their model has little to no direct implications for policy.

Classical economists, and followers of Classical economic methodology, which included economists up through Lionel Robbins (See Colander, 2009), maintained a strict separation between pure scientific research, which was designed to be as objective as possible, and which developed theorems and facts, and applied policy research, which involved integrating the models developed in science to real world issues. That separation helped keep economists in their role as scientific economists out of policy.

It did not prevent them from talking about, or taking positions on, policy. It simply required them to make it clear that, when they did so, they were not speaking with the certitude of economic science, but rather in their role as an economic statesman. The reason this distinction is important is that being a good scientist does not necessarily make one a good statesman. Being an economic statesman requires a different set of skills than being an economic scientist. An economic statesman needs a well-tuned educated common sense. He or she should be able to subject the results of models to a “sensibility test” that relates the topography illuminated by the model to the topography of the real world. Some scientific researchers made good statesmen; they had the expertise and training to be great policy statesmen as well as great scientists. John Maynard Keynes, Frederick Hayek, and Paul Samuelson come to mind. Others did not; Abba Lerner and Gerard Debreu come to mind.

The need to separate out policy from scientific research in social science is due to the complexity of economic policy problems. Once one allows for all the complexities of interaction of forward looking purposeful agents and the paucity of data to choose among models, it is impossible to avoid judgments when relating models to policy. Unfortunately, what Lionel Robbins said in the 1920s remains true today, “What precision economists can claim at this stage is largely a sham precision. In the present state of knowledge, the man who can claim for economic science much exactitude is a quack.”

Why Economists Failed Society
One of J.M. Keynes’s most famous quotes, which economists like to repeat, highlights the power of academic economists. He writes, “the ideas of economists and political philosophers, both when they are right and when they are wrong, are more powerful than is commonly understood. Indeed, the world is ruled by little else. Practical men, who believe themselves to be quite exempt from any intellectual influences, are usually the slaves of some defunct economist. Madmen in authority, who hear voices in the air, are distilling their frenzy from some academic scribbler of a few years back.” (Keynes, 1936: 135) What this quotation misses is the circularity of the idea generating process. The ideas of economists and political philosophers do not appear out of nowhere. Ideas that succeed are those that develop in the then existing institutional structure. The reality is that academic economists, who believe themselves quite exempt from any practical influence, are in fact guided by an incentive structure created by some now defunct politicians and administrators.

Bringing the issue home to this committee, what I am saying is that you will become the defunct politicians and administrators of the future. Your role in guiding research is pivotal in the future of science and society. So, when economists fail, it means that your predecessors have failed. What I mean by this is that when, over drinks, I have pushed macroeconomic researchers on why they focused on the DSGE model, and why they implied, or at least allowed others to believe, that it had policy relevance beyond what could reasonably be given to it, they responded that that was what they believed the National Science Foundation, and other research support providers, wanted. That view of what funding agencies wanted fits my sense of the macroeconomic research funding environment of the last thirty years. During that time the NSF and other research funding institutions strongly supported DSGE research, and were far less likely to fund alternative macroeconomic research. The process became self-fulfilling, and ultimately, all macro researchers knew that to get funding you needed to accept the DSGE modeling approach, and draw policy conclusions from that DSGE model in your research. Ultimately, successful researchers follow the money and provide what funders want, even if those funders want the impossible. If you told funders it is impossible, you did not stay in the research game.

One would think that competition in ideas would lead to the stronger ideas winning out. Unfortunately, because the macroeconomy is so complex, macro theory is, of necessity, highly speculative, and it is almost impossible to tell a priori what the strongest ideas are. The macro economics profession is just too small and too oligopolistic to have workable competition among supporters of a wide variety of ideas and alternative models. Most top researchers are located at a small number of interrelated and inbred schools. This highly oligopolistic nature of the scientific economics profession tends to reinforce one approach rather than foster an environment in which a variety of approaches can flourish. When scientific models are judged by their current policy relevance, if a model seems temporarily to be matching what policy makers are finding in the dark, it can become built in and its premature adoption as “the model” can preclude the study of other models. That is what happened with what economists called the “great moderation” and the premature acceptance of the DSGE model.

Most researchers; if pushed, fully recognize the limitations of formal models for policy. But more and more macroeconomists are willing to draw strong policy conclusions from their DSGE model, and hold them regardless of what the empirical evidence and common sense might tell them. Some of the most outspoken advocates of this approach are Vandarajan Chari, Patrick Kehoe and Ellen McGrattan. They admit that the DSGE model does not fit the data, but state that a model neither “can nor should fit most aspects of the data” (Chari, Kehoe and McGratten, 2009, pg 243). Despite their agreement that their model does not fit the data, they are willing to draw strong policy implications from it. For example, they write “discretionary policy making has only costs and no benefits, so that if government policymakers can be made to commit to a policy rule, society should make them do so.”

While they slightly qualify this strong conclusion slightly later on, and agree that unforeseen events should allow breaking of the rule, they provide no method of deciding what qualifies as an unforeseen event, nor do they explain how the possibility of unforeseen events might have affected the agent’s decisions in their DSGE model, and hence affected the conclusions of their model. Specifying how agents react to unexpected events in uncertain environments where true uncertainty, not just risk, exists is hard. It requires what Robert Shiller and George Akerlof call an animal spirits model; the DSGE model does not deal with animal spirits.

Let’s say that the US had followed their policy advice against any discretionary policy, and had set a specific monetary policy rule that had not taken into account the possibility of financial collapse. That fixed rule could have totally tied the hands of the Fed, and the US economy today would likely be in a depression.

Relating this discussion back to the initial searching in the light metaphor, the really difficult problem is not developing models; they really difficult policy problem is relating models to real world events. The DSGE model is most appropriate for a relatively smooth terrain. When the terrain out in the dark where policy actually is done is full of mountains and cliffs, relying on DSGE model to guide policy, even if that DSGE model has been massaged to make it seem to fit the terrain, can lead us off a cliff, as it did in the recent crisis. My point is a simply one: Models can, and should, be used in policy, but they should be used with judgment and common sense.

DSGE supporter’s primary argument for using the DSGE model over all other models is based on their model having what they call micro foundations. As we discuss in Colander, et al. (2008) what they call micro foundations are totally ad hoc micro foundations. As almost all scientists, expect macroeconomic scientists, fully recognize, when dealing with complex systems such as the economy, macro behavior cannot be derived from a consideration of the behavior of the components taken in isolation. Interaction matters, and unless one has a model that captures the full range of agent interaction, with full inter-agent feedbacks, one does not have an acceptable micro foundation to a macro model. Economists are now working on gaining insight into such interactive micro foundations using computer generated agent-based models. These agent based models can come to quite different conclusions about policy than DSGE models, which calls into question any policy conclusion coming from DSGE models that do not account for agent interaction.

If one gives up the purely aesthetic micro foundations argument for DSGE models, the conclusion one arrives at is that none of the DSGE models are ready to be used directly in policy making. The reality is that given the complexity of the economy and lack of formal statistical evidence leading us to conclude that any particular model is definitely best on empirical grounds, policy must remain a matter of judgment about which reasonable economists may disagree.

How the Economics Profession Can Do Better
I believe the reason why the macroeconomics profession has arrived in the situation it has reflects serious structural problems in the economics profession and in the incentives that researchers face. The current incentives facing young economic researchers lead them to both focus on abstract models that downplay the complexity of the economy while overemphasizing the direct policy implications of their abstract models.

The reason I am testifying today is that I believe the NSF can take the lead in changing this current institutional incentive structure by implementing two structural changes in the NSF program funding economics. These structural changes would provide economists with more appropriate incentives, and I will end my testimony by outlining those proposals.

Include a wider range of peers in peer review
The first structural change is a proposal to make diversity of the reviewer pool an explicit goal of the reviewing process of NSF grants to the social sciences. This would involve consciously including what are often called heterodox and other dissenting economists as part of the peer reviewer pool as well as including reviewers outside of economics. Along with economists on these reviewer panels for economic proposals one might include physicists, mathematicians, statisticians, and individuals with business and governmental real world experience. Such a broader peer review process would likely encourage research on a much wider range of models, promote more creative work, and provide a common sense feedback from real world researchers about whether the topography of the models matches the topography of the real world the models are designed to illuminate.

Increase the number of researchers trained to interpret models
The second structural change is a proposal to increase the number of researchers explicitly trained in interpreting and relating models to the real world. This can be done by explicitly providing research grants to interpret, rather than develop, models. In a sense, what I am suggesting is an applied science division of the National Science Foundation’s social science component. This division would fund work on the appropriateness of models being developed for the real world.

This applied science division would see applied research as true “applied research” not as “econometric research.” It would not be highly technical and would involve a quite different set of skills than currently required by the standard scientific research. It would require researchers who had a solid consumer’s knowledge of economic theory and econometrics, but not necessarily a producer’s knowledge. In addition, it would require a knowledge of institutions, methodology, previous literature, and a sensibility about how the system works—a sensibility that would likely have been gained from discussions with real-world practitioners, or better yet, from having actually worked in the area.

The skills involved in interpreting models are skills that currently are not taught in graduate economics programs, but they are the skills that underlie judgment and common sense. By providing NSF grants for this interpretative work, the NSF would encourage the development of a group of economists who specialize in interpreting models and applying models to the real world. The development of such a group would go a long way towards placing the necessary warning labels on models, making it less likely that fiascos, such as the recent financial crisis would happen again.
Update: Tim Worstall sums Colander up this way:
One way of reading it is that as a source of policy advice macroeonomics is bunkum: and will remain bunkum until it’s based on solid microeconomic foundations.

Yes, this does mean neo-classical economics wins: for that’s pretty much what micro is.

Thursday, 16 September 2010

The effects of price controls

In light of Eric Crampton's, of Offsetting Behaviour fame, excellent piece on "Price Gouging" in this morning's Press here in Christchurch, there was much discussion over lunch on the effect of price controls. Seamus Hogan, also of Offsetting Behaviour, gave the following wonderful example:
A classic example of price controls making a bad situation much worse occured in 1584-85, when Spanish forces under the Duke of Parma besieged the port city of Antwerp on land and gradually blockaded it from the sea as well. As food became scarce, the city fathers imposed price controls. While food was plentiful elsewhere, and merchants could have delivered vast quantities of supplies before the Spanish tightened their blockade, relief never came. The reason:

Antwerp's price controls meant that merchants would get only the same price for their goods in Antwerp as they would get for selling them elsewhere at a much lower cost and risk. Naturally, the merchants sold elsewhere. At the same time, the artificially low prices set by the city government discouraged the citizens from limiting their consumption of scarce foodstuffs. The result: The population continued to eat heartily as if there was no shortage until the food ran out and they were forced to surrender. In the words of one historian, "the city, by its own stupidity, blockaded itself, far more effectively than the Duke of Parma could have done." Schuettinger and Butler, op. cit., pp. 33.

Wednesday, 15 September 2010

Remember 'Cash for Clunkers'?

Well it turns out its a clunker. Who would have guessed?

The abstract for a new paper, The Effects of Fiscal Stimulus: Evidence from the 2009 'Cash for Clunkers' Program by Atif Mian and Amir Sufi, on the 'Cash for Clunkers' program reads:
A key rationale for fiscal stimulus is to boost consumption when aggregate demand is perceived to be inefficiently low. We examine the ability of the government to increase consumption by evaluating the impact of the 2009 “Cash for Clunkers” program on short and medium run auto purchases. Our empirical strategy exploits variation across U.S. cities in ex-ante exposure to the program as measured by the number of “clunkers” in the city as of the summer of 2008. We find that the program induced the purchase of an additional 360,000 cars in July and August of 2009. However, almost all of the additional purchases under the program were pulled forward from the very near future; the effect of the program on auto purchases is almost completely reversed by as early as March 2010 – only seven months after the program ended. The effect of the program on auto purchases was significantly more short-lived than previously suggested. We also find no evidence of an effect on employment, house prices, or household default rates in cities with higher exposure to the program.

Up and running

My first posting from my office for a while. I'm pleased to say very little in the way of damage, just a few papers and books over the floor. Computer is ok, obviously. Now to catch up on all the work I haven't done for the best part of two weeks. :-(

Tuesday, 14 September 2010

The importance of being an optimist

According to new research, The Importance of Being an Optimist: Evidence from Labor Markets by Ron Kaniel, Cade Massey and David T. Robinson, being an optimist is good for your job prospects. The abstract reads:
Dispositional optimism is a personality trait associated with individuals who believe, either rightly or wrongly, that in general good things tend to happen to them more often than bad things. Using a novel longitudinal data set that tracks the job search performance of MBA students, we show that dispositional optimists experience significantly better job search outcomes than pessimists with similar skills. During the job search process, they spend less effort searching and are offered jobs more quickly. They are choosier and are more likely to be promoted than others. Although we find optimists are more charismatic and are perceived by others to be more likely to succeed, these factors alone do not explain away the findings. Most of the effect of optimism on economic outcomes stems from the part that is not readily observed by one's peers.
So you didn't get that great job not because you are a moron but because you are a pessimist. This explains much!

EconTalk this week

Author Alain de Botton talks with EconTalk host Russ Roberts about his latest book, The Pleasures and Sorrows of Work. How has the nature of work changed with the increase in specialization? Why is the search for meaningful work a modern phenomenon? Has the change in the workplace changed parenting? Why does technology become invisible? These are some of the questions discussed by de Botton in a wide-ranging discussion of the modern workplace and the modern worker.

Sunday, 12 September 2010

Trust me, I'm an expert

Arnold Kling writes at Cato on The Era of Expert Failure. He notes that the growth in the use of "experts" in Obama administration is striking.
However, equally striking is the failure of such experts. They failed to prevent the financial crisis, they failed to stimulate the economy to create jobs, they have failed in Massachusetts to hold down the cost of health care, and sometimes they have failed to prevent terrorist attacks that instead had to be thwarted by ordinary civilians.
and the important point is that,
Ironically, whenever government experts fail, their instinctive reaction is to ask for more power and more resources. Instead, we need to step back and recognize that what we are seeing is not the vindication of Keynes, but the vindication of Hayek. That is, decentralized knowledge is becoming increasingly important, and that in turn makes centralized power increasingly anomalous.
Growth in what is sometimes called the "knowledge economy" is making the life of the government expert difficult to the point of impossible. The philosopher H. B. Acton wrote back in 1971,
[b]ut the range of scientific discovery and technological invention is enormous, and as specialistion increases, it becomes more difficult for any man or even committee to know what is afoot everywhere. Even if planning a whole economy were a valid concept (in fact it is a confused one), and even if it were a feasible economic exercise (and this may well be doubted), the planners would still be faced with the paradox that the more successfully science and technology are pursued the more uncontrollable they become and the more social surprises they will give rise to. Scientists and technologists make the central planner's task impossible.
As the knowledge economy grows we each know more about less, knowledge is becoming increasingly specialised and more dispersed and markets are better at dealing with specialised, dispersed knowledge than government experts. As Acton put it as knowledge develops at an ever increasing rate and thus becomes more specialised,
it becomes more difficult for any man or even committee to know what is afoot everywhere.
And thus experts, no matter how expert they are, can not keep up.

This is not to say experts are not useful, they are and we depend on them daily. Every time we use the services of an accountant, an lawyer, a doctor or a dentist, we show faith in their expertise. Kling notes that,
In fact, I would say that our dependence on experts has never been greater. It might seem romantic to live without experts and instead to rely solely on your own instinct and know-how, but such a life would be primitive.
The problem is when we add expertise to power. Kling argues,
First, it creates a problem for democratic governance. The elected officials who are accountable to voters lack the competence to make well-informed decisions. And, the experts to whom legislators cede authority are unelected. The citizens who are affected by the decisions of these experts have no input into their selection, evaluation, or removal.

A second problem with linking expertise to power is that it diminishes the diversity and competitive pressure faced by the experts.

A key difference between experts in the private sector and experts in the government sector is that the latter have monopoly power, ultimately backed by force. The power of government experts is concentrated and unchecked (or at best checked very poorly), whereas the power of experts in the private sector is constrained by competition and checked by choice. Private organizations have to satisfy the needs of their constituents in order to survive. Ultimately, private experts have to respect the dignity of the individual, because the individual has the freedom to ignore the expert.
Kling ends by saying,
To summarize: We live in an increasingly complex world. We depend on experts more than ever. Yet experts are prone to failure, and there are no perfect experts.

Given the complexity of the world, it is tempting to combine expertise with power, by having government delegate power to experts. However, concentration of power makes our society more brittle, because the mistakes made by government experts propagate widely and are difficult to correct.

It is unlikely that we will be able to greatly improve the quality of government experts.

Instead, if we wish to reduce the knowledgepower discrepancy, we need to be willing to allow private-sector experts to grope toward solutions to problems, rather than place unwarranted faith in experts backed by the power of the state.

Islam, institutions and economic development

In this audio from Eric Chaney of Harvard University talks to Romesh Vaitilingam about his research on the evolution of institutions in the Islamic world and the relationship with economic development. Among other things, they discuss the rise and fall of Muslim science; and the balance of power between ‘church’ and ‘state’ in times of catastrophe.

Friday, 10 September 2010

Differing views on law and economics

Peter Klein writes at the Organizations and Markets blog:
Via Josh Wright, here’s an announcement for the Levy Fellowship at George Mason University School of Law. It’s a program to support PhD economists (and ABDs) pursuing law degrees. These days, a JD and a PhD are pretty much required for an academic post at a good law school, so check it out if you’re interested in teaching. After all, the world clearly needs more economists and more lawyers. . . .
Well both a law degree and economics degree may be pretty much required for an academic post in a good law school in the U.S. but not it seems here in New Zealand. In fact I have yet to see any real interest shown by any law school in New Zealand in law and economics. The only law and economics courses I know of are in economics departments. ECON327, for example, is a third year course on the Economic Analysis of Law offered in the economics department here at Canterbury. But very few law students seem to take it. What I don't get is why law schools here are so anti law and economics. After all as the Klein posting makes clear it is big business overseas. Why are our guys so far behind the times?

Stating the obvious

It's not a good sign when the government has to intervene to prevent a run on a bank that is already owned by the government,
But Megan McArdle points out that this is what's happening in Ireland. The state-owned Anglo Irish Bank Corp. is to be divided into a government-backed bank that would hold customer deposits and an "asset recovery bank" holding the bank's increasingly bad loans. What the bet the "asset recovery bank" isn't worth a lot. But it could be sold in whole or part down the road. I wish them luck.

Thursday, 9 September 2010

Armen Alchian is cool

“There is no such thing as macroeconomics.”
Or so William Allen reports Armen Alchian as saying. The quote comes from a new article in Econ Journal Watch untitled A Life among the Econ, Particularly at UCLA by William R. Allen.

Jerry O'Driscoll argues that Alchian didn't deny there were aggregate economic phenomena, only that theory must be microeconomic. O'Driscoll also argues that this was in substance Hayek's view as well.

Makes sense to me.

Bureaucrats are people

In the comments to a previous post BK Drinkwater said:
1) People are often stupid
2) Bureaucrats are the same stupid people, with bad incentives.
Now Gareth Morgan writes in the New Zealand Herald on problems in the Reserve Bank to do with banking supervision,
But despite a plethora of taskforces, we still haven't paid any attention to the standard of competence that has undermined the Reserve Bank's prudential supervision of the banking system, which time and time again can be shown to have been the largest single piece of policy negligence of the past 20 years.

The officials responsible for it should at least be subject to an inquiry - not as part of a witch hunt but to make them justify (or not) the directives they've made to the banks - the bill for which we as taxpayers now have to fund.
A public self-examination is overdue for Reserve Bank governance of the banks - and we all agree, don't we, that taxpayer-guaranteed banks these days are little more than arms of government lending policies. Bring it on.
There are many question to be answered about the banking sector and its regulation. Officials from the Reserve Bank and other government departments should be held accountable for their actions. Problems with the Government guarantee scheme and South Canterbury Finance only highlight the need for a "public self-examination".

Tuesday, 7 September 2010

Measuring economic welfare (updated)

John Taylor at the Economics One blog draws our attention to a new measure of economic welfare. The measure combines consumption, leisure, mortality, and even inequality. Interestingly, if not surprisingly, the new measure is positively correlated with GDP per capita. But as Taylor points out there are differences.
For example, income per capita in France is only 70 percent of that in the United States, while the new welfare measure for France is 97 percent of that in the United States. The difference is mainly due to more leisure and less income inequality in France.
One point that I'm sure that many people will not like is
[t]he gains and losses of utility from different levels of income inequality are based on the Rawls abstract concept of the veil of ignorance in which each person enters a lottery each year determining what country he or she will live in--one with less or more income inequality.
This idea has been criticized by a number of welfare economists.

The last point Taylor makes may be the most important,
Chad and Pete have a whole section on “caveats” in their interesting paper.
but you can bet they will be ignored.

Update: Tim Worstall comments here.