macroresilience

resilience, not stability

Archive for the ‘Goodhart’s Law’ Category

On The Futility of Banning Proprietary Risk-Taking By Banks

with one comment

In his interview in Der Spiegel, Paul Volcker argues that banks must not be allowed to take on proprietary risk except for risk incidental to “client activities”. Quoting from the interview:

SPIEGEL: Banking should become boring again?

Volcker: Banking will never be boring. Banking is a risky business. They are going to have plenty of activity. They can do underwriting. They can do securitization. They can do a lot of lending. They can do merger and acquisition advice. They can do investment management. These are all client activities. What I don’t want them doing is piling on top of that risky capital market business. That also leads to conflicts of interest.”

This is a more nuanced version of the argument that calls for the reinstatement of the Glass-Steagall Act. But it suffers from two fatal flaws:

  • Regulatory Arbitrage: Separation of “client risk” and “proprietary risk” sounds good in theory but it’s almost impossible to enforce in practise. As I’ve discussed previously, a detailed and fine-tuned regulatory policy will be easy to arbitrage and a blunt policy will result in a grossly inefficient financial system.
  • Losses on “Client Activities” were the major driver in the current crisis. My analysis of the UBS shareholder report highlighted how the accumulation of super-senior CDO tranches was justified primarily by their perceived importance in facilitating the sale of fee-generating junior tranches to clients. It is the losses on these tranches issued in the name of facilitating client business that were at the core of the crisis. It is these tranches that caused the majority of the losses on banks’ balance sheets. It is losses on insuring these tranches that brought down AIG. Segregated proprietary risk is monitored closely by almost all banks. The real villain of the piece was proprietary risk taken on under the cover of facilitating client business.
Bookmark and Share

Written by Ashwin Parameswaran

December 13th, 2009 at 1:36 pm

Regulatory Arbitrage and the Efficiency-Resilience Tradeoff

with one comment

On the subject of securitization and regulatory arbitrage, Daniel Tarullo notes:

“securitization appears to present a case in which efforts to plug gaps in regulatory coverage are quickly and repeatedly overtaken by innovative arbitraging measures.”

Arnold Kling noted the problem of adaptation of economic agents to changes in the regulatory regime in his paper on the financial crisis:

“The lesson is that financial regulation is not like a math problem, where once you solve it the problem stays solved. Instead, a regulatory regime elicits responses from firms in the private sector. As financial institutions adapt to regulations, they seek to maximize returns within the regulatory constraints. This takes the institutions in the direction of constantly seeking to reduce the regulatory “tax” by pushing to amend rules and by coming up with practices that are within the letter of the rules but contrary to their spirit. This natural process of seeking to maximize profits places any regulatory regime under continual assault, so that over time the regime’s ability to prevent crises degrades.”

Regulatory arbitrage follows from the application of Goodhart’s Law to financial regulation. One of Daniel Tarullo’s key recommendations to counter this arbitrage is the adoption of a “simple leverage ratio requirement” . Such blunt measures reduce efficiency – of course, we can make the system more resilient if we insist on blanket 25% bank capital ratios and ban all bonuses but this would be a grossly inefficient solution.

The tradeoff between efficiency and resilience is a constant theme in fields as diverse as corporate risk management, ecosystem management and in this case, financial regulation.

Bookmark and Share

Written by Ashwin Parameswaran

December 5th, 2009 at 7:19 am