« The Job Shortfall: Then and Now | Main | Exchange Rate Angst and Rebalancing »
In some ways the Gulf of Mexico oil spill seems like a replay of the subprime lending disaster. Clever technological innovations blew up in a mess that nobody knew how to control, wreaking devastation on those innocently standing by. The actors and the scenes have changed, but you can't shake the feeling you've been through this nightmare before.
Ken Rogoff sees the parallels this way:
The accelerating speed of innovation seems to be outstripping government regulators' capacity to deal with risks, much less anticipate them.
The parallels between the oil spill and the recent financial crisis are all too painful: the promise of innovation, unfathomable complexity, and lack of transparency (scientists estimate that we know only a very small fraction of what goes on at the oceans' depths.) Wealthy and politically powerful lobbies put enormous pressure on even the most robust governance structures....
The oil technology story, like the one for exotic financial instruments, was very compelling and seductive. Oil executives bragged that they could drill a couple of kilometers down, then a kilometer across, and hit their target within a few meters.
This rings true to me. New financial instruments and new technologies for extracting oil require changes in regulatory oversight. And this is the kind of adaptation that established bureaucracies often find impossible to implement.
Ed Dolan thinks the common element is gambling with other people's money:
Executive compensation plans that emphasize short-term bonuses, include golden parachutes, and lack clawback provisions are one example. Not only top executives face such incentives-- mid-level traders, engineers, and analysts may also take risks in the hope of bonuses or promotions, with the expectation that the worst that can happen in case of catastrophe is that they lose their jobs. Stockholders may condone such risk taking because they are protected by limited liability.
Both the Gulf oil spill and the financial crisis had their origins in negatively skewed risks. Investigators in the Gulf disaster are looking at whether BP and its contractors underplayed downside risks when they made technical choices, ignored warning signs, and neglected preparations for dealing with a worst-case spill. In the financial crisis, negatively skewed risks involved excessive leverage, manipulation of ratings, design of complex securities, and several other factors.
I agree with Ed that intra-organizational incentives contributed to the problem in both cases, and that government policy allowed the firms that created the problems to pass some of the costs on to others in many details of the financial debacle. But I am less persuaded that limited liability explains BP's decisions at the corporate level. The company's market value has declined by over $75 billion since April. Here was an entity with more than just skin in the game and looking more than just flayed at the moment. And yet, the company opted not to invest $500,000 in a secondary acoustic shut-off switch, which is essentially required in Norway and Brazil, and which Royal Dutch Shell and France's Total SA sometimes use even when not required. BP's backup plans B, C, and D all seemed to come out of the playbook for dealing with the 1979 Ixtoc disaster-- none of them worked that well there, either. So why did the company take such risks?
I think part of the answer, for both toxic assets and toxic oil, has to do with a kind of groupthink that can take over among the smart folks who are supposed to be evaluating these risks. It's so hard to be the one raising the possibility that real estate prices could decline nationally by 25% when it's never happened before and all the guys who say it won't are making money hand over fist. And this interacts with the forces mentioned above. When the probability of spectacular failure appears remote, and moreover it hasn't happened yet, it's hard to set up incentives, whether you're talking about a corporation or a regulatory body, in which the person who makes sure that the risks stay contained is the person who gets rewarded. When everyone around you starts thinking that nothing can go wrong, it's hard for you not to do the same. It can become awfully lonely in those environments to try to be the voice of prudence.
And yet, prudent judgment is the thing I most desperately wish decision-makers had more of in these times of dazzling new technological capabilities.
Posted by James Hamilton at June 9, 2010 05:09 AM
amen
Posted by: spencer at June 9, 2010 06:08 AM
If "smart folks" are prone to group think that blinds them to risk, then what does that say about the Fed? Clearly, they succumbed to some of the same group think that afflicted all financial engineers during both the stock and mortgage bubbles. They searched went shopping for analytical models that justified the bubbles ("productivity" in the case of stocks, "savings glut" and "great moderation" in the case of mortgages). And they largely shut out opposing views. Reading the transcripts for 2004, one gets the sense that FOMC members would put up mild resistance against the Chairman's pet views, only to finally give in using self-deprecating humor, as if it were a debate over where to go for lunch with a strong-willed colleague.
It seems to me that since the two bubbles, a fringe group -- now larger but still fringe -- believes that aggressive monetary policy in the hands of the Fed is a recipe for large "oil spills". Consensus economists, in contrast, seem reluctant to hold the Fed accountable for its actions. Witness that Tim Geithner gets little blowback for studying derivatives markets for years as head of the NY Fed before concluding that the risks they posed were acceptable. That was his job, and he failed, just as the BP engineer in charge of the shut-off valve system failed. "Whocouldhaveknown?" seems to be an unacceptable defense for BP, but an entirely acceptable one for the Fed.
Posted by: David Pearson at June 9, 2010 06:57 AM
This is a basic problem with human perception and rare events. A person who takes some risk and succeeds a few times thinks the risk is not there for them. For example car drivers will drive so they are generally safe but will not drive to be safe for the rare for them event hence we have 10's of thousands of deaths and 100's of thousands of injuries each year. Large corporations with regulation can do much better but when the rare event needs to not happen more than about once a decade like a global financial meltdown, off shore oil disaster or nuclear power plant disaster human systems have not been shown to work. Not that it is always possible but it is best to set up systems to allow occasional acceptable failure so that there can be learning and correction. In financial systems this means smaller less integrated systems so we do not have coordinated global failure.
Posted by: David Kardatzke at June 9, 2010 08:00 AM
include lack of integrity and negligence by regulators and, more proximately, senior management and boards of directors. sounds a lot like sub-prime.
Posted by: bp at June 9, 2010 08:22 AM
With the utmost respect, I have to say that: "New financial instruments and new technologies for extracting oil require changes in regulatory oversight. And this is the kind of adaptation that established bureaucracies often find impossible to implement.", is utter nonsense. The administrations in charge cut off regulation at the knees, and refused to allow their bureaucracies to deal with the problems.
It has very little to do with the lower-level people at those agencies - they have as good imaginations as anyone. But when a junior employee says to their boss, "You know, that could result in a blow-out. Why aren't we inspecting those BOPs?", and their boss says "We should, but there's no way the appointees would allow it.", you have your problem.
Pretty hard to have agencies doing their jobs and proactively regulating problem areas when the agency heads think regulation is ineffective and "the market" will deal with potential problems.
Why didn't BP prevent the problem? Because businesses are constantly looking for efficiencies. Any method of dealing with a problem that hasn't happened or isn't likely is an inefficiency. That's why we have regulators, to require those slight inefficiencies (from the bean-counter perspective), that the companies would ignore otherwise, but that prevent or deal with problems.
Posted by: KJMClark at June 9, 2010 08:29 AM
I agree the underlying failure is the same. Self-regulating industries with weak underfunded oversight will have market participants that go boom, with all the collateral damage associated. That said, I must disagree with Ken Rogoff. While this may be true for derivatives (and I stress may) the issue isn't well served by assuming that regulators are incapable. Go listen to the audio of the meeting where the Investment Bankers convince the regulators to allow the banks to increase their leverage. The regulators knew this was a bad idea, the regulators' advisors are very smart, very sophisticated and they knew this was a bad idea. The regulators lack the power/strength to deny the Investment Bankers what they wanted. It wasn't that the regulators got out-thought, it was that they couldn't say no even when they knew it was the right answer.
Posted by: Frank Gooodman at June 9, 2010 09:40 AM
I think these events tend to share certain common traits:
- haste - inexperience - misinterpretation of data - failure to listen to subordinates' concerns - mindset - a three standard deviation event - equipment failure
Of these, only mindset and a three standard deviation event apply to the Fed or subprime mortgages. There was plenty of time for the Fed, and plenty of data. In addition, the Fed and the subprime markets were characterized by principal-agent problems. Investment bankers are interested in and paid to consummate a transaction. As a matter of incentives, they don't really care what happens afterwards. Similarly, the Fed didn't want to spoil the housing party, so they failed to act, but they had plenty of information to do so at senior levels.
For BP, the situation was different. The BP manager on the rig, according to press reports, was under pressure to make progress on an overdue well. I have read that he was primarily a land driller, but I don't know if that's true. Certainly, he appears to have failed to listen to the top TransOcean man on the rig. I'm sure he didn't take the risk seriously (TransOcean did), possibly because there hadn't been a serious rig incident in decades--a mindset problem. Moreover, he--and the rest of us in the industry--placed great faith in the blow out preventer, which failed.
Finally, this is one nasty, nasty well. To quote one worker who survived the blast: "I've seen a lot of gas coming up from different wells, and the highest I've seen in my 11 years is 1,500 units. This well was gave us 3,000. I've never been on a well with that high gas coming out of the mud. That was kind of letting me know this well was something to be reckoned with." (Peak Oil News, June 9) I have every expectation this will prove to be a three-standard deviation event (and I wouldn't be surprised, ironically, if it proves to be one of the most prolific wells in Gulf).
So those are the factors. I personally don't see a regulatory failure, as there was with, say, the oversight of AIG. Overseeing a drilling operation would be a bit like overseeing a brain surgeon during an operation. You sure you want to second guess him when the clock is ticking at $25k an hour and the patient is on the table?
A better comparison, to my mind, is the KLM/PanAm crash at Tenerife Airport in 1977 and still the worst air disaster ever. (http://en.wikipedia.org/wiki/Tenerife_airport_disaster) What many don't know is that the stage was set by a bombing at the main airport in Las Palmas, which diverted the incoming aircraft to Tenerife. The KLM and PanAm aircraft weren't even supposed to be there. The disaster was caused by KLM's top training pilot, who had to take off because his allowable flight window was about to expire. He misheard the tower, ignored the co-pilot, and took off on a foggy runway. The accident was ultimately caused by a confluence of factors, not least of which were safety regulations limiting flying time.
Posted by: Steven Kopits at June 9, 2010 12:14 PM
A very useful comparison. And a strong argument for reform of executive compensation in all publicly traded companies.
Posted by: don at June 9, 2010 01:47 PM
If this analogy is correct, the US government has adopted a very different strategy to deal with both scenarios. The government jumped in with both feet in the financial crisis, but prefers to not get involved with the oil spill, instead spending its efforts on looking for "ass to kick."
Posted by: MikeH at June 9, 2010 02:09 PM
I work as an exploitation engineer, and I'm inclined not to agree with Ed Dolan's sentiment that it was due to "gambling with other people's money." There probably aren't a lot of studies on the rate of failure of the BOP's and there are probably only a handful of competent specialists in the world that fully understand BOP's and precisely exactly how and why these things fail. The quantification of an extremely low probability event, that has a very high cost is a difficult thing to do. The results of this spill I believe will be better regulations that ensure that a secondary shutoff is installed. Humans learn by making mistakes, this is a process of engineering and discovery. I think we need to backstop what we have learned from this disaster with better regulations.
Posted by: Jonesie at June 9, 2010 02:50 PM
Richard Thaler talks about this in his almost 20 yr old book "The Winner's Curse." Status quo bias. Where otherwise smart people feel they have more to lose to sticking their neck out than just following the status quo.
The solution is to create an isolated and well incentivized group of engineers to act as risk managers.
Maybe we should work aggressively to make these 'auditors' be a part of the regulatory side instead, and again, properly incentivize them to think outside the box the next time big oil or big bank starts doing something.
Maybe a decent way to do this is implement privatized, but publicly funded regulators who receive 'awards' (funded from public dollars) upon staying on top of technological innovation from private industry. Make regulation a competitive business, where the most aggressive policy/research gets rewarded the most.
Posted by: Michael Krause at June 9, 2010 03:26 PM
Incentives are the real problem. If you rob a 7-11 store of a hundred bucks you can get 10 years in jail. But if your shortcuts cause the fiery death of 11 people on a drilling rig, you get a multi-million dollar salary. If you kill 25 people in a coal mine by ignoring safety violations, you get a multi-million dollar salary. If you defraud customers of billions of dollars at Goldman Sachs, you get a multi-million dollar bonus.
I think a few more perp walks with CEOs in orange jumpsuits and handcuffs might have a much greater benefit than more regulations.
Read Full Article »