sábado, 19 de febrero de 2011

sábado, febrero 19, 2011
Finance: Elusive information

By Tom Braithwaite

Published: February 15 2011 22:25



It was Friday August 15 2008 and a senior official at the US Federal Reserve in Washington wrestled with a thorny problem: he wanted to know what was happening inside Lehman Brothers but was afraid to ask.


Pat Parkinson, now the Fed’s top bank supervisor, was trying to find out which companies had derivatives contracts with Lehman as he gauged how severe the impact would be if the investment bank collapsed. But colleagues in New York told him that just requesting the data would be “a huge negative signal” for the bank’s prospects and they were “very reluctant” to do anything that might spook the market”.


Lehman’s implosion the following month was not the only recent instance where a calamitous lack of decent data has plagued financial markets. Others range from the European bank stress tests carried out last year, which officials admit relied on informationpolluted by accounting”, to the US stock marketflash crash” on May 6 that left the Securities and Exchange Commission floundering for answers.


But for the first time in decades there is a growing movement to rebuild the creaking data architecture that underpins modern finance.


Regulators’ need to understand the crisis provided the impetus. When Lehman fell in September 2008, not only did institutions not know their rivals’ exposure to Lehman, or to other problem areas such as subprime mortgages; they were sometimes unable to map their own with any speed. In the vortex, stock prices collapsed, liquidity dried up and investors and bankers ran scared.


Everyone’s going, ‘what do I hold that is Lehman?’” says Mike Atkin, head of the Enterprise Data Management Council, a group of banks, information technology companies and regulators. Wait a minute ... what is Lehman? Lehman isn’t one entity – it’s 10,000 entities. We don’t know what our exposure is because we’re not sure what Lehman is.”


Soon afterwards, John Liechty, a Pennsylvania State University professor, started asking regulators which of them collected data on the financial system as a whole. “I figured somebody would have the data so they could piece together the financial system and work out where the risk is. They said: ‘Nobody’s got it. Some is there and unshared. Other parts are not there at all.’ I said: ‘This is dangerous; it’s crazy.’”


For years, individual statisticians, technology specialists and economists from regulators, financial institutions and academia had warned of the dangers. For years, they were dismissed as Jeremiahs and a root-and-branch reform of the data networks underpinning the financial system was rejected by the industry and regulators, which saw big costs and limited benefits.


John Geanakoplos, a Yale University professor, blames the Fed for not using data well, sometimes because of bureaucratic blockages, in one instance because officials balked at paying $400,000 for mortgage information, and sometimes because of a philosophical belief in self-correcting markets. In a recent presentation to the European Central Bank, he took aim at the “Greenspan-Bernanke doctrine” that “denied that there are bubbles, or that they could recognise one if they saw it”.


In less damning terms, Keith Saxton, IBM’s London-based global director of financial markets, points to the same problem. Most of the data that the regulators and the central banks collect are what I call quite traditional,” he says. “They have a view of ‘The Bank and everyone assumed that because that bank was healthy maybe there wasn’t a problem with the system. It has turned out that the data they had about that bank weren’t granular enough to be accurate: these guys can’t get to the cash flow of an instrument.”


. . .


But across the world, the evangelical geeks are gaining ground. In the US, Jack Reed, the Democratic senator from Rhode Island, took up the cause and managed to slip a new early-warning agency – the Office of Financial Research – into the mammoth Dodd-Frank financial reform bill that passed Congress last summer.


The OFR, which is being incubated in the Treasury before it gains independence, has begun work on standardising the components of every significant financial transaction. Eventually it will collect data, pull it all into a supercomputer and analyse the results, with the ultimate aim of spotting bubbles before they burst.


In a rare moment for Washington, an idea that was not proposed by the executive branch and without a big bloc of business support became law. A loose coalition of advocates managed to persuade one lawmaker, who sold the initiative to his colleagues in the face of scepticism from existing agencies. “Within the government there was a hesitancy to create an agency that was independent,” says Mr Reed. The senator, an army veteran, says he wants the OFR to act like a “red team” – the military term for a group of soldiers charged with probing the weaknesses of their own comrades.


Words of change from previously sceptical bureaucrats are borne out by actions. The Fed does now buy the expensive granular mortgage data that Prof Geanakoplos highlighted. New rules will force more derivatives through clearing houses, allowing Mr Parkinson to grab data from fewer sources without alerting the market.


But the real data zealots think that regulators should go much further: the ultimate prize is a “dashboard” of the whole financial system. While anyone can view a snapshot of the stock market, the OFR and sister agencies in Europe and Asia would have the same visibility over darker parts of finance, allowing them to test various scenarios on Wall Street at the push of a button. The trouble is, at the moment, this is science fiction; it is impossible to create.


“When something starts to go awry we go, ‘what am I holding?’ and I need to know it down to the loan level so I can run it against a scenario,” says Mr Atkin.


“If Ford goes bankrupt what happens to the homes in Detroit? Well, I don’t know how many homes in Detroit I’ve got in my mortgage-backed security because I can’t unravel the bloody thing.”


Francis Gross, head of external statistics at the Frankfurt-based ECB, points to the same issue: “The basic assumption is you have these vast pools of data that are quite homogeneous so that a spade is a spade – and that’s where the problem comes.”


The reason is that the building blocks of so-calledreference data” are not standard. As banks have grown by acquisition they have acquired thousands of legacy systems spread across different businesses – they do not have standard ways of recording data even within the group. So plotting relationships across the financial system is almost impossible. “The trading desk may book a transaction as Deutsche Bank and code that as ‘DB’,” says Lew Alexander, the Treasury official in charge of setting up the OFR. “When it gets to accounting, ‘DB’ may mean Dresdner Bank.”


. . .


It is these non-standardidentifiers”, for companies and for instruments, that Mr Gross and Mr Alexander are now trying to reconcile on both sides of the Atlantic. Unravelling garbled transactions costs the industry hundreds of millions of dollars a year in people and IT but during the boom years it always seemed too fiddly to fix.


One banker says: “We do a transaction and we get a lot of breaks and reconciliations ... Generally over 90 per cent of them are due to inconsistency of reference data, not due to misunderstanding between parties. If we all had the same reference data we would revolutionise how operations are done in our firms.” He adds: “Logistically it’s impossible to do unless it’s imposed.”


Mr Gross, who is leading calls for international standards of reference data, says regulators must be in the driving seat or it will be “ask[ing] cats to herd themselves”.


The US now has the structure to start work, although even after Mr Reed managed to get the OFR through Congress, a handful of powerful Republicans continue to oppose it. Karl Rove, an adviser to former president George W. Bush, and Richard Shelby, the senior Republican on the Senate banking committee, both think it reeks of big government.


“I believe that the Democrats’ new Office of Financial Research will not only fail to detect systemic threats and asset price bubbles in the future, it may threaten the civil liberties and privacy of Americans, waste billions of dollars of taxpayer resources and lull markets into the false belief that this new government power will protect the financial system from risk,” says Mr Shelby.


Some Wall Street executives are worried about disclosing their trading positions to anyone, even regulators policing the system for systemic risk. Says the banker who has watched the OFR closely and supports it: “There is a concern about people giving up their positions ... There was a time in 1905 when people didn’t report their income to the [Internal Revenue Service]. I think it will become: you do a transaction, you report it to the OFR.”


From the other end of the political spectrum, liberal Democrats in Congress are worried the Treasury and the Fed are paying lip service to the OFR and will not give it the tools to be intrusive enough. They think scepticism among senior officials endures.


It is certainly a never-ending struggle. Forty years before Mr Parkinson grappled with Lehman, one of his antecedents extolled the virtues of technology for market supervision. In 1968, a year in which the New York Stock Exchange had to close for days at a time because paper records of trades were so out of hand, Manuel Cohen, chairman of the SEC, boasted that his agency was now using “its own computer” to monitor markets.


“We are able to provide a measure of protection to investors that theretofore had been virtually impossible due to budget and ‘manpowerlimitations,” he said. “But our techniques in this area are not as fully developed as they will be.”


..........................................


Information technology


How innovation has come to mean different things on different coasts


In recent decades, a curious paradox has hung over the American economy. On the west coast, a gaggle of entrepreneurial companies, filled with some of the country’s brightest brains, has been scrambling to track data in the smartest and most innovative way, writes Gillian Tett.


Companies such as Google, Amazon and Facebook are now able to monitor what consumers and businesses are doing around the world in real time. They can track everything from book purchases to friendship links and the consumption of breakfast cereal.


But on the east coast, another collection of highly talented brains has been delivering a very different form of innovation. Wall Street has produced a plethora of products and processes that has made the financial system more complex and (often) more opaque.


But while bankers have used cutting-edge computer technology to, say, develop ultra-fast automatic trading strategies, the data-handling innovations developed on the west coast have been slow to move east.


As recently as six years ago, traders in the credit default swaps market, for example, were still conducting deals by fax. Banks’ back offices were not standardised and regulators could not collect data from them in anything resembling a timely manner.


Beyond banking, many other parts of the financial world went almost entirely untracked by regulators, who remain behind the technological curve.


The question that hangs over the Office of Financial Research, being set up as part of reforms to the sector, is whether these different west and east coast worlds can now meet – and apply Silicon Valley-style innovation to the financial system as a whole. Can the techniques that allow Facebook to aggregate data on online friends in a flash be used to track derivatives trades, say?


Optimists argue that the answer is yes, given the extraordinary strides in computing power that have already occurred. Officials linked to the OFR have started talking to companies such as IBM about how to transplant innovations in the non-financial world into a coherent form of data collection in finance.


But pessimists retort that financial companies have little incentive to co-operate; after all, opacity has on the whole served Wall Street well, enabling traders to enjoy fat profits.


Either way, the really big question is whether the type of entrepreneurial, innovative drive that inspires Silicon Valley can be transplanted to the state sector.


“If you really wanted to revolutionise [data collection], you should ask somebody like Google to run it, and pay them properly,” observes one senior banker, only partly in jest.


Right now, however, that prospect seems even harder to imagine than a world where the OFR starts to fly.


Lowdown on the OFR



The Office of Financial Research, set up by the DoddFrank financial reform act last year to improve the quality and analysis of US data, has wide-ranging powers enabling it to compel institutions to provide information.


Lewis Alexander, interim head and former senior Citigroup economist, has “a few milliondollars from the Federal Reserve for staffing. By 2012 the OFR will be funded by a tax on big banks.


Based in the Treasury, the OFR will have an independent chair who reports to Congress. The White House has approached candidates.

Copyright The Financial Times Limited 2011

0 comments:

Publicar un comentario