Approaching the 10-year Anniversary

Doug Nolan

We're rapidly Approaching the 10-year Anniversary of the 2008 financial crisis. Exactly one decade ago to the day (September 7, 2008), Fannie Mae and Freddie Mac were placed into government receivership. And for at least a decade, there has been nothing more than talk of reforming the government-sponsored-enterprises.

It's worth noting that total GSE (MBS and debt) Securities ended Q3 2008 at $8.070 TN, having about doubled from year 2000. The government agencies were integral to the mortgage finance Bubble - fundamental to liquidity excess, pricing distortions (finance and housing), general financial market misperceptions and the misallocation of resources. GSE Securities did contract post-crisis, reaching a low of $7.544 TN during Q1 2012. Since then, with crisis memories fading and new priorities appearing, GSE Securities expanded $1.341 TN to a record $8.874 TN. Of that growth, $970 billion has come during the past three years, as financial markets boomed and the economy gathered momentum. A lesson not learned.

Scores of lessons from the crisis went unheeded. The Financial Times' Gillian Tett was the star journalist from the mortgage finance Bubble period. I read with keen interest her piece this week, "Five Surprising Outcomes of the Financial Crisis - We Learnt the Dangers Posed by 'Too Big to Fail' Banks but Now They Are Even bigger."

Tett's article is worthy of extended excerpts: "What are these surprises? Start with the issue of debt. Ten years ago, investors and financial institutions re-learnt the hard way that excess leverage can be dangerous. So it seemed natural to think that debt would decline, as chastened lenders and borrowers ran scared. Not so. The American mortgage market did experience deleveraging. So did the bank and hedge fund sectors. But overall global debt has surged: last year it was 217% of gross domestic product, nearly 40 percentage points higher - not lower - than 2007."

A second surprise is the size of banks. The knock-on effects of the Lehman bankruptcy made clear the dangers posed by 'too big to fail' financial institutions with extreme concentrations of market power and risks. Unsurprisingly, there were calls to break them up. The big beasts are even bigger: at the last count America's top five banks controlled 47% of banking assets, compared with 44% in 2007, and the top 1% of mutual funds have 45% of assets."

A third counter-intuitive development is the relative power of American finance. In 2008, the crisis seemed to be a 'made in America' saga: US subprime mortgages and Wall Street financial engineering were at the root of the meltdown. So it seemed natural to presume that American finance might be subsequently humbled. Not so. American investment banks today eclipse their European rivals in almost every sense… and the financial centres of New York and Chicago continue to swell…"

Then there is the issue of non-bank financial companies. A decade ago, investors discovered the world of 'shadow banks', when they learnt that a vast hidden ecosystem of opaque investment vehicles posed systemic risks. Regulators pledged to clamp down. So did the shadow banks shrink? Not quite: a conservative definition of the shadow bank sector suggests that it is now $45tn in size, controlling 13% of the world's financial assets, up from $28tn in 2010. A regulatory clampdown on the banks has only pushed more activity to the shadows."

A fifth issue to ponder is the post-crisis retribution. Back when lenders were falling over by the dozens, it seemed natural to presume that some bankers would end up in jail. After all, there were hundreds of prosecutions after the US savings and loans scandals of the 1980s. But while banks have been hit with fines in the past decade, totalling more than $321bn, (almost) the only financiers who have done jail time are those who committed crimes that were not directly linked to the crisis, such as traders who rigged the Libor rate."

The FT's Martin Wolf weighed in with, "Why So Little Has Changed Since the Financial Crash." I greatly respect Gillian Tett's insight. Martin Wolf is exceptionally knowledgeable and an esteemed journalist, but I don't hold his perspective in the same high regard.

Wolf: "So what happened after the global financial crisis? Have politicians and policymakers tried to get us back to the past or go into a different future? The answer is clear: it is the former… After the crisis of 2008, they wanted to go back to a better version of the past in financial regulation. In both cases, all else was to stay the way it was."

Wolf: "The financial crisis was a devastating failure of the free market that followed a period of rising inequality within many countries. Yet, contrary to what happened in the 1970s, policymakers have barely questioned the relative roles of government and markets."

I've never viewed the 2008 fiasco as a "failure of the free markets." It was instead an abject failure of policymaking - of government policy and central bank doctrine and methods. At its roots, the crisis was the inevitable consequence of unsound money and Credit - finance that over time became increasingly unstable specifically because of government intervention and manipulation. "Activist" central banks were manipulating the price of finance and the quantity and allocation of Credit, along with increasingly heavy-handed interventions to backstop dysfunctional markets.

The crisis was a predictable failure in inflationism. Sure, it's reasonable to blame the reckless behavior of Wall Street. But risk-taking, leveraging, speculation and chicanery were all incentivized by policy measures employed to inflate both asset prices and the general price level.

Instead of crisis focusing attention on the root causes of perilous financial and economic fragilities, it was a panicked backdrop conducive to only more egregious government and central bank intervention. Rather than exhaustive discussions of the roles played by "The Maestro's" "asymmetric" market-friendly policy approach, Bernanke's pledge of "helicopter money," and central bank "puts" in inflating the Bubble, Dr. Bernanke was the superhero figure with the smarts, determination and academic creed to reflate the securities markets for the good of humanity. It was a grand illusion: Enlightened inflationism was viewed as the solution - and not the core problem that it was. And inflationists - including the FT's Martin Wolf - cheered on zero rates, Trillions of QE and the resulting inflation of the greatest Bubble in human history.

It became common to compare 2008 to 1929, and we were darn lucky that chairman Bernanke had trained his entire academic career to ensure a different outcome. This comparison continued for some years, 2009 to 1930, 2011 to 1932, and so on. I never bought into this line of analysis. As it turns out, 2008 did not mark a major inflection point in finance, in policymaking or in economic structure. I would argue that the unprecedented reflation merely extended the cycle, with essentially the same policy doctrine, financial apparatus and market structure that ensured the previous crisis. Same cycle, but just a much more comprehensive Bubble, across markets and economies on a global scale - and on powerful steroids.

It's popular to blame the rise of populism on the financial crisis. I believe the issue is more about economic structure. It is interesting to note that back in 2006, at the height of the U.S. Credit expansion, manufacturing jobs actually contracted during the year. The financial backdrop ensured that it was much easier to generate profits lending money, in structured finance and speculating in the markets than it was producing goods in the U.S. Productive investment (and manufacturing employment) has bounced back somewhat in recent years. Yet post-crisis inflationism has only widened the gap between real economic investment and the easy returns available from asset inflation, securities trading and financial engineering.

It's very much a minority view. But I believe we'd be in a much better place today had we not reflated the previous Bubble. It was a mistake to aggressively promote securities market inflation, once again incentivizing financial speculation; once again favoring the Financial Sphere over the Real Economy Sphere. Such favoritism specifically favors segments of the economy and population over others. The ongoing financial incentive structure foments financial and economic instability (ensuring a more outlandish and protracted cycle of central bank inflationism).

Warren Buffett is known for his focus on ensuring the right incentives are in place. Few have benefitted more from central bank-created incentives and securities market favoritism - along with inflationism more generally. I would add that no investor's reputation has gained as much from crisis policymaking. If there is a paramount investment truth today, it's that we all must invest for the long-term like the great Warren Buffett. Buy and hold, never try to time the market - but simply invest in America for the long-term. It's a sure thing.

As part of 10-year crisis anniversary coverage, the Wall Street Journal interviewed Buffett. The title of the video was enticing: "Warren Buffett Explains the 2008 Financial Crisis.

Buffett: "In 2008, you had something close to a bubble in home real estate. Fifty million people had mortgages roughly at that time, out of 75 million homeowners. When that bubble burst, it hit home to probably 40% of the households in the country - these people that had mortgages on their houses. Fear spread in the month of September 2008 at a rate that was like a tsunami."

WSJ: Who do you hold responsible for that?

"Bubbles are always hard to ascertain the originators of it. There really aren't originators. Everybody got caught in. Some were foolish, some were crooked - some were both. But you had a mass illusion that it could go on forever. You had Wall Street firms participating. Mortgage originators participating. But you had the public participating. It was a lot of fun. It was like Cinderella going to the party. We were all going to turn and buy some pumpkins at midnight, but nobody wanted to leave until one minute to midnight. And the rush for the door couldn't be handled."

WSJ: For you, what were the lessons you learned in 2008?

"I didn't really learn any new lessons in 2008 or 2009. I had emphasized to me some of the things that I'd always believed. That you do need somebody who can say 'do whatever it takes.' The U.S. government had to do the right things - not perfect things - but generally the right things starting in September. And they did a fantastic job, actually, of getting the train back on the tracks. There was still damage for a long period thereafter. But it was really important to have fast action at that time. We were very fortunate we had the leaders we did. If we'd had people that would have waited for all the information to be right, or for committees to work - that sort of thing - it would have been far, far worse. People talk about a fog of war, but there's a fog of panic too. And during that panic you're getting inaccurate information, you're hearing rumors. If you wait until you know everything, it's too late."

…I can understand how people that lost their houses or lost their jobs - whatever may have happened to them - feel that there must be somebody out there that was profiting from this that did it doing some things that should send them to jail. The people that ran most of the institutions - the big institutions that got in trouble - probably shouldn't name names - they went away rich. They may have been disgraced to some degree, but they went away rich. So I don't think the incentive system has been improved a lot from what it was ten years ago."

WSJ: What could the next crisis look like?

"If I knew what the next crisis would look like, I might be a little helpful in stopping it. But there will be other crises. There's no way of knowing, when we're in a situation like we were in the fall of 2008, when or precisely how it will end. You know the United States will come back. The factories don't disappear. The farm land doesn't disappear. The skills of the people don't disappear. But you had a system which was going to put them in an idle position - or could do it - there's no way to know how far it was going to go.

"What's left from the crisis is pretty much memories. The tracks are still there. The train in still there. But we had a big interruption in 2008 and nine - and now the train has been running pretty darn well. We've shown that America can't be stopped."  

I find Buffett's comments disappointing. For someone with his experience and intelligence, it seems there should be deeper insight regarding the forces behind such a major financial crisis.

For me, it's reminiscent of the mindset at the market top in the late-twenties. And, of course, the factories, farms and human skills didn't disappear after the Great Crash. America wasn't stopped. But the financial apparatus that inflated to extraordinary excess during the boom came to a grinding halt, with momentous ramifications for economies, societies and geopolitics.

In contrast to 2008, that crash and the resulting crisis in confidence - in the markets, in finance, in policymaking and in the real economy - concluded the cycle.

Hopefully the bullish consensus view is correct. But the current backdrop sure seems late cycle - "permanent plateau" - manic wishful thinking to me. This whole buy and hold and ignore risk delirium - the product of decades of "activist" central banks jamming too many "coins in the fuse box" - espoused by the great market oracle Warren Buffett - is a trap. It's been awhile since investors have experienced a protracted bear market. Central bankers have too quickly come to the markets' defense. The next crisis could prove much more difficult to manage. Long-term investors, convinced to hold tight, may find it's a long time before they see these securities prices again.

The way I see it, a lot of faith has been placed in enhanced bank supervision, larger bank capital buffers and the almighty power of "whatever it takes" central banking. But despite the propaganda, irresponsible bank lending was not the root cause of 2008 fragilities. It was dysfunctional financial markets, replete with mispricing, misperceptions, rank speculation, leverage and resource misallocation. It was a massive and unwieldy derivatives marketplace. It was the view that the securities and derivatives markets were too big to fail - that central banks could ensure uninterrupted liquid and robust markets.

And this is where critical lessons went unlearned and, as a consequence, where danger lurks today. From my vantage point, all the previous key forces fomenting latent fragilities are greater today than a decade ago. From a global perspective, unsound "money" and Credit back in 2008 appears pristine in comparison. And if you think populism, nationalism, socialism and mayhem are on the rise, just wait until this global Bubble bursts.

The Forgotten History of the Financial Crisis

What the World Should Have Learned in 2008

By Adam Tooze

"September and October of 2008 was the worst financial crisis in global history, including the Great Depression.” Ben Bernanke, then the chair of the U.S. Federal Reserve, made this remarkable claim in November 2009, just one year after the meltdown. Looking back today, a decade after the crisis, there is every reason to agree with Bernanke’s assessment: 2008 should serve as a warning of the scale and speed with which global financial crises can unfold in the twenty-first century.

The basic story of the financial crisis is familiar enough. The trouble began in 2007 with a downturn in U.S. and European real estate markets; as housing prices plunged from California to Ireland, homeowners fell behind on their mortgage payments, and lenders soon began to feel the heat. Thanks to the deep integration of global banking, securities, and funding markets, the contagion quickly spread to major financial institutions around the world. By late 2008, banks in Belgium, France, Germany, Ireland, Latvia, the Netherlands, Portugal, Russia, Spain, South Korea, the United Kingdom, and the United States were all facing existential crises. Many had already collapsed, and many others would before long.

The Great Depression of the 1930s is remembered as the worst economic disaster in modern history—one that resulted in large part from inept policy responses—but it was far less synchronized than the crash in 2008. Although more banks failed during the Depression, these failures were scattered between 1929 and 1933 and involved far smaller balance sheets. In 2008, both the scale and the speed of the implosion were breathtaking. According to data from the Bank for International Settlements, gross capital flows around the world plunged by 90 percent between 2007 and 2008.

As capital flows dried up, the crisis soon morphed into a crushing recession in the real economy. The “great trade collapse” of 2008 was the most severe synchronized contraction in international trade ever recorded. Within nine months of their pre-crisis peak, in April 2008, global exports were down by 22 percent. (During the Great Depression, it took nearly two years for trade to slump by a similar amount.) In the United States between late 2008 and early 2009, 800,000 people were losing their jobs every month. By 2015, over nine million American families would lose their homes to foreclosure—the largest forced population movement in the United States since the Dust Bowl. In Europe, meanwhile, failing banks and fragile public finances created a crisis that nearly split the eurozone.

Ten years later, there is little consensus about the meaning of 2008 and its aftermath. Partial narratives have emerged to highlight this or that aspect of the crisis, even as crucial elements of the story have been forgotten. In the United States, memories have centered on the government recklessness and private criminality that led up to the crash; in Europe, leaders have been content to blame everything on the Americans.

In fact, bankers on both sides of the Atlantic created the system that imploded in 2008. The collapse could easily have devastated both the U.S. and the European economies had it not been for improvisation on the part of U.S. officials at the Federal Reserve, who leveraged trans-atlantic connections they had inherited from the twentieth century to stop the global bank run.

That this reality has been obscured speaks both to the contentious politics of managing global finances and to the growing distance between the United States and Europe. More important, it forces a question about the future of financial globalization: How will a multipolar world that has moved beyond the transatlantic structures of the last century cope with the next crisis?


One of the more common tropes to emerge since 2008 is that no one predicted the crisis. This is an after-the-fact construction. In truth, there were many predictions of a crisis—just not of the crisis that ultimately arrived.

Macroeconomists around the world had long warned of global imbalances stemming from U.S. trade and budget deficits and China’s accumulation of U.S. debt, which they feared could trigger a global dollar selloff. The economist Paul Krugman warned in 2006 of “a Wile E. Coyote moment,” in which investors, recognizing the poor fundamentals of the U.S. economy, would suddenly flee dollar-denominated assets, crippling the world economy and sending interest rates sky-high.

But the best and the brightest were reading the wrong signs. When the crisis came, the Chinese did not sell off U.S. assets. Although they reduced their holdings in U.S.-government-sponsored enterprises such as the mortgage lenders Fannie Mae and Freddie Mac, they increased their purchases of U.S. Treasury bonds, refusing to join the Russians in a bear raid on the dollar. Rather than falling as predicted, the dollar actually rose in the fall of 2008. What U.S. authorities were facing was not a Sino-American meltdown but an implosion of the transatlantic banking system, a crisis of financial capitalism.

And the crisis was general, not just American, although the Europeans had a hard time believing it. When, over the weekend of September 13–14, 2008, U.S. Treasury Secretary Henry Paulson and other officials tried to arrange the sale of the failed investment bank Lehman Brothers to the British bank Barclays, the reaction of Alistair Darling, the British chancellor of the exchequer, was telling. He did not want, he told his American counterparts, to “import” the United States’ “cancer”—this despite the fact that the United Kingdom’s own banks were already tumbling around him.

To Europeans, the crisis was the United States' comeuppance.

The French and the Germans were no less emphatic. In September 2008, as the crisis was going global, the German finance minister, Peer Steinbrück, declared that it was “an American problem” that would cause the United States to “lose its status as the superpower of the world financial system.” French President Nicolas Sarkozy announced that U.S.-style “laissez faire” was “finished.”

To Europeans, the idea of an American crisis made sense. The United States had allowed itself to be sucked into misguided wars of choice while refusing to pay for them. It was living far beyond its means, and the crisis was its comeuppance. But confident predictions that this was a U.S. problem were quickly overtaken by events. Not only were Europe’s banks deeply involved in the U.S. subprime crisis, but their business models left them desperately dependent on dollar funding. The result was to send the continent into an economic and political crisis from which it is only now recovering.

Even today, Americans and Europeans have very different memories of the financial crisis. For many American commentators, it stands as a moment in a protracted arc of national decline and the prehistory of the radicalization of the Republican Party. In September 2008, the Republican-led House of Representatives voted against the Bush administration’s bailout plan to save the national economy from imminent implosion (although it passed a similar bill in early October); a few months later, after a lost election and at a time when 800,000 Americans were being thrown out of work every month, House Republicans voted nearly unanimously against President Barack Obama’s stimulus bill. The crisis ushered in a new era of absolute partisan antagonism that would rock American democracy to its foundations.

Europeans, meanwhile, remain content to let the United States shoulder the blame. France and Germany have no equivalent of The Big Short—the best-selling book (and later movie) that dramatized the events of 2008 as an all-American conflict between the forces of herd instinct and rugged individualism, embodied by the heterodox speculators who saw the crisis coming. Germans cannot ignore that Deutsche Bank was a major player in those events, but they can easily explain this away by claiming that the bank abandoned its German soul. And just as the Europeans have chosen to forget their own mistakes, so, too, have they forgotten what the crisis revealed about Europe’s dependence on the United States—an inconvenient truth for European elites at a time when Brussels and Washington are drifting apart.


Europe’s persistent illusions were on full display in an August 9, 2017, press release from the European Commission. In it, the commission announced that the “crisis did not start in Europe” and that the underlying problem had been “exposure to sub-prime mortgage markets in the United States,” which triggered the deep European recession that followed. Brussels went on to take credit for mitigating that recession through the “strong political decisions” of EU institutions and member states.

The timing of the press release was significant. It came on the tenth anniversary of what most experts consider to be the true start of the global financial crisis—the moment on August 9, 2007, when the French bank BNP Paribas announced that it was freezing three of its investment funds due to volatility in asset-backed securities markets in the United States. This was the first indication that the downturn in housing prices, which had begun in early 2007, would have global ramifications. That same day, the European Central Bank (ECB) was sufficiently alarmed to inject $131 billion in liquidity into Europe’s banking system.

The commission’s analysis of what happened in 2007 was telling. Set aside, for a moment, the fact that problems at a French bank were the occasion of the anniversary, that there were massive homegrown real estate busts in Ireland and Spain, and that Greece and Italy had accumulated dangerous debt stocks of their own. What, exactly, did the implosion of U.S. subprime mortgage markets expose?

The United States’ mortgage system was obviously broken. Some of the lending was criminal. And the design of mortgage-backed securities, many of which earned the highest bond ratings by bundling together bad mortgages, was flawed. But none of these problems explains why the downturn precipitated a global banking crisis. After all, investors lost more money when the dot-com bubble burst in 2000 and 2001, but that did not bring the global financial system to the brink of disaster.

What turned 2008 into the worst banking crisis in history was a new business model for banks.

Traditionally, most banks had funded their operations through what is known as “retail” banking, in which consumers lend money to banks in the form of deposits, which banks use to make loans.

Beginning in the 1980s, however, banks across the world increasingly moved toward “wholesale” banking, funding their operations through large, short-term loans from other financial institutions, such as other banks and money market funds. The motive for this shift was profit and competitive survival. Wholesale funding gave banks the ability to borrow much larger sums of money than they could in the retail market, allowing them to become more leveraged—and thus more exposed to risk—than ever before.

But the real threat to the global economy was not just that banks in the United States, Europe, and, to some extent, Russia and Asia were becoming overleveraged; it was also that much of these banks’ short-term funding involved currency mismatches. In order to do business in the United States, non-U.S. banks needed dollars, which they obtained from wholesale markets through a variety of methods: borrowing unsecured cash from U.S. sources, issuing commercial paper (essentially short-term IOUs), and, crucially, using currency-swap markets to receive short-term dollar loans in exchange for their own local currencies, with a promise to “swap” the currencies back at the end of the loan term. In short, foreign banks were racking up sizable liabilities that had to be paid in dollars.

If the money markets where they obtained these dollars ceased to function, many of the world’s banks would immediately be at risk of failure.

And in fact, that is precisely what happened. The first big bank to fail spectacularly was the British lender Northern Rock, in August and September 2007. It had no exposure to American subprime mortgages, but its funding model relied overwhelmingly on wholesale borrowing from around the world. What cut off Northern Rock’s access to funding was BNP Paribas’ August 9 announcement.

This sent a signal to wholesale lenders that more banks were holding bad assets than anyone had previously understood. With the extent of the contagion unknown, wholesale lending ground to a halt. Five days later, Northern Rock informed British regulators that it would need assistance.

The shutdown in bank funding quickly rippled across the global financial system, even reaching Russia and South Korea, countries remote from the subprime debacle but whose banks relied on the same wholesale markets now under stress. The world was witnessing a trillion-dollar, transnational bank run.

By late 2007, the world was witnessing a trillion-dollar, transnational bank run.

People tend to think of globalization as involving the rise of emerging markets such as China and India, and in manufacturing and commodities, these countries have indeed been the engines of growth. But in the early twenty-first century, financial globalization still revolved around the transatlantic axis, and it was between the United States and Europe that the real disaster threatened. The Bank for International Settlements estimated that all told, by the end of 2007, European banks would have needed to raise somewhere between $1 trillion and $1.2 trillion in order to cover the gaps on their balance sheets between dollar assets and dollar funding. In the good times, these banks had easily obtained funding through currency swaps and wholesale markets. Now, with interbank markets drying up, they were desperate for dollars.

By the fall of 2007, officials in the United States had begun to fear that European banks, in a frantic bid to earn dollars to pay their bills, would liquidate their dollar portfolios in a giant fire sale. And because these banks owned 29 percent of all nonconforming, high-risk mortgage-backed securities in the United States, this was not just a European problem. The nightmare scenario for the Americans was that European banks would dump their dollar holdings, driving the prices of mortgage-backed securities to rock bottom and forcing U.S. banks, which held even larger quantities of those securities, to recognize huge losses, thus triggering a bank run that would have overwhelmed the furious efforts of the U.S. authorities to restore stability. It was this risk of simultaneous implosion on both sides of the Atlantic that made 2008 the most dangerous crisis ever witnessed.


With disaster threatening, the question became how to respond. In the fall of 2008, governments across the West rushed to bail out their ailing financial institutions. In the United States, Washington came to the aid of the investment bank Bear Stearns, Fannie Mae and Freddie Mac, and the insurance giant AIG. The United Kingdom effectively nationalized HBOS, Lloyds, and the Royal Bank of Scotland. Belgium, France, Germany, Ireland, and Switzerland all took emergency measures to rescue their own banking sectors. 

As the trouble spread, crisis diplomacy kicked in. The inaugural G-20 leadership summit convened in November 2008, bringing together heads of state from developing countries such as Brazil, China, and India, in addition to those from the developed world. The birth of the G-20 reflected a multipolar world economy in which emerging markets had new weight. But it also made recourse to institutions such as the International Monetary Fund, which many developing countries viewed with hostility, all the more sensitive. No one in Washington wanted a repeat of the controversies of the Asian financial crisis in the late 1990s, when the IMF's draconian loans came to be seen by their recipients as violations of national sovereignty. 

Behind the scenes, U.S. officials were putting an alternative rescue mechanism in place. The central problem was that the world’s banks needed dollar funding. And the only institution that could fill that need was the Federal Reserve. 

Officials at the Fed had already started to worry about European banks’ funding gaps toward the end of 2007. By December of that year, Bernanke and Timothy Geithner, then the president of the New York Federal Reserve Bank, had begun offering special liquidity programs to Wall Street, giving U.S. financial institutions access to cheap cash in the hopes of stabilizing their balance sheets and avoiding a ruinous selloff of mortgage-backed securities. Immediately, European banks started dipping into these funds. The Europeans took more than half of the $3.3 trillion offered through the Fed’s Term Auction Facility, which auctioned off low-interest short-term loans, and 72 percent of the deals provided through the Single-Tranche Open Market Operation, a little-publicized Fed liquidity program that ran from March to December of 2008. (Credit Suisse alone took one-quarter of that program’s funds.) 

For the Fed to be acting as lender of last resort to foreign banks was no doubt unusual, but these were desperate times, and it needed to avoid a European fire sale of U.S. assets at all costs. As the crisis intensified, however, the Fed’s leaders found that simply providing the European banks with access to the Wall Street liquidity programs would not be enough. Their funding needs were too great, and they lacked sufficient high-quality collateral in New York. So Geithner and the New York Federal Reserve resorted to an indirect mechanism for providing them with dollars, repurposing a long-forgotten instrument known as a “liquidity swap line.”
First responders: Henry Paulson and Ben Bernanke testifying in Washington, July 2008.
First responders: Henry Paulson and Ben Bernanke testifying in Washington, July 2008.

Liquidity swap lines are contracts between two central banks, in this case, the Fed and a foreign central bank, to temporarily exchange currencies: the Fed provides its counterpart with a fixed amount of dollars and in return receives an equivalent amount of that bank’s local currency. (The foreign central bank also pays a margin of interest to the Fed.) Liquidity swap lines had been used extensively in the 1960s to deal with tensions in the Bretton Woods system—which, by compelling countries to back their money with gold, led to frequent currency imbalances—but had since been confined to emergencies, as when they were used to help the Bank of Mexico during the peso crisis of 1994–95. The revival of liquidity swap lines in 2007–8 ensured that there would be no dangerous spikes in the funding costs of key Asian, European, and Latin American banks. If interbank funding got too tight, the global financial system would receive dollars directly from the Fed. 

The major beneficiaries of the swap lines were the central banks of Japan, Europe, and the major emerging-market countries, which could now take dollars from the Fed to pass on to their own struggling banks. The Fed introduced the liquidity swap lines in December 2007, and they were rapidly increased to a permissible limit of $620 billion. On October 13, 2008, they were uncapped, giving the major foreign central banks unlimited dollar drawing rights. By December 2008, the swap lines were the single largest outstanding item on the Fed’s balance sheet. The swap lines operated over various terms, ranging from overnight to three months. But if, for accounting purposes, they were standardized to a 28-day term, between December 2007 and August 2010, the Fed provided its Asian, European, and Latin American counterparts with just shy of $4.5 trillion in liquidity, of which the ECB alone took $2.5 trillion. That the European banks’ giant funding gap did not escalate into a full-blown transatlantic financial crisis is thanks in large part to these swap lines. 

Although the swap lines could be dismissed as technical in-house arrangements between central banks, they represented a fundamental transformation of the global financial system. The world’s central banks effectively became offshore divisions of the Fed, conduits for whatever dollar liquidity the financial system required. The Fed, that is, made itself into a global lender of last resort. Whereas before 2008 many had expected an imminent dollar selloff, the crisis ended up confirming the centrality of the Fed to the global financial system. And by successfully managing the crisis, the Fed reinforced the dollar’s attractiveness as the world’s commercial currency. 

But in establishing the swap-line system, the Fed also confirmed a hierarchy of central banks.

The system included the obvious European central banks, such as the ECB, the Bank of England, and the Swiss National Bank, and those of Canada and Japan. But it also included the central banks of major emerging-market financial centers, such as Brazil, Mexico, Singapore, and South Korea. They were in; the likes of China, India, and Russia were not. Veterans of the swap-line program at the Fed, who spoke to me on the condition of anonymity, admitted that they knew that by rolling it out they were straying into geopolitical terrain. They carefully compiled a list of the 14 central banks that were to participate in the program, all of which had to be approved by the U.S. Treasury Department and the State Department. The Fed’s minutes from the meeting of the Federal Open Market Committee on October 29, 2008, record that at least two applicants were rejected, but their names were redacted. Not everyone was sufficiently important—or sufficiently politically and economically aligned with the United States—to qualify. 

The swap-line system wasn’t secret, but it wasn’t trumpeted, either. This was no Marshall Plan moment, and U.S. officials had no desire to publicize the fact that they were coming to the world’s rescue. The inability of Europe’s megabanks to come up with the trillions of dollars they owed posed such a risk to the U.S. economy that doing nothing was simply not an option. So discreetly, the Fed offered the Europeans a helping hand.

The world's central banks effectively became offshore divisions of the Fed.

The liquidity swap lines wound down rapidly in 2009, as private credit markets began to recover. The full details of the liquidity programs were not disclosed until 2011, when the U.S. Supreme Court ordered the Fed to release the data to reporters from Bloomberg. There was good reason for secrecy: central banks do not wish to stigmatize borrowers that avail themselves of support when they need it, and announcing that the world’s most important central banks were desperate for dollar funding could have frightened international markets.

The result, however, is that the Fed’s actions to save the global financial system have largely been forgotten. An unprecedented intervention effectively disappeared down a memory hole.


Today, the swap lines are an obscure part of the narrative in the United States; in Europe, they have been forgotten altogether. The European Commission is free to peddle its story that it was prompt action by the European authorities that saved Europe from a crisis made in the United States.

European banks such as Barclays and Deutsche Bank can proudly proclaim that, unlike their American counterparts, they came through the crisis without state assistance, despite the fact that they took hundreds of billions of dollars in liquidity from the Fed. Although such depictions are profoundly misleading, they speak volumes about the state of transatlantic relations in the early twenty-first century. The United States and Europe remain massively interdependent, but they lack a common story to glue the relationship together.

The year 2008 can thus be seen as a moment of transition. On the one hand, it marked a twenty-first-century global crisis. On the other hand, the management of that crisis relied on networks of interdependence shaped by the twentieth-century history of the transatlantic relationship—networks that were deep but that leaders on both sides of the divide now seem eager to leave behind. 

What are the implications for the future? Many predicted that in the aftermath of the crisis, the dollar would lose its status as the world’s leading currency, but the opposite has happened. According to figures compiled by the economists Ethan Ilzetzki, Carmen Reinhart, and Kenneth Rogoff, today the dollar is the anchor currency—the standard against which other currencies are pegged—for countries representing around 70 percent of global GDP, up from closer to 60 percent at the turn of the millennium. It was European, not American, finance that retreated. The events of 2008 left the European banks in a weakened position, and since then, they have repeatedly looked to Washington for support. When the eurozone crisis was at its most acute, in 2010, the Fed reopened its swap lines, and in November 2013, they were made permanent.

At the same time as the Fed tided the European banks over during the crisis, U.S. regulators began to take an increasingly dim view of their stability. During negotiations in the Basel Committee on Banking Supervision throughout 2010, U.S. and European officials clashed over tightening banking rules and capital requirements. And after Obama signed the Dodd-Frank financial regulations into law in July of that year, U.S. regulators began using the law’s provisions to force European banks in the United States to either comply with the tougher standards or exit the U.S. market.  

The ultimate outcome of the crisis was thus an unwinding of the extraordinarily tight connection between U.S. and European finance that had characterized the 1990s and early years of this century. Between 2009 and 2017, the foreign claims of banks as a share of global GDP—a rough proxy for financial globalization—fell by roughly 22 percentage points, or around $9.5 trillion. The entirety of that reduction was attributable to European banks, with much of it coming in 2009 through a collapse of European claims on the United States. Deutsche Bank’s April 2018 decision to reduce its presence on Wall Street was a belated example of this broader European retreat.

At the same time as European finance has deglobalized, emerging markets have taken center stage. Cheap dollar finance enabled by the Fed’s policy of low interest rates has sucked emerging markets into a deep entanglement with the U.S.-dominated financial system. By 2015, China’s businesses had borrowed over $1.7 trillion in foreign currency, the largest part of that in dollars, to feed their rampant need for investment finance. This is profitable for everyone involved and widely seen as a harbinger of China’s integration into international finance; yet with this new development come new dangers. The actions taken by the Fed to manage the 2008 crisis were underpinned by the remnants of a transatlantic relationship dating back to the end of World War II; given today’s fraying transatlantic ties, it is an open question whether it will be able to repeat its efforts on a truly global scale when the next crisis arrives. 

Nor is it clear that the Fed will have as much political leeway as it did in 2008. When asked about the politics of the swap lines back then, one Fed veteran who spoke to me on the condition of anonymity remarked that it had been as though the world’s central bankers had a guardian angel watching over them on Capitol Hill. Some legislators clearly understood what was going on, but no unhelpful and potentially inflammatory questions were asked, such as whom the billions of dollars flushing through the swap lines would ultimately benefit. The Fed had carte blanche to do what was necessary. Given what has since emerged about the scale of its actions, the shift in the political climate in the United States, and the likelihood that the next crisis will be in the emerging markets, and quite possibly in China, it may take more than a guardian angel to save the global economy next time.

As Euro Crisis Ends, Italy Stokes Fear of a Revival

Concern comes amid market jitters over Italian debt, attacks by politicians in Rome on Europe’s establishment

By Marcus Walker

Italian Prime Minister Giuseppe Conte has sought to reassure investors with vows of fiscal discipline.
Italian Prime Minister Giuseppe Conte has sought to reassure investors with vows of fiscal discipline. Photo: angelo carconi/EPA/Shutterstock

ROME—The end of Greece’s marathon bailout on Monday would mark the closure of the eurozone crisis—if only it weren’t for Italy, and nagging fears that the euro isn’t fixed after all.

European Union authorities will hail as a victory the completion of Greece’s financial-rescue program, an eight-year drama that triggered a wider European sovereign-debt panic. Greece’s economy has begun to grow again, although recovery has far to go. Defying many predictions, Greece has stayed in the euro, thanks to the strength of public support for keeping the currency, even amid one of the deepest economic depressions of modern times.

Meanwhile French PresidentEmmanuel Macron,German ChancellorAngela Merkeland other EU leaders are discussing the next moves to bolster the currency union, building on various overhauls since the crisis.

Italy shows it might not be enough.

Renewed market tremors last week over Italian debt, and fresh verbal attacks on Europe’s establishment by politicians in Rome, suggests the specter of destabilizing capital flight from a eurozone country could return. 

A first test will come this fall, when Italy’s new populist government must present a budget and explain how it will pay for its costly promises to voters.

“I am as serene as the rainbow,” parliamentary budget committee chairmanClaudio Borghitweeted on Aug. 13 as investors sold off Italian bonds. Either the European Central Bank will guarantee Italy’s debt, “or everything will be dismantled,” said Mr. Borghi, a euroskeptic economic adviser toMatteo Salvini,head of Italy’s nationalist League party.

A day later, the prime minister’s office sought to reassure investors with a statement pledging fiscal discipline.

EU authorities have drawn many lessons from bond-market breakdowns that nearly destroyed the euro in 2010-12. They have built safeguards ranging from a permanent bailout fund to centralized banking supervision. Leaders are haggling over an embryonic common budget for the eurozone.

But the causes of Europe’s debt crisis haven’t gone away.

The currency union facilitated massive capital flows during the 2000s from Europe’s economic core around Germany to its periphery. That fed credit bubbles that distorted national economies, then left whole countries gasping for liquidity when investors lost confidence and fled.

Advanced economies, which normally control the currency they borrow in, became as vulnerable to investor stampedes as emerging economies—such as Turkey—that borrow in foreign currencies.

The euro still feels like a foreign currency to some Italians. The euroskeptic Minister for EuropePaolo Savonahas called it a “German cage.” During the crisis, the ECB acted to save Italy’s bond market from collapse only after Rome inflicted painful fiscal austerity to satisfy the central bank and Berlin.

The nativist League, part of Rome’s new governing coalition, feeds partly off lingering resentment about perceived German bullying.

ECB intervention in bond markets from 2012 onward, led by the bank’s Italian presidentMario Draghi,was the key to ending the financial disintegration and saving the euro. Government-to-government loans were enough to bail out smaller countries such as Greece, Ireland and Portugal—but only the ECB has the firepower to defend Italy, with its €2.3 trillion ($2.6 trillion) national debt.

Mr. Draghi, who retires next year, famously vowed to do “whatever it takes.” But he also needed supportive political leaders in Berlin, Rome and other important capitals.

“Are we sure that the next ECB head will be willing to do that?” says Paul De Grauwe, one of Europe’s most prominent economists. “Are we sure the political configuration in Europe will allow it? We don’t really know.” In an age of voter backlash against Europe’s political establishment, he says, “the leading actors next time may not be as invested in saving the euro.”

Italy’s new governing coalition, comprising the League and the antiestablishment 5 Star Movement, dominates in opinion polls at the expense of Italy’s centrist establishment, which backed the austere fiscal policies in the crisis. Neither governing party advocates leaving the euro, but each contains vocal euro-skeptics and has in the past called for a referendum on returning to the lira.

“If markets are seen as punishing Italy, it could intensify political animosity against the eurozone,” says Mr. De Grauwe.

Optimists say Europe has made good progress in reducing one of the big reasons why the debt crisis escalated: the mutual dependence of banks and governments.

When government-bond prices plunged, inflicting losses on banks that held them, markets doubted that crisis-hit governments could afford to support their country’s banks, leading to more selloffs. Struggling banks choked off credit to their national economies, deepening recessions. 
Europe’s new banking union, still under construction, aims to break the vicious circle. In future, the Europe-wide banking sector and its investors are to carry the cost of bank failures, rather than taxpayers.

“We may be closer to disentangling banks and sovereigns than is generally realized,” saysNicolas Veron,senior fellow at Brussels think tank Bruegel. “Even so, I would recommend going much further,” by deterring banks from owning too many of their government’s bonds.

Ideally, says Mr. Veron, bond markets would be able to punish governments for reckless policies, without capital flight spilling over into banks and the wider economy.

Others say the banking union isn’t enough. A banking crisis such as 2008 would overwhelm the sector’s limited new defenses, again burdening governments. Italy’s bond market is so big that a crash can’t be isolated from the economy, especially if linked to fears of a euro exit.

And the eurozone still lacks tools to fight recessions in countries where its brittle bond markets force governments to cut spending in a recession. The small eurozone investment fund envisaged by Mr. Macron and Ms. Merkel would be largely symbolic.

Ultimately the euro’s defense rests on the unwillingness of ordinary European voters to see their savings and livelihoods decimated in the chaos of a breakup, says Jacob Funk Kirkegaard, senior fellow at the Peterson Institute for International Economics in Washington.

“This is the key lesson of the Greek crisis,” he says. “Leaving the euro perhaps isn’t impossible, but the costs are so catastrophic that, politically, it’s unbearable.” That is why Greece turned back from the brink of exit in 2015.

“And Italy—a richer country with high savings that’s more deeply integrated into the European economy—has so much more to lose than Greece.”

What Will Trigger the Next Crisis?

Potential threats include bad loans, a euro exodus, China’s debt levels, earthquakes

What Will Trigger the Next Crisis?

Illustration: Peter Oumanski (4)

The person who predicts the next financial crisis, and there will be at least one, should get credit for luck rather than forecasting skill. A decade of extraordinarily low interest rates has created multiple distortions in the global economy and financial system. Any of those can unwind painfully but predicting what factors would trigger a global downturn is near impossible. Five columnists from Heard on the Street give it a try.

Interest Rates Jump

The biggest distortion in global markets is also the most important. Rock-bottom interest rates, driven by central banks to allow economies to heal, have encouraged risk-taking. Barring a downturn, interest rates will rise in most of the world, and rising interest rates always expose cracks in the financial system.

That was clear in February when a slight uptick in U.S. inflation expectations sent rates higher and ultimately caused the implosion of a multibillion-dollar fund that bet against market volatility.

Higher rates have typically pushed down stocks and commodities. In a crisis scenario, that would be just the beginning. The losses could be magnified by leverage and lead to higher defaults on public and private corporate debt, which could trigger capital flight and currency depreciations in emerging markets. The result would be an economic slowdown that could make each of these problems worse.
Interest rates globally have never been this low for this long. A sustained increase puts the global financial system in uncharted territory.

—Aaron Back

Bad-Loan Boom

The flip side to low interest rates has been a chase for yield that has driven investors to embrace riskier bonds. The result has been a boom in corporate borrowing and a race to the bottom both for high-quality and junk-rated companies. This drop in credit quality means that when the economy slows, losses are likely to be worse than in previous downturns.

Companies rated BBB, the lowest investment-grade rating, now account for almost half of all U.S. investment-grade corporate bonds by value, the highest share in more than 15 years. Among high-yield borrowers in bond and loan markets, new issuance from companies rated at the lowest B grade also make up a record share of new volume.

That raises the risk of significant losses for credit investors—from pension funds and insurers to mutual funds, ETFs and banks. The downturn could be made worse because banks are less able to trade debt than in the past, leaving some investors unable to sell their holdings. 

—Paul J. Davies

Italy Dumps the Euro

The recovery of the eurozone from the 2012 bond crisis has been predicated on Italy, Spain and Portugal pledging to meet the European Union’s budget rules, and Germany overlooking the fact they haven’t. As long as countries stick with the project, markets have rightly ignored debts and deficits.

But Italy may be wavering. Italian bond yields spiked in May after two parties with anti-euro leanings tried to form a new government. The crisis could escalate again once politicians return from holidays. Some 59% of Italians want to keep the common currency, official surveys show—the slimmest majority in the eurozone.

If Italy left the euro, Italian banks would face runs on their deposits and would be crushed—sovereign debt accounts for 9% of their assets.// This would paralyze lending and the economy. The shock waves would ripple abroad: Foreigners own 36% of Italian government debt. Markets would lose faith in Spain and Portugal’s debt and eurozone banks outside of Italy, which own $140 billion of Italian debt. This would weigh on the eurozone, one of the world’s three main economic engines.

—Jon Sindreu

China Cracks

China and the U.S. are the two other big global economies. When external demand weakens, China pumps up investment at home, as it did in 2009. Its growth has been powered in part by one of the fastest buildups of debt in history by a major country. That would worsen any financial crisis at home.

China has so far weathered the dual threats of trade war and a rising dollar well. The most likely causes for a China-triggered global crisis would be a real-estate crash or rolling defaults by local government-owned fundraising vehicles, severely damaging bank balance sheets, tanking investment and driving big capital outflows.

Abroad, a weakening of China’s economic might would drive down prices for commodities and the value of many emerging-market currencies, prompting widespread dollar-bond defaults which could damage Western lenders. Chinese capital fleeing the country would sharply drive up the dollar. Slower emerging market growth would hit U.S. and European exporters.

—Nathaniel Taplin

Supply-Chain Disruptions

Catastrophe can strike at any moment—and quickly reverberate. Decades of globalization and technological progress have made it easy for the fallout of natural or man-made disasters to become global.  

Consider a handful of isolated events with outsize impact. In 2011, flooding in Thailand slowed the global supply chain for personal computers as makers of hard drives were shut down. Last year, a computer virus crippled the fleet of AP Moeller-Maersk ,the world’s largest container shipping firm. The 2011 earthquake and tsunami in Japan caused auto plants world-wide to shut down as key components became unavailable. In 2010, the eruption of an Icelandic volcano caused the cancellation of over 100,000 flights, affecting about 10 million passengers.

And in 2016, a construction accident cut off a third of the U.S. East Coast’s gasoline and jet-fuel supply, sparking price spikes and shortages.

Unpredictable weather and rogue nations could do far worse, enough to bankrupt major companies or spark a recession.

—Spencer Jakab


The Scientist Who Scrambled Darwin’s Tree of Life

How the microbiologist Carl Woese fundamentally changed the way we think about evolution and the origins of life.

By David Quammen

      CreditIllustration by Michael Houtz

On Nov. 3, 1977, a new scientific revolution was heralded to the world — but it came cryptically, in slightly confused form. The front page of that day’s New York Times carried a headline: “Scientists Discover a Form of Life That Predates Higher Organisms.” A photograph showed a man named Carl R. Woese, a microbiologist at the University of Illinois in Urbana, with his feet up on his office desk. He was 50ish, with unruly hair, wearing a sport shirt and Adidas sneakers. Behind him was a blackboard, on which was scrawled a simple treelike figure in chalk. The article, by a veteran Times reporter named Richard D. Lyons, began:

Scientists studying the evolution of primitive organisms reported today the existence of a separate form of life that is hard to find in nature. They described it as a “third kingdom” of living material, composed of ancestral cells that abhor oxygen, digest carbon dioxide and produce methane.

This “separate form of life” would become known as the archaea, reflecting the impression that these organisms were primitive, primordial, especially old. They were single-celled creatures, simple in structure, with no cell nucleus. Through a microscope, they looked like bacteria, and they had been mistaken for bacteria by all earlier microbiologists. They lived in extreme environments, at least some of them — hot springs, salty lakes, sewage — and some had unusual metabolic habits, such as metabolizing without oxygen and, as the Times account said, producing methane.

But these archaea, these whatevers, were drastically unlike bacteria if you looked at their DNA, which is what (indirectly) Woese had done. They lacked certain bits that characterized all bacteria, and they contained other bits that shouldn’t have been present. They constituted a “third kingdom” of living creatures because they fit within neither of the existing two, the bacterial kingdom (bacteria) and the kingdom of everything else (eukarya), including animals and plants, amoebas and fungi, you and me.

Charles Darwin himself suggested (first in an early notebook, later in “On the Origin of Species”) that the history of life could be drawn as a tree — all creatures originating in a single trunk, then diverging into different lineages like major limbs, branches and twigs, with leaves of the canopy representing the multiplicity of living species. But if that simile was valid, then the prevailing tree of 1977, the orthodox image of life’s history, was wrong. It showed two major limbs arising from the trunk. According to what Woese had just announced to the world, it ought to show three. 
Woese was a rebel researcher, obscure but ingenious, crotchety, driven. He had his Warholian 15 minutes of fame on the front page of The Times, and then disappeared back into his lab in Urbana, scarcely touched by popular limelight throughout the remaining 35 years of his career. But he is the most important biologist of the 20th century that you’ve never heard of. He asked profound questions that few other scientists had asked. He created a method — clumsy and dangerous, but effective — for answering those questions. And in the process, he effectively founded a new branch of science.

It began with a casual suggestion made to Woese by Francis Crick, the co-discoverer of DNA’s structure, who mentioned passingly in a scientific paper that certain long molecules in living creatures, because they are built of multiple small units, coded in sequences that change gradually over time, could serve as signatures of the relatedness between one form of life and another. The more similar the sequence, the closer the relative. In other words, comparing such molecules could reveal phylogeny. The new branch of science is called molecular phylogenetics. Wrinkle your nose at that fancy phrase, if you will, and I’ll wrinkle with you, but in fact what it means is fairly simple: reading the ancient history of life from the different sequences built into such molecules. The molecules mainly in question were DNA, RNA and a few select proteins. Carried far beyond Woese and his lab, these efforts have brought unexpected and unimaginable discoveries, fundamentally reshaping what we think we know about life’s history, the process of evolution and the functional parts of living beings, including ourselves.

Woese vanished into his lab, but his insights and methods, and his successors in applying them, have produced in particular one cardinal revelation: The tree of life is not a tree. That old metaphor is obsolete. Life’s history has been far more tangled.

The idea of a “tree of life,” variously construed, goes back a long way in Western thinking — to the Book of Revelation, for instance, wherein the image of the tree seems to represent Christ, with his leafy and fruity blessings for the world. In 1801 the French botanist Augustin Augier used a tree as a kind of chart, for bringing order to the diversity of plants. He clustered major groups together on limbs and depicted minor groups as leaves at the ends of smaller branches. This wasn’t evolutionary thinking; it was just data management.

That simple, pragmatic use of the arboreal metaphor changed profoundly in 1837, when young Charles Darwin, just back from the Beagle voyage and scribbling reckless thoughts in a notebook, drew a small sketch of the first evolutionary tree. Above it he wrote: “I think.” This tree was hypothetical, its branches labeled with letters, not actual species, but what it meant to Darwin was: I think all creatures have arisen from a single source, diverging and changing somehow over time. He didn’t yet have a theory of the evolutionary process — the concept of natural selection would come later — but his sketch at least gave him an image of evolutionary history and its results. From that he could work backward, attempting to deduce the mechanism. 
He took his time and refined his ideas, working secretly. Twenty-two years later, finally announcing his theory in “On the Origin of Species,” Darwin wrote: “The affinities of all the beings of the same class have sometimes been represented by a great tree. I believe this simile largely speaks the truth.” But there was a big difference between his tree and Augier’s, or anyone else’s: His implied common origins (in the trunk), descent with modification (in the limbs) and adaptation by evolutionary change (in the twigs and leaves).

That image, the tree, defined the shape of evolutionary thinking from Darwin’s time until the 1990s, when new discoveries following from Woese’s rebel initiative suggested that it was inadequate. Among the most basic elements of the tree figure is continuous divergence, only divergence, through the passage of time and the lineages of creatures. Limbs never converge, never fuse. Those arboreal realities fit the canonical belief that genes flow only vertically, from parents to offspring, and can’t be traded sideways across species boundaries. What made Woese the foremost challenger and modifier of Darwinian orthodoxy — as Einstein was to Newtonian orthodoxy — is that his work led to recognition that the tree’s cardinal premise is wrong. Branches do sometimes fuse. Limbs do sometimes converge. The scientific term for this phenomenon is horizontal gene transfer (H.G.T.). DNA itself can indeed move sideways, between limbs, across barriers, from one kind of creature into another.

Those were just two of three big surprises that flowed from the work and the influence of Woese — the existence of the archaea (that third kingdom of life) and the prevalence of H.G.T. (sideways heredity). The third big surprise is a revelation, or anyway a strong likelihood, about our own deepest ancestry. We ourselves — we humans — probably come from creatures that, as recently as 41 years ago, were not known to exist. How so? Because the latest news on archaea is that all animals, all plants, all fungi and all other complex creatures composed of cells bearing DNA within nuclei — that list includes us — may have descended from these odd, ancient microbes. Our limb, eukarya, seems to branch off the limb labeled archaea. The cells that compose our human bodies are now known to resemble, in telling ways, the cells of one group of archaea known as the Lokiarcheota, recently discovered in marine ooze, almost 11,000-feet deep between Norway and Greenland near an ocean-bottom hydrothermal vent. It’s a little like learning, with a jolt, that your great-great-great-grandfather came not from Lithuania but from Mars.

   Woese with an RNA model at G.E. in 1961. CreditAssociated Press

We are not precisely who we thought we were. We are composite creatures, and our ancestry seems to arise from a dark zone of the living world, a group of creatures about which science, until recent decades, was ignorant. Evolution is trickier, far more complicated, than we realized. The tree of life is more tangled. Genes don’t just move vertically. They can also pass laterally across species boundaries, across wider gaps, even between different kingdoms of life, and some have come sideways into our own lineage — the primate lineage — from unsuspected, nonprimate sources. It’s the genetic equivalent of a blood transfusion or (to use a different metaphor preferred by some scientists) an infection that transforms identity. They called it “infective heredity.”

Such revelations, beginning in 1977 and continuing to break in the world’s leading scientific journals — but seldom explained to the general public — challenge us to adjust our basic understanding of who we humans are. You can blame it, if you want to blame someone, on the little white-haired man in Urbana, Ill.

Carl Woese was tangled himself. A proudly independent soul, very private, he flouted some of the rules of scientific decorum, made enemies, ignored niceties, said what he thought, focused obsessively on his own research program to the exclusion of most other concerns and turned up discoveries that shook the pillars of biological thought. To his close friends he was an easy, funny guy, caustic but wry, with a love for jazz, a taste for beer and Scotch and an amateurish facility on piano. To his grad students and postdoctoral fellows and laboratory assistants, most of them, he was a good boss and an inspirational mentor, sometimes (but not always) generous, wise and caring. As a teacher in the narrower sense — a professor of microbiology — he was almost nonexistent as far as undergraduates were concerned. He didn’t stand before large banks of eager, clueless students, patiently explaining the ABCs of bacteria. Lecturing wasn’t his strength, or his interest, and he lacked eloquent forcefulness even when presenting his work at scientific meetings. He didn’t like meetings. He didn’t like travel. He didn’t create a joyous, collegial culture within his lab, hosting seminars and Christmas parties to be captured in group photos, as many senior scientists do. He had his chosen young friends, and some of them remember good times, laughter, beery barbecues at the Woese home, just a short walk from the university campus. But those friends were the select few who, somehow, by charm or by luck, had gotten through his shell.
By 1969, at age 41, Woese was a tenured but unexceptional professor at the University of Illinois in Urbana. On June 24 that year, he wrote a revealing letter to Francis Crick in Cambridge, England. He had struck up an acquaintance with Crick about eight years earlier, while Woese was working at the General Electric Research Laboratory in Schenectady, N.Y., as an unguided biophysicist not quite sure what his employers wanted from him. Crick was already world renowned for the co-discovery, with James Watson, of DNA’s structure, but he hadn’t yet won his share of the Nobel Prize. The Woese-Crick interaction began as a tenuous exchange of courtesies through the mail — Woese requesting, and receiving, a reprint of one of Crick’s papers on genetic coding — but by 1969 they were friendly enough that he could be more personal and ask a larger favor. “Dear Francis,” he wrote, “I’m about to make what for me is an important and nearly irreversible decision,” adding that he would be grateful for Crick’s thoughts and his moral support.

What he hoped to do, Woese confided, was to “unravel the course of events” leading to the origin of the simplest cells — the cells that microbiologists called prokaryotes, by which they meant bacteria. Eukaryotes constituted the other big category, and all forms of cellular life (that is, not including viruses) were classified as one or the other. Although bacteria are still around, still vastly successful, dominating many parts of the planet, they were thought in 1969 to be the closest living approximations of early life-forms. Investigating their origins, Woese told Crick, would require extending the current understanding of evolution “backward in time by a billion years or so,” to that point when cellular life was just taking shape from ... something else, something unknown and precellular.

Oh, just a billion years farther back? Woese was always an ambitious thinker. “There is a possibility, though not a certainty,” he told Crick, “that this can be done by using the cell’s ‘internal fossil record.’ ” What he meant by “internal fossil record” was the evidence of long molecules, the linear sequences of units in DNA, RNA and proteins. Comparing such sequences — variations on the same molecule, as seen in different creatures — would allow him to deduce the “ancient ancestor sequences” from which those molecules, in one lineage and another, had diverged. And from such deductions, such ancestral forms, Woese hoped to glean some understanding of how creatures evolved in the very deep past. He was talking about molecular phylogenetics, without yet using that phrase, and he hoped by this technique to look back at least three billion years.

But which molecules would be the most telling? Which would represent the best “internal fossil record” of living cells?

Woese had in mind a tiny molecular mechanism, common to all forms of cellular life, called the ribosome. Nearly every cell contains ribosomes in abundance, like flakes of pepper in a stew, and they stay busy with the task of translating genetic information into proteins. Hemoglobin, for instance. That crucial protein transports oxygen through the blood of vertebrate animals. Architectural instructions for building hemoglobin molecules are encoded in the DNA of the animal, but where is hemoglobin actually produced? In the ribosomes. They are the core elements of what Woese called the translation apparatus. In plain words: Ribosomes turn genes into living bodies.

These particles had only recently been discovered, and at first no one knew what they did. Then they became recognized as the sites where proteins are built, but a big question remained: How? Some researchers suspected that ribosomes might actually contain the recipes for proteins, extruding them as an almost autonomous process. That notion collapsed in 1960, almost with a single flash of insight, when another of Crick’s brilliant colleagues, Sydney Brenner, during a lively meeting at Cambridge University, hit upon a better idea. Matt Ridley has described the moment in his biography of Crick:

Then suddenly Brenner let out a “yelp.” He began talking fast. Crick began talking back just as fast. Everybody else in the room watched in amazement. Brenner had seen the answer, and Crick had seen him see it. The ribosome did not contain the recipe for the protein; it was a tape reader. It could make any protein so long as it was fed the right tape of “messenger” RNA.

This was back in the days before digital recording, remember, when sound was recorded on magnetic tape. The “tape” in Brenner’s analogy was a strand of RNA — that particular sort called messenger RNA, because it carries messages from the cell’s DNA genome to the ribosomes, telling them which amino acids to assemble into a specified protein. Because the proteins they produce become three-dimensional molecules, a better metaphor than Brenner’s tape reader, for our own day, might be this: The ribosome is a 3-D printer.
Ribosomes are among the smallest of structures within a cell, but what they lack in size they make up for in abundance and consequence. A single mammalian cell might contain as many as 10 million ribosomes; a single cell of the bacterium Escherichia coli, or E. coli, might get by with just tens of thousands. Each ribosome might crank out protein at the rate of about 20 amino acids (the constituent units of all proteins) per second, altogether producing a sizzle of constructive activity within the cell. And this activity, because it’s so basic to life itself, life in all forms, has presumably been going on for almost four billion years. Few people, in 1969, saw the implications of that ancient, universal role of ribosomes more keenly than Carl Woese. What he saw was that these little flecks — or some molecule within them — might contain evidence about how life worked, and how it diversified, at the very beginning.
“What I propose to do is not elegant science by my definition,” he confided to Crick. Scientific elegance lay in generating the minimum of data needed to answer a question. His approach would be more of a slog. He would need a large laboratory, set up for reading at least portions of the ribosomal RNA. That itself was a stretch, at the time. (The sequencing of very long molecules — DNA, RNA or proteins — is so easily done nowadays, so elegantly automated, that we can scarcely appreciate the challenge Woese faced.) Back in 1969, Woese couldn’t hope to sequence the entirety of a long molecule, let alone a whole genome. He could expect only glimpses, short excerpts, read from fragments of ribosomal RNA molecules, and even that much could be achieved only laboriously, at great cost in time and effort. He planned to sequence what he could, from one creature and another, and then make comparisons, working backward to an inferred view of life in its earliest forms and dynamics. Ribosomal RNA would be his rabbit hole to the beginning of evolution.

In the handful of years following his letter to Crick, Woese developed a unique methodology for this task, limning life’s history by way of the “internal fossil record” within living cells. The mechanics were intricate, laborious and a little spooky. They involved explosive liquids, high voltages, radioactive phosphorus, at least one form of pathogenic bacteria and a loosely improvised set of safety procedures. Courageous young grad students, postdocs and technical assistants, under a driven leader, were pushing their science toward points where no one had gone before. OSHA, though recently founded, was none the wiser.
Woese had already settled on that one universal element of cellular anatomy, the ribosome, as the locus of his internal fossil record. But there remained a crucial decision: Which ribosomal molecule should he study? He settled on a longish molecule that serves as a structural component in one of the two ribosomal subunits. Its shorthand label is 16S rRNA. In English we say “16S ribosomal RNA.”
Mitchell Sogin worked in Woese’s lab, as a grad student and chief technical assistant, during these crucial years leading to the archaea discovery. He had come to the University of Illinois planning to do pre-med, shifted his focus, stayed for a master’s degree in “industrial microbiology” (essentially food-preservation and fermentation technology), and then drifted into Woese’s ambit because of shared interests in deeper questions. Woese noticed something about Sogin during their early interactions: The kid was not just smart but also handy around equipment. Some combination of talents — dexterity, mechanical aptitude, precision, patience, a bit of the plumber, a bit of the electrician — made him good not just at experimental work but at creating the tools for such work.

Another professor had ordered and paid for a collection of apparatus to be used for RNA sequencing, then accepted a position at Columbia University, leaving behind the hardware. “So Carl inherited that equipment, but he had no one that knew how to use it,” Sogin told me, in his office at the Marine Biological Laboratory in Woods Hole, Mass., almost 50 years later. No one who knew how to use the equipment, that is, until Sogin joined his lab. Sogin learned as much as possible about how to operate these tools, then became Woese’s handyman as well as his doctoral student, assembling and maintaining an array of paraphernalia to enable the sequencing of ribosomal RNA.
Woese himself was not an experimentalist. He was a theorist, a thinker, like Francis Crick. “He never used any of the equipment in his own lab,” Sogin said. None of it — unless you count the light boxes for reading film images of RNA fragments, the shorter pieces of the molecule once Sogin had used enzymes to cut them into workable bits. Sogin himself built these fluorescent light boxes, on which the images of the fragments, cast by radioactive phosphorus onto large X-ray films, could be examined. He converted an entire wall of bookshelves, using translucent plastic sheeting and fluorescent tubes, into a single big vertical light box, like a bulletin board. They called that one the light board. Viewed over a box or taped up on the light board, every new film would show a pattern of dark ovals, like a herd of giant amoebas racing across a bright plain. This was the fingerprint of an RNA molecule. Recollections from his lab members at the time, as well as a few old photographs, portray Woese gazing intently at those fingerprints, hour upon hour.

 “It was routine work, boring, but demanding full concentration,” Woese himself later recalled. Each spot represented a small string of RNA letters — like the letters of the DNA code, A, C, G, T, but with U replacing the T. The shortest useful fragments were at least four letters long, and no more than about 20. Each film, each fingerprint, represented ribosomal RNA from a different creature. The sum of the patterns, taking form in Woese’s brain, represented a new draft of the tree of life.

The work was deceptively perilous. Sogin described to me the deliveries of radioactive phosphorus (an isotope designated as P32, with a half-life of 14 days), which amounted to a sizable quantity arriving every other Monday. The P32 came as liquid within a lead “pig,” a shipping container designed to protect the shipper, though not whoever opened it. Sogin would draw out a measured amount of the liquid and add it to whatever bacterial culture he intended to process next. “I was growing stuff with P32,” he said, tossing that off as a casual memory. “It was crazy. I don’t know why I’m alive today.”

By 1973, the Woese lab had become one of the foremost users of such RNA-sequencing technology in the world. While the grad students and technicians produced fingerprints, Woese spent his time staring at the spots. Was this effort tedious in practice as well as profound in its potential results? Yes. “There were days,” he wrote later, “when I’d walk home from work saying to myself, ‘Woese, you have destroyed your mind again today.’ ”

George Fox, a rangy young man from Syracuse, came to Urbana in 1973 for a postdoctoral position in Woese’s lab. Fox was not a natural experimentalist and had aspirations to work on the “theoretical stuff,” the deep evolutionary analysis of molecular data, alongside Woese himself. Failing initially to persuade Woese of his aptitude for that, Fox was banished back to the lab, set to the tasks of growing radioactive cells and extracting their ribosomal RNA. But he continued, in flashes, to show his value to Woese as a thinker. Gradually he proved himself, not just sufficiently to work on sequence comparisons but well enough to become Woese’s trusted partner, and sole co-author, on the culminating paper in 1977, with its announcement of a “third kingdom” of life.

The paper announcing that revelation, now considered to be among the most important works ever published in microbiology, is known in the professional shorthand as “Woese and Fox (1977).” But the paper’s immediate reception, by the community of biologists who worked on such subjects, was far from universally admiring. Part of the problem was a matter of scientific protocol: Woese’s discovery had been announced in a news release — issued from NASA, one of his grant sources — just as the paper itself appeared. That offended some scientists. Another factor was that Woese lacked facility as an explainer. He had never developed the skills to give a good lecture. He stood before audiences — when he did so at all, which wasn’t often — and thought deeply, groped for words, started and stopped, generally failing to inspire or persuade. Then suddenly that November, for a very few days, he had the world’s attention.

“When reporters called him up and tried to find out what this was all about,” according to Ralph Wolfe, a microbiologist and colleague, “he couldn’t communicate with them. Because they didn’t understand his vocabulary.” Wolfe helped with growing these organisms in the lab, though he wasn’t credited (or implicated, depending on your view of it) as a co-author on the controversial paper. “Finally he said, ‘This is a third form of life.’ Well, wow! Rockets took off, and they wrote the most unscientific nonsense you can imagine.” The Chicago Tribune, for instance, carried a dizzy headline asserting that “Martianlike Bugs May Be Oldest Life.” The news-release approach backfired, the popular news accounts overshadowed the careful scientific paper and many scientists who didn’t know Woese concluded, according to Wolfe, that “he was a nut.”
Wolfe himself heard from colleagues immediately. Among his phone calls on the morning of Nov. 3, 1977, he recollected in a personal essay, “the most civil and free of four-letter words” was from Salvador Luria, one of the early giants of molecular biology, a Nobel Prize winner in 1969 and a professor at Illinois during Wolfe’s earlier years, who called now from M.I.T., saying, “Ralph, you must dissociate yourself from this nonsense or you’re going to ruin your career!” Luria had seen the newspaper coverage but not yet read the scientific report, with its supporting data, to which Wolfe referred him. He never called back. But the broader damage was done. After Luria’s call and others, Wolfe added, “I wanted to crawl under something and hide.”
To me, during a chat in his office, Wolfe said: “We had a whole bunch of calls, all negative, people outraged at this nonsense. The scientific community just totally rejected the thing. As a result, this whole concept was set back by at least a decade or 15 years.”
Woese’s ideas eventually found purchase in Europe, and in time, scientists in the United States recognized him as well. In 1984 Woese received a MacArthur Fellowship for his efforts in phylogenetic analysis and his discovery of the archaea, and in 1988 he was elected to the National Academy of Sciences. Despite the MacArthur honor, and because the Academy had elected him relatively late (at 59), he still thought of himself as a neglected outsider. That gave him some latitude to continue being ambitious, bold and ornery. And he wanted to revisit the status of his beloved archaea.
      Woese, around 1982.CreditCharlie Vossbrinck

As an outlet for this work, Woese turned to the Proceedings of the National Academy of Sciences, a journal in which — as a member now of the Academy — he could be a bit more speculative than he would at Nature or Science. His next big paper, published in June 1990 with two co-authors and titled “Towards a Natural System of Organisms,” made several main assertions. First, any system of classification should be strictly “natural,” as the title suggested — meaning phylogenetic, reflecting evolutionary relationships. Second, there should be three major divisions of life, not two (the predominant view), not five (an alternative proposal, recognizing animals, plants, bacteria, fungi and a catchall group of other eukaryotes), and those divisions should be known as domains. Three domains, recognized above the old kingdoms rather than replacing them: It was ingenious strategically, transcending rather than rejoining the battle over kingdoms.

Last of the paper’s main points was that these three domains should henceforth be known as the Bacteria, the Eucarya (now known as Eukarya, a better transliteration of the Greek roots, meaning “true kernel,” because of the cell nucleus) and the Archaea. And of course there was a tree. It was drawn in straight, simple lines, but it was rich and provocative nonetheless.

It was the last of the great classical trees, authoritative, profound, completely new to science and correct to some degree. But it only served as a point of departure for what came next. 
The following decade saw an explosive recognition of the bizarre, counterintuitive phenomenon called horizontal gene transfer and the role it has played throughout the history of life. That explosion occurred during the 1990s but had deep precedents, even before Woese’s work opened the door to appreciating its unimaginable prevalence and significance.
The first recognition by science that any such thing as H.G.T. might be possible dates to 1928, when an English medical researcher named Fred Griffith first detected a puzzling transformation among the bacteria that cause pneumococcal pneumonia: one strain changing suddenly into another strain, presto, from harmless to deadly virulent. At the Rockefeller Institute in New York during the 1940s, Oswald Avery and two colleagues identified the “transforming principle” in such instantaneous transmogrifications as naked DNA — that is, genes, moving sideways, from one strain of bacteria into another. To say that seemed odd is an understatement. Genes weren’t supposed to move sideways; they were supposed to move vertically, from parents to offspring — even when the “parents” were bacteria, reproducing by fission. But by 1953, the great Joshua Lederberg, then at the University of Wisconsin, had shown that this sort of transformation, relabeled “infective heredity,” is a routine and important process in bacteria. Still more unexpectedly, as later work would reveal, H.G.T. is not unique to bacteria.
Slowly at first, during the 1980s and early 1990s, H.G.T. became a favored research focus in more than a few labs. Many researchers had followed Woese’s lead, using ribosomal rRNA as the basis for comparing one organism with another, judging relatedness and constructing trees of life. But then, as new tools and methods made gene sequencing easier and faster, and as more powerful computers allowed analysis of the vast troves of genomic data, researchers went far beyond 16S rRNA, comparing other genes and whole genomes. What they found surprised them: that many genes had moved sideways from one lineage of life into another. Such genes might be absent from most living species within a group (say, a family of butterfly species), implying that it was absent too from the common ancestral form, but it might show up unexpectedly in one species of butterfly in that family, matching closely to a gene that exists only in another kind of creature (say, a bacterium), classified to an entirely different part of the tree of life. How could that happen? If the gene was absent from the common ancestor, it couldn’t have gotten into the anomalous butterfly species by vertical descent.

Researchers have identified three primary mechanisms by which H.G.T. occurs, each of which has a formalized label: conjugation, transformation and transduction. Conjugation is sometimes loosely called “bacterial sex.” It occurs when two individual bacteria (they needn’t be of the same species) form a copulation-like connection, and a segment of DNA passes from one to the other. (It’s isn’t really bacterial sex because it involves gene exchange but not reproduction.) Transformation is what Fred Griffith noticed in 1928: uptake of naked DNA, left floating in the environment after the rupture of some living cell, by another living cell (again, not necessarily of the same species). Transduction is a sort of drag-and-drop trick performed by viruses, picking up bits of DNA from cells they infect, then dropping those DNA bits later within other infected cells, where they may become incorporated into the genomes.

Conjugation was known to be widespread and common among bacteria. H.G.T. by transformation and transduction could potentially occur among other creatures too, even eukaryotes — even animals and plants — though that prospect was far more uncertain and startling, into the 1990s and beyond. Then improved genome sequencing and closer scrutiny brought more surprises. A bacterium had sent bits of its DNA into the nuclear genomes of infected plants. How was that possible? A species of sea urchin seemed to have shared one of its genes with a very different species of sea urchin, from which its lineage diverged millions of years earlier. That was a stretch. Still another bacterium, the familiar E. coli, transferred DNA into brewer’s yeast, which is a fungus. Brewer’s yeast is microbial, a relatively simple little creature, but nonetheless eukaryotic. This mixing of fungal host and bacterial genes happened via a smooching process that looked much like bacterial transformation, the researchers reported, and “could be evolutionarily significant in promoting trans-kingdom genetic exchange.” Trans-kingdom is a long way for a gene to go.

New investigations, as time passed and improvements in gene-sequencing technology made more complete genomes available, showed that far more radical leaps were happening, and not infrequently. For instance: There’s a peculiar group of tiny animals known as rotifers, once studied only by invertebrate zoologists but now notable throughout molecular biology for their “massive” uploads of alien genes. Rotifers are homely beyond imagining. They live in water, mainly freshwater, and in moist environments such as soils and mosses, rain gutters and sewage-treatment tanks. Some species favor harsh, changeable environments that sometimes go dry, and their individuals reproduce without sex. Despite the absence of sexual recombination, which shuffles the genetic deck in a population and offers new combinations of genes, these rotifers have managed to find newness by other means. One means is horizontal gene transfer. Three researchers at Harvard and Woods Hole sequenced portions of the genome of a certain rotifer and found all sorts of craziness that shouldn’t have been there. More specifically, they found at least 22 genes from other creatures, most of which, they concluded, must have arrived by H.G.T. Some of those were bacterial genes, some were fungal. One gene had come from a plant. At least a few of those genes were still functional, producing enzymes or other products useful to the rotifer.

Some of these individual cases were later challenged, but the trend of discoveries held. H.G.T. also started showing up among insects. Again this was supposed to be impossible. There were fervent doubters. Alien genes cannot move from one species to another, they insisted. The germ line of animals, meaning the eggs and the sperm and the reproductive cells that give rise to them, is held separate from such influences. It’s sequestered behind what biologists call the Weismann barrier, named for August Weismann, the German biologist of the 19th and early 20th centuries who defined the concept. Bacteria cannot cross that barricade, the Weismann barrier — so said the skeptical view — to insert bits of their own DNA into animal genomes. Impossible. But again it turned out to be possible.
Beyond the realm of insects and rotifers, evidence of H.G.T. has even been found in mammals — an opossum from South America, a tenrec from Madagascar, a frog from West Africa, all carrying long sections of similar DNA that seem to have come to them sideways, by some sort of infection. “Infective heredity” again. Even the human genome has been laterally invaded. Its sequencing has revealed the boggling reality that 8 percent of our human genome consists of viral DNA inserted sideways into our lineage by retroviruses. Some of those viral genes, as illuminated by a French scientist named Thierry Heidmann and his colleagues, have even been co-opted to function in human physiology, such as creating an essential layer between the placenta and the fetus during pregnancy. 

These and other discoveries of H.G.T. had an impact on evolutionary thinking. One form of that impact, like the blade of an ax, was on the very idea of the tree — and particularly on the tree as Woese had drawn it, using ribosomal RNA as the definitive signal of life’s ever-diverging history. Other researchers began offering other images of evolutionary history, other “trees,” some of them not very treelike, that took account of H.G.T. and represented those entanglements of evolutionary history. Among the most vivid was one drawn by Ford Doolittle, an American biologist at Dalhousie University in Halifax, Nova Scotia, who had known Woese during Doolittle’s years as a postdoc in Urbana. In a review article for Science in 1999, Doolittle offered his own hand-drawn cartoon as an alternative to Woese’s three-limb tree. Doolittle called his “a reticulated tree,” but it suggested also a tangle of pipes in someone’s basement, set in place by a manic plumber.
Maybe, Doolittle said in his text, as well as with his drawing, the history of life just can’t be shown as a tree.
 Ford Doolittle’s reticulated tree. CreditFrom Ford Doolittle and the A.A.A.S.

Carl Woese, as his research career ended, assumed his new role as a much-honored but cranky elder, with strong opinions. He collected kudos, and he wrote. Having already received a MacArthur and an award from the National Academy of Sciences and the Leeuwenhoek Medal (microbiology’s highest honor) from the Royal Netherlands Academy of Arts and Sciences, in 2000 he was announced as a winner of the National Medal of Science, bestowed by the president of the United States with advice from scientific counselors. Woese declined to attend the event in Washington because, according to a friend, he didn’t want to shake Bill Clinton’s hand. In 2003 came the Crafoord Prize, given by the Royal Swedish Academy of Sciences as a complement to the Nobel Prizes and presented by Sweden’s king. Woese hated travel, but he did go to Stockholm for that event and had no scruples about shaking the hand of King Carl XVI Gustaf. The Crafoord was gratifying, but he seems to have yearned for more. Woese had been nominated for a Nobel, but maybe his discovery of the archaea seemed a little too obscure, and maybe he just didn’t live long enough.

Woese had a sort of bifurcated brain, one of his oldest friends, Larry Gold, told me. Gold, now a distinguished molecular biologist and biotech entrepreneur, knew Woese from the early days in Schenectady when they both worked for G.E., and remained close to him through the years. On one side of Woese, he said, was this great depth of learning — mostly acquired by self-instruction, not formal training — and a relentless questioning. Woese had trained at Yale as a biophysicist, Gold reminded me, not a biologist. “He didn’t know any biology. He knew less biology by the time he died than I know,” Gold said self-deprecatingly. “That’s a terrible thing to say. But he didn’t really think about biology. He was thinking about what happened 3.5 billion years ago. That’s not biology.” It’s more a gumbo of physics and molecular evolution and geology, Gold meant. But the deep history, going back those billions of years, lay at the core of understanding evolution, as Woese tried to do it.

One year after the Crafoord Prize, in 2004, he published another of his big, ambitious treatises. This appeared not in Nature or Science but in a narrower journal, Microbiology and Molecular Biology Reviews, the editors of which allowed him 14 pages to vent. It was an appropriate outlet, not just a spacious one, because he wanted to tell the field of molecular biology just what he thought of it. He wanted to piss in the punch bowl. 
He titled this essay “A New Biology for a New Century.” His central point was that molecular biology had strayed from its early promise and declined to “an engineering discipline.” By that he meant it had come to concern itself with applications, such as genetic modification of organisms for agriculture or environmental remediation, and the concerns of human health, rather than the fundamental questions about how life had arisen, become complex and evolved for billions of years. Worse, molecular biology took a “reductionist” perspective on what it saw as mechanistic problems, Woese argued, such as the workings of the gene and the cell. It lost sight of “the holistic problems” of evolution, life’s ultimate origins and the deepest mysteries of how life-forms became organized. It lost interest, or never had any, in the big story over four billion years.
“How else could one rationalize the strange claim,” Woese wrote, “by some of the world’s leading molecular biologists (among others) that the human genome (a medically inspired problem) is the ‘Holy Grail’ of biology? What a stunning example of a biology that operates from an engineering perspective, a biology that has no genuine guiding vision!” A science like that, intent on changing the living world without trying to understand it, he added, “is a danger to itself.”
No one ever accused Woese of pulling his punches. And as he got older, ever more pugnacious, he harbored an increasing disdain for Charles Darwin, distinct from but alongside his disdain for molecular biology. The Darwin animus had kindled within him, an off-and-on resentment of the distant figure with the big name. Part of it might have been substantive disagreement: Darwin himself, and the neo-Darwinian synthesis of ideas that became orthodoxy during the 20th century, saw evolutionary change as inherently incremental and gave little attention to the processes of inheritance, variation and reproduction as they occur among microbes, as opposed to animals and plants. Woese saw microbial evolution and (later in his life) H.G.T. as essential to understanding deep history, eons before the time, the threshold, when Darwin’s vision became relevant. Another part was probably sheer jealousy. He came to believe himself a more important, more profound and more revolutionary thinker than Darwin himself.
Woese was bitter and needy toward the end of his life. But he was also a great scientist, one of the greatest, if not Darwin’s peer as a visionary of evolution’s mysteries.

Among the essential points of the upheaval that Woese helped initiate are three counterintuitive insights, three challenges to categorical thinking about aspects of life on earth. The categoricals are these: species, individual, tree. Species: It’s a collective entity but a discrete one, like a club with a fixed membership list. The lines between this species and that one don’t blur. Individual: An organism is also discrete, with a unitary identity. There’s a brown dog named Rufus; there’s an elephant with extraordinary tusks; there’s a human known as Charles Robert Darwin. No mixing refutes the oneness of an individual. Tree: Inheritance flows always vertically from ancestor to descendant, always branching and diverging, never converging. So the history of life is shaped like a tree.

Now we know, thanks to Carl Woese and those who have followed him, that each of those three categoricals is wrong.

In the early summer of 2012, while vacationing with his wife on Martha’s Vineyard, Woese became ill — an intestinal blockage. It was pancreatic cancer. He sent for his trusted administrative assistant and friend, Debbie Piper, to join him and his family. She flew into Boston, and he pleaded with her to rescue him from Massachusetts General Hospital and the mind-dulling medication he was given after surgery. He wanted clarity more than he wanted comfort. Piper and Woese’s daughter helped get him aboard a medical charter flight and back to Urbana.
That August, he consented to endure a series of video interviews for the historical record. Several friends came to town for that purpose, to assist in the questioning, and Woese did his best to respond, with halting reflections on his work, his discoveries, the science of his time.

Pale and manifestly uncomfortable, seated before bookshelves and an ivy plant, he spoke to the camera for seven hours spread across three days, laboring to remember facts and names, to express ideas, frustrated when he was unable to. There was so much that still needed saying. Now it was too late. He took long pauses. He blinked back his own mortality. At one point he said, “My memory serves badly, badly, badly.” The camera captured it all. At the time of his memorial service, months afterward, someone raised the idea of playing some of this video to bring his voice and image into the event.
Piper, when we spoke, recalled her reaction to that thought: “Oh, please don’t. Because he just looks and sounds like a sick old man.”

But inside the sick old man was a multiplicity of other realities. Some had arisen straight, and some had arrived sideways.

This article is adapted from “The Tangled Tree: A Radical New History of Life,” published by Simon & Schuster.