Modern Monetary Madness

By John Mauldin


More than 10 years ago some Australian readers begin regaling me with the ideas of economist Bill Mitchell of the University of Newcastle in New South Wales. He was teaching about something he called (and he coined the term) Modern Monetary Theory. I looked into it and fairly quickly dismissed it as silly. Actually printing money as an economic policy? Get serious.

MMT is a revival of an early 1900s idea called chartalism. Now it is influencing the thinking of new socialist-like movements in the US and other places and cited by politicians. MMT is increasingly appearing in mainstream media like this sobering Financial Times article. Since it is increasingly discussed in more public venues, you should know more about it and that will be today’s topic.


Join me in Dallas. Now, on with our letter.


Modern Monetary Madness

Essentially, MMT espouses that the public through the government owns the process of money creation, and that in addition to borrowing and taxing, should simply issue currency as payment for its obligations. This is not the sleight-of-hand that quantitative easing was. This is direct monetization in lieu of borrowing.

If that sounds like printing money, that’s because it is. Upfront and in-your-face as a serious economic proposal. Most of the time when I am talking with my fellow writers and economists, when somebody mentions MMT, everybody smiles, maybe chuckles, and shakes their heads. The problem is, what seems like a joke is actually getting traction.

Let’s get the official definition of MMT from Wikipedia. My comments inserted are in brackets.

 In MMT, "vertical" money (money created by the government and spent in the private sector) enters circulation through government spending. Taxation and its legal tender enable power to discharge debt and establish the fiat money as currency, giving it value by creating demand for it in the form of a private tax obligation that must be met. [And thus higher taxes create more demand for the currency and help to maintain the value thereof.]

In addition, fines, fees and licenses create demand for the currency. An ongoing tax obligation, in concert with private confidence and acceptance of the currency, maintains its value. Because the government can issue its own currency at will, MMT maintains that the level of taxation relative to government spending (the government's deficit spending or budget surplus) is in reality a policy tool that regulates inflation and unemployment, and not a means of funding the government's activities by itself. [The more you want the government to spend, the higher the taxes have to be in order to keep from creating inflation, or so the theory goes.]

Proponents argue that unemployment is caused by lack of demand and lack of demand is caused by insufficient money entering the private sector, a problem the government can solve by creating money and spending it in the private sector. Voilà, demand is created and unemployment goes down. Inflation? That can be controlled by higher taxes. Hey, it’s their theory. Don’t ask me to explain it.

Economists advising major presidential and congressional candidates on the progressive and even “moderate” left are more and more openly talking about MMT and its practical applications.

Pet Economists

I have said before that economists are the modern-day equivalent of shamans and priests. Rather than looking at sheep entrails, economists look at “data” and tell the politician (king, emperor, or chief…) what they want to hear. I have been in more than one meeting where a politician is clearly shopping for a rationale for something that they would like to propose and do. Any serious politician is going to have more than a few economic advisors attached in one form or another to their campaign.

Let me quickly state that I am not disparaging the role of economists when they act as political advisors. I have done that myself. It is actually one of the rationales for the discipline. Indeed, it would be strange if that were not the case.
 

90% of readers may wonder why we are even talking about this in a serious letter. The rest of you may tell me how I’m wrong and it really will work. Let me hasten to say that 10 years ago it was much less than 1%. And it is beginning to come from readers that I recognize to be fairly serious.

There are multiple and growing motivations and rationales for adopting MMT into your own philosophical base.

Why should this be on your radar? Let me give you just a few scenarios…

Politicians are increasingly talking about “free stuff.” Free college, guaranteed basic income, more total healthcare paid for by the public, basic housing, and more. It is almost like there will be an auction to see who can promise the most free benefits, paid for by taxes on the rich. They will cite economic advisors who say it is completely doable and even necessary for the general welfare.

“The richest country in the history of rich countries can easily afford to spend more on its citizens ensuring basic income and wealth equality.” More or less a direct quote from several interviews. Forget mere income taxes. The new political ante will be a wealth tax.

That means these ideas will be increasingly promoted in the public space. More politicians will argue for increased spending and/or at least different spending priorities. Guns and butter.

Over the next few years this will enter the national mindset. An increasingly large group of voters, especially younger voters, will feel a natural affinity with the idealism. Why shouldn’t a rich nation help those who are less advantaged?

Then somewhere, while we are having this conversation, there will be a recession. Unemployment will rise and deficits increase until we are on our way to a $30-trillion debt in just a few years. This will crowd out private investment, slowing whatever recovery there might be and making us vulnerable to a quick second recession, not unlike the recessions of 1980 and 1982.

But it will also produce the potential for a true “change” election. The frustration noted among Trump voters will still be there, but it will also be shared by many on the left who will see the promises as a way to change things. It is hard to argue in the middle of financial crisis and recession that we don’t need change.

There won’t be a President Warren Harding who essentially decided to do nothing in one of the deepest recessions/depressions in American history in the early 1920s. In that case, severe austerity allowed markets to clear but the recovery gave us the Roaring 1920s. Cause and effect? Numerous scholarly books have been written to suggest that.

But that will not be the case 100 years later as we face the 2020s. There will be an increasing drumbeat for “doing something.” Change will be the mantra.

It is not far-fetched to imagine a White House and Congress beginning to work around the principles of MMT, if not adopt it outright with sharply higher taxes and spending.

Now here’s where it gets a little bit murkier. The Federal Reserve, even if a new president could pack the board with members philosophically attuned to a new president’s desire to increase public spending through monetary creation, does not have the legal authority to directly create money. That is a right reserved strictly for the federal government and specifically the US Treasury. The Treasury can issue all the debt into the private sector it wants. The Federal Reserve can then go into the private market and buy all the debt it wants, adding that debt to its balance sheet. This is called quantitative easing. It is technically not the same thing.

Congress has tried to create agencies which would use the Federal Reserve to directly create money. These agencies and methods have all been ruled overwhelmingly unconstitutional by the Supreme Court. For the Federal Reserve to create money as MMT advocates want, you would have to amend the Federal Reserve Act. Certainly a possibility, but not easy.
 

Proponents of MMT point to how successful Japan has been in implementing what essentially looks to be the same policy. They have moved 140% of their GDP under the balance sheet of the Bank of Japan—essentially buying every bond available in the private markets. Their balance sheet is growing because they are buying stocks and carrying Japan’s entire annual deficit, which is large.

Why can’t we do the same? Japan and the US are both modern countries and economies. Europe, though not to the extent of Japan, also engaged in a large amount of quantitative easing. If it didn’t cause problems the last time, why not try it again on a larger scale? Especially if there is a crisis?

The explanation for Japan not having inflation or hyperinflation doesn’t fit into a sound bite and MMT proponents can answer it with dismissive sound bites that will be readily consumed and believed by a public wanting change, coupled with automation increasingly taking jobs and depressing wages. It will be a firestorm of political backlash and calls for change.

Do Deficits Matter?

The only way to really tackle the increasing deficits is to:

1.   Reduce Medicare and Medicaid benefits, means-test Social Security and at the same time raise the age of eligibility. But few politicians will run on a platform of cutting Medicare and Social Security, because no matter how they propose it, that will be what it means.

2.   Raise Medicare and Social Security taxes, or simply increase taxes on everyone or at least “the rich.” A lot. The definition of “the rich” would have to be lower than you might think. Most of my readers will be seen as the rich. Whether you feel rich is beside the point. That will still not balance the budget but there’s a high probability that it will send us into yet another recession, bringing calls for more direct spending and some form of money creation as the answer. That’s what MMT says we should do.

 
Any politician who proposes to limit entitlement spending to balance the budget will be accused of forcing austerity on those least able to afford it. That is not a winning platform. There will be no Clinton/Gingrich compromise. Austerity has no fun and simple sound bites. It requires a certain amount of pain, which is generally not politically popular.

Oh, a segment of the population will embrace such, but we must remember that elections are won on the margin. President Trump won by razor-thin margins in a few Midwest states. A change election in the middle of a recession or its aftermath could not only see those margins evaporate, but bring a wave of progressive and socialist politicians to join AOC and her friends.

Think 1932. The country was in true turmoil and there was a huge shift to the left. FDR didn’t get every policy he wanted, but he got a lot of them. It was truly transformational and has impacted the US for the last 100 years.

What would this look like? How do we get there? We are going to have several sessions at the Strategic Investment Conference to specifically address these issues. Is all this going to happen next year? No, but something along the above line is my base case for the 1920s. That means you need to begin structuring your life and your portfolios to avoid being in a tunnel with an oncoming train.

These are not simple changes, like simple buy and sell instructions, but will require much deeper structural change in not just your portfolios but perhaps your lives. It is something you want to think very seriously about while you have the luxury of time and not wait to the last minute. Waiting too long may mean you won’t be as prepared as you will wish.

Think about how you would deal with taxes 20 or 30% higher (or more!) than today’s, and potentially more. How would that change your lifestyle? What can you do today to deal with whatever may come? It may mean adjusting your lifestyle, saving more and getting out of debt, which takes time. For most families these are not quick decisions. But I think they will become necessary ones, especially if the first wave of a change election happens in 2020. Bluntly, Shane and I have already begun our own changes.

If somehow there is eventually a change back to austerity? Or a crisis forces it? That will mean even more change you need to be prepared for. And unfortunately, it’s not clear what will happen. We will have to get much closer to the actual events and elections to get a feel for the way actual events may develop.


Dallas, Houston, Cleveland a lot, New York, back to Puerto Rico, Austin and Dallas

Shane and I will be traveling to Dallas at the end of this week to begin to get ready to move the rest of our furnishings to Puerto Rico or to a new, smaller apartment that will be our Dallas base. Right now, it appears my Dallas high-rise home will sell within a few weeks. After an Ashford, Inc, board meeting in Athens, Texas I make a quick dash to Houston to meet with my SMH friends and then a meeting with an economics council at Rice University, my alma mater. Then it’s back to Dallas for a few days and then on to Puerto Rico.

Starting in the middle of March, I will be staying in the Cleveland area where I will have two separate cataract surgeries over a few weeks, maybe get a quick visit to New York, eventually back to Puerto Rico and then in early April I have dinners with clients and potential clients in Austin and Dallas. And then thankfully the rest of April looks mostly free of travel.

When I write to you about the potential for needs for lifestyle changes, I am also talking to myself. And taking action. If there is any interest, maybe I will do a private session on what changes I am making at the SIC. But everyone’s life and reality is different. I am still thinking through and making those changes,

It’s time to hit the send button. This has been an interesting letter to write. More speculation than economic rationale, but I can guarantee you this is something I worry about. And I think you should consider what it might mean for you. Have a great week!

Your making his own changes analyst,


 
John Mauldin
Chairman, Mauldin Economics


The Trump era could last 30 years

But the populist movement is going to need more than electoral success

Gideon Rachman




How long is this going to last? Ever since the twin political upheavals of 2016 — Britain’s vote for Brexit and America’s election of Donald Trump — analysts have argued about whether this a temporary aberration, or the beginning of a new era.

It is still early days. But it already seems likely that future historians will look upon the events of 2016 as marking the beginning of a new cycle in international history. The bad news for anguished liberals is that these cycles can last quite a long time — 30 years seems to be about average.

In the years since “Brexit-and-Trump”, a global populist movement has gathered momentum.

The fact that Mr Trump is despised by much of the western establishment and media can obscure this point. But the US president has many admirers, some of them running governments around the world.

Jair Bolsonaro, the new president of Brazil, Latin America’s largest country, is an avowed Trump fan. In the Middle East, the Saudi and Israeli governments much prefer Mr Trump to Barack Obama, his predecessor. His fan club also extends into Europe. The governments of Poland and Hungary are closer ideologically to the Trump White House than to the European Commission in Brussels. Matteo Salvini, the deputy prime minister of Italy (and the country’s most powerful man), also sees Mr Trump as a role model.

The horror show of Brexit means that there are few other European populist parties currently campaigning to leave the EU. But the anti-establishment impulse that gave rise to the Brexit vote is still gathering strength in Europe. It has found expression in diverse forms, from the gilets jaunes movement in France to the rise of the Alternative for Germany party, which is now the official opposition in the German parliament.

Past precedent suggests that if a “populist era” takes hold, it might last as long as three decades. All efforts at historical periodisation are slightly artificial. But it is possible to identify two distinct eras in postwar western politics, both of which lasted roughly 30 years. The period from 1945-1975, known as les trente glorieuses in France, was identified with a period of strong economic growth across the west, alongside the construction of welfare states and Keynesian demand-management — all played out against the international backdrop of the cold war.

By the mid-1970s, this model had run into trouble in the Anglo-American world, with Britain suffering from “stagflation” and President Jimmy Carter diagnosing a national “malaise” in the US. A new era (often termed “neoliberal” by its critics) began in 1979 with the election of Margaret Thatcher in Britain, followed by Ronald Reagan in the US in 1980.

In retrospect, this was also part of a global shift. In 1978, Deng Xiaoping came to power in China and initiated a policy of market-based “reform and opening-up”. The communist bloc in Europe also began to crack with the formation of the Solidarity trade union in Poland in September 1980. The foundations of a globalised capitalist economy were emerging.

This “neoliberal era” also lasted roughly 30 years until it was discredited by the global financial crisis of 2008. As with the end of the trente glorieuses, it took a few years of uncertainty before a new ideological movement emerged. But that happened in 2016, with Mr Trump’s election and Brexit.

But why should cycles in modern history last for roughly 30 years? One possible explanation is that the successful ideologies and the political movements they spawn go through a cycle of emulation followed by overshoot.

If new movements or politicians develop an aura of success, they find imitators around the world. That sense of ideological momentum then creates a demand for the original ideas behind the movement to be pushed further and faster. And that leads to the over-reach phase of the cycle. An example of ideological over-reach is the way in which the Reaganite demand for lower taxes and less red tape eventually led to the excessive deregulation of finance, culminating in the financial crisis.

The fact that populist and nationalist parties around the world are already taking their cue from Mr Trump suggests that the cycle of emulation is already well under way. It is now standard practice for politicians, such as Viktor Orban in Hungary, as well as Messrs Salvini and Bolsonaro, to imitate the Trump playbook — condemning “globalism”, accusing the media of spreading fake news, mocking the “politically correct”, and scorning international organisations that attempt to deal with problems such as climate change or the resettlement of refugees.

The rapid spread of this new political style could be just the beginning of a new era that lasts decades. But there is one major qualification to this idea, that distressed liberals should hang on to. If the period of emulation and intensification is to last, the populist movement needs more than electoral success. It also needs to point to results in the real world. The trente glorieuses were deemed glorious because living standards were visibly rising across the west. In the same way, the Reagan-Thatcher era was solidified by renewed economic growth and victory in the cold war.

By contrast, Brexit is in deep trouble and the Trump administration is floundering. Unless populists can deliver tangible results, their new era could yet die in its infancy.


Bad News Is Good News for Stocks Again

Stocks are up around the world, even as economic data has disappointed and earnings forecasts have been cut

By Mike Bird





The mantra that gained popularity in the years after the financial crisis—bad news is good news—is back. That should remind investors that finding themselves on the wrong side of major central banks can be perilous.

The MSCI All Country World Index has erased all of the losses it suffered during a panicky end to 2018, when concerns about rising global interest rates alongside slowing growth sparked a sudden selloff. On Friday, the index finished slightly higher than its Dec. 3 close, and most Asian markets continued to rise Monday: The U.S. market is closed for a holiday.

The recovery in global stocks has come despite disappointing economic data from powerhouses such as China, Germany and even the U.S., alongside a broad-based deterioration in company earnings outlooks. This year’s sentiment shift has been so broad that no major country-level stock indexes—besides India’s—are down so far this year. Every sector of the S&P 500 has risen; both growth and value stocks have picked up in the U.S. 
The market’s Panglossian approach is reminiscent of the period when Ben Bernanke helmed the Federal Reserve. When the U.S. central bank was buying bonds by the truckload and interest rates were at zero, bad economic news like weak U.S. employment numbers often sparked market rallies, as investors bet that a weaker economy would stall plans to tighten monetary policy. 
The Federal Reserve headquarters in Washington. When the U.S. central bank said last month that it may be done with interest-rate rises for now, it triggered a market rally.The Federal Reserve headquarters in Washington. When the U.S. central bank said last month that it may be done with interest-rate rises for now, it triggered a market rally. Photo: kevin lamarque/Reuters 
 

The same counterintuitive dynamic is in place now. The Fed sparked the current market rally when it signaled last month that it may be done with interest-rate rises for now. Any subsequent weak data has only served to raise hopes it would become still more dovish. In turn, an assumption has taken hold that other central banks around the world—especially in Asia—will feel less pressure to tighten policy.

The Fed’s apparent change of heart has overwhelmed everything else: all of the rebound in equity markets since Christmas has come on the back of rising valuations rather than improving corporate fundamentals.

Investors can’t fight the Fed, but the power the U.S. central bank wields should inspire caution too. If the Fed believes investors have overinterpreted its more cautious stance, any clarification could spark a selloff just as broad as the rally has been this year.


Is the “Populist” Tide Retreating?

Strong support for immigration and globalization in the US sits uneasily with the view that “populism” is a problem. In fact, the term remains vague and explains too little – particularly now, when support for the political forces it attempts to describe seems to be on the wane.

Joseph S. Nye

trump


STANFORD – The dysfunctional politics of Brexit in the United Kingdom, and the midterm election reaction against President Donald Trump in the United States, are generating second thoughts about the populist tide sweeping the world’s democracies in recent years. In fact, second thoughts are long overdue.

Populism is an ambiguous term applied to many different types of political parties and movements, but its common denominator is resentment of powerful elites. In the 2016 presidential election, both major US political parties experienced populist reactions to globalization and trade agreements. Some observers even attributed Trump’s election to the populist reaction to the liberal international order of the past seven decades. But that analysis is too simple. The outcome was over-determined by many factors, and foreign policy was not the main one.

Populism is not new, and it is as American as apple pie. Some populist reactions – for example, Andrew Jackson’s presidency in the 1830s or the Progressive Era at the beginning of the twentieth century – have led to democracy-strengthening reforms. Others, such as the anti-immigrant, anti-Catholic Know-Nothing Party in the 1850s or Senator Joe McCarthy and Governor George Wallace in the 1950s and 1960s, have emphasized xenophobia and exclusion. The recent wave of American populism includes both strands.

The roots of populist reactions are both economic and cultural, and are the subject of important social science research. Pippa Norris of Harvard and Ronald Inglehart of the University of Michigan have found that cultural factors long antedating the 2016 election were very important. Voters who lost jobs to foreign competition tended to support Trump, but so did groups like older white males who lost status in the culture wars that date back to the 1970s and involved changing values related to race, gender, and sexual preference. Alan Abramowitz of Emory University has shown that racial resentment was the single strongest predictor for Trump among Republican primary voters.

But economic and cultural explanations are not mutually exclusive. Trump explicitly connected these issues by arguing that illegal immigrants were taking jobs from American citizens. The symbolism of building a wall along America’s southern border was a useful slogan for uniting his electoral base around these issues. That is why he finds the idea hard to give up.

Even if there had been no economic globalization or liberal international order, and even if there had been no great recession after 2008, domestic cultural and demographic changes in the US would have created some degree of populism. America saw this in the 1920s and 1930s. Fifteen million immigrants had come to the US in the first 20 years of the century, leaving many Americans with an uneasy fear of being overwhelmed. In the early 1920s, the Ku Klux Klan had a resurgence and pushed for the National Origins Act of 1924 to “prevent the Nordic race from being swamped,” and “preserve the older, more homogeneous America they revered.”

Similarly, Donald Trump’s election in 2016 reflected rather than caused the deep racial, ideological, and cultural schisms that had been developing in reaction to the civil rights and women’s liberation movements of the 1960s and 1970s. Populism is likely to continue in the US as jobs are lost to robotics as much as to trade, and cultural change continues to be divisive.

The lesson for policy elites who support globalization and an open economy is that they will have to pay more attention to issues of economic inequality as well as adjustment assistance for those disrupted by change, both domestic and foreign. Attitudes toward immigration improve as the economy improves, but it remains an emotional cultural issue. In mid-2010, when the effects of the Great Recession were at their peak, a Pew survey found that 39% of US adults believed immigrants were strengthening the country, and 50% viewed them as a burden. By 2015, 51% said that immigrants strengthen the country, while 41% said they were a burden. Immigration is a source of America’s comparative advantage, but political leaders will have to show that they can manage national borders – both physical and cultural – if they are to fend off nativist attacks, particularly in times and places of economic stress.

Still, one should not read too much about long-term trends in American public opinion into the heated rhetoric of the 2016 election or Trump’s brilliant use of social media to manipulate the news agenda with cultural wedge issues. While Trump won the Electoral College, he fell three million short in the popular vote. According to a September 2016 poll, 65% of Americans thought that globalization is mostly good for the US, despite their concerns about jobs. While polls are always susceptible to framing by altering the wording and order of questions, the label “isolationism” is not an accurate description of current American attitudes.

Since 1974, the Chicago Council on Global Affairs has asked Americans annually if the US should take an active part in, or stay out of, world affairs. Over that period, roughly a third of the public has been consistently isolationist, harkening back to the nineteenth-century tradition. That number reached 41% in 2014, but, contrary to popular myth, 2016 was not a high point of post-1945 isolationism. At the time of the election, 64% of the respondents said they favored active involvement in world affairs, and that number rose to 70% in the 2018 poll, the highest recorded level since 2002 (which had been reached in the aftermath of the 9/11 terrorist attacks).

Strong support for immigration and globalization in the US sits uneasily with the view that “populism” is a problem. The term remains vague and explains too little – particularly now, when support for the political forces it attempts to describe seems to be on the wane.


Joseph S. Nye, Jr., is a professor at Harvard University and author of Is the American Century Over? and the forthcoming Do Morals Matter? Presidents and Foreign Policy from FDR to Trump.


Remembering to Forget World War II

People tend to compare every modern conflict to Hitler and the Nazis. They shouldn’t.

By Jacob L. Shapiro

 

“Those who cannot remember the past are condemned to repeat it.” It’s a cruel irony that perhaps the most famous warning of how dangerous it is to ignore what came before us is often wrongly attributed to Winston Churchill and not to its true author, Spanish-born philosopher George Santayana. To Santayana, the phrase meant something very specific. Rather than insisting that we merely remember events that already happened, he was imploring us to retain the experience of our history. It’s a subtle difference, but it’s an important one, and to understand the peril in failing to distinguish between the two, we need look only at the legacy of World War II.

It’s hard to overstate just how consequential World War II was. It completely changed the political structure of the world. By removing Europe from the top of the international order, it gave way to the United States as a superpower and created opportunities for newly independent powers in Asia to rise. It destroyed European Jewry, which had played a crucial role in European geopolitics for centuries. It sounded the death knell for the traditional forms of imperialism and colonialism. It confirmed the nation-state as the basic organizing principle of the international system, and in the war’s aftermath, the vast majority of those nation-states were organized into a global political and economic structure that revolved around the United States. It created new and terrible weapons that could wipe out humanity in a matter of hours. From international political structures to still-unresolved territorial disputes – between Japan and Russia, between Ukraine and its neighbors, between Israel and the Palestinians and between everyone in the Balkans – the world today was forged by the changes World War II wrought.

 






World War II also reconfigured the political imaginations of all who participated in it. And how could it not? In geopolitics, issues tend to be gray, situations rarely zero-sum, with very little on the line. The remarkable and unprecedented thing about World War II was that it was a zero-sum conflict, good versus evil, with nothing less than the fate of the entire world on the line. That was the mindset of every politically conscious man, woman and child for eight years. It’s understandable why we have since grafted that mentality onto nearly every conflict that has taken place since 1945.

It’s understandable but ultimately misguided because it has corrupted the histories the victors wrote. They blamed World War II solely on the men deemed responsible for it – men like Adolf Hitler and Hideki Tojo. By 1953, Leo Strauss, a Jewish philosopher who narrowly escaped Europe early in the war, coined the phrase “reductio ad Hitlerum” to describe a type of ad hominem attack whereby any argument could be defeated by linking it to Hitler, even if in the most circuitous way. (“Hitler liked dogs, and you like dogs. Therefore, dogs are evil and so are you.”) In 1994, when the internet was still in its infancy, an American writer named Mike Godwin coined “Godwin’s law,” the idea that the longer an online discussion goes on, the greater the chance a comparison to Hitler is drawn. (From my experience, Godwin’s law is nearly as immutable as the law of gravity.)

To be clear, there are good reasons men like Hitler and Tojo are so reviled. Their policies were morally reprehensible. They were anti-democratic, racist demagogues who believed in the superiority of their races, and they shoulder plenty of responsibility for World War II. But scapegoating them after the fact had the added benefit of political expedience. When World War II gave way to the Cold War, the U.S. needed to rehabilitate Germany and Japan as allies in this new global conflict, and Washington couldn’t rightly side with the evil people against whom it had just waged an existential battle. Germany was thus remade as a liberal democracy that manufactured great cars and warned about the dangers of anti-Semitism. Japan was remade as a liberal democracy as passionately pacifist as it had been martial just a decade previous. Vilifying Hitler and Tojo absolved the German and Japanese people, who were now free to become trusted U.S. allies. (Bob Dylan articulated this best when he sang that the Germans now had God on their side too. )

This kind of thinking violates the idea on which the entire international system is based: the sovereignty of the people. Governments do not gain power simply by virtue of being governments. Presidents do not achieve power simply by declaring themselves presidents. (Self-proclaimed Venezuelan interim president Juan Guaido is the latest to learn this lesson.) People give governments power – and governments derive power from people. Germany’s embrace of Hitler was not a bout of mass psychosis. The German people weren’t duped by promises of a noble liberal democrat who transformed into a monster that held the country hostage. Hitler came to power because he had the support of millions of Germans – people who voted for him, people who fought and died for the vision of Germany he articulated, people who either supported his government’s policies or found them palatable enough not to protest.

This is not to say that every German supported Hitler, nor is it to say that there weren’t those who resisted. But it is to say that someone like Hitler could not have come to power without the support of his people. People tend to forget this today when they invoke World War II as a warning against regimes or leaders they don’t like. That’s because it’s easier to blame individuals for the horrors of something like World War II than to indict a system that was complicit in its crimes. People are happy to pin problems on individuals they hate and to credit successes to individuals they love. Better to blame it on someone else and get back to tweeting. No sense in accepting the truth – that broad, impersonal forces shape the worldand that individuals sometimes bend those forces to their will.

We see this kind of thinking infecting contemporary foreign policy everywhere. The United States recently intervened in Venezuela to attempt to overthrow the Maduro regime. At least one GPF reader wrote in agreeing with the policy because, he argued, had the U.S. intervened in Germany in 1936, Hitler would never have started World War II. (A classic reductio ad Hitlerum.)

A British lawmaker went on the BBC last week and raged against the “Teutonic arrogance” of a German business leader who said Brexit would hurt the British economy. The lawmaker compared his defiance to his father’s landing on the beaches at Normandy.

Western media, meanwhile, routinely compares the treatment of China’s minority Uighur population to the Nazis’ treatment of the Jews. Their treatment is unconscionable, but if it’s true that this is Nazism 2.0, then the U.S. should be preparing for war, not applying economic sanctions.

Elsewhere, internal U.S. political debates about the now-abandoned Iran nuclear deal became a fight about whether the deal was like appeasing Nazi Germany. A few years ago, then-Secretary of State and Democratic presidential candidate Hillary Clinton compared Russian President Vladimir Putin to Hitler when he intervened in eastern Ukraine. The list of examples is endless.

When we invoke such a narrow historical memory of World War II and apply it to any modern conflict, we do so at our own peril. It personalizes issues that cannot be fully understood when personalized. It reduces complex issues into matters of good and evil, and thus it ignores the conflicts of interest around which most international disputes arise and by which most are resolved. If every problem is a nail, every solution is a hammer. If every adversary is Nazi Germany, every fight is existential. If every person you disagree with is Hitler, then every argument ends without compromise. Indulging in a reductio ad Hitlerum or a reductio ad World War II every time a problem arises is not remembering the past, it is invoking archaic tropes to avoid dealing with complicated questions. Learning the lessons of the past isn’t the same as living in the past.

World War II was fought because of the rise and fall of great powers, because of the secular rise of the power of the state, because of the rejection of prior forms of political legitimacy for both liberal and nationalist principles, because of distrust between nations, because of scarcity of resources in a rapidly transforming global economy – to name just a few of the root causes.

Famous as Santayana’s adage has become, the line that precedes it is, to me, equally profound: “When experience is not retained, as among savages, infancy is perpetual.” The most important thing to understand about World War II is that it is over. One can only hope that at some point the world will start acting like it, rather than preparing unconsciously for the next.
   

Here’s What the Fed’s Halt on Interest Rates Means for Your Wallet

There are consequences whether you have a savings account or a credit card (or both).

By Tara Siegel Bernard


Car loans are one of the many consumer financial products affected when the Federal Reserve changes rates. CreditDavid Zalubowski/Associated Press


The Federal Reserve indicated on Wednesday that it was done raising interest rates for the foreseeable future, after a run of incremental increases that began to affect the typical consumer’s wallet.

The decision will hold the central bank’s benchmark for short-term rates to a target between 2.25 and 2.5 percent, the level it reached in December after steadily climbing since the end of 2015.

That is the target for the federal funds rate, the interest rate that banks and depository institutions charge one another for overnight loans. It influences how banks and other lenders price certain loans and savings vehicles.

Whether you will cheer or chafe at the halt depends, broadly, on whether you’re a saver or a spender. For savers and retirees, who were only just starting to find accounts that paid more than 2 percent, the end of rate increases means that’s as good as it will get. But people trying to whittle down a pile of credit card debt, thinking about tapping their home equity line of credit or buying a car should welcome the fact that the cost of those loans won’t keep rising.

Deposit Accounts and C.D.s

When the Fed raises rates, some banks may pay more interest on savings accounts, particularly when they want to lure consumers to park their money. But the big banks haven’t been too generous lately, and you shouldn’t expect much to change any time soon. Today, the average savings and money market deposit accounts pay a paltry 0.23 percent, according to BankRate.com. That’s up from 0.10 percent in 2015, when the Fed starting raising rates.

You also shouldn’t rush to tie up your money in certificates of deposit, which tend to move in step with similarly dated Treasury securities. Two-year C.D.s are paying just more than 1 percent on average, but you can find some paying 3 percent if you take the time to comparison shop, according to BankRate.com.

Online Savings

You’ll probably do better with an online savings account; many are already paying more than 2.25 percent and may rise further.

“For the first time in more than a decade, you can earn more than the rate of inflation on your savings account, but only if you shop around,” said Greg McBride, chief financial analyst at BankRate.com.

The inflation rate, which measures how much prices have risen from a year ago, is now roughly 2 percent. If your money isn’t earning at least that much, you’re losing purchasing power.

Citizens Access is offering 2.35 percent and CIBC Bank USA 2.39 percent, according to BankRate.com, while at least two other online banks are offering 2.4 percent.

Bonds

Bond investors often get nervous when interest rates rise, because bond prices tend to fall in response. Why? When rates increase, the price of existing (and lower-yielding) bonds drops because investors can buy new bonds that offer higher interest rates.

Now that interest rates have stabilized, at least for now, bond fund investors might be less distracted by rate-related volatility — but that doesn’t mean bonds won’t continue to react to broader economic conditions and news. It’s best to focus on why bonds are in your portfolio to begin with — to act as a buffer against stocks — and to avoid fretting about short-term moves.


Certain adjustable-rate mortgages are linked to the Federal Reserve's short-term rates. Credit
Matt Rourke/Associated Press


Home Loans

Many people think mortgage rates are tied to the Fed’s short-term rate, but there isn’t a direct link. Most 30-year fixed-rate mortgages are priced off the 10-year Treasury bond, which is influenced by a variety of factors, including the outlook for inflation and long-term economic growth here and abroad.

But some home loans are more directly connected to the Fed’s short-term rate, including home equity lines of credit and adjustable-rate mortgages, or A.R.M.s.

A typical home-equity borrower has already seen rates rise to about 6.7 percent, according to BankRate.com, from roughly 4.5 percent three years ago.

The combination of the recent increases and changes in the tax code that restricted the interest deduction “is a bit of a double pinch for some,” said Keith Gumbinger of HSH.com, which tracks the mortgage market.