Shelter from the storm

Covid-19 has transformed the welfare state. Which changes will endure?

The pandemic may mark a new chapter in the nature of social safety-nets

“Suddenly everything drops out from under you,” says Will, a 30-year-old Londoner. He has had paid jobs in arts marketing since he graduated from university. The pandemic upended everything. 

Redundancy loomed. Rescue came in the form of the British government’s furlough scheme, without which he would be jobless and penurious. 

The experience has made him more supportive of the welfare state—and even of grander schemes, such as a universal basic income (ubi).

Crises, such as wars or economic collapse, reveal societies’ strengths and weaknesses, and change thinking about how they can and should be organised. 

The pandemic has forced a re-evaluation of the social contract, in particular how risk should be divided among individuals, employers and the state. 

The covid-19 fiscal stimulus packages have made even the interventions of the global financial crisis seem like minnows. 

The expansion of the welfare state has been the greatest in living memory. 

Government bail-outs of citizens, rather than banks, could mark a new chapter in its history.

At its most basic, the welfare state provides some form of social security and poverty relief. 

In 1990 Gosta Esping-Andersen, a political scientist, identified three models: market-oriented in Anglophone countries, where the state plays a “residual” role; family-oriented in mainland Europe, where the state and employers play a supporting role; and state-oriented for the Scandinavians, with universal protections and services. 

The balance between state, market and family shifts over the course of people’s lives, but most take out about as much as they put in (in any year 36% of Britons receive more than they pay in taxes, but over their lifetimes only 7% do).

When covid-19 struck and economies locked down, entire industries faced obliteration. Since the start of the pandemic countries have announced over $13.8trn (13.5% of global gdp) in total emergency funding, more than four times the support provided during the financial crisis. 

Rich countries have done almost all the spending (see map). Only in 1945, as Europe was rebuilt after the second world war, was government debt as a share of gdp so high. 

Emerging economies have never borrowed as much.

The shape of the welfare state has been transformed, too. Established principles such as means-testing (welfare only for the poorest), social insurance (only for those who paid in) and conditionality (only for those who do something) went out of the window. 

Governments wrote near-blank cheques for everything from job guarantees to food. Some simply sent cash.

As the pandemic abates and economic recovery beckons, how much of this expansion will last? 

The shift in risk in 2020 came after decades during which risks such as living longer than expected, or being replaced by an algorithm or foreign worker, were gradually offloaded from governments and employers onto individuals. 

And just as a flood increases demand for flood insurance, the millions reliant on the state for the first time are demanding stronger safety-nets.

Covid-19 showed that the welfare state needed modernisation. It was born in a different social order, and to protect against different risks. 

Discontent was rising before the pandemic: in 2019 less than one in five people in 26 countries agreed that “the system” was working for them and half said that it was failing, according to the Edelman Trust Barometer. 

That governments had to respond so aggressively to covid-19 shows that the responsibility for some risks sat in the wrong place. 

In a new book about the social contract, Minouche Shafik, the head of the London School of Economics (lse), predicts that “The political turmoil we observe in many countries is only a foretaste of what awaits us if we do not rethink what we owe each other.”

American social security emerged from the Great Depression. Social-insurance programmes appeared in Europe at the turn of the 20th century. 

But it was the second world war that led to the birth of the modern European welfare state, with universal benefits to guard against poverty and provide health care and education.

Before the war, welfare had primarily been understood as poverty relief through redistribution. But the bombs hit both rich and poor, and Europe emerged with a new appetite for something different, and larger: shock relief for everyone through insurance. 

Nicholas Barr of the lse describes this as part of the “piggy bank” objective of welfare: the realisation that even if poverty were eradicated, people still need protection against shocks and periods of dependency over their lifetimes.

The post-war expansion ended with the stagnation and inflation of the 1970s. A new version of the welfare state focused on getting people into jobs. Benefits were made scarcer and stingier to discourage laziness and dependency. 

Work incentives were boosted. Welfare recipients were stigmatised as “scroungers” and universalism gave way to means-testing and conditionality. America replaced many cash benefits for the jobless with tax credits for the working poor. 

Britain renamed unemployment benefits “Jobseeker’s Allowance”. The labour market was made more flexible to entice employers to hire. With full employment, fewer people would need benefits, went the thinking.

Most countries used this second phase which started in the 1980s to reduce state intervention and shift risk back to individuals. Unions were successively weakened, and employment protections cut back still further.

In the private sectorthe certainty of defined-benefit pensions was replaced by the uncertainty of the defined-contribution kind. Between 2004 and 2018 the share of real income replaced by a typical mandatory pension for a private-sector worker fell by 11% in rich countries on average. 

The social-housing stock as a share of total housing decreased, rent controls were trimmed and housing costs went up.

Don’t think twice

But talk of self-sufficiency ended when covid-19 struck. Governments scrambled to get the money out and ask questions later. 

The result was a huge increase in the number and generosity of safety-net measures. 

By January the International Labour Organisation counted over 1,600 social-protection policies launched since February 2020. Record numbers claimed support. 

In some rich countries as many as 60% of those getting help during the pandemic, including through furlough schemes, had never received welfare payments before, according to bcg, a consultancy.

The imf estimates that by January rich economies had increased total direct spending by almost 13% of gdp, about half of it on supporting workers and households. Countries that typically spend a lot on social protection spent comparatively less on emergency funding (see chart). 

Support for employment, such as wage subsidies or furlough schemes, was most popular in Europe (including Britain). In the oecd, a club of mostly rich countries, over one in five employees have had their job rescued by such programmes.

Governments have spent about the same on supporting households through bolstered unemployment benefits, child benefits and cash transfers. 

In America, which has favoured such spending over wage subsidies, the $600 weekly increase in unemployment insurance meant two-thirds of recipients earned more on the dole in the first months of the pandemic than they had when they were working. 

Claims soared: nearly 33m were made in the third week of June, compared with 2m in the last week of February, just before the pandemic struck, and 12m in the peak week of the financial crisis. 

In Britain the government increased universal credit, the main welfare programme before the pandemic, by £1,000 ($1,290) a year. 

Some 6m people claimed it in January compared with 2.6m last February. Britain, like others, snipped some of the strings attached to such benefits and broadened eligibility.

Many also doled out cash. Donald Trump’s administration sent cheques for $1,200 and then $600 to most adults last year. President Joe Biden plans to distribute another $1,400, taking the price tag of the policy to $920bn. 

In Japan every citizen received ¥100,000 ($930).

The pandemic highlighted the outmoded pattern of some welfare spending: designed to fit a mid-skilled worker of a type that has become rare, and is likely to become rarer still. It exposed the vulnerability of the growing group of labour-market outsiders, and how little job and income security many essential workers enjoy.

They say every man needs protection

Economists were already arguing for the need to plug coverage gaps, especially for the one in four workers in oecd countries in temporary work or self-employed. 

Over the past 20 years rich countries’ labour markets have become polarised, with growing shares of low- and high-skilled jobs and falling shares of middle-skilled (and -income) jobs. 

Before the pandemic hit, a higher share of people were in work than at the turn of the century, but most of the growth has been in part-time jobs. 

Bureaucratic, inflexible welfare systems were already showing the strain before the shock of the pandemic made changes that had seemed politically unfeasible look not just possible but necessary.

When Margaret Hope, a self-employed chef in Canada, lost all her work due to covid-19 last March, she immediately began selling her kitchen equipment. 

“Here we go again,” she thought, “I’ll get nothing.” 

After Alberta’s oil-price crash in 2014 she had received no government support and had to close up. But this time the federal rescue package covered the self-employed. An emergency monthly benefit of C$2,000 ($1,580) was paid to those who earned less than C$1,000 per month between March and September. Some 8.9m Canadians received it—nearly a quarter of the population—at a cost of C$82bn.

Other governments took similar action. America, for the first time, expanded unemployment insurance to freelancers and contractors. Several extended the coverage of sick leave. 

The public interest in a universal benefit had rarely been clearer.

The vulnerability of workers with family responsibilities became acutely clear when schools closed. 

In America one in four working women considered cutting their hours or quitting. 

Public support for better child-care provision is now more bipartisan. The pandemic put child-care policies on the table even in places where it had been overlooked, for example in Italy. 

Some governments, such as Australia’s, made child care free for a time. Others, including Portugal’s and Germany’s, provided cash for carers and increased child benefits. 

Mr Biden has proposed a temporarily enhanced child tax credit (a policy which would, almost on its own, halve poverty among children). 

“There is total commitment…from the entire Democratic caucus to make this permanent,” says Sherrod Brown, a Democratic senator from Ohio.

The pandemic also underscored the importance of speed to welfare. Analysis by McKinsey, a consultancy, suggests that the magic “troika” of reaching lots of people, quickly and with little fraud was possible only for countries with advanced financial infrastructure, meaning widespread use of digital payments, digital ids and—crucially—relevant data, such as tax returns, linked to these ids. Singapore, which has all three, was able to send wage subsidies to eligible employers automatically.

Other countries had to make trade-offs between speed and fraud, or between scope and successful delivery, says Anu Madgavkar of McKinsey. 

Ms Hope, the chef, was “gobsmacked” to receive her money within days of applying online. 

Most Canadians got their payments within a week. Canada decided to prioritise speed and ask questions later (it has now started asking recipients to prove their eligibility).

Technology was not the only deciding factor in governments’ ability to act with agility; simplifying the claims process proved as important, for example by dropping burdensome tests on assets or assessments of partners’ incomes.

Ensuring that social spending is flexible is crucial, not just in a pandemic. When people know there is a safety-net, they may take healthy risks, such as starting a business. 

If it takes new applicants for out-of-work benefits months to get their money, they may be less keen on jobs that they might later lose. 

Distributing cash quickly in a crisis can help smooth consumption and lessen economic contraction.

Swift action has had remarkable success. Households’ incomes in rich countries were largely protected even as gdp tumbled. 

In April, as the unemployment rate more than tripled, American real disposable income rose by 15.6%, a record. History suggests that increases in social spending rarely disappear entirely after a crisis. 

The question is what will stick.

Many countries seem to have passed the peak of their emergency social spending, as economies begin to recover. Across the oecd take-up of furlough schemes has fallen, from a high of 20% of employed people in May to around 5% in September. In America claims for unemployment benefits have almost halved since their peak, and the unemployment rate dropped from 14.8% in April to 6.3% in January.

Some support programmes, such as Britain’s furlough schemes, have been extended. Others are being wound down. 

Australia is no longer providing child care free, and its “coronavirus supplement” will end this month (to be replaced with a smaller permanent increase in jobseeker’s allowance of A$25 ($20) per week).

Such changes are driven primarily by fiscal necessity. Government debt is piling up to record highs. Tax revenues have fallen. 

Governments worry that overgenerous benefits are themselves a disincentive to taking paid work and can lock people into a “welfare trap”.

And yet even before covid-19, public opinion had been moving in favour of the state, and employers, taking more of the risk away from individuals. 

In 1987, 30% of Britons thought welfare recipients did not deserve benefits; by 2019 this had fallen to 15%, according to the annual Social Attitudes survey. 

The proportion who think benefits are too high and discourage work has fallen from 59% in 2015 to 35%. In America only 56% of people surveyed in 2009 by Pew, a pollster, were in favour of the Obama administration’s $800bn stimulus package, whereas 88% supported the Trump administration’s $2trn covid-19 package last year. 

“It’s rather extraordinary how there’s been all this spending, even sending people cash, and the public has basically accepted it,” says Rachel Lipson, at Harvard University.

The pandemic seems to have shifted the mood from targeting towards universalism. Some claim that, taken to their logical conclusion, the lessons from covid-19 will lead countries to roll out ubi. 

Direct cash transfers, perhaps even universal ones, could become a standard part of governments’ emergency tool-kits. 

But no country is seriously contemplating a full-blown ubi scheme.

When the winds of change shift

More likely is a renewed appreciation of governments’ role in pooling and underwriting risks, in particular those that insurers call “uninsurable”. The pandemic has demonstrated the extent to which governments can smooth shocks. 

On the two days in April when the largest group of Americans received their stimulus cheques, spending by low-income households shot up by 26 percentage points, to near pre-pandemic levels, according to research by Raj Chetty of Harvard University and colleagues. 

Several economists have argued that the pandemic has shown why the generosity of benefits should be pegged to the state of the economy, with welfare acting as a shock absorber when times are toughest.

A revamped welfare state could provide enough flexibility to encourage work but still step in when disaster strikes. It will need to invest in human capital. 

The pandemic has accelerated ongoing changes in the structure of the economy. 

“Buffering alone won’t be enough to fight future shocks,” cautions Anton Hemerijck of the European University Institute. 

“You have to invest in child care, in skills, in health, in people as well if you want to future-proof the welfare state.” 

The impact of climate change, technological innovations and demographic shifts on jobs and livelihoods is hard to predict. 

But further social disruption is almost certain. 

Better preparations cannot start soon enough.

The first task of democracies is to put their own house in order

Joe Biden’s planned summit and Boris Johnson’s ‘D10’ must not turn into anti-Chinese fronts

Tony Barber

© Ellie Foreman-Peck

Last year’s G7 summit of leading industrialised democracies was planned to be held at Camp David in Maryland, but the pandemic forced its cancellation. 

As a result, the leaders of Canada, France, Germany, Italy, Japan and the UK were denied the pleasure of chewing the geopolitical fat with Donald Trump, who as the then serving US president would have hosted the event. 

One looks in vain for evidence that either Trump or the other six leaders regretted the missed occasion.

The setting for this year’s G7 summit is the English county of Cornwall, and the signs are that the June event will take place in a much-improved atmosphere. 

This is almost entirely down to the speed with which President Joe Biden has reaffirmed Washington’s commitment to allies around the world and reversed Trump’s withdrawal of the US from key international agreements and institutions. 

As Biden told last month’s Munich security conference: “America is back.”

The relief in other G7 capitals at the US return to multilateralism is palpable. 

But the resumption of G7 summits — which have included EU representatives since 1977, two years after the format’s launch — will not quite be business as usual. 

For Boris Johnson, the UK prime minister, plans to invite the leaders of Australia, India and South Korea to the Cornwall event, creating what he touts as a “Democratic 10” or D10 group of countries.

Johnson’s initiative is not to be confused with the Biden administration’s emerging plan for a “democracy summit”. 

The new president pledged in his election campaign to hold such an event during his first year in office. 

Nevertheless his proposal clearly has something in common with Johnson’s, insofar as each emphasises the need for democracies to stand together in a world characterised by rising authoritarianism and great power rivalry.

There is certainly a case for the US and its allies to sound the alarm about democracy. Freedom House, a non-partisan, US government-funded organisation, says that 2019 was the 14th consecutive year in which global freedom was in decline. 

Similarly, the World Justice Project, a Washington-based civil society initiative, estimates that the rule of law deteriorated or at best remained the same in 2019 in a majority of countries in every region of the world.

However, the Johnson and Biden initiatives will need careful preparation if they are not to end up as empty and divisive public relations exercises. 

One problem concerns the quality of democracy in Johnson’s putative D10, and in the larger club of friendly countries which the Biden administration may summon into being. 

Both initiatives also risk turning into anti-Chinese fronts, employed for the purpose of promoting hard-nosed geopolitical interests rather than democracy as such. 

That would muddle the purported aim of the two proposals and almost certainly give rise to disagreements inside the new clubs, especially among continental European G7 members that do not want to join an anti-China crusade.

The awkward truth about Johnson’s D10 is that it would contain several countries where standards of democracy and the rule of law have fallen short of late. 

Embarrassingly, one of those was the US itself under Trump. 

Another was the UK, where Johnson’s government unlawfully suspended parliament in 2019 and threatened last year to break international law as a way of ending an impasse with the EU over Brexit.

A third example is India under Prime Minister Narendra Modi’s ruling Bharatiya Janata party. India used to be labelled the world’s largest democracy. 

But its drift from democratic norms under the BJP makes it harder to maintain a clear-cut distinction between India and more authoritarian systems.

As for Biden’s democracy summit, one difficult question is which countries to invite or exclude. 

Should Brazil and Ukraine be in because they are important to US geopolitical interests, despite their flawed records on democracy and the rule of law? 

What about Hungary or Georgia?

At a minimum, Biden should be frank about the retreat of US liberal democracy under Trump. 

He ought to make clear that, under his leadership, Washington is willing to listen and not simply preach to others on matters of freedom. 

Ideally, Biden and other leaders would not shrink from addressing the thorniest point of all — namely, that the threat to democracy comes not only from hostile authoritarian regimes abroad, but from freely elected leaders in our own countries who corrode liberal norms by appealing to an angry popular base, denigrating established institutions and chastising minorities.

Biden struck the right note in his Munich speech by stressing that he had no intention of exploiting the theme of democracy to rebuild the “rigid blocs of the cold war”. 

For all its suspicions of China and growing military co-operation with the US, India is a highly independent-minded country that would want no part of such a scheme.

But this points, too, to the limitations of Johnson’s D10 idea. It is debatable whether the 10 countries have enough in common to be fully united on democratic values or on strategy towards China. 

For this reason, the scheme looks mostly like Johnson’s attempt to put some flesh at long last on the Brexit-era slogan of a “global Britain” foreign policy. 

Whether that will impress other governments, in and outside Europe, is another matter.

Is “Temporary Inflation” A Real Thing?


Fed Chair Jerome Powell just spooked the markets by predicting that inflation will jump when the economy reopens – but don’t worry, it’s just temporary. 

Here’s the real-time CNBC account:

Fed’s Powell says reopening could cause inflation to pick up temporarily

Federal Reserve Chairman Jerome Powell said Thursday that he expects some inflationary pressures in the time ahead but they likely won’t be enough to spur the central bank to hike interest rates.

“We expect that as the economy reopens and hopefully picks up, we will see inflation move up through base effects,” Powell said during a Wall Street Journal conference. 

“That could create some upward pressure on prices.”

The Fed likes inflation to run around 2%, a rate it believes signals a healthy economy and provides some room to cut interest rates during times of crisis. 

However, the rate has run below that for most of the past decade and inflation has been particularly weak during the pandemic.

With the economy increasingly back on its feet, some price pressures are likely to emerge, said Powell, but he said they likely will be transitory and look higher because of “base effects,” or the difference measures against last year’s deeply depressed levels just as the Covid-19 crisis began.

Now, this comes as the federal government is preparing to hit the economy with … wait for it… $4 trillion of created-out-of-thin-air cash, in the form of the covid stimulus bill currently in the Senate and the blockbuster “infrastructure” plan that’s apparently next in the pipeline. 

Here’s Goldman Sachs’ prediction on that one:

Biden’s Infrastructure Bill Could Be $2 Trillion Behemoth—Here’s What Goldman Sachs Is Expecting

With President Biden’s $1.9 trillion stimulus bill now making its way to the Senate, analysts from investment giant Goldman Sachs laid out their expectations for his forthcoming infrastructure spending initiative—the second phase of his ambitious plan to revitalize the American economy.

In a research note late Sunday, Goldman Sachs’ analysts said they expect the proposal will be worth at least $2 trillion—and potentially even double that—over the next 10 years based on previous proposals and estimates of how much investment will be necessary to shore up U.S. infrastructure.

They also note that the upcoming package could be broader in scope than expected and has the potential to expand beyond infrastructure, green energy, and climate change initiatives to include a spate of Democratic priorities including childcare, healthcare, and education initiatives, though the Washington Post reported last month that Democrats are far from united on what the final proposal should look like.

“We are so far behind the curve,” Biden said last month ahead of a meeting with labor leaders to discuss his stimulus and infrastructure legislation. “We rank something like 38th in the world in terms of our infrastructure—everything from canals to highways to airports.” He added that the United States needs to do “everything we can do…to make ourselves competitive in the 21st century.”

A quick digression: Notice the neat bit of misdirection here? 

“Infrastructure,” like “covid relief,” can in practice mean pretty much anything a political party wants to give its favorite voting blocs. 

So childcare, education, you name it, it’s either relief or infrastructure.

It might simpler to just view this year’s spending as a back-door bailout for badly run cities and states. 

After all, part of being “badly-run” is skimping on roads, bridges, etc., in order to fund wildly overgenerous public sector pensions. 

Because most of these places have been ruined by politicians from the same party that now controls Washington, a direct bailout would be politically dangerous. 

But give a bill a popular name and Congress can lard it with everything on the wish list.

Now back to an analysis of the numbers, the point of which is that in this late, decadent stage of rampant financialization, governments will always find a way to borrow and spend more, because there’s really no other option if they want to avoid presiding over Great Depression 2.0. 

More debt won’t delay the Big D forever of course, but it might move it beyond the retirement date of most of the current political class, and for them, that seems to be enough.

As for temporary inflation, one could make the case that it’s already raging. 

Lumber, for instance …

… and copper…

… and of course interest rates. The 10-year Treasury Note responded to Powell’s speech by jumping back up to 1.5%.

These are all signs of an overheating economy – BEFORE several trillion more dollars inundate construction, housing, and finance. 

So how exactly can the Fed know that the even more broad-based inflation likely to result from tossing new gasoline onto this fire will burn itself out in a few months? 

The answer is they can’t possibly know that. 

And history says that this particular genie, once out of the bottle, likes to stick around for a good long while.

The Case for a Higher Minimum Wage

In their push to increase the US federal minimum wage from $7.25 to $15 per hour, President Joe Biden and his fellow Democrats are on solid ground not just economically but also politically. A higher wage floor would create an impetus for good jobs, which is precisely what Western economies are lacking.

Daron Acemoglu

BOSTON – Efforts in the United States to increase the federal minimum wage from $7.25 to $15 per hour have gained steam now that the Democratic Party controls the White House and Congress. Such a move makes sense both economically and politically.

Economists are no longer as skeptical of minimum wages as they once were. It used to be assumed that labor markets worked flawlessly, thereby denying employers the monopoly power with which to extract “rents” above the fair return for their physical capital investments. Under such circumstances, basic economics predicts that a higher minimum wage would reduce employment.

But research since the late 1980s has, for the most part, failed to find major disemployment effects from modestly higher minimum wages. The first salvo came from David Card of the University of California, Berkeley and the late Alan B. Krueger of Princeton University (partly building on joint work with Lawrence F. Katz). 

Their seminal work – summarized in their book Myth and Measurement: The New Economics of the Minimum Wage – found that reduced employment did not follow minimum-wage hikes; in some cases, employment actually rose when wage floors were raised.

Although these findings incited controversy at the time, additional evidence based on larger samples and more fine-tuned empirical approaches confirmed them. If minimum wages don’t reduce employment by much, if at all, it may be inferred that large employers of low-wage workers (like McDonald’s or Walmart) do have market power with which to earn rents (though the jury remains out on this question).

The earlier economics literature may also have underestimated other potential gains from minimum wages. After all, such policies do more than merely increase low-wage workers’ earnings. 

My own work finds that minimum wages tend to discourage low-pay employment and create an impetus for the creation of good jobs with higher wages, more security, and possibilities for career advancement. 

Now that opportunities are dwindling for workers without a college degree – many of whom must resort to the gig economy and zero-hour contracts – the need for such an impetus has become more urgent.

True, some economists worry that minimum wages can discourage skills training and other investments in worker productivity. But as Steve Pischke of the London School of Economics and I have shown, this concern has been exaggerated. 

When employers are earning rents – as seems to be the case in US low-wage markets – they can accommodate a small increase in the minimum wage without having to fire their employees. Better yet, when an employer must pay its workers higher wages, it has a stronger incentive to boost their productivity.

Moreover, while Democrats are already on solid empirical ground for advocating a higher minimum wage, the case for doing so is even stronger when one considers non-economic factors. 

As the philosopher Philip Pettit explains, humans strive for freedom from “dominance,” which he defines as living “at the mercy of another, having to live in a manner that leaves you vulnerable to some ill that the other is in a position arbitrarily to impose.” One is being dominated when one is “subject to arbitrary sway; being subject to the potentially capricious will or the potentially idiosyncratic judgment of another.”

This definition captures the experience of those throughout human history who have lived in servitude. But as James A. Robinson and I emphasize in our book The Narrow Corridor, even though most workers in the West no longer need to worry about the most brutal forms of labor coercion, the absence of job security and pay sufficient to meet one’s needs means that one is still subject to “dominance.”

Of course, neither Pettit nor James and I were the first to seize on this point. One of the architects of the British welfare state, William Beveridge, argued in 1945 that “Liberty means more than freedom from the arbitrary power of governments. It means freedom from economic servitude to Want and Squalor and other social evils; it means freedom from arbitrary power in any form. 

A starving man is not free.” Likewise, Article 23 of the 1948 Universal Declaration of Rights states that “Everyone who works has the right to just and favorable remuneration ensuring for himself and his family an existence worthy of human dignity.”

Viewed in this light, the Democrats’ efforts to increase the minimum wage and expand worker protections should be viewed as a return to a social agenda that has been ignored for too long. In an increasingly unequal and stratified economy, policies to level the playing field and reduce dominance are long overdue.

As always, policy design matters. At some point, raising the federal minimum wage probably would start to produce disemployment, and it is reasonable to question whether the same minimum wage should be applied to all parts of the country, considering the cost-of-living differences between New York and Mississippi, or Massachusetts and Louisiana. 

Hence, some economists call for state minimum wages to be calibrated to average earnings in local labor markets. But most states have not taken the initiative to raise their minimum wages, leaving the federal government to set a new floor.

A higher federal minimum wage would have a powerful economic as well as symbolic effect; but it’s no panacea. Without a voice in the workplace and a safe working environment, workers will remain under the “arbitrary sway” of their employers. 

If raising the federal minimum wage is the only substantive labor-market policy the Democrats enact during President Joe Biden’s first term, they will not have achieved much, and may even have created stronger incentives for employers to automate more tasks.

The biggest problem facing Western economies today is a shortage of good jobs, owing to an excessive focus on automation and insufficient efforts to develop new technologies and tasks that benefit workers from all backgrounds. 

A minimum-wage hike would represent an important first step, but it must be accompanied by policies to redirect technological change and provide incentives for employers to create good jobs and better working conditions.

Daron Acemoglu, Professor of Economics at MIT, is co-author (with James A. Robinson) of Why Nations Fail: The Origins of Power, Prosperity and Poverty and The Narrow Corridor: States, Societies, and the Fate of Liberty.