About Me

My photo
An Investor and counsellor in Financial Market

Wednesday, July 31, 2019

Why investors favour economically orthodox political strongmen

A guide to auto-technocracy










A good way to start an argument that never ends is to try to define populism. Dictionaries say it is politics directed at ordinary people who feel neglected by elites. That leaves a lot out, not least economics. In 1990 Sebastian Edwards and the late Rudiger Dornbusch sketched what is meant by “economic populism”. It is an approach, they wrote, that denies that budget deficits or inflation are constraints on economic growth. The Latin American populists they studied printed money to pay for public-spending binges. It ended in tears.
There is no shortage of leaders in middle-income countries who fit the dictionary definition of a populist. But economic populism in its purest form is now quite rare (though its results are sadly evident in Venezuela). These days a lot of would-be champions of the people prefer their macroeconomic policies on the orthodox side: inflation targets, fiscal restraints, free-floating currencies, that sort of thing. They are happy to let technocrats get on with it.
This poses a mild dilemma for rich-world investors—which is soon resolved. They may be appalled by the social and foreign policies of such strongmen. Yet as professionals they are also unthrilled by inflation, default, devaluation and adverse shifts in politics, which are hazardous to a bond portfolio. They tend to favour autocrats who like technocrats. You might call it the Putin Principle.
Russia’s president, Vladimir Putin, is not much loved in the West. Buttressed by suppression at home and military adventures abroad, he is the archetypal strongman. He combines this with an affinity for well-qualified economists. His finance ministry frames the budget by a conservative fiscal rule. Inflation is under control, helped by fairly high interest rates. The central bank’s governor, Elvira Nabiullina, is widely admired.
What pulls investors in or puts them off is hard to pin down. It is never a single factor; the world is more complex than that. But a currency’s value is a clue to general sentiment, because it is a shadow price of a country’s assets relative to everybody else’s. The rouble is one of the best-performing currencies this year (see chart). Just as telling is that countries led by strongmen hostile to orthodox policies have seen their currencies suffer—the inverse of the Putin Principle. Recep Tayyip Erdogan, president of Turkey, is the exemplar. For years he has bullied the central bank. Earlier this month he sacked its governor for keeping interest rates high. The lira has suffered badly.
Other countries can be paired on the Putin-Erdogan scale. Take Egypt and Pakistan. The army looms over both. Under the IMF’s auspices, Egypt has followed orthodox policies. The Egyptian pound has risen. Pakistan lost its policy discipline as soon as its most recent IMF programme ended (though it has just signed up to another one). The rupee is down.
The populist label also fits both Jair Bolsonaro of Brazil and Andrés Manuel López Obrador of Mexico. But when Mr Bolsonaro was elected, investors sensed that he might defer to technocratic advisers such as Paulo Guedes, now his economy minister, who has a doctorate in economics from the University of Chicago. Sure enough, Mr Bolsonaro recently shepherded through Brazil’s lower house a pension reform that is vital to the country’s fiscal stability. The real rallied. Meanwhile, Mr López Obrador’s finance minister abruptly resigned. He complained bitterly that technocrats had been sidelined in favour of unqualified types. The peso wobbled.
The leader who is hardest to place on the scale is Narendra Modi of India. He has all the elements of a strongman: strident nationalism, personality cult, enfeebled opposition. He appears to value technocrats. As prime minister and, before that, as chief minister of Gujarat, he has relied on a small band of trusted civil servants. And though he is no fiscal hawk, he seems to grasp that the budget has limits. But there are marks against him. He lost two well-regarded central-bank governors in his first term as prime minister. His madcap idea to withdraw banknotes from circulation in 2016 was anything but orthodox.
The autocrat-plus-technocrat model is not rock-solid. A frailty is that stability is less valuable if it is not married to policies to promote economic growth. Such reforms often, or usually, founder on vested interests on which the autocrat depends. But if living standards are not growing, there will be demand from voters for old-style economic populism. And that always ends badly.

Tuesday, July 30, 2019

What economists have gotten wrong for decades




In a House hearing on monetary policy last week, Federal Reserve Chair Jerome Powell made a telling confession in response to a question from Rep. Alexandria Ocasio-Cortez (D-NY). The topic was the so-called natural rate of unemployment: the idea, believed by many economists and policymakers, that there is a rate at which unemployment could get so low that it could trigger ever-rising inflation.

It’s an idea that has governed decades of monetary policymaking, often prompting the Fed to keep interest rates higher than it should — slowing down the economy in the process — out of fear of accelerating inflation.
Ocasio-Cortez didn’t waste time poking holes at it. She pointed out that the unemployment rate, now 3.7 percent, has fallen well below the Fed’s estimates of the natural rate, which it forecast at 5.4 percent in 2014 and 4.2 percent today. And yet, she noted, “inflation is no higher today than it was five years ago. Given these facts, do you think it’s possible that the Fed’s estimates of the lowest sustainable unemployment rate may have been too high?”
Powell’s response, to his credit, was as simple and direct as you’ll ever hear from a central banker: “Absolutely.” He elaborated: “I think we’ve learned that ... this is something you can’t identify directly. I think we’ve learned that it’s lower than we thought, substantially lower than we thought in the past.”
Powell’s response was commendable, perhaps even groundbreaking; here was the Fed chair challenging decades of conventional economic wisdom. It was a welcome sign of a policymaker’s willingness to question age-old assumptions that have dictated policy and affected millions.
And it’s not the only economic “iron law” that we need to revisit. In the spirit of Powell’s act, I’d like to dig deeper into some assumptions that have defined economic policymaking these past few decades, assumptions that have needlessly caused a lot of economic pain.
The natural rate of unemployment that AOC questioned is one such idea (more on that below). There are three others worth singling out:
  • that globalization is a win-win proposition for all, an idea that has deservedly taken a battering in recent years;
  • that federal budget deficits “crowd out” private investments; and
  • that the minimum wage will only have negative effects on jobs and workers.
Economists and policymakers have gotten these ideas wrong for decades, at great cost to the public. Especially hard hit have been the most economically vulnerable, and these mistakes can certainly be blamed for the rise of inequality. It’s time we moved on from them.

1) Going below the natural rate of unemployment could spark an inflationary spiral

The mandate of the Federal Reserve is to achieve maximum employment at stable prices. It has interpreted the latter to mean an inflation rate of 2 percent. For decades, the Fed has used the benchmark interest rate it controls to target that inflation rate, and it’s done so by trying to keep actual unemployment close to its estimate of what’s called the natural rate of unemployment — a rate below which it was believed inflation would spiral up.
The problem is that the core relationship behind this model — the negative correlation between unemployment and inflation — has been weakening for years, and with it any ability to reliably estimate the natural unemployment rate. Moreover, as Powell acknowledged, there’s been an asymmetry: Because the estimates of the natural rate have been too high, the Fed has often intervened in the direction of raising or failing to cut interest rates.
The cost of this asymmetry has been steep. Since 2009, the average of the Fed’s natural rate estimate has been about 5 percent. As Powell stressed, we can’t accurately identify the natural rate of unemployment, but suppose it’s actually 3.5 percent. Targeting 5 percent unemployment when we could achieve 3.5 percent with little risk of spiraling inflation would mean 2.4 million people unnecessarily out of work. Even targeting the Fed’s current natural rate (4.2 percent) would sacrifice a million potential workers to the altar of an empirically elusive concept.
And the unemployed are just one subgroup that gets hurt in such a scenario. Much research has shown that in slack labor markets, middle- and low-wage earners lack the bargaining clout they have in tight labor markets. As such, they face lower pay, fewer hours of work, higher poverty, and wider racial economic gaps.
By contrast, high-income households are little affected — which means that labor market slack can deepen inequality. The figure below, from a recent paper by Keith Bentele and me, shows the acceleration — the difference between wage growth in strong versus weak labor markets — for real annual earnings.
For low-income workers, we found earnings rose at about a 2 percent annual pace in hot labor markets and fell at about a 4 percent pace in cool ones (the difference, 6 percent, is the first bar). Clearly, the benefits of moving from slack to taut conditions are much more important for low- than for high-earning households.
Such are the costs of over-estimating the natural rate.
Source: Jared Bernstein and Keith Bentele

2) Everybody wins with globalization

Back in the 1990s, when the Clinton administration was trying to sell NAFTA, the view that expanded trade was virtually all upside began to pervade the rhetoric and politics of both parties. They were supported by economic arguments that exporting industries would expand into markets and add new jobs, and consumers would have cheaper goods. By dint of their superior productivity, US manufacturers and their communities wouldn’t be hurt. Any disruption to workers’ livelihoods was either dismissed as an impossibility or placed under the antiseptic rubric of “transition costs.”
This excerpt from the 1994 economic report of the president nicely captures the zeitgeist:
As economists have long predicted, freer trade has been a win-win strategy for both the United States and its trading partners, allowing all to reap the benefits of enhanced specialization, lower costs, greater choice, and an improved international climate for investment and innovation. American industries—both their workers and their owners—have benefited from increased export markets and from cheaper imported inputs. American consumers have been able to purchase a wider variety of products at lower prices than they could have without the expansion of trade.
When pressed as to how expanded trade could truly be “win-win,” advocates like Clinton’s economics team above cited the economic theory of comparative advantage: When trading partners produce what they’re best at producing, both countries will come out ahead.
But the theory never said expanded trade would be win-win for all. Instead, it (and its more contemporary extensions) explicitly said that expanded trade generates winners and losers, and that the latter would be our blue-collar production workers exposed to international competition. True, the theory maintained (correctly in my view) that the benefits to the winners were large enough to offset the costs to the losers and still come out ahead. But as trade between nations expanded, policymakers quickly forgot about the need to compensate for the losses.
The era of free trade eventually led to large trade deficits with countries with comparatively productive factories to ours but with much lower wages, most notably Mexico and China. As in every other advanced economy, the share of US manufacturing employment had long been drifting down. But the number of US factor jobs held pretty constant around 17 million — until around 2000, when, over the next decade, almost 6 million such jobs were lost. Economists who’ve studied the period now refer to it as “the China Shock.”
Once again, these impacts didn’t just translate into just job losses; wages were hit, too. Between the late 1940s and the late 1970s, when production workers were relatively insulated from foreign competition, blue-collar manufacturing compensation more than doubled. By contrast, it’s grown only 5 percent since then.
Did the winners from trade — the multinational corporations that relocated production, the finance sector that made the deals, the retailers that profited from “the China price” — compensate the losers? Of course not. They argued that “everyday low prices” were reward enough.
But not only did the winners fail to help the losers — say, through serious employment-replacement programs, robust safety net assistance, direct job creation, and investments to make our manufacturers more competitive — they instead used their winnings to invest in politicians to cut their taxes and write ever more trade deals favoring investors over workers.
Let me be very clear. Both the US and developing countries have significantly benefitted from global trade. But because of the demonstrably false view that free trade is all upside — win-win — considerable economic pain has been meted out, pain that has not been met with anything approaching an adequate policy response.

3) Deep budget deficits will crowd out private investment

For decades, economists argued that when the federal government runs a budget deficit, it pushes up interest rates and slows economic growth. It’s a theory known as “crowd-out,” suggesting government borrowing from a relatively fixed stock of loanable capital crowds out private borrowing, which in turn raises the cost of capital — i.e., the interest rate.
But this is yet another relationship that has failed to hold up, though not before its adherents created considerable hardship, both here and even more so in Europe, through austere budget policy in the wake of the Great Recession. The belief in this idea prompted policymakers to reduce government spending to avoid alleged crowd-out effects well before the private sector had recovered and could generate enough growth on its own.
There were certainly periods in the past when crowd-out did indeed appear in the data. The 1970s and early 1980s saw larger budget deficits (i.e., more negative) and higher interest rates. But since then, deficits have swung significantly up and down while interest rates have consistently drifted down.
Most recently, we’ve been posting very large budget deficits given the state of the economy (due to both deficit-financed tax cuts and spending) and interest rates are nonetheless hitting historic lows — precisely the opposite of crowd-out predictions.
Source: Federal Reserve and Bureau of Economic Analysis
This all sounds pretty abstract, but it has stark implications on the ground. Based on the deeply embedded notion (at the time) that the deficits built up in the Great Recession needed to come down quickly, the federal government pivoted to deficit reduction well before our private sector had recovered.
As a member of the Obama economic team at the time, I can confirm that crowd-out fears were a motivation for the pivot. According to this analysis by the Brooking’s Institute, between 2011 and 2014, fiscal policy cut about 1 percentage point per year from real GDP growth. Based on the historical correlation between growth and jobs, this austerity added 2 points to the unemployment rate in those years, or about 3 million jobs.
I tend not to give Trump a lot of credit for economic policy, and I believe his tax cut will exacerbate inequality and rob the Treasury of needed revenue. But the fiscal economics of Trump’s tax cuts are revealing in ways that relate both to crowd-out and the natural rate of unemployment. As noted, deficits are up and interest rates are down. Meanwhile, the positive fiscal boost has helped drive the unemployment rate down to 50-year lows while inflation remains low and stable. These developments clearly undermine long-held economic doctrines, and they’ve been a boon to working families.
That said, a final point must be underscored: The absence of crowd-out doesn’t mean deficits no longer matter. Even with low rates, we’ll still be devoting more tax revenue to financing our debt, and even more worrisome is the fact that we’re almost certain to enter the next recession with a debt-to-GDP ratio that’s twice that of the historical norm. This will likely lead Congress to be more timid in fighting the next recession. But this is a political constraint, not an economic one.

4) A higher minimum wage will only hurt workers

Another big mistake with lasting consequences has been the assumption that minimum wage increases will hurt their intended beneficiaries: low-wage workers.
The theory is that free markets set an “equilibrium” wage that perfectly matches supply and demand given employers needs and workers’ capabilities. Force that equilibrium wage up and rampant unemployment will result.
When I was coming up in the profession, our textbooks argued that believing minimum wages could help low-wage workers was akin to believing that water flowed uphill. Their message was particularly comforting to conservative politicians who wanted to protect the profits of employers of low-wage workers.
Today, decades of high-quality research (much of it initiated by the late, great economist Alan Krueger) have introduced a much more nuanced view about the true impacts of minimum-wage hikes. But years of economists’ opposition to the policy have left us with a national minimum wage of $7.25 per hour, a level far too low to support the many families that depend on the minimum wage. (Another myth was that only teenagers earned the minimum; David Cooper’s work shows the main beneficiaries of higher minimum wages are working adults.)
How the consensus began to change is instructive. To their credit, some state policymakers decided to ignore the economists and raise minimum wages in their states. This provided researchers like Krueger with quasi-natural experiments of a type too rare in economics. The positive results of these studies led many more states and cities to raise their wage floors (29 states plus DC now have minimums above the federal level), and this fed back into the experimental research, creating a powerful loop.
Summarizing a large and still contentious body of research, a fair conclusion is that, conditional on their magnitude, minimum wage increases accomplish their goal of raising pay for low-wage workers without large job-loss effects. But the broader point is that an economic relationship believed to be steadfast was tested and was found wanting.
The changing consensus can be seen in a new report from the Congressional Budget Office — a bastion of mainstream economics — that found an increase in the minimum wage to $15, phased in by 2025, would benefit 27.3 million workers, with an average gain of $1,500 per year, reduce the number of the poor by 1.3 million, but also cut employment of affected workers by 1.3 million. Yes, some would lose jobs, but so many more would benefit — hardly the “everybody loses” prediction that prevailed among economists for decades.

What all these economic mistakes have in common

Pegging the “natural rate” too high, ignoring the harm from exposure to international competition, austere budget policy, low and stagnant minimum wages — all of these misunderstood economic relationships have one thing in common.
In every case, the costs fall on the vulnerable: people who depend on full employment to get ahead; blue-collar production workers and communities built around factories; families who suffer from austerity-induced weak recoveries and under-funded safety nets, and who depend on a living wage to make ends meet. These groups are the casualties of faulty economics.
In contrast, the benefits in every case accrue to the wealthy: highly educated workers largely insulated from slack labor markets, executives of outsourcing corporations, the beneficiaries of revenue-losing tax cuts that allegedly require austere budgets, and employers of low-wage workers.
In this regard, there is a clear connection between each one of these mistakes and the rise of economic inequality.
I cannot overemphasize the importance of recognizing who benefits and who loses from these economic mistakes, because that difference is why these mistakes persist. Every one of the wrong assumptions described here benefits conservative causes, from reducing the bargaining clout of wage earners, to strengthening the hand of outsourcers and offshorers, to lowering the labor costs of low-wage employers. These economic assumptions are thus complementary to the conservative agenda and that, in and of themselves, makes them far more enduring than they should be based on the facts.
It is no coincidence that the assumptions are being so rigorously questioned by a new group of highly progressive politicians, like Rep. Ocasio-Cortez. They are making the critical connections in our political economy to challenge old assumptions that have hurt working people for too long. The vast majority of us will be better off for their work
.

Monday, July 29, 2019

Tensions between Iran and the West have the Gulf states on edge

The conflict threatens their infrastructure and their income

Compared with Jebel Ali in Dubai, it feels like a sleepy Mediterranean harbour. The port at Fujairah, on the eastern coast of the United Arab Emirates (uae), serves mostly as a refuelling depot for ships plying the Strait of Hormuz. It lacks the cargo capacity and the high-tech wizardry of Jebel Ali, the largest port in the Gulf and the ninth-busiest in the world. But Fujairah is the uae’s only link to the high seas that bypasses the troubled strait, and so it has become a focal point amid worsening tension between Iran and the West.
That tension rose on July 19th when Iran’s navy seized the Stena Impero, a British tanker (pictured), as it sailed west through the strait. The 30,000-tonne ship is now anchored near Bandar Abbas, hostage to a dispute that began on July 4th, when Britain impounded an Iranian tanker (allegedly bound for Syria) as it passed Gibraltar. In one of his final acts as foreign secretary, Jeremy Hunt proposed setting up a European task force to protect commercial vessels in the Gulf.
Britain and its allies worry about the threat to business and energy supplies. For the Gulf Co-operation Council (gcc), though, tensions with Iran border on an existential issue. Despite some hawkish rhetoric, Gulf states are nervous about President Donald Trump’s policy of imposing “maximum pressure” on Iran. Conflict threatens their infrastructure and could hamper the oil and gas shipments that fill their treasuries. “Who’s going to pay the price? It’s us,” says a Qatari diplomat.
For all its threats, Iran cannot close the Strait of Hormuz, the conduit for one-fifth of the world’s traded oil and a quarter of its liquefied natural gas. But it has already raised the cost of commercial shipping. Fujairah became a target in May, when four oil tankers anchored offshore had holes blown in their hulls. Iran is the prime suspect, though investigators have not formally assigned blame. The bunkering business in Fujairah has suffered as a result.
Insurance premiums for the strait have climbed by an average of 10%. For the largest oil tankers, they have doubled, with a transit now costing as much as $500,000 to insure. Some shippers may decide not to take the risk (and bear the cost) of sailing through the strait. That is a concern for the Gulf states, which rely on the waterway to import everything from wheat to cars. Three of them—Bahrain, Kuwait and Qatar—have no other outlet to the sea.
Infrastructure on land is vulnerable too. The Houthis, a Yemeni Shia militia backed by Iran (and which sometimes acts as its proxy), are fighting a Saudi-led coalition at home. But they have also hit soft targets in the region. In May the Houthis took credit for striking an oil pipeline in Saudi Arabia (American officials blamed Shia militias in Iraq). The Houthis claim to have used long-range drones, which could also hit oilfields in the uae. At least three times since May the group has fired missiles at the international airport in Abha, in southern Saudi Arabia. One person has been killed.


Gulf states struggle to counter these threats. Though they have spent tens of billions of dollars on military kit from America and Europe, it is not always the right kit. Tanks and fighter jets have limited value in an asymmetric conflict. Their navies are small and lack combat experience; they train with the Americans and are investing in new ships, but play only a supporting role in regional security. Years of talk about an integrated gcc missile-defence command has led nowhere, and individual defences are spotty. If drones hit Saudi Arabia’s oil pipeline they would have spent hours flying undetected over the kingdom.In June the militia lobbed a rocket at a major Saudi desalination plant in Al-Shuqaiq. It caused little damage but highlighted another vulnerability: the kingdom gets about one-third of its drinking water, more than 1bn cubic meters a year, from such facilities, which are expensive to build and easy to target. The Qataris even worry about their national air carrier, which has been forced, since its Arab neighbours imposed an embargo in 2017, to route hundreds of daily flights over Iran. At least two drones (one American, one Iranian) have recently been shot down in the area.

Saudi Arabia has long viewed Iran as its chief enemy. It still broadly supports American policy. Officials in the smaller Gulf emirates are unhappy, though, and those in the uae feel particularly stuck. In public they cannot break with Mr Trump or their Saudi allies. But they are subtly distancing themselves. They are withdrawing troops from Yemen partly to lower tensions with the Houthis—and thus avoid being attacked. They have also taken a cautious line on Iran, even suggesting it may not have been responsible for the sabotage in Fujairah. “They could shut this place down with a few missiles,” says an official in Dubai. “We need to protect our own interests.”

Friday, July 26, 2019

The rise and rise of private capital

Why investors are seeking higher returns away from public markets

The biggest event on Wall Street in 1956 was the ipo of a company that had stayed private for more than half a century. Investors queued to secure shares in the Ford Motor Company. Their price jumped $5 on the first day’s trading. Yet by the time the company listed, it had no real need for capital. Henry Ford, its founder, had long been hostile to the idea of going public. It was almost a decade after his death in 1947 when the family foundation at last decided to sell some of its Ford shares to the public.
This year’s wave of high-profile flotations has some similarities. Uber, Lyft, Slack and the rest are not old as Ford was when it listed. But nor are they in the first flush of youth. New firms are staying private for longer. The number of public firms in America has declined by more than a third since the 1990s.






One explanation is that today’s tech firms have less need of public capital. More of the value of startups is tied up in ideas than in fixed assets such as factories. Tech moguls like the opportunities a public listing brings. But, like Henry Ford, they are less keen on the loss of control—and the scrutiny (business secrecy matters a great deal for ideas-led firms). Yet the crucial shift has been not a fall in demand for capital, but a rise in its supply. Sums that could once be raised only on public markets can now be readily tapped from private sources.
To grasp the rise of private capital, go back to Ford’s day, when finance was simpler. Private capital was a bank loan, a contract between a borrower and lender, or a grubstake raised from family members, friends or business associates. If larger sums were needed a firm might issue securities in the public markets—equity shares or (for the well-established) high-grade bonds.
Around the time that Ford Motors went public, venture-capital firms emerged to provide seed capital to startups. Later on, in the 1980s, Drexel Burnham Lambert, an upstart investment bank, had the idea of issuing low-grade “junk” bonds to buy public companies (so called “leveraged buy-outs”, or lbos).
Today’s big private-equity firms, such as kkr and Blackstone, cut their teeth during the buy-out boom of the 1980s. Rival firms, such as Apollo and Ares, were founded by Drexel alumni. Changes to capital-markets regulation in the 1990s and 2000s set the scene for the recent growth of private markets. The pooling of private capital was made easier. Listing on public markets was made harder.fs
Deeper forces were at work, too, which strengthened after the financial crisis. The secular decline in long-term interest rates, caused in part by abundant savings, was given an extra push by the easy-money policies of central banks. Yields on stocks and corporate bonds also declined. The venturesome looked to private markets for higher returns. By their nature, such markets are illiquid and more lightly regulated. The intrepid hope to be rewarded for tying up their money for longer and for doing their homework.
A gap in the supply of credit was left by capital-constrained banks. One manifestation is the popularity of leveraged loans—bond-like securities sold to syndicates of private investors. Easier than bank finance, and more flexible than junk bonds, they are the preferred fuel for lbos. Other forms of private credit look a lot like bank credit. A syndicate may consist of just a handful of lenders. The loan is tailored. It might be for an office building, or to buy breathing space for a company to get back on track.
The growing supply of private forms of equity capital is even more remarkable. Young tech firms are almost spoilt for choice. Sovereign-wealth funds, family offices and even staid old pension funds now compete for stakes in newish, unlisted firms. Private-equity firms have $2.4trn waiting to be put to work, according to Preqin, a data-provider. A scarcity of buy-out targets means funds are increasingly reluctant to let go of good firms. The lifespan of some funds is being extended beyond ten years.
Even Henry Ford would applaud such patient capital. It is now received wisdom that ideas-rich firms are better suited to private capital than to public markets, with their endless disclosures and distracting spotlight. But that wisdom tends to play down the supply-side of capital markets. In fact “public markets are not inherently antithetical to tech,” says Ajay Royan of Mithril, a venture-capital firm based in Austin, Texas. Steve Jobs did fine in them. So did Bill Gates. What Silicon Valley takes today as a lasting norm might simply be the outgrowth of unusually low interest rates.

Thursday, July 25, 2019

Socialism: A Man-Made Malthusian Trap

Despite significant economic progress since ancient times, most people in agrarian societies continued to live at the subsistence minimum until modern times. By the nineteenth century, it was said that these societies fell into a "Malthusian trap." The Malthusian trap describes a situation that keeps population growth in line with available resources. The increase in income per person was not sustainable in the long run, as economic growth was inevitably consumed by population increases.
Western European countries, however, managed to escape the Malthusian trap through the Industrial Revolution which accelerated in the nineteenth century. Escaping the Malthusian trap meant an increase in both population growth and economic prosperity for the vast majority of people. For example, Europe’s population more than doubled between 1800 and 1900, but the decline of living standards no longer accompanied this growth as it had in pre-industrial societies. Economic historians explained that the phenomenon resulted from technological advances, demographic shifts due to European marriage patterns (marrying in later years, establishing a separate household, having fewer children), and increased human intelligence.
All of the above are supposed to secure the systematic excess of output growth rates over the overpopulation growth rate. It seems that one crucial factor needs to be added to the list: capitalism itself, where economic laws are fully unfolded and have maximum manifestation and impact on society. Humanity entered a capitalist mode of production, which became possible by limiting absolutism and intrusion into the economy, creating democratic institutions, and improving human rights, law supremacy, and its uniform application.
The ideal market economy emerges in the society that is described as a collection of numerous self-governing producers that meet multiple independent consumers and freely exchange commodities and services according to the rates established on the market by the equilibrium between supply and demand. People behave according to the rules and freely exercise their right to enter a business transaction or refuse to participate. Such a society is characterized by the primacy of private property, an extensive division of labor and cooperation, and rich assortments of commodities and services. Economic freedom was accompanied by a high degree of personal freedom. The closest social formation of this ideal happened to be capitalism at the times of classical liberals in the course of the beginning stages of the Industrial Revolution.
Thus, we already see the following pattern: In the societies of hunter-gatherers, economic laws had minimal manifestation but the most prolonged influence (more than 150,000 years). The agricultural revolution created more stable and secure communities, but they were characterized by a lack of capital the lead to low use of more productive factors of production. Low levels of personal economic freedom also inhibited growth and productivity. Industrialization, on the other hand, offered an escape.
But the escape is not always permanent. Socialist governments often act to undo the benefits of insutrialization and capitalism. Historically, socialist regimes have tried to suppress or override the natural operation of personal choice and capital accumulation in economiees. Socialism, in general, encroaches on private property rights, controls the economy, and subordinates individual decision-making to the collective. In this regard, it is appropriate to assume that socialism would push society back toward the Malthusian trap.

Let us examine this hypothesis in light of the case of Venezuela. Venezuela escaped the Malthusian trap only in the thirties of the last century, judging by the GDP output per capita (Figure 1). According to scholars, in the period from the 1920s to the 1940s, the average annual growth rate was more than 10 percent. Indeed, in order to escape the gravity of the trap, an economy needs a high magnitude of acceleration. Now it is hard to believe, but in 1950 , the country ranked fourth in the world in terms of GDP per capita. Unfortunately, as soon as Venezuela established itself as a powerhouse of South America, the government started to implement economic policies from the cookbooks of socialism. Undoubtedly, the country fell prey to the Soviet Union’s influences on Latin America during the Cold War.
The main assault was directed on private property rights in the industry and in agriculture. In the late 1950s, the government nationalized the telephone company and founded state-owned metallurgical plants and petrochemical and oil corporations. The authority initiated the agrarian reform whereby the state practically expropriated lands from large landowners and redistributed it among new farmers. Despite the continuing economic growth, the Venezuelan economy was poisoned by the venom of socialism. By the 1970s, Venezuela was a mixed economy with a significant share of state-owned enterprises in the most valuable sectors, which were controlled by a central planning agency. Every new government doubled down on implementing socialist measures as a way to solve the socio-economic issues facing society. It was a continuous trend of nationalization of industries, control of prices and minimum wage, unionization, the imposition of new taxes, and administration of exchange and interest rates.
The high revenue from the oil boom fueled the economy; however, the government went on an enormous spending spree. People and the remaining business became live out of the state generosity rather than creating wealth themselves. By 1980, the economic growth stalled, and in the next decade, Venezuela’s economy experienced stagnation. Partial liberal reforms undertaken by the government in conjunction with IMF could not divert an unfavorable trend. Several fruitful years in the mid-2000s, due to a super profit from unprecedented prices for petroleum products, was the last breath before the economy went for a nosedive after the market correction. The economy was exhibiting negative growth, hyperinflation, extreme impoverishment of the population, the deficit of basic food and consumer products.
How can one explain such unfortunate events? Socialism adversely affects personal and economic freedoms—the essential components of socio-economic systems that are subject to universal and natural economic laws. The implementation of socialistic measures inhibited the natural flow of market forces in the official economic sphere and funneled them to the shadow .
Venezuela has fallen into the man-made socialist Malthusian trap. The socialist Malthusian trap is a condition of politico-economic zugzwang when every consecutive move leads to an even worse situation. Whatever the government does in the framework of the socialist way of thinking will have an accumulating negative effect on people’s wellbeing. Venezuela entered the territory of a humanitarian crisis, which manifested itself in widespread hunger and weakened health care. Gregory Clark’s “A Farewell to Alms: A Brief Economic History of the World” determined the average daily energy intake from food per capita of about 2,300 or less, as is typical for those social systems that did not escape the Malthusian trap. At the same time, WHO established 2,100 kcal/capita/day as a minimum daily norm.
Figure 2 shows that even during the years of stable GDP growth in 1960–1970, Venezuelans experienced food supply problems. This may be evidence of a socialist agricultural reform in the late 50s. Also, the inability to feed their people is a common feature of all socialist regimes. The country was counting on food imports, and when the price of oil was high, food consumption increased. In the socialistic Malthusian trap, people faced acute hunger. The recent study revealed that Venezuelans lost an average of 24 lbs in body weight in 2017. Therefore, both indexes show that socialism drove the country into the Malthusian trap. In contrast to the original trap that all societies used to experience in their history, the socialistic Malthusian trap is human-made. The economic misery was not caused by full-scale warfare or a natural disaster of biblical proportions. Instead, Venezuela had all the ingredients for success, which pre-industrial society was lacking, but stepped into the uncharted territory of socialism and lost the bet.
Socialism, as a regime of willful ignorance of fundamental economic laws and economic illiteracy, drives society back into the Malthusian trap. Venezuela is a vivid and unfortunate example of the implementing of the socialist idea in modern times. The way out of the trap is a full restoration of economic and individual freedoms that guarantee fundamental laws of the economy to unfold freely to people’s advantage.

Wednesday, July 24, 2019

The Missing Three-Letter Word In The Iran Crisis

It’s always the oil. While President Donald Trump was hobnobbing with Saudi Crown Prince Mohammed bin Salman at the G-20 summit in Japan, brushing off a recent UN report about the prince’s role in the murder of Washington Post columnist Jamal Khashoggi, Secretary of State Mike Pompeo was in Asia and the Middle East, pleading with foreign leaders to support “Sentinel.” The aim of that administration plan: to protect shipping in the Strait of Hormuz and the Persian Gulf. Both Trump and Pompeo insisted that their efforts were driven by concern over Iranian misbehavior in the region and the need to ensure the safety of maritime commerce. Neither, however, mentioned one inconvenient three-letter word - O-I-L - that lay behind their Iranian maneuvering (as it has impelled every other American incursion in the Middle East since World War II).
Now, it’s true that the United States no longer relies on imported petroleum for a large share of its energy needs. Thanks to the fracking revolution, the country now gets the bulk of its oil — approximately 75 percent - from domestic sources. (In 2008, that share had been closer to 35 percent.)  Key allies in NATO and rivals like China, however, continue to depend on Middle Eastern oil for a significant proportion of their energy needs. As it happens, the world economy - of which the U.S. is the leading beneficiary (despite Trump’s self-destructive trade wars) - relies on an uninterrupted flow of oil from the Persian Gulf to keep energy prices low. By continuing to serve as the principal overseer of that flow, Washington enjoys striking geopolitical advantages that its foreign policy elites would no more abandon than they would their country’s nuclear supremacy.
Pompeo arriving in Abu Dhabi, June 24, 2019. (State Department/ Ron Przysucha)
This logic was spelled out clearly by President Barack Obama in a September 2013 address to the UN General Assembly in which he declared that “the United States of America is prepared to use all elements of our power, including military force, to secure our core interests” in the Middle East. He then pointed out that, while the U.S. was steadily reducing its reliance on imported oil, “the world still depends on the region’s energy supply and a severe disruption could destabilize the entire global economy.” Accordingly, he concluded, “We will ensure the free flow of energy from the region to the world.”
To some Americans, that dictum — and its continued embrace by Trump and Pompeo — may seem anachronistic. True, Washington fought wars in the Middle East when the American economy was still deeply vulnerable to any disruption in the flow of imported oil. In 1990, this was the key reason President George H.W. Bush gave for his decision to evict Iraqi troops from Kuwait after Saddam Hussein’s invasion of that land. “Our country now imports nearly half the oil it consumes and could face a major threat to its economic independence,” he told a nationwide TV audience. But talk of oil soon disappeared from his comments about what became Washington’s first (but hardly last) Gulf War after his statement provoked widespread public outrage. (“No Blood for Oil” became a widely used protest sign then.) His son, the second President Bush, never even mentioned that three-letter word when announcing his 2003 invasion of Iraq. Yet, as Obama’s UN speech made clear, oil remained, and still remains, at the center of U.S. foreign policy. A quick review of global energy trends helps explain why this has continued to be so.

The World’s Undiminished Reliance on Petroleum

Despite all that’s been said about climate change and oil’s role in causing it — and about the enormous progress being made in bringing solar and wind power online — we remain trapped in a remarkably oil-dependent world. To grasp this reality, all you have to do is read the most recent edition of oil giant BP’s “Statistical Review of World Energy,” published this June. In 2018, according to that report, oil still accounted for by far the largest share of world energy consumption, as it has every year for decades. All told, 33.6 percent of world energy consumption last year was made up of oil, 27.2 percent of coal (itself a global disgrace), 23.9 percent of natural gas, 6.8 percent of hydro-electricity, 4.4 percent of nuclear power, and a mere 4 percent of renewables.
Most energy analysts believe that the global reliance on petroleum as a share of world energy use will decline in the coming decades, as more governments impose restrictions on carbon emissions and as consumers, especially in the developed world, switch from oil-powered to electric vehicles. But such declines are unlikely to prevail in every region of the globe and total oil consumption may not even decline. According to projections from the International Energy Agency (IEA) in its “New Policies Scenario” (which assumes significant but not drastic government efforts to curb carbon emissions globally), Asia, Africa, and the Middle East are likely to experience a substantially increased demand for petroleum in the years to come, which, grimly enough, means global oil consumption will continue to rise.
Concluding that the increased demand for oil in Asia, in particular, will outweigh reduced demand elsewhere, the IEA calculated in its 2017 “World Energy Outlook” that oil will remain the world’s dominant source of energy in 2040, accounting for an estimated 27.5 percent of total global energy consumption. That will indeed be a smaller share than in 2018, but because global energy consumption as a whole is expected to grow substantially during those decades, net oil production could still rise — from an estimated 100 million barrels a day in 2018 to about 105 million barrels in 2040.
Of course, no one, including the IEA’s experts, can be sure how future extreme manifestations of global warming like the severe heat waves recently tormenting Europe and South Asia could change such projections. It’s possible that growing public outrage could lead to far tougher restrictions on carbon emissions between now and 2040. Unexpected developments in the field of alternative energy production could also play a role in changing those projections. In other words, oil’s continuing dominance could still be curbed in ways that are now unpredictable.
In the meantime, from a geopolitical perspective, a profound shift is taking place in the worldwide demand for petroleum. In 2000, according to the IEA, older industrialized nations — most of them members of the Organization for Economic Cooperation and Development (OECD) — accounted for about two-thirds of global oil consumption; only about a third went to countries in the developing world. By 2040, the IEA’s experts believe that ratio will be reversed, with the OECD consuming about one-third of the world’s oil and non-OECD nations the rest. More dramatic yet is the growing centrality of the Asia-Pacific region to the global flow of petroleum. In 2000, that region accounted for only 28 — of world consumption; in 2040, its share is expected to stand at 44 —, thanks to the growth of China, India, and other Asian countries, whose newly affluent consumers are already buying cars, trucks, motorcycles, and other oil-powered products.
Where will Asia get its oil? Among energy experts, there is little doubt on this matter. Lacking significant reserves of their own, the major Asian consumers will turn to the one place with sufficient capacity to satisfy their rising needs: the Persian Gulf. According to BP, in 2018, Japan already obtained 87 percent of its oil imports from the Middle East, India 64 percent, and China 44 percent. Most analysts assume these percentages will only grow in the years to come, as production in other areas declines.
This will, in turn, lend even greater strategic importance to the Persian Gulf region, which now possesses more than 60 percent of the world’s untapped petroleum reserves, and to the Strait of Hormuz, the narrow passageway through which approximately one-third of the world’s seaborne oil passes daily. Bordered by Iran, Oman, and the United Arab Emirates, the Strait is perhaps the most significant — and contested — geostrategic location on the planet today.
One of hundreds of Kuwaiti oil fires set by retreating Iraqi forces in 1991. (Jonas Jordan, U.S. Army Corps of Engineers via Wikimedia Commons)

Controlling the Spigot

When the Soviet Union invaded Afghanistan in 1979, the same year that militant Shiite fundamentalists overthrew the U.S.-backed Shah of Iran, U.S. policymakers concluded that America’s access to Gulf oil supplies was at risk and a U.S. military presence was needed to guarantee such access. As President Jimmy Carter would say in his State of the Union Address on Jan. 23, 1980:
The region which is now threatened by Soviet troops in Afghanistan is of great strategic importance: It contains more than two thirds of the world’s exportable oil… The Soviet effort to dominate Afghanistan has brought Soviet military forces to within 300 miles of the Indian Ocean and close to the Strait of Hormuz, a waterway through which most of the world’s oil must flow… Let our position be absolutely clear: an attempt by any outside force to gain control of the Persian Gulf region will be regarded as an assault on the vital interests of the United States of America, and such an assault will be repelled by any means necessary, including military force.”
To lend muscle to what would soon be dubbed the “Carter Doctrine,” the president created a new U.S. military organization, the Rapid Deployment Joint Task Force (RDJTF), and obtained basing facilities for it in the Gulf region. Ronald Reagan, who succeeded Carter as president in 1981, made the RDJTF into a full-scale “geographic combatant command,” dubbed Central Command, or CENTCOM, which continues to be tasked with ensuring American access to the Gulf today (as well as overseeing the country’s never-ending wars in the Greater Middle East). Reagan was the first president to activate the Carter Doctrine in 1987 when he ordered Navy warships to escort Kuwaiti tankers, “reflagged” with the stars and stripes, as they traveled through the Strait of Hormuz. From time to time, such vessels had been coming under fire from Iranian gunboats, part of an ongoing “Tanker War,” itself part of the Iran-Iraq War of those years. The Iranian attacks on those tankers were meant to punish Sunni Arab countries for backing Iraqi autocrat Saddam Hussein in that conflict.  The American response, dubbed Operation Earnest Will, offered an early model of what Pompeo is seeking to establish today with his Sentinel program.
Operation Earnest Will was followed two years later by a massive implementation of the Carter Doctrine in Bush’s 1990 decision to push Iraqi forces out of Kuwait. Although he spoke of the need to protect U.S. access to Persian Gulf oil fields, it was evident that ensuring a safe flow of oil imports wasn’t the only motive for such military involvement. Equally important then (and far more so now): the geopolitical advantage controlling the world’s major oil spigot gave Washington.
When ordering U.S. forces into combat in the Gulf, American presidents have always insisted that they were acting in the interests of the entire West. In advocating for the “reflagging” mission of 1987, for instance, Secretary of Defense Caspar Weinberger argued (as he would later recall in his memoir “Fighting for Peace”), “The main thing was for us to protect the right of innocent, nonbelligerent and extremely important commerce to move freely in international open waters — and, by our offering protection, to avoid conceding the mission to the Soviets.” Though rarely so openly acknowledged, the same principle has undergirded Washington’s strategy in the region ever since: the United States alone must be the ultimate guarantor of unimpeded oil commerce in the Persian Gulf.
Look closely and you can find this principle lurking in every fundamental statement of U.S. policy related to that region and among the Washington elite more generally. My own personal favorite, when it comes to pithiness, is a sentence in a report on the geopolitics of energy issued in 2000 by the Center for Strategic and International Studies, a Washington-based think tank well-populated with former government officials (several of whom contributed to the report): “As the world’s only superpower, [the United States] must accept its special responsibilities for preserving access to [the] worldwide energy supply.” You can’t get much more explicit than that.
Of course, along with this “special responsibility” comes a geopolitical advantage: by providing this service, the United States cements its status as the world’s sole superpower and places every other oil-importing nation — and the world at large — in a condition of dependence on its continued performance of this vital function.
Originally, the key dependents in this strategic equation were Europe and Japan, which, in return for assured access to Middle Eastern oil, were expected to subordinate themselves to Washington. Remember, for example, how they helped pay for Bush the elder’s Iraq War (dubbed Operation Desert Storm). Today, however, many of those countries, deeply concerned with the effects of climate change, are seeking to lessen oil’s role in their national fuel mixes. As a result, in 2019, the countries potentially most at the mercy of Washington when it comes to access to Gulf oil are economically fast-expanding China and India, whose oil needs are only likely to grow. That, in turn, will further enhance the geopolitical advantage Washington enjoyed as long as it remains the principal guardian of the flow of oil from the Persian Gulf. How it may seek to exploit this advantage remains to be seen, but there is no doubt that all parties involved, including the Chinese, are well aware of this asymmetric equation, which could give the phrase “trade war” a far deeper and more ominous meaning.

The Iranian Challenge and the Specter of War

From Washington’s perspective, the principal challenger to America’s privileged status in the Gulf is Iran. By reason of geography, that country possesses a potentially commanding position along the northern Gulf and the Strait of Hormuz, as the Reagan administration learned in 1987-1988 when it threatened American oil dominance there. About this reality President Reagan couldn’t have been clearer. “Mark this point well: the use of the sea lanes of the Persian Gulf will not be dictated by the Iranians,” he declared in 1987 — and Washington’s approach to the situation has never changed.
Guided-missile destroyer USS Porter transits Strait of Hormuz, May 2012. (U.S. Navy/Alex R. Forster)
In more recent times, in response to U.S. and Israeli threats to bomb their nuclear facilities or, as the Trump administration has done, impose economic sanctions on their country, the Iranians have threatened on numerous occasions to block the Strait of Hormuz to oil traffic, squeeze global energy supplies, and precipitate an international crisis. In 2011, for example, Iranian Vice President Mohammad Reza Rahimi warned that, should the West impose sanctions on Iranian oil, “not even one drop of oil can flow through the Strait of Hormuz.” In response, U.S. officials have vowed ever since to let no such thing happen, just as Secretary of Defense Leon Panetta did in response to Rahimi at that time. “We have made very clear,” he said, “that the United States will not tolerate blocking of the Strait of Hormuz.” That, he added, was a “red line for us.”
It remains so today. Hence, the present ongoing crisis in the Gulf, with fierce U.S. sanctions on Iranian oil sales and threatening Iranian gestures toward the regional oil flow in response. “We will make the enemy understand that either everyone can use the Strait of Hormuz or no one,” said Mohammad Ali Jafari, commander of Iran’s elite Revolutionary Guards, in July 2018. And attacks on two oil tankers in the Gulf of Oman near the entrance to the Strait of Hormuz on June 13th could conceivably have been an expression of just that policy, if —as claimed by the U.S. — they were indeed carried out by members of the Revolutionary Guards. Any future attacks are only likely to spur U.S. military action against Iran in accordance with the Carter Doctrine. As Pentagon spokesperson Bill Urban put it in response to Jafari’s statement, “We stand ready to ensure the freedom of navigation and the free flow of commerce wherever international law allows.”
As things stand today, any Iranian move in the Strait of Hormuz that can be portrayed as a threat to the “free flow of commerce” (that is, the oil trade) represents the most likely trigger for direct U.S. military action. Yes, Tehran’s pursuit of nuclear weapons and its support for radical Shiite movements throughout the Middle East will be cited as evidence of its leadership’s malevolence, but its true threat will be to American dominance of the oil lanes, a danger Washington will treat as the offense of all offenses to be overcome at any cost.
If the United States goes to war with Iran, you are unlikely to hear the word “oil” uttered by top Trump administration officials, but make no mistake: that three-letter word lies at the root of the present crisis, not to speak of the world’s long-term fate.