Previous blogs in this series asked why real wages stopped growing in the 1970s and whether long-term trend in labor demand and supply can help us answer this question. In this blog I turn to ‘extra-economic’ (non-market) factors, which are even harder to quantify than economic ones.
Non-market forces potentially affecting real wages include a whole host of mechanisms. On one hand we have political factors due to the state regulation of (or, as some would put it, meddling into) the economy. Then, there are power disparities among various players. If the political and legislative climate favors labor unions, workers gain power by being able to collectively bargain with the employers. Otherwise, the employers have the upper hand.
Furthermore, humans are not pure rational agents – far from it! (and thank God) Our ideas about what is right and what is wrong can and do affect our buying and selling decisions. Labor is one of those commodities where cultural attitudes – social norms and values – are especially important.
Finally, all these political and cultural forces are intertwined in intricate ways. Ideas about what level of compensation is ‘fair’ may change together with the willingness of the governing elites to either support, or suppress labor unions. Cultural forces (I am using ‘culture’ here as a catch-all category) also spill over into the realm of supply and demand. For example, the public opinion may be either supportive of immigration, or against it, which has obvious consequences for the labor supply trends. Attitudes about whether it is proper for women to work will influence the likelihood of their entry into the labor force.
How can we deal with such complexity, many factors interacting with each other? My inclination is to start with a model that is as simple as possible (but, as Einstein famously said, no simpler than that). I’ll look for one ‘proxy,’ a variable that best captures changing social and cultural moods, and see how well it works. If it doesn’t work well, we’ll try other things.
Let’s make this discussion more concrete. In the context of American labor history, ‘cultural’ forces (in the broad sense) sometimes worked to encourage wage increases, and sometimes to hold them down. For example, at the beginning of the Great Depression there was a broad consensus among the political and business elites that worker wages should not be lowered. In December 1929 President Hoover addressed four hundred of key members of the business community urging them not to cut wages. Leading executives responded in 1929–30 by pledging to maintain wages at the expense of profits. As a result, real wages actually grew quite vigorously between 1929 and 1941, helped along by a deflation of prices.
During World War II, on the other hand, millions of Americans were put into uniform and sent to fight overseas. The supply of labor dropped (even despite many women entering the labor force for the first time). At the same time war demanded a huge increase in the output. During this period worker wages grew, but much less than they would if they were driven by pure economic forces of demand and supply. The reason was that the government (through the National War Board created by President Roosevelt in 1942) actively intervened in suppressing labor disputes and restraining wage growth.
Overall, the period from the New Deal through the Great Society was characterized by government policies that favored labor unions and enforced laws against various business practices designed to suppress unionization. As a result, the proportion of unionized workers increased. Before the passage of the National Labor Relations Act (NLRA) in 1935, only 7–8 percent of workers were unionized. By 1945 this proportion increased to over 25 percent and fluctuated at that level until the late 1960s.
In the 1970s union coverage started to decline. Currently it is at the level of 12 percent. The decline of union membership in the private sector was even more pronounced: from 35 percent in the 1950s to 7.6 percent in 2008.
Various explanations have been proposed for this decline, but recent research, summarized and extended by Schmitt and Zipperer in a 2009 article, indicates that the most important factor was efforts by the firms to derail unionization campaigns. One of the methods used to defeat union drives is firing pro-union workers, which is illegal under the NLRA. The frequency of union election campaigns in which employers used illegal firings as a disruptive and intimidating tactic grew during the 1970s and reached a peak in the early 1980s, when roughly one in three unionization campaigns was marred by illegal firings (the bars in the graphic above).
There is no consensus among economists on whether a decline in unionization has contributed to wage stagnation. While labor unions definitely increase the wages of unionized workers, by an estimated 10–15 percent, on average, most economists believe that labor unions distribute income from nonunion to union workers, and that the effect on the overall real wages is negligible (this is what, for example, the influential Economics textbook by Paul Samuelson and William Nordhaus says). Whether this assessment is correct, or not, the undeniable fact is that the social mood among the American elites with respect to labor unions (and more generally, to labor-management cooperation) has undergone a sea change during the 1970s.
I won’t describe this shift, or its possible causes here – instead referring the reader to my Aeon article. To cut the long story short, by the late 1970s a new generation of political and business leaders had come to power. Although the election of President Ronald Reagan in 1980 and the beginning of ‘Reaganomics’ was the most visible symbolic manifestation of it, the actual cultural shift took place several years before. While the presidency of Richard Nixon continued the Great Society policies of the Lyndon Johnson’s era, the policy trends under Jimmy Carter were much more similar to the subsequent Reagan era.
Perhaps the best testament of the changing cultural landscape is the famous resignation letter to the Labor-Management Group by the United Auto Workers president Douglas Fraser:
The leaders of industry, commerce and finance in the United States have broken and discarded the fragile, unwritten compact previously existing during a past period of growth and progress.
So how do we capture this cultural shift numerically (which we need for statistical analysis)? What kind of a proxy could we use?
Let’s take a look at another graphic. It shows the trend in the real minimum wage – well known to anybody who has followed the debate about wage and income stagnation:
The green curve follows the fluctuations of real minimum wage: ups, when its nominal value is increased, and downs, when its real value is eroded by inflation. The brown curve depicts the smoothed overall trend, which is a general increase until the late 1960s, followed by a decline during the 1970s and 1980s.
I submit that the smoothed trend of the real minimum wage is an excellent proxy for the hard-to-quantify variable, cultural shift, which we seek.What should a minimum wage be? Should it cover just basic subsistence, the price of daily bread, or more? Cultural attitudes towards this issue vary from place to place (e.g., India versus Norway) and over time (what was acceptable in the nineteenth century America may not be acceptable today). Furthermore, the actual value of this proxy is set by a political process, resulting from two countervailing forces: one that pushes it up faster than inflation, which prevailed during the period from the New Deal to the Great Society. The opposite force, which allowed the minimal wage to decay as a result of inflation, gained the upper hand in the late 1970s.
The main point is that the current value of the real minimum wage is set as a result of a complex of political, economic, and cultural influences. It appears to be a good candidate proxy for the variable we need to quantify. An additional plus is that it is already expressed in the same units as the quantity that we aim to model and understand (inflation-adjusted dollars per unit of work time).
I should emphasize that I am not talking about the direct effect of the minimum wage on overall wages; I treat it as a proxy for the complex of non-market forces. There are reasons to believe that the direct effect is slight, because it affects a small proportion of the American population (made even smaller because many states set their minimum wages above the federal level).
We now have all the quantitative ingredients – GDP per capita, the labor demand/supply ratio, and a proxy for non-market forces. The next step is to put it all together in a statistical analysis of the effects of these three factors on the trends in real wages.
To be continued
“Real wages” taking into account inflation certainly makes sense… but… it also seems worthwhile for any broader analysis to take into account “productivity.” e.g., I’ve seen books arguing for “living wages” in which the authors have argued that — adjusted for inflation and productivity (e.g., from technological improvements) — the decline in the minimum wage is actually much more severe. In any case, thanks again for provoking thought and discussion on a very worthwhile topic.
Productivity enters the model in two ways. A direct effect is that when productivity increases, the demand for labor falls, because businesses do not need to hire as many workers to produce a given amount of GDP.
On the other hand, as productivity grows, GDP also tends to increase. This is a complex, indirect, and long-term effect, and it is not completely understood how this connection operates – at least, I have been unable to find any convincing theories explaining this process in a mechanistic way (if someone knows about such theories, I’d like to see references). I include this indirect effect in the model simply by following the trajectory of the observed GDP.
The direct and indirect effects work in opposite directions, and on different time scales. The standard theory is that increased productivity can depress wages in the short run, but as surplus workers retrain and move into different industries, in the long run everyone wins.
This is a beautiful theory, but the data indicate that it does not always work. There is a very well-known graphic showing how productivity and wages increased together in lock-step until the 1970s. But after that the productivity continued its inexorable upward trend, while wages, as we know, stagnated.
“Whether this assessment is correct, or not, the undeniable fact is that the social mood among the American elites with respect to labor unions (and more generally, to labor-management cooperation) has undergone a sea change during the 1970s.”
This was just part of a broader change. It was in the late 70s and really in the 80s where share holder value came to dominate CEO thinking about business. Before the 1980s shareholder value was a periphery concern of most businesses (and the managers who operated them). Their believed that their main goal was the “health” of the company – its long term success as an institution. The intellectual foundations for share holder activism was set in the 1970s, it began its swing in the 1980s, and by the early 1990s almost all CEO’s pay and compensation packages were tied to the company’s stocks. CEOS in the 50s and 60s often talked about the businesses they ran as if they were a social institution with many stake holders (including their employees – and as your analysis notes, the unions that represented them) and their actions often matched their rhetoric.
The real competition was not between capital and labor, but share holders and managers.
In the 1980s the share holders won. Corporations are no longer run as institutions, but commodities. Long term growth, stability, and institutional survival have no purpose in the new order – the share holder, after all, really doesn’t care if a business fails or succeeds, as long as he sells at the right time. CEOs now base most of their decisions on how things will effect share price, and company after company has been driven into the ground because cuts made in the name of short term stock jumps are not good for the long term health of the company. .
(Karen Ho’s Wall Street: An Ethnography devotes several chapters to this. I very much recommend it. I ail be reviewing the book over at my blog sometime in the next few weeks. Until then, the following article is the best introduction I have found to the topic:
“The Dumbest Idea in the World: Maximizing Shareholder Value”
Steve Denning. Forbes 28 November 2012.)
I suppose that finding a way to quantify company behavior (especially as related to stock prices) might provide another way to “capture this cultural shift numerically.” (Perhaps merger/acquisitions would be a good place to start?)
T.Greer, thank you for this information. I did not know this shift is relatively recent.
As I think about it, the minimum wage is a good, if not ideal, proxy for unquantifiable ‘cultural forces’. Indeed, the notion of ‘minimum’ is very different from country to country.
The current minimum wage in the US is clearly not a living wage, yet there is no much talk about the need to raise it. So it is acceptable to the society at large. Is it because people still think about it as good enough for a teenager’s summer job?
After all, a living wage used to be a part of the American Dream.
The real minimum wage is a suitable proxy, but it is an outcome of the cultural change you wish to follow. I suggest the presidential oscillator as a measure closer to the source of the cultural change:
The oscillator is simply the fraction of the preceding 30 years that the presidency has been held by the Democratic party and its precursors. It has the advantage of being predictable. Given an electoral outcome you can project the future value of the oscillator. I show three futures depending on the outcome of the next two presidential elections, the blue future is a two-term Democratic president being elected in 2016, the red future is a two-term Republican and the purple is a one-term Republican followed by a Democrat for one term.
The basic idea behind the 30 year period is it represents a typical length of time that elite political pundits have been following politics. When the oscillator is high, elite opinion tends to have been formed during period of a Democratic dominance and the cultural “zeitgeist” will be more friendly towards Democratic constituencies like labor. When the oscillator is low, elite opinion will have been formed during a period of Republican dominance and the zeitgeist will be more friendly towards Republican constituencies like Business and Finance.
Ive always considered min wage as a cultural proxy (more so than a strict economic policy), but min wage is also a wage. Introducing it as a factor seems would obviously cause the model to more accurately mimic wages overall. Am I missing something?
(btw, a recent piece you might find interesting: http://www.eoionline.org/blog/x-marks-the-spot-where-inequality-took-root-dig-here/ )