by Dean Prigelmeier, President of Proactive Technologies, Inc.
In the Part 1 of this article, I reminisced about the better times for workers several decades ago from my own experience as a young man entering the labor force via manufacturing. If you ask others who were around then, or did a little research, you must have found it was not a fantasy, but the life of a normal American middle-class worker.
Manufacturing was seen as a prestigious position, especially among the lower and middle classes. Someone was fortunate to have a job in manufacturing and could expect hard work but a comfortable life.
Before many employers, especially those listed on Wall Street Exchanges, began to follow neoliberal economics (not to be confused with “liberalism” – which ” is a political and moral philosophy based on liberty, consent of the governed and equality before the law“) companies like Hewlett-Packard, IBM, General Electric, as well as their suppliers, were the “gold standard” of employers. Although the steel industry started moving off-shore in the 1970’s, the movement of jobs and decimation of communities seemed to be an isolated occurrence.
Then a series of consequential events set the United States on a path of economic and societal decline for the vast majority of its citizens.
The 1980’s and the War on Labor Unions
The PATCO Air Traffic Controller Strike of 1981 ushered in an unrelenting pattern of legal rulings and legislation that eroded the strength of the labor movement, removing a potent alternative to unprincipled employers. Corporations were emboldened by an infamous 1970 New York Times magazine article in which the Chicago school economist Milton Friedman argued that “businesses’ sole purpose is to generate profit for shareholders. Moreover, he maintained, companies that did adopt “responsible” attitudes would be faced with more binding constraints than companies that did not, rendering them less competitive.” That was interpreted very literally by business think tanks, lobbying groups and politicians to mean “increased shareholder value” above all else.
Supported by conservative groups such as the American Legislative Exchange Council(ALEC), the Heritage Foundation, The Koch Brothers and the US Chamber of Commerce, a number of states became “right to work” states, passing laws prohibiting or making it very difficult for labor to organize. Today, 27 states are right to work states, encouraging employers to move their operations there to take advantage of the “pro-business” environment.
According to USA Facts, labor unions declined in strength to 14.3 million, or 10.8%, of US employees in private employment in 2020 from 20.1% in 1983, when there were 17.7 million employed waged and salaried workers in unions. The number of unionized employees in public sector unions remained relatively constant, at around 35%. Currently there are movements afoot to unionize more of both public and private sector employment, but there are many decades of anti-labor laws in place to overcome.
United States Trade Imbalance, Growing Wealth Inequality and Worker Insecurity
From the 1980’s on, the U.S. Chamber of Commerce – a private lobbying group of business, primarily corporate at the national level, played an increasing role in negotiating trade agreements with our trading partners. As their agenda concentrated around the growing class of multinational corporations – with no particular allegiance to any one country including the U.S., trade agreements grew to trade off the good paying and sustaining U.S. job classifications for lower-wage, lower worker protection jobs in developing countries.
Neoliberal trade agreements were concluded that opened the flood gates to the off-shoring of American jobs. The so-called North American Free Trade Agreement(NAFTA) between the United States, Canada and Mexico legalized the migration of American and Canadian manufacturing to Mexico which had already begun many years prior and encouraged more. Mexico was a source of cheaper labor and laxed environmental and employment laws. It also included stiff penalties for host countries that interfered with the operations of agreement member enterprises on their land, even if polluting the environment or exploiting workers.
China’s Most Favored Nation status was first granted by the United States in the 1990s and was made permanent in 2001. This significantly liberalized trade between the U.S. and China and loosened enforcement of trade, labor, human rights and environmental laws with China, causing an acceleration in the off-shoring of American manufacturers.
Similar trade agreements were formalized with other countries, such as Brazil and countries of Asia. In each case, as wages of workers moderately increased and workers attempted to create labor unions, employers sought out other countries that were willing to offer even cheaper labor and even lesser regulatory enforcement, creating a wake of disrupted communities and broken workers in their path.
Major transnational companies like Walmart forced its suppliers to cut costs each year or move operations to one of these countries in the name of “free-trade,” in many cases closing down U.S. communities as the main employers in rural areas relocated their operations with the financial support, in most cases, by U.S. government policies and programs. The U.S. balance of trade continuously declined for the last 40 years, meaning much more wealth created by the remaining activities in the United States went to purchasing foreign-produced goods and services than U.S. goods and services were sold to foreign countries. As a result, “wealth inequality in the United States grew from 1989 to 2019, and wealth became increasingly concentrated in the top 1% and top 10% due in large part to corporate stock ownership concentration in those segments of the population; since the bottom 50% own little if any corporate stock. The gap between the wealth of the top 10% and that of the middle class is over 1,000%; that increases another 1,000% for the top 1%. Although different from income inequality, the two are related. In 2017, an Oxfam study found that only eight people, six of them Americans, own as much combined wealth as half the human race.” So one could say the orchestrated concentration of wealth continues to be successful – even through tragic, life-altering major economic events stemming from this imbalance take its toll, for most, such as the Crash of 2008 and the effects of the Covid-19 pandemic of recent times.
According to the Brookings Institution, in 2018, “Manufacturing constitutes 27 percent of China’s overall national output, which accounts for 20 percent of the world’s manufacturing output. In the United States, it represents 12 percent of the nation’s output and 18 percent of the world’s capacity.” “China is the top nation in terms of manufacturing output and the percentage of its national output that is generated by that sector. Poland meanwhile has the highest percentage of its workforce employed in manufacturing.” In essence, America created its largest competitor for China’s agreement to generously invest their profits in U.S. Treasury bills, about $1 trillion worth in 2020.
In 2019, the U.S. was ranked the world’s number 2 exporter, number 1 importer and ranks 127th out of 127 in trade balance. With their growing wealth, the Chinese now have better bridges, better roads and better infrastructures in the cities and suburban areas. Their presence in the world spreads across it. In a few decades, the United States created its biggest and most challenging competitor on many levels while giving up, some say, its status as a beacon of capitalism and democracy.
Expanding trade and helping the rest of the world eliminate poverty are noble goals. Doing it at the expense of vast numbers of Americans proved itself risky to a previously balanced trade arrangement and a society of mostly sustained and hopeful consumers.
Neoliberal economists like Larry Summers sold political leaders on the notion that offshoring of manufacturing jobs to lower-wage countries was strategically the right thing to do. That “we’re going to have so many new service jobs that we don’t need those manufacturing jobs anymore.” He infuriated union leaders by saying, “a job is a job.” It was a phony notion of competitiveness that was supposed to make everything cheaper to American citizens. The result was that the product or service was created for less and less cost, and the company and its shareholders retained the growing profit margin. The multinational corporations benefitted immensely since they could sell the same product at the same high price if not more…with many finding ways to avert U.S. taxes and sheltering money that could be used for U.S. investment off-shore.
Employee’s wages since 1973 have not increased much. In fact, for much of it wages, in real dollars, declined for workers. For CEOs on the other hand, according to TheBalancedCareers, “In 1965, CEOs earned on average 20-times the average employee. By 1978, CEOs earned just less than 30 times the average worker. In 1989, the divergence grew to 59 times and by 1995 it was almost 72 times. By 2014, the EPI suggested that the ratio was 313 times the average worker compensation.” In 2021, CEO compensation continues to rise even through the Covid pandemic to, in some cases, 1000’s of times employee median wages.
For much of this time, workers were told wages and benefits would need to be cut in order to compete globally, while CEO’s were negotiating huge increases for themselves under the premise that companies had to attract the “best and the brightest” – code for managements that would follow the ongoing strategy for enhancement of shareholder value. Having no choice, unions continued to negotiate away their wage increases and benefits in an attempt to keep their jobs in the United States. For some, the company still moved with employers offering existing employees the ultimate indignity; an opportunity to train their replacements. And when that wasn’t enough, corporations used the U.S. visa programs to bring in technically skilled foreign workers who would work for less and for those jobs for which U.S. adults were attending a 2 or 4-year program because the pundits told them that is where the future lied.
Mergers and acquisition‘s made sure that “competitiveness” increased under the story line that we were now competing in the world economy so fewer large corporations made sense. Monopolies and oligopolies have continued their growth in every sector; from food production (from farm to distribution), to product manufacturing of durable goods and household products. Most everything is now produced by fewer and fewer industry dominant corporations. There was very little enforcement of U.S. antitrust laws that continues to this day. Hedge funds with an endless supply of low interest money buy up small and midsize companies with the purpose of gutting them for their assets and to turn a quick profit.
Even service sector jobs that were once thought immune and were to replace disappearing manufacturing jobs, such as information technology, engineering, financial services, paralegal and legal research and medical services, found new homes in Mexico, China and later India, Pakistan, and Viet Nam.
When many of the corporations are cutting benefits to workers, it gives cover to, and in some ways forces, the others to follow suit. Paid Sick leave was vulnerable, so was paid vacation. While overtime rules are still enforceable by the National Labor Relations Board and the Department of Labor, some employers found ways to circumvent the intent of the rules by redefining job titles and using more and more contract employees with no benefits.
Family farms were gobbled up by factory farms; from farming to food processing to wholesale food production and food retail sales (a form of “vertical monopoly:)- upsetting the balance of rural communities even further. Generations of farm families suffered the same fate as manufacturing families; an erosion, then loss of, wealth; the upheaval of the family; the loss of hope; and a lack of a clear path forward. The carnage being left behind is amassing in America for the next generations to ultimately sort out. Sadly, the social program safety net put in place as a result the Crash of 1929 – built by the contributions of workers – is being used up more and more at a time when less workers means less contributions. Where this is heading is ominous.
As wages for workers stagnated but the cost of living continued to climb, credit and credit cards filled the gap similar to the “company store” of the turn of the century when worker’s pay was not enough to live on, so credit was extended by the company store that kept the worker impoverished and beholding to the company. According to CreditCards.com, “Americans’ outstanding revolving debt, most of which is credit card debt, reached $998.4 billion2 in July 2021, according to data from the Federal Reserve.That’s an increase from a low of $974.6 billion2 in the fourth quarter of 2020 after the amount of revolving debt owed by U.S. consumers fell throughout the year.”
Pensions and Retirement
Kline-Miller Multiemployer Pension Reform Act of 2014 was enacted in the United States on December 16, 2014, with the goal of allowing certain American pension plans that have insufficient funds, and thus are at risk of insolvency, to reduce the benefits they owe to participants. Prior to this, pensions were audited frequently to ensure they would remain solvent into the future to maintain promises to retirees.
The Bankruptcy Reform Act of 1978 and subsequent amendments, in short, made it harder for individuals to discharge debt in bankruptcy but easier for corporations to reorganize pension plans, to offer buyouts to employees (at a fraction of its future value) or push plans onto the Pension Benefit Guarantee Corporation, which is funded by participating employer contributions, and by taxpayers when those funds run out. Pensioners are given a percentage of what they would have realized each month.
From 1980 – 2009, the PBGC experienced a rush of bankruptcy-induced pension acquisitions onto their balance sheet. Large corporations hurried to file bankruptcy if for no other the reason than to shed their pension plans – a promise between the worker and the employer that if they contributed a good share of their life to the creation of value for the company’s, the CEO’s and the shareholder’s benefit, the employee would be taken care of upon retirement. Some of these jobs left employees badly damaged or crippled through their life of dedicated service, but healthcare benefits in retirement became “negotiable.”
Today, the PBGC, the organization that collected employer contributions to cover the cost of bankrupt pension plans, was nearly broke for multi-employer pension funds requiring a taxpayer bail-out. Next to follow is the single employer pension fund. While the corporations were able to file bankruptcy so that their balance sheet could soar, the liability was shifted to ultimately the taxpayers. In these cases, the pensioners receive just a fraction of their pension that was promised by the employer and they are on their own to cover their exorbitant individual healthcare.
In 2000, a decision was made that the interest on the federal debt was too daunting. Concessions seemed to be made by the Clinton administration that if the federal reserve kept the interest low, the federal government would loosen its oversight and enforcement of the financial sector. The result; more and more people moved money from their savings as the savings interest rate tanked to less than 1% and into risky wall street investments as an alternative. Wall Street created more and more risky derivative instruments that gave the illusion of saving for retirement but could not guarantee it.
While employers were moving away from defined benefit retirement plans and offering 401ks, employees were forced to “roll the dice” and put blind faith in the fund manager’s competence and integrity. For almost 40 years, workers were not told that there was no law in place to require financial advisors to put the individual investor’s interest above their own; a fiduciary responsibility. Several efforts in Congress to strengthen protections to individual investors were shot down by financial services sector lobbyists. were block. Only recently has the U.S. Department of Labor attempted to enforce fiduciary rules, but the prospects are uncertain.
Wages for workers have not risen in the last 40 years. In order for employers to have justification for keeping wages low, they had to change the formula for calculating the cost of living. Inflation was broken into two categories, core inflation and inflation. Core inflation includes the most volatile parts of the economy such as rents or mortage costs, utilities, healthcare, food while inflation covers the least volatile. Confusing it more, several other metrics and definitions have been created to explain the rising cost of living so those feeling it will think their perception may be wrong.
In 1996, the Welfare Reform Act was signed into law. At a time when U.S. policies encouraged the outsourcing of good-paying initially manufacturing, followed by good paying service jobs in financial services, law, medicine and customer support, the government calculated that it was time to remove the welfare safety net. States were required to prod current welfare recipients into the workforce. When states reported that there weren’t enough jobs to absorb all of those individuals losing their welfare benefits, social security-disability attorneys got busy signing those individuals up at an alarming rate…with the social security administration accepting applications that prior they most likely would not. After a period of decline from 1975 – 1990, the number of Americans receiving social security-disability insurance rose rapidly between 1995 – 2010 from 600,000 to over 1,000,000 – ironically since lawmakers in Washington periodically expressed, for decades, the need to reform social security in order to save it.
Education
As I said in Part 1 of the article, I was the beneficiary of a very robust vocational training program while in high school in the 1970’s which was dismantled in the 1980s when the educational system’s focus became more about getting children ready for college. This trend continued clumsily for the following decades, leading to a reactionary piece of legislation when test scores continued to decline, and the U.S. rank compared to other developing countries in education attainment continued to embarrass. The No Child Left Behind Act of 2001 changed the emphasis of schools to teaching kids to pass the tests. Schools were leaned out by defunding the teaching of things like history, languages, civics and government, culture and the arts so that distractions to taking the test could be eliminated.
The United States continued to decline in its rankings versus the world. In educational attainment, the U.S. ranks around 50th in the world. In literacy, the U.S ranks 15th in the world. “The Program for International Student Assessment tests 15-year-old students around the world and is administered by the Organization for Economic Cooperation and Development. In 2018, when the test was last administered, the U.S. placed 11th out of 79 countries in science. It did worse in math, ranking 30th.” Overall, in 2018 the “US now ranks 27th in the world for its levels of healthcare and education.”
After nearly 4 decades of decline, Science, Technology, Engineering and Math(STEM) is the focus of education today. It seems to be a step back to the vocational programs of the 1970s which were similar in structure to apprenticeships (even in some cases partnering with employers), but more focused on school-taught lab activities. Time will tell if it is the right prescription for the ailment.
College – Out of Reach For Most
Even if a child can reach the level of college entrance, the average cost of tuition, books and fees is out of reach for most. Tuition and fees at a ranked in-state public college is about 72% less than the average sticker price at a private college, at $9,687 for the 2020-2021 year compared with $35,087, respectively, U.S. News data shows. That average cost for out-of-state students at public colleges comes to $21,184 for the same year.
National College Debt has recached $1.7 trillion on 2021, placing tremendous burdens on the student and their families as they struggle to find a job that would pay them enough to live, consume and pay down the college debt.
There are still tax write-offs for employee tuition reimbursement for employers in place by law. “A way to provide employees with education benefits without having them be taxable to the employee is to provide working condition benefits. This policy is for businesses that don’t have an educational assistance plan or who provide education assistance over $5,250 in a year. Working condition benefits are given to employees to help them do their jobs, including job-related education benefits. Degree programs usually don’t qualify, because the employee should already have the degree necessary to do the job. To qualify as a working condition fringe benefit: 1) The education must be required by the employer or by law for the employee to keep his or her present salary, status, or job. The required education must serve a bona fide business purpose of the employer. 2) The education maintains or improves skills needed in the job.”
We all know that college tuition has skyrocketed in the last 20 years and employers rarely offer tuition reimbursement. This leaves many potential workers putting off college, many workers who finish college putting off marriage and childbirth, and families in debt for decades. This also disrupts the consumption power of a large class of people that small businesses rely on to keep their enterprises participating in capitalism. Those finding a job in their chosen field of study often find only a subsistence wage in their chosen field of work if they were lucky enough to find a job at all. It is not uncommon to find college graduates working side-by-side with an 18-year-old in a fast food restaurant watching their degree and the skills it represents “become irrelevant.” Furthermore, many retirees aren’t able to retire since they were not left much of their pension if they even had one and now work with, and end up competing with, college graduates and high school graduates for the same low pay, no benefit jobs.
Healthcare
In the 1970s, many employers provided healthcare benefits and healthcare costs were kept low by non-profit community hospitals. The hospitals were bought up by private equity firms, but somehow were able to keep their non-profit status for tax purposes while yielding record profits.
When healthcare as a part of budget balooned, employers switched to Health Maintenance Organizations (HMO) which came from the Health Maintenance Organization of 1973. Preferred Provider Organizations (PPO) added several layers of costs between a patient and their doctor with little in the way of rules and oversight .
The so-called “Affordable Care Act of 2010” started as a sincere effort to overhaul the U.S. healthcare system that became so unaffordable to most and inaccessible for many. Other than the elimination of “pre-existing” conditions that excluded many who were chronically sick from insurance entirely, or made healthcare inaccessible with exorbitant premiums, political division led the details of the Act to be worked out by hospital and insurance lobbyists. Today, “out-of-network” surprise billing and non-transparent pricing are still a major source of personal bankruptcy. “Every year 530,000 American families file for bankruptcy due to medical bills. US medical bills and indebtedness are responsible for 66.5% of all American personal insolvencies. On average, Americans spend about $10,000 a year on healthcare costs. 66% of all medical debt in the US comes from one-time medical issues,” according to Balancing Everything’s Medical Bankrupcies Statistics-Update 2021.
In 2021, some 49% of workers receive some form of healthcare benefits from their employers. It can take many forms, with varying amounts of copay and contributions required of the employee. The other half of the U.S., 160, million Americans, have little or no access to healthcare other than state-run Medicaid programs.
Many people find it hard to believe the U.S. performs poorly on most measures of health outcomes compared to other high-income countries. But the truth is, study after study supports the same two conclusions: The U.S. spends more on health care but has worse health outcomes than comparable countries around the globe. Shorter lives, bad birth outcomes, more injuries and homicides, heart disease, obesity and diabetes, chronic lung disease, disability adolescent pregnancy and sexually transmitted disease; HIV and AIDS — we have the second highest prevalence of HIV infection and the highest incidence of AIDS and Drug-related deaths contribute to these declining outcomes. “In fact, the president’s 2014 National Drug Control Strategy noted overdose deaths now surpass homicides and car crash deaths.”
The U.S. spends more than any other country on health care, but ranks 11th in healthcare outcomes compared to the other high income countries.
Workers today have high co-pays if their employer even offers healthcare coverage to families. Cost of healthcare is skyrocketed since that’s another sector where government oversight is negligible. One serious illness can lead to economic ruin for a worker and their family. The fear of that has kept workers on edge on a daily basis.
Media Erosion and Erosion of Trust
The Fairness Doctrine of the United States Federal Communications Commission (FCC), introduced in 1949, “was a policy that required the holders of broadcast licenses both to present controversial issues of public importance and to do so in a manner that fairly reflected differing viewpoints.[1] In 1987, the FCC abolished the fairness doctrine,[2] prompting some to urge its reintroduction through either Commission policy or congressional legislation.[3] However later the FCC removed the rule that implemented the policy from the Federal Register in August 2011.[4]”
Media deregulation was pushed by the investment banks. Robert W. McChesney argues that “the concentration of media ownership is caused by a shift to neoliberal deregulation policies, which is a market-driven approach. Deregulation effectively removes governmental barriers to allow for the commercial exploitation of media. Motivation for media firms to merge includes increased profit-margins, reduced risk and maintaining a competitive edge. In contrast to this, those who support deregulation have argued that cultural trade barriers and regulations harm consumers and domestic support in the form of subsidies hinders countries to develop their own strong media firms. The opening of borders is more beneficial to countries than maintaining protectionist regulations.[11]”
“In the case of Sony BMG, there existed a “Big Five” (now “Big Four”) of major record companies, while The CW network’s creation was an attempt to consolidate ratings and stand up to the “Big Four” of American network (terrestrial) television (this despite the fact that the CW was, in fact, partially owned by one of the Big Four, CBS). In television, the vast majority of broadcast and basic cable networks, over a hundred in all, are controlled by eight corporations: Fox Corporation, The Walt Disney Company (which includes the ABC, ESPN, FX and Disney brands), National Amusements (which owns ViacomCBS), Comcast (which owns NBCUniversal), AT&T (which owns WarnerMedia), Discovery, Inc., E. W. Scripps Company, Cable (now known as Altice USA), or some combination thereof.[83]”
There may also be some large-scale owners in an industry that are not the causes of monopoly or oligopoly. iHeartMedia (formerly Clear Channel Communications), especially since the Telecommunications Act of 1996, acquired many radio stations across the United States, and came to own more than 1,200 stations. However, the radio broadcasting industry in the United States and elsewhere can be regarded as oligopolistic regardless of the existence of such a player. Because radio stations are local in reach, each licensing a specific part of spectrum from the FCC in a specific local area, any local market is served by a limited number of stations. In most countries, this system of licensing makes many markets local oligopolies. The similar market structure exists for television broadcasting, cable systems and newspaper industries, all of which are characterized by the existence of large-scale owners. Concentration of ownership is often found in these industries.
Subsequent rulings by the FCC facilitated the further concentration of more and more media ownership of outlets. This stifles fair and balanced reporting, as well as the news-worthiness of programs and content. When the large conglomerates decided to all switch to a model of “infotainment,” the quality of news declined as the number of investigative reporters replaced by opinion entertainers increased.
Wall Street hedge funds and private equity firm started buying up local media outlets in major cities, created a black hole of information that kept voters from knowing about important local, state, national legislation that is negatively impacting their lives and the future of their families. Their rationale was that consumers can get news from so many sources on the Internet you don’t need paper newspapers, yet the wealthy still have their Wall Street Journal, New York Times, Washington Post, Economic Times, The Economist and other major publications that only they are privileged to read due to the high cost of a subscription. Everyone else is confined to a media difficult to navigate for truthful reporting on serious issues of interest, and are manipulated by those that provide the rest.
According to a Gallup 2021 survey, “In all, 7% of U.S. adults say they have “a great deal” and 29% “a fair amount” of trust and confidence in newspapers, television and radio news reporting — which, combined, is four points above the 32% record low in 2016.”
To maintain this information inequality, the focus of political campaigns have become more interested in raising money for commercials and staging media events and sound bites to get time on a syndicated political talk show. Congress’ inability to pass meaningful campaign finance reform, combined with the US Supreme Court’s fateful 2010 decision in Citizens United v the Federal Communication Commission case opened the door for unlimited, anonymous campaign contributions. When Congress eliminated “earmarks” as a way of gaining a congressman’s or senator’s support for a proposed bill or law, it made the Speaker of The House and Senate Leader, as well as the minority leaders of the House and Senate, the clearinghouse of campaign donations to campaigns. These four individuals de ide which bills will be heard, which nominations of office will be considered – all based on the interests of the monied elite. As the wealthy became wealthier, the contributions to both parties increased. In 2012, “The final cost of this presidential-year election totaled more than $6 billion — including more than $300 million in dark money spent by politically active 501(c) groups that don’t disclose their donors.” In 2020, it rose to an estimated $14.4 billion.
The same is occurring at the state and local levels as well. Voters have been increasingly coopted from the democratic process, replaced by money from those interests that serve only the few. Some brave members of congress have tried to make changes to the campaign finance laws, such as republican John McCain, democrats Russ Feingold, Bernie Sanders and Elizabeth Warren, but they are up against powerful lobbying organizations and the other members of Congress who do not want to go back to shaking hands with voters, hearing their challenges and asking for their contributions. Some states have attempted to reign in campaign contributions, but are fought at every turn.
Home Ownership
Home ownership was dealt a serious blow by the Savings and Loan Scandel of the 1980s and 1990s. While, in typical fashion, the initial blame was put on the social programs were meant to help people but who became the victims of unscrupulous capitalists, others blamed deregulation of the S&L industry, combined with regulatory forbearance, and fraud which worsened the crisis.
Many see the irrational housing price rises of the last 5 years as another looming crisis similar to the S & L Crisis and Crash of 2008 by many of the same people and organizations. Afterall, little in the way of regulation has changed that wasn’t already undone by lobbyists and only a few individuals who disrupted the word economy saw the inside of a prison. Many of the same people that created S & L Crisis played a role in the subprime housing market crash of 2008 amassed tremendous wealth. Some like former Treasury Secretary Steven Mnuchin (dubbed the “Foreclosure King”) was able to acquire foreclosed properties and foreclosed government-backed reverse mortgages after the 2008 Crash for next to nothing and sell them back into the market. To build the same scenario during a pandemic and making living unaffordable in America with skyrocketing mortgages and rents seems to be a travesty to so many, but little is done while cities welcome the incresed property tax windfall.
The Federal Reserve continues to print money and lend it to wealthy hedge funds who buy up houses and apartment buildings – with cash – and compete with each other to drive the value of their portfolios up while driving those who live in the community away or into homelessness.
Economic Volatility
A deal was also struck between the Clinton Administration and the Federal Reserve to drastically lower, and find reasons to maintain it, the interest rate so that the interest payments on the national debt would be kept low. In exchange, the administration would sign legislation to deregulate the financial industry. The most notorious change was the repeal of the Glass-Steagall Act 0f 1933 by the Financial Services Modernization Act of 1999. Glass-Steagall was put in place after the Great Depression of 1929 and was meant to keep the banking sector from repeating the worst man-made financial event in world history. Repeal of the Glass Steagall Act allowed banks to, once again, remove the firewall between the depositor’s money and the speculative money. Regulations and oversight were weakened and in 7 years, despite repeated warnings, the Crash of 2008 was born. Since the banks, and their speculation, were covered by the Federal Deposit Insurance Corporation(FDIC), firms like Goldman Sacs were able to privatize their profit and socialize their risks to the taxpayers to bail them out. Collectively, the cost of the Crash of 2008 to the U.S. was more than $22 trillion, a study by the Government Accountability Office(GAO) reported.
Even those without holdings in real estate were damaged. Credit markets in 2008 and 2009 froze as in 1929, and overleveraged companies could not access financing. Nearly 3 million workers lost their jobs, and many of those lost value in their homes or their homes entirely, any savings, any increased equity in 401Ks if they had them, their health, and even their families due to the unchecked recklessness. Small businesses closed as consumers dealt with the fallout.
The low interest rate for savings (although consumers do not see the low interest rates for borrowing) has made it impossible for workers to save – which used to be a component of every worker’s retirement strategy. Their only option was to place their money in riskier instruments. Each time an economic crisis hits, workers lose their retirement investments, significant value in their riskier 401Ks and quite possibly their job and their home.
And when Americans need help recovering from a natural crises or manmade crises, The New York Times reported “critics of post-crisis profiteering responded with disgust to new reporting that private financial institutions in the U.S. are taking advantage of the federal government’s failure to respond swiftly to disasters by providing loans—to be repaid, with interest, by taxpayers—to landlords affected by floods, wildfires, and other catastrophes that are growing in frequency and intensity due to the fossil fuel-driven climate emergency. Disaster “victims often wait years for help to get back into their homes because money for repairs moves so slowly.” “Disaster capitalism hits new heights.” According to research from the Urban Institute, HUD’s CDBG-DR program starts distributing funds 20 months after a disaster, on average, and is typically still allocating money two years after that.”
Society In Trouble
According to PEW Research, Americans across the board have very little trust and faith in government and business leaders, as well as journalists. They all rank at the bottom. They also have little faith that these three can help solve the problems of the day. They have a declining trust in each other – probably due to varying perceptions of the problems and what are viable solutions.
There are a few rankings where the U.S. excels but may not want to. There Over 393 Million (Over 98%) of those guns are in civilian hands, the equivalent of 120 firearms per 100 citizens. The average gun owning American has 5 firearms, while nearly 22% of gun owners only have a single firearm. Around 1.5 million Americans are incarcerated – the highest number per capita among all the developed countries and many of the developing. There are about 48,000 suicides per year, nearly half by gun. In 2021, sadly 6,261 were suicides of veterans.
According to the Centers for Disease Control and Prevention, “Nearly 841,000 people have died since 1999 from a drug overdose. 1 In 2019, 70,630 drug overdose deaths occurred in the United States. The age-adjusted rate of overdose deaths increased by over 4% from 2018 (20.7 per 100,000) to 2019 (21.6 per 100,000). Opioids—mainly synthetic opioids (other than methadone)—are currently the main driver of drug overdose deaths. 72.9% of opioid-involved overdose deaths involve synthetic opioids.”
None of these indicators reflect a strong, stable and viable society, yet we tolerate the growing numbers. Perhaps we have become ambivalent or see it as “acceptable losses.”
According to the National Institute for Mental Health, “Mental illnesses are common in the United States. Nearly one in five U.S. adults live with a mental illness (51.5 million in 2019). Mental illnesses include many different conditions that vary in degree of severity, ranging from mild to moderate to severe.”
There seems to be a coordinated effort to add more and more division, fear and polarization to an already tenuous situation. Television and radio talk show hosts are paid handsomely to find and exploit wedge issues – even if invented – to stir up their side’s base and elevate their ratings. Higher ratings comes from higher viewership or listenership, which means higher costs for advertisers and sponsors and more money for the network. The late Rush Limbaugh made $84.5 million annually. Sean Hannity makes $40 million annually, Anderson Cooper makes $12 million, Joe Scarborough makes $8 million, Rachel Maddow $7 million and Tucker Carlson makes $6 million. Why the spread? Hate, intolerance and conspiracy sells. So does faux outrage and opposition. That is why the number of radio and tv opinion shows continue to grow – mostly negative because who likes to listen to “upbeat” messaging and happy news when you are struggling through life and looking for someone who seems to hear you? When people are in fear of their, and their family’s, way of life, they want to hear outrage that sounds familiar even if it is insincere, inaccurate or out-and-out disinformatioon. The more these outlets sell, the more they are paid and the more divided America becomes.
To most Americans, America feels like it is at a precarious state. For those who thought a civilian uprising was impossible look to the events of January 6 and the storming of the capital to know that the anger is building America even though not everyone agrees on the causes or remedies. It’s up to leaders to recognize this threat to stability, listen and seek ways to clearly address the concerns and to provide real solutions in real time. They need to show America is on a path that returns it to greatness; that we can all believe again.
It’s natural for those who have benefited by the economic inequality to want to preserve the comfortable lifestyle they’ve created for themselves. But common sense tells us that when so few become so wealthy at the expense of the vast majority that a change will have to come in one form or another. Hopefully it will be through peaceful means by true leaders, not by irrational populism that just wants change for change’s sake.
Some modest movements are taking shape to bring the discussion to the table. The Reshoring Initiative-Bringing Manufacturing home is a group of manufacturers and leaders encouraging the reshoring of American manufacturing by highlighting the real cost of offshoring versus reshoring. With the current disruption of the global supply chain by the Covid-19 pandemic, maybe this will fall on more ears. The International Standards Organization released ISO-30414 Human Capital Reporting which requires international businesses to, among other things, treat workers as “human capital” assets, account for them and report. Some billionaires are calling on peers to “change their ways,” move to bring the wealth gap closer to sustainable equilibrium “before the pitchforks come out.” These are all encouraging, but will require the solid support of a large part of the U.S. population to have a chance. Altruistic movements usually start with a few people, but need to build momentum to be seen and heard.
My goal in trying to explain some of the many forces working against the worker in America was to show how much has been taken away and how much is to be done to reverse the trend. To allow people to step back and see the factors that contributed to a frightened and stressed working class of Americans. Change cannot begin until the problem is recognized. This happened over several decades and will take several decades to reverse if change starts now. The current generation might not experience the improvement, but we who directly or by remaining silent created these conditions owe it to the next generations to “put their interests first” and return our world to what it was…and better.
With the coming new year, what better time to resolve to embrace the saying, “a rising tide lifts all boats.”
Proactive Technologies’ is a proponent of reshoring manufacturing. It’s structured on-the-job training system approach develops incumbent, new-hire and cross-training workers to full job mastery through the accelerated transfer of expertiseTM helps employers dramatically increase the value of each worker by building and supporting a training infrastructure around what is already in place. Maximizing work quality, quantity, worker capacity and compliance while lowering the internal costs of developing workers, allows employers to raise wages and benefits through efficiencies. In most cases, state workforce development grants are available to help the employer defray most, if not all, of the investment to set-up and implement the program. PTI researches available state funds, drafts a grant applications and provides grant support for clients. To see how it might work at your firm, your family of facilities or your region. Contact a Proactive Technologies representative today to schedule a GoToMeeting videoconference briefing to your computer. This can be followed up with an onsite presentation for you and your colleagues. A 13-minute promo briefing is available at the Proactive Technologies website and provides an overview to get you started and to help you explain the approach to your staff.