by Dean Prigelmeier, President of Proactive Technologies, Inc.
A disturbing emerging trend, particularly in the last three decades, concerns the accuracy and quality of the economic statistics reported to the public. You probably have noticed lately that monthly statistics such as Gross Domestic Product, U.S. International Transactions, Unemployment and Job Creation have been issued with encouraging numbers one month only to be quietly revised downward a few months later. Businesses, consumers and policy makers can only implement effective strategies and correct potential dangerous courses if working with accurate data. One of those measures concerning worker relevance, development and effectiveness is “productivity.”
Think tanks have sprung up in Washington issuing reports and policy statements, and some put a cloak of perceived “credibility” around statements they release meant to support a policy direction or change its course – both to the benefit of a segment of subsidizing interests. Confusing us even more is the media’s propensity to report, as “news,” press releases emanating from these think tanks as if accurate, unbiased and inherently factual. Some may be, but when they are reported through the same careless filter, it throws them all into suspicion. The decrease in the number of accurate, readily available sources of news and facts can derail a life or business strategy.
Take for example the daily explanations by news and business show anchors of why the stock market gyrates up or down, as if the collective market can always be explained simply as, “the stock market reacted to the federal reserve’s decision to not act,” or “the stock market tumbled because of the results of the presidential election” – only to recover fully the next day. Could another simple explanation be that the market moved one way or another because groups with large holdings decided to move them?
“Unfortunately, however, figures on productivity in the United States do not help improve productivity in the United States.”
W. Edwards Deming
Another example is the preoccupation with what is referred to as “inflation,” which is based on the consumer price index (“CPI”). A “basket of consumer goods” was selected and periodic measurements of their retail prices are taken to see, primarily, if any inflationary forces exerted pressure on prices upward or downward during the period that might require an adjustment in central bank monetary policy. First, it is important to know which goods make up the basket.
Many years ago an effort was made to take out the goods prone to price pressures. This explains the stares at price labels by the shopper who heard on the news in the morning that inflation has not risen but is looking at prices in the afternoon that seem to continually rise. The decision was made that some goods didn’t need to be in the basket because consumers could substitute them with other, less-expensive goods and still be happy with the experience. For example, substitute mac and cheese for chicken. The trouble being in that even those prices rise.
According to Wikipedia, ”Core inflation represents the long run trend in the price level.” In measuring long run inflation, transitory price changes should be excluded. One way of accomplishing this is by excluding items frequently subject to volatile prices, like food and energy.” Other volatile prices such as the cost of healthcare, travel, housing and education are not measured in inflation, either – categories causing the biggest dent in household budgets. Retailers have found other ways to confuse those making the calculation further by offering seemingly super discounts…discounts if you buy 10 of the same item, coupons or sale prices when the beef doesn’t sell at the $14 per pound price. Others have kept their base price low but tacked on endless fees which are not included in calculating inflation.
“There are so many people who can figure costs, and so few who can measure values.”
Larry Fast
Employers have made use of the dichotomy between inflation and core inflation, as well. They have learned they can justify no cost-of-living wage adjustment by pointing out that inflation – the partial reflection of price increases – has not risen, so a wage increase isn’t warranted. Ironically, US Supreme Court Chief Justice Roberts asked Congress in January, 2009, for a raise saying “I must renew the judiciary’s modest petition: Simply provide cost-of-living increases that have been unfairly denied,” Roberts said in his annual year-end report on the federal judiciary. This at a time when millions of Americans were losing there jobs and homes during the Great Recession of 2008, apparently inflation – core and regular inflation – only impacted high level government officials.
Reports of the rise in inflation are frequently reported monthly and yearly in the hope that no one will notice a small increment like they would a 10 or 20-year trend. Comparing a 20-year rise in CPI against a 20-year rise (or lack thereof) in personal income might dramatically change the discussion.
Another example, in the 1970’s economists became irritated by the effects of declining housing starts on the Gross National Product (i.e. market value of all the products and services produced in one year by labor and property supplied by the citizens of a country), so they removed it from the calculation. When the housing starts number improved, they added it back in. But when it quickly declined again, the Reagan administration economists sought a different measure of evaluating the economic performance of a country. They settled on Gross Domestic Product (i.e. measures the monetary value of final goods and services – that is, those that are bought by the final user – produced in a country in a given period of time (say a quarter or a year)) in 1981 – a measure which is used to this day.
“If you efficiently increase worker capacity and performance, and even if you raise the wages to reflect the increasing value of the worker, your productivity should increase as your costs stay stable or decline. But far too often a false conclusion is drawn that you can cut costs by cutting the cost of labor and there will not be repercussions in the long-term.”
GNP is an economic statistic that is equal to “GDP plus any income earned by residents from overseas investments minus income earned within the domestic economy by overseas residents.” There are three general ways of calculating GDP. Two of the ways focus on the movement of money; Income and Expenditure models. In countries that have a high level of income inequality, GDP may reflect more how one income class is doing while ignoring the other(s). While GNP indicates allocated production based on location of ownership, some critics estimate that today 70 – 80% of GDP is the measure of the turnover of financial transactions (Wall Street trading) and less about the aggregate economy for all. So when we get excited about economic growth as reported by GDP, we may be cheering or fearing for others than ourselves.
Another suspicious metric is the “Dow Jones Industrial Average” and similar index measures. The Dow Jones is a basket of 30 blue chip publicly traded company stock that are collectively measured for their average increase/decrease in stock price. The trouble with this measure is that the formula is adjusted by removing chronically poor performers and replacing them with better performing stocks. Anyone knows, including economists, if you change a formula to find a more acceptable result you really aren’t measuring anything but your own creativity.
Other measures such as the “level of employment” (without consideration of the wage level, benefits, hours worked and without considering underemployment), “level of unemployment,” and “level of job creation” may not seem accurate and relevant depending on your economic status. While other things that should be measured don’t ever seem to be, like the effect of the Crash of 2008 on suicide rates, drug use and abuse, divided families, decline in high school graduation rates – all a cost to society and taxpayers – the public seems to be given encouraging economic performance numbers that, critics say, confusingly reflect how the wealthy are doing and not main street citizens. Keeping the populace confused or uninformed helps to ensure the status quo a little longer.
But for this article, I have chosen to focus on the measure of “productivity.” I only mentioned the other measures that should be critically observed to demonstrate how inaccurate measures of performance do not lead to the measures, or the right measures, to improve what is supposed to be measured. In fact, efforts to improve based on misleading data can lead to disappointing to disastrous results.
“Productivity” sounds like a simple measure. However, employees or labor leaders are very suspicious when the term is used these days. Productivity is associated with the justification of a pending decision to lay-off employees and/or close a plant. Listening to a Wall Street business show you can always hear the exciting line, “Good news for Wall Street, company X has announced it would lay off 2,000 workers and move its operation to China.” Good news for whom; the person that might still have a company’s stock in their “stock portfolio,” but lost their job with them?
W. Edwards Deming had something to say about the measure of productivity in his book “Out of the Crisis” first published in 1982. In a chapter entitled, “Quality, Productivity and Lower Costs” Deming stated, “Measures of productivity do not lead to improvement of productivity. There is in the United States, any day, a conference on productivity, usually more than one. There is in fact a permanent conference on productivity, and there is now the President’s Committee of Productivity. The aim of these conferences is to construct measures of productivity…Unfortunately, however, figures on productivity in the United States do not help improve productivity in the United States. Measures of productivity are like statistics on accidents: they tell you all about the number of accidents in the home, on the road, and at the work place, but they do not tell you how to reduce the frequency of accidents.”
He continues, “On the other hand, an orderly study of productivity, to enquire whether any given activity is consistent with the aim of the organization, and what it is costing, can be helpful to management.” Deming quoting from Marvin E. Mundel’s book “Measuring and Enhancing the Productivity of Service and Government Organizations,” Outputs…cannot be considered without considering the goals they are designed to achieve…”
According to Wikipedia, William Edwards Deming (October 14, 1900 – December 20, 1993) was an American engineer, statistician, professor, author, lecturer, and management consultant. Educated initially as an electrical engineer and later specializing in mathematical physics, he helped develop the sampling techniques still used by the U.S. Department of the Census and the Bureau of Labor Statistics. In his book [published in 1994] The New Economics for Industry, Government, and Education,[1] Deming championed the work of Walter Shewhart, including statistical process control, operational definitions, and what Deming called the “Shewhart Cycle”[2] which had evolved into PDSA (Plan-Do-Study-Act). This was in response to the growing popularity of PDSA, which Deming viewed as tampering with the meaning of Shewhart’s original work.[3]”
Deming’s fame grew out of his work in Japan after WWII, helping the Japanese government rebuild its industries devastated from the war beginning in August 1950. Again, from Wikipedia, “Many in Japan credit Deming as one of the inspirations for what has become known as the Japanese post-war economic miracle of 1950 to 1960, when Japan rose from the ashes of war to start Japan on the road to becoming the second largest economy in the world through processes partially influenced by the ideas Deming taught:[4]”
For most of his career, Deming was ignored by business and government leaders in the United States. Although in latter years Deming became associated more with “quality” – quality assurance and control principals – Deming’s conclusions were simple, practical and effective. Take his example Deming’s quality philosophy, as summarized by some of his Japanese followers:
“(a) When people and organizations focus primarily on quality, defined by the following ratio,
Quality = Results of work efforts} / Total costs} Quality tends to increase and costs fall over time.
(b) However, when people and organizations focus primarily on costs, costs tend to rise and quality declines over time.”
The same relationship applies to the measure of productivity.
Productivity = sum of outputs / cost (number) of inputs
If the market will accept it, increasing the sum of outputs will be reflected in an increase of “productivity” if the costs of doing so are the same or incrementally lower. If the market slows and will not accept more output, and production levels are lowered, the productivity number will decline. If thinking long-term, a rational organization would wait for the market to stabilize or come back, and/or try to find new markets or develop new products. But if thinking short-term, an irrational organization might seek to cut costs and, far too often, they zero in on labor costs without consideration of the cost to develop the organizational capacity thus far, the value of the intellectual property developed in the minds and techniques of their workers, without consideration of product quality or service quality going forward and the cost of that to the organization’s brand.
Unfortunately, this appears to be the modus operandi of especially publicly traded companies and private equity held firms world-wide. Wall Street, it appears, demands that profits and earnings per share increase every quarter no matter what – not even if struggling through the result of market disruptions they often cause. Originally, this was the rationale for many businesses closing US operations and moving the operation to a lower wage country with high labor and support costs, higher supply chain and logistics costs and costs to, it seems to me, quality and brand. In the case of the United States, reduction in labor costs lead to a decline in the consumption levels of their largest market as the best paying jobs, reflecting the value from decades of expertise and experience, were eliminated. Although seemingly self-destructive long-term, it seems to still be the underlying principle today. This approach seems to counter every other business philosophy and system in an organization, but it is likely that these are not the concerns of short-term minded business owners.
When it comes to “worker productivity,” what ever measure is devised points more at the failure to develop the asset(s). If you efficiently increase worker capacity and performance, and even if you raise the wages to reflect the increasing value of the worker, your productivity should increase as your costs stay stable or decline. But far too often a false conclusion is drawn from other measures implying that you can cut costs by cutting the cost of labor without repercussions in the long-term.
More modern measures of productivity, such as OEE (“Overall Equipment Effectiveness), seem modern, but may be just as shallow. An OEE score of 100% means you are manufacturing only good parts, as fast as possible, with no stop time. In the language of OEE that means 100% quality (only good parts), 100% performance (as fast as possible), and 100% availability (no stop time). While being a high-bar measure, even versions of this measure seem narrow and confrontational, not remedial according to Larry Fast.
Ina recent article in IndustryWeek entitled Performance Metrics That Matter: Effective Productivity Measures, Larry Fast offers three productivity measures he feels are more effective. Still, he warns, “The biggest mistake I see in manufacturing organizations is the tracking of manufacturing production costs as a percentage of sales dollars. It doesn’t matter whether the scorecard is on net sales or gross sales. It’s just the wrong way to think about it.”
“The biggest mistake I see in manufacturing organizations is the tracking of manufacturing production costs as a percentage of sales dollars. It doesn’t matter whether the scorecard is on net sales or gross sales. It’s just the wrong way to think about it.”
Larry Fast
Fast offers three compelling reasons why: “First, the amount of sales dollars in a specific period has no relevance to the shop floor’s cost performance…The second reason is that several non-manufacturing causes affect the calculation…The third reason for a 35-year old manufacturing guy like me is just short of criminal. Too often companies still measure cost by a long outdated formula of ‘standard cost + margin = price.’ So the factory team busts their tails delivering year-after-year cost reductions, which should flow directly to the bottom line (isn’t that the goal?). The effect: Cost reductions get passed straight through to the customer. Pretty discouraging for factory folks and a very negative outcome for the business. I have seen this occur time after time.” These type of cost cuts seem more suited for extreme deflationary periods, but according to the press and business think tanks, there isn’t a better time to be in business so that couldn’t be the case…could it? Maybe it is more about keeping costs low to keep prices low and keep profits high, while squeezing everything in between to maintain it – even it dampens the long-term viability and vitality of the organization.
In a final point, Fast offers a quote from an unknown author that seems appropriate to this discussion, “There are so many people who can figure costs, and so few who can measure values.”
Whenever economic terms are used, it is important to have the user clarify what they believe they are measuring lest the wrong conclusion may be drawn and wrong measures implemented. Be cynical, not to be a contrarian but to be able to offer better solutions to challenges. It is sometimes difficult to debate people who use faux statistics to defend their position or policy, but those individuals who are relying on, unknowingly, the wrong formulae may be grateful for your inquisitiveness and perhaps opposing view.
Visit the Proactive Technologies website for more information on an approach to maximizing the human component, while lowering direct labor and opportunity costs of a productivity measure. Schedule to attend one of the live online presentations (or schedule your own) to learn more and/or contact us for more information.