Sunday, December 10, 2017

Trump’s Authority to Launch Nuclear Weapons

During the 2016 campaign for the presidency the authority the president has over the use of our nuclear weapons became a topic of considerable interest.  Concern was driven by the fact that Donald Trump could become our next president and could gain control over those weapons.  Could a psychologically unstable and perhaps demented person launch a nuclear weapon based solely on his own initiative was the question.  Several articles appeared in the media essentially claiming that yes he could.  An article from vox.com titled If President Trump decided to use nukes, he could do it easily was an example.  The implication was that the president had the authority to do almost any damned thing he wanted to do.

Trump has now been president for almost a year and the concerns about his mental state have not subsided; rather, they have been augmented by a perceived lack of knowledge and understanding on his part of what a nuclear explosive is and what it is capable of doing.  The ongoing exchange of schoolboy taunts and threats between Trump and North Korea’s leader have served to elevate the level of concern even further.  Recent articles have come out raising anew the question of the extent of authority an incompetent president might have in initiating the use of nuclear weapons.  Garry Wills wrote Big Rocket Man for the New York Review of Books and Adam Schatz produced The President and the Bomb for the London Review of Books.  Both authors expressed the concern that there was no mechanism in place to stop Trump from impulsively launching an unwarranted nuclear attack.

One of the difficulties in addressing this issue is that the mechanisms in place are not known to the people writing about them—including myself.  The people who know are not about to publicly acknowledge the procedures they adhere to.  One must rely on incidents from the past and experiences of people who have had incomplete knowledge about the process of nuclear authorization.  Most of what is discussed is based on what is presumed known of presidential authority in the case of an ongoing nuclear attack on the United States by another power.  In that situation, where mere minutes are available to initiate a response, absolute authority to launch by a single person, the president, is the only option.  The concern with Trump is that he will go rogue and launch an attack on his own initiative with no credible threat justifying such an act.  That is a completely different situation in which his presidential authority could come up against limitations.  In such a situation it would have been foolish to give unlimited authority to a single individual.  One has to assume that after seventy years of wrestling with these issues a system that makes sense would have been developed.

Wikipedia is usually a credible source of information, but in this case it illustrates the risk in drawing conclusions from dubious sources.  Under the rubric of “National Command Authority” its knowledge is summarized in two brief paragraphs, one of which is merely a definition of the term.  The second is as follows.

“Only the President can direct the use of nuclear weapons by U.S. armed forces, including the Single Integrated Operational Command (SIOP). While the President does have unilateral authority as commander-in-chief to order that nuclear weapons be used for any reason at any time, the actual procedures and technical systems in place for authorizing the execution of a launch order requires a secondary confirmation under a two-man rule, as the President's order is subject to secondary confirmation by the Secretary of Defense. If the Secretary of Defense does not concur, then the President may in his sole discretion fire the Secretary. The Secretary of Defense has legal authority to approve the order, but cannot veto it.  The Secretary of Defense succession plan designates numerous individuals that may serve after a President removes his or her predecessor.  Traditionally, a civilian United States officer must countersign a Presidential order or resign.”

The notion that a two-man rule is in place at the presidential level while our country is presumed to be under a nuclear attack with only a few minutes available to take action seems dubious.  At that point there is no time to waste in discussing the plan to move forward.  As a practical matter, the decision that an actual attack is underway must have already been vetted through the Secretary of Defense or his/her representative.  Before the president even hears about the situation, the system has decided that this is real, in which case the president and his advisors should have already gamed out what to do when this occurs.  The idea that the Secretary of Defense could decide to change his mind about the appropriate course of action forcing everyone to wait until a search is initiated to find the next person on the succession list seems ridiculous.

The paragraph from Wikipedia quoted above lists three references to support it.  One is the vox.com article referred to earlier, one is from Politico, and the third is from the New York Times.  None of these articles offer any hard sources for any of their conclusions.  In fact, only the vox.com article makes the claim about a two-man rule being in place; the other two disagree with this assertion.  So much for reliability in what you read.

The above situation assumes that at least one nuclear weapon is headed towards the United States.  When that is not the case, there is time to have a more leisurely process in place, and one would hope that at least a two-man rule is active, and there are mechanisms whereby others can provide counsel.  It would make no sense to assign the president unquestionable authority in this case.

Some hope of enforced sanity is obtained from a few remarkable comments coming from General John Hyten, head of the U.S. strategic Command (STRATCOM), the entity responsible for operational control of our strategic nuclear weapons.  As reported by Kathryn Watson under the heading Top General says he would resist ‘illegal’ nuke order from Trump (11/18/2017).

“Air Force Gen. John Hyten, commander of the U.S. Strategic Command (STRATCOM), told an audience at the Halifax International Security Forum in Halifax, Nova Scotia, on Saturday that he has given a lot of thought to what he would say if Mr. Trump ordered a strike he considered unlawful.” 

"’I provide advice to the president, he will tell me what to do,’ Hyten added. ‘And if it's illegal, guess what's going to happen? I'm going to say, 'Mr. President, that's illegal.' And guess what he's going to do? He's going to say, “What would be legal?” And we'll come up options, with a mix of capabilities to respond to whatever the situation is, and that's the way it works. It's not that complicated’."

The notion that the president can “put his finger on the nuclear button” seems misleading.  The “nuclear football” that is always nearby the president is apparently a device that allows the president to authenticate his presence and set up a communication link with those who have operational control.  General Hyten’s comments suggest that the president can’t press the button, somebody else has that responsibility.  And that someone else, Hyten for example, seems to have procedures in place that must be followed before acquiescing to whatever the president might wish to do.  It is interesting that the general used the word “illegal” with respect to potential actions Trump might wish to take.  That implies there is a code of ethics written down somewhere with respect to the usage of nuclear weapons.

“Hyten said he has been trained every year for decades in the law of armed conflict, which takes into account specific factors to determine legality -- necessity, distinction, proportionality, unnecessary suffering and more. Running through scenarios of how to react in the event of an illegal order is standard practice, he said.”

"’If you execute an unlawful order, you will go to jail. You could go to jail for the rest of your life,’ Hyten said.”

Whether Hyten is in a position to say “hell no” to a president is not clear, but the fact that he refers to the penalty for acquiescing to an “illegal” usage suggests he might have the justification to do just that.

It would seem that the nation’s leaders have put in place a system that has anticipated the existence of a rogue president.

It seems one fool is not sufficient to start a nuclear war, so perhaps we should relax a bit.  It seems it would take a number of fools to stumble into a nuclear conflict…..well…. maybe we should be at least a little worried. 


Thursday, December 7, 2017

How a Value-Added Tax (VAT) Works, and Why Governments—Except for the USA—Love It

We continue to mine T. R. Reid’s wonderful book studying tax systems in use around the world: A Fine Mess: A Global Quest for a Simpler, Fairer, and More Efficient Tax System.  The subject here will be the value-added tax or VAT with which most of us only become familiar when we travel overseas.  It seems the United States is one of the few countries that has chosen to not implement one.  Reid quotes a Professor Richard Bird of the University of Toronto to provide some perspective.

“Professor Bird, who has helped design tax regimes for numerous countries, told me that ‘a VAT, or its twin a GST [Goods and Services Tax], is an absolutely essential element of any tax structure today.  To set up a tax structure anywhere that didn’t include a VAT would be malpractice; it would be like creating a health-care system without hospitals.  That’s why every responsible Finance Ministry has used it.”

The VAT was created mainly to solve the problem of tax dodging by French businesses.  Is this something that the US should be concerned about?  You bet!  Kenneth Rogoff, in his book The Curse of Cash, provides us with some startling data.   One realizes that the tax authorities will target suspicious tax returns for an audit.  However, they can also target tax returns for an audit randomly in order to poll the economy to estimate the extent to which tax avoidance activities affect revenue.

“The IRS has used these extensive audits, combined with an array of other information (e.g., investigations into high-income-earner tax shelters) to arrive at an overall estimate of unpaid taxes.  For 2006, the most recent year reported, the IRS found that the ‘tax gap’—the difference between taxes voluntarily paid and taxes due—was $450 billion.  This comprises tax evasion in many different sectors, including underreporting of business income, wage income, and rental income.”

“Of the $450 billion, the IRS expected to recover $65 billion, leaving a net tax gap of $385 billion.  Put differently, roughly 14% of estimated 2006 federal taxes, or 2.7% of 2006 GDP, will never be paid.”

One might suspect that this activity goes on at levels where cash is not the logical means of transaction, but that would be wrong.  It seems a staggering number of small businesses use cash transactions as a means of hiding income.

“By far the most important area of tax noncompliance comes from underreporting of business income by individuals who conduct a significant share of their transactions in cash.  The problem extends to individuals operating as partnerships or small corporations.  Overall, small business owners report less than half their income and account for 52% of the tax gap.”

And the problem is even worse when state and local taxes are included.  Rogoff states that state and local taxes provide revenue at a level of about two thirds of federal revenue.

“Most states have income taxes (where noncompliance is presumably similar to that for the federal income tax), as well as sales taxes, where the scale for noncompliance in cash transactions is enormous.”

Rogoff was focused on cash as a vehicle for committing crime, but the real problem is that there is no mechanism in place for providing tax collectors with a record off what transactions actually took place.

It was the Frenchman, Maurice Lauré, who took up the task of inventing a system whereby records would be generated and business participants would willingly report all of their transactions.  The beauty of the system he developed was that it created the incentive for businesses to obey the tax laws.

Before explaining how a VAT works, we must compare it with a standard “sales tax” as applied across the US.  The sales tax is only applied at the end of a sequence of transactions involved in producing a product.  Consider a chair purchased for $200 by a consumer where the sales tax is 10%.  That person pays $220 for the chair and leaves unconcerned about whether or not the $20 she paid in taxes actually gets transferred to the government.  As Rogoff’s data shows, much of it doesn’t.

Lauré’s approach would break down that transaction and apply the 10% tax at each step in the production of that chair, but credit each participant for taxes already paid before moving forward. 

Let’s use the chair as an example of how value-added taxes work.  Company A buys wood from a supplier valued at $50.  He adds the tax and pays the supplier $55.  The supplier owes the government $5.  Company A fabricates the chair and sells it company B who will stain and finish it for $100 plus $10 in tax.  Company A is responsible for tax of $10 minus a credit for the $5 it paid in tax for the wood.  It transfers $5 to the government.  Company B then sells the finished chair to company C who will sell it to a consumer for $150 plus $15 in tax.  Company B transfers $15 minus its $10 credit to the government.  When the product is sold to a consumer for $200 plus $20 in tax, Company C owes the government $20 minus its credit of $15.  The net result is that the tax collector receives $20 on a $200 purchase, just as in a sales-tax transaction.  The price of the chair to the consumer would have increased to $220along the way to cover the cost of the tax, but the consumer would not explicitly be paying the tax.  The difference with a value-added tax is that each business participating in the chair’s production has the incentive to make sure all other participants record their taxable transactions so that each can claim their credit for taxes already paid.  If someone chooses to try to cheat, there will be a record that will allow the tax collector to detect it.

“….Lauré’s new tax on commercial transactions worked so well that it began to spread to other countries; Europe’s nascent Common Market (which became the European Union) made the value-added tax mandatory for all its member nations in 1967, on the theory that a unified market should have a unified tax structure.  Fairly quickly, the idea was taken up in South America and Africa, in the fast-growing ‘Tiger economies’ of East Asia, and then, somewhat later, in former British colonies like Australia, New Zealand and Canada.  By 2016, some 175 of the planet’s 200 countries had a value-added tax or a goods and services tax, which is another name for the same thing.”

The VAT is an important source of revenue for nations and it allows them to lower income and other taxes that taxpayers dislike and economists view as providing economic distortions (incentives for economic inefficiencies).

“This form of taxation brings in about 20% of all government revenue in the world; among the members of the OECD, the club of rich nations, VAT payments constitute 33% of all tax revenue.  For many countries, the VAT has become the most important single tax; in France, Lauré’s invention generates about 40% of revenues.”

The VAT is usually buried in the price of the item purchased where it is easily forgotten about rather than tacked on at the end where it is a constant reminder to the taxpayer.  Countries can also choose to provide VAT-free prices on goods that are exported, a benefit that is familiar to aggressive shoppers in other countries.  This also provides an advantage when competing with the United States in international markets.

The VAT has some disadvantages which probably contribute to the lack of interest exhibited by the United States.  Since the tax is generally applied to all purchases, it is regressive in nature because the lower income individuals will have it applied to a higher portion of their income.  This can dealt with in a number of ways, but the simplest approach is to remove the undesirable aspect of regressiveness with tax credits for those with low incomes through the income tax.  The implementation also requires some initial investment, but that is soon more than paid for by the increased efficiency of collection.  For a federal system with state and local sales taxes, some effort is required for a federal tax to be included, but other countries such as Canada, New Zealand, Australia, and Great Britain have already figured out ways to do it.

Liberals in the US have demonstrated a lack of interest in a VAT because it is intrinsically regressive in nature.  Conservatives in the US have opposed it because it is too efficient at collecting taxes (Reid chose to title his chapter on VATS “The money Machine”).  Since it is not applied explicitly at the time of purchase, taxpayers tend to forget about it.  Also, since prices vary continuously from market factors and overall inflation, small changes in the tax rate could generate a lot of government revenue without even being noticed by the public.  Conservatives would find such a tax to be exceedingly dangerous in the wrong hands.  However, the price conservatives pay in avoiding a VAT is higher income taxes.

The most recent study of a US Vat was generated by a study group formed by George W. Bush in 2005 tasked with making the tax code “fairer, simpler, and more efficient.”

“In an earlier version of its report, the advisory panel had proposed a revamped federal income tax, with a progressive rate structure of 15% for the lowest brackets and then additional brackets 25%, 28%, and 33% for people at higher incomes.  But if the United States adopted a 15% VAT on purchases, the income tax could be slashed to two brackets: 5% for lower income families and 15% for taxpayers above the median income.  With a top rate of 15%, few taxpayers would find it necessary or productive to invest in complicated tax avoidance schemes, so compliance would go up and IRS administrative costs would go down.”

Reid points out that by 2005 we had already embarked on a path of irreconcilable partisanship and the study group was, in the end, unable to come up with a consensus on whether to recommend a VAT.

Is there any chance for positive changes in the US tax code aimed at fairness and efficiency?  Not any time soon, but perhaps there is hope that a VAT might be reconsidered in the future.  It would only require that both conservatives and liberals pause, step back, and reconsider the advantages of a VAT.  Reid provides us with this take on the situation from Lawrence Summers.

“The former Treasury secretary Lawrence Summers offered a tongue-in-cheek prediction of when that would happen.  ‘Liberals think the VAT is regressive,’ Summers said, ‘and conservatives think it’s a money machine.  If they reverse their positions, the VAT may happen’.”


The interested reader might find the following articles informative:






Sunday, December 3, 2017

Simplifying the Paying of Taxes—and Saving a Lot of Money

While people will always argue about who should be paying what level of taxes, all would agree that the very process of paying taxes is painful, needlessly complicated, time consuming, and becoming expensive as more people are forced to seek help in filing their returns.  T. R. Reid addresses this issue at length in his recent book: A Fine Mess: A Global Quest for a Simpler, Fairer, and More Efficient Tax System.  In it, he tells us that the tax code does not have to be as complicated as it has become, and that there are other filing mechanisms that are much more accommodating for the taxpayer.  Reid’s goal is to survey systems in place in other countries and determine the approaches that would benefit the United States.  We have a lot to learn from other countries.

Much of our problem arises from the complexity built into tax code.

“The U.S. tax code has grown so huge that nobody really knows how long it is.  During the 2016 presidential campaign, candidates routinely cited a figure of seventy-three thousand pages—a number that seems to include thirty-five hundred pages of the law itself, plus another seventy thousand pages of regulations.”

We tend to curse the IRS when we struggle with our forms, but the real culprits are our representatives in Congress.

“Members of Congress love to harangue the IRS bureaucrats about lengthy tax forms and unfair rules and complex instructions—but of course the IRS isn’t responsible for the length, the fairness, or the complexity of our tax code.  It is Congress that writes the tax laws.  It’s Congress that adds hundreds of new exemptions, allowances, credits, and calculations to the tax code every year.  It was Congress that decided to give the IRS the responsibility for managing the health insurance subsidies flowing to millions of Americans under the Affordable Care Act (Obamacare)—and then cut the agency’s staff after assigning it this major new task.  It was Congress that assigned to the IRS the management of the earned income tax credit (EITC), which has become one of the nation’s largest support programs for low-income Americans.  It was Congress that crafted the much-hated alternative minimum tax, which spawned whole new levels of complexity, and hours of additional work, for millions of families.  And yet congressmen and senators can’t seem to resist pointing angry fingers at the IRS, as if someone else had created the legislative monster that is the U.S. tax code.”

Reid points out the perversity of a code which burdens those least able to afford tax-paying assistance with some of the most complex filing procedures.  He uses the EITC, which provides a payment to support low-income workers and families, as an example.

“The instruction book for low-income taxpayers hoping to get this benefit (Publication 596) is fifty-nine pages long.  The book lists fifteen separate conditions, spread over three chapters, that you have to meet to claim the credit….The whole thing is so complicated that the error rate is 27%, which means one out of four filers, and the IRS, have to spend even more time trying to get it right.  This has prompted a mini-industry of tax fraud, with shysters going door-to-door in low-rent neighborhoods offering to fill out the EITC forms (for a fee of course) whether the client actually qualifies or not.  Similarly the tax credits for people buying health insurance on the ObamaCare exchanges are generally aimed at low-income taxpayers and are also ridiculously complicated.”

The IRS has the appearance of being an efficient agency in that its cost for collecting taxes compares with the best in the world at that task.  However, it does that by putting the burden of tax filing completely on the taxpayer.

“While the tax agency spends $11.4 billion, American taxpayers end up paying vastly more just to file their annual returns.  The Office of the Taxpayer Advocate says American families spend 3.16 billion hours each year getting their taxes done—gathering the data, keeping records, and filling out forms; businesses spend about 2.9 billion hours on the same tasks….At an average wage, those 6 billion hours devoted to filing tax returns represent about $400 billion per year of working time; six billion hours is the equivalent of 3.1 million people working forty hours per week, fifty weeks per year.  In terms of time and cost, just paying our taxes has become one of the biggest industries in the United States.”

The complexity of the process is such that very few people remain who attempt to perform the task on their own.

“Because the system is so complicated, hardly any Americans still fill out Form 1040 by themselves….Today, barely 10% of Americans do their own tax returns.  About 60% of all individual taxpayers hire tax-preparation agencies to do the work for them; another 30% buy tax-preparation software each year to get them through the process.  The IRS says an average family at medium income shells out about $260 per year for tax-preparation services; those with higher incomes can easily pay ten times as much.”

One of the curious features of filling out income tax forms is that you are spending a lot of time giving the IRS information it already has.  Given that, and the possibility that an effort could be made to arrive at revenue-neutral simplifications, and a small expansion of reporting requirements could be made, one should be able to arrive at a system in which the IRS fills out your tax forms for you and just sends you a summary to approve or dispute.  That is just what other countries have done, with more of them pursuing that goal.

Consider Japan’s system of tax collection.

“Japan’s equivalent of the IRS, Kokuzeicho, gathers all the pertinent data for each worker—income, taxable benefits, number of personal exemptions, tax withheld, and so on—and then computes how much the worker owes in tax, down to the last yen.  Because Japan uses a system known as ‘precision withholding,’ with the amount changing whenever pay goes up or down, most people withhold the exact amount due.  In early March, Kokuzeicho sends a postcard to every citizen that sets forth all this information: how much you earned, how much tax you owe, how much tax you’ve already paid through withholding.  If you’ve paid in more tax than you owe, Kouzeicho deposits the refund amount in your bank account; if you did not withhold enough, the agency takes the tax that’s due from your bank account.  If the figures on the postcard look about right, the taxpayer does nothing.  The tax has been computed and paid already.  If the numbers look wrong, you go into the local tax office and try to straighten things out.  As a result, paying income tax is a totally automatic process for about 80% of Japanese households, requiring no more than reading a postcard once a year.”

Britain has a system that is similar to that in Japan.  In fact it maintains a bureau called the U.K. Office of Tax Simplification (OTS).

“….the office has made some four hundred formal proposals to simplify either the tax code or the tax return forms and 50% of them have been implemented, at least in part.”

“The U.K. has established a system rather like Japan’s, with Her Majesty’s Revenue and Customs filling in the tax return with data it has received from employers, banks, brokerage houses, charitable recipients and so on.  The Brits also use a ‘precision withholding’ system, called Pay as You Earn, or PAYE, which takes into account wages and benefits, Social Security and health-care deductions, student loan deductions, and various other adjustments to income.  With that, most British wage earners find their yearly total tax withholding just about equals the tax they owe; in 2014, according to the Office of Tax Simplification, only one in five Brits had to file a tax return.”

Reid lists a number of countries that have followed or are beginning to follow the path taken by Japan and the U.K.: the Netherlands, Denmark, Sweden, Spain, and Portugal.

Implementing such a system in the United States would be much easier if the tax code were simpler: there basically would be less data to collect.  It would be expensive to move in this direction, but the data indicates that a simpler system increases compliance with tax laws; the initiative would soon pay for itself with increased revenue.

Reid points out that such a system already exists as a trial project in California for state taxes.

“California has launched what it calls an ‘experimental’ program known as CalFile in which the revenue department will send you a state tax return that is already filled in; if the numbers look right, you sign it, and the work is done.  If they don’t, you send back the return with your changes.  The state has never spent much money to publicize this system, so it is poorly known and little used.  But of the ninety thousand Californians who did file through this prefilled form in 2012, 98% subsequently told pollsters that they loved it and would definitely use it again.”

Such a system could be implemented in the United States.  There is no reason, other than politics, why software like the commercial tax filing programs couldn’t be implemented by the IRS thus saving the public billions of dollars in unnecessary expenses.  But many of our legislators are not in the business of representing their constituents; rather, they are tasked to represent the corporations that feed them money for reelection campaigns.

“It turns out that the complexity of the U.S. tax system is a money maker for some large companies.  So they have lobbied strenuously, and successfully, against all efforts to simplify the tax code.  The ‘Tax Complexity Lobby,’ as Forbes magazine called it, includes tax-preparation firms like H&R Block and Jackson Hewitt, as well as companies that make tax-preparation software.”

“The biggest spender in the anti-simplification camp is Intuit, the maker of Turbotax, the top-selling tax software program.”

Yes, the United States continues to strive to be “exceptional,” whether for the good or the bad.


The interested reader might find the following articles informative:








Thursday, November 30, 2017

T. R. Reid: When Tax Reform in the United States Worked: 1986

T. R. Reid has had a long career as a journalist and author.  One of his best products was a volume titled The Healing of America: A Global Quest for Better, Cheaper, and Fairer Health Care.  This book was published around the time that the program now known as Obamacare was being assembled and passed into law.  It provided an excellent comparison between healthcare systems in other countries and that operating in the United States.  Reid’s message was that we in the US can do a lot better if we would only be willing to learn from others.  He has recently produced another timely book focused this time on the topic of taxation: A Fine Mess: A Global Quest for a Simpler, Fairer, and More Efficient Tax System.  Reid again produces a short, clear, highly readable comparison between the US and other approaches to taxation in use around the world.  Again, the message is that we could learn a lot from others.

Reid states that while the initial passage of the federal income tax occurred in 1913, it was not until 1922 that the law settled into a well-established form.  Over the course of time conditions change and lawmakers feel compelled to “tweak” the tax system and it becomes more complicated and more unwieldy.  By the time of Eisenhower’s administration it was clear that the legislation needed a major rewrite in order to render it more efficient.  That new version was generated in 1954.  As before, this version was subjected to continual modifications as legislators felt a need to accommodate various special interests.  It would be in the 1980s that a sprawling and confusing tax code generated enough disgust to render another rewrite necessary.  This would take place under Ronald Reagan in 1986.  Reid takes note of the fact that major rewrites have occurred every 32 years, making the timing for the next rewrite to be 2018.  How much more timely could an author be?

The path to tax reform in 1986 was not simple, but after several years of arguing back and forth between political partisans, a very simple approach began to be viewed in a positive light by both sides and strong bipartisan support for the new tax code was attained.  The lessons of that period should inform the current deliberations on tax modification, but the attempt to avoid bipartisan support probably means we will not see the needed rewrite on Reid’s 32-year schedule.  Nevertheless, Reid’s perspective is informative.

Reid makes the dubious claim that politics in the Reagan era “was as fractious as it is today.”  At the time of the relevant taxation deliberations we had Republicans in control of the presidency and the Senate while Democrats controlled the House of Representatives.  Reid attributes credit for the needed political momentum to two people.

“These two were strange bedfellows indeed.  One of them, a conservative Republican, was a Wall Street tycoon; the other, a liberal Democrat, was a basketball star.”

The Republican was Donald Regan who accepted the post of secretary of the Treasury under President Reagan.  He was not an ideologue and thought our messy system could be improved by studying what other countries were doing in the taxation arena.  He was particularly interested in what was taking place in New Zealand, a country that was already moving in a direction that seemed appropriate for the US as well.

Essentially all nonpartisan tax experts advise countries to adopt an approach tersely summarized as “broad base, low rates,” or even more simply: BBLR.  The notion is that high taxes affect economic decisions which lead to inefficient economic outcomes.  It is better that decisions be based on their intrinsic properties rather than on tax considerations.  The goal then is to attain the needed funds to run government with the lowest possible rates of taxation.  The way to do this is to broaden the tax base as much as possible.  Every existing special tax deduction or allowance then has the effect of narrowing the base and requiring higher rates.  New Zealand possessed a complex mess for a tax code and decided to apply the BBLR principle in order to fix it.

“It relied primarily for revenue on a personal income tax, and the tax was riddled with exemptions, credits, and giveaways for particular groups and companies.  In short, it was the opposite of BBLR.  All those preferences made for a fairly narrow tax base, which meant that tax rates had to be high to raise the required revenue.  By the early 1980s, the top marginal income tax rate was 66%.”

The tax specialists were tasked with lowering income tax rates by half.  This they accomplished by eliminating essentially all of the special provisions that had been inserted into the previous code and by establishing a national sales tax that has dwelled in the 10-15% range.  This tax was labeled a goods and services tax (GST) but it is equivalent to the value-added taxes (VATs) in place in most nations of the world.

“With that addition, the income tax rates could be cut in half for every tax payer in the nation—with no loss of revenue to the government.”

Reid estimates that a couple in the US with the median income will average paying about 35% of earnings in income taxes and healthcare insurance.  That includes federal, state, local, and payroll taxes (Social Security and Medicare).

“In New Zealand, in contrast, the median wage earner pays about 17.5% in income tax.  But that one payment also covers his old-age pension (there’s no separate tax for Social Security), plus free healthcare for life (there’s no separate tax for healthcare), plus free education through college graduation (there’s no separate tax for schools).  So the average New Zealander’s wages are taxed at less than half the rate of the average American’s.  And yet New Zealand provides more government services than the United States—with half the tax rate.  That’s the beauty of BBLR.”

One must remember that New Zealand’s consumption tax (value-added tax) is also much more efficient at collecting funds than the array of state and local sales taxes that are so easy to cheat on, and we waste much more money than anyone else on healthcare.  But yes, there are things to be learned from other countries.

The driver for tax reform from the liberal side was Bill Bradley, a basketball all-American at Princeton and a Rhodes Scholar who followed that up with a career as a star player for the New York Knicks of the NBA.  Bradley’s interest in federal taxes is said to have arisen when he learned that as a player he was considered a “depreciable asset” by his employers.  He entered the Senate representing New Jersey in 1978 and began focusing on the issue of taxation.  His goal was to apply the principles of BBLR in order to be able to lower tax rates.

“’The trade-off between loophole elimination and a lower top rate became obvious,’ Bradley wrote later; ‘the lower the rate, the more loopholes had to be closed to pay for it.’  Bradley stuck to the mantra of ‘broad base, low rates’ for years, telling anybody who would listen that a significant cut in tax rates would win the votes needed to broaden the base.  ‘The key to reform was to focus on the attractiveness of low rates, not on the pain of limiting deductions.”

When Regan, his successor at Treasury, James Baker, and Bradley brought a credible proposal to President Reagan in 1984 to significantly cut tax rates he bought into the idea.  Legislation would eventually pass in 1986 after much political haggling.

The tax legislation passed in 1986 is startling in the changes that were approved.  The fact that it could actually be passed by a bipartisan vote is astonishing given today’s environment.

“The 1986 law, generally recognized as ‘the most significant reform in the history of the income tax,’ reduced the top marginal rate for individual taxpayers from 50% to 28%—the biggest reduction of any tax bill before or since.  It did that by eliminating a broad range of ‘tax shelter’ breaks available only to the rich.  It cut back the deduction for mortgage interest and completely eliminated the deduction for interest on consumer loans, like auto loans and credit cards.  It eliminated the deduction for state and local sales taxes.  It limited deductions for charitable contributions, IRA deposits, medical bills, and other personal expenses.  It set the tax rate on capital gains—that is profit on stocks, real estate deals, and so on—at the same level as the top income tax rate, so that financiers could no longer cut their tax bill by defining all their pay as capital gains.”

“The new law produced significant tax cuts for low-income and median-income Americans and provided tax savings for the rich as well.  But somebody had to pay for all that lost revenue, and the burden was shifted largely to corporations.  Although the bill cut the basic corporate tax rate, from 48% to 34%, it took away so many of industry’s cherished credits, deductions, and depletion allowances that corporate taxes increased by some $120 billion over five years.”

It may be difficult to believe now, but at one time the United States was the shining example of how to produce tax policy.

“This stunning and unexpected tax reform, particularly coming out of a politically polarized Washington, D.C., drew attention, and prompted action, around the world.  When little New Zealand transformed its tax code, the other wealthy nations found it interesting; when the mighty United States did the same thing, the rest of the world found it imperative, on political and fiscal grounds, to do the same.  In short order, Britain, Ireland, Canada, the Netherlands, and other democracies dramatically lowered their tax rates by broadening the tax base.  The OECD called this wave of tax reduction a ‘global revolution,’ and the United States lit the spark.”

But politicians do what politicians do and “money doesn’t talk, it swears,” so over the years, even New Zealand’s tax code became cluttered with exceptions and had to be rewritten in 2010.  Our politicians have also been busy.  Corporate taxes produce ever less in relative revenue and the demand is to cut that revenue even further.  Mechanisms for tax avoidance inexorably grow causing the government to either raise rates or borrow money to keep the ship of state afloat.

“After proudly patting itself on the back because of the 1986 reform, Congress in the next three decades made more than thirty thousand changes to the 1986 code.  Most of them ran counter to the ethos of BBLR.  Virtually all of them made the tax code more complicated—including that bizarre ‘anti-complexity clause,’ Section 7803(c)(2)(B)(ii)(IX).”

It is hard to believe that only three decades ago Congress was capable of acting on such a major bill.  Reid may see his 32-year prophecy come to pass, but it is unlikely that the result will be viewed as a worthwhile accomplishment by the American people—let alone by the rest of the world.


The interested reader might find the following articles informative:






Tuesday, November 14, 2017

The Persistence of Implicit Racial Bias

Although many of the world’s most powerful nations have participated in slavery, the United States was unique in the extent to which it based its political structure and its economy on the institution of slavery.  In so doing, it created a situation that it has struggled to deal with for the last 150 years.  The inevitable end to enslavement for millions of black African Americans left a white-dominated society with a need to incorporate all of these new citizens.

People who enslave others must produce a moral justification for themselves.  If one chooses to make slaves of members of a race, then that race must consist of people who are deserving of subjugation.  If whites are humans, then blacks must be subhuman in some way or another.  It was as simple as that.  And that belief, imprinted over centuries, did not disappear with the end of the institution of slavery.  In fact, it persisted openly throughout much of the twentieth century.  It seems to continue to propagate within our culture even though overt forms of discrimination have become illegal.

Keith Payne addresses the implicit form of racial bias that has continued in the US in his book The Broken Ladder: How Inequality Affects the Way We Think, Live, and Die.  In one chapter Payne devotes a chapter to the ties between racial bias and inequality.  He concludes that discrimination persists although it has become more covert.  He also demonstrates that while explicit bias is much less apparent, implicit bias is quite common.  Most troubling, but perhaps most enlightening, he provides examples which indicate racial bias is often subconscious, resulting from a lifetime of conditioning.  Payne provides this perspective.

“When the Civil Rights Act of 1964 outlawed overt racial discrimination, and the Voting Rights Act of 1965 ended explicitly discriminatory voting practices, society did not change overnight in response.  Following that 350-year period of perfectly legal subjugation, a mere half century—less than a single lifetime—separates us from whites-only lunch counters, water fountains and schools.  How much have things changed since then?  It depends whom you ask.”

“If you look at polls, the proportion of Americans favoring overtly racist ideas like segregated schools and hiring discrimination has declined from clear majorities in the 1960s to single digits today.  These trends have been regarded as an encouraging sign, but perhaps we have drawn too much encouragement from them.”

There are a number of studies that indicate racial discrimination, while more subtle, is still alive and well.  One of the more famous studies, by sociologist Devah Pager, consisted of sending out equal numbers of young black and white men with résumés crafted to be equivalent.  They were also given equivalent narratives to introduce themselves to prospective employers.

“The white applicant was called back twice as often as the equally qualified black applicant.  Similar studies have been repeated with the same results in New York, Chicago, Atlanta, and other cities.  They have also been replicated in areas other than employment.  Black renters are much more likely than equally qualified white renters to be told there are no vacant apartments.  Black shoppers are offered less favorable deals on cars and higher interest rates on mortgages than equally qualified whites.  Antiblack bias is alive and well in twenty-first-century America.”

People acquire their racial attitudes from family, from friends, and from what they view.  Since de facto segregation is still common, much of what is learned about other races is absorbed from the media.  The sum of all those inputs programs an individual to respond to other races in certain ways.  The examples of bias listed above derive from conscious decisions to discriminate.  What Payne wants us to realize is that racial bias can arise from subconscious mechanisms and lead to what he refers to as implicit bias. One can be biased without actually realizing it.

Payne describes his own enlightenment as he sought a tool to measure implicit bias in a rather important context.  He was interested in learning the probability that a person would mistakenly assume a harmless object was a gun when it was associated with a black person.  His objects were mostly tools like wrenches and pliers that were chosen to be metal and similar in size to a handgun.  The idea was to quickly flash a picture of a white or black person, followed by an image of an object.  The subjects participating in the measurement were given only a brief instant to decide whether the object was or was not a gun.  Payne tried the program out on himself as he verified that it was working properly.  He was startled by what he discovered.

“When I looked at my data I got about 80 percent correct.  That was not a bad result, but the pattern of my errors was disturbing: I was much more likely to mistake harmless objects for guns when a black face had been flashed initially.”

“Sitting there in my lab, trying to beat my own bias test and failing, I felt for the first time the discomforting gap between my good intentions and my biased behavior, known as implicit bias.”

His explanation for his own behavior, and that of the average person who would take such a test, is that when we are faced with ambiguous data, our subconscious has been programmed to make a choice, and that choice will be the one we subconsciously expect to be the case.

“One of the best-established findings in all of psychology is that people make sense of uncertain or ambiguous circumstances by relying on their expectations.  The less time there is to think carefully, the more they depend on them.”

Payne’s experience certainly is relevant to the rash of police shootings of black people that have occurred in recent years.  His data indicates that not everyone exhibits this bias, but enough do to allow him to assert that the average person will be biased with respect to race.  One might expect that this knowledge can be used to eliminate this tendency, but that is not the case.

“In some versions of the experiment, we even warned the subjects that the race of the face would bias them and urged them to resist that prejudice.  But cautioning didn’t help, and in fact it made the bias even worse, because then the topic of race was more prominent in subjects’ minds.  Good intentions don’t protect us from unintended biases.”

Payne’s results have been duplicated by other researchers in their own laboratories.  This is a significant finding.  One has to wonder how this mental programming was accomplished.  How did this association of black people with guns occur?  Payne perhaps provides us with a clue in a discussion of another bias against blacks.

According to Payne, we have been programmed to think of poor people who are deserving of assistance as whites, and those who are on “welfare” as undeserving blacks.

“Not only does income inequality heighten racial bias, but prejudice can also perpetuate income inequality.  Decades of studies have found a strong correlation between dislike of black people and opposition to social welfare policies aimed at helping the poor.”

“’Welfare’ simply refers to the suite of race-neutral government programs aimed at helping the poor, so these results don’t make much sense on their surface.”

“But it turns out that when Americans talk about ‘the poor,’ they mean something very different from when they talk about ‘welfare recipients.’  The best predictor of wanting to slash funding for welfare recipients is racial prejudice.  People who believe that black Americans are lazy and undeserving are the most likely to oppose welfare spending.”

It seems the traditional media has played a role in establishing biases.  It is scary to consider what social media will contribute to interracial strife.

“While it may not be surprising that the average person views welfare in racially tinged terms, the truth is that welfare recipients are about evenly divided among white, black, and Hispanic recipients.  But when [political scientist Martin] Gilens analyzed depictions of welfare recipients in television and newsmagazines since the 1960s, he found a clear racial bias: When welfare recipients were depicted as the ‘deserving poor,’ they were mostly white, but when they were portrayed as lazy and dishonest, they were overwhelmingly black.”

Insidious cultural messaging coming from parents and peers might be foreseen damping out in a few generations.  But when it is deeply imbedded in our mass media, that situation can be corrected quickly.  Whether or not change occurs depends on people like Payne getting the message out.

Payne leaves us with this final thought on the matter of implicit racial bias.

“Understanding implicit bias requires taking a more nuanced approach to the individuals we are easily tempted to label as ‘racist’ or ‘not racist.’  If you consider whether you yourself are biased, and why, you will likely focus on your conscious thoughts and beliefs, your values and good intentions.  Having reflected on what a fundamentally good person you are, you will conclude that implicit bias is other people’s problem.  Although we would all like to believe ourselves to be members of the ‘not racist’ club, we are all steeped in a culture whose history and present is built on massive racial inequality.  Research has shown that a majority of even well-meaning people—and their children—show signs of implicit bias when tested.”


The interested reader might find the following articles informative:





Wednesday, November 8, 2017

Early States, Capitalism, and the Domestication of Humans

There is a tendency to assume that human evolution has been characterized by an inexorable improvement in humanity’s capabilities and in the societies it develops.  When humans were at the mercy of the elements and could exercise little control over their environments, natural selection would favor characteristics that favored survivability in whatever current environment existed.  That produces change but it does not necessarily introduce what one might consider, in retrospect, as progress.  As time went on humans became more adept at influencing their own environments and creating new selection trends.  About 10,000 BCE humans began to experience and try to manage a number of changing conditions, probably driven mostly by increases in population and the ever-changing climate.  A species that had spent most of its existence as hunter-gatherers would gradually transition into farmers, herders and craftspeople.  Small groups would be replaced by larger communities that would ultimately evolve to states organized on the basis of an agricultural economy.  This is often viewed as a period of great progress on the part of humanity.  Humans filled the earth and to a great extend molded it to suit their needs.  However, in changing the earth they also changed the factors operative in natural selection.  By changing their environment, humans also changed themselves.

It was certainly a period of great change, but can it all be viewed as progress? 

James C. Scott is a political scientist at Yale University with an interest in the characteristics of the earliest formed states.  He was impressed by the amount of new information that had been produced by archeological and anthropological studies and was moved to present his interpretation of this fresh data in his book Against the Grain: A Deep History of the Earliest States.  His focus is on events in a region that is now roughly equivalent to modern Iran.  What Scott’s analysis makes clear is that the precursors of the modern state were entities driven by elites whose goals had little to do with any universal benefits to humanity.  Rather, these early political constructs seemed more akin to modern corporations—but ones with horrible human resource policies.

Scott introduces his final chapter with this warning:

“The history of the peasants is written by the townsmen
The history of the nomads is written by the settled
The history of the hunter-gatherers is written by the farmers
The history of the nonstate peoples is written by the court scribes
All may be found in the archives catalogued under ‘Barbarian Histories’”

The immediately accessible record of the distant past is generally self-serving documentation produced by a small element of the population.  In effect, any person who was not controlled by a state was considered a “barbarian.”  The term had nothing to do with the quality of life or the viability of the society in which these nonstate peoples lived.  In fact, Scott claims that one could make the argument that it would only be around 1,600CE that the majority of humans would have transitioned from “barbarism” to state domination.  That last statement carries a scent of cynicism.  That type of attitude is difficult to avoid after reading Scott who seems to enjoy indicating the conflicts of interest that exist between state rulers and state subjects.  For example, the most important task of the early states was to prevent the accumulated laborers under state control from escaping and regaining the safer and more comfortable life available under “barbarism.”

Scott provides a brief chronology of the period of interest.

“Homo sapiens appeared as a subspecies about 200,000 years ago and is found outside of Africa and the Levant no more than 60,000 years ago.  The first evidence of cultivated plants and of sedentary communities appears roughly 12,000 years ago.  Until then—that is to say for ninety-five percent of the human experience on earth—we lived in small, mobile, dispersed, relatively egalitarian, hunting-and-gathering bands.  Still more remarkable, for those interested in the state form, is the fact that the first small, stratified, tax-collecting, walled states pop up in the Tigres and Euphrates Valley only around 3,100 BCE, more than four millennia after the first crop domestications and sedentism.  This massive lag is a problem for those theorists who would naturalize the state form and assume that once crops and sedentism, the technological and demographic requirements, respectively, for state formation were established, states/empires would immediately arise as the logical and most efficient units of public order.”

Here is his summary of our conventional wisdom as to our history.

“Historical humankind has been mesmerized by the narrative of progress and civilization as codified by the first great agrarian kingdoms….In its essentials, it was an ‘ascent of man’ story.  Agriculture, it held, replaced the savage, wild, primitive, lawless, and violent world of hunter-gatherers and nomads.  Fixed-field crops, on the other hand, were the origin and the guarantor of the settled life, of formal religion, of society, and of government by laws.  Those who refused to take up agriculture did so out of ignorance or a refusal to adapt.  In virtually all early agricultural settings the superiority of farming was underwritten by an elaborate mythology recounting how a powerful god or goddess entrusted the sacred grain to a chosen people.”

“No one, once shown the techniques of agriculture, would dream of remaining a nomad or forager.  Each step is presumed to represent an epoch-making leap in mankind’s well-being: more leisure, better nutrition, longer life expectancy, and, at long last, a settled life that promoted the household arts and the development of civilization.”

What actually happened was that these barbarians had already developed all the technology needed to implement an agricultural economy based on a few dominant crops and animals yet they decided against it.  They had very good reasons for not following that path, resisting such a move for over 4,000 years.  The region in which they lived, the Tigres and Euphrates Valley, was, at the time, rather lush with many wetland areas that provided an abundant and diverse assortment of food sources.  Acquiring one’s daily nutrition was a part-time occupation, and if conditions changed it was relatively simple to move on to a new location.

“Having already domesticated some cereals and legumes, as well as goats and sheep, the people of the Mesopotamian alluvium were already agriculturalists and pastoralists as well as hunter-gatherers.  It’s just that so long as there were abundant stands of wild foods they could gather and annual migrations of waterfowl and gazelles they could hunt, there was no earthly reason why they would risk relying mainly, let alone exclusively, on labor-intensive farming and livestock rearing.”

What was driving the development of the agricultural economy based on state control was not the desire to advance civilization, but the desire to earn a profit for the few from the labor of many.  There were many sources of food available, but grain was chosen to be the main crop because it provided critical industrial and fiscal advantages.  It was important because it could be traded, making it the equivalent of money.  It would require a great amount of labor to produce that wealth.  It would require much more effort than that involved in hunting and gathering.  Therefore, a degree of coercion was required to obtain laborers.  Either environmental conditions made hunting and gathering no longer competitive, or physical coercion was required.  What records remain of these early states indicate great concern about maintaining the workforce by preventing escape or replacing those who escaped by raiding other sites and enslaving captives.  So much for advances in civilization.

“The key to the nexus between grains and states lies, I believe, in the fact that only the cereal grains can serve as a basis for taxation: visible, divisible, assessable, storable, transportable, and ‘rationable.’  Other crops—legumes, tubers, and starch plants—have some of these desirable state-adapted qualities, but none has all of these advantages.”

“The fact that cereal grains grow above the ground and ripen at roughly the same time makes the job of any would-be taxman that much easier.  If the army or tax officials arrive at the right time, they can cut, thresh, and confiscate the entire harvest in one operation.”

These early states were to be kingdoms, not democracies.  The people, other than a class of elites, were subjects, not citizens.  Wild animals were domesticated by controlling reproduction to produce desired characteristics.  It would take only a few generations of controlled breeding to produce a more docile species better acclimated to life in the agricultural economy.  Something similar must have also occurred with humans as they were extracted from the more intellectually challenging and nutritionally superior life of the hunter-gatherer and subjected to generations of simple but strenuous labor.

“’Domiciled’ sheep, for example, are generally smaller than their wild ancestors; they bear telltale signs of domesticate life: bone pathologies typical of crowding and a narrow diet with distinctive deficiencies.  The bones of ‘domiciled’ Homo sapiens compared with those of hunter-gatherers are also distinctive: they are smaller; the bones and teeth often bear the signature of nutritional distress, in particular, an iron-deficiency anemia marked above all in women of reproductive age whose diets consist increasingly of grains.”

“Evidence for the relative restriction and impoverishment of early farmers’ diets comes largely from comparisons of skeletal remains of farmers with those of hunter-gatherers living nearby at the same time.  The hunter-gatherers were several inches taller on average.  This presumably reflected their more varied and abundant diet.”

Animal species that have been domesticated all undergo physiological changes and suffer a loss of brain mass relative to their wild counterparts.  It is not clear exactly what that loss can be attributed to, but it seems foolish to assume that humans could not have been similarly affected.  In fact, physical anthropologists tell us that human brain size has been decreasing for the past 20,000 years.  Could it be that civilization places less demands on us and allows smaller brains to prove adequate?

“It is no exaggeration to say that hunting and foraging are, in terms of complexity, as different from cereal grain farming as cereal grain farming is, in turn, removed from repetitive work on a modern assembly line.”

The enshrinement of the agricultural economy, and the increase in population density of both humans and other animals, contributed yet another new feature to civilization: the creation of modern infectious diseases.  A virus or microbe that can thrive within an animal host requires a mechanism for transfer to another host and the availability of another host if it is to survive.  There will then be a minimum population size required for a disease agent to propagate and thrive.  That value will differ according to the characteristics of the given agent.  The critical point is that larger, higher-density populations invite new disease agents to move in and take hold.

“The importance of sedentism and the crowding it allowed can hardly be overestimated.  It means that virtually all the infectious diseases due to microorganisms specifically adapted to Homo sapiens came into existence only in the past ten thousand years, many of them perhaps only in the past five thousand.  They were, in the strong sense, a ‘civilizational effect.’  These historically novel diseases—cholera, smallpox, mumps, measles, influenza, chicken pox, and perhaps malaria—arose only as a result of the beginnings of urbanism and, as we shall see, agriculture.  Until very recently they collectively represented the major overall cause of human mortality.”

Agriculture’s specific contribution to the misery of humankind is the close association between herds of humans and the herds of other animals that ensued.  This provided infectious agents from humans the opportunity to attack other species (anthroponosis), and, more importantly to us, the opportunity for infectious agents carried by animals to infect humans (zoonosis).

“Estimates vary, but of the fourteen hundred known human pathogenic organisms, between eight hundred and nine hundred are zoonotic diseases, originating in nonhuman hosts.”

“In an outdated list, now surely even longer, we humans share twenty-six diseases with poultry, thirty-two with rats and mice, thirty-five with horses, forty-two with pigs, forty-six with sheep and goats, fifty with cattle, and sixty-five with our much studied and oldest domesticate, the dog.”

The zoonosis process continues.  HIV and Ebola are more recent zoonotic diseases.  As humans continue to increase in population and wander into unexplored ecosystems where new sources of pathogens exist, some have suggested that the rate of zoonosis is increasing.

Over the millennia, these new diseases had devastating effects on human populations, contributing to the rise and fall of states and severely limiting population growth—at least for a while.  Since survivors of a given disease acquire immunity, populations would eventually stabilize while allowing the disease to lurk in the background and survive—if there was a sufficient flux of new potential hosts.  The disease became endemic within that population, ready to leap out and devastate any group of humans without acquired immunity that it might encounter.  One legacy of the human civilization project is the burden of living under the continuous threat of new and even more dangerous epidemics.

We know, of course, that the hunter-gatherer lifestyle lost out in the long run as formalized, hierarchical states became dominant.  It seems some combination of population pressure and climate change probably forced people to accept the less desirable roll of state-controlled laborers.  Can this be considered progress?  Many view this transition as the source of the tale of humans being driven out of the Garden of Eden and being forced to live a life of suffering and toil.

Tales of paradise lost are probably extreme, but it is clear that this transition was quite painful and not without unforeseen consequences.  Scott is cautious in hypothesizing about what effect the state-dominated regime, with its economic demands, might have had on humans.  He believes not enough generations have passed for genetic consequences to be readily apparent.  He may be wrong.

Consider that we began with a picture of humans living in small bands where wealth was not accumulated and individuals were more or less equal in status.  The dynamic of a small band demands that members look out for each other and share their individual bounties when appropriate.  Would anyone use those same words to describe current societies?  What the “expulsion from Eden” did was create a hierarchical society where class mattered, and there were always elites who accumulated wealth through their control of society.  We went from living in a world of relative equality to one of rampant inequality.  We went from a society in which cooperation was demanded to one in which competition is required.  We started in a place where wealth was barely even a concept to one in which it is worth breaking all societal rules in order to acquire it. 

Living in such an environment for a few hundred generations is plenty of time for natural selection to have carried us off in some different direction.  We are not who we were, and we do not know who we will become.

There are at least two rapidly approaching crises with which humans will have to deal.  One is climate change; the other is the growth of automation and artificial intelligence.  It is likely that both will require a considerable retreat from the inequality and individual competitiveness we have grown accustomed to if we are to survive intact.  The question is: Can we discard the lessons of a hundred generations worth of natural selection in the next one or two generations?

I fear not.


The interested reader might find the following articles informative:







Sunday, October 29, 2017

South Korea: Managing Growth by Increasing the Minimum Wage

A significant number of economists have begun to realize that the US economy has been experiencing a state of stagnation in which not enough demand has existed in order to induce corporations to invest their earnings in new production.  Usually this situation is addressed by the federal government providing a fiscal stimulus by funding investments in projects that will provide more economic activity via the purchase of goods and services.  The stimulus to the economy then comes from the increased sales some companies will see and the extra wages that will be paid in providing those services.  The economy as a whole will benefit to the degree that the effects of additional production and increased wages are widely distributed.  There is no guarantee that increased production will generate spin-off production directly, but it is certain that a large fraction of any increase in wages will be spent and contribute to general economic activity.

Given the above reasoning, it would seem most efficient to directly increase wages if one wishes to stimulate the economy.  One can try to accomplish this effect by tax cuts, but those tend to go preferentially to the highest tax payers who are the least likely to spend any largess in stimulating the general economy.  One could conjure up a tax cut that would deliver money directly into the hands of lower income people, or one could declare a tax holiday that would benefit some sector of the population preferentially.  In this case, consumers who are also debtors will tend to use much of the extra income to help pay off bills, particularly if the increase in income is viewed as temporary.

It would seem that the most efficient way to inject money directly into the economy is to increase the wages of people who have no choice but to spend the money.  Fortunately (or not) the US has a large number of workers living at or near the poverty line who are in great need of the means for increased consumption.  An efficient way to boost the economy, and by so doing provide income to those most in need, would be to increase the minimum wage.  Presumably, the greater the increase, the greater will be the stimulus.

Such a move is said to produce severe economic consequences.  Businesses will be affected with higher wages costing some to raise prices, some to eliminate workers, and some to close up shop.  However, the increase in wages will show up as increased activity which will have a counterbalancing effect.  Will such a move improve the economy and the life of its workers, or will it diminish both?  No one really knows.  Small changes in minimum wage or large changes that were only applied locally have led to arguable results.  What one needs to understand the dynamics is for a large increase in minimum wage to be applied throughout the entire economy and to observe the long-term results.  Fortunately, South Korea seems poised to perform that experiment for us.

An article in The Economist titled Promising the Moon in the paper edition (a play on the name of the new President of South Korea) became the more relevant South Korea tries to boost the economy by hiking the minimum wage online.  South Korea might seem an unlikely nation to pursue such a policy.  The country has long been blessed with high growth rates that reflected its ability to produce high-value goods that sold well throughout the world.  But all growth brings with it problems.  One is the inequality that inevitably comes with capitalism.  The dependency on export-driven growth can also leave a nation too vulnerable to the whims of international markets.  The goal of the new President, Moon Jae-in, is to address both issues by increasing the minimum wage by about 55% by 2020.

“On the face of things, the South Korean economy is doing well. Growth has averaged 3% annually over the past six years, a decent outcome for a period when global trade was sluggish. Income per person is about two-thirds of America’s, up from a third 25 years ago. The unemployment rate is just 3.6%. South Korea spends more as a share of GDP on research and development than almost any other country.”

“Nonetheless, poorer Koreans resent rising inequality….A study by the International Monetary Fund last year found that the top 10% of South Koreans receive 45% of total income—a greater concentration than in other big economies in Asia. The proportion has risen sharply over the past two decades as the wages of the rich have grown faster than those of the poor….Adjusted for inflation, household incomes fell last year, something that in recent decades had happened only in the wake of financial crises.”

“The bet is that the jump in wages will feed through to stronger consumption, particularly as low-earners tend to spend more of their pay than the rich do. In addition to propping up growth, stronger consumption would make South Korea less reliant on exports and so less beholden to the whims of China and America, Mr Moon predicts. It should also help reduce inequality.”

The article provides this chart to help place what South Korea intends to do in comparison with policies in other countries.



Since The Economist is quite conservative in its economic principles, it suggests that disaster might be the outcome.  Let us hope not.  History tells us that a required minimum wage provides a very stable floor to wages.  The only way to improve the wages of the lowest-paid workers is to raise that minimum.


The interested reader might find the following articles informative:





Lets Talk Books And Politics - Blogged