Monday, September 18, 2023

Is Increasing Inequality Inevitable?

I believe that human societies naturally trend toward larger economic inequality because of obvious feedback mechanisms that favor higher income and wealthy families.  Only revolution or other cataclysmic events seem to reverse this trend.  And those events are a high price to pay.

Some of the feedback mechanisms that increase inequality are:

  • The growth of investment value 
  • The double income multiplier (the wealthy marrying the wealthy)
  • Education inequality (private schools, tutoring, college/exam prep)
  • Legacy college admission
  • Hiring through networks, friendships
  • Investment opportunities through friendships, network

The wealthier one is, the more weight these feedback mechanisms have.

 

One school of thought argues that economic inequality is a necessary component of economic progress and is at least partly attributable to the differences in human nature and behavior – ambition, talent, intelligence, work ethic, addictions, delinquency, etc.  The argument extends even to postulate that overall economic growth is accelerated (all boats rising . . .) by large income inequalities, as job and company creation, investment and innovation depend on large net-worth individuals or families. It theorizes that there is ultimately a net positive effect of large inequalities.

 

A strong counter argument to this school of thought can be seen in the poor economic development of developing countries in Latin America and Africa.  Large inequalities exist and there are many high net-worth individuals, but economic woes persist and even worsen in many cases.  These nations also often have rampant corruption and a lack of judicial and political stability, societal defects at least partially due to their history of colonial rule and exploitation.  These problems assist in entrenching inequality.

 

The fact that something occurs naturally does not imply that it is good.  The naturally occurring increase in inequality is certainly impacted by human behavioral traits of both the rich (self-interest, selfishness, opportunism, greed, vanity) and the poor (poor life choices, poor work ethic).  But I would argue that the positive feedback mechanisms listed above accelerate and magnify the ‘natural’ tendencies.  There will always be inequality based on differences in human behavior and potential, but the ever-growing, hyperbolic inequality we see today is a result of factors that have nothing to do with hard work and talent.

 

The Pew Charitable Trust conducted a survey in 2019 (pre-pandemic) to better understand the prevailing views on economic inequality.  They found that a majority (61%) of Americans felt there was ‘too much economic inequality’.  But as one might expect, there were significant differences between the left and the right – that figure of 61% was composed of 41% of the right and 78% of the left.

 

It is likely that the 39% who don’t believe there is too much inequality (and this includes 22% of those on the left!) think that a certain amount of inequality is inevitable and even desirable in an economy.  There were also predictably variable views on what causes inequality, with right-leaning respondents choosing personal factors much more than left-leaning.

 

If one looks at the history of global economic inequality it is clear that it is a very stubborn and unyielding aspect of human development.  Thomas Piketty, the French economist, has written two very well-researched (and long!) books on this topic.  What I find particularly interesting in his data is that the last hundred and fifty years or so have seen the rise and then fairly sudden moderation of inequality in the industrial world over several different time periods.

 

Unfortunately, this moderation has generally come after cataclysmic events – world wars, revolution, or the depression.  And it has at least partially been the result of major changes in the taxation of income and wealth that became necessary to pay off national debt.  Other factors such as education, labor union strength, and social benefits play a role as well.

 

All of the above-mentioned factors rely on the political will to initiate changes to reduce inequality, a political will that is naturally weakened as more power and influence accrue to the wealthy.

 

The Pew survey found that 84% of respondents felt that taxes should be raised for the wealthy, including a surprising 65% of right-leaning ones.  But only 14% felt that their taxes should be raised.

 

If these numbers are valid, then there is at least hope for some sort of future income and/or wealth tax that could impact economic inequality.  But for this to happen there must be a much stronger groundswell of concern to overcome the reluctance of conservative legislators, who are probably part of that 16% that don’t believe in raising taxes on the wealthy (i.e. themselves).

 

We face a plethora of challenges in this world – climate change, immigration and refuge, regional and global conflicts, and political instability to name a few.  Economic inequality is not generally at the top of that list for most people (the Pew survey says only 42% consider it a major priority), but to me it is a symptom of an increasingly sick society that will be less resilient in facing other crises.  We neglect it at our peril.

Saturday, September 16, 2023

A Journey of Faith, Reason, Logic, Belief and Doubt

Christians often talk about a faith journey.  It has been a staple of Christian group interactions for one person to speak about how they were raised in the church and the various phases of faith and belief they went through.  These ‘witnessings’ are understandably often quite emotional and powerful for both the person testifying and the audience, as one’s core beliefs about religion and spirituality are inextricably bound to one’s self-image, self-worth and deep longing for meaning.

I have had a journey too, but I would characterize it as a combination of faith, reason and logic, with doubt as a driving force.  I went through an early childhood of Episcopal church attendance, which ended in the middle of my 6th grade year as my family moved to California and we ceased going to church.  My interest in matters of church and spirituality was minimal throughout high school and college, though I had short involvements with Young Life, a Christian youth movement that recruited high school students, and with a soccer teammate in college who attempted to ‘bring me to Christ’.  

 

After my short Naval career ended and I went to grad school, I began to visit church again on my own in Boston and I ultimately became very intrigued by Christian theology.  When I married my wife, Karen, who had grown up as a Methodist Minister’s daughter and was totally committed to Christian social justice, my infatuation with Christianity accelerated.  We became very involved in our church and I read widely in Christian literature.  I even spent a long weekend at a Christian retreat known as Walk to Emmaus (named for the walk Jesus took post-resurrection, revealing himself to several disciples) and wrote a long essay proclaiming my beliefs.

 

This period of my life was very exciting and passionate as I explored my ‘faith’ within communities of very avid Christians.  I was almost totally convinced that this faith in Christ and the tenets of Christian theology were the ultimate truth about our existence and purpose.  Karen and I left our careers to join a Christian ministry, Habitat for Humanity, and immersed ourselves totally in this world.

 

But even in the midst of this most passionate embrace of Christianity, there were questions that I posed to myself that slowly began to undermine the fervor of my belief.  These were questions about the exclusive nature of Christianity – “I am the way the truth and the life, no one comes to the Father but through me”, and the obvious contradiction of a loving God and the eternal damnation of non-Christians.

 

I was able to reconcile my ardent faith with these apparent inconsistencies by use of the oft-employed explanation that ‘in God all things are possible’ and that how He judges the world is a mystery that we will neither solve nor understand.  We must have faith.  This seemed reasonable at the time.

 

But then, as my experience in the world and my knowledge of people, power and history expanded, other doubts began to nibble at the edges of my belief.  Closer readings of the New Testament identified multiple inconsistencies that only a blind acceptance of the text being directly God-given and inerrant could explain.  I read several scholarly analyses of biblical history that explained how Christian doctrine had been established and how the texts were copied hundreds of times over the centuries.  

 

The mere fact that the gospels and letters were written multiple decades after events occurred and were clearly written with specific audiences and goals in mind calls into question their accuracy.  The biblical rehash of themes that had already occurred in multiple other religions and mythologies (virgin birth, sacrifice, resurrection, etc.) seemed to be more in line with the long history of human desire to understand our existence and the tendency for humans to appropriate this desire to create structures for obedience and control than a revelation of divine truth.

 

But the most difficult thing for me to ignore was the long list of illogical aspects of religious belief.  The paradox of creation versus evolution; the incongruence of ‘God’s plan’ and free will (not to mention the sheer leap of faith necessary to imagine a God listening to prayers, deciding where and when to act, allowing huge injustices to occur, etc.); the idea of souls being inserted into humans who sometimes die after a few days, months or years – before they are even cognitive beings; the idea of heaven and how our eternal reward will juggle family, friends across our lives and sustain us for eternity in a blissful state; the occurrence of miraculous events over two thousand years ago in an age of ignorance and superstition versus the lack of religious miracles today.

 

These questions and doubts made it much more difficult for me to fully envelop myself in Christian faith.  I loved the sense of community and the emotional highs that spiritual liturgy and music provided, but found my own beliefs becoming ever more abstract and uncertain.  I felt like a hypocrite and a charlatan as I mouthed the doxology and articles of faith.

 

Religion recognizes doubts and questions, but it insists that one can overcome them with faith, that ‘substance of things hoped for, evidence of things unseen’.  Yet things hoped for and unseen can take almost any form.  How can one choose to have a very specific faith when so much evidence contradicts that faith and so much uncertainty and mystery enshrouds all matters outside our physical and material experience?  Even our physical world continues to defy full understanding as quantum physics and cosmology evolve.

 

It is tempting to disparage religious belief as simplistic and many intellectuals, scientists and atheists energetically ridicule religion.  Humans can be very arrogant and vicious, and there is a lot of ego and vanity at play in the battle between so-called believers and non-believers.  It is a sad testament to the inevitable potential for conflict in all human affairs.

 

My own journey continues.  I have accepted the doubt, the mystery and the uncertainty, though I cannot say I am at peace with it.  I claim neither belief nor disbelief.  I search for insights without expecting resolution. I continue to love the idea of a soul or spirit, the hope of existing beyond my physical death, the vague image of some sort of loving force in the universe, whether pantheistic or deistic.  But I will not pretend to know or even to have ‘faith’.  This is not a comfortable state of mind, but it is an honest one and I cannot imagine any other way to live.

Tuesday, September 5, 2023

Oppenheimer and the Absurdity of Moral Distinctions in War

The 3-hour long film Oppenheimer was generally an interesting portrayal of Robert Oppenheimer and the invention of nuclear weaponry.  It took the typical Hollywood liberties – tossing in gratuitous nude scenes with Oppenheimer’s lover and lots of silly gee whiz science moments and clever repartee that probably never occurred – but it did a good job of exposing the paranoid idiocy of 1950’s anticommunist hysteria and the moral conundrum that faced the scientists who worked on the Manhattan Project.

 

The McCarthy era and the shameful slander and penalties that it perpetrated on so many Americans has been the topic of several movies and there is not a lot more to say about it.  People who pursued social change in good faith through socialist and communist organizations should never have been persecuted unless they had actively advocated or engaged in violent revolution.

 

But the questions of morality that surrounded the Manhattan Project and subsequent weapons programs are more complex and less easily navigated.  In fact, I would argue that war and weapons quickly ascend to a level of moral absurdity that makes any rational conclusion unobtainable.

 

The Manhattan Project was launched at the instigation of several leading scientists (Einstein being the most notable) who were concerned that the Nazis might develop a nuclear bomb.  The conviction that the allies must ‘beat’ the Nazis to the bomb seemed logical to these scientists and to the bureaucrats and military leaders who went on to fund and initiate the project. 

 

But ‘beating the Nazis to the bomb’ implied that it would be used on Germany if the war was ongoing, regardless of whether the Germans were close to having their own bomb or not, to ensure that the Nazis would not be successful in their own pursuit.  So, from the very beginning the Manhattan Project was based on the inescapably absurd moral calculus of ‘lesser evil’, a calculation that lies at the heart of every modern war decision.  

 

The ’lesser evil’ proposition justifies an action by the hypothesis that in the long-term fewer people will be killed (and usually that means fewer on ‘our’ side) by undertaking that action than by other tactical activities.

 

The decision to use a weapon on a civilian population, whether in response to an enemy’s strike or as a strategy to ‘break the will’ of the enemy by killing women and children, is mass murder no matter what the rationale.  It may seem logical in time of war, but that is only because wartime has already suspended all morality and put all decision making on an impossibly absurd amoral footing.  Comparing one set of deaths with another is a fool’s errand.  There is simply no satisfactory answer.

 

Many of the Manhattan Project scientists protested the use of the bomb on Japan.  The argument that fewer lives would be lost by dropping two atomic bombs and killing hundreds of thousands of civilians did not seem morally legitimate even if the calculations made sense. This must have left a bitter aftertaste once the exultation of the successful trinity test faded and the nightmarish news came in from Japan.

 

 It seemed that Oppenheimer, upon confronting this dilemma and the ensuing debates about the development of the hydrogen bomb, became acutely aware of the moral absurdity of their accomplishment and the inevitable arms race that it would engender.  The long-term likelihood of a nuclear holocaust, which today once again begins to loom over our bitterly divided world, reminded him of the earlier fear that the fission chain reaction would set off an atmospheric reaction, ending the world.

 

There is no right and wrong in war, no moral path to seek in the murder of innocents.  I think Oppenheimer realized this at the end.  He thought and felt too deeply to avoid the doubts or to find solace in rationalizations.  And because of that he is all the more sympathetic as a human being. 

Wednesday, July 26, 2023

Is Human Creativity Really Different From AI?

 It is clear that chatGPT and other generative forms of artificial intelligence have catapulted into the general public’s consciousness and created a mixture of fear, glee and unlimited pontificating.  The usual suspects who leap onto technology bandwagons are, of course, leading the charge.  They have dumped crypto and found a new, even sexier infatuation.

 

The prospect of generative AI saturating our society with fake news, deep fake videos and photos and other chaos-creating content is disturbing, to say the least.  One doubts that there is any means to stem that onslaught and the task of distinguishing truth from lie will grow progressively more challenging. 

 

But in addition to that inevitable scourge, there is the somewhat dispiriting prospect of people using chatbots to do most if not all of their creative work – writing emails, letters and essays; creating videos or photo albums; coming up with poems, songs and melodies.  One can only imagine the nightmare for school teachers and professors in trying to assess the capabilities of their students.  Or perhaps the only skill necessary or assessed in the future will be the ability to guide the generative AI to whatever end product one desires?

 

We are reassured by some pundits that human creativity will not be in jeopardy because we are uniquely capable of innovation and modes of thought that computers cannot replicate.  But is this really true?  AI learns from data that it consumes.  Don’t humans do the same?  Isn’t our entire life a consumption of data?  We use the books we read, the music we heard, the conversations we had, the movies and series we watched to construct new thoughts, new ideas, and these are the sources for all our creative output.

 

It is true that there are subtleties to human thought and feeling that are more difficult to imagine being mastered by AI – irony, humor, sarcasm, empathy, sorrow, ecstasy, to name a few.  But these are also acquired over years of training and interaction.  A newborn has no empathy, no irony, no sarcasm.  Would it be so difficult for a computer to likewise be introduced to all of these and become conversant with them?

 

Most of us would like to believe that there is some aspect of the human brain or ‘spirit’, something beyond the purely material realm, that gives us our ‘humanity’ and our moral and ethical compass.  But others are at peace with the idea that human beings are simply incredibly complex and beautiful machines.

 

I suspect that the next few years will bring us the rather depressing realization that human creativity is not all that amazing after all.  We will find that a chatbot can come up with a catchy tune and lyric that rivals the Beatles or Cole Porter, or a novel that would make F. Scott Fitzgerald envious.

 

But then again, maybe after all is said and done, we will find that there is that certain ineffable genius of human creativity that is missing in the deluge of content created by generative AI.  And we may find ourselves all the poorer for having allowed it to dominate our world.  Who can say?

Sunday, July 23, 2023

Some Thoughts on Immigration and Its Downside

An academic study released in 2019 found that the children of poor immigrants succeeded in climbing the income ladder much better than the children of poor native-born parents.  This was not only true for immigrants from India, Asia and Europe, but also for those from Africa, the Middle East and Latin America, countries former President Trump labelled ‘shithole’ countries.

The authors theorized that the tendency for these immigrants to live in areas where there are more employment opportunities as well as their willingness to move wherever new opportunities arise might explain some of this difference.  There is also the fact that in some cases, the immigrant parents take jobs at a lower level than they had in their mother country and are not really at a comparable socioeconomic level to the poor in our country.  This may significantly impact the probability of success for their children.

 

In addition to the so-called poor immigrants who achieve social mobility you also have a significant number of immigrants who arrive in the USA as students or skilled workers to fill jobs that might not otherwise be filled.  This is especially true in areas of technology and science.

 

The USA has always prided itself on attracting the best and brightest from across the world.  Our universities eagerly pursue bright international students who are willing to pay the tuition or who are particularly capable, and most of those will stay and work in the USA.  They are also more likely to pursue postgraduate education than native born students and then progress into academia. A whopping 22% of post-secondary education teachers are immigrants!

 

In a world that is beginning to experience population decline in most industrialized nations, immigration is a method to counteract this trend and sustain economic growth.  However, this solution for the so-called first world comes at what certainly must be a high cost for the non-industrialized countries. 

 

For not only are these nations losing their best aspiring students and skilled workers to the lure of the industrialized world, but it is highly probable that the poor emigrants that flee these countries are in most cases a highly motivated and industrious group that constitute a major loss for the mother country as well.  Is it any surprise that many countries remain impoverished and in a failed state when their most valuable resource is being siphoned off?

 

Worldwide competition for people may contribute to innovation and economic prowess in the winning countries, but it also exacerbates many of the problems that plague our increasingly globalized world.  There was a time when the USA could remain blissfully unaffected by the chaos and deprivation outside its borders, but that time is past.  If we cannot find a way to help developing nations retain their best, brightest and most motivated then we will all ultimately suffer.

Tuesday, May 30, 2023

Memorial Day - Placing a Value on Death?

In the USA we have two federal holidays dedicated to the military – Memorial Day and Veterans Day.  The former specifically honors those who have lost their lives in the wars of our nation, while the latter is a general recognition of all veterans, though still heavily focused on those who died.

The wars of my generation – Vietnam, Iraq and Afghanistan – have been almost universally acknowledged as tragic mistakes and failures.  The loved ones of those who lost their lives (or were horribly maimed) in those conflicts probably find little solace in the idea of a higher cause.  There was no victory, no great achievement, no noble sacrifice.  They must wrestle with the notion that their sons and daughters, husbands and wives, fathers and mothers died for nothing, for no reason at all.

 

I would offer a counter argument, that any sacrifice of a life in good faith, regardless of the outcome, should be honored.  The fireman who rushes into a building to save a child and is killed should be honored even if the child dies or was no longer in the building.  It is the act and intention that counts, not the outcome.

 

But every attempt to place some sort of value on a death must ultimately seem a pitiful effort in the face of the horrible injustice of an early death.  The teenager who dies in a car crash, the child who succumbs to cancer, the young adult who overdoses, the kid who is murdered in a drive-by fusillade, the school children massacred in a mass shooting – the incomprehensible tragedy of it all haunts us.

 

Death, even in old age, unleashes a barrage of painful implications – the loss of a loved one and the horrific realization that one will never see them again; the potent reminder of our mortality and the rapid falling of the sand in our own hourglass; the question of life’s meaning and the troubling enigma of our existence.   And these thoughts are ever so much more poignant and relentless when the death is a youthful one.

 

Yes, time does partially heal the wounds.  And yes, we are resilient creatures who carry on even in the face of all of our doubts and fears.  And yes, there is joy to be had in this life no matter what hardships and tragedies confront us.  But death, and especially the death of the young, is never easily rationalized, and it remains a confounding aspect of our lives and rattles our faith and our spirit.  Attempting to place a value on a death is to a great extent a self-delusion, and I wonder whether it offers any real consolation.

 

Friday, May 26, 2023

AI and Genetic Engineering - Twin Horsemen of the Apocalypse?

The tsunami of fawning and fearful AI articles in the media over the last few months is breathtaking.  It might lead one to wonder whether chatbots are auto-generating all of these articles as part of an evil ploy to create widespread panic and prepare the world for AI’s takeover!  

The hyperbole reminds me of the hysteria that has frequently accompanied news events associated with genetic engineering.  These two technology frontiers are flip sides of the same coin – changing the basic nature and scope of humanity.  They are simultaneously thrilling and terrifying, harbingers of a very uncertain but intriguing future world.

 

Artificial Intelligence (AI) has been a topic of discussion for several decades.  And like much of technology jargon, it is a very broad term that people overuse to either make themselves seem knowledgeable or, in this case, to sell content to a public hungry for apocalyptic rumors and new things to obsess about.

 

So, what is AI?  Wikipedia defines it as:  intelligence—perceiving, synthesizing, and inferring information—demonstrated by machines, as opposed to intelligence displayed by humans or by other animals.

 

On the one hand, one can argue that all computer applications, and even many mechanical or electrical machines, are exhibiting artificial intelligence, in that they are independently performing tasks that mimic human tasks or behavior.  This type of artificial intelligence is confined to specific tasks and limited by the set of instructions that a human being has programmed into the computer. The tasks can be quite complicated, but they are constrained to act in a previously defined manner.

 

But when computer scientists talk about AI, they are more likely talking about systems and software that can ‘learn’ to perform a task rather than just perform the task.  These learning systems rely on massive amounts of data to adapt their capabilities, just as we humans require years of training to learn to understand, speak, move, and reason.  How closely these machine learning algorithms mimic our own brains is difficult to say, as we are still in early days of understanding the human brain’s inner workings.

 

The point of these machine learning systems is to enable computerized systems to perform tasks that would be impossible for previous pre-programmed systems – for example, recognizing objects or faces, driving cars, or creating unique content or images.  The learning systems are essentially writing their own code, or at least adapting it as they go through a process of trial and error.

 

The ability for learning systems to adapt and change has both advantages and disadvantages.  The advantage is obvious – that they can accomplish far more than their earlier fixed-program brethren.  The disadvantage, and the thing that provokes understandable fear and hysteria in the media and even among many of the pioneers in this technology, is that once these programs are unconstrained, there is the possibility that they will do unexpected and even unwanted things.  The danger is that we will lose control of how their software grows and how they ultimately behave.

 

This unpredictability is not a big problem in an AI-driven autonomous vacuum cleaner, but it could be a problem in autonomous vehicles, drones, robotic soldiers, and yes, even content generating applications like chatgpt.

 

AI is ultimately seeking to aid or replace human intelligence with a potentially unbounded and unregulated alternative intelligence.  Genetic engineering, on the other hand, offers the capability to change the human vessel itself.  Though currently held somewhat in check by international agreements, the capability to edit gene sequences and alter genomes tempts us to both repair and optimize human beings with all of the inherent uncertainties and risks.  As in all technological advancement, there is potential for both good and evil, and for a whole plethora of unintended consequences.

 

Can future AI and genetic engineering efforts be regulated in such a way as to put controls or curbs in place and ensure that no harmful consequences ensue?  This is the difficult challenge that faces the world today. The genies are already out of the bottle.  If we lose control of them, or even more sadly, employ them indiscriminately in a mad arms race for power and global dominance, then the apocalypse may be just around the corner.