Four Mennonite Sons

Standard

There was, one hundred years ago, a Mennonite family with four sons. They lived near a small rural village on the outskirts of a bustling city, with their three sisters, and two parents. Life was simple. They would get up early, milk the cows, then clean the stalls, before heading in for a hearty breakfast at mom’s dining room table, then out for the fieldwork or to cut firewood. The seasons of planting and harvesting were busy times, but there was always plenty of work year-round. There were community events, almost always involving donated labor, to raise a barn or help some struggling neighbor harvest their crops, but life revolved around the daily chores, tending to the animals, the repetitive cycles of the crops, occasional trips to town and church attendance.

In their spare time, evenings before going to bed or after dinnertime on the slower seasons, these boys would read. They had a keen interested in history and current events. The books gave them a window into the world beyond the horizon, beyond the slow pace of his agricultural lifestyle, where great men made important decisions, tales of war, of how his Anabaptist ancestors had suffered intensely for their faith, stories of missionaries traveling to exotic locations, reports the new technology that promised to change everything, and all of this captivated these young men. Their 8th-grade education and sheltered agrarian lifestyle may have left them in wide-eyed wonderment—like the first time they saw that WW1 surplus Jenny JN-4 biplane flying over the family farm—but this did not make them ignorant or lacking in intelligence.

The eldest son, Joseph, was the spitting image of his father, he had seen the farm grow, had participated in the hard work and toil right from the beginning, this simple lifestyle was as ingrained in his heart as the dirt was ground into his calloused hands. He had his dad’s work ethic, would never complain about physical labor, and he had that wiry strength common to farm boys. It is said that once, as a teenager, one of the town boys seeing this naive Mennonite, tried to pick a fight, even landing a blow, before John gave his antagonist a big bear hug, repeated “I don’t want to fight” and then put the stunned bully down. That bully would go on to be the mill owner, a friend, who would always tell that story, but John would laugh and claim that it was exaggerated, a tall tale. John, who had basically inherited his father’s farm, would continue to implement new techniques, was very successful, a respected member of the local community, married his sweetheart and they faithfully attended the church of his childhood.

The second oldest, the ever-inventive Henry, found a way to improve a farm implement, he started manufacturing in the shop on his dad’s farm, but soon outgrew the shop and purchased some land nearer to the city where he built his first factory. By following his passion for business, employing his hardworking heritage, he became very wealthy and could afford to treat his child to luxuries he could not have even imagined at their age. His life was always full of activities, parties, baby showers, vacations (his wife loved the beach, how could he say no?) and, of course, the daily grind of running an industrial production schedule. His life was dominated by the clock, by the calendar of events, the sports teams, politics, etc. He loved technology and one day brought home a brand new cabinet radio/record player that he had purchased at Sears. But, as busy as they were, and despite leaving his father’s old-fashioned church behind, religion still played an important role in the life of his family and he did his best to instill conservative values, his charitable giving (not for attention) made him a noteworthy character and admired amongst those in need.

Hudson was the third of the sons, said to be named after the famed Protestant missionary to China (although it may have been the automobile of the same name), was the more earnest of the four sons. One day an evangelist came to town, despite attendance being discouraged by the church elders, he (with his brothers) was in the audience. The message tugged at his sensitive heart, he rose to his feet shaking, walked the sawdust trail, and had a “born again” experience. Now, truth be told, he had never really been that rebellious, he had had some terrible guilt about seeing some female peers taking a dip in the pond and spending an extra moment observing, but he had always been thoughtful, considerate, and conscientious sort. But now, freedom from his sin, he was determined to serve. He taught at the newly formed Mennonite high school, eventually became a founding member of Mennonite World Aid, an outreach of the conference created to appease those longing to be missionaries, and even did a stint in post-WW2 Europe. He raised his large family to be Anabaptist (although he saturated them with fundamentalist literature) and was followed everywhere by his adoring perpetually pregnant wife.

Then there was Clyde. Clyde was the black sheep of the family, saw John as naive, not too interested in technology like Henry (other than his camera) and certainly far more cynical than Hudson. He didn’t have much appreciation for the farm life. He soon realized that his church was taught by ignorant rubes who got their “ordination” by seeming sincere enough to nominate and then picking up the right Bible. He at first decide to do the Mennonite missionary thing, but he was more or less there to observe and take pictures, and then headed off to university to satiate his hunger for knowledge. Yet, beneath all of this ‘liberal’ smugness, was a compassionate and caring heart. He would go on to write books, people loving to hear about his experience growing up as a traditional Mennonite (although things had really changed significantly before he was old enough to remember) and he was eventually hired as the pastor of the big conference church. Unlike his forebearers, he used his pulpit to spread about social issues, encouraging diversity, reprimanding the “ethnic church” for not caring enough about minorities, the poor, victims, etc.

All of the sons remained Mennonite. And yet all, besides John, had dramatically changed what it meant to be Mennonite. Even John’s life became more chaotic and cluttered than that of his father’s, some of his sons gave up farming (land was too expensive) and worked at his brother Henry’s factory, others (also smitten by an emotional ‘revival’ preacher” carried out Hudson’s vision, but all remained active in their congregations. Henry’s sons embraced the comforts of modern life, they drove muscle cars, listened to popular music and were a little wild before settling down. Of course, Hudson’s sons, all home-schooled (a necessity on the mission field) were a mixture of sheltered and exposed, they all thought of their father as sort of saintly character and were determined to spread ‘Anabaptism’ to the corners of the world. Then there was Clyde’s only child, an avowed feminist, decrying the patriarchy, privilege, police brutality, and basically indistinguishable from the other trust-fund babies who shared his far-leftist views—to him Jesus was basically a political tool, a means to shame his more practical cousins, and a philosopher superseded by Karl Marx.

Nothing about the new generation was the same as their grandfathers. Horses had been long replaced by tractors, the suburbs had encroached on the farmland inheritance and the influence of the ‘liberal’ cousins was having an impact on Joe’s old Mennonite orthodoxy that had been unquestioned for decades, more and more switched from farming to carpentry or manufacturing as economic realities pressed into their communities. More of Henry and Hudson’s descendants (who still crossed paths as conservative Mennonites) became disenchanted with the status quo, some looking for a more lively worship experience, others being disillusioned by the Protestant influence started to question the foundation of their religious tradition, some were angry about hidden abuses, and there were special conferences held to discuss the “Anabaptist identity” crisis. The trappings of modern life had slowly but surely crept into their lifestyle, smartphones were prevalent, pornography caused anxiety amongst many and the austerity of the past would have been appealing if they had the time to stop and think about it.

Advertisements

Dangerous Complexity: What To Do About the Complex Problem of Complexity?

Standard

Air-travel has become safer than ever and that due, in large part, to the increase in automated systems in the cockpit. However, with this advanced technology there comes a downside and the downside being that an otherwise perfectly functional aircraft (I.e., mechanically sound) with competent operators, can be lost because of a small electronic glitch somewhere in the system.

This issue was discussed, at length in response to the crash of Air France flight 447, an Airbus A330, in 2009, when an issue with an airspeed indicator and automated systems led to pilot confusion—which, in the end, resulted in a plunge into the ocean and the loss of all 228 people on board. The pilots were ultimately responsible for not responding in the correct way (they were in a stall and needed to push the nose down to recover lift) and yet the reason for their failure is as complex as the automated systems that were there to help them manage the cockpit.

It is this problem with advanced electronics that is summarized as a “systemic problem with complexity” in the quote below:

One of the more common questions asked in cockpits today is “What’s it doing now?” Robert’s “We don’t understand anything!” was an extreme version of the same. Sarter said, “We now have this systemic problem with complexity, and it does not involve just one manufacturer. I could easily list 10 or more incidents from either manufacturer where the problem was related to automation and confusion. Complexity means you have a large number of subcomponents and they interact in sometimes unexpected ways. Pilots don’t know, because they haven’t experienced the fringe conditions that are built into the system. I was once in a room with five engineers who had been involved in building a particular airplane, and I started asking, ‘Well, how does this or that work?’ And they could not agree on the answers. So I was thinking, If these five engineers cannot agree, the poor pilot, if he ever encounters that particular situation . . . well, good luck.” (“Should Airplanes Be Flying Themselves?,” The Human Factor)

More recently this problem of complexity has come back into focus after a couple disasters involving Boeing 737 MAX 8 and 9 aircraft. Initial reports have suggested that at an automated system on the aircraft has malfunctioned—pushing the nose down at low altitudes on take-offs as if responding to a stall—and with catastrophic consequences.

It could very well be something as simple as one sensor going haywire. It could very well be that everything else on the aircraft is functioning properly except this one small part. If that is the case, it certainly not something that should bring down an aircraft and would not have in years past when there was an actual direct mechanical linkage between pilot and control surfaces. But, now, since automated systems can override pilot inputs and take away some of the intuitive ‘feel’ of things in a cockpit, the possibility is very real that the pilots simply did not have enough time to sift through the possibilities of what was going wrong enough to diagnose the issue, switch to a manual mode, and prevent disaster.

The FAA, following after the lead of China and the Europeans, has decided to ground the entire fleet of Boeing 737 MAX 8 and 9 aircraft pending the results of the investigations. This move on the part of regulators will probably be a big inconvenience for air travelers. Nevertheless, after two incidents, and hundreds dead, it is better to take the precaution and get to the bottom of the issue.

https://platform.twitter.com/widgets.js

President Trump’s off-the-cuff Twitter response, basically stating “the complexity creates danger,” was met with the usual ridicule from those who hate the man and apparently do not understand hyperbole. (It ironic that some, who likely see themselves as sophisticated, have yet to see that through Trump’s putting-it-in-simple-layman’s-terms shtick.) However, technically incorrect is not the same as totally wrong and there is absolutely nothing ridiculous about the general point being made—there are unique (and unforeseeable) problems that come with complex systems.

The “keep it simple, stupid” mantra (aka: KISS principle) is not without merit in an age where our technology is advancing beyond our ability to control it. If a minor glitch in a system can lead to a major disaster, that is dangerous complexity and a real problem that needs to be addressed. Furthermore, if something as simple as flight can be made incomprehensible, even for a trained professional crew, then imagine the risk when a system is too complicated for humans alone to operate—say, for example, a nuclear power plant?

Systems too complex for humans to operate?

On the topic of dangerous complexity, I’m reminded of the meltdown of reactor two at Three Mile Island and the series of small human errors leading up to the big event. A few men, who held the fate of a wide swath of central Pennsylvania in their hands, made a few blunders in diagnosing the issue with serious consequences.

Human operators aren’t even able to comprehend the enormous (and awful) potential of their errors in such circumstances—they cannot fear to the same magnitude or to the proportion of the possible fallout of their actions—let alone have the ability to respond correctly to the cascade of blaring alarms when things did start to go south:

Perrow concluded that the failure at Three Mile Island was a consequence of the system’s immense complexity. Such modern high-risk systems, he realized, were prone to failures however well they were managed. It was inevitable that they would eventually suffer what he termed a ‘normal accident’. Therefore, he suggested, we might do better to contemplate a radical redesign, or if that was not possible, to abandon such technology entirely. (“In retrospect: Normal accidents“. Nature.)

The system accident (also called the “normal” accident by Yale sociologist, Charles Perrow, who wrote a book on the topic) is when a series of minor things go wrong together or combine in an unexpected way and eventually lead to a cataclysmic failure. This “unanticipated interaction of multiple factors” is what happened at Three Mile Island. It is called ‘normal’ because people, put in these immensely complex situations, revert to their normal routines and (like a pilot who has the nose of his aircraft inexplicably pitch down on routine take off) they lose (or just plain lack) the “narrative thread” necessary to properly respond to an emerging crisis situation.

Such was the case at Three Mile Island. It was not gross misconduct on the part of one person nor a terrible flaw in the design of the reactor itself, but rather it was a series of minor issues that led to operator confusion and number of small of mistakes that soon snowballed into something gravely serious. The accident was a result of the complexity of the system, our difficulty predicting how various factors can interact in ways that lead to failure and is something we can expect as systems become more and more complex.

And increased automation does not eliminate this problem. No, quite the opposite, it compounds the problem by adding another layer of management that clouds our ability to understand what is going on before it is too late. In other words, with automation, not only do you have the possibility of mechanical failure and human error, but you also have the potential for the automation itself failing and failing in a way that leaves the human operators too perplexed to sort through the mess of layered systems and unable respond in time. As the list of interactions between various systems grows so does the risk of a complex failure.

[As a footnote, nuclear energy is cleaner, safer and far more reliable than wind and solar farms. And, in the same way, that it is safer to fly than to drive, despite perceptions to the contrary, the dangers of nuclear are simply more obvious to the casual observer than the alternatives. So, again, with the fierce opposition to nuclear power by those who are unwittingly promoting less effective and more dangerous solutions, the human capacity to make good decisions when faced with the ambiguous problems created by the interaction of various complex systems does certainly come into question.]

Has modern life become dangerously complex?

There is no question that technological advancement has greatly benefited this generation in many ways and few would really be willing to give up modern convenience. That said, this change has not come without a cost. I had to think of that reality over the past few weeks while doing a major overhaul of how we manage information at the office and considering how quickly years of work could vanish into thin air. Yes, I suppose that paper files, like the Library of Alexandria burned, are always susceptible to flames or other destructive forces of nature. But, at least fire (unlike the infamous “blue screen of death“) is a somewhat predictable phenomenon.

Does anyone know why the Bluetooth in my car syncs up sometimes and not always?

Or why plugging my Android phone into the charger causes my calls in Facebook Messenger to hiccup (I.e., disconnects and reconnects multiple times) sometimes but not always?

I’m sure there is a reason hidden somewhere in the code, a failed interaction between several components in the system, but it would take an expert to get to the bottom of the issue. That’s quite a bit different from the times when the problem was the rain and the solution was cutting down a few trees to create a shelter. That was also true in the early days of machines as well—a somewhat mechanically inclined person could maintain and repair their own automobiles. However, the complicating factor of modern electronics has put this do-it-yourself option out of reach for all but the most dedicated mechanics.

Life for this generation has also become exponentially more complex than it was for prior generations when travel was as fast as your horse and you were watching your crops grow rather than checking your Facebook feed updates every other minute. It is very easy to be overwhelmed, as individuals, by information overload. The common man is increasingly over his head in dealing with the technological onslaught. We have become increasingly dependent on technology that we cannot understand ourselves and fails spontaneously, without warning, at seemingly the most inopportune times.

Advanced modern technology represents a paradigm shift as much as the invention of the automobile was a revolution for personal transportation. We have gone from analog to digital—a change that has opened a whole new realm of possibilities and also comes with a new set of vulnerabilities as well that go beyond the occasional annoyance of a computer crash. We really have no idea how the complexity of the current system will fare against the next Carrington Event (a solar storm that caused widespread damage and disruptions to the electric grid in 1859—a time of very basic and sturdy technology) nor are we able to foresee the many other potential glitches that could crash the entire system.

It is easy to be lulled into thinking everything will be okay because it has been so far. But that is a false security in a time of complex systems that are extremely sensitive and vulnerable. As when a pilot of a sophisticated airliner fails to comprehend the inputs or like the flustered operators of a nuclear reactor when the alarm bells ring, our civilization may be unable to respond when the complex systems we now rely on fail in an unexpected way that we could not predict. It is not completely unlikely that a relatively small glitch could crash the entire system and lead to a collapse of the current civilization. That is the danger of complexity, having systems that are well beyond our ability to fix should they fail in the right way at the wrong time.

The last human invention will be too complex to control and could be our demise…

Computers far exceed the human capacity to process information. We’ve come a long way from Deep Blue versus Garry Kasparov in the 90s and the gap between man and machine continues to grow wider after our best representatives were surpassed. Yet, while vastly faster in their abilities, computers have long only been able to do what they were programmed to do and thus their intelligence is limited by the abilities of their human programmers.

However, we are on the cusp of development of this technology and the implications far beyond the finite capacity of the human mind to grasp. We could very soon couple the processing speed of a computer with a problem-solving ability similar to that of a human. Except, unlike us, limited by our brain size and relatively slow processing speed, this “machine learning” invention (a video on the progress so far) could continue to expand its own intellectual abilities.

Machine learning is a massive paradigm shift from the programmed computers we currently use. It would lead to super-intelligence beyond our ability to fathom (literally) and, any more than a monkey can control us, could not be stopped. Imagine something that is always a hundred steps beyond any scenario we could imagine and has less in common with us (in terms of raw intelligence) than we do with an ant—would it have any reason not to treat us better than bacteria?

There was a time when I would not have believed that artificial intelligence was possible in my lifetime and a time after that when I would’ve thought it is something we could control. That was naive, artificial intelligence would, at very least, be unpredictable and almost totally unstoppable once the ball got rolling. It could see us as a curiosity, solve cancer simply because it could in a few nanoseconds—or it could kill us off for basically the same reason. Hopefully, in the latter case, it would see our extermination as not being worth the effort and be on to far greater things.

It remains to be seen whether artificial intelligence will solve all of our problems or see us as a problem and remove us from the equation. This is why very intelligent men, who love science and technological advancement, like Elon Musk, are fearful. Like the atomic age, it is a Pandora’s box that, once opened, cannot be closed again. But unlike a fission bomb that is dependent on human operators, this is a technology that could shape a destiny for itself—an invention that could quite possibly make us obsolete, hardly even worth a footnote in history, as it expanded across our planet and into the universe.

Whatever the case, we will soon have an answer…

Neural nets, the key component to artificial super-intelligence, are already here…

In fact, it is in your smartphone, it enables facial recognition and language translation. It also helps you pick a movie on Amazon by predicting what might interest you based on your prior choices.

Artificial intelligence technology could be our future. It could be that last invention that can finally manage all of these dangerous complex systems that modern convenience is so dependent upon and allow us to return to our simple pleasures. Or it could be a dangerous complexity in and of itself, something impossible to control, indifferent to our suffering and basically (from a human perspective) the greatest evil we ever face in the moments before it ensures our extinction.

Artificial super-intelligence will be complexity beyond our control, a dangerous complexity, and comes with risks that are humanly unimaginable. It could either solve all of our problems in dealing with disease and the complexity of our current technology—or it could make our woes exponentially greater and erase our civilization from the universe in the same way we apply an antibiotic to a pathogen. It is not ridiculous or absurd to think a little about the consequences before flipping the “on” switch of our last invention.

Should we think about simplifying our lives?

It is important, while we still reign supreme as the most inventive, intelligent and complex creatures on this planet, that we consider where our current trajectory will lead. Technological advancement has offered us unique advantages over previous generations but has also exposed us to unique stresses and incredible risks as well. Through technology, we have gained the ability to go to the moon and also to destroy all life on this planet with the push of a button.

Our technologies have always come as two-edged swords, with a good side and bad side. Discovering how to use fire, for example, provided us with warmth on a winter night and eventually internal combustion engines, but has often escaped our containment, destroyed our properties, cost countless lives, and creates air pollution. Rocks, likewise, became useful tools in our hands, they increased our productivity in dramatic fashion, but then also became a means to bash in the skulls of other humans as a weapon. For every positive development, there seems to be corresponding negative consequences and automation has proved to be no different.

The dramatic changes of the past century will likely seem small by comparison to what is coming next and there really is no way to be adequately prepared. Normal people can barely keep up with the increased complexity of our time as it is, we are already being manipulated by our own devices—scammers use our technology against us (soon spoof callers, using neuron networks, will be able to perfectly mimic your voice or that of a loved one for any nefarious purpose they can imagine) and it is likely big corporations will continue to do the same. Most of us will only fall further behind as our human weakness is easily used against us by the use of computer algorithms and artificial intelligence.

It would be nice to have the option to reconsider our decisions of the past few decades. Alas, this flight has already departed, we have no choice but to continue forward, hope for the best, and prepare for the worse. We really do need to consider, with the benefits, the potential cost of our increased dependence on complex systems and automation. And there is good reason to think (as individuals and also a civilization) about the value of simplifying our lives. It is not regressive or wrong to hold back a little on complexity and go with what is simple, tried and true.