Understanding the “Other”

I was in the 6th grade, in a modern Orthodox Jewish Day school. The Hebrew teacher got very angry at one of the pupils and slapped him in the face – not altogether uncommon back in the early 1960s when “light” corporal punishment was handed out in schools. But the pupil’s reaction was certainly uncommon: he punched the teacher back – in the stomach! And then he followed that with (in Hebrew): “don’t you ever touch me again!” (And the teacher never did.)

It goes without saying that we were all shocked back then by our “colleague’s” response. I’ve thought about it over the years (the only “violence” I ever witnessed in my entire educational experience): how and why did he have the gumption to react like that? My conclusion: he was born and raised in Israel (his family came to the States somewhat recently before that event). Over “there,” no one takes “shit” from anyone else, whether age 12 or 52.

As a dual citizen, born and raised in the U.S., but living most of my life in Israel (with occasional sabbatical years back in the States), I am eminently qualified to compare these two allied countries that have very dissimilar cultures. My purpose here, however, is not an academic exercise in cross-comparative culture, but to note the difficulty that certain cultures have in “understanding” others. This is especially true of America.

First, some facts. It’s banal to note that the U.S. exists on a continent several thousands of miles away from Europe on one side and the Pacific on the other side. But this has had profound consequences on the American mentality. For most of its history that meant little international warfare and political isolationism (George Washington’s Inaugural Address is the prime example, where he warned Americans against getting involved in European politics and wars). That macro approach has filtered down deeply into the micro-level as well. Only 42% of Americans have a passport – up from 32% in 2009 when for the first time they needed one to get into Canada or Mexico. Compare that with another “even-more-distant-over-the-ocean” Anglo country, Australia, with 57%.

I could not find any similar data on Israel regarding passports, but in 2019 (the last pre-Corona year), an astounding 4.3 million Israelis went overseas – almost half the country’s population (and half of them went overseas more than once). Clearly, the percentage of Israeli passport holders is well over 50%. From a macro standpoint, Israel is the mirror image of the U.S., with constant foreign travel. In part, this might also be due to the fact that a majority of Israel’s Jewish population was either born overseas or are children of overseas-born parents.

What does this all mean? First, xenophobia is much higher in the U.S. than in Israel. I would venture a guess that most American “white nationalists” do not hold foreign passports, and thus have not really had (nor do they want) any significant exposure to “others.” Israel has some xenophobic racism, but nowhere near the magnitude found in the U.S.  Second, and more important for the world scene, American foreign policy – when it does decide to enter world politics, especially post-WW2 – tries to mold the rest of the world in its own image. Democracy in Afghanistan? That U.S. debacle is but the latest result of a profound symptom in which even (most) American foreign policymakers see the world through the prism of their country’s culture and creed. It goes without saying, though, that American “exceptionalism” not only holds within it the benefits of a liberal, democratic order, but also the disadvantage of cultural blinders that others do not have the same history, culture, or even aspirations that America has.

This is particularly perplexing for American and Israeli Jewry. On the face of it, one would expect Jews to think and behave about the same everywhere. However, after seventy-plus years of incessant conflict – in a mostly hostile regional neighborhood – Israelis have developed a culture, not only politically but also on the micro-personal level, that is worlds apart from their American (and western European) counterparts. American Jews cannot understand why Israelis are so “belligerent” towards their immediate neighbors, the Palestinians (forgetting that Israel had no problem signing peace treaties with several former enemies). Even on a personal level, the gap is wide. For example, few Israelis residing in America feel comfortable or socialize with native co-religionists for all sorts of cultural-behavioral reasons – and vice versa.

Which brings me back to the school incident. Was the teacher right in slapping the boy? No. Was the boy right in punching our teacher? Again, no. Two noes don’t make a right, of course – which is exactly the point. Jewish history is full of examples where Jews from around the world didn’t get along because of the widely disparate cultures they came from. To take but one egregious example from two centuries ago: upper class Sephardi Jews asked Napoleon not to “emancipate” (i.e., grant wider civil rights to) their poorer Ashkenazi Jewish counterparts.

The best we can hope for, indeed strive for, is greater toleration of the “other” (Jew and non-Jew alike) and even some appreciation of the fact that any culture tends not to be “superior” (or inferior) to most others – just humanly different.

Cancel Culture: A Serious Satire

For my PhD studies at Harvard, I had to choose four “sub-fields” for my oral exams. Among the four, I selected Modern Jewish History. I then took a course with one of the most dynamic teachers I have ever had the privilege of hearing: Prof. Yosef Yerushalmi. His most famous book (the core of our course) was “Zakhor” – “Memory in Jewish History.”

It strikes me today in our era of “Cancel Culture,” that the Jewish approach to history is diametrically opposite: we are commanded to remember the past. This is not to say that Jews glorify anything heinous in the past. Quite the opposite! The Bible commands us to actively remember the past, especially if immoral or otherwise disastrous. Other than the High Holidays, that’s the basis of almost all Jewish holidays and fast days: Passover (freedom after centuries of slavery), Purim (Haman), Hanukkah (assimilation), Tisha B’av (Temple destruction, twice), 17th of Tammuz (breeching Jerusalem’s walls), the Fast of Gedaliah (assassination of Jewish viceroy), and so on. Even more germane is the commandment to publicly remember what Amalek did to the Israelites in the desert – a passage read every pre-Purim Sabbath. Indeed, this passage is particularly “paradoxical” because it demands that Jews “erase the memory of Amalek” by recounting every year what they did way back then! In other words, the Jewish approach to “canceling” is remembering!!

In that latter spirit, and as a protest (also quintessentially Jewish) against Cancel Culture, I decided this time to do something very different: pen a quasi-serious satire of Cancel Culture, taking place in the relatively distant future, but looking back at the present. Call it a “future memorial” if you wish.

**************

 

October 2071

“Hi Gramps. How did your genetification treatment go today?”

“Pretty well. Instead of 75, my biocell-age is down to 52. Should be able to run the marathon next week in under two and half hours. How’s college going?”

“Actually, better than I thought. We spend half our time outside the campus.”

“Virtual classes from afar?”

“That’s a good one! No, we’re into historical justice activism.”

Historical justice? What’s that? In my day, we were into social justice.”

“Well, it’s almost the same thing. But instead of fixing the contemporary system, we are pointing fingers at past injustices.”

“Interesting. But how does that help us change things today?”

“If we let the historicriminals of the past maintain their visibility, we can’t deal with the foundations of today’s injustices.”

“Did you say historicriminals? Is that one word or two?”

“Wow, gramps, don’t you receive any BrainBook news these days?”

“BrainBook. What’s that?”

“It’s the relatively new way of social mediating – based on Machine-to-Brain communication. Sort of a neuronal Facebook.”

“Sorry, too newfangled for me. I’ll stick to reading books. Anyway, the news is too depressing; haven’t gotten any in a few years. Much better for my blood pressure.”

“Have it your way. In any case, historicriminality is all about important people in the past who influenced or caused serious harm to society or lots of individuals.”

“For example?”

“Last week our class did some serious research about the major meat producers. A few of my friends didn’t even last an hour. Can you imagine that people once killed animals and then ate them? How gross and cruel can you get? When we got to augmediated clips of early 21st century slaughterhouses, one acquaintance fainted, and a few others couldn’t continue.”

“What was the purpose of that research?”

“To go through all Wokeapedia items that covered such murderous companies and their founders or major corporate managers.”

“And then?”

“And then to Cancel all the ones we could find.”

“Cancel?”

“Come on, gramps. You may be a bit old, but you’re not senile. From what I understand, the idea of Cancel Culture started when you were about my age. We can’t allow historicriminals to be glorified or even memorialized. That just preserves and continues the long chain of immorality.”

“So you just say ‘delete’ and Wokepedia erases these people?”

“It’s not that simple. We’ve got to show that each of these individuals caused great suffering or evil. But with universal genemeat consumption today, that’s a lot easier. Public opinion is on our side.”

“Anything else that’s harder?”

“You bet!”

“Like what?”

“Jesus.”

“Excuse me? Did you say Jesus? What did he do wrong?”

“Gramps, do you know how many Jews, Moslems and other non-believers were tortured and murdered by the Church as an institution, not to mention Christian believers who took matters into their own hands without Church encouragement?”

“So Jesus was to blame for all that? Don’t tell me that you…what did you call it… ah yes, Canceled him?”

“In principle, we could have. Your generation did a great job eliminating Columbus, so why not Jesus?”

“Why not, indeed.”

“Well, here’s a surprise for you. At first, we thought to do exactly that. But you know, college is for learning stuff. We discovered that Jesus wasn’t at fault at all – in fact, he wasn’t even a Christian!”

“You don’t say! A ‘surprise’, indeed.”

“That sounded like sarcasm, Gramps. That’s not like you.”

“Sorry. There are some things that old fogies like us are aware of, that your generation seems to be ‘discovering’ anew.”

“In any case, turns out that Jesus was Jewish to the very end. It was one of his students—”

“Disciples.”

“Right, disciples. The one called Paul that really started Christianity as a religion.”

“So, what did you people do then?”

“Well, there was a huge battle to Cancel city names with ‘Paul’ in it. Minnesotans put up a furious fight. We lost that one.”

“So ‘Paul’ is saved?”

“Are you kidding? Of course not! We put the name Paul on our Cancel baby list.”

“Baby list?”

“The baby name list. Once a name gets put on that list, almost no one will call their child by that name.”

“Sounds pretty draconian to me.”

“Gramps, don’t be such a hypocrite. When you were growing up, how many American or European kids were given the name Adolf?”

“So St. Paul and Adolf Hitler get the same treatment?”

“Not yet. We’ve succeeded in Canceling the word Hitler from all U.S. education textbooks. That probably won’t happen with Paul.”

“You can’t win ‘em all, I guess.”

“Right. But we’ve scored quite a number of major successes.”

“Like what? Or should I say: like who?”

“Well, if we got rid of Hitler, we had to do the same with Stalin and Mao. You know, some historians argue that both of them led to more people dying than Hitler! Scary.”

“China and Russia went along with that?”

“China, yes. Ever since the Third Revolution, they are willing to join in expunging historicriminals from their records too.”

“But Russia not?”

“Ever since Putin’s grandson took over, their history has become sacrosanct.”

“I wasn’t aware he had a grandson.”

“See what you could learn if you’d be on BrainBook? Putin’s son was Kirill Shamalov, and his son is Shamalov, Jr. But we prefer to call him shame, no love.”

“That’s a good one.”

“Thanks. Who says we historic justicians don’t have a sense of humor?”

“Not me. Any other campaigns you’ve been part of?”

“Yeah, the biggest one of all – climate warming deniers. What a massive project! You wouldn’t believe how many climate deniers there were.”

“I sure would. Remember, I lived much of my early life in that atmosphere.”

“Good pun, Gramps!”

“Unintended, I assure you. Who’s the leading historicriminal of this campaign?”

“Well, we had so many names that it took us a long time to decide who was disworthy, and on the other hand who we ought to leave alone.”

“Disworthy? Is that a new term?”

“Not at all. Boy, you really haven’t been following the news for a long time. It’s a mashup meaning whoever is worthy of dissing.”

“And the disworthy winner was?”

“President Trump. It wasn’t that he was any more forceful in his denial than others. But he came much later than most of the others, at a time when the evidence was already overwhelming. And despite that, he not only stuck to his guns on the issue but also pushed his government agencies to allow for more pollution. Can’t get more disworthy than that.”

“Must have been easy to Cancel him all over. One of the history books I read recently called him the worst U.S. President. Not sure how they measured that, but it seems you guys weren’t the only ones on his case.”

“Gramps, please don’t use that expression.”

“On his case?”

“No. You guys. Language has to be gender-neutral.”

“OK, sorry again. Bad habits die hard.”

“Sure. But that’s why we’re cleaning up history. To clear the air: language, people, movements – everything and everyone who don’t deserve to be remembered.”

“What’s next? You’ve got two more years of college.”

“Are you kidding? The list is almost endless. Two years won’t be enough to Cancel everything disworthy.”

“Almost endless? Just give me another example or two. That will suffice.”

“Across the country, our college movement will be taking courses next semester to Cancel warmongers.

“Warmongers?”

“Yeah, those leaders who pushed their nations into unnecessary wars. You should know – some of them lived close to your generation: LBJ in Vietnam, George W. Bush in Afghanistan and Iraq. They weren’t as bad as Hitler and Stalin, but still their actions were unconscionable.”

“I suppose you’re also going after some non-Americans too. Like Théoneste Bagosora.”

“Who?”

“Too complicated to explain for now. He’s African – from a country called Rwanda. You should… what did you call it?”

“BrainBook it?”

“Right. You rely on the brain; I’ll rely on the book. Anyway, what he did is enough material for another semester’s Cancel work.”

“OK, sounds worthwhile.”

“You mean disworthwhile.”

“Are you trying to be funny? This is serious stuff we’re doing.”

“I agree. It’s very serious.”

“By the way, this Rwanda fellow. Which other country did he attack?”

“His own.”

“His own country?”

“His own countrymen…I mean countrypeople. You know, sometimes civil wars are far worse than wars between nations.”

“Really? I wasn’t aware of that.”

“That’s what history is for. So, after you’re done with the warmongers – what then?”

“For next year, we’ve convinced the Union of American colleges to offer Cancel Courses on the anti-vaxxers. Crazy movement. Hard to believe that people won’t learn from history.”

“Learn from history?”

“Sure. Until the 19th century, hundreds of millions of people died from cholera, smallpox, bubonic plague, typhoid and other infectious diseases. By the late 20th century the vaccines almost entirely eliminated those diseases. Not to mention the mutating Corona viruses this century – finally got that under vaccine control a few decades ago.”

“You forgot this century’s malaria and AIDS vaccines.”

“Didn’t forget them. I could add the universal flu vaccine and a few others. But you get the idea, Gramps. And despite all that, there are still people out there who are against vaccination!”

“I suppose you’re right. I think we can both agree that remembering the details of history is important.”

“Absolutely!”

************

Change Culture; Don’t Cancel History

In 2008 I was teaching at Brown University in Providence, Rhode Island, during my sabbatical year. One course was a research seminar entitled “Zionist Intellectual History.” I had nine students – seven Jewish, one a Catholic, and one a Native American. I invited them for an erev Shabbat dinner at our home, where they could relax with their “professor.” During the meal, I asked the Native American student (who happened to be the best one in the class) why he was taking such a course. His answer floored me: “My goal in life is to help my people, but they are not at all united; I wanted to understand how Herzl united the Jews and started them on the way to their own sovereign state!”

I was doubly impressed. First, about his lofty (if probably unattainable) goal of uniting the First Nations into a political force to improve their lot. Second, because here was student willing to learn from history – and not only his own nation’s. The study of history is not normally high on the list of things that today’s students are interested in learning about. Which brings me to what happened next…

A couple of weeks later, he approached me and noted that he couldn’t make the next week’s class. I asked why. “I am leading a protest against Columbus Day,” came his reply. Once again, I was taken aback. 

What he (and his fellow protesters) obviously hadn’t taken into account was the fact that Rhode Island had the proportionally largest Italian-American population of all fifty states! To put it mildly, the protest was not very popular.

That was then; today there’s “Indigenous People’s Day,” as an alternative to Columbus Day (each state or city can take their pick – or celebrate both). My question is whether this will stay as a double commemoration, or whether we are halfway to the cancellation of Columbus Day. To be clear, this is but the tip of the iceberg; America’s society is starting to run aground of a Titanic crash in many walks of life, called “Cancel Culture.”

The problem is not that progressives want to be sure that we remember the dark sides of our history (whatever nation we belong to) – that is not only worthy but also socially cathartic. The problem starts when they want to delete history e.g., tear down statues, expunge historical people from the history books, etc. Replacing one “narrative” with another unidimensional, counter-narrative is illogical and even self-defeating. Illogical because you cannot argue that history is multi-dimensional, and then go ahead and replace it with a different, unidimensional history. Self-defeating because if one expunges the darker sides of history, we lose the ability to properly learn from those past mistakes. As the 19th century, Spanish philosopher George Santayana opined: “Those who cannot remember the past are condemned to repeat it.”

And there’s another negative aspect to Cancel Culture: the feeling of moral superiority and lack of historical perspective. Cancel Culture progressives feel that their stance not only takes the ethical high ground, but that it is the only high ground to take. There are no other ideas permissible or even thinkable. This in turn leads to the lack of self-reflection: whether anything they stand for will actually hold up in the future – or will their progeny look back with dismay at their supercilious approach to social morality, and whether things that they believe in today might not stand the test of time.

Does this mean that we shouldn’t get rid of any historical statues (or similar cancelling of other historical mementos)? Not all. The basic criterion has to be what was considered acceptable at the time – not what we anachronistically evaluate through the prism of contemporary mores. Otherwise, we have to cancel the Hebrew Bible (slavery was sanctioned!), the entire Hellenistic world (Greek women were not allowed to vote in the Agora!), the Catholic Church (supporting the Inquisition!), and the list goes on (potentially endlessly).

I’m all for Indigenous People’s Day. But it should be a celebration of First Nation culture and history, and not a cudgel against European Colonizers and their supposed genocide (95% of Native Americans died from European plagues from which they had no immunity, not in warfare with the Europeans). Similarly, Columbus Day is an excellent opportunity to not only highlight the travails of the early settlers trying to tame a harsh physical environment but also to note what such colonization ultimately meant for Native American society.

No one can change the past, but we can influence the future. However, that will only succeed if we commemorate and remember all of history. The good, the bad, and yes – even the ugly.

Changing One’s Own Behavior

“Everyone thinks of changing the world,
but no one thinks of changing himself.”

— CAL THOMAS

 Although I look much like my mother, and seem to have her healthy genes too, when it comes to personality – especially temperament – I was born on my father’s side of the family: fiery. Alongside many positive traits, the Wilzigs have a temper. So did I. Which leads to my question: can someone change one’s own basic personality and behavior?

Neuroscientific research in the past few decades has come to quite a clear conclusion: we have great brain plasticity i.e., the brain can change in response to external challenges; we can also change our brains by practicing. For instance, the area of the brain connecting the two hemispheres – the corpus callosum – is larger among musicians than the average non-musician. The causation is “practice  corpus callosum,” and not that people born with a larger corpus callosum tend to become musicians.

Does this apply to personality and general behavior as well? Not so much regarding the first, but certainly yes for the second. The kind of people we are seems to be relatively “hard-wired”: introvert vs. extrovert, adventurous vs home-body, etc. However, none of this means that with some effort we can’t “overcome” some personal trait that we (or others around us) find unbecoming or problematic. To put it in “philosophical” terms: just because we do not have complete free will does not mean that we are slaves to our essential being.

However, in such cases, precisely because we are attempting to do something that goes against our innate personality, it takes a lot of work to change a habitual behavioral trait or pattern. And motivation.

At the relatively young age of 57 my father suffered a massive heart attack and passed away half a year later. His sister, my aunt Rosa (a warm, wonderful woman, but another “yeller”) also died relatively early – and my father’s brother Freddy (the biggest yeller of all, but with a big heart) also didn’t live past his early 70s. As I approached my 50s, I began to think about the connection. Type A personalities (intensely temperamental) don’t have the longest lifespans. Could I do anything about this?

This was motivation enough, even if it was somewhat selfish, as the other major problem with a temper is that it makes people living around you very uncomfortable – and the bigger/more frequent the temper, the tenser family (and work) life becomes. I did think about this as well, but being brutally honest (in retrospect), it was self-preservation that constituted the main push to change.

In any case, I figuratively “sat down with myself” and made a conscious effort to control my temper whenever something (in the past, LOTS of things) would get my goat. It took a while. In fact, I found this not too much different than “practicing” other more mundane things in life, like shooting a basketball from the foul line, or improving my writing skills. I tried different approaches, keeping those that worked best for me. After a few years (yes, YEARS), I had succeeded in cutting down my temper outbursts by about 80% – to the extent that one day my son Boaz actually asked my wife Tami: “what happened to Abba? He hardly yells anymore…”

Should I have tried to do this earlier in life? Absolutely. Could I have succeeded earlier? I doubt it. Successful self-change can really come about only after a certain amount of “living life” (some call this “maturity” – but that’s not right because maturity is the outcome and not the driver of change; perhaps “aging” is better).

I didn’t intend (or think about) this but there are ancillary benefits to such change. First, success begets success. If we succeed in changing one problematic aspect of our behavior, it then becomes more likely that we will attempt to change another one as well. That doesn’t mean we will always succeed (I have had no success in stopping my finger-picking habit), but even trying to change can be a salutary enterprise. Second, not only will such change improve us personally, but as noted above it will also benefit our loved ones who have had to “put up” with the problem. And if they are happier, then we become even more satisfied with our personal effort.

I’ll conclude with what really is the hardest part of this whole “project”: admitting to ourselves in the first place that we have a serious personality flaw! Looking in the mirror is not for the weak-hearted (or much fun) but it’s crucial for self-growth. Once we get past that emotional obstacle, the rest of the self-change project is almost easy by comparison. In short, the expression “physician, heal thyself” is intended for all of us to rectify our basest behavior.

“Most of us are about as eager to be changed as we were to be born,
and go through our changes in a similar state of shock.”

— JAMES BALDWIN

 The Process Should Be the Product

I was accepted to an experimental program for my freshman year at City College of New York, with our “campus” in the CUNY Graduate Center. We had the very best lecturers, one of whom taught English Composition: FIVE days a week for the entire academic year!

 You probably think that there can’t be a worse college experience than that. I certainly did. In fact, it was quite painful – but “boring” certainly not. Indeed, looking back on this after five decades, it probably was the most important course I took in my whole academic career – not merely for the skills it gave me, but for its life lessons as well.

On the very first day of class, Mr. Gordon C. Lea (a well-tanned Brit) asked us to sit there and write a two-page essay on anything we chose, as long as it contained an argument for or against something. We handed them in before the class was over. By the very next day, he had already marked all thirty essays and handed them back in class, one student after another in alphabetical order. Being a “Wilzig,” I was going to be one of the last to receive mine. As the “returns” went along I began to hear sobbing in the classroom; several students were silently tearing up and others had gone white in the face. With growing apprehension, I waited for mine to be returned. 

I had always known how to write. I liked words, I had a “Germanic” disposition for grammar, and my elementary school and high school made us sweat “book reports.” Moreover, I was chosen to be one of the editors of my high school yearbook…

“Wilzig!” he called out. I went up to Mr. Lea and took my paper without peeking at it until I returned to my seat, although out of the corner of my eye, I could see a lot of red markings. Then I looked directly: the grade was C-! I was shocked. Later I discovered that this was the second-best grade in the class!!

Here was the first life lesson: failure is a relative matter. Relative to one’s expectations (subjective), and also relative to what others have done (objective). Which is more important? As we go through life, most of us tend to focus on the subjective aspect: how did we do compared to others? Where does this leave me on the social (or professional) totem pole? But that’s not the way to go through life, because we can never be the “best” at anything, or (in almost all cases) even close to the best. Yes, it pays to have an external benchmark, but this should be set by what each of us is capable of reaching. In short, success in life has to be based on some inner-directed criterion.

“Most of you came here thinking that you knew how to write. Hopefully, now you understand otherwise. But…” Mr. Lea stopped for dramatic effect, “by the time this course is over, you WILL know how to write – as long as you put in the effort. And I’m going to give you an incentive for that: your final grade will not be an average of all your paper grades throughout the course but rather it will be based on your degree of IMPROVEMENT from now to then.” Hearing this, I didn’t know whether to be sad or happy: sad, because as one of the “top” scorers, I would have less “improvement” to make; happy, because I was closer to an “A” than the others.

That was life lesson number two: in the end, success should be measured more by the process, by the effort, than the final product. Of course, if we finish the “job” with failure, that’s not enough; however, we shouldn’t measure our success only by what we accomplished, but also (primarily?) by how well we did relative to our ability.

As the year went along, almost everyone in class made great strides in improving their writing skills. At the end, we arrived at the final exam – only to have one last surprise waiting for us. The main essay question was this: “In 500 words or less, make an argument to convince me [the teacher] that you should receive the course grade that you think you deserve.”

Life lesson number three: technical skills are useless without accompanying cognitive skills – and even more so, without a clear goal. I am sure that when you read “English Composition” at the start of this essay, you said to yourself: grammar, spelling, punctuation… Of course, those rudimentary skills are the foundation for good writing, but far from enough. On top of that, we need three additional elements: the ability to organize our thoughts into a meaningful whole (“composition”), the ability to think logically and persuasively (“rhetoric”), and the ability to know what you want to achieve i.e., where you want to “go” with what you’re writing (“goal-orientation”). In fact, these should be undertaken in reverse order: First, where am I heading here? Second, how can I organize the argument? Third, what specifically should I argue and how to put it into words?

I argued in my essay that I deserved an “A,“ and that’s what he gave me for the course. I was more pleased by this “A” than any other I received in college precisely because I had to “sweat bullets” to achieve it.

That was the main (fourth) life lesson I “processed” successfully: no pain, no gain…

Without God (2)

As a practicing social scientist, I eagerly devour the latest research on religion and…war; and also religion and…peace. Although academics are used to some ambiguity in research results (after all, humans don’t behave predictably like most atoms), on the issue of the relationship between religion and violence the results and “opinions” are all over the place.

After last week’s post on individual religious belief, the time has come to tackle some broader, societal question(s).

First, notice the word “relationship” above. There might well be some connection between religion and violence/war, but that doesn’t in any way “prove” that the first causes the second. As the truism goes: “correlation is not causality.” One can think of a huge number of leaders who use religion as a (false) basis of going to war e.g., Saddam Hussein. The power-hungry will turn to any useful “ideology” to further their own ambitions.

Second, it is also clear that if religion does cause people to go to war (or act violently), it certainly isn’t the only “ism” to do that. In the 20th century, Marxism (Stalin and Mao) killed far more people than all the religion-based wars put together (admittedly, in Stalin’s case, Marxism was a cover for pure self-aggrandizement; Mao, though, probably believed in Marxism). Not to mention Hitler, whose war-craze was partly based on anti-religion (expunging Judaism and Jews). So even if religion can be said to be an important factor in world history for causing war – and that is correct only from Christianity onwards; in the BCE era, religion per se almost never played a part in international warfare, although it had some influence on civil wars e.g., Jews vs. local idol worshipping Canaanites – it is certainly not the only, or even the major cause.

Having said all that, we do live in an era of intense religious strife – mostly trans-religious i.e., between different religions (Moslem vs Christian) instead of what previously was the dominant paradigm: inter-religious (Catholic vs. Protestant; Christian vs. Jew), although the latter still exists: Sunni vs. Shiite Islam (located exclusively in the Middle East and northern Africa).

With all that, why are so many people in the world still religious? Well, first of all it has to be noted that the vast majority of them are peaceful, so it’s not as if “religious belief” automatically drives people to violence; there is no evidence for that whatsoever. Just as a relatively mere handful of secular extremist ideologues (Marxists, Libertarians, etc.) use violence to further their ideals, so too the same small proportion of religious extremists give religion a bad name.

The answer to religion’s continuing popularity is that it provides several advantages from a societal standpoint. First, it turns out that on average, regular worshippers have an added few years lifespan! Speaking as social scientist, that’s not because God is looking out for them, but rather because of the health benefits (mental and physical) of sociality and communality. Going to synagogue, church, mosque, temple on a regular basis brings people together. If the number one killer of older people is “loneliness” (that’s so great a problem that both Great Britain and Japan have established a Minister for Loneliness!), then clearly regular, communal religious gatherings are going to alleviate that critical problem, whatever the religion or level of belief.

Second, as I discussed in my previous post, another possible factor in increased lifespan is the believer’s reduction of existential angst; we know how much “stress” in general can cause illness and general bodily malfunction. Thus, instead of “rage, rage against the dying of the light” (Dylan Thomas’s immortal verse – pun intended), a true believer can face the eventual prospect of death with greater tranquility.

Third, religion has given humanity most of its moral code – or at least has provided a strong underpinning to buttress Homo Sapiens’ “natural” moral tendencies. It’s one thing to fear the government’s threat of punishment for transgressions, but those police powers cannot be everywhere; for true believers, God is everywhere and sees everything, so that further encourages rightful behavior. Having said that, the latest data clearly show something “peculiar”: the countries with the lowest levels of criminality and violence are the most secular (e.g., Scandinavia); the ones with the highest levels of corruption are the most religious (Middle East and parts of Africa). Of course, that might have nothing to do with religion per se, but rather a function of socio-economic level (that is also negatively correlated with religion – see the next paragraph).

Which brings us to the social downsides of religiosity – at least from a modern standpoint. It can be a stultifying, overly conservative force: maintaining patriarchy, continuing homophobia, undermining personal freedom, and in general leaving the population behind socio-economically. It is not a coincidence that the further one goes from south to north in Europe, the less religious and the wealthier/socially-advanced are the countries. But again, is that because secularism leads to more wealth, or because more wealth (and especially education) leads to less religiosity? Probably more the latter than the former, but there’s no unequivocal evidence either way.

Confusing? That’s precisely the point I started out with: there is no clear, one-way relationship – positive or negative – between religion and positive/negative societal outcomes. Like every other area of life, one factor – as important as it might be – cannot explain or influence the rest of our very complex, social world.

You better believe that!

Without God (1)

Many years ago, as I was putting my (then) 7-year-old, youngest son to bed, I was ready to relate another nightly story. But before I could start, he turned to me with a serious face and said: “Abba, I have something to tell you.” I thought to myself: “here comes a confession of some minor infraction; after all, Avihai didn’t like to lie.”

“What is it?” I asked, with an inner smile.

“God doesn’t exist,” he responded.

To say that I was dumbfounded would be an understatement – by the very thought, and even more that a 7-year-old was even considering such a weighty matter!

But then again, why not? We all constantly mention the Almighty – whether in saying “My God!” or “God help you…”. Moreover, who doesn’t think about the big question: “who’s really in charge here?” And for children of religiously observant parents, it’s hard to avoid God altogether.

So let’s return to the substance of what my son said. I’ve been thinking about that for decades and have come to the conclusion (well, one of many on this broad issue), that the ability or tendency to Believe in God (or not) is something that we are born with i.e., it’s part of people’s nature. Of course, deep in their hearts many are somewhere in the middle: agnostic – not sure that He (or She) exists, but not sure either that an Almighty Being doesn’t exist.

To be sure, this isn’t only a matter of a person’s nature. Socialization is a large part of it as well (family and peer environment) – although that tends to influence the way we express our religion more than the actual belief itself. In short, it’s Nature and Nurture.

However, there’s a third aspect that is less talked about, but in my estimation is central to a person’s level of belief: existential angst. I know, that’s a mouthful. In simpler language this involves two quite different things: 1- the fear of “nothingness” that humans have regarding what happens after they die; 2- is there any rhyme or reason to life (or for that matter, the universe)?

The first fear can be felt at an early age, especially if a young person sees someone in the family (or close friend’s family member) pass away. (In Avihai’s case, it might not be coincidental that he made his declaration quite soon after Israel’s Gulf War in 1991, when everyone in the country was deathly worried about Scuds landing on their home.) This doesn’t mean that everyone – child and adult – will find succor in God’s hands. Many people will not accept a Being that they can’t see, hear, or touch – indeed, one (not One) who’s completely invisible. For a 3-year-old, the “virtual” is real (make-believe characters they converse with); by the age of 7 or thereabouts, the only real is the really real – for them, “make-believe is for babies.”

But for much of humanity, it’s not “make-believe” but rather they really believe. Or should I say, really need to believe. Which brings me to the “war” between atheists and believers. My first reaction to Avihai’s statement was to try and convince him otherwise. But I stopped myself, for if such a young boy can come to such a conclusion by himself (he certainly didn’t get this from our home!), why try at this stage to argue with what is (for him) a very natural conclusion? Yet, there’s a converse lesson here too: why should atheists try to convince religious believers of their “error”?

Here’s why they shouldn’t. Let’s try a “thought experiment.” You are told by a doctor friend with many years of experience as a family GP, that for so-called ailments, the best thing she does is to prescribe for the “patient” a placebo pill to be taken three times a day. “It’s amazing how many of my patients return after a while thanking me for the great medicine I had them take!”

Do you run off to inform his patients that it is all a scam; the “medicine” is a sugar pill? Of course not! Why not? Because in fact, it works! Now for the above doctor’s vignette, substitute “existential angst” for “ailment”, and “God and prayer” for the “placebo pill.” You as an atheist might feel that they are being duped; they feel (and in actuality, receive) relief from their spiritual “ailment.”

Thus, there is nothing so exasperating to me as religious people who try to convince atheists that God exists – and equally maddening, atheists who try undercut religious belief of the observant and the worshipper. If both sides are happy with their (un)belief, they should leave the other side alone.

Which leaves open one gigantic question. All this might be fine and dandy on the micro-individual level, but don’t religious belief and conversely atheism have consequences on the macro-societal level? I’ll relate to that question in my next post.

New

In my lifetime, I have had several significant changes that in a sense made me start “anew.” Leaving the cloistered world of Jewish Day School education to go to City College; moving to Israel from comfortable America; changing my academic research and teaching discipline mid-career from Political Science to Communications; and so on. Which gets me thinking – as we enter the Jewish NEW Year – about the concept of “new” in our life.

 

Human beings like to feel comfortable, another word for “habitual.” Radical change – except for those whose lives are truly miserable – is not something sought after. Think of the expression “tried and true.” That doesn’t merely mean that based on past experience it’s the correct thing to do (or that “it works”); it can also mean that what we have done in the past is the “true me.”

 

Unfortunately, people also tend to get bored doing the same thing over and over again. Assembly-line work is definitely alienating (a la Charley Chaplin’s hilarious Modern Times scene where he can’t keep up with the objects flying by). Office work can also be mind-numbing. Thus, at some point we need to find something “new.” But that runs the gamut from the trivial to the truly life-altering.

 

The question for each of us is finding the right balance between doing something new and continuing the tried and true. A lot depends on personality: some are thrill seekers; others, safe and sound bodies. Some are very good at finding the right amount of “new” by themselves; others need some outside push.

 

Which is where a New Year comes into play. We are well aware that Rosh Hashana or January 1 do not mark something really “new.” They mainly signify starting the same old cycle (“calendar”) all over again. But what they do offer is the opportunity for each of us to really think about whether – and to what extent – we do want to have something “new” in our life.

 

Once a year is obviously not enough, so we invent other “new-thinking” devices: a birthday; an anniversary. These are what I would call “potential-new”: getting us to consider what could be new the coming year, if we so willed it. Then there are the “already-new” events: engagement party, wedding, housewarming: these symbolize that we have already decided and undertaken to begin something new – but that still leaves the question of how to “manage” this new life.

 

Overall, there are three main “new” events in our life: marriage; children; retirement. (Of course, they can include some variations: divorce; empty nesting; spouse’s passing.) In each of these, we are never completely ready – or fully cognizant – of what this “new” entails, but we are willing to jump in. However, there are two differences between the first two and the third. First, marriage and children are almost always events that we have control over (excepting shotgun weddings and pregnancy “mistakes”). On the other hand, retirement is largely “forced” on us by law or physical/mental frailty.

 

That’s the bad news. The second difference is better news: whereas marriage and children restrict our ability to pursue the “new,” retirement opens up a whole world of new opportunities, without many of the life encumbrances we had pre-retirement. For the thrill- seeking types, that’s great; for more conservative individuals it can be a big problem because after 65-75 years of habitual life, it isn’t easy to switch to new types of activity, new outlooks, new ways of looking at our personal world.

 

The bottom line: we all hope to retire someday (that’s much better than the alternative, except for the “lucky” few who want to, and can, work until they drop). That’s a new situation – but it won’t be very successful unless we prepare ourselves way ahead of time psychologically and practically (hobbies; interests; etc.). And if already the Jewish New Year, then this coming one also happens to be a once-in-seven “Shmitta” year when traditionally the land lies fallow, and we all take a long rest from work. A good time as any to think not only of the coming year but future retirement as well, for a successful and fruitful new start.

Perfection(ism)

I am a rehabilitated perfectionist. How did I originally get to be that type of person? Probably because I was born with that personality trait. Or perhaps it’s a result of my German-origin parents (what’s called in Yiddish a “yekke” – but that can degenerate into “yekke-putz”). I can recall only one incident when I was young that might have egged it on: I came home from school one day with a 98 on my math test, and my father asked me: “why didn’t you get 100?” But that was the only time I ever heard that line at home (although it has stuck in my head, so who knows?). “Tiger parents” they were not, although they expected that at least I always put in a good effort.

Perfectionism is a silent “disease” – not anatomical but rather psychological. When we are faced with such a person, it’s also almost always hidden from sight – or perhaps I should say that most of us can’t see it hiding in front of our very eyes. That’s because we tend to look with envy or awe at successful people, or at least those who produce things that are way above what we are capable of. But all that hides the inner turmoil – or at least nagging angst – of the “successful” person. Their “problem” is that they set an impossibly high bar for themselves, and since they can’t really reach it, they get disappointed with what they have produced. That at times can lead them to waste inordinate amounts of time “fixing,” “improving,” “redoing” or other types of “productive procrastinations” that are actually very unproductive.

There are two reasons for such added unproductivity. First, perfectionists can never fill their own Olympian expectations. The attempt is akin to Xeno’s Paradox: you can keep on getting closer and closer to your goal, but each “half increment advance” only brings you that much closer – you don’t ever “arrive.” A second reason, as with almost every other area of life: the “Last Mile” is the most difficult or most expensive (in time or money).

But let’s say that ultimately a perfectionist does succeed in reaching perfection after great effort. What was gained? What was lost? The gain is minimal, if we are to compare the “really good” (even “great”) initial product with the final one of “perfection.” Meanwhile, what is lost is precisely that: WHILE s/he was redoing and refining the product again and again (and again), s/he could have produced all sorts of other very good/great things – worth far more than the minor incremental improvement of that initial, one product.

There are, of course some exceptions. If we are working on a work of art (fiction, article, computer program, or any other “product” for which we hope/expect it to last for a very long time), then it does make some sense to take the time to refine over and over. Mozart was notorious for simply dashing off whole symphonies and sonatas with nary a second look – until he worked on his later quartets and quintets (revolutionary for their time), for which we see in his score a huge amount of changes. For understandable reasons, composers in general can be given lots of leeway in their path to musical “perfection.” As the 20th century compositional giant Arnold Schoenberg once explained: “A composer’s most important tool is an eraser.”

Furthermore, none of this is to suggest that we shouldn’t look at our “first draft” as a rough sample of the final product. Some people (“hares”) work straight through in a creative frenzy; they need to go back and carefully polish their work. Other people (“turtles”) slog through an initial creation; their work might be closer to “ready for prime time,” but here they have to consider the totality of what they have done, given how much time elapsed between the beginning part and end section of their work – two poles that might be less connected than warranted. In either case (and other working styles as well), checking and refining is not neurotic perfectionism but rather good work practice. Checking and refining several times over is a problem, probably hiding such a psychological issue.

So how did I “heal” myself from my “disease”? I simply set a hard and fast rule: maximum TWO additional “polishing/proofing” run-throughs. After that, no matter what, I send it off. Therefore, if you find a minor error here and there in this essay, so be it. My “extra” time – as will be yours, whatever you’re doing – is better spent moving on to the next project.

In the final analysis (hopefully, not too much analysis), it behooves all of us to internalize the immortal saying: “perfection is the enemy of accomplishment.”

Solving Problems by Looking Elsewhere

I usually start off a post here with a personal vignette from my life. This time it will be a news item that I read almost 50 years ago (!)– that obviously made a deep impression.

Back in the 1970s the Federal Trade Commission set up a committee to find a solution to the (relatively rare) disaster of babies dying while sitting on the lap of their parents during a plane’s takeoff and emergency braking (whereupon the baby would go flying forward). After due deliberation and much expert testimony, the committee announced that their recommendation was…. to do NOTHING!

I was taken aback – as you probably are right now. Then I read on. Of course, a few babies’ lives could be saved by demanding parents to buy a reduced rate ticket for their tot and strap them into a bassinet or whatever. That would lead to zero deaths. But the committee members went further: what happens when you demand a ticket be bought for the baby? The answer (they had economists figure this out): a certain percentage of people will decide to drive to their destination instead of flying (especially for relatively short, inter-city distances) and then the number of babies dying in automobile accidents will be far higher than the airplane deaths! (As is well known, air transportation – per mile – is the SAFEST mode of travel in the world.)

That was a lesson that I have never forgotten – when faced with a problem, don’t only concentrate on the data within the problem proper, but rather think about the (seemingly irrelevant) “ancillary” elements that could be decisive.

This all came back to me recently in our “post-Corona” situation in which the following question is being bandied about: should companies and organizations force their workers to return to the office, or enable them to continue working from home?

The main elements seem clear cut regarding both options:

Return to the office: a- organizations need an esprit de corps i.e., a sense of community (or common purpose) that can’t be maintained if everyone is at home or elsewhere; b- managers need to stay on top of workers for guidance and some supervision; c- the “water cooler effect” – serendipitous conversations that lead to insights and breakthroughs, only happen when people are physically together.

Work from home: a- huge savings in rent for the company with far less office space necessary; b- higher worker morale (and productivity?) when workers don’t have to spend lots of time on the road to and from work; c- flexible worktime, something especially advantageous for parents (usually mothers) of young children.

As one can see, these are all directly related to the work situation itself. However, if one takes a wider perspective, then some other considerations bubble to the surface. Here are several.

Saving money? Not necessarily. Electricity bills can skyrocket for the company, with far greater use of computers and other appliances to communicate from afar. And does anyone think that workers will pay for their own increased electricity bills (computers, air-conditioning, etc.)? In fact, many home A/Cs will use far more electricity than one gigantic central A/C at the office. Thus, even without considering the economic cost for company or worker, society as a whole might actually LOSE out here with INCREASED carbon emissions!

And then there’s a possible, unintended tertiary effect. If workers can work from anywhere, many will then move to more “amenable” (for them) places to live. But no company will allow workers never to meet each other face-to-face, so then (here we go again with the airplanes!) we have to factor airfare costs into the company’s “savings” equation, for the quarterly “ingathering” of the worker self-exiles.

Moreover, as we already see with many Silicon Valley employees moving to Colorado and other rural and suburban areas – away from the denser cities – here too society exacerbates the carbon footprint problem, as it is now well documented that city folk use far less “carbon” than their suburban/exurban compatriots. Why so? Not only far less transportation expenditures, but the housing carbon footprint (urban apartment buildings vs suburban one-family homes) is much lower in the city. So that society loses out again…

Improved worker morale? Not for everyone. Many (perhaps most) people need to get away from “the house” as a break from the humdrum of everyday life. Others are “physical presence” type of people who need the visceral immediacy of “corporeal” interaction, so that meeting virtually might actually be depressing, not morale-boosting. And still others have marriages that are holding on by a thread mainly because one spouse (or both) are out of the house for many hours. (Which is why divorce rates spike immediately after retirement.) Then there are those who like the clear differentiation between private life and workplace – getting a frantic email from the boss late in the evening is not what they bargained for. Not to mention the vastly different “working at home experience” between parents with children at home, and empty nesters who can work in quiet. (Then again, there might be more parenting being done – good for the kids and society at large; bad for the parents’ employers.)

Organizational culture: The ones probably most against permanent work-from-home are… managers! It’s not only more challenging to manage workers from afar, but more “problematic” (from their standpoint): what if it turns out that many workers don’t need managers at all?! Today’s “information worker” is well educated and socialized to think independently. That’s great for the organization as a whole – but not necessarily for many who run the organization.

I’m sure that with a little thought, you too could come up with several additional, broader societal considerations, unintended consequences, and wider ramifications for this burning issue. More important, though, is to look beyond whatever other narrow problem you face now and in the future. What you see is not necessarily what you’ll get if you focus only on what you see right in front of you.