Maturity

In my teens, I was quite the voracious reader of fiction. Several books impressed me – e.g. Somerset Maugham’s Of Human Bondage, Thomas Hardy’s Tess of the d’Urbervilles – but above all, Ayn Rand’s Atlas Shrugged had a tremendous influence. Not so much for its literary “merit” (even then I realized that Shakespeare she’s not) but for the ideas Rand espoused so forcefully. I recently returned to the book. My reaction: what was I thinking back then???

Precisely the point. When we’re young, we think that we’re thinking. In a way, of course, we are “thinking” – but without much reflection. Strong ideas, simple to grasp, tend to grab our attention; complexity is perceived as wishy-washy. Just like our teenage chemical hormones that are “raging” in one clear direction, so too our cognitive faculties hone-in laser-like on any idea that represents protest against the status quo. It’s an age where we can’t be too bothered by facts that get in the way.

As we grow older, most of us manage to “mature”. In a large sense, that’s another way of saying that our life experience wears down the sharp edges of assuredness, bringing a more complex and rounded understanding of social reality. That doesn’t mean that we stop strongly believing in ideas, ideologies, movements; it does mean that there’s also a “yes, but…” lying in the thicket of whatever opinion or belief we happen to hold.

I came across Ayn Rand’s tome (over 1,000 pages long!!) in the early 1960s. It was a stunning (for me) paean to capitalism and freedom. As a teenager, I hadn’t thought much about “economics”, but “freedom” was certainly on my mind – as it would be for any adolescent beginning to forge their own identity. The idea that we are the masters of our own fate, and that society shouldn’t restrict us in any significant way, had great meaning for anyone at my early stage in life.

What has happened since then? I began to look at the world in all its intricacy. Simultaneously, I started reading social-science research (after all, that is my general field): the way humans think and behave (psychology), the way societies work (sociology, politics, economics), and from there to the way (and why) they are constructed (biology, neuroscience). The more I read and delved, the clearer the picture became – not simpler, but rather far more nuanced and complex.

Put simply (and somewhat simplistically): there is no such thing as the “individual” divorced from society. Everything we believe in, all that we create and do, to a very large extent is a product of our social and physical environment. Indeed, Hebrew (Judaism) has two words for what in English we call “creation”: “briyah” and “yetzirah” – the difference between creatio ex nihilo (something from nothing) that only God can perform, and creatio ex materia (something from something) that describes what humans are (only) capable of. In other words, when we think of something “new”, it is basically a recombination of something (physical or ideational) that already exists on our world.

Most people are not voracious readers like I am, but everyone lives life in all its variety, up close and vicariously too (“the news”). The “up close” teaches us how to relate better to others: what drives others and ourselves, where they’re “coming from”, when to say (or not say) what in response. The news – external to our immediate life – provides perspective on “others”: different cultures and ways of living, different “narratives” and ways of thinking. Assuming that our eyes and ears are somewhat open to all this, we cannot help but become more “mature”, i.e., tolerant and able to accept life’s others and “otherness” with humility and grace.

Maturity has another side effect: happiness! Recent research clearly shows that around the world, as an age cohort older people (60s and up) are happier than any other age group. That’s true for three main reasons: first, the kids are out of the house; second, professionally we have achieved (or not) most of what we will ever accomplish. Third, and perhaps most important of all, we don’t take things so much to heart anymore; we’ve “seen it all” and for the “mature” person the good and the bad are all accepted with a gracious chuckle. Thus, as people approach the last stage of life, they paradoxically also tend to accept life far more from a “middle” standpoint.

Finally, there’s one more important aspect of the mature mind: accepting that whatever the current conventional wisdom and even the hard, factual evidence, we have to maintain a level of skepticism because of the lack of absolute surety. Even Newton’s “absolute” laws of physics eventually succumbed to new facts and a new way of understanding the universe (Einstein). Metaphorically, Sir Isaac was our teenage certainty; Albert constitutes later-age relativity. Great minds might not think alike, but mature minds do take the same approach to thinking about life.

Are We a Whole, or Many Holes?

For those of you who have read several (or most) of these “Reflections” essays, it might seem that they present a disjointed picture of yours truly. There isn’t much of an “autobiographical narrative” here. Which brings up a very interesting – indeed, even hugely “personal-philosophical” – question for each and every one of us. Through a lifetime, does a person consist of a stable “self”?

At the most basic biological level, you are not the same person you were yesterday – and certainly not what you were a few months ago, given that almost every cell in your body has died and regrown since then. We shed skin and water – with new skin and liquids constantly replenished; we lose bone and cartilage – and grow new cells to keep our bodies functioning; all the while, “zillions” of bacteria and viruses who reside permanently in our gut and elsewhere are constantly dying off with others taking their place. Most fantastically (and still perplexing neuroscientists) is the fact that even our brain cells are constantly dying, and new ones take their place – so how do we retain our memories??

Nevertheless, my initial question above is not biological but rather psychological: are we the same person(ality) we were yesteryear in our thoughts, our emotions, our way(s) of viewing and dealing with the world? Is there any consistent “me” from childhood to late adulthood? Notice that I don’t even include here “infancy” or “dotage” because clearly at the very start and very end of our life we are mostly not the person we will become or was. However, in the vast middle – are we really the same “person” in the deepest psychological and behavioral sense of the term?

There is no clear answer. On the one hand, most of us would unthinkingly say “of course I’m the same Sam”. Although our face and body changes through the years, we can see the gradual progression; although we might be somewhat more “mature” later in life, we still react to things in the generally same idiosyncratic fashion as before.

Or do we? Here we come to the other side of the coin: memory. If we wish to view ourselves as the same person we always were, there has to be some internal narrative to support this. But except for the very few (un)lucky individuals who can recall every moment of their life, we have huge holes in our “internal narrative”. Even worse, much of what we do “remember” did not happen in the way we “recall”; in other words, much of the time we “invent” our life. This is a common (heavily researched and proven) problem in court cases where witnesses literally “re”member what they saw or heard. Little do we know that we are quite bad “witnesses” of our own life!

And now for the BIG question: do you (past or present) try to “make sense” of your life, to see some overarching framework in what you have accomplished (personally and professionally)? In other words, notwithstanding the memory holes we share, even if we remembered everything, would that in itself constitute a “life narrative” with any motif? In fact, have you ever asked yourself that question?

I definitely do not mean to suggest that if you haven’t even asked it (meta-self-examination), there’s something wrong with you. You might actually believe – perhaps correctly – that there is no such thing as a “life narrative” that is stable, consistent, and unchanging. The reason there might not be such a thing is that we are not completely sovereign over our life; our social environment (broadly defined) affects and influences us every moment of the day. Thus, like a ship at sea buffeted by the waves, winds and currents, our life deals more with “staying alive” than inexorably charging forward to our self-defined “life-goal”. Moreover, we also might not feel that there’s a consistent narrative because we actively tried “maturing” and changing “who we were”. Why look at a steady life as something admirable? As someone once opined: “consistency is the hobgoblin of small minds”.

Yet ultimately that is somewhat unsatisfying because we want to feel that we “stand for something”, i.e., that there’s a “there, there” within us. We all want to believe that we have some control over who we were, and are, and will be – and if we do change, then at least it’s because we willed it.

The American moral philosopher J. David Velleman put it pithily: “We invent ourselves… but we really are the characters we invent.” In short, even if we aren’t in full control of who we actually are, we are in total control of who we say (to ourselves, and to a lesser extent to others) what we are. Some of us see in ourselves a one-act play; others perceive ourselves to be a kaleidoscope of colors and shapes. Whether whole or full of holes, human beings are at least free to paint on their own canvas – even if the picture isn’t Realistic but rather Impressionistic or Surrealistic.

The Limits of Freedom

When I was a young boy, my parents bought us two parakeets. We kept them in their cage, well fed and taken care of, in the bedroom I shared with David. One day I felt sad for them because they didn’t have the opportunity to fly. So, I opened the cage door, gently took one out, and set it free. The bird was unsure what to do, but soon enough started flying around the room and then… flew straight into the window, cracked open its head, and died on the spot. 

At the time, it was a “tragedy” for a young kid like me. Looking back, though, I realize that there were a few lessons to be learned from this incident.

First, don’t do something “different” unless you consider possible unintended consequences. Of course, the world is too complex to be able to foresee everything that might happen, but we can certainly think through several, relatively likely possibilities and perhaps a couple of unlikely ones as well.

Second, and more difficult to absorb, is that unfettered freedom is actually counterproductive. Indeed, there’s a word for that: anarchy. We intuitively understand the problem in a few areas of life. Take sports: imagine a game where there are no rules, and every player can do whatever s/he wants in order to “score”. Not only wouldn’t that be very interesting for the participants, but it’s doubtful many spectators would find such a sport compelling. Or take music: what do most people want to hear – Mozart or some hyper-modern, “anything goes” composition? It is precisely the “limiting” structure of pre-20th century classical music that renders the music of Mozart (or any of his composer compatriots) so compelling.

This holds true not only for culture and entertainment. Even in general social life, complete freedom does not exist. One can’t walk anywhere, anytime; down the middle of the street or crossing at a red light is not acceptable. One can’t even say whatever one wants, the First Amendment notwithstanding: yelling “Fire!” in a crowded theater (when there’s no fire) has no Constitutional protection. We even accept restrictions with no danger or harm in sight: even if all parties are willing, no more than two people can marry each other.

Why would this paradox hold true? That true freedom is only found within some structured or even “restrictive” framework? One way of looking at this is that society emulates Nature. The universe has “laws” of physics – there is no “cosmic chaos”, even if black holes and other outer space phenomena might seem that way. Given that human beings are part of that same natural world, why wouldn’t homo sapiens behave along the same principle? Indeed, we don’t think twice about the fact that gravity stops us from flying – it’s just one of the numerous “limitations” on our freedom that we accept because… well, because that’s the way world works.

Another approach is to consider what drives humans after their basic needs are met. In a word: challenge. By our very psychological nature we are problem-solvers. At first, to stay alive; later, to keep us mentally stimulated or having a “better” life. But a problem, task, or challenge to be solved, completed or overcome can only be stimulating if it is circumscribed in some way: a frame around a painting; rules to complete a puzzle; laws allowing and forbidding certain advertising practices; professional licenses for those completing a specific course of study and passing an exam; and so on. The fun is in difficulty overcome, not in taking the easy way home.

In retrospect, as that young child I had two other choices: open my bedroom windows before releasing the parakeet – or lowering the shades.

The first option would have provided my bird with complete freedom – and then it would most probably have starved to death, not being trained to find food by itself. The second option would have restricted the parakeet’s freedom of movement even more by darkening the room – but that in itself would have forced it to be more careful, and thus survive the experience!

At the risk of sounding “pedantic”, there’s a much larger lesson here. I’ll take the latest “political” brouhaha as my example: during Covid-19, every country has had to deal with the following dilemma: to prohibit certain activities from those who refuse to wear a mask and get vaccinated – or enable “complete freedom” as a function of individual rights? At least for me, my parakeet episode provides the correct answer.

Hedgehog or Fox?

Back in the mid-1980s, when I was about a decade into my academic career, a senior colleague and close friend of mine gave me some advice as I was preparing my “file” for promotion: “Sam, you’re studying too many different subjects. Try to focus on one and be an expert in that. Academia today wants specialists, not generalists.”

One of the ironies in this vignette is that my friend is British – the same country where one of the foremost philosophers of the 20th century worked: Prof. Isaiah Berlin. Among other things, Berlin is famous for his essay book “The Hedgehog and the Fox”. Taking his cue from the Greek philosopher Archilochus, Berlin divided thinkers into two categories: hedgehogs, who see the world through the perspective of one single, defining idea; and foxes whose standpoint is a wide variety of life experiences and areas of knowledge. As Berlin explained, the former digs deep into one subject area, gaining great expertise in that field; the latter brings together ideas from several sources, even if the fox’s knowledge base is more superficial than the hedgehog’s.

In our world, both are useful and even necessary. We accumulate detailed knowledge incrementally through many hedgehogs – and then a few foxes try and “connect the dots” between ostensibly disparate fields. Most big paradigmatic advances in modern times have been contributed by foxes, but they could not have done it without the groundwork laid by the hedgehogs.

So why the friendly “warning” of my friend and colleague? Because the world of academic research has come to reward – almost exclusively – hedgehog work. The main reason for that is the “measurability” of their contribution, especially the number of articles they publish in prestigious journals. Foxy work, on the other hand, is far harder to evaluate, especially because almost all of it appears in books and not articles – and producing a book is far more painstaking than putting out articles. Thus, the fox’s “output” tends to be far smaller than that of the scholarly hedgehog.

I knew that my friend was correct, functionally. But I basically ignored his advice and to a certain extent “suffered” academically from that decision. Why ignore something that I understood as being correct, at least from a “career” perspective?

Two reasons. First, by nature I am a tried-and-true fox – deeply interested in several areas of life, and having the ability to contribute something in each by combining a few (I try not to use this platform for self-aggrandizement, but just this once: back in 1981 I published the first-ever scholarly article on the legal ramifications of artificial intelligence, still being cited today in the latest research literature. End of my “promo”).

However, the second reason is far more important, and I already hinted at it above: although few and far between, it is the fox who has the chance to make some major breakthrough, or at least to shed new light on social phenomena, bringing to bear knowledge that seems to be far removed from the issue at hand.

By now you are probably saying to yourself: OK, interesting, but what does this have to do with “real life”? Here we come to the crux of the matter: everything in our world is connected somehow to virtually everything else! The challenge is how to perceive and act on those connections. This problem is especially acute in politics and policymaking. Government agencies and ministries tend to be intellectual silos – hedgehogs burying (and buried) deep within their own field. But this invariably leads to bad policy.

I will offer just one policymaking example: trivial and critical at one and the same time. High school educators are steeped in the ins and outs of pedagogy but are oblivious to the discipline of chronobiology. You’re surely saying: chrono-what? Chronobiology is the science of our body’s internal clock, and it turns out that teenagers simply cannot get up early in the morning. So why do high schools still start classes at 8:30 or 9:00AM??? It is completely counterproductive pedagogically!

And just because you aren’t part of public policymaking, you shouldn’t think this doesn’t affect you as well. We all have to make serious decisions in our personal life. Here too is but one example: thinking of buying a house? Always wanted to have a view of the sea? Well, think again. The only real estate in the U.S. that over the past decade has gone DOWN in price is housing within half a mile of the shore. The reason: global warming leads to sea level increase, that brings on increased flooding in storms and in general. Not to mention that it’s worth considering whether to live in a wood frame or stone/concrete home; the latter is the way to go in order to keep cool during increasingly hot weather! Who would have thought that real estate and climatology are linked?

I bring these examples to show that the “interaction” between fields of knowledge is not necessarily within the same general scientific area, i.e. not within the social sciences exclusively, ditto the natural sciences, or the same for the humanities – but rather between them. Of course, this is a huge problem because when they were in college those same governmental policymakers studied almost exclusively within one general scientific field (e.g. psychology, economics and political science, without any history and literature, or biology and physics). I was lucky enough to have done so (ergo my article combining Law and Computer Science), and thus equipped to tackle some “foxy” issues.

I am not suggesting that everyone become a fox. Again, the world needs all the hedgehogs it can muster in order to gain discrete knowledge. However, it is important to break out of one’s own intellectual ghetto – and certainly in the public sphere to find ways to have a real dialogue, if not institutional arrangements, between very disparate fields of endeavor.

Then and Now: Staying Upbeat

Tami calls me a “Pollyanna” – almost always upbeat and optimistic about the present and future. Is that realistic? No… and yes. Is it beneficial? Absolutely YES!

 Let’s start with a fact: some people are born with a natural disposition to be pessimistic, whereas others come into this world with an optimistic temperament. That doesn’t mean we can’t do anything to change or moderate our general perspective on life; personal change is possible, and even advisable. But such change is not necessarily easy, because it strikes at the very core of our personality. So, if what follows is persuasive but goes against your “grain”, with some mental effort you can certainly “modify” yourself.

What is not realistic about being constantly upbeat? Simply put, not everything “works out for the better”. As the colloquial saying goes: “shit happens”. Even if you believe in historical “progress” (more on that in a moment), that is not a constant, linear, upward slope. The lsope of history acts more like a roller coaster, but despite the dips it’s one that generally moves higher. Sort of “two steps forward, one step backward.”

On the other hand, general optimism is definitely realistic – if one takes the “longer view”. I have always felt – and mention to my friends when they talk about the “good old days” – that if I had one power to turn everyone in today’s world into an optimist, it would be this: a time machine to return them a few centuries back for a mere 24 hours. Let them freely wander about. They would probably not even last the full day.

For instance, back then the public stench was overwhelming, with garbage simply thrown out the window into the street – which is why to this day in older parts of European cities one can still see the “indentation” running down the stone street so that the effluents would be washed away during the rains. And if you tried to take refuge in people’s home, well their personal body odor would overwhelm us moderns as well: people showered once a month or at best once a week. Indeed, that’s the origin of the term “don’t throw out the baby with the bathwater”. The entire family would bathe in the bathtub in sequential order: dad, mom, older siblings and down the line, so that by the time the baby’s turn came the bathwater was dark and murky from everyone’s dirt and so submerged that you couldn’t see “junior” and might empty the bath (out the window), baby and all!

Actually, the situation was far worse: imagine living at a time when one in three of your children (and you had many!) died before reaching the age of 5; when a toothache meant physical extraction with pliers (and no anesthesia) – don’t even think about other, more invasive operations; and when you were lucky to live past age 36 (the average lifespan in Europe in the late 18th century – Mozart’s exact age when he died). One can go on and on about nutrition (severely lacking), poverty (the lot of the vast majority) – and that’s if you were lucky enough not to be a slave, or slaving away in indentured servitude.

In short, our life today is immeasurably better than anything that came before. So why shouldn’t we be optimistic about our personal as well as our social/national/global future?

But you might ask, just because a person becomes (more) optimistic, that makes their life better? The surprising answer is a definite “yes”. First, although this sounds banal, it is also true: optimistic people enjoy life more, so that there is a qualitative benefit. Even more germane is the “surprising” (?) fact that taking everything else into account, optimistic people live a few years longer than pessimistic people!

None of this is to say that we should go through life apathetically because “everything will work out for the better”. They won’t (as I already noted above) – unless we do something about it proactively. The human race in general, and each of us specifically, have powers of creative problem-solving that we are hardly aware of, until we start moving that “muscle”. Here’s just one example among myriads.

A huge, best-selling book in the late 1960s was Paul Ehrlich’s book The Population Bomb, a modern updating of Thomas Malthus’ infamous prediction that humanity will forever suffer from starvation and over-population warfare. What happened since then? On the one hand, the world developed agricultural technology (e.g., “Golden Rice”: a genetically modified type of rice with fortified beta-carotene – the basis of vitamin A – that has saved tens of millions of children from blindness). On the other hand, fertility rates have plummeted almost everywhere in the world, as parents decided to have fewer children in order to provide more to those that they do have.

So, when I hear the fear-mongers regarding, for example, climate change, I react in two ways: first, I’m happy that they’re warning us; second, precisely because of those warnings, I’m very optimistic that humanity will find the solutions to this real problem – just as they did with “overpopulation”. As the title of this post suggests: what happened “then” is not what will happen “now” or in the future. Optimism coupled with creative adaptation almost always brings about a positive outcome.

The Aging Brain

Like almost everyone else I know around my age, remembering things is not as easy as it used to be. That refers more to recent stuff in my life than “remembrances of things past” (to quote Proust). Nothing unusual about that. But what is somewhat “peculiar” is the fact that when it comes to another important part of my mentalizing, there is no diminution whatsoever: analysis and problem-solving. How come?

Let’s take memory first. There’s a common misconception that the progress of “memory” works as a parabola: slowly rising until our early Thirties, and then a slow decline for the rest of our life (for some, unfortunately, a faster decline in their Eighties and later). This is incorrect. In fact, we all undergo a massive form of “amnesia” around the age of eight (yes: 8) when the brain prunes most of our memories, probably to “make room” for the huge memory needs as we enter puberty and into the main learning period of our lives through the ensuing decade or so. That’s why we don’t remember anything from our toddler past (before 3 years old) – all those memories have been mind-expunged, as new neurons and synapses form.

Second, there’s the question: memory of what? I’m terrible at remembering names of people, even though I recognize the faces of people I haven’t seen in decades. If I see any word in print even only once, I never forget how it’s spelled; but I can’t very well recall events that I attended only a few years ago. Speaking of “years”, numbers of almost any sort (dates, how much something cost, etc.) – they’re all totally “sticky” in my brain. In short, there’s no one overall memory ability.

Why would that be? Simply put, because there is no single place in our brain where “memory” resides. Rather, memories are spread over different parts of our brain, each one close to a different sensory “module”. So that if we went to a restaurant and had a great meal, the memory of that would be evinced (recalled) from the olfactory or taste sections of our brain. And as we are well aware, people have variably stronger and weaker senses. For instance, I am very strong on “music” – once I hear a tune, I never forget it. But please don’t ask me what the song (or classical piece) is called. In fact, in most people “musical memory” is incredibly strong – one of the last things to go through the latter stages of dementia.

So, if one cognitive ability – memory – has many different “parts”, it shouldn’t be very surprising that other mental abilities are also differential in their “stickiness”. Whereas I have increasing difficulty remembering what I ate yesterday evening, I can analyze problems just as quickly and sharply as in the past. Indeed, maybe even better than in the past. Why? That leads to another aspect of the brain.

There is an old adage: the wise person knows how much s/he doesn’t know. But the converse is true too: none of us knows how much we really know. That means that buried deep in our mind is a huge amount of innate knowledge based on our life experience. It lies there dormant until something in our life demands a response to which this or that piece of “forgotten” information or knowledge is called for – and up it pops! Our brain is a massive and (usually) terrific archivist. And the longer we live, the more experience/info we accumulate, so that we are armed with more ammunition to resolve problems or analyze challenging issues. And that stuff doesn’t seem to decline much with age. For anyone even remotely active mentally, the amount of new information surpasses by far the info our brain forgets from lack of use.

That’s why one of the three most important things we should be doing as we age is to keep the brain “challenged” with new types of information: learning a new language, honing a new skill, getting educated in an unfamiliar subject area. (Now you’re asking: what are the other two important things?  Good nutrition; and aerobic, physical exercise, such as fast walking, swimming etc.).

In short, the brain isn’t so much a single, three-pound lump of flesh as it is a jigsaw puzzle of many pieces that make up a complete picture. You can lose a piece here and there and still see what the whole is all about. Obviously, if you lost the whole upper right-hand quarter of the jigsaw, you might not be able to see what was there at all. Luckily for us, however, the brain is different in an important way: if we lose some neurons or even small sections of the “mind”, it is able to find “detours” and pick up that “lost” ability (at least in part) through other parts of the brain. So, if our olfactory sense and memory starts failing, we might still be able to recall that delightful restaurant dinner through the “taste” or “sight” sections of our brain. And keeping the brain “challenged” throughout “retirement” ensures that it will be able to construct “detours” more effectively.

The famous actress Bette Davis once said: “Growing old is not for sissies!” And yet she herself had an indomitable spirit and stayed “sharp” to her dying day (obviously – that’s one of the pithiest comments about aging you will ever read!); she continued acting in film and on television until shortly before her death. The aging brain can continue to do many wonderful things for us despite some partial breakdowns here and there. Just keep it as oiled as you can.

Talking to Children (chapter 2)

Prologue: Several weeks ago I reflected on what parents should or should not tell their children regarding “problematic” subjects of parental sensitivity (“Talking to Children”: http://profslw.com/talking-to-children/). This time the topic is more “benign” – the need to offer autobiographical details.

* * * * * * * *

 If there is one major thing that I regret regarding the relationship I had with my father, Arthur Wilzig, it is that he almost never talked about his youth and early adulthood – and being a teenager, of course I never asked. In fact, most of what I do know about the Cuba years and thereafter was told to me by my mother. There’s an object lesson here.

Children are naturally inquisitive. But there seems to be a blind spot in their questioning: although they want to know about virtually everything in their world, they seem to be naturally uncurious about their parents’ past! Perhaps that has to do with the fact that young kids have a problem even envisioning their parents as children or anything other than grown adults. In any case, as I’ve found from talking to many friends, once the children become adults themselves and their parents may no longer be alive it turns into one of the bigger regrets of their life.

To be sure, parents can’t just sit their kids down and lecture to them about the parent’s past. Nevertheless, there are all sorts of opportunities in which that sort of information can be made interesting to the child. For example, when coming across something new or recently modern with the kid in tow, the parent can point out: “that didn’t exist in my day” or “instead, we had to do, or deal with…” – and from there can easily segue into a description of some aspect of the parent’s early life.

Nor does this kind of retelling have to be done in some kind of chronological order. Children really don’t care exactly what age the parent did something, i.e., they are not little historians! As long as each story or anecdote/event/vignette make some cohesive sense, and the parent offers a general ballpark age, the child will eventually put the “chapters” together in their approximate chronology.

Why is this important? Given the growing interest in DNA heritage – what percent do we belong to this, that and/or the other ethnic/racial/tribal/national group – the social history of our father, mother, and their fore-parents is but the other side of the same coin. Humans naturally want to know from whence they came – biologically and historically-collectively. True, given the digitization of archival records around the world, it is somewhat easier today to find information about the “recent” (post-19th century) past. However, this is mostly dry data: birthdates, names, towns, and in rarer cases some other types of information, e.g., school records. What we really seek are the flesh-and-blood stories of our progenitors: what were they like (personality)? what did they do? what did they look like? in what way might we be like some of them?

This does not mean that parents will be forthcoming with all the information they know. There could be a black sheep in the family – the less said the better. A parent might have undergone some serious trauma – again, discretion is the better part of (their) valor. This was a particular problem with Holocaust survivors, but not only them.

When my son Avihai was around six years old, instead of reading to him a bedtime book I started telling him what we called “Sammy stories” – taken straight out of my memory. In fact, the present series of my “Reflections” memoirs could be considered a direct continuation of that experience, just some “levels” higher.

Perhaps the best reason for a parent to provide information within some past social context is the opportunity to “model” growing up. All children look up to their parents for “guidance” as to how to deal with the world. Of course, actions tend to speak louder than words, but words do carry weight as well – especially when they are about the parents’ actions when they were the same age. Education need not be just sitting down to help with our kids’ homework; “informal” education sometimes is even more important, especially when it involves “socialization” – how to behave, how to react to challenges, how to control one’s emotions, and so on. “When-I-was-a -kid” stories make at least as strong an impact as straightforward lecturing around the dinner table.

Again, the initiative has to come from the parent. Yes, some kids will be bored; others fascinated; and most mildly interested, depending on the way it’s told and what it’s about. But rest assured, even if today they don’t fully appreciate these “roots” excursions, they certainly will later in life.

One final note: whatever I said here goes double for grandchildren! They are two generations away (imagine explaining a rotary phone to contemporary digital natives); grandparents are perceived as being “wiser” than parents; and they certainly have more time and patience for such family storytelling. Although I would not overdo this next point, but a few stories from grandparents about their own children – the kids’ parents and uncles/aunts – will definitely grab their attention!

May we all get to be great-grandparents (I’m sure we are, or will be, great grandparents), to regale dozens of our progeny – the real route to roots.

Purim’s Double Narrative: Celebration and Travesty

The holiday of Purim is my least favorite Jewish holiday – and Tami’s too. Why this should be so is an object lesson of what can happen when we become too comfortable with a traditional narrative.

All human beings have an amazing ability to take a relatively objective (factual) situation and interpret it in several ways, sometimes completely contradicting each other. As someone once said: “Variety is the spice of life.” But as we all know, sometimes spices can be very “hot” – burning us in the process of ingestion. In other words, if two people, camps, sectors, or population groups have widely different “tales” on what happened or “who did what to whom”, that could be a recipe for serious societal trouble.

This conflicting way of understanding the present is also true of the past (perhaps even more so!). There’s nothing like a contretemps between historians regarding a past event, or as Dr. Henry Kissinger once opined: “Academic arguments are so virulent because the stakes are so low…”.

Which brings me to the Purim holiday. As a mostly observant Jew (“Conservadox” is the best way to describe me), I have no intention of starting another “cultural war”. Rather, I want to explicitly state here something that other Jewish friends and acquaintances have sheepishly mentioned to me over the years: Purim is one “strange” story (and they don’t mean that positively). Some of you readers might have had the same queasy feeling.

The standard narrative is well known. Indeed, it is the classic basis for the Jewish trope (and joke): “What’s a Jewish holiday all about? The Gentiles tried to kill us, we fought back and won, and now let’s eat…”. Haman tried to manipulate King Ahaseurus into decreeing the destruction of the Jews, Esther devised a plan to turn the tables on Haman and succeeded, the Jews killed their Persian enemies, and we Jews celebrate to this day by eating and drinking ourselves into a stupor (the only day of the Jewish calendar when drunkenness is permitted). Kids celebrate by masquerading, adults by gorging on “hamantaschen” (in Hebrew “oznei Haman” = Haman’s Ears), and a good time is had by all.

So, what’s not to like? Well, when read a bit more closely, the Purim story is a complete travesty of Jewish ethics and commandments! Esther, an orphan, is brought up by her “uncle” (or cousin?), and when the king decides to find a new queen through a “beauty contest” (in the king’s bedchambers) of all the country’s virgins, she joins!! In other words, she is willing to have sexual relations before marriage, and with a Gentile no less. Then when she wins the competition, she actually marries the Gentile king!!! When was that ever condoned in Judaism? Indeed, in Jewish Law there are only three transgressions that prevent the saving of life – one of them, illicit sexual relations – so how does Mordechai even suggest that all this happened to save the Jews??

The Rabbis came up with all sorts of convoluted “explanations”, e.g., Esther was actually married to Mordechai (!?!) and didn’t consummate anything with the king (??) ; or, she would go to the mikveh (ritual pool for cleansing) before having relations with the king, and then again when she snuck out of the palace to have relations with Mordechai. (I am not making this up.)

Not as egregious from the standpoint of Jewish Law, but quite unJewish nonetheless, is Mordechai’s self-aggrandizement in the concluding sections. We are asked (actually commanded) to repeat every year the heroics of Mordechai (“the great man”, as the book puts it) – that he ostensibly wrote himself! Where did Jewish modesty go?

Which of the two main narratives is correct – the traditional one representing the Jewish Diaspora experience through the ages (trying to successfully fight anti-Semitism), or a highly problematic “outlier” in the Biblical canon? Obviously, I’m not objective given my antipathy to the entire Esther story, but consider these two points. First, where did the names Mordechai and Esther come from? The ancient Mesopotamian gods Marduk and Ishtar!! In other words, not only is their behavior reprehensible (by Jewish standards), but their very names suggest that they are not acting Jewishly – because maybe they aren’t?

Second, other than the Song of Songs (a love poem), the Book of Esther is the only other book in the entire Bible where God’s name is not mentioned! The Almighty was obviously as embarrassed by this narrative as I am…

Religious Observance: Dispensable or Dutiful?

Image result for google images wood tennis racket

When I entered City College of New York for my undergraduate education, I decided to try out for the tennis team. Although my first love was basketball, playing on my championship-winning, high school varsity team for two years, I wasn’t good (or tall) enough to play hoops at a college level. But as I was a tennis teacher in summer camp, I figured I might be able to play college varsity tennis. So, I went to the tryout.

After a few matches against other candidates, I made the team. Then came the hard part: informing the coach that I could not play on Saturdays, as I was a sabbath observer. That meant I would miss about a third of the matches. The coach was a stout fellow from Alabama – throughout his life perhaps not having been in much contact with Jews. He looked me in the face, thought about it for a few seconds, and then asked: “Can’t you get special dispensation from your Rabbi?”

OK, you can stop chuckling now. It is funny (if you know Judaism), but there’s a serious issue behind a question like that. Religion demands adherence to strict “commandments”. Some like Christianity have a relatively low number of “do’s” and a bit more don’ts; Judaism has a huge number of them: 613 main ones and uncountable lemmas, additional “fences”, and assorted customs that have eventually morphed into “edicts”. Either way, if the underlying assumption is that these are God-given (or at least God-inspired), there isn’t too much wiggle room for the religious observant.

Or perhaps there is? I am no expert on Christianity (or Islam), but papal dispensations for all sorts of future (or past) transgressions are well known. Judaism, however, has only one central “dispensation”: protecting human life. If there is any sort of mortal danger to a person then (almost all) commandments not only can, but SHOULD, be abrogated in order to ensure the person’s continued life. This need not be a clear case of life and death; even preventing a relatively minor illness – that theoretically could lead to death – is enough to allow transgression. There are three exceptions: idol worship, murder, and incest, e.g., if someone says “kill her or I will kill you” the Jew cannot pull a trigger on the woman.

However, this “thou must LIVE by them” dispensation means that there are not any other circumstances that enable transgression. Which leads to the next question: what’s the utility of being a stickler in performing religious commandments?

First, any “legal” system needs performance consistency. It wouldn’t work if any pedestrian could decide that it’s OK to cross the street at the red light because that’s their favorite color. All religions constitute a sort of “legal system”, notwithstanding the origin (after all, we obey secular laws even if not mandated “by God”). Second, as humans we are creatures of habit – comfortable in repeating activity that we enjoy or find meaning in doing. Third, and somewhat related, all religions are social – designed to strengthen communal solidarity, something critical for mental health and it turns out quite important for physical health too! Religious people on average live 3 to 6 years LONGER than those who are non-religious!! (See: https://www.medicalnewstoday.com/articles/322175) The main reason for that in the modern world seems to be social solidarity: loneliness is the number ONE killer of senior citizens!

For Jews in the past, there were other more prosaic reasons. For instance, the edict that one has to wash hands (and make a blessing) before eating was obviously a major hygiene boost in an era when people had no idea about germs and the like. Circumcision seems to reduce the incidence of sexually transmitted disease. And so on.

Of course, we never know what aspects of religious practice are beneficial and which not – and that’s precisely why sticking to tried and true beliefs and accompanying rituals is worthwhile. Just ask my fellow, college tennis teammates.

During my senior year, my tennis team was scheduled to play Temple University (Philadelphia) – that’s not the Jewish “Temple”, or even a religiously-oriented college, although it did start out Baptist. The problem? The match was to be played on the interim Passover days (Chol Ha’moed), so that with the long, round-trip bus ride and several hours of the match, I had to bring my own food along. So I’m on the bus with my teammates on the way to Philadelphia when I get hungry and take out a matzoh sandwich I made for myself. The first bite – “CRUNCH!” – and everyone turns around. “What’s that?” asks one of my teammates. I try to explain, but they look at my gigantic “cracker” and start laughing, with some good-natured mocking of my culinary habit.

We were thoroughly beaten in the match: 7-2. The only two wins: my singles and also my doubles matches. On the bus ride back, all I could hear was my teammates imploring me: “Can I have one of those? There’s got to be some secret ingredient in those crackers!” 

 If you stick to your religious guns, it all works out in the end. No need for dispensations…

The Wondering Jew (Not a Typo!)

Ever since I was a tot, I would ask “why?” (My baby-sitting, twin cousins, Ruthie and Naomi called me “the mouth” – at age 3!) But as the stereotype has it, that seems to be something natural to most Jews – asking, questioning, disagreeing, and protesting. And it’s worth remembering that not all “stereotypes” are wrong. If indeed this one is correct (as I believe it is), the question is – sorry about this! – WHY?

As with every good question, there are several possible answers – each probably true to a certain extent. I’ll start with a few that historians have bandied about, and then I’ll add my take. First, the Bible (“Old Testament”) is replete with arguments (Abraham telling God that “He” can’t destroy Sodom and Amorah if there are saints living there), protests (the Israelites in the desert), and sundry questioning of authority (the Prophets). Not for nothing does God call the Children of Israel “a stiff-necked people”!

Second, more than a thousand years later, after the destruction of the Second Temple (68 CE), Judaism took a radical turn away from the Temple cult (priests, sacrifices etc) to scholarship. Religious learning and education evolved and soon developed what came to be known later as the Talmud, a gigantic compendium – not of laws, but of arguments. The questions, debating, arguing, go on for pages and pages, each camp (many times more than two) utilizing all the tools of rhetoric and logic. Over many hundreds of years (the Talmud, originally purely oral based on memory, was finally written down around the 5th-6th centuries CE), it has created a culture of learning through interrogation and verbal give-and-take without parallel.

Third, anti-semitism also played its part over the past 2000 years. When the world views you as an outcast or “inferior” (in the Moslem world, that was called “dhimmi”), then at some point you begin to view yourself as well as an “other” – and start thinking as an “other”. Actually, I should take back the word “start”; as the first two factors above note, the Jews have viewed themselves as “other” from their very start – whether as “The Chosen People” or simply believing differently than everyone else around them (monotheism vs the ancient world’s polytheism).

When I was ten years old, my mother took David and me for an entire summer vacation to England and Switzerland to see the family. Each direction on a famous ocean liner. Being with my overseas family was fun, but the real “added value” was seeing “alien” things: cricket, farthings, fish & chips, even a royal palace!

In retrospect, it strikes me that a major reason for the Jewish mindset is the Jewish People’s never-ending wanderings. Think about it: Abraham moved from Babylonia to Assyria to Canaan to Egypt and back to Canaan; Jacob moved his whole family to Egypt (and we know how that turned out); the Israelites sojourned in the desert for 40 years and then entered Canaan; 10 tribes were expelled in the 8th century BCE and then 200 years later the last two tribes’ leaders were also forced to leave, but they managed to return to the Holy Land after 50 years; then five centuries later came the Roman destruction of the Temple and more expulsions – this time to Babylon (again), Egypt and Rome. The next 2000 year period (until today) constitute(d) the Diaspora, with Jews ever on the move from one continent and country to another.

That’s why we are also called “The Wandering Jew”. And it seems to me that all this Wandering also makes us Wondering, i.e. wondering why we are not accepted almost anywhere, and also wondering (in the new place we settle) why “they” do things the way the way they do.

Note how this plays into all three of the original factors mentioned earlier: Jews don’t accept the “conventional wisdom” of anyone; they were highly literate and educated among other nations that have very low educational levels, so that was only natural to ask “why?” when the only answer forthcoming was “tradition”; and all this, of course, gets the Gentiles angry, leading to more anti-semitism that brings more expulsions etc., thereby starting the cycle all over again.

Thus, the Jewish way of doing (and thinking) things is “Wonderful” – as in “I wonder why it’s done that way?” Terrific for progress: scientific, technological and even social (Jews were/are in the forefront of most major social protest movements). But that also leads to Wanderful: sometimes a “push” by the Gentile world, and sometimes a “pull” with the Jew seeking out more amenable pastures. As one example: look how many Jews have immigrated to Palestine/Israel in the past century – and how many have also left Israel since its establishment!

Can we have our cake (enjoy staying put in our home country) and eat it too (see other places to broaden horizons)? Yes. The answer is vacation travel – not only to have fun but to learn how human nature is highly variegated, how others can live quite differently than us – but still make sense of their lives. Americans are notoriously “parochial” (the center of the world), and part of the reason for that is never leaving the U.S.  Indeed, even after the law was changed to require a passport for visiting Canada and Mexico, only about 30% of Americans have passports! I am willing to bet that the vast majority of the ultra-nationalist, racists in America have never left the good ol’ U.S.A. If solitary confinement in jail is inhuman, then self-confinement to one’s national borders is a recipe for anti-humanity.

Obviously, the present Corona period is not the time for travel. But once we’re past the pandemic, all of us should “get up and go”… far away. Not only for our mental health, but our moral health as well.

To make a bilingual pun (from my Mom’s home country), seeing the world is wünderbar!