The Limits of Freedom

When I was a young boy, my parents bought us two parakeets. We kept them in their cage, well fed and taken care of, in the bedroom I shared with David. One day I felt sad for them because they didn’t have the opportunity to fly. So, I opened the cage door, gently took one out, and set it free. The bird was unsure what to do, but soon enough started flying around the room and then… flew straight into the window, cracked open its head, and died on the spot. 

At the time, it was a “tragedy” for a young kid like me. Looking back, though, I realize that there were a few lessons to be learned from this incident.

First, don’t do something “different” unless you consider possible unintended consequences. Of course, the world is too complex to be able to foresee everything that might happen, but we can certainly think through several, relatively likely possibilities and perhaps a couple of unlikely ones as well.

Second, and more difficult to absorb, is that unfettered freedom is actually counterproductive. Indeed, there’s a word for that: anarchy. We intuitively understand the problem in a few areas of life. Take sports: imagine a game where there are no rules, and every player can do whatever s/he wants in order to “score”. Not only wouldn’t that be very interesting for the participants, but it’s doubtful many spectators would find such a sport compelling. Or take music: what do most people want to hear – Mozart or some hyper-modern, “anything goes” composition? It is precisely the “limiting” structure of pre-20th century classical music that renders the music of Mozart (or any of his composer compatriots) so compelling.

This holds true not only for culture and entertainment. Even in general social life, complete freedom does not exist. One can’t walk anywhere, anytime; down the middle of the street or crossing at a red light is not acceptable. One can’t even say whatever one wants, the First Amendment notwithstanding: yelling “Fire!” in a crowded theater (when there’s no fire) has no Constitutional protection. We even accept restrictions with no danger or harm in sight: even if all parties are willing, no more than two people can marry each other.

Why would this paradox hold true? That true freedom is only found within some structured or even “restrictive” framework? One way of looking at this is that society emulates Nature. The universe has “laws” of physics – there is no “cosmic chaos”, even if black holes and other outer space phenomena might seem that way. Given that human beings are part of that same natural world, why wouldn’t homo sapiens behave along the same principle? Indeed, we don’t think twice about the fact that gravity stops us from flying – it’s just one of the numerous “limitations” on our freedom that we accept because… well, because that’s the way world works.

Another approach is to consider what drives humans after their basic needs are met. In a word: challenge. By our very psychological nature we are problem-solvers. At first, to stay alive; later, to keep us mentally stimulated or having a “better” life. But a problem, task, or challenge to be solved, completed or overcome can only be stimulating if it is circumscribed in some way: a frame around a painting; rules to complete a puzzle; laws allowing and forbidding certain advertising practices; professional licenses for those completing a specific course of study and passing an exam; and so on. The fun is in difficulty overcome, not in taking the easy way home.

In retrospect, as that young child I had two other choices: open my bedroom windows before releasing the parakeet – or lowering the shades.

The first option would have provided my bird with complete freedom – and then it would most probably have starved to death, not being trained to find food by itself. The second option would have restricted the parakeet’s freedom of movement even more by darkening the room – but that in itself would have forced it to be more careful, and thus survive the experience!

At the risk of sounding “pedantic”, there’s a much larger lesson here. I’ll take the latest “political” brouhaha as my example: during Covid-19, every country has had to deal with the following dilemma: to prohibit certain activities from those who refuse to wear a mask and get vaccinated – or enable “complete freedom” as a function of individual rights? At least for me, my parakeet episode provides the correct answer.

Hedgehog or Fox?

Back in the mid-1980s, when I was about a decade into my academic career, a senior colleague and close friend of mine gave me some advice as I was preparing my “file” for promotion: “Sam, you’re studying too many different subjects. Try to focus on one and be an expert in that. Academia today wants specialists, not generalists.”

One of the ironies in this vignette is that my friend is British – the same country where one of the foremost philosophers of the 20th century worked: Prof. Isaiah Berlin. Among other things, Berlin is famous for his essay book “The Hedgehog and the Fox”. Taking his cue from the Greek philosopher Archilochus, Berlin divided thinkers into two categories: hedgehogs, who see the world through the perspective of one single, defining idea; and foxes whose standpoint is a wide variety of life experiences and areas of knowledge. As Berlin explained, the former digs deep into one subject area, gaining great expertise in that field; the latter brings together ideas from several sources, even if the fox’s knowledge base is more superficial than the hedgehog’s.

In our world, both are useful and even necessary. We accumulate detailed knowledge incrementally through many hedgehogs – and then a few foxes try and “connect the dots” between ostensibly disparate fields. Most big paradigmatic advances in modern times have been contributed by foxes, but they could not have done it without the groundwork laid by the hedgehogs.

So why the friendly “warning” of my friend and colleague? Because the world of academic research has come to reward – almost exclusively – hedgehog work. The main reason for that is the “measurability” of their contribution, especially the number of articles they publish in prestigious journals. Foxy work, on the other hand, is far harder to evaluate, especially because almost all of it appears in books and not articles – and producing a book is far more painstaking than putting out articles. Thus, the fox’s “output” tends to be far smaller than that of the scholarly hedgehog.

I knew that my friend was correct, functionally. But I basically ignored his advice and to a certain extent “suffered” academically from that decision. Why ignore something that I understood as being correct, at least from a “career” perspective?

Two reasons. First, by nature I am a tried-and-true fox – deeply interested in several areas of life, and having the ability to contribute something in each by combining a few (I try not to use this platform for self-aggrandizement, but just this once: back in 1981 I published the first-ever scholarly article on the legal ramifications of artificial intelligence, still being cited today in the latest research literature. End of my “promo”).

However, the second reason is far more important, and I already hinted at it above: although few and far between, it is the fox who has the chance to make some major breakthrough, or at least to shed new light on social phenomena, bringing to bear knowledge that seems to be far removed from the issue at hand.

By now you are probably saying to yourself: OK, interesting, but what does this have to do with “real life”? Here we come to the crux of the matter: everything in our world is connected somehow to virtually everything else! The challenge is how to perceive and act on those connections. This problem is especially acute in politics and policymaking. Government agencies and ministries tend to be intellectual silos – hedgehogs burying (and buried) deep within their own field. But this invariably leads to bad policy.

I will offer just one policymaking example: trivial and critical at one and the same time. High school educators are steeped in the ins and outs of pedagogy but are oblivious to the discipline of chronobiology. You’re surely saying: chrono-what? Chronobiology is the science of our body’s internal clock, and it turns out that teenagers simply cannot get up early in the morning. So why do high schools still start classes at 8:30 or 9:00AM??? It is completely counterproductive pedagogically!

And just because you aren’t part of public policymaking, you shouldn’t think this doesn’t affect you as well. We all have to make serious decisions in our personal life. Here too is but one example: thinking of buying a house? Always wanted to have a view of the sea? Well, think again. The only real estate in the U.S. that over the past decade has gone DOWN in price is housing within half a mile of the shore. The reason: global warming leads to sea level increase, that brings on increased flooding in storms and in general. Not to mention that it’s worth considering whether to live in a wood frame or stone/concrete home; the latter is the way to go in order to keep cool during increasingly hot weather! Who would have thought that real estate and climatology are linked?

I bring these examples to show that the “interaction” between fields of knowledge is not necessarily within the same general scientific area, i.e. not within the social sciences exclusively, ditto the natural sciences, or the same for the humanities – but rather between them. Of course, this is a huge problem because when they were in college those same governmental policymakers studied almost exclusively within one general scientific field (e.g. psychology, economics and political science, without any history and literature, or biology and physics). I was lucky enough to have done so (ergo my article combining Law and Computer Science), and thus equipped to tackle some “foxy” issues.

I am not suggesting that everyone become a fox. Again, the world needs all the hedgehogs it can muster in order to gain discrete knowledge. However, it is important to break out of one’s own intellectual ghetto – and certainly in the public sphere to find ways to have a real dialogue, if not institutional arrangements, between very disparate fields of endeavor.

Then and Now: Staying Upbeat

Tami calls me a “Pollyanna” – almost always upbeat and optimistic about the present and future. Is that realistic? No… and yes. Is it beneficial? Absolutely YES!

 Let’s start with a fact: some people are born with a natural disposition to be pessimistic, whereas others come into this world with an optimistic temperament. That doesn’t mean we can’t do anything to change or moderate our general perspective on life; personal change is possible, and even advisable. But such change is not necessarily easy, because it strikes at the very core of our personality. So, if what follows is persuasive but goes against your “grain”, with some mental effort you can certainly “modify” yourself.

What is not realistic about being constantly upbeat? Simply put, not everything “works out for the better”. As the colloquial saying goes: “shit happens”. Even if you believe in historical “progress” (more on that in a moment), that is not a constant, linear, upward slope. The lsope of history acts more like a roller coaster, but despite the dips it’s one that generally moves higher. Sort of “two steps forward, one step backward.”

On the other hand, general optimism is definitely realistic – if one takes the “longer view”. I have always felt – and mention to my friends when they talk about the “good old days” – that if I had one power to turn everyone in today’s world into an optimist, it would be this: a time machine to return them a few centuries back for a mere 24 hours. Let them freely wander about. They would probably not even last the full day.

For instance, back then the public stench was overwhelming, with garbage simply thrown out the window into the street – which is why to this day in older parts of European cities one can still see the “indentation” running down the stone street so that the effluents would be washed away during the rains. And if you tried to take refuge in people’s home, well their personal body odor would overwhelm us moderns as well: people showered once a month or at best once a week. Indeed, that’s the origin of the term “don’t throw out the baby with the bathwater”. The entire family would bathe in the bathtub in sequential order: dad, mom, older siblings and down the line, so that by the time the baby’s turn came the bathwater was dark and murky from everyone’s dirt and so submerged that you couldn’t see “junior” and might empty the bath (out the window), baby and all!

Actually, the situation was far worse: imagine living at a time when one in three of your children (and you had many!) died before reaching the age of 5; when a toothache meant physical extraction with pliers (and no anesthesia) – don’t even think about other, more invasive operations; and when you were lucky to live past age 36 (the average lifespan in Europe in the late 18th century – Mozart’s exact age when he died). One can go on and on about nutrition (severely lacking), poverty (the lot of the vast majority) – and that’s if you were lucky enough not to be a slave, or slaving away in indentured servitude.

In short, our life today is immeasurably better than anything that came before. So why shouldn’t we be optimistic about our personal as well as our social/national/global future?

But you might ask, just because a person becomes (more) optimistic, that makes their life better? The surprising answer is a definite “yes”. First, although this sounds banal, it is also true: optimistic people enjoy life more, so that there is a qualitative benefit. Even more germane is the “surprising” (?) fact that taking everything else into account, optimistic people live a few years longer than pessimistic people!

None of this is to say that we should go through life apathetically because “everything will work out for the better”. They won’t (as I already noted above) – unless we do something about it proactively. The human race in general, and each of us specifically, have powers of creative problem-solving that we are hardly aware of, until we start moving that “muscle”. Here’s just one example among myriads.

A huge, best-selling book in the late 1960s was Paul Ehrlich’s book The Population Bomb, a modern updating of Thomas Malthus’ infamous prediction that humanity will forever suffer from starvation and over-population warfare. What happened since then? On the one hand, the world developed agricultural technology (e.g., “Golden Rice”: a genetically modified type of rice with fortified beta-carotene – the basis of vitamin A – that has saved tens of millions of children from blindness). On the other hand, fertility rates have plummeted almost everywhere in the world, as parents decided to have fewer children in order to provide more to those that they do have.

So, when I hear the fear-mongers regarding, for example, climate change, I react in two ways: first, I’m happy that they’re warning us; second, precisely because of those warnings, I’m very optimistic that humanity will find the solutions to this real problem – just as they did with “overpopulation”. As the title of this post suggests: what happened “then” is not what will happen “now” or in the future. Optimism coupled with creative adaptation almost always brings about a positive outcome.

The Aging Brain

Like almost everyone else I know around my age, remembering things is not as easy as it used to be. That refers more to recent stuff in my life than “remembrances of things past” (to quote Proust). Nothing unusual about that. But what is somewhat “peculiar” is the fact that when it comes to another important part of my mentalizing, there is no diminution whatsoever: analysis and problem-solving. How come?

Let’s take memory first. There’s a common misconception that the progress of “memory” works as a parabola: slowly rising until our early Thirties, and then a slow decline for the rest of our life (for some, unfortunately, a faster decline in their Eighties and later). This is incorrect. In fact, we all undergo a massive form of “amnesia” around the age of eight (yes: 8) when the brain prunes most of our memories, probably to “make room” for the huge memory needs as we enter puberty and into the main learning period of our lives through the ensuing decade or so. That’s why we don’t remember anything from our toddler past (before 3 years old) – all those memories have been mind-expunged, as new neurons and synapses form.

Second, there’s the question: memory of what? I’m terrible at remembering names of people, even though I recognize the faces of people I haven’t seen in decades. If I see any word in print even only once, I never forget how it’s spelled; but I can’t very well recall events that I attended only a few years ago. Speaking of “years”, numbers of almost any sort (dates, how much something cost, etc.) – they’re all totally “sticky” in my brain. In short, there’s no one overall memory ability.

Why would that be? Simply put, because there is no single place in our brain where “memory” resides. Rather, memories are spread over different parts of our brain, each one close to a different sensory “module”. So that if we went to a restaurant and had a great meal, the memory of that would be evinced (recalled) from the olfactory or taste sections of our brain. And as we are well aware, people have variably stronger and weaker senses. For instance, I am very strong on “music” – once I hear a tune, I never forget it. But please don’t ask me what the song (or classical piece) is called. In fact, in most people “musical memory” is incredibly strong – one of the last things to go through the latter stages of dementia.

So, if one cognitive ability – memory – has many different “parts”, it shouldn’t be very surprising that other mental abilities are also differential in their “stickiness”. Whereas I have increasing difficulty remembering what I ate yesterday evening, I can analyze problems just as quickly and sharply as in the past. Indeed, maybe even better than in the past. Why? That leads to another aspect of the brain.

There is an old adage: the wise person knows how much s/he doesn’t know. But the converse is true too: none of us knows how much we really know. That means that buried deep in our mind is a huge amount of innate knowledge based on our life experience. It lies there dormant until something in our life demands a response to which this or that piece of “forgotten” information or knowledge is called for – and up it pops! Our brain is a massive and (usually) terrific archivist. And the longer we live, the more experience/info we accumulate, so that we are armed with more ammunition to resolve problems or analyze challenging issues. And that stuff doesn’t seem to decline much with age. For anyone even remotely active mentally, the amount of new information surpasses by far the info our brain forgets from lack of use.

That’s why one of the three most important things we should be doing as we age is to keep the brain “challenged” with new types of information: learning a new language, honing a new skill, getting educated in an unfamiliar subject area. (Now you’re asking: what are the other two important things?  Good nutrition; and aerobic, physical exercise, such as fast walking, swimming etc.).

In short, the brain isn’t so much a single, three-pound lump of flesh as it is a jigsaw puzzle of many pieces that make up a complete picture. You can lose a piece here and there and still see what the whole is all about. Obviously, if you lost the whole upper right-hand quarter of the jigsaw, you might not be able to see what was there at all. Luckily for us, however, the brain is different in an important way: if we lose some neurons or even small sections of the “mind”, it is able to find “detours” and pick up that “lost” ability (at least in part) through other parts of the brain. So, if our olfactory sense and memory starts failing, we might still be able to recall that delightful restaurant dinner through the “taste” or “sight” sections of our brain. And keeping the brain “challenged” throughout “retirement” ensures that it will be able to construct “detours” more effectively.

The famous actress Bette Davis once said: “Growing old is not for sissies!” And yet she herself had an indomitable spirit and stayed “sharp” to her dying day (obviously – that’s one of the pithiest comments about aging you will ever read!); she continued acting in film and on television until shortly before her death. The aging brain can continue to do many wonderful things for us despite some partial breakdowns here and there. Just keep it as oiled as you can.

Talking to Children (chapter 2)

Prologue: Several weeks ago I reflected on what parents should or should not tell their children regarding “problematic” subjects of parental sensitivity (“Talking to Children”: This time the topic is more “benign” – the need to offer autobiographical details.

* * * * * * * *

 If there is one major thing that I regret regarding the relationship I had with my father, Arthur Wilzig, it is that he almost never talked about his youth and early adulthood – and being a teenager, of course I never asked. In fact, most of what I do know about the Cuba years and thereafter was told to me by my mother. There’s an object lesson here.

Children are naturally inquisitive. But there seems to be a blind spot in their questioning: although they want to know about virtually everything in their world, they seem to be naturally uncurious about their parents’ past! Perhaps that has to do with the fact that young kids have a problem even envisioning their parents as children or anything other than grown adults. In any case, as I’ve found from talking to many friends, once the children become adults themselves and their parents may no longer be alive it turns into one of the bigger regrets of their life.

To be sure, parents can’t just sit their kids down and lecture to them about the parent’s past. Nevertheless, there are all sorts of opportunities in which that sort of information can be made interesting to the child. For example, when coming across something new or recently modern with the kid in tow, the parent can point out: “that didn’t exist in my day” or “instead, we had to do, or deal with…” – and from there can easily segue into a description of some aspect of the parent’s early life.

Nor does this kind of retelling have to be done in some kind of chronological order. Children really don’t care exactly what age the parent did something, i.e., they are not little historians! As long as each story or anecdote/event/vignette make some cohesive sense, and the parent offers a general ballpark age, the child will eventually put the “chapters” together in their approximate chronology.

Why is this important? Given the growing interest in DNA heritage – what percent do we belong to this, that and/or the other ethnic/racial/tribal/national group – the social history of our father, mother, and their fore-parents is but the other side of the same coin. Humans naturally want to know from whence they came – biologically and historically-collectively. True, given the digitization of archival records around the world, it is somewhat easier today to find information about the “recent” (post-19th century) past. However, this is mostly dry data: birthdates, names, towns, and in rarer cases some other types of information, e.g., school records. What we really seek are the flesh-and-blood stories of our progenitors: what were they like (personality)? what did they do? what did they look like? in what way might we be like some of them?

This does not mean that parents will be forthcoming with all the information they know. There could be a black sheep in the family – the less said the better. A parent might have undergone some serious trauma – again, discretion is the better part of (their) valor. This was a particular problem with Holocaust survivors, but not only them.

When my son Avihai was around six years old, instead of reading to him a bedtime book I started telling him what we called “Sammy stories” – taken straight out of my memory. In fact, the present series of my “Reflections” memoirs could be considered a direct continuation of that experience, just some “levels” higher.

Perhaps the best reason for a parent to provide information within some past social context is the opportunity to “model” growing up. All children look up to their parents for “guidance” as to how to deal with the world. Of course, actions tend to speak louder than words, but words do carry weight as well – especially when they are about the parents’ actions when they were the same age. Education need not be just sitting down to help with our kids’ homework; “informal” education sometimes is even more important, especially when it involves “socialization” – how to behave, how to react to challenges, how to control one’s emotions, and so on. “When-I-was-a -kid” stories make at least as strong an impact as straightforward lecturing around the dinner table.

Again, the initiative has to come from the parent. Yes, some kids will be bored; others fascinated; and most mildly interested, depending on the way it’s told and what it’s about. But rest assured, even if today they don’t fully appreciate these “roots” excursions, they certainly will later in life.

One final note: whatever I said here goes double for grandchildren! They are two generations away (imagine explaining a rotary phone to contemporary digital natives); grandparents are perceived as being “wiser” than parents; and they certainly have more time and patience for such family storytelling. Although I would not overdo this next point, but a few stories from grandparents about their own children – the kids’ parents and uncles/aunts – will definitely grab their attention!

May we all get to be great-grandparents (I’m sure we are, or will be, great grandparents), to regale dozens of our progeny – the real route to roots.

Purim’s Double Narrative: Celebration and Travesty

The holiday of Purim is my least favorite Jewish holiday – and Tami’s too. Why this should be so is an object lesson of what can happen when we become too comfortable with a traditional narrative.

All human beings have an amazing ability to take a relatively objective (factual) situation and interpret it in several ways, sometimes completely contradicting each other. As someone once said: “Variety is the spice of life.” But as we all know, sometimes spices can be very “hot” – burning us in the process of ingestion. In other words, if two people, camps, sectors, or population groups have widely different “tales” on what happened or “who did what to whom”, that could be a recipe for serious societal trouble.

This conflicting way of understanding the present is also true of the past (perhaps even more so!). There’s nothing like a contretemps between historians regarding a past event, or as Dr. Henry Kissinger once opined: “Academic arguments are so virulent because the stakes are so low…”.

Which brings me to the Purim holiday. As a mostly observant Jew (“Conservadox” is the best way to describe me), I have no intention of starting another “cultural war”. Rather, I want to explicitly state here something that other Jewish friends and acquaintances have sheepishly mentioned to me over the years: Purim is one “strange” story (and they don’t mean that positively). Some of you readers might have had the same queasy feeling.

The standard narrative is well known. Indeed, it is the classic basis for the Jewish trope (and joke): “What’s a Jewish holiday all about? The Gentiles tried to kill us, we fought back and won, and now let’s eat…”. Haman tried to manipulate King Ahaseurus into decreeing the destruction of the Jews, Esther devised a plan to turn the tables on Haman and succeeded, the Jews killed their Persian enemies, and we Jews celebrate to this day by eating and drinking ourselves into a stupor (the only day of the Jewish calendar when drunkenness is permitted). Kids celebrate by masquerading, adults by gorging on “hamantaschen” (in Hebrew “oznei Haman” = Haman’s Ears), and a good time is had by all.

So, what’s not to like? Well, when read a bit more closely, the Purim story is a complete travesty of Jewish ethics and commandments! Esther, an orphan, is brought up by her “uncle” (or cousin?), and when the king decides to find a new queen through a “beauty contest” (in the king’s bedchambers) of all the country’s virgins, she joins!! In other words, she is willing to have sexual relations before marriage, and with a Gentile no less. Then when she wins the competition, she actually marries the Gentile king!!! When was that ever condoned in Judaism? Indeed, in Jewish Law there are only three transgressions that prevent the saving of life – one of them, illicit sexual relations – so how does Mordechai even suggest that all this happened to save the Jews??

The Rabbis came up with all sorts of convoluted “explanations”, e.g., Esther was actually married to Mordechai (!?!) and didn’t consummate anything with the king (??) ; or, she would go to the mikveh (ritual pool for cleansing) before having relations with the king, and then again when she snuck out of the palace to have relations with Mordechai. (I am not making this up.)

Not as egregious from the standpoint of Jewish Law, but quite unJewish nonetheless, is Mordechai’s self-aggrandizement in the concluding sections. We are asked (actually commanded) to repeat every year the heroics of Mordechai (“the great man”, as the book puts it) – that he ostensibly wrote himself! Where did Jewish modesty go?

Which of the two main narratives is correct – the traditional one representing the Jewish Diaspora experience through the ages (trying to successfully fight anti-Semitism), or a highly problematic “outlier” in the Biblical canon? Obviously, I’m not objective given my antipathy to the entire Esther story, but consider these two points. First, where did the names Mordechai and Esther come from? The ancient Mesopotamian gods Marduk and Ishtar!! In other words, not only is their behavior reprehensible (by Jewish standards), but their very names suggest that they are not acting Jewishly – because maybe they aren’t?

Second, other than the Song of Songs (a love poem), the Book of Esther is the only other book in the entire Bible where God’s name is not mentioned! The Almighty was obviously as embarrassed by this narrative as I am…

Religious Observance: Dispensable or Dutiful?

Image result for google images wood tennis racket

When I entered City College of New York for my undergraduate education, I decided to try out for the tennis team. Although my first love was basketball, playing on my championship-winning, high school varsity team for two years, I wasn’t good (or tall) enough to play hoops at a college level. But as I was a tennis teacher in summer camp, I figured I might be able to play college varsity tennis. So, I went to the tryout.

After a few matches against other candidates, I made the team. Then came the hard part: informing the coach that I could not play on Saturdays, as I was a sabbath observer. That meant I would miss about a third of the matches. The coach was a stout fellow from Alabama – throughout his life perhaps not having been in much contact with Jews. He looked me in the face, thought about it for a few seconds, and then asked: “Can’t you get special dispensation from your Rabbi?”

OK, you can stop chuckling now. It is funny (if you know Judaism), but there’s a serious issue behind a question like that. Religion demands adherence to strict “commandments”. Some like Christianity have a relatively low number of “do’s” and a bit more don’ts; Judaism has a huge number of them: 613 main ones and uncountable lemmas, additional “fences”, and assorted customs that have eventually morphed into “edicts”. Either way, if the underlying assumption is that these are God-given (or at least God-inspired), there isn’t too much wiggle room for the religious observant.

Or perhaps there is? I am no expert on Christianity (or Islam), but papal dispensations for all sorts of future (or past) transgressions are well known. Judaism, however, has only one central “dispensation”: protecting human life. If there is any sort of mortal danger to a person then (almost all) commandments not only can, but SHOULD, be abrogated in order to ensure the person’s continued life. This need not be a clear case of life and death; even preventing a relatively minor illness – that theoretically could lead to death – is enough to allow transgression. There are three exceptions: idol worship, murder, and incest, e.g., if someone says “kill her or I will kill you” the Jew cannot pull a trigger on the woman.

However, this “thou must LIVE by them” dispensation means that there are not any other circumstances that enable transgression. Which leads to the next question: what’s the utility of being a stickler in performing religious commandments?

First, any “legal” system needs performance consistency. It wouldn’t work if any pedestrian could decide that it’s OK to cross the street at the red light because that’s their favorite color. All religions constitute a sort of “legal system”, notwithstanding the origin (after all, we obey secular laws even if not mandated “by God”). Second, as humans we are creatures of habit – comfortable in repeating activity that we enjoy or find meaning in doing. Third, and somewhat related, all religions are social – designed to strengthen communal solidarity, something critical for mental health and it turns out quite important for physical health too! Religious people on average live 3 to 6 years LONGER than those who are non-religious!! (See: The main reason for that in the modern world seems to be social solidarity: loneliness is the number ONE killer of senior citizens!

For Jews in the past, there were other more prosaic reasons. For instance, the edict that one has to wash hands (and make a blessing) before eating was obviously a major hygiene boost in an era when people had no idea about germs and the like. Circumcision seems to reduce the incidence of sexually transmitted disease. And so on.

Of course, we never know what aspects of religious practice are beneficial and which not – and that’s precisely why sticking to tried and true beliefs and accompanying rituals is worthwhile. Just ask my fellow, college tennis teammates.

During my senior year, my tennis team was scheduled to play Temple University (Philadelphia) – that’s not the Jewish “Temple”, or even a religiously-oriented college, although it did start out Baptist. The problem? The match was to be played on the interim Passover days (Chol Ha’moed), so that with the long, round-trip bus ride and several hours of the match, I had to bring my own food along. So I’m on the bus with my teammates on the way to Philadelphia when I get hungry and take out a matzoh sandwich I made for myself. The first bite – “CRUNCH!” – and everyone turns around. “What’s that?” asks one of my teammates. I try to explain, but they look at my gigantic “cracker” and start laughing, with some good-natured mocking of my culinary habit.

We were thoroughly beaten in the match: 7-2. The only two wins: my singles and also my doubles matches. On the bus ride back, all I could hear was my teammates imploring me: “Can I have one of those? There’s got to be some secret ingredient in those crackers!” 

 If you stick to your religious guns, it all works out in the end. No need for dispensations…

The Wondering Jew (Not a Typo!)

Ever since I was a tot, I would ask “why?” (My baby-sitting, twin cousins, Ruthie and Naomi called me “the mouth” – at age 3!) But as the stereotype has it, that seems to be something natural to most Jews – asking, questioning, disagreeing, and protesting. And it’s worth remembering that not all “stereotypes” are wrong. If indeed this one is correct (as I believe it is), the question is – sorry about this! – WHY?

As with every good question, there are several possible answers – each probably true to a certain extent. I’ll start with a few that historians have bandied about, and then I’ll add my take. First, the Bible (“Old Testament”) is replete with arguments (Abraham telling God that “He” can’t destroy Sodom and Amorah if there are saints living there), protests (the Israelites in the desert), and sundry questioning of authority (the Prophets). Not for nothing does God call the Children of Israel “a stiff-necked people”!

Second, more than a thousand years later, after the destruction of the Second Temple (68 CE), Judaism took a radical turn away from the Temple cult (priests, sacrifices etc) to scholarship. Religious learning and education evolved and soon developed what came to be known later as the Talmud, a gigantic compendium – not of laws, but of arguments. The questions, debating, arguing, go on for pages and pages, each camp (many times more than two) utilizing all the tools of rhetoric and logic. Over many hundreds of years (the Talmud, originally purely oral based on memory, was finally written down around the 5th-6th centuries CE), it has created a culture of learning through interrogation and verbal give-and-take without parallel.

Third, anti-semitism also played its part over the past 2000 years. When the world views you as an outcast or “inferior” (in the Moslem world, that was called “dhimmi”), then at some point you begin to view yourself as well as an “other” – and start thinking as an “other”. Actually, I should take back the word “start”; as the first two factors above note, the Jews have viewed themselves as “other” from their very start – whether as “The Chosen People” or simply believing differently than everyone else around them (monotheism vs the ancient world’s polytheism).

When I was ten years old, my mother took David and me for an entire summer vacation to England and Switzerland to see the family. Each direction on a famous ocean liner. Being with my overseas family was fun, but the real “added value” was seeing “alien” things: cricket, farthings, fish & chips, even a royal palace!

In retrospect, it strikes me that a major reason for the Jewish mindset is the Jewish People’s never-ending wanderings. Think about it: Abraham moved from Babylonia to Assyria to Canaan to Egypt and back to Canaan; Jacob moved his whole family to Egypt (and we know how that turned out); the Israelites sojourned in the desert for 40 years and then entered Canaan; 10 tribes were expelled in the 8th century BCE and then 200 years later the last two tribes’ leaders were also forced to leave, but they managed to return to the Holy Land after 50 years; then five centuries later came the Roman destruction of the Temple and more expulsions – this time to Babylon (again), Egypt and Rome. The next 2000 year period (until today) constitute(d) the Diaspora, with Jews ever on the move from one continent and country to another.

That’s why we are also called “The Wandering Jew”. And it seems to me that all this Wandering also makes us Wondering, i.e. wondering why we are not accepted almost anywhere, and also wondering (in the new place we settle) why “they” do things the way the way they do.

Note how this plays into all three of the original factors mentioned earlier: Jews don’t accept the “conventional wisdom” of anyone; they were highly literate and educated among other nations that have very low educational levels, so that was only natural to ask “why?” when the only answer forthcoming was “tradition”; and all this, of course, gets the Gentiles angry, leading to more anti-semitism that brings more expulsions etc., thereby starting the cycle all over again.

Thus, the Jewish way of doing (and thinking) things is “Wonderful” – as in “I wonder why it’s done that way?” Terrific for progress: scientific, technological and even social (Jews were/are in the forefront of most major social protest movements). But that also leads to Wanderful: sometimes a “push” by the Gentile world, and sometimes a “pull” with the Jew seeking out more amenable pastures. As one example: look how many Jews have immigrated to Palestine/Israel in the past century – and how many have also left Israel since its establishment!

Can we have our cake (enjoy staying put in our home country) and eat it too (see other places to broaden horizons)? Yes. The answer is vacation travel – not only to have fun but to learn how human nature is highly variegated, how others can live quite differently than us – but still make sense of their lives. Americans are notoriously “parochial” (the center of the world), and part of the reason for that is never leaving the U.S.  Indeed, even after the law was changed to require a passport for visiting Canada and Mexico, only about 30% of Americans have passports! I am willing to bet that the vast majority of the ultra-nationalist, racists in America have never left the good ol’ U.S.A. If solitary confinement in jail is inhuman, then self-confinement to one’s national borders is a recipe for anti-humanity.

Obviously, the present Corona period is not the time for travel. But once we’re past the pandemic, all of us should “get up and go”… far away. Not only for our mental health, but our moral health as well.

To make a bilingual pun (from my Mom’s home country), seeing the world is wünderbar!

“Perfect” Almost Never Is

In my freshman college year, I took a great course on the History of Art. The professor was one of the world’s leading art historians and critics from the NY Metropolitan Museum of Art – and a really good lecturer to boot. He once devoted a full 90-minute lecture to one painting: Leonardo da Vinci’s “The Last Supper”. The presentation was “eye-opening” (literally and figuratively) and the analysis immensely fascinating, not to mention enlightening. But there was one “small” problem: he deemed this to be a “perfect” painting, perhaps the greatest ever.

One should use superlatives very sparingly – like fine wine. Too much and one’s senses begin to become dull, lessening the enjoyment. Indeed, overusing a superlative can even dull the word itself! For example, I was taken aback the first time I heard a British cousin say “brilliant!” when I made a relatively unremarkable observation. I knew I wasn’t a dumb person, but “brilliant” because I had a “thought”? I was too embarrassed to ask, but pretty soon realized that the word had lost all its “American” luster when I heard another Brit say “brilliant!” to some other banal comment. 

Of course, there are superlatives – and then there are SUPERLATIVES. Among the latter is the word “perfect”. This one should be used extremely sparingly, perhaps once or twice a lifetime. The reason goes far beyond the diminution phenomenon I just mentioned; that’s the least of it. The real problem is that “perfect/ion” hardly exists in the real world – and even if it does (very, very rarely) reasonable people could argue about whether indeed it is “perfect”.

Back to that college class lecture. The professor explained how this painting was basically the first ever to get the three-dimensional perspective absolutely right. He further pointed out how each of the twelve disciples was doing something that foretold his future (end). And so on. The lecture ended with the claim that this was as perfect a painting as one could imagine. And then he asked: “any questions?”

I was never known for my shyness, at least not in a classroom. So I raised my hand and asked: “How can you say the painting is perfect, when we know that this was the Passover seder – and DaVinci drew bread rolls on the table instead of matzohs (unleavened bread)”? The entire class went silent – several students turning around looking at me with astonishment that I had the audacity to confront this august professor. But to me more astonishing was the fact that the professor himself was struck dumb. After what seemed an eternity (probably only a few seconds to all of us), he replied: “I never noticed that. Interesting point!” (The most incredible part of this story is that he was Jewish!!)

From that incident I took away two central lessons. The first – really important in light of my future academic career – was to forthrightly admit my error if a student caught me erring in some factually way. Indeed, with any such student I go out of my way to duplicate the quasi-compliment I received from that world expert. That’s part of higher education: students should never hesitate to ask or even “confront” a teacher with the facts (or opinions, for that matter). Encouraging individual thought is what thinking is all about.

The second lesson I learned is that the most “deadly” word in the English language is also the most positive of all: “perfect”. There are two, almost contradictory, reasons for that.

On the one hand, as my Art History Lecture story suggests, even when one thinks that something is “perfect”, it probably isn’t because there might (probably will?) be some flaw discovered later on. (Newton’s physics worked “perfectly” – until Einstein showed that this was not universally true.) On the other hand, if by some “miracle” perfection is attained, this will only induce others to try and get to “perfect” in their own bailiwick. But as the adage wisely states: “Perfection is the enemy of accomplishment”. In other words, if you try to be perfect, you will spend so much time on this almost impossible goal that you won’t get much else done.

There’s a relatively new word in the English language: “satisficing” – getting to a level that’s enough to accomplish the purpose you started out with. That’s not “satisfying”, i.e., we might not be completely satisfied (happy) with what we’ve done, but if it achieves the goal then that’s good enough. So instead of “deadly”, maybe I should have said “deadening”. Running after perfection not only won’t lead to satisfaction, but it might actually cause a contraction – a word that combines the contradictory terms of “contra” and “action”.

Bottom line: the perfect way to do things in life is not to try and do them to perfection.

Passing the Marshmallow Test

Marshmallow test claims it can tell how successful your kid will be - but  there's a problem - Mirror Online

I have now gone thirty years without any significant weight change; indeed, throughout these three decades it has not fluctuated more than 3 pounds either way. In fact, today I weigh exactly what I did when I was eighteen years old. Genes? Nope – my parents gained weight as the years went by. Biology? Not at all – during my 1989-90 sabbatical in San Diego I gained twenty pounds, just from adding a frozen yogurt every day in the university lunchroom. The answer: I passed the “Marshmallow Test” – possibly the most important exam in the life of any child or adult.

What’s the Marshmallow Test? First, you don’t have to use a marshmallow; any tempting treat will do (tempting for the person being tested). Second, this test is usually given to a child around the age of four or five; it’s harder to do with adults, but it could work under the right conditions. Here’s how it goes:

Put the kid in a relatively empty room – the emptier the better because you don’t want other “interesting” things to be within reach that could enable the child to distract him/herself. Sit the child down at a table that has only one plate on it with one, and only one, marshmallow (or other candy). Tell the child that you will be leaving the room for five minutes and that s/he can eat the marshmallow/candy whenever s/he wants, but if s/he does not eat it, then when you return to the room you’ll give them two marshmallows/candies to eat!

This experiment has been performed many times under controlled research conditions, and it tests for “delayed gratification”: whether a person can push off obtaining something desirable right now in order to attain something even more desirable in the “future”. It turned out that many children were able to hold off from that marshmallow – but about the same number were not able to control themselves.

So what? Although the test and its outcome were interesting in itself, the real ramification of the results came only 15 years later – quite clearly (and shockingly). Those children (now going to college) who back then were able to hold off in order to get the larger (later) reward tended to have better life outcomes, as measured by SAT scores, educational accomplishment, lower BMI body mass index (i.e., were slimmer), and other positive life measures! In other words, one simple test at a very early age was better at predicting life success or failure than almost anything else a child could be tested for and about.

Further exploration of the issue revealed some additional insights. Part of the ability to delay gratification was clearly inborn, but that was hardly the full story. Socio-economic background and parental upbringing practices were also influential. Thus, this “ability” is not either-or but rather can be learned or self-taught.

My personal experience (no weight gain) can offer one more insight. If a person has enough motivation for “holding off”, then that can make it easier to push back at temptation. In my case, I had two motivations. First, my father died of heart failure at age 57 and as I started to see that “age” coming up the pike, I felt the need to start taking better care of myself. Second, my love of playing basketball. Unfortunately, one can’t do that well being overweight, especially passing – not the ball – but age 40. As I am wont to joke: when I was young, I would eat in order to play basketball; now I don’t eat, in order to play basketball!

Of course, delaying gratification is easy to say, but a lot harder to do. For each such goal, one needs a “system”. Mine was (still is) simple albeit for some people pretty drastic: I get on the scale every morning! If I’m a pound “overweight”, I diet down a few days. That’s it. I don’t write off cake completely – just a tiny bit here and there; no empty calories – but I will “indulge” once in a blue moon if something really scrumptious comes along.

That’s not for everyone – and in any case, there are many other goals in life for which we could use some delayed gratification, e.g., saving for retirement (or even that expensive vacation you’ve been dreaming about). What’s important is to have a reasonable plan (not completely inflexible) and stick to it by periodically checking on your progress, e.g., that growing pension plan you’re saving up.

And if you have a child or grandchild, do the Marshmallow Test. If the results are not satisfactory, it’s time to start “gratification training”…