Identifying with One’s People: An Emotional Journey

I attended a modern Orthodox Jewish Day School from 1st through 8th grade, and then a similar high school. The secular education was quite good, and the Jewish part – at least from the standpoint of “do’s and don’ts” – wasn’t bad either. But something was missing, although I didn’t realize what until I went to college and took a course in Jewish History. For me, that’s when everything changed…

This essay is not about “Judaism” per se; it’s relevant to anyone and everyone of whatever ethnic, national or religious background – each with their own variation. But the underlying premise is the same: there is a difference between “knowing the rules” of your group and “feeling the connection”.

In those twelve years, I learned about the stories in the Bible; what a Jew is commanded to do; and how to study the sacred books. The approach was hyper-rational and didactic: this is what the Bible states; that is what the Talmud’s Rabbis argued. What was completely missing? Any real historical framework that would give “life” to those people and events of yesteryear. Sure, the Bible’s heroes were “heroic” (although each with flaws as well), but they existed in a world without socio-cultural context.

One central example: the Talmud (including the Mishnah) covers 500 years of history in the Land of Israel and Babylon, but not once were we taught that “this Rabbi lived in the 2nd century CE in Palestine” (the name given by the Romans), whereas “that Rabbi lived in Babylon in the 5th century CE”. In and of itself the specific “century” was not all that important; what is critical for understanding was the political, social and cultural milieu in which these rabbis worked and argued – and the impact of their environment on their argumentation and law rulings. To mix metaphors, this was teaching Jewish history and law from “Olympus” on high, and not within the messiness and complexity of real life, evolving over time.

In registering for that Jewish History course, I thought that I would be merely be filling in some lacunae in my knowledge. What I got was a double shock. First, I wasn’t missing a few pieces of knowledge; it was more like 450 pieces of a 500-piece jigsaw puzzle had gone AWOL without my knowing! Second, whatever I did “know” about Judaism was cut and dry, without much emotional resonance. Yes, I knew about the destruction of the two Temples – but more as a Fast day (the 9th of Av) with some specific prayers, than any flesh and blood detail about who, why, and how those catastrophes occurred.

The rest of Jewish history? The word “history” never arose in my 12 years of education; again, the Jewish calendar is taught as a litany of holidays and commandments, with almost no historical context (and certainly without questioning the extent to which “history” that was taught is actually accurate). But far more important, any event that did not have a place in the Jewish calendar was simply invisible: the 8th century BCE exile of ten (!) tribes, never to be seen again; Byzantine Emperor Theodosius’s persecution of the Jews in the Holy Land; periodic, Christian campaigns of mass conversion of Jews; Crusader pogroms through the mid-Middle Ages; the Spanish Inquisition; Chmielniki massacre of Polish and Ukranian Jews in 1648; the false messianic, Shabtai Zvi mass movement (most of Western European Jewry following his lead); and other significant, lugubrious events.

Of course, Jewish history is full of other, more positive (or “neutral”) developments, and these too were never mentioned: post-Temple Jewish Babylonian society with the “Rosh Golah” (Diaspora Head) opposite the Yeshiva rabbis; the various Jewish mass migrations from East to West, setting up huge, new communities in North Africa and later in Europe; the literary and commercial effervescence and affluence of Spanish Jewry under Moslem authority; Spinoza and later the Jewish Enlightenment (e.g. Moses Mendelssohn). And astoundingly completely missing from Jewish Day School (or modern yeshiva) education is Zionism, as this was largely a secular movement of national independence.

Confronting all this in my college course – the good, the bad, and the really ugly things in Jewish history – was a wrenching experience for me. When the emotional dust settled, I had a much greater appreciation of Jewish history. Far more important, I now had a greater visceral attachment to the Jewish people and my heritage. Judaism wasn’t only 613 Commandments; it was the product of a 3000-year journey from depths to heights and back again – over and over – each time with the Jews exhibiting the highest level of fortitude and adaptability in the face of some of the greatest challenges that any nation has ever faced. And still the Jews endure(d).

I started this essay with these words: “it’s relevant to anyone and everyone of whatever ethnic, national or religious background.” So I’ll end with an example. Today is Christmas. Why does it fall on the 25th of December? Indeed, why is Hanukkah also celebrated around this time of the year? (The Jewish calendar is lunar so that the “civic date” shifts by a few weeks back and forth every annum.) After all, the Maccabee revolt against the Seleucids went on for seven years; however, Hanukkah’s date does not celebrate the revolt but rather the rededication of the Temple when its Menorah was relit. The answer to both these questions is the same: throughout the ancient (pagan) world the 25th of December was celebrated as the first day that it was possible to see that the sun actually “regenerating”, i.e., until Dec. 21 the days got shorter and shorter – only four days later could it be discerned that the days started getting longer and “the world was saved”.

That’s why Hanukkah is celebrated mainly as the holiday of LIGHTS!! That’s why Jesus was (supposedly) born on that day, as the harbinger of a “new era”. Both these religions understood that in order to survive they had to “piggyback” on the pagan world’s age-old holiday festivals.

Does this “cheapen” Hanukkah? In my opinion, quite the reverse. It is but one example of many as to how Judaism over the millenia adapted to the exigencies of the time, turning something “unholy Gentile” (a superstitious belief in the Sun’s demise) into a holy celebration of national independence and religious rededication. (The Christians, in their completely different way, did this too. And if already on the topic: why is New Year’s on Jan. 1? That’s eight days after Dec. 25 – when Jewish Jesus had his brit milah [circumcision]! Christians too need to understand their religious connection to Judaism – not just to paganism.)

In sum, a true emotional connection to one’s people – whether religious or national – can only arise from a deeper and especially wider historical comprehension of who they were and why, i.e., with what our forebears had to contend. It is only through such an understanding that we too today will be better able to deal with the newer challenges that our people face. 

Why Are Individual Humans Complex?

My thinking is “out-of-the-box”; my behavior is distinctly conservative. How can that be? I am an irrepressible punster; an original problem-solver; an unorthodox thinker. Conversely, I dress conservatively, follow rules and the law to the letter, have never used any drugs (or even smoked cigarettes) – in short, my actions are very “yekke” (German for a stickler). What accounts for people being so “inconsistent”?

This is not about “me” as Sam. I am simply using myself as someone representative of most of humanity in general. The question is: why are people so “complicated”, even “contradictory”?

Obviously, every person is “complex” in a different way. Some are conservative thinkers but act radically. Others are “off-beat” in one area of life but quite “straight” in other areas. Still others stick to all the social rules of the game in their lifestyle, but deep-down dream of other ways of living. The combinations are almost endless.

Why should this be? The question itself is interesting because one can easily turn it around: why would we think otherwise? The answer to that is the fact that when we look at almost all other species in the animal kingdom, we find a conservative regularity; birds don’t crawl, snakes don’t fly, ants have no individuality, and so on. Their behavior seems to be “wholistic” in nature (and in Nature). If we too are part of the natural world, why should we be different (from other species – and most important, from other humans)? But we are.

What makes us different (in both directions) is “culture”, i.e., the ability to communicate with each other and expand our knowledge of the natural world and each other. Over the eons, this enabled us to sustain ourselves without having to find food non-stop (anthropologists estimate that hunter-gatherers “worked” only about 3-4 hours a day), leaving ample time for “leisure”. Back then that probably meant telling stories around the campfire; creating ornaments; etc. In short, broadening our horizons – with new physical artifacts and also novel thoughts, intellectual ideas, and perspectives. Still later, we started expanding our social horizons – villages, towns, cities, states, empires – and that led to markedly different ways of living and thinking, as greater human density meant greater intellectual cross-pollination. Ultimately, we spanned (and communicated/traded across) the entire globe. As opposed to homo sapiens, African tigers don’t “talk” to Bengali tigers; they each stay mostly in their own habitat, and if there are any behavioral differences between these two “cousin species” (based on climatic, environmental and topographical differences) they won’t transfer one to the other, even if by chance they do meet up.

From a physical standpoint, humans look pretty much the same: two eyes, two ears, and so on. The real difference is found in our brain that has two main functions: consciously thinking thoughts and unconsciously controlling most bodily functions (you don’t usually think to breathe and certainly not to keep your heart beating). It’s conscious thought that makes all the difference – even to the extent that, for example, people born without arms have incredibly dexterity in their toes; the brain simply finds another “outlet” to do what the body needs to do!

However, what truly makes the human brain stand out is its multifaceted elements. It might look like one 3-pound piece of wet meat, but the human brain is actually a modular machine with numerous parts, each of which works on a different aspect of our “reality”. There are sections for receiving and interpreting inputs from the external world: perception; other parts do the rational thinking; an important sector runs our emotions; then there’s a section that filters thoughts and emotions so that not everything we feel or think comes out of our mouth.

It is here that we return to the original question: how can people be so different? Easy: one section of the brain does not necessarily have to be consistent with another. There are people so emotional that their rational side gets lost; others whose rational thought suppresses their emotions; still other humans whose perceptual apparatus works in weird ways (look up “synesthesia”: https://www.healthline.com/health/synesthesia); and so on. The various permutations and combinations of our brain’s “modules” are almost endless – ergo, people’s thoughts and behaviors differ widely.

The irony in all this is that we tend to judge a person by externalities: how good looking/pretty s/he is, how tall, their skin color, muscles, body size (lithe or heavy), etc. But the real “action” (insofar as differences are concerned) hide hidden away within our skull.

“Hold on!” you are probably asking: what about that “culture” mentioned earlier? Are people different because of their brain biology or because of their social environment: parental upbringing, formal education, peer socialization etc? The answer is “both”. However, we do not come into this world as a tabula rasa on which society can “write” whatever it wants to form our personality; we start out with a particular brain in all its complex modularity. Our social environment can indeed “mold” that brain to an extent, but such an influence is severely restricted by the particular biology of the baby’s (and growing adolescent’s) brain. Scientists are still arguing about the degree of Nature vs Nurture, but no one doubts that both are important in what makes me “me”.

In short, you probably know several “weird” people. But they’re not actually strange; rather, they are as complex as you are – just in different ways, and it’s not always easy for you (or anyone) to understand what makes that person tick. It’s only relatively easy to understand ourself: if you want to know why “you” are a complex person, first look in the mirror (at your head), and then go outside and survey your social environment. Given the innumerable elements that go into making “you” you, it’s actually pretty surprising that you actually are at least somewhat similar to other people! So try focusing more on how other individuals are similar to you, and less on how they’re different. We’re all “different” from each other in some ways, but that’s precisely what makes us all similar: we’re complex humans.

MEeting the “Enemy”

In my early 20’s, after many summers spent in Orthodox summer camps, I was hired by the Conservative Movement’s Ramah Camp to be its Sports Director in one of their camps. There I met a young woman (for the sake of her privacy, I’ll here call her Toni), who I started “dating” in camp. She was nothing like most girls/women I had ever met – very serious about religion and other social issues. We had many discussions, among other matters about “feminism” – something completely alien to someone like me, a product of Jewish Orthodox education through high school. After a month, she told me her not so secret “secret”: she was a lesbian. Didn’t I say “nothing like most girls/women I had ever met”?

No, that’s not a misprint in the title. It does say “MEeting” (I’ll end this essay with another “misprint” – something to look forward to…)

Almost everyone – except perhaps for children of diplomats and military officers moving from country to country every few years – spend all their youth growing up in much the same social milieu, even if they move once or twice to a different city. That provides comfort, for we quickly learn who’s who, how to behave, what is expected of us, and most of all: what’s “normal”. The downside, of course, is that such a milieu is a social bubble; we basically have no idea how other social groups and certainly different cultural enclaves live their lives and what they consider to be important or “acceptable”.

Frankly, this is especially true and problematic for citizens of large countries and/or countries that are mostly separated territorially from the rest of the world. Those two elements characterize Americans more than any other significant nation on the planet. If you’ve never seen it, look up Sternberg’s famous New Yorker cartoon cover; it even has its own Wikipedia page! https://en.wikipedia.org/wiki/View_of_the_World_from_9th_Avenue

It is also true for very cohesive social groups (religious, ethnic etc) – Orthodox Jews among them. This is not necessarily negative; that cultural self-ghettoization has enabled the Jewish People to survive longer than any other national culture on the face of the Earth (Chinese culture as we understand it today goes back “only” 2500 years to Confucius). But it does leave the average individual within that cultural enclave pretty clueless about other ways of life. Like I was.

It might seem that there isn’t much of a cultural gap between modern Orthodoxy and Conservative Judaism. (As an aside: as a matter of fact, theologically [not sociologically] there isn’t that great a difference; one could somewhat simplistically but still truthfully argue that modern Orthodoxy is today’s Beit Shammai and the Conservative approach to halakha [Jewish law] is Beit Hillel. The former school [from 2000 years ago] was very strict in its interpretation of the law, whereas the latter were far more lenient and flexible. Interestingly – something that many Orthodox today have forgotten – the final decision on what was to be done was almost always according to Beit Hillel!) But the milieu from which Orthodox Jews emerge is quite constricted, as opposed to Conservative Jews who tend to live in less homogeneously Jewish enclaves.

So when I met Toni, it was a religio-cultural “oil and water” interaction (I guess, sometimes opposites do attract). My first shock: she wore a kippah (skullcap) all the time – something that back in the 1970s was unheard of (at least for me). The second shock: she put on tefillin (phylacteries) every morning! Ensuing discussions about what this was all about were not exactly shocking to me, but were certainly eye-opening. In a word – which I had read about, and only understood in very general terms – “feminism”.

It’s hard to say exactly why this deeply influenced my thinking. In retrospect, I can speculate about possible reasons: ancient Jewish law actually treated women quite well for the “norms” of that age; Judaism has always placed social justice at a high normative level; I never viewed girls as inferior (indeed, in the 8th grade at graduation I was voted Most Popular Boy, even though not one boy voted for me – I seemed to be one of the few boys who spoke to my classmate girls as equals); and finally, although my mother was never a “feminist”, she had several professional lives that were decidedly not of the “stay-at-home” garden variety: the first woman diamond cutter in Havana (back then, one of the world’s diamond centers), and in America an assistant fashion designer to one of Seventh Avenue’s leading designers, Bob McKintosh. So, while the idea of modern feminism was beyond my ken, the practice (and Jewish antecedent) perhaps smoothed the way to my “natural” acceptance of the whole basket.

I have been a strong feminist since (my revised surname is testament to that – a story for another day). Indeed, other than my wife Tami, without doubt Toni influenced me for the long term more than any other female in my life (Moms aren’t included in this regard). This is because her influence went far beyond teaching me what feminism is all about. Rather, the experience was an “education” in the real sense of the term. The main lesson: keep one’s eyes and ears open for other ways of thinking and living. Of course, this does not in any way mean that one has to (or should) accept and adopt those dissimilar perspectives and (sorry for this big word, but English doesn’t have a real parallel) different weltanschauungen.

What’s the value, then, of opening up to other ways of life? First, additive: we can learn about new things or other ways of thinking. Second, refractive: it’s a mirror to our own way of doing things, i.e. forcing us to consider whether everything we’re used to doing makes sense. And third, it improves our social intelligence – better understanding and tolerance of “strange” things that others might be saying/doing. For instance, different cultures have different conceptions of “social distance”: Latins (southern Europe and South America) tend to stand very close to each other when conversing; Anglos keep a significant distance. When the two get together without understanding the other’s culture, Latins consider Anglos to be weak and offputting for not getting up close, whereas the reverse has the latter considering the former to be “pushy” and aggressive for constantly “invading my space”.

Back to Toni and feminism. Why would I use the word “enemy” in the title of this essay? Because in Orthodox circles, anyone of another Jewish “sect” was (and certainly in Israel, still is) considered to be a dangerous “enemy” – the Conservative movement perhaps even more so than Reform Judaism, precisely because the former adheres to the same general basis: halakha. In any case, meeting such an “enemy” in Toni, not only showed me how absurd such theological demonization can be, but also constituted a life lesson in listening to different ideas, opinions and practices – even if ultimately one does not necessarily accept (parts of) that way of living.

The bottom line: in life when we come across a person who is “strange” or even seemingly “threatening” (our values, not physically) – what sociologists call “the other” – we should stop and think. Perhaps what we need to face is ourselves: the eneME?

Body and Brain

As far back as I can recall, I loved (and relatively excelled in) athletics. I have kept this up through the years: tennis (until my shoulder started aching in my 50s) and basketball still today (my early 70s). As all athletes know, sports activity is a mood enhancer – you feel great after working up a sweat. But who would have thought that intensive physical activity could also be a brain-intellect enhancer? Well, recent research has discovered some very interesting things about the body-brain connection…

We can start at the very beginning. Eons ago, our forefathers and foremothers did lots of walking, running, and other physical activities just to stay alive – mostly in the hunt for meat. (Yes, women too; it was reported recently that female bones were found buried with full hunting paraphernalia.) So, it stands to reason that the human body would evolve in such a way as to prioritize the ability to move quickly and for long distances – the better “proto-athletes” had a better chance of survival, and thus more progeny.

If hominids have been around for hundreds of thousands of years, and homo sapiens for one or two hundred thousand, then “modern” humans – the sedentary ones sitting in office all day, or worse, “couch potatoing” in front of a screen of one sort or another – are not doing what our body was designed for.

However, it turns out that not all types of “exercise” are equal. Let’s take a simple comparison: running on a treadmill vs. jogging in the park. The former is excellent for keeping your body in aerobic shape – good for your heart, arteries, and some leg and arm muscles. But the brain? No benefit there. On the other hand (or leg), running outdoors is different – especially for the brain. The reason is simple. On a treadmill, there isn’t much thinking we have to do; it becomes almost automatic – one leg in front of the other, on and on…. But outside? We have to keep track of the following: not tripping on some hole or object on the ground; where we are going (to avoid getting lost, or how to get back to where we started); not bumping into other people or a pole. We are also receiving far more stimuli: birdsong, animals scooting around, people doing interesting things, flowers blooming, etc.

In short, when we’re running outside, we are basically copying the same experience our fore-parents underwent, thereby keeping not only their bodies fit, but their brains as well. Indeed, to continue the “evolutionary” description I mentioned earlier, those who exercised their mind in the long hunt were also evolving in a positive, cognitive direction – and not just improving their physical capabilities.

What about sports? Is the above description also germane to competitive sports? Is the sky blue?? If anything, competitive sports is even more brain-enhancing than running outdoors. Just think (pun intended) of everything a sport demands of the player: cooperate with teammates (in group sports), follow the direction of a ball and coordinate the body with its movement (catch the ball, hit it, kick it etc.), think of our next tactical move within a broader strategy – all this while running around and not being certain of our opponent’s next counter-move!

Many athletes are vaguely aware that competitive sports are not merely a physical activity but demand mental exercise as well. However, until very recently, no one was able to show that such physical activity had a highly positive effect on our brain in the long run. Now researchers have begun to do controlled experiments with people, using MRI brain scans to test “before” and “after” effects (usually mid-term – a few weeks – and not one-time exercise) to see what happens in the brain. Without boring you with the neurological details, it turns out that consistent athletic/sports activity of the type I described here improves our memory and general cognitive functioning – indeed, it can slow down or even hold back gradual dementia!

And please don’t say “it’s too late for me, I’ve been sedentary all my life”. It turns out that it is never too late to start exercising. Sure, you’re not going to win any Olympic medal at this stage of your life, but in the “race of life” you can certainly extend your own “finish line”! That’s quantitatively (lifespan) and qualitatively (brain and body functioning).

How much time does one have to “invest” in this? For medium exercise (e.g. fast walking – break a sweat, get the heart rate up just a bit), 150 minutes a week; for intense exercise (e.g. running, full court basketball), 75 minutes a week – that’s a mere 10-12 minutes a day!

Too much “work” for you to keep your brain in shape? Then consider this: in 2016, a large-scale study found that very active people were much less likely to develop thirteen different types of cancer than people who rarely moved! Indeed, an ever more recent research study (American College of Sports Medicine) discovered that regular exercise can reduce the risk of developing some cancers by as much as 69 percent – and also might improve the outcome of cancer treatments, thereby extending cancer patients’ lifespan!! If you want to read more, see this: https://www.nytimes.com/2020/11/11/well/move/how-exercise-might-affect-immunity-to-lower-cancer-risk.html?surface=home-discovery-vi-prg&fellback=false&req_id=652281408&algo=identity&imp_id=710516562&action=click&module=Science%20%20Technology&pgtype=Homep

Ah yes, one more thing to consider: exercise also aids in losing weight – another life extender. But that’s a topic for a different day. For now, I trust I’ve given you some food for thought with regard to the connection between exercise/sports and mental/bodily health.

Talking to Children

This time I’ll start with two seemingly similar vignettes.

1- Like many young couples immigrating to the U.S. my parents had a very hard time at first making ends meet. The question of working on shabbat was especially vexing in their case: on the one hand, the economic need was clear; on the other hand, my father grew up without any significant religious education but promised my mother that the household and family would be strictly Orthodox. It was only when I was an adult and after he passed away that my mother told me about their “solution”.

2- One day when I was ten years old, my father told me that mom was going away on a vacation to Florida for “a few weeks”. At the time, I thought it a bit strange, but 10-year-olds don’t ask too many questions (certainly not back then). Here again, it was only much, much later in life (well into my 50s) that I realized that she must have had a nervous breakdown.

What should parents tell their children? Or more to the point, what should they hold back from their progeny – if anything? Obviously, there’s a “when” question here too; you can’t tell a three-year-old what you could a teen who’s thirteen. But I’ll leave the “when” question in abeyance.

Regarding vignette #1, they decided that he would open his lingerie store on shabbat – until I was three years old, and then stop Saturday openings. Why three years old? Because that’s the age when young children begin to “follow” what’s going on around them and start asking questions – in this case, something like “why can’t I go to synagogue with dad like my friends do?” Indeed, in the Jewish tradition, 3 is considered to be the age when tots turn into cognitive children and real “education” starts. It is also the age when many traditional Jews give their sons the first haircut (similar to a tree that starts providing fruit after three years).

As to vignette #2, my mother – still alive today [2020] at age 95 (until 120…) – had a difficult youth. Fleeing Germany by herself in 1939 at age 14 on a Nazi ship (!) going to Lisbon; a year later setting up with her mother the only kosher “pension” (bed & breakfast) for fleeing Jewish refugees, working there night and day; a year or two later moving to the island of Jamaica where the British had set up a refugee camp, and then in 1944 (age 18) relocating one again to Cuba – all that effectively eliminated any possibility for a carefree teenage life. There were other family tribulations that I won’t get into here, but in retrospect a nervous breakdown made sense – even if a decade and a half later. Amazingly, she returned quite “recuperated” and functioned fantastically for many decades thereafter.

Should I have been told at some later point about the lingerie store and shabbat? That would have defeated the very purpose of stopping the Saturday opening when I turned three. In addition, certainly until the teenage years there is little understanding of “home economics” matters, not to mention hard tradeoffs that we are sometimes forced to do in our life.

Should my parents have told me the real reason for my mother’s “vacation”? (My brother David was only six at the time – far too young for that sort of information.) Here my answer is different: I believe they should have, considering that they knew me as quite an emotionally stable and pretty intelligent kid.

One might ask: what good would it have done? Perhaps greater consideration on my part going forward regarding my mother’s emotional state (I was as much a “fashtunkener” teenager as most). Perhaps to understand more about her past specifically, and the Holocaust period in general. And even perhaps to be a greater helping hand around the house. In retrospect, one thing is clear: had I known, I would have been even prouder of my mother for all she accomplished despite her sensitive emotional state.

This is not to say that I feel any animosity whatsoever for the fact that my parents did not tell me the truth about either situation. In most cases, parents have the right to keep certain things close to the chest – especially for pre-teenage children.

However, one has to also take something else into consideration: whether the “secret” is only hush-hush for the child in question – and public for everyone else! That’s a situation that almost demands disclosure. Hearing about the “secret” from a kid’s parents, at a level appropriate for the child’s age, is infinitely better than stumbling on it through a relative’s (or worse – a stranger’s!) slip of the tongue. In the latter case, not only is the “shock” greater, but it could theoretically impact the child’s trust in her/his parents: “If they didn’t tell me about that, what other secrets (skeletons) are there in my family’s closet?”

In short, parents have to think carefully about what they tell their children; but they equally have to consider the downside of NOT relating important family matters.

 

Courage

At the start of the last month of my high school senior year (June 1967), our social science teacher Mr. Ya’acov Aronson walked into the classroom and made a startling announcement: “As you know, war has broken out in Israel and I feel it my duty to go there to do whatever I can to help. I’m sorry that I won’t be here to help you with the last-minute preparations for your NY State Regents exam, but sometimes we have to do what we have to do.”

Incredibly, the school’s principal – this was Yeshiva University High School, Manhattan – told him he would be fired if he left before the school year was over! Mr. Aronson stuck to his guns (pun intended); my classmates had our parents call the principal in protest, and the principal ultimately backed down from his threat.

When we think of the word “courage”, it usually means a sort of physical heroism: in war or saving a baby running into the street from an oncoming car. But these instances are relatively rare (except for soldiers). The courage needed in everyday life is of a different sort altogether. It entails going against the social stream, standing up for what one believes despite that being a distinctly minority opinion.

In my specialty field of expertise – mass communications – there’s a well-documented theory called “The Spiral of Silence”. It works like this: in a group, several of the “leading” individuals will voice an opinion; a few others will feel that they are in a minority, and therefore not voice a contrasting opinion. Most people in that group might have no opinion at first but seeing/hearing only one opinion being voiced they will gradually begin to believe in – and express – that same opinion, until the whole group becomes homogeneous on that topic. The “silence” of the minority eventually leads to what’s called “group-think”. This is no small matter; the first rendition of this theory was based on the majority’s silence in pre-Nazi, Weimar Germany. As Sartre put it: “Every word has consequences. Every silence, too.”

Doing the opposite (speaking out) has the beneficial effect not only of clearing one’s conscience but can also make such a person more popular – not with the “in-group” but rather with others who appreciate a “straight-shooter”. Indeed, in my other academic specialty area, political science, I have noticed that many popular elected leaders have policies that do not necessarily reflect their supporters’ interests, but these voters appreciate that leaders’ vocal honesty (or at least what passes for “honest talk”, e.g. Donald Trump in his 2016 campaign). In Israel we’ve had straight shooters like PM Yitzchak Rabin and Jerusalem Mayor Teddy Kollek, who never minced words; Kollek was reelected five (!) times and served for 28 years in one of the world’s toughest cities. The U.S. had Harry Truman and Ronald Reagan (not to mention “Honest Abe” way back then) – the former two not the brightest of leaders but respected for telling it like it is.

Saying something (seemingly) unpopular takes gumption; acting on one’s beliefs is an even higher level of courage. To paraphrase: it’s important to put your legs where your mouth is – otherwise known as “talk the talk AND walk the walk.” Mr. Aronson was willing to do just that – at potentially great sacrifice (life, limb, and employment). That was a rare example of extreme civilian courage – a model of doing what one feels is the right thing to do and damn the consequences. (Interesting coda: more than a decade later I bumped into him when I started to teach at Bar-Ilan University in Israel; he had become the Head Librarian at my university!)

I will now offer a speculation (take it or leave it): Jews are culturally predisposed to such types of “courage”. First, the Jewish tradition is heavily steeped in argumentation, e.g. the Talmud is one gigantic compendium of disagreements between the rabbis – no spiral of silence there. However, even more germane is the fact that from the start, Judaism has been “oppositionist”. The biblical Prophets were paragons of this, railing against the Israelite kings to their royal face. Simultaneously, Judaism fought tooth and nail against the ancient world’s polytheism; later, Jews stood steadfast as a denial of Christianity and Islam, despite their extremely minority position vis-à-vis both those major religions.

Indeed, one could take such speculation one step farther (this will be controversial): the State of Israel today is the epitome of “moral courage” – emphasizing national ethnicity over contemporary, western, “civic” statism; refusing to be labeled “colonial”, as Jews return to their historical homeland.

Be that as it may, moral courage is a universal phenomenon, albeit sadly all too rare. Social pressure (many times self-inflicted – we only imagine that others demand that we “toe their line”) is not easy to overcome. When is it easier? When we have an internal compass, some deeply held belief or opinion (hopefully based on fact). Armed with that, we can more easily defend ourselves psychologically when the counterattack is launched.

Courage, then, is somewhat paradoxical: in order to be more popular with ourself, we might have to suffer some modicum of unpopularity from others. Is it worth that? I believe so: every time we look in the mirror, we see ourself and not others. It’s far more important to live with (and up to the standards of) that face than to face the opprobrium of those who think that you don’t know what’s right and they do.

Life Directions: Inner or Outer

Early in my last undergraduate year I had to decide “what to do with my life”. Clearly, this entailed graduate work, but in what? My inclination (and cognitive skills) was to go for either a law degree or a PhD, i.e. either practice law or teach political science (my B.A. field). As the title of this set of essays is “Prof. Sam”, you already know what I chose. But why? And what can one learn from my example?

Given that my decision was going to be truly a momentous one for the rest of my life, I decided to consult with a few of my professors. They tended not to convince me one way or the other, but rather to present me with the “bottom line” (literally and figuratively) of each choice. It was clear from my grades that I most probably would get into an elite university, e.g. Harvard Law School or Harvard University Graduate School. Their advice boiled down to this in a nutshell: if I chose the former, I would probably be making $500,000 after five years of practicing law (this in the mid-1970s!); if I went for a PhD, I’d be making $50,000 around the same time.

Why in the world, then, would I even consider graduate school? Because my professors’ advice came with another “prediction” or caveat: practicing law meant 12-hour work days, usually 6 days a week; academic life, while intensive in its own way (“publish or perish”), involved almost total freedom to decide when, what, and how much to work. So I decided to take a “pay cut” of $450,000 and become a teacher/scholar.

I never learned how to play any musical instrument well, but from the start of my life I marched to the beat of my own drum. Only decades later did I discover that there is a psychological term for this: “inner-directed”. This is one of the hardest things for most people to learn, if they were not so inclined from birth. The reason is simple, although the phenomenon itself is complex: humans are social animals. From the earliest era of homo sapiens (and even before with homo erectus), we lived in packs – what today would be called an “extended family” or as they say in the Middle East, a “hamula”. This ensured our personal survival back then, and since then we haven’t changed much in that regard despite huge advances in technology and social structure expansions.

Thus, as we go through life, we are constantly on the lookout for what others are doing, what is expected of us, how we can “fit in”. Not fitting in can be emotionally painful, something most people will avoid at almost all costs (see Asch’s classic “Conformity Experiment”: https://www.simplypsychology.org/asch-conformity.html). Over time, this becomes second nature – not only in our behavior, but in our values, beliefs and norms. To take my case as an example, there were two norms I had to overcome: America’s Protestant Ethic of “money = success” and the Jewish-American “my son the lawyer”.

Obviously, trying to shake off such “outer direction” comes with a price – sometimes economic, sometimes psychological, many times both. None of this is to say that being outer-directed (i.e. taking one’s cues from others) is in any way “wrong”. Indeed, if everyone did what popped into their head, or even thought carefully about doing the opposite of what society expects, just to be different, we would be living in the jungle (which might be unfair to jungle animals that mostly do cooperate with each other). Accepting most societal norms is healthy for the individual and certainly for society writ large. However, when this becomes mindless “follow-the-crowd” blindly, or when going against a social norm is not harmful to the group, then inner-directedness is called for.

One could even make the argument that inner direction of individuals is useful for the group as a whole. For instance – and to be clear, this is not me – many of humanity’s greatest social and scientific advances were made by highly inner directed thinkers and doers. Think Martin Luther who publicly railed against the greatest power in late medieval Europe, the Catholic Church; or his namesake Martin Luther King who stood up against deeply ingrained southern racism in America (perhaps his name provided him with the necessary inner directed gumption to take on the entire southern caste system?). We definitely need occasional “mavericks” to move society forward.

In our own small, personal way, though, each and every one of us has a niche area where something really counts for us, despite society’s looking askance. Doing what we really want to do might cost us money or even a friend or two, but that’s really a small price to pay for being able to smile when we look in the mirror every morning – not to mention the satisfaction of accomplishing something “abnormal” that we’ve always wanted to try. Sure, it takes some courage (the topic of next week’s post). But that word has an interesting etymology: its root is cor – Latin for “heart” (French speakers know it as coeur). In its early form it meant “to speak one’s mind by telling all that’s in one’s heart.” You don’t have to speak this to others. To get started, it’s simply enough to tell yourself what your real inner desires or needs are, and then act on them.

Informaddiction

I don’t recall all that much of my youth (although it was generally pleasant), but one “regular” event sticks in my mind. Every Friday night – after our traditional erev-Shabbat dinner with all the Orthodox accoutrements (Kiddush, Blessing the Bread, singing Shabbat songs, Grace After Meals), we would all “retire” to the living room (right next to the dining room). My brother David and I would play some board game – and my parents would sit in “their” lounge chairs, intently (and contentedly) reading the newspaper.

I am an addict. Not for drugs nor for sex. It’s a curious and mostly unharmful form of addiction, one that I’m pretty sure affects other people. But I can’t say how many; no one has done any research on the matter. In any case, I’m addicted to information.

That’s not only what’s called “news”, or as we put in the academic world of mass communications “hard news”.[1] Rather, I constantly seek dopamine gratification from learning about something new: scientific discovery or idea, philosophical argument, social phenomenon, historical analysis or finding – you name it and as long as it has some intrinsic worth or even surprise, I’m ready to absorb it.

Of course, I don’t spend my entire waking hours “sponging” for new information – just most of the time. My purpose? Much as a person might take an “upper” in order to boost performance at work, study, or even sports, so too these info-bites (occasionally full-size meals) serve me as food for thought and action – in my case, researching, writing, and “getting through the vagaries of life”.

You might be asking: how is this an “addiction”? In my case, we can start with my post-breakfast routine: several newsletters filling my email inbox from highbrow and middlebrow intellectual, political, and scientific sites. I spend at least an hour omnivorously consuming such brain food before turning to “my life”. Once chores are out of the way, it’s back to scanning the online horizon for more mind nutrition. Indeed, other than quality time with my wife Tami or playing a game of basketball with my hoop buddies, there’s not much out there I would rather do than read the latest… whatever (of intellectual novelty or practical use). I literally have to drag myself away from a book or computer screen to see a good movie (for me, “good” almost always meaning “thought-provoking”); the latest David Brooks opinion piece will attract me far more than some juicy “news” about this or that politician’s goings-on. Overall, each day I will spend hours surfing the Net to find interesting (to me) reviews of “challenging” books (in the double meaning of the term: going against conventional wisdom; complex in substance) – and from there (occasionally) hitting my Amazon button to download the tome itself.

Another sign of “addiction”: what can be called multinforming. Ingesting information while doing something else. Some examples: driving my car and listening to the radio news or hearing a lecture; ditto (the latter) when in the fitness gym (doing weights or on the treadmill); morning walks in the park while smartphone viewing a lecture series on Oceanography or The Evolution of Birds. Even “worse”: speeding up these lectures (usually to 1.5 or 1.75 speed) to be able to finish two 30-minute lectures during my 40-minute park walk. In my life, there is no “wasted time”: the radio news is my best friend while washing the dishes and taking a shower; mini-articles on my iPhone are consumed at the supermarket checkout line before reaching the cash register.

By coincidence, in the middle of my penning these lines, the magazine Scientific American just reported on medical research that shows doing exercise AND simultaneously “exercising” the mind does more for our brain than simple aerobic exercise (e.g. running on a treadmill). So, if you’re going to jog, do it outside (the brain has to keep track of the terrain) and if you can handle it, listen to a lecture or do some other mentally challenging work.

Back to the overall issue with my next question: is such overall “addiction” unusual? Probably yes and no. Let’s start with the “no”. All children are born with an insatiable hunger to learn. They are literally information sponges (being cute doesn’t hurt either – it magnetizes others to feed them with constant stimuli). Their curiosity is boundless; anything and everything is fair game to learn and understand. And even if there is little stimulus to be had at the moment, they have two other tactics: first, use their curiosity to grab things and figure out what they are and how they work; second, if a human is around, ask “why?” – again and again and again and… “Enough already! Go play with your sister…”.

When does such info-sponging become not normal (as opposed to “abnormal”), i.e. when does it become “yes, it’s unusual”? Gradually, as we become older and “life takes over”. An older friend of ours once served on the New York City Board of Education. When we told her that our oldest son was about to enter first grade, we were taken aback at her response: “Too bad; school will spoil his curiosity.”

Some of this is inevitable. After all, not every child will want to know the basics of physics or how to calculate the radius of a circle, but things like that need to be learned in order to function in our increasingly complex world. The bigger problem is that most educational systems are still more oriented to rote-learning than to teaching through figure-it-out-for yourself education.

And then, of course, there’s life: making a living, raising a family, and the like, that takes up most of our time and energy after leaving the educational system. In short, if school didn’t kill our kids’ curiosity and thirst for learning, life tends to do the job just as effectively.

And yet, this isn’t the whole explanation. We do seek out “information”, just of a different type: celebrity goings-on, political machinations (some important; others far less so), cute cat videos – you get the (YouTube/Instagram/TikTok) picture. Of course, there’s nothing wrong with harmless fun; the question is one of degree. If a person’s life is taken up mostly (or completely) by fun and games – “bread and circus” as the Romans put it – then the important information simply collapses under the weight of the weightless.

Can the opposite be true too? If over my lifetime I have spent about 90% of my free (i.e. non-working/eating/sleeping/parenting/hygiene) time sponging “hard” information and news, could that too be considered an imbalanced life? Perhaps. But that sort of depends, among other things, on the nature of the information and to what use the 90% is put: to better one’s health? professional expertise? parental and social capability? civic action? Or simply to quench the information craving, however esoteric and useless it might be? As with eating culinary food, so too with ingesting food for thought there’s a difference between gourmand and glutton.

I want to believe that I have found some semblance of balance here: a good part of my self-education is admittedly a function of what catches my intellectual fancy at the moment. But a not inconsiderable amount is purposeful – or as we academics call it: utilitarian. I read lots about medicine that helps keep me very healthy and fit; about evolutionary biology that (perhaps surprisingly) is a boon in being a better communicator with my fellow bipedal primates (also called humans); about economics, obviously helpful in investing and otherwise keeping my bank balance in the black; and so on. Even strange esoterica can be useful if one knows how to dole it out (in small amounts) during social get-togethers.

Does this make me a better person? Not at all. There’s no correlation – let alone causation – between intellectual curiosity (or even brilliance) and social-mindedness, good-heartedness, or any other definition of what a “good” person is. However – admittedly one can argue with me on this – “informaddiction” properly activated can lead to the “good life” in Platonic terms. If the term “mind your own business” is familiar to all of us, I believe that “your business should be mind” is even more apt.

To be sure, not everyone has the capability for this sort of life predilection. And many people who have a relatively high IQ might still prefer to live the entertained life rather than one of sustained self-education. If that makes them happy, fine with me (and hopefully them). To a certain extent, this type of addiction is environmentally and culturally learned: a home with shelves of books; growing up with dinner conversations about the wonders of the world; stimulating teachers. Nevertheless, it seems to me that in the final analysis, to put it simply, simplistically, and also truthfully: you are what you are.

In my case, if that makes me an egghead, so be it. The egg preceded the chicken by about 150,000,000 years (see: esoterica can be interesting!), so I figure that I have a pretty good head start continuing the advance of homo sapiens sapiens – and even (if I and/or they are lucky) resurrecting others’ curiosity.

[1] From the standpoint of journalism, this is what I (along with my co-author Michal Seletzky) called “general news” – not yesterday’s “political” event or economic datum (hard news), nor soft news items found in the middle and back pages of even the most erudite news institutions: food, travel, sports, culture, gossip, and the like.

Freedom of Choice? Beyond Nature & Nurture

As I entered my senior year in college (CCNY), I had to start seriously thinking what I was going to do with my life “when I grew up”. With an almost straight “A” average in my college grades, I understood that I could get into almost any graduate school of my choice – but in what? For the first time in my life, I had to “introspect” – not something easy to do for a 21-year-old “whippersnapper” (for those too young to have heard this term, it means a young and inexperienced person considered to be presumptuous or overconfident). I had a good mind, and even “better” mouth – indeed, my cousin babysitters, Ruthie and Naomi used to say about me when I toddled into the room: “here comes the mouth”!

After a while and some “consultation” with a few of my college professors, I whittled down my choices to two relevant (for me) possibilities: Law School or Graduate School (for a PhD). With my grades, I was almost a sure-bet to get into Harvard Law and Harvard Graduate School for Arts & Sciences (GSAS). Which was it to be?

The considerations were pretty straightforward (I am only slightly exaggerating here): if I go to Harvard Law, within 5 years of graduating I would probably be earning $500,000 a year (and that’s back in the mid-1970s!). If I attend Harvard GSAS, then I would be earning about $50,000 annually. A no-brainer? Not exactly. A high-powered career in Law meant (still means) that I would be working close to 24/7/365: oodles of money and no personal, free time. A career as a professor meant far less money but a lot more freedom to do what I want professionally, when I want, and how I want.

“Prof. Sam” provides the clue to my ultimate choice. And I have never regretted that decision.

 One of the most hotly debated issues in academia these days is the Nature/Nurture divide. Simply put: in the way we behave and think, are humans mostly/completely a product of our biological-genetic makeup, or mainly influenced by our environment (social and physical), e.g. parental upbringing, education, societal norms, weather etc? I do not intend here to dive into this very thorny controversy, but only remark that the latest scientific research (e.g. epigenetics) clearly shows the interaction between the two.

However, there is a third factor here that is not given much attention: personal choice, otherwise known as “free will”. That too is a highly fraught term in contemporary scholarship, with serious arguments – philosophical and neuro-scientific – on both sides of the issue. Some argue that our decisions are a deterministic product of all the internal and external forces that impinge on us. For instance, why do I choose to eat a banana right now? First, because I am hungry (internal pressure); second, because I read somewhere (external) that bananas have potassium which I need after playing an hour or so of intensive basketball. Others claim that “we” don’t really have free will because it turns out (incontrovertible empirical research) that our brains make a decision (for us?) a split second before we are aware that we have decided!

Our common sense understanding of free will, however, accepts that at the extreme margins, we do not have “free will”. We are all aware that we can’t decide to have our bodies fly through the air or see through walls; without the necessary wealth, most of us can’t simply decide to take a round-the-world cruise over the next two years; and so on. Our internal, physical makeup and external, social environment puts quite a lot of restrictions on our “free will”. We live with that because “that’s the way it is”.

Between those polar extremes, though, we do feel that we can make choices large and small – even if they are in some loose way influenced by other life factors. After all, if instead of educating us our parents had put us in an isolation cage for eighteen years with almost no external stimuli, our choices in life would be far more circumscribed (no language, no education, no exercise etc). By being “out in the world”, we are pushed and pulled by an almost infinite number of “influences”. And yet, we are not a planet stuck in orbit around a sun for time immemorial; we can determine to some – and even a large extent – our personal life orbit.

However – and here’s the key point – in order to make choices based on some measure of free will, we have to be aware of the “deterministic” factors around and within us that might cause us to act more like a gravity-captured planet rather than individuals with freedom of choice. In short, it is not so much political dictators that prevent us from acting freely; it is our own lack of self-cognizance regarding what is pushing and pulling us down a “pre-determined” path.

This involves several things:

First, as Nobel-prize winner Daniel Kahneman (and his pre-Nobel-deceased research partner Amos Twersky) have shown, our cognitive apparatus (aka: “brain”) is full of traps and obstacles to clear thinking. We leap (in this case, “jump” is too mild a word) to conclusions without sufficient evidence or logical thought. Not only the “uneducated”; professors and researchers/scientists are almost as guilty of this – especially regarding almost any topic not within their field of expertise. (By the way, expert knowledge is not the reason that at least within their expertise they don’t fall into mental traps, but rather that in such fields they have been trained to weigh the evidence in “reasonable” fashion; unfortunately, such thinking patterns are not easily transferable to other topics.)

Second, we are all influenced in some way by social norms: some blatant, others subtle. Blatant: in theory, there is no reason why we couldn’t all walk around without clothes on (when it’s warm enough), but we don’t because society clearly does not approve – which is why “nudist colonies” are almost always found in remote regions. Subtle: in America, we expect our conversationalist partner to stand about a yard/meter away from us; in Latin countries, that’s considered to be “distant” (literally and figuratively). No one in any of these countries ever thinks about the norm of “personal space” until meeting another culture where the norm is different. But in our own society we all act (a bit more or less) in accordance with that norm. I won’t repeat here what I have already noted in a previous “Prof. Sam” essay (“Connections”) regarding the great impact living within a social environment has upon us.

Third, if societal norms reflect the macro-situation, then our family and close friends also have expectations regarding our behavior. Indeed, as Muzafer Sherif’s famous pre-teenager, color war experiment found in the 1950s, it doesn’t even have to be someone we are close to – it’s enough to feel “part of the group”. We are all social animals, having evolved eons ago in groups of around 50 people – so when we’re in a group of any “small” size, we will quickly adhere to what’s “expected from us” in order to “survive” (other researchers have now found that the maximum number of people we can be truly friendly-with/close-to is around 150). To take but one kind of example, movies such as “My Big Fat Greek Wedding” and “The Big Sick” seriously (and hilariously) show how family pressure can significantly restrict our ability to make major life decisions for ourselves.

Luckily for me, my mother was not the “my son the lawyer” type. Would my decision to pursue a career in academia have been different were she that sort of mom? Who knows? But if I had to make that decision thirty years earlier (the 1930s), when antisemitism ran rampant in professorial academia (not that the legal profession back then was a bastion of tolerance), I probably would not have chosen this career.

Fourth and finally, the matter of free will and our conscious decision-making process is greatly complicated by a phenomenon that only recently have neuroscientists become aware of, one that I mentioned above: it is not clear who/what is the “I” making the decision! Before I continue, apologies for the semantic confusion this might cause, because our language has not caught up with the latest research. I will use the first person here to make things easier.

When I am faced with a decision of any sort, I eventually decide what to do. But it turns out that my brain makes the decision about four-tenths of a second before “I” (consciously) do! Of course, my brain “belongs” to me, but still there is a difference between “me” deciding and my brain deciding for me before I am aware of it. How does this affect our concept of “free will”? You decide (pun intended): either our free will stays intact (I am my brain, so in fact I made that decision), or it becomes a slippery concept (I was not aware that my brain was “deciding for me”).

The bottom line: we are always “free” in theory to make our own decision, but the degree of such freedom is heavily dependent on the number and power of the cognitive and social obstacles we have to overcome to make such a personal choice – not to mention our understanding of what/who exactly “we” are when deciding something. Greater personal freedom, then, is not only a matter of “freeing” ourself but also (perhaps primarily?) reducing society’s strictures and expectations of what each of us should be doing and deciding, as well as being aware of subconscious processes deep in the recesses of “our” mind.

Two Bad Can be G😃😃d

After several years of marriage, it became clear that Tami and I were having problems getting pregnant. After both of us underwent all sorts of tests (and subsequent “procedures”), the doctors asked us to query our mothers about a drug called DES that decades earlier were given to pregnant mothers. Amazingly, both my mother and Tami’s mom had been given DES in the very late 1940s – the cause of our mutual infertility.

 In math, we are all taught that multiplying two negative numbers renders a positive. But in real life, it turns out that adding two negatives can also end up as a “positive”.

A couple’s infertility has the potential of being a marriage-breaker – for two reasons. First, it demands of the couple some soul-searching and heavy decision-making: Do they go childless? Do they try to adopt – and if so, how and who? Or perhaps surrogacy?

These are very difficult choices, each with substantial advantages and downsides. Childless through life? Lots of freedom and secure finances, but with a “hole” in the family unit, not to mention serious familial loneliness in old age. Adopt through an agency? Not that expensive, but not too much choice of the type of child unless you are willing to forego a baby for a kid somewhat older. A legal, private adoption? More control over what you are getting – but frightfully expensive (lawyers’ fees, biological mother’s medical costs etc). Hiring a female surrogate and/or donated ovum – or using a sperm bank (depending on who is infertile)? Cost and/or parentage issues. In short, any one of these questions can lead to a serious rift between a married couple.

By far the worst issue, though, is the “blame game”: who is the infertile one? Whether husband or wife, emotions can run riot. On the part of the infertile spouse, a major blow to self-esteem and perhaps jealousy of the “healthy” partner. The fertile spouse has a tough choice – almost Solomonic: to continue with the marriage at the cost of never having biological progeny or sacrificing a marriage partner for “genetic continuity” (if not biological immortality). In short, minus one added to plus one = a huge negative.

But if both spouses are negative, the equation pretty much straightens itself out! Neither is jealous of the other. Surrogacy is out (except for sperm and ovum donations). Only the quandary of childless freedom vs. (type of) adoption remains as a tough decision. In our case, we quickly agreed on adoption (although the process for each of our two sons was wildly different).

While not at the same level of “severity”, another ostensible double whammy had no less an impact on my life. Indeed, I considered it then to be so “horrendous” that it was the only time in my life that I was really furious at my mother. In the 8th grade of my Jewish Day School, we had the choice of taking the entrance exam to Bronx High School of Science – the highest ranked and most well-regarded high school in all of New York City. My mom allowed me to take the test which I passed; and then she wouldn’t let me go! “You need to continue getting a good Jewish education,” and that was that.

Anger and frustration hardly begin to describe my feelings back then – for two reasons (the “double whammy”). First, which kid would not want to be in such an elite high school? I always had a keen interest in science and was pretty good at math. Second (the other side of the coin), eight years of Jewish education was quite enough for me; what was there still to learn? (Young teenagers are not known for their “wisdom”; they would be more correctly be called “wise?dumb!)  The thought of four more years studying Talmud and Hebrew (not to mention it being an all-boys school!) was not what I was looking forward to.

In retrospect, my mother’s decision changed my life in unintended, positive ways. Of course, one can never know “what if” I had gone to Bronx Science. But this was what happened at Yeshiva University High School in Manhattan. First and foremost, I did get a solid Jewish & Hebrew education that enabled me later on to offer a tentative “yes” to Tami’s “ridiculous” demand before she would go on a second date with me: would I consider making “aliyah” (moving to Israel)? That education also formed part of my secondary research agenda later in my academic career: writing on the Jewish Political Tradition.

Second, and this might seem to be a rather minor outcome, but in my eyes of major importance down the road, something that I already alluded to in my previous post, entitled Fa(r)ther – I was able to play on my high school basketball varsity (I most probably wasn’t good enough to make the Bronx Science team; who says “nerds” can’t be athletes? And anyway, they played many games on Saturday, my shabbat). Over 50 years later at age 71 (pre-Corona) I continue to play intensive hoops twice a week with some guys around half my age – and I’m one of the more energetic players among them.

What’s the big deal? As I have already mentioned, my father died at 57 from heart failure; similarly, his sister and brother passed away when only somewhat older than him. I have been aware of this “genetic threat” from my Twenties so that exercise became for me a potential lifesaver. However, as many exercise wannabees know only too well, if it’s drudgery you won’t stick to it. For me, basketball was always FUN – easy to stick to. Moreover, there was secondary element to this: as I got older into my 40s and 50s, it became clear that in order to be able to run up and down the court for an hour and a half, I had to stay relatively thin – thus impacting my eating habits: healthy and minimalistic. People constantly tell me “of course you’re thin – you play basketball.” They have the cause and effect backwards: in fact, I stay thin in order to play basketball! In short, what for me back then was my mother’s double “terrible” decision (letting me take the entrance exam – and then not allowing me to go to that school), ultimately turned out really well for all concerned.

These were my double “bads” that flipped into big positives – the first averting marital disaster, the second placing me in a life path that at the time (a 14-year-old whippersnapper!) I viewed as “calamitous” but turning out far better than I could have imagined. Obviously, there are many other life situations where a double whammy can ultimately end up being quite beneficial. To take a common one: I am sure that there are numerous men and women out there who become unemployed and/or get divorced, and as a result move to a new city – only to find the true love of their life or get hired for their dream job.

The lesson is universal: the paths of life are never linear; what seems negative at first – especially when doubled over – can hold profound (and unexpected), positive consequences. Does this mean that everything bad ends up good? Of course not! But it does suggest that whatever knocks we encounter in life, it pays to maintain a long-term perspective. What we feel at the time they occur might not at all be how we view them in hindsight decades later. And sometimes, despair twice-over shared with another person close to us can have its own mutual, positively reinforcing benefits.