Body and Brain

As far back as I can recall, I loved (and relatively excelled in) athletics. I have kept this up through the years: tennis (until my shoulder started aching in my 50s) and basketball still today (my early 70s). As all athletes know, sports activity is a mood enhancer – you feel great after working up a sweat. But who would have thought that intensive physical activity could also be a brain-intellect enhancer? Well, recent research has discovered some very interesting things about the body-brain connection…

We can start at the very beginning. Eons ago, our forefathers and foremothers did lots of walking, running, and other physical activities just to stay alive – mostly in the hunt for meat. (Yes, women too; it was reported recently that female bones were found buried with full hunting paraphernalia.) So, it stands to reason that the human body would evolve in such a way as to prioritize the ability to move quickly and for long distances – the better “proto-athletes” had a better chance of survival, and thus more progeny.

If hominids have been around for hundreds of thousands of years, and homo sapiens for one or two hundred thousand, then “modern” humans – the sedentary ones sitting in office all day, or worse, “couch potatoing” in front of a screen of one sort or another – are not doing what our body was designed for.

However, it turns out that not all types of “exercise” are equal. Let’s take a simple comparison: running on a treadmill vs. jogging in the park. The former is excellent for keeping your body in aerobic shape – good for your heart, arteries, and some leg and arm muscles. But the brain? No benefit there. On the other hand (or leg), running outdoors is different – especially for the brain. The reason is simple. On a treadmill, there isn’t much thinking we have to do; it becomes almost automatic – one leg in front of the other, on and on…. But outside? We have to keep track of the following: not tripping on some hole or object on the ground; where we are going (to avoid getting lost, or how to get back to where we started); not bumping into other people or a pole. We are also receiving far more stimuli: birdsong, animals scooting around, people doing interesting things, flowers blooming, etc.

In short, when we’re running outside, we are basically copying the same experience our fore-parents underwent, thereby keeping not only their bodies fit, but their brains as well. Indeed, to continue the “evolutionary” description I mentioned earlier, those who exercised their mind in the long hunt were also evolving in a positive, cognitive direction – and not just improving their physical capabilities.

What about sports? Is the above description also germane to competitive sports? Is the sky blue?? If anything, competitive sports is even more brain-enhancing than running outdoors. Just think (pun intended) of everything a sport demands of the player: cooperate with teammates (in group sports), follow the direction of a ball and coordinate the body with its movement (catch the ball, hit it, kick it etc.), think of our next tactical move within a broader strategy – all this while running around and not being certain of our opponent’s next counter-move!

Many athletes are vaguely aware that competitive sports are not merely a physical activity but demand mental exercise as well. However, until very recently, no one was able to show that such physical activity had a highly positive effect on our brain in the long run. Now researchers have begun to do controlled experiments with people, using MRI brain scans to test “before” and “after” effects (usually mid-term – a few weeks – and not one-time exercise) to see what happens in the brain. Without boring you with the neurological details, it turns out that consistent athletic/sports activity of the type I described here improves our memory and general cognitive functioning – indeed, it can slow down or even hold back gradual dementia!

And please don’t say “it’s too late for me, I’ve been sedentary all my life”. It turns out that it is never too late to start exercising. Sure, you’re not going to win any Olympic medal at this stage of your life, but in the “race of life” you can certainly extend your own “finish line”! That’s quantitatively (lifespan) and qualitatively (brain and body functioning).

How much time does one have to “invest” in this? For medium exercise (e.g. fast walking – break a sweat, get the heart rate up just a bit), 150 minutes a week; for intense exercise (e.g. running, full court basketball), 75 minutes a week – that’s a mere 10-12 minutes a day!

Too much “work” for you to keep your brain in shape? Then consider this: in 2016, a large-scale study found that very active people were much less likely to develop thirteen different types of cancer than people who rarely moved! Indeed, an ever more recent research study (American College of Sports Medicine) discovered that regular exercise can reduce the risk of developing some cancers by as much as 69 percent – and also might improve the outcome of cancer treatments, thereby extending cancer patients’ lifespan!! If you want to read more, see this: https://www.nytimes.com/2020/11/11/well/move/how-exercise-might-affect-immunity-to-lower-cancer-risk.html?surface=home-discovery-vi-prg&fellback=false&req_id=652281408&algo=identity&imp_id=710516562&action=click&module=Science%20%20Technology&pgtype=Homep

Ah yes, one more thing to consider: exercise also aids in losing weight – another life extender. But that’s a topic for a different day. For now, I trust I’ve given you some food for thought with regard to the connection between exercise/sports and mental/bodily health.

Talking to Children

This time I’ll start with two seemingly similar vignettes.

1- Like many young couples immigrating to the U.S. my parents had a very hard time at first making ends meet. The question of working on shabbat was especially vexing in their case: on the one hand, the economic need was clear; on the other hand, my father grew up without any significant religious education but promised my mother that the household and family would be strictly Orthodox. It was only when I was an adult and after he passed away that my mother told me about their “solution”.

2- One day when I was ten years old, my father told me that mom was going away on a vacation to Florida for “a few weeks”. At the time, I thought it a bit strange, but 10-year-olds don’t ask too many questions (certainly not back then). Here again, it was only much, much later in life (well into my 50s) that I realized that she must have had a nervous breakdown.

What should parents tell their children? Or more to the point, what should they hold back from their progeny – if anything? Obviously, there’s a “when” question here too; you can’t tell a three-year-old what you could a teen who’s thirteen. But I’ll leave the “when” question in abeyance.

Regarding vignette #1, they decided that he would open his lingerie store on shabbat – until I was three years old, and then stop Saturday openings. Why three years old? Because that’s the age when young children begin to “follow” what’s going on around them and start asking questions – in this case, something like “why can’t I go to synagogue with dad like my friends do?” Indeed, in the Jewish tradition, 3 is considered to be the age when tots turn into cognitive children and real “education” starts. It is also the age when many traditional Jews give their sons the first haircut (similar to a tree that starts providing fruit after three years).

As to vignette #2, my mother – still alive today [2020] at age 95 (until 120…) – had a difficult youth. Fleeing Germany by herself in 1939 at age 14 on a Nazi ship (!) going to Lisbon; a year later setting up with her mother the only kosher “pension” (bed & breakfast) for fleeing Jewish refugees, working there night and day; a year or two later moving to the island of Jamaica where the British had set up a refugee camp, and then in 1944 (age 18) relocating one again to Cuba – all that effectively eliminated any possibility for a carefree teenage life. There were other family tribulations that I won’t get into here, but in retrospect a nervous breakdown made sense – even if a decade and a half later. Amazingly, she returned quite “recuperated” and functioned fantastically for many decades thereafter.

Should I have been told at some later point about the lingerie store and shabbat? That would have defeated the very purpose of stopping the Saturday opening when I turned three. In addition, certainly until the teenage years there is little understanding of “home economics” matters, not to mention hard tradeoffs that we are sometimes forced to do in our life.

Should my parents have told me the real reason for my mother’s “vacation”? (My brother David was only six at the time – far too young for that sort of information.) Here my answer is different: I believe they should have, considering that they knew me as quite an emotionally stable and pretty intelligent kid.

One might ask: what good would it have done? Perhaps greater consideration on my part going forward regarding my mother’s emotional state (I was as much a “fashtunkener” teenager as most). Perhaps to understand more about her past specifically, and the Holocaust period in general. And even perhaps to be a greater helping hand around the house. In retrospect, one thing is clear: had I known, I would have been even prouder of my mother for all she accomplished despite her sensitive emotional state.

This is not to say that I feel any animosity whatsoever for the fact that my parents did not tell me the truth about either situation. In most cases, parents have the right to keep certain things close to the chest – especially for pre-teenage children.

However, one has to also take something else into consideration: whether the “secret” is only hush-hush for the child in question – and public for everyone else! That’s a situation that almost demands disclosure. Hearing about the “secret” from a kid’s parents, at a level appropriate for the child’s age, is infinitely better than stumbling on it through a relative’s (or worse – a stranger’s!) slip of the tongue. In the latter case, not only is the “shock” greater, but it could theoretically impact the child’s trust in her/his parents: “If they didn’t tell me about that, what other secrets (skeletons) are there in my family’s closet?”

In short, parents have to think carefully about what they tell their children; but they equally have to consider the downside of NOT relating important family matters.

 

Courage

At the start of the last month of my high school senior year (June 1967), our social science teacher Mr. Ya’acov Aronson walked into the classroom and made a startling announcement: “As you know, war has broken out in Israel and I feel it my duty to go there to do whatever I can to help. I’m sorry that I won’t be here to help you with the last-minute preparations for your NY State Regents exam, but sometimes we have to do what we have to do.”

Incredibly, the school’s principal – this was Yeshiva University High School, Manhattan – told him he would be fired if he left before the school year was over! Mr. Aronson stuck to his guns (pun intended); my classmates had our parents call the principal in protest, and the principal ultimately backed down from his threat.

When we think of the word “courage”, it usually means a sort of physical heroism: in war or saving a baby running into the street from an oncoming car. But these instances are relatively rare (except for soldiers). The courage needed in everyday life is of a different sort altogether. It entails going against the social stream, standing up for what one believes despite that being a distinctly minority opinion.

In my specialty field of expertise – mass communications – there’s a well-documented theory called “The Spiral of Silence”. It works like this: in a group, several of the “leading” individuals will voice an opinion; a few others will feel that they are in a minority, and therefore not voice a contrasting opinion. Most people in that group might have no opinion at first but seeing/hearing only one opinion being voiced they will gradually begin to believe in – and express – that same opinion, until the whole group becomes homogeneous on that topic. The “silence” of the minority eventually leads to what’s called “group-think”. This is no small matter; the first rendition of this theory was based on the majority’s silence in pre-Nazi, Weimar Germany. As Sartre put it: “Every word has consequences. Every silence, too.”

Doing the opposite (speaking out) has the beneficial effect not only of clearing one’s conscience but can also make such a person more popular – not with the “in-group” but rather with others who appreciate a “straight-shooter”. Indeed, in my other academic specialty area, political science, I have noticed that many popular elected leaders have policies that do not necessarily reflect their supporters’ interests, but these voters appreciate that leaders’ vocal honesty (or at least what passes for “honest talk”, e.g. Donald Trump in his 2016 campaign). In Israel we’ve had straight shooters like PM Yitzchak Rabin and Jerusalem Mayor Teddy Kollek, who never minced words; Kollek was reelected five (!) times and served for 28 years in one of the world’s toughest cities. The U.S. had Harry Truman and Ronald Reagan (not to mention “Honest Abe” way back then) – the former two not the brightest of leaders but respected for telling it like it is.

Saying something (seemingly) unpopular takes gumption; acting on one’s beliefs is an even higher level of courage. To paraphrase: it’s important to put your legs where your mouth is – otherwise known as “talk the talk AND walk the walk.” Mr. Aronson was willing to do just that – at potentially great sacrifice (life, limb, and employment). That was a rare example of extreme civilian courage – a model of doing what one feels is the right thing to do and damn the consequences. (Interesting coda: more than a decade later I bumped into him when I started to teach at Bar-Ilan University in Israel; he had become the Head Librarian at my university!)

I will now offer a speculation (take it or leave it): Jews are culturally predisposed to such types of “courage”. First, the Jewish tradition is heavily steeped in argumentation, e.g. the Talmud is one gigantic compendium of disagreements between the rabbis – no spiral of silence there. However, even more germane is the fact that from the start, Judaism has been “oppositionist”. The biblical Prophets were paragons of this, railing against the Israelite kings to their royal face. Simultaneously, Judaism fought tooth and nail against the ancient world’s polytheism; later, Jews stood steadfast as a denial of Christianity and Islam, despite their extremely minority position vis-à-vis both those major religions.

Indeed, one could take such speculation one step farther (this will be controversial): the State of Israel today is the epitome of “moral courage” – emphasizing national ethnicity over contemporary, western, “civic” statism; refusing to be labeled “colonial”, as Jews return to their historical homeland.

Be that as it may, moral courage is a universal phenomenon, albeit sadly all too rare. Social pressure (many times self-inflicted – we only imagine that others demand that we “toe their line”) is not easy to overcome. When is it easier? When we have an internal compass, some deeply held belief or opinion (hopefully based on fact). Armed with that, we can more easily defend ourselves psychologically when the counterattack is launched.

Courage, then, is somewhat paradoxical: in order to be more popular with ourself, we might have to suffer some modicum of unpopularity from others. Is it worth that? I believe so: every time we look in the mirror, we see ourself and not others. It’s far more important to live with (and up to the standards of) that face than to face the opprobrium of those who think that you don’t know what’s right and they do.

Life Directions: Inner or Outer

Early in my last undergraduate year I had to decide “what to do with my life”. Clearly, this entailed graduate work, but in what? My inclination (and cognitive skills) was to go for either a law degree or a PhD, i.e. either practice law or teach political science (my B.A. field). As the title of this set of essays is “Prof. Sam”, you already know what I chose. But why? And what can one learn from my example?

Given that my decision was going to be truly a momentous one for the rest of my life, I decided to consult with a few of my professors. They tended not to convince me one way or the other, but rather to present me with the “bottom line” (literally and figuratively) of each choice. It was clear from my grades that I most probably would get into an elite university, e.g. Harvard Law School or Harvard University Graduate School. Their advice boiled down to this in a nutshell: if I chose the former, I would probably be making $500,000 after five years of practicing law (this in the mid-1970s!); if I went for a PhD, I’d be making $50,000 around the same time.

Why in the world, then, would I even consider graduate school? Because my professors’ advice came with another “prediction” or caveat: practicing law meant 12-hour work days, usually 6 days a week; academic life, while intensive in its own way (“publish or perish”), involved almost total freedom to decide when, what, and how much to work. So I decided to take a “pay cut” of $450,000 and become a teacher/scholar.

I never learned how to play any musical instrument well, but from the start of my life I marched to the beat of my own drum. Only decades later did I discover that there is a psychological term for this: “inner-directed”. This is one of the hardest things for most people to learn, if they were not so inclined from birth. The reason is simple, although the phenomenon itself is complex: humans are social animals. From the earliest era of homo sapiens (and even before with homo erectus), we lived in packs – what today would be called an “extended family” or as they say in the Middle East, a “hamula”. This ensured our personal survival back then, and since then we haven’t changed much in that regard despite huge advances in technology and social structure expansions.

Thus, as we go through life, we are constantly on the lookout for what others are doing, what is expected of us, how we can “fit in”. Not fitting in can be emotionally painful, something most people will avoid at almost all costs (see Asch’s classic “Conformity Experiment”: https://www.simplypsychology.org/asch-conformity.html). Over time, this becomes second nature – not only in our behavior, but in our values, beliefs and norms. To take my case as an example, there were two norms I had to overcome: America’s Protestant Ethic of “money = success” and the Jewish-American “my son the lawyer”.

Obviously, trying to shake off such “outer direction” comes with a price – sometimes economic, sometimes psychological, many times both. None of this is to say that being outer-directed (i.e. taking one’s cues from others) is in any way “wrong”. Indeed, if everyone did what popped into their head, or even thought carefully about doing the opposite of what society expects, just to be different, we would be living in the jungle (which might be unfair to jungle animals that mostly do cooperate with each other). Accepting most societal norms is healthy for the individual and certainly for society writ large. However, when this becomes mindless “follow-the-crowd” blindly, or when going against a social norm is not harmful to the group, then inner-directedness is called for.

One could even make the argument that inner direction of individuals is useful for the group as a whole. For instance – and to be clear, this is not me – many of humanity’s greatest social and scientific advances were made by highly inner directed thinkers and doers. Think Martin Luther who publicly railed against the greatest power in late medieval Europe, the Catholic Church; or his namesake Martin Luther King who stood up against deeply ingrained southern racism in America (perhaps his name provided him with the necessary inner directed gumption to take on the entire southern caste system?). We definitely need occasional “mavericks” to move society forward.

In our own small, personal way, though, each and every one of us has a niche area where something really counts for us, despite society’s looking askance. Doing what we really want to do might cost us money or even a friend or two, but that’s really a small price to pay for being able to smile when we look in the mirror every morning – not to mention the satisfaction of accomplishing something “abnormal” that we’ve always wanted to try. Sure, it takes some courage (the topic of next week’s post). But that word has an interesting etymology: its root is cor – Latin for “heart” (French speakers know it as coeur). In its early form it meant “to speak one’s mind by telling all that’s in one’s heart.” You don’t have to speak this to others. To get started, it’s simply enough to tell yourself what your real inner desires or needs are, and then act on them.

Informaddiction

I don’t recall all that much of my youth (although it was generally pleasant), but one “regular” event sticks in my mind. Every Friday night – after our traditional erev-Shabbat dinner with all the Orthodox accoutrements (Kiddush, Blessing the Bread, singing Shabbat songs, Grace After Meals), we would all “retire” to the living room (right next to the dining room). My brother David and I would play some board game – and my parents would sit in “their” lounge chairs, intently (and contentedly) reading the newspaper.

I am an addict. Not for drugs nor for sex. It’s a curious and mostly unharmful form of addiction, one that I’m pretty sure affects other people. But I can’t say how many; no one has done any research on the matter. In any case, I’m addicted to information.

That’s not only what’s called “news”, or as we put in the academic world of mass communications “hard news”.[1] Rather, I constantly seek dopamine gratification from learning about something new: scientific discovery or idea, philosophical argument, social phenomenon, historical analysis or finding – you name it and as long as it has some intrinsic worth or even surprise, I’m ready to absorb it.

Of course, I don’t spend my entire waking hours “sponging” for new information – just most of the time. My purpose? Much as a person might take an “upper” in order to boost performance at work, study, or even sports, so too these info-bites (occasionally full-size meals) serve me as food for thought and action – in my case, researching, writing, and “getting through the vagaries of life”.

You might be asking: how is this an “addiction”? In my case, we can start with my post-breakfast routine: several newsletters filling my email inbox from highbrow and middlebrow intellectual, political, and scientific sites. I spend at least an hour omnivorously consuming such brain food before turning to “my life”. Once chores are out of the way, it’s back to scanning the online horizon for more mind nutrition. Indeed, other than quality time with my wife Tami or playing a game of basketball with my hoop buddies, there’s not much out there I would rather do than read the latest… whatever (of intellectual novelty or practical use). I literally have to drag myself away from a book or computer screen to see a good movie (for me, “good” almost always meaning “thought-provoking”); the latest David Brooks opinion piece will attract me far more than some juicy “news” about this or that politician’s goings-on. Overall, each day I will spend hours surfing the Net to find interesting (to me) reviews of “challenging” books (in the double meaning of the term: going against conventional wisdom; complex in substance) – and from there (occasionally) hitting my Amazon button to download the tome itself.

Another sign of “addiction”: what can be called multinforming. Ingesting information while doing something else. Some examples: driving my car and listening to the radio news or hearing a lecture; ditto (the latter) when in the fitness gym (doing weights or on the treadmill); morning walks in the park while smartphone viewing a lecture series on Oceanography or The Evolution of Birds. Even “worse”: speeding up these lectures (usually to 1.5 or 1.75 speed) to be able to finish two 30-minute lectures during my 40-minute park walk. In my life, there is no “wasted time”: the radio news is my best friend while washing the dishes and taking a shower; mini-articles on my iPhone are consumed at the supermarket checkout line before reaching the cash register.

By coincidence, in the middle of my penning these lines, the magazine Scientific American just reported on medical research that shows doing exercise AND simultaneously “exercising” the mind does more for our brain than simple aerobic exercise (e.g. running on a treadmill). So, if you’re going to jog, do it outside (the brain has to keep track of the terrain) and if you can handle it, listen to a lecture or do some other mentally challenging work.

Back to the overall issue with my next question: is such overall “addiction” unusual? Probably yes and no. Let’s start with the “no”. All children are born with an insatiable hunger to learn. They are literally information sponges (being cute doesn’t hurt either – it magnetizes others to feed them with constant stimuli). Their curiosity is boundless; anything and everything is fair game to learn and understand. And even if there is little stimulus to be had at the moment, they have two other tactics: first, use their curiosity to grab things and figure out what they are and how they work; second, if a human is around, ask “why?” – again and again and again and… “Enough already! Go play with your sister…”.

When does such info-sponging become not normal (as opposed to “abnormal”), i.e. when does it become “yes, it’s unusual”? Gradually, as we become older and “life takes over”. An older friend of ours once served on the New York City Board of Education. When we told her that our oldest son was about to enter first grade, we were taken aback at her response: “Too bad; school will spoil his curiosity.”

Some of this is inevitable. After all, not every child will want to know the basics of physics or how to calculate the radius of a circle, but things like that need to be learned in order to function in our increasingly complex world. The bigger problem is that most educational systems are still more oriented to rote-learning than to teaching through figure-it-out-for yourself education.

And then, of course, there’s life: making a living, raising a family, and the like, that takes up most of our time and energy after leaving the educational system. In short, if school didn’t kill our kids’ curiosity and thirst for learning, life tends to do the job just as effectively.

And yet, this isn’t the whole explanation. We do seek out “information”, just of a different type: celebrity goings-on, political machinations (some important; others far less so), cute cat videos – you get the (YouTube/Instagram/TikTok) picture. Of course, there’s nothing wrong with harmless fun; the question is one of degree. If a person’s life is taken up mostly (or completely) by fun and games – “bread and circus” as the Romans put it – then the important information simply collapses under the weight of the weightless.

Can the opposite be true too? If over my lifetime I have spent about 90% of my free (i.e. non-working/eating/sleeping/parenting/hygiene) time sponging “hard” information and news, could that too be considered an imbalanced life? Perhaps. But that sort of depends, among other things, on the nature of the information and to what use the 90% is put: to better one’s health? professional expertise? parental and social capability? civic action? Or simply to quench the information craving, however esoteric and useless it might be? As with eating culinary food, so too with ingesting food for thought there’s a difference between gourmand and glutton.

I want to believe that I have found some semblance of balance here: a good part of my self-education is admittedly a function of what catches my intellectual fancy at the moment. But a not inconsiderable amount is purposeful – or as we academics call it: utilitarian. I read lots about medicine that helps keep me very healthy and fit; about evolutionary biology that (perhaps surprisingly) is a boon in being a better communicator with my fellow bipedal primates (also called humans); about economics, obviously helpful in investing and otherwise keeping my bank balance in the black; and so on. Even strange esoterica can be useful if one knows how to dole it out (in small amounts) during social get-togethers.

Does this make me a better person? Not at all. There’s no correlation – let alone causation – between intellectual curiosity (or even brilliance) and social-mindedness, good-heartedness, or any other definition of what a “good” person is. However – admittedly one can argue with me on this – “informaddiction” properly activated can lead to the “good life” in Platonic terms. If the term “mind your own business” is familiar to all of us, I believe that “your business should be mind” is even more apt.

To be sure, not everyone has the capability for this sort of life predilection. And many people who have a relatively high IQ might still prefer to live the entertained life rather than one of sustained self-education. If that makes them happy, fine with me (and hopefully them). To a certain extent, this type of addiction is environmentally and culturally learned: a home with shelves of books; growing up with dinner conversations about the wonders of the world; stimulating teachers. Nevertheless, it seems to me that in the final analysis, to put it simply, simplistically, and also truthfully: you are what you are.

In my case, if that makes me an egghead, so be it. The egg preceded the chicken by about 150,000,000 years (see: esoterica can be interesting!), so I figure that I have a pretty good head start continuing the advance of homo sapiens sapiens – and even (if I and/or they are lucky) resurrecting others’ curiosity.

[1] From the standpoint of journalism, this is what I (along with my co-author Michal Seletzky) called “general news” – not yesterday’s “political” event or economic datum (hard news), nor soft news items found in the middle and back pages of even the most erudite news institutions: food, travel, sports, culture, gossip, and the like.

Freedom of Choice? Beyond Nature & Nurture

As I entered my senior year in college (CCNY), I had to start seriously thinking what I was going to do with my life “when I grew up”. With an almost straight “A” average in my college grades, I understood that I could get into almost any graduate school of my choice – but in what? For the first time in my life, I had to “introspect” – not something easy to do for a 21-year-old “whippersnapper” (for those too young to have heard this term, it means a young and inexperienced person considered to be presumptuous or overconfident). I had a good mind, and even “better” mouth – indeed, my cousin babysitters, Ruthie and Naomi used to say about me when I toddled into the room: “here comes the mouth”!

After a while and some “consultation” with a few of my college professors, I whittled down my choices to two relevant (for me) possibilities: Law School or Graduate School (for a PhD). With my grades, I was almost a sure-bet to get into Harvard Law and Harvard Graduate School for Arts & Sciences (GSAS). Which was it to be?

The considerations were pretty straightforward (I am only slightly exaggerating here): if I go to Harvard Law, within 5 years of graduating I would probably be earning $500,000 a year (and that’s back in the mid-1970s!). If I attend Harvard GSAS, then I would be earning about $50,000 annually. A no-brainer? Not exactly. A high-powered career in Law meant (still means) that I would be working close to 24/7/365: oodles of money and no personal, free time. A career as a professor meant far less money but a lot more freedom to do what I want professionally, when I want, and how I want.

“Prof. Sam” provides the clue to my ultimate choice. And I have never regretted that decision.

 One of the most hotly debated issues in academia these days is the Nature/Nurture divide. Simply put: in the way we behave and think, are humans mostly/completely a product of our biological-genetic makeup, or mainly influenced by our environment (social and physical), e.g. parental upbringing, education, societal norms, weather etc? I do not intend here to dive into this very thorny controversy, but only remark that the latest scientific research (e.g. epigenetics) clearly shows the interaction between the two.

However, there is a third factor here that is not given much attention: personal choice, otherwise known as “free will”. That too is a highly fraught term in contemporary scholarship, with serious arguments – philosophical and neuro-scientific – on both sides of the issue. Some argue that our decisions are a deterministic product of all the internal and external forces that impinge on us. For instance, why do I choose to eat a banana right now? First, because I am hungry (internal pressure); second, because I read somewhere (external) that bananas have potassium which I need after playing an hour or so of intensive basketball. Others claim that “we” don’t really have free will because it turns out (incontrovertible empirical research) that our brains make a decision (for us?) a split second before we are aware that we have decided!

Our common sense understanding of free will, however, accepts that at the extreme margins, we do not have “free will”. We are all aware that we can’t decide to have our bodies fly through the air or see through walls; without the necessary wealth, most of us can’t simply decide to take a round-the-world cruise over the next two years; and so on. Our internal, physical makeup and external, social environment puts quite a lot of restrictions on our “free will”. We live with that because “that’s the way it is”.

Between those polar extremes, though, we do feel that we can make choices large and small – even if they are in some loose way influenced by other life factors. After all, if instead of educating us our parents had put us in an isolation cage for eighteen years with almost no external stimuli, our choices in life would be far more circumscribed (no language, no education, no exercise etc). By being “out in the world”, we are pushed and pulled by an almost infinite number of “influences”. And yet, we are not a planet stuck in orbit around a sun for time immemorial; we can determine to some – and even a large extent – our personal life orbit.

However – and here’s the key point – in order to make choices based on some measure of free will, we have to be aware of the “deterministic” factors around and within us that might cause us to act more like a gravity-captured planet rather than individuals with freedom of choice. In short, it is not so much political dictators that prevent us from acting freely; it is our own lack of self-cognizance regarding what is pushing and pulling us down a “pre-determined” path.

This involves several things:

First, as Nobel-prize winner Daniel Kahneman (and his pre-Nobel-deceased research partner Amos Twersky) have shown, our cognitive apparatus (aka: “brain”) is full of traps and obstacles to clear thinking. We leap (in this case, “jump” is too mild a word) to conclusions without sufficient evidence or logical thought. Not only the “uneducated”; professors and researchers/scientists are almost as guilty of this – especially regarding almost any topic not within their field of expertise. (By the way, expert knowledge is not the reason that at least within their expertise they don’t fall into mental traps, but rather that in such fields they have been trained to weigh the evidence in “reasonable” fashion; unfortunately, such thinking patterns are not easily transferable to other topics.)

Second, we are all influenced in some way by social norms: some blatant, others subtle. Blatant: in theory, there is no reason why we couldn’t all walk around without clothes on (when it’s warm enough), but we don’t because society clearly does not approve – which is why “nudist colonies” are almost always found in remote regions. Subtle: in America, we expect our conversationalist partner to stand about a yard/meter away from us; in Latin countries, that’s considered to be “distant” (literally and figuratively). No one in any of these countries ever thinks about the norm of “personal space” until meeting another culture where the norm is different. But in our own society we all act (a bit more or less) in accordance with that norm. I won’t repeat here what I have already noted in a previous “Prof. Sam” essay (“Connections”) regarding the great impact living within a social environment has upon us.

Third, if societal norms reflect the macro-situation, then our family and close friends also have expectations regarding our behavior. Indeed, as Muzafer Sherif’s famous pre-teenager, color war experiment found in the 1950s, it doesn’t even have to be someone we are close to – it’s enough to feel “part of the group”. We are all social animals, having evolved eons ago in groups of around 50 people – so when we’re in a group of any “small” size, we will quickly adhere to what’s “expected from us” in order to “survive” (other researchers have now found that the maximum number of people we can be truly friendly-with/close-to is around 150). To take but one kind of example, movies such as “My Big Fat Greek Wedding” and “The Big Sick” seriously (and hilariously) show how family pressure can significantly restrict our ability to make major life decisions for ourselves.

Luckily for me, my mother was not the “my son the lawyer” type. Would my decision to pursue a career in academia have been different were she that sort of mom? Who knows? But if I had to make that decision thirty years earlier (the 1930s), when antisemitism ran rampant in professorial academia (not that the legal profession back then was a bastion of tolerance), I probably would not have chosen this career.

Fourth and finally, the matter of free will and our conscious decision-making process is greatly complicated by a phenomenon that only recently have neuroscientists become aware of, one that I mentioned above: it is not clear who/what is the “I” making the decision! Before I continue, apologies for the semantic confusion this might cause, because our language has not caught up with the latest research. I will use the first person here to make things easier.

When I am faced with a decision of any sort, I eventually decide what to do. But it turns out that my brain makes the decision about four-tenths of a second before “I” (consciously) do! Of course, my brain “belongs” to me, but still there is a difference between “me” deciding and my brain deciding for me before I am aware of it. How does this affect our concept of “free will”? You decide (pun intended): either our free will stays intact (I am my brain, so in fact I made that decision), or it becomes a slippery concept (I was not aware that my brain was “deciding for me”).

The bottom line: we are always “free” in theory to make our own decision, but the degree of such freedom is heavily dependent on the number and power of the cognitive and social obstacles we have to overcome to make such a personal choice – not to mention our understanding of what/who exactly “we” are when deciding something. Greater personal freedom, then, is not only a matter of “freeing” ourself but also (perhaps primarily?) reducing society’s strictures and expectations of what each of us should be doing and deciding, as well as being aware of subconscious processes deep in the recesses of “our” mind.

Two Bad Can be G😃😃d

After several years of marriage, it became clear that Tami and I were having problems getting pregnant. After both of us underwent all sorts of tests (and subsequent “procedures”), the doctors asked us to query our mothers about a drug called DES that decades earlier were given to pregnant mothers. Amazingly, both my mother and Tami’s mom had been given DES in the very late 1940s – the cause of our mutual infertility.

 In math, we are all taught that multiplying two negative numbers renders a positive. But in real life, it turns out that adding two negatives can also end up as a “positive”.

A couple’s infertility has the potential of being a marriage-breaker – for two reasons. First, it demands of the couple some soul-searching and heavy decision-making: Do they go childless? Do they try to adopt – and if so, how and who? Or perhaps surrogacy?

These are very difficult choices, each with substantial advantages and downsides. Childless through life? Lots of freedom and secure finances, but with a “hole” in the family unit, not to mention serious familial loneliness in old age. Adopt through an agency? Not that expensive, but not too much choice of the type of child unless you are willing to forego a baby for a kid somewhat older. A legal, private adoption? More control over what you are getting – but frightfully expensive (lawyers’ fees, biological mother’s medical costs etc). Hiring a female surrogate and/or donated ovum – or using a sperm bank (depending on who is infertile)? Cost and/or parentage issues. In short, any one of these questions can lead to a serious rift between a married couple.

By far the worst issue, though, is the “blame game”: who is the infertile one? Whether husband or wife, emotions can run riot. On the part of the infertile spouse, a major blow to self-esteem and perhaps jealousy of the “healthy” partner. The fertile spouse has a tough choice – almost Solomonic: to continue with the marriage at the cost of never having biological progeny or sacrificing a marriage partner for “genetic continuity” (if not biological immortality). In short, minus one added to plus one = a huge negative.

But if both spouses are negative, the equation pretty much straightens itself out! Neither is jealous of the other. Surrogacy is out (except for sperm and ovum donations). Only the quandary of childless freedom vs. (type of) adoption remains as a tough decision. In our case, we quickly agreed on adoption (although the process for each of our two sons was wildly different).

While not at the same level of “severity”, another ostensible double whammy had no less an impact on my life. Indeed, I considered it then to be so “horrendous” that it was the only time in my life that I was really furious at my mother. In the 8th grade of my Jewish Day School, we had the choice of taking the entrance exam to Bronx High School of Science – the highest ranked and most well-regarded high school in all of New York City. My mom allowed me to take the test which I passed; and then she wouldn’t let me go! “You need to continue getting a good Jewish education,” and that was that.

Anger and frustration hardly begin to describe my feelings back then – for two reasons (the “double whammy”). First, which kid would not want to be in such an elite high school? I always had a keen interest in science and was pretty good at math. Second (the other side of the coin), eight years of Jewish education was quite enough for me; what was there still to learn? (Young teenagers are not known for their “wisdom”; they would be more correctly be called “wise?dumb!)  The thought of four more years studying Talmud and Hebrew (not to mention it being an all-boys school!) was not what I was looking forward to.

In retrospect, my mother’s decision changed my life in unintended, positive ways. Of course, one can never know “what if” I had gone to Bronx Science. But this was what happened at Yeshiva University High School in Manhattan. First and foremost, I did get a solid Jewish & Hebrew education that enabled me later on to offer a tentative “yes” to Tami’s “ridiculous” demand before she would go on a second date with me: would I consider making “aliyah” (moving to Israel)? That education also formed part of my secondary research agenda later in my academic career: writing on the Jewish Political Tradition.

Second, and this might seem to be a rather minor outcome, but in my eyes of major importance down the road, something that I already alluded to in my previous post, entitled Fa(r)ther – I was able to play on my high school basketball varsity (I most probably wasn’t good enough to make the Bronx Science team; who says “nerds” can’t be athletes? And anyway, they played many games on Saturday, my shabbat). Over 50 years later at age 71 (pre-Corona) I continue to play intensive hoops twice a week with some guys around half my age – and I’m one of the more energetic players among them.

What’s the big deal? As I have already mentioned, my father died at 57 from heart failure; similarly, his sister and brother passed away when only somewhat older than him. I have been aware of this “genetic threat” from my Twenties so that exercise became for me a potential lifesaver. However, as many exercise wannabees know only too well, if it’s drudgery you won’t stick to it. For me, basketball was always FUN – easy to stick to. Moreover, there was secondary element to this: as I got older into my 40s and 50s, it became clear that in order to be able to run up and down the court for an hour and a half, I had to stay relatively thin – thus impacting my eating habits: healthy and minimalistic. People constantly tell me “of course you’re thin – you play basketball.” They have the cause and effect backwards: in fact, I stay thin in order to play basketball! In short, what for me back then was my mother’s double “terrible” decision (letting me take the entrance exam – and then not allowing me to go to that school), ultimately turned out really well for all concerned.

These were my double “bads” that flipped into big positives – the first averting marital disaster, the second placing me in a life path that at the time (a 14-year-old whippersnapper!) I viewed as “calamitous” but turning out far better than I could have imagined. Obviously, there are many other life situations where a double whammy can ultimately end up being quite beneficial. To take a common one: I am sure that there are numerous men and women out there who become unemployed and/or get divorced, and as a result move to a new city – only to find the true love of their life or get hired for their dream job.

The lesson is universal: the paths of life are never linear; what seems negative at first – especially when doubled over – can hold profound (and unexpected), positive consequences. Does this mean that everything bad ends up good? Of course not! But it does suggest that whatever knocks we encounter in life, it pays to maintain a long-term perspective. What we feel at the time they occur might not at all be how we view them in hindsight decades later. And sometimes, despair twice-over shared with another person close to us can have its own mutual, positively reinforcing benefits.

Fa[r]ther

Like most immigrant men back in the mid-20th century, my father – Arthur Wilzig – did not have any easy time supporting his family economically. And like most native-born teenagers of immigrant parents, not only was I rather clueless about that, but also not very interested in his past life, or then-present difficulties. Indeed, our relationship might have been more fraught than most – mostly my fault (in hindsight). But what is most striking (and paradoxical) to me today is how much of myself is a reflection of what he did – and also what he didn’t succeed in doing – back then.

Mark Twain was said to have said (but didn’t): “history doesn’t repeat, but it sure rhymes”. That’s one of those fake aphorisms that contain more truth than many that were actually penned. It’s true on a societal level, and in my case, quite true on a personal level as well – up to a point.

In thinking back to “dad”, several striking parallelisms become clear. First, he was not a religious man but became heavily involved in our New York synagogue as a Trustee. My ideational “religiosity” is quite unorthodox (or should I spell it unOrthodox), but I too have served as president of one synagogue in Israel, and chair of the Ritual Committee in another.

Second, despite becoming a well-respected figure in Havana, Cuba – not to mention a man-about-town and popular ladies’ man – Arthur fell in love with my mom (Jenny) and after marriage agreed to immigrate once again (he was born in then eastern Germany; after WW2 the territory was transferred Poland), this time to New York City. I too fell in love with a “Zionist nut” and agreed (pretty soon after our first date!!) to eventually move to Israel – which Tami and I did three years after marriage.

Third, in New York, my father struggled to support us – among other things, at the expense of spending time with the family, given that he would get home quite late in the evenings from work. I had less economic pressure than my father, but perhaps even more (self)pressure professionally – “publish or perish” – and frankly spent far less time with my two kids than I should have.

Fourth, as “compensation”, dad would spend whatever “surplus” money the family had on summer vacations for my mother, brother and I – in bungalow resorts, overseas visits, or summer camps. On a somewhat different plane, Tami and I spent a huge amount of money on child psychologists to try and keep our two kids on an even keel.

Fifth, the famous Wilzig family “temper” was in evidence with dad’s raising his voice throughout my childhood, but here I unfortunately “outdid” him. More on this in a moment.

Finally, a sort of “reverse” parallel: my father’s death at the early age of 57 (heart attack; cigarettes; meat & potatoes; no recreation or sports; etc) incentivized me to do the opposite in quite radical fashion – the main subject of this essay.

It’s here that we arrive at the “pun” in the title of this specific essay: Fa(r)ther. Our parents can/should teach us many things directly, and they certainly do indirectly, as behavioral models. But in the final analysis it is up to each of us to take what we find useful and learn what not to do with other things our parents perhaps mishandled. Of course, 20-20 hindsight is easy; we do things that we think is fine – and in the future our kids will look back and say “what was s/he thinking?” Notwithstanding this generational truism, regarding our parents we should always try to go one or two steps farther than they were able to in their actions and general behavior. It’s called “maturity”.

My example here is just that – one of many that other people have to “overcome” in their own child-to-parent (and on to our next generation’s children) cycle. The following remarks, then, should be seen as merely illustrative.

It doesn’t take a rocket scientist to understand that an explosive temper is not good for the heart (not to mention for everyone else around). But “understanding” is one thing; behavioral change is another. On rare occasions this can be accomplished purely from self-reflection (i.e. lots of looking in the metaphorical mirror). Normally, though, it needs some external push or motivation. By the time I was approaching my 50s – the same decade my father had his heart attack, I started worrying about my “prognosis” in this regard. This led to some soul searching (perhaps in this specific case, “heart-searching” would be a better term). And from there, I made a really big effort to temper my temper. I succeeded (about 90%), as my oldest son, Boaz, acknowledged to Tami several years later.

Easy? Not at all. But I had “help”: Tami. Not only did she constantly remind me (“complain” might be more appropriate) of my temper, but she pointed out something quite strange. When I was at work, even as a “boss” (chairing academic divisions, departments etc), I almost never raised my voice, forget about losing my temper! In short, this wasn’t an “impossible” goal; it merely meant transferring my behavior from one part of my life to another, more important, sphere.

There was another area of my life that helped as well. I have played basketball (and sporadically, tennis too) on a pretty steady basis since my early teens. Why? Mainly out of love of the sport(s), but also for health reasons. A byproduct, though, is “pressure-reduction”, otherwise called “letting off steam”. Actually, for this specific purpose tennis is superior – physically smashing a ball is a lot better than verbally smashing other people! In any case, I am now 14 years past the age of my father’s demise – still playing basketball (a topic for a future essay).

To be sure, every parent wants their children to “surpass” them. This usually means professionally, or in some cultures with more progeny. Less thought about – but in my opinion, of far greater importance – is behavior. This can entail how we treat family members; how we behave at work (customers, co-workers, employees); and in general, to what extent we find it easier to “look in the mirror”.

Just as there is no such thing as a perfectly smooth mirror, there is also no perfectly behaved human being. (The Bible goes out of its way to even find some minor fault with Moses!) But mirrors have become smoother over time; with lots of self-reflection, we too can attain a level that is a somewhat better reflection of our own parents.

Connections

During my doctoral coursework, I learned from several truly outstanding (and world famous) professors: Sam Huntington (who would become my dissertation advisor), Seymour Martin Lipset, Louis Hartz, Harvey Mansfield, and Judith Sklar. However, the professor who made the most profound (long-term) impact on me was Daniel Bell. Not because he was a great lecturer (that was Hartz), nor an unusual “mensch” (that was Lipset). What made Bell stand apart was his unbelievable multi-disciplinarity – the man was a polymath.

We’ve all met people who know a lot about a lot. However, what turns such knowledge into something special are the connections made between seemingly non-related fields of endeavor, and the novel insights such associations can engender. Prof. Bell connected the humanities and physical science in ways that were truly thought-provoking – and wildly unexpected – and he was a social science scholar! The clearest example of this, his seminal book The Coming of Post-Industrial Society (1973), the first time anyone clearly analyzed (and foresaw) what eventually came to be called “The Information Age”. And then he followed that with his no-less-prescient The Cultural Contradictions of Capitalism (1976) – a perspicacious (non-Marxian) prediction of the inherent weaknesses within the capitalist value system, or if you will: the psychological underpinning of the 2008 Great Recession.

For the past several millenia, political philosophers, sociology scholars, psychology analysts, ethicists, theologians – just about anyone who deals with the study of humanity – have argued about the foundational question: are humans, at base, individualists or social animals? To put it another way, are we basically people who cherish freedom above all else but need and use social contact with others for self-interested, functional reasons – or, do we first look for psychological comfort in the presence of others and self-identify first and foremost through a group identity, occasionally withdrawing into ourselves for some privacy and freedom from constraints?

There are two general approaches to answering such a question. The first approach: there isn’t a definitive answer, but rather it’s a matter of degree depending on the culture and society in which one is educated and lives. Some societies prioritize the individual (America), others place greater value on the group (most Far Eastern nations). The second approach: human nature is the same everywhere, so that whereas every person has a bit of one or the other (individuality vs sociability), ultimately, we will scientifically discover which of the these two constitute the bedrock of humanity.

Growing up in the U.S. and notwithstanding my strongly Jewish education, I started out not only believing in the second approach but I was convinced that we already had the answer: the individual über alles. After all, hadn’t the individualist ethos turned the U.S. into the greatest power on earth – not only militarily but also culturally, intellectually, scientifically, etc?

But the more I studied and researched politics & society – especially other cultures, historical periods, and social-behavioral patterns – the more I moved into the first camp: human behavior and thought was contextual, situational, and relative to its time. Sure, America the Great was highly individualistic (as perhaps was Greece to a more limited extent) – but the Chinese and also the Mongol Empires were “Great” in their time as well, yet far more group-oriented than individualistic.

So I taught for a few decades (perhaps even pontificating at times). However, as I gradually moved in mid-career from researching politics to studying communications and especially new media – and from there broadening my horizons into technology and science writ large – I have changed my opinion once again. I am convinced that the second approach is correct – a core kernel to human psychology and behavior does exist – but this time it falls on the side of sociality. The reason? Connections. (I am happy that you stuck with me to this point, probably asking yourself: what does all this have to do with the chapter title?)

Let’s start with one of the keywords in the field of sociology as well as new media: networks. It is a banal truism to state that babies need parents and other caretakers to learn and grow up; it is less banal (although to me quite obvious) to state that throughout our entire life we continue to “grow”/change (not to mention survive) by developing and nurturing relationships with other people. Indeed, among other facts that science has uncovered are these two: babies who have little physical or social contact tend to get far sicker and have greater infant mortality than others who are properly nourished socially (even controlling for similar nutrition); the number one “killer” of older people (i.e. dying before their cohort’s life expectancy) is……LONELINESS!!

We are “social” animals by nature. However, society can involve competition or cooperation. Darwin believed that competition was the driving force behind evolution; we now know that cooperation was (and continues to be) at least as important. Indeed, that is what human speech is all about – enabling us to communicate and thus cooperate far more than any animal is capable of. You don’t need speech to kill; you do need it to help others.

It took hundreds of thousands of years for hominids (our ancient fore-species) to slowly evolve into homo sapiens (thinking Man), but once we developed speech about 100,000 years ago evolution started speeding up, increasingly so. In less than 50,000 years we had developed “culture” (burial sites, cave art), and within another 40,000 years we reached “civilization” – leaving the small, extended family as our main social unit, and developing the village, town, city, city-state, and finally empire – with all their attendant revolutionary inventions and ideas as a result of so many more people coming into contact one with the other.

This is the picture regarding intelligence and creativity, on both the macro level (human race) and meso (individuals). It explains why we have far more inventions in the contemporary age than in the past; indeed, also why almost all truly creative work (artistic, scientific, technological etc) occurs in cities and not in rural areas. The one-word answer: “connections”. Just as we moved millenia ago from extended family (50 people) to city (thousands), in the contemporary age there are far more people in the world (close to 8 billion) than in the past (in 1800 there were only one billion people on Earth). To this we also have to add that the means of communication between these social units are far greater and more efficient. The result: collective “social intelligence” has skyrocketed.

I mentioned above that “individuals” = “meso” (intermediate) level. What, then, is the “micro” plane? Our brain! Once again, we find here the same story. It isn’t the 100 billion neuron (brain cells) that render us intelligent, but rather the fact that each neuron is connected to thousands (!) of other neurons, i.e. we have trillions of connections in one brain!! (Neurobiology studies also show that people with a faster “connection time” between brain cells also happen to be more intelligent; the micro equivalent of the macro-world’s modern media and hyper-fast means of communication.)

That’s “intelligence”. What about “creativity”? (The two are not synonymous; above approximately 120 IQ, there is no correlation between the two. In short, you have to be mildly intelligent in order to be creative, but certainly do not have to be an intellectual genius.) Here too we find that one of the main bases of creativity is the ability to bring together very different types of ideas or concepts into creating a new whole – sort of 1 + 4 + 3 = 13. In other words, the act of creation is a function of making something from many prior somethings; only God can perform Creatio ex Nihilo: creation from nothing.

Which is why I laugh (to myself) and groan (out loud) when my students state: who needs to know all this stuff when we have Wikipedia or Google? My reply: “If you need to know some specific facts, then Wikipedia is great; but if you want to create something new, then those facts have to be already in your head and not simply out there for retrieval, because you won’t even realize that these specific facts constitute the building blocks of the new ideational house that you want to build.”

In most cases, such facts are not from the same field but rather quite distant from each other – it’s the very “incongruity” of these facts that makes their “miscegenation” so productive! One example: George Lucas (of Star Wars directorial fame) received the National Medal of Technology in 2004 and the National Medal of Arts in 2012. One would be hard put to come up with two disciplines – technology and arts – more distant from each other, at least on the face of it. But that’s precisely what made him so “creative”.

Indeed, this explains why most “Eureka” moments occur when we’re not trying to solve a problem / come up with a novel solution. At rest (napping, showering, daydreaming, etc) our brain is furiously making connections between all sorts of disparate “factoids” stored in our memory – the vast majority of such connections being useless and thus immediately discarded by our brain. But once in a while the brain manages to come up with a “worthwhile” connection between ideas, however “distant” from each other – and voila! we have the creative solution.

Interestingly, this does not happen with the same frequency when we are actively trying to solve the problem, because our conscious mind is not aware of the myriad “factoids” stored in our brain. As strange as it might sound, our brain does a better job of problem-solving (connections production) when we are not directing it – similar to a decentralized corporation whose workers come up with more and better solutions through unfettered, horizontal contact between themselves than if upper management directed them how to act and think about the issue. Corporate management would do well to take lessons from neurobiology.

The bottom line of all this is not that we are all ants in a gigantic colony (although they too are an excellent example of the connectivity phenomenon: each ant by himself is as dumb as they get; but put them together, connecting through chemical pheromones, and “miraculously” one gets a highly intelligent animal “society”). Humans are individuals and have specific – and occasionally idiosyncratic – wishes, thoughts and behavior patterns. But in some parts of the world we have taken this concept to its illogical extreme: the individual uber alles. Rather, it’s our connection to others that enables us to truly express our full humanity – and it’s our ability to connect seemingly unrelated ideas and concepts that is the fount of our intelligence and creativity.

You’re Incredibly Lucky to be Alive

My parents met in Cuba. My mother Jenny Weinreb was a refugee, escaping Nazi Germany in 1939 (alone on a German boat at age 13!) to Lisbon (where my Aunt Eva was married with two baby twins), and from there to a British refugee camp in Jamaica (1942?), and finally to Cuba in 1944. My father Arthur Wilzig left Poland in 1936 straight to Cuba (where a large expatriate community had established itself in 1926 after widespread Polish pogroms).

They married in 1946 and I was conceived in Cuba in 1948, but born in May 1949 in the U.S. where they had immigrated because “there was no future Jewish life in Cuba”, as my mother explained to me decades later. Thus, I definitely qualified as an “international baby” from several perspectives.

How often have we heard others say that they’re “lucky to be alive” because they just missed being killed in some accident? Indeed, Judaism has a special prayer for someone escaping danger: “gomel”.

But this is missing the main point: EVERYONE in the world is unbelievably lucky to be alive! From several standpoints.

First, although we do not know for sure if there is life anywhere else in the universe, it is clear that the chances of life “evolving” on any one planet are very, very small (but as there are trillions upon trillions of stars and planets in the universe, overall the probability is that there is “life” elsewhere: see “Drake’s Law”). You and I happen to be living on a planet where this did happen – the chance of that is less than winning the national Powerball Lottery. So thank your lucky star (pun intended)!

Second, in many cases the chances of your parents meeting when they did is also quite small (notwithstanding the occasional “marrying one’s high school sweetheart”). In the course of a person’s youth and young adulthood, we will meet – even superficially – a few thousand people, among the hundreds of millions that they could have met, and of them the thousands they reasonably could have married. And if they had met someone else, you wouldn’t be here! Doubly lucky.

Third, and something of a head scratcher, is the most important “lucky event” of them all. In a nutshell, whereas your mother dropped one ovum each month down her fallopian tubes to be (potentially) impregnated, your father’s ejaculate contained close to one hundred million (100,000,000!) sperm. So, think about this: if the specific sperm that impregnated your mom’s egg had been beaten in that swimming race by one other among the “99,999,999” sperm, would YOU be alive at all? Probably not – just someone pretty similar to you (at least, similar to when you were a newborn).

Never thought of that, did you? And that’s the point. Life is largely a matter of attitude and perspective. You can look at your cup that’s missing a few drops on top, or at the cup that’s brimming with life (yours). I am certainly aware that this is not the way people usually think about their life, but that’s my point: as human beings, we are capable of shifting our viewpoint once we look at the mirror straight on, instead of from the side. Some people don’t need to do this – their natural predisposition (what is usually called “attitude”) is to be positive and optimistic. Others are naturally pessimistic, seeing more night than day. And then there’s the broad middle who can go either way, depending…

On what? On their social circle; on their socio-economic status; on their social happiness (quality of marriage; relationship with their kids; etc); and dare I say it? On how they consume media information.

No, I am not about to partake in modernity’s national sport: media-bashing. Actually, the “people” (that’s you and me) are at fault. When was the last time you read this headline (change the country name as you wish): “Yesterday, 8 million Israelis had an Uneventful Day.” Never. Because we, the readers, listeners, and viewers, want to get “bad news” – almost the only kind that we consider “news” at all.

Why? For that we have to go back several hundred thousand years (or even millions). All creatures on earth – and certainly evolving humans who were not particularly strong or fast – had (and have) to be on constant lookout for danger that could annihilate them: back then, tigers and human enemies; today, technology (e.g. cars) and human enemies. Thus, danger-sniffing is planted deeply in our genes. And what constitutes “danger” (socially and not only personally) is not only the usual stuff (plane crashes; war breaking out) but also – even increasingly – unusual stuff that we are not familiar with. Flying machines? (Early 20th century) People changing their sex?? (Mid-20th century) Robots building cars faster than humans can??? (Late 20th century) Designer babies?!?! (21st Century).

The media – old (“legacy”) and new (“digital”) – are doing an increasingly better job of ferreting out each and every “potentially dangerous”/ unusual item of information from around the world. And as we absorb more and more of this (with the occasional respite through sports scores and fashion fads), our perception of the world is that things are going to hell in a basket faster than ever. Ah, nostalgia: “when I was growing up, the world was a better place”. Baloney. As a matter of fact, there is less violence in the world today (per capita) than ever before in human history! (Don’t take my word for this; read Steven Pinker’s The Better Nature of Our Angels.)

So we “moderns” are caught in a paradoxical situation: the better things get (objectively), the worse they seem (subjectively). The solution is not to avoid the media (we do want to know what real dangers lie out there). Rather, it is to choose wisely which media to consume – and then understand that 99% of what is happening in the world is positive, precisely what is NOT found in the media (because we don’t want to read “good news”).

Freed from the shackles of informational negativity, we can then more realistically work on developing a more positive and optimistic attitude towards the world – micro (ours personally) and macro (society and the world at large).

Coda: the latest research shows that people with a generally positive attitude have, on average, a lifespan that’s a few years longer than those with ingrained (or acquired) negativity. So not only are the former “lucky to be alive”, but by appreciating that fact they actually live longer – and that’s not a matter of luck.

So here’s my final “you’re really lucky booster”. Remember what I said about your parents’ egg and sperm? Now take that back every generation of your personal forebears! In other words, if any procreation act of any of your grand/grand/grand(etc)parents had a different sperm win that specific impregnation race to the ovum, you would not be here (or anywhere, ever)! So just this one, repeated, “generational” piece of good fortune should be enough to get you to count your blessings every minute of your day.