Intuition?

The brain is a strange organ. I have a good sense of humor – love to hear/read joke, and also tell them (if they’re good). But if you ask me to tell you a good joke that I heard or read recently, I can’t do it. However, ask me to tell you a joke on any specific topic that you choose – a joke I heard years ago will pop up and I have no problem relating it on the spot.

Notice that I started off with “the brain” and not “my brain”. Although there are some minor differences in capabilities between people, overall the human brain works basically the same way for everyone. In the case of memory, passive always beats active i.e., we remember things a lot better when given a “prompt” than when we forced to dredge something up without some external aid.

Which brings me (very circuitously) to the topic and title of this essay. Every one of us has had experiences where we “intuitively” thought something would happen. Let’s leave for now the fact that we “intuit” lots of things and selectively (and conveniently) forget those that didn’t come to pass). What exactly is this “intuition”?

It does not exist – at least not in the usual way it’s understood, as something “innately” natural to some people as opposed to others. Indeed, if I can make a pun, it’s a figment of our imagination.

So what is “intuition”? Simple: knowledge (mostly based on personal experience) that we are not aware that we have. Brain research has now conclusively proven that the brain is a “prediction machine” – constantly thinking about “what is about to happen” to prepare itself (and you) for the immediately upcoming environmental challenges. That could be when to cross the street based on the oncoming traffic, all the way to how much risk to take every minute in a Covid-19 environment.

Our brain is doing this on a second-to-second basis, outside of our consciousness. In fact, the situation is even weirder – and some would say “scarier” – than that. It turns out that when we have to make a decision (not just trying to “predict” the very-near-term future), our brain makes the decision four-tenths of a second before we are even aware that it/we have made that decision!

Indeed, it is at this point that we arrive at a semantic conundrum. If our brain is deciding things before we consciously “make the decision”, who is in control of whom? Indeed, can one separate the brain (or mind, if you wish) from “us”? Who am “I” if not my brain? And if my conscious thoughts are somehow different from my unconscious decision-making process, who is in charge of the matter? That’s another intentional pun, but serious too, because are decisions undertaken within the organic brain “matter”, or are they a function of the metaphysical “mind”?

Intuitively, we want to think that “we” are in charge – but that’s precisely where intuition goes astray. The knowledge underlying our “intuition” is somewhere beyond our conscious sense – almost like a computer’s external hard drive, separate from the CPU and internal memory disk.

Why, then, do some people seem to have a better intuition than others? They don’t; they just have more “hidden” (from themselves) experience and knowledge – or are better at forgetting all the times that their intuition did not materialize!

To be sure, all this is distressing because raises the question of whether we are really in charge of ourselves – indeed, whether there is such a thing as free will, at least in the extended sense of the term. In any case, the next time you have a hunch about something about to occur, don’t say “my intuition tells me”, but rather: “my brain predicts…”.

(Un)Talented

As I entered the first grade, my mother thought it would be a good time to also get me started on some pastime. So she signed me up for piano lessons. I dutifully went by myself once a week to the next building where the piano teacher lived. I learned, practiced, and… after two years came to the conclusion that I was vastly untalented.

I come from a very musical family, albeit not professionally. I have two uncles who could play the piano without a score, even if they heard the piece only once. Moreover, I loved music, so all in all my non-talent “epiphany” was somewhat disappointing. What to do?

In the third grade, we could join the school choir, which I did. I loved it and even discovered that extemporaneous harmonizing came easy to me. This became my joyful extra-curricular activity for the next six years.

The first lesson I learned from this was that “talent” is something we are born with. There’s a “rule of thumb” going around that to become expertly proficient in any field of endeavor, one has to practice approximately 10,000 hours. But “proficiency” is a murky concept. In fact, we have to distinguish between “talent” and “skill”. The 10,000-hour rule relates to skill – another word for turning raw talent into something effective at the highest level. But if one doesn’t have the initial talent, then even 100,000 hours of practice won’t do it.

Which leads to the second lesson: even if we don’t have talent in a field that we love, that doesn’t mean that we can’t still do something, albeit different, in that same field. I couldn’t (still can’t) play the piano; but I do harmonize every chance I get when in a singing environment. (Synagogue is an ideal place for that; God forgives the occasional warble.)

And then there’s the in-between situation: great love mixed with minor talent. I have been playing basketball almost without interruption for close to sixty years – high school varsity, pickup league, semi-pro in Israel for a couple of annum, and twice-a-week full-court games with a steady group of “guys” for the past several decades. That’s way beyond 10,000 hours. The result? I’m a great shooter, a decent passer, a bad rebounder, and a horrible dribbler.

Very, very few of us will ever make it to the top of our profession or “hobby” – but so what? If I compared myself to Michael Jordan, I’d never have the joy of simply doing something that I love, in decent fashion. Indeed, had I actually continued practicing the piano, I might have been “decent” – but the point is not to be decent in something “expected” of you, but to get the most out of yourself in something that you much enjoy.

So here’s a dilemma for each of us to chew on. Which is preferable: to be really good in an activity that we aren’t crazy about? or to be passably decent doing something that we really enjoy? If that sounds like a trivial question regarding hobbies, then try this: what if the question related to your work life? Add to this the monetary compensation issue: the better you are professionally, the more money you’ll make. So, regarding the preference question, now what do you choose to do professionally? Make more money working in a field that “humdrums” you steadily – or earn less money but be excited every day you go to work?

Talent is great, but we shouldn’t let it be the overwhelming consideration in our life decisions – whether regarding which “trivial” hobby to be involved in, or about the “central” profession that we choose to spend decades performing. Muddling along (10,000 hours will raise the muddling to at least middling) while doing something we really love to do is preferable to following our “talent” to an emotional dead end.

As my piano lesson failure taught me – that’s the key to life…

From Meshuggeh to Mainstream

Upon our engagement, Tami and I decided to combine our family names. That necessitated going to court for an official name change. To save her the time, I did it in a Boston court a couple of weeks before the wedding, so that from “Lehman” she “automatically” became a Lehman-Wilzig upon our marriage.

These days there’s a different “craziness” going on in the Western world (mostly the U.S.), with millennials and some older folk in their wake signing their emails with “instructions” such as I wish to be called “they”, not she, her… – or other, similar such nomenclature. For most post-50-year-olds, this strikes us as pure craziness.

The Boston courtroom was filled with Irish: the security guard at the entrance, the stenographer, and the policeman standing next to each witness in the courtroom. With my “luck”, though, I got a Jewish judge. I handed him my papers, in which I had to explain and justify my name change. I had written that “my wife is an only child and wants to keep her parent’s name” and that “I’m a feminist and wish to have an equal marriage, in name too.”

Every generation has its own way of asserting its identity. One can call it “youthful rebellion” or “generational progress” or some other more or less positive description, but it seems to be a set pattern. What accompanies this, though, is generational pushback as well. The older ones don’t “get it”, mainly because that wasn’t the way they did it when growing up and taking over the world.

My judge took the papers in hand, read it with a stolid face, picked up his pen, and as he was signing the papers, declaimed out loud the beginning of Tennyson’s famous quote: “Ours not to reason why…”. I wanly smiled, expecting to hear the second part: “ours but to do and die.” Instead, he continued: “but you’re meshuggeh!” I couldn’t help laughing out loud, to the consternation of the Irish cop at my side – probably more perplexed at what the judge said than my guffaw.

So I now find myself in the role of my Jewish judge: “they” instead of “her”?!? How far can this gender fluidity take us? I won’t join this trend, nor do I think it will have much staying power. But then again, my last name is still Lehman-Wilzig – forty-seven years later.

The point of all this is that “you never know”. Linguistic, cultural, social trends arrive by the dozens. Most come and go (remember bell-bottom pants?); a few arrive and stay (Black has definitely overtaken Negro). Several are clear signs of moral progress (I do believe feminism – once called “Women’s Liberation” – is permanently here); conversely, a few are evidence of philosophical backsliding (Cancel Culture is antithetical to freedom of expression).

A decade ago, our son Boaz hesitatingly came to us with “bad news”: he wanted to change his last name from Lehman-Wilzig to something that sounded Israeli and was simpler to say. The irony, and reason we were actually not upset by this, is that Tami’s father’s original family name was Sprinzeles, which he changed to Lehman in Vienna back in the 1930’s due to anti-Semitism (“Lehman” was a neutral German name). We suggested “Zeevi” – the Hebrew rendition of “little wolf” which is what my family name “Vilchek” means in Polish.

Change is inevitable – linguistic and otherwise. Each of us has the right to be called whatever s/he wishes (within limits; New York State will not allow a child to be given the name “Hitler”). That doesn’t mean that the rest of us have to like an individual’s decision – but it has to be respected. Respect, though, also doesn’t mean complete acquiescence but rather a measure of tolerance on our part to what seems at first to be bizarre. In any case, the passage of time and social “convention” will determine whether – and to what extent – the new trend will stick. That’s the name of the game – or the game of the name…

Coincidence

As I mentioned in a previous post (“Courage”), I lost contact with my favorite (best) teacher in high school – until “remarkably” I bumped into him several years later when we both had made Aliyah and landed at the same Israeli institution: Bar-Ilan University. What an amazing coincidence! Well, coincidence, yes; amazing, not really.

Everyone has had at least one remarkable coincidence happen to them in life, probably several that left us flabbergasted. We find it so incredible that we usually can’t stop thinking about it or relating it to others. Well, I’m sorry to be a party-pooper, but coincidences are really no big deal.

Let’s take my own example from above. There are two ways of looking at this:

1) What are the chances that I would bump into a person I know well – both of us thousands of miles away from where we had our relationship? Put this way, the chances are very low.

2) How many times did I walk around Israel and not bump into someone that I knew from back then? The answer: several hundred times a day.

Now multiply #2 by all the days in the year, and we have thousands of non-coincidences! But we don’t think about these because they didn’t happen. A person would have to be completely crazy to go about life thinking every second about what did not occur at that moment – and if s/he did try to do that, s/he would very soon become crazy!

Coincidences, then, are simple statistical anomalies – what are called outliers. In any situation, there are the “normal” things that happen (and the vast majority of things that do not happen), and then there are the rare things that happen (or don’t occur e.g., a baby climbing and falling out the fourth floor window and not breaking a bone because the mattress delivery men just happened to be bringing one at the entrance to the building). Human psychology, of course, picks up on (and remembers) the “abnormal” event precisely because it is not normal.

In reality, we see this phenomenon everywhere in our life, not only regarding “coincidences”. We walk down the block and pass hundreds of people – but the one that sticks in our mind is the 6’10” (2.10 meters) guy built like a tank. Do you recall any of your losing poker table hands, or the one where you bluffed your way to a huge jackpot?

From this perspective, “amazing” coincidences are, paradoxically, proof of how routine our lives really are. When we are part of a coincidence, it makes us stop and shake our heads in wonderment. But that’s really the time to consider how infrequently a coincidence actually occurs – and why.

Finally, there is another type of “coincidence” – the statistically spurious one. Here’s a classic example: if you went to a party, what number of people at the party would have to be present for there to be a 50-50 chance of two attendees having the same birthdate? (Don’t peek below; first make an educated guess.)

Some people will say: 182 (half of 364); others will understand that there are “permutations” here and lower the number to around 100. Here’s the statistically correct number: twenty-three!! Now imagine being at a party of 30 people (not having read this mini-essay of mine) – and in the course of the party chatter you hear two people laughing out loud: “Wow! Amazing that we have the same birthdate!! What are the chances of that?!?” It becomes the dominant conversation of the party, and you wonder at the coincidence. But now you know that with thirty people in the room the chances were far better than “even” that two partygoers would have been born on the same day/month on the calendar.

Such statistical “anomalies” can be found everywhere in life. As most of us are not particularly mathematically literate, we see “coincidences” everywhere. So calm down: it’s just normal mathematics at work – plus a healthy dollop of human psychology not working so well.

P.S.  None of this is to say that an “amazing coincidence” can’t occur. My all-time favorite: the second and third U.S. presidents – John Adams and Thomas Jefferson – both died on the exact same day pretty amazing, right?): July 4, 1826, exactly 50 years to the day from the signing of the Declaration of Independence, of which both men were signees! Now that’s worth shaking your head over…

Courage (2)

                                                                                    “I cannot think of any need in childhood as strong as

                                                                                      the need for a father’s protection.”

                                                                                                                                   — SIGMUND FREUD

As with most families back in the 1950s, my stay-at-home mother had a much bigger place in my life than my hard-working father. Here and there, though, he stepped in and made an impact.

In school, I had a “friend” who would bully me around – not serious bodily harm, but enough to make my class life miserable. By the fourth grade, I had enough – but what to do? So I mentioned my problem to Dad. I didn’t really mean for him to get involved (or maybe I did, deep down?). The next day, as he dropped me off at school on his way to work, unbeknownst to me he waited in the car with a clear view of the classroom through the large, front-facing window. Sure enough, as I entered the classroom my “friend” grabbed my kippah (“yarmulkah” back then) and started making fun of me. Within a few seconds, my father stormed into the room, grabbed my bully friend by the collar, pushed him into the lockers, and warned him never to start up with me again. We were all shocked – but it worked. From then on, the boy actually became a real friend…

A few months ago I wrote a Reflections post called “Courage” (http://profslw.com/courage), about moral courage. This time it’s about physical courage – doing something that takes a measure of bravery. I’m not talking about wartime heroism, the supreme courageous act whereby soldiers are putting their life on the line. Nor does this have to be a war act; jumping onto the train tracks to rescue a young child that fell in, with the train quickly approaching, is just as brave. Let’s leave that for others to discuss (my army service was in Education – no line of fire there).

The first point to consider is that day-to-day courage is a relative matter – depending on the individual’s personality. I happen to be a minor “coward” – really hate any use of force, even when possibly justified. Other people might have a more aggressive personality, some even relishing the occasional scrap. For the former, it takes internal courage to stand up against physical threats; for the latter, such “engagement” is part of life – a Hobbesian approach to social intercourse.

Physical courage is also culturally related. Some hyper-aggressive societies view physical bravery as noteworthy, perhaps even an important value. Others have a pacific culture, in which any form of force is looked down on, preferring social diplomacy if and when conflict arises. This is not a matter of primitive societies vs modern ones. We now know that several primitive societies avoid warfare or internal force through such customs as “potlatch” (everyone gives gifts to everyone else); more advanced ancient and medieval societies used arranged marriages between leaders of potential rivals to smooth out the rough edges of inter-social contact.

The contemporary world, however, seems to be addressing the issue of physical courage by reducing actual physical contact! As we become more involved in virtual forms of social intercourse and communication – e.g., social media, Zooming, texting etc. – the “bullying” and other forms of “attack” become more verbal or textual, and less physical. This is not necessarily an improvement, given that the old popular rhyme has proven to be quite false: “sticks and stones can break my bones, but words can never hurt me”. Thus, “physical” courage today is being slowly transmuted into verbal/textual courage – whether standing up to online bullying or writing something not “politically correct”.

Of course, being “courageous” online is easier and harder at the same time. It’s obviously much easier to fight someone verbally than physically, given that no bodily harm will occur. On the other hand, physical courage was most often quite circumscribed to a small audience in the immediate surrounding. Online, however, most attacks are by a “crowd” in the victim’s filter bubble, a huge quantitative increase. Standing up to a vituperative mob is not as easy as literally facing down a bully or two.

Every age, every generation, and every culture have their own demands and challenges to be courageous. As long as individuals think and believe differently and even passionately, moved primarily by emotion (especially fear), we all will still need to find resources within ourselves to physically and communicatively stand up to those who wish to force their own behavior and ideas upon us.

Loss Aversion

As I entered my 70s, I naturally started considering questions that younger folk don’t think about much (if at all). One of them is death. And that led to other a few other seemingly unrelated questions. After a while, though, I began to see a common thread.

Let’s take these three questions, each from different aspects of my life (and I assume for most people):

1) Why do we fear death? (Existential)

2) Why are most people not willing to take a 60% chance of making a “bundle” in the stock market with a 40% chance of losing the same size bundle? (Financial)

3) Why do we read a lot more bad news than good news in our daily news digest? (Information)

Here’s why it’s logical to ask the first question (assuming that one doesn’t believe in a fiery “afterlife”). The universe has existed for about 13.7 billion years and all that time you did not exist! Did you suffer? No. Were you waiting in anticipation to join the living? Again, no. So why should you fear spending the next X billions of years in the same condition that you spent all the time that existed before you came into being?

My second question seems to be even more illogical – according to simple probability, taking such a stock market “gamble” makes eminent sense, as the chance of gaining is better than the chance of losing money.

The answer to these first two questions (conundrums, if you wish) is the same: “loss aversion”. Human beings have a greater sensitivity to losing something than gaining something. Put another way, we suffer more pain when we lose what we have, than when we gain what we didn’t have. Thus, we aren’t keen on losing our money on the market – even if the chances are better than even of making the same amount. And as for death, well that’s now obvious: before we were born, we didn’t have anything to lose; after we’ve come into the world, we have a lot to lose in death: our life. It’s a lot more emotionally painful to think about losing the life we have than gaining a new life (even if we could “think” about that before were born).

Is this just an abstruse intellectual exercise? Not at all. Loss aversion is something we live with every day of our life. Which brings us to the third question regarding “the news”. If you picked up your daily newspaper (or surfed to the homepage screen), what would you think if this is what you saw: YESTERDAY MILLIONS OF PEOPLE SPENT THE DAY WITHOUT MISHAP. Ridiculous, no? Well, actually it’s not so silly – in principle. After all, what’s wrong with good news? Objectively, is it any less “newsy” than bad news?

However, as humans we are “primed” to be acutely aware of negative news, a/k/a anything that can harm us. Good things are nice, but they generally won’t severely affect us. Why are we like that? It harks back to early evolution. Imagine the early hominid in the savannah. All of a sudden, he sees movement in the tall grass nearby. Is it the wind? Or a predatory tiger? If it’s the “good news” (just the wind), well that nice; but if it’s the tiger, he better move fast and get out of there. Obviously, he was (and we still are) primed to consider and act on the possible bad news way ahead of the possible good (or neutral) environmental stimulus.

There have been several serious attempts at publishing “Good News” daily newspapers, and all have failed – despite their good news being invariably significant: cancer cures; new inventions; public policies that actually work to make life better; etc.

The irony, of course, is that good news can also save life. (Or significantly improve its quality.) Think about the latest health news regarding a new medicine for your condition; about the latest gadget that can save your tot’s life (e.g., not forgetting her in the car); the newest extra-bank institution (Fintech) offering a much cheaper mortgage; the end of a civil war in a country you’ve always wanted to visit; a brand new type of university B.A. program that you’ve always wanted so that you could change your career path to what you really wanted to do all along; and so on. Most of these are not “earth-shaking” news items – but neither are most “bad news” items that pass for journalistic reportage these days.

Good news delights us – we can all use a good smile once in a while. But bad news makes us afraid – and fear beats happy every day of the year. What we need, and have to actively search out, is a lot more of the former to counteract the latter. We should all work harder at turning “loss aversion” into “gain provision”.

Less is More When You Want to Get Something Done

I am about to admit to the most unprofessional thing I have ever “done” – or perhaps I should say “not done”. As a professor of communications, and especially “New Media”, I have refused from the beginning to join Facebook, Instagram, Twitter, TikTok, and almost every other social medium (except for a couple of specific, limited-to-friends/family WhatsApp groups). I’ll leave for another day how I managed to do serious research – or even to teach – in the field. For now, I want to offer my reasons – one that everyone else should consider as well. 

First and perhaps least, albeit still worthy of consideration, is the personal privacy issue. Social media know just about everything there is to know about each and every one of us – at least those who are “on”. They even know things about you that you might not know about yourself, because their algorithm can correlate different aspects of your online behavior to develop a “profile”: who you are (race, religion etc.), what you like (pick any field of life), and how you think (extroverted vs shy, smartass vs. humble, genius or moron…). Even aspects that you have never mentioned online (e.g., religion) will be “guessed” with a very high degree of accuracy as there are thousands of others out there who exactly “match” your online activity but they have also mentioned the one “missing” element you haven’t.  This profile is then used to “nudge” you – “manipulate” is probably the more appropriate word – in language that best suits your profile, into doing things that “add value” to that social medium, whether through advertising or extracting data to sell to others.

Second, social media are huge time-wasters. Of course, we all wish (and need) to communicate with others, and social media even enable us to find “others” that we want to converse with, but didn’t know how to get to in the past, e.g., distant relatives, grade school friends etc. But once online, the social media are highly expert in keeping us there – tweeting, sharing, uploading, and liking – well beyond what we might normally do in regular conversation.

Third, we all like to gossip once in a while; it’s totally human. But the Net exponentializes our “gossip” routines. Indeed, that extends not only to our personal life but far worse it involves the “news”. In a word, we are awash in negativity. Some of it is important and real; it is important to know about society’s social ills. However, most of it is either superficial – or far worse, manufactured, what we now call “fake news”. There are no inhibitions nor (almost) no restrictions on what any person can write or share/forward. And that’s invariably negative stuff: from “the world (as we knew and loved it) is coming to an end” to “they are taking over our life” (aliens, radicals, Antifa or Proud Boys, Hamas or Breaking the Silence, etc.). Online, there is no border between positive reality (read Pinker’s Enlightenment Now for a dose of REAL positivity) and negative unreality.

Fourth, and for me the most important of all reasons – connected a couple of the previous problems – is the “information overload” problem. Part of this is FOMO, i.e. “Fear Of Missing Out”: who is saying what to whom, what just happened that I “need to know”? So we pile it on – and always leave our gadget “on”.

The problem of information surfeit is not just a matter of quantity; it’s first and foremost a mental obstacle to thinking, especially the problem-solving or the creativity sort. The simple fact is that only when the brain is at rest – that is, not having to continually deal with external stimuli – can it begin to get “creative”. Most of us have experienced this phenomenon when we have a problem that we can’t solve, and then in the middle of the night or when daydreaming, “BANG!” the solution appears “out of nowhere”.

In fact, it’s not at all out of the blue. We are not aware that the brain is working even when “we” think it isn’t. Indeed (get ready for this strange conundrum), our brain makes decisions about 0.4 seconds before we are aware that “we” (it) has come to that decision! In short, if the Bible considered Eve a “helpmate” to Adam, there actually was (and is) a prior helpmate than our spouse/partner: it’s our brain.

Unfortunately, the brain cannot “multitask” (not even female ones). If you’re keeping it continually occupied with social media communication, it doesn’t have the time to do the more important stuff. Even worse, the brain is by far the biggest energy consumer of all our organs – and if it’s expending most available energy on social media silliness and worse, then even if and when you provide it with some respite it too will need to rest and not work on that problem that had you stumped.

The bottom line: when it comes to social media each of us has to decide whether to be primarily a consumer or a producer. I don’t mean being a consumer of others’ tweets and posts or producing your own clips and feeds. I mean either consuming other people’s online “product” or producing your own serious thinking and creativity offline – otherwise called the “real world”.

Recently, this basic idea has been pushed in such books as “24/6” by Tiffany Shlain, and “Sacred Rest” by Dr. Saundra Dalton-Smith. As a “Conservadox” Jew, completely unplugging myself every Saturday (shabbat) is second nature – and the practice has hardly harmed all the modern Orthodox professionals out there economically or otherwise. Quite the opposite! (Orthodox Jews have the highest average income of any ethnic or religious group in the U.S.)

So whatever your method, give yourself (or at least your brain; physically exercising while “thinking sweet-nothings” is recommended) a break! For a few hours a day, pull yourself away from your addictive smartphone – breakfast, lunch, and dinner are excellent times to start. You’ll be surprised at how much that actually then becomes, and leads to, true “food for thought”.

A Hebrew Lesson in Zionism

In these “memoir” mini-chapters, I usually start with a short personal vignette as a springboard for discussing a larger issue relevant to all. This time I’ll reverse that; the general issue comes first – and then the vignette that will take up the rest of this essay. The theme is “self-negation of social status to help others”. The following story is almost incomprehensible in an American milieu; most Israelis, on the other hand, can tell a similar story. Which is why so many outsiders have fallen in love with the country, despite its many, many flaws…

Nine days after landing in Israel as a new immigrant (oleh khadash) back in 1977, I found myself standing in front of 200 students from the Department of Political Studies at Bar-Ilan University, to give my inaugural lecture – in Hebrew!! Yes, I learned Hebrew from the 1st grade onwards in my Jewish Day School, through high school, but it was more “biblical” Hebrew than the modern lingo that had developed over the past century. In any case, I spent several hours preparing the lecture with a dictionary at my side, writing it out word for word – not the most ideal way to present a lecture.

It went OK, or so I thought. Back in my office, I started working on the following week’s lecture (the course was “Introduction to Politics and Government”), when two quite older gentlemen walked into my office. I had noticed them out of the corner of my eye during the lecture, and briefly wondered what they were doing there in a sea of youngsters, but I had to maintain my concentration.

“We really appreciate the effort you’re making,” they started off, without the usual niceties that I was used to back in the States. “For an oleh khadash, it was quite impressive. But still, here and there we couldn’t quite figure out what you were trying to say. Sometimes because the word you used wasn’t altogether right, and other times because the syntax of the sentence was too confusing.”

I could feel myself blush from embarrassment.  I knew the lecture wasn’t ”perfect”, but still…

“So, we would like to make you an offer. We’re willing to sit with you here every week, a day or two before your next lecture, and go over your notes to correct the Hebrew. That way, your talk will be clearer to all the other students, and most important, your Hebrew should get better quite quickly.”

I was completely taken aback. These were not kids trying to “pull one” over me. A very generous offer, indeed. It was surprising, but you don’t look a gift horse in the mouth. I accepted graciously. The real surprise was yet to come.

Every week they came at the appointed time, and sat with me for close to two hours as we went over the lecture word for word, sentence by sentence. It was illuminating, and also at times funny. Once I had looked up the word for “lobbyist” and ended up (in modern Hebrew) with the person who cleans hotel lobbies. And so it went for several weeks. 

One day as they were sitting there, the department’s manager happened to pass by, peeked into my open-door office and I could see him stopping short, startled. After my “buddies” left an hour later, he came into the office and asked what they were doing there. I explained the situation. His eyes bugged out.

“What’s the problem?” I asked. “Do you know who they are?” he responded hesitatingly. “Sure,” I replied, “Yonah and Menachem.”

He laughed. “No, I mean do you know who they ARE?” I had no idea what he meant. “Nope.”

“The man called Yonah was the IDF (Tzahal) Central Command General during the Yom Kippur War – and Menachem was the IDF’s Military Governor of the entire West Bank!”

To say that I was taken aback would be an understatement; blown away was more like it. These two elderly gentleman (57 and 63 at the time; I was all of 28), at the top of the Israeli social pyramid, had volunteered to spend many hours over the course of a full academic year to help a new immigrant. That was beyond my ken, as someone still harboring a fully American mentality.

And so it went for the whole year. Out of curiosity, I asked them both (gingerly) why they decided to study in the university at their age. Yonah’s reply was standard: many high-level IDF officers do so upon retirement; back then there was no chance of higher education at the beginning of an army career. Menachem’s answer was far more interesting – here too another glimpse of what makes Israeli tick (and how). 

Back in 1967, the Six-Day War ended quickly – well, in 6 days. Israel itself was surprised to find itself in control of the entire West Bank (Israel had actually asked Jordan NOT to enter the war, but for geo-political reasons the Jordanians had “no choice”). What to do with this territory? How to run it? The IDF searched its Personnel “data base” to see whether anyone had any previous experience in “military government”, and Menachem Arkin’s name popped up – as a junior officer for the British military government in North Africa during World War II. So 25 years later, the IDF made him West Bank Military Governor!

It was supposed to be a four-year term of service, but that dragged on. Finally, in early 1973 he called “Moshe” (Dayan): “enough is enough; find a replacement”. Dayan promised to do so by the end of the year – and then in October, another war broke out! By 1976, Menachem was despairing of ever getting out of the army, being too “indispensable”. He asked his army buddies what to do. Their advice: “ask for a one-year leave of absence to study; they can’t deny you that.” And that’s how he found himself doing a B.A. in my department!

Israel: a truly insane country, mostly for the better…

Touching the Stars

I grew up in Washington Heights, in northern Manhattan. Every Friday after school (or Sunday, if the weather was bad on Friday), I’d go to Fort Tryon Park – home of the famous Cloisters, the “medieval-style” museum (built by Frederick Law Olmstead Jr. who also designed Central Park – but I digress). At the entrance to the park was a large basketball playground where guys would show up for serious pick-up games. Occasionally, we would be joined by a very tall teenager by the name of Lew Alcindor…

If that name doesn’t mean anything to you, here he is in a nutshell: possibly the greatest high school basketball player of all time (only Wilt Chamberlain was of the same stature). Of course, it helps that at the age of 16 he was already 6’ 10” (approximately 2.10m). Alcindor lived about a mile from the park and would take the 10-minute bus uptown to shoot the hoops with us. As an aside, I can vouch for the fact that even then he was a fine fellow – never playing under the basket (with his height, no competition there) but only shooting from the outside. Oh yes, I forgot to mention (for all you non-basketball mavens) he went on to a stellar career in college where he converted to Islam and changed his name to Kareem Abdul Jabbar. (If that name still doesn’t ring a bell, where have you been living these past decades?)

For me, my few pickup basketball games with the “Great One” are a matter of “that’s nice” – but nothing to brag about. After all, I didn’t do anything here other than being at the same place, and playing a few games, where he too happened to be. However, I find it very interesting that every time I mention this vignette in passing, people are not only amazed but also never forget the “story” – even years and decades later.

Which brings me to the point of this essay: humans are social animals through and through, i.e. we haven’t left behind our “caveman” nature by much, if at all. The term “social” sounds nice but embedded within it are all sorts of other characteristics, a central trait being “competitive hierarchy”. Let it be said at the outset that while most “primitive” societies (then and now) have been patriarchal, we know of several that were and are matriarchal, meaning run by women. So the social structure is not gender-specific, even if most have been led by males.

Every human being strives for social status. In our distant past that was mostly (perhaps exclusively) for procreation – the more socially powerful, the more food and other resources the individual gained, thus attracting more fertile mates (yes, plural). Today, such social status is sought for other goals as well, but the underlying drive is still psycho-biological.

Who gets to the top of the pile? The “Alpha Male” (or Alpha Female). Others can be close hangers-on who are in the middle of the social hierarchy, and still others are socially “back of the pack”. The social “rat race” is perpetual, no matter our age: kids do it, adults do it, and even Third Agers do it (for a day or two, try being an anthropologist in a Retirement Home; social “competition” might express itself more subtly, but it’s there).

The Abdul-Jabbars of the world are “Alpha”, but what’s interesting are those in the coterie “circles” around them. Many people are Mid-Social due to their own accomplishments: a good lawyer; an excellent teacher; a minor league ballplayer. However, a large number get to the mid-hierarchy by “touching stardust”, i.e. being near, working for, or having some sort of relationship with, the Alpha.

Among other things, this explains the widespread (universal?) phenomenon we see on social media of “Leaders/Influencers” and their minions, e.g. Twitter “followers”, Facebook “sharers” and “likers”, and so on. By signing up and receiving constant feeds, people have the feeling that they are part of the VIP’s “circle”, even if it’s an outer one. And if they should actually get a real response from that Leader, well they’ve then moved into an inner circle of sorts – something that might be “worth” social status credit with their friends and family.

This sounds awfully crass, but real life isn’t all flowers and cookies. That’s not a “moral” judgment; just calling the way things really are. There’s no harm in wanting to be part of something “bigger” – except when it prevents us for DOING something bigger ourselves because we’re too caught up in constantly being with someone bigger.

I touched future stardust on the playground with Lew Alcindor and it was fine. More important was my ability not to view that as the pinnacle of my life. I never made “Alpha” – but achieving “Beta” is nothing to be ashamed about, especially when there are a lot more lower letters on the alphabet…

Religion: Belief vs. Practice

In our family, if and when I was home at bedtime, I was the one who put to bed our young son Avihai. One evening, around when he was seven years-old, he startled me: “Abba,” he looked me straight in the eye, “I don’t believe in God”. We then had a very interesting conversation…

That got me thinking. Several questions ultimately came to mind. First, what do people do if they are constitutionally unable to “believe” in a higher power? Second, is “religion universal” if there are people who honestly do not believe in God? Third, and for me a much more “practical” question: can one partake of a religion or be part of a religious community, if there’s no real belief in the Almighty (however defined)?

The first is relatively easy to answer: there are many “atheists” in the world – or at least “agnostics” (who are not sure there is or isn’t a God). And they manage to lead normal lives. There is no evidence that such non-believers are less or more “moral” than religious believers. However, I would personally argue that they are by far the most existentially “courageous” of all people, because they go through life feeling that there is no “ultimate”/metaphysical meaning to life or for that matter (pun intended), within the universe. It’s not easy to live a life where “this is all there is”.

The second question is a bit more complex. Religion is NOT “universal”, if we mean that all humans are believers in a higher power. However, it seems to be societally universal, i.e. we haven’t found any society – primitive or “advanced” – where religion doesn’t exist. And this statement goes back as far as we have any evidence regarding how homo sapiens lived: even tens of thousands of years ago they/we had burial rites, among other elements that we recognize as constituting “religion”.

The third question is (for me) the most interesting, for two related reasons. I have doubts regarding “God”, at least as He (I prefer “It”) is generally understood. But I am positive that in this I have lots of company – many people that I know who are to some extent religiously observant, clearly also have serious (and sincere) doubts about “God”. But notice the curious three words in this last sentence: “are…religiously observant”. How can that be – and why?

I can really only answer from a Jewish perspective. Indeed, it might well be that this is the antithesis of a Christian standpoint. Judaism is a religion first and foremost of PRACTICE. I once asked a few rabbis who I knew well the following theoretical question: “If a Jew came to you with the following completely dichotomous choice – complete belief in God but no religious practice, or complete acceptance and observance of all the Commandments but with no belief in the Almighty – which would you instruct them to follow? All the rabbis so queried answered: practice without belief. As best I can tell, Christian theologians would say the opposite, given that their religion is based primarily on the belief of Christ etc.

Nevertheless, how can someone practice their religion without believing in the supernatural foundation or source behind such practice? Several answers are possible (not necessarily contradictory). First, religion deals with moral behavior; just as we accept and hew to secular laws (written by human beings) because they enable society to function, so too we can follow longstanding moral strictures and commandments, even if they evolved from and were written by humans many centuries or even millennia ago. Second, many religious practices might have developed over time because they proved their utility. An example in Judaism: ritually washing hands before mealtime – long before the science of personal hygiene was understood by anyone. But such a “custom” might have morphed into a religious “commandment” as society began to notice that people who hand-wash before eating tended to be healthier than those who didn’t. In Judaism, one can find many such commandments, although we will never know whether “utility” was the original source of any specific one.

A third answer is probably the most profound one of all: religion tends to offer a respite from the travails of life. In olden times, people worked themselves to the bone – religious service enabled some rest and succor from fieldwork; in modern times, with its social anomie in an increasingly atomized society, religious gatherings enable steady social intercourse and building a solid “community”. Indeed, this is most probably the reason that recent research has found longer lifespans among communal “worshippers” than non “church-goers”: there is less loneliness when belonging to a religious community (the religious believer, of course, will argue that God is rewarding the worshipper…). In short, there are some highly utilitarian benefits to practicing religion – whether one “believes” or not.

This brings me back to the atheists. I fully accept their right to not-believe. However, they are doing a disservice by trying to convince others that religious belief is false. To the atheist, I put the following question: if you were a physician, and a patient came to you with what you understood (after a thorough checkup) to be a psychosomatic “ailment”, would you provide that person with a placebo in the hope that it will help (it usually does!), or tell them to “get real; there’s nothing wrong with you”. Similarly, the religious believer suffering (as most humans tend to do at some point) from existential angst about the meaning of life. At the least, religious belief provides psychological comfort – and in many cases, religious practice can do a lot more than that.