Less is More When You Want to Get Something Done

I am about to admit to the most unprofessional thing I have ever “done” – or perhaps I should say “not done”. As a professor of communications, and especially “New Media”, I have refused from the beginning to join Facebook, Instagram, Twitter, TikTok, and almost every other social medium (except for a couple of specific, limited-to-friends/family WhatsApp groups). I’ll leave for another day how I managed to do serious research – or even to teach – in the field. For now, I want to offer my reasons – one that everyone else should consider as well. 

First and perhaps least, albeit still worthy of consideration, is the personal privacy issue. Social media know just about everything there is to know about each and every one of us – at least those who are “on”. They even know things about you that you might not know about yourself, because their algorithm can correlate different aspects of your online behavior to develop a “profile”: who you are (race, religion etc.), what you like (pick any field of life), and how you think (extroverted vs shy, smartass vs. humble, genius or moron…). Even aspects that you have never mentioned online (e.g., religion) will be “guessed” with a very high degree of accuracy as there are thousands of others out there who exactly “match” your online activity but they have also mentioned the one “missing” element you haven’t.  This profile is then used to “nudge” you – “manipulate” is probably the more appropriate word – in language that best suits your profile, into doing things that “add value” to that social medium, whether through advertising or extracting data to sell to others.

Second, social media are huge time-wasters. Of course, we all wish (and need) to communicate with others, and social media even enable us to find “others” that we want to converse with, but didn’t know how to get to in the past, e.g., distant relatives, grade school friends etc. But once online, the social media are highly expert in keeping us there – tweeting, sharing, uploading, and liking – well beyond what we might normally do in regular conversation.

Third, we all like to gossip once in a while; it’s totally human. But the Net exponentializes our “gossip” routines. Indeed, that extends not only to our personal life but far worse it involves the “news”. In a word, we are awash in negativity. Some of it is important and real; it is important to know about society’s social ills. However, most of it is either superficial – or far worse, manufactured, what we now call “fake news”. There are no inhibitions nor (almost) no restrictions on what any person can write or share/forward. And that’s invariably negative stuff: from “the world (as we knew and loved it) is coming to an end” to “they are taking over our life” (aliens, radicals, Antifa or Proud Boys, Hamas or Breaking the Silence, etc.). Online, there is no border between positive reality (read Pinker’s Enlightenment Now for a dose of REAL positivity) and negative unreality.

Fourth, and for me the most important of all reasons – connected a couple of the previous problems – is the “information overload” problem. Part of this is FOMO, i.e. “Fear Of Missing Out”: who is saying what to whom, what just happened that I “need to know”? So we pile it on – and always leave our gadget “on”.

The problem of information surfeit is not just a matter of quantity; it’s first and foremost a mental obstacle to thinking, especially the problem-solving or the creativity sort. The simple fact is that only when the brain is at rest – that is, not having to continually deal with external stimuli – can it begin to get “creative”. Most of us have experienced this phenomenon when we have a problem that we can’t solve, and then in the middle of the night or when daydreaming, “BANG!” the solution appears “out of nowhere”.

In fact, it’s not at all out of the blue. We are not aware that the brain is working even when “we” think it isn’t. Indeed (get ready for this strange conundrum), our brain makes decisions about 0.4 seconds before we are aware that “we” (it) has come to that decision! In short, if the Bible considered Eve a “helpmate” to Adam, there actually was (and is) a prior helpmate than our spouse/partner: it’s our brain.

Unfortunately, the brain cannot “multitask” (not even female ones). If you’re keeping it continually occupied with social media communication, it doesn’t have the time to do the more important stuff. Even worse, the brain is by far the biggest energy consumer of all our organs – and if it’s expending most available energy on social media silliness and worse, then even if and when you provide it with some respite it too will need to rest and not work on that problem that had you stumped.

The bottom line: when it comes to social media each of us has to decide whether to be primarily a consumer or a producer. I don’t mean being a consumer of others’ tweets and posts or producing your own clips and feeds. I mean either consuming other people’s online “product” or producing your own serious thinking and creativity offline – otherwise called the “real world”.

Recently, this basic idea has been pushed in such books as “24/6” by Tiffany Shlain, and “Sacred Rest” by Dr. Saundra Dalton-Smith. As a “Conservadox” Jew, completely unplugging myself every Saturday (shabbat) is second nature – and the practice has hardly harmed all the modern Orthodox professionals out there economically or otherwise. Quite the opposite! (Orthodox Jews have the highest average income of any ethnic or religious group in the U.S.)

So whatever your method, give yourself (or at least your brain; physically exercising while “thinking sweet-nothings” is recommended) a break! For a few hours a day, pull yourself away from your addictive smartphone – breakfast, lunch, and dinner are excellent times to start. You’ll be surprised at how much that actually then becomes, and leads to, true “food for thought”.

A Hebrew Lesson in Zionism

In these “memoir” mini-chapters, I usually start with a short personal vignette as a springboard for discussing a larger issue relevant to all. This time I’ll reverse that; the general issue comes first – and then the vignette that will take up the rest of this essay. The theme is “self-negation of social status to help others”. The following story is almost incomprehensible in an American milieu; most Israelis, on the other hand, can tell a similar story. Which is why so many outsiders have fallen in love with the country, despite its many, many flaws…

Nine days after landing in Israel as a new immigrant (oleh khadash) back in 1977, I found myself standing in front of 200 students from the Department of Political Studies at Bar-Ilan University, to give my inaugural lecture – in Hebrew!! Yes, I learned Hebrew from the 1st grade onwards in my Jewish Day School, through high school, but it was more “biblical” Hebrew than the modern lingo that had developed over the past century. In any case, I spent several hours preparing the lecture with a dictionary at my side, writing it out word for word – not the most ideal way to present a lecture.

It went OK, or so I thought. Back in my office, I started working on the following week’s lecture (the course was “Introduction to Politics and Government”), when two quite older gentlemen walked into my office. I had noticed them out of the corner of my eye during the lecture, and briefly wondered what they were doing there in a sea of youngsters, but I had to maintain my concentration.

“We really appreciate the effort you’re making,” they started off, without the usual niceties that I was used to back in the States. “For an oleh khadash, it was quite impressive. But still, here and there we couldn’t quite figure out what you were trying to say. Sometimes because the word you used wasn’t altogether right, and other times because the syntax of the sentence was too confusing.”

I could feel myself blush from embarrassment.  I knew the lecture wasn’t ”perfect”, but still…

“So, we would like to make you an offer. We’re willing to sit with you here every week, a day or two before your next lecture, and go over your notes to correct the Hebrew. That way, your talk will be clearer to all the other students, and most important, your Hebrew should get better quite quickly.”

I was completely taken aback. These were not kids trying to “pull one” over me. A very generous offer, indeed. It was surprising, but you don’t look a gift horse in the mouth. I accepted graciously. The real surprise was yet to come.

Every week they came at the appointed time, and sat with me for close to two hours as we went over the lecture word for word, sentence by sentence. It was illuminating, and also at times funny. Once I had looked up the word for “lobbyist” and ended up (in modern Hebrew) with the person who cleans hotel lobbies. And so it went for several weeks. 

One day as they were sitting there, the department’s manager happened to pass by, peeked into my open-door office and I could see him stopping short, startled. After my “buddies” left an hour later, he came into the office and asked what they were doing there. I explained the situation. His eyes bugged out.

“What’s the problem?” I asked. “Do you know who they are?” he responded hesitatingly. “Sure,” I replied, “Yonah and Menachem.”

He laughed. “No, I mean do you know who they ARE?” I had no idea what he meant. “Nope.”

“The man called Yonah was the IDF (Tzahal) Central Command General during the Yom Kippur War – and Menachem was the IDF’s Military Governor of the entire West Bank!”

To say that I was taken aback would be an understatement; blown away was more like it. These two elderly gentleman (57 and 63 at the time; I was all of 28), at the top of the Israeli social pyramid, had volunteered to spend many hours over the course of a full academic year to help a new immigrant. That was beyond my ken, as someone still harboring a fully American mentality.

And so it went for the whole year. Out of curiosity, I asked them both (gingerly) why they decided to study in the university at their age. Yonah’s reply was standard: many high-level IDF officers do so upon retirement; back then there was no chance of higher education at the beginning of an army career. Menachem’s answer was far more interesting – here too another glimpse of what makes Israeli tick (and how). 

Back in 1967, the Six-Day War ended quickly – well, in 6 days. Israel itself was surprised to find itself in control of the entire West Bank (Israel had actually asked Jordan NOT to enter the war, but for geo-political reasons the Jordanians had “no choice”). What to do with this territory? How to run it? The IDF searched its Personnel “data base” to see whether anyone had any previous experience in “military government”, and Menachem Arkin’s name popped up – as a junior officer for the British military government in North Africa during World War II. So 25 years later, the IDF made him West Bank Military Governor!

It was supposed to be a four-year term of service, but that dragged on. Finally, in early 1973 he called “Moshe” (Dayan): “enough is enough; find a replacement”. Dayan promised to do so by the end of the year – and then in October, another war broke out! By 1976, Menachem was despairing of ever getting out of the army, being too “indispensable”. He asked his army buddies what to do. Their advice: “ask for a one-year leave of absence to study; they can’t deny you that.” And that’s how he found himself doing a B.A. in my department!

Israel: a truly insane country, mostly for the better…

Touching the Stars

I grew up in Washington Heights, in northern Manhattan. Every Friday after school (or Sunday, if the weather was bad on Friday), I’d go to Fort Tryon Park – home of the famous Cloisters, the “medieval-style” museum (built by Frederick Law Olmstead Jr. who also designed Central Park – but I digress). At the entrance to the park was a large basketball playground where guys would show up for serious pick-up games. Occasionally, we would be joined by a very tall teenager by the name of Lew Alcindor…

If that name doesn’t mean anything to you, here he is in a nutshell: possibly the greatest high school basketball player of all time (only Wilt Chamberlain was of the same stature). Of course, it helps that at the age of 16 he was already 6’ 10” (approximately 2.10m). Alcindor lived about a mile from the park and would take the 10-minute bus uptown to shoot the hoops with us. As an aside, I can vouch for the fact that even then he was a fine fellow – never playing under the basket (with his height, no competition there) but only shooting from the outside. Oh yes, I forgot to mention (for all you non-basketball mavens) he went on to a stellar career in college where he converted to Islam and changed his name to Kareem Abdul Jabbar. (If that name still doesn’t ring a bell, where have you been living these past decades?)

For me, my few pickup basketball games with the “Great One” are a matter of “that’s nice” – but nothing to brag about. After all, I didn’t do anything here other than being at the same place, and playing a few games, where he too happened to be. However, I find it very interesting that every time I mention this vignette in passing, people are not only amazed but also never forget the “story” – even years and decades later.

Which brings me to the point of this essay: humans are social animals through and through, i.e. we haven’t left behind our “caveman” nature by much, if at all. The term “social” sounds nice but embedded within it are all sorts of other characteristics, a central trait being “competitive hierarchy”. Let it be said at the outset that while most “primitive” societies (then and now) have been patriarchal, we know of several that were and are matriarchal, meaning run by women. So the social structure is not gender-specific, even if most have been led by males.

Every human being strives for social status. In our distant past that was mostly (perhaps exclusively) for procreation – the more socially powerful, the more food and other resources the individual gained, thus attracting more fertile mates (yes, plural). Today, such social status is sought for other goals as well, but the underlying drive is still psycho-biological.

Who gets to the top of the pile? The “Alpha Male” (or Alpha Female). Others can be close hangers-on who are in the middle of the social hierarchy, and still others are socially “back of the pack”. The social “rat race” is perpetual, no matter our age: kids do it, adults do it, and even Third Agers do it (for a day or two, try being an anthropologist in a Retirement Home; social “competition” might express itself more subtly, but it’s there).

The Abdul-Jabbars of the world are “Alpha”, but what’s interesting are those in the coterie “circles” around them. Many people are Mid-Social due to their own accomplishments: a good lawyer; an excellent teacher; a minor league ballplayer. However, a large number get to the mid-hierarchy by “touching stardust”, i.e. being near, working for, or having some sort of relationship with, the Alpha.

Among other things, this explains the widespread (universal?) phenomenon we see on social media of “Leaders/Influencers” and their minions, e.g. Twitter “followers”, Facebook “sharers” and “likers”, and so on. By signing up and receiving constant feeds, people have the feeling that they are part of the VIP’s “circle”, even if it’s an outer one. And if they should actually get a real response from that Leader, well they’ve then moved into an inner circle of sorts – something that might be “worth” social status credit with their friends and family.

This sounds awfully crass, but real life isn’t all flowers and cookies. That’s not a “moral” judgment; just calling the way things really are. There’s no harm in wanting to be part of something “bigger” – except when it prevents us for DOING something bigger ourselves because we’re too caught up in constantly being with someone bigger.

I touched future stardust on the playground with Lew Alcindor and it was fine. More important was my ability not to view that as the pinnacle of my life. I never made “Alpha” – but achieving “Beta” is nothing to be ashamed about, especially when there are a lot more lower letters on the alphabet…

Religion: Belief vs. Practice

In our family, if and when I was home at bedtime, I was the one who put to bed our young son Avihai. One evening, around when he was seven years-old, he startled me: “Abba,” he looked me straight in the eye, “I don’t believe in God”. We then had a very interesting conversation…

That got me thinking. Several questions ultimately came to mind. First, what do people do if they are constitutionally unable to “believe” in a higher power? Second, is “religion universal” if there are people who honestly do not believe in God? Third, and for me a much more “practical” question: can one partake of a religion or be part of a religious community, if there’s no real belief in the Almighty (however defined)?

The first is relatively easy to answer: there are many “atheists” in the world – or at least “agnostics” (who are not sure there is or isn’t a God). And they manage to lead normal lives. There is no evidence that such non-believers are less or more “moral” than religious believers. However, I would personally argue that they are by far the most existentially “courageous” of all people, because they go through life feeling that there is no “ultimate”/metaphysical meaning to life or for that matter (pun intended), within the universe. It’s not easy to live a life where “this is all there is”.

The second question is a bit more complex. Religion is NOT “universal”, if we mean that all humans are believers in a higher power. However, it seems to be societally universal, i.e. we haven’t found any society – primitive or “advanced” – where religion doesn’t exist. And this statement goes back as far as we have any evidence regarding how homo sapiens lived: even tens of thousands of years ago they/we had burial rites, among other elements that we recognize as constituting “religion”.

The third question is (for me) the most interesting, for two related reasons. I have doubts regarding “God”, at least as He (I prefer “It”) is generally understood. But I am positive that in this I have lots of company – many people that I know who are to some extent religiously observant, clearly also have serious (and sincere) doubts about “God”. But notice the curious three words in this last sentence: “are…religiously observant”. How can that be – and why?

I can really only answer from a Jewish perspective. Indeed, it might well be that this is the antithesis of a Christian standpoint. Judaism is a religion first and foremost of PRACTICE. I once asked a few rabbis who I knew well the following theoretical question: “If a Jew came to you with the following completely dichotomous choice – complete belief in God but no religious practice, or complete acceptance and observance of all the Commandments but with no belief in the Almighty – which would you instruct them to follow? All the rabbis so queried answered: practice without belief. As best I can tell, Christian theologians would say the opposite, given that their religion is based primarily on the belief of Christ etc.

Nevertheless, how can someone practice their religion without believing in the supernatural foundation or source behind such practice? Several answers are possible (not necessarily contradictory). First, religion deals with moral behavior; just as we accept and hew to secular laws (written by human beings) because they enable society to function, so too we can follow longstanding moral strictures and commandments, even if they evolved from and were written by humans many centuries or even millennia ago. Second, many religious practices might have developed over time because they proved their utility. An example in Judaism: ritually washing hands before mealtime – long before the science of personal hygiene was understood by anyone. But such a “custom” might have morphed into a religious “commandment” as society began to notice that people who hand-wash before eating tended to be healthier than those who didn’t. In Judaism, one can find many such commandments, although we will never know whether “utility” was the original source of any specific one.

A third answer is probably the most profound one of all: religion tends to offer a respite from the travails of life. In olden times, people worked themselves to the bone – religious service enabled some rest and succor from fieldwork; in modern times, with its social anomie in an increasingly atomized society, religious gatherings enable steady social intercourse and building a solid “community”. Indeed, this is most probably the reason that recent research has found longer lifespans among communal “worshippers” than non “church-goers”: there is less loneliness when belonging to a religious community (the religious believer, of course, will argue that God is rewarding the worshipper…). In short, there are some highly utilitarian benefits to practicing religion – whether one “believes” or not.

This brings me back to the atheists. I fully accept their right to not-believe. However, they are doing a disservice by trying to convince others that religious belief is false. To the atheist, I put the following question: if you were a physician, and a patient came to you with what you understood (after a thorough checkup) to be a psychosomatic “ailment”, would you provide that person with a placebo in the hope that it will help (it usually does!), or tell them to “get real; there’s nothing wrong with you”. Similarly, the religious believer suffering (as most humans tend to do at some point) from existential angst about the meaning of life. At the least, religious belief provides psychological comfort – and in many cases, religious practice can do a lot more than that.

Maturity

In my teens, I was quite the voracious reader of fiction. Several books impressed me – e.g. Somerset Maugham’s Of Human Bondage, Thomas Hardy’s Tess of the d’Urbervilles – but above all, Ayn Rand’s Atlas Shrugged had a tremendous influence. Not so much for its literary “merit” (even then I realized that Shakespeare she’s not) but for the ideas Rand espoused so forcefully. I recently returned to the book. My reaction: what was I thinking back then???

Precisely the point. When we’re young, we think that we’re thinking. In a way, of course, we are “thinking” – but without much reflection. Strong ideas, simple to grasp, tend to grab our attention; complexity is perceived as wishy-washy. Just like our teenage chemical hormones that are “raging” in one clear direction, so too our cognitive faculties hone-in laser-like on any idea that represents protest against the status quo. It’s an age where we can’t be too bothered by facts that get in the way.

As we grow older, most of us manage to “mature”. In a large sense, that’s another way of saying that our life experience wears down the sharp edges of assuredness, bringing a more complex and rounded understanding of social reality. That doesn’t mean that we stop strongly believing in ideas, ideologies, movements; it does mean that there’s also a “yes, but…” lying in the thicket of whatever opinion or belief we happen to hold.

I came across Ayn Rand’s tome (over 1,000 pages long!!) in the early 1960s. It was a stunning (for me) paean to capitalism and freedom. As a teenager, I hadn’t thought much about “economics”, but “freedom” was certainly on my mind – as it would be for any adolescent beginning to forge their own identity. The idea that we are the masters of our own fate, and that society shouldn’t restrict us in any significant way, had great meaning for anyone at my early stage in life.

What has happened since then? I began to look at the world in all its intricacy. Simultaneously, I started reading social-science research (after all, that is my general field): the way humans think and behave (psychology), the way societies work (sociology, politics, economics), and from there to the way (and why) they are constructed (biology, neuroscience). The more I read and delved, the clearer the picture became – not simpler, but rather far more nuanced and complex.

Put simply (and somewhat simplistically): there is no such thing as the “individual” divorced from society. Everything we believe in, all that we create and do, to a very large extent is a product of our social and physical environment. Indeed, Hebrew (Judaism) has two words for what in English we call “creation”: “briyah” and “yetzirah” – the difference between creatio ex nihilo (something from nothing) that only God can perform, and creatio ex materia (something from something) that describes what humans are (only) capable of. In other words, when we think of something “new”, it is basically a recombination of something (physical or ideational) that already exists on our world.

Most people are not voracious readers like I am, but everyone lives life in all its variety, up close and vicariously too (“the news”). The “up close” teaches us how to relate better to others: what drives others and ourselves, where they’re “coming from”, when to say (or not say) what in response. The news – external to our immediate life – provides perspective on “others”: different cultures and ways of living, different “narratives” and ways of thinking. Assuming that our eyes and ears are somewhat open to all this, we cannot help but become more “mature”, i.e., tolerant and able to accept life’s others and “otherness” with humility and grace.

Maturity has another side effect: happiness! Recent research clearly shows that around the world, as an age cohort older people (60s and up) are happier than any other age group. That’s true for three main reasons: first, the kids are out of the house; second, professionally we have achieved (or not) most of what we will ever accomplish. Third, and perhaps most important of all, we don’t take things so much to heart anymore; we’ve “seen it all” and for the “mature” person the good and the bad are all accepted with a gracious chuckle. Thus, as people approach the last stage of life, they paradoxically also tend to accept life far more from a “middle” standpoint.

Finally, there’s one more important aspect of the mature mind: accepting that whatever the current conventional wisdom and even the hard, factual evidence, we have to maintain a level of skepticism because of the lack of absolute surety. Even Newton’s “absolute” laws of physics eventually succumbed to new facts and a new way of understanding the universe (Einstein). Metaphorically, Sir Isaac was our teenage certainty; Albert constitutes later-age relativity. Great minds might not think alike, but mature minds do take the same approach to thinking about life.

Are We a Whole, or Many Holes?

For those of you who have read several (or most) of these “Reflections” essays, it might seem that they present a disjointed picture of yours truly. There isn’t much of an “autobiographical narrative” here. Which brings up a very interesting – indeed, even hugely “personal-philosophical” – question for each and every one of us. Through a lifetime, does a person consist of a stable “self”?

At the most basic biological level, you are not the same person you were yesterday – and certainly not what you were a few months ago, given that almost every cell in your body has died and regrown since then. We shed skin and water – with new skin and liquids constantly replenished; we lose bone and cartilage – and grow new cells to keep our bodies functioning; all the while, “zillions” of bacteria and viruses who reside permanently in our gut and elsewhere are constantly dying off with others taking their place. Most fantastically (and still perplexing neuroscientists) is the fact that even our brain cells are constantly dying, and new ones take their place – so how do we retain our memories??

Nevertheless, my initial question above is not biological but rather psychological: are we the same person(ality) we were yesteryear in our thoughts, our emotions, our way(s) of viewing and dealing with the world? Is there any consistent “me” from childhood to late adulthood? Notice that I don’t even include here “infancy” or “dotage” because clearly at the very start and very end of our life we are mostly not the person we will become or was. However, in the vast middle – are we really the same “person” in the deepest psychological and behavioral sense of the term?

There is no clear answer. On the one hand, most of us would unthinkingly say “of course I’m the same Sam”. Although our face and body changes through the years, we can see the gradual progression; although we might be somewhat more “mature” later in life, we still react to things in the generally same idiosyncratic fashion as before.

Or do we? Here we come to the other side of the coin: memory. If we wish to view ourselves as the same person we always were, there has to be some internal narrative to support this. But except for the very few (un)lucky individuals who can recall every moment of their life, we have huge holes in our “internal narrative”. Even worse, much of what we do “remember” did not happen in the way we “recall”; in other words, much of the time we “invent” our life. This is a common (heavily researched and proven) problem in court cases where witnesses literally “re”member what they saw or heard. Little do we know that we are quite bad “witnesses” of our own life!

And now for the BIG question: do you (past or present) try to “make sense” of your life, to see some overarching framework in what you have accomplished (personally and professionally)? In other words, notwithstanding the memory holes we share, even if we remembered everything, would that in itself constitute a “life narrative” with any motif? In fact, have you ever asked yourself that question?

I definitely do not mean to suggest that if you haven’t even asked it (meta-self-examination), there’s something wrong with you. You might actually believe – perhaps correctly – that there is no such thing as a “life narrative” that is stable, consistent, and unchanging. The reason there might not be such a thing is that we are not completely sovereign over our life; our social environment (broadly defined) affects and influences us every moment of the day. Thus, like a ship at sea buffeted by the waves, winds and currents, our life deals more with “staying alive” than inexorably charging forward to our self-defined “life-goal”. Moreover, we also might not feel that there’s a consistent narrative because we actively tried “maturing” and changing “who we were”. Why look at a steady life as something admirable? As someone once opined: “consistency is the hobgoblin of small minds”.

Yet ultimately that is somewhat unsatisfying because we want to feel that we “stand for something”, i.e., that there’s a “there, there” within us. We all want to believe that we have some control over who we were, and are, and will be – and if we do change, then at least it’s because we willed it.

The American moral philosopher J. David Velleman put it pithily: “We invent ourselves… but we really are the characters we invent.” In short, even if we aren’t in full control of who we actually are, we are in total control of who we say (to ourselves, and to a lesser extent to others) what we are. Some of us see in ourselves a one-act play; others perceive ourselves to be a kaleidoscope of colors and shapes. Whether whole or full of holes, human beings are at least free to paint on their own canvas – even if the picture isn’t Realistic but rather Impressionistic or Surrealistic.

The Limits of Freedom

When I was a young boy, my parents bought us two parakeets. We kept them in their cage, well fed and taken care of, in the bedroom I shared with David. One day I felt sad for them because they didn’t have the opportunity to fly. So, I opened the cage door, gently took one out, and set it free. The bird was unsure what to do, but soon enough started flying around the room and then… flew straight into the window, cracked open its head, and died on the spot. 

At the time, it was a “tragedy” for a young kid like me. Looking back, though, I realize that there were a few lessons to be learned from this incident.

First, don’t do something “different” unless you consider possible unintended consequences. Of course, the world is too complex to be able to foresee everything that might happen, but we can certainly think through several, relatively likely possibilities and perhaps a couple of unlikely ones as well.

Second, and more difficult to absorb, is that unfettered freedom is actually counterproductive. Indeed, there’s a word for that: anarchy. We intuitively understand the problem in a few areas of life. Take sports: imagine a game where there are no rules, and every player can do whatever s/he wants in order to “score”. Not only wouldn’t that be very interesting for the participants, but it’s doubtful many spectators would find such a sport compelling. Or take music: what do most people want to hear – Mozart or some hyper-modern, “anything goes” composition? It is precisely the “limiting” structure of pre-20th century classical music that renders the music of Mozart (or any of his composer compatriots) so compelling.

This holds true not only for culture and entertainment. Even in general social life, complete freedom does not exist. One can’t walk anywhere, anytime; down the middle of the street or crossing at a red light is not acceptable. One can’t even say whatever one wants, the First Amendment notwithstanding: yelling “Fire!” in a crowded theater (when there’s no fire) has no Constitutional protection. We even accept restrictions with no danger or harm in sight: even if all parties are willing, no more than two people can marry each other.

Why would this paradox hold true? That true freedom is only found within some structured or even “restrictive” framework? One way of looking at this is that society emulates Nature. The universe has “laws” of physics – there is no “cosmic chaos”, even if black holes and other outer space phenomena might seem that way. Given that human beings are part of that same natural world, why wouldn’t homo sapiens behave along the same principle? Indeed, we don’t think twice about the fact that gravity stops us from flying – it’s just one of the numerous “limitations” on our freedom that we accept because… well, because that’s the way world works.

Another approach is to consider what drives humans after their basic needs are met. In a word: challenge. By our very psychological nature we are problem-solvers. At first, to stay alive; later, to keep us mentally stimulated or having a “better” life. But a problem, task, or challenge to be solved, completed or overcome can only be stimulating if it is circumscribed in some way: a frame around a painting; rules to complete a puzzle; laws allowing and forbidding certain advertising practices; professional licenses for those completing a specific course of study and passing an exam; and so on. The fun is in difficulty overcome, not in taking the easy way home.

In retrospect, as that young child I had two other choices: open my bedroom windows before releasing the parakeet – or lowering the shades.

The first option would have provided my bird with complete freedom – and then it would most probably have starved to death, not being trained to find food by itself. The second option would have restricted the parakeet’s freedom of movement even more by darkening the room – but that in itself would have forced it to be more careful, and thus survive the experience!

At the risk of sounding “pedantic”, there’s a much larger lesson here. I’ll take the latest “political” brouhaha as my example: during Covid-19, every country has had to deal with the following dilemma: to prohibit certain activities from those who refuse to wear a mask and get vaccinated – or enable “complete freedom” as a function of individual rights? At least for me, my parakeet episode provides the correct answer.

Hedgehog or Fox?

Back in the mid-1980s, when I was about a decade into my academic career, a senior colleague and close friend of mine gave me some advice as I was preparing my “file” for promotion: “Sam, you’re studying too many different subjects. Try to focus on one and be an expert in that. Academia today wants specialists, not generalists.”

One of the ironies in this vignette is that my friend is British – the same country where one of the foremost philosophers of the 20th century worked: Prof. Isaiah Berlin. Among other things, Berlin is famous for his essay book “The Hedgehog and the Fox”. Taking his cue from the Greek philosopher Archilochus, Berlin divided thinkers into two categories: hedgehogs, who see the world through the perspective of one single, defining idea; and foxes whose standpoint is a wide variety of life experiences and areas of knowledge. As Berlin explained, the former digs deep into one subject area, gaining great expertise in that field; the latter brings together ideas from several sources, even if the fox’s knowledge base is more superficial than the hedgehog’s.

In our world, both are useful and even necessary. We accumulate detailed knowledge incrementally through many hedgehogs – and then a few foxes try and “connect the dots” between ostensibly disparate fields. Most big paradigmatic advances in modern times have been contributed by foxes, but they could not have done it without the groundwork laid by the hedgehogs.

So why the friendly “warning” of my friend and colleague? Because the world of academic research has come to reward – almost exclusively – hedgehog work. The main reason for that is the “measurability” of their contribution, especially the number of articles they publish in prestigious journals. Foxy work, on the other hand, is far harder to evaluate, especially because almost all of it appears in books and not articles – and producing a book is far more painstaking than putting out articles. Thus, the fox’s “output” tends to be far smaller than that of the scholarly hedgehog.

I knew that my friend was correct, functionally. But I basically ignored his advice and to a certain extent “suffered” academically from that decision. Why ignore something that I understood as being correct, at least from a “career” perspective?

Two reasons. First, by nature I am a tried-and-true fox – deeply interested in several areas of life, and having the ability to contribute something in each by combining a few (I try not to use this platform for self-aggrandizement, but just this once: back in 1981 I published the first-ever scholarly article on the legal ramifications of artificial intelligence, still being cited today in the latest research literature. End of my “promo”).

However, the second reason is far more important, and I already hinted at it above: although few and far between, it is the fox who has the chance to make some major breakthrough, or at least to shed new light on social phenomena, bringing to bear knowledge that seems to be far removed from the issue at hand.

By now you are probably saying to yourself: OK, interesting, but what does this have to do with “real life”? Here we come to the crux of the matter: everything in our world is connected somehow to virtually everything else! The challenge is how to perceive and act on those connections. This problem is especially acute in politics and policymaking. Government agencies and ministries tend to be intellectual silos – hedgehogs burying (and buried) deep within their own field. But this invariably leads to bad policy.

I will offer just one policymaking example: trivial and critical at one and the same time. High school educators are steeped in the ins and outs of pedagogy but are oblivious to the discipline of chronobiology. You’re surely saying: chrono-what? Chronobiology is the science of our body’s internal clock, and it turns out that teenagers simply cannot get up early in the morning. So why do high schools still start classes at 8:30 or 9:00AM??? It is completely counterproductive pedagogically!

And just because you aren’t part of public policymaking, you shouldn’t think this doesn’t affect you as well. We all have to make serious decisions in our personal life. Here too is but one example: thinking of buying a house? Always wanted to have a view of the sea? Well, think again. The only real estate in the U.S. that over the past decade has gone DOWN in price is housing within half a mile of the shore. The reason: global warming leads to sea level increase, that brings on increased flooding in storms and in general. Not to mention that it’s worth considering whether to live in a wood frame or stone/concrete home; the latter is the way to go in order to keep cool during increasingly hot weather! Who would have thought that real estate and climatology are linked?

I bring these examples to show that the “interaction” between fields of knowledge is not necessarily within the same general scientific area, i.e. not within the social sciences exclusively, ditto the natural sciences, or the same for the humanities – but rather between them. Of course, this is a huge problem because when they were in college those same governmental policymakers studied almost exclusively within one general scientific field (e.g. psychology, economics and political science, without any history and literature, or biology and physics). I was lucky enough to have done so (ergo my article combining Law and Computer Science), and thus equipped to tackle some “foxy” issues.

I am not suggesting that everyone become a fox. Again, the world needs all the hedgehogs it can muster in order to gain discrete knowledge. However, it is important to break out of one’s own intellectual ghetto – and certainly in the public sphere to find ways to have a real dialogue, if not institutional arrangements, between very disparate fields of endeavor.

Then and Now: Staying Upbeat

Tami calls me a “Pollyanna” – almost always upbeat and optimistic about the present and future. Is that realistic? No… and yes. Is it beneficial? Absolutely YES!

 Let’s start with a fact: some people are born with a natural disposition to be pessimistic, whereas others come into this world with an optimistic temperament. That doesn’t mean we can’t do anything to change or moderate our general perspective on life; personal change is possible, and even advisable. But such change is not necessarily easy, because it strikes at the very core of our personality. So, if what follows is persuasive but goes against your “grain”, with some mental effort you can certainly “modify” yourself.

What is not realistic about being constantly upbeat? Simply put, not everything “works out for the better”. As the colloquial saying goes: “shit happens”. Even if you believe in historical “progress” (more on that in a moment), that is not a constant, linear, upward slope. The lsope of history acts more like a roller coaster, but despite the dips it’s one that generally moves higher. Sort of “two steps forward, one step backward.”

On the other hand, general optimism is definitely realistic – if one takes the “longer view”. I have always felt – and mention to my friends when they talk about the “good old days” – that if I had one power to turn everyone in today’s world into an optimist, it would be this: a time machine to return them a few centuries back for a mere 24 hours. Let them freely wander about. They would probably not even last the full day.

For instance, back then the public stench was overwhelming, with garbage simply thrown out the window into the street – which is why to this day in older parts of European cities one can still see the “indentation” running down the stone street so that the effluents would be washed away during the rains. And if you tried to take refuge in people’s home, well their personal body odor would overwhelm us moderns as well: people showered once a month or at best once a week. Indeed, that’s the origin of the term “don’t throw out the baby with the bathwater”. The entire family would bathe in the bathtub in sequential order: dad, mom, older siblings and down the line, so that by the time the baby’s turn came the bathwater was dark and murky from everyone’s dirt and so submerged that you couldn’t see “junior” and might empty the bath (out the window), baby and all!

Actually, the situation was far worse: imagine living at a time when one in three of your children (and you had many!) died before reaching the age of 5; when a toothache meant physical extraction with pliers (and no anesthesia) – don’t even think about other, more invasive operations; and when you were lucky to live past age 36 (the average lifespan in Europe in the late 18th century – Mozart’s exact age when he died). One can go on and on about nutrition (severely lacking), poverty (the lot of the vast majority) – and that’s if you were lucky enough not to be a slave, or slaving away in indentured servitude.

In short, our life today is immeasurably better than anything that came before. So why shouldn’t we be optimistic about our personal as well as our social/national/global future?

But you might ask, just because a person becomes (more) optimistic, that makes their life better? The surprising answer is a definite “yes”. First, although this sounds banal, it is also true: optimistic people enjoy life more, so that there is a qualitative benefit. Even more germane is the “surprising” (?) fact that taking everything else into account, optimistic people live a few years longer than pessimistic people!

None of this is to say that we should go through life apathetically because “everything will work out for the better”. They won’t (as I already noted above) – unless we do something about it proactively. The human race in general, and each of us specifically, have powers of creative problem-solving that we are hardly aware of, until we start moving that “muscle”. Here’s just one example among myriads.

A huge, best-selling book in the late 1960s was Paul Ehrlich’s book The Population Bomb, a modern updating of Thomas Malthus’ infamous prediction that humanity will forever suffer from starvation and over-population warfare. What happened since then? On the one hand, the world developed agricultural technology (e.g., “Golden Rice”: a genetically modified type of rice with fortified beta-carotene – the basis of vitamin A – that has saved tens of millions of children from blindness). On the other hand, fertility rates have plummeted almost everywhere in the world, as parents decided to have fewer children in order to provide more to those that they do have.

So, when I hear the fear-mongers regarding, for example, climate change, I react in two ways: first, I’m happy that they’re warning us; second, precisely because of those warnings, I’m very optimistic that humanity will find the solutions to this real problem – just as they did with “overpopulation”. As the title of this post suggests: what happened “then” is not what will happen “now” or in the future. Optimism coupled with creative adaptation almost always brings about a positive outcome.

The Aging Brain

Like almost everyone else I know around my age, remembering things is not as easy as it used to be. That refers more to recent stuff in my life than “remembrances of things past” (to quote Proust). Nothing unusual about that. But what is somewhat “peculiar” is the fact that when it comes to another important part of my mentalizing, there is no diminution whatsoever: analysis and problem-solving. How come?

Let’s take memory first. There’s a common misconception that the progress of “memory” works as a parabola: slowly rising until our early Thirties, and then a slow decline for the rest of our life (for some, unfortunately, a faster decline in their Eighties and later). This is incorrect. In fact, we all undergo a massive form of “amnesia” around the age of eight (yes: 8) when the brain prunes most of our memories, probably to “make room” for the huge memory needs as we enter puberty and into the main learning period of our lives through the ensuing decade or so. That’s why we don’t remember anything from our toddler past (before 3 years old) – all those memories have been mind-expunged, as new neurons and synapses form.

Second, there’s the question: memory of what? I’m terrible at remembering names of people, even though I recognize the faces of people I haven’t seen in decades. If I see any word in print even only once, I never forget how it’s spelled; but I can’t very well recall events that I attended only a few years ago. Speaking of “years”, numbers of almost any sort (dates, how much something cost, etc.) – they’re all totally “sticky” in my brain. In short, there’s no one overall memory ability.

Why would that be? Simply put, because there is no single place in our brain where “memory” resides. Rather, memories are spread over different parts of our brain, each one close to a different sensory “module”. So that if we went to a restaurant and had a great meal, the memory of that would be evinced (recalled) from the olfactory or taste sections of our brain. And as we are well aware, people have variably stronger and weaker senses. For instance, I am very strong on “music” – once I hear a tune, I never forget it. But please don’t ask me what the song (or classical piece) is called. In fact, in most people “musical memory” is incredibly strong – one of the last things to go through the latter stages of dementia.

So, if one cognitive ability – memory – has many different “parts”, it shouldn’t be very surprising that other mental abilities are also differential in their “stickiness”. Whereas I have increasing difficulty remembering what I ate yesterday evening, I can analyze problems just as quickly and sharply as in the past. Indeed, maybe even better than in the past. Why? That leads to another aspect of the brain.

There is an old adage: the wise person knows how much s/he doesn’t know. But the converse is true too: none of us knows how much we really know. That means that buried deep in our mind is a huge amount of innate knowledge based on our life experience. It lies there dormant until something in our life demands a response to which this or that piece of “forgotten” information or knowledge is called for – and up it pops! Our brain is a massive and (usually) terrific archivist. And the longer we live, the more experience/info we accumulate, so that we are armed with more ammunition to resolve problems or analyze challenging issues. And that stuff doesn’t seem to decline much with age. For anyone even remotely active mentally, the amount of new information surpasses by far the info our brain forgets from lack of use.

That’s why one of the three most important things we should be doing as we age is to keep the brain “challenged” with new types of information: learning a new language, honing a new skill, getting educated in an unfamiliar subject area. (Now you’re asking: what are the other two important things?  Good nutrition; and aerobic, physical exercise, such as fast walking, swimming etc.).

In short, the brain isn’t so much a single, three-pound lump of flesh as it is a jigsaw puzzle of many pieces that make up a complete picture. You can lose a piece here and there and still see what the whole is all about. Obviously, if you lost the whole upper right-hand quarter of the jigsaw, you might not be able to see what was there at all. Luckily for us, however, the brain is different in an important way: if we lose some neurons or even small sections of the “mind”, it is able to find “detours” and pick up that “lost” ability (at least in part) through other parts of the brain. So, if our olfactory sense and memory starts failing, we might still be able to recall that delightful restaurant dinner through the “taste” or “sight” sections of our brain. And keeping the brain “challenged” throughout “retirement” ensures that it will be able to construct “detours” more effectively.

The famous actress Bette Davis once said: “Growing old is not for sissies!” And yet she herself had an indomitable spirit and stayed “sharp” to her dying day (obviously – that’s one of the pithiest comments about aging you will ever read!); she continued acting in film and on television until shortly before her death. The aging brain can continue to do many wonderful things for us despite some partial breakdowns here and there. Just keep it as oiled as you can.