Holocaust Survivor Primo Levi on Human Nature, Happiness and Unhappiness, and the Interconnectedness of Our Fates

Primo Levi

“A country is considered the more civilized the more the wisdom and efficiency of its laws hinder a weak man from becoming too weak or a powerful one too powerful.”

“If during the next million generations there is but one human being born in every generation who will not cease to inquire into the nature of his fate, even while it strips and bludgeons him, some day we shall read the riddle of our universe,” Rebecca West wrote in her extraordinary 1941 treatise on survival and the redemption of suffering. One such unrelenting inquirer into the nature of his barely survivable fate was the great Italian Jewish chemist and writer Primo Levi (July 31, 1919–April 11, 1987), who was thrown into a Nazi death camp shortly after West set her timeless words to paper. Arrested as a member of the anti-Fascist resistance and deported to Auschwitz in 1944, Levi lived through the Holocaust and transmuted his horrifying confrontation with death into a humanistic force of justice and empathy under the lifelong conviction that “no human experience is without meaning or unworthy of analysis.”

In Survival in Auschwitz (public library), originally published as If This Is a Man, Levi wrests from what he witnessed and endured profound insight into some of the most elemental questions of human existence: what it means to be happy, why we habitually self-inflict unhappiness, how to fathom unfathomable suffering, where the seedbed of meaning resides.

Of the forty-five people crammed into the train car that took Levi to Auschwitz, which he notes was “by far the most fortunate wagon,” only four survived. Toward the end of his memoir, in diaristic form, he offers a harrowing perspective barely imaginable to any free person:

This time last year I was a free man: an outlaw but free, I had a name and a family, I had an eager and restless mind, an agile and healthy body. I used to think of many, far-away things: of my work, of the end of the war, of good and evil, of the nature of things and of the laws which govern human actions; and also of the mountains, of singing and loving, of music, of poetry. I had an enormous, deep-rooted foolish faith in the benevolence of fate; to kill and to die seemed extraneous literary things to me. My days were both cheerful and sad, but I regretted them equally, they were all full and positive; the future stood before me as a great treasure. Today the only thing left of the life of those days is what one needs to suffer hunger and cold; I am not even alive enough to know how to kill myself.

It takes an extraordinary person to not only survive such a devastating extreme of inhumanity but to emerge from it with the awareness that existence always leans toward equilibrium. Reflecting on his experience in the camp, Levi writes:

Sooner or later in life everyone discovers that perfect happiness is unrealizable, but there are few who pause to consider the antithesis: that perfect unhappiness is equally unattainable. The obstacles preventing the realization of both these extreme states are of the same nature: they derive from our human condition which is opposed to everything infinite. Our ever-insufficient knowledge of the future opposes it: and this is called, in the one instance, hope, and in the other, uncertainty of the following day. The certainty of death opposes it: for it places a limit on every joy, but also on every grief. The inevitable material cares oppose it: for as they poison every lasting happiness, they equally assiduously distract us from our misfortunes and make our consciousness of them intermittent and hence supportable…

more…

https://www.brainpickings.org/

WIKK WEB GURU

How to Tell a True Tale: Neil Gaiman on What Makes a Great Personal Story

 

Neil Gaiman (Photograph: Amanda Palmer)
Neil Gaiman (Photograph: Amanda Palmer)

“The gulf that exists between us as people is that when we look at each other we might see faces, skin color, gender, race, or attitudes, but we don’t see, we can’t see, the stories.”

“We tell ourselves stories in order to live,” Joan Didion memorably wrote. And perhaps we live in order to tell our stories — or, as Gabriel García Márquez put it in reflecting on his own story, “life is not what one lived, but what one remembers and how one remembers it in order to recount it.” To tell a story, Susan Sontag observed in her timeless advice to writers, “is to reduce the spread and simultaneity of everything to something linear, a path.”

And yet our means of making a clearing through the chaos of events matter as much as, if not more than, the events themselves. The best of our stories are those that transform and redeem us, ones that both ground us in ourselves by reminding us what it means to be human and elevate us by furnishing an instrument of self-transcendence.

What it takes to make such a clearing is what Neil Gaiman, a writer who knows a thing or two about what makes stories last and how storytelling enlarges our humanity, examines in his foreword to All These Wonders: True Stories About Facing the Unknown (public library), celebrating a quarter century of storytelling powerhouse The Moth.

The sequel to the volume that gave us what I continue to consider the greatest Moth story ever told, this wondrous collection contains forty-five stories about courage in the face of uncertainty by tellers as varied as a cognitive scientist and an Ultra-Orthodox Jew.

Reflecting on his own improbable path into the Moth community, where storytellers tell true stories in front of a live audience and end up feeling like they have “walked through fire and been embraced and loved,” Gaiman considers what makes a great Moth story — which is ultimately a question of what it is in a human story that anneals us to one another through the act of its telling:

The strange thing about Moth stories is that none of the tricks we use to make ourselves loved or respected by others work in the ways you would imagine they ought to. The tales of how clever we were, how wise, how we won, they mostly fail. The practiced jokes and the witty one-liners all crash and burn up on a Moth stage.

Honesty matters. Vulnerability matters. Being open about who you were at a moment in time when you were in a difficult or an impossible place matters more than anything.

Having a place the story starts and a place it’s going: that’s important.

Telling your story, as honestly as you can, and leaving out the things you don’t need, that’s vital.

The Moth connects us, as humans. Because we all have stories. Or perhaps, because we are, as humans, already an assemblage of stories. And the gulf that exists between us as people is that when we look at each other we might see faces, skin color, gender, race, or attitudes, but we don’t see, we can’t see, the stories. And once we hear each other’s stories we realize that the things we see as dividing us are, all too often, illusions, falsehoods: that the walls between us are in truth no thicker than scenery.

All These Wonders is replete with wondrous true stories of loves, losses, rerouted dreams, and existential crises of nearly every unsugarcoated flavor. Complement the theme of this new anthology with Anaïs Nin on how inviting the unknown helps us live more richly, Rebecca Solnit on how we find ourselves by getting lost, and Wisława Szymborska’s Nobel Prize acceptance speech on the generative power of not-knowing, then revisit Gaiman on why we read, the power of cautionary questions, and his eight rules of writing.

For a supreme taste of The Moth’s magic, see astrophysicist Janna Levin’s unparalleled story about the Möbius paths that lead us back to ourselves.

https://www.brainpickings.org/

WIKK WEB GURU

The Deep Space of Digital Reading

LaFarge_BR-machine.16TH-CENTURY INTERNET: The “book wheel,” invented in 1588, was a rotating reading desk that allowed readers to flit among texts by giving the wheel a quick spin.Wikipedia

Why we shouldn’t worry about leaving print behind.

In A History of Reading, the Canadian novelist and essayist Alberto Manguel describes a remarkable transformation of human consciousness, which took place around the 10th century A.D.: the advent of silent reading. Human beings have been reading for thousands of years, but in antiquity, the normal thing was to read aloud. When Augustine (the future St. Augustine) went to see his teacher, Ambrose, in Milan, in 384 A.D., he was stunned to see him looking at a book and not saying anything. With the advent of silent reading, Manguel writes,

the reader was at last able to establish an unrestricted relationship with the book and the words. The words no longer needed to occupy the time required to pronounce them. They could exist in interior space, rushing on or barely begun, fully deciphered or only half-said, while the reader’s thoughts inspected them at leisure, drawing new notions from them, allowing comparisons from memory or from other books left open for simultaneous perusal.

To read silently is to free your mind to reflect, to remember, to question and compare. The cognitive scientist Maryanne Wolf calls this freedom “the secret gift of time to think”: When the reading brain becomes able to process written symbols automatically, the thinking brain, the I, has time to go beyond those symbols, to develop itself and the culture in which it lives.

A thousand years later, critics fear that digital technology has put this gift in peril. The Internet’s flood of information, together with the distractions of social media, threaten to overwhelm the interior space of reading, stranding us in what the journalist Nicholas Carr has called “the shallows,” a frenzied flitting from one fact to the next. In Carr’s view, the “endless, mesmerizing buzz” of the Internet imperils our very being: “One of the greatest dangers we face,” he writes, “as we automate the work of our minds, as we cede control over the flow of our thoughts and memories to a powerful electronic system, is … a slow erosion of our humanness and our humanity.”

There’s no question that digital technology presents challenges to the reading brain, but, seen from a historical perspective, these look like differences of degree, rather than of kind. To the extent that digital reading represents something new, its potential cuts both ways. Done badly (which is to say, done cynically), the Internet reduces us to mindless clickers, racing numbly to the bottom of a bottomless feed; but done well, it has the potential to expand and augment the very contemplative space that we have prized in ourselves ever since we learned to read without moving our lips.

Critics like to say the Internet causes our minds to wander off, but we’ve been wandering off all along.

The fear of technology is not new. In the fifth century B.C., Socrates worried that writing would weaken human memory, and stifle judgment. In fact, as Wolf notes in her 2007 book Proust and the Squid: The Story and Science of the Reading Brain, the opposite happened: Faced with the written page, the reader’s brain develops new capacities. The visual cortex forms networks of cells that are capable of recognizing letterforms almost instantaneously; increasingly efficient pathways connect these networks to the phonological and semantic areas of the cortex, freeing up other parts of the brain to put the words we read into sentences, stories, views of the world. We may not keep the Iliadin our heads any longer, but we’re exquisitely capable of reflecting on it, comparing it to other stories we know, and forming conclusions about human beings ancient and modern…

more…

http://nautil.us/issue/47/consciousness/the-deep-space-of-digital-reading-rp

WIKK WEB GURU

Kierkegaard on Time, the Fullness of the Moment, and How to Bridge the Ephemeral with the Eternal

“The moment is not properly an atom of time but an atom of eternity. It is the first reflection of eternity in time, its first attempt, as it were, at stopping time.”

“All eternity is in the moment,” Mary Oliver wrote with an indebted eye to Blake and Whitman. “[Is] only the present comprehended?” Patti Smith asked two decades later in her magnificent meditation on time and transformation.

This temporal tension between the immediate and the eternal is one of the core characteristics and defining frustrations of the human experience — over and over, we strain to locate ourselves within time, against time, grasping for solid ground while aswirl in its unstoppable flow. We struggle to hold it all with what Bertrand Russell called “a largeness of contemplation,” but we continually suffer at the smallness of our temporal existence — suffering reflected in our cultural fascination with time travel, which illuminates the central mystery of human consciousness.

How to inhabit the time-scale of our existence without suffering and fill the moment with eternity is what the great Danish philosopher Søren Kierkegaard (May 5, 1813–November 11, 1855) explores in a portion of his 1844 classic The Concept of Anxiety, later included in the indispensable volume The Essential Kierkegaard (public library).

Søren Kierkegaard

A century before Borges’s famous proclamation — “time is the substance I am made of”— and more than a century and a half before Einstein revolutionized human thought by annealing our two primary modes of existence to one another in the single entity of spacetime, Kierkegaard writes:

Man … is a synthesis of psyche and body, but he is also a synthesis of the temporal and the eternal.

Centuries before physicist came to explore the science of why we can’t remember the future, Kierkegaard probes our familiar temporal ordering of events and experiences:

If time is correctly defined as an infinite succession, it most likely is also defined as the present, the past, and the future. This distinction, however, is incorrect if it is considered to be implicit in time itself, because the distinction appears only through the relation of time to eternity and through the reflection of eternity in time. If in the infinite succession of time a foothold could be found, i.e., a present, which was the dividing point, the division would be quite correct. However, precisely because every moment, as well as the sum of the moments, is a process (a passing by), no moment is a present, and accordingly there is in time neither present, nor past, nor future. If it is claimed that this division can be maintained, it is because the moment is spatialized, but thereby the infinite succession comes to a halt, it is because representation is introduced that allows time to be represented instead of being thought. Even so, this is not correct procedure, for even as representation, the infinite succession of time is an infinitely contentless present (this is the parody of the eternal).

[…]

The present, however, is not a concept of time, except precisely as something infinitely contentless, which again is the infinite vanishing. If this is not kept in mind, no matter how quickly it may disappear, the present is posited, and being posited it again appears in the categories: the past and the future.

The eternal, on the contrary, is the present. For thought, the eternal is the present in terms of an annulled succession (time is the succession that passes by). For representation, it is a going forth that nevertheless does not get off the spot, because the eternal is for representation the infinitely contentful present. So also in the eternal there is no division into the past and the future, because the present is posited as the annulled succession.

Time is, then, infinite succession; the life that is in time and is only of time has no present. In order to define the sensuous life, it is usually said that it is in the moment and only in the moment. By the moment, then, is understood that abstraction from the eternal that, if it is to be the present, is a parody of it. The present is the eternal, or rather, the eternal is the present, and the present is full…

more…

https://www.brainpickings.org/

WIKK WEB GURU

 

The Nothingness of Personality: Young Borges on the Self

Illustration by Mimmo Paladino for a rare edition of James Joyce’s Ulysses

“There is no whole self. It suffices to walk any distance along the inexo­rable rigidity that the mirrors of the past open to us in order to feel like out­siders, naively flustered by our own bygone days.”

You find yourself in a city you hadn’t visited in years, walking along a street you had once strolled down with your fingers interlacing a long-ago lover’s, someone you then cherished as the most extraordinary person in the world, who is now married in Jersey with two chubby bulldogs. You find yourself shocked by how an experience of such vivid verisimilitude can be fossilized into a mere memory buried in the strata of what feels like a wholly different person, living a wholly different life — it was you who then lived it, and you who now remembers it, and yet the two yous have almost nothing in common. They inhabit different geographical and social loci, lead different lives, love different loves, dream different dreams. Hardly a habit unites them. Even most of the cells in the body striding down that street are different.

What, then, makes you you? And what is inside that cocoon of certitudes we call a self?

It’s an abiding question with which each of us tussles periodically, and one which has occupied some of humanity’s most fertile minds. The ancient Greeks addressed it in the brilliant Ship of Theseus thought experiment. Walt Whitman marveled at the paradox of the self. Simone de Beauvoir contemplated how chance and choice converge to make us who we are. Jack Kerouac denounced “the imaginary idea of a personal self.” Amelie Rorty taxonomized the seven layers of identity. Rebecca Goldstein examined what makes you and your childhood self the “same” person despite a lifetime of change.

The young Jorge Luis Borges (August 24, 1899–June 14, 1986) set out to explore this abiding question in one of his earliest prose pieces, the 1922 essay “The Nothingness of Personality,” found in his splendid posthumously collection Selected Non-Fictions(public library).

Jorge Luis Borges, 1923

Shortly after his family returned to their native Buenos Aires after a decade in Europe and more than a year before he published his first collection of poems, the 22-year-old Borges begins by setting his unambiguous, unambivalent intention:

I want to tear down the exceptional preeminence now generally awarded to the self, and I pledge to be spurred on by concrete certainty, and not the caprice of an ideological ambush or a dazzling intellectual prank. I propose to prove that personality is a mirage maintained by conceit and custom, without metaphysical foundation or visceral reality. I want to apply to literature the consequences that issue from these premises, and erect upon them an aesthetic hostile to the psychologism inherited from the last century, sympathetic to the classics, yet encouraging to today’s most unruly tendencies.

Exactly three decades before he faced his multitudes in the fantastic Borges and I, he writes:

There is no whole self. Any of life’s present situations is seamless and sufficient. Are you, as you ponder these disquietudes, anything more than an in­ difference gliding over the argument I make, or an appraisal of the opinions I expound?

I, as I write this, am only a certainty that seeks out the words that are most apt to compel your attention. That proposition and a few muscular sensations, and the sight of the limpid branches that the trees place outside my window, constitute my current I.

It would be vanity to suppose that in order to enjoy absolute validity this psychic aggregate must seize on a self, that conjectural Jorge Luis Borges on whose tongue sophistries are always at the ready and in whose solitary strolls the evenings on the fringes of the city are pleasant.

Illustration by Cecilia Ruiz from The Book of Memory Gaps, inspired by Borges

Half a century before neuroscientists demonstrated that memory is the seedbed of the self, Borges writes:

There is no whole self. He who defines personal identity as the private possession of some depository of memories is mistaken. Whoever affirms such a thing is abusing the symbol that solidifies memory in the form of an enduring and tangible granary or warehouse, when memory is no more than the noun by which we imply that among the innumerable possible states of consciousness, many occur again in an imprecise way. Moreover, if I root personality in remembrance, what claim of ownership can be made on the elapsed instants that, because they were quotidian or stale, did not stamp us with a lasting mark? Heaped up over years, they lie buried, inac­cessible to our avid longing. And that much-vaunted memory to whose rul­ing you made appeal, does it ever manifest all its past plenitude? Does it truly live? The sensualists and their ilk, who conceive of your personality as the sum of your successive states of mind, are similarly deceiving them­ selves. On closer scrutiny, their formula is no more than an ignominious circumlocution that undermines the very foundation it constructs, an acid that eats away at itself, a prattling fraud and a belabored contradiction…

more…

https://www.brainpickings.org/

WIKK WEB GURU

Polish Poet and Nobel Laureate Wisława Szymborska on How Our Certitudes Keep Us Small and the Generative Power of Not-Knowing

Art by Salvador Dalí from a rare edition of Alice’s Adventures in Wonderland

“Attempt what is not certain. Certainty may or may not come later. It may then be a valuable delusion,” the great painter Richard Diebenkorn counseled in his ten rules for beginning creative projects. “One doesn’t arrive — in words or in art — by necessarily knowing where one is going,” the artist Ann Hamilton wrote a generation later in her magnificent meditation on the generative power of not-knowing. “In every work of art something appears that does not previously exist, and so, by default, you work from what you know to what you don’t know.”

What is true of art is even truer of life, for a human life is the greatest work of art there is. (In my own life, looking back on my ten most important learnings from the first ten years of Brain Pickings, I placed the practice of the small, mighty phrase “I don’t know” at the very top.) But to live with the untrammeled openendedness of such fertile not-knowing is no easy task in a world where certitudes are hoarded as the bargaining chips for status and achievement — a world bedeviled, as Rebecca Solnit memorably put it, by “a desire to make certain what is uncertain, to know what is unknowable, to turn the flight across the sky into the roast upon the plate.”

That difficult feat of insurgency is what the great Polish poet Wisława Szymborska (July 2, 1923–February 1, 2012) explored in 1996 when she was awarded the Nobel Prize in Literature for capturing the transcendent fragility of the human experience in masterpieces like “Life-While-You-Wait” and “Possibilities.”

In her acceptance speech, later included in Nobel Lectures: From the Literature Laureates, 1986 to 2006 (public library) — which also gave us the spectacular speech on the power of language Toni Morrison delivered after becoming the first African American woman to win the Nobel Prize — Szymborska considers why artists are so reluctant to answer questions about what inspiration is and where it comes from:

It’s not that they’ve never known the blessing of this inner impulse. It’s just not easy to explain something to someone else that you don’t understand yourself.

Noting that she, too, tends to be rattled by the question, she offers her wieldiest answer:

Inspiration is not the exclusive privilege of poets or artists generally. There is, has been, and will always be a certain group of people whom inspiration visits. It’s made up of all those who’ve consciously chosen their calling and do their job with love and imagination. It may include doctors, teachers, gardeners — and I could list a hundred more professions. Their work becomes one continuous adventure as long as they manage to keep discovering new challenges in it. Difficulties and setbacks never quell their curiosity. A swarm of new questions emerges from every problem they solve. Whatever inspiration is, it’s born from a continuous “I don’t know.”

In a sentiment of chilling prescience today, as we witness tyrants drunk on certainty drain the world of its essential inspiration, Szymborska considers the destructive counterpoint to this generative not-knowing:

All sorts of torturers, dictators, fanatics, and demagogues struggling for power by way of a few loudly shouted slogans also enjoy their jobs, and they too perform their duties with inventive fervor. Well, yes, but they “know.” They know, and whatever they know is enough for them once and for all. They don’t want to find out about anything else, since that might diminish their arguments’ force. And any knowledge that doesn’t lead to new questions quickly dies out: it fails to maintain the temperature required for sustaining life. In the most extreme cases, cases well known from ancient and modern history, it even poses a lethal threat to society.

This is why I value that little phrase “I don’t know” so highly. It’s small, but it flies on mighty wings. It expands our lives to include the spaces within us as well as those outer expanses in which our tiny Earth hangs suspended. If Isaac Newton had never said to himself “I don’t know,” the apples in his little orchard might have dropped to the ground like hailstones and at best he would have stooped to pick them up and gobble them with gusto. Had my compatriot Marie Sklodowska-Curie never said to herself “I don’t know”, she probably would have wound up teaching chemistry at some private high school for young ladies from good families, and would have ended her days performing this otherwise perfectly respectable job. But she kept on saying “I don’t know,” and these words led her, not just once but twice, to Stockholm, where restless, questing spirits are occasionally rewarded with the Nobel Prize…

more…

https://www.brainpickings.org/

WIKK WEB GURU

The Sane Society: The Great Humanistic Philosopher and Psychologist Erich Fromm on How to Save Us From Ourselves

“The whole life of the individual is nothing but the process of giving birth to himself; indeed, we should be fully born, when we die.”

 

“Every advance of intellect beyond the ordinary measure,” Schopenhauer wrote in examining the relationship between genius and insanity, “disposes to madness.” But could what is true of the individual also be true of society — could it be that the more so-called progress polishes our collective pride and the more intellectually advanced human civilization becomes, the more it risks madness? And, if so, what is the proper corrective to restore our collective sanity?

That’s what the great German humanistic philosopher and psychologist Erich Fromm (March 23, 1900–March 18, 1980) explores in his timely 1956 treatise The Sane Society (public library).

Fifteen years after his inquiry into why totalitarian regimes rise in Escape from Freedom, Fromm examines the promise and foibles of modern democracy, focusing on its central pitfall of alienation and the means to attaining its full potential — the idea that “progress can only occur when changes are made simultaneously in the economic, socio-political and cultural spheres; that any progress restricted to one sphere is destructive to progress in all spheres.”

Two decades before his elegant case for setting ourselves free from the chains of our culture, Fromm weighs the validity of our core assumption about our collective state:

Nothing is more common than the idea that we, the people living in the Western world of the twentieth century, are eminently sane. Even the fact that a great number of individuals in our midst suffer from more or less severe forms of mental illness produces little doubt with respect to the general standard of our mental health. We are sure that by introducing better methods of mental hygiene we shall improve still further the state of our mental health, and as far as individual mental disturbances are concerned, we look at them as strictly individual incidents, perhaps with some amazement that so many of these incidents should occur in a culture which is supposedly so sane.

Can we be so sure that we are not deceiving ourselves? Many an inmate of an insane asylum is convinced that everybody else is crazy, except himself.

Fromm notes that while modernity has increased the material wealth and comfort of the human race, it has also wrought major wars that killed millions, during which “every participant firmly believed that he was fighting in his self-defense, for his honor, or that he was backed up by God.” In a sentiment of chilling pertinence today, after more than half a century of alleged progress has drowned us in mind-numbing commercial media and left us to helplessly watch military budgets swell at the expense of funding for the arts and humanities, Fromm writes:

We have a literacy above 90 per cent of the population. We have radio, television, movies, a newspaper a day for everybody. But instead of giving us the best of past and present literature and music, these media of communication, supplemented by advertising, fill the minds of men with the cheapest trash, lacking in any sense of reality, with sadistic phantasies which a halfway cultured person would be embarrassed to entertain even once in a while. But while the mind of everybody, young and old, is thus poisoned, we go on blissfully to see to it that no “immorality” occurs on the screen. Any suggestion that the government should finance the production of movies and radio programs which would enlighten and improve the minds of our people would be met again with indignation and accusations in the name of freedom and idealism.

Art by Edward Gorey from The Shrinking of Treehorn

Less than a decade after the German philosopher Josef Pieper made his beautiful case for why leisure is the basis of culture, Fromm adds:

We have reduced the average working hours to about half what they were one hundred years ago. We today have more free time available than our forefathers dared to dream of. But what has happened? We do not know how to use the newly gained free time; we try to kill the time we have saved, and are glad when another day is over… Society as a whole may be lacking in sanity.

Fromm points out that we can only speak of a “sane” society if we acknowledge that a society can be not sane, which in turn requires a departure from previous theories of sociological relativism postulating that “each society is normal inasmuch as it functions, and that pathology can be defined only in terms of the individual’s lack of adjustment to the ways of life in his society.” Instead, Fromm proposes a model of normative humanism — a redemptive notion that relieves some of our self-blame for feeling like we are going crazy, by acknowledging that society itself, when bedeviled by certain pathologies, can be crazy-making for the individual…

more…

https://www.brainpickings.org/

WIKK WEB GURU