Category: Literature


Illustration by Maurice Sendak from Open House for Butterflies by Ruth Krauss

“People who for some reason find it impossible to think about themselves, and so really be themselves, try to make up for not thinking with doing.”

In 1926, having just divorced her first husband at the age of twenty-five, the American poet, critic, essayist, and short story writer Laura Riding (January 16, 1901–September 2, 1991) moved to England and founded, together with her friend the poet Robert Graves, a small independent press. Like Anaïs Nin’s publishing venture, all of their early publications — which included work by Gertrude Stein — were typeset and printed by hand.

In 1930, Riding and Graves moved their offices to Majorca. That year, 29-year-old Riding wrote a series of letters to 8-year-old Catherine — the daughter of Graves and the artist Nancy Nicholson. Originally published by a Parisian press in a limited edition of 200 copies each signed by the author, Four Unposted Letters to Catherine(public library) endures as a small, miraculous book, reminiscent in spirit of Rilke’s Letters to a Young Poet and in style and substance of the Zen teachings of Seung Sahn or Thich Hhat Hanh. With great simplicity and unpretentious sincerity, both comprehensible and enchanting as much to this particular little girl as to any child or even any wakeful grownup at all, Riding addresses some of the most elemental questions of existence — how to live a life of creativity and integrity, why praise and prestige are corrosive objects of success, and above all what it means to be oneself.

Riding eventually returned to America in 1939, remarried and became Laura (Riding) Jackson, continued to write, and lived to be ninety — a long life animated by the conviction that language is “the essential moral meeting-ground.” When she reflected on these letters three decades after writing them, she remarked wistfully that she might no longer be inclined to write “such easy-speaking letters, treating with so much diffident good-humor the stupendous, incessantly-urgent matter of Virtue and the lack of it,” by which she meant “the eternal virtue of good Being, not the mortal virtue of good Custom.” And yet, mercifully, she did once write them, and they did survive, and today they continue to nourish souls of all ages with their unadorned wisdom and transcendent truthfulness.

In the first of the four letters, a meandering meditation on young Catherine’s remark that grownups sometimes seem to “know everything about everything,” Riding explores the nature of knowledge and its essential seedbed of self-knowledge. She writes:

A child should be allowed to take as long as she needs for knowing everything about herself, which is the same as learning to be herself. Even twenty-five years if necessary, or even forever. And it wouldn’t matter if doing things got delayed, because nothing is really important but being oneself.

Nearly a century after Kierkegaard extolled the virtues of idleness and two decades before the German philosopher Joseph Pieper argued that not-doing is the basis of culture, Riding urges young Catherine not to worry about being accused of laziness and considers the basic goodness of simply being oneself:

You seem to spend a lot of time dreaming about nothing at all. And yet you are, as the few people who really know you recognise, a perfect child… This is because when you seem to be dreaming about nothing at all you are not being lazy but thinking about yourself. One doesn’t say you are lazy or selfish. If a person is herself she can’t be a bad person in any way; she is always a good person in her own way. For instance, you are very affectionate, but that’s because you are a good person. You are not a good person just because you are affectionate. It wouldn’t matter if you weren’t affectionate, because you are a good person. You are yourself, and whatever you do is sure to be good.

In a passage that radiates a prescient admonition against the perils of our modern Parenting Industrial Complex, Riding adds:

It is very sad then that so many children are hurried along and not given time to think about themselves. People say to them when they think that they have been playing long enough: “You are no longer a child. You must begin to do something.” But although playing is doing nothing, you are really doing something when you play; you are thinking about yourself. Many children play in the wrong way. They make work out of play. They not only seem to be doing something, they really are doing something. They are imitating the grown-ups around them who are always doing as much instead of as little as possible. And they are often encouraged to play in this way by the grown-ups. And they are not learning to be themselves.

In an essential caveat that teases out the nuance of her point, Riding notes that rather than selfishness or narcissism, such thinking about oneself is the only way to conceive of one’s place within a larger world and therefore to think of the world itself. In a sentiment that calls to mind Diane Ackerman’s wonderful notion of “the plain everythingness of everything, in cahoots with the everythingness of everything else,”Riding offers an almost Buddhist perspective:…

more…

https://www.brainpickings.org/

 

WIKK WEB GURU

WIKK WEB GURU

Art by Bobby Baker from Diary Drawings: Mental Illness and Me

How a visionary woman persisted in leading a quiet revolution in mental health.

“All good teachers know that inside a remote or angry person is a soul, way deep down, capable of a full human life,” Anne Lamott wrote in her beautiful meditation on the life-giving power of great teachers. Those whom teachers — nor parents, nor friends, nor spouses, nor lovers — cannot reach, those whose inner turbulence has metastasized into acute mental illness and shipwrecked them on the remotest edges of the mind, are left to psychotherapists. But the most effective therapists are animated by the same unflinching conviction that within each patient lives an almost sacred person, and that no person, no matter how damaged and disturbed, is irredeemable or incapable of having a full life.

This was the animating ethos of pioneering psychotherapist Frieda Fromm-Reichmann (October 23, 1889–April 28, 1957), who had narrowly escaped from Nazi Germany, lived in exile in France and Palestine, and ended up in America to begin nothing short of a revolution in mental health care. (Adding another layer of rebellious complexity to her life was her decision to marry, while still in Germany, the great humanistic philosopher and psychologist Erich Fromm — her colleague and onetime patient, ten years her junior.) In many ways, she was the Oliver Sacks of mental health, not merely applying her robust professional expertise to the healing of her patients but bathing them in largehearted perseverance of faith in the inextinguishable light of their humanity.

Fromm-Reichmann was introduced into the popular imagination by the improbable 1964 hit novel I Never Promised You a Rose Garden — the faintly fictionalized autobiographical account of Joanne Greenberg, one of her patients, who had made a seemingly miraculous recovery from what is considered the most hopeless of mental illnesses: schizophrenia. Greenberg had entered Fromm-Reichmann’s care as teenager so afflicted as to be gashing her arms with jagged tin can tops and putting out cigarettes into the wounds. She exited four years later as a fully functioning college student who went on to have a family and become a successful writer.

Although Greenberg wrote the novel under the pseudonym Hannah Green and christened Fromm-Freichmann “Dr. Fried,” details about the institution and their respective lives soon revealed their real identities. Against the odds of what seemed like an unusual and ill-advised premise for a popular novel, I Never Promised You a Rose Garden became a sensation, amassing a cult following through the six million copies sold in decades since. But its most enduring feat was to make its millions of readers fall in love with Frieda Fromm-Reichmann and her maverick insistence that even the most tortured minds have a shot at serenity given enough attentive patience and persistence on behalf of those qualified to help them.

It was the novel that first introduced fifteen-year-old Gail Hornstein to Fromm-Reichmann’s work and planted the long-germinated seed of what would become, thirty-four years later, To Redeem One Person Is to Redeem the World: A Life of Frieda Fromm-Reichmann (public library) — a spectacular biography ten years in the making, which Hornstein, by then a psychologist herself, hadn’t set out to write but found herself unable not to.

Hornstein considers the seedbed of Fromm-Reichmann’s unusually tenacious and patient faith in the potential for healing:

Frieda’s capacity to wait had been honed as a child, when she trained herself to expand to infinity the time she gave her parents to tire of misunderstanding. Medical school in Königsberg was one long act of patience, designed to prove that she and the handful of other women deserved to be there. Later, working at a Prussian army hospital during World War I, she learned from brain-injured soldiers what it was like to have a shell explode in your face and still be alive. Their muteness became her measure. When she took up treating schizophrenics in the 1920s, they seemed so intact by comparison that she found the work a pleasure. Most psychiatrists, accustomed to treating the “worried well,” find the unbearably slow pace of therapy with psychotics intolerable. But Frieda could wait cheerfully through years of infinitesimal gain; the knowledge that recovery was anatomically possible was enough to keep her going. She could tolerate any behavior, no matter how disgusting or bizarre, so long as it seemed necessary to protect a vulnerable person. It was only when symptoms became ruses or habits that she started badgering patients to give them up and get better.

Fromm-Reichmann held nothing back in helping her patients — nothing of herself, and nothing of the often arbitrary rules by which her profession operated. Hornstein writes:

She was willing to try practically anything that might help them, which was a great deal more than most other psychiatrists were willing to do. She saw one patient at ten o’clock at night because that’s when he was most likely to talk. She took others on walks around hospital grounds, or to symphony concerts, or to country inns for lunch. Those too distraught to leave at the end of an hour were permitted to stay for two. If a patient was violent and couldn’t be let off the ward, she went to his room or saw him in restraints, if necessary. “She would have swung from the chandelier like Tarzan if she thought it would help,” Joanne Greenberg later observed. A colleague remarked, not admiringly, that Frieda’s patients got better because she simply gave them no other choice…

more…

https://www.brainpickings.org/

WIKK WEB GURU

WIKK WEB GURU

 Art by Paul Rand from Little 1 by Ann Rand, a vintage concept book about the numbers

“If you look at zero you see nothing; but look through it and you will see the world.”

If the ancient Arab world had closed its gates to foreign travelers, we would have no medicine, no astronomy, and no mathematics — at least not as we know them today.

Central to humanity’s quest to grasp the nature of the universe and make sense of our own existence is zero, which began in Mesopotamia and spurred one of the most significant paradigm shifts in human consciousness — a concept first invented (or perhaps discovered) in pre-Arab Sumer, modern-day Iraq, and later given symbolic form in ancient India. This twining of meaning and symbol not only shaped mathematics, which underlies our best models of reality, but became woven into the very fabric of human life, from the works of Shakespeare, who famously winked at zero in King Lear by calling it “an O without a figure,” to the invention of the bit that gave us the 1s and 0s underpinning my ability to type these words and your ability to read them on this screen.

Mathematician Robert Kaplan chronicles naught’s revolutionary journey in The Nothing That Is: A Natural History of Zero (public library). It is, in a sense, an archetypal story of scientific discovery, wherein an abstract concept derived from the observed laws of nature is named and given symbolic form. But it is also a kind of cross-cultural fairy tale that romances reason across time and space

Kaplan writes:

If you look at zero you see nothing; but look through it and you will see the world. For zero brings into focus the great, organic sprawl of mathematics, and mathematics in turn the complex nature of things. From counting to calculating, from estimating the odds to knowing exactly when the tides in our affairs will crest, the shining tools of mathematics let us follow the tacking course everything takes through everything else – and all of their parts swing on the smallest of pivots, zero

With these mental devices we make visible the hidden laws controlling the objects around us in their cycles and swerves. Even the mind itself is mirrored in mathematics, its endless reflections now confusing, now clarifying insight.

[…]

As we follow the meanderings of zero’s symbols and meanings we’ll see along with it the making and doing of mathematics — by humans, for humans. No god gave it to us. Its muse speaks only to those who ardently pursue her.

With an eye to the eternal question of whether mathematics is discovered or invented — a question famously debated by Kurt Gödel and the Vienna Circle — Kaplan observes:

The disquieting question of whether zero is out there or a fiction will call up the perennial puzzle of whether we invent or discover the way of things, hence the yet deeper issue of where we are in the hierarchy. Are we creatures or creators, less than – or only a little less than — the angels in our power to appraise?

Art by Shel Silverstein from The Missing Piece Meets the Big O

Like all transformative inventions, zero began with necessity — the necessity for counting without getting bemired in the inelegance of increasingly large numbers. Kaplan writes:

Zero began its career as two wedges pressed into a wet lump of clay, in the days when a superb piece of mental engineering gave us the art of counting.

The story begins some 5,000 years ago with the Sumerians, those lively people who settled in Mesopotamia (part of what is now Iraq). When you read, on one of their clay tablets, this exchange between father and son: “Where did you go?” “Nowhere.” “Then why are you late?”, you realize that 5,000 years are like an evening gone.

The Sumerians counted by 1s and 10s but also by 60s. This may seem bizarre until you recall that we do too, using 60 for minutes in an hour (and 6 × 60 = 360 for degrees in a circle). Worse, we also count by 12 when it comes to months in a year, 7 for days in a week, 24 for hours in a day and 16 for ounces in a pound or a pint. Up until 1971 the British counted their pennies in heaps of 12 to a shilling but heaps of 20 shillings to a pound.

Tug on each of these different systems and you’ll unravel a history of customs and compromises, showing what you thought was quirky to be the most natural thing in the world. In the case of the Sumerians, a 60-base (sexagesimal) system most likely sprang from their dealings with another culture whose system of weights — and hence of monetary value — differed from their own…

more…

https://www.brainpickings.org/

WIKK WEB GURU

WIKK WEB GURU

Art by Tomi Ungerer from Otto: The Autobiography of a Teddy Bear — his iconic children’s book about the Holocaust

“Society has discovered discrimination as the great social weapon by which one may kill men without any bloodshed.”

“All animals are equal, but some animals are more equal than others,” George Orwell wrote in his cautionary 1945 allegory Animal Farm, the pertinence and prescience of which has continued to ripple through every present since. A generation later, Dr. King cautioned in his piercing 1963 letter on justice and nonviolent resistance: “Injustice anywhere is a threat to justice everywhere. We are caught in an inescapable network of mutuality… Whatever affects one directly, affects all indirectly.”

In times of institutionally condoned injustice and inequality, we ought to find in ourselves the moral courage to reweave that mesh of mutuality with any tools we have, and we hardly have a tool more powerful than the refusal to keep silent about injustice. “We humanize what is going on in the world and in ourselves only by speaking of it,”Hannah Arendt (October 14, 1906–December 4, 1975) wrote the year of Dr. King’s assassination, “and in the course of speaking of it we learn to be human.” But she herself had been incubating these ideas for decades in the cells of her soul and the sinews of her identity as a German Jew who had narrowly escaped the Holocaust to become an American citizen.

Arendt addresses the complexities of that identity in a powerful essay titled “We Refugees,” penned in the 1940s and included in the Arendt anthology The Jewish Writings (public library). Although its subject is Jewishness, the essay speaks stirringly to the broader tragedy of being thrust into refugee status on account of some fragment of one’s identity — be it religion or nationality or gender or ethnicity or any other variable of exclusion and discrimination. “I speak of unpopular facts,” she writes in the piece — a sobering phrase that illuminates why such disquieting truths may give rise to “alternative facts” that offer illusory comfort.


Hannah Arendt by Fred Stein, 1944 (Photograph courtesy of the Fred Stein Archive)

Arendt, still in her thirties and already an intellectual titan, writes:

In the first place, we don’t like to be called “refugees.” We ourselves call each other “newcomers” or “immigrants.”

[…]

A refugee used to be a person driven to seek refuge because of some act committed or some political opinion held. Well, it is true we have had to seek refuge; but we committed no acts and most of us never dreamt of having any radical political opinion. With us the meaning of the term “refugee” has changed.

With an eye to the Stockholm syndrome of the psyche that leads such “refugees” to seek assimilation by the culture into which they’ve immigrated, she writes:

The less we are free to decide who we are or to live as we like, the more we try to pump up a front, to hide the facts, to play roles.

[…]

A man* who wants to lose his self discovers, indeed, the possibilities of human existence, which are infinite, as infinite as its creation. But the recovering of a new personality is as difficult — and as hopeless — as a new creation of the world.

Arendt considers the Jewish plight of identity:

If it is true that men seldom learn from history, it is also true that they may learn from personal experiences which, as in our case, are repeated again and again. But before you cast the first stone at us, remember that being a Jew does not give any legal status in this world. If we should start telling the truth that we are nothing but Jews, it would mean that we expose ourselves to the fate of human beings who, unprotected by any specific law or political convention, are nothing but human beings. I can hardly imagine an attitude more dangerous, since we actually live in a world in which human beings as such have ceased to exist for quite a while; since society has discovered discrimination as the great social weapon by which one may kill men without any bloodshed; since passports or birth certificates, and sometimes even income tax receipts, are no longer formal papers but matters of social distinction. It is true that most of us depend entirely upon social standards; we lose confidence in ourselves if society does not approve us; we are — and always were — ready to pay any price in order to be accepted by society. But it is equally true that the very few among us who have tried to get along without all these tricks and jokes of adjustment and assimilation have paid a much higher price than they could afford: they jeopardized the few chances even outlaws are given in a topsy-turvy world…

more…

https://www.brainpickings.org/

WIKK WEB GURU

WIKK WEB GURU


Art by Jean-Pierre Weill from The Well of Being

“Cynicism about society is not the only option besides a credulous conformity to this social aeon or a credulous looking-forward to the one that is to come.”

In his timeless and increasingly timely 1972 inquiry into human nature and its capacity for destructiveness, the great humanistic philosopher and psychologist Erich Fromm proposed the notion of humanistic radicalism — a mindset and movement that “seeks to liberate man from the chains of illusions,” one which “postulates that fundamental changes are necessary, not only in our economic and political structure but also in our values, in our concept of man’s aims, and in our personal conduct.” A few years later, in his treatise on the art of living, Fromm argued that any attempt to save our civilization from fatality must begin with liberation “in the classic, humanist sense as well as in the modern, political and social sense.” He wrote: “The only realistic aim is total liberation, a goal that may well be called radical (or revolutionary) humanism.”

But this systematic movement toward humanism as a form of insurgency and an instrument of cultural, social, and political liberation began a decade earlier with the Austrian-American sociologist Peter L. Berger (b. March 17, 1929) in his 1963 classic Invitation to Sociology: A Humanistic Perspective (public library).

Berger makes the case for what he terms “sociological humanism”:

Clearly sociology by itself cannot lead to humanism, as it cannot by itself produce an adequate anthropology… But sociological understanding can be an important part of a certain sense of life that is peculiarly modern, that has its own genius of compassion and that can be the foundation of a genuine humanism. This humanism to which sociology can contribute is one that does not easily wave banners, that is suspicious of too much enthusiasm and too much certainty. It is an uneasy, uncertain, hesitant thing, aware of its own precariousness, circumspect in its moral assertions. But this does not mean that it cannot enter into passionate commitment at those points where its fundamental insights into human existence are touched upon… Before the tribunals that condemn some men to indignity because of their race or sexuality, or that condemn any man to death, this humanism becomes protest, resistance and rebellion.

But because such sociological humanism is predicated on a skeptical questioning of the status quo, its certitudes, and its hubrises, it can often be mistaken for resigned disenchantment or, worse yet, for cynicism — that sewage of the spirit. Berger weighs the crucial difference between cynicism and the healthy skepticism of sociological humanism:

Sociological understanding leads to a considerable measure of disenchantment. The disenchanted man is a poor risk for both conservative and revolutionary movements; for the former because he does not possess the requisite amount of credulity in the ideologies of the status quo, for the latter because he will be skeptical about the Utopian myths that invariably form the nurture of revolutionaries. Such unemployability in the cadres of either present or future regimes need not, however, leave the disenchanted man in the posture of alienated cynicism. It may do that, to be sure. And we find just such postures among some younger sociologists in this country, who find themselves driven to radical diagnoses of society without finding in themselves the capacity for radical political commitments. This leaves them with no place to go except to a sort of masochistic cult of debunkers who reassure each other that things could not possibly be worse.

Echoing Bertrand Russell’s assertion that construction is both more difficult and more satisfying than destruction — for cynicism is, at bottom, a destructive kind of resignation — Berger adds:

This cynical stance is in itself naive and often enough grounded more in a lack of historical perspective than anything else. Cynicism about society is not the only option besides a credulous conformity to this social aeon or a credulous looking-forward to the one that is to come.

Another option is what we regard as the most plausible one to result from sociological understanding, one that can combine compassion, limited commitment and a sense of the comic in man’s social carnival. This will lead to a posture vis-à-vis society based on a perception of the latter as essentially a comedy, in which men parade up and down with their gaudy costumes, change hats and titles, hit each other with the sticks they have or the ones they can persuade their fellow actors to believe in. Such a comic perspective does not overlook the fact that nonexistent sticks can draw real blood, but it will not from this fact fall into the fallacy of mistaking the Potemkin village for the City of God. If one views society as a comedy, one will not hesitate to cheat, especially if by cheating one can alleviate a little pain here or make life a little brighter there. One will refuse to take seriously the rules of the game, except insofar as these rules protect real human beings and foster real human values. Sociological Machiavellianism is thus the very opposite of cynical opportunism. It is the way in which freedom can realize itself in social action.

It was Machiavelli, after all, who wrote half a millennium ago that “there is nothing more difficult to take in hand, more perilous to conduct, or more uncertain in its success, than to take the lead in the introduction of a new order of things.” And today, we need nothing less than a new order.

https://www.brainpickings.org/

 

WIKK WEB GURU

WIKK WEB GURU

waltwhitman

“America, if eligible at all to downfall and ruin, is eligible within herself, not without… Always inform yourself; always do the best you can; always vote.”

In 1855, Walt Whitman (May 31, 1819–March 26, 1892) made his debut as a poet and self-published Leaves of Grass. Amid the disheartening initial reception of pervasive indifference pierced by a few shrieks of criticism, the young poet received an extraordinary letter of praise and encouragement from his idol — Ralph Waldo Emerson, the era’s most powerful literary tastemaker. This gesture of tremendous generosity was a creative life-straw for the dispirited artist, who soon became one of the nation’s most celebrated writers and went on to be remembered as America’s greatest poet.

In the late 1860, working as a federal clerk and approaching his fiftieth birthday, Whitman grew increasingly concerned that America’s then-young democracy had grown in danger of belying the existential essentials of the human spirit. He voiced his preoccupations in a masterful and lengthy essay titled Democratic Vistas, later included in the indispensable Library of America volume Walt Whitman: Poetry and Prose (free ebook | public library).

Both Whitman’s spirited critique of American democracy and his proposed solution — which calls for an original and ennobling national body of literature as the means to cultivating the people’s mentality, character, and ideals — ring remarkably true today, perhaps even truer amid our modern disenchantment and dearth of idealism, accentuated by the spectacle of an election season.

Literature, Whitman argues, constructs the scaffolding of society’s values and “has become the only general means of morally influencing the world” — its archetypal characters shape the moral character and political ideals of a culture. Long after the political structures of the ancient world have crumbled, he reminds us, what remains of Ancient Greece and Rome and the other great civilizations is their literature. He writes:

At all times, perhaps, the central point in any nation, and that whence it is itself really sway’d the most, and whence it sways others, is its national literature, especially its archetypal poems. Above all previous lands, a great original literature is surely to become the justification and reliance, (in some respects the sole reliance,) of American democracy. Few are aware how the great literature penetrates all, gives hue to all, shapes aggregates and individuals, and, after subtle ways, with irresistible power, constructs, sustains, demolishes at will.

[…]

In the civilization of to-day it is undeniable that, over all the arts, literature dominates, serves beyond all — shapes the character of church and school — or, at any rate, is capable of doing so. Including the literature of science, its scope is indeed unparallel’d.

Illustration by Allen Crawford from Whitman Illuminated: Song of Myself

Lamenting the vacant materialism of consumer society, Whitman writes:

We had best look our times and lands searchingly in the face, like a physician diagnosing some deep disease. Never was there, perhaps, more hollowness at heart than at present, and here in the United States. Genuine belief seems to have left us. The underlying principles of the States are not honestly believ’d in, (for all this hectic glow, and these melodramatic screamings,) nor is humanity itself believ’d in.

[…]

Our New World democracy, however great a success in uplifting the masses out of their sloughs, in materialistic development, products, and in a certain highly-deceptive superficial popular intellectuality, is, so far, an almost complete failure in its social aspects, and in really grand religious, moral, literary, and esthetic results… In vain have we annex’d Texas, California, Alaska, and reach north for Canada and south for Cuba. It is as if we were somehow being endow’d with a vast and more and more thoroughly-appointed body, and then left with little or no soul.

[…]

To take expression, to incarnate, to endow a literature with grand and archetypal models — to fill with pride and love the utmost capacity, and to achieve spiritual meanings, and suggest the future — these, and these only, satisfy the soul. We must not say one word against real materials; but the wise know that they do not become real till touched by emotions, the mind.

The savior of the nation’s soul, Whitman insists, is not the politician but the artist:

Should some two or three really original American poets, (perhaps artists or lecturers,) arise, mounting the horizon like planets, stars of the first magnitude, that, from their eminence, fusing contributions, races, far localities, &c., together they would give more compaction and more moral identity, (the quality to-day most needed,) to these States, than all its Constitutions, legislative and judicial ties, and all its hitherto political, warlike, or materialistic experiences.

Art by Maurice Sendak from his 1993 masterwork We Are All in the Dumps with Jack and Guy, his darkest yet most hopeful book

In a sentiment that makes one shudder imagining what the poet would’ve made of Donald Trump’s presidential candidacy, Whitman writes:

I know nothing grander, better exercise, better digestion, more positive proof of the past, the triumphant result of faith in human kind, than a well-contested American national election.

[…]

America, it may be, is doing very well upon the whole, notwithstanding these antics of the parties and their leaders, these half-brain’d nominees, the many ignorant ballots, and many elected failures and blatherers. It is the dilettantes, and all who shirk their duty, who are not doing well… America, if eligible at all to downfall and ruin, is eligible within herself, not without…

more…

https://www.brainpickings.org/

WIKK WEB GURU

WIKK WEB GURU

John Cheever (based on photograph by Nancy Crampton)

“A lonely man is a lonesome thing, a stone, a bone, a stick, a receptacle for Gilbey’s gin, a stooped figure sitting at the edge of a hotel bed, heaving copious sighs like the autumn wind.”

“If I could catch the feeling, I would; the feeling of the singing of the real world, as one is driven by loneliness and silence from the habitable world,” Virginia Woolf wrote in contemplating the relationship between loneliness and creativity. Half a century later, Hannah Arendt considered how tyrants use loneliness as the common ground of terror. “Loneliness is difficult to confess; difficult too to categorise, [and] it can run deep in the fabric of a person,” Olivia Laing observed in her exquisite inquiry into the texture of loneliness in art and life.

Few writers have captured the way in which loneliness can rip the fabric of the psyche asunder between the poles of the creative and the tyrannical more articulately than John Cheever (May 27, 1912–June 18, 1982). Loneliness — its anguish, its expression, its antidotes, its eventual acceptance — permeates The Journals of John Cheever(public library), one of those rare masterworks of introspection radiating enormous insight into the universal human experience.

The journals — which Cheever’s son knew his father wanted published — were as much a workbook for the Pulitzer-winning writer’s fiction as they were a workbook for his character, his struggles, and his very self. His son, Benjamin Cheever, writes in the preface:

By 1979 John Cheever had become a literary elder statesman. “I’m a brand name,” he used to say, “like corn flakes, or shredded wheat.” He seemed to enjoy this status. He must have suspected that the publication of the journals would alter it.

[…]

Few people knew of his bisexuality. Very few people knew the extent of his infidelities. And almost nobody could have anticipated the apparent desperation of his inner life, or the caustic nature of his vision. But I don’t think he cared terribly about being corn flakes. He was a writer before he was a breakfast food. He was a writer almost before he was a man.

[…]

He saw the role of the serious writer as both lofty and practical in the same instant. He used to say that literature was one of the first indications of civilization. He used to say that a fine piece of prose could not only cure a depression, it could clear up a sinus headache. Like many great healers, he meant to heal himself.

And what he sought to heal most of all, what saturated his psyche more than anything, was his loneliness. His son writes:

For much of his life he suffered from a loneliness so acute as to be practically indistinguishable from a physical illness.

[…]

He meant by his writing to escape this loneliness, to shatter the isolation of others… With the journals … he meant to show others that their thoughts were not unthinkable.

His was a bone-deep loneliness that had afflicted him since childhood, despite his seemingly idyllic upbringing in a well-to-do family nestled into a genteel New England suburb. Cheever captures this hollowing alienation in an early journal entry:

Walking back from the river I remember the galling loneliness of my adolescence, from which I do not seem to have completely escaped. It is the sense of the voyeur, the lonely, lonely boy with no role in life but to peer in at the lighted windows of other people’s contentment and vitality. It seems comical — farcical — that, having been treated so generously, I should be stuck with this image of a kid in the rain walking along the road shoulders of East Milton…

more…

https://www.brainpickings.org/

WIKK WEB GURU

WIKK WEB GURU

Resultado de imagem para The teaching of Logic or Dialectics from a collection of scientific, philosophical and poetic writings, French, 13th century;

image edited by Web Investigator – The teaching of Logic or Dialetics from a collection of scientific, philosophical and poetic writings, French, 13th century; Bibliotheque Sainte-Genevieve, Paris, France. 

Is logical thinking a way to discover or to debate? The answers from philosophy and mathematics define human knowledge

by Catarina Dutilh Novaes is professor of philosophy and the Rosalind Franklin fellow in the Department of Theoretical Philosophy at the University of Groningen in the Netherlands. Her work focuses on the philosophy of logic and mathematics, and she is broadly interested in philosophy of mind and science. Her latest book is The Cambridge Companion to Medieval Logic (2016).

The history of logic should be of interest to anyone with aspirations to thinking that is correct, or at least reasonable. This story illustrates different approaches to intellectual enquiry and human cognition more generally. Reflecting on the history of logic forces us to reflect on what it means to be a reasonable cognitive agent, to think properly. Is it to engage in discussions with others? Is it to think for ourselves? Is it to perform calculations?

In the Critique of Pure Reason (1781), Immanuel Kant stated that no progress in logic had been made since Aristotle. He therefore concludes that the logic of his time had reached the point of completion. There was no more work to be done. Two hundred years later, after the astonishing developments in the 19th and 20th centuries, with the mathematisation of logic at the hands of thinkers such as George Boole, Gottlob Frege, Bertrand Russell, Alfred Tarski and Kurt Gödel, it’s clear that Kant was dead wrong. But he was also wrong in thinking that there had been no progress since Aristotle up to his time. According to A History of Formal Logic (1961) by the distinguished J M Bocheński, the golden periods for logic were the ancient Greek period, the medieval scholastic period, and the mathematical period of the 19th and 20th centuries. (Throughout this piece, the focus is on the logical traditions that emerged against the background of ancient Greek logic. So Indian and Chinese logic are not included, but medieval Arabic logic is.)

Why did Kant disregard the scholastic tradition? And, more generally, what explains the ‘decline’ of logic after the scholastic period? Though in the modern era logic remained an important part of the educational curriculum, there were no fundamental innovations to speak of (with the important exception of some developments in the 17th century, by Gottfried Wilhelm Leibniz). In fact, much of the scholastic achievement got lost, and the logic taught in this period (the one Kant was referring to) was for the most part rudimentary. To be sure, the decline of scholastic logic didn’t happen at once, and in some regions (eg, Spain) innovative work in the scholastic tradition continued to emerge well into the 16th century. However, generally speaking, scholastic logic became less and less prominent after the end of the Middle Ages, except for educational purposes at universities (but again, in watered-down versions).

There were many causes of the decline of scholastic logic. Perhaps the most famous was the damning criticism by Renaissance authors such as Lorenzo Valla. These thinkers deplored the lack of applicability of scholastic logic. Valla, for example, saw syllogisms – arguments composed of two premises and one conclusion, all of which are of the form ‘Some/All/No A is (not) B’, whose premises necessitate the truth of the conclusion – as an artificial type of reasoning, useless for orators on account of being too far removed from natural ways of speaking and arguing. They harshly criticised the ugly, cumbersome, artificial and overly technical Latin of scholastic authors, and defended a return to the classical Latin of Cicero and Vergil. For the most part, these critics did not belong to the university system, where scholasticism was still the norm in the 15th century. Instead, they tended to be civil servants, and were thus involved in politics, administration and civic life in general. They were much more interested in rhetoric and persuasion than in logic and demonstration.

Another reason logic gradually lost its prominence in the modern period was the abandonment of predominantly dialectical modes of intellectual enquiry. A passage by René Descartes – yes, the fellow who built a whole philosophical system while sitting on his own by the fireplace in a dressing gown – represents this shift in a particularly poignant way. Speaking of how the education of a young pupil should proceed, in Principles of Philosophy (1644) he writes:

After that, he should study logic. I do not mean the logic of the Schools, for this is strictly speaking nothing but a dialectic which teaches ways of expounding to others what one already knows or even of holding forth without judgment about things one does not know. Such logic corrupts good sense rather than increasing it. I mean instead the kind of logic which teaches us to direct our reason with a view to discovering the truths of which we are ignorant.

Descartes hits the nail on the head when he claims that the logic of the Schools (scholastic logic) is not really a logic of discovery. Its chief purpose is justification and exposition, which makes sense particularly against the background of dialectical practices, where interlocutors explain and debate what they themselves already know. Indeed, for much of the history of logic, both in ancient Greece and in the Latin medieval tradition, ‘dialectic’ and ‘logic’ were taken to be synonymous.

Up to Descartes’s time, the chief application of logical theories was to teach students to perform well in debates and disputations, and to theorise on the logical properties of what follows from what, insofar as this is an essential component of such argumentative practices. It’s true that not everyone conceived of logic in this way: Thomas Aquinas, for example, held that logic is about ‘second intentions’, roughly what we call second-order concepts, or concepts of concepts. But as late as in the 16th century, the Spanish theologian Domingo de Soto could write with confidence that ‘dialectic is the art or science of disputing’…

more…

https://aeon.co/essays/the-rise-and-fall-and-rise-of-logic

WIKK WEB GURU

WIKK WEB GURU

Illustration by Judith Clay from Thea’s Tree

“Sleep acts … more like an emotion than a bodily function. As with desire, it resists pursuit. Sleep must come find you.”

We spend — or are biologically supposed to spend — a third of our lives in sleep, yet it remains a state we neither fully understand nor can bend to our will. A central cog in the machinery of our complex internal clocks, it regulates our negative emotions and affects our every waking moment. “Something nameless / Hums us into sleep,” the poet Mark Strand wrote in his sublime ode to dreams, “Withdraws, and leaves us in / A place that seems / Always vaguely familiar.” But what if the hum never comes, if the place in which night ought to leave us is a terra incognita at best unfamiliar, at worst entirely unreachable?

That’s what writer and photographer Bill Hayes explores in his magnificent 2001 book Sleep Demons: An Insomniac’s Memoir (public library) — part reflection on his own lifelong turmoil in the nocturne, part sweeping inquiry into the sometimes converging, sometimes colliding worlds of sleep research, psychology, medicine, mythology, aging, and mental health. (It is hardly any wonder, though perhaps a most delightful miracle, that Hayes’s writing — philosophical, rigorously researched, immensely poetic — became a channel of love for the late, great Oliver Sacks; it was through writing that he met Hayes, who became the Billy in his memoir and the love of his life.)


Bill Hayes (Photograph: Katy Raddatz)

Hayes writes:

I grew up in a family where the question “How’d you sleep?” was a topic of genuine reflection at the breakfast table. My five sisters and I each rated the last night’s particular qualities — when we fell asleep, how often we woke, what we dreamed, if we dreamed. My father’s response influenced the family’s mood for the day: if “lousy,” the rest of us felt lousy, too. If there’s such a thing as an insomnia gene, Dad passed it on to me, along with green eyes and Irish melancholy.

I lay awake as a young boy, my mind racing like the spell-check function on a computer, scanning all data, lighting on images, moments, fragments of conversation, impossible to turn off. As a sleeping aid, I would try to recall my entire life — a straight narrative from first to last incident — thereby imposing order on the inventory of desire and memory.

For two years of Hayes’s childhood, his particular flavor of nocturnal torment was sleepwalking — all unconscious desire, no conscious memory. He would crawl out of bed, wander into the family living room as if looking for something, but not respond to his mother’s voice. He paints a poetic, if sorrowful, portrait of the sleepless mind trapped in a restless body:

If the insomniac is a shadow of his daylight self, existing nightlong on nothing but the fumes of consciousness, then the somnambulist is like an animal whose back leg drags a steel trap — the mind is fleeing and the body is inextricably attached.

Where did I want to go? Out of that house, I imagine. Away from the person I saw myself becoming. Toward a dreamed-up boy, with a new story, a different version of myself.


Illustration by Tom Seidmann-Freud from a philosophical 1922 children’s book about dreaming

In this lacuna between body and mind, Hayes locates the most elusive essence of sleep:

Sleeping pills can force the body into unconsciousness, it’s true. I’ve slept many times on those delicious, light-blue pillows. But the body is never really tricked. The difference between drugged and natural sleep eventually reveals itself, like the difference between an affair and true romance. It shows up in your eyes. Sleep acts, in this regard, more like an emotion than a bodily function. As with desire, it resists pursuit. Sleep must come find you.

And the compass by which sleep finds us appears to be magnetized by our biology and the fundamental nature of reality itself. With an eye to the legacy of pioneering sleep researcher Nathaniel Kleitman, who kept himself awake in a cave for fifty days in the 1920s at the outset of a career that would revolutionize our understanding of the non-wakeful consciousness, Hayes argues that sleep unlatches its own singular cosmogony:

Our entire lives are shaped by circadian rhythms, gravitational forces, and seasonal cycles (day and night, ebb and flow, growth and decay), all of which, in my view, may be echoed in grander schemes throughout the cosmos. None of which can truly be resisted, only tested and studied, in Kleitman’s cave as in Plato’s. Daylight to darkness, the body mimics the behavior of the earth itself. Perhaps this is why vexing sleep questions (Why do humans dream? Why do we wake up?) sound like great metaphysical questions about the meaning of life; excerpts from a timeless dialogue on truth and illusion, awareness and unconsciousness.

Perhaps it was the inevitable metaphysical nature of these questions that led Nietzsche to believe that dreams are an evolutionary time machine for the human mind, Dostoyevsky to discover the meaning of life in a dream, Margaret Mead to find in one the perfect existential metaphor, and Neil Gaiman to dream his way to a philosophical parable of identity

more…

https://www.brainpickings.org/

WIKK WEB GURU

WIKK WEB GURU

“All the goodness and the heroisms will rise up again, then be cut down again and rise up. It isn’t that the evil thing wins — it never will — but that it doesn’t die.”

There are events in our personal lives and our collective history that seem categorically irredeemable, moments in which the grounds for gratefulness and hope have sunk so far below the sea level of sorrow that we have ceased to believe they exist. But we have within us the consecrating capacity to rise above those moments and behold the bigger picture in all of its complexity, complementarity, and temporal sweep, and to find in what we see not illusory consolation but the truest comfort there is: that of perspective.

John Steinbeck (February 27, 1902–December 20, 1968) embodies this difficult, transcendent willingness in an extraordinary letter to his friend Pascal Covici — who would soon become his literary fairy godfather of sorts — penned on the first day of 1941, as World War II was raging and engulfing humanity in unbearable darkness. Found in Steinbeck: A Life in Letters (public library) — which also gave us the beloved writer on the difficult art of the friend breakup, his comical account of a dog-induced “computer crash” decades before computers, and his timeless advice on falling in love— the letter stands as a timeless testament to the consolatory power of rehabilitating nuance, making room for fertile contradiction, and taking a wider perspective.

Steinbeck writes on January 1, 1941:

Speaking of the happy new year, I wonder if any year ever had less chance of being happy. It’s as though the whole race were indulging in a kind of species introversion — as though we looked inward on our neuroses. And the thing we see isn’t very pretty… So we go into this happy new year, knowing that our species has learned nothing, can, as a race, learn nothing — that the experience of ten thousand years has made no impression on the instincts of the million years that preceded.

But Steinbeck, who devoted his life to defending the disenfranchised and celebrating the highest potentiality of the human spirit, refuses to succumb to what Rebecca Solnit has so aptly termed the “despair, defeatism, cynicism[,] amnesia and assumptions” to which we reflexively resort in maladaptive self-defense against overwhelming evil. Instead, fifteen centuries after Plato’s brilliant charioteer metaphor for good and evil, Steinbeck quickly adds a perceptive note on the indelible duality of human nature and the cyclical character of the civilizational continuity we call history:

Not that I have lost any hope. All the goodness and the heroisms will rise up again, then be cut down again and rise up. It isn’t that the evil thing wins — it never will — but that it doesn’t die. I don’t know why we should expect it to. It seems fairly obvious that two sides of a mirror are required before one has a mirror, that two forces are necessary in man before he is man. I asked [the influential microbiologist] Paul de Kruif once if he would like to cure all disease and he said yes. Then I suggested that the man he loved and wanted to cure was a product of all his filth and disease and meanness, his hunger and cruelty. Cure those and you would have not man but an entirely new species you wouldn’t recognize and probably wouldn’t like.

Steinbeck’s point is subtle enough to be mistaken for moral relativism, but is in fact quite the opposite — he suggests that our human foibles don’t negate our goodness or our desire for betterment but, rather, provide both the fuel for it and the yardstick by which we measure our moral progress.

He wrests out this inevitable interplay of order and chaos the mortal flaw of the Nazi regime and the grounds for hope toward surviving the atrocity of WWII, which, lest we forget, much of the world feared was unsurvivable in toto:

It is interesting to watch the German efficiency, which, from the logic of the machine is efficient but which (I suspect) from the mechanics of the human species is suicidal. Certainly man thrives best (or has at least) in a state of semi-anarchy. Then he has been strong, inventive, reliant, moving. But cage him with rules, feed him and make him healthy and I think he will die as surely as a caged wolf dies. I should not be surprised to see a cared for, thought for, planned for nation disintegrate, while a ragged, hungry, lustful nation survived. Surely no great all-encompassing plan has ever succeeded.

Mercifully, Steinbeck was right — the Nazis’ grim world domination plan ultimately failed, humanity as a whole survived these unforgivable crimes against it (though we continually fail to sufficiently reflect upon them), and we commenced another revolution around the cycle of construction and destruction, creating great art and writing great literature and making great scientific discoveries, all the while carrying our parallel capacities for good and evil along for the ride, as we are bound to always do.

So when we witness evil punctuate the line of our moral and humanitarian progress, as we periodically do, may we remember, even within the most difficult moments of that periodicity, Steinbeck’s sobering perspective and lucid faith in the human spirit.

https://www.brainpickings.org/

WIKK WEB GURU

WIKK WEB GURU

%d bloggers like this: