Polish Poet and Nobel Laureate Wisława Szymborska on How Our Certitudes Keep Us Small and the Generative Power of Not-Knowing

Art by Salvador Dalí from a rare edition of Alice’s Adventures in Wonderland

“Attempt what is not certain. Certainty may or may not come later. It may then be a valuable delusion,” the great painter Richard Diebenkorn counseled in his ten rules for beginning creative projects. “One doesn’t arrive — in words or in art — by necessarily knowing where one is going,” the artist Ann Hamilton wrote a generation later in her magnificent meditation on the generative power of not-knowing. “In every work of art something appears that does not previously exist, and so, by default, you work from what you know to what you don’t know.”

What is true of art is even truer of life, for a human life is the greatest work of art there is. (In my own life, looking back on my ten most important learnings from the first ten years of Brain Pickings, I placed the practice of the small, mighty phrase “I don’t know” at the very top.) But to live with the untrammeled openendedness of such fertile not-knowing is no easy task in a world where certitudes are hoarded as the bargaining chips for status and achievement — a world bedeviled, as Rebecca Solnit memorably put it, by “a desire to make certain what is uncertain, to know what is unknowable, to turn the flight across the sky into the roast upon the plate.”

That difficult feat of insurgency is what the great Polish poet Wisława Szymborska (July 2, 1923–February 1, 2012) explored in 1996 when she was awarded the Nobel Prize in Literature for capturing the transcendent fragility of the human experience in masterpieces like “Life-While-You-Wait” and “Possibilities.”

In her acceptance speech, later included in Nobel Lectures: From the Literature Laureates, 1986 to 2006 (public library) — which also gave us the spectacular speech on the power of language Toni Morrison delivered after becoming the first African American woman to win the Nobel Prize — Szymborska considers why artists are so reluctant to answer questions about what inspiration is and where it comes from:

It’s not that they’ve never known the blessing of this inner impulse. It’s just not easy to explain something to someone else that you don’t understand yourself.

Noting that she, too, tends to be rattled by the question, she offers her wieldiest answer:

Inspiration is not the exclusive privilege of poets or artists generally. There is, has been, and will always be a certain group of people whom inspiration visits. It’s made up of all those who’ve consciously chosen their calling and do their job with love and imagination. It may include doctors, teachers, gardeners — and I could list a hundred more professions. Their work becomes one continuous adventure as long as they manage to keep discovering new challenges in it. Difficulties and setbacks never quell their curiosity. A swarm of new questions emerges from every problem they solve. Whatever inspiration is, it’s born from a continuous “I don’t know.”

In a sentiment of chilling prescience today, as we witness tyrants drunk on certainty drain the world of its essential inspiration, Szymborska considers the destructive counterpoint to this generative not-knowing:

All sorts of torturers, dictators, fanatics, and demagogues struggling for power by way of a few loudly shouted slogans also enjoy their jobs, and they too perform their duties with inventive fervor. Well, yes, but they “know.” They know, and whatever they know is enough for them once and for all. They don’t want to find out about anything else, since that might diminish their arguments’ force. And any knowledge that doesn’t lead to new questions quickly dies out: it fails to maintain the temperature required for sustaining life. In the most extreme cases, cases well known from ancient and modern history, it even poses a lethal threat to society.

This is why I value that little phrase “I don’t know” so highly. It’s small, but it flies on mighty wings. It expands our lives to include the spaces within us as well as those outer expanses in which our tiny Earth hangs suspended. If Isaac Newton had never said to himself “I don’t know,” the apples in his little orchard might have dropped to the ground like hailstones and at best he would have stooped to pick them up and gobble them with gusto. Had my compatriot Marie Sklodowska-Curie never said to herself “I don’t know”, she probably would have wound up teaching chemistry at some private high school for young ladies from good families, and would have ended her days performing this otherwise perfectly respectable job. But she kept on saying “I don’t know,” and these words led her, not just once but twice, to Stockholm, where restless, questing spirits are occasionally rewarded with the Nobel Prize…

more…

https://www.brainpickings.org/

WIKK WEB GURU

When nations apologise

Resultado de imagem para German Chancellor Willy Brandt kneels before the memorial to the dead of the Warsaw Uprising, 7 December 1970‘What people do when words fail them’: German Chancellor Willy Brandt kneels before the memorial to the dead of the Warsaw Uprising, 7 December 1970. Photo by Ullstein Bild/Sven Simon (image edited by Web Investigator)

National apologies are a big deal: they acknowledge the past to help move everyone forward. No wonder they’re so hard

by Edwin Battistella teaches linguistics and writing at Southern Oregon University.  He has a PhD in linguistics from the City University of New York. His most recent book is Sorry About That: The Language of Public Apology (2014).

In May 2016, when Barack Obama visited Hiroshima, some speculated that the president of the United States might offer an apology, on behalf of his country, for the bombing of that city at the close of the Second World War. Instead, in his joint press conference with Japan’s prime minister Shinzo Abe, Obama said that his visit would ‘honour all those who were lost in the Second World War and reaffirm our shared vision of a world without nuclear weapons’. The White House had announced before the visit that it would neither revisit the decision to drop the bomb nor apologise for it. The Obama administration judged that this wartime military action required no apology. 

When do nations apologise? Nearly 30 years earlier, in 1988, the US Congress passed the Civil Liberties Act authorising apologies and redress payments to the Japanese Americans interred during the 1940s. Signing the bill, the US president Ronald Reagan said that ‘here we admit a wrong; here we reaffirm our commitment as a nation to equal justice under the law’. Reagan’s successors, George H W Bush and Bill Clinton, later sent individual apology letters to former internees as their claims were processed.

The apology for internment was a long time coming. In the wave of xenophobia following the 1941 attack on Pearl Harbor, the military removed nearly 120,000 Japanese Americans and Japanese to what were euphemistically called War Relocation Authority camps. Those interred encountered hardship, suffering and loss. In 1944, with the Korematsu v United States court case, internment was declared unconstitutional. Some Japanese Americans had been imprisoned for as long as three years. They were given a train ticket and $25. But no apology.

Then, 43 years after internment ended, the US Congress apologised. The path to apology began in 1970, with a call to action from the Japanese American Citizens League. A decade later, the US president Jimmy Carter appointed a Commission on Wartime Relocation and Internment of Civilians to recommend a course of action. Getting the apology was controversial, involving issues of cost and accountability, political consensus-building, and philosophical debate about whether later governments were responsible for the moral failures of their predecessors. But, in the eyes of many former internees, the effort was worth it. For them, it was a restoration of honour. For the US government, the apology was an admission of having wronged its citizens and a recommitment to justice.

Sometimes, a whole nation must come to grips with its collective past if it is to move ahead. This was the case for the 50-year process that Germany underwent following Hitler’s regime. With the Nuremberg trials and with reforms of the education system aimed at denazification, the Allied victors focused on accountability and re-education. West German politicians knew that they had to address wartime atrocities if Germany was to rejoin the community of nations. As a step toward this, in 1952 the German chancellor Konrad Adenauer’s government made a 3.5 billion Deutsche mark payment to the new state of Israel. Internally, however, Adenauer was advocating a national policy of forgetting rather than apology and remembrance. In his first address as chancellor in 1949, Adenauer had told the West German parliament that his government was determined to put the past behind. He was concerned with a resurgence of nationalism and sought to de-emphasise wartime guilt in favour of economic revitalisation.

In the post-war years, the German Right and Left would debate whether exploring the past would mire the nation in perpetual guilt, or whether a greater recognition of the past was a necessary step to national dignity. Contrition became the norm, so much so that when the later chancellor Willy Brandt went to Poland in 1970, he knelt before the monument commemorating the Warsaw uprising of 1943. Brandt’s action was widely viewed as a non-verbal apology, and he later explained that he ‘did what people do when words fail them’. In 1995, Helmut Kohl, the chancellor of a unified Germany, found the words. On the 50th anniversary of the liberation of Auschwitz, Kohl was unambiguously apologetic. Auschwitz, he explained, was ‘the darkest and most horrible chapter of German history … one of our priority tasks is to pass on this knowledge to future generations so that the horrible experiences of the past will never be repeated.’ The process of national apology for Germany was one of overcoming amnesia and acknowledging the past. But, importantly, the orientation was on the future. Germany saw facing and apologising for its past as a way to improve the lives of those to come…

more…

https://aeon.co/essays/a-national-apology-has-the-power-to-change-the-future

WIKK WEB GURU

What Really Turned the Sahara Desert From a Green Oasis Into a Wasteland?

FBBY1H (1).jpg

One of the world’s most iconic deserts was once lush and green. What happened? (Alamy )

10,000 years ago, this iconic desert was unrecognizable. A new hypothesis suggests that humans may have tipped the balance

SMITHSONIAN.COM

When most people imagine an archetypal desert landscape—with its relentless sun, rippling sand and hidden oases—they often picture the Sahara. But 11,000 years ago, what we know today as the world’s largest hot desert would’ve been unrecognizable. The now-dessicated northern strip of Africa was once green and alive, pocked with lakes, rivers, grasslands and even forests. So where did all that water go?

Archaeologist David Wright has an idea: Maybe humans and their goats tipped the balance, kick-starting this dramatic ecological transformation. In a new study in the journal Frontiers in Earth Science, Wright set out to argue that humans could be the answer to a question that has plagued archaeologists and paleoecologists for years.

The Sahara has long been subject to periodic bouts of humidity and aridity. These fluctuations are caused by slight wobbles in the tilt of the Earth’s orbital axis, which in turn changes the angle at which solar radiation penetrates the atmosphere. At repeated intervals throughout Earth’s history, there’s been more energy pouring in from the sun during the West African monsoon season, and during those times—known as African Humid Periods—much more rain comes down over north Africa.

With more rain, the region gets more greenery and rivers and lakes. All this has been known for decades. But between 8,000 and 4,500 years ago, something strange happened: The transition from humid to dry happened far more rapidly in some areas than could be explained by the orbital precession alone, resulting in the Sahara Desert as we know it today. “Scientists usually call it ‘poor pramaterization’ of the data,” Wright said by email. “Which is to say that we have no idea what we’re missing here—but something’s wrong.”

As Wright pored the archaeological and environmental data (mostly sediment cores and pollen records, all dated to the same time period), he noticed what seemed like a pattern. Wherever the archaeological record showed the presence of “pastoralists”—humans with their domesticated animals—there was a corresponding change in the types and variety of plants. It was as if, every time humans and their goats and cattle hopscotched across the grasslands, they had turned everything to scrub and desert in their wake.

Wright thinks this is exactly what happened. “By overgrazing the grasses, they were reducing the amount of atmospheric moisture—plants give off moisture, which produces clouds—and enhancing albedo,” Wright said. He suggests this may have triggered the end of the humid period more abruptly than can be explained by the orbital changes. These nomadic humans also may have used fire as a land management tool, which would have exacerbated the speed at which the desert took hold.

It’s important to note that the green Sahara always would’ve turned back into a desert even without humans doing anything—that’s just how Earth’s orbit works, says geologist Jessica Tierney, an associate professor of geoscience at the University of Arizona. Moreover, according to Tierney, we don’t necessarily need humans to explain the abruptness of the transition from green to desert.

Instead, the culprits might be regular old vegetation feedbacks and changes in the amount of dust. “At first you have this slow change in the Earth’s orbit,” Tierney explains. “As that’s happening, the West African monsoon is going to get a little bit weaker. Slowly you’ll degrade the landscape, switching from desert to vegetation. And then at some point you pass the tipping point where change accelerates.”

Tierney adds that it’s hard to know what triggered the cascade in the system, because everything is so closely intertwined. During the last humid period, the Sahara was filled with hunter-gatherers. As the orbit slowly changed and less rain fell, humans would have needed to domesticate animals, like cattle and goats, for sustenance. “It could be the climate was pushing people to herd cattle, or the overgrazing practices accelerated denudation [of foliage],” Tierney says.

Which came first? It’s hard to say with evidence we have now. “The question is: How do we test this hypothesis?” she says. “How do we isolate the climatically driven changes from the role of humans? It’s a bit of a chicken and an egg problem.” Wright, too, cautions that right now we have evidence only for correlation, not causation.

But Tierney is also intrigued by Wright’s research, and agrees with him that much more research needs to be done to answer these questions.

“We need to drill down into the dried-up lake beds that are scattered around the Sahara and look at the pollen and seed data and then match that to the archaeological datasets,” Wright said. “With enough correlations, we may be able to more definitively develop a theory of why the pace of climate change at the end of the AHP doesn’t match orbital timescales and is irregular across northern Africa.”

Tierney suggests researchers could use mathematical models that compare the impact hunter-gatherers would have on the environment versus that of pastoralists herding animals. For such models it would be necessary to have some idea of how many people lived in the Sahara at the time, but Tierney is sure there were more people in the region than there are today, excepting coastal urban areas…

more…

Read more: http://www.smithsonianmag.com/science-nature/what-really-turned-sahara-desert-green-oasis-wasteland-180962668/#3aY85oLSfAGlomkP.99
Give the gift of Smithsonian magazine for only $12! http://bit.ly/1cGUiGv
Follow us: @SmithsonianMag on Twitter

WIKK WEB GURU

The Real Story of David Rockefeller That the Media Isn’t Telling

Mint Press News
Waking Times

No one person encapsulates the enduring legacy of the “robber barons” of the Industrial Age quite like David Rockefeller. Rockefeller, who died today at the age of 101, was the last surviving grandson of John D. Rockefeller, the oil tycoon who became America’s first billionaire and the patriarch of what would become one of the most powerful and wealthiest families in American history. David Rockefeller, an undeniable product of American nobility, lived his entire life in the echelons of U.S. society, becoming symbolic of the elite who often direct public policy to a much greater extent than many realize, albeit often from the shadows.

Rockefeller made it clear that he preferred to operate out of public view despite his great influence in American – and international – politics. Due to his birthright, Rockefeller served as an advisor to every president since Eisenhower, but when offered powerful positions such as Federal Reserve chairman and Secretary of the Treasury – he declined, preferring “a private role.

As evidenced by the numerous obituaries bemoaning the loss of the last of the Rockefeller’s grandsons, he was largely successful in hiding his most significant wrongdoings from public view, as evidenced by his characterization as a generous philanthropist and influential banker.

But as is often the case, Rockefeller’s true legacy is much more mired in controversy than major publications seem willing to admit. In addition to having the ear of every U.S. president for the better part of the last 70 or so years, Rockefeller – once again operating “behind the scenes” – was instrumental in shaping the more cringe-worthy aspects of U.S. policy during that time, as well as being a major force in establishing banking policies that led to debt crises in the developing world.

Rockefeller – as the head of Chase Manhattan Bank from 1969 to 1981 – worked with government and multinational corporations throughout the world to create a “global order” unequivocally dominated by the 1 percent, of which his family was a part. As the New York Times noted back in the 1970s, Rockefeller became embroiled in controversy when his constant trips overseas caused the bank to become less profitable, as he prioritized the bank’s influence on foreign politics over its actual business dealings.

During his time as Chase CEO, Rockefeller helped laid the foundation for repressive, racist and fascist regimes around the world, as well as architecture for global inequality. In addition, Rockefeller helped to bring the debt crisis of the 1980s into existence, in part by direct action through Chase Bank and also indirectly through his former employee-turned-Federal Reserve chairman Paul Volcker. Two years before the debt crisis erupted, Rockefeller, Volcker and other top bankers met at the International Monetary Conference in 1980s to argue for the establishment of a “safety net” for major banks – like Chase – that were embroiled in bad loans given largely to countries in the developing world.

David Rockefeller, center, chairman of the Chase Manhattan?s Bank?s International Advisory Committee and the banks? former chairman of the board and chief executive officer, receives the 1983 International Leadership Award from the U.S. Council for International Business, presented by Dr. Henry A. Kissinger, former Secretary of State, left, and Ralph A. Pfeiffer, Jr., U.S. Council Chairman, at New York?s Pierre Hotel on Thursday, Dec. 9, 1983. The award recognizes outstanding contributions to world trade and investment. (AP Photo/Ron Frehm)

After the crisis brought financial ruin to Latin America and other developing areas throughout the world, Rockefeller – along with other bankers – created austerity programs to “solve” the debt crisis during subsequent IMC meetings, provoking inequality that still persists to this day. However, thanks to the “safety net” conveniently established years prior, Chase avoided the economic consequences for its criminal actions.

In addition, Rockefeller supported the bloody and ruthless dictatorships of the Shah of Iran and Augusto Pinochet of Chile while also supporting Israeli apartheid. Rockefeller then went on to found the influential Trilateral Commission while also serving as a major force on the Council on Foreign Relations that he, along with his close friend Henry Kissinger, would come to dominate.

Both of these organizations have come under fire for using their powerful influence to bring about a “one-world government” ruled by a powerful, ultra-wealthy elite – an accusation to which David Rockefeller confirmed as true in his autobiography. Far from the generous philanthropist he is made to be, David Rockefeller deserves to be remembered for his true legacy – one of elitism, fascism and economic enslavement.

http://www.wakingtimes.com/2017/03/23/real-story-david-rockefeller-media-isnt-telling/

WIKK WEB GURU

Hamburgers Are Bigger Than Ever, but the Meat Has Always Been Questionable

 

by Quinn Myers

From ‘pink slime’ to bug burgers, a look at the quintessential American meal

For most of the world, the symbol most associated with America isn’t the bald eagle, George Washington or even the stars and stripes—it’s the hamburger and fries. But how much has this simple meal — a ground-beef sandwich with fried potatoes — changed since its glory days of the 1950s? Let’s find out.

The Ingredients

1950s: According to Andrew Smith, author of The Encyclopedia of Junk Food and Fast Food, the quality of hamburger meat was so bad in the early 20th century that by the 1950s, customers needed reassuring that what they were getting was actual meat. “Heading into the ‘50s, White Castle had beef slabs delivered to each outlet a couple times a day,” Smith says. “It was ground up in front of any customers in the store to assure everyone that their beef did come from a cow, as opposed to a variety of meat and other products from other slaughtered animals.”

White Castle employee pointing out a White Castle inspected meat sign

So during the burger’s heyday, most people could feel confident that they were, in fact, getting 100 percent ground beef, while the fries were exactly as advertised: Potatoes, sliced in the restaurant and fried in animal oil.

Today: In 2008, a study by Brigid Prayson of the Cleveland Clinical Foundation tried to find out whether it was even possible for America to produce as much beef as was apparently being consumed — an interesting question, considering that there are fewer cattle being raised now than in the 1970s, and yet we’re eating more beef than we were then. The answers weren’t encouraging, and a test of a variety of fast-food burgers found that the amount of real meat in burgers ranged from just 2 to 14 percent. The rest was made up of what has become known as “pink slime,” or in the words of the study, “a mash of connective tissue, blood vessels, peripheral nerve, plant material, cartilage and bone.”

This nauseating goop was then doused in ammonium hydroxide, an antimicrobial agent once classified by the Department of Agriculture as “generally recognized as safe,” though the practice is banned in the European Union. McDonald’s and other chains have since claimed that they no longer use the stuff, but after a brief public backlash, it has crept back into grocery stores, with a 2014 study claiming that up to 70 percent of the ground beef sold in stores contains the dreaded pink slime.

McDonald’s pink slime

The meat isn’t the only thing chock-full of chemicals now, either. A quick look at the fry ingredients listed on McDonald’s website reveals not just potatoes but rather a dozen different things, including chemicals with such appetizing names as sodium acid pyrophosphate (that’s the one that maintains their friendly yellow color). Essentially, most of the water in the fries has been replaced with fat, and a bunch of chemicals are added to make them taste like they were fried in animal fat, rather than the mix of corn and soybean oil they’re actually fried in.

The Size

1950s: “The combo of french fries and burgers as a meal became solidified during World War II, since meat was rationed and you needed to bolster what small amount of it you had with something else,” says Smith. How small exactly were the burgers? In 1950, the average burger weighed just 3.9 ounces—not so much bigger than a modern-day White Castle slider, at 2.2 ounces, according to the Centers for Disease Control. For their part, an average order of fries weighed roughly 2.4 ounces.

Today: As rationing came to an end, burgers began to fatten up. “Chains like Burger King came along offering bigger burgers with more meat, and the increased competition led to an arms race of the sizes and the styles of burgers,” Smith says. As a result, the average fast-food burger has quadrupled in size since the 1950s and now stands at a gut-busting 12 ounces. Fries, meanwhile, have nearly doubled in size, weighing in at 6.7 ounces (again according to the CDC)…

more…

https://melmagazine.com/hamburgers-are-bigger-than-ever-but-the-meat-has-always-been-questionable-ba04dc37f0e7#.mugo2w70w

WIKK WEB GURU

In defence of hierarchy

Resultado de imagem para Daoist power: Herding Horses by Han Gan, Tang dynasty, China.

Daoist power: Herding Horses by Han Gan, Tang dynasty, China. Photo courtesy the National Palace Museum, Taipei/Wikipedia

As a society we have forgotten how to talk about the benefits of hierarchy, expertise and excellence. It’s time we remembered

BY Stephen C Angle is professor of philosophy at Wesleyan University. He has written and edited many books on Chinese philosophy, including Sagehood: The Contemporary Significance of Confucian Philosophy (2012). He lives in Middletown, CT.

The modern West has placed a high premium on the value of equality. Equal rights are enshrined in law while old hierarchies of nobility and social class have been challenged, if not completely dismantled. Few would doubt that global society is all the better for these changes. But hierarchies have not disappeared. Society is still stratified according to wealth and status in myriad ways.

On the other hand, the idea of a purely egalitarian world in which there are no hierarchies at all would appear to be both unrealistic and unattractive. Nobody, on reflection, would want to eliminate all hierarchies, for we all benefit from the recognition that some people are more qualified than others to perform certain roles in society. We prefer to be treated by senior surgeons not medical students, get financial advice from professionals not interns. Good and permissible hierarchies are everywhere around us.

Yet hierarchy is an unfashionable thing to defend or to praise. British government ministers denounce experts as out of tune with popular feeling; both Donald Trump and Bernie Sanders built platforms on attacking Washington elites; economists are blamed for not predicting the 2008 crash; and even the best established practice of medical experts, such as childhood vaccinations, are treated with resistance and disbelief. We live in a time when no distinction is drawn between justified and useful hierarchies on the one hand, and self-interested, exploitative elites on the other.

As a group, we believe that clearer thinking about hierarchy and equality is important in business, politics and public life. We should lift the taboo on discussing what makes for a good hierarchy. To the extent that hierarchies are inevitable, it is important to create good ones and avoid those that are pernicious. It is also important to identify the ways in which useful and good hierarchies support and foster good forms of equality. When we talk about hierarchies here, we mean those distinctions and rankings that bring with them clear power differentials.

We are a diverse group of scholars and thinkers who take substantively different views on many political and ethical issues. Recently, we engaged in an intensive discussion of these issues under the aegis of the Berggruen Philosophy and Culture Center in Los Angeles, and we found ourselves agreeing on this: much can be said in defence of some kinds of hierarchy. The ideas we present here are at the very least worthy of more widespread and serious attention. All of this takes on a new urgency given the turn in world politics towards a populism that often attacks establishment hierarchies while paradoxically giving authoritarian power to individuals claiming to speak for ‘the people’.

What then, should be said in praise of hierarchy? 

First, bureaucratic hierarchies can serve democracy. Bureaucracy is even less popular these days than hierarchy. Yet bureaucratic hierarchies can instantiate crucial democratic values, such as the rule of law and equal treatment.

There are at least three ways in which usually hierarchical constitutional institutions can enhance democracy: by protecting minority rights, and thereby ensuring that the basic interests of minorities are not lightly discounted by self-interested or prejudiced majorities; by curbing the power of majority or minority factions to pass legislation favouring themselves at the expense of the public good; and by increasing the epistemic resources that are brought to bear on decision-making, making law and policy more reflective of high-quality deliberation. Hence democracies can embrace hierarchy because hierarchy can enhance democracy itself…

more…

https://aeon.co/essays/hierarchies-have-a-place-even-in-societies-built-on-equality

WIKK WEB GURU

The last hollow laugh

Resultado de imagem para Francis Fukuyama photographed in Paris.

Francis Fukuyama photographed in Paris. Photo by Stephane Grangier/Corbis/Getty

Since Francis Fukuyama proclaimed ‘The End of History’ 25 years ago, he has been much maligned. His work now seems prophetic

Paul Sagar is junior research fellow in politics and international relations at King’s College at the University of Cambridge.

Edited by Nigel Warburton

This year marks the 25th anniversary of Francis Fukuyama’s The End of History and the Last Man (1992). Rarely read but often denigrated, it might be the most maligned, unfairly dismissed and misunderstood book of the post-war era. Which is unfortunate for at least one reason: Fukuyama might have done a better job of predicting the political turmoil that engulfed Western democracies in 2016 – from Brexit, to Trump, to the Italian Referendum – than anybody else.

This should sound surprising. After all, Fukuyama’s name has for more than two decades been synonymous with a fin-de-siècle Western triumphalism. According to the conventional wisdom, he is supposed to have claimed that the collapse of the communist regimes in eastern Europe and the United States’ victory in the Cold War meant that liberal capitalist democracy was unambiguously the best form of human political organisation possible. To his popular critics – sometimes on the Right, but most especially on the Left – The End of History was thus a pseudo-intellectual justification for a hyper-liberal capitalist ideology, whose high-water mark was the disastrous administration of George W Bush. Fukuyama’s tagline – ‘the end of history’ – was seized upon by critics as proof that he was attempting to legitimate neoconservative hubris, cloaking a pernicious ideology with the façade of inevitability.

But (the conventional wisdom continues) hubris was soon followed by nemesis: the 9/11 attacks and the subsequent disaster of the Iraq War showed how wrong any triumphalist vision of liberal-capitalist world order was. Fukuyama took particularly heavy flak in this regard. Francis Wheen, in How Mumbo Jumbo Conquered the World (2004), was typical when he accused Fukuyama of being a shill for neo-con interests. In reply to the question ‘How do you get ahead by boldly making one of the worst predictions in social science?’ Wheen sniped: ‘If you are going to be wrong, be wrong as ostentatiously and extravagantly as possible.’ He claimed that Fukuyama ‘understood what was required to titillate the jaded palate of the chattering classes’ – and played on this for personal gain.

Yet all of this is incorrect. For a start, it is a gross misreading of The End of History to see it as any kind of triumphalism, let alone one subsequently disproved by the rise of radical Islam, or the stalling of capitalist democracies post-2008. It was also deeply unfair to Fukuyama himself. Although a public intellectual rather than a traditional academic, his infamous book displayed an erudition and depth of learning, combined with ambition and panache, that few tenured academics come close to. He might have been wrong, but he was never the dummy his critics made out.

To see this better, it’s worth elucidating the actual argument of The End of History. For a start, Fukuyama never suggested that events would somehow stop happening. Just like any other sane person, he believed that history (with a small h), the continuation of ordinary causal events, would go on as it always had. Elections would be held, sports matches would be won and lost, wars would break out, and so on. The interesting question for Fukuyama was about History (with a big H), a term that, for him, picked out a set of concerns about the deep structure of human social existence.

With regards to History, Fukuyama advanced a complex thesis about the way opposing forces play themselves out in social development. Here, he drew inspiration from the work of the German philosopher Georg Hegel, via the reinterpretations of the Russian émigré Alexandre Kojève. Hegel (and Kojève) proposed that History is a process by which contradictions in the ordering of societies work themselves out by eventually overcoming conflict, so as to move to a higher order of integration, where previous contradictions drop away because the underlying oppositions have been solved. The most famous instance of such a ‘dialectical’ view is Karl Marx’s (also made under Hegel’s influence): that the bourgeoisie and the proletariat would eventually move past their combative opposition, via a period of revolution against capitalism, into the harmony of communism.

In essence, big-H history was, for Fukuyama, an understanding of human development as a logical progression (or dialectical working out of contradictions), generating a grand-narrative of progress, in which each step forward sees the world becoming a more rational place. For Fukuyama, the long-run development of humanity was clearly discernible: from the Dark Ages, to the Renaissance, and then crucially the Enlightenment, with its inventions of secularism, egalitarianism and rational social organisation, paving the way in turn for democratic liberal capitalism. This was the cumulative, and thus far upward-curving, arc of human development…

more…

https://aeon.co/essays/was-francis-fukuyama-the-first-man-to-see-trump-coming

 

WIKK WEB GURU
%d bloggers like this: