The legal imagination

I have the right to remain silent (2017). Oil on canvas (35 x 25cm). Private Collection.I have the right to remain silent (2017). Oil on canvas by Albert Barqué-Duran.

Hypotheticals, fantastical beings, and a fictional omnibus: legal reasoning is made supple by its use of the imagination

Maksymilian Del Mar is a reader in legal theory at the School of Law at Queen Mary University of London. He is the co-editor of Legal Fictions in Theory and Practice (2015). He lives in London.

The legal world is wonderfully strange. Pull down a dusty volume of case law from a barrister’s bookshelf, and you’ll discover a parade of fantastical beings that could have been lifted from the pages of Jorge Luis Borges or Dr Seuss. In the law, constitutions behave like living trees, the island of Minorca is treated as a suburb of London, immobile houses suddenly zoom along beltways, Clapham omnibuses are packed with reasonable men, and spectral officious bystanders routinely spy on contractual negotiations. The legal realm is full of unlikely and improbable possibilities, as well as paths not taken, counterfactuals, mights, perhapses and maybes.

All of this draws on the faculty of the imagination. You’d be forgiven for thinking of a judge as someone who spends all day shoehorning ‘the facts’ into pre-fabricated principles, and laying down determinative rulings like geological strata. In fact, legal reasoning is a much more supple exercise. Individual judges must resolve knotty questions under conditions of uncertainty, and in a context in which there’s usually profound disagreement about both what has happened and what ought to be done about it. 

In these circumstances, imagination performs many salutary functions. Indeed, legal reasoning would be impossible without it. Imagination allows judges to explore what might be at stake in any particular dispute, and to provide a set of resources for future decision-makers. It lets them communicate doubt and express hesitation. And it brings the language of law alive, moving us and inviting us to imagine further – and so enables a thriving, interactive community of enquiry.

Of course, imagination also carries certain dangers. It might encourage bias, or signal a departure from common sense. But overall it should be celebrated – in law and, perhaps, in other domains where people must engage in the messy business of public reasoning.

Legal reasoning has at least four imaginative abilities at its disposal. The first is supposing: pretending that something is the case when you know or suspect that it’s not. Judges have been doing this sort of ‘as-if’ style of imagining for thousands of years. Courts in ancient Rome frequently used a mechanism known as fictio civitatis, the fiction of citizenship, which let authorities rule on the behaviour of ‘aliens’ as if they were Romans. As Gaius, a celebrated jurist in the second century CE, said:

If it appears that a golden cup has been stolen from Lucius Titius by Dio the son of Hermaeus or by his aid and counsel, on which account, if he were a Roman citizen, he would be bound to compound for the wrong as a thief.

Fictions are not just the preserve of the West. In 17th-century China, clans of villagers set up ‘companies’ that collected and distributed capital to their members, who were supposedly united by kinship with common ancestors. But as the legal scholar Teemu Ruskola at Emory University in Atlanta argues in Legal Orientalism (2013), ‘the idiom of the family was frequently only a legal fiction used to recruit members, many of whom were not even related by blood to the clan they joined’. Needless to say, this fiction often proved useful in raising revenue for the company.

However, I will focus on the common law – a tradition that comes from Britain, and in which the authority for a principle is settled through the slow accretion of case law and custom, rather than by setting out everything in statutes or codes. This mode of thought involves its fair share of judge-invented fictions. In the 18th-century case of Mostyn v Fabrigas, for example, a resident of Minorca – an island off the coast of Spain that was under British rule – claimed that he had been falsely imprisoned by the British government. To gain jurisdiction, the British court treated the territory as if it were a suburb of London.

Legal scholars usually dislike such judicial inventiveness. ‘[T]he pestilential breath of Fiction poisons the sense of every instrument it comes near,’ wrote the jurist and philosopher Jeremy Bentham in 1776. He said that imagination had infected the law like syphilis, ‘begotten in the bed of metaphor’ – something of an irony, given his own turn of phrase. Bentham claimed that legal language would reflect the truth of affairs only if it were direct and free of ornament, and accused lawyers of deliberately mystifying the law so as to retain sole guardianship over its mysteries – and thereby enrich themselves…

mores…

https://aeon.co/essays/why-judges-and-lawyers-need-imagination-as-much-as-rationality

WIKK WEB GURU

The Secret Playbook of Internet Trolls. “Disrupt, Misdirect and Control Internet Discussions”

internet troll

Relevant to the evolving Fake News saga, this incisive article was first published in 2014. The objective is to smear Truth in Media

Pleased to meet you
Hope you guess my name
What’s confusing you
Is the nature of my game

– The Rolling Stones

The reason that Internet trolls are effective is that people still don’t understand their game.

There are 15 commonly-used trolling tactics to disrupt, misdirect and control internet discussions.

As one interesting example, trolls start flame wars because – according to two professors – swearing and name-calling shut down our ability to think and focus.

And trolls will often spew divisive attacks so that people argue against each other, instead of bad actions and policies of the powers-that-be.   For example, trolls will:

Start a religious war whenever possible using stereotypes like “all Jews are selfish”, “all Christians are crazy” or “all Muslims are terrorists”.

Yesterday, the alternative news site Common Dreams caught a troll using scores of different user names to spew anti-Semitic bile. (Common Dreams discovered that the same troll was behind the multiple user names by tracking their IP addresses. And the troll confessed to Common Dreams.)

The troll is a “a Jewish Harvard graduate in his thirties who was irritated by the website’s discussion of issues involving Israel”.

He posted anti-Semitic diatribes – such as Hitler should have finished the job and killed all Jews – using one alias.  Then – a couple of minutes later – he’d post an attack on the first poster using a different alias, claiming that criticism of Israel is the same thing as anti-Semitism.  (Note: Holocaust survivors and Israeli ministers say it’s not.)

Why would a Jew post vile anti-Semitic comments?  Because normal people are offended by – and don’t want to be associated with – pure, naked anti-Semitism, and so they will avoid such discussions.  If the discussion was originally criticizing a specific aspect of Israeli policy, the discussion will break down, and the actual point regarding policy will be lost.

Similarly, anti-Semitic posts weaken websites by making them seem less reputable. Indeed, Common Dreams says that the troll’s anti-Semitic comments drove away many of that site’s largest donors … dealing a severe blow to its continued viability. That’s exactly what trolls spewing anti-Semitic bile are trying to do: shut down logical discussion and discredit and weaken sites which allow rational criticism of policy.

It is well-known that foreign  governments and large companies troll online. See thisthis this, and this. For example, the Israeli government is paying students to post pro-Israeli comments online.

And American students are also attempting to influence internet discussion.

While the Common Dreams troll claims that he’s not sponsored by the state of Israel, government  agencies have manipulated  Internet discussion for years. This includes the use of multiple “socket puppet” aliases.  The potential for mischief is stunning.

Unless we learn their game…

http://www.globalresearch.ca/the-secret-playbook-of-internet-trolls-disrupt-misdirect-and-control-internet-discussions/5581824

WIKK WEB GURU

The Illusion of America: Has it Ever Really Existed?


by Ryan Cristian, Guest Waking Times

In the past, this country has always prided itself on the personal freedom and Constitutional rights of the individual, and oddly enough, it still does today despite the fact that the US currently ranks 41st according to the 2016 world press freedom index, and 11th on the Heritage Institute’s 2016 index of economic freedom, behind countries such as Singapore, Hong Kong and Chile.

However, the facts have yet to stop the government, or rather the faction that co-opted the government, from carrying out its will and broadcasting whichever story reinforces that will. From lying about attacks and epidemics, to fabricating them all together, what this country once could have been is simply no longer present, where sadly, it truly counts: within the government, save for a brave few. But the people remain, and that American Dream still resides within their hearts and longs to be realized. Regardless of whether that dream was ever truly seen to fruition or if it was sabotaged by some of the very men who took part in its creation, it has taken root in the very foundation that holds this broken country together, it has become eternal; it prevails within the deep-seeded beliefs of that which makes this country truly powerful: its people.

Those who have participated in this fabrication are terrified that the people will once and for all see through the lie, and find the sinister reality behind. And that is indeed what we are witnessing in real-time, as Americans finally begin to wake up to this gross manipulation.

This awakening has been a long time coming, and was brought on by many different occurrences. Yet, the one event that was always meant to reinforce the citizen’s blind faith and dedication to the system, the election process, has much to the establishment’s dismay, become the largest awakening event in American history. People are being hyper-exposed to the true nature of the government and those who have complete control over its actions, thanks almost entirely to WikiLeaks and those who anonymously leaked the information.

From blaming the Russians with zero evidence, and pitting Americans against each other with race, religion and political standing, to the classic ‘lesser of two evils” ploy, the entire past election has weirdly enough turned into one of the most important things that could have happened to the individual, as it shook us all out of our complacency to see the true level of corruption operating right beneath our noses, while being told we are living the dream.

Does the Government Represent the People?

In today’s “feelings-over-facts“ climate, when a study does not embody what one chooses to embrace as “the truth,” it is simply disregarded as “fake news” or some other catch-all measure of dismissal that allows the individual to ignore contradictory information, and do so with a sense of confidence, be it a false one. Which is the case with the following study, that wildly enough has not gotten much attention since its release, as something like this would only be ignored by a willfully uninformed populace. This study actually quantifies the irrelevancy of the American public in regard to their influence on the government and its policy.

Researchers at Princeton University looked at more than 20 years worth of data to answer a pretty simple question:

Does the government represent the people?

Quoting the Princeton study directly:

“The preferences of the average American appear to have only a minuscule, near-zero, statistically non-significant impact upon public policy.”

(Click the graph to see the full video)
Many are beginning to come to grips with the reality that their opinion doesn’t matter in this country, and that the government truly doesn’t care what they think, and sadly… they are correct. But there’s a catch. Economic elites, business interests–those who have enough money to hire lobbyists–have an entirely different line on the graph above than the average America(the bottom 90%), and surprise surprise, it’s much closer to the ideal representation, which means these elite individuals do in fact directly influence policy in this country. So in other words, this study clearly demonstrates that money, and those who have it, dictate this country’s true direction, not the people and their faux democracy.

Ultimately, this shows that the elites have the ability to get what they what, regardless of the American majority-will, and the people pay for it, literally and figuratively. With one in every five American children born into poverty, the American people suffer the most expensive healthcare in the world, a floundering education system, and a catastrophically detrimental failure of a drug war, all spurred on with egregiously wasteful spending seemingly without end. Almost every major issue Americans face as a nation can be tied back to the realization brought on by the above graph: The average person has essentially zero influence on the passage of the very laws that affect their daily lives and the direction of their own country, which they then pay for with their hard-earned money, lest they be forced into a cage at gunpoint. Sounds like freedom to me…

more…

About the Author

Ryan Cristian is the author of website, The Last American Vagabond.

This article (The Illusion of America: Has it Ever Really Existed?) was originally created and published by The Last American Vagabond and is printed here with permission. 

http://www.wakingtimes.com/2017/03/27/illusion-america-ever-really-existed/

WIKK WEB GURU

Polish Poet and Nobel Laureate Wisława Szymborska on How Our Certitudes Keep Us Small and the Generative Power of Not-Knowing

Art by Salvador Dalí from a rare edition of Alice’s Adventures in Wonderland

“Attempt what is not certain. Certainty may or may not come later. It may then be a valuable delusion,” the great painter Richard Diebenkorn counseled in his ten rules for beginning creative projects. “One doesn’t arrive — in words or in art — by necessarily knowing where one is going,” the artist Ann Hamilton wrote a generation later in her magnificent meditation on the generative power of not-knowing. “In every work of art something appears that does not previously exist, and so, by default, you work from what you know to what you don’t know.”

What is true of art is even truer of life, for a human life is the greatest work of art there is. (In my own life, looking back on my ten most important learnings from the first ten years of Brain Pickings, I placed the practice of the small, mighty phrase “I don’t know” at the very top.) But to live with the untrammeled openendedness of such fertile not-knowing is no easy task in a world where certitudes are hoarded as the bargaining chips for status and achievement — a world bedeviled, as Rebecca Solnit memorably put it, by “a desire to make certain what is uncertain, to know what is unknowable, to turn the flight across the sky into the roast upon the plate.”

That difficult feat of insurgency is what the great Polish poet Wisława Szymborska (July 2, 1923–February 1, 2012) explored in 1996 when she was awarded the Nobel Prize in Literature for capturing the transcendent fragility of the human experience in masterpieces like “Life-While-You-Wait” and “Possibilities.”

In her acceptance speech, later included in Nobel Lectures: From the Literature Laureates, 1986 to 2006 (public library) — which also gave us the spectacular speech on the power of language Toni Morrison delivered after becoming the first African American woman to win the Nobel Prize — Szymborska considers why artists are so reluctant to answer questions about what inspiration is and where it comes from:

It’s not that they’ve never known the blessing of this inner impulse. It’s just not easy to explain something to someone else that you don’t understand yourself.

Noting that she, too, tends to be rattled by the question, she offers her wieldiest answer:

Inspiration is not the exclusive privilege of poets or artists generally. There is, has been, and will always be a certain group of people whom inspiration visits. It’s made up of all those who’ve consciously chosen their calling and do their job with love and imagination. It may include doctors, teachers, gardeners — and I could list a hundred more professions. Their work becomes one continuous adventure as long as they manage to keep discovering new challenges in it. Difficulties and setbacks never quell their curiosity. A swarm of new questions emerges from every problem they solve. Whatever inspiration is, it’s born from a continuous “I don’t know.”

In a sentiment of chilling prescience today, as we witness tyrants drunk on certainty drain the world of its essential inspiration, Szymborska considers the destructive counterpoint to this generative not-knowing:

All sorts of torturers, dictators, fanatics, and demagogues struggling for power by way of a few loudly shouted slogans also enjoy their jobs, and they too perform their duties with inventive fervor. Well, yes, but they “know.” They know, and whatever they know is enough for them once and for all. They don’t want to find out about anything else, since that might diminish their arguments’ force. And any knowledge that doesn’t lead to new questions quickly dies out: it fails to maintain the temperature required for sustaining life. In the most extreme cases, cases well known from ancient and modern history, it even poses a lethal threat to society.

This is why I value that little phrase “I don’t know” so highly. It’s small, but it flies on mighty wings. It expands our lives to include the spaces within us as well as those outer expanses in which our tiny Earth hangs suspended. If Isaac Newton had never said to himself “I don’t know,” the apples in his little orchard might have dropped to the ground like hailstones and at best he would have stooped to pick them up and gobble them with gusto. Had my compatriot Marie Sklodowska-Curie never said to herself “I don’t know”, she probably would have wound up teaching chemistry at some private high school for young ladies from good families, and would have ended her days performing this otherwise perfectly respectable job. But she kept on saying “I don’t know,” and these words led her, not just once but twice, to Stockholm, where restless, questing spirits are occasionally rewarded with the Nobel Prize…

more…

https://www.brainpickings.org/

WIKK WEB GURU

Everything That’s Ever Been Said About Boning Before Sporting Events

by Andrew Fiouzi

1. The idea that celibacy breeds maximum athletic performance dates back to 444 B.C., when Plato, of all people, opined, “Olympic competitors before races should avoid sexual intimacy.” A few centuries later, Aretaeus of Cappadocia, a celebrated Greek physician, gave Plato’s thinking a little more color: “If any man is in possession of semen, he is fierce, courageous and physically mighty, like beasts.”

2. The most detailed explanation, though, can be found in Philostratus’ Gymnasticus, the oldest text on sports known to man: “Those who come to the gymnasium straight after sex are exposed by a greater number of indicators when they train, for their strength is diminished and they are short of breath and lack daring in their attacks, and they fade in colour in response to exertion. … And when they strip, their hollow collar-bones give them away, their poorly structured hips, the conspicuous outline of their ribs, and the coldness of their blood. These athletes, even if we dedicated ourselves to them, would have no chance of being crowned in any contest. The part beneath the eyes is weak, the beating of their hearts is weak, their perspiration is weak, their sleep, which controls digestion, is weak, and their eyes glance around in a wandering fashion and indicate an appearance of lustfulness.”

3. Perhaps that’s why Cleitomachus, a star pankratiast (sort of an ancient form of MMA that was a big event during the earliest Greek Olympics), is said to have never slept with his wife, and would avert his gaze when he saw two dogs mating.

4. To ensure that a male athlete’s seed was never spilled — intentionally or otherwise — Galen, another prominent Greek doctor, recommended the following around the 2nd century, “A flattened lead plate is an object to be placed under the muscles of the loins of an athlete in training, chilling them whenever they might have nocturnal emissions of semen.”

5. That said, not everyone thought a little pre-game bacchanal was the mark of a loser. In fact, in 77 A.D., Pliny the Elder, author, philosopher and inspiration for a delicious beer, as well as a naval and army commander of the Roman Empire, argued directly against Plato and everyone else above when he wrote, “Athletes when sluggish are revitalized by lovemaking.”

6. Despite the passage of about 2,000 years, our thinking on the topic has not gotten any clearer. And the methods some athletes have gone to suppress their libidos are no less barbaric than sticking lead plates down their pants. For instance, Antonio Miguel, head of medical services at the Club Universidad Nacional Pumas, one of the top soccer teams in Mexico, has said, “At the end of the 1950s and beginning of the 1960s, people thought that sex diminished the players’ performance. Coaches gave us nitrate salts (potassium nitrate, a substance used to prevent erections) because, according to them, this would inhibit the sexual desire.”

7. With or without nitrate salts, Muhammad Ali, according to several reports, abstained from having sex for six weeks before a fight.

8. After all, WOMEN WEAKEN LEGS:

9. All of which seems backward, since a 1968 study, “Muscular Performance Following Coitus,” found that men who hadn’t had sex for six days did no better on a strength test than men who’d had sex the previous night.

10. Same for a 2000 study in the Journal of Sports Medicine and Physical Fitness involving 15 high-level athletes between the ages of 20 and 40 who participated in a two-day experiment. Its conclusion? Sexual activity had no significant overall effect on how the athletes performed during exercise and mental tests.

11. In fact, Emmanuele A. Jannini of the University of L’Aquila in Italy has found that sex stimulates the production of testosterone. “After three months without sex, which is not so uncommon for some athletes, testosterone dramatically drops to levels close to children’s levels,” he told National Geographic.

12. Of course, Joe Namath didn’t need Jannini to tell him that. “I try to [have sex the night before a game],” he explained in his 1969 Playboy Interview. “Before one game last year, I just sat home by myself and watched television, drank a little tequila to relax and went to sleep fairly early. But most of the nights before games, I’ll be with a girl. One of the Jets’ team doctors, in fact, told me that it’s a good idea to have sexual relations before a game, because it gets rid of the kind of nervous tension an athlete doesn’t need.”…

more…

https://melmagazine.com/everything-thats-ever-been-said-about-boning-before-sporting-events-e737ca33e1c1#.63byqwokg

WIKK WEB GURU

When nations apologise

Resultado de imagem para German Chancellor Willy Brandt kneels before the memorial to the dead of the Warsaw Uprising, 7 December 1970‘What people do when words fail them’: German Chancellor Willy Brandt kneels before the memorial to the dead of the Warsaw Uprising, 7 December 1970. Photo by Ullstein Bild/Sven Simon (image edited by Web Investigator)

National apologies are a big deal: they acknowledge the past to help move everyone forward. No wonder they’re so hard

by Edwin Battistella teaches linguistics and writing at Southern Oregon University.  He has a PhD in linguistics from the City University of New York. His most recent book is Sorry About That: The Language of Public Apology (2014).

In May 2016, when Barack Obama visited Hiroshima, some speculated that the president of the United States might offer an apology, on behalf of his country, for the bombing of that city at the close of the Second World War. Instead, in his joint press conference with Japan’s prime minister Shinzo Abe, Obama said that his visit would ‘honour all those who were lost in the Second World War and reaffirm our shared vision of a world without nuclear weapons’. The White House had announced before the visit that it would neither revisit the decision to drop the bomb nor apologise for it. The Obama administration judged that this wartime military action required no apology. 

When do nations apologise? Nearly 30 years earlier, in 1988, the US Congress passed the Civil Liberties Act authorising apologies and redress payments to the Japanese Americans interred during the 1940s. Signing the bill, the US president Ronald Reagan said that ‘here we admit a wrong; here we reaffirm our commitment as a nation to equal justice under the law’. Reagan’s successors, George H W Bush and Bill Clinton, later sent individual apology letters to former internees as their claims were processed.

The apology for internment was a long time coming. In the wave of xenophobia following the 1941 attack on Pearl Harbor, the military removed nearly 120,000 Japanese Americans and Japanese to what were euphemistically called War Relocation Authority camps. Those interred encountered hardship, suffering and loss. In 1944, with the Korematsu v United States court case, internment was declared unconstitutional. Some Japanese Americans had been imprisoned for as long as three years. They were given a train ticket and $25. But no apology.

Then, 43 years after internment ended, the US Congress apologised. The path to apology began in 1970, with a call to action from the Japanese American Citizens League. A decade later, the US president Jimmy Carter appointed a Commission on Wartime Relocation and Internment of Civilians to recommend a course of action. Getting the apology was controversial, involving issues of cost and accountability, political consensus-building, and philosophical debate about whether later governments were responsible for the moral failures of their predecessors. But, in the eyes of many former internees, the effort was worth it. For them, it was a restoration of honour. For the US government, the apology was an admission of having wronged its citizens and a recommitment to justice.

Sometimes, a whole nation must come to grips with its collective past if it is to move ahead. This was the case for the 50-year process that Germany underwent following Hitler’s regime. With the Nuremberg trials and with reforms of the education system aimed at denazification, the Allied victors focused on accountability and re-education. West German politicians knew that they had to address wartime atrocities if Germany was to rejoin the community of nations. As a step toward this, in 1952 the German chancellor Konrad Adenauer’s government made a 3.5 billion Deutsche mark payment to the new state of Israel. Internally, however, Adenauer was advocating a national policy of forgetting rather than apology and remembrance. In his first address as chancellor in 1949, Adenauer had told the West German parliament that his government was determined to put the past behind. He was concerned with a resurgence of nationalism and sought to de-emphasise wartime guilt in favour of economic revitalisation.

In the post-war years, the German Right and Left would debate whether exploring the past would mire the nation in perpetual guilt, or whether a greater recognition of the past was a necessary step to national dignity. Contrition became the norm, so much so that when the later chancellor Willy Brandt went to Poland in 1970, he knelt before the monument commemorating the Warsaw uprising of 1943. Brandt’s action was widely viewed as a non-verbal apology, and he later explained that he ‘did what people do when words fail them’. In 1995, Helmut Kohl, the chancellor of a unified Germany, found the words. On the 50th anniversary of the liberation of Auschwitz, Kohl was unambiguously apologetic. Auschwitz, he explained, was ‘the darkest and most horrible chapter of German history … one of our priority tasks is to pass on this knowledge to future generations so that the horrible experiences of the past will never be repeated.’ The process of national apology for Germany was one of overcoming amnesia and acknowledging the past. But, importantly, the orientation was on the future. Germany saw facing and apologising for its past as a way to improve the lives of those to come…

more…

https://aeon.co/essays/a-national-apology-has-the-power-to-change-the-future

WIKK WEB GURU

What Really Turned the Sahara Desert From a Green Oasis Into a Wasteland?

FBBY1H (1).jpg

One of the world’s most iconic deserts was once lush and green. What happened? (Alamy )

10,000 years ago, this iconic desert was unrecognizable. A new hypothesis suggests that humans may have tipped the balance

SMITHSONIAN.COM

When most people imagine an archetypal desert landscape—with its relentless sun, rippling sand and hidden oases—they often picture the Sahara. But 11,000 years ago, what we know today as the world’s largest hot desert would’ve been unrecognizable. The now-dessicated northern strip of Africa was once green and alive, pocked with lakes, rivers, grasslands and even forests. So where did all that water go?

Archaeologist David Wright has an idea: Maybe humans and their goats tipped the balance, kick-starting this dramatic ecological transformation. In a new study in the journal Frontiers in Earth Science, Wright set out to argue that humans could be the answer to a question that has plagued archaeologists and paleoecologists for years.

The Sahara has long been subject to periodic bouts of humidity and aridity. These fluctuations are caused by slight wobbles in the tilt of the Earth’s orbital axis, which in turn changes the angle at which solar radiation penetrates the atmosphere. At repeated intervals throughout Earth’s history, there’s been more energy pouring in from the sun during the West African monsoon season, and during those times—known as African Humid Periods—much more rain comes down over north Africa.

With more rain, the region gets more greenery and rivers and lakes. All this has been known for decades. But between 8,000 and 4,500 years ago, something strange happened: The transition from humid to dry happened far more rapidly in some areas than could be explained by the orbital precession alone, resulting in the Sahara Desert as we know it today. “Scientists usually call it ‘poor pramaterization’ of the data,” Wright said by email. “Which is to say that we have no idea what we’re missing here—but something’s wrong.”

As Wright pored the archaeological and environmental data (mostly sediment cores and pollen records, all dated to the same time period), he noticed what seemed like a pattern. Wherever the archaeological record showed the presence of “pastoralists”—humans with their domesticated animals—there was a corresponding change in the types and variety of plants. It was as if, every time humans and their goats and cattle hopscotched across the grasslands, they had turned everything to scrub and desert in their wake.

Wright thinks this is exactly what happened. “By overgrazing the grasses, they were reducing the amount of atmospheric moisture—plants give off moisture, which produces clouds—and enhancing albedo,” Wright said. He suggests this may have triggered the end of the humid period more abruptly than can be explained by the orbital changes. These nomadic humans also may have used fire as a land management tool, which would have exacerbated the speed at which the desert took hold.

It’s important to note that the green Sahara always would’ve turned back into a desert even without humans doing anything—that’s just how Earth’s orbit works, says geologist Jessica Tierney, an associate professor of geoscience at the University of Arizona. Moreover, according to Tierney, we don’t necessarily need humans to explain the abruptness of the transition from green to desert.

Instead, the culprits might be regular old vegetation feedbacks and changes in the amount of dust. “At first you have this slow change in the Earth’s orbit,” Tierney explains. “As that’s happening, the West African monsoon is going to get a little bit weaker. Slowly you’ll degrade the landscape, switching from desert to vegetation. And then at some point you pass the tipping point where change accelerates.”

Tierney adds that it’s hard to know what triggered the cascade in the system, because everything is so closely intertwined. During the last humid period, the Sahara was filled with hunter-gatherers. As the orbit slowly changed and less rain fell, humans would have needed to domesticate animals, like cattle and goats, for sustenance. “It could be the climate was pushing people to herd cattle, or the overgrazing practices accelerated denudation [of foliage],” Tierney says.

Which came first? It’s hard to say with evidence we have now. “The question is: How do we test this hypothesis?” she says. “How do we isolate the climatically driven changes from the role of humans? It’s a bit of a chicken and an egg problem.” Wright, too, cautions that right now we have evidence only for correlation, not causation.

But Tierney is also intrigued by Wright’s research, and agrees with him that much more research needs to be done to answer these questions.

“We need to drill down into the dried-up lake beds that are scattered around the Sahara and look at the pollen and seed data and then match that to the archaeological datasets,” Wright said. “With enough correlations, we may be able to more definitively develop a theory of why the pace of climate change at the end of the AHP doesn’t match orbital timescales and is irregular across northern Africa.”

Tierney suggests researchers could use mathematical models that compare the impact hunter-gatherers would have on the environment versus that of pastoralists herding animals. For such models it would be necessary to have some idea of how many people lived in the Sahara at the time, but Tierney is sure there were more people in the region than there are today, excepting coastal urban areas…

more…

Read more: http://www.smithsonianmag.com/science-nature/what-really-turned-sahara-desert-green-oasis-wasteland-180962668/#3aY85oLSfAGlomkP.99
Give the gift of Smithsonian magazine for only $12! http://bit.ly/1cGUiGv
Follow us: @SmithsonianMag on Twitter

WIKK WEB GURU
%d bloggers like this: