Category: Media&Culture


Bullfight Picasso

SEEING RED?: Bulls can’t actually see the red of the cape, but its meaning is clear to the matador and spectators.Bullfight, the death of the torero by Pablo Picasso

The innate meanings of color and intensity.

You don’t have to look very hard to see that our culture has some pretty powerful associations between colors and feelings. As a recent example, the new Pixar film Inside Out has characters representing emotions, and the color choices for these characters—red for anger, and blue for sadness—feel right.

Red, specifically, is one of the most powerful colors in terms of its associations and the feelings it generates. Soccer players perceive red-shirted opponents to be better players, and one study indeed found that players wearing red shirts won sports games more often. Looking at red also seems to help people focus. Red enhances performance on detail-oriented tasks, whereas blue and greenimprove the results of creative tasks. Red is also sexual—men find women wearing red to be more attractive, and women think the same of men.

Why might this be? Although these associations are a part of our culture, are they arbitrary, or did they come about for reasons outside of culture, perhaps having to do with our biology or the environment we all live in?

Color is not distributed randomly in the world around us, and as we experience the world, we build up associations between colors and the things they represent. Yet red is a relatively rare color in the natural environment. Certain fruits, small parts of the sky at times, and blood are all red. When we get angry or embarrassed, our faces get redder (though the effect is less obvious in dark-skinned people).

But these appearances of red in our environment don’t seem to be enough to explain the breadth of red’s various connotations. And there is reason to think the meanings might be inherent to our biology: Red connotes high arousal, passion, and violence even for some non-human animals. When male mandrills face off, for instance, the paler (less red) male stands down. And macaques use red in sexual displays. Yet while poison dart frogs are brightly colored, they don’t skew toward red—many are green, yellow, and blue. This suggests that the general association with red is specific to primates, evolving before mandrills and humans differentiated. If it had been learned, as a result of being associated with passion and danger in the environment, we would expect broader cross-species associations; for instance, we might see that all poison animals were red. (Red does serve as a warning in some other animals, such as venomous snakes, but it’s not at all universal.) If red is indeed a warning color used to communicate among primates, specifically, then the associations with red might be innate yet arbitrary, meaning that it might just as easily have been another color that took on that role.

The difference between light and darkness is the most primitive and most visually and emotionally powerful aspect of color, and the associations in our environment for darkness and light seem clearer. This suggests that the implications of light and dark might be learned as well as evolved—that is, we might be born with certain reactions to light and dark that get reinforced through experience with the natural world and through culture. Darkness is scary because it prevents us from using our dominant sense: vision (pdf). So in that sense, the common negative associations with darkness are not arbitrary, as is primates’ use of red as a warning color. If a language has only two words for colors, those two words are invariably light and dark. In one experiment, people were supposed to speak aloud the words they saw flashed on the screen. People were faster at saying the words associated with immorality (such as “greed”) when they were in black, and faster at saying the “moral words” (such as “honesty) when they were in white—and this happened too quickly for them to deliberate about it. This shows that it was subconsciously easier for them to associate morality with lightness…

more…

http://nautil.us/issue/45/power/why-the-dark-side-of-the-force-had-to-be-dark

WIKK WEB GURU

WIKK WEB GURU

Chemical compasses may rely on quantum spin (Credit: Andrey Volodin/Alamy)

Few of us really understand the weird world of quantum physics – but our bodies might take advantage of quantum properties

By Martha Henriques

If there’s any subject that perfectly encapsulates the idea that science is hard to understand, it’s quantum physics. Scientists tell us that the miniature denizens of the quantum realm behave in seemingly impossible ways: they can exist in two places at once, or disappear and reappear somewhere else instantly.

The one saving grace is that these truly bizarre quantum behaviours don’t seem to have much of an impact on the macroscopic world as we know it, where “classical” physics rules the roost.

Or, at least, that’s what scientists thought until a few years ago.

Quantum processes might be at work behind some very familiar processes

Now that reassuring wisdom is starting to fall apart. Quantum processes may occur not quite so far from our ordinary world as we once thought. Quite the opposite: they might be at work behind some very familiar processes, from the photosynthesis that powers plants – and ultimately feeds us all – to the familiar sight of birds on their seasonal migrations. Quantum physics might even play a role in our sense of smell.

In fact, quantum effects could be something that nature has recruited into its battery of tools to make life work better, and to make our bodies into smoother machines. It’s even possible that we can do more with help from the strange quantum world than we could without it.

Photosynthesis looks easy (Credit: Morley Read/Alamy)

Photosynthesis looks easy (Credit: Morley Read/Alamy)

At one level, photosynthesis looks very simple. Plants, green algae and some bacteria take in sunlight and carbon dioxide, and turn them into energy. What niggles in the back of biologists minds, though, is that photosynthetic organisms make the process look just a little bit too easy.

It’s one part of photosynthesis in particular that puzzles scientists. A photon – a particle of light – after a journey of billions of kilometres hurtling through space, collides with an electron in a leaf outside your window. The electron, given a serious kick by this energy boost, starts to bounce around, a little like a pinball. It makes its way through a tiny part of the leaf’s cell, and passes on its extra energy to a molecule that can act as an energy currency to fuel the plant.

Photosynthetic organisms make the process look just a little bit too easy

The trouble is, this tiny pinball machine works suspiciously well. Classical physics suggests the excited electron should take a certain amount of time to career around inside the photosynthetic machinery in the cell before emerging on the other side. In reality, the electron makes the journey far more quickly.

What’s more, the excited electron barely loses any energy at all in the process. Classical physics would predict some wastage of energy in the noisy business of being batted around the molecular pinball machine. The process is too fast, too smooth and too efficient. It just seems too good to be true.

Then, in 2007, photosynthesis researchers began to see the light. Scientists spotted signs of quantum effects in the molecular centres for photosynthesis. Tell-tale signs in the way the electrons were behaving opened the door to the idea that quantum effects could even be playing an important biological role.

This could be part of the answer to how the excited electrons pass through the photosynthetic pinball machine so quickly and efficiently. One quantum effect is the ability to exist in many places at the same time – a property known as quantum superposition. Using this property, the electron could potentially explore many routes around the biological pinball machine at once. In this way it could almost instantly select the shortest, most efficient route, involving the least amount of bouncing about.

Quantum physics had the potential to explain why photosynthesis was suspiciously efficient – a shocking revelation for biologists…

more…

http://www.bbc.com/earth/story/20160715-organisms-might-be-quantum-machines?ocid=fbert

WIKK WEB GURU

WIKK WEB GURU

by Nathaniel Mauka, Staff Writer Waking Times 

Neuroscientists have argued whether we even have free will, but now they want to turn it off.

The Libet Experiment

In the 1980s scientist Benjamin Libet conducted an experiment. He ‘discovered’ that what seems to be free will or the conscious choice to do or not do something is really just the observance of something that has already happened. This completely rocked the foundations of what most thought of as a prerequisite for being human, and the long-held religious view that free-will must always be honored.

Libet recorded people’s brainwaves as they made spontaneous finger movements while looking at a clock. The participants in the study were to tell researchers the time at which they decided to wave their fingers. Libet found that there were several milliseconds of preparatory brain activity prior to the time that people reported the conscious act of waving their fingers. His findings were taken as gospel that free will did not exist. Now we call this preparatory action of the brain the ‘readiness potential.’

What Libet’s experiment failed to consider though, was manifold. It is possible that people were only conscious of an action milliseconds after a subconscious realization. It is possible that they could not indicate their intent as fast as their physical bodies could carry it out – a delay in physical vs. mental activity that has been well documented, and it is also possible that the cognition of an anticipated event is cognized well before the actual event, because the entire causal field is changed by our consciousness, as evidenced by recent experiments in physics. This is called the observer effect as it refers to changes that the act of observation will make on a phenomenon being observed.

Libet implies that the conscious decision act is divorced from fee will, in that it is acted out nonconsciously, and that the subjective feeling of having made this decision is tagged on afterward – however – we already know from vast amounts of research from Jung and others, that we know a lot more than we consciously allow ourselves to honor.

Nonetheless, Libet’s experiment has weathered such criticism and the implications have been replicated with even more advanced equipment including the use of FMRI technology and the direct recording of neuronal activity using implanted electrodes.

How to Reprogram Or Eliminate Free Will

These studies all seem to point in the same, troubling conclusion: We don’t really have free will. So why then are neuroscientists trying to remove our free will?

A study published in Proceedings of the National Academy of Sciences by researchers in Germany, has scientists backtracking on their original assumption that we have no free will.

The German researchers worked backwards in a way, from Libet’s experimental protocol, using a form of brain-computer integration to see whether participants could cancel a movement after the onset of the unconscious preparatory brain activity identified by Libet.

If they could, it would be a sign that humans can consciously intervene and “veto” processes that neuroscience has previously considered automatic and beyond willful control. There were more complex methods utilized including the use of colored lights, but in short, they found we could easily undo actions and “veto” them – a sign of undeniable free will.

A quote from the lead researcher, Dr. John-Dylan Haynes of Charité – Universitätsmedizin in Berlin, becomes telling in order to discover how neuroscientists working for the deep state could override our own free will,

“A person’s decisions are not at the mercy of unconscious and early brain waves. They are able to actively intervene in the decision-making process and interrupt a movement. Previously people have used the preparatory brain signals to argue against free will. Our study now shows that the freedom is much less limited than previously thought.”

These findings were supported by a French study which found that “nonconscious” preparatory brain activity identified by Libet is really just part of a fairly random ebb and flow of background neural activity, and that movements occur when this activity crosses a certain threshold.

And even more studies confirm what we all suspected regardless of early scientific findings – that we all act consciously, perhaps to different degrees, but certainly with free will.

When we form a vague intention to move, they explain, this mind-set feeds into the background ebb and flow of neural activity, but the specific decision to act only occurs when the neural activity passes a key threshold — and our all-important subjective feeling of deciding happens at this point or a brief instant afterward.

“All this leaves our common sense picture largely intact,” they write, meaning we can break a chain of events (determinism), but that also implies a certain responsibility for our actions.

The Cooperation of Subconscious and Conscious Awareness

All these studies do suggest, though, that our free will requires healthy partnerships between conscious and unconscious systems. In special circumstances like playing musical instruments, engaging in sports, or driving a car, we apparently recruit specialized unconscious agents with the ability to carry out certain acts quickly without conscious “permission.”

If these “unconscious” agents can be reprogrammed, then we can be controlled, essentially by “disabling” our free will – at least according to pedantic science.

Attempts to Destruct Free Will

Aside from using drugs like scopolamine, known to wipe our subconscious plates clean, so that new, possibly nefarious programming can be installed, and obvious mind control techniques admittedly researched by the CIA (with the help of Stanford Neuroscientists, and others) along with additional intelligence agencies of our government, there are subtle programming methods used every day in the form of subconscious messages in advertising. There are even cell phone apps meant to control the free will of the user. You can imagine what other technologies have been employed.

My advice? Use your free will to override unwanted subconscious programming. If it requires both conscious and ‘non’ conscious compliance, to remove free will, then we can at least interfere by utilizing our conscious awareness and removing tacit consent. That ought to keep the physicist busy for a while, at any rate, and the deep state wasting our tax dollars on more Mind Kontrol experiments.

About the Author
Nathaniel Mauka is a researcher of the dark side of government and exopolitics, and a staff writer for Waking Times.
This article (Deep State Neuroscientists Believe They Can Turn Off Free Will) was originally created and published by Waking Times and is published here under a Creative Commons license with attribution to Nathaniel MaukaIt may be re-posted freely with proper attribution and author bio.
WIKK WEB GURU

WIKK WEB GURU

Resultado de imagem para Newly built Volkswagen Beetles ready for shipping from Hamburg in 1972. Photo by Thomas Hoepker/Magnum

Newly built Volkswagen Beetles ready for shipping from Hamburg in 1972. Photo by Thomas Hoepker/Magnum

Unprecedented growth marked the era from 1948 to 1973. Economists might study it forever, but it can never be repeated. Why?

Newly built Volkswagen Beetles ready for shipping from Hamburg in 1972. Photo by Thomas Hoepker/Magnum

Marc Levinson is an economist, historian and journalist whose work has appeared in The Harvard Business Review, The Wall Street Journal and Bloomberg.com, among others. His latest book is An Extraordinary Time: The End of the Postwar Boom and the Rise of the Ordinary Economy (2016). He lives in Washington, DC.

The second half of the 20th century divides neatly in two. The divide did not come with the rise of Ronald Reagan or the fall of the Berlin Wall. It is not discernible in a particular event, but rather in a shift in the world economy, and the change continues to shape politics and society in much of the world today.

The shift came at the end of 1973. The quarter-century before then, starting around 1948, saw the most remarkable period of economic growth in human history. In the Golden Age between the end of the Second World War and 1973, people in what was then known as the ‘industrialised world’ – Western Europe, North America, and Japan – saw their living standards improve year after year. They looked forward to even greater prosperity for their children. Culturally, the first half of the Golden Age was a time of conformity, dominated by hard work to recover from the disaster of the war. The second half of the age was culturally very different, marked by protest and artistic and political experimentation. Behind that fermentation lay the confidence of people raised in a white-hot economy: if their adventures turned out badly, they knew, they could still find a job.

The year 1973 changed everything. High unemployment and a deep recession made experimentation and protest much riskier, effectively putting an end to much of it. A far more conservative age came with the economic changes, shaped by fears of failing and concerns that one’s children might have it worse, not better. Across the industrialised world, politics moved to the Right – a turn that did not avert wage stagnation, the loss of social benefits such as employer-sponsored pensions and health insurance, and the secure, stable employment that had proved instrumental to the rise of a new middle class and which workers had come to take for granted. At the time, an oil crisis took the blame for what seemed to be a sharp but temporary downturn. Only gradually did it become clear that the underlying cause was not costly oil but rather lagging productivity growth – a problem that would defeat a wide variety of government policies put forth to correct it.

The great boom began in the aftermath of the Second World War. The peace treaties of 1945 did not bring prosperity; on the contrary, the post-war world was an economic basket case. Tens of millions of people had been killed, and in some countries a large proportion of productive capacity had been laid to waste. Across Europe and Asia, tens of millions of refugees wandered the roads. Many countries lacked the foreign currency to import food and fuel to keep people alive, much less to buy equipment and raw material for reconstruction. Railroads barely ran; farm tractors stood still for want of fuel. Everywhere, producing enough coal to provide heat through the winter was a challenge. As shoppers mobbed stores seeking basic foodstuffs, much less luxuries such as coffee and cotton underwear, prices soared. Inflation set off waves of strikes in the United States and Canada as workers demanded higher pay to keep up with rising prices. The world’s economic outlook seemed dim. It did not look like the beginning of a golden age.

As late as 1948, incomes per person in much of Europe and Asia were lower than they had been 10 or even 20 years earlier. But 1948 brought a change for the better. In January, the US military government in Japan announced it would seek to rebuild the economy rather than exacting reparations from a country on the verge of starvation. In April, the US Congress approved the economic aid programme that would be known as the Marshall Plan, providing Western Europe with desperately needed dollars to import machinery, transport equipment, fertiliser and food. In June, the three occupying powers – France, the United Kingdom and the US – rolled out the deutsche mark, a new currency for the western zones of Germany. A new central bank committed to keeping inflation low and the exchange rate steady would oversee the deutsche mark.

Postwar chaos gave way to stability, and the war-torn economies began to grow. In many countries, they grew so fast for so long that people began to speak of the ‘economic miracle’ (West Germany), the ‘era of high economic growth’ (Japan) and the 30 glorious years (France). In the English-speaking world, this extraordinary period became known as the Golden Age…

more…

https://aeon.co/essays/how-economic-boom-times-in-the-west-came-to-an-end

WIKK WEB GURU

WIKK WEB GURU

Tracy Moore

Love is a power move

Hot tip: Women don’t typically say “I love you” first in a heterosexual relationship because we know men think we are going to say it first, so we wait and make you say it first because that way it will be more “real.”

Lemme put it another way: Many women play an “I love you” game of chicken, because we have to, or else risk confirming every stereotype alive that we are blinking neon signs of emotional neediness. Yes, there are exceptions, and many people are mature, evolved beings who have no need for such silly games, but we can’t all be brave soldiers in the game of love.

Some context: In an age of Hey, men have feelings, too!, research has resurfaced on the internet that when it comes to those three little words, it turns out that men not only fall in love faster than women, but say it sooner, too. No shit! While the research was celebrated as heartening — proof that men not only have real feelings, but can actually string sentences together on their own to express them without a hard prompt — the ensuing aha! misses the point: Saying “I love you” is, and always will be, one of the earliest, most important power moves in a relationship, and we’ve typically given all this power to dudes, painting women as militants in the game of locking down love. The result? Women feel pressure to hold back.

The research showed up at Broadly, where Jessica Pan explored a 2011 studyof 172 college students at Pennsylvania State University, published in the Journal of Social Psychology. Researchers Marissa Harrison and Jennifer Shortall found that men reported both falling in love earlier and saying “I love you” earlier than women did. This contradicted the authors’ expectation that women would fall in love first and express it first. And popular culture, of course, has long painted women as the more eager gender when it comes to falling in love and committing.

“Surprise!” Redbook wrote of the Broadly piece, remarking that the research “totally debunks the myth that women are the ones who *~fall so fast~* and spend all their time quoting songs about unrequited love.”

Women, of course, know this, but such gendered stereotypes — women be chasin’, men be avoidin’ — hang over all our heads as we move toward the big moment. Harrison’s research was published in 2011; that same year, another study was published on Valentine’s Day in the Journal of Personality and Social Psychology. It looked at six studies of male and female behavior in commitment in terms of who says “I love you” first. It, too, found that “although people think that women are the first to confess love and feel happier when they receive such confessions, it is actually men who confess love first and feel happier when receiving confessions.”

So why do we think of women as the ambulance chasers of love, when research has shown again and again that we aren’t? Because we think of falling in love and saying “I love you” as the same thing as wanting a commitment, and women, as we all know, all want commitment.

Take this old bit from Chris Rock, who says women are perennially ready to settle down. “Shit,” he jokes, “a woman go on four good dates, she’s like, ‘Why we bullshitting? What are you waiting for?’ Men, never ready to settle down. Men don’t settle down. We surrender.”…

more…

https://melmagazine.com/its-not-who-falls-in-love-first-but-who-says-it-first-that-matters-bec005f93992#.403udnnum

WIKK WEB GURU

WIKK WEB GURU

 

 

Resultado de imagem para Intelligent assumptions? At the Oxford Union, 1950.

Intelligent assumptions? At the Oxford Union, 1950. From the Picture Post feature, Eternal Oxford. Photo by John Chillingworth/Getty

Intelligence has always been used as fig-leaf to justify domination and destruction. No wonder we fear super-smart robots

Stephen Cave is executive director and senior research fellow of the Leverhulme Centre for the Future of Intelligence at the University of Cambridge. A philosopher by training, he has also served as a British diplomat, and written widely on philosophical and scientific subjects, including for The New York Times, The Atlantic, Guardian and others.

As I was growing up in England in the latter half of the 20th century, the concept of intelligence loomed large. It was aspired to, debated and – most important of all – measured. At the age of 11, tens of thousands of us all around the country were ushered into desk-lined halls to take an IQ test known as the 11-Plus. The results of those few short hours would determine who would go to grammar school, to be prepared for university and the professions; who was destined for technical school and thence skilled work; and who would head to secondary modern school, to be drilled in the basics then sent out to a life of low-status manual labour.

The idea that intelligence could be quantified, like blood pressure or shoe size, was barely a century old when I took the test that would decide my place in the world. But the notion that intelligence could determine one’s station in life was already much older. It runs like a red thread through Western thought, from the philosophy of Plato to the policies of UK prime minister Theresa May. To say that someone is or is not intelligent has never been merely a comment on their mental faculties. It is always also a judgment on what they are permitted to do. Intelligence, in other words, is political.

Sometimes, this sort of ranking is sensible: we want doctors, engineers and rulers who are not stupid. But it has a dark side. As well as determining what a person can do, their intelligence – or putative lack of it – has been used to decide what others can do to them. Throughout Western history, those deemed less intelligent have, as a consequence of that judgment, been colonised, enslaved, sterilised and murdered (and indeed eaten, if we include non-human animals in our reckoning).

It’s an old, indeed an ancient, story. But the problem has taken an interesting 21st-century twist with the rise of Artificial Intelligence (AI). In recent years, the progress being made in AI research has picked up significantly, and many experts believe that these breakthroughs will soon lead to more. Pundits are by turn terrified and excited, sprinkling their Twitter feeds with Terminator references. To understand why we care and what we fear, we must understand intelligence as a political concept – and, in particular, its long history as a rationale for domination.

The term ‘intelligence’ itself has never been popular with English-language philosophers. Nor does it have a direct translation into German or ancient Greek, two of the other great languages in the Western philosophical tradition. But that doesn’t mean philosophers weren’t interested in it. Indeed, they were obsessed with it, or more precisely a part of it: reason or rationality. The term ‘intelligence’ managed to eclipse its more old-fashioned relative in popular and political discourse only with the rise of the relatively new-fangled discipline of psychology, which claimed intelligence for itself. Although today many scholars advocate a much broader understanding of intelligence, reason remains a core part of it. So when I talk about the role that intelligence has played historically, I mean to include this forebear.

The story of intelligence begins with Plato. In all his writings, he ascribes a very high value to thinking, declaring (through the mouth of Socrates) that the unexamined life is not worth living. Plato emerged from a world steeped in myth and mysticism to claim something new: that the truth about reality could be established through reason, or what we might consider today to be the application of intelligence. This led him to conclude, in The Republic, that the ideal ruler is ‘the philosopher king’, as only a philosopher can work out the proper order of things. And so he launched the idea that the cleverest should rule over the rest – an intellectual meritocracy.

This idea was revolutionary at the time. Athens had already experimented with democracy, the rule of the people – but to count as one of those ‘people’ you just had to be a male citizen, not necessarily intelligent. Elsewhere, the governing classes were made up of inherited elites (aristocracy), or by those who believed they had received divine instruction (theocracy), or simply by the strongest (tyranny)…

more…

https://aeon.co/essays/on-the-dark-history-of-intelligence-as-domination

WIKK WEB GURU

WIKK WEB GURU

Illustration by Maurice Sendak from Open House for Butterflies by Ruth Krauss

“People who for some reason find it impossible to think about themselves, and so really be themselves, try to make up for not thinking with doing.”

In 1926, having just divorced her first husband at the age of twenty-five, the American poet, critic, essayist, and short story writer Laura Riding (January 16, 1901–September 2, 1991) moved to England and founded, together with her friend the poet Robert Graves, a small independent press. Like Anaïs Nin’s publishing venture, all of their early publications — which included work by Gertrude Stein — were typeset and printed by hand.

In 1930, Riding and Graves moved their offices to Majorca. That year, 29-year-old Riding wrote a series of letters to 8-year-old Catherine — the daughter of Graves and the artist Nancy Nicholson. Originally published by a Parisian press in a limited edition of 200 copies each signed by the author, Four Unposted Letters to Catherine(public library) endures as a small, miraculous book, reminiscent in spirit of Rilke’s Letters to a Young Poet and in style and substance of the Zen teachings of Seung Sahn or Thich Hhat Hanh. With great simplicity and unpretentious sincerity, both comprehensible and enchanting as much to this particular little girl as to any child or even any wakeful grownup at all, Riding addresses some of the most elemental questions of existence — how to live a life of creativity and integrity, why praise and prestige are corrosive objects of success, and above all what it means to be oneself.

Riding eventually returned to America in 1939, remarried and became Laura (Riding) Jackson, continued to write, and lived to be ninety — a long life animated by the conviction that language is “the essential moral meeting-ground.” When she reflected on these letters three decades after writing them, she remarked wistfully that she might no longer be inclined to write “such easy-speaking letters, treating with so much diffident good-humor the stupendous, incessantly-urgent matter of Virtue and the lack of it,” by which she meant “the eternal virtue of good Being, not the mortal virtue of good Custom.” And yet, mercifully, she did once write them, and they did survive, and today they continue to nourish souls of all ages with their unadorned wisdom and transcendent truthfulness.

In the first of the four letters, a meandering meditation on young Catherine’s remark that grownups sometimes seem to “know everything about everything,” Riding explores the nature of knowledge and its essential seedbed of self-knowledge. She writes:

A child should be allowed to take as long as she needs for knowing everything about herself, which is the same as learning to be herself. Even twenty-five years if necessary, or even forever. And it wouldn’t matter if doing things got delayed, because nothing is really important but being oneself.

Nearly a century after Kierkegaard extolled the virtues of idleness and two decades before the German philosopher Joseph Pieper argued that not-doing is the basis of culture, Riding urges young Catherine not to worry about being accused of laziness and considers the basic goodness of simply being oneself:

You seem to spend a lot of time dreaming about nothing at all. And yet you are, as the few people who really know you recognise, a perfect child… This is because when you seem to be dreaming about nothing at all you are not being lazy but thinking about yourself. One doesn’t say you are lazy or selfish. If a person is herself she can’t be a bad person in any way; she is always a good person in her own way. For instance, you are very affectionate, but that’s because you are a good person. You are not a good person just because you are affectionate. It wouldn’t matter if you weren’t affectionate, because you are a good person. You are yourself, and whatever you do is sure to be good.

In a passage that radiates a prescient admonition against the perils of our modern Parenting Industrial Complex, Riding adds:

It is very sad then that so many children are hurried along and not given time to think about themselves. People say to them when they think that they have been playing long enough: “You are no longer a child. You must begin to do something.” But although playing is doing nothing, you are really doing something when you play; you are thinking about yourself. Many children play in the wrong way. They make work out of play. They not only seem to be doing something, they really are doing something. They are imitating the grown-ups around them who are always doing as much instead of as little as possible. And they are often encouraged to play in this way by the grown-ups. And they are not learning to be themselves.

In an essential caveat that teases out the nuance of her point, Riding notes that rather than selfishness or narcissism, such thinking about oneself is the only way to conceive of one’s place within a larger world and therefore to think of the world itself. In a sentiment that calls to mind Diane Ackerman’s wonderful notion of “the plain everythingness of everything, in cahoots with the everythingness of everything else,”Riding offers an almost Buddhist perspective:…

more…

https://www.brainpickings.org/

 

WIKK WEB GURU

WIKK WEB GURU

Regan_BR

FRIENDS FORMING FAMILIES: The TV series The Golden Girls portrayed a different family model, in which unrelated single women formed a household together.ABC Photo Archives / Contributor / Getty Images

Thinking out of the nuclear family box.

Actually, as it turns out, that might not be the best solution anyway

by Chris Bourn

There are a couple of undeniable truths about stress. First, it will (eventually) kill you — causing a litany of ailments that include a weakened immune system; ulcers; memory loss; abdominal obesity; and an accelerated aging process. One very recent study, published in The Lancet, even finds a direct link between emotional stress and heart disease. Second, we won’t do shit about it. Mainly because we know the only way to really, truly rid ourselves of stress is to tackle the source of said stress head on, whether that be quitting your job or busting free from your relationship.

At least, that’s what most of us believe. But there must be some sort of happy medium, right? Something that would allow us to alleviate the short-term pain while still doing nothing about the long-term game. Or at least buying us some time before we need to stare down the very thing that’s making us miserable — and weakening our immune systems, giving us ulcers and stealing our memories from us. You know, punting. Yet while it might seem like a cop out, it’s not as impossible (or unhealthy) as it seems. And it might just be the best thing for you.

#1: You’re still gonna get a peace of mind

“When you’re so stressed that you can’t think with clarity, you need to do some other relaxation techniques first,” says Mary Alvord, director of a group mental-health practice in Maryland. It’s sensible enough advice: Stress floods your system with adrenaline, which increases both your heart and breathing rate to ship more oxygen to your tensed up muscles. This clenched state — especially the spike in blood pressure — not only causes damage to your body over the long-term, it seriously clouds your judgment. So before you even consider confronting the source of your stress, it’s a good idea to find a way to dial down the symptoms.

Which techniques will be most effective depends on the source of your stress, Alvord explains. For chronic stress emanating from uncertainty about the future, or from dwelling on past events in your life (e.g., divorce or death), she suggests training yourself in the psychology cure-all du jour: Mindfulness. “It’s not making you forget,” she says, “but it teaches you to be in the moment and not to get stuck on the past or worry unduly about the future.”

For stress that’s unbearable, she teaches visualization techniques: “This is so you can be at work, in a stressful situation, and imagine yourself by the ocean, if that’s something that’s relaxing for you. Go there for a few minutes, calm yourself down and then you can get back on task.”

#2: You’re gonna figure out what you can control in the first place

At the root of many people’s struggles with stress is a missing sense of control in their lives. It’s no coincidence that being choked by debts and bills, or even just having a career that seemingly offers no room for meaningful advancement, are so often cited among the leading causes of stress in America. But if our stress is really coming from a feeling that we’re not in charge of our own lives, what’s the answer? How do you control what you can’t control?

For some psychologists, cognitive behavior therapy programs — in which patients are encouraged to break down their problems into manageable parts and understand how they’re affecting them — are the best answer. “It’s learning to be proactive and coming up with plans to take initiative, so that you feel like you’re doing something,” she explains. If your stress is work-related, it could take the form of on-the-job training or mentoring, or getting as far from work as possible and doing the things that make you happy. “If you can believe that you have some power to control some aspects of your life — not everything, but some aspects — and take charge of those, you tend to be more resilient in more areas,” she says…

more…

https://melmagazine.com/is-there-a-way-to-deal-with-stress-without-confronting-its-source-aab36896f978#.nsfz3ahh4

WIKK WEB GURU

WIKK WEB GURU

ceo

%d bloggers like this: