Archive for August, 2016
Republican U.S. presidential candidate Donald Trump has been called both a “narcissist” and a “psychopath.” And, while in common parlance, the two terms are often used interchangeably, there are key differences between both the disorders that one should know.
It is quite difficult to identify someone as a psychopath. They can seem quite normal and even charming but their normalcy is a mere façade. Psychopaths lack conscience and empathy thus leading them to be calculating, manipulative, impulsive and sometimes, criminal. On the other hand, psychopaths are usually gifted with incredible intelligence.
They are quite predatory in nature. They constantly seek people out to abuse for their selfish gains moving on to the next person without so much as a second thought. Psychopaths don’t fear anyone and possess a remarkable amount of confidence.
Psychopaths and narcissists have one important trait in common. They both lack empathy.
However, people suffering from Narcissistic Personality Disorder exhibit great arrogance and a constant need for attention both in their personal and professional lives. Narcissists are often described as “cocky, self-centered, manipulative, and demanding.” They believe they are entitled to special privileges and tend to hold themselves in high regard.
Although on the surface, narcissists seem to have a high self-esteem, they usually have very poor self-esteem and hence the mask. In some cases, however, it has been observed that narcissists may have high self-esteem at both levels.
Narcissists can also be quite aggressive, often acting on an impulse to gain attention. Narcissists often display emotions unlike psychopaths, indicating that while a narcissist can suffer humiliation, psychopaths don’t.
image edited by Web Investigator
A lot of what our brain does is synthesize a hallucination, a model of the world that we proceed to live in. This is a model reality; the real reality is completely unknowable. – Dennis McKenna
The earliest known descriptions of lucid dreaming come to us from Hindu scriptures dating back over 3,000 years ago, in the Upanishads and the Vigyan Bhairav Tantra, where there are instructions for how to direct one’s consciousness within a dream and during sleep. Other ancient descriptions of lucid-dreaming meditations, from the Tibetan Bön and Vajrayāna Buddhist traditions, are over 1,200 years old.
In the West, the earliest mention of lucid dreaming comes from Aristotle, some 2,000 years ago. In his treatise On Sleep and Dreams, Aristotle says that when we are asleep there is often something in our minds telling us that what we are experiencing is only a dream. However, as we learned in the introduction, the first attempt at a systematic scientific study of lucid dreaming began with the French sinologist Marquis d’Hervey de Saint-Denys, in the mid-1800s.
In 1867, Saint-Denys’ book Les rêves et les moyens de les diriger (Dreams and How to Guide Them) was published, and this landmark book is the first known record of a systematic exploration of lucid dreaming.
Originally published anonymously, Saint-Denys’ detailed personal reports span a period of thirty-two years. In this remarkable book, the author describes how he became interested in dreams as a young teenager, and how he learned to become lucid in his dreams and partially direct what happened. Saint-Denys coined the term rêve lucide, “lucid dream,” and he performed many experiments in his lucid dreams.
The first scientist in the West to explore lucid dreaming was Dutch physician Frederik van Eeden, a contemporary of Freud’s who corresponded with the psychoanalyst about dreams. Van Eeden’s famous first scientific paper on lucid dreaming, “A Study of Dreams,” was published in 1913. This landmark paper contains the first mention of the term “lucid dream” in the English language.
D. Ouspensky’s essay “On the Study of Dreams and Hypnotism” was published in 1931. Much of it is based on detailed observations of the author’s own accounts of lucid dreaming, which he calls “half-dream states.” Ouspensky, a mathematician, made a number of fascinating observations as well as perhaps generalized assertions based on his own experiences, which may not be as fixed as he believed, but they were an important part of the growing body of knowledge on lucid dreaming that eventually would lead to legitimate scientific study.
In 1965, psychologist Charles Tart wrote a paper for the Psychological Bulletin titled “Towards the Experimental Control of Dreaming,” where the idea of signaling from a dream state was first proposed. He writes, “To what extent could a ‘two-way communication system’ be developed, whereby the experimenter could instruct the subject to do such and such while dreaming, and the subject could report on the events of the dream while they are occurring?”1
Then in 1968, Celia Green, a British writer on philosophical skepticism, twentieth-century thought, and psychology, paved the way for a scientific study of lucid dreams with her seminal book Lucid Dreams. In this book she says, “In view of the fact that subjects very frequently report that the lucid dream arose out of a previous non-lucid dream, we may tentatively expect to find lucid dreams occurring, as do other dreams, during the ‘paradoxical’ phase of sleep characterized by fast low-voltage EEG waves, rapid eye movements and muscular relaxation.”2 It was Green’s book that inspired sleep-laboratory researchers Keith Hearne and Stephen LaBerge to carry out the studies that led to the scientific demonstration that people can make conscious decisions, and carry out instructions, while their bodies are fast asleep.
Signals from Another World
While SETI (the search for extraterrestrial intelligence) researchers scan the skies and monitor electromagnetic radiation for signs of transmissions from civilizations on other worlds in vain, the first person in human history to send signals from the outer limits of the dream state to the earthbound waking world was Alan Worsley, a British shopkeeper. Worsley was recruited by Keith Hearne, who in the mid-1970s was a doctoral student in psychology at the University of Hull, in Yorkshire, England, when he conducted this experiment. Hearne had read Celia Green’s book and was keen to demonstrate the reality of lucid dreaming. So after recruiting Worsley, a proficient lucid dreamer, he designed the ingenious experiment that I described in the introduction of this book.
In his book The Dream Machine, Hearne tells us what it was like on that magic morning when the first signals from Worsley arrived in the sleep laboratory:
Suddenly, out of the jumbled senseless tos and fros of the two eye movement recording channels, a regular set of large zigzags appeared on the chart. Instantly, I was alert and felt the greatest exhilaration on realizing that I was observing the first ever deliberate signals sent from within a dream to the outside. The signals were coming from another world—the world of dreams—and they were as exciting as if they were emanating from some other solar system in space. A channel of communication had been established from the inner universe of the mind in dreaming sleep.3 …
There’s a reason why mobile devices have been likened to a drug in recent years. We’ve all seen the addictive and stupefying effects that these devices have on people, especially children. But now the medical community is saying that they quite literally have a drug-like effect on patients.
A study conducted by World Congress of Anaesthesiologists found that iPads work just as well at reducing pre-surgery anxiety in children, as a commonly prescribed sedative known as midazolam. One group of young patients was given the sedative for a surgery, while the other was allowed to play games on an iPad for 20 minutes.
“Our study showed that child and parental anxiety before anaesthesia are equally blunted by midazolam or use of the iPad. However, the quality of induction of anaesthesia, as well as parental satisfaction, were judged better in the iPad group. Use of iPads or other tablet devices is a non-pharmacologic tool which can reduce perioperative stress without any sedative effect in paediatric ambulatory surgery.”
Of course, it’s wonderful that they’ve found a way to reduce anxiety without the use of pharmacological drugs, but the majority of children in the developed world are using these devices on a regular basis. What are the effects of using this pseudo drug every day?
Some child psychologists believe that it causes mental developmental problems in children, while others suggest that using these devices all the time can harm the physical growth of chidlren as well. Is it any wonder that Steve Jobs wouldn’t let his kids use the devices that he created?
Delivered by The Daily Sheeple
(The Anthropocene era is defined as the age in which human activity has become a significant factor in shaping the planet.) image edited by Web Investigator
The epoch is thought to have begun in the 1950s, when human activity set global systems on a different trajectory
The Anthropocene Epoch has begun, according to a group of experts assembled at the International Geological Congress in Cape Town, South Africa this week.
After seven years of deliberation, members of an international working group voted unanimously on Monday to acknowledge that the Anthropocene—a geologic time interval so-dubbed by chemists Paul Crutzen and Eugene Stoermer in 2000—is real.
The epoch is thought to have begun in the 1950s, when human activity, namely rapid industrialization and nuclear activity, set global systems on a different trajectory. And there’s evidence in the geographic record. Indeed, scientists say that nuclear bomb testing, industrial agriculture, human-caused global warming, and the proliferation of plastic across the globe have so profoundly altered the planet that it is time to declare the 11,700-year Holocene over.
As the working group articulated in a media note on Monday:
Changes to the Earth system that characterize the potential Anthropocene Epoch include marked acceleration to rates of erosion and sedimentation; large-scale chemical perturbations to the cycles of carbon, nitrogen, phosphorus, and other elements; the inception of significant change to global climate and sea level; and biotic changes such as unprecedented levels of species invasions across the Earth. Many of these changes are geologically long-lasting, and some are effectively irreversible.
These and related processes have left an array of signals in recent strata, including plastic, aluminium and concrete particles, artificial radionuclides, changes to carbon and nitrogen isotope patterns, fly ash particles, and a variety of fossilizable biological remains. Many of these signals will leave a permanent record in the Earth’s strata.
“Being able to pinpoint an interval of time is saying something about how we have had an incredible impact on the environment of our planet,” said Colin Waters, principal geologist at the British Geological Survey and secretary for the working group. “The concept of the Anthropocene manages to pull all these ideas of environmental change together.”
Indeed, the Guardian compiled more “evidence of the Anthropocene,” saying humanity has:
- Pushed extinction rates of animals and plants far above the long-term average. The Earth is now on course to see 75 percent of species become extinct in the next few centuries if current trends continue.
- Increased levels of climate-warming CO2 in the atmosphere at the fastest rate for 66m years, with fossil-fuel burning pushing levels from 280 parts per million before the industrial revolution to 400ppm and rising today.
- Put so much plastic in our waterways and oceans that microplastic particles are now virtually ubiquitous, and plastics will likely leave identifiable fossil records for future generations to discover.
- Doubled the nitrogen and phosphorous in our soils in the past century with our fertilizer use. This is likely to be the largest impact on the nitrogen cycle in 2.5bn years.
- Left a permanent layer of airborne particulates in sediment and glacial ice such as black carbon from fossil fuel burning.
Now, scientists must commence their search for the “golden spike”—explained in the Telegraph as “a physical reference point that can be dated and taken as a representative starting point for the Anthropocene epoch.” This could be found in anything from layers of sediment in a peat bog to a coral reef to tree rings…
image edited by Web Investigator
The air around the world has recently been declared to be as carcinogenic as second hand smoke.
The following is an excerpt from the new book The Myth of Human Supremacy by Derrick Jensen (Seven Stories Press, 2016):
“The modern conservative [and, I would say, the human supremacist] is engaged in one of man’s oldest exercises in moral philosophy; that is, the search for a superior moral justification for selfishness.” —John Kenneth Galbraith
I’m sitting by a pond, in sunlight that has the slant and color of early fall. Wind blows through the tops of second-growth redwood, cedar, fir, alder, willow. Breezes make their way down to sedges, rushes, grasses, who nod their heads this way and that. Spider silk glistens. A dragonfly floats a few inches above the water, then suddenly climbs to perch atop a rush.
A family of jays talks among themselves.
I smell the unmistakable, slightly sharp scent of redwood duff, and then smell also the equally unmistakable and also slightly sharp, though entirely different, smell of my own animal body.
A small songbird, I don’t know who, hops on two legs just above the waterline. She stops, cocks her head, then pecks at the ground.
Movement catches my eye, and I see a twig of redwood needles fall gently to the ground. It helped the tree. Now it will help the soil.
Someday I am going to die. Someday so are you. Someday both you and I will feed—even more than we do now, through our sloughed skin, through our excretions, through other means—those communities who now feed us. And right now, amidst all this beauty, all this life, all these others—sedge, willow, dragonfly, redwood, spider, soil, water, sky, wind, clouds—it seems not only ungenerous, but ungrateful to begrudge the present and future gift of my own life to these others without whom neither I nor this place would be who we are, without whom neither I nor this place would even be.
Likewise, in this most beautiful place on Earth—and you do know, don’t you, that each wild and living place on Earth is the most beautiful place on Earth—I can never understand how members of the dominant culture could destroy life on this planet. I can never understand how they could destroy even one place.
Last year someone from Nature [sic] online journal interviewed me by phone. I include the sic because the journal has far more to do with promoting human supremacism—the belief that humans are separate from and superior to everyone else on the planet—than it has to do with the real world. Here is one of the interviewer’s “questions”: “Surely nature can only be appreciated by humans. If nature were to cease to exist, nature itself would not notice, as it is not conscious (at least in the case of most animals and plants, with the possible exception of the great apes and cetaceans) and, other than through life’s drive for homeostasis, is indifferent to its own existence. Nature thus only achieves worth through our consciously valuing it.”
At the precise moment he said this to me, I was watching through my window a mother bear lying on her back in the tall grass, her two children playing on her belly, the three of them clearly enjoying each other and the grass and the sunshine. I responded, “How dare you say these others do not appreciate life!” He insisted they don’t.
I asked him if he knew any bears personally. He thought the question absurd…
My success as chief economist at a major international consulting firm was not due to the lessons I learned in business school. It was not due to the competence of my staff of brilliant econometricians and financial wizards.
Those things may have helped at times. But there was something else that made it all happen. That something else was the same something else that elevated George Washington, Henry Ford, Mahatma Gandhi, Mother Theresa, Martin Luther King Jr, Steve Jobs, and other successful people to the heights of their success.
That something else is available to everyone of us.
It is the ability to alter objective reality by changing perceived reality, what we might think of as the Perception Bridge.
As described in my book The New Confessions of an Economic Hit Man, my job was to convince heads of state of countries with resources our corporations covet, like oil, to accept huge loans from the World Bank and its sister organizations. The stipulation was that these loans would be used to hire our engineering and construction companies, such as Bechtel, Halliburton, and Stone and Webster, to build electric power systems, ports, airports, highways and other infrastructure projects that would bring large profits to those companies and also benefit a few wealthy families in the country, the ones that owned the industries and commercial establishments. Everyone else in the country would suffer because funds were diverted from education, healthcare and other social services to pay interest on the debt. In the end, when the country could not buy down the principal, we would go back and, with the help of the International Monetary Fund (IMF), “restructure” the loans. This included demands that the country sell its resources cheap to our corporations with minimal environmental and social regulations and that it privatize its utility companies and other public service businesses and offer them to our companies at cut-rate prices.
It was a strategy of using perceived reality to change objective reality. In these cases, Objective Reality 1 was that the countries had resources. The Perceived Reality was that using those resources as collateral on loans to finance the building of infrastructure projects would create economic growth and prosperity for all the citizens. Objective Reality 2, however, was that economic growth occurred only among the very wealthy. Since economic statistics (GDP) in such countries are skewed in favor of the wealthy, the fact was that only our companies and the wealthy families benefited. The rest of the population suffered. In many cases this has led to political unrest, resentment, and the rise of various forms of radicalism and terrorism.
“Reality is merely an illusion.” –Albert Einstein
We know from quantum physics and chaos theory that consciousness, observation, and changes in perception have impacts on physical reality that can expand exponentially. Modern psychology teaches that perceived reality governs much of human behavior. Religion, culture, legal and economic systems, corporations – in fact, most human activities – are determined by perceived reality. When enough people accept these perceptions or when they are codified into laws, they have immense impact on objective reality.
Human activities – individual, communal, and global – are driven by this process of altering human perceptions of reality in order to change objective realities. A couple of cases from US corporations illustrate this.
Case #1: Ford Motor Company
In 1914 Henry Ford’s Objective Reality was: A) His company sold Model T cars that were produced through the assembly line process by workers who were paid a standard minimum wage; and B) Because the assembly line was monotonous and workers were under a lot of pressure to reduce the amount of time to build a car from 12.5 hours to less than 100 minutes, there was an extremely high turn-over rate in Ford’s work force.
So Ford perceived a new reality. He raised wages from the standard $2.34 for a nine-hour day to $5 for an eight-hour day – at a time when every other car manufacturer was trying to reduce wages. In addition to keeping workers on his assembly line, Ford was motivated by a second perception. He understood that the company, its workers and the buying public all came from the same population and he reasoned that “unless an industry can so manage itself as to keep wages high and prices low it destroys itself, for otherwise it limits the number of its customers. One’s own employees ought to be one’s own best customers.” Ford perceived that increasing the buying power of his workers would have a multiplier effect; it would also increase the buying power of many others.
Objective Reality 2: Ford sold 308,000 Model Ts in 1914—more than all other carmakers combined. In 1915, sales soared to 501,000. In 1920, Ford sold a million cars. In the process, Ford’s actions helped stimulate unprecedented growth in the US middle class.
Case #2: Nike, Adidas and other Retailers
Objective Reality 1: These companies design high-end footwear and clothing that is manufactured in factories that the companies do not own in China, Vietnam, and other “sweatshop” countries.
Perceived Reality on the part of management at these companies: A) Outsourcing production releases their companies of worker-rights responsibilities and minimizes wages; B) Hiring highly-paid athletes to promote products counterbalances the negative publicity generated by activists who advocate more pay for sweatshop workers; and C) These policies, that are diametrically opposed to those of Henry Ford, will maximize profits.
Objective Reality 2: A) Low “non-living” wages and poor working conditions in overseas factories result in high worker turnover, illnesses, and adverse publicity; B) By negatively impacting consumer economic growth, such policies destroy opportunities for new markets that would result if workers were paid enough to buy the products they make and at the same time stimulate the multiplier effect; and C) Neither corporate profits nor overall economic growth in the countries where the factories are located are in fact maximized.
I had the opportunity to highlight the difference between the two cases above when a Portland Oregon (home of Nike) radio station interviewed me. The host inquired “If you could ask Nike founder Phil Knight one question, what would it be?”…
ABOUT THE AUTHOR
Phillip J. Watt lives on the Mid North Coast of NSW Australia.
It’s a no brainer that using our body is good for it. But in a technocratic world full of brain-dimmers like TV ‘programming’, most people just aren’t getting their sweat on like they should. That is of course unless they’re ironically into the new mind-masher called Pokemon Go.
There are many lies that are well established in regards to our food, such as what to eat to be healthy, how much to eat, what’s actually in our food, how nutrition-less some of it is and in which varities of food true nutrition actually exists. Then of course there’s the BS in the toxic mess of so-called modern medicine.
Essentially, we’ve been lied to a lot, particularly because it supports the pharmaceutical-medico complex and other corporate agendas.
Therefore, it’s obviously not too much of a stretch to think that what we’ve been recommended as the physical exercise required to reduce future health complications is also false; however, a new Australian study appears to have solidified it.
The current level of physical activity recommended by the World Health Organisation (WHO) is 10 metabolic equivalent (MET) hours each week, which equates to about 1.75 hours of running per week. The new study conducted by researchers at the University of Queensland suggests that we need at least five times as much:
University of Queensland researcher Dr Lennert Veerman said a significant boost to physical activity level recommendations to the equivalent of 15 to 20 hours of brisk walking or six to eight hours of running a week could reduce breast and bowel cancer, diabetes, heart disease and stroke.
“Although the first minutes of activity do most for health, our research results suggest activity needs to be several times higher than current World Health Organisation (WHO) recommendations to achieve larger reductions in risks of these diseases,” Dr Veerman said.
“WHO advises a minimum total physical activity level of 10 metabolic equivalent (MET) hours a week, but the study found health gains accumulated up to levels of 50 to 70 MET-hours a week.
The researchers came to the conclusion by assessing health literature spanning several decades:
They analysed the results of 174 health studies between 1980 and 2016 and found higher levels of physical activity were linked to the reduced risk of chronic conditions.
Given the toxic food, water, air and medicine we are bombarded with by a profit-before-morality model that makes up the crony capitalism system we have today, it’s no wonder we need to go to an extra effort to cleanse them from our bodies, or avoid them completely if we can.
So this study makes sense. The more we are tainted with chemical and artificial concoctions, the harder we need to work to stop them from giving us one of many chronic illnesses. It won’t be long until we’ll have to quit our jobs and exercise full-time.
In addition, we also shouldn’t forget; it is encoded deep into our subconscious, our genealogy and our DNA that it is very natural for us to undertake physical activity on a consistent basis. And not only is it natural, it’s necessary for our health and vitality.
So as we become more aware of what’s really going on in the world, including how we’ve been enslaved to the will of our dying corporate masters, remember that getting into nature and exerting our mind-body, as well as aligning with the plethora of natural energies and therapies, are absolute must-haves if we want to prevent ourselves from contracting illness from an infested-by-infection corporatocracy.
by Jon Rappoport
Dali, the painter: Salvador Domingo Felipe Jacinto Dalí i Domènech, Marqués de Dalí de Pubol (11 May 1904 – 23 January 1989). His self-appointed task: shake up reality. Make the impossible intrude on the ordinary. Expose and confound the critics and the press. Whenever the establishment tries to define who he is, become something else.
The critics would have declared Dali a mental patient if he hadn’t had such formidable classical painting skills. He placed his repeating images (the notorious melting watch, the face and body of his wife, the ornate and fierce skeletal structures of unknown creatures) on the canvas as if they had as much right to be there as any familiar object.
This was quite troubling to many people. If an immense jawbone that was also a rib or a forked femur could rival a perfectly rendered lamp or couch or book (on the same canvas), where were all the safe and easy accoutrements and assurances of modern comfortable living? Where was the pleasantly mesmerizing effect of a predictable existence? Where was a protective class structure?
To make it worse, Dali invented vast comedies. But the overall joke turned, as the viewer’s eye moved, into a nightmare, into an entrancing interlude of music, a memory of something that had never happened, a gang of genies coming out of corked bottles.
What was the man doing? Was he making fun of the audience? Was he simply showing off? Was he inventing waking dreams? Was he, God forbid, actually imagining something entirely new that resisted classification?
Dali’s greatest paintings were undeniable symphonies, and mere acknowledgment of his talent would not explain how he composed the movements.
Words failed viewers and critics and colleagues and enemies. But they didn’t fail Dali. He took every occasion to explain his work. However, his explications were handed out in a way that made it plain he was telling tall tales — interesting, hilarious, and preposterous tall tales.
Every interview and press conference he gave, gave birth to more attacks on him. Was he inviting scorn? Was he really above it all? Was he toying with the press like some perverse Olympian?
Media analysts flocked to make him persona non grata, but what was the persona they were exiling? They had no idea then, and they have no idea now.
It comes back to this: when you invent something truly novel, you know that you are going to stir the forces trapped within others that aspire to do the very same thing. You know that others are going to begin by denying that anything truly NEW even exists. That DOES make it a comedy, whether you want to admit it or not.
It is possible that every statement ever uttered in public by Dali was a lie. A fabrication. An invention dedicated to constructing a massive (and contradictory) persona.
Commentators who try to take on Dali’s life usually center on the early death of his young brother as the core explanation for Dali’s “basic confusion” — which resulted in his “bizarre behavior.”
However, these days, with good reason, we might more correctly say that Dali was playing the media game on his own terms, after realizing that no reporter wanted the real Dali (whatever that might mean) — some fiction was being asked for, and the artist was merely being accommodating.
He was creating a self that matched his paintings…
Photograph by Cadaverexquisito / Wikicommons
There is a church in Argentina called Iglesia Maradona. In this church, God is football—soccer—and its prophet is the renowned player Diego Armando Maradona. Founded in 1998, the year after the star’s retirement, the Church of Iglesia Maradona now has some 120,000 members worldwide, who bear its insignia D10S—a portmanteau of Dios, the Spanish word for God, and Maradona’s shirt number, 10. Members congregate in sports bars; transubstantiation occurs not to wine and wafer, but to beer and pizza. They even have their own version of the Lord’s Prayer: “Our Diego, who art on the pitches, hallowed be thy left hand,” alluding to Maradona’s controversial “hand of God” goal in the 1986 World Cup.
It all sounds a bit absurd, but at least some of the church’s founders and followers appear to be serious. Co-founder Hernán Amez told The Argentina Independent in 2008, “It’s not just a bit of fun—it’s a religion. Religion is about feelings, and we feel football.” He is right, psychologically speaking. The power of religion, sociologist Émile Durkheim wrote, stems from its ability to unite two of our deepest yearnings—the universality of God and the cultural specificity of a clan—through totems and rituals. The specific beliefs of a religion do not matter so much as its ability to meet these emotional and social needs. In other words: deed then creed. Given this, Iglesia Maradona doesn’t seem so strange. After all, 90 percent of Argentinians declare allegiance to a soccer team. In many ways, the devotion to soccer in Argentina resembled a religion already.
While it may be common to think that ancient sporting rituals were performed in the service of religion, this modern example, and others, suggest it can just as easily go the other way: Religion adapts itself to sport. Take the United States, where football (American football) and Christianity are closely linked. Football counts 63 percent of Americans as fans, more than any other sport in the country—and 33 percent of them believe God intervenes in football games. As Albert Mohler, president of the Southern Baptist Theological Seminary, wrote in 2014, “The relationship between sports and religion in America has always been close, and it has often been awkward.”
Some might argue that this awkward closeness can be seen in the development of megachurches, defined as churches that have 2,000 or more in weekend attendance—they’re often modeled after sports stadiums. Crenshaw Christian Center, perhaps the largest such structure in the U.S., can pack 10,400 churchgoers into its 360-degree stadium seating. Critics of megachurches argue that their large size discourages nuanced discussions of social justice issues and the formation of intimate communities. What they do encourage, though, is group-feel. Sociologist Katie Corcoran has likenedsitting in megachurch to standing in the crush of a packed stadium: Both allow the self to melt away.
“What has Jesus done that Maradona hasn’t?”
Recognizing this potential for self-transcendence, megachurches are now seeking to wield sports’ power for their own ends. In 2005, theologian Matthew Brian White examined the 100 largest megachurches to see how they used sports to win over followers, a practice known as “sports evangelism.” In addition to distributing pamphlets and videos at major sporting events, megachurches have created their own sports associations, such as Upward Basketball, which integrates religious practices into pre-game rituals and play…