Archive for April, 2015



Why are some people sharp as a tack at 95 years old, while others begin struggling with mental clarity in their 50s?

A lot of it has to do with genetics, but certain lifestyle factors also play an important role in how our brain ages. So while you can’t control your genes, you can take advantage of the latest science and avoid these seven big brain mistakes:

Mistake No. 1: Eating a standard American diet

Foods high in sugar, unhealthy fats and processed foods — i.e., the typical American diet — can wreak havoc on your brain over time. Studies have shown that excess sugar consumption can impair learning and memory, and increase your vulnerability to neurodegenerative diseases like Alzheimer’s. Some scientists have even referred to Alzheimer’s as “Type 3 Diabetes,” suggesting that diet may have some role in an individual’s risk for developing the disease.

A Mediterranean-based diet, on the other hand, can help protect the brain from signs of aging and ward off cognitive decline. A recent study showed that following this type of diet — which is a good source of brain-healthy nutrients and includes a lot of fish, healthy fats, whole grains and vegetables — could slash Alzheimer’s risk by up to 50 percent.

Mistake No. 2: Living next to a highway

Living in a smoggy city might be bad news for your brain. According to research published this month in the journal Stroke, exposure to air pollution is linked with premature aging of the brain.

The researchers found that people who lived closer to a major highway had greater markers of pollution in their lungs and blood, which increased their risk for a form of brain damage known as “silent strokes,” or symptomless strokes. Increased pollution volume was also linked to decreased brain volume — a major sign of aging.

Mistake No. 3: Drinking a few evening cocktails

Don Draper’s daily cigarettes and two-martini lunches might seem glamorous on “Mad Men,” but research suggests that they’re a fast track to neurodegeneration.

It should come as no surprise that excessive drinking and cigarette smoking at any stage of life can have a negative effect on the brain, damaging brain tissue and leading to cognitive impairment. Alcoholism can cause or accelerate aging of the brain.

But just a couple of glasses of wine a night could pose a risk to brain health, even though there are some cardiovascular benefits. A 2012 Rutgers University studyfound that moderate to binge drinking — drinking relatively lightly during the week and then more on the weekends — can decrease adult brain cell production by 40 percent.

“In the short term there may not be any noticeable motor skills or overall functioning problems, but in the long term this type of behavior could have an adverse effect on learning and memory,” one of the study’s authors, Rutgers neuroscience graduate student Megan Anderson, said in a statement.

Mistake No. 4: Giving in to stress

Living a stressful lifestyle may be the worst thing you can do for your health as you age. Chronic stress is known to shorten the length of telomeres, the sequences at the end of DNA strands that help determine how fast (or slow) the cells in our body age. By shortening telomeres, stress can accelerate the onset of age-related health problems…






by Hanson O’Haver

From the column ‘The VICE Guide to Mental Health’

There’s a reason the image of the floundering, scared, shaky post-teen struggling to enter adulthood is a cliché. Between moving out of your parent’s home, going to college and getting a job, lack of sleep, drugs, and unrestricted access to alcohol, becoming an adult is fucking hard. So it’s no wonder that this period is popularly associated with having a mental breakdown. But is there any truth behind the pop culture trope? What about kids from wealthy families who don’t have the stresses the rest of us do in early adulthood, or people whose most trying times come in their 30s or 40s? Is the appearance of mental illness in young people a matter of environment or biology?

To better understand these questions, I phoned Johanna Jarcho, Ph.D., a postdoctoral fellow at the National Institute of Mental Health whose work studies differences in brain development in healthy people versus those who have mental health problems, with a focus on anxiety. She explained how our brains interact with social conditions to influence our mental health, and why the best way to deal with a problem is to get it diagnosed early.

VICE: I’ve often heard it repeated that mental illnesses frequently begin in a person’s late adolescence or early 20s. Anecdotally that seems consistent with what I’ve seen, but is there any scientific basis to this claim?

Dr. Johanna Jarcho: Yeah, the vast majority of mental health disorders do emerge during one’s adolescence or early 20s. If you’re going to have an anxiety disorder as an adult, there’s a 90% chance that you’ll have had it as an adolescent. Basically, you’re not going to develop an anxiety disorder as an adult. You’re going to develop it as a kid and then it’ll carry through to adulthood. Emerging research suggests that this is because adolescence is a time when the brain is changing to a great degree. We once thought that the brain didn’t change that much after earlier childhood, but what we’ve seen is that the brain continues to undergo really profound changes up until your early 20s. It’s still quite malleable, so being exposed to different influences in your social environment can really have a profound impact on the way that your brain continues to develop.

You said that much has to do with brain development. At the same time, young adulthood seems to be a time where people are going through major upheavals, both socially and economically—things like college, entering the workforce, or living away from your parents. Is there a way to quantify the effect of environment versus biology?

Some types of mental health disorders are much more genetically based than others. Schizophrenia and bipolar disorder have a much higher rate of inheritance. If you have a first degree relative like a parent or sibling who has one of those disorders, you’re at a much greater risk for developing it yourself, and there are things in the environment that can potentiate that. For other disorders like depression or anxiety, it’s less heritable. Whether or not you develop one of those disorders is a lot more contingent on your environment. Young adults go through all these different social changes, but we evolved to be able to make this big transition from being with parents to forging adulthood. What happens during this transition can definitely have a profound effect on whether you grow to be “healthy” or to have these types of disorders…




Creflo Dollar, his jet and his demons

Photo Illustration by Emil Lendof/The Daily Beast

by Olivia Nuzzi

The pastor who asked followers to buy him a new $65 million aircraft blames the devil for losing the old one—and says he may need a spaceship to evangelize on Mars.

Satan stole Pastor Creflo Dollar’s private jet, according to a paid spokesman for Satan’s opposition, Pastor Creflo Dollar.

During a recent sermon, Dollar addressed his critics in the media who challenged him when he tried to convince the members of his congregation that they should buy him a $65 million private jet.

In particular, Dollar said those who tried to damn him to a life of TSA pat-downs and long lines were doing the work of Satan, and that one day he would ask God for a billion dollars to buy a spaceship so he can go to Mars.

Wide-eyed and rabid, Dollar told his congregation: “I can dream as long as I want to! I can believe God as long as I want to! If I want to believe God for a $65 million plane, you cannot stop me! You cannot stop me from dreaming!”

In March, The Daily Beast brought you word that Dollar’s private jet was beginning to age somewhat, and that God told him he deserved a new one, at a cost of $65 million to the members of his church, World Changers Church International. 

The crowdsourcing mission was called “Project G650” and Dollar said it was all about “Understanding Grace” and “Empowering Change.”

Dollar said he needed this new jet so that he could continue to travel the world and bring people the word of the gospel, but he did not explain why such a feat could not be accomplished while flying coach. 

Soon after Dollar went public with his plea via blog post, it was inexplicably removed from his website—most likely by Satan, according to Dollar.

“Now you see why the devil tried so aggressively to discredit my voice,” Dollar told the members of his congregation. “The enemy has got to discredit the voices of faith and grace and truth because he don’t want you to know that you can walk on the water if you can look at Jesus.”

The Daily Beast has attempted, for weeks, to speak to Dollar, but he has not returned calls to his assistant. An effort to lure Dollar to The Daily Beast’s headquarters from his church in the Bronx with a trail of money was thwarted by complications with wind…




Dado Ruvic / Reuters

The state of the media in 2015 begins and ends with the tech giant.


Facebook, it seems, is unstoppable. The social publishing site, just 11 years old, is now the dominant force in American media. It drives a quarter of all web traffic. In turn, Facebook sucks up a huge portion of ad revenue—the money that keeps news organizations running—and holds an enormous captive audience.

We already know, from a Pew poll last year, that nearly half of the adults who use the Internet report getting their news from Facebook alone. Now consider some of the latest numbers from Pew, in its annual State of the Media report, which came out on Wednesday:

• As in previous years, just five companies generate the majority (61 percent) of digital ad revenue: Facebook, Google, Microsoft, Yahoo, and AOL.

• Facebook more than doubled digital ad revenue over the course of two years. It made $5 billion in ad money last year. That represents 10 percent of all digital ad revenue.

• Facebook is getting a quarter of all display ad revenue and more than a third (37 percent) of display ads on mobile.

This last point—Facebook’s mobile ad revenue dominance—is worth lingering on for a moment. Facebook has succeeded in thriving financially on mobile while leaving desktop behind. That’s exactly what consumers are doing, but it was unclear for years that money would follow on mobile. (It’s still unclear, for news sites especially, whether mobile revenue will be enough.) Facebook’s share of revenue on desktop dropped 20 percentage points last year, while its share of mobile revenue went up 20 percentage points.

What all this reveals, and what’s evident from the rest of the Pew report, is that Facebook’s already established dominance is only growing.

The Pew report is also a reminder that Facebook—and, to a smaller extent, social publishing sites like Twitter—has changed Americans’ news diets. For most news consumers, long gone are the days of sitting down to read an entire newspaper. Instead, people are increasingly beginning with Facebook and dipping into a variety of news sites on an article to article basis.

The average visit to The New York Times’ website and associated apps in January 2015 lasted only 4.6 minutes – and this was the highest of the top 25. Thus, most online newspaper visitors are “flybys,” arriving perhaps through a link on a social networking site or sent in an email, and so may not think of this experience as “reading a newspaper” but simply browsing an article online.

For context: The average American smartphone owner spends more than 42 minutes a day on the site, according to CEO Mark Zuckerberg in an earnings call last year. Facebook accounts for one out of every five minutes spent on a smartphone, he said.

To say Facebook is huge is an understatement. Even to call it “the Coca-Cola of social media,” as Austin Carr did in this excellent Fast Company piece, now seems muted. “The great social network of the early 21st century is laying the groundwork,” he wrote, “for a platform that could make Facebook a part of just about every social interaction that takes place around the world.”

Facebook wants to be the portal by which people go online. And, increasingly, it is…




by Jay Dyer

Transcendence is the ultimate transhumanist film to date. Convalescence would have been a better name.

The film begins with some cataclysmic future event that has caused the collapse of civilization to a pre-electrical state. Rewind a few years, and we see Johnny Depp as Dr. Will Caster, a scientist who specializes in artificial intelligence on the verge of discovering the final key to mapping the human mind so it can be downloaded onto a floppy disc. Dr. Caster calls this point of singularity “transcendence,” where man will finally overcome his bodily limitations and load himself up to YouTube.

Following a presentation at a tech conference on how man will create “god,” Dr. Caster is mortally wounded by a radical anti-tech Luddite group, RIFT (Revolutionary Independence From Technology), leading to his desire to be the first willing test subject of human AI. Simultaneously, attacks on AI labs occur nationwide, prompting FBI involvement.

Dr. Caster, as you might guess, is successfully downloaded into a computer, leading to an exponential growth in his intelligence and power enabling him to become a ghost in the machine accessing infinite data.

After becoming Max Headroom, Caster hacks into the supercomputer PINN (Physically Independent Neural Network) he formerly worked on to track and identify all the members of RIFT globally.

The true reveal here is not that Johnny Depp can be placed on a thumb drive like Scarlett in Lucy, but that what is real are the NSA supercomputers that can literally track anyone, anywhere through AI, just like PINN can in the film.

As I have written many times, the NSA supercomputers exist to do this very thing, ultimately achieving real-time, 3D modelling of all events. NSA programmer William Binney recently revealed to the mainstream what had been known for a long time to alternative media: The purpose of all this tech surveillance and grid has nothing to do with terrorism, and everything to do with AI and panoptic population control.

The Guardian recently reported on Binney:

“At least 80% of fibre-optic cables globally go via the US,” Binney said. “This is no accident and allows the US to view all communication coming in. At least 80% of all audio calls, not just metadata, are recorded and stored in the US. The NSA lies about what it stores.” The NSA will soon be able to collect 966 exabytes a year, the total of internet traffic annually. Former Google head Eric Schmidt once argued that the entire amount of knowledge from the beginning of humankind until 2003 amount to only five exabytes.The ultimate goal of the NSA is total population control,” Binney said. “The NSA is mass-collecting on everyone, and it’s said to be about terrorism but inside the US it has stopped zero attacks.

As is usual, the ridiculous pseudo-philosophical question that is raised in this kind of film comes to the fore: is Max Headroom “self-aware”? Here, to be “alive” has been replaced with a psychological and ambiguously loaded philosophical concept “self-aware,” instead of the classic idea of the soul. Individual man is no longer viewed as an embodiment of human nature with psyche, nous or mind, but as a single entity, a body, with mind or psyche being collapsed into body (“mind is brain”). With these presuppositions, it is only rational to conclude consciousness is purely a series of algorithms. Since fMRI scans can image the energy waves emanating from the brain, the brain is nothing more than these recorded sensory impressions, like an image on film.

Regular readers will immediately see the fallacy here, where once again the entire premise of all transhumanism and enlightenment mythology is the presupposition that “man” is a body with a blank slate video recorder (brain). If that’s all man is, then these electrical impulses can be mapped and put into an algorithm, and thus transferred to a hard drive…




by Jon Rappoport, Guest, Waking Times

This operation has various stages. It can be applied to issues like vaccination, GMO food, climate change, fake epidemics, elections, war—and it can be applied widely across the general subject of reality itself.

It begins with some “authoritative” voice proclaiming an idea is a fact. Well, someone has to start the ball rolling.

Very quickly, according to plan, others pick up the ball and echo the original idea. This includes both “respected” individuals and groups. Media join in.

What follows is a scramble to gain public acceptance. All sorts of approaches are used in this stage of the process:

the claim that a consensus has already been reached (which is a lie);

attacks on dissidents and critics;

warnings that refusal to accept the central idea will have dire consequences “for all of us”;

cooked and slanted scientific studies;

general reminders that the idea being promoted represents the greatest good for the greatest number;

expressions of shock that there are “still people who refuse to accept the obvious”;

new laws and regulations hammered into existence;

subsequent legalized coercion;

arguments so illogical and bizarre that people accept them, thinking they themselves must be missing the point;

relegation of dissent to the margins of “wild conspiracy theory”;

the conjoining of dissent with negative images and feelings;

the marginalizing of dissent into legal cases that wind through seemingly endless corridors of a maze;

and, of course, endless repetition of the original idea.

After a suitable period has passed, most people can’t even recall a time when the original idea was seriously disputed.

And yet…none of these strategies would succeed, unless people felt a strong need for a centralized authority that defines reality.

To put it another way, people have a quite minimal tolerance for chaos, in which there are many conflicting views on a subject of key importance.

Years ago, I did interesting experiments with small groups re chaos.

In one part of the experiment, I had the people in the group spontaneously sing a note, all at once, any note. Ten people, ten different notes would emerge.

“Chaos.” Dissonance.

I asked them to listen to the overall musical effect and then voice a note they believed would add greater dissonance to the group sound.

Most of the time, if I let this go on long enough, the group would tend to retreat back into “harmony,” and end up all singing the same note.

It didn’t matter which note. Everyone wound up voicing the same pitch.

This was comfortable. This was acceptable. This was “unity.”

So it is with ideas. Most people prefer that “harmony and unity.” They seek it.

Somewhere in their minds, there is a learned program that asks for consensus, a program that prefers consent to difference.

Dissonance (disagreement) registers as a negative reality.

When I had done these group experiments on and off for a few years, I began my research on what came to be the collections, “The Matrix Revealed” and “Exit From The Matrix.”

I stressed the power of individual unfettered imagination, and offered many exercises to expand the scope and range of imagination.

The program in favor of uniform consent fades and disintegrates in the face of imagination…





Experimental Photography by Ronny Engelmann

Photography by Ronny Engelmann


I’ve been having this reoccurring sensation; a feeling that the reality I experience as my waking life is really a dream that I am just about to wake from. I start to get a light glimpse of each person within this waking reality – directly present and historically referenced – to be dream characters and creations of my mind. Like a wonderfully sown yarn, each character in this unfolding dreamscape has become present at the exact moment necessary to further me along a process of awakening to this truth.

Simultaneous to this sense of pending lucidity, there exists a grand paradox of perception; an equally profound but significant counter observation. Though “I” am the only being that exists and all of waking reality is the illusion of my dream constructed around me in an unfolding process intended to slowly awaken me to my true existence, I can feel that every other person that exists – though to my perception is only a fragment of my greater mind – is actually in the same state of awakening as “I” am.

This means that each individual and sentient human being in my completely fabricated reality is also the only being – the only “I” – that exists in their completely fabricated reality. Slowly awakening from a dream their greater minds has constructed. Wherein I am just a character and everything I know about myself, the historical progress of my life, my role in relation to others and even the sensation that I am the dreamer, is a construction of their dreams – not mine[1].

This exquisite paradox of dual yet simultaneous conceptualizations of reality are reminiscent of a state of being I once may have turned away from in fear, but now openly welcome as a part of my dynamic conscious experience of life unfolding.



Bryon Lippincott (CC BY-ND 2.0)

Bryon Lippincott (CC BY-ND 2.0)

Oliver Burkeman writes at the Guardian:

If you’ve been following the news recently, you know that human beings are terrible and everything is appalling. Yet the sheer range of ways we find to sabotage our efforts to make the world a better place continues to astonish. Did you know, for example, that last week’s commemorations of the liberation of Auschwitz may have marginally increased the prevalence of antisemitism in the modern world, despite being partly intended as a warning against its consequences? Or that reading about the eye-popping state of economic inequality could make you less likely to support politicians who want to do something about it?

These are among numerous unsettling implications of the “just-world hypothesis”, a psychological bias explored in a new essay by Nicholas Hune-Brown at Hazlitt. The world, obviously, is a manifestly unjust place: people are always meeting fates they didn’t deserve, or not receiving rewards they did deserve for hard work or virtuous behaviour. Yet several decades of research have established that our need to believe otherwise runs deep. Faced with evidence of injustice, we’ll certainly try to alleviate it if we can – but, if we feel powerless to make things right, we’ll do the next best thing, psychologically speaking: we’ll convince ourselves that the world isn’t so unjust after all.

Hence the finding, in a 2009 study, that Holocaust memorials can increase antisemitism. Confronted with an atrocity they otherwise can’t explain, people become slightly more likely, on average, to believe that the victims must have brought it on themselves.

The classic experiment demonstrating the just-world effect took place in 1966, when Melvyn Lerner and Carolyn Simmons showed people what they claimed were live images of a woman receiving agonizing electric shocks for her poor performance in a memory test. Given the option to alleviate her suffering by ending the shocks, almost everybody did so: humans may be terrible, but most of us don’t go around being consciously and deliberately awful. When denied any option to halt her punishment, however – when forced to just sit and watch her apparently suffer – the participants adjusted their opinions of the woman downwards, as if to convince themselves her agony wasn’t so indefensible because she wasn’t really such an innocent victim. “The sight of an innocent person suffering without possibility of reward or compensation”, Lerner and Simmons concluded, “motivated people to devalue the attractiveness of the victim in order to bring about a more appropriate fit between her fate and her character.” It’s easy to see how a similar psychological process might lead, say, to the belief that victims of sexual assault were “asking for it”: if you can convince yourself of that, you can avoid acknowledging the horror of the situation.

– See more at:



This head of Odysseus was discovered in 1957 on the west coast of Italy between Rome and Naples, on the grounds of the former villa of the Roman Emperor Tiberius at Sperlonga. The original sculpture likely dates to the 1st century BC. Source: Wikimedia Commons

This head of Odysseus was discovered in 1957 on the west coast of Italy between Rome and Naples, on the grounds of the former villa of the Roman Emperor Tiberius at Sperlonga. The original sculpture likely dates to the 1st century BC.
Source: Wikimedia Commons 

Brian C. Muraresku via

“The man of a traditional culture sees himself as real only to the extent that he ceases to be himself. Plato could be regarded as the outstanding philosopher of ‘primitive mentality’ – the thinker who succeeded in giving philosophic currency and validity to the modes of life and behavior of archaic humanity.”1

Mircea Eliade

The Real Hippies

What’s become of religion these days? Seriously. More than a billion people across the planet are religiously unaffiliated. That includes one in every five Americans and Europeans, and – believe it or not – almost half of the British public. Impressive as those numbers are today, just imagine the future of the Western world. Fueling the growth of this segment, after all, is a younger generation that is either uninterested in or entirely fed up with the organized religions of their parents and grandparents. Despite being four months older than the statistically oldestMillennial (born in the distant past of 1980), I can still report the cohort’s emerging preference: 32% of Americans aged 18 to 34 choose not to identify with a particular faith. This is far more than any previous generation, including those groovy Boomers, whose comparative figure in the 1970s was an underwhelming 13%. The below graph gives due credit to the real hippies.

At the end of his brilliant career, mythologist Joseph Campbell concluded that what we’re all seeking is not the meaning of life, but an “experience of being alive”. Across the 200,000-year history of our species, the triggering of “powerful, pervasive, and long-lasting” states of mind has been the essential function of bona fide religion.2 Recently, our fields, stages and screens – the altars of the 21st century – have assumed that sacred responsibility, making organized religion obsolete in a world where the full range of human emotion is available at the tap of a thumb. A half century ago, this is really all John Lennon was trying to say. When people are willing to wait in line for days (yes, days!) to get the latest iPhone or audition for American Idol, what the hell isn’t “more popular than Jesus”? But fear not. There is nothing particularly new or disturbing about that 72% of my “spiritual but not religious” generation intent on reclaiming ownership of its heart and mind. Before the rise of Churchianity, in the long-forgotten cradle of Western Civilization, our ancestors were also drawn to a spiritually independent lifestyle – free from any doctrine or dogma. What united them, however, was not just an “experience of being alive”, but something they thought was the single most important event a human being could ever experience. Its participants, without fail, left permanently transformed. Before religion goes the way of the fax machine, we owe this phenomenon some serious consideration – for the sake of our ancestors, and ourselves.

The Place Where Science Was Born

At the tender age of 14, I began eight years of intensive training in Latin and Greek. Accounting as they do for 60% of the English language, I was told the mandatory Jesuit curriculum would bump my SAT scores. An appeal was also made to the Founding Fathers, who were themselves fairly obsessed with the Classics. James Madison’s opinion that Athens and Rome had “done more honor to our species (humanity) than all the rest” was by no means unique in the late 18th century. The principal drafter of the Declaration of Independence, Thomas Jefferson, looked to classical literature as “the ultimate source of both delight and instruction”3 – admiration that hearkens back 700 years to the Renaissance, when the key feature of the new humanism that came to shape our Western world “was an educational and cultural program based on the study of classical Greek and Latin authors”.4 It is no coincidence the National Mall near my home in Washington DC is flanked on either end by rather explicit tributes to the Athenian Parthenon (Lincoln Memorial) and the Roman Pantheon (Capitol Building)...

– See more at:



By Jeff Berwick

Jeff interviews Illuminati whistle-blower and consciousness liberator Mark Passio. Topics include: satanism in the halls of power, the real tenets of organised satanism, a self serving philosophy, moral relativism, social Darwinism, eugenics, preparation for slavery, propagation of exoteric satanism to an unaware public, an ancient struggle of good vs evil, mind control, the occult, they steer the public mind, psychic self defense, a profound dark influence, Larken Rose’s The Most Dangerous Superstition, freemasonry, a Fabian-socialist plan of action, child abuse part of elite culture…you can help by getting the word out…

– See more at:





%d bloggers like this: