Japan ponders recycling Fukushima soil for public parks & green areas
Soil from the Fukushima prefecture may be used as landfill for the creation of “green areas” in Japan, a government panel has proposed, facing potential public backlash over fears of exposure to residual radiation from the decontaminated earth.
The advisory panel of the Environment Ministry on Monday proposed reusing soil that was contaminated during the Fukushima nuclear meltdown of 2011 as part of future landfills designated for public use, Kyodo news reported.
In its proposal, the environmental panel avoided openly using the word “park” and instead said “green space,” apparently to avoid a premature public outcry, Mainichi Shimbun reported.
Following an inquiry from the news outlet, the Ministry of the Environment clarified that “parks are included in the green space.”
In addition to decontaminating and recycling the tainted earth for new parks, the ministry also stressed the need to create a new organization that will be tasked with gaining public trust about the prospects of such modes of recycling.
To calm immediate public concerns, the panel said the decontaminated soil will be used away from residential areas and will be covered with a separate level of vegetation to meet government guidelines approved last year.
In June last year, the Ministry of the Environment decided to reuse contaminated soil with radioactive cesium concentration between 5,000 to 8,000 becquerels per kilogram for public works such as nationwide roads and tidal banks.
Under these guidelines, which can now be extended to be used for the parks, the tainted soil shall be covered with clean earth, concrete or other materials.
Such a landfill, the government said at the time, will not cause harm to nearby residents as they will suffer exposure less than 0.01 mSv a year after the construction is completed.
The Fukushima Daiichi nuclear power plant suffered a blackout and subsequent failure of its cooling systems in March 2011, when it was hit by an earthquake and a killer tsunami that knocked out the facility, spewing radiation and forcing 160,000 people to flee their homes. Three of the plant’s six reactors were hit by meltdowns, making the Fukushima nuclear disaster the worst since the Chernobyl catastrophe in 1986.
1. The idea that celibacy breeds maximum athletic performance dates back to 444 B.C., when Plato, of all people, opined, “Olympic competitors before races should avoid sexual intimacy.” A few centuries later, Aretaeus of Cappadocia, a celebrated Greek physician, gave Plato’s thinking a little more color: “If any man is in possession of semen, he is fierce, courageous and physically mighty, like beasts.”
2. The most detailed explanation, though, can be found in Philostratus’ Gymnasticus, the oldest text on sports known to man: “Those who come to the gymnasium straight after sex are exposed by a greater number of indicators when they train, for their strength is diminished and they are short of breath and lack daring in their attacks, and they fade in colour in response to exertion. … And when they strip, their hollow collar-bones give them away, their poorly structured hips, the conspicuous outline of their ribs, and the coldness of their blood. These athletes, even if we dedicated ourselves to them, would have no chance of being crowned in any contest. The part beneath the eyes is weak, the beating of their hearts is weak, their perspiration is weak, their sleep, which controls digestion, is weak, and their eyes glance around in a wandering fashion and indicate an appearance of lustfulness.”
3. Perhaps that’s why Cleitomachus, a star pankratiast (sort of an ancient form of MMA that was a big event during the earliest Greek Olympics), is said to have never slept with his wife, and would avert his gaze when he saw two dogs mating.
4. To ensure that a male athlete’s seed was never spilled — intentionally or otherwise — Galen, another prominent Greek doctor, recommended the following around the 2nd century, “A flattened lead plate is an object to be placed under the muscles of the loins of an athlete in training, chilling them whenever they might have nocturnal emissions of semen.”
5. That said, not everyone thought a little pre-game bacchanal was the mark of a loser. In fact, in 77 A.D., Pliny the Elder, author, philosopher and inspiration for a delicious beer, as well as a naval and army commander of the Roman Empire, argued directly against Plato and everyone else above when he wrote, “Athletes when sluggish are revitalized by lovemaking.”
6. Despite the passage of about 2,000 years, our thinking on the topic has not gotten any clearer. And the methods some athletes have gone to suppress their libidos are no less barbaric than sticking lead plates down their pants. For instance, Antonio Miguel, head of medical services at the Club Universidad Nacional Pumas, one of the top soccer teams in Mexico, has said, “At the end of the 1950s and beginning of the 1960s, people thought that sex diminished the players’ performance. Coaches gave us nitrate salts (potassium nitrate, a substance used to prevent erections) because, according to them, this would inhibit the sexual desire.”
9. All of which seems backward, since a 1968 study, “Muscular Performance Following Coitus,” found that men who hadn’t had sex for six days did no better on a strength test than men who’d had sex the previous night.
10. Same for a 2000 study in the Journal of Sports Medicine and Physical Fitness involving 15 high-level athletes between the ages of 20 and 40 who participated in a two-day experiment. Its conclusion? Sexual activity had no significant overall effect on how the athletes performed during exercise and mental tests.
11. In fact, Emmanuele A. Jannini of the University of L’Aquila in Italy has found that sex stimulates the production of testosterone. “After three months without sex, which is not so uncommon for some athletes, testosterone dramatically drops to levels close to children’s levels,” he told National Geographic.
12. Of course, Joe Namath didn’t need Jannini to tell him that. “I try to [have sex the night before a game],” he explained in his 1969 Playboy Interview. “Before one game last year, I just sat home by myself and watched television, drank a little tequila to relax and went to sleep fairly early. But most of the nights before games, I’ll be with a girl. One of the Jets’ team doctors, in fact, told me that it’s a good idea to have sexual relations before a game, because it gets rid of the kind of nervous tension an athlete doesn’t need.”…
“The whole life of the individual is nothing but the process of giving birth to himself; indeed, we should be fully born, when we die.”
“Every advance of intellect beyond the ordinary measure,” Schopenhauer wrote in examining the relationship between genius and insanity, “disposes to madness.” But could what is true of the individual also be true of society — could it be that the more so-called progress polishes our collective pride and the more intellectually advanced human civilization becomes, the more it risks madness? And, if so, what is the proper corrective to restore our collective sanity?
That’s what the great German humanistic philosopher and psychologist Erich Fromm (March 23, 1900–March 18, 1980) explores in his timely 1956 treatise The Sane Society (public library).
Fifteen years after his inquiry into why totalitarian regimes rise in Escape from Freedom, Fromm examines the promise and foibles of modern democracy, focusing on its central pitfall of alienation and the means to attaining its full potential — the idea that “progress can only occur when changes are made simultaneously in the economic, socio-political and cultural spheres; that any progress restricted to one sphere is destructive to progress in all spheres.”
Nothing is more common than the idea that we, the people living in the Western world of the twentieth century, are eminently sane. Even the fact that a great number of individuals in our midst suffer from more or less severe forms of mental illness produces little doubt with respect to the general standard of our mental health. We are sure that by introducing better methods of mental hygiene we shall improve still further the state of our mental health, and as far as individual mental disturbances are concerned, we look at them as strictly individual incidents, perhaps with some amazement that so many of these incidents should occur in a culture which is supposedly so sane.
Can we be so sure that we are not deceiving ourselves? Many an inmate of an insane asylum is convinced that everybody else is crazy, except himself.
Fromm notes that while modernity has increased the material wealth and comfort of the human race, it has also wrought major wars that killed millions, during which “every participant firmly believed that he was fighting in his self-defense, for his honor, or that he was backed up by God.”In a sentiment of chilling pertinence today, after more than half a century of alleged progress has drowned us in mind-numbing commercial media and left us to helplessly watch military budgets swell at the expense of funding for the arts and humanities, Fromm writes:
We have a literacy above 90 per cent of the population. We have radio, television, movies, a newspaper a day for everybody. But instead of giving us the best of past and present literature and music, these media of communication, supplemented by advertising, fill the minds of men with the cheapest trash, lacking in any sense of reality, with sadistic phantasies which a halfway cultured person would be embarrassed to entertain even once in a while.But while the mind of everybody, young and old, is thus poisoned, we go on blissfully to see to it that no “immorality” occurs on the screen. Any suggestion that the government should finance the production of movies and radio programs which would enlighten and improve the minds of our people would be met again with indignation and accusations in the name of freedom and idealism.
We have reduced the average working hours to about half what they were one hundred years ago. We today have more free time available than our forefathers dared to dream of. But what has happened? We do not know how to use the newly gained free time; we try to kill the time we have saved, and are glad when another day is over… Society as a whole may be lacking in sanity.
Fromm points out that we can only speak of a “sane” society if we acknowledge that a society can be not sane, which in turn requires a departure from previous theories of sociological relativism postulating that “each society is normal inasmuch as it functions, and that pathology can be defined only in terms of the individual’s lack of adjustment to the ways of life in his society.” Instead, Fromm proposes a model of normative humanism — a redemptive notion that relieves some of our self-blame for feeling like we are going crazy, by acknowledging that society itself, when bedeviled by certain pathologies, can be crazy-making for the individual…
It’s not just the check—restaurant meals put us in a ‘consumer mindset’
One of the most common pieces of advice you’ll get from personal finance experts is to eat out less. By cooking at home, you almost always manage to spend less money on food (not to mention eat healthier).
But a new study not only confirms that eating out is bad for your finances, but suggests that eating out is among the worst things you can do for your personal financial health.
“What we saw consistently throughout the study was that when people reported their dining-out budget for the second time during the experiment, it was significantly higher than what they stated the first time,” Penn State professor Amit Sharma, one of the study’s co-authors, tells Futurity. “What this tells us is that obviously they thought they would spend less in a week, but as the week progressed, they realized they were spending a lot more and they rationalized that increase.”
Specifically, people increased their personal dining out budgets from less than $18 in the first week of the study to $55 in week two, when they realized the first figure was unrealistic.
I’m not sure where these study respondents live that any of them think $18 would last them a week’s worth of dining out. They certainly don’t live in L.A., where $18 gets you a tablespoon of quinoa with a side of two fig leaves, or New York, where $18 is the admission price for the privilege of waiting to maybe buy a cronut.
What’s most interesting, though, is the rationalization part. Rather than curb their dining out in the face of that information, people just readjust their budgets to meet their actual spending habits.
People’s tendency to overspend is partially due to valuing immediate gratification over the long-term benefits of saving. In Sharma’s study, people’s weekly budget goals were no match for their pressing desire to go out and eat some delicious food. “We tend to discount the future more than we should and, therefore, place higher value on current consumption,” says Sharma.
Worse, the study suggests that eating out changes people’s mindset from saving to consumption.
Serious savers know that a commitment to saving is about more than abstaining from the occasional splurge — it’s a mindset that informs every aspect of their lives. They understand that while spending $5 at Starbucks may seem like a minor purchase, it’s actually very important. That daily $5 purchase each morning equates to more than $1,200 over the course of a year, so serious savers opt for the shitty office brew. Conduct that calculus on all the small, seemingly inconsequential purchases in one’s life, and you have significant savings.
The people MEL profiled in our Into the Black series, for instance, didn’t pay off their debts because they refrained from buying expensive cars. They did so by identifying and cutting out any and all unnecessary purchases, no matter the size, and letting the savings accumulate over time. They bought cheap beer, hosted game nights and potlucks and took up free hobbies such as rock climbing instead of meeting their friends out at fancy cocktail bars.
But going out to eat seems to take a person out of that vigilant savings mindset: What’s a $5 coffee when I already spent $12 on lunch?
The suffering that comes with running a marathon or participating in a race like Tough Mudder, which entails crawling around in mud and pretending you’re a Navy SEAL, obviously helps offset the weak bones and paunchy stomach that typically come from hours spent staring blankly at a computer screen. But it also fights against the existential dread of monotonous office work. “For individuals who feel that modern office work has made their bodies redundant, obstacle racing and other forms of short but intense and painful activities provide a brief but acute reappearance of the body,” study coauthor Julien Cayla, assistant professor at the Nanyang Business School in Singapore, tells Futurity.
The findings coincide with pre-existing research about office workers needing to engage in some kind of physical activity to stay sharp. Tough Mudders are on the extreme end of the spectrum, however — most people would benefit from taking periodic walking breaks throughout the day. Sitting for long periods of time slows down your cognitive functioning, while walking gets blood flowing to your brain, makes your thinking more balanced and gives your senses a break from the usual office stimuli, fostering creativity.
But the findings also suggest a disturbing conclusion about the nature of modern work, and our relationship to it. Simply put: Our work doesn’t meet our fundamental needs as humans. The modern “knowledge economy” revolves around people doing silent, sedentary work for more than half their waking hours, creating work that doesn’t occupy physical space, but instead lives only in the ether of the internet.
It’s a scenario so utterly unsatisfying — both physically and spiritually — that people have to crawl under electrified fences in sub-freezing temperatures just to feel alive. Which is not to say we were better off as an agrarian or industrial civilization. But there’s something undeniably fulfilling about working with your hands and forging something tangible, a satisfaction our current job market fails to provide us.
From ‘pink slime’ to bug burgers, a look at the quintessential American meal
For most of the world, the symbol most associated with America isn’t the bald eagle, George Washington or even the stars and stripes—it’s the hamburger and fries. But how much has this simple meal — a ground-beef sandwich with fried potatoes — changed since its glory days of the 1950s? Let’s find out.
1950s: According to Andrew Smith, author of The Encyclopedia of Junk Food and Fast Food, the quality of hamburger meat was so bad in the early 20th century that by the 1950s, customers needed reassuring that what they were getting was actual meat. “Heading into the ‘50s, White Castle had beef slabs delivered to each outlet a couple times a day,” Smith says. “It was ground up in front of any customers in the store to assure everyone that their beef did come from a cow, as opposed to a variety of meat and other products from other slaughtered animals.”
So during the burger’s heyday, most people could feel confident that they were, in fact, getting 100 percent ground beef, while the fries were exactly as advertised: Potatoes, sliced in the restaurant and fried in animal oil.
Today: In 2008, a study by Brigid Prayson of the Cleveland Clinical Foundation tried to find out whether it was even possible for America to produce as much beef as was apparently being consumed — an interesting question, considering that there are fewer cattle being raised now than in the 1970s, and yet we’re eating more beef than we were then. The answers weren’t encouraging, and a test of a variety of fast-food burgers found that the amount of real meat in burgers ranged from just 2 to 14 percent. The rest was made up of what has become known as “pink slime,” or in the words of the study, “a mash of connective tissue, blood vessels, peripheral nerve, plant material, cartilage and bone.”
This nauseating goop was then doused in ammonium hydroxide, an antimicrobial agent once classified by the Department of Agriculture as “generally recognized as safe,” though the practice is banned in the European Union. McDonald’s and other chains have since claimed that they no longer use the stuff, but after a brief public backlash, it has crept back into grocery stores, with a 2014 study claiming that up to 70 percent of the ground beef sold in stores contains the dreaded pink slime.
The meat isn’t the only thing chock-full of chemicals now, either. A quick look at the fry ingredients listed on McDonald’s website reveals not just potatoes but rather a dozen different things, including chemicals with such appetizing names as sodium acid pyrophosphate (that’s the one that maintains their friendly yellow color). Essentially, most of the water in the fries has been replaced with fat, and a bunch of chemicals are added to make them taste like they were fried in animal fat, rather than the mix of corn and soybean oil they’re actually fried in.
1950s: “The combo of french fries and burgers as a meal became solidified during World War II, since meat was rationed and you needed to bolster what small amount of it you had with something else,” says Smith. How small exactly were the burgers? In 1950, the average burger weighed just 3.9 ounces—not so much bigger than a modern-day White Castle slider, at 2.2 ounces, according to the Centers for Disease Control. For their part, an average order of fries weighed roughly 2.4 ounces.
Today: As rationing came to an end, burgers began to fatten up. “Chains like Burger King came along offering bigger burgers with more meat, and the increased competition led to an arms race of the sizes and the styles of burgers,” Smith says. As a result, the average fast-food burger has quadrupled in size since the 1950s and now stands at a gut-busting 12 ounces. Fries, meanwhile, have nearly doubled in size, weighing in at 6.7 ounces (again according to the CDC)…