Last month His Holiness the Dalai Lama held the 18th of his celebrated “Mind and Life” conferences – inviting notable neuroscientists to India in the hope that when Buddhist epistemology and western neurology compare notes, the results are educational for everyone.
It’s the sort of communication that Saybrook faculty say they’d like to see more of: different intellectual approaches coming together to get a bigger sense of the picture.
“Exchanges between spiritual understandings of consciousness and scientific understandings can be mutually enriching,” says Amedeo Giorgi, a Saybrook faculty member in psychology who is a major figure in contemporary phenomenology. “Such exchanges can only be helpful.”
They have born fruit in the past, according to an article in the London Guardian:
(C)onferences have spurred the development of research programmes that examine the effects of Buddhist contemplative techniques and how they might be applied more widely to benefit humanity. They have, for example, been instrumental in the work of Richard Davidson at the University of Wisconsin, whose brain imaging studies found that experienced meditators show increased activity in the left prefrontal cortex of the brain, an area associated with emotional well-being, as well as having stronger immune systems.
But Saybrook psychology faculty Stanley Krippner, long at the forefront of the exploration of consciousness, says he always has mixed feelings when he hears about the Mind and Life conferences – because he thinks they could be taken to the next level.
When some people say “kids will be kids,” what they really mean is that “children are cruel.”
The assumption behind the saying is that there’s nothing we can do about it. It’s human nature. Kids won’t grow out of cruelty until they mature.
But according to a recent New York Times report, schools around the country – especially middle schools – are betting that children can learn their way out of cruelty through new curriculums.
The idea: “teach” kids empathy. If they can learn to be more empathetic, the thinking goes, children will be less cruel and more supportive to one another. Bullying will be reduced, and kids who might have been pushed over the edge as outcasts will have a better chance for happy childhoods.
It’s a nice plan, but is it realistic? Is empathy something that can be taught?
Even if it can, will it, in fact, likely lead to the desired effect of less bullying?
Good news: Saybrook's Social Concentration Director Joel Federman has done research indicating that the answer to both questions is "yes" - if you do it right.
Admit it: we all know, at some level, that rational thought can be a smokescreen.
You don’t like strawberries because there’s a rational argument for them … they just taste good. And you don’t abhor murder because there’s a good argument against it, although there is: that good argument is something you use to justify your inherent disgust at the practice.
We know that. From far back in human history people have known that we often use rational justifications as a cover for things we already believe.
But modern neuroscience has now “proven” it – showing that for many decisions the emotional parts of our brain kick in before the rational. Some people are now saying that this changes everything we know about ethics – because ethical behavior is an emotional, rather than a rational, process.
Does that follow?
In a recent New York Times column provocatively entitled “The End of Philosophy,” David Brooks suggests that new evidence that humans make value-laden, emotional decisions will lead to a new “evolutionary” perspective on ethics that doesn’t need all that difficult philosophizing. He writes:
The rise and now dominance of this emotional approach to morality is an epochal change. It challenges all sorts of traditions. It challenges the bookish way philosophy is conceived by most people. It challenges the Talmudic tradition, with its hyper-rational scrutiny of texts. It challenges the new atheists, who see themselves involved in a war of reason against faith and who have an unwarranted faith in the power of pure reason and in the purity of their own reasoning.
Marvin Brown, however, doesn’t believe it.
The three most controversial words in government right now might be: “Pay for performance.”
A number of federal departments have recently announced that they’ll be instituting pay-for-performance plans for the first time, because they say they have a desperate need for motivated, high-quality employees … and that traditional reward structures just aren’t doing it.
At the same time, a number of prominent Democratic lawmakers have asked the Obama administration to suspend all pay-for-performance initiatives until they can be fully evaluated: they suggest that pay-for-performance systems don’t really motivate federal employees.
What’s an administration to do? What’s the best system for motivating civil servants?
Two Saybrook management experts with significant government experience – one ultimately in favor of pay-for-performance, one leaning against it – say that the issue isn’t which reward system you’re using, but whether the system actually can recognize the behaviors you want to reward.
If the system can do that, they agree, it’s probably a good one: if it can’t, it’s likely a bad one.
Somewhere, a committee is trying to draw a line in the sand: check your email so many times a day and you’re healthy; check it so many more and you’re an addict in need of mental health counseling, and possibly drugs.
One year ago the American Journal of Psychiatry published an editorial calling for recognition in the upcoming DSM (Diagnostic and Statistical Manual of Mental Disorders) – the “Bible of psychiatry” – of “Internet addiction” as a mental disorder. Late last month, Psychology Today blogger Christopher Lane noted that the effort to include Internet addiction in the DSM is still ongoing … and fairly uncontroversial among the psychiatric community.
For Lane, it should be controversial – and the idea of treating Internet addiction with drugs is ludicrous. Here at Saybrook, the PsyD classes led by Art Bohart are presently examining this very issue: is Internet addiction “real”? If so, what kind of disorder is it? And how can it best be treated?
For Bohart, the very approach taken by psychiatry is the problem – and that has nothing to do with exactly what “Internet addiction” is.
The wars in Iraq and Afghanistan may not be exceptions: they may be the new rule.
According to a recent article in The New Atlantis by former Marine and current Ethics and Public Policy Center senior fellow Keith Pavlischek, the United States’ dominance in conventional warfare has given insurgents the world over the incentive to use different types of tactics. Therefore, “It is likely,” Pavlischek notes, “that the United States will be involved in more irregular conflicts in the years ahead.”
The history of counter-insurgency warfare is pretty brutal, as Pavlischek documents. These kinds of conflicts are much more likely to resemble Afghanistan and Iraq than World War II or the first Gulf War, which for Pavlischek and a host of military scholars and ethicists raises a troubling question: have we learned anything in Afghanistan and Iraq that will help us develop more ethical tactics?
Humanity passed a milestone last month, with the first ever commercial fertility service announcing that it would allow parents to screen potential offspring for “cosmetic” details such as eye color, hair color, and skin color.
The company (Fertility Institutes) announced that it was dropping the service shortly afterward, as “we remain sensitive to public perception and feel that any benefit the diagnostic studies may offer are far outweighed by the apparent negative societal impacts involved,” according to a company statement.
But even if this was a near miss, the fact remains that genetic research is moving steadily ahead, and its commercial aspects … in this country and in others … are moving quickly too. At some point, some level of “designer humans” appear to be inevitable.
For psychologists, and for everyone, this new era will present some profoundly new versions of old questions: How do we approach issues of identity and moral responsibility when many details of children can be chosen by their parents (or others) as never before? What are the implications for personhood? For the way we think of ourselves, and others?
Saybrook psychology faculty member Eugene Taylor says that however new the technology, the underlying folly of “the commercialization of biology” is an old one: that idea that everything can be rationally managed if we just think hard enough about it.
It was, according to the New York Times, a breakthrough in the study of dreams.
“(S)ocial scientists now have answers,” about what dreams “mean,” wrote Times science blogger John Tierny, “and really, it’s about time.”
He was referring to a meta-analysis published by the APA showing that “people engage
in motivated interpretation of their dreams and that these interpretations impact their everyday lives.”
In other words, there is a selection bias in the way we interpret dreams: we’re more likely to act on the basis of dreams that reinforce our existing prejudices, and less likely to believe in dreams that tell us things we don’t want to hear.
Voila! Tierny wrote. These “suspiciously convenient correlations” mean that your dreams mean “whatever your bias says.” Problem solved.
Saybrook’s experts in dream studies are not impressed.
“I find it interesting and not a little amusing that one should do studies to show that our cultures and belief systems influence how we interpret dreams,” says Claire Frederick, a faculty member in Saybrook’s Mind-Body Medicine and Consciousness and Spirituality programs. “From a strictly neuroscience point of view, this seems obvious.”
It didn’t get much buzz in America, but across the pond Britons are still talking (so we’re told) about a BBC commentary made last month by Dr Alan Maryon-Davis, the President of the UK Faculty of Public Health.
In it, Dr. Maryon-Davis says that public health has become a significant enough social issue that the government must intervene at far more significant levels to ensure participation and effectiveness. Sound like a “nanny state?” Yes, says Maryon-Davis, it does: and that’s not a bad thing.
“Is the government 'nannying' us too much” to help prevent killers like heart disease, strokes, and cancer? Maryon-Davis writes. “Is it trying too hard to micro-manage our health? I say firmly - no.”
Here at Saybrook, many faculty have been advocating a changing governmental role in health care for years: Mind-Body Medicine faculty member Marie DiCowden, for example, has overseen public hearings on the way the government – at all levels – can encourage best practice.
But at the same time, DiCowden says, the idea of “nannying” doesn’t seem to get it quite right.
Imagine a highway stretching along the coast from California to Mexico – with alternative, eco-friendly fuels available at every rest stop. Need compressed natural gas? Electricity? Biodiesel? Hydrogen? They’d offer it to every car that passes by.
That’s the dream of three state governors - Gov. Chris Gregoire in Washington, Gov. Ted Kulongoski in Oregon, and Gov. Arnold Schwarzenegger in California – who envision America’s first “Green Highway” across 1,300 miles of coastland.
If the idea can clear federal and state regulations … not to mention opposition from business groups who say alternative refueling stations at rest stops would take business away from nearby private entities … it would be a milestone in both American environmentalism and inter-state cooperation.
Nancy Southern, who directs Saybrook’s Organizational Systems program, says it also might be a good reason to finally buy an alternative fuel vehicle.