Recently Robert Faris, research director at Harvard University's Berkman Center for Internet and Society, made a distressing prediction to the National Endowment for Democracy: international diplomacy is going to get harder than it used to be.
The reason? Not terrorism (though sure) or fighting over increasingly scarce resources (though yet): but rather, social media like Facebook.
As more people in different countries get on social media, Faris said, more people in different countries talk directly to each other, and this virtual citizen diplomacy makes it very difficult for diplomats to control the conversation.
"The role of diplomacy given social media is going to be more complicated than it used to be," Faris said.
Nor are diplomats the only ones trying to figure the implications of the new technology out. Gail Ervin, a Saybrook PhD student in Human Science who works as an environmental mediator, says that “at this point, most mediators are just learning the basics of social media, and we are far from experiencing the promise of it regarding reducing conflicts.”
“I think we are at the dawn of a grand global experiment regarding these questions,” Ervin added, “and there are only inquiries at this point, no answers.”
However, according to Joel Federman, who directs Saybrook’s concentration in Social Transformation, there is reason for optimism. More people talking to each other directly means more people reacting to actual human beings, instead of crude stereotypes and propaganda. Diplomacy might get harder, but more human relationships across borders means it might get better.
Let’s admit it: we’ve all had some really bad impulses.
Shouting at a lecturer, running in traffic, stealing an inadvisable kiss … who hasn’t had a sudden, mad, urge to do the unthinkable?
It’s a basic fact of human life, and once again evolutionary psychology is claiming to have explained it. Turns out, it’s a survival mechanism. Who would have guessed?
In a recent paper published in Science, Harvard researches say these urges are “ironic processes of control” that help us tame our anti-social impulses.
“These monitoring processes keep us watchful for errors of thought, speech, and action and enable us to avoid the worst thing in most situations,” the authors write.
In a post on the New York Times’ “Mind” blog, author Benedict Carey expanded on the idea that these self-destructive impulses evolved as a way of helping us manage our anti-social tendencies.
Perverse impulses seem to arise when people focus intensely on avoiding specific errors or taboos. The theory is straightforward: to avoid blurting out that a colleague is a raging hypocrite, the brain must first imagine just that; the very presence of that catastrophic insult, in turn, increases the odds that the brain will spit it out.
“We know that what’s accessible in our minds can exert an influence on judgment and behavior simply because it’s there, it’s floating on the surface of consciousness,” said Jamie Arndt, a psychologist at the University of Missouri.
So there you go, question answered, problem solved, right?
Maybe – unless you actually want to actually understand what’s going on in your mind, with your thoughts, and your impulses. Then this theory has absolutely nothing to tell you.
In fact, says Saybrook faculty member Kirk Schneider, it’s a classic example of what Rollo May, in his book Psychology and the Human Dilemma, called "turning mountains into mole hills."
If the scientific establishment didn’t have ADHD, this is the sort of thing they would be paying attention to: a long-term study recently completed by the National Institutes of Mental Health (NIMH) showed that there are few-to-no long term benefits for treating children with ADHD with Ritalin.
According to the NIMH report:
The eight-year follow-up revealed no differences in symptoms or functioning among the youths assigned to the different treatment groups as children. This result suggests that the type or intensity of a one-year treatment for ADHD in childhood does not predict future functioning.
A majority (61.5 percent) of the children who were medicated at the end of the 14-month trial had stopped taking medication by the eight-year follow-up, suggesting that medication treatment may lose appeal with families over time. The reasons for this decline are under investigation, but they nevertheless signal the need for alternative treatments.
And, perhaps most importantly:
Children who were no longer taking medication at the eight-year follow-up were generally functioning as well as children who were still medicated.
These are the kind of results that humanistic psychologists have been predicting for some time, and humanistic psychology can be excused an exasperated sigh when it reads that the NIMH now thinks that the actual symptoms of individual children might be the most important factor they present with, as noted below:
The researchers also speculate that a child’s initial clinical presentation, including ADHD symptom severity, behavior problems, social skills and family resources, may predict how they will function as teens more so than the type of treatment they received.
For a generation of new college graduates, the future is not what they expected.
It had seemed so easy: get a BA, go get a job at an investment bank or a big company, make lots of money and the rest would take care of itself.
But “the rest” didn’t take care of itself – and as industry after industry has been roiled by social and technological change, there is an increasing drumbeat that “there has to be a better way” to handle work. “The future of work” is big news in the media, even a Time Magazine cover story last week.
Meanwhile, the easy future is no longer an option. According to a terrifying report from ABC News, only 20% of new BA’s are finding a full-time job right after college. As these students try to piece together a healthy economic life with the tools they have, they are the unwilling vanguard into the new economy.
But what will the future of work look like? What trends will be most important, what skills will be valued, and what will a “day at the office” look like?
Kathia Laszlo, a Saybrook faculty member in Organizational Systems, says that much of the current chaos in the economy comes from the fact that “We have created an artificial separation between work, learning and life.”
On May 26, California made national news when the state’s supreme court upheld Proposition 8 – a ballot initiative that stripped the right to marry away from gay and lesbian couples.
Legal analysts say the court made its decision because … while acknowledging that marriage is a “fundamental right” … the state constitution does not explicitly protect “fundamental rights,” and that therefore there is no ground to protect them from a popular vote.
Political analysts, meanwhile, point out that State Supreme Court justices are elected in California, and that the six justices had been threatened with recall efforts had they voted the other way.
But Joel Federman, who directs Saybrook’s Social Transformation Concentration, says there is sufficient precedent already for California to stand behind gay marriage.
“The California court had a precedent they could have followed to declare Proposition 8 unconstitutional, a 1996 US Supreme Court decision, Romer v. Evans, involving a constitutional amendment, Amendment 2, passed by a majority in Colorado, and intended to deny state and local government protection of "homosexuals, lesbians or bisexuals" from discrimination,” Federman wrote at topia.net. “As Justice Anthony Kennedy wrote for the majority in that decision, striking down Colorado Proposition 2: ‘A state cannot...deem a class of persons a stranger to its laws.’ That eloquent phrasing captured the essential meaning of equal protection under the law, and applied it to same-sex discrimination.”
Eventually, Federman says, same-sex marriage will be an “unquestioned right, as obvious to the fair-minded as interracial marriage.”
“But,” he says, “in the meantime, we protest.”
But what will that protest look like? Civil rights marches emerged for the era of TV – what new forms of protest will emerge for the era of Facebook and Twitter? How effective will they be?
The Ford Foundation recently announced that it is endowing the first permanent arts foundation for the art and culture of American Indian, Native Hawaiian and Alaska Native artists.
“This,” says Saybrook psychology faculty member Stanley Krippner, “can be a bonanza for indigenous arts.”
Krippner is just one of Saybrook’s many community members who have done extensive work with America’s native peoples, and many of them are thrilled the prospect of native artists finally receiving ongoing support and recognition.
But they also warn that there’s a big difference between “appreciating” native works of art that are preserved behind glass, and supporting the living, breathing, cultures that create today’s native traditions.
The first is easy, they say. The second is far more complex and challenging.
Seven years ago Alison Shapiro was the picture of a healthy 55 year-old. A happy life, a successful career; no health issues, no weight issues; her blood pressure was normal. She was in the middle of living out a lifelong dream, illustrating her first children’s book. She had three of 17 pictures finished.
Then she had a stroke. Twenty-four hours later, she had another.
You may think you’ve got problems, but probably not like Alison had.
The two strokes struck her brain stem – the most lethal place for a stroke to hit. Fifty percent of brain stem stroke victims die; others suffer from “locked in” syndrome, where they are fully conscious – and fully paralyzed. By the time the strokes were over, Alison’s left side was mostly paralyzed, and her right side was wildly uncoordinated. She couldn’t swallow, she couldn't sit up, her speech was heavily slurred, her eyes wouldn’t focus, she couldn’t walk.
It was the kind of event no one is ever prepared for.
“There I was, in that hospital,” she says. “I was completely stunned. It’s a very sudden event, it’s like a train wreck: one minute your life is fine, the next minute you can’t move. When it happened, I had absolutely no idea what I was going to do. I had no idea how to face it.”
But don’t feel bad for Alison: she figured out how to face it. And she wants you to know that when you have to face it … or any other adversity … that you can, too.
Today Alison is healthy, active, and engaged with life again: a fully functioning person who has published her children’s book. In fact, she says she feels more empowered than she ever has before.
And this month Alison, the Chair of Saybrook’s Board of Trustees, is seeing the release of a book about her recovery experience.
Will file sharing, easy downloads, and a universe of experts all posting on Wikipedia make universities irrelevant within 15 years?
Yes, says David Wiley. Information will be free, and that means universities will have to radically restructure to accommodate that … or else face irrelevance.
Wiley, a leader in the “open content” movement and professor of psychology and instructional technology at Brigham Young University, made that prediction recently in the wake of student bodies more inclined to download than watch TV … and universities putting more and more class lectures online.
Between Facebook, Google, file sharing, YouTube, and universities putting lectures online, Wiley says, all universities have to offer paying students is a credential – and at some point that will be provided by other means, too.
Or will it? Eric Fox, Saybrook’s Dean of Instruction, says that he had a great time reading the article about Wiley’s prediction – but doesn’t think the future will pan out just that way.
That, Fox says, is because having “access” to information isn’t the same as “learning.”
Do you ever worry that maybe you spend too much time updating your Facebook status at work?
Don’t. An Australian study suggests that, in fact, your office should be encouraging it.
According to the research out of the University of Melbourne, people who use the Internet for personal reasons at work are nine percent more productive.
According to Wired Magazine, “’workplace Internet leisure browsing,’ or WILB, helped to sharpen workers' concentration,” so long as it took up less than 20% of their time at the office.”
Wow – who knew YouTube could be a productivity tool?
“This made me smile,” says Nina Serpiello, a PhD student in Saybrook’s Organizational Systems program and a human factors research designer at IDEO. “A traditional company might not encourage goofing off without having a business reason for it, like cultivating creativity for innovation. If a company is interested in empowering employees to offer ideas to outsmart the competition, then it also should promote activities that stimulate creative thinking.”
Students posted about it on their Facebook pages; faculty sent links back and forth; at Saybrook’s San Francisco offices, administrators asked one another about it. Everyone in the community, it seems, has an opinion about last week’s New York Times op-ed by Mark Taylor, “End of the university as we know it.”
In it, Taylor suggests scrapping traditional fields of study in favor of real-world problem solving clusters; abolishing tenure and replacing it with seven year, renewable, teaching contracts; replacing academic papers, and even dissertations, with scholarly multi-media presentations; and training academics for careers outside of teaching.
It’s not the first time the death of the modern university has come up (link), but this time it’s engaged the Saybrook community like no other.
Here are some student and faculty reactions. Please continue the conversation by leaving your own responses in the comments section.
Psychology faculty Eugene Taylor found the document “Orwellian” – and product of the very type of thinking it wishes would end.
“(Mark Taylor) might be more optimistic if he were more person-centered. The very thing all his emphatic points miss is the spiritual side of learning.” Eugene Taylor wrote.