Tuesday, June 22, 2010

You Don't Know That You Don't Know (and Other Such Puzzles)


This series is making my brain hurt... in a good way.
The Anosognosic’s Dilemma: Something’s Wrong but You’ll Never Know What It Is
(Part 1)
DD: There have been many psychological studies that tell us what we see and what we hear is shaped by our preferences, our wishes, our fears, our desires and so forth.  We literally see the world the way we want to see it.  But the Dunning-Kruger effect suggests that there is a problem beyond that.  Even if you are just the most honest, impartial person that you could be, you would still have a problem — namely, when your knowledge or expertise is imperfect, you really don’t know it.  Left to your own devices, you just don’t know it.   We’re not very good at knowing what we don’t know.
EM:  Knowing what you don’t know?  Is this supposedly the hallmark of an intelligent person?
DD:  That’s absolutely right.  It’s knowing that there are things you don’t know that you don’t know. Donald Rumsfeld gave this speech about “unknown unknowns.”  It goes something like this: “There are things we know we know about terrorism.  There are things we know we don’t know.  And there are things that are unknown unknowns.  We don’t know that we don’t know.”  He got a lot of grief for that.  And I thought, “That’s the smartest and most modest thing I’ve heard in a year.”
In a brief communication presented to the Neurological Society of Paris, Joseph Babinski (1857-1932), a prominent French-Polish neurologist, former student of Charcot and contemporary of Freud, described two patients with “left severe hemiplegia” – a complete paralysis of the left side of the body – left side of the face, left side of the trunk, left leg, left foot. Plus, an extraordinary detail. These patients didn’t know they were paralyzed. To describe their condition, Babinski coined the term anosognosia – taken from the Greek agnosia, lack of knowledge, and nosos, disease.

The contemplation of anosognosia leads to many questions about how the brain puts together a picture of reality and a conception of “the self.” It also suggests that our conception of reality is malleable; that it is possible to not-know something that should be eminently knowable. It may also suggest that it is possible to know and not-know something at the same time. But additionally, it puts the question of how we “know” things at the heart of a neurological diagnosis, and raises questions about how we separate the physical from the mental.

Monday, June 21, 2010

Wonks and the Press


The academy and the press can have somewhat of a love/hate relationship (ok, maybe mostly hate).  Academics don't want to give soundbites and journalists want a juicy story. Abstract theories just don't mesh well with a nitty-gritty 24-hour news cycle. But in the realm of politics, the relationship may be warming a bit.  This recent article in the Columbia Journalism Review traces the growing influence and acceptance of political science research in political reporting and commentary.  Some quotes: 
In November 2007, The Monkey Cage—the name comes from an H. L. Mencken line about the nature of democracy—was launched...perhaps The Monkey Cage’s greatest influence has been in fostering a nascent poli-sci blogosphere, and in making the field’s insights accessible to a small but influential set of journalists and other commentators who have the inclination—and the opportunity—to approach politics from a different perspective. That perspective differs from the standard journalistic point of view in emphasizing structural, rather than personality-based, explanations for political outcomes.
These powerful, simple explanations are often married to an almost monastic skepticism of narratives that can’t be substantiated, or that are based in data—like voter’s accounts of their own thinking about politics—that are unreliable. Think about that for a moment, and the challenge to journalists becomes obvious: If much of what’s important about politics is either stable and predictable or unknowable, what’s the value of the sort of news—a hyperactive chronicle of the day’s events, coupled with instant speculation about their meaning—that has become a staple of modern political reporting?
The journalists who have engaged most with political science...have something in common: they’re operating under a new model of what it means to be a political reporter, one that allows them to conceive of “news” in a different way.
That’s not to say that traditional reporting tasks will go by the wayside, nor should they. But even in day-to-day coverage, a poli-sci perspective can have value in helping reporters make choices about which storylines, and which nuggets of information, really matter. For that to happen, political scientists must do more to make their work accessible, reaching beyond the circle of journalists who are inclined to, as Sides says, “embrace the wonk.”

Because of course, not everyone is running to embrace the wonk. Here's a (hilarious) response from a journalist at Slate: What if Political Scientists Covered the News? 

Related Link (and comic above): Not political science, but my favorite comic on press coverage out there!

Added 6/23: Bloggingheads discussion between Jay Rosen & Julian Sanchez on the ideology of the press and related topics (they address political journalism in the first section).

Friday, June 11, 2010

The Cost of Machines That Think = People Who Don't?

As I marvel at the new iPhone 4 – and hope to get one of my very own, very soon – I can't help but notice the flurry of recent articles on the mind-altering impacts of technology. The half-empty: irreversible, fundamental changes to the brain caused by the deluge of incoming information, from e-mail to video games to tweets to newsfeeds. Essentially, this digital multitasking is rewiring us to be shallow. The half-full: those same changes could actually be pretty useful – even making us smarter.

From the NY Times article: Hooked on Gadgets, and Paying a Mental Price
Scientists say juggling e-mail, phone calls and other incoming information can change how people think and behave. They say our ability to focus is being undermined by bursts of information.vThese play to a primitive impulse to respond to immediate opportunities and threats. The stimulation provokes excitement — a dopamine squirt — that researchers say can be addictive. In its absence, people feel bored...
While many people say multitasking makes them more productive, research shows otherwise. Heavy multitaskers actually have more trouble focusing and shutting out irrelevant information, scientists say, and they experience more stress. And scientists are discovering that even after the multitasking ends, fractured thinking and lack of focus persist. In other words, this is also your brain off computers.
“The technology is rewiring our brains,” said Nora Volkow, director of the National Institute of Drug Abuse and one of the world’s leading brain scientists. She and other researchers compare the lure of digital stimulation less to that of drugs and alcohol than to food and sex, which are essential but counterproductive in excess.

From the WSJ article: Does the Internet Make You Dumber?
The picture emerging from the research is deeply troubling, at least to anyone who values the depth, rather than just the velocity, of human thought. People who read text studded with links, the studies show, comprehend less than those who read traditional linear text. People who watch busy multimedia presentations remember less than those who take in information in a more sedate and focused manner. People who are continually distracted by emails, alerts and other messages understand less than those who are able to concentrate. And people who juggle many tasks are less creative and less productive than those who do one thing at a time.
The common thread in these disabilities is the division of attention. The richness of our thoughts, our memories and even our personalities hinges on our ability to focus the mind and sustain concentration. Only when we pay deep attention to a new piece of information are we able to associate it "meaningfully and systematically with knowledge already well established in memory," writes the Nobel Prize-winning neuroscientist Eric Kandel. Such associations are essential to mastering complex concepts.
When we're constantly distracted and interrupted, as we tend to be online, our brains are unable to forge the strong and expansive neural connections that give depth and distinctiveness to our thinking. We become mere signal-processing units, quickly shepherding disjointed bits of information into and then out of short-term memory.

From the NY Times book review: Our Cluttered Minds
There is little doubt that the Internet is changing our brain. Everything changes our brain. What Carr neglects to mention, however, is that the preponderance of scientific evidence suggests that the Internet and related technologies are actually good for the mind. For instance, a comprehensive 2009 review of studies published on the cognitive effects of video games found that gaming led to significant improvements in performance on various cognitive tasks, from visual perception to sustained attention. This surprising result led the scientists to propose that even simple computer games like Tetris can lead to “marked increases in the speed of information processing.” One particularly influential study, published in Nature in 2003, demonstrated that after just 10 days of playing Medal of Honor, a violent first-person shooter game, subjects showed dramatic increases in ­visual attention and memory.
Carr’s argument also breaks down when it comes to idle Web surfing. A 2009 study by neuroscientists at the University of California, Los Angeles, found that performing Google searches led to increased activity in the dorsolateral prefrontal cortex, at least when compared with reading a “book-like text.” Interestingly, this brain area underlies the precise talents, like selective attention and deliberate analysis, that Carr says have vanished in the age of the Internet. Google, in other words, isn’t making us stupid — it’s exercising the very mental muscles that make us smarter.

Related Link: The Edge Question 2010: How is the Internet Changing the Way You Think?

Saturday, June 5, 2010

Experiments & the Future of News

Want to save the news? Look to Google. So says this Atlantic article about the future of journalism and the sustainability of professional news-gathering. Some quotes:
So how can news be made sustainable? The conceptual leap in Google’s vision is simply to ignore print. It’s not that everyone at the company assumes “dead tree” newspapers and magazines will disappear...But all of their plans for reinventing a business model for journalism involve attracting money to the Web-based news sites now available on computers, and to the portable information streams that will flow to whatever devices evolve from today’s smart phones, iPods and iPads, Nooks and Kindles, and mobile devices of any other sort. This is a natural approach for Google, which is, except for its Nexus One phone, a strictly online company.

The three pillars of the new online business model, as I heard them invariably described, are distribution, engagement, and monetization. That is: getting news to more people, and more people to news-oriented sites; making the presentation of news more interesting, varied, and involving; and converting these larger and more strongly committed audiences into revenue, through both subscription fees and ads.


“The three most important things any newspaper can do now are experiment, experiment, and experiment,” Hal Varian said.


In fact, such advice is both natural and inconceivable for most of today’s journalists. Natural, in that every book, every article, every investigative project, every broadcast is its own form of pure start-up enterprise, with nothing guaranteed until it’s done (if then). Inconceivable, in that news businesses themselves are relatively static, and the very name “Newspaper Guild” suggests how tradition-bound many journalists are. We pride ourselves on defending standards of language, standards of judgment, and even a form of public service that can seem antique. Whether or not this makes for better journalism, it complicates the embrace of radical new experiments.
The other implicitly connecting theme is that an accumulation of small steps can together make a surprisingly large difference. The forces weighing down the news industry are titanic. In contrast, some of the proposed solutions may seem disappointingly small-bore. But many people at Google repeated a maxim from Clay Shirky, of New York University, in an essay last year about the future of the news: “Nothing will work, but everything might.”