Friday, January 25, 2013

ANS -- How do you know what you know?

Here's an interesting article from Doug Muder, about journalism and information overload.  There's an interesting parallel between the internet information overload and the Gutenberg revolution, and it makes people polarized.  Read this.
Find it here:  http://weeklysift.com/2013/01/21/how-do-you-know-what-you-know/   
--Kim


January 21, 2013 – 11:52 am

How do you know what you know?


why the internet isn't making us wiser

[] []

If you'd never experienced the flood of information that comes from a revolutionary new technology, you might expect it to power growth in everything downstream from information: knowledge, understanding, and even wisdom. If it's easier to find things out, then people should know more, understand more, and make better choices. You might even expect more consensus. Ignorant people can come to blows debating whether Kansas is north or south of Nebraska, but the more we know and understand about the world we all live in, the more agreement we should find.

Since you're living through the internet revolution right now, though, you know better. More knowledge? Maybe. Understanding? Hard to say. But wisdom? Surely you jest. And consensus … some days we seem lucky just to avoid civil war.

Nate Silver thinks we could have seen this coming, because the same thing happened in the last information revolution. Eventually Gutenberg's printing press led to the Enlightenment, democracy, modern science, and the Industrial Revolution. But that light came at the end of a nasty 300-year tunnel of constant strife and near-genocidal religious wars. In the Thirty Years War alone "the male population of the German states was reduced by almost half."

But why? Nate explains:

The informational shortcut that we take when we have "too much information" is to engage with it selectively, picking out the parts we like and ignoring the remainder, making allies with those who have made the same choices and enemies of the rest.

Reducing that to a bumpersticker: TMI equals polarization.

Picture it: Before Gutenberg, baptism was baptism. The priest did it, and if we wondered what it meant or why he did it that way, maybe we could ask him and maybe he'd explain by waving in the direction of a Bible that some monk had spent years producing by hand. (You could get your own ­ in Latin, a language that neither you nor Moses ever spoke ­ for about the cost of a Mercedes today.)

After Gutenberg, you say babies can be baptized by sprinkling water on them, while I accept only full-submersion adult baptism. We each own pamphlets from our own theologians, quoting passages of scripture that we have each checked in our translated Bibles at home. We each belong to religious communities that agree with us, and our respective church libraries are stocked with many other pamphlets listing the outrages that the opposing community has committed against us and providing reams of evidence proving that the conflict is all their fault.

What can we do but kill each other?

Information is great when you have some reasonable way of processing it. But when you don't, it's overwhelming and even threatening. If you try to pay attention to all of it, you'll freeze. And then the people who didn't freeze will eat your lunch ­ or eat you for lunch.

There are two easy ways to deal with information overload:
  • Submit unquestioningly to an authority who decides what's what.
  • Find a simple worldview that pleasingly organizes the wild flood of facts and interpretations, and then ally with people who subscribe to that worldview.

Both choices are cultish, but the second can seem downright enlightened, at least from the inside. Unlike the unquestioning follower, you're always learning new facts and interpretations. You're getting better and better at explaining why your tribe's view is right and the opposing view is wrong. And you do ask questions, but you've learned to ask the right questions ­ unlike those mindless sheep in the opposing tribe.

In other words, you live inside a tribal bubble that lets pleasing information in and keeps disturbing information out. The information flood actually helps you do this, because the more details, the easier to cherry-pick support for whatever you want to believe.

These delusions are easy to see in other people: conspiracy theorists, global-warming deniers, Birthers, and so on. You can never win an argument against such folks, because there is always more information you haven't explained, some new micro-analysis that "proves" Obama's birth certificate is fake or explains why the world is really cooling. You never reach the end of it, precisely because the 21st-century information barrel is bottomless.

That's why liberals like me ­ and probably Nate Silver more than anybody ­ had to love watching Republicans cope with the election returns. Nate had dispassionately put together a prediction model and he faithfully ran new polling data through it every day. It turned out to be down-the-line accurate, but until the votes were actually counted he was vilified by people who wanted to believe Romney would win. And not just ignorantly vilified, vilified with spreadsheets and graphs and detailed explanations of what he must be doing wrong.

It's rare to run into such a perfect bubble-pricking.

But Silver's book (published before the election) isn't about self-congratulation. It's about why accurate prediction is hard and how to do it better. Each chapter describes a prediction-making community ­ meteorologists, baseball stat geeks, poker players, etc. ­ and draws some general lesson from their collective success or failure.

Some of those lessons are technical, but a few general-public themes come through:
  • Foxes beat hedgehogs. People who have one big idea do badly in an information flood, because they can always explain away their failures without changing their big idea. But people who juggle multiple competing ideas can use new data to develop the good ones and discredit the bad ones.
  • Data doesn't interpret itself. The best predictions don't come from pure pattern matching, but from a plausible theory that is then proven by experiment. If you just pattern-match, you'll end up modeling the noise rather than the signal.
  • Make specific predictions so you can recognize your mistakes. Since it always rains eventually, if you aren't specific about when you expect rain and how much, you'll always be able to claim you were right ­ and you won't learn anything.
  • Be methodical. If you don't define how you're going to judge your results, the temptation to cherry-pick will overwhelm you.

Always in the background lies this lesson: Bubbles don't just happen to other people. It's a universal human tendency in the face of too much information. If you're not constantly on guard ­ and maybe even if you are ­ you will fall prey to it.

Western civilization came out of the Gutenberg Tunnel when it developed more rigorous collective methods of handling the increased information flow: Science, most obviously, but also market capitalism, journalism, and constitutional democracies that could balance majority rule with tolerance for minority rights. Maybe a similar leap will get us through the Internet Tunnel eventually ­ better sooner than later.

Bill Kovach and Tom Rosenstiel have a less sweeping focus: How are you personally going to cope?

If we continue the Gutenberg analogy, there's a clear analog to the priest and the universal church he represented: the editor and the culture of journalistic objectivity.

Once upon a time, national news outlets were few and were controlled by gatekeepers who told you "the way it is". Every evening, the remarkably similar news departments of the three major networks told you what you needed to know. If you wanted more detail, you read a daily newspaper or weekly news magazines, but even they wouldn't give you a fundamentally different worldview.

As I've described in more detail elsewhere, this system was both good and bad. (The same could be said of the pre-Gutenberg Catholic Church). The gatekeepers tried to be accurate, and they had the power to hold a story back until they could verify it. So rumors got squashed, hucksters were weeded out, and special-interest groups couldn't trump up a story out of nothing. And because the gatekeepers defined news by what people should know rather than what they wanted to know, the Vietnam War never vanished from public awareness the way the Afghan War often has.

On the downside, the range of views presented was narrow. Only by staging artificial public events (like Martin Luther King's March on Washington) could marginalized groups push their message through the editorial bottleneck.

Now that's all gone. There is no priest, or rather there are too many would-be priests sprinkling dubious holy water in all directions.

In essence, we are all editors now. We used to get a filtered flow of information, pre-tested and pre-sanitized by experts. Now we're exposed to the raw flood, which we have to test and sanitize for ourselves. So we all need to learn the ways of thought that used to only be taught in journalism school.

That's what Blur is about.

A lot of Kovach and Rosenstiel's advice is common sense. Before you react to a news article or factoid, you need to take a step back and judge it like an editor: Where does this information come from? Are the sources in a position to know? Do they have reason to lie? Am I just being told a story, or are there checkable facts here? Has anybody checked them? What is left out of this article? Does it raise obvious questions that are not answered? If the article focuses on only a few characters in the story, would other characters tell it differently? And so on. If you have a critical, analytical mind, the questions aren't hard to generate once you realize that you need to take a step back and judge.

I found one piece of their analysis very insightful, and I may start using their terminology. They identify three models of journalism: verification, assertion, and affirmation. I don't like how they present affirmation (probably because they belong to the verification tribe and the Weekly Sift is affirmation journalism), but the distinctions themselves are worthwhile.

Journalism of verification. This is the gatekeeper model of the Cronkite Era and the ideal that you will hear expressed by the editors of publications like the New York Times. (For now let's leave alone the question of how well they live up to that ideal.) Check everything. Get it right before you publish. Be objective. Be complete. Put a wall between news and opinion.

Journalism of assertion. The model most often seen on CNN. Put newsmakers on camera and see what they say. (If you can only get them on camera by agreeing not to raise certain subjects, fine.) Let viewers judge for themselves whether they're being lied to. Get information out as quickly as possible, even if you haven't checked that it's true. Strive for balance rather than accuracy; let liberals and conservatives alike spin the story for your audience, and then " leave it there" rather than check who's right.

Journalism of affirmation. The model shared by Fox News, the nighttime line-up of MSNBC, and (mostly) the Weekly Sift. Have a point of view and attract an audience that (mostly) shares that view.

Reading Blur, you will get the idea that verification is the gold standard, while assertion and affirmation are in some way illegitimate. (I was struck by how often Rachel Maddow ­ who I admire ­ came up as a bad example.) I'd express this differently: assertion and affirmation journalism are illegitimate if they pretend to be verification journalism.

That is my biggest objection to Fox News ­ the pretense that they're "fair and balanced". If they billed themselves as "interpreting the world through a conservative prism", I'd respect them more.

Affirmation journalism is legitimate to the extent that it's honest and tries to serve its audience rather than pander to them so their attention can be sold to advertisers. Like verification journalists, an affirmation journalist should be trying to get it right, and also should provide a verification trail (that's what the links are for on the Weekly Sift), honestly represent the people s/he quotes, endorse only arguments s/he believes are valid, not intentionally hide facts or points of view from its audience, and so on. (That's my other problem with Fox. I don't think they're just conservative. I think they repeat talking points they know are false and use frames designed to deceive.)

In short, I think affirmation (and assertion too) can be done well. Rachel Maddow isn't just Sean Hannity's mirror image.

Tying this back to Nate Silver and the bubble tendency: Part of being honest and doing affirmation journalism well is recognizing the constant danger of winding up in a delusional bubble. Because there is a real world out there, and it will bite you if you turn your back on it, as Fox News viewers discovered on election night.

So serving you as a reader means not pleasing you too well. I could tell you a lot of things that would make you feel good about yourself and say "Hell yes!". But some of them would set you up for a comeuppance.

And as for the horrors that might still await in the Internet Tunnel: Wishing to be out the other side doesn't make it so, and affirmation journalism is popular because the priesthood of verification journalism is broken; it doesn't know how to handle the flood. Maybe someday they will figure it out, or some new information-processing methodology will burst onto the scene the way science did in the 1600s. But for now, all I know how to do is to choose my simplifying assumptions as best I can, revisit them from time to time, and proceed honestly from there.

No comments: