I followed this up with a subsequent post on Saturday, Jan. 16.
The reaction to How News Happens may tell us more about the news industry than the study itself does.
The study of the news ecosystem in Baltimore was published today by the Pew Research Center’s Project for Excellence in Journalism, and news of the report was first published Sunday. The New York Times, Los Angeles Times, editorsweblog and more tweets than I could count trumpeted the finding that most news originates with newspapers and those upstart blogs contribute barely a trickle of original news. The favorite fact cited was that 95 percent of stories reporting fresh information came from the endangered old media, newspapers primarily.
This reinforces something that newspaper journalists and executives have been reassuring themselves of ever since we learned about those pesky bloggers: that bloggers and talk radio and really everyone would have nothing to talk and write about if not for newspapers.
Though the report was limited and flawed, it produced some valuable understanding of how journalism works today. But its most important findings were not that most news comes from newspapers. I’m glad PEJ is conducting this kind of research. I hope it does more.
But this research has too many flaws and limitations to be taken very seriously, and few of those flaws were noted in some of the coverage. Poynter’s Bill Mitchell and David Carr of the New York Times raised important issues and Jeff Jarvis bluntly pointed out dangers in reading much into the study (though he noted it has value).
I will just run through issues that the study (or those embracing the 95-percent number) should have taken into account (or that future studies should address):
- The study focused on types of news that old media emphasizes: crime, government, the justice system, health care. That skewed the findings inherently in the direction of old media. In fact, one of the six stories selected, on juvenile justice, began with an enterprise story in the Baltimore Sun. So of course the other reporting in that case was going to be derivative of the Sun’s original reporting. If old media are losing audience because they are not covering stories that are relevant to their communities, this study would have no value in telling you whether anyone was covering the relevant information.
- Areas the study steered clear of included neighborhood news, sports, arts, schools (the study was conducted in July, when school is out). One story (dealing with the sale of a historic movie theater) dealt with entertainment and business, though it didn’t fit tidily into either genre. These are areas where specialized blogs and web sites are contributing or could contribute, depending on the community, meaningfully to the news ecosystem.
- The study reported on only six story narratives from one week (a summer week, as noted, when the news flow, not to mention staffing of any or all news outlets, might have been far from typical). While those six stories were studied in great detail, can six stories during a summer week tell you very much?
- The study’s coverage of Twitter reflected an incomplete understanding of the value of Twitter in providing news to a community. The report took extensive note of the use of Twitter feeds by news outlets and by the Baltimore police department. But it paid no attention to whether and how stories break from Twitter reports by the general public. I suspect that Twitter produced significant discussion about at least three of the stories studied, if not all six. I would be surprised if that discussion didn’t break some news that old media either reported later or missed entirely.
- While the study encompassed all of the so-called mainstream news outlets, the new-media outlets studied were limited to those that “produce or disseminate local public affairs news.” While that selection of sources fit the narrow selection of issues studied, Mark Potts’ blog post on news sources in Baltimore (posted in June, the month before the study) includes several news outlets not studied by PEJ. If Baltimore Real Estate Investing Blog did a better job of reporting what might happen with the sale of that theater (the report noted that the advance reporting by the media mostly failed to raise the possibility that actually happened), the study wouldn’t know, because its selection of news outlets was too narrow. Update: I don’t recommend speculation, but indeed I was pretty close on this one, though I had the wrong blog. I just learned that astrogirl’s galaxy guide did in fact blog extensively about the sale of The Senator theater. I count six posts about the Senator the week that Pew studied the Baltimore media. Without studying such an active blogger, the study’s analysis of this story line simply is not valid. Please read her comment below (from Laura) and you can read her post about this post and the Pew study. Also, please note the comment from me, adding a response from Tom Rosenstiel, director of the Project for Excellence in Journalism. (I address Rosenstiel’s response and further concerns it raises, in the subsequent post.)
- The study would have been more valid in a city served by a strong online-only news operation. The study noted that with a reference to such a city, San Diego, though the reference to such operations as aggregators revealed the bias of the study. San Diego and many other cities have local online news operations with stronger reputations (nationally at least) than those mentioned in Baltimore. This study would have had more validity if it addressed how much original reporting comes from operations such as the Voice of San Diego, Oakland Local, MinnPost, West Seattle Blog (Seattle would have been an interesting city for this study) or St. Louis Beacon. I would love to see such a study of our local news ecosystem, including Eastern Iowa News.
- The finding that one of the stories broke in a blog was heavily minimized. A plan to put listening devices on Maryland Transit Administration buses was reported first in the Maryland Politics Watch blog. As reported in the study, a Baltimore Sun blogger three days later linked to the original post and called the state for comment, stirring up enough fuss that the idea was quickly scrapped. So the report tempers the grudging recognition that new media broke the story by saying that nothing happened until real media weighed in. But is that the full story? Since the study only checked official Twitter accounts, we don’t know whether the original post stirred up some tweeps or generated some calls from citizens to the MTA. But we do know that at least one of these six stories (17 percent) originally broke in new media. And what if that’s not typical? If the typical rate is one-third, I guarantee you would find some six-story samples with only one (or no) story breaking in new media. Or what if that’s not accurate for this sample? I can almost assure you the police-shooting story (and possibly another in the sample) broke on Twitter before it broke in any of the media outlets PEJ studied. How would this study be viewed differently if its conclusion said one-third of the stories studied broke in new media before they did in old media (in a story sample skewed in favor of old media)?
But if you argue that the study was completely accurate and valid and important, it had several disturbing findings that were at least as important as, if not more important than, the finding that most news comes from old media:
- 83 percent of stories in the study were essentially repetitive, meaning it’s going to be difficult for newspapers to claim enough value to make paywalls work. And don’t think that the paywalls will keep people from repeating newspaper stories. Broadcast outlets were doing that long before they could link to or rip off online stories.
- In the stories examined, the media were mostly (63 percent) reacting to the government. The report repeatedly mentions the lack of enterprise reporting from even traditional news outlets.
- The report found extensive lifting of material from press releases and other reports, some of it apparently amounting to plagiarism (though the report did not provide details): “We found official press releases often appear word for word in first accounts of events, though often not noted as such. In the growing echo chamber online, formal procedures for citing and crediting can get lost. We found numerous examples of websites carrying sections of other people’s work without attribution and often suggesting original reporting was added when none was. We found elements of this in several major stories we traced.”
I am glad to see someone studying the emerging ecosystem. I hope to see deeper studies that tell us more. And I hope the media who report about those studies make a better effort to understand and report what the studies say. And what they don’t.