This is long, but I recommend that you read it all:
The line between science and journalism is getting blurry....again
By Bora Zivkovic
Human #1: "Hello, nice weather today, isn't it?"
Human #2: "Ummm...actually not. It's a gray, cold, windy, rainy kind of day!"
Many a joke depends on confusion about the meaning of language, as in the example above. But understanding the sources of such confusion is important in realms other than stand-up comedy, including in the attempts to convey facts about the world to one's target audience.
In the example above, Human #1 is using Phatic language, sometimes referred to as 'small talk' and usually exemplified, at least in the British Isles, with the talk about the highly unpredictable weather. (image: by striatic on Flickr)
Phatic language
Phatic discourse is just one of several functions of language. Its role is not to impart any factual information, but to establish a relationship between the people. It conveys things like emotional state, relative social status, alliance, intentions and limits to further conversation (i.e., where the speaker "draws the line").
If a stranger rides into a small town, a carefully chosen yet meaningless phrase establishes a state of mind that goes something like this: "I come in peace, mean no harm, I hope you accept me in the same way". The response of the local conveys how the town looks at strangers riding in, for example: "You are welcome...for a little while - we'll feed you and put you up for the night, but then we hope you leave". (image: Clint Eastwood in 'Fistful of Dollars' from Squidoo)
An important component of phatic discourse is non-verbal communication, as the tone, volume and pitch of the voice, facial expression and body posture modify the language itself and confirm the emotional and intentional state of the speaker.
It does not seem that linguistics has an official term for the opposite - the language that conveys only pure facts - but the term usually seen in such discussions (including the domain of politics and campaigning) is "Conceptual language" so this is what I will use here. Conceptual language is what Human #2 in the joke above was assuming and using - just the facts, ma'am.
Rise of the earliest science and journalism
For the sake of this article, I will use two simplified definitions of science and journalism.
Journalism is communication of 'what's new'. A journalist is anyone who can say "I’m there, you’re not, let me tell you about it."
Science is communication of 'how the world works'. A scientist is anyone who can say "I understand something about the world, you don't, let me explain it to you".
Neither definition necessitates that what they say is True, just what they know to the best of their ability and understanding.
Note that I wrote "science is communication". Yes, science is the process of discovery of facts about the way the world works, but the communication of that discovery is the essential last step of the scientific process, and the discoverer is likely to be the person who understands the discovery the best and is thus likely to be the person with the greatest expertise and authority (and hopefully ability) to do the explaining.
For the greatest part of human history, none of those distinctions made any sense. Most of communication contained information about what is new, some information about the way the world works, and a phatic component. Knowing how the world works, knowing what is happening in that world right now, and knowing if you should trust the messenger, were all important for survival.
For the most part, the information was local, and the messengers were local. A sentry runs back into the village alerting that a neighboring tribe, painted with war-paints, is approaching. Is that person a member of your tribe, or a stranger, or the well-known Boy Who Cried Wolf? What do you know about the meaning of war-paint? What do you know about the neighboring tribe? Does all this information fit with your understanding of the world? Is information coming from this person to be taken seriously? How are village elders responding to the news? Is this piece of news something that can aid in your personal survival?
For the longest time, information was exchanged between people who knew each other to some degree - family, neighbors, friends, business-partners. Like in a fishing village, the news about the state of fishing stocks coming from the ships at sea is important information exchanged at the local tavern. But is that fish-catch information 'journalism' (what's new) or 'science' (how the world works)? It's a little bit of both. And you learn which sailors to trust by observing who is trusted by the locals you have already learned to trust. Trust is transitive.
Someone in the "in-group" is trusted more than a stranger - kids learned from parents, the community elders had the authority: the trust was earned through a combination of who you are, how old you are, and how trustworthy you tended to be in the past. New messengers are harder to pin down on all those criteria, so their information is taken with a degree of skepticism. The art of critical thinking (again, not necessarily meaning that you will always pick the Truth) is an ancient one, as it was essential for day-to-day survival. You trust your parents (or priests or teachers) almost uncritically, but you put up your BS filters when hearing a stranger.
Emergence of science and of journalism
The invention of the printing press precipitated the development of both journalism and science. But that took a very long time - almost two centuries (image: 1851, printing press that produced early issues of Scientific American). After Gutenberg printed the Bible, most of what people printed were political pamphlets, church fliers and what for that time and sensibilities went for porn.
London Gazette of 1666 is thought to be the first newspaper in the modern sense of the word. (image: from DavidCo) Until then, newspapers were mostly irregular printings by individuals, combining news, opinion, fiction and entertainment. After this, newspapers gradually became regular (daily, weekly, monthly) collections of writings by numerous people writing in the same issue.
The first English scientific journal was published a year before - the Philosophical Transactions of the Royal Society of London in 1665 (image: Royal Society of London).
Until then, science was communicated by letters - those letters were often read at the meetings of scientists. Those meetings got formalized into scientific societies and the letters read at such meetings started getting printed. The first scientific journals were collections of such letters, which explains why so many journals have the words "Letters", "Annals" or "Proceedings" in their titles.
Also, before as well as for a quite a long time after the inception of first journals, much of science was communicated via books - a naturalist would spend many years collecting data and ideas before putting it all in long-form, leather-bound form. Those books were then discussed at meetings of other naturalists who would often respond by writing books of their own. Scientists at the time did not think that Darwin's twenty-year wait to publish The Origin was notable (William Kimler, personal communication) - that was the normal timeline for research and publishing at the time, unusual only to us from a modern perspective of 5-year NIH grants and the 'publish or perish' culture.
As previously oral communication gradually moved to print over the centuries, both journalistic and scientific communication occured in formats - printed with ink on paper - very similar to blogging (that link leads to the post that served as a seed from which this article grew). If born today, many of the old writers, like Montaigne, would be Natural Born Bloggers ('NBBs' - term coined by protoblogger Dave Winer). A lot of ship captains’ logs were essentially tweets with geolocation tags.
People who wanted to inform other people printed fliers and pamphlets and books. Personal letters and diaries were meant to be public: they were as widely shared as was possible, they were publicly read, saved, then eventually collected and published in book-form (at least posthumously). Just like blogs, tweets and Facebook updates today….
The 18th century 'Republic of Letters' (see the amazing visualization of their correspondence) was a social network of intellectual leaders of Europe who exchanged and publicly read their deep philosophical thoughts, scientific ideas, poetry and prose.
Many people during those centuries wrote their letters in duplicate: one copy to send, one to keep for publishing Collected Letters later in life. Charles Darwin did that, for example (well, if I remember correctly, his wife made copies from his illegible originals into something that recipients could actually read), which is why we have such a complete understanding of his work and thought - it is all well preserved and the availability of such voluminouos correspondence gave rise to a small industry of Darwinian historical scholarship.
What is important to note is that, both in journalism and in science, communication could be done by anyone - there was no official seal of approval, or licence, to practice either of the two arts. At the same time, communication in print was limited to those who were literate and who could afford to have a book printed - people who, for the most part, were just the wealthy elites. Entry into that intellectual elite from a lower social class was possible but very difficult and required a lot of hard work and time (see, for example, a biography of Alfred Russell Wallace). Membership in the worlds of arts, science and letters was automatic for those belonging to the small group of literate aristocracy. They had no need to establish formalized gatekeeping as bloodlines, personal sponsorship and money did the gatekeeping job quite well on their own.
As communication has moved from local to global, due to print, trust had to be gained over time - by one's age, stature in society, track record, and by recommendation - who the people you trust say you should trust. Trust is transitive.
Another thing to note is that each written dispatch contained both 'what's new' and 'how the world works' as well as a degree of phatic discourse: "This is what happened. This is what I think it means. And this is who I am so you know why you should trust me." It is often hard to tell, from today's perspective, what was scientific communication and what was journalism.
Personal - and thus potentially phatic - communication was a norm in the early scientific publishing. For example, see "A Letter from Mr J. Breintal to Peter Collinfoxl, F. RXS. contairnng an Account of what he felt after being bit by a Rattle-fnake" in Philosophical Transactions, 1747. - a great account of it can be found at Neurotic Physiology. It is a story of a personal interaction with a rattlesnake and the discovery leading from it. It contained "I was there, you were not, let me tell you what happened" and "I understand something, you don't, let me explain that to you" and "Let me tell you who I am so you can know you can trust me".
Apparently, quite a lot of scientific literature of old involved exciting narratives of people getting bitten by snakes - see this one from 1852 as well.
The anomalous 20th century - effects of technology
The gradual changes in society - invention of printing, rise of science, rise of capitalism, industrial revolution, mass migration from rural to urban areas, improvements in transportation and communication technologies, to name just a few - led to a very different world in the 20th century.
Technology often leads societal changes. If you were ever on a horse, you understand why armies that used stirrups defeated the armies that rode horses without this nifty invention.
Earlier, the speed of spreading news was much slower (see image: Maps of rates of travel in the 19th century - click on the link to see bigger and more). By 1860 Telegraph reached to St. Louis. During its short run the Pony Express could go the rest of the way to San Francisco in 10 days. After that, telegraph followed the rails. First transcontinental line was in 1869. Except for semaphores (1794) information before the telegraph (1843) could only travel as fast as a rider or boat (Thanks to John McKay for this brief primer on the history of speed of communication in Northern America. I am assuming that Europe was slightly ahead and the rest of the world somewhat behind).
The 20th century saw invention or improvement of numerous technologies in transportation - cars, fast trains, airplanes, helicopters, space shuttles - and in communication - telephone, radio, and television. Information could now travel almost instantly.
But those new technologies came with a price - literally. While everyone could write letters and send them by stagecoach, very few people could afford to buy, run and serve printing presses, radio stations and television studios. These things needed capital, and increasingly became owned by rich people and corporations.
Each inch of print or minute of broadcast costs serious money. Thus, people were employed to become official filters of information, the gatekeepers - the editors who decided who will get access to that expensive real estate. As the editors liked some people's work better than others, those people got employed to work in the nascent newsrooms. Journalism became professionalized. Later, universities started journalism programs and codified instruction for new journalists, professionalizing it even more.
Instead of people informing each other, now the few professionals informed everyone else. And the technology did not allow for everyone else to talk back in the same medium.
The broadcast media, a few large corporations employing professional writers informing millions – with no ability for the receivers of information to fact-check, talk back, ask questions, be a part of the conversation – is an exception in history, something that lasted for just a few decades of the 20th century.
The anomalous 20th century - industrialization
Industrial Revolution brought about massive migration of people into big cities. The new type of work required a new type of workforce, one that was literate and more educated. This led to the invention of public schools and foundation of public universities.
In the area of science, many more people became educated enough (and science still not complex and expensive yet) to start their own surveys, experiments and tinkering. The explosion of research led to an explosion of new journals. Those too became expensive to produce and started requiring professional filters - editors. Thus scientific publishing also became professionalized. Not every personal anecdote could make it past the editors any more. Not everyone could call oneself a scientist either - a formal path emerged, ending with a PhD at a university, that ensured that science was done and published by qualified persons only.
By the 1960s, we got a mass adoption of peer-review by scientific journals that was experimentally done by some journals a little earlier. Yes, it is that recent! See for example this letter to Physical Review in 1936:
Dear Sir,
We (Mr. Rosen and I) had sent you our manuscript for publication and had not authorized you to show it to specialists before it is printed. I see no reason to address the — in any case erroneous — comments of your anonymous expert. On the basis of this incident I prefer to publish the paper elsewhere.
Respectfully,
Albert Einstein
Or this one:
John Maddox, former editor of Nature: The Watson and Crick paper was not peer-reviewed by Nature… the paper could not have been refereed: its correctness is self-evident. No referee working in the field … could have kept his mouth shut once he saw the structure...
Migration from small towns into big cities also meant that most people one would meet during the day were strangers. Meeting a stranger was not something extraordinary any more, so emergence and enforcement of proper proscribed conduct in cities replaced the need for one-to-one encounters and sizing up strangers using phatic language. Which is why even today phatic language is much more important and prevalent in rural areas where it aids personal survival than in urban centers where more general rules of behavior among strangers emerged (which may partially explain why phatic language is generally associated with conservative ideology and conceptual language with politicial liberalism, aka, the "reality-based community").
People moving from small hometowns into big cities also led to breaking up of families and communities of trust. One needed to come up with new methods for figuring out who to trust. One obvious place to go was local media. They were stand-ins for village elders, parents, teachers and priests.
If there were many newspapers in town, one would try them all for a while and settle on one that best fit one's prior worldview. Or one would just continue reading the paper one's parents read.
But other people read other newspapers and brought their own worldviews into the conversation. This continuous presence of a plurality of views kept everyone's BS filters in high gear - it was necessary to constantly question and filter all the incoming information in order to choose what to believe and what to dismiss.
The unease with the exposure to so many strangers with strange ideas also changed our notions of privacy. Suddenly we craved it. Our letters are now meant for one recepient only, with the understanding it will not be shared. Personal diaries now have lockets. After a century of such craving for privacy, we are again returning to a more historically traditional notions, by much more freely sharing our lives with strangers online.
The anomalous 20th century - cleansing of conceptual language in science and journalism
Until the 20th century we did not see the consolidation of media into large conglomerates, and of course, there were no mass radio or TV until mid-20th century. Not until later in the century did we see the monopolization of local media markets by a single newspaper (competitors going belly-up) which, then, had to serve everyone, so it had to invent the fake "objective" HeSaidSheSaid timid style of reporting in order not to lose customers of various ideological stripes and thus lose advertising revenue.
Professionalising of journalism, coupled with the growth of media giants serving very broad audiences, led to institutionalization of a type of writing that was very much limited to "what's new".
The "let me explain" component of journalism fell out of favor as there was always a faction of the audience that had a problem with the empirical facts - a faction that the company's finances could not afford to lose. The personal - including phatic - was carefully eliminated as it was perceived as unobjective and inviting the criticism of bias. The way for a reporter to inject one's opinion into the article was to find a person who thinks the same in order to get the target quote. A defensive (perhaps cowardly) move that became the norm. And, once the audience caught on, led to the loss of trust in traditional media.
Reduction of local media to a single newspaper, a couple of local radio stations and a handful of broadcast TV channels (that said esentially the same thing), left little choice for the audience. With only one source in town, there was no opportunity to filter among a variety of news sources. Thus, many people started unquestioningly accepting what 20th-century style broadcast media served them.
Just because articles were under the banners of big companies did not make them any more trustworthy by definition, but with no alternative it is still better to be poorly informed than not informed at all. Thus, in the 20th century we gradually lost the ability to read everything critically, awed by the big names like NYT and BBC and CBS and CNN. Those became the new parents, teachers, tribal elders and priests, the authority figures whose words are taken unquestioningly.
In science, explosion in funding not matched by explosion of job positions, led to overproduction of PhDs and a rise of hyper-competitive culture in academia. Writing books became unproductive. The only way to succeed is to keep getting grants and the only way to do that is to publish very frequently. Everything else had to fall by the wayside.
False measures of journal quality - like the infamous Impact Factor - were used to determine who gets a job and tenure and who falls out of the pipeline. The progress of science led inevitably to specialization and to the development of specialized jargon. Proliferation of expensive journals ensured that nobody but people in highest-level research institutions had access to the literature, so scientists started writing only for each other.
Scientific papers became dense, but also narrowed themselves to only "this is how the world works". The "this is new" became left out as the audience already knew this, and it became obvious that a paper would not be published if it did not produce something new, almost by definition.
And the personal was so carefully excised for the purpose of seeming unbiased by human beings that it sometimes seems like the laboratory equipment did all the experiments of its own volition.
So, at the close of the 20th century, we had a situation in which journalism and science, for the first time in history, completely separated from each other. Journalism covered what's new without providing the explanation and context for new readers just joining the topic. Science covered only explanation and only to one's peers.
In order to bridge that gap, a whole new profession needed to arise. As scientists understood the last step of the scientific method - communication - to mean only 'communication to colleagues', and as regular press was too scared to put truth-values on any statements of fact, the solution was the invention of the science journalist - someone who can read what scientists write and explain that to the lay audience. With mixed success. Science is hard. It takes years to learn enough to be able to report it well. Only a few science journalists gathered that much expertise over the years of writing (and making mistakes on the way).
So, many science journalists fell back on reporting science as news, leaving the explanation out. Their editors helped in that by severely restricting the space - and good science coverage requires ample space.
A good science story should explain what is known by now (science), what the new study brings that is new (news) and why does that matter to you (phatic discourse). The lack of space usually led to omission of context (science), shortening of what is new (news) and thus leaving only the emotional story intact. Thus, the audience did not learn much, Certainly not enough to be able to evaluate next day's and next week's news.
This format also led to the choice of stories. It is easy to report in this way if the news is relevant to the audience anyway, e.g., concerning health (the "relevant" stories). It is also easy to report on misconduct of scientists (the "fishy" stories) - which is not strictly science reporting. But it was hard to report on science that is interesting for its own sake (the "cool" stories).
What did the audience get out of this? Scientists are always up to some mischief. And every week they change the story as to what is good or bad for my health. And it is not very fun, entertaining and exciting. No surprise that science as endeavour slowly started losing trust with the (American) population, and that it was easy for groups with financial, political or religious interests to push anti-science rhetoric on topics from hazards of smoking to stem-cell research to evolution to climate change.
At the end of the 20th century, thus, we had a situation in which journalism and science were completely separate endeavors, and the bridge between them - science journalism - was unfortunately operating under the rules of journalism and not science, messing up the popular trust in both.
Back to the Future
It is 2010. The Internet has been around for 30 years, the World Wide Web for 20. It took some time for the tools to develop and spread, but we are obviously undergoing a revolution in communication. I use the word "revolution" because it is so almost by definition - when the means of production change hands, this is a revolution.
The means of production, in this case the technology for easy, cheap and fast dissemination of information, are now potentially in the hands of everyone. When the people formerly known as the audience employ the press tools they have in their possession to inform one another, we call that 'citizen journalism.' And some of those citizens possess much greater expertise on the topics they cover than the journalists that cover that same beat. This applies to science as well.
In other words, after the deviation that was the 20th century, we are going back to the way we have evolved as a species to communicate - one-to-one and few-to-few instead of one-to-many. Apart from technology (software instead of talking/handwriting/printing), speed (microseconds instead of days and weeks by stagecoach, railroad or Pony Express, see image above) and the number of people reached (potentially - but rarely - millions simultaneously instead of one person or small group at a time), blogging, social networking and other forms of online writing are nothing new – this is how people have always communicated. Like Montaigne. And the Republic of Letters in the 18th century. And Charles Darwin in the 19th century.
All we are doing now is returning to a more natural, straightforward and honest way of sharing information, just using much more efficient ways of doing it. (Images from Cody Brown)
And not even that – where technology is scarce, the analog blogging is live and well (image: Analog blogger, from AfriGadget).
What about trustworthiness of all that online stuff? Some is and some isn’t to be trusted. It’s up to you to figure out your own filters and criteria, and to look for additional sources, just like our grandparents did when they had a choice of dozens of newspapers published in each of their little towns.
With the gradual return of a more natural system of communication, we got to see additional opinions, the regular fact-checks on the media by experts on the topic, and realized that the mainstream media is not to be trusted.
With the return of a more natural system of communication, we will all have to re-learn how to read critically, find second opinions, evaluate sources. Nothing new is there either – that is what people have been doing for millennia – the 20th century is the exception. We will figure out who to trust by trusting the judgment of people we already trust. Trust is transitive.
Return of the phatic language
What does this all mean for the future of journalism, including science journalism?
The growing number of Web-savvy citizens have developed new methods of establishing trustworthiness of the sources. It is actually the old one, pre-20th century method - relying on individuals, not institutions. Instead of treating WaPo, Fox, MSNBC and NPR as the proxies for the father, teacher, preacher and the medicine man, we now once again evaulate individuals.
As nobody enters a news site via the front page and looks around, but we all get to individual articles via links and searches, we are relying on bylines under the titles, not on the logos up on top. Just like we were not born trusting NYTimes but learned to trust it because our parents and neighbors did (and then perhaps we read it for some time), we are also not born knowing which individuals to trust. We use the same method - we start with recommendations from people we already trust, then make our own decisions over time.
If you don't link to your sources, including to scientific papers, you lose trust. If you quote out of context without providing that context, you lose trust. If you hide who you are and where you are coming from - that is cagey and breeds mistrust. Transparency is the new objectivity.
And transparency is necessarily personal, thus often phatic. It shows who you are as a person, your background, your intentions, your mood, your alliances, your social status.
There are many reasons sciencebloggers are more trusted than journalists covering science.
First, they have the scientific expertise that journalists lack - they really know what they are talking about on the topic of their expertise and the audience understands this.
Second, they link out to more, more diverse and more reliable sources.
Third, being digital natives, they are not familiar with the concept of word-limits. They start writing, they explain it as it needs to be explained and when they are done explaining they end the post. Whatever length it takes to give the subject what it's due.
Finally, not being trained by j-schools, they never learned not to let their personality shine through their writing. So they gain trust by connecting to their readers - the phatic component of communication.
Much of our communication, both offline and online, is phatic. But that is necessary for building trust. Once the trust is there, the conceptual communication can work. If I follow people I trust on Twitter, I will trust that they trust the sources they link to so I am likely to click on them. Which is why more and more scientists use Twitter to exchage information (PDF). Trust is transitive.
Scientists, becoming journalists
Good science journalists are rare. Cuts in newsrooms, allocation of too little space for science stories, assigning science stories to non-science journalists - all of these factors have resulted in a loss of quantity and quality of science reporting in the mainstream media.
But being a good science journalist is not impossible. People who take the task seriously can become experts on the topic they cover (and get to a position where they can refuse to cover astronomy if their expertise is evolution) over time. They can become temporary experts if they are given sufficient time to study instead of a task of writing ten stories per day.
With the overproduction of PhDs, many scientists are choosing alternative careers, including many of them becoming science writers and journalists, or Press Information Officers. They thus come into the profession with the expertise already there.
There is not much difference between a research scientist who blogs and thus is an expert on the topic s/he blogs about, and a research scientist who leaves the lab in order to write as a full-time job. They both have scientific expertise and they both love to write or they wouldn't be doing it.
Blog is software. A medium. One of many. No medium has a higher coefficient of trustworthiness than any other. Despite never going to j-school and writing everything on blogs, I consider myself to be a science writer.
Many science journalists, usually younger though some of the old ones caught on quickly and became good at it (generation is mindset, not age), grok the new media ecosystem in which online collaboration between scientists and journalists is becoming a norm.
At the same time, many active scientists are now using the new tools (the means of production) to do their own communication. As is usually the case with novelty, different people get to it at different rates. The conflicts between 20th and 21st style thinking inevitably occur. The traditional scientists wish to communicate the old way - in journals, letters to the editor, at conferences. This is the way of gatekeeping they are used to.
But there have been a number of prominent cases of such clashes between old and new models of communication, including the infamous Roosevelts on toilets (the study had nothing to do with either US Presidents or toilets, but it is an instructive case - image by Dr.Isis), and several other smaller cases.
The latest one is the Arsenic Bacteria Saga in which the old-timers do not seem to undestand what a 'blog' means, and are seemingly completely unaware of the important distinction between 'blogs' and 'scienceblogs', the former being online spaces by just about anyone, the latter being blogs written by people who actually know their science and are vetted or peer-reviewed in some way e.g., at ResearchBlogging.org or Scienceblogging.org or by virtue of being hand-picked and invited to join one of the science blogging networks (which are often run by traditional media outlets or scientific publishers or societies) or simply by gaining resepect of peers over time.
Case by case, old-time scientists are learning. Note how both in the case of Roosevelts on toilets and the Arsenic bacteria the initially stunned scientists quickly learned and appreciated the new way of communication.
In other words, scientists are slowly starting to get out of the cocoon. Instead of just communicating to their peers behind the closed doors, now they are trying to reach out to the lay audience as well.
As more and more papers are Open Access and can be read by all, they are becoming more readable (as I predicted some years ago). The traditional format of the paper is changing. So they are covering "let me explain" portion better, both in papers and on their own blogs.
They may still be a little clumsy about the "what's new" part, over-relying on the traditional media to do it for them via press releases and press conferences (see Darwinius and arsenic bacteria for good examples) instead of doing it themselves or taking control of the message (though they do need to rely on MSM to some extent due to the distinction between push and pull strategies as the media brands are still serving for many people as proxies for trustworthy sources).
But most importantly, they are now again adding the phatic aspect to their communication, revealing a lot of their personality on social networks, on blogs, and even some of them venturing into doing it in scientific papers.
By combining all three aspects of good communication, scientists will once again regain the trust of their audience. And what they are starting to do looks more and more like (pre-20th century) journalism.
Journalists, becoming scientists
On the other side of the divide, there is a renewed interest in journalism expanding from just "this is new" to "let me explain how the world works". There are now efforts to build a future of context, and to design explainers.
If you are not well informed on an issue (perhaps because you are too young to remember when it first began, or the issue just started being relevant to you), following a stream of 'what is new' articles will not enlighten you. There is not sufficient information there. There is a lot of tacit knowledge that the writer assumes the readers possess - but many don't.
There has to be a way for news items to link to some kind of collection of background information - an 'explainer'. Such an explainer would be a collection of verifiable facts about the topic. A collection of verifiable facts about the way the world works is....scientific information!
With more and more journalists realizing they need to be transparent about where they are coming from, injecting personality into their work in order to build trust, some of that phatic language is starting to seep in, completing the trio of elements of effective communication.
Data Journalism - isn't this science?
Some of the best journalism of the past - yes, the abominable 20th century - was done when a reporter was given several months to work on a single story requiring sifting through boxes and boxes of documents. The reporter becomes the expert on the topic, starts noticing patterns and writes a story that brings truly new knowledge to the world. That is practically science! Perhaps it is not the hardest of the hard sciences like physics, but as good as well-done social science like cultural anthropology, sociology or ethnography. There is a system and a method very much like the scientific method.
Unfortunately, most reporters are not given such luxury. They have to take shortcuts - interviewing a few sources to quote for the story. The sources are, of course, a very small and very unrepresentative sample of the relevant population - from a rolodex. Call a couple of climate scientists, and a couple of denialists, grab a quote from each and stick them into a formulaic article. That is Bad Science as well as Bad Journalism. And now that the people formerly known as audience, including people with expertise on the topic, have the tools to communicate to the world, they often swiftly point out how poorly such articles represent reality.
But today, most of the information, data and documents are digital, not in boxes. They are likely to be online and can be accessed without travel and without getting special permissions (though one may have to steal them - as Wikileaks operates: a perfect example of the new data journalism). Those reams of data can be analyzed by computers to find patterns, as well as by small armies of journalists (and other experts) for patterns and pieces of information that computer programs miss.
This is what bioinformaticists do (and have already built tools to do it - contact them, steal their tools!).
Data journalism. This is what a number of forward-thinking journalists and media organizations are starting to do.
This is science.
On the other hand, a lot of distributed, crowdsourced scientific research, usually called Citizen Science, is in the business of collecting massive amounts of data for analysis. How does that differ from data journalism? Not much?
Look at this scientific paper - Coding Early Naturalists' Accounts into Long-Term Fish Community Changes in the Adriatic Sea (1800–2000) - is this science or data journalism? It is both.
The two domains of communicating about what is new and how the world works - journalism and science - have fused again. Both are now starting to get done by teams that involve both professionals and amateurs. Both are now led by personalities who are getting well-known in the public due to their phatic communication in a variety of old and new media.
It is important to be aware of the shortness of our lives and thus natural tendency for historical myopia. Just because we were born in the 20th century does not mean that the way things were done then are the way things were 'always done', or the best ways to do things - the pinnacle of cultural and social development. The 20th century was just a strange and deviant blip in the course of history.
As we are leaving the 20th century behind with all of its unusual historical quirks, we are going back to an older model of communicating facts - but with the new tools we can do it much better than ever, including a much broader swath of society - a more democratic system than ever.
By the way, while it's still cold, the rain has stopped. And that is Metaphorical language...
This article was commissioned by Science Progress and will also appear on their site in 24 hours.
Tags: language, journalism, blogging, media, history, communication
More Observations: Next: North America in for a rare total lunar eclipse Previous: Dimming city lights may help reduce smog
Post a Comment | Read Comments (19)
Reprints and Permissions »
0diggsdigg
Share
19 Comments
Add Comment
Show All | Jump To: 1-10 | 11-20 | Next
View
1.
1. AndiKuszewski 01:48 PM 12/20/10
It's amazing (yet unsurprising) how easily scientists can fall into that low-effort, blind acceptance state when presented with scientific data as long as it is peer-reviewed. Additionally, if it isn't peer-reviewed, it gets rejected without so much as a second glance. Watson & Crick didn't go through peer review?! Then the whole field of genetics must all be wrong! Built on a false premise! My mind is blown!!
I feel that we go through cycles of communication and critical thinking/questioning of the methods and systems like you mentioned in the explanation of early newspapers and the public becoming the editors.
Once a method of review is accepted by the people as valid, the public gets lazy, sits back, and stops questioning the actual data. Inevitably, the sources providing the data get lazy as well and over time, the data can't be guaranteed to have that same level of purity yet the public still accepts it because of the blind acceptance of the messenger. Eventually, people start protesting. I think we are starting to see a backlash of the unconditional acceptance of peer review which is a very good thing.
I gave a presentation this past summer on ways to maximize your cognitive potential, and one of the things I brought up was how an over-reliance on modern tech (such as GPS) to do our critical thinking for us, can result, over time, in a weakening of those cognitive skills. I fear the same is happening in a broader sense with how we think about science we stop using our critical thinking skills, and they become less sharp. Our sciencey radar system weakens, and we are less effective gatekeepers of scientific data.
Science has become so methodized, that we inclined to accept the process without questioning why we are using that approach in the first place. The small group that continues to question often gets punished or mocked. Creativity and critical thinking has been suffering for the last decade or so, but I think with the addition of modern technology, social networks, and transparency, things have already begun to change and it won't be stopped, as much as the traditional scientists may resist.
The fact that there is so much controversy, friction and discord right now in science, science communication, and journalism in general, can only mean progress. It means the Revolution is already taking place, and change is inevitable. These are all good things.
Also I love the juxtaposition of old & new methods of science communication using the basic ideas in a new medium, to achieve higher goals.
Loved this piece!
Reply | Report Abuse | Link to this
2.
2. billsmith 02:31 PM 12/20/10
Bravo. The distinction between communication of trustworthiness, explanation, and novelty is an excellent one. I hope all Scientific American writers study this essay.
Whether it's recent political events or discoveries in medical science, I often feel like a newspaper article is just pointless fluff added onto the first paragraph. There is no greater context of what was known before and how this new information fits into a bigger picture.(And yet I read them anyway. How sad.)
If you write that a dictator suffers a coup, explain to me in detail who was upset with him and why and explain how similar upheavals affected other countries. If a scientific study seems to contradict one that you wrote about last month, explain how one of the studies is irrelevant, or how they are both part of a more complicated truth.
Science Daily is one of my favorite sources for accurately reported science news, but their "Related Stories" sidebar still often amusingly provides evidence that the "breakthrough" reported by a university press office is just one piece of a decade-long puzzle.
I forgive them the lapses, though, partly because their writers and "Related Stories" often provide a fair amount of context. But more than that, the links to the ACTUAL PAPERS being reported on are highly reassuring and informative.
I hope that Scientific American writers will try to include accurate context whenever possible. I also hope that they will realize the power of transparency and without exception provide links to the original journal articles, abstracts, and researchers' websites. (Thank you to those who have started doing this more often.)
Reply | Report Abuse | Link to this
3.
3. EricMJohnson 05:27 PM 12/20/10
As someone with training in science, journalism, and (now) history I think this is a wonderfully synthesized and provocative piece of writing. Too many professionals in journalism as well as science (and the editors in both fields especially) have come to fear phatic language because it reveals that both endeavors are human activities with all of the failings that come along with that. By emphasizing conceptual language, as you say, it merely gives the illusion of objectivity. But we are all biased. Good scientists and good journalists use different tools but their shared goal is to try and move beyond their personal biases and report on the world as it really is. Trust is built through hard work and reputation and it must be constantly earned. It is an individual process and can't be shifted to the anonymous entity of "brand authority" (whether that is Nature, The New York Times, or BBC News).
I would argue that the commitment to promoting brand authority actually serves to undermine critical thinking. Readers turn off their BS filters because the brands have convinced them that the critical thinking has already been done. But when readers must enter into conflicting interpretations it forces them to make up their own mind about the issues based on the integrity of the source and the veracity of the information (and then, perhaps, even write about it themselves which continues to promote the critical thinking process). Blogs fill an important gap in our media ecosystem, one that has increasingly become a monoculture as more and more sources of information are controlled by a handful of media conglomerates.
The critique by some traditional journalists about how some blogs are poor quality misses the point. Just look at the poor quality of the mainstream media today! When Judith Miller can promote known falsehoods about Iraqi WMDs using anonymous sources in the Times, when Fox News receives the highest ratings and is the most trusted news organization despite the fact that viewers are more likely to be woefully misinformed (according to a recent WPO report: http://bit.ly/fNeK1b) the problem is not one of media platform, it is one of media literacy. The traditional media gatekeepers have become less interested in what is true and more interested in what is good for the bottom line (as well as continued access to institutions of power: http://huff.to/an9dAB). We are currently in the Golden Age of media and, I for one, am very hopeful about what this means for scientific literacy as well as for democracy in general.
Reply | Report Abuse | Link to this
4.
4. xprof 08:12 PM 12/20/10
When writing about science and journalism, how can one omit reference to science journalists like Natalie Angier and Olivia Judson who are equally at home in the field, the laboratory and at the typewriter, and who superbly articulate scientific concepts and discoveries in a stimulating and informative way?
Leo Toribio
Pittsburgh, PA
Reply | Report Abuse | Link to this
5.
5. Bora Zivkovic in reply to xprof 08:56 PM 12/20/10
Excellent writers like that have been mentioned under the link "good science writers are rare" up there. I did not say they don't exist, but that they are rare, and then go on to describe how they come about: they either spend years studying and specializing, or they come to journalism from a scientific background.
Reply | Report Abuse | Link to this
6.
6. Bora Zivkovic in reply to AndiKuszewski 08:59 PM 12/20/10
Thank you. I guess in our busy lives it is usually a good heuristic to trust again people we trusted before. Once we invest a lot of trust in someone over prolonged time, it is a difficult realization they may be wrong - as it implies we are wrong as well, something that is hard to admit.
Reply | Report Abuse | Link to this
7.
7. Bora Zivkovic in reply to billsmith 09:00 PM 12/20/10
Thank you. Yes, under pressure by a number of bloggers over time, it is good to notice that some media organizations have started more routinely to link to primary papers, e.g., BBC and Reuters. Let's home more do.
Reply | Report Abuse | Link to this
8.
8. Bora Zivkovic in reply to EricMJohnson 09:04 PM 12/20/10
Thank you. You make a number of excellent points here. I agree that this is a Golden Age of sorts - never before in history did we have as much quantity and quality of science coverage and as ease of access to it by as many people around the world.
Reply | Report Abuse | Link to this
9.
9. clrbear430 09:10 PM 12/20/10
Spellcheck, spellcheck!! There are a number of errors in the above Observation. For example:
Charles Darwin did that, for example (well, if I remember correctly, his wife made copies from his illegible originals into something that recipients could actually read), which is why we have such a complete understanding of his work and thought - it is all well preserved and the availability of such voluminouos correspondence gave rise to a small industry of Darwinian historical scholarship. [Voluminous?]
**I do like the abundance of links in this post.**
Just because we were born in the 20th century does not mean that the way things were done then are the way things were 'always done', or the best ways to do things - the pinnacle of cultural and social development.
Don't assume everyone reading this post is older than 10!! I started to read interesting articles from the Internet approximately when I was 8.
Claire Binkley
West Chester, PA
Reply | Report Abuse | Link to this
10.
10. Bora Zivkovic in reply to clrbear430 09:21 PM 12/20/10
Thank you. An occasional typo on a blog is actually a part of phatic communication - it is a demonstration that the writer spoke directly, with no filters.
And in such situations typos happen - one becomes typo-blind after spending a lot of time with one's own piece of text. A fresh set of eyes (copyeditor if lucky, spouse more usually) can find them, but that adds one more layer of distance between the writer and the reader.
Which is why blog-readers are so tolerant of typos, even if they vocally point them out in works published by corporate media.
Reply | Report Abuse | Link to this
Wednesday, December 22, 2010
Monday, December 20, 2010
Sunday, December 19, 2010
And here's looking into someones crystal ball:http://www.niemanlab.org/category/themes/predictions-2011/
Series: Predictions for Journalism 2011
To close out 2010, we asked some of the smartest people we know to predict what 2011 will bring for the future of journalism.
Dec. 13: What will 2011 bring for journalism? Clay Shirky predicts widespread disruptions for syndication
Dec. 13: The great paywall debate: Will The New York Times’ new model work?
Dec. 13: Steven Brill: 2011 will bring ebook battles, paywall successes, and a new model for long-form articles
Dec. 14: Better curation on Twitter, pushback against anonymity, and more new startups: Predictions for 2011
Dec. 14: Smartphone growth, Murdoch’s Daily, and journalism for the poor: Predictions for mobile news in 2011
Dec. 15: In-car app stores, success for Xinhua, and more social media: Predictions for journalism in 2011
Dec. 15: Coming soon to journalism: Matt Thompson sees the “Speakularity” and universal instant transcription
Dec. 15: Dave Winer: There’s no good place for a new Maginot Line for the news
Dec. 16: Jonathan Stray: In 2011, news orgs will finally start to move past the borders of their own content
Dec. 16: Gawker copycats, luxurious print, more robots, and a new blogging golden age: Predictions for 2011
Dec. 17: DDoS attacks on the U.S. media, Twitter history searching, and a big blog deal: More predictions for 2011
Dec. 17: Jason Fry: A blow to content farms, Facebook’s continued growth, and the continued pull of the open web
Series: Predictions for Journalism 2011
What will 2011 bring for journalism? Clay Shirky predicts widespread disruptions for syndication
By Clay Shirky / Dec. 13 / 10 a.m. / 19 comments
Editor’s Note: To mark the end of the year, we at the Lab decided to ask some of the smartest people we know what they thought 2011 would bring for journalism. We’re very pleased that so many of them agreed to share their predictions with us.
Over the next few days, you’ll hear from Steve Brill, Vivian Schiller, Michael Schudson, Markos Moulitsas, Kevin Kelly, Geneva Overholser, Adrian Holovaty, Jakob Nielsen, Evan Smith, Megan McCarthy, David Fanning, Matt Thompson, Bob Garfield, Matt Haughey, and more.
We also want to hear your predictions: take our Lab reader poll and tell us what you think we’ll be talking about in 2011. We’ll share those results later this week.
To start off our package of predictions, here’s Clay Shirky. Happy holidays.
The old news business model has had a series of shocks in the 15 or so years we’ve had a broadly adopted public web. The first was the loss of geographic limits to competition (every outlet could reach any reader, listener or viewer). Next was the loss of progressive layers of advertising revenue (the rise of Monster and craigslist et alia, as well as the “analog dollars to digital dimes” problem). Then there is the inability to charge readers easily without eviscerating the advertising rate-base (the failure of micropayments and paywalls as general-purpose solutions).
Next up for widespread disruption, I think, is syndication, a key part of the economic structure of the news business since the founding of Havas in the early 19th century. As with so many parts of a news system based on industrial economics, that model is now under pressure.
Keep reading »
Tags: aggregation, Clay Shirky, Jeff Jarvis, Jonathan Stray, linking, Nicholas Carr, syndication
Series: Predictions for Journalism 2011
The great paywall debate: Will The New York Times’ new model work?
By Lois Beckett / Dec. 13 / noon / 16 comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
Many of their predictions centered on what may be the most anticipated business-model shift of 2011: The New York Times’ shift to charging for full access to NYTimes.com next month. We found voices on both sides of the “will it work” debate. Here are Markos Moulitsas, Megan McCarthy, C.W. Anderson, Paddy Hirsch, Jason Fry, Nikki Usher, and Barry Sussman on how they see the metered model shaking out.
Megan McCarthy, editor, Mediagazer
Prediction for 2011: The building up — and subsequent tearing down — of online paywalls for general news sites. The New York Times are planning to implement their paywall in January and I predict it will be modified enough — either by the Times themselves or outside developers — to be rendered irrelevant by March.
C.W. Anderson, assistant professor of media culture, CUNY
Faced with a massive migration of regular readers to the Guardian and the BBC, The New York Times will abandon its recently enacted paywall.
Keep reading »
Tags: Barry Sussman, C. W. Anderson, charging, Jason Fry, Journalism Online, Markos Moulitsas, Megan McCarthy, New York Times, Paddy Hirsch, paywall
Series: Predictions for Journalism 2011
Steven Brill: 2011 will bring ebook battles, paywall successes, and a new model for long-form articles
By Steven Brill / Dec. 13 / 3 p.m. / 4 comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
Here, Journalism Online cofounder and long-time journalism entrepreneur Steven Brill lays out three predictions for 2011.
1. E-books will continue to soar — and authors will get into major fights with publishers over who gets what percentage of the take, with more top authors withholding their e-book rights and selling them independently or through specialty distributors.
2. Someone — via Press+, I hope — will go into the business of commissioning long-form magazine articles from top writers and providing the first two or three paragraphs online for free and then selling the rest for, say, 75 cents or a dollar. That trailblazing publisher might call these “mini-e-books” and use a business model of simply splitting the revenues with the author, 50-50. My favorite candidates would be website publishers who already have great brand names, such as the Huffington Post or Daily Beast, but that want to revive long-form journalism and make money doing it (and limit risk by making some top writers their 50-50 business partners, rather than pay high flat fees for their work.)
Keep reading »
Tags: business model, charging, ebooks, long-form journalism, paid content, paywall, Press+, Steve Brill
Series: Predictions for Journalism 2011
Better curation on Twitter, pushback against anonymity, and more new startups: Predictions for 2011
By Lois Beckett / Dec. 14 / 10 a.m. / No comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
Below are predictions from Bob Giles, Alan Murray, David Beard, Geneva Overholser, Alan D. Mutter, Melissa Ludtke, Brooke Kroeger, Jan Schaffer, and Ory Okolloh.
We also want to hear your predictions: take our Lab reader poll and tell us what you think we’ll be talking about in 2011. We’ll share those results later this week.
Bob Giles, curator, Nieman Foundation for Journalism at Harvard
Newspaper companies will regret the deep cutting of newsgathering resources as the economy recovers and advertisers conclude that local newspapers are no longer vital sources of community coverage. Moreover, newspapers will follow their historical pattern of being slow to adapt to what’s new — in this case, opportunities offered by the iPad and other tablets.
Geneva Overholser, director, USC Annenberg’s School of Journalism
This will be the year when collaboration finally, truly, really takes hold. Smart legacy media leaders will determine what they and they alone can do best, then ally themselves with others who can supply the rest. Radio, TV, web-based publications, print publications, bloggers, international and national news providers, journalism schools, nonprofits, and commercial media — the smart ones will figure out their niche and how to partner (strategically) with others to be sure their work is seen. The public will be the biggest beneficiary.
Keep reading »
Tags: Alan Murray, Alan Mutter, Brooke, David Beard, Geneva Overholser, Jan Schaffer, Melissa Ludtke, Ory Okolloh
Series: Predictions for Journalism 2011
Smartphone growth, Murdoch’s Daily, and journalism for the poor: Predictions for mobile news in 2011
By Lois Beckett / Dec. 14 / noon / 1 comment
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
One of the common threads through many of their predictions was mobile — the impact smartphones and tablets and apps will have on how news is reported, produced, distributed, and consumed. (Not to mention how it’s paid for.) Here are Vivian Schiller, Keith Hopper, Jakob Nielsen, Alexis Madrigal, Michael Andersen, Richard Lee Colvin, Megan McCarthy, David Cohn, and David Fanning on what 2011 will bring for the mobile space.
Vivian Schiller, president and CEO, NPR
After two decades of saying that “this is the year of mobile,” 2011 really will be the year of mobile.
Michael Andersen, editor, Portland Afoot
My wild prediction: 2011 will be the year of media initiatives that serve poor and middle-income people.
For 20 years, almost all native Internet content has been made for the niche interests — often the professional interests — of people who make more than the median household income of $50,000 or so. But one of the best things about the mobile Internet is that it’s finally killing (or even reversing) the digital divide.
Poor folks may not have broadband, but they’ve got cell phones. African Americans and Latinos are more likely than white people to use phones for the web, pictures, texts, emails, games, videos, and social networking. As hardware prices keep falling, we’ll see more and more demand for information that is useful to the lower-income half of the population — and thanks to low marginal costs, people will be creating products that fill that need. It’s about damn time, wouldn’t you say?
Keep reading »
Tags: Alexis Madrigal, apps, David Cohn, David Fanning, iPad, iPhone, Jakob Nielsen, Keith Hopper, Megan McCarthy, Michael Andersen, mobile, news apps, Richard Colvin, Vivian Schiller
Series: Predictions for Journalism 2011
In-car app stores, success for Xinhua, and more social media: Predictions for journalism in 2011
By Lois Beckett / Dec. 15 / noon / No comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
Below are predictions from Paul Bass, John Paton, Philip Balboni, Martin Moore, Mark Luckie, Adrian Monck, Ken Doctor, Keith Hopper, and Vivian Schiller.
We also want to hear your predictions: take our Lab reader poll and tell us what you think we’ll be talking about in 2011. We’ll share those results later this week.
Paul Bass, founder, New Haven Independent
Every city of 100,000 or more in America will have its own online-only daily local news site.
Local governments will create their own “news” sources online to try to control the message and compete with new media and compensate for the decline of old media channels.
Newspapers, TV and radio stations, and online news outlets will collaborate on a bigger scale on local coverage and events
Vivian Schiller, president and CEO, NPR
“Local” takes center stage in online news, as newspaper sites, Patch, Yahoo, NPR member stations and new start ups (not for profit and for profit) form alliances, grow, and compete for audience and revenue online.
Twitter and Facebook become established as journalism platforms for newsgathering, distribution and engagement.
In-car Internet radio becomes a hot media topic, though penetration of enabled cars will lag by a few years.
Keep reading »
Tags: Adrian Monck, John Paton, Keith Hopper, Ken Doctor, Mark Luckie, Martin Moore, Paul Bass, Philip Balboni, predictions, Vivian Schiller, Xinhua
Series: Predictions for Journalism 2011
Coming soon to journalism: Matt Thompson sees the “Speakularity” and universal instant transcription
By Matt Thompson / Dec. 15 / 1 p.m. / 9 comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
We also want to hear your predictions: Take our Lab reader poll and tell us what you think we’ll be talking about in 2011. We’ll share those results later this week.
Here’s Matt Thompson, he of Newsless, Snarkmarket, and NPR fame.
At some point in the near future, automatic speech transcription will become fast, free, and decent. And this moment — let’s call it the Speakularity — will be a watershed moment for journalism.
So much of the raw material of journalism consists of verbal exchanges — phone conversations, press conferences, meetings. One of journalism’s most significant production challenges, even for those who don’t work at a radio company, is translating these verbal exchanges into text to weave scripts and stories out of them.
After the Speakularity, much more of this raw material would become available. It would render audio recordings accessible to the blind and aid in translation of audio recordings into different languages. Obscure city meetings could be recorded and auto-transcribed; interviews could be published nearly instantly as Q&As; journalists covering events could focus their attention on analyzing rather than capturing the proceedings.
Keep reading »
Tags: data, search, transcription, video, YouTube
Series: Predictions for Journalism 2011
Dave Winer: There’s no good place for a new Maginot Line for the news
By Dave Winer / Dec. 15 / 2 p.m. / 6 comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
Today, it’s web pioneer Dave Winer, a man key to the evolution of many of the publishing technologies we use online today, currently a visiting scholar in journalism at NYU, and half the team behind the Rebooting the News podcast.
When people in the news business try to figure out how to make news pay after the Internet, it seems analogous to the French, after being invaded by Germany in World War II, trying to figure out where to put the new Maginot Line.
The Maginot Line would have been a perfect defense in World War I. It didn’t help much in the second war.
Analogously, there was a perfect paywall in the pre-Internet news business, the physical product of a newspaper. There is no equivalent in the new distribution system.
Keep reading »
Tags: charging, commerce, CraigsList, deal brokering, Groupon, Maginot Line, paywall
Series: Predictions for Journalism 2011
Jonathan Stray: In 2011, news orgs will finally start to move past the borders of their own content
By Jonathan Stray / Dec. 16 / 11 a.m. / 3 comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
Today, our predictor is Jonathan Stray, interactive technology editor for the Associated Press and a familiar byline here at the Lab. His subject: the building of new multi-source information products, and whether it’ll be news organizations that do the building.
2011 will be the year that news organizations finally start talking about integrated products designed to serve the complete information needs of consumers, but it won’t be the year that they ship them.
News used to be more or less whatever news organizations published and broadcast. With so many other ways to find out about the world, this is no longer the case. Professional journalism has sometimes displayed an antagonistic streak towards blogs, Wikipedia, and social media of all types, but it’s no longer possible to deny that non-journalism sources of news are exciting and useful to people.
Unencumbered by such tribalism — and lacking content creation behemoths of their own — the information technology industry has long understood the value of curating multiple sources, including traditional news content. Google web search was the first truly widespread digital public information system. RSS allowed readers to assemble their own news feeds. Mid-decade, Wikipedia exploded into the one of the top ten sites on the web, used as much for news as for reference. The business practices of news aggregators angered publishers, but there’s no getting around the fact that they are tremendously useful tools. The most recent change in information distribution is social. Twitter has become an entirely new form of news network, while Facebook wants media organizations to use their social infrastructure to reach users.
Keep reading »
Tags: aggregation, curation, Facebook, Flipboard, Google, information needs, service, Wikipedia
Series: Predictions for Journalism 2011
Gawker copycats, luxurious print, more robots, and a new blogging golden age: Predictions for 2011
By Lois Beckett / Dec. 16 / 1 p.m. / 2 comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
Below are predictions from Susan Orlean, Joe Grimm, Matt Haughey, Adrian Holovaty, Megan McCarthy, Mark Potts, Jake Shapiro, and Cody Brown.
We also want to hear your predictions: take our Lab reader poll and tell us what you think we’ll be talking about in 2011. We’ll share those results in a couple days.
Susan Orlean, Twitter artist, and staff writer, The New Yorker
We’ll be reading more on our phones, our iPads, and our Super Scout Decoder rings by the end of next year.
Several magazines — maybe Time or Newsweek or both — will go monthly and/or digital only. But there will be new magazine startups in print that will be luxurious and expensive and book-like. 2011 will be the year of those two forms making themselves distinct; things on line will become more webby, and print publications will become more “collectible” and classic.
Journalism schools will offer a “web producer” major.
The last typewriter living in the wild will be captured, its DNA sequenced; and then it will be humanely destroyed.
Keep reading »
Tags: Adrian Holovaty, Cody Brown, Jake Shapiro, Joe Grimm, Mark Potts, Matt Haughey, Megan McCarthy, Susan Orlean
Series: Predictions for Journalism 2011
DDoS attacks on the U.S. media, Twitter history searching, and a big blog deal: More predictions for 2011
By Lois Beckett / Dec. 17 / 11 a.m. / 2 comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
Below are predictions from Michael Schudson, Alexis Madrigal, Markos Moulitsas, Joy Mayer, Nicco Mele, Nikki Usher, Steve Buttry, Paddy Hirsch, John Davidow, Ethan Zuckerman, Richard Lee Colvin, and Kevin Kelly.
We also want to hear your predictions: Today’s the last day we’ll be accepting entries in our Lab reader poll, where you tell us what you think we’ll be talking about in 2011. We’ll share those results in a couple days.
Michael Schudson, historian and sociologist, Columbia Journalism School
Prognosticating about the news media in these times is a risky business, but I’ll try one nonetheless: In 2011, none of the 250 largest U.S. cities will stop publishing (on paper) its last remaining daily newspaper. Cities with more than one daily newspaper may be reduced to one survivor.
Alexis Madrigal, senior editor at The Atlantic and co-founder, Longshot Magazine
One of the truly important big city papers will go digital-only.
Kevin Kelly, author and founder, Wired Magazine
Twitter will go down for 36 hours. The ensuing media attention will prompt a 10 percent increase in signups in the months following.
Keep reading »
Tags: Alexis Madrigal, Ethan Zuckerman, John Davidow, Joy Mayer, Kevin Kelly, Markos Moulitsas, Michael Schudson, Nicco Mele, Nikki Usher, Paddy Hirsch, Richard Colvin, Steve Buttry
Series: Predictions for Journalism 2011
Jason Fry: A blow to content farms, Facebook’s continued growth, and the continued pull of the open web
By Jason Fry / Dec. 17 / 1 p.m. / 3 comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring. Today, our predictor is Jason Fry, a familiar byline at the Lab. Jason also prognosticated earlier this week about the potential success of the NYT paywall.
Hyperlocal will remain stubbornly small scale. Large-scale efforts at cracking hyperlocal will seed more news organizations with content, but that content will remain mostly aggregation and data and still feel robotic and cold. Meanwhile, small-scale hyperlocal efforts will continue to win reader loyalty, but struggle to monetize those audiences. By the end of 2011, the most promising developments in hyperlocal will come from social media. Promising efforts to identify and leverage localized news and conversation in social media will be the buzz of late 2011, and we’ll be excited to think that social media is proving an excellent stepping stone to greater involvement in our physical communities.
Google will deal the content farms a big blow by tweaking its algorithms to drive down their search rankings. But the company will be opaque to the point of catatonia about exactly what it did and why it did it, reflecting its reluctance to be drawn into qualitative judgments about content. There will be talk of lawsuits by the spurned content farms and no small amount of jawing about Google’s power, lack of transparency, and whether or not it’s being evil. But even those worried about Google’s actions will admit that search is a much better experience now that results are less cluttered with horribly written crap.
Keep reading »
Tags: content farms, Facebook, Google, hyperlocal, search engine optimization, social media, tablets
The Nieman Journalism Lab is a collaborative attempt to figure out how quality journalism can survive and thrive in the Internet age. (More.)
The Lab in your pocket: Download the Lab's iPhone app — it's the best way to stay up-to-the-minute on the future of journalism. It's free and available now in the App Store.
Sign up for our daily email: Get all our freshest stories, delivered to your inbox once a day.
Prev
Featured
Next
Making the business case for long stories at Slate
The Internet is supposed to wreck attention spans. So why is Slate seeing great numbers for its longest pieces?
Surviving in an age of brand fragmentation
How will paid-content models adjust to a world where publications are no longer the atomic unit of news?
The Wikipedia of news translation
How a Chinese website assembled a volunteer army to translate English stories — despite Beijing interference
Is an English major the best way to become a journonerd? | A study of the NYT’s interactive news team finds a liberal-arts heart beating in the code
Apple’s impact | What Steve Jobs’ latest announcements mean for the news industry’s mobile strategy
Could a name-your-own-price model work for news?
Panera is experimenting with a new model. Can you build a business on good will?
Who owns the news? Could sharing a hot story be illegal?
The right to link is up for debate in a critical court case, pitting news orgs against Google and Twitter
Apps vs. browsers | Why the biggest competitor to iPad news apps may be a familiar but forgotten icon in the dock
The changing value of copyediting in an online age
How can copy desks stay relevant in a time of content farms, Captcha, and crowdsourced errors?
Why Twitter looks like a social network but feels like news media | The promise of many-to-many communication ends up looking more like broadcast
This chart is not encouraging | Ignore the spin: The latest circ numbers aren’t cheery
When the public’s right to know meets copyright
States are using the copyright code to keep otherwise public data private. What rights do journalists — and citizens — have?
* No, seriously: What Old Spice can teach us about news
* What makes a nonprofit news org legit? Six clues
* Calmness, curation, cat porn: Dave Eggers on print
* An involuntary Facebook for reporters and their work
* The Gutenberg Parenthesis: Back to the pre-print world?
* The NYT’s depressing list of most looked-up words
* Why Apple banned a Pulitzer winner from the iPhone
From Twitter
* Predicted: In 2011, Google will tweak its algorithms to drive down content farms' search rankings http://nie.mn/gYd0yQ 49 mins ago
* Better curation on Twitter, more news start-ups, and reconsidering anonymity: More 2011 predictions for media http://nie.mn/h3YqBk 2 hrs ago
* DDoS attacks on the media. A more searchable Twitter. An important paper goes digital-only. More predictions for 2011 http://nie.mn/g0bk1q 4 hrs ago
* 2011 predictions continue: Gawker copycats, luxurious print, more robots--and a new golden age for blogging http://nie.mn/fMxnWh 6 hrs ago
* Dealing with equal and opposite forces in news start-ups: How to speed up and slow down http://nie.mn/ecmX9e 8 hrs ago
* More updates...
Series: Predictions for Journalism 2011
To close out 2010, we asked some of the smartest people we know to predict what 2011 will bring for the future of journalism.
Dec. 13: What will 2011 bring for journalism? Clay Shirky predicts widespread disruptions for syndication
Dec. 13: The great paywall debate: Will The New York Times’ new model work?
Dec. 13: Steven Brill: 2011 will bring ebook battles, paywall successes, and a new model for long-form articles
Dec. 14: Better curation on Twitter, pushback against anonymity, and more new startups: Predictions for 2011
Dec. 14: Smartphone growth, Murdoch’s Daily, and journalism for the poor: Predictions for mobile news in 2011
Dec. 15: In-car app stores, success for Xinhua, and more social media: Predictions for journalism in 2011
Dec. 15: Coming soon to journalism: Matt Thompson sees the “Speakularity” and universal instant transcription
Dec. 15: Dave Winer: There’s no good place for a new Maginot Line for the news
Dec. 16: Jonathan Stray: In 2011, news orgs will finally start to move past the borders of their own content
Dec. 16: Gawker copycats, luxurious print, more robots, and a new blogging golden age: Predictions for 2011
Dec. 17: DDoS attacks on the U.S. media, Twitter history searching, and a big blog deal: More predictions for 2011
Dec. 17: Jason Fry: A blow to content farms, Facebook’s continued growth, and the continued pull of the open web
Series: Predictions for Journalism 2011
What will 2011 bring for journalism? Clay Shirky predicts widespread disruptions for syndication
By Clay Shirky / Dec. 13 / 10 a.m. / 19 comments
Editor’s Note: To mark the end of the year, we at the Lab decided to ask some of the smartest people we know what they thought 2011 would bring for journalism. We’re very pleased that so many of them agreed to share their predictions with us.
Over the next few days, you’ll hear from Steve Brill, Vivian Schiller, Michael Schudson, Markos Moulitsas, Kevin Kelly, Geneva Overholser, Adrian Holovaty, Jakob Nielsen, Evan Smith, Megan McCarthy, David Fanning, Matt Thompson, Bob Garfield, Matt Haughey, and more.
We also want to hear your predictions: take our Lab reader poll and tell us what you think we’ll be talking about in 2011. We’ll share those results later this week.
To start off our package of predictions, here’s Clay Shirky. Happy holidays.
The old news business model has had a series of shocks in the 15 or so years we’ve had a broadly adopted public web. The first was the loss of geographic limits to competition (every outlet could reach any reader, listener or viewer). Next was the loss of progressive layers of advertising revenue (the rise of Monster and craigslist et alia, as well as the “analog dollars to digital dimes” problem). Then there is the inability to charge readers easily without eviscerating the advertising rate-base (the failure of micropayments and paywalls as general-purpose solutions).
Next up for widespread disruption, I think, is syndication, a key part of the economic structure of the news business since the founding of Havas in the early 19th century. As with so many parts of a news system based on industrial economics, that model is now under pressure.
Keep reading »
Tags: aggregation, Clay Shirky, Jeff Jarvis, Jonathan Stray, linking, Nicholas Carr, syndication
Series: Predictions for Journalism 2011
The great paywall debate: Will The New York Times’ new model work?
By Lois Beckett / Dec. 13 / noon / 16 comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
Many of their predictions centered on what may be the most anticipated business-model shift of 2011: The New York Times’ shift to charging for full access to NYTimes.com next month. We found voices on both sides of the “will it work” debate. Here are Markos Moulitsas, Megan McCarthy, C.W. Anderson, Paddy Hirsch, Jason Fry, Nikki Usher, and Barry Sussman on how they see the metered model shaking out.
Megan McCarthy, editor, Mediagazer
Prediction for 2011: The building up — and subsequent tearing down — of online paywalls for general news sites. The New York Times are planning to implement their paywall in January and I predict it will be modified enough — either by the Times themselves or outside developers — to be rendered irrelevant by March.
C.W. Anderson, assistant professor of media culture, CUNY
Faced with a massive migration of regular readers to the Guardian and the BBC, The New York Times will abandon its recently enacted paywall.
Keep reading »
Tags: Barry Sussman, C. W. Anderson, charging, Jason Fry, Journalism Online, Markos Moulitsas, Megan McCarthy, New York Times, Paddy Hirsch, paywall
Series: Predictions for Journalism 2011
Steven Brill: 2011 will bring ebook battles, paywall successes, and a new model for long-form articles
By Steven Brill / Dec. 13 / 3 p.m. / 4 comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
Here, Journalism Online cofounder and long-time journalism entrepreneur Steven Brill lays out three predictions for 2011.
1. E-books will continue to soar — and authors will get into major fights with publishers over who gets what percentage of the take, with more top authors withholding their e-book rights and selling them independently or through specialty distributors.
2. Someone — via Press+, I hope — will go into the business of commissioning long-form magazine articles from top writers and providing the first two or three paragraphs online for free and then selling the rest for, say, 75 cents or a dollar. That trailblazing publisher might call these “mini-e-books” and use a business model of simply splitting the revenues with the author, 50-50. My favorite candidates would be website publishers who already have great brand names, such as the Huffington Post or Daily Beast, but that want to revive long-form journalism and make money doing it (and limit risk by making some top writers their 50-50 business partners, rather than pay high flat fees for their work.)
Keep reading »
Tags: business model, charging, ebooks, long-form journalism, paid content, paywall, Press+, Steve Brill
Series: Predictions for Journalism 2011
Better curation on Twitter, pushback against anonymity, and more new startups: Predictions for 2011
By Lois Beckett / Dec. 14 / 10 a.m. / No comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
Below are predictions from Bob Giles, Alan Murray, David Beard, Geneva Overholser, Alan D. Mutter, Melissa Ludtke, Brooke Kroeger, Jan Schaffer, and Ory Okolloh.
We also want to hear your predictions: take our Lab reader poll and tell us what you think we’ll be talking about in 2011. We’ll share those results later this week.
Bob Giles, curator, Nieman Foundation for Journalism at Harvard
Newspaper companies will regret the deep cutting of newsgathering resources as the economy recovers and advertisers conclude that local newspapers are no longer vital sources of community coverage. Moreover, newspapers will follow their historical pattern of being slow to adapt to what’s new — in this case, opportunities offered by the iPad and other tablets.
Geneva Overholser, director, USC Annenberg’s School of Journalism
This will be the year when collaboration finally, truly, really takes hold. Smart legacy media leaders will determine what they and they alone can do best, then ally themselves with others who can supply the rest. Radio, TV, web-based publications, print publications, bloggers, international and national news providers, journalism schools, nonprofits, and commercial media — the smart ones will figure out their niche and how to partner (strategically) with others to be sure their work is seen. The public will be the biggest beneficiary.
Keep reading »
Tags: Alan Murray, Alan Mutter, Brooke, David Beard, Geneva Overholser, Jan Schaffer, Melissa Ludtke, Ory Okolloh
Series: Predictions for Journalism 2011
Smartphone growth, Murdoch’s Daily, and journalism for the poor: Predictions for mobile news in 2011
By Lois Beckett / Dec. 14 / noon / 1 comment
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
One of the common threads through many of their predictions was mobile — the impact smartphones and tablets and apps will have on how news is reported, produced, distributed, and consumed. (Not to mention how it’s paid for.) Here are Vivian Schiller, Keith Hopper, Jakob Nielsen, Alexis Madrigal, Michael Andersen, Richard Lee Colvin, Megan McCarthy, David Cohn, and David Fanning on what 2011 will bring for the mobile space.
Vivian Schiller, president and CEO, NPR
After two decades of saying that “this is the year of mobile,” 2011 really will be the year of mobile.
Michael Andersen, editor, Portland Afoot
My wild prediction: 2011 will be the year of media initiatives that serve poor and middle-income people.
For 20 years, almost all native Internet content has been made for the niche interests — often the professional interests — of people who make more than the median household income of $50,000 or so. But one of the best things about the mobile Internet is that it’s finally killing (or even reversing) the digital divide.
Poor folks may not have broadband, but they’ve got cell phones. African Americans and Latinos are more likely than white people to use phones for the web, pictures, texts, emails, games, videos, and social networking. As hardware prices keep falling, we’ll see more and more demand for information that is useful to the lower-income half of the population — and thanks to low marginal costs, people will be creating products that fill that need. It’s about damn time, wouldn’t you say?
Keep reading »
Tags: Alexis Madrigal, apps, David Cohn, David Fanning, iPad, iPhone, Jakob Nielsen, Keith Hopper, Megan McCarthy, Michael Andersen, mobile, news apps, Richard Colvin, Vivian Schiller
Series: Predictions for Journalism 2011
In-car app stores, success for Xinhua, and more social media: Predictions for journalism in 2011
By Lois Beckett / Dec. 15 / noon / No comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
Below are predictions from Paul Bass, John Paton, Philip Balboni, Martin Moore, Mark Luckie, Adrian Monck, Ken Doctor, Keith Hopper, and Vivian Schiller.
We also want to hear your predictions: take our Lab reader poll and tell us what you think we’ll be talking about in 2011. We’ll share those results later this week.
Paul Bass, founder, New Haven Independent
Every city of 100,000 or more in America will have its own online-only daily local news site.
Local governments will create their own “news” sources online to try to control the message and compete with new media and compensate for the decline of old media channels.
Newspapers, TV and radio stations, and online news outlets will collaborate on a bigger scale on local coverage and events
Vivian Schiller, president and CEO, NPR
“Local” takes center stage in online news, as newspaper sites, Patch, Yahoo, NPR member stations and new start ups (not for profit and for profit) form alliances, grow, and compete for audience and revenue online.
Twitter and Facebook become established as journalism platforms for newsgathering, distribution and engagement.
In-car Internet radio becomes a hot media topic, though penetration of enabled cars will lag by a few years.
Keep reading »
Tags: Adrian Monck, John Paton, Keith Hopper, Ken Doctor, Mark Luckie, Martin Moore, Paul Bass, Philip Balboni, predictions, Vivian Schiller, Xinhua
Series: Predictions for Journalism 2011
Coming soon to journalism: Matt Thompson sees the “Speakularity” and universal instant transcription
By Matt Thompson / Dec. 15 / 1 p.m. / 9 comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
We also want to hear your predictions: Take our Lab reader poll and tell us what you think we’ll be talking about in 2011. We’ll share those results later this week.
Here’s Matt Thompson, he of Newsless, Snarkmarket, and NPR fame.
At some point in the near future, automatic speech transcription will become fast, free, and decent. And this moment — let’s call it the Speakularity — will be a watershed moment for journalism.
So much of the raw material of journalism consists of verbal exchanges — phone conversations, press conferences, meetings. One of journalism’s most significant production challenges, even for those who don’t work at a radio company, is translating these verbal exchanges into text to weave scripts and stories out of them.
After the Speakularity, much more of this raw material would become available. It would render audio recordings accessible to the blind and aid in translation of audio recordings into different languages. Obscure city meetings could be recorded and auto-transcribed; interviews could be published nearly instantly as Q&As; journalists covering events could focus their attention on analyzing rather than capturing the proceedings.
Keep reading »
Tags: data, search, transcription, video, YouTube
Series: Predictions for Journalism 2011
Dave Winer: There’s no good place for a new Maginot Line for the news
By Dave Winer / Dec. 15 / 2 p.m. / 6 comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
Today, it’s web pioneer Dave Winer, a man key to the evolution of many of the publishing technologies we use online today, currently a visiting scholar in journalism at NYU, and half the team behind the Rebooting the News podcast.
When people in the news business try to figure out how to make news pay after the Internet, it seems analogous to the French, after being invaded by Germany in World War II, trying to figure out where to put the new Maginot Line.
The Maginot Line would have been a perfect defense in World War I. It didn’t help much in the second war.
Analogously, there was a perfect paywall in the pre-Internet news business, the physical product of a newspaper. There is no equivalent in the new distribution system.
Keep reading »
Tags: charging, commerce, CraigsList, deal brokering, Groupon, Maginot Line, paywall
Series: Predictions for Journalism 2011
Jonathan Stray: In 2011, news orgs will finally start to move past the borders of their own content
By Jonathan Stray / Dec. 16 / 11 a.m. / 3 comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
Today, our predictor is Jonathan Stray, interactive technology editor for the Associated Press and a familiar byline here at the Lab. His subject: the building of new multi-source information products, and whether it’ll be news organizations that do the building.
2011 will be the year that news organizations finally start talking about integrated products designed to serve the complete information needs of consumers, but it won’t be the year that they ship them.
News used to be more or less whatever news organizations published and broadcast. With so many other ways to find out about the world, this is no longer the case. Professional journalism has sometimes displayed an antagonistic streak towards blogs, Wikipedia, and social media of all types, but it’s no longer possible to deny that non-journalism sources of news are exciting and useful to people.
Unencumbered by such tribalism — and lacking content creation behemoths of their own — the information technology industry has long understood the value of curating multiple sources, including traditional news content. Google web search was the first truly widespread digital public information system. RSS allowed readers to assemble their own news feeds. Mid-decade, Wikipedia exploded into the one of the top ten sites on the web, used as much for news as for reference. The business practices of news aggregators angered publishers, but there’s no getting around the fact that they are tremendously useful tools. The most recent change in information distribution is social. Twitter has become an entirely new form of news network, while Facebook wants media organizations to use their social infrastructure to reach users.
Keep reading »
Tags: aggregation, curation, Facebook, Flipboard, Google, information needs, service, Wikipedia
Series: Predictions for Journalism 2011
Gawker copycats, luxurious print, more robots, and a new blogging golden age: Predictions for 2011
By Lois Beckett / Dec. 16 / 1 p.m. / 2 comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
Below are predictions from Susan Orlean, Joe Grimm, Matt Haughey, Adrian Holovaty, Megan McCarthy, Mark Potts, Jake Shapiro, and Cody Brown.
We also want to hear your predictions: take our Lab reader poll and tell us what you think we’ll be talking about in 2011. We’ll share those results in a couple days.
Susan Orlean, Twitter artist, and staff writer, The New Yorker
We’ll be reading more on our phones, our iPads, and our Super Scout Decoder rings by the end of next year.
Several magazines — maybe Time or Newsweek or both — will go monthly and/or digital only. But there will be new magazine startups in print that will be luxurious and expensive and book-like. 2011 will be the year of those two forms making themselves distinct; things on line will become more webby, and print publications will become more “collectible” and classic.
Journalism schools will offer a “web producer” major.
The last typewriter living in the wild will be captured, its DNA sequenced; and then it will be humanely destroyed.
Keep reading »
Tags: Adrian Holovaty, Cody Brown, Jake Shapiro, Joe Grimm, Mark Potts, Matt Haughey, Megan McCarthy, Susan Orlean
Series: Predictions for Journalism 2011
DDoS attacks on the U.S. media, Twitter history searching, and a big blog deal: More predictions for 2011
By Lois Beckett / Dec. 17 / 11 a.m. / 2 comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.
Below are predictions from Michael Schudson, Alexis Madrigal, Markos Moulitsas, Joy Mayer, Nicco Mele, Nikki Usher, Steve Buttry, Paddy Hirsch, John Davidow, Ethan Zuckerman, Richard Lee Colvin, and Kevin Kelly.
We also want to hear your predictions: Today’s the last day we’ll be accepting entries in our Lab reader poll, where you tell us what you think we’ll be talking about in 2011. We’ll share those results in a couple days.
Michael Schudson, historian and sociologist, Columbia Journalism School
Prognosticating about the news media in these times is a risky business, but I’ll try one nonetheless: In 2011, none of the 250 largest U.S. cities will stop publishing (on paper) its last remaining daily newspaper. Cities with more than one daily newspaper may be reduced to one survivor.
Alexis Madrigal, senior editor at The Atlantic and co-founder, Longshot Magazine
One of the truly important big city papers will go digital-only.
Kevin Kelly, author and founder, Wired Magazine
Twitter will go down for 36 hours. The ensuing media attention will prompt a 10 percent increase in signups in the months following.
Keep reading »
Tags: Alexis Madrigal, Ethan Zuckerman, John Davidow, Joy Mayer, Kevin Kelly, Markos Moulitsas, Michael Schudson, Nicco Mele, Nikki Usher, Paddy Hirsch, Richard Colvin, Steve Buttry
Series: Predictions for Journalism 2011
Jason Fry: A blow to content farms, Facebook’s continued growth, and the continued pull of the open web
By Jason Fry / Dec. 17 / 1 p.m. / 3 comments
Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring. Today, our predictor is Jason Fry, a familiar byline at the Lab. Jason also prognosticated earlier this week about the potential success of the NYT paywall.
Hyperlocal will remain stubbornly small scale. Large-scale efforts at cracking hyperlocal will seed more news organizations with content, but that content will remain mostly aggregation and data and still feel robotic and cold. Meanwhile, small-scale hyperlocal efforts will continue to win reader loyalty, but struggle to monetize those audiences. By the end of 2011, the most promising developments in hyperlocal will come from social media. Promising efforts to identify and leverage localized news and conversation in social media will be the buzz of late 2011, and we’ll be excited to think that social media is proving an excellent stepping stone to greater involvement in our physical communities.
Google will deal the content farms a big blow by tweaking its algorithms to drive down their search rankings. But the company will be opaque to the point of catatonia about exactly what it did and why it did it, reflecting its reluctance to be drawn into qualitative judgments about content. There will be talk of lawsuits by the spurned content farms and no small amount of jawing about Google’s power, lack of transparency, and whether or not it’s being evil. But even those worried about Google’s actions will admit that search is a much better experience now that results are less cluttered with horribly written crap.
Keep reading »
Tags: content farms, Facebook, Google, hyperlocal, search engine optimization, social media, tablets
The Nieman Journalism Lab is a collaborative attempt to figure out how quality journalism can survive and thrive in the Internet age. (More.)
The Lab in your pocket: Download the Lab's iPhone app — it's the best way to stay up-to-the-minute on the future of journalism. It's free and available now in the App Store.
Sign up for our daily email: Get all our freshest stories, delivered to your inbox once a day.
Prev
Featured
Next
Making the business case for long stories at Slate
The Internet is supposed to wreck attention spans. So why is Slate seeing great numbers for its longest pieces?
Surviving in an age of brand fragmentation
How will paid-content models adjust to a world where publications are no longer the atomic unit of news?
The Wikipedia of news translation
How a Chinese website assembled a volunteer army to translate English stories — despite Beijing interference
Is an English major the best way to become a journonerd? | A study of the NYT’s interactive news team finds a liberal-arts heart beating in the code
Apple’s impact | What Steve Jobs’ latest announcements mean for the news industry’s mobile strategy
Could a name-your-own-price model work for news?
Panera is experimenting with a new model. Can you build a business on good will?
Who owns the news? Could sharing a hot story be illegal?
The right to link is up for debate in a critical court case, pitting news orgs against Google and Twitter
Apps vs. browsers | Why the biggest competitor to iPad news apps may be a familiar but forgotten icon in the dock
The changing value of copyediting in an online age
How can copy desks stay relevant in a time of content farms, Captcha, and crowdsourced errors?
Why Twitter looks like a social network but feels like news media | The promise of many-to-many communication ends up looking more like broadcast
This chart is not encouraging | Ignore the spin: The latest circ numbers aren’t cheery
When the public’s right to know meets copyright
States are using the copyright code to keep otherwise public data private. What rights do journalists — and citizens — have?
* No, seriously: What Old Spice can teach us about news
* What makes a nonprofit news org legit? Six clues
* Calmness, curation, cat porn: Dave Eggers on print
* An involuntary Facebook for reporters and their work
* The Gutenberg Parenthesis: Back to the pre-print world?
* The NYT’s depressing list of most looked-up words
* Why Apple banned a Pulitzer winner from the iPhone
From Twitter
* Predicted: In 2011, Google will tweak its algorithms to drive down content farms' search rankings http://nie.mn/gYd0yQ 49 mins ago
* Better curation on Twitter, more news start-ups, and reconsidering anonymity: More 2011 predictions for media http://nie.mn/h3YqBk 2 hrs ago
* DDoS attacks on the media. A more searchable Twitter. An important paper goes digital-only. More predictions for 2011 http://nie.mn/g0bk1q 4 hrs ago
* 2011 predictions continue: Gawker copycats, luxurious print, more robots--and a new golden age for blogging http://nie.mn/fMxnWh 6 hrs ago
* Dealing with equal and opposite forces in news start-ups: How to speed up and slow down http://nie.mn/ecmX9e 8 hrs ago
* More updates...
Subscribe to:
Posts (Atom)