Sister blog of Physicists of the Caribbean in which I babble about non-astronomy stuff, because everyone needs a hobby

Sunday, 28 February 2016

Is bad science journalism a fault of the education system ?

"In schools, instead of encouraging active participation by students in exploration and experimentation, science is often taught as a dry collection of facts and rules to be memorised for exams. Moreover, we force students to chose between sciences and humanities far too early. That perpetuates a divide in which most journalists – who are often trained in the humanities – tend to avoid writing about science and editors are more inclined to push science to the margins."

I'm not so sure (except for the bit about exams and rote learning). Early specialisation is almost unique to the UK, yet there is a wealth of awful science journalism from the "let's make everyone well-rounded generalists" US as well.

http://www.theguardian.com/commentisfree/2016/feb/28/science-arts-society

13 comments:

  1. I taught science journalism at a Uni for a couple of years and it was hard work - the students were so ignorant they had no idea what constituted a good question to ask of a scientist, how to balance a story by assessing who had the most genuine evidence to back up their side, how to spot bullshit dressed up in sciency jargon etc. I didn't really feel I achieved much.

    The answer as far as I an concerned is to educate scientists in how to be journalists. It's not that difficult, there's no great mystery to the craft and while their prose might not be as polished as an English graduates that's what sub-editors are for. If you haven't got the facts right you are polishing a turd, if the journalist hasn't got his head round what is relevant in a story then they have failed in their basic function of informing the public.

    The difficulty comes in getting editors and media managers to take the wannabe science communicators seriously. They, as all managers do, recruit in their own image - they like humanities grads as that's what they are themselves. And while they may see the merit in having a political correspondent who actually knows a little about politics, or a sports journalist who is actually interested in sport, they have a bit of a blind spot about doing the same for science. Then there's the standard tropes and practices - the miracle cure, the maverick vs the establishment, the urge to be 'balanced' etc etc (I have a good two hour lectures worth) - that make any attempt to tell a science story in a standard news format a pain in the jaxie.

    But there is an argument that we get the media we deserve; the reason most science journalism is rubbish is because the audience is ignorant of science. They may be able to spot the bullshit in many other kinds of story, but for science any old crap will do because the audience won't pick up on it.

    ReplyDelete
  2. Barry Blatt I couldn't agree more. In fact I want to +1 your comment several times. The irony is that I found high school humanities courses at least as useful for developing critical thinking skills as a physics degree.

    There also seems to be a chronic misunderstanding about the scientific method. If I see a "mystery solved" or "scientists baffled" headline one more time I think I'm going to scream. That, perhaps, is the fault of rote-learning at school level, where science is seen as producing results which are certainly right or wrong. In turn this leads to people having this bias against experts, citing their mistakes as incompetence when it's (mostly) just part of the process of honest research. Of course, it doesn't help with the researchers themselves directly say that they've solved a mystery when >99% of the time they've done nothing of the sort.

    As well as the problem of getting editors to recruit/interview scientists, there's also the problem of getting university professors to encourage outreach. Many see it as a distraction from research - which it is, but that doesn't mean it's not important. Instead of being a small but integral part of the job it's usually done as a hobby. Mind you, if we didn't have to spend so much time applying for grants... but don't get me started on that one. :)

    ReplyDelete
  3. Could you guys collaborate on an article posted somewhere--blog or even some kind of print publication with a website--that encapsulates those two first comments here? I'd love to link to that article about a hundred times. I quite literally posted a mini-rant in a comment about 5 minutes ago about how tech journalists taken as a whole should have quotes around the second word. Similar with most science journalists (there are some good ones, but hoo boy...the others?)

    ReplyDelete
  4. As a general rule, I would not use my country as a model on how to implement a given theory of teaching, or as an argument that that a given theory of teaching is invalid.

    ReplyDelete
  5. > There also seems to be a chronic misunderstanding about the scientific method.

    I'd be interested in your take on that. My own experience suggests that the "scientific method" is akin to the "food pyramid"; an abstraction that has little to do with how things are actually structured, the reality being much more adhoc and complicated, but makes for a nice short 'lesson' on the back of cereal boxes.

    ReplyDelete
  6. Michael J. Coffey I cover the state of science journalism a little bit here :
    http://astrorhysy.blogspot.cz/2015/05/oh-humanities.html
    But if you're looking for something more specific I'd be happy to write up something short. If Barry Blatt wants to collaborate, so much the better.

    ReplyDelete
  7. Sure, I am a long time out of the business but I can provide some examples of bad practice.

    ReplyDelete
  8. Chris Greene I rather like this meme :
    http://ascienceenthusiast.com/wp-content/uploads/2015/08/scientific-method-meme-v2.png
    It has the process as a cycle rather than a series of steps, which is two hundred and seven times better than just ending with "mystery solved". But I agree it's an over-simplification. I'm not sure I know anyone who does all of those steps. Some observers never do any theoretical work at all, and vice-versa. It also misses out the peer review process completely, but I think it's broadly correct for areas where there is a very well-established, detailed theory.

    For the cutting edge I don't think it works well at all. Here you can be dominated by unknown unknowns, so it's difficult to say if your theory and observation are really (in)compatible. "If it disagrees with observation is wrong" is fine, but it's often extremely difficult to judge if that's true. Example : LCDM cosmology. Some people would say it's stunningly successful at just about everything. Others would say it's a miserable failure that's been patched to the point it should be abandoned. It isn't always easy to distinguish between "legitimate correction" and "fudge to fit data".

    ReplyDelete
  9. Rhys Taylor Barry Blatt -- Yeah, the follow-up article you posted, Rhys, is much more what I was thinking near the end rather than the beginning.  Something that would show journalists how to do science better, but also people reading science articles and how to tell good ones from bad ones.  

    I'd write that article, but I'm neither a scientist nor a journalist.  I just picked up good senses for those things because (1) I had grandparents and great grandparents who were journalists, (2) I had a fantastic humanities education that did all the things mentioned in Rhys' article, and (3) I really like science and actually spend time tracking down sources of articles that I read and looking at the published study when I can find them for free.  (I've paid for a few, but when you're reading for pleasure and self-enrichment, bookstores are a better use of funds...)

    ReplyDelete
  10. Rhys's point about sensationalism and trivialisation is an important one. In journalism training you are taught to try and make a story relevant to the personal experience of the audience to enable them to relate to it in some way. Trouble is personal experience of black holes and the minutiae of the workings of the immune system is rather limited. When a journalist sits down and thinks 'why should the audience be interested in this?' with many science stories they have very little to use and overinflate what is being said or turn it into a joke. I'm sure Rhys is sick to death of Star Trek analogies cropping up in stories in his field, but if that is the main touchstone the bulk of the public have with astronomy, it's going to get used. To be fair this often starts with the PR departments and their efforts to get journalists to take an interest in what their institution is up to.

    My beat was medical and health and I was lucky enough to be writing for medical professionals in the main so I could assume a pretty solid background in science and an appreciation of the uncertainties involved. But, to take an example, MERS was a big story in the mainstream press. But how to bring it to the attention of those who just might be interested? By slapping over stated headlines on it of course - it is a nasty disease to be sure, if you get a serious case of it, but not that common and there is only very limited human to human transmission known so you aren't likely to pick it up unless you hang around with camels. But 'Camel herders get nasty cough' isn't exactly an attention grabber and it was hyped up no end - the papers had to make people feel that they might personally at risk to make it 'relevant'. Better to have left the whole bloody tale alone really and leave it to medics in travel clinics issue mild warnings to people going to an affected area, but some twit issued an inflammatory press release to panic-exploiting news outlets somewhere along the line and off the avalanche rumbled.

    Understanding and communicating risk is a whole minefield in itself though. Medics get trained in it, journalists unfortunately don't.

    And good on you Michael for actually reading the research, most journalists don't. I have come across plenty of bullshit stories that could have been nipped in the bud if only the people reporting on them had read the effing studies and had half a clue what they were looking at - the godawful MMR fiasco being a case in point. Even before we knew what a charlatan Wakefield was you only had to glance at the Lancet paper as originally presented to see the huge holes, but it seems very few did.

    ReplyDelete
  11. Barry Blatt -- I do most of my stuff with relation to tea, so I'm most sensitive to bogus health claims.  My go-to example was press coverage about "a new study that shows drinking tea is good for your oral health!"  Reading the study showed that (a) the study used a super-concentrated extract unlike the typical brewed tea, (b) it was green tea, whereas most English speakers drink black tea, (c) the study participants gargled with the tea extract, rather than drinking it, and (d) they found a change in the relative numbers of different microbes in the mouth afterwards, not better oral health.  

    I like to show my tea classes this as an example of how a perfectly plausible science article got every single salient detail wrong.

    ReplyDelete
  12. Barry Blatt Great point about relatability. SMBC says it best :
    http://www.smbc-comics.com/?id=2088

    Actually, I don't mind Star Trek analogies for precisely that reason - if it helps people understand a complex piece of research then I'm all for it. What gets me is when the claims are exaggerated beyond all credibility : NASA is working on a warp drive being a prime example. Yes, some very minor functionary is researching a rather off-the-wall idea about how we could eventually build some kind of warp drive. But the popular media would have it that all the problems are sorted and it'll definitely be done next year. I have no problem with fringe research as long as it's presented for what it is (unlike just about everything on the "History" channel these days).

    "When a journalist sits down and thinks 'why should the audience be interested in this?'"
    I wonder if this is part of a wider problem, or just a problem with journalism. Notable and outstanding exception : pretty much any BBC nature documentary. Rarely if ever do they try and make things "relevant", they just tell you interesting stuff. If something is genuinely interesting, the audience will be interested - there's no need to try and force excitement or personal relevancy. Whether the shoddier end of journalism occurs because the journalists don't think people will read anything if it's not exciting enough, or because the audience genuinely won't read anything unless it sounds exciting, I don't know. I'd like to think it's the former.
    (also pertinent, at least the first half : http://astrorhysy.blogspot.cz/2015/10/false-consensus.html)

    "Understanding and communicating risk is a whole minefield in itself though. Medics get trained in it, journalists unfortunately don't."
    Yeah, don't get me started. I'll just leave it to SMBC again :
    http://www.smbc-comics.com/index.php?id=3930

    ReplyDelete

Due to a small but consistent influx of spam, comments will now be checked before publishing. Only egregious spam/illegal/racist crap will be disapproved, everything else will be published.

Whose cloud is it anyway ?

I really don't understand the most militant climate activists who are also opposed to geoengineering . Or rather, I think I understand t...