I don’t like your data! – my March 2019 “Best of Health” article

Best of HealthIt’s always interesting to read the discussion threads on social media when new scientific papers on canine health and welfare matters are reported and when breed clubs publish their health survey results. I’ve written before about Cognitive Dissonance, a term that captures a multitude of reasons why it’s so hard to get people to see the need for improvement, let alone act to create change. In essence, it means people feel uncomfortable when newly presented evidence clashes with their existing beliefs and they try to find ways to reduce their discomfort.

Albert Einstein is quoted as having said “If we knew what it was we were doing, it wouldn’t be called research, would it?” It should be no surprise that, with some research, the results are completely novel or, in some cases, unexpected. Newly published research should prompt us to ask the question “why?” – why might a particular association have been identified and why might the results have turned out like they did. Instead, we often find what appears to be cognitive dissonance kicking in. Here are some examples:

The sample is too small: Some research starts with very small samples, often for practical reasons such as cost or convenience. Any interesting findings need to be explored with further studies using bigger and more representative samples, not simply dismissed. Since we know how many dogs are registered in every pedigree breed each year, it is easy enough to estimate the UK population if we also know their average age of death. There are well-established statistical methods for judging the confidence that can be applied to samples so it doesn’t take much effort to decide whether a sample is really “too small”. The opposite effect also happens sometimes; the results of a small study may be misused to provide “evidence” for whole populations.

It’s misleading/skewed: This is a variant of “the sample is too small” but focusing on it being the wrong sample. All samples have some degree of bias; the important thing is to understand what that might be and most peer-reviewed papers have a section discussing potential limitations of the study. That’s where you can get an understanding of potential shortcomings in the chosen data and the potential to address these with future studies. A common criticism of breed health surveys is that the responses are skewed by people who have ill dogs or that “show people” won’t be honest. That’s why it’s important to look for other studies that perhaps cover different respondent samples to see what results were obtained there. Our first major Dachshund health survey was criticised by some for being biased with responses from 85% show breeders. When we repeated the survey 3 years later and had responses from 85% non-show owners, the findings were very similar.

It’s not scientific: This is a great one that gets trotted out to criticise breed health surveys, in particular. I don’t even know what it means. Is it because the report wasn’t written by someone with a PhD or administered by someone wearing a white lab coat? The expertise of most Breed Health Coordinators is backed-up by advice from the KC’s health team so there is invariably a strong scientific input into the design and analysis of breed surveys these days. The follow-on criticism of breed survey reports is sometimes that “it’s not peer-reviewed”. That’s probably true but most aren’t intended for publication in academic journals. The lack of peer review doesn’t negate their usefulness. Most are reporting basic descriptive statistics such as Means or Medians, and maybe Odds Ratios, often with Confidence Intervals and p-values.

We can’t do anything about it: Findings are written off because, in some people’s view, no action can be taken. For example, we found that incidence of back disease in Dachshunds is higher in Winter months than during the other 3 seasons. It prompts the question why. We could hypothesise that it’s due to them getting less exercise or some temperature effect. Whatever the reason (which more investigation might explore), it’s pretty unlikely that nothing can be done. “We can’t do anything about it” is often just a lazy response to avoid finding something that can be done. In fact, it’s just as lazy as responding with “what can we do to improve it?” unsupported by any suggestions.

We need more research: I’ve written before about the parallels between the tobacco industry’s response to the link between smoking and cancer and the dog world’s response to criticisms of health issues in pedigree dogs. A call for more research is sometimes just a smokescreen, looking for the perfect set of data which, of course, will never be found.

We need facts: This one is used alongside “it’s not scientific” and it’s hard to know what to make of such a comment when the report being referred to is full of data and analyses. The BBC says “a fact is something that can be checked and backed up with evidence” and “facts are often used in conjunction with research and study”. “Opinions are based on a belief or view”. Last year, it was reported that chocolate labradors live significantly shorter lives than the other colours. Although perhaps surprising, this “fact” can be checked by looking at the data and evidence presented in the paper. Additionally, it’s not the first example of a dog’s colour being associated with a particular aspect of its health so maybe it shouldn’t be so surprising.

My dogs don’t have that: We need to remember that data presented in papers and reports are from samples of populations. These will contain a range of cases and non-cases. Just because one breeder has never had a particular problem doesn’t mean it doesn’t exist. “The plural of anecdote is not data“! Gregoire Leroy has written an excellent blog at dogwellnet.com about this, which he calls the “sampling effect”.

A nudge towards breed health improvement

My Christmas reading was Black Box Thinking by Matthew Syed. It’s all about how people and organisations learn (or don’t). One paragraph really struck a chord with me:

Science is not just about a method, it is also about a mindset. At its best it is driven forward by a restless spirit, an intellectual courage, a willingness to face up to failures and to be honest about key data, even when it undermines cherished beliefs.

We do need to question the research that is published on canine health matters, not to knock it down but to understand how it can be used to help us. Every piece of research and every breed survey has the potential to nudge us towards actions that will improve the lives of our dogs.


Comments are closed.

%d bloggers like this: