Pages

Tuesday, December 3, 2013

A Brief Primer on Research

[contents: fat hate]

I wanted to write a little bit about how I find and evaluate research related to my health and health care.

Being able to find, understand, and evaluate research is a very useful skill, most especially when it comes to being a fat person in the doctor's office - or a pregnant person. (Or ANY person.) Yes, you should be able to go right in to the doctor's office and get appropriate, correct care. Aaaaaaand I and many other fat people have run in to the exact opposite. I also have problems with the public health campaigns about being informed patients, etc. and so on. Do I think it's amazing to be informed about your treatment? ABSOFUCKINGLUTELY. However, these campaigns put nearly all of the responsibility on the patient, and that's not where it belongs. Doctors need to step up their game (and yes, I realize they're as much a part of the system which incentivizes the exact opposite as anyone else... but that doesn't mean they're completely powerless. And yes, the system absolutely needs to change).

That being said, pretty much the only way that I've gotten appropriate treatment is doing my own research, and going in armed with it. But doing research on medical conditions, symptoms, treatments, can be really tough. Sure, the internet is great... and also filled with a lot of misinformation. Plus, getting access to the actual research journal articles is difficult and expensive for most people. I can get many of the actual articles... if I physically go to my alma mater's library. Then you have to evaluate whether the research was actually any good, then you run in to websites that purport to have information and it's rubbish... sorting through the utter flood of information can be tough.

But there's a couple of rules that can make it easier. Here's some of the ones I use:

Science reporting is shit.

If it is reported in a magazine, newspaper, whatever, it's shit. There are some blogs that actually do some good reporting - I link to some in the sidebar. But in general, science reporting in the media is terrible. Few if any reporters understand the difference between correlation and causation, few if any reporters are equipped with the skills to evaluate the quality of the research, and some of them will straight up report things that were not actually found in the study. Scientists are biased. So are journalists. Never doubt that they're selling you a narrative, not science.

I won't say "OH NEVER READ STORIES ABOUT NEW STUDIES", because shit, I do, and besides that, they're pretty unavoidable. But please, please, be super skeptical about them, and if you want to check them, go to the source - read at least the abstract of the study yourself to see what it actually says. Abstracts are usually available for free (and are the short summary of the research, usually about a paragraph long). Google Scholar can often help you find them. Another good place to check is PubMed.

For more information on how mainstream science reporting is usually crap, see this post on Well-Rounded Mama.

Most research does not - and CAN not - find the cause of something.

Following on from that first point, how many articles have you read about "OBESITY CAUSES X!", or similar? Probably a lot. The problem is, the vast, vast majority of research doesn't actually find causes. At best, it finds a correlation.

Doing research to determine the cause of something is actually quite complex. It's one of the things I learned to do in graduate school. It's not impossible, but it's also not often done. There are very strict conditions that have to exist in order to do it, and so many things you have to control for and account for and it's just really complicated. To get an idea of the complications, I link to Hills' criteria of causation in the sidebar. In order to prove causation, you have to meet those criteria. Most research doesn't. So any time you see "WE HAVE FOUND THE CAUSE OF X" in health research, be suuuuuper skeptical.

As for what research does usually find, it's a correlation - that is, "we find these two things occurring together pretty often, more often than we figure we would with random chance". You might also hear "is associated with" - that still means "correlation". The problem is, two things can be correlated, but that correlation tells you absolutely nothing about whether one causes the other - or if there's a third factor, unknown or unmeasured, that causes both. Shorthand that's popularly used for this concept is "CORRELATION DOES NOT EQUAL CAUSATION"; you might have seen that on Twitter. It's pretty key in research, and it gets forgotten a lot.

That's not to say that research that finds a correlation is useless - far from it. We can make some pretty good guesses based off of correlative research. But research that finds a correlation does not ever prove we have found the cause of something. So any time someone tells you "well X causes Y medical condition or biological effect", feel free to give them all of the side-eye.

Research, researchers, and publishers are biased - just like everyone.

A whole lot of research doesn't meet the criteria of good research design. Without taking extensive courses in research design, just keep this in the back of your head. There's also a fair amount of research that gets published that is straight up made up. Then there's the bias in publishing - a positive result, as in, "this thing we tried showed an effect", is way easier to publish than "we tried this thing and it didn't work". The latter does get published sometimes, but not as often.

There's also dominant narratives in research - it's why you don't hear about the numerous studies that show that weight loss really is difficult and doesn't last long term, and you DO hear about all of the studies that show that oh this new thing causes weight loss! The acceptable story right now, in medicine, research, reporting, is that OMG FAT IS BAD and WEIGHT LOSS IS GOOD, and things that are contrary to those notions don't get the press. It's also why you often see studies about obesity that say one thing in their results section (e.g. "no significant reduction in weight was found in the study population") and another in their conclusion ("this intervention is an effective weight loss treatment and should be recommended") I WISH I WERE MAKING THIS UP.

Then there's my next point...

Follow the money.

Conflict of interest is a real thing in science, and it's a big way that bias gets introduced. Check out who funds studies - chances are, if it's a study that trumpets weight loss, or a weight loss program that works, it was funded by the person or company who invented said program, or who benefits financially from weight loss. Similarly, studies that find that sports drinks are effective are almost always funded by companies that make sports drinks. Isn't it funny how that works out?

Well, no, it's not funny, because it's bullshit, and it leads to a lot of really crap research. And it's also not to say that, say, federally funded research isn't biased. But it's more likely to be more biased if the funder benefits financially from it.

Check Cochrane.

The Cochrane Collaborative is an independent, international organization dedicated to cataloguing and evaluating health-related research. They are highly respected, and very reliable. I won't say they're completely unbiased, because no one and nothing is, but they work really hard to NOT be.

You can probably find what they call a "Cochrane Review" on just about any health topic. What they do is they comb the lists of all research trials related to a topic, evaluate the quality of the trials, then combine them all to come up with what they are all in general saying. They provide this in multiple languages, and also do a "plain-language" summary. For free.

For an example, here is their review regarding weight loss in pregnancy for obese women. WELP SORRY MIDWIFE YOUR RECOMMENDATIONS ARE WRONG AND NOT BASED IN THE SCIENCE WE HAVE.

You can also view the Cochrane Summaries, which are the short, versions of the reviews. They can be really helpful to print off and bring to your doctor's office. I was recently all up in them because of a finding on an ultrasound I had last week (and helpfully, they confirmed what I thought - that it's not that big of a deal, and if we wanted to do anything, what I got recommended was the thing to do - the finding was about me, not The Kid, btw). I went to my doctor's office with the relevant summaries printed and SHOCKER, did not even need them, because she was as up on the research as I was. But if she had recommended something that I knew from my reading wasn't going to be effective, I could whip out the summary and go "okay but that's not what the research says", and have the relevant research right there.

Seriously Cochrane is what policy-makers, health care professionals, pretty much everyone uses. Your doctor has probably heard of them, and if you say "Well here's the Cochrane review about that", most will actually listen. If they still say "well that's nice but" after you give them evidence like this... probably find a new doctor if you can. Seriously.

**********

So those are the rules I use when reading research. Anyone else got helpful tips to share?

No comments:

Post a Comment