Meat

The Science of Dietary Risks

Monday, November 9, 2015

Red wine. Coffee. Dark chocolate. Epidemiologists, doctors, and nutrition scientists are constantly analyzing the good and bad consequences of different food and beverage components of our diets and reporting correlative data.

These studies are important. Just because your grandparents (note: a sample size of four) lived a long and happy life while eating a lot of item X and avoiding item Y, doesn’t mean those are wise dietary choices for the general population today. Knowing what kind of diet is healthy on a population scale is good for public health. Additionally,modern scientific studies produce much larger and more reliable data sets, because we’ve gotten better at identifying the caveats and limitations of past studies and fixing them with large, unbiased and well-controlled sample sets moving forward.

In spite of the fact that the quality of science generally improves rather than deteriorates, the public mindset still holds a memory of the seemingly flip-flopping nature of past dietary studies. Looking back 50 years, dietary recommendations and restrictions have often contradicted each other, especially when the underlying studies were chewed up and spoon-fed to the public in overly-simplified, hyped-up layman terms that glaze over the methodology.

While back-and-forth challenges are good for improving scientific rigor, they create a “science fatigue” in the general public that leads to mistrust and, unfortunately, disregard of legitimately important dietary guidelines. Daniel du Plooy’s essay from earlier this week in the online journal The Conversation sums up the problem from the public perspective: “How can science be trusted if it can’t make up its mind?”

Unfortunately, science fatigue can lead to deadly consequences – such as when pregnant women, who are immunocompromised, consume soft unpasteurized cheeses and contract the bacterial pathogen Listeria monocytogenes – so choosing to ignore science entirely is not a great life choice.

In the latest high-profile study of nutrition science, the World Health Organization (WHO) announced that red meat, and in particular processed meat, is carcinogenic. This is based on a recent meeting of 22 scientists from 10 countries at the International Agency for Research on Cancer (IARC) in Lyon, France. Every newspaper in the country has picked up this story by now, mourning bacon’s sad fate in the affair and discussing correlation versus causation. Red meat lovers have been quick to criticize the study as yet another drop in the ocean of back-and-forth nutritional recommendations. But let’s just look at the scientific data.

Red meat was classified as “all mammalian muscle meat, including beef, veal, pork, lamb, mutton, horse, and goat.” Processed meat includes “all meat that has been transformed through salting, curing, fermentation, smoking, and other processes to enhance flavor or improve preservation” – that includes hot dogs, bacon, ham, jerky, and more. In some countries, less than 5% of the population consumes red meat and less than 2% eats processed meat. At the other end of the spectrum, there are countries in which 100% of people eat red meat and 65% eat processed meat. Meat-eaters typically consume 50 to 100 grams of red meat on a daily average, and the IARC working group defined high consumption rates as more than 200 grams per person per day.

Compiling over 800 epidemiological studies and weighing those with clean case-control study designs, the working group found a correlation of high red meat consumption with colorectal cancer in 7/15 European case-control studies, and a correlation of high processed meat consumption with colorectal cancer in 12/18 studies performed in Europe, Japan, and the USA. For red meat, they even observed a dose-dependent relationship – the more meat you eat, the higher your cancer risk - 17% increased risk per 100 grams of red meat and 18% per 50 grams of processed meat. Taken at face value, this data delivers a simple message: a population that consumes large amounts of red or processed meat will have about 18% more colorectal cancer than a population that consumes modest amounts. The increase is modest – which explains why not everyone who eats a lot of red meat gets cancer, and why meat-lovers may use this fact as confirmation bias – but the data is what it is, and isn’t going to be wished away.

The IARC study basically calls for everything in moderation. Red meat has nutritional benefits – it’s a rich source of iron which is important for red blood cell function – so unless you’re going to increase your intake of beans, lentils, and spinach, you may want to keep some meat in your diet. Here is a quick reference list: a hot dog weighs 52 grams, an 8-ounce steak weighs 226 grams, and a slice of bacon weighs 8 grams.

How might eating meat cause cancer? Cooking or processing meat produces carcinogenic chemicals including N-nitroso-compounds (NOC), polycyclic aromatic hydrocarbons (PAH), and heterocyclic aromatic amines (HAA). These are known to produce DNA damage, but studies haven’t yet looked at DNA damage in the colon following meat consumption, so for now the mechanism remains a highly likely speculation.

In the end, the WHO classified red meat as “probably carcinogenic” (group 2A) and processed meat as “carcinogenic” (group 1). They are quick to note that, although other items in group 1 include tobacco smoke and asbestos: “…This does NOT mean that they are all equally dangerous. The IARC classifications describe the strength of the scientific evidence about an agent being a cause of cancer, rather than assessing the level of risk.”

At the end of the day, epidemiological statistics on appropriate nutrition may point one way, but individuals ultimately make their own choices in their own homes. In a perfect world, those choices are based on reasonable embracing of scientific recommendations – without going overboard in embracing, or refuting, every study that shows up on the internet. When a major health organization performs a meta analysis on a body of scientific literature, however, it’s generally reasonable to take their findings into consideration.