The large meta-studies done on this subject says that some poly-unsaturaed fat offers heart protective effects but it hasn't been shown that saturated fat itself can induce damage. At worse, it is neutral.
Recommendations from authority figures like the AHA are useful to a certain extent but they are less so when the science says the opposite and policy-making is lagging as a result. This is one of those instances.
Depends on which meta-analysis you use, but the science does not exactly say the "opposite."
The
Cochrane meta-analysis still supports replacing saturated fat with polyunsaturated fat to reduce heart attack risk.
The
BMJ meta-analysis you're referring to indeed did not see any association between saturated fat intake and increased mortality in the average population. However, I like the methodologies of the studies included in the first meta-analysis I mentioned more compared to the second's. Why? The first one was interventional, the second one was purely observational (and skipped out on the data for the middle groups, only focusing on extremes).
Is there another body of science as hard to trust "consensus" in as nutrition? It sounds like any sort of peer-review is just an inquisition of whether your results match what they want to be true.
Absolutely no other body of "science."
The second meta-analysis was based on studies that were based on food questionnaires, which so many GAFfers love to point out the bullshittiness. Which is hilarious, because that is virtually what any long-term nutritional study is going to be based on. So, after having been relatively enthusiastic in discussing nutritional data (
in the average population) on this forum, I now pity my earlier naive self. Nutritional epidemiological science is an oxymoron. Everyone's a potential armchair expert because everyone experiences nutrition their entire lives. Nutrition isn't some acute illness that affects a subset of a population whose confounding variables you can easily control. You can't measure nutrition as easily like you can with cigarettes in packs per day if you want to talk about a case-control study done well. The majority of case-control or cohort studies have binary exposures (yes or no). Nutrition is not like that. You have to measure it. And since measurements are not standardized, every researcher does it differently. Which makes any meta-analysis even more piss-poor.
Now, before The Lamp gets on my case for seemingly putting down RDs/nutritionists/PhDs in this field, I will say that the majority of these folks are NOT trying to tout one-diet-fits-all approach to the general population. The best RDs IMO are the ones who focuses only on the persons in front of them and adjusts their diet to address a quantifiable nutrient deficiency that can be explained by a known physiological/anatomical process. Frankly, I think any health authority should stay out of nutritional recommendations to the average person who doesn't have a specific disease process to address. There is no good data.
When the most famous so-called experts of a supposed health field are the folks who do not do any of the research themselves, that's a red flag. And fuck you, Dr Oz.
Edit: I will also point out that from a purely biostatistical point of view, nutrition and case-control studies should NEVER mix. That defeats the damn purpose of having a case-control study in the first place. They're designed to estimate odds risk for RARE diseases that do not affect the WHOLE population.