Heart Disease: Do Docs Have No Clue?

I am not a doctor, and I don’t even play one on TV. I’m just an academic economist who often reads stuff on blogs and in layman’s publications about health and fitness. And it seems that whenever I read something about heart disease, it contradicts the medical profession’s conventional wisdom. Heart disease is the leading cause of death in America, and doctors act as if they know and can do a lot about it, but much of the stuff I read suggests that they have hardly a clue about what really causes heart disease or how to prevent or treat it.

Your doctor carefully tests your cholesterol level, and tells you that you should follow that number as closely as a CEO follows his corporation’s stock price. But as I reported back in June, the latest research shows that cholesterol is not a problem.

Cholesterol does not cause heart disease in the elderly and trying to reduce it with drugs like statins is a waste of time, an international group of experts has claimed.

A review of research involving nearly 70,000 people found there was no link between what has traditionally been considered “bad” cholesterol and the premature deaths of over 60-year-olds from cardiovascular disease.

Published in the BMJ Open journal, the new study found that 92 percent of people with a high cholesterol level lived longer.
[…]
“What we found in our detailed systematic review was that older people with high LDL (low-density lipoprotein) levels, the so-called “bad” cholesterol, lived longer and had less heart disease.”

Vascular and endovascular surgery expert Professor Sherif Sultan from the University of Ireland, who also worked on the study, said…“Lowering cholesterol with medications for primary cardiovascular prevention in those aged over 60 is a total waste of time and resources, whereas altering your lifestyle is the single most important way to achieve a good quality of life.”…

Lead author Dr Uffe Ravnskov, a former associate professor of renal medicine at Lund University in Sweden, said there was “no reason” to lower high-LDL-cholesterol.

And yet, anti-cholesterol drugs remain today the leading class of drugs prescribed in America. Do docs read the same things I read? One of us must be missing something.

Next, consider last month’s piece in The Atlantic about unnecessary medical procedures. One of the procedures highlighted by the article is heart stents.

In 2012, Brown had coauthored a paper that examined every randomized clinical trial that compared stent implantation with more conservative forms of treatment, and he found that stents for stable patients prevent zero heart attacks and extend the lives of patients a grand total of not at all. In general, Brown says, “nobody that’s not having a heart attack needs a stent.” (Brown added that stents may improve chest pain in some patients, albeit fleetingly.) Nonetheless, hundreds of thousands of stable patients receive stents annually, and one in 50 will suffer a serious complication or die as a result of the implantation procedure.

In particular, you can die from a post-operative blood clot. For the sake of an unnecessary procedure. Good work, docs!

Unlike statins, blood pressure medications are something I have never really questioned. Cholesterol numbers might be meaningless, but surely blood pressure means something, right?. And blood-pressure medications really do effectively bring down pressure. That would seem to be obviously beneficial, since lowering pressure reduces strain on the heart. Indeed, the conventional wisdom holds that blood pressure medications, known as beta-blockers, have saved untold numbers of lives. And yet, the same Atlantic piece casts doubt on the usefulness of beta-blockers.

[T]he latest review of beta-blockers from the Cochrane Collaboration—an independent, international group of researchers that attempts to synthesize the best available research—reported that they “are not recommended as first line treatment for hypertension as compared to placebo due to their modest effect on stroke and no significant reduction in mortality or coronary heart disease.”

That somewhat awkward language might require a bit of translation. “Not recommended…compared to placebo” means the beta-blockers are worse than doing nothing. They do more harm than good. And the “modest effect on stroke” refers not to a decreased but to an increased risk of stroke. The beta-blockers modestly increase the risk of stroke without reducing the risk of “mortality or coronary heart disease.” What a deal.

Finally, here’s something else I ran across this week. The so-called Seven Countries Study is the most famous study to link heart disease to saturated fat and cholesterol. The 25-year follow up to the original study again found a significant correlation between cholesterol and heart disease. Researchers in the U.K., however, analyzed the same data and found that heart disease correlated even more closely with…wait for it…latitude.

The Pearson correlation coefficient was calculated as 0.72 for baseline cholesterol and CHD deaths at 25 years. The data in the 1993 Menotti article has been examined to repeat the correlations found with CHD death rates and mean serum cholesterol to understand the data and methodology used. The same methodology was then used to explore alternative correlations. The strongest relationship found was for CHD death rates and the latitude of the country or cohort in The Seven Countries Study. The correlation coefficient for CHD deaths and latitude of the cohort was 0.93. The correlation coefficient for CHD deaths and latitude of the country was 0.96. While Keys did find a strong association with median serum cholesterol and CHD deaths, there were stronger associations that were discoverable.
The latitude finding offers an alternative explanation for the observed relationship with cholesterol and CHD. Vitamin D is made when sunshine synthesises cholesterol in skin membranes. In cohorts further [sic] away from the equator, cholesterol is less able to be turned into vitamin D.
Population mean serum cholesterol levels are higher and concomitantly population mean vitamin D levels are lower. Higher CHD could be associated with lower vitamin D, with cholesterol a marker, not a maker, of heart disease.

So according to this theory, the problem is not too much saturated fat, but too little vitamin D from sunshine. The theory casts doubt, therefore, on the alleged benefits of the ‘Mediterranean diet.’ The Mediterranean advantage would be the sunshine, not the food.

So much of what we think we know, might not be so.

Lab-Grown Meat is Coming

Just four years ago, a pound of lab-grown beef cost $325,000 to produce. Now, researchers have got the cost down to about $11. If progress continues, lab-grown meat might be commercially viable in less than five years.

Mark Post, whose stem cell burger created an international sensation in 2013, recently announced that his company, Mosa Meat, would be selling lab-grown beef in four to five years.

Memphis Meats is developing a way to produce meat directly from animal cells without the need to feed, breed or slaughter actual animals.

In theory, the stem cells could provide a lot of meat. Assuming unlimited nutrients and room to grow, a single satellite cell can undergo 75 generations of division during three months. That means one turkey cell could turn into enough muscle to manufacture over 20 trillion turkey nuggets.

Some animal lovers are welcoming lab-grown meat because it means that cows and pigs will no longer be slaughtered. It also means, however, that nobody will any longer have an incentive to raise them. The population of farm animals would undergo a complete collapse. The future of domesticated livestock might be the zoo, and those zoos might offer the only opportunity to save those animals from complete extinction.

By the way, the invention of lab-grown meat was foreseen some 85 years ago by none other than Sir Winston Churchill, who was ridiculed for his prediction. Churchill was just a little too optimistic about the time line, however, as he thought lab-grown meat would be viable by the 1980s.

Progressives Finalizing Their Religious Diet

Increasingly, the only way I can make sense of modern liberalism is by viewing it as a neo-pagan, post-Christian religion. Most religions impose some type of dietary restrictions. Jews, for instance, can’t eat pork or combine meat with dairy. Mormons can’t have alcohol or caffeine. And so on. It’s perhaps only natural, therefore, that liberalism should develop a religiously dictated diet of its own. News out of California–the Holy Land of liberalism–suggests that liberalism’s approved diet is finally taking shape. Like many religious diets, it will include little or no meat.

FOE [Friends of the Earth] gave kids a lunch menu designed to eliminate foods it says are “unsustainable for our planet.” The new menu features far less meat and more plant-based food. Any meat or cheese the school did use came from “pastured, organic dairy cows.” The student’s lunch menu went from beef hot dogs and pepperoni pizza to vegan stir fry tofu and vegan tostadas. The new FOE-approved menu served meat and cheese-less frequently and reduced the portion sizes.

Needless to say, none of this has any scientific legitimacy. The term “unsustainable for our planet” has absolutely no scientific meaning.

“This is a landmark moment for school food,” Jennifer LeBarre, head of nutrition services for Oakland Unified School District, said in a FOE press statement. “We were so excited to see how the data showed that we could reduce our carbon and water footprint by serving healthy, delicious food –– like the vegetarian tostadas with fresh made in-house salsa, that kids absolutely love –– all while saving money.”

The old morality: The Ten Commandments

The new morality: Reducing your “carbon and water footprint”

The district and FOE claimed the lunch program was healthier than before, but only on the basis that food from plants is typically healthier than meat.

Of course, the assertion that “food from plants is typically healthier than meat,” has no scientific justification. Vegetarians, on average, are less healthy both physically and mentally than meat eaters.

Concerning self-reported health, vegetarians differ significantly from each of the other groups, toward poorer health (p = 000). Moreover, these subjects report higher levels of impairment from disorders (p = .002). Vegetarians additionally report more chronic diseases than those eating a carnivorous diet less rich in meat (p = .000; Table 2). Significantly more vegetarians suffer from allergies, cancer, and mental health ailments (anxiety, or depression) than the other dietary habit groups (Table 3).

A meatless diet seems particularly unhealthy for children, who need dietary fat for brain development.

But that’s science, and therefore irrelevant to the modern religion of liberalism.

The scary part is that if liberals can impose their dietary restrictions on schoolchildren, the next logical step will be to impose them on the rest of us.

Maybe someday the liberal Taliban will force meateaters to go into hiding, and dietary dissidents will risk arrest for scoffing brisket at underground barbecues.

Oakland schools partnered with the environmental group Friends of the Earth (FOE) to fight global warming by making student lunches climate-friendly.

Friends of the Earth. Enemies of Humanity.

Are Prescription Drugs All Crocs?

Ok, not all drugs are crocs, but a lot. An astonishing 70 percent of Americans take at least one prescription drug. How many of those drugs are useless or even harmful?

Consider the two most prescribed classes of drugs: anti-cholesterol agents (statins) and anti-depressants. The scientific theories underlying both drug classes have more or less been debunked.

The theory underlying statins is the so-called lipid hypothesis of heart disease. This theory has been around for 60 years, but was never supported by very much scientific evidence. The latest and best evidence generally contradicts the lipid hypothesis. Simply put, there is no correlation between cholesterol and heart disease. Heart patients admitted to hospitals have an average cholesterol level no higher than the population as a whole. The overall correlation between cholesterol and life expectancy is positive–people with higher cholesterol live longer on average. Yet reducing serum cholesterol is the intent of the number one class of drugs in America.

The number two class of drugs consists of anti-depressants called selective seratonin-reuptake inhibitors (SSRIs). These drugs are supposed to reduce depression by regulating seratonin in the brain. The problem is that the scientific evidence totally contradicts the seratonin theory of depression.

One of the leading myths that unfortunately still circulates about clinical depression is that it’s caused by low serotonin levels in the brain (or a “biochemical imbalance”). This is a myth because countless scientific studies have specifically examined this theory and have come back universally rejecting it.

So let’s put it to rest once and for all — low levels of serotonin in the brain don’t cause depression.

Regarding SSRIs, there is a growing body of medical literature casting doubt on the serotonin hypothesis, and this body is not reflected in the consumer advertisements. In particular, many SSRI advertisements continue to claim that the mechanism of action of SSRIs is that of correcting a chemical imbalance, such as a paroxetine advertisement, which states, “With continued treatment, Paxil can help restore the balance of serotonin…” [22].

Yet […] there is no such thing as a scientifically established correct “balance” of serotonin. The take-home message for consumers viewing SSRI advertisements is probably that SSRIs work by normalizing neurotransmitters that have gone awry. This was a hopeful notion 30 years ago, but is not an accurate reflection of present-day scientific evidence.

As we reported previously, SSRIs might effectively reduce depression, but only through a placebo effect. Sugar pills also reduce depression, but with the sugar pill you wouldn’t get the nasty SSRI side effects, including increased risk of suicide, stroke, and death. Yet doctors keep handing this stuff out like candy. Perhaps part of the problem is that SSRIs can be prescribed by general practitioners, even though they have no qualifications in psychiatry.

So much for the top two classes of drugs. Just a bit further down the list are antacids known as proton-pump inhibitors (PPIs). Some of the familiar marketing names include Nexium, Prevacid, and Prilosec. As noted in the latest issue of Scientific American, long-term use of these drugs is now being linked to kidney problems as well as dementia.

[T]wo studies linked the regular use of proton-pump inhibitors to conditions that were seemingly unrelated to the acid levels of the stomach. One of the studies, published in JAMA Neurology, found that the drugs increased the risk of developing dementia, including Alzheimer’s disease; the other, published in JAMA Internal Medicine, suggested a greater risk of kidney problems.

The studies reported in 2016 grew out of earlier hints that such chronic use could affect the brain and kidneys. One 2013 study in PLOS ONE, for instance, found that proton-pump inhibitors can enhance the production of beta-amyloid proteins, a hallmark of Alzheimer’s. Three years later the JAMA Neurology study, which included 74,000 Germans older than 75, found that regular PPI users had a 44 percent higher risk of dementia than those not taking PPIs.

Similarly, worries about kidneys emerged from evidence that people with sudden renal damage were more likely to be taking PPIs. In one 2013 study in BMC Nephrology, for example, patients with a diagnosis of kidney disease were found to be twice as likely as the general population to have been prescribed a PPI. The 2016 study of PPIs and kidney disease, which followed 10,482 participants from the 1990s through 2011, showed that those who took the drug suffered a 20 to 50 percent higher risk of chronic kidney disease than those who did not. And anyone who took a double dose of PPIs every day had a much higher risk than study subjects who took a single dose.

Gotta wonder how many of those people who are risking their health by popping purple pills could easily get relief by taking just a relatively harmless Tums.

So statins, antidepressants, and antacids–three of the top eight classes of prescription drugs appear to do more harm than good. How many others?

What a disgrace. The medical profession and the pharmaceutical industry should be ashamed of themselves.

A Relative Age Effect in ADHD Diagnoses

One of the more interesting–and robust–empirical relationships discovered in recent years is the so-called relative age effect. Schoolkids in a given grade can differ in age by up to a full year, depending on their birthdays. This age difference can result in significant variation in physical and emotional development among kids in the same cohort. The difference in relative maturity between a 36-year-old and a 35-year-old is negligible, but a 7-year-old is nearly 17% older than a 6-year-old. The older kids seem to more easily develop confidence and self-esteem, which bolsters their likelihood of life success in the long run. The younger kids lag behind, and never seem to catch up. The effect seems to carry over even into adulthood. The kids with late birthdays have less success as adults.

The relative age effect has been well documented in various contexts including sports and academia. About 40% of pro hockey players from Canada, as well as European soccer players, have birthdays during the first quarter of the year. Other studies show that students with late birthdays are less likely to go to college.

Most recently, studies have shown that students with late birthdays are also more likely to be diagnosed with ADHD.

New research has found the youngest children in West Australian primary school classes are twice as likely as their oldest classmates to receive medication for Attention Deficit Hyperactivity Disorder (ADHD).
Published in the Medical Journal of Australia, the research analysed data for 311,384 WA schoolchildren, of whom 5,937 received at least one government subsidised ADHD prescription in 2013. The proportion of boys receiving medication (2.9%) was much higher than that of girls (0.8%).

Among children aged 6–10 years, those born in June (the last month of the recommended school-year intake) were about twice as likely (boys 1.93 times, girls 2.11 times) to have received ADHD medication as those born in the first intake month (the previous July).

For children aged 11–15 years, the effect was smaller, but still significant.

Previous studies have found similar results for students in the U.S., Canada, and Taiwan.

The reason for the higher rate of diagnosis is that kids with late birthdays probably have more behavioral and emotional problems due to their relative lack of maturity. But that suggests that the students are being misdiagnosed. The purpose of ADHD medications is supposed to be the treatment of a neurological disorder. Youthful immaturity, however, is not a neurological disorder.

Multiple studies, including the WA study, have established boys are three to four times more likely to be medicated for ADHD. If, as is routinely claimed, ADHD is a neurobiological disorder, a child’s birthdate or gender should have no bearing on their chances of being diagnosed.

Well, the article’s point about ‘gender’ doesn’t make sense, because there are plenty of afflictions that have greater incidence among boys, including autism and learning disabilities. We can’t therefore conclude that anything is amiss from the fact that boys get diagnosed more. The correlation with birthdate, however, does suggest the drugs are improperly prescribed. There’s no reason that neurological disorders should correlate with date of birth. This is strong evidence that, at the very least, ADHD drugs are over prescribed. And a more cynical interpretation is that ADHD is not real at all.

Government Wrong About Milk Too

Over a year ago, we reported on a study from New Zealand that found whole milk to be healthier than low-fat. Now comes more evidence, this time from Canada, that whole milk is healthier.

New Canadian research has found that children who drink whole milk are leaner than those who drink low-fat and skimmed versions.
The study, published this week in the American Journal of Clinical Nutrition, also found that kids who consume whole milk have higher vitamin D levels.
Carried out by a team from St. Michael’s Hospital in Toronto, the researchers looked at 2,745 children ages two to six years.
The team surveyed parents on milk consumption, measured the children’s height and weight to calculate Body Mass Index (BMI), and took blood samples to assess vitamin D levels.

The researchers found that the children who drank whole milk had a Body Mass Index score 0.72 units lower than those who drank 1 or 2 per cent milk, comparable to the difference between having a healthy weight and being overweight commented the study’s lead author Dr. Jonathon Maguire.

In addition, children who drank one cup of whole milk each day had higher vitamin D levels, comparable to those who drank nearly 3 cups of one percent milk.
The team suggested that the higher vitamin D levels could be explained by the vitamin being fat soluble. As it dissolves in fat rather than water, milk with a higher fat content therefore contains more vitamin D.
There could be an inverse relationship between body fat and vitamin D stores, and as children’s body fat increases, their vitamin D levels decrease.

And of course, the Canadian government is giving out wrong advice, just as all governments seems to do about everything.

Current guidelines from Health Canada, National Institutes of Health and American Academy of Pediatrics go against the findings from the study, recommending two servings of low fat (one percent or two percent) milk for children over the age of two to reduce the risk of childhood obesity, with Dr. Maguire commenting that the new research indicates a need to look again at existing nutritional guidelines.
Childhood obesity has tripled in North America in the past 30 years while consumption of whole milk has halved over the same period.

I must say I was taken aback recently when I noticed that the small milk bottles offered at McDonald’s and served mostly to kids contain one percent milk. Fat is healthy for kids and important for brain development. Serving kids low-fat or skim milk is just so sad.

Academia Rewards Bad Science

Rewards in academia–promotions, pay increases–are tied to publishing in prestigious academic journals. Publishing in those journals, however, is more likely if the results of the study appear to be novel or surprising. A bias exists, therefore, for academics to produce results that are surprising, even if wrong. Indeed, most results that are surprising are also wrong. The problem is so bad that some people have speculated that everything published is false. Now researchers have done a computer simulation that confirms that published research will tend to be false.

To draw attention to the way good scientists are pressured into publishing bad science (read: sensational and surprising results), researchers in the US developed a computer model to simulate what happens when scientists compete for academic prestige and jobs.
In the model, devised by researchers at the University of California, Merced, all the simulated lab groups they put in these scenarios were honest – they didn’t intentionally cheat or fudge results.

But they received greater rewards if they published ‘novel’ findings – as happens in the real world. They also had to expend greater effort to be rigorous in their methods – which would improve the quality of their research, but lower their academic output.

“The result: Over time, effort decreased to its minimum value, and the rate of false discoveries skyrocketed,” lead researcher Paul Smaldino explains in The Conversation.

“As long as the incentives are in place that reward publishing novel, surprising results, often and in high-visibility journals above other, more nuanced aspects of science, shoddy practices that maximise one’s ability to do so will run rampant,” Smaldino told Hannah Devlin at The Guardian.

Note carefully that the model predicted science would turn out bad even though the researchers were ‘good’ and did not commit fraud by fabricating data or results. But in the real world, fraud does happen. For instance, Yoshitaka Fujii, a researcher in anesthesiology, was found to have fabricated results in at least 183 papers. Taking into account the possibility of fraud suggests that science in the real world might be even more corrupted than the model predicts.

Bear in mind, however, that the computer simulation is itself an academic study. But the simulation says that academic studies are false, which implies the simulation is false, and so academic studies are not false, which implies the simulation is not false, but the simulation says that academic studies are false…in an infinite loop. This is an example of the so-called Liar Paradox.

If “this sentence is false” is true, then the sentence is false, but if the sentence states that it is false, and it is false, then it must be true, and so on.

But leaving aside this problem of logic, my own experience in academia tells me that a lot of published research is in fact bad science.

Tesla’s Scaremongering

Tesla motors is a company that probably could not survive without the massive subsidies it receives from taxpayers. To keep those subsidies coming, and to convince people with more money than brains that buying a Tesla vehicle means ‘saving the planet,’ it’s in Tesla’s interest to promote the big-money scam known as ‘climate change.’

safe1

This is pure scaremongering. There is in fact no scientific justification for calling 350 ppm ‘safe’ and 400 ppm ‘unsafe.’ The climate record has demonstrated relatively little sensitivity to changes in CO2, and so a climate with 400 ppm hardly differs in any noticeable way from a climate with 350 ppm, except by being a bit greener, since CO2 is plant food.

Of course, climate alarmists have often argued that global warming will lead to more storms and/or stronger storms. Basic meteorology, however, states that storminess is caused by the temperature differential between the equator and the poles. And interestingly, models of global warming predict that as the earth warms the temperature differential should decrease (because the poles will warm more than the equator), which suggests fewer storms.

Certainly in the past, when CO2 was lower, Earth experienced no shortage of extreme weather events. A far back as 1362, during the Little Ice Age when the climate was cold and CO2 low, an enormous storm devastated England and Northern Europe. In England, the storm was remembered as The Great Wind, and on the continent as Grote Mandrenke, the “great drowner of men.” A report from Canterbury stated that

around the hour of vespers on that day, dreadful storms and whirlwinds such as never been seen or heard before occurred in England, causing houses and buildings for the most part to come crashing to the ground, while some others, having had their roofs blown off by the force of the winds, were left in the ruined state; and fruit trees in gardens and other places, along with other trees standing in the woods and elsewhere, were wrenched from the earth by their roots with a great crash, as if the Day of Judgement were at hand, and fear and trembling gripped the people of England to such an extent that no one knew where he could safely hide, for church towers, windmills, and many dwelling-houses collapsed to the ground…

The storm killed “at minimum” 25,000 people, and altered the geography of Europe.

An immense storm tide of the North Sea swept far inland from England and the Netherlands to Denmark and the German coast, breaking up islands, making parts of the mainland into islands, and wiping out entire towns and districts, such as Rungholt on the island of Strand in North Frisia, Ravenser Odd in East Yorkshire and the harbour of Dunwich.

Tony Heller, aka Steve Goddard, offers some more recent examples.

safe2

safe4

Leaving aside the anecdotal evidence, systematic studies have uncovered no convincing evidence that warming in recent decades has increased the magnitude or frequency of storms. For instance, a study published in 2010 concluded the following.

No significant time-dependent trends were identified for precipitation or snowfall from East Coast Winter Storms or for the percentage of precipitation or snowfall from East Coast Winter Storms.

Even the UN’s generally alarmist IPCC report assigns ‘low confidence’ to the notion that storms have become more powerful.

In summary, confidence in large scale changes in the intensity of extreme extra-tropical cyclones since 1900 is low.

safe3

CO2 is not a pollutant; it is plant food. All life on Earth depends on CO2, and life evolved at levels of CO2 roughly 15 times as high as exist today.

safe5

In the video below, Patrick Moore, a founder of Greenpeace, explains that 50 million years ago, CO2 in the atmosphere was nearly 18 times greater than today. The long-term trend in CO2 over the past 150 million years is downward. If CO2 were ever to fall below 150 ppm, life on Earth as we know it would come to an end.

Harvard Misled Public on Heart Disease

Back when I was a student I had a housemate who loved bacon and would cook a whole pound of the stuff and consume it all in one sitting. Our other housemate referred to the dish as “heart attack in a pan.” Like most Americans since the 1970s or 1980s, he believed that saturated fat like bacon grease was a cause of heart disease.

Both newer and older wisdom, however, suggest that heart disease has much more to do with simple carbohydrates, like sugar. Gary Taubes conducted an exhaustive survey of the nutrition literature and found that German researchers in particular were linking heart disease to sugar as early as the 1930s.

Starting in the 1950s and 1960s, however, the narrative shifted away from sugar to saturated fat. What caused this change? Well, as usual, if you want to know the answer, you have to follow the money.

The sugar industry paid scientists in the 1960s to play down the link between sugar and heart disease and promote saturated fat as the culprit instead, newly released historical documents show.

The internal sugar industry documents, recently discovered by a researcher at the University of California, San Francisco, and published Monday in JAMA Internal Medicine, suggest that five decades of research into the role of nutrition and heart disease, including many of today’s dietary recommendations, may have been largely shaped by the sugar industry.

“They were able to derail the discussion about sugar for decades,” said Stanton Glantz, a professor of medicine at U.C.S.F. and an author of the JAMA Internal Medicine paper.

The documents show that a trade group called the Sugar Research Foundation, known today as the Sugar Association, paid three Harvard scientists the equivalent of about $50,000 in today’s dollars to publish a 1967 review of research on sugar, fat and heart disease. The studies used in the review were handpicked by the sugar group, and the article, which was published in the prestigious New England Journal of Medicine, minimized the link between sugar and heart health and cast aspersions on the role of saturated fat.

The Harvard scientists and the sugar executives with whom they collaborated are no longer alive. One of the scientists who was paid by the sugar industry was D. Mark Hegsted, who went on to become the head of nutrition at the United States Department of Agriculture, where in 1977 he helped draft the forerunner to the federal government’s dietary guidelines. Another was Dr. Fredrick J. Stare, the chairman of Harvard’s nutrition department.

The documents show that in 1964, John Hickson, a top sugar industry executive, discussed a plan with others in the industry to shift public opinion “through our research and information and legislative programs.”

In 1965, Mr. Hickson enlisted the Harvard researchers to write a review that would debunk the anti-sugar studies. He paid them a total of $6,500, the equivalent of $49,000 today. Mr. Hickson selected the papers for them to review and made it clear he wanted the result to favor sugar.

Harvard’s Dr. Hegsted reassured the sugar executives. “We are well aware of your particular interest,” he wrote, “and will cover this as well as we can.”

As they worked on their review, the Harvard researchers shared and discussed early drafts with Mr. Hickson, who responded that he was pleased with what they were writing. The Harvard scientists had dismissed the data on sugar as weak and given far more credence to the data implicating saturated fat.

“Let me assure you this is quite what we had in mind, and we look forward to its appearance in print,” Mr. Hickson wrote.

The title of this article in the New York Times is “How the Sugar Industry Shifted Blame to Fat,” and people in the comments thread are focusing their ire on the industry. But it takes two to tango, and the article could just as easily have been titled “How Harvard Deceived the Public About Heart Disease.” The actions of the industry are deplorable, but the fact is that the Harvard academics sold out. I would argue that the role of the academics is even more appalling, because they are the ones who have the social responsibility for maintaining scientific integrity. The primary responsibility of the sugar industry is to produce sugar, and despite this incident, we’ll need to keep them around in order to supply us with the sugar we consume (in moderation!). But if academics won’t adhere to the truth and to scientific integrity, who needs them?

Also revealing is how cheaply the academics were willing to sell out. The Times claims that the amount is equivalent to $49,000 in current dollars. That’s actually pretty close. Adjusting for the growth in average wages gives a figure of about $53,000. There were three authors, so we’re talking an average of roughly $17,000 per author. That seems a pretty small sum for misleading the entire country about heart disease, resulting in the loss of God knows how many lives. I mean, it’s not like $17,000 buys you a summer home in the Hamptons.

What to do? Well, don’t expect any solution to come out of Harvard. Instead they offer up this howler.

Dr. Walter Willett, chairman of the nutrition department at the Harvard T. H. Chan School of Public Health, said that academic conflict-of-interest rules had changed significantly since the 1960s, but that the industry papers were a reminder of “why research should be supported by public funding rather than depending on industry funding.”

So researchers funded by industry can be expected to do the bidding of industry. But researchers funded by government won’t do the bidding of government?

Send in the clowns.

Are People Getting Dumber?

Psychologist Michael Woodley claims that, since the Victorian Era, average IQ has fallen by 10-15 points, almost a full standard deviation. Woodley came to this conclusion by measuring reaction times, which correlate with IQ, and comparing his results to measurements made by the famous statistician Francis Galton during the 1880s.

Woodley’s conclusion seems to contradict the Flynn Effect, which is the fact that scores on pencil-and-paper IQ tests have increased in the long term by about 3 points per decade. Woodley argues, however, that IQ tests cannot be used to make comparisons in different time periods. Over time, IQ scores on tests have increased, but this doesn’t necessarily imply that people have gotten smarter; it could be that people have just gotten better at taking standardized pencil-and-paper tests.

Woodley says his results on reaction times are reinforced by results using other measures that correlate with IQ such as repeating digits backwards and (oddly enough) distinguishing between fine gradations of color. These criteria seem relatively immune to the sorts of bias that plague written IQ tests.

The cause of the 10-15 point drop remains a mystery. Woodley says that dysgenic fertility (the tendency for low IQ women to have more children than high IQ women) can account for a loss of only 3 or 4 points during the relevant time span. So about two-thirds of the decline remains unaccounted for. He doesn’t mention it, but it seems to me that the drop in childhood mortality should have had some effect as well. Lower childhood mortality probably implies that more relatively low-IQ children survive to adulthood.

Here’s an interview with Woodley in which he discusses his research. The interview is long and somewhat high-brow, but rewards scrutiny.