Journalists Don’t Know Much About History: An Ongoing Series

Many popular media outlets reported recently on a fascinating new scientific study published in Nature Communications. Here’s how Britain’s Daily Mail interpreted the story.

The first ever full-genome analysis of Ancient Egyptians shows they were more Turkish and European than African.

Scientists analysed ancient DNA from Egyptian mummies dating from 1400 BC to 400 AD and discovered they shared genes with people from the Mediterranean.

They found that ancient Egyptians were closely related to ancient populations in the Levant – now modern day Turkey, Syria, Jordan, Israel and Lebanon.

They were also genetically similar to Neolithic populations from the Anatolian Peninsula and Europe.

The groundbreaking study used recent advances in DNA sequencing techniques to undertake a closer examination of mummy genetics than ever before.

The study is interesting and seems to deliver a devastating blow to the Black Egyptian Hypothesis which has been advocated by many serious people, including W. E. B. DuBois. Contrary to the hypothesis, the ancient Egyptians don’t seem to have been appreciably more black than were, say, the ancient Greeks, although the study did not include evidence from southern (Upper) Egypt, where sub-Saharan admixture may have been relatively greater.

What are we to make, however, of the Daily Mail‘s claim that ancient Egyptians were “Turkish”? The study itself did not make that claim. What the study showed was that ancient Egyptians shared genetics with ancient inhabitants of Anatolia. But the ancient inhabitants of Anatolia did not include any Turks. The Turks were an Asian tribe that did not enter Anatolia until the medieval period. The door opened to large scale Turkish migration to Anatolia after the Seljuk Turks defeated the Byzantines at the Battle of Manzikert in 1071.

Of course, modern Turkey must today include many people descended from those ancient Anatolians who were related to the ancient Egyptians. But over the centuries there has occurred so much movement of peoples, including the Turks, that modern populations are very different genetically from the ancient populations who occupied the same territory. So direct comparisons to modern populations are exceedingly hazardous. If anything, it’s probably more accurate to say the ancient Egyptians were “Greek” than to say they were “Turkish,” although the Nature Communications study makes neither assertion.

What the study suggests is merely that the ancient Egyptians were a Mediterranean people, and not sub-Saharan. But under no reasonable interpretation could the ancient Egyptians be called “Turkish,” and I doubt anybody with even a passing familiarity with the history of the Turks or the Byzantine Empire would differ with this conclusion.

And so we have just one more data point, as if any more were needed, that journalists don’t know much. But that, of course, doesn’t stop them from trying to lecture to the rest of us.

A Generational Decline in Testosterone

Ever wonder how, in just three generations, American males went from G.I.s who defeated the Nazis and the Japanese Empire to whiny Pajama Boys who think Barack Obama is cool? Well, I’m just throwing this out there: maybe it has something to do with low-T. Several studies have found that contemporary western males have significantly lower testosterone levels than same-age males had roughly 30 years ago. A couple of studies first reported the secular decline in testosterone about ten years ago. One study focused on men in Massachusetts over age 45.

“Male serum testosterone levels appear to vary by generation, even after age is taken into account,” said Thomas G. Travison, Ph.D., of the New England Research Institutes (NERI) in Watertown, Mass., and lead author of the study. “In 1988, men who were 50 years old had higher serum testosterone concentrations than did comparable 50-year-old men in 1996. This suggests that some factor other than age may be contributing to the observed declines in testosterone over time.”

For men 65-69 years of age in this study, average total testosterone levels fell from 503 ng/dL (nanograms/deciliter) in 1988 to 423 ng/dL in 2003.

Another study published the same year found similar results for men in Denmark. But that was 10 years ago, and I was wondering if any follow-up studies had been done since then. All I managed to find was a 2012 study from Finland. This study also found a secular decline in testosterone.

We analysed serum levels of testosterone, gonadotrophin and sex hormone binding globulin (SHBG) in 3271 men representing different ages (25–74 years) and birth cohorts within three large Finnish population surveys conducted in 1972, 1977 and 2002…The more recently born Finnish men have lower testosterone levels than their earlier born peers.

French leftists protesting in skirts

Notably, the fall in T-levels cannot be fully explained by changes in health or lifestyle such as obesity or smoking. Some other environmental factors must be responsible, but nobody knows which. Speculation involves a wide range of possibilities, everything from endocrine disruptors in plastics to tight underwear!

Whatever the cause, I wonder if this change in hormone levels has implications for male behavior and social outcomes. For instance, could low-T have an effect on marriage or divorce rates? And what about birth rates? (There is some mixed evidence suggesting that sperm also has declined in both quality and quantity.)

Right now, violent crime rates in America are at their lowest level in about 50 years. Could the drop in violent crime be caused at least in part by diminished male aggression due to lower testosterone?

Low T might offer some benefits, like maybe lower crime rates, but the fact that some unknown factor is adversely affecting men’s health is nonetheless disturbing. And yet, nobody seems to care. As far as I know, there is no concern among advocacy groups or public health officials regarding the problem of secularly declining testosterone. Some endocrinologists have an academic interest in the issue, but it does not show up on the radar screen of people working in public health.

Imagine, however, if the sexes were reversed, and it were women instead of men who had exhibited a long-term decline in hormone levels. In that case, it would be a genuine public health crisis. We would all know about the problem, and the subject would be discussed endlessly on The View.

But when it happens to men: crickets. Men take note: Society does not care about you.

Bring Back Beef Tallow Fries!

For decades now, the government, nutritionists, and the medical profession have told us to replace dietary saturated fats with polyunsaturated fats like vegetable oils. That way, we can reduce the incidence of heart disease and live longer. This idea is so widely accepted that the bottle of Mazola corn oil is labeled “Heart Healthy” with a big red heart. The claim must be true, because otherwise the FDA would never allow Mazola to engage in misleading product labeling, amirite? No, the health benefits of vegetable oil are Settled Science. That’s why, about 25 years ago, McDonald’s had to stop cooking its french fries in beef tallow and switch to vegetable oil.

But how much real evidence do we have that vegetable oil is healthy? In general, it is hard to determine the relationship between diet and health because humans are not guinea pigs that can be subjected to a controlled diet in the laboratory. Usually, the best that researchers can do is to ask people what they eat, but people’s responses are notoriously unreliable.

Turns out, however, that over 40 years ago a rare study was done that did manage to precisely control human diets over a period of several years. The study achieved dietary control by focusing on patients confined to nursing homes or mental institutions. For some reason, however, the results of this uniquely valuable study were never published in full.

Thanks to Christopher Ramsden, who specializes in recovering lost studies, the results of the study were finally published last year, after literally being rescued from a moldy basement. And what did those results indicate?

Ramsden wondered if there was more data from the study somewhere.

In 2011, he sought out the sons of the experiment’s principal scientist, Dr. Ivan Frantz of the University of Minnesota, who died in 2009…

Dr. Robert Frantz, a physician at the Mayo Clinic, drove 90 minutes to his childhood home, to search file cabinets. On his third trip he spied moldering, unlabeled boxes in the far corner of the basement. Inside were ancient magnetic computer tapes and reams of yellowed documents. The subject line in his email to Ramsden was “Eureka.”

After getting the tapes translated into formats that modern computers can read, Ramsden and his colleagues discovered what had been hidden for nearly half a century: records on 9,423 study participants, ages 20 to 97, all living in state mental hospitals or a nursing home. It was the largest experiment of its kind.

It was also one of the most rigorous. Participants were randomly assigned either to the group eating the then-standard diet, which was high in animal fats and margarines, or to a group in which vegetable oil and corn oil margarine replaced about half of those saturated fats. Such a randomized controlled trial is considered less likely to produce misleading results than observational studies, in which volunteers eat whatever they choose. Observational studies are weaker than randomized ones because people who eat one way, rather than another, might have characteristics that benefit their heart health.

And because the Minnesota participants were in institutions that prepared all their meals and kept records, the scientists knew exactly what they ate for up to 56 months. Many nutrition studies have foundered because people misremember, or lie about, what they ate.

Analyzing the reams of old records, Ramsden and his team found, in line with the “diet-heart hypothesis,” that substituting vegetable oils lowered total blood cholesterol levels, by an average of 14 percent.

But that lowered cholesterol did not help people live longer. Instead, the lower cholesterol fell, the higher the risk of dying: 22 percent higher for every 30-point fall. Nor did the corn-oil group have less atherosclerosis or fewer heart attacks. [Emphasis added.]

Indeed, other studies also have shown that the correlation between serum cholesterol and life expectancy is positive, which means lower cholesterol is associated with shorter life. But your doc still wants you to bring down your cholesterol.

Time for McDonald’s to bring back the beef tallow fries. But of course if McDonald’s did that, all the corrupt and ignorant non-profit pressure groups would scream bloody murder.

Truly we live in a frustrating world of rent-seeking and ignorance, and rent-seeking enabled by ignorance.

Heart Disease: Do Docs Have No Clue?

I am not a doctor, and I don’t even play one on TV. I’m just an academic economist who often reads stuff on blogs and in layman’s publications about health and fitness. And it seems that whenever I read something about heart disease, it contradicts the medical profession’s conventional wisdom. Heart disease is the leading cause of death in America, and doctors act as if they know and can do a lot about it, but much of the stuff I read suggests that they have hardly a clue about what really causes heart disease or how to prevent or treat it.

Your doctor carefully tests your cholesterol level, and tells you that you should follow that number as closely as a CEO follows his corporation’s stock price. But as I reported back in June, the latest research shows that cholesterol is not a problem.

Cholesterol does not cause heart disease in the elderly and trying to reduce it with drugs like statins is a waste of time, an international group of experts has claimed.

A review of research involving nearly 70,000 people found there was no link between what has traditionally been considered “bad” cholesterol and the premature deaths of over 60-year-olds from cardiovascular disease.

Published in the BMJ Open journal, the new study found that 92 percent of people with a high cholesterol level lived longer.
[…]
“What we found in our detailed systematic review was that older people with high LDL (low-density lipoprotein) levels, the so-called “bad” cholesterol, lived longer and had less heart disease.”

Vascular and endovascular surgery expert Professor Sherif Sultan from the University of Ireland, who also worked on the study, said…“Lowering cholesterol with medications for primary cardiovascular prevention in those aged over 60 is a total waste of time and resources, whereas altering your lifestyle is the single most important way to achieve a good quality of life.”…

Lead author Dr Uffe Ravnskov, a former associate professor of renal medicine at Lund University in Sweden, said there was “no reason” to lower high-LDL-cholesterol.

And yet, anti-cholesterol drugs remain today the leading class of drugs prescribed in America. Do docs read the same things I read? One of us must be missing something.

Next, consider last month’s piece in The Atlantic about unnecessary medical procedures. One of the procedures highlighted by the article is heart stents.

In 2012, Brown had coauthored a paper that examined every randomized clinical trial that compared stent implantation with more conservative forms of treatment, and he found that stents for stable patients prevent zero heart attacks and extend the lives of patients a grand total of not at all. In general, Brown says, “nobody that’s not having a heart attack needs a stent.” (Brown added that stents may improve chest pain in some patients, albeit fleetingly.) Nonetheless, hundreds of thousands of stable patients receive stents annually, and one in 50 will suffer a serious complication or die as a result of the implantation procedure.

In particular, you can die from a post-operative blood clot. For the sake of an unnecessary procedure. Good work, docs!

Unlike statins, blood pressure medications are something I have never really questioned. Cholesterol numbers might be meaningless, but surely blood pressure means something, right?. And blood-pressure medications really do effectively bring down pressure. That would seem to be obviously beneficial, since lowering pressure reduces strain on the heart. Indeed, the conventional wisdom holds that blood pressure medications, known as beta-blockers, have saved untold numbers of lives. And yet, the same Atlantic piece casts doubt on the usefulness of beta-blockers.

[T]he latest review of beta-blockers from the Cochrane Collaboration—an independent, international group of researchers that attempts to synthesize the best available research—reported that they “are not recommended as first line treatment for hypertension as compared to placebo due to their modest effect on stroke and no significant reduction in mortality or coronary heart disease.”

That somewhat awkward language might require a bit of translation. “Not recommended…compared to placebo” means the beta-blockers are worse than doing nothing. They do more harm than good. And the “modest effect on stroke” refers not to a decreased but to an increased risk of stroke. The beta-blockers modestly increase the risk of stroke without reducing the risk of “mortality or coronary heart disease.” What a deal.

Finally, here’s something else I ran across this week. The so-called Seven Countries Study is the most famous study to link heart disease to saturated fat and cholesterol. The 25-year follow up to the original study again found a significant correlation between cholesterol and heart disease. Researchers in the U.K., however, analyzed the same data and found that heart disease correlated even more closely with…wait for it…latitude.

The Pearson correlation coefficient was calculated as 0.72 for baseline cholesterol and CHD deaths at 25 years. The data in the 1993 Menotti article has been examined to repeat the correlations found with CHD death rates and mean serum cholesterol to understand the data and methodology used. The same methodology was then used to explore alternative correlations. The strongest relationship found was for CHD death rates and the latitude of the country or cohort in The Seven Countries Study. The correlation coefficient for CHD deaths and latitude of the cohort was 0.93. The correlation coefficient for CHD deaths and latitude of the country was 0.96. While Keys did find a strong association with median serum cholesterol and CHD deaths, there were stronger associations that were discoverable.
The latitude finding offers an alternative explanation for the observed relationship with cholesterol and CHD. Vitamin D is made when sunshine synthesises cholesterol in skin membranes. In cohorts further [sic] away from the equator, cholesterol is less able to be turned into vitamin D.
Population mean serum cholesterol levels are higher and concomitantly population mean vitamin D levels are lower. Higher CHD could be associated with lower vitamin D, with cholesterol a marker, not a maker, of heart disease.

So according to this theory, the problem is not too much saturated fat, but too little vitamin D from sunshine. The theory casts doubt, therefore, on the alleged benefits of the ‘Mediterranean diet.’ The Mediterranean advantage would be the sunshine, not the food.

So much of what we think we know, might not be so.

Lab-Grown Meat is Coming

Just four years ago, a pound of lab-grown beef cost $325,000 to produce. Now, researchers have got the cost down to about $11. If progress continues, lab-grown meat might be commercially viable in less than five years.

Mark Post, whose stem cell burger created an international sensation in 2013, recently announced that his company, Mosa Meat, would be selling lab-grown beef in four to five years.

Memphis Meats is developing a way to produce meat directly from animal cells without the need to feed, breed or slaughter actual animals.

In theory, the stem cells could provide a lot of meat. Assuming unlimited nutrients and room to grow, a single satellite cell can undergo 75 generations of division during three months. That means one turkey cell could turn into enough muscle to manufacture over 20 trillion turkey nuggets.

Some animal lovers are welcoming lab-grown meat because it means that cows and pigs will no longer be slaughtered. It also means, however, that nobody will any longer have an incentive to raise them. The population of farm animals would undergo a complete collapse. The future of domesticated livestock might be the zoo, and those zoos might offer the only opportunity to save those animals from complete extinction.

By the way, the invention of lab-grown meat was foreseen some 85 years ago by none other than Sir Winston Churchill, who was ridiculed for his prediction. Churchill was just a little too optimistic about the time line, however, as he thought lab-grown meat would be viable by the 1980s.

Progressives Finalizing Their Religious Diet

Increasingly, the only way I can make sense of modern liberalism is by viewing it as a neo-pagan, post-Christian religion. Most religions impose some type of dietary restrictions. Jews, for instance, can’t eat pork or combine meat with dairy. Mormons can’t have alcohol or caffeine. And so on. It’s perhaps only natural, therefore, that liberalism should develop a religiously dictated diet of its own. News out of California–the Holy Land of liberalism–suggests that liberalism’s approved diet is finally taking shape. Like many religious diets, it will include little or no meat.

FOE [Friends of the Earth] gave kids a lunch menu designed to eliminate foods it says are “unsustainable for our planet.” The new menu features far less meat and more plant-based food. Any meat or cheese the school did use came from “pastured, organic dairy cows.” The student’s lunch menu went from beef hot dogs and pepperoni pizza to vegan stir fry tofu and vegan tostadas. The new FOE-approved menu served meat and cheese-less frequently and reduced the portion sizes.

Needless to say, none of this has any scientific legitimacy. The term “unsustainable for our planet” has absolutely no scientific meaning.

“This is a landmark moment for school food,” Jennifer LeBarre, head of nutrition services for Oakland Unified School District, said in a FOE press statement. “We were so excited to see how the data showed that we could reduce our carbon and water footprint by serving healthy, delicious food –– like the vegetarian tostadas with fresh made in-house salsa, that kids absolutely love –– all while saving money.”

The old morality: The Ten Commandments

The new morality: Reducing your “carbon and water footprint”

The district and FOE claimed the lunch program was healthier than before, but only on the basis that food from plants is typically healthier than meat.

Of course, the assertion that “food from plants is typically healthier than meat,” has no scientific justification. Vegetarians, on average, are less healthy both physically and mentally than meat eaters.

Concerning self-reported health, vegetarians differ significantly from each of the other groups, toward poorer health (p = 000). Moreover, these subjects report higher levels of impairment from disorders (p = .002). Vegetarians additionally report more chronic diseases than those eating a carnivorous diet less rich in meat (p = .000; Table 2). Significantly more vegetarians suffer from allergies, cancer, and mental health ailments (anxiety, or depression) than the other dietary habit groups (Table 3).

A meatless diet seems particularly unhealthy for children, who need dietary fat for brain development.

But that’s science, and therefore irrelevant to the modern religion of liberalism.

The scary part is that if liberals can impose their dietary restrictions on schoolchildren, the next logical step will be to impose them on the rest of us.

Maybe someday the liberal Taliban will force meateaters to go into hiding, and dietary dissidents will risk arrest for scoffing brisket at underground barbecues.

Oakland schools partnered with the environmental group Friends of the Earth (FOE) to fight global warming by making student lunches climate-friendly.

Friends of the Earth. Enemies of Humanity.

Are Prescription Drugs All Crocs?

Ok, not all drugs are crocs, but a lot. An astonishing 70 percent of Americans take at least one prescription drug. How many of those drugs are useless or even harmful?

Consider the two most prescribed classes of drugs: anti-cholesterol agents (statins) and anti-depressants. The scientific theories underlying both drug classes have more or less been debunked.

The theory underlying statins is the so-called lipid hypothesis of heart disease. This theory has been around for 60 years, but was never supported by very much scientific evidence. The latest and best evidence generally contradicts the lipid hypothesis. Simply put, there is no correlation between cholesterol and heart disease. Heart patients admitted to hospitals have an average cholesterol level no higher than the population as a whole. The overall correlation between cholesterol and life expectancy is positive–people with higher cholesterol live longer on average. Yet reducing serum cholesterol is the intent of the number one class of drugs in America.

The number two class of drugs consists of anti-depressants called selective seratonin-reuptake inhibitors (SSRIs). These drugs are supposed to reduce depression by regulating seratonin in the brain. The problem is that the scientific evidence totally contradicts the seratonin theory of depression.

One of the leading myths that unfortunately still circulates about clinical depression is that it’s caused by low serotonin levels in the brain (or a “biochemical imbalance”). This is a myth because countless scientific studies have specifically examined this theory and have come back universally rejecting it.

So let’s put it to rest once and for all — low levels of serotonin in the brain don’t cause depression.

Regarding SSRIs, there is a growing body of medical literature casting doubt on the serotonin hypothesis, and this body is not reflected in the consumer advertisements. In particular, many SSRI advertisements continue to claim that the mechanism of action of SSRIs is that of correcting a chemical imbalance, such as a paroxetine advertisement, which states, “With continued treatment, Paxil can help restore the balance of serotonin…” [22].

Yet […] there is no such thing as a scientifically established correct “balance” of serotonin. The take-home message for consumers viewing SSRI advertisements is probably that SSRIs work by normalizing neurotransmitters that have gone awry. This was a hopeful notion 30 years ago, but is not an accurate reflection of present-day scientific evidence.

As we reported previously, SSRIs might effectively reduce depression, but only through a placebo effect. Sugar pills also reduce depression, but with the sugar pill you wouldn’t get the nasty SSRI side effects, including increased risk of suicide, stroke, and death. Yet doctors keep handing this stuff out like candy. Perhaps part of the problem is that SSRIs can be prescribed by general practitioners, even though they have no qualifications in psychiatry.

So much for the top two classes of drugs. Just a bit further down the list are antacids known as proton-pump inhibitors (PPIs). Some of the familiar marketing names include Nexium, Prevacid, and Prilosec. As noted in the latest issue of Scientific American, long-term use of these drugs is now being linked to kidney problems as well as dementia.

[T]wo studies linked the regular use of proton-pump inhibitors to conditions that were seemingly unrelated to the acid levels of the stomach. One of the studies, published in JAMA Neurology, found that the drugs increased the risk of developing dementia, including Alzheimer’s disease; the other, published in JAMA Internal Medicine, suggested a greater risk of kidney problems.

The studies reported in 2016 grew out of earlier hints that such chronic use could affect the brain and kidneys. One 2013 study in PLOS ONE, for instance, found that proton-pump inhibitors can enhance the production of beta-amyloid proteins, a hallmark of Alzheimer’s. Three years later the JAMA Neurology study, which included 74,000 Germans older than 75, found that regular PPI users had a 44 percent higher risk of dementia than those not taking PPIs.

Similarly, worries about kidneys emerged from evidence that people with sudden renal damage were more likely to be taking PPIs. In one 2013 study in BMC Nephrology, for example, patients with a diagnosis of kidney disease were found to be twice as likely as the general population to have been prescribed a PPI. The 2016 study of PPIs and kidney disease, which followed 10,482 participants from the 1990s through 2011, showed that those who took the drug suffered a 20 to 50 percent higher risk of chronic kidney disease than those who did not. And anyone who took a double dose of PPIs every day had a much higher risk than study subjects who took a single dose.

Gotta wonder how many of those people who are risking their health by popping purple pills could easily get relief by taking just a relatively harmless Tums.

So statins, antidepressants, and antacids–three of the top eight classes of prescription drugs appear to do more harm than good. How many others?

What a disgrace. The medical profession and the pharmaceutical industry should be ashamed of themselves.

A Relative Age Effect in ADHD Diagnoses

One of the more interesting–and robust–empirical relationships discovered in recent years is the so-called relative age effect. Schoolkids in a given grade can differ in age by up to a full year, depending on their birthdays. This age difference can result in significant variation in physical and emotional development among kids in the same cohort. The difference in relative maturity between a 36-year-old and a 35-year-old is negligible, but a 7-year-old is nearly 17% older than a 6-year-old. The older kids seem to more easily develop confidence and self-esteem, which bolsters their likelihood of life success in the long run. The younger kids lag behind, and never seem to catch up. The effect seems to carry over even into adulthood. The kids with late birthdays have less success as adults.

The relative age effect has been well documented in various contexts including sports and academia. About 40% of pro hockey players from Canada, as well as European soccer players, have birthdays during the first quarter of the year. Other studies show that students with late birthdays are less likely to go to college.

Most recently, studies have shown that students with late birthdays are also more likely to be diagnosed with ADHD.

New research has found the youngest children in West Australian primary school classes are twice as likely as their oldest classmates to receive medication for Attention Deficit Hyperactivity Disorder (ADHD).
Published in the Medical Journal of Australia, the research analysed data for 311,384 WA schoolchildren, of whom 5,937 received at least one government subsidised ADHD prescription in 2013. The proportion of boys receiving medication (2.9%) was much higher than that of girls (0.8%).

Among children aged 6–10 years, those born in June (the last month of the recommended school-year intake) were about twice as likely (boys 1.93 times, girls 2.11 times) to have received ADHD medication as those born in the first intake month (the previous July).

For children aged 11–15 years, the effect was smaller, but still significant.

Previous studies have found similar results for students in the U.S., Canada, and Taiwan.

The reason for the higher rate of diagnosis is that kids with late birthdays probably have more behavioral and emotional problems due to their relative lack of maturity. But that suggests that the students are being misdiagnosed. The purpose of ADHD medications is supposed to be the treatment of a neurological disorder. Youthful immaturity, however, is not a neurological disorder.

Multiple studies, including the WA study, have established boys are three to four times more likely to be medicated for ADHD. If, as is routinely claimed, ADHD is a neurobiological disorder, a child’s birthdate or gender should have no bearing on their chances of being diagnosed.

Well, the article’s point about ‘gender’ doesn’t make sense, because there are plenty of afflictions that have greater incidence among boys, including autism and learning disabilities. We can’t therefore conclude that anything is amiss from the fact that boys get diagnosed more. The correlation with birthdate, however, does suggest the drugs are improperly prescribed. There’s no reason that neurological disorders should correlate with date of birth. This is strong evidence that, at the very least, ADHD drugs are over prescribed. And a more cynical interpretation is that ADHD is not real at all.

Government Wrong About Milk Too

Over a year ago, we reported on a study from New Zealand that found whole milk to be healthier than low-fat. Now comes more evidence, this time from Canada, that whole milk is healthier.

New Canadian research has found that children who drink whole milk are leaner than those who drink low-fat and skimmed versions.
The study, published this week in the American Journal of Clinical Nutrition, also found that kids who consume whole milk have higher vitamin D levels.
Carried out by a team from St. Michael’s Hospital in Toronto, the researchers looked at 2,745 children ages two to six years.
The team surveyed parents on milk consumption, measured the children’s height and weight to calculate Body Mass Index (BMI), and took blood samples to assess vitamin D levels.

The researchers found that the children who drank whole milk had a Body Mass Index score 0.72 units lower than those who drank 1 or 2 per cent milk, comparable to the difference between having a healthy weight and being overweight commented the study’s lead author Dr. Jonathon Maguire.

In addition, children who drank one cup of whole milk each day had higher vitamin D levels, comparable to those who drank nearly 3 cups of one percent milk.
The team suggested that the higher vitamin D levels could be explained by the vitamin being fat soluble. As it dissolves in fat rather than water, milk with a higher fat content therefore contains more vitamin D.
There could be an inverse relationship between body fat and vitamin D stores, and as children’s body fat increases, their vitamin D levels decrease.

And of course, the Canadian government is giving out wrong advice, just as all governments seems to do about everything.

Current guidelines from Health Canada, National Institutes of Health and American Academy of Pediatrics go against the findings from the study, recommending two servings of low fat (one percent or two percent) milk for children over the age of two to reduce the risk of childhood obesity, with Dr. Maguire commenting that the new research indicates a need to look again at existing nutritional guidelines.
Childhood obesity has tripled in North America in the past 30 years while consumption of whole milk has halved over the same period.

I must say I was taken aback recently when I noticed that the small milk bottles offered at McDonald’s and served mostly to kids contain one percent milk. Fat is healthy for kids and important for brain development. Serving kids low-fat or skim milk is just so sad.

Academia Rewards Bad Science

Rewards in academia–promotions, pay increases–are tied to publishing in prestigious academic journals. Publishing in those journals, however, is more likely if the results of the study appear to be novel or surprising. A bias exists, therefore, for academics to produce results that are surprising, even if wrong. Indeed, most results that are surprising are also wrong. The problem is so bad that some people have speculated that everything published is false. Now researchers have done a computer simulation that confirms that published research will tend to be false.

To draw attention to the way good scientists are pressured into publishing bad science (read: sensational and surprising results), researchers in the US developed a computer model to simulate what happens when scientists compete for academic prestige and jobs.
In the model, devised by researchers at the University of California, Merced, all the simulated lab groups they put in these scenarios were honest – they didn’t intentionally cheat or fudge results.

But they received greater rewards if they published ‘novel’ findings – as happens in the real world. They also had to expend greater effort to be rigorous in their methods – which would improve the quality of their research, but lower their academic output.

“The result: Over time, effort decreased to its minimum value, and the rate of false discoveries skyrocketed,” lead researcher Paul Smaldino explains in The Conversation.

“As long as the incentives are in place that reward publishing novel, surprising results, often and in high-visibility journals above other, more nuanced aspects of science, shoddy practices that maximise one’s ability to do so will run rampant,” Smaldino told Hannah Devlin at The Guardian.

Note carefully that the model predicted science would turn out bad even though the researchers were ‘good’ and did not commit fraud by fabricating data or results. But in the real world, fraud does happen. For instance, Yoshitaka Fujii, a researcher in anesthesiology, was found to have fabricated results in at least 183 papers. Taking into account the possibility of fraud suggests that science in the real world might be even more corrupted than the model predicts.

Bear in mind, however, that the computer simulation is itself an academic study. But the simulation says that academic studies are false, which implies the simulation is false, and so academic studies are not false, which implies the simulation is not false, but the simulation says that academic studies are false…in an infinite loop. This is an example of the so-called Liar Paradox.

If “this sentence is false” is true, then the sentence is false, but if the sentence states that it is false, and it is false, then it must be true, and so on.

But leaving aside this problem of logic, my own experience in academia tells me that a lot of published research is in fact bad science.