“Sweet and dour” (Artificial sweetener hysteria)

SBArtificial sweeteners have been the subject of mass hysteria for decades. In the 1970s, studies fueled worries about the possible carcinogenetic nature of saccharin. However, this research involved rats being force-fed the synthetic compound at a rate that would have been like a person drinking 100 diet sodas a day for years.

In the early 1990s, the Internet’s first wide-spread smear campaign listed every malady in the history of Mankind as being the result of aspartame, which raised the question of why humans hadn’t been immortal prior to the artificial sweetener’s creation.

This year, there was an alarmist report about diet soda being responsible for Alzheimer’s, cancer, dementia, and the Smog Monster. This freak-out was based on a horrible misinterpretation of the study, which is what’s happening in yet another fabricated fizzy fear. This latest scare is that artificial sweeteners wreak havoc with one’s gut microbiome.

The human gastrointestinal tract is amazingly complex and is composed of multitudinous organisms that can either help or hinder digestion. These organisms can have a substantial impact on our health, either good or bad. Because of the microbiome’s key role in human wellbeing, research is constantly being done on it.

That includes a study which some media have given plenty of panicky play to. In this experiment, scientists poured artificial sweeteners on bacterial cells. At very high concentrations, most of the bacteria began to act stressed, and researchers deduced that artificial additives were the culprit. This was translated in the press as sweeteners being detrimental to human health.

This was an unfounded conclusion. For starters, the research considered only a few strains of e. coli, which are among the millions of different types of bacteria that have taken up residence in our gastrointestinal tracts. Further, the stressed reaction only occurred when e. coli were subjected to extremely elevated dosages. The bacteria started showing agitation after exposure to four grams per liter of aspartame. The human equivalent of this would be chugging two gallons of Mountain Dew in 15 minutes. Incidentally, I’d be might riled myself if strangers kept dousing me with sticky liquids.

Also, reactions from one type of organism seldom translate into the same experience for another type. Epidemiologist and skeptic blogger Gid M-K wrote, “Exposing cells to artificial sweeteners in a lab is very different to a person drinking diet soft drinks.”

Indeed, a 2016 systemic review of studies concluded there is little evidence of a substantial health detriment or benefit to ingesting moderate amounts of artificial sweetener.

This is much shorter than most of my entries, but I’ve got to prep a Thanksgiving meal, one that will safely include some Diet Cherry Dr Pepper.

“Taking baby from a candy” (Halloween hysteria)


It’s time for ghosts, goblins, and gremlins, but if seeking still further imaginary fright, consider some Halloween urban legends and hysterics.

First we have the time-dishonored account of poisoned candy. Even before the Internet and 24-7 news channels, these dark tales made the rounds and almost everyone knew about them. Ann Landers, notorious for passing off urban legends as fact, helped spread the terror. I remember presenting my candy to my parents for inspection and losing one piece because it seemed to be an off-brand, although the neighbor’s long hair, being affixed to a 20-something male in 1974, may have been another factor.

Police departments and hospitals offered free X-rays of the confectioneries, with the uniformly negative results being inconsistent with the panic that the candy from strangers was causing. The munchies madness hit its peak in the mid-1990s when Halloween parties began being held in lieu of neighborhood knockings. Surprisingly, the transformation of the Internet from relative novelty to ubiquitous entity over the last quarter century has not pushed the alarm to greater heights and, in fact, it has tempered somewhat, though warnings still abound.

Fortunately, such advisories are based on fear, not facts. Only one person is confirmed to have died from poisoned Trick or Treat candy, and the victim was targeted by his father, who slipped his son cyanide-laced Pixy Stix. He did the same to his daughter and four of his children’s friends, hoping to make the event seem random, but none of the four other intended victims ate the tainted treat.

While the granular candy was not passed out by a neighbor answering the door, the murderer was trying to take advantage of the myth that such occurrences had been happening nationwide for decades.  Ironically, his attempt to emulate an urban legend helped to spread it. But this was not a case of a homeowner handing out lethal candies. The killer did not pass them out at his door, nor were the Pixy Stix put in his son’s plastic Jack-o-Lantern by someone else. 

There are still periodic reports of children dying after ingesting laced confectioneries, but investigations have always concluded there was another cause. It should be noted that razors, pins, and needles have been found in Halloween candy multiple times so caution against this occurrence is justified.

But as to toxic treats, the Los Angeles Times quoted Cal State sociology professor Joel Best, who said, “We checked major newspapers from throughout the country from 1958 through 1988, assuming that any story this horrible would certainly be well reported,” and his research group found no cases of intentional, random Halloween candy contamination.

With that, we move from poison to pedophiles to satisfy our All Hallows Eve hysteria.  

According to Reason’s Lenore Skenazy, each October the website Patch publishes maps that show where registered sex offenders live. This is publicly available, accurate information, but the insinuation is that residents of these homes pose a specific danger on Halloween. However, those persons are prohibited from handing out candy on this date and a study of 67,000 child molestation cases showed no increase in such incidents on Oct. 31. Besides, the great majority of crimes against children are not committed by strangers; an infinitesimal amount are committed by a stranger on Halloween; and probably zero have been committed by a stranger passing out Halloween candy on his doorstep. Parents should protect their children and they do so by accompanying them as they seek sugary snacks on these  nocturnal excursions.

Patch defends its actions because a Wisconsin child was raped and murdered on Halloween in 1973. But such tragedies can occur on any date and this one taking place on Halloween was coincidental. Besides, almost all persons convicted of raping and murdering a child are still in prison or have been executed. The killer in this case had not struck before and would not have been on any registry had those existed at the time.  

While such a person should be locked away for life, it is a myth that all sexual offenders are incapable of redemption. In fact, sex criminals have the second-lowest recidivism rate, after murderers. Also, public urination, soliciting a prostitute, and teen sex all land people on the list that Patch distributes.

It promotes this as a public good, so I will counter with my own such public service map. Any red dots below indicate cases where persons already on sex offender registries attacked another child who was Trick-or-Treating:


us map




“Milking it” (Baby formula fears)


When my children were born, the biggest decision for me was figuring out which stuffed animal to buy for the crib. But their mother, already suffering through mental and physical anguish, had to decide whether to feed by breast, bottle, or both. The pressure to do the first can be substantial, based on the notion it is always best. Going this route means further taxing exhausted new mothers since, despite their bundle of joy status, newborns need nursed about 10 times a day. These feedings may happen at 3 a.m. or 11 p.m., and while mothers can sleep while fathers handle formula feedings, only the maternal antecedent can perform nursing duties.

The reason well-meaning folks laud breastfeeding is because infants so nourished show lower lifetime rates of asthma, cancer, and diabetes, as well as having fewer instances of infancy infections and mortality.

Similarly, Emily Oster at 538 cited a study of 345 Scandinavians which compared IQ scores for children who had been breastfed for less than three months with those who had been breastfed for more than six months. The authors found that the children who nursed for longer had higher IQ scores.

But, as always, we must consider correlation and causation. In the developed world, women who breastfeed tend more to be nonsmokers, educated, affluent, and given better access to quality health care. Mothers with those distinctions who choose formula see no more health problems in their offspring than those who breastfeed.

In the Scandinavian study, breastfeeding mothers were wealthier, better educated, and had higher IQ scores than those in the other group. Once researchers accounted for these variables, the seeming advantage of nursing evaporated.

Now let’s consider mothers in the developing world. There, breast milk substitutes are often prepared without clean water and in unsanitary conditions. Health issues for their newborns arise because of the environment and what the formula was mixed with, not the formula itself.

Because breastfeeding is wrongly presumed always be best, mothers can be guilted into acquiescing, and this can lead to further problems. Science writer Kavin Senapathy noted there is occasionally an issue with some mother’s breastmilk supply immediately after birth, especially for first-time moms. According to Senapathy, about 15 percent of mothers are incapable of producing enough milk, so if they rely entirely on this source, their baby may suffer dehydration, high blood pressure, hypoglycemia, and excess sodium in their blood.

Senapathy cited Hannah Awadzi, a Ghanaian whose daughter experienced jaundice and hypoglycemia while Awadzi exclusively breastfed her despite inadequate milk supplies. This led to the daughter’s cerebral palsy. Yet Awadzi’s only other option had been formula mixed with deplorable-quality water. Awadzi had no decent alternative, but if having a good choice, formula would be the way to go in cases like this.

To see if a perceptible difference results from breastfeeding and formula use, we can look at studies in which breastfeeding is assigned randomly to subjects, or ones where adjustments are made for differences among women being tested.

One example comes from Belarus, where women were randomized into two groups. For those in the first group, breastfeeding was encouraged; in the second group, it was not. Infants in the breastfed group had fewer gastrointestinal infections and were less likely to experience eczema. However, there were no significant differences in any other studied outcomes, such as respiratory ailments, ear infections, croup, wheezing, infant mortality, allergies, asthma, cavities, height, blood pressure, obesity, and mental issues.

Another study, published in Social Science & Medicine, compared breastfed children with their siblings who had been given formula. In the health and behavior outcomes that were examined, researchers detected no differences. This is crucial because siblings are on equal ground with regard to their environment and their mother’s parenting style, wealth, education, and health. And if breastfeeding made the difference that proponents claim, there would be universal pronounced detriments among those who were adopted at birth.

There are advantages to breastfeeding, including, cost, convenience, and bonding. But babies being nursed won’t enjoy health benefits over those given formula and they will sleep just as well next to whatever stuffed animal Daddy has chosen.  


“Canned response” (Diet soda study)


This past month, my Facebook feed has seen a frenzy focused on fizzy drinks. The panicky parade of posts tied diet soda to dementia, stoke, Alzheimer’s, and wearing linen out of season.

The alarm centered on one study, which was read by Dr. Harriett Hall, but likely not perused by those who posted the terrified screeds. By reading it, Hall learned that there were 2,888 subjects who were followed for 10 years, and that 97 of them had a stroke, while 81 of them developed dementia, most of those displaying symptoms consistent with Alzheimer’s. The authors relied on a survey of test subjects to determine how much diet soda each person consumed. All this lead to headlines which warned that drinking one or more artificially-sweetened soft drink per day was associated with a tripling of strokes, dementia, and Alzheimer’s.

While worried posters presumed correlation meant causation, even the lead researcher cautioned against reaching that conclusion. That man, Matthew Pase, said, “It is not clear whether the diet sodas are causing stroke and dementia or whether unhealthy people gravitate more towards these drinks than healthier people.”

Addressing these and other shortcomings in Fortune, Sy Mukherjee observed that an accurate description would be, “Study determines minor observational link, but no direct cause-and-effect, between certain people who drink artificial sugar beverages, but it has a small sample size that doesn’t include minorities or account for a whole bunch of other critical factors.” But there’s little click-bait or alarmist value in that.

Meanwhile, physician Aaron Carroll highlighted several reasons why you should take this study with a grain of, in this case, sugar. Carroll pointed out that while the conclusions were based on results from one model that adjusted for demographics, diet, physical activity, and smoking, another model which adjusted for still more potentially mitigating factors produced less-pronounced results. Yet this model received scant attention.  

Other deficiencies of the study: Various artificial sweeteners have different molecular makeups, meaning they likely have different impacts on the body; the study focused only on diet soda consumption and did not take into account the subjects ingesting artificial sweeteners through other food and drink; it highlighted relative risk rather than absolute risk, as it declared that subjects were thrice as likely to have stroke or dementia, instead of pointing out that the percentage of all subjects who had these conditions was one half of one percent. That would fit into what would be expected of the general population.

This response to this study was the latest in a long string of doom and gloom pronouncements about several artificial sweeteners that has been going on since at least the mid-1970s. Then, the worry was over saccharin causing bladder cancer in rats. While this was true if the rats were force fed large doses of saccharin, the cancer developed through a mechanism that humans lack. So the danger was for rodents and that’s only if they got into your Diet Dr Pepper and finished off a case in short order. Had there been any truth to the rumor, bladder cancer rates would have plummeted once other sweeteners supplanted saccharin.

In the war on cola, aspartame has been especially slandered. Critics have accused it of causing seizures, Alzheimer’s, arthritis, cancer, diabetes, multiple sclerosis, birth defects, tinnitus, migraines, emphysema, and (fill in the blank with your own malady). Yet, as Hall noted, aspartame is the most frequently evaluated food additive and the metadata of published studies show it is safe for anyone without the genetic disorder phenylketonuria. But those studies don’t get the headlines and the headlines that do get published usually misrepresent what studies say about carbonated refreshment.

It’s not that there could never, under any circumstance, be any harm in diet soda consumption. We should always be open to new evidence gained through double blind studies and follow where the science leads. But there is certainly harm in spreading fear through anecdotes, flawed studies, and misinterpretation of good data.


“Plastic Oh No Banned” (Bottled water hysteria)


While residents of Sub-Saharan Africa and Flint, Mich., strive for access to clean drinking water, some in the West are more concerned with the containers which house this liquid. While this could be seen as a First World problem, our focus here is not on affluent privilege but on how factual such worries are. We will examine if plastic water bottles release unsafe levels of chemicals when they are heated, cooled, or reused.

There have been a myriad of such claims on the Internet for at least 15 years and they often contain a nugget of truth, but leave out key facts while leaping to unfounded conclusions. The nuggets include a factoid about heat often releasing chemicals, usually identified in forwarded chain mail as dioxins. This is sometimes accompanied with calls that the bottles be banned. Some plastics do contain levels of chemicals that could be dangerous if released or which seep into containers when heated.

But plastic water bottles are usually made from polyethylene terephthalate (PET), which does not have those qualities. Water can be safely stored in them, whereas gasoline probably could not be and carbolic acid certainly couldn’t be. That’s why manufactures of containers for water, gasoline, and carbolic acid all utilize different types of plastics for their products. Because they are made from polyethylene terephthalate, water bottles will not become dangerous due to heating, freezing, or reuse.

If concerned about safety, remember to use products for their intended use. There are multitudinous plastics and what each is used for is dependent on their characteristics. The American Chemical Council has gone on record that there is no science to support the claim that PET bottles will release dioxins when frozen or heated. It stated, “Dioxins can only be formed at temperatures well above 700 degrees…and there is no scientific basis for expecting dioxins to be present in plastic food or beverage containers.” So unless you live on Venus, don’t worry about leaving bottled water in your car on a summer day.

Another claim is that the plastics additive diethylhydroxylamine may seep into whatever liquid a plastic container is carrying. However, this additive is not used in plastic water bottles, nor is it created through the breakdown of such bottles. And even if it did, the agent has been approved by the FDA for food-contact applications.

Addressing these concerns, civil engineer and biologist Rolf Halden noted the irony of persons being more concerned about the containers than the product inside. “Many people do not feel comfortable drinking tap water, so they buy bottled water instead,” Dr. Halden said. “The truth is that city water is much more highly regulated and monitored for quality. Bottled water can legally contain many things we would not tolerate in municipal drinking water.”

He noted that safety can usually be assured by following warning labels and directions. For example, heat can cause some plastics to release chemicals, which is why there are cautions on certain drinking straws to refrain from using them with hot beverages.

“If you put that straw into a boiling cup of hot coffee, you have hot water extraction going on and chemicals in the straw are being extracted,” Halden explained.

Maybe that’s why I’ve never seen anyone drink coffee with a straw.  


“I Can’t Believe It’s Not Better” (Margarine fears)


These days, it seems that even the most trivial item can become the object of an unwarranted freakout. This includes how we make our English muffins tastier, for a diatribe against margarine has made its way around the Internet. In addressing the faux yellow condiment, the message gets a few items right, but it mostly contains whoppers and misinformation.

It starts with the assertion that margarine was invented as a means to fatten turkeys, but that the concocted food caused the birds to die en masse. Hoping to recoup some of the money lost from the stricken livestock, the farmers added food coloring to the white substance and passed it off as butter to the unsuspecting masses.

In truth, margarine has nothing to do with turkey, or Turkey for that matter, but with France. Napoleon III offered a prize to anyone who could produce a viable, affordable butter substitute that could be consumed by peasants and soldiers. The winner was a mix of beef fat, saltwater, milk, and margaric acid, which gave the nascent substance its name. Today’s margarine is normally composed of refined vegetable oil, water, and sometimes milk.

I have written before that there is enough amazing about science that there’s no reason to make up cool stuff. For instance, humans having landed a probe on a comet is more captivating to me than is pursuing proof that some unknown critters constructed a face on Mars. In the same way, there is enough genuine ghastly gastroenterological unpleasantness that there is no need to fabricate any.

For example, trans fat is legitimately a food boogeyman that increases the chance of Alzheimer’s, cancer, diabetes, liver disorders, and much more. It was prevalent in margarine for years and were that still the norm rather than the exception, the railing against margarine would be justified.

But the key issue is how much trans fat margarine (or any other food) contains. Avoiding all margarine because of the trans fat issue would be like going naked because one dislikes hats. Many brands, including I Can’t Believe It’s Not Butter, no longer contain trans fat, and that’s usually the case for margarine that comes in tubs or in liquid form.

Another assertion from the screed is that butter has been around for centuries, whereas margarine has been around for less than 100 years. The math is off on that, as margarine dates to the 1860s. But the more relevant point is that how long something has been around is unrelated to its other attributes. Trying to score this as a point for butter over margarine is to commit the appeal to tradition fallacy.

The bulk of the rant is a series of unsubstantiated claims that are unsupported by any documentation, evidence, or studies. The claims include: Margarine triples the risk of coronary disease, quintuples the risk of cancer, increases bad cholesterol while lowering good cholesterol, lowers the quality of breast milk, decreases immune and insulin response, increases the risk of heart disease in women by more than 50 percent, and that eating butter increases the absorption of nutrients from other foods.

The claims against margarine would only be true if the specific brand is high in trans fat, and again, that would be true of any food. The boast about butter melts like, well, butter, when examined. Harriet Hall at Science-Based Medicine wrote, “Where did this claim come from? I found no evidence to support it. Perhaps they were thinking about the fact that some vitamins are fat-soluble, but that would apply to margarine as well as to butter.”

Another baseless assertion is that margarine will not attract flies because it has no nutritional value. Any food, by nature, has nutritional value, and while I doubt there is any data on whether winged pests cotton to vegetable oil spreads, I see no evidence for the assertion that they don’t. Feel free to conduct your own experiment and let me know the results.

Like other good fearmongering pieces, this one contains a dose of chemophobia, this time in the form of a caps-friendly alarm: “Margarine is but ONE MOLECULE away from being PLASTIC and shares 27 ingredients with PAINT.”

First, as Hall noted, this is false. She wrote, “Plastics are polymers and completely unrelated to anything in margarine. Paint doesn’t contain any of the ingredients in margarine.”

But even if true, this would be pointless anyway. Any change, not matter how small, in the chemical makeup of a substance can alter its safety, impact, and use. One oxygen atom is all that separates water from hydrogen peroxide, but this would not be a sound reason to drink the latter while using the former to disinfect a scraped finger.

“The nuclear option” (Nuclear power fears)


In the rare times that the left and right are in agreement, it’s usually because both sides are getting something from the deal. But in the case of nuclear power, the objections from a mix of liberals and conservatives are ironically stifling an innovative, pro-environment, pro-business resource. That’s because nuclear power’s efficiency, safety, and low-carbon status are three strong reasons to adopt the technology.

Liberals who object are self-styled environmentalists who embrace the positions of the IPCC and IEA when it comes to climate change. Yet they reject nuclear power, which those organizations call one of the primary solutions to global warming.

Meanwhile on the right, objections seem to be based on oil and coal industry titans potentially seeing their salaries dip into the seven figures if nuclear power becomes too prevalent. So the best way to win over conservatives would be to point out to how much money a real-life C. Montgomery Burns could make.

As to trying to convince those on the left, the key point is that all energy sources contain risks and that nuclear is among the least concerning. I find nuclear power akin to airplanes. They are both the safest method of doing what they do, but the failures are spectacular, widely publicized, and most remembered.

But there are more chilling dangers from air pollution and the burning of fossil fuels. According to the criminally underappreciated blogger Thoughtscapism, even wind causes more deaths per kilowatt than nuclear power does. She also cites climate scientists James Hansen and P.A. Kharecha, whose paper on nuclear powered concluded that the technology has saved two million lives by producing energy that had previously come via coal.

According to evolutionary and environmental blogger J.M. Korhonen, even when the full lifecycle is considered – uranium mining, accidents, and waste spillage, nuclear energy is still one of the safest energy sources.  She also wrote that, when compared to sources that require burning, energy produced from nuclear power is responsible for much less harm to people and the environment. The same conclusion was reached by the EU-funded External Costs of Energy study.

Additionally, Friends of the Earth commissioned an independent research review that deduced, “The overall safety risks associated with nuclear power appear to be more in line with lifecycle impacts from renewable energy technologies, and significantly lower than for coal and natural gas.”

OK, so nuclear power is efficient and the risk of uranium mining is the same as unearthing similar minerals used in renewables, but what about the notorious accidents at Three Mile Island, Chernobyl, and Fukushima? These get the headlines, any loss of life is tragic, and environmental damage is always disconcerting. Yet in more than 50 years, just 75 persons have died directly or indirectly as the result of nuclear power accidents, all but a handful of these at Chernobyl. This is far fewer than from coal, according to an assessment conducted by the University of Stuttgart. The study concluded that the 300 largest coal plants in Europe cause 22,000 deaths per year.

Beyond safety advantages, another plus of nuclear power is reduced carbon output. For example, the lowest emissions among European countries occur in those nations who use the most nuclear and hydrological power. Moreover, the Intergovernmental Panel on Climate Change and the International Energy Agency both have the position that no single solution will bring sufficient reduction in Earth’s net carbon output. Nuclear power is needed to help make that happen.  

Fossil fuel use is still rising and the IPCC estimates that reliance on the fuels needs to be reduced 40 percent and replaced with nuclear power to have a sizable reduction in carbon reduction by 2030. Meanwhile, the IEA holds that nuclear use must double over the next three decades if humanity is to halt Earth’s rise in average global temperature. We also need bioenergy, wind, power, hydroelectricity, reforestation, solar radiation management, lifestyle changes, and other strategies, but we are losing a valuable resource by failing to embrace nuclear power.  

“Cuban Whistle Crisis” (Sickened diplomats)


Cuba and the U.S. have a long history of antagonizing one another. Eisenhower targeted Castro with coup attempts and following the Bay of Pigs and Cuban Missile Crisis, these morphed into assassination efforts. The CIA went at Castro with such frequency that there was no questioning the agency’s intent, though there were doubts about its efficiency.

A combination of James Bond and the Keystone Cops, CIA assassination attempts employed exploding cigars, explosive-laden seashells, a diving suit coated with deadly fungus, and a poison pen. After repeated failures, the agency was reduced to trying to humiliate Castro by making his beard fall out, and it failed to manage even this clean-shaven caper.

Castro lasted through 10 U.S. presidents and survived a largely ineffective embargo that included prohibitions on Americans from traveling to Cuba. Then there were the trips in the other direction. The most well-known resulted in the Elián González saga, during which right wingers developed a sudden concern for residency rights of undocumented immigrants.

Toward the end of the Obama administration, U.S.-Cuba relations thawed, the countries resumed diplomatic ties, and the travel ban was largely rescinded. The freeze soon resumed, however, as President Trump put most of the travel restrictions back in place. There was also a mysterious mass sickening of U.S. State Department employees at the embassy in Havana. Whether there was a connection between these two events is the focus of this post.

There were suggestions that the illnesses resulted from Cuba deploying a supersonic weapon. While not specifying what type of attack, White House Chief of Staff John Kelly used that word and blamed it for the sickness surge. The State Department’s website reads, “Over the past several months, numerous U.S. Embassy Havana employees have been targeted in specific attacks.” Consequently, the department recalled nonessential personnel and expelled 15 Cuban diplomats.

There have been 22 confirmed illnesses, but a secret supersonic weapon would be an unlikely cause. More likely culprits would include toxins, bacteria, or viruses, as certain strains of all these can damage hearing, which is among the reported symptoms, along with tinnitus, headaches, and dizziness.

The Guardian entertained another possibility. Reporters interviewed neurologists who said that a definitive diagnosis is impossible without having access to the stricken diplomats, but they said perhaps a “functional disorder” could be effecting nervous system functioning. The newspaper quoted neurologist Mark Hallett, who said it was possible for 22 persons to be impacted by the same disorder, especially when they work close together in a high stress environment.

Meanwhile, the AP obtained audio tapes of high-pitched whistles, which some workers said they heard through cellphones or from their computer. Yet the recording reveals nothing about the source, its potentially deleterious effect on the human body, or its relationship to the sicknesses. Of relevance, the report noted that not all sickened Americans heard the strange sounds. And acoustics experts have said that it is highly unlikely that the range of symptoms reported could have been caused by any kind of supersonic weapons. They said they were unaware of any sound that could can cause physical damage when played for a short duration at moderate levels through normal workplace equipment like a cellphone or computer. This works against the idea that an auditory assault is causing issues related to hearing, cognition, vision, vertigo, and sleep.  

It’s true that the Navy uses long-range blasts to target terrorists and pirates, that the Army uses them at checkpoints, and police employ them to disperse crowds. But these weapons work because of their high volume and cacophony. If such a device were targeting U.S. diplomats in Cuba, there would be no mystery about it. It would be loud and proud. Those intent on finding a Havana connection have speculated the answer may lie in a sinister device that is producing sounds beyond the human hearing threshold.

This would include the possibility of infrasound, which emits extremely low frequencies. It can cause feelings of unease in people and many times when persons reported sensing ghostly presence, infrasound was proven to be the culprit. Ultrasound is another possibility. At the other end of the spectrum from infrasound, ultrasound is too high to be heard by people, but it can still cause damage. However, even if Cuba succeeded in developing a secret supersonic weapon, physics laws would make it unlikely that the device could harm victims from a great distance. Ultrasound has limited range, gets weaker as it travels, and would be further hampered in a humid climate. Moreover, a beam of ultrasound would probably be repealed by a building’s exterior.

An ultrasound-emitting device planted inside a building might be close and powerful enough to cause harm to occupants, but it is unlikely that an army of these emitters could be implanted without being detected. And even if this happened, it still wouldn’t explain most of the symptoms U.S. diplomats are reporting.

So in summary, the idea of supersonic weapons being responsible is about as likely as Castro’s 2016 death being the result of the CIA finally succeeding.


“Err supply” (Food control)


One tenet of the anti-GMO, selectively anti-corporate crowd is that evil, powerful groups are controlling the world’s food supply. I’m generally not much on conspiratorial thinking, but this time, the accusation is correct.

But it comes with a substantial caveat. That’s because those making the accusation and those committing the act are the same. For it is anti-GMO activists that are corruptly manipulating the food marketplace. It is not being done, as they claim, by food technology companies through patents and seed ownership. Rather, anti-GMO activists manage to artificially constrain GMOs through a three-pronged approach of regulatory control, making threats to corporations, and exerting pressure on food importers.

The result is that only 10 crops have ever been approved for genetic modification even though the technique can reduce the chance of a crop being afflicted by drought, disease, or pests. Anti-GMO victories have included preventing the distribution of Vitamin A-rich Golden Rice to Third World countries, which would prevent some instances of childhood blindness.

When anti-GMO forces have failed and farmers have been given the chance to grow biotech crops, they embrace them. Genetic modification allows for the development of traits that provide economic benefit, make for sturdier corps, and carry less risk. But only a small percentage of the world’s fruit, vegetable, and grain producers enjoy this biotechnology option.

One of the more prominent successes of anti-GMO forces was the politically-driven decision by several European nations to disallow biotech crops to be cultivated in all or parts of their countries. A related win was the required labeling of genetically-modified foods. Most companies avoided such products since the labels are accompanied by harassment from activists.

These activists frequently employ the ad populum fallacy and consider the number of countries that have banned the cultivation of genetically-modified foods to be evidence of their nefarious nature. But nearly 2,000 studies attest to GMO safety, meaning the restrictions are based on fear and threats, not science and reason. Just how much of a problem this can be was highlighted in a 2014 Guardian article. From the story:

“More than 20 of the most eminent botanists and ecologists in the world warn that it is time to put fears of genetic modification aside and begin widespread field trials. They call for a ‘fundamental revision of GM regulation’ which, they claim, is based not on science, but on politics. Professor Jonathan Jones says British scientists are creating world-changing crops, but they are being blocked by Europe. Jones has developed a blight resistant potato which would avoid the need for farmers to spray crops 15 times a year. Blight is the number one threat to the six million tons of potatoes produced in Britain each year and was responsible for the Irish Famine of the 1840s. But European approval is needed for commercial cultivation and so far the Council of Ministers has vetoed every application.”

This entrenched opposition has extended to other continents. African farmers are denied access to genetically engineered seeds that would improve resistance to insects and drought, and which would make the food they produce hardier, brighter, better tasting, and less susceptible to failure.

Beyond legislation, a second strategy is to threaten corporations with demonization. An insect-resistant potato was developed in 1996 and agricultural scientist Steve Savage reported that he “interviewed many potato growers in the first few years the trait was available and they were extremely happy to have a solution to their most damaging insect pest.”

But after anti-GMO activists threatened McDonald’s and Frito-Lay with boycotts, protests, and ad campaigns if they used this scientific advancement in their products, the companies caved and announced they would not be buying the crop. No small potatoes indeed, as with the two biggest potential customers backing out, the idea fizzled.

This tactic has hobbled other crop developments as well. Savage wrote, “I am aware of projects that have been started or were planned for bananas, coffee, grapes, tomatoes, lettuce, strawberries and apples,” but these were also torpedoed by activists who relied on threats, not data.

The final strategy is to threaten importers from countries which mandate GMO labeling. Savage explains how this derailed a herbicide-resistant wheat strain. “Once again, I had the opportunity to interview many wheat growers to assess their interest in these options,” he wrote. “Most already had positive experiences growing biotech soy, corn or Canola, and they were keen to try the new wheat options. They never got that chance. Major wheat importers from Europe threatened to boycott all North American wheat if any commercial biotech varieties were planted in the U.S. or Canada.”

European bread and pasta producers shied away from having to label their food because they knew this would subject them to activist pressure, so they declined to let the wheat in. The decision was based not on safety or supply and demand, but on the activists’ ability to create marketing issues for food companies that import.  

The activists have yet to get mandatory labeling in the United States. The pro-GMO camp continues to fight this, in part because “If they’re safe, why not label them?” will become, “If they are safe, why are they labeled?” 






“Wishing Welles” (War of the Worlds broadcast)


Though not my intent, I have riled a few people with my posts and comments about topics related to the skeptic movement. Some folks care little for probing questions about their great passions, be they psychic powers, ghost hunters, cryptozoological critters, or cancer-conquering baking soda.

But we should consider sound evidence even when it contradicts a cherished belief, and I strive to be consistent with this. When presented with enough proof, I have discarded ideas that I loved.

For example, while I don’t believe aliens have visited Earth, I once believed that people thought this was happening, and I found the story fascinating. Specifically, they were convinced invading spaceships were ravaging the eastern seaboard the night of Oct. 30, 1938. On that date, Mercury Theater on the Air broadcast Orson Welles’ radio adaptation of his near-namesake’s novel, H.G. Wells’ The War of the Worlds.

Producer and director Welles tasked script writer Howard Koch to frame the play as a radio broadcast featuring a series of breaking news events that interrupted mellow jazz. Each interjection became more disconcerting until finally Martian laser weapons were zapping and frying farmers, state troopers, and newscasters.

Some persons mistook the fictitious newscast for a factual one, with subsequent newspaper reports portraying this as the country losing its collective mind. Sociologists have pondered that this tale has endured because it speaks to the power of unrestrained media and has a vague Big Brother feel to it, as an unseen, baritone voice of authority deftly dupes the populace. A competing hypothesis is that modern listeners allow themselves to feel superior to Depression Era rubes who fell for such a preposterous notion. The latter hypothesis ascribes unjustified credulity to modern news consumers who unquestioningly pass around Poes and Onion articles as authentic.

As for my love of the tale, it had nothing to do with contemplating the reach of powerful mediums or wanting to feel uppity. It was just a story that was at once intriguing and amusing. No one died, no long-term harm was done, and it was all encapsulated in a well-written, well-acted theater program presented in entertaining crescendo style. For a few months, I made listening to the broadcast a bedtime routine.

For those who tuned in from the beginning, it was obvious that the broadcast was of a dramatic production. But listeners coming across it later might have taken it as fact. On the following day in headlines, and in the following half-century in American folklore, there were tales of near-suicides, impromptu minutemen armies, terrified citizens fleeing to churches and hilltops, and roads and phone lines being jammed.

But while there was panic, little of it centered on an early version of Space Invaders. Rather, the panic came in the form of all caps banner headlines, lawsuits against CBS, Congressional hearings, and calls to tighten broadcast regulations.

Subsequent research has shown that overreaction to the broadcast was localized instead of nationwide, and often came in the form of measured concern rather than full-blown anxiety. The freak-out was likely limited to parts of New York and New Jersey, with the exception of Concrete, Wash. There were persons outside those areas who were taken in by the broadcast and who telephoned relatives in the east, but these responses stopped short of panic.

The idea that millions were pouring into the streets to escape or confront the aliens is vastly different from reality. Four days after it ran a sensational report alleging this, the Washington Post ran a letter from a man who had walked down F Street during the broadcast and witnessed “nothing approximating mass hysteria. In many stores radios were going, yet I observed nothing whatsoever of the absurd supposed terror of the populace.” Then in 1954, Ben Gross, radio editor for the New York Daily News, wrote in his memoir that Gotham’s streets were “nearly deserted” that night.

One of the few instances of confirmed hysteria took place in Grover’s Mill, N.J., where the first alien cylinder was said to have landed. There, residents thought Martians had transformed the water tower into a war machine, so they turned their attention and rifles on it. Meanwhile, in Concrete, Wash., the broadcast reported that Martians were working their way west, destroying railroad tracks, highways, power grids, and communication centers in order to cripple the country. As this happened, a thunderstorm took out a power station and telephone lines. The sudden loss of electricity and phone service seemed consistent with the alien occupation report, so some residents took this as Concrete proof, so to speak, that their village had fallen prey to the invaders.

Despite just two verified cases of terrified throngs, headlines the next day blared, “RADIO FAKE SCARES NATION” and “FAKE RADIO WAR STIRS TERROR THROUGH U.S.” Hitler, who some listeners thought was responsible for the apparent invasion, called the alleged panic “evidence of the decadence and corrupt condition of democracy.”

But while there were tiny pockets who took the broadcast as truth, the reactions of editors and genocidal dictators were greatly unwarranted. There were other radio stations to choose from and there were plenty of activities to occupy one’s time besides radio. The most popular show in the time slot, by far, was the Chase and Sanborn Hour, hosted by Edgar Bergen and his wooden sidekick, Charlie McCarthy. I’m not sure I understand the appeal of a radio ventriloquist, but let’s stay on topic.

The strongest evidence for how overblown the supposed size of the panic was comes from a poll of 5,000 households taken by the C.E. Hooper Ratings Service. In the telephone survey, two percent responded they were listening to the Mercury Radio Theatre production. Of that two percent, none of them answered that they were listening to news reports of an alien invasion. True, some who took the play to be a newscast were fleeing from or seeking out the invaders and therefore would not have been home to answer the call. Also true is that some of those who thought there were interplanetary interlopers were basing this on third-hand accounts and not the broadcast.

Still, reports of a country teetering on the brink is inconsistent with an estimated audience of 2.6 million in a nation 50 times that size. Moreover, those 2.6 million included many who listened from the beginning and were aware all along the broadcast was of a drama and not a doomsday. Some who tuned in late took it for the theatrical production it was, while others thought the attack was courtesy the Nazis. The rest went with the Invaders From Mars conclusion. But that number would have been in the hundreds, maybe thousands, but certainly not millions.

The main culprits for propping up the mostly-mythical panic were newspapers. Editor & Publisher encapsulated the industry’s combination of haughtiness and concern over dwindling profits by fuming, “The nation continues to face the danger of incomplete, misunderstood news from a medium which has yet to prove that it is competent to perform the job.”

While newspapers were still fumbling around with linotype machines and changing the ink on their printing presses, broadcasters were filing reports on the same story in real time, hours before newspapers could hit the streets. Unlike the Internet, there was no way for newspapers to embrace this neophyte technology and use it for themselves. The War of the Worlds broadcast presented publishers an opportunity to smear the radio medium as sensationalist, unprincipled, and unwholesome – ironically by displaying those same traits.

Wire service articles conspicuously lacked names of persons who were said to have panicked. Subsequent investigations of reports about patients being admitted for shock at St. Michael’s Hospital in Newark, N.J., showed that this was untrue. American University communications professor Joseph Campbell has characterized the newspaper coverage as “almost entirely anecdotal and largely based on sketch wire service roundups that emphasized breadth over in-depth detail.”

While the number of persons impacted was greatly exaggerated, so too was the nature of their reaction. While there were reports of persons feeling “frightened, disturbed, or excited” by the show, this fails to differentiate between those who thought it was news and those who knew it was a play. One could well be scared or thrilled by a radio drama about invading aliens without thinking it was real.

Some observers suspect that the Depression and looming threat of global war left the relative few who did panic ripe for doing so. I disagree. People in 1950s, with the war won and economy booming, fell for a bogus story about Italy’s pasta harvest. The Roaring 20s gave us the Cottingley Fairies hoax. People believe crazy stuff without seeking confirming evidence regardless of what economic circumstances or technological developments define the era.

Much as I loved the tale and wish it true, evidence shows the panic was sparked not by a Martian militia, but by people’s alleged reaction to it.