The INVISIBLE RAINBOW
A History of Electricity and Life
Arthur Firstenberg
12
The Transformation of Diabetes
IN 1859, AT THE AGE OF TWELVE, the son of a lumber and grain merchant in Port Huron, Michigan strung a telegraph line one mile long between his house and a friend’s, placing the two into electrical communication. From that day forward Thomas Alva Edison was intimate with the mysterious forces of electricity. He worked as an itinerant telegraph operator from the age of fifteen until he went into business for himself in Boston at age twenty-one, providing private-line telegraph service for Boston firms, stringing the wires from downtown offices, along the rooftops of houses and buildings, to factories and warehouses on the outskirts of the city. By the time he was twenty-nine, when he moved his laboratory to a small hamlet in New Jersey, he had made improvements to telegraph technology and was engaged in perfecting the newly invented telephone. The “Wizard of Menlo Park” became world famous in 1878 for his invention of the phonograph. He then set himself a much more ambitious task: he dreamed of lighting people’s homes with electricity, and displacing the hundred-fifty-million-dollar-a-year gas lighting industry. Before he was done, he had invented the electric light bulb, dynamos that generated electricity at constant voltage, and a system of distributing electricity in parallel circuits. In November 1882, he patented the three-wire distribution system that we all still use today.
At around that time, Edison developed a rare disease known as diabetes.1
Another young man, who grew up in Scotland, was teaching elocution at a school in Bath in 1866 when he hooked up a homemade telegraph system between his house and a neighbor’s. Five years later he found himself teaching the deaf to speak in Boston, where he was also a professor of elocution at Boston University. But he did not give up his lifelong affair with electricity. One of his deaf students, with whose family he boarded, glanced one day into his bedroom. “I found the floor, the chairs, the table, and even the dresser covered with wires, batteries, coils, cigar boxes, and an indescribable mass of miscellaneous equipment,” the man recalled many years later. “The overflow was already in the basement, and it wasn’t many months before he had expanded into the carriage house.” In 1876, after patenting a number of improvements to the telegraph, Alexander Graham Bell invented the telephone, achieving world renown before the age of thirty. His “endless health complaints”—severe headaches, insomnia, sciatic pain, shortness of breath, chest pains, irregular heartbeat, and an abnormal sensitivity to light—dated from his earliest experiments with electricity in Bath.
In 1915 he, too, was diagnosed with diabetes.2
To begin to get a sense of just how rare diabetes once was, I searched the antique books in my medical library. I first looked in the Works of Robert Whytt, a Scottish physician of the early and mid-eighteenth century. I did not find diabetes mentioned in the 750-page volume.
American physician John Brown, at the end of the eighteenth century, devoted two paragraphs to the disorder in his Elements of Medicine. In the Works of Thomas Sydenham, who practiced in the seventeenth century and is known as the Father of English Medicine, I found a single page on diabetes. It set forth a sparse description of the disease, recommended an all-meat diet, and prescribed an herbal remedy.
I opened Benjamin Ward Richardson’s 500-page work, Diseases of Modern Life, published in New York in 1876, a time when Edison and Bell were experimenting intensively with electricity. Four pages were devoted to diabetes. Richardson considered it a modern disease caused by exhaustion from mental overwork or by some shock to the nervous system. But it was still uncommon.
Then I consulted my “bible” of diseases of the nineteenth century, the Handbook of Geographical and Historical Pathology, published in stages between 1881 and 1886 in German and English. In this massive three-volume scholarly work, August Hirsch compiled the history of known diseases, along with their prevalence and distribution throughout the world. Hirsch spared six pages for diabetes, noting primarily that it was rare and that little information about it was known. In ancient Greece, he wrote, in the fourth century B.C., Hippocrates never mentioned it. In the second century A.D., Galen, a Greek-born physician practicing in Rome, devoted some passages to diabetes, but stated that he himself had seen only two cases.
The first book on diabetes had actually been written in 1798, but its author, John Rollo of England, had only seen three cases of it himself in his twenty-three years of practicing medicine.
The statistics Hirsch gathered from around the world confirmed to him that the disease “is one of the rarest.”3 About 16 people per year died of it in Philadelphia, 3 in Brussels, 30 in Berlin, and 550 in all of England. Occasional cases were reported in Turkey, Egypt, Morocco, Mexico, Ceylon, and certain parts of India. But an informant in St. Petersburg had not seen a case in six years. Practitioners in Senegambia and the Guinea Coast had never seen a case, nor was there any record of it occurring in China, Japan, Australia, the islands of the Pacific, Central America, the West Indies, Guiana, or Peru. One informant had never seen a case of diabetes during a practice of many years in Rio de Janeiro.
How, then, did diabetes come to be one of the major killers of humanity? In today’s world, as we will see, limiting one’s intake of sugar plays an important role in the prevention and control of this disease. But, as we will also see, blaming the rise of diabetes on dietary sugars is as unsatisfactory as blaming the rise of heart disease on dietary fats.
In 1976, I was living in Albuquerque when a friend placed a newly published book in my hands that changed the way I ate and drank. William Dufty, the author of Sugar Blues, had done his homework thoroughly. He convinced me that the single most addictive substance that was undermining the health of the masses, and had been doing so for centuries, was not alcohol, tobacco, opium, or marijuana, but sugar. He further blamed four centuries of the African slave trade largely on the need to feed a sugar habit that had been acquired by the Crusaders during the twelfth and thirteenth centuries. Europeans, he said, had wrested control of the world sugar trade from the Arab Empire, and needed a steady supply of labor to tend their sugar plantations. His claim that sugar was “more intoxicating than beer or wine and more potent than many drugs” was supported by an entertaining tale that he spun about his own perplexing illnesses and his heroic efforts to kick the sugar habit, which finally succeeded. Migraine headaches, mysterious fevers, bleeding gums, hemorrhoids, skin eruptions, a tendency to gain weight, chronic fatigue, and an impressive assortment of aches and pains that had tormented him for fifteen years vanished within twenty-four hours, he said, and did not return.
Dufty also explained why sugar causes diabetes. Our cells, especially our brain cells, get their energy from a steady supply of a simple sugar called glucose, which is the end product of digesting the carbohydrates we eat. “The difference between feeling up or down, sane or insane, calm or freaked out, inspired or depressed depends in large measure upon what we put in our mouth,” he wrote. He further explained that the difference between life and death depends upon a precise balance between the amount of glucose in our blood and the amount of blood oxygen, insulin being one of the hormones that maintains this balance. If not enough insulin is secreted by the pancreas after a meal, glucose builds up to a toxic level in the blood and we begin excreting it in our urine. If too much insulin is produced, blood glucose levels drop dangerously low.
The problem with eating pure sugar, wrote Dufty, is that it doesn’t need to be digested and is absorbed into the blood much too fast. Eating complex carbohydrates, fats, and proteins requires the pancreas to secrete an assortment of digestive enzymes into the small intestine so that these foods can be broken down. This takes time. The glucose level in the blood rises gradually. However, when we eat refined sugar it is turned into glucose almost immediately and passes directly into the blood, Dufty explained, “where the glucose level has already been established in precise balance with oxygen. The glucose level in the blood is thus drastically increased. Balance is destroyed. The body is in crisis.”
A year after reading this book I decided to apply to medical school, and had to first take basic courses in biology and chemistry that I did not take in college. My biochemistry professor at the University of California, San Diego essentially confirmed what I had learned from reading Sugar Blues. We evolved, said my professor, eating foods like potatoes that have to be digested gradually. The pancreas automatically secretes insulin at a rate that exactly corresponds to the rate at which glucose—over a considerable period of time after a meal—enters the bloodstream. Although this mechanism works perfectly if you eat meat, potatoes, and vegetables, a meal containing refined sugar creates a disturbance. The entire load of sugar enters the bloodstream at once. The pancreas, however, hasn’t learned about refined sugar and “thinks” that you have just eaten a meal containing a tremendous amount of potatoes. A lot more glucose should be on its way. The pancreas therefore manufactures an amount of insulin that can deal with a tremendous meal. This overreaction by the pancreas drives the blood glucose level too low, starving the brain and muscles—a condition known as hypoglycemia.4 After years of such overstimulation the pancreas may become exhausted and stop producing enough insulin or produce none at all. This condition is called diabetes and requires the person to take insulin or other drugs to maintain his or her energy balance and stay alive.
Many besides Dufty have pointed out that an extraordinary rise in sugar consumption has accompanied the equally extraordinary rise in diabetes rates over the past two hundred years. Almost a century ago Dr. Elliott P. Joslin, founder of Boston’s Joslin Diabetes Center, published statistics showing that yearly sugar consumption per person in the United States had increased eightfold between 1800 in 1917.5
But this model of diabetes is missing an important piece. It teaches us how to avoid getting diabetes in the twenty-first century: don’t eat highly refined foods, especially sugar. But it completely fails to explain the terrible prevalence of diabetes in our time. Sugar or no sugar, diabetes was once an impressively rare disease. The vast majority of human beings were once able to digest and metabolize large quantities of pure sugar without eliminating it in their urine and without wearing out their pancreas. Even Joslin, whose clinical experience led him to suspect sugar as a cause of diabetes, pointed out that the consumption of sugar in the United States had increased by only 17 percent between 1900 and 1917, a period during which the death rate from diabetes had nearly doubled. And he underestimated sugar use in the nineteenth century because his statistics were for refined sugar only. They did not include maple syrup, honey, sorghum syrup, cane syrup, and especially molasses. Molasses was cheaper than refined sugar, and until about 1850 Americans consumed more molasses than refined sugar. The following graph 6 shows actual sugar consumption during the past two centuries, including the sugar content of syrups and molasses, and it does not fit the dietary model of this disease. In fact, per capita sugar consumption did not rise at all between 1922 and 1984, yet diabetes rates soared tenfold.
That diet alone is not responsible for the modern pandemic of diabetes is clearly shown by the histories of three communities at opposite ends of the world from one another. One has the highest rates of diabetes in the world today. The second is the largest consumer of sugar in the world. And the third, which I will examine in some detail, is the most recently electrified country in the world.
American Indians
The poster child for the diabetes story is supposed to be the American Indian. Supposedly— according to the American Diabetes Association—people today are just eating too much food and not getting enough exercise to burn off all the calories. This causes obesity, which, it is believed, is the real cause of most diabetes. Indians, so the story goes, are genetically predisposed to diabetes, and this predisposition has been triggered by the sedentary lifestyle imposed on them when they were confined to reservations, as well as by an unhealthy diet containing large amounts of white flour, fat, and sugar that have replaced traditional foods. And indeed, today, Indians on most reservations in the United States and Canada have rates of diabetes that are the highest in the world.
Yet this does not explain why, since all Indian reservations were created by the end of the nineteenth century, and Indian fry bread, consisting of white flour deep fried in lard and eaten with sugar, became a staple food on most reservations at that time, diabetes nevertheless did not exist among Indians until the latter half of the twentieth century. Before 1940 the Indian Health Service had never listed diabetes as a cause of death for a single Indian. And as late as 1987, surveys done by the Indian Health Service in the United States and the Department of National Health and Welfare in Canada revealed differences in diabetes rates between different populations of Indians that were extreme: 7 cases of diabetes per 1,000 population in the Northwest Territories, 9 in the Yukon, 17 in Alaska, 28 among the Cree/Ojibwa of Ontario and Manitoba, 40 on the Lummi Reservation in Washington, 53 among the Micmac of Nova Scotia and the Makah of Washington, 70 on the Pine Ridge Reservation in South Dakota, 85 on the Crow Reservation in Montana, 125 on the Standing Rock Sioux Reservation in the Dakotas, 148 on the Chippewa Reservation in Minnesota and North Dakota, 218 on the Winnebago/Omaha Reservation in Nebraska, and 380 on the Gila River Reservation in Arizona.7
In 1987, neither diet nor lifestyle in the various communities was different enough to account for a fifty-fold difference in diabetes rates. But one environmental factor could account for the disparities. Electrification came to most Indian reservations later than it came to most American farms. Even in the late twentieth century some reservations were still not electrified. This included most Indian reserves in the Canadian Territories and most native villages in Alaska. When the first electric service came to the Standing Rock Reservation in the Dakotas in the 1950s, diabetes came to that reservation at the same time.8 The Gila River Reservation is located on the outskirts of Phoenix. Not only is it traversed by high voltage power lines serving a metropolis of four million, but the Gila River Indian Community operates its own electric utility and its own telecommunications company. The Pima and Maricopa on this small reservation are exposed to a greater concentration of electromagnetic fields than any other Indian tribes in North America.
Brazil
Brazil, which has grown sugar cane since 1516, has been the largest producer and consumer of that commodity since the seventeenth century. Yet in the 1870s, when diabetes was beginning to be noticed as a disease of civilization in the United States, that disease was completely unknown in the sugar capital of the world, Rio de Janeiro.
Brazil today produces over 30 million metric tons of sugar per year and consumes over 130 pounds of white sugar per person, more than the United States. Analyses of the diets of the two countries—Brazil in 2002–2003, and the United States from 1996–2006—revealed that the average Brazilian obtained 16.7 percent of his or her calories from table sugar or sugar added to processed foods, while Americans consumed only 15.7 percent of their calories from refined sugars. Yet the United States had more than two and a half times the rate of diabetes as Brazil.9
Bhutan
Sandwiched between the mountainous borders of India and China, the isolated Himalayan kingdom of Bhutan may be the last country in the world to be electrified. Until the 1960s, Bhutan had no banking system, no national currency, and no roads. In the late 1980s, I learned something about this Buddhist country, thought by some to be the model for James Hilton’s Shangri-La, when I made the acquaintance of a Canadian woman who worked for CUSO International, the Canadian version of the United States Peace Corps. She had just returned from a four-year stint in a small Bhutanese village, where she taught English to the local children. Bhutan is somewhat larger, in area, than the Netherlands, and has a population just over 750,000. The road system at the time was still extremely limited, and most travel outside the immediate vicinity of the small capital, Thimphu, including travel to my friend’s village, was by foot or horseback. She felt privileged to be able to live in that country at all, because outside visitors to Bhutan were limited to 1,000 per year. The woven baskets and other handcrafts that she brought back were intricate and beautiful. Technology was unknown, as there was no electricity at all in most of the country. Diabetes was extremely rare, and completely unknown outside the capital.
As recently as 2002, fuel wood provided virtually one hundred percent of all non-commercial energy consumption. Fuel wood consumption, at 1.22 tons per capita, was one of the highest, if not the highest, in the world. Bhutan was an ideal laboratory in which to monitor the effects of electricity, because that country was about to be transformed from near zero percent electrification to one hundred percent electrification in a little over a decade.
In 1998, King Jigme Singye Wangchuk ceded some of his powers to a democratic assembly, which wanted to modernize the country. The Department of Energy and the Bhutan Electricity Authority were created on July 1, 2002. That same day the Bhutan Power Corporation was launched. With 1,193 employees, it immediately became the largest corporation in the kingdom. Its mandate was to generate and distribute electricity throughout the kingdom, with a target of full electrification of the country within ten years. By 2012 the proportion of rural households actually reached by electricity was about 84 percent.
In 2004, 634 new cases of diabetes were reported in Bhutan. The next year, 944. The year after that, 1,470. The following year, 1,732. The next year, 2,541, with 15 deaths.10 In 2010, there were 91 deaths and diabetes mellitus was already the eighth most common cause of mortality in the kingdom. Coronary heart disease was number one. Only 66.5 percent of the population had normal blood sugar.11 This sudden change in the health of the population, especially the rural population, was being blamed, incredibly, on the traditional Bhutanese diet which, however, had not changed. “Bhutanese have a penchant for fat-rich foods,” reported Jigme Wangchuk in the Bhutan Observer. “All Bhutanese delicacies are fat-rich. Salty and fatty foods cause hypertension. Today, one of the main causes of ill-health in Bhutan is hypertension caused by oil-rich and salty traditional Bhutanese diet.” Rice, the article continued, which is the staple food of the Bhutanese, is rich in carbohydrates, which turns into fat unless there is physical activity; perhaps the Bhutanese are not getting enough exercise. Two-thirds of the population, the author complained, are not eating enough fruits and vegetables.
But the Bhutanese diet has not altered. The Bhutanese people are poor. Their country is mountainous with few roads. They have not all gone out and suddenly bought automobiles, refrigerators, washing machines, televisions, and computers, and become a lazy, inactive people. Yet rates of diabetes quadruple in four years. Bhutan now ranks eighteenth in the world in its mortality rate from heart disease.
Only one other thing has changed so dramatically in Bhutan in the last decade: electrification, and the resulting exposure of the population to electromagnetic fields.
We recall from the last chapter that exposure to electromagnetic fields interferes with basic metabolism. The power plants of our cells, the mitochondria, become less active, slowing the rate at which our cells can burn glucose, fats, and protein. Instead of being taken up by our cells, excess fats accumulate in our blood and are deposited on the walls of our arteries along with the cholesterol that transports them, forming plaques and causing coronary heart disease. This can be prevented by eating a low-fat diet.
In the same way, excess glucose, instead of being taken up by our cells, also backs up and accumulates in our blood. This increases the secretion of insulin by our pancreas. Normally, insulin lowers blood sugar by increasing its uptake by our muscles. But now our muscle cells can’t keep up. They burn glucose as fast as they can after a meal, and it’s no longer fast enough. Most of the excess goes into our fat cells, is converted to fat, and makes us obese. If your pancreas becomes worn out and stops producing insulin, you have Type 1 diabetes. If your pancreas is producing enough, or too much insulin, but your muscles are unable to use glucose quickly enough, this is interpreted as “insulin resistance” and you have Type 2 diabetes.
Eating a diet free of highly refined, quickly digested foods, especially sugar, can prevent this. In fact, before the discovery of insulin in 1922, some doctors, including Elliott Joslin, successfully treated severe cases of diabetes with a near-starvation diet.12 They radically restricted their patients’ intake of not just sugar, but all calories, thus ensuring that glucose entered the bloodstream at a rate no faster than the cells could deal with. After a several days’ fast normalized the blood glucose, first carbohydrates, then proteins, then fats were gradually reintroduced into the patient’s diet. Sugar was eliminated. This saved many people who would have died within a year or two.
But in Joslin’s time the very nature of this disease underwent a mysterious transformation.
Insulin resistance—which accounts for the vast majority of diabetes in the world today—did not exist before the late nineteenth century. Neither did obese diabetic patients. Almost all people with diabetes were insulin-deficient, and they were universally thin: since insulin is needed in order for muscle and fat cells to absorb glucose, people with little or no insulin will waste away. They pee away their glucose instead of using it for energy, and survive by burning their stores of body fat.
In fact, overweight diabetics were at first so unusual that late-nineteenth-century doctors couldn’t quite believe the change in the disease—and some of them didn’t. One of these, John Milner Fothergill, a prominent London physician, wrote a letter to the Philadelphia Medical Times in 1884, in which he stated: “When a corpulent, florid- complexioned man, well-fed and vigorous, passes sugar in his urine, only a tyro would conjecture that he was the victim of classical diabetes, a formidable wasting disease.”13 Dr. Fothergill, as it turned out, was in denial. A corpulent, florid-complexioned man himself, Fothergill died of diabetes five years later.
Today the disease has changed entirely. Even children with Type 1, insulin-deficient diabetes tend to be overweight. They are overweight before they become diabetic because of their cells’ reduced ability to metabolize fats. They are overweight after they become diabetic because the insulin that they take for the rest of their lives makes their fat cells take up lots of glucose and store it as fat.
Diabetes Is Also a Disorder of Fat Metabolism
Nowadays, all blood that is drawn from a patient is sent right off to a laboratory to be analyzed. The doctor rarely looks at it. But a hundred years ago the quality and consistency of the blood were valuable guides to diagnosis. Doctors knew that diabetes involved an inability to metabolize not just sugar but fat, because blood drawn from a diabetic’s vein was milky, and when it was allowed to stand, a thick layer of “cream” invariably floated to the top.
In the early years of the twentieth century, when diabetes had become epidemic and was not yet controllable with any medication, it was not unusual for a diabetic’s blood to contain 15 to 20 percent fat. Joslin even found that blood cholesterol was a more reliable measure of the severity of the disease than blood sugar. He disagreed with those of his contemporaries who were treating diabetes with a low-carbohydrate, high-fat diet. “The importance of the modification of the treatment to include control of the fat of the diet is obvious,” he wrote. He issued a warning, appropriate not only for his contemporaries but for the future: “When fat ceases to be metabolized in a normal manner no striking evidence of it is afforded, and both patient and doctor continue to journey along in innocent oblivion of its existence, and hence fat is often a greater danger to a diabetic than carbohydrate.”14
The linked failure of both carbohydrate and fat metabolism is a sign of disordered respiration in the mitochondria, and the mitochondria, we have seen, are disturbed by electromagnetic fields.Under the influence of such fields, respiratory enzyme activity is slower. After a meal, the cells cannot oxidize the breakdown products of the proteins, fats, and sugars that we eat as quickly as they are being supplied by the blood. Supply outstrips demand. Recent research has shown exactly how this happens.
Glucose and fatty acids, proposed University of Cambridge biochemist Philip J. Randle in 1963, compete with each other for energy production. This mutual competition, he said, operates independently of insulin to regulate glucose levels in the blood. In other words, high fatty acid levels in the blood inhibit glucose metabolism, and vice versa. Evidence in support appeared almost immediately. Jean-Pierre Felber and Alfredo Vannotti at the University of Lausanne gave a glucose tolerance test to five healthy volunteers, and then another one a few days later to the same individuals while they were receiving an intravenous infusion of lipids. Every person responded to the second test as though they were insulin resistant. Although their insulin levels remained the same, they were unable to metabolize the glucose as quickly in the presence of high levels of fatty acids in their blood, competing for the same respiratory enzymes. These experiments were easy to repeat, and overwhelming evidence confirmed the concept of the “glucose-fatty acid cycle.” Some evidence also supported the idea that not only fats, but amino acids as well, competed with glucose for respiration.
Randle had not been thinking in terms of mitochondria, much less what could happen if an environmental factor restricted the ability of the respiratory enzymes to work at all. But during the last decade and a half, finally some diabetes researchers have begun focusing specifically on mitochondrial function.
Remember that our food contains three main types of nutrients—proteins, fats, and carbohydrates—that are broken down into simpler substances before being absorbed into our blood. Proteins become amino acids. Fats become triglycerides and free fatty acids. Carbohydrates become glucose. Some portion of these is used for growth and repair and becomes part of the structure of our body. The rest is burned by our cells for energy.
Within our cells, inside tiny bodies called mitochondria, amino acids, fatty acids, and glucose are all further transformed into even simpler chemicals that feed into a common cellular laboratory called the Krebs cycle, which breaks them down the rest of the way so that they can combine with the oxygen we breathe to produce carbon dioxide, water, and energy. The last component in this process of combustion, the electron transport chain, receives electrons from the Krebs cycle and delivers them, one at a time, to molecules of oxygen. If the speed of those electrons is modified by external electromagnetic fields, as suggested by Blank and Goodman, or if the functioning of any of the elements of the electron transport chain is otherwise altered, the final combustion of our food is impaired. Proteins, fats, and carbohydrates begin to compete with each other and back up into the bloodstream. Fats are deposited in arteries. Glucose is excreted in urine. The brain, heart, muscles, and organs become oxygen-deprived. Life slows down and breaks down.
Only recently was it proven that this actually happens in diabetes. For a century, scientists had assumed that because most diabetics were fat, obesity causes diabetes. But in 1994, David E. Kelley at the University of Pittsburgh School of Medicine, in collaboration with Jean-Aimé Simoneau at Laval University in Quebec, decided to find out exactly why diabetics have such high fatty acid levels in their blood. Seventy-two years after insulin was discovered, Kelley and Simoneau were among the first to measure cellular respiration in detail in this disease. To their surprise, the defect turned out not to be in the cells’ ability to absorb lipids but in their ability to burn them for energy. Large amounts of fatty acids were being absorbed by the muscles and not metabolized. This led to intensive research into all aspects of respiration at the cellular level in diabetes mellitus. Important work continues to be done at the University of Pittsburgh, as well as at the Joslin Diabetes Center, RMIT University in Victoria, Australia, and other research centers.15
What has been discovered is that cellular metabolism is reduced at all levels. The enzymes that break down fats and feed them into the Krebs cycle are impaired. The enzymes of the Krebs cycle itself, which receives the breakdown products of fats, sugars, and proteins, are impaired. The electron transport chain is impaired. The mitochondria are smaller and reduced in number. Consumption of oxygen by the patient during exercise is reduced. The more severe the insulin resistance—i.e., the more severe the diabetes—the greater the reductions in all these measures of cellular respiratory capacity.
In fact, Clinton Bruce and his colleagues in Australia found that the oxidative capacity of the muscles was a better indicator of insulin resistance than their fat content—which threw into question the traditional wisdom that obesity causes diabetes. Perhaps, they speculated, obesity is not a cause but an effect of the same defect in cellular respiration that causes diabetes. A study involving lean, active young African-American women in Pittsburgh, published in 2014, seemed to confirm this. Although the women were somewhat insulin resistant, they were not yet diabetic, and the medical team could find no other physiological abnormalities in the group except two: their oxygen consumption during exercise was reduced, and mitochondrial respiration in their muscle cells was reduced.16
In 2009, the Pittsburgh team made an extraordinary finding. If the electrons in the electron transport chain are being disturbed by an environmental factor, then one would expect that diet and exercise might improve all components of metabolism except the last, energy-producing step involving oxygen. That is exactly what the Pittsburgh team found. Placing diabetic patients on calorie restriction and a strict exercise regime was beneficial in many respects. It increased the activity of the Krebs cycle enzymes. It reduced the fat content of muscle cells. It increased the number of mitochondria in the cells. These benefits improved insulin sensitivity and helped control blood sugar. But although the number of mitochondria increased, their efficiency did not. The electron transport enzymes in dieted, exercised diabetic patients were still only half as active as the same enzymes in healthy individuals.17
In June 2010, Mary-Elizabeth Patti, a professor at Harvard Medical School and researcher at the Joslin Diabetes Center, and Silvia Corvera, a professor at the University of Massachusetts Medical School in Worcester, published a comprehensive review of existing research on the role of mitochondria in diabetes. They were forced to conclude that a defect of cellular respiration may be the basic problem behind the modern epidemic. Due to “failure of mitochondria to adapt to higher cellular oxidative demands,” they wrote, “a vicious cycle of insulin resistance and impaired insulin secretion can be initiated.”
But they were not willing to take the next step. No diabetes researchers today are looking for an environmental cause of this “failure to adapt” of so many people’s mitochondria. They are still, in the face of evidence refuting it, blaming this disease on faulty diet, lack of exercise, and genetics. This in spite of the fact that, as Dan Hurley noted in his 2011 book, Diabetes Rising, human genetics has not changed and neither diet, exercise, nor drugs has put a dent in the escalation of this disease during the ninety years since insulin was discovered.
Diabetes in Radio Wave Sickness
In 1917, when Joslin was publishing the second edition of his book on diabetes, radio waves were being massively deployed on and off the battlefield in the service of war. At that point, as we saw in chapter 8, radio waves joined power distribution as a leading source of electromagnetic pollution on this planet. Their contribution has steadily grown until today when radio, television, radar, computers, cell phones, satellites, and millions of transmission towers have made radio waves by far the predominant source of electromagnetic fields bathing living cells.
The effects of radio waves on blood sugar are extremely well documented. However, none of this research has been done in the United States or Europe. It has been possible for western medical authorities to pretend that it doesn’t exist because most of it is published in Czech, Polish, Russian, and other Slavic languages in strange alphabets and has not been translated into familiar tongues.
But some of it has, thanks to the United States military, in documents that have not been widely circulated, and thanks to a few international conferences.
During the Cold War, from the late 1950s through the 1980s, the United States Army, Navy, and Air Force were developing and building enormously powerful early warning radar stations to protect against the possibility of nuclear attack. In order to stand sentinel over the air spaces surrounding the United States, these stations were going to monitor the entire coastline and the borders with Mexico and Canada. This meant that a strip of the American border up to hundreds of miles wide—and everyone who lived there—was going to be continuously bombarded with radio waves at power levels that were unprecedented in human history. Military authorities needed to review all ongoing research into the health effects of such radiation. In essence, they wanted to know what were the maximum levels of radiation to which they could get away with exposing the American population. And so one of the functions of the Joint Publications Research Service, a federal agency established during the Cold War to translate foreign documents, was to translate into English some of the Soviet and Eastern European research on radio wave sickness. One of the most consistent laboratory findings in this body of literature is a disturbance of carbohydrate metabolism.
In the late 1950s, in Moscow, Maria Sadchikova gave glucose tolerance tests to 57 workers exposed to UHF radiation. The majority had altered sugar curves: their blood sugar remained abnormally high for over two hours after an oral dose of glucose. And a second dose, given after one hour, caused a second spike in some patients, indicating a deficiency of insulin.18
In 1964, V. Bartoníček, in Czechoslovakia, gave glucose tolerance tests to 27 workers exposed to centimeter waves—the type of waves we are all heavily exposed to today from cordless phones, cell phones, and wireless computers. Fourteen of the workers were prediabetic and four had sugar in their urine. This work was summarized by Christopher Dodge in a report he prepared at the United States Naval Observatory and read at a symposium held in Richmond, Virginia in 1969.
In 1973, Sadchikova attended a symposium in Warsaw on the Biologic Effects and Health Hazards of Microwave Radiation. She was able to report on her research team’s observations of 1,180 workers exposed to radio waves over a twenty-year period, of whom about 150 had been diagnosed with radio wave sickness. Both prediabetic and diabetic sugar curves, she said, “accompanied all clinical forms of this disease.”
Eliska Klimková-Deutschová of Czechoslovakia, at the same symposium, reported finding an elevated fasting blood sugar in fully three-quarters of all individuals exposed to centimeter waves.
Valentina Nikitina, who was involved in some of the Soviet research and was continuing to do such research in modern Russia, attended an international conference in St. Petersburg in 2000. She reported that people who maintained and tested radio communication equipment for the Russian Navy—even people who had ceased such employment five to ten years previously—had, on average, higher blood glucose levels than unexposed individuals.
Attached to the same medical centers at which Soviet doctors were examining patients were laboratories where scientists were exposing animals to the very same types of radio waves. They, too, reported seriously disturbed carbohydrate metabolism. They found that the activity of the enzymes in the electron transport chain, including the last enzyme, cytochrome oxidase, is always inhibited. This interferes with the oxidation of sugars, fats and proteins. To compensate, anaerobic (non-oxygen using) metabolism increases, lactic acid builds up in the tissues, and the liver becomes depleted of its energy-rich stores of glycogen. Oxygen consumption declines. The blood sugar curve is affected, and the fasting glucose level rises. The organism craves carbohydrates, and the cells become oxygen starved.19
These changes happen rapidly. As early as 1962, V. A. Syn Gayevskaya, working in Leningrad, exposed rabbits to low level radio waves and found that the animals’ blood sugar rose by one third in less than an hour. In 1982, Vasily Beloi Krinitskiy, working in Kiev, reported that the amount of sugar in the urine was in direct proportion to the dose of radiation and the number of times the animal was exposed. Mikhail Navakatikian and Lyudmila Tomashevskaya reported in 1994 that insulin levels decreased by 15 percent in rats exposed for just half an hour, and by 50 percent in rats exposed for twelve hours, to pulsed radiation at a power level of 100 microwatts per square centimeter. This level of exposure is comparable to the radiation a person receives today sitting directly in front of a wireless computer, and considerably less than what a person’s brain receives from a cell phone.
If there wasn’t a public outcry when most of this information was concealed in foreign alphabets, there should be one now, because it has become possible to confirm directly, in human beings, the degree to which cell phones interfere with glucose metabolism, and the outcomes of such studies are being published in English. Finnish researchers reported their alarming findings in the Journal of Cerebral Blood Flow and Metabolism in 2011. Using positron emission tomography (PET) to scan the brain, they found that glucose uptake is considerably reduced in the region of the brain next to a cell phone.20
Even more recently, researchers at Kaiser Permanente in Oakland, California, confirmed that electromagnetic fields cause obesity in children. They gave pregnant women meters to wear for 24 hours to measure their exposure to magnetic fields during an average day. The children of those women were more than six times as likely to be obese when they were teenagers if their mothers’ average exposure during pregnancy had exceeded 2.5 milligauss. Of course, the children were exposed to the same high fields while growing up, so what the study really proved is that magnetic fields cause obesity in children.21
Vital Statistics
As with heart disease, rural mortality from diabetes in the 1930s corresponded closely with rates of rural electrification, and varied as much as tenfold between the least and the most electrified states.
The overall history of diabetes in the United States is similar to that of heart disease.
Death Rate from Diabetes in the United States (per 100,000 population)
1850 1.4
1860 1.7
1870 3.0
1880 3.4
1890 6.4
1900 10.6
1910 15.0
1920 16.2
1930 19.0
1940 26.6
1950 16.2
1960 16.7
1970 18.9
1980 15.4
1990 19.2
2000 25.2
2010 22.3
2017 25.7
Mortality from diabetes increased steadily from 1870 until the 1940s—this, despite the discovery of insulin in 1922.
The apparent drop in mortality in 1950 is not real, but is due to a reclassification that occurred in 1949. Previously, if a person had both diabetes and heart disease, the cause of death was reported as diabetes. Beginning in 1949, those deaths were reported as due to heart disease, diminishing the reported mortality from diabetes by about 40 percent. In the late 1950s, Orinase, Diabinese, and Phenformin were brought to market, the first of many oral medications that helped control the blood sugar of people with “insulin-resistant” diabetes for whom insulin was of limited use. These drugs have restrained, but not reduced the mortality from this disease. Meanwhile the number of diagnosed cases of diabetes in the United States has steadily increased:
Year Cases per 1,000 population
1917 1.922
1944 5.7
1958 9.3
1963 11.5
1968 16.2
1973 20.4
1978 23.7
1983 24.5
1988 25.6
1990 25.2
1992 29.3
1994 29.8
1996 28.9
1997 38.0
1998 39.0
1999 40.0
2000 44.0
2001 47.5
2002 48.4
2003 49.3
2004 52.9
2005 56.1
2006 59.0
2007 58.6
2008 62.9
2009 68.6
2010 69.6
2011 67.8
2012 69.6
2013 71.8
2014 70.1
2015 72.0
The real change over time may have been even greater because the definition of diabetes, in the United States and worldwide, was relaxed in 1980. A two-hour plasma glucose level of over 130 milligrams per deciliter was formerly taken as an indication of diabetes, but since 1980 diabetes is not diagnosed until the two-hour level exceeds 200 milligrams per deciliter. Levels between 140 and 200, which may not cause sugar in the urine, are now called “prediabetes.”
A sudden spike in diabetes cases occurred nationwide in 1997—a 31 percent increase in a single year. No one was able to explain why. But that was the year the telecommunications industry introduced digital cell phones en masse to the United States. The first such phones went on sale in dozens of American cities during the Christmas season of 1996. Construction of cell towers began in those cities during 1996, but 1997 was the year that battalions of towers, previously confined to metropolises, marched out over the rural landscapes to occupy previously virgin territory. That was the year cell phones were transformed from a rich person’s luxury to the common person’s soon-to-be necessity—the year microwave radiation from towers and antennas became inescapable over large parts of the United States.
The situation today is out of control. The Centers for Disease Control estimates that in addition to the 21 million American adults over the age of twenty who have diagnosed diabetes, 8 million have undiagnosed diabetes, and 86 million have prediabetes. Adding these numbers together gives the shocking statistic that 115 million Americans, or more than half of all adults, have elevated levels of sugar in their blood.
Worldwide, it was estimated that more than 180,000,000 adults had diabetes in 2000. In 2014, the estimate was 387,000,000. In no country on earth is the rate of diabetes, or of obesity, decreasing.
Like diabetes, obesity has tracked exposure to electromagnetic fields. The first official statistics in the United States date from 1960, showing that one-quarter of adults were overweight. That number did not change for twenty years. The fourth survey, however, conducted during 1988–1991, revealed something alarming: fourteen million additional Americans had become fat.
Overweight in the United States (percent of adults 20 through 74 years of age)
1960-1962 24.3
1971-1974 25.0
1976-1980 25.4
1988-1991 33.3
The authors, writing in the Journal of the American Medical Association, commented that studies in Hawaii and England had found similar rises in overweight during the 1980s across the board throughout the population in both sexes and at all ages. They speculated about “dietary knowledge, attitudes, and practices, physical activity levels, and perhaps social, demographic, and health behavior factors” that might have changed, although they did not point to a single piece of evidence that any of those things had changed.23 In rebuttal, British physician Jeremiah Morris noted in a letter to the British Medical Journal that the average lifestyle had improved during this time, not worsened. More people in England were cycling, walking, swimming, and doing aerobics than ever before. Average daily food consumption, even after adjusting for meals eaten outside the home, had declined by 20 percent between 1970 and 1990.
However, in 1977, Apple had marketed its first personal computer, and during the 1980s the majority of people in both the United States and England, either at home or at work or both, were suddenly—and for the first time in history—exposed to high frequency electromagnetic fields continuously for hours everyday.
The problem became so huge that in 1991 the Centers for Disease Control began retroactively tracking not just overweight but obesity. For an American man or woman of average height this is defined as being more than about 30 pounds overweight.
Obesity in the United States 24 (percent of adults over 20 years of age)
1960-1962 13.4
1971-1974 14.4
1976-1980 14.7
1988-1991 22.3
1999-2000 30.5
2009-2010 35.7
2015-2016 39.6
Grade III obesity, called “morbid obesity,” has also been rising since 1980. This is defined as being more than about 100 pounds overweight.
Grade III Obesity in the United States (percent of adults over 20 years of age)
1960-1962 0.8
1971-1974 1.3
1976-1980 1.3
1988-1991 2.8
1999-2000 4.7
2009-2010 6.3
2015-2016 7.7
More than two-thirds of all adults today—about 150 million Americans—are overweight. Eighty million are obese, as are twelve and a half million children, including one and a half million children aged two to five.25 Twelve and a half million adults are more than 100 pounds overweight. The experts at the Centers for Disease Control have been able to do little more than shout that similar trends are being reported elsewhere—more than half a billion adults worldwide are obese—and to throw up their hands and say, “We do not know the causes of these increases in overweight and obesity.”26
Obesity in Wild and Domestic Animals
If obesity is caused by an environmental factor, then it should be occurring in animals too. And it is so.
A few years ago David B. Allison, professor of biostatistics at the University of Alabama School of Public Health, was looking over data on small primates called marmosets from the Wisconsin Non-Human Primate Center, when he noticed that the average weight of the animals had increased remarkably over time. Mystified, he checked with the center, but could find no convincing reason for weight gain in this large population of animals living in a fixed laboratory environment on a controlled diet.
Intrigued, Allison searched online for previous studies of mammals that had lasted at least a decade and contained information about the animals’ weight. He involved colleagues at primate centers, toxicology programs, pet food companies, and veterinary programs. The final paper, published in 2010 in the Proceedings of the Royal Society B, had eleven coauthors from Alabama, Florida, Puerto Rico, Maryland, Wisconsin, North Carolina, and California, and analyzed data on over 20,000 animals from twenty-four populations representing eight species, including laboratory animals, house pets, and feral rats, both rural and urban. In all twenty-four populations, the average weight of the animals rose over time. The odds of this happening by chance were less than ten billion to one.
Animal population Average weight gain per decade
macaques, 1971 to 2006 (Wisconsin Primate Center) 5.3%
macaques, 1981 to 1993 (Oregon Primate Center) 9.6%
macaques, 1979 to 1992 (California Primate Center) 11.5%
chimpanzees, 1985 to 2005 (Yerkes Primate Center, Atlanta) 33.6%
vervets, 1990 to 2006 (UCLA Vervet Research Center) 8.8%
marmosets, 1991 to 2006 (Wisconsin Primate Center) 9.3%
laboratory mice, 1982 to 2005 3.4%
domestic dogs, 1989 to 2001 2.9%
domestic cats, 1989 to 2001 9.7%
feral rats, 1948 to 2006 (urban) 6.9%
feral rats, 1948 to 1986 (rural) 4.8%
Chimpanzees gained the most weight: they were twenty-nine times as likely to be obese in 2005 as they were in 1985. But even among country rats there was 15 percent more obesity every decade, consistently for four decades. The authors found similar studies with the same results elsewhere: 19 percent of light breed horses in Virginia were obese in 2006, versus 5 percent in 1998;27 laboratory rats in France, under identical conditions, had increased in weight between 1979 and 1991.
Because wild and domestic animals were gaining so much weight, and had been doing so since at least the 1940s, Allison and his colleagues challenged the tired old wisdom that the rising tide of human fatness is due to lack of exercise and poor diet. They held up these animals as a warning to us all about an unknown global environmental factor. They titled their report “Canaries in the Coal Mine.”28
13.
Cancer and the Starvation of Life
At the commencement of the twentieth century the great problem of the causation of tumours, like a giant sphinx, looms large on the medical horizon... W. ROGER WILLIAMS, Fellow of the Royal College of Surgeons, England, 1908
ON FEBRUARY 24, 2011, Italy’s Supreme Court upheld the criminal conviction of Cardinal Roberto Tucci, former president of Vatican Radio’s management committee, for creating a public nuisance by polluting the environment with radio waves. The Vatican’s broadcasts to the world, transmitted in forty languages, emanate from fifty-eight powerful radio towers occupying over one thousand acres of land, surrounded by suburban communities. And for decades, residents in those communities had been screaming that the transmissions were destroying their health as well as causing an epidemic of childhood leukemia. At the request of the Public Prosecutor’s office in Rome, which was considering bringing charges against the Vatican for negligent homicide, Judge Zaira Secchi ordered an official investigation by the National Cancer Institute of Milan. The results, released November 13, 2010, were shocking. Between 1997 and 2003, children aged one to fourteen who lived between six and twelve kilometers (3.7 to 7.5 miles) from Vatican Radio’s antenna farm developed leukemia, lymphoma, or myeloma at eight times the rate of children who lived further away. And adults who lived between six and twelve kilometers from the antennas died of leukemia at almost seven times the rate of adults who lived further away.
The third disease of civilization that Samuel Milham related to electrification is cancer. At first blush it is not obvious what the connection is. Impaired metabolism of sugars is surely connected to diabetes, and impaired metabolism of fats to heart disease. But how does cancer fit in? The key was provided by a scientist who studied sea urchin eggs in his laboratory over one hundred years ago. He was a native of the same city where, a century later, three thousand doctors were to sign an appeal to the world stating, among other things, that radio waves cause leukemia.
On October 8, 1883, a son was born to Emil Warburg, a prominent Jewish physicist in Freiburg, Germany. When he was thirteen, the family moved to Berlin, where visitors to his parents’ home included some of the giants of the natural sciences—chemist Emil Fischer, physical chemist Walter Nernst, physiologist Theodor Wilhelm Engelmann. Later, when Albert Einstein moved to Berlin, the great scientist used to come over to play chamber music with his father—Einstein on violin and Emil Warburg on piano. No one was surprised when young Otto, growing up in such an atmosphere, enrolled in the University of Freiburg to study chemistry.
But by the time he received his Ph.D. in 1906, a growing disease epidemic had caught the attention of this ambitious young man. His was the first generation seriously to be affected by it. Cancer rates all over Europe had doubled since he was born, and he determined to devote his life to finding the reason and, hopefully, a cure. With this in mind he returned to school, receiving his M.D. from the University of Heidelberg in 1911.
What fundamental changes, he wondered, take place in the tissue when a normal cell becomes cancerous? “Does the metabolism of tumours,” he asked, “growing in a disorganized manner, differ from the metabolism of orderly cells growing at the same rate?”1 Impressed that both tumors and early embryos consist of undifferentiated cells that multiply rapidly, Otto Warburg began his life’s work by studying fertilized eggs. Perhaps, he speculated, cancer cells are just normal cells that have reverted to an embryonic pattern of growth. He chose the sea urchin egg to study because its embryo is large and grows particularly fast. His first major work, published while he was still in medical school, showed that on fertilization the rate of oxygen consumption of an egg rises sixfold.2
But in 1908, he could pursue his ambition no further because the chemical reactions within cells that involve oxygen were completely unknown. Spectrophotometry—the identification of chemicals from the frequencies of light they absorb—was new, and had not yet been applied to living systems. Existing techniques for culturing cells and measuring gas exchange were primitive. Warburg realized that before any real progress could be made in elucidating the metabolism of cancer, fundamental research on the metabolism of normal cells would have to be done. Cancer research would have to wait.
During the coming years—with a break during which he served in the World War—Warburg, using techniques that he developed, proved that respiration in a cell took place in tiny structures that he called “grana” and that we now call mitochondria. He experimented with the effects of alcohols, cyanide, and other chemicals on respiration and concluded that the enzymes in the “grana” must contain a heavy metal that he suspected, and later proved, was iron. He conducted landmark experiments using spectrophotometry that proved that the portion of the enzyme that reacts with oxygen in a cell is identical with the portion of hemoglobin that binds oxygen in the blood. That chemical, called heme, is a porphyrin bonded to iron, and the enzyme containing it, which exists in every cell and makes breathing possible, is known today as cytochrome oxidase. For this work Warburg was awarded the Nobel Prize in Physiology or Medicine in 1931.
Meanwhile, in 1923, Warburg resumed his research on cancer, picking up where he had left off fifteen years earlier. “The starting point,” he wrote, “has been the fact that the respiration of sea urchin eggs increases six-fold at the moment of fertilization,” i.e. at the moment that it changes from a state of rest to a state of growth. He expected to find a similar increase of respiration in cancer cells. But to his amazement, he found just the opposite. The rat tumor he was working with used considerably less oxygen than normal tissues from healthy rats.
“This result seemed so startling,” he wrote, “that the assumption seemed justified that the tumor lacked suitable material for combustion.” So Warburg added various nutrients to the culture medium, still expecting to see a dramatic rise in oxygen use. Instead, when he added glucose, the tumor’s respiration ceased completely! And in trying to discover why this happened, he found that tremendous amounts of lactic acid were accumulating in the culture medium. The tumor, in fact, was producing fully twelve percent of its weight in lactic acid per hour. Per unit time, it was producing 124 times as much lactic acid as blood, 200 times as much as a frog’s muscle at rest, and eight times as much as a frog’s muscle working to the limit of its capacity. The tumor was consuming the glucose, all right, but it was not making use of oxygen to do it.3
In additional experiments on other types of cancers in animals and humans, Warburg found that this was generally true of all cancer cells, and of no normal cells. This singular fact impressed Warburg as of utmost importance and the key to the causation of this disease.
The extraction of energy from glucose without using oxygen, a type of metabolism called anaerobic glycolysis—also called fermentation—is a highly inefficient process that takes place to a small extent in most living cells but only becomes important when not enough oxygen is available. For example, runners, during a sprint, push their muscles to use energy faster than their lungs can deliver oxygen to them. Their muscles temporarily produce energy anaerobically (without oxygen), incurring an oxygen debt that is repaid when they end their sprint and stop to gulp air. Although capable of supplying energy rapidly in an emergency, anaerobic glycolysis produces much less energy for the same amount of glucose, and deposits lactic acid in the tissues that has to be disposed of. 175s
Fermentation is a very old form of metabolism from which all forms of life obtained their energy for billions of years, before green plants appeared on earth and filled the atmosphere with oxygen. Some primitive forms of life today—many bacteria and yeasts, for example—still rely on it, but all complex organisms have abandoned that way of making a living.
What Warburg discovered in 1923 is that cancer cells differ from normal cells in all higher organisms in this fundamental respect: they maintain high rates of anaerobic glycolysis and produce large amounts of lactic acid even in the presence of oxygen. This discovery, called the Warburg effect, is the basis for the diagnosis and staging of cancer today, using positron emission tomography, or PET scanning. Because anaerobic glycolysis is inefficient and consumes glucose at a tremendous rate, PET scans can easily find tumors in the body by the more rapid uptake of radioactive glucose. And the more malignant the tumor, the more rapidly it takes up glucose.
Warburg reasonably believed he had discovered the cause of cancer. Evidently, in cancer, the respiratory mechanism has been damaged and has lost control over the metabolism of the cell. Unrestrained glycolysis—and unrestrained growth—are the result. In the absence of normal metabolic control the cell reverts to a more primitive state. All complex organisms, proposed Warburg, must have oxygen in order to maintain their highly differentiated forms. Without oxygen, they will revert to a more undifferentiated, simple form of growth, such as existed exclusively on this planet before there was oxygen in the air. “The causative factor in the origin of tumors,” proposed Warburg, “is nothing other than oxygen deficiency.”4 When cells are deprived of oxygen only temporarily, glycolysis takes over during the emergency, but ceases again when oxygen is once more available. But when cells are repeatedly or chronically deprived of oxygen, he said, respiratory control is eventually damaged and glycolysis becomes independent. “If respiration of a growing cell is disturbed,” wrote Warburg in 1930, “as a rule the cell dies. If it does not die, a tumour cell results.”5
Warburg’s hypothesis was first brought to my attention in the mid-1990s by Dr. John Holt, a colorful figure in Australia who was treating cancer with microwave radiation, and who warned his colleagues that the same radiation could convert normal cells into cancerous ones. I didn’t fully understand the connection of Warburg’s work on cancer to my work on electricity, so I filed away the research papers Holt sent me for future reference. Today, with so many more pieces of the puzzle in place, the connection is obvious. Electricity, like rain on a campfire, dampens the flames of combustion in living cells. If Warburg was correct, and chronic lack of oxygen causes cancer, then one need look no further than electrification for the origin of the modern pandemic.
Warburg’s theory was controversial from the beginning. Hundreds of different kinds of cancers were known in the 1920s, triggered by thousands of kinds of chemical and physical agents. Many scientists were reluctant to believe in a common cause that was so simple. Warburg answered them with a simple explanation: each of those thousands of chemicals and agents, in its own way, starves cells of oxygen. Arsenic, he explained by way of example, is a respiratory poison that causes cancer. Urethane is a narcotic that inhibits respiration and causes cancer. When you implant a foreign object under the skin, it causes cancer because it blocks blood circulation, starving neighboring tissues of oxygen.6
Although they didn’t necessarily accept Warburg’s theory of causation, other researchers lost little time confirming the Warburg effect. Tumors did, universally, have the ability to grow without oxygen. By 1942, Dean Burk at the National Cancer Institute was able to report that this was true of over 95 percent of the cancerous tissues he had examined.
Then, in the early 1950s, Harry Goldblatt and Gladys Cameron, at the Institute for Medical Research at Cedars of Lebanon Hospital in Los Angeles, reported to a skeptical public that they had succeeded in transforming normal cells—cultured fibroblasts from the heart of a five-day-old rat— into cancer cells merely by repeatedly depriving them of oxygen.
In 1959, Paul Goldhaber gave further support to Warburg’s hypothesis when he discovered that some types of Millipore diffusion chambers, but not others, when implanted under the skin of mice, caused large tumors to grow around them. Diffusion chambers were used to sample tissue fluid in many kinds of animal experiments. Their ability to cause cancer turned out to depend not on the type of plastic they were made of, but on the size of the pores that allowed fluid to flow through them. Only one animal out of 39 developed a tumor when the pores were 450 millimicrons in diameter. But 9 out of 34 developed tumors when the pore size was 100 millimicrons, and 16 out of 35—close to half—developed tumors when the pore size was only 50 millimicrons. The interference with free fluid circulation when the pore size was too small apparently deprived the tissues next to the chamber walls of oxygen.
In 1967, Burk’s team proved that the more malignant a tumor is, the higher its rate of glycolysis, the more glucose it consumes, and the more lactic acid it produces. “The extreme forms of rapidly growing ascites cancer cells,” Burk reported, “can produce lactic acid from glucose anaerobically at a sustained rate probably faster than any other living mammalian tissue—up to half the tissue dry weight per hour. Even a hummingbird, whose wings may beat up to at least one hundred times a second, consumes at best only half its dry weight of glucose equivalent per day.”
Because he insisted that the origin of cancer was known, Warburg thought that “one could prevent about 80 percent of all cancers if one could keep out the known carcinogens.”7 He therefore advocated, in 1954, for restrictions on cigarette smoking, pesticides, food additives, and air pollution by car exhaust.8 His incorporation of these attitudes into his personal life earned him a reputation as an eccentric. Long before environmentalism was popular, Warburg had a one-acre organic garden, obtained milk from an organically maintained herd, and purchased French butter because in France the use of herbicides and pesticides was more strictly controlled than in Germany.
Otto Warburg passed away in 1970 at the age of 83—the same year the first oncogene was discovered. An oncogene is an abnormal gene, thought to be caused by mutation, that is associated with the development of cancer. The discovery of oncogenes and tumor suppressor genes promoted a widespread belief that cancer was caused by genetic mutations and not by altered metabolism. Warburg’s hypothesis, controversial from the start, was largely abandoned for three decades.
But the widespread use of PET scanning for diagnosing and staging human cancers has catapulted the Warburg effect back onto the main stage of cancer research. No one can now deny that cancers live in anaerobic environments, and that they rely on anaerobic metabolism in order to grow. Even molecular biologists, who once focused exclusively on the oncogene theory, are discovering, after all, that there is a connection between lack of oxygen and cancer. A protein has been discovered that exists in all cells—hypoxia-inducible factor (HIF)—that is activated under conditions of low oxygen, and that in turn activates many of the genes necessary for cancer growth. HIF activity has been found to be elevated in colon, breast, gastric, lung, skin, esophageal, uterine, ovarian, pancreatic, prostate, renal, stomach, and brain cancers.9
Cellular changes that indicate damaged respiration—including reductions in the number and size of mitochondria, abnormal structure of mitochondria, lessened activity of Krebs cycle enzymes, lessened activity of the electron transport chain, and mutations of mitochondrial genes—are being routinely found in most types of cancer. Even in tumors caused by viruses, one of the first signs of malignancy is an increase in the rate of anaerobic metabolism.
Experimentally inhibiting the respiration of cancer cells, or simply depriving them of oxygen, has been shown to alter the expression of hundreds of genes that are involved in malignant transformation and cancer growth. Damaging respiration makes cancer cells more invasive; restoring normal respiration makes them less invasive.10
A consensus is forming among cancer researchers: tumors can only develop if cellular respiration is diminished.11 In 2009, a book dedicated to Otto Warburg was published titled “Cellular Respiration and Carcinogenesis.” Addressing all aspects of this question, it contains contributions from leading cancer researchers from the United States, Germany, France, Italy, Brazil, Japan, and Poland.12 In the foreword, Gregg Semenza wrote: “Warburg invented a device, now known as the Warburg manometer, with which he demonstrated that tumor cells consume less oxygen (and produce more lactate) than do normal cells under the same ambient oxygen concentrations. A century later, the struggle to understand how and why metastatic cancer cells manifest the Warburg effect is still ongoing, and 12 rounds of this heavyweight fight await the reader beyond this brief introduction.”
The question being asked today by cancer researchers is no longer, “Is the Warburg effect real?” but “Is hypoxia a cause, or an effect, of cancer?”13 But, as increasingly many scientists are admitting, it really doesn’t matter, and may be only a question of semantics. Since cancer cells thrive in the absence of oxygen, oxygen deprivation gives incipient cancer cells a survival advantage.14 And any environmental factor that damages respiration therefore—whether Warburg was right and it directly causes malignant transformation or whether the skeptics are right and it merely provides an environment in which cancer has an advantage over normal cells—will necessarily increase the cancer rate.
Electricity, as we have seen, is such a factor.
Diabetes and Cancer
If the same cause—a slowing of metabolism by the electromagnetic fields around us—produces both diabetes and cancer, then one might expect diabetics to have a high rate of cancer, and vice versa. And it is so.
The first person to confirm a connection between the two diseases was South African physician George Darell Maynard in 1910. Unlike almost all other diseases, rates of both cancer and diabetes were steadily rising. Thinking that they might have a common cause, he analyzed mortality statistics from the 15 death registration states in the 1900 Census of the United States. And he found, after correcting for population and age, that the two diseases were strongly related. States that had higher incidences of one also had higher incidences of the other. He proposed that electricity might be that common cause:
“Only one cause, it seems to me, will fit the facts as we know them, viz.: the pressure of modern civilisation and the strain of modern competition, or some factor closely associated with these. Radio-activity and various electric phenomenon have from time to time been accused of producing cancer. The increased use of high tension currents is an undoubted fact in modern city life.”
A century later, it is an accepted fact that diabetes and cancer occur together. More than 160 epidemiological studies have investigated this question worldwide, and the majority have confirmed a link between the two diseases. Diabetics are more likely than non-diabetics to develop, and to die from, cancers of the liver, pancreas, kidney, endometrium, colon, rectum, bladder, and breast, as well as non-Hodgkin’s lymphoma.15 In December 2009, the American Diabetes Association and American Cancer Society convened a joint conference. The consensus report that resulted concurred: “Cancer and diabetes are diagnosed within the same individual more frequently than would be expected by chance.”16
Cancer in Animals
We recall from chapter 11 that complete autopsy records of the Philadelphia zoo, kept since 1901, showed an increase in heart disease that accelerated during the 1930s and 1940s, and that affected all species of animals and birds at the zoo. An equivalent increase occurred in rates of cancer. The 1959 report from the Penrose Research Laboratory at the zoo17 divided the autopsies into two time periods: 1901-1934 and 1935-1955. The rate of malignant tumors among nine families of mammals increased between two- and twenty-fold from the earlier to the later time period. The rate of benign tumors increased even more. Only 3.6 percent of felines, for example, had benign or malignant tumors at autopsy during the earlier period, compared to 18.1 percent during the later period; 7.8 percent of ursines (bears) had tumors during the earlier period, compared to 47 percent during the later period.
The autopsy records of 7,286 birds at the zoo, encompassing four different orders, showed that malignant tumors increased two-and-a-half-fold, and benign tumors eightfold.
Vital Statistics
The real story, again, is revealed by the historical records.
The increase in cancer began slightly before heart disease and diabetes began to rise. Early records from England show that cancer deaths were rising as early as 1850:18
Year Cancer deaths, England (per 100,000 population)
1840 17.7
1850 27.9
1855 31.9
1860 34.3
1865 37.2
1870 42.4
1875 47.1
1880 50.2
1885 57.2
1890 67.6
1895 75.5
1900 82.8
1905 88.5
Cooke and Wheatstone’s first telegraph line, running from London to West Drayton, opened for business on July 9, 1839. By 1850, over two thousand miles of wire ran the length and breadth of England. While we don’t have earlier statistics from England to prove that cancer rates first began rising between 1840 and 1850, or comparable data from any other national government, we do have them for the parish of Fellingsbro, a small well-to-do rural district 90 miles west of Stockholm, Sweden. We have them because in 1902, Swedish physician Adolf Ekblom, in an effort to discover whether cancer rates had really risen during the previous century, consulted the “death and burial book” kept by the clergy of Fellingsbro parish. These are the numbers that he compiled from that book:
Years Average yearly cancer mortality (Fellingsbro, per 100,000 population)
1801-1810 2.1
1811-1820 6.5
1821-1830 8.1
1831-1840 3.5
1841-1850 6.6
1851-1860 14.0
*** ***
1885-1894 72.5
1895-1900 141.0
The records were incomplete from 1863 to 1884. But the records that survive tell the story that we seek.
The population of Fellingsbro was 4,608 at the beginning of the nineteenth century, and 7,104 at the end of it. One person died of cancer about every three years between 1801 and 1850. Then, in 1853, the first telegraph wire in Sweden was strung between Stockholm, the capital, and Uppsala, a city 37 miles north. The following year a line was run southwestward from Uppsala, via Västerås, to Örebro. This line ran right through the middle of Fellingsbro parish. At that time the cancer rate in Fellingsbro began to rise.19 By the turn of the twentieth century, the country folk in Fellingsbro were dying of cancer faster than the average residents of London.
In 1900, annual cancer deaths around the world, per 100,000 population, were:
Switzerland 127
Holland 92
Norway 91
England and Wales 83
Scotland 79
Bermuda 75
Germany 72
Austria 71
France 65
USA 64
Australia 63
Ireland 61
New Zealand 56
Belgium 56
Italy 52
Uruguay 50
Japan 46
Spain 39
Hungary 33
Cuba 29
Chile 27
British Guiana 24
Portugal 22
Windward and Leeward Islands 22
Costa Rica 20
British Honduras 19
Jamaica 16
St. Kitts 13
Trinidad 12
Mauritius 12
Serbia 9
Ceylon 5.5
Hong Kong 4.5
Brazil 4.5
Guatemala 4
La Paz, Bolivia 3.4
Bahamas 1.8
Fiji 1.7
New Guinea, Borneo, Java, Sumatra, Philippines, most of Africa, Macao non-existent
Every historical source shows that cancer always accompanied electricity. In 1914, among about 63,000 American Indians living on reservations, none of which had electricity, there were only two deaths from cancer. The cancer mortality in the United States as a whole was 25 times as high.20
An unusual one-year rise in cancer mortality of from 3 to 10 percent occurred in every modernizing country in 1920 or 1921. This corresponded to the beginning of commercial AM radio broadcasting. In 1920, cancer deaths rose 8 percent in Norway, 7 percent in South Africa and France, 5 percent in Sweden, 4 percent in the Netherlands, and 3 percent in the United States. In 1921, cancer deaths rose 10 percent in Portugal, 5 percent in England, Germany, Belgium, and Uruguay, and 4 percent in Australia.
Lung cancer, breast cancer, and prostate cancer rates rose spectacularly throughout the first half of the twentieth century in every country for which we have good data. The number of deaths from breast cancer quintupled in Norway, sextupled in the Netherlands, and increased sixteen-fold in the United States. Lung cancer deaths increase twenty-fold in England. Prostate cancer deaths increased eleven-fold in Switzerland, twelve-fold in Australia, and thirteen-fold in England.
Lung cancer was once so uncommon that it was not even listed separately in most countries until 1929. In the few countries that tracked it, it did not start its dramatic rise until about 1920. Benjamin Ward Richardson, in his 1876 book, Diseases of Modern Life, is surprising to a modern reader in this respect. His chapter on “Cancer from Smoking” discusses the controversy over whether tobacco smoking caused cancer of the lip, tongue, or throat, but cancer of the lung is not even mentioned. Lung cancer was still rare in 1913, the year when the American Society for the Control of Cancer was founded. Out of 2,641 cases of cancer reported to the New York State Institute for the Study of Malignant Disease that year, there was only a single case of primary lung cancer. Frederick Hoffman, in his exhaustive 1915 book, The Mortality From Cancer Throughout the World, asserted as a proven fact that smoking caused cancer of the lips, mouth, and throat, but like Richardson four decades previously made no mention of lung cancer in connection with smoking.21
Swedish researchers Örjan Hallberg and Olle Johansson have shown that the rates of lung, breast, and prostate cancer continued to rise, just as spectacularly, in the second half of the twentieth century in forty countries, along with malignant melanoma and cancers of the bladder and colon— and that the overall rate of cancer changed precisely with changes in the exposure of the population to radio waves. The rate of increase in cancer deaths in Sweden accelerated in 1920, 1955, and 1969, and took a downturn in 1978. “In 1920 we got AM radio, in 1955 we got FM radio and TV1, in 1969-70 we got TV2 and colour TV and in 1978 several of the old AM broadcasting transmitters were disrupted,” they note in their article, “Cancer Trends During the 20th Century.” Their data suggest that at least as many cases of lung cancer can be attributed to radio waves as to smoking.
The same authors have focused on FM radio exposure in connection with malignant melanoma, following up on the findings of Helen Dolk at the London School of Hygiene and Tropical Medicine. In 1995, Dolk and her colleagues had shown that the incidence of skin melanoma declined with distance from the powerful television and FM radio transmitters at Sutton Coldfield in The West Midlands, England. Noting that the FM frequency range, 85 to 108 MHz, is close to the resonant frequency of the human body, Hallberg and Johansson decided to compare melanoma incidence with exposure to FM radio waves for all 565 Swedish counties. The results are startling. When melanoma incidence is plotted on a graph against the average number of FM transmitters to which a municipality is exposed, the points fall on a straight line. Counties that get reception from 4.5 FM stations have a rate of malignant melanoma that is eleven times as high as counties that do not get reception from any FM station.
In their article, “Malignant Melanoma of the Skin—Not a Sunshine Story,” they refute the notion that the tremendous increase in this disease since 1955 is caused primarily by the sun. No increase in ultraviolet radiation due to ozone depletion occurred as early as 1955. Nor, until the 1960s, did Swedes begin to travel to more southerly countries in large numbers to soak up the sun. The embarrassing truth is that rates of melanoma on the head and feet hardly rose at all between 1955 and 2008, while rates for sun-protected areas in the middle of the body increased by a factor of twenty. Most moles and melanomas are now occurring not on the head, arms, and feet, but in areas of the body that are not exposed to sunshine.
Elihu Richter, in Israel, has recently published a report on 47 patients, treated at Hebrew University-Hadassah School of Medicine, who developed cancer after occupational exposure to high levels of electromagnetic fields and/or radio waves.22 Many of these people—especially the youngest people—developed their cancers within a surprisingly short period of time—some as short as five or six months after the beginning of their exposure. This dispelled the notion that we must wait ten or twenty years to see the effects of cell phones on the world’s population. Richter’s team warns that “with the recent introduction of WiFi into schools, personal computers for each pupil in many schools, high frequency voltage transients measured in schools—as well as the population wide use of cellphones, cordless phones, some exposure to cellphone towers, residential exposure to RF/MW from Smart Meters and other ‘smart’ electronic equipment at the home and possibly also ELF exposures to high power generators and transformers—young people are no longer free from exposure to EMF.”
The range of tumors in Richter’s clinic ran the gamut: leukemias, lymphomas, and cancers of the brain, nasopharynx, rectum, colon, testis, bone, parotid gland, breast, skin, vertebral column, lung, liver, kidney, pituitary gland, pineal gland, prostate, and cheek muscle.
United States 23 Year Cancer deaths (per 100,000 population)
1850 10.3
1860 14.7
1870 22.5
1880 31.0
1890 46.9
1900 60.0
1910 76.2
1920 83.4
1930 98.9
1940 120.3
1950 139.8
1960 149.2
1970 162.8
1980 183.9
1990 203.2
2000 200.9
2010 185.9
2017 183.9
You may notice that the position of Nevada shifted more than any other state between 1931 and 1940. For some reason, deaths from heart disease, diabetes, and cancer rose dramatically in Nevada while the rate of household electrification rose only modestly. I propose that the construction of Hoover Dam, completed in 1936, was that reason. The most powerful hydroelectric plant in the world at that time, its one billion watt capacity supplied Las Vegas, Los Angeles, and most of Southern California via high voltage power lines that coursed through southeastern Nevada on their way to their destinations, exposing the surrounding area—where most of the population of the state lived—to some of the world’s highest levels of electromagnetic fields. In June of 1939 the Los Angeles grid was connected to Hoover Dam via a 287,000-volt transmission line, also the most powerful in the world at that time.24
Two types of cancer deserve additional comment: lung cancer and brain cancer. As the following graph shows, the percentage of adults who smoke has declined steadily since 1970 among both men and women. Yet lung cancer mortality has almost quadrupled in women, and is virtually the same in men as it was fifty years ago.25
When non-smoker Dana Reeve, the 46-year-old widow of “Superman” actor Christopher Reeve, died of lung cancer in 2006, the public was stunned because we had had it drummed into us for decades that this type of cancer is caused by smoking. Yet lung cancer in people who have never smoked—if you consider it as a separate category—ranks today as the seventh most common cause of cancer deaths worldwide, even before cancer of the cervix, pancreas, and prostate.26
Brain tumors deserve mention because, obviously, of cell phones. Several billion people in the world are exposing their brains for up to hours per day to microwave radiation at point blank range —a new situation that began in approximately 1996 or 1997 in most countries. Yet honest data on brain tumors are difficult to obtain because special interests have controlled most of the research funding on brain tumors since the advent of digital cell phones two decades ago. As a result, a media war has pitted the independent scientists, who report a tripling to quintupling of brain cancer rates among those who have used their cell phones for ten years or more, against industry scientists who report no increase in cancer at all.
The problem, as Australian neurosurgeon Charlie Teo tells those who will listen, is that all the data on cell phone usage comes from databanks controlled by cell phone providers, and “no telcos have allowed scientists access to their records for these large studies.”
I found out firsthand how closely not only the telecom providers, but the scientists they fund, guard their data, when I requested access to some of it in 2006. Yet another industry-funded study was published, this time in Denmark, purporting to show not only that cell phones did not cause brain cancer, but that cell phone users even had a lower rate of brain cancer than everyone else. In other words, those scientists would have the world believe that people might actually protect themselves from brain tumors by holding a cell phone to their heads for hours per day. The study, published in the Journal of the National Cancer Institute, was titled “Cellular Telephone Use and Cancer Risks: Update of a Nationwide Danish Cohort.”27 It claimed to come to its conclusions after an examination of the medical records of over 420,000 Danish cell phone users and non-users over a period of two decades. It was clear to me that something was wrong with the statistics.
Although the study found a lower rate of brain cancer—in men only—among cell phone users than non-users, it found a higher rate of exactly those cancers that Swedish scientists Hallberg and Johansson had reported to be caused by radio waves: bladder cancer, breast cancer, lung cancer, and prostate cancer. The Danish study did not report rates of colon cancer or melanoma, the other two types of cancer that the Swedish researchers had mentioned. However, the Danish study did additionally find that testicular cancer in men was higher and that cervical and kidney cancers in women were significantly higher among the cell phone users. I sensed manipulation of the data, because the only type of cancer for which a “protective” effect was reported was the type of cancer these scientists and their funders were trying to convince the public that cell phones did not cause: brain cancer.
It occurred to me that all of the study’s subjects had actually been using cell phones for a long time by the year 2004, when the study ended. The only difference between “users” and “non-users” was the date of first subscription: the “users” first bought a cell phone between 1982 and 1995, while the “non-users” didn’t buy one until after 1995. And all the “users” were lumped together. The study did not distinguish between people who had used cell phones for 9 years and people who had used them for 22 years. But according to the study, those who subscribed prior to 1994 tended to be wealthier, and drank and smoked much less, than those who first subscribed later. I suspected that controlling for length of use might change the results of the study. So I did the natural, normal, accepted thing that scientists do when they wish to validate a study that is published in a peer review journal: I requested to look at their data. On December 18, 2006, I sent an email to the lead author, Joachim Schüz, telling him that I had colleagues in Denmark who would like to look at their data. And on January 19, 2007, we were cordially refused permission. The letter of refusal was signed by three of the study’s six authors: Schüz, Christoffer Johansen and Jørgen H. Olsen.
Meanwhile, Teo is sounding the alarm. “I see 10 to 20 new patients each week,” he says, “and at least one third of those patients’ tumors are in the area of the brain around the ear. As a neurosurgeon I cannot ignore this fact.”
Many if not most of us have one or more acquaintances or family members who have, or have died from, a brain tumor. My friend Noel Kaufmann, who died in 2012 at the age of 46, never used a cell phone, but he did use a home cordless phone for years, which emits the same type of radiation, and the tumor that killed him was in the part of his brain beneath the ear against which he held that phone. All of us have heard about famous people who have died of brain tumors—Senator Ted Kennedy, attorney Johnnie Cochran, journalist Robert Novak, Vice President Joe Biden’s son Beau. I have in my files, sent to me by the director of the California Brain Tumor Association, a list of over three hundred celebrities who either have a brain tumor or have died from one during the past decade and a half. When I was younger I never heard of any celebrity who had brain cancer.
Yet highly publicized studies assure us that brain tumor rates are not increasing. This is certainly not true, and a little investigation shows why the data cannot be trusted, in the United States or anywhere else. In 2007, researchers at the Swedish National Board of Health and Welfare found out that, for some reason, one-third of the cases of brain cancer diagnosed at university hospitals, and the majority of cases at county hospitals, were not being reported to the Swedish Cancer Registry.28 All other types of cancer were being routinely reported, but not brain cancer.
A 1994 study revealed that difficulties in brain cancer reporting were already occurring in Finland. Although the Finnish cancer registry was complete for most types of cancer, it seriously underreported brain tumors.29
Here in the United States, severe problems have been found with surveillance not just of brain cancer, but across the board. The Surveillance Epidemiology and End Results (SEER) program, run by the National Cancer Institute, depends on state registries to deliver accurate data. But the data are not accurate. American researcher David Harris reported at a conference in Berlin in 2008 that state registries cannot keep up with the increasing load of cancer cases because they are not receiving enough funding to do so. “SEER registries are currently faced with the challenge of collecting more cases in less time with often the same limited resources as the previous year,” he said. This means that the greater the rise in cancer, the less it will be reported, barring an improvement in the American economy.
Even worse has been the deliberate refusal by Veterans Administration hospitals and military base medical facilities to report cases to the state cancer registries. A report by Bryant Furlow that appeared in The Lancet Oncology in 2007 noted “a precipitous decline in VA reporting of new cases to California cancer registries beginning in late 2004—from 3,000 cases in 2003 to almost none by the end of 2005.” After inquiring in other states, Furlow discovered that California was not an exception. The Florida cancer registry had never received any VA case reports, and VA facilities in other states were dealing with years of backlogged, unreported cancer cases. “We’ve been working with the VA for more than 5 years, but it’s just got worse,” Holly Howe told him. She represents the North American Association of Central Cancer Registries. As many as 70,000 cases of cancer from the VA were not being reported each year. And in 2007, the VA made non-reporting official policy when it issued a directive on cancer nullifying all existing agreements between state registries and VA facilities. Furlow reported that the Department of Defense was also not cooperating with the cancer registries. No cancers diagnosed at military base facilities had been reported to any state registries for several years. As a result of all these failures, Dennis Deapon of the Los Angeles Cancer Surveillance Program warned that studies based on the deficient data may be worthless. “Research from the mid-2000’s will forever require an asterisk, or perhaps a sticker on the cover, to remind researchers and the public that they are not correct,” he said.
Doctors at the Southern Alberta Cancer Research Institute at the University of Calgary were shocked when records showed a 30 percent increase in malignant brain tumors in Calgary in the single year between 2012 and 2013,30 despite official government statistics proclaiming no rise in malignant brain tumor rates at all in either the province of Alberta or the nation of Canada. This discrepancy has lit a fire under Faith Davis, professor of epidemiology at the University of Alberta’s School of Public Health. As unreliable as official statistics are for malignant tumors, they are even worse for non-malignant tumors: Canada’s surveillance system does not record them at all. To remedy this incredible situation, the Brain Tumour Foundation of Canada announced in July 2015 that it is raising money to help Davis create a national brain tumour registry that will finally give clinicians and researchers access to accurate information.
The studies that are assuring us that all is well with cell phones have been funded by the telecommunications industry. But, and in spite of severe underreporting of brain tumors, independent scientists are confirming the impression of brain surgeons and oncologists that their caseloads are increasing, as well as the evident fact that many more people that we all know and hear about are dying of such tumors than ever before. The most prominent of these independent scientists is Lennart Hardell. Hardell is a professor of oncology and cancer epidemiology at University Hospital in Örebro, Sweden. Although most of his earlier research was on chemicals like dioxins, PCBs, flame retardants, and herbicides, since 1999 he has focused on exposure to cell and cordless telephones. He tells us, based on case control studies involving over 1,250 people with malignant brain tumors, that using both cell phones and cordless phones significantly increases one’s risk for brain cancer. The more years you use such a phone, the more cumulative hours you use one, and the younger you are at first exposure, the greater the odds that you will develop a tumor. Two thousand hours of cell phone use, according to Hardell, triples one’s risk. Two thousand hours on a cordless phone more than doubles one’s risk. First use of a cell phone before the age of twenty increases one’s overall risk of brain cancer three-fold, the risk of an astrocytoma—the most common type of malignant brain tumor—five-fold, and the risk of an astrocytoma on the same side of the head as the phone eight-fold. First use of a cordless phone before the age of twenty doubles the risk of any brain tumor, quadruples the risk of an astrocytoma, and increases the risk of an astrocytoma on the same side of the head eight-fold.31
The literature on cell towers and radio towers is less compromised. Almost all of the existing studies, until recently, have been funded by independent sources and not by the telecommunications industry, and they have yielded consistent results: living near a transmission tower is carcinogenic.
William Morton, at Oregon Health Sciences University, found that living near VHF-TV broadcast antennas was a significant risk for leukemia and breast cancer in the Portland-Vancouver metropolitan area from 1967 to 1982.
In 1986, the Department of Health of the State of Hawaii found that residents of Honolulu who lived in census tracts that had one or more broadcast towers had a 43 percent increased risk for all types of cancer.32
In 1996, Bruce Hocking, an occupational physician in Melbourne, analyzed the childhood cancer incidence for nine Australian municipalities in relation to a group of three high-power television towers. Children who lived closer than four kilometers to the towers were almost two and a half times more likely to die of leukemia as children in more distant cities.
In 1997, Helen Dolk and her colleagues found high rates of adult leukemia, bladder cancer, and skin melanoma near the Sutton Coldfield tower at the northern edge of Birmingham. When Dolk expanded her study to include twenty high power transmission towers throughout Great Britain, she found that, in general, the closer you lived to a tower, the more likely you were to have leukemia.
In 2000, Neil Cherry analyzed the childhood cancer rate in San Francisco as a function of distance from Sutro Tower. Sutro Tower is almost 1,000 feet tall, stands on top of a tall hill, and can be seen from all over San Francisco. At the time of Cherry’s study it was broadcast- ing nearly one million watts of VHF-TV and FM radio signals, plus over 18 million watts of UHF-TV. The rates of brain cancer, lymphoma, leukemia, and all cancers combined, throughout San Francisco, were related to the distance a child lived from that tower. Children who lived on hills and ridgetops had much more cancer than children who lived in valleys and were shielded from the tower. Children who lived less than one kilometer from the tower had 9 times the rate of leukemia, 15 times the rate of lymphoma, 31 times the rate of brain cancer, and 18 times the total cancer rate, as children in the rest of the city.
In 2004, Ronni and Danny Wolf responded to residents in a small neighborhood around a single cell tower in south Netanya, Israel. During the five years before the tower was erected, two of the 622 residents had developed cancer; during the single year after the tower went up, eight more developed cancer. This turned a neighborhood with one of the lowest cancer rates in the city into a zone where the risk was more than quadruple the average for Netanya.
In the same year, Horst Eger, a physician in Naila, Germany examined 1,000 patient records in his home town. He found that people who lived within 400 meters (1,300 feet) of a cell tower had triple the risk of developing cancer, and developed their cancer, on average, when they were eight years younger, compared to people who lived further away.
In 2011, Adilza Dode headed up a team of university scientists and government officials of a metropolis in southeastern Brazil that confirmed the results of all the previous studies. The risk of cancer for the residents of Belo Horizonte decreased uniformly and steadily with distance from a cell tower.
And on February 24, 2011, the Supreme Court of Italy upheld the 2005 conviction of Cardinal Tucci for polluting Rome with radio waves. A ten-day suspended jail sentence was his only punishment. No one has ever been compensated for their injuries. The Prosecutor’s Office has not filed charges of negligent homicide. Vatican Radio’s antennas have not been shut down.
next
PART 6
https://exploringrealhistory.blogspot.com/2021/01/part-6-invisible-rainbowsuspended.html
14. Suspended Animation
footnotes
Chapter 12.
The Transformation of Diabetes
1. The Sun 1891; Howe 1931; Joslin Diabetes Clinic 1990.
2. Gray 2006, pp. 46, 261, 414.
3. Hirsch 1885, p. 645.
4. Harris 1924; Brun et al. 2000.
5. Joslin 1917, p. 59.
6. Annual consumption of sugar and other sweeteners from 1822 to 2014 was obtained from tables published in Annual Report of the Commissioner of Agriculture for the Year 1878; American Almanac and Treasury of Facts (New York: American News Company, 1888); Proceedings of the Interstate Sugar Cane Growers First Annual Convention (Macon, GA: Smith and Watson, 1903); A. Bouchereau, Statement of the Sugar Crop Made in Louisiana in 1905-’06 (New Orleans, 1909); Statistical Abstracts of the United States for 1904-1910; Ninth Census of the United States, vol. 3, The Statistics of Wealth and Industry of the United States (1872); Twelfth Census of the United States, vol. 5, Agriculture (1902); Thirteenth Census of the United States, vol. 5, Agriculture (1914); United States Census of Agriculture, vol. 2 (1950); Statistical Bulletin No. 3646 (U.S. Dept. of Agriculture, 1965); Supplement to Agricultural Economic Report No. 138 (U.S. Dept. of Agriculture, 1975); and Sugar and Sweeteners Outlook, Table 50 – U.S. per capita caloric sweeteners estimated deliveries for domestic food and beverage use, by calendar year (U.S. Dept. of Agriculture, 2003). Honey was estimated to contain 81 percent sugar; molasses, 52 percent sugar; cane syrup, 56.3 percent sugar; maple syrup, 66.5 percent sugar; and sorghum syrup, 68 percent sugar.
7. Gohdes 1995.
8. Black Eagle, personal communication. 9. Levy et al. 2012; Welsh et al. 2010.
10. Pelden 2009.
11. Giri et al. 2013.
12. Joslin 1917, 1924, 1927, 1943, 1950; Woodyatt 1921; Allen 1914, 1915, 1916, 1922; Mazur 2011.
13. Fothergill 1884.
14. Joslin 1917, pp. 100, 102, 106, 107.
15. Simoneau et al. 1995; Gerbiz et al. 1996; Kelley et al. 1999; Simoneau and Kelley 1997; Kelley and Mandarino 2000; Kelley et al. 2002; Bruce et al. 2003; Morino et al. 2006; Toledo et al. 2008; Ritov et al. 2010; Patti and Corvera 2010; DeLany et al. 2014; Antoun et al. 2015.
16. DeLany et al. 2014.
17. Ritov et al. 2010.
18. Gel’fon and Sadchikova 1960.
19. Gel’fon and Sadchikova 1960; Syngayevskaya 1962; Bartoníček and Klimková-Deutschová 1964; Petrov 1970a, p. 164; Sadchikova 1974; Klimková-Deutschová 1974; Dumanskiy and Rudichenko 1976; Dumanskiy and Shandala 1974; Dumanskiy and Tomashevskaya 1978; Gabovich et al. 1979; Kolodub and Chemysheva 1980; Belokrinitskiy 1981; Shutenko et al. 1981; Dumanskiy et al. 1982; Dumanskiy and Tomashevskaya 1982; Tomashevskaya and Soleny 1986; Tomashevskaya and Dumanskiy 1988; Navakatikian and Tomashevskaya 1994.
20. Kwon et al. 2011.
21. Li et al. 2012.
22. 1917 figure from Joslin 1917, p. 25.
23. Kuczmarski et al. 1994. See also Prentice and Jebb 1995.
24. Flegal et al. 1998, 2002, 2010; Ogden et al. 2012.
25. Kim et al. 2006.
26. Flegal 1998, p. 45.
27. Thatcher et al. 2009.
28. Klimentidis et al. 2011.
Chapter 13. Cancer and the Starvation of Life
1. Warburg 1925, p. 148.
2. Warburg 1908.
3. Warburg et al. 1924; Warburg 1925.
4. Warburg 1925, p. 162.
5. Warburg 1930, p. x.
6. Warburg 1956.
7. Warburg 1966b.
8. Krebs 1981, pp. 23-24, 74.
9. Harris 2002; Ferreira and Campos 2009.
10. Ristow and Cuezva 2006; van Waveren et al. 2006; Srivastava 2009; Sánchez-Aragó et al. 2010.
11. Kondoh 2009, p. 101; Sánchez-Aragó et al. 2010.
12. Apte and Sarangarajan 2009a.
13. Ferreira and Campos 2009, p. 81.
14. Vaupel et al. 1998; Gatenby and Gillis 2004; McFate et al. 2008; Gonzáles-Cuyar et al. 2009, pp. 134-36; Semenza 2009; Werner 2009, pp. 171-72; Sánchez-Aragó et al. 2010.
15. Vigneri et al. 2009.
16. Giovannuca et al. 2010.
17. Lombard et al. 1959.
18. From Williams 1908, p. 53.
19. Guinchard 1914.
20. Hoffman 1915, p. 151.
21. Ibid., pp. 185-186.
22. Stein et al. 2011.
23. From volumes of Vital Statistics of the United States (United States Bureau of the Census) and National Vital Statistics Reports (Centers for Disease Control and Prevention).
24. Moffat 1988.
25. Data on smoking rates from National Center for Health Statistics. Data on lung cancer from Vital Statistics of the United States (1970, 1980, 1990) and National Vital Statistics Reports (2000, 2010, 2015).
26. National Cancer Institute 2009.
27. Schüz et al. 2006.
28. Barlow et al. 2009.
29. Teppo et al. 1994.
30. Jacob Easaw, Southern Alberta Cancer Research Institute, personal communication.
31. Hardell and Carlberg 2009; Hardell et al. 2011a.
32. Anderson and Henderson 1986.
FAIR USE NOTICE
No comments:
Post a Comment