Tuesday, December 1, 2020

Part 1: The Contagion Myth...Contagion...Electricity & Disease..Pandemic

The Contagion Myth

why viruses including

"Coronavirus" are not

the cause of disease

by  Sally Fallon Morell &

Thomas S. Cowan, MD

CHAPTER 1 

CONTAGION 


Let’s get right to the nitty-gritty of this issue: contagion. How do we know whether any set of symptoms has an infectious cause? As we can all imagine, determining the cause of a disease in general, or a set of symptoms in any particular person, can be a complex and difficult task. Obviously, there are many factors to be considered for any one person at any one time in his or her life. Are the symptoms a result of genetics, poisoning, bad diet and nutrient deficiencies, stress, EMFs, negative emotions, placebo or nocebo effects—or infection from another person by a bacteria or virus? 

In finding our way through this morass, we need well-defined rules to determine how to prove causation—and these rules should be clear, simple, and correct. We do have such rules, but scientists have ignored them for years. Unfortunately, failure to follow these guidelines threatens to destroy the fabric of society. 

Imagine that an inventor calls you up and says he has invented a new ping-pong ball that is able to knock down brick walls and therefore make the process of demolition much easier and safer for builders and carpenters. Sounds interesting, although it is hard to imagine how a pingpong ball could do such a thing. You ask the inventor to show you how he has determined that the new ping-pong balls are able to destroy brick walls. His company sends you a video. The video shows them putting a ping-pong ball in a bucket of rocks and ice cubes. They then take the bucket and fling it at a small brick wall. The wall goes down—“there’s the proof,” they say.

Wait a minute! How do we know it was the ping-pong ball that knocked the wall down and not the rocks and ice cubes that were also in the bucket?

“Good question,” the inventor replies and then sends you a video showing an animated or virtual ping-pong ball destroying a virtual brick wall. He lets you know that the ball and the wall are exact renditions of the actual ball and brick. Still, something doesn’t seem right; after all, it’s fairly easy to create a computer image or video that shows such an occurrence, yet we would all agree it has nothing to do with what might happen with the actual ball and wall.

The inventor is getting exasperated with all your questions, but since you are a potential investor and he is interested in having your financial support, he persists. He then sends you a detailed analysis of what makes his ping-pong ball special. It has special protrusions on the outside of the ball that “grab onto and destroy the integrity of the cement holding the bricks together.” Also, they build a lightweight internal system in the ping pong ball that, according to the inventor, leverages the power of the ball, making it hundreds of times more powerful than the usual ping-pong ball. This, he says, is absolute proof that the new ball can whack down walls. 

At this point, you are ready to hang up on this lunatic, but then he pulls the final trump card. He sends you videos of five esteemed researchers in the new field of ping-pong ball demolition. They, of course, have been funded entirely by the Ping-Pong Ball Demolition Council and have attained prestigious positions in the field. They each separately give testimony about the interesting qualities of this new ping-pong ball. They admit that more research is needed, but they have “presumptive” evidence that the claims of improved efficiency are correct and that a cautious investment is warranted. At that point, you do hang up the phone and check outside to see whether you’ve been dropped into Alice’s Wonderland, and whether you have just been talking to the Mad Hatter.

Now if this ping-pong ball can really knock down brick walls, the obvious thing to do is to take the ping-pong ball, throw it at the wall, and record what happens—then have multiple other non-invested people do the same to make sure the company didn’t put lead in the ball and throw it at a wall made of paper bricks. We could call this the Ultimate Ping-Pong Ball Test (UPPBT). 

As bizarre and crazy as it sounds, this lack of evidence—that a microorganism called coronavirus pulls down the wall of your immune system, invades your cells, and starts replicating in them—is exactly what has happened with the “coronavirus” pandemic. No one has bothered to see what happens if you do the UPPBT, throwing the ball against the wall — and if you even suggest that we should do this, the trolls emerge from the shadows to call you a crazy person spreading “fake news.” 

Most people would agree with the requirement of proving that the ping-pong ball can destroy the brick wall; it’s not something any of us would consider negotiable.. And most people would agree that seeing a real brick wall demolished by a ping-pong ball constitutes proof. In other words, sane, rational human beings would accept the above UPPBT as true and relevant.

Heinrich Hermann Robert Koch (1843–1910) is considered one of the founders of modern bacteriology; he created and improved laboratory technologies for isolating bacteria and also developed techniques for photographing bacteria. His research led to the creation of Koch’s postulates, a kind of UPPBT for disease, which consist of four principles linking specific microorganisms to specific diseases. Koch’s postulates are as follows:   

1.The microorganism must be found in abundance in all organisms suffering from the disease but not found in healthy organisms. 

2.The microorganism must be isolated from a diseased organism and grown in a pure culture. 

3.The cultured microorganism should cause disease when introduced into a healthy organism. 

4.The microorganism must be re-isolated from the now diseased experimental host which received the inoculation of the microorganisms and identified as identical to the original specific causative agent.

If all four conditions are met, you have proven the infectious cause for a specific set of symptoms. This is the only way to prove causation. Interestingly, even Koch could not find proof of contagion using his postulates. He abandoned the requirement of the first postulate when he discovered carriers of cholera and typhoid fever who did not get sick.1 In fact, bacteriologists and virologists today believe that Koch’s sensible and logical postulates “have been recognized as largely obsolete by epidemiologists since the 1950s.”2 

Koch’s postulates are for bacteria, not for viruses, which are about one thousand times smaller. In the late nineteenth century, the first evidence for the existence of these tiny particles came from experiments with filters that had pores small enough to retain bacteria and let other particles through. 

In 1937, Thomas Rivers modified Koch’s postulates in order to determine the infectious nature of viruses. Rivers’ postulates are as follows: 

1.The virus can be isolated from diseased hosts. 

2.The virus can be cultivated in host cells. 

3.Proof of filterability—the virus can be filtered from a medium that also contains bacteria. 

4.The filtered virus will produce a comparable disease when the cultivated virus is used to infect experimental animals. 

5.The virus can be re-isolated from the infected experimental animal. 

6.A specific immune response to the virus can be detected. 

Please note that Rivers drops Koch’s first postulate—that’s because many people suffering from “viral” illness do not harbor the offending microorganism. Even with Koch’s first postulate missing, researchers have not been able to prove that a specific virus causes a specific disease using Rivers’ postulates; one study claims that Rivers’ postulates have been met for SARS, said to be a viral disease, but careful examination of this paper demonstrates that none of the postulates have been satisfied.3 

Again, this book’s central claim is that no disease attributed to bacteria or viruses has met all of Koch’s postulates or all of Rivers’ criteria. This is not because the postulates are incorrect or obsolete (in fact, they are entirely logical) but rather because bacteria and viruses don’t cause disease, at least not in any way that we currently understand. 

How did this state of error come about, especially concerning “infections” with bacteria and viruses? It goes back a long time—even to philosophies espoused in ancient Greece. Several philosophers and medics promoted this theory during the Renaissance,4 but in modern times this masquerade became the explanation for most disease with that great fraud and plagiarist, Louis Pasteur, father of the germ theory. 

Imagine a case in which some people who drink the milk from a certain cow develop profuse, bloody diarrhea. Your job is to find the cause of the problem. You wonder whether there is a transmissible agent in the milk that is being consumed by the unfortunate people, which makes them ill. This seems perfectly reasonable thus far. You then examine the milk under the newly invented microscope apparatus and find a bacterium in the milk; you can tell by its appearance that it is different from the usual bacteria that are found in all milk. You carefully examine the milk, discover that most if not all of the people with bloody diarrhea in fact did drink this milk. You then examine the milk consumed by people who didn’t develop diarrhea and find that none of the milk samples contain this particular bacterium. You name the bacteria “listeria” after a fellow scientist. Then, to wrap up the case, you purify the bacteria, so that nothing else from the milk remains. You give this purified bacterial culture to a person who then develops bloody diarrhea; the clincher is that you then find this same bacteria in their stool. Case closed; infection proven. 

Pasteur did this type of experiment for forty years. He found sick people, claimed to have isolated a bacterium, gave the pure culture to animals—often by injecting it into their brains—and made them sick. As a result, he became the celebrity scientist of his time, feted by kings and prime ministers, and hailed as a great scientist. His work led to pasteurization, a technique responsible for destroying the integrity and health-giving properties of milk (see chapter 9). His experiments ushered in the germ theory of disease, and for over a century this radical new theory has dominated not only the practice of Western medicine, but also our cultural and economic life.

We are proposing a different way of understanding the milk study. For example, what if the milk came from cows that were being poisoned or starved? Maybe they were dipped in flea poison; maybe they were fed grains sprayed with arsenic instead of their natural diet of grass; maybe they were fed distillery waste and cardboard—a common practice in Pasteur’s day in many cities around the world. We now know with certainty that any toxins fed to a nursing mammal show up in her milk. What if these listeria bacteria are not the cause of anything but simply nature’s way of digesting and disposing of toxins? After all, this seems to be the role that bacteria play in biological life. If you put stinky stuff in your compost pile, the bacteria feed on the stuff and proliferate. No rational person would claim the compost pile has an infection. In fact, what the bacteria do in the compost pile is more of a bioremediation. Or, consider a pond that has become a dumping ground for poisons. The algae “see” the poison and digest it, returning the pond to a healthier state (as long as you stop poisoning the pond). Again, this is bioremediation, not infection.

If you take aerobic bacteria—bacteria that need oxygen—and put them in an anaerobic environment in which their oxygen supply is reduced, they often produce poisons. Clostridia is a family of bacteria that under healthy circumstances ferments carbohydrates in the lower bowel to produce important compounds like butyric acid; but under anaerobic conditions, this bacteria produces poisons that can cause botulism. It’s the poisons, not the bacteria itself, that make people sick; or more fundamentally, it’s the environment or terrain that cause the bacteria to create the poisons. 

Isn’t it possible that toxins in the milk—possibly because the cow is not well nourished and cannot easily get rid of the toxins—account for the presence of listeria (which is always present in our bodies, along with billions of other bacteria and particles called viruses)? The listeria is simply biodegrading the toxins that proliferate due to the unhealthy condition of the milk. 

The central question then is how can we prove that the listeria, and not something toxic in the milk, is causing the diarrhea? The answer is the same as in the ping-pong ball example: feeding a healthy person the milk is like throwing the bucket with stones, ice, and (yes) a ping-pong ball at the wall; it proves nothing. You must isolate the ball—in this case, the listeria—and feed only this to the healthy person or animal to see what happens. This is what Pasteur claims to have done in his papers. 

Pasteur passed his laboratory notebooks along to his heirs with the provision that they never made the notebooks public. However, his grandson, Louis Pasteur Vallery-Radot, who apparently didn’t care for Pasteur much, donated the notebooks to the French national library, which published them. In 1914, Professor Gerard Geison of Princeton University published an analysis of these notebooks, which revealed that Pasteur had committed massive fraud in all his studies. For instance, when he said that he injected virulent anthrax spores into vaccinated and unvaccinated animals, he could trumpet the fact that the unvaccinated animals died, but that was because he also injected the unvaccinated animals with poisons. 

In the notebooks, Pasteur states unequivocally that he was unable to transfer disease with a pure culture of bacteria (he obviously wasn’t able to purify viruses at that time.) In fact, the only way he could transfer disease was to either insert the whole infected tissue into another animal (he would sometimes inject ground-up brains of an animal into the brain of another animal to “prove” contagion) or resort to adding poisons to his culture, which he knew would cause the symptoms in the recipients.5 

He admitted that the whole effort to prove contagion was a failure, leading to his famous deathbed confession: “The germ is nothing; the terrain is everything.” In this case, terrain refers to the condition of the animal or person and whether the animal or person had been subject to poison. 

Since Pasteur’s day, no one has demonstrated experimentally the transmissibility of disease with pure cultures of bacteria or viruses. No one has bothered since Pasteur’s time to throw a ping-pong ball at a wall and see what happens. Incredible as that may seem, we are sitting on a house of cards that has resulted in incalculable harm to humanity, the biosphere, and the geosphere of the Earth. 

In chapters 2 and 3, we will examine cases in which bacteria or viruses were falsely accused of causing disease. Read on, dear friends; the ride has only started. 

CHAPTER 2 

ELECTRICITY AND DISEASE 


The earliest “electricians” were not technicians who installed wires in houses; they were physicians and “healers” who used the newly discovered phenomena of electric current and static electricity to treat people with ailments—from deafness to headaches to paralysis. The only problem with having patients touch Leyden jars (a device that stores a high-voltage electric charge) or subject themselves to electric currents was that it sometimes caused harm and occasionally killed them. 

One thing these early electrical experimenters noted was that people showed a range of sensitivity to electricity. According to Alexander von Humboldt, a Prussian scientist who (among other experiments) subjected himself and others to the shocks of electric eels, “It is observed that susceptibility to electrical irritation and electrical conductivity, differs as much from one individual to another, as the phenomena of living matter differ from those of dead material.”1 

These early studies captured the imagination of researchers; they began to realize that electric currents ran through the bodies of frogs and humans and that even plants were sensitive to electrical phenomena. After a 1749 earthquake in London, British physician William Stukeley concluded that electricity must play a role in earthquakes because the residents of London felt “pains in their joints, rheumatism, sickness, headache, pain in their back, hysteric and nervous disorders . . . exactly upon electrification, and to some it has proved fatal.”

As early as 1799, researchers puzzled over the cause of influenza, which appeared suddenly, often in diverse places at the same time, and could not be explained by contagion. In 1836, Heinrich Schweich, author of a book on influenza, noted that all physiological processes produce electricity and theorized that an electrical disturbance of the atmosphere may prevent the body from discharging it. He repeated the then-common belief that the accumulation of electricity in the body causes the symptoms of influenza.3 

With the discovery of the sun’s electrical nature, scientists have made some interesting observations. The period 1645–1715 is one that astronomers call the Maunder Minimum, when the sun was quiet; astronomers observed no sunspots during the time span, and the northern lights (aurora borealis) were nonexistent; in 1715, sunspots reappeared, as did the northern lights. Sunspot activity then increased, reaching a high in 1727. In 1728, influenza appeared in waves on every continent. Sunspot activities became more violent until they peaked in 1738, when physicians reported flu in both man and animals (including dogs, horses, and birds, especially sparrows). By some estimates, two million people perished during the ten-year pandemic. 

These and other facts about the relationship of influenza to disturbances in electricity come from a remarkable book, The Invisible Rainbow by Arthur Firstenberg.4 Firstenberg chronicles the history of electricity in the United States and throughout the world, and the outbreaks of illness that accompanied each step toward greater electrification. The first stage involved the installation of telegraph lines; by 1875, these formed a spiderweb over the earth totaling seven hundred thousand miles, with enough copper wire to encircle the globe almost thirty times. With it came a new disease called neurasthenia. Like those suffering today from “chronic fatigue syndrome,” patients felt weak and exhausted and were unable to concentrate. They had headaches, dizziness, tinnitus, floaters in the eyes, racing pulse, pains in the heart region, and palpitations; they were depressed and had panic attacks. Dr. George Miller Beard and the medical community observed that the disease spread along the routes of railroads and telegraph lines; it often resembled the common cold or influenza and commonly seized people in the prime of life.5 

In 1889, we mark the beginning of the modern electrical era and also of a deadly flu pandemic, which followed the advent of electricity throughout the globe. Said Firstenberg: “Influenza struck explosively and unpredictably, over and over in waves until early 1894. It was as if something fundamental had changed in the atmosphere.”6 

Physicians puzzled over influenza’s capricious spread. For example, William Beveridge, author of a 1975 textbook on influenza, noted, “The English warship Arachne was cruising off the coast of Cuba ‘without any contact with land.’ No less than 114 men out of a crew of 149 fell ill with influenza and only later was it learnt that there had been outbreaks in Cuba at the same time.”7 

During World War I, governments on both sides of the conflict installed antennas, which eventually blanketed the earth with strong radio signals—and during the latter part of 1918, disaster struck. The Spanish flu afflicted a third of the world’s population and killed about fifty million people, more than the Black Death of the fourteenth century. To stop the contagion, communities shut down schools, businesses, and theaters; people were ordered to wear masks and refrain from shaking hands.8 

Those living on military bases, which bristled with antennas, were the most vulnerable. A common symptom was bleeding—from the nostrils, gums, ears, skin, stomach, intestines, uterus, kidneys, and brain. Many died of hemorrhage in the lungs, drowning in their own blood. Tests revealed a decreased ability of the blood to coagulate. Those close to death often developed “that peculiar blue color which seemed to mark all early fatal cases.”9 

Health officials were desperate to find a cause. The team of physicians from the US Public Health Service tried to infect their one hundred healthy volunteers at a naval facility on Gallops Island in Boston Harbor. A sense of frustration pervades the report, written by Milton J. Rosenau, MD, and published in the Journal of the American Medical Association. 10 Rosenau had built a successful career in public health by instilling a fear of germs, overseeing quarantines, and warning the public about the dangers of raw milk. He believed that something called Pfeiffer bacillus was the cause. The researchers carefully extracted throat and nasal mucus and even lung material from cadavers and transferred it to the throats, respiratory tracts, and noses of volunteers. “We used some billions of these organisms, according to our estimated counts, on each one of the volunteers, but none of them took sick,” he said. 

Then they drew blood from those who were sick and injected it into ten volunteers. “None of these took sick in any way.” 

Thoroughly perplexed, Rosenau and the other researchers designed the next experiment “to imitate the natural way in which influenza spreads, at least the way in which we believe influenza spreads, and I have no doubt it does [even though his experiments showed that it doesn’t]—by human contact.” They instructed those afflicted to breathe and cough over volunteers. “The volunteer was led up to the bedside of the patient; he was introduced. He sat down alongside the bed of the patient. They shook hands, and by instructions, he got as close as he conveniently could, and they talked for five minutes. At the end of the five minutes, the patient breathed out as hard as he could, while the volunteer, muzzle to muzzle (in accordance with his instructions, about 2 inches between the two), received this expired breath, and at the same time was breathing in as the patient breathed out. This they repeated five times.” The volunteers were watched carefully for seven days, but alas, “none of them took sick in any way.” 

“Perhaps,” said Rosenau, “there are factors, or a factor, in the transmission of influenza that we do not know. . . . Perhaps if we have learned anything, it is that we are not quite sure what we know about the disease.” 

Researchers even tried to infect healthy horses with the mucous secretions of horses with the flu 11—yes, animals also became ill during the pandemic—but the results were the same. The Spanish flu was not contagious, and physicians could attach no blame to the accused bacterium nor provide an explanation for its global reach. 

The year 1957 marked the installation of radar worldwide. The “Asian” influenza pandemic began in February 1957 and lasted for a year. A decade later, the United States launched twenty-eight satellites into the Van Allen belts as part of the Initial Defense Communication Satellite Program (IDCSP), ushering in the Hong Kong flu pandemic, which began in July 1968. 

As Firstenberg observed, “In each case—in 1889, 1918, 1957 and 1968 —the electrical envelope of the earth . . . was suddenly and profoundly disturbed,”12 and along with it the electrical circuits in the human body. Western medicine pays scant attention to the electrical nature of living things—plants, animals, and humans—but mountains of evidence indicate that faint currents govern everything that happens in the body to keep us alive and healthy. From the coagulation of the blood to energy production in the mitochondria, even to small amounts of copper in the bones, which create currents for the maintenance of bone structure—all can be influenced by the presence of electricity in the atmosphere, especially “dirty” electricity, characterized by many overlapping frequencies and jagged changes in frequency and voltage. Today we know that each cell in the body has its own electrical grid, maintained by structured water inside the cell membrane (see chapter 8). Cancer occurs when this structure breaks down, and cancer has increased with each new development in the electrification of the earth.13 

Humankind has lived for thousands of years with our brains tuned to the Schuman resonances of the earth, our bodies and indeed all life bathed in a static electric field of 130 volts per meter. The electronic symphony that gives us life is soft and delicate. Minute electrical currents that course through leaf veins or through the glial cells in our nervous system guide the growth and metabolism of all life-forms. Our cells communicate in whispers in the radiofrequency range. 

Traditional Chinese medicine has long recognized the electrical nature of the human body and has developed a system to defuse the “accumulation of electricity” that leads to disease. It’s called acupuncture. Many things that we do instinctively also help release any unhealthy buildup of current—the mother who strokes her infant’s head or who scratches her children’s backs to put them to sleep, the caresses of lovers, walking barefoot on the earth, massage, even handshakes and hugs—all now discouraged by the frowny faces of health authorities. 

Fast-forward to the Internet and cell phone era. According to Firstenberg, the onset of cell phone service in 1996 resulted in greater levels of mortality in major cities like Los Angeles, New York, San Diego, and Boston.14 Over the years, wireless signals at multiple frequencies have filled the atmosphere to a greater and greater extent, along with mysterious outbreaks like SARS and MERS. 

Today the quiet hum of life-giving current is infiltrated by a jangle of overlapping and jarring frequencies—from power lines to the fridge to the cell phone. It started with the telegraph and progressed to worldwide electricity, then radar, then satellites that disrupt the ionosphere, then ubiquitous Wi-Fi. The most recent addition to this disturbing racket is fifth generation wireless—5G. 

5G is broadcast in a range of microwave frequencies: mostly 24– 72 GHz, with the range of 700–2500 MHz also considered 5G. Frequencies in this range (below the frequency of light) are called nonionizing, in contrast with ionizing radiation, which has a higher frequency than light. Ionizing radiation, such as X-rays, causes electrons to split off atoms, obviously something to which exposure must be limited. (This is why a lead shield is put on patients when they get X-rays.) 

Instead of producing charged ions when passing through matter, non ionizing electromagnetic radiation changes the rotational, vibrational, and electronic valence configurations of molecules and atoms. This produces thermal effects (think microwave ovens). The telecommunications industry flatly denies any nonthermal effects on living tissue, even though a large body of research suggests considerable harm to the delicate electromagnetic systems in the human body from constant exposure to nonionizing frequencies. In particular, high frequency electromagnetic fields like 5G affect cell membrane permeability 15—not a good thing when the architecture of a healthy cell ensures that it is not permeable except in controlled situations. 

We are already familiar with millimeter wave technology; this is the frequency of airport scanners, which can see through your clothes. Children and pregnant women are not required to go through these scanners, a nod to potential dangers. Adults get zapped a second or two; 5G bathes us in the same kind of radiation twenty-four seven. 

Of particular concern is the fact that some 5G transmitters broadcast at 60 GHz, a frequency that is absorbed by oxygen, causing the oxygen molecule (composed of two oxygen atoms) to split apart, making it useless for respiration.16 

On September 26, 2019, 5G wireless was turned on in Wuhan, China (and officially launched November 1) with a grid of about ten thousand 5G base stations—more than exist in the entire United States—all concentrated in one city.17 A spike in cases occurred on February 13—the same week that Wuhan turned on its 5G network for monitoring traffic.18 

Illness has followed 5G installation in all the major cities in America, starting with New York in Fall 2019 in Manhattan, along with parts of Brooklyn, the Bronx, and Queens—all subsequent coronavirus hot spots. Los Angeles, Las Vegas, Dallas, Cleveland, and Atlanta soon followed, with some five thousand towns and cities now covered. Citizens of the small country of San Marino (the first country in the world to install 5G, in September 2018) have had the longest exposure to 5G and the highest infection rate—four times higher than Italy (which deployed 5G in June 2019), and twenty-seven times higher than Croatia, which has not deployed 5G.19 In rural areas, the illness blamed on the coronavirus is slight to nonexistent.20 

In Europe, illness is highly correlated with 5G rollout. For example, Milan and other areas in northern Italy have the densest 5G coverage, and northern Italy has twenty-two times as many coronavirus cases as Rome.21 

In Switzerland, telecommunications companies have built more than two thousand antennas, but the Swiss have halted at least some of the 5G rollout due to health concerns. Switzerland has had far fewer coronavirus cases than nearby France, Spain, and Germany, where 5G is going full steam ahead. 

Iran announced an official 5G launch in late March 2020, but assuming pre-launch testing in February, the advent of 5G correlates with the first Covid-19 cases at the same time. Korea has installed over seventy thousand 5G bases and reported over eight thousand cases of illness by mid-March. Japan began testing 5G in tunnels in Hokkaido in early February 2020, and this city now has the most cases of coronavirus in Japan, even more than Tokyo.22 

In South America, the 5G rollout has occurred in Brazil, Chile, Ecuador, and Mexico, all of which have many coronavirus cases. Countries without 5G, such as Guyana, Suriname, French Guiana, and Paraguay have not reported any cases. Paraguay is doing what all countries should do—building a national fiber optics network without resorting to 5G.23 

Bartomeu Payeras i Cifre, a Spanish epidemiologist, has charted the rollout of 5G in European cities and countries with cases per thousand people and demonstrated “a clear and close relationship between the rate of coronavirus infections and 5G antenna location.”24 

What about Covid-19 in the Amazon basin? The Pan American Health Organization (PAHO) estimates that there are at least twenty thousand active coronavirus cases among the indigenous peoples.25 They live a primitive lifestyle, but 5G is already there,26 along with “twenty- five enormously powerful surveillance radars, ten Doppler weather radars, two hundred floating water-monitoring stations, nine hundred radio-equipped ‘listening posts,’ thirty-two radio stations, eight airborne state-of-the-art surveillance jets equipped with fog-penetrating radar, and ninety-nine ‘attack/trainer’ support aircraft,’ all of which can track individual human beings and ‘hear a twig snap’ anywhere in the Amazon.”27 These were installed in 2002 as part of the System for Vigilance of the Amazon (SIVAM), which monitors activities in a two-million-square-mile area of remote wilderness. All life in the Amazon is bathed with a range of electromagnetic frequencies. 

These 5G frequencies go only a short distance and cannot penetrate into buildings. However, a few tech startups are working to get the 5G signal into the areas where we work, play, and sleep. Pivotal Commware is testing an “Echo 5G In-Building Penetration Device.”28 

Pivotal’s offices are about one mile from the Life Care nursing home in Kirkland, Washington, where the illness first appeared in the United States, and where twenty-five residents died. Was the Life Care center a testing ground for Pivotal’s new device? Health-care facilities also teem with electronic equipment, some of it located right by the heads of sick patients. People who suffer from electrical hypersensitivity cannot go near many hospitals and nursing homes. 

The 5G system is also installed on modern cruise ships. For example, the Diamond Princess cruise ship advertises “the best Wi-Fi at sea.”29 On February 3, 2020, the ship was quarantined in Yokohama, Japan after many passengers complained of illness. In the end, 381 passengers and crew members became sick, and fourteen died.

The Diamond Princess cruise ship. The four round objects on the top of the ship are 5G antennas and transmitters.

Of interest is the fact that the military has crowd-control devices that operate in the same ranges: 6–100 GHz. The 95 GHz Active Denial System is a weapon that can penetrate the skin and produce intolerable heating sensations, causing people to move away from the beam.30 

The EUROPA EMF Guideline 2016 states “there is strong evidence that long-term exposure to certain EMFs is a risk factor for diseases such as certain cancers, Alzheimer’s disease, and male infertility. . . . Common EHS (electromagnetic hypersensitivity) symptoms include headaches, concentration difficulties, sleep problems, depression, lack of energy, fatigue, and flu-like symptoms [emphasis added].”31 

An article published in May 2020 in Toxicology Letters found that in real-world conditions, exposure to wide-spectrum non ionizing frequencies adversely impacted skin, eyes, heart, liver, kidney, spleen, blood, and bone marrow.32 Electromagnetic frequencies also disturb the immune function through stimulation of various allergic and inflammatory responses, and they adversely affect tissue repair.33 

The Russians studied the effects of millimeter waves on animals and humans in 1979. Workers servicing ultra-high-frequency generators complained of fatigue, drowsiness, headaches, and loss of memory. The blood was particularly affected, with a reduction in the amount of hemoglobin and a tendency toward hyper-coagulation.34 Even earlier, in 1971, the US Naval Medical Research Institute published more than twenty-three hundred references in a “Bibliography of Reported Biological Phenomena (‘Effects’) and Clinical Manifestations Attributed to Microwave and Radio-Frequency Radiation.”35 They found adverse effects almost everywhere in the body; in addition to “generalized degeneration of all body tissue,” they noted altered sex ratio of births (more girls), altered fetal development, decreased lactation in nursing mothers, seizures, convulsions, anxiety, thyroid enlargement, decreased testosterone production, and—of particular interest—sparking between dental fillings and a peculiar metallic taste in the mouth. 

One research review of almost two hundred studies 36 noted, “nonthermal effects have been clearly demonstrated in thousands of peer reviewed publications.” Whereas some EMF frequency band patterns are coherent and may be health-promoting, “the chosen 5G frequencies belong for a great part to the detrimental zones.” The authors noted that government studies claiming 5G safety have taken no account of the fact that 5G radiation can be pulsating and modulated and emitted from multiple antennas. Of interest is the finding “that EMF waves can also be circularly polarized by interaction with atmospheric dust and therefore may penetrate much deeper into the organism. In addition, 5G waves may exhibit interference with other EMF wave frequencies, resulting in standing waves and environmental ‘hot spots’ of radiation that can be very taxing on EMF hypersensitive individuals.” Air pollution and 5G are not a good mix!

A study published in Frontiers in Oncology describes lung injury from radiation therapy. Radiation therapy uses shorter waves at close range for a shorter period of time, but it stands to reason that 5G millimeter waves, with transmitters nearby, pulsing massive amounts of frequency at all times, could also cause lung injury. According to the authors, “Depending on the dose and volume of lung irradiated, acute radiation pneumonitis may develop, characterized by dry cough and dyspnea (shortness of breath).”37 

Of interest is the fact that Lloyd’s of London and other insurance carriers won’t cover injury from cell phones, Wi-Fi, or smart meters. EMFs are classified as a pollutant, alongside smoke, chemicals, and asbestos: “The Electromagnetic Fields Exclusion (Exclusion 32) is a General Insurance Exclusion and is applied across the market as standard. The purpose of the exclusion is to exclude cover for illnesses caused by continuous, long-term non-ionizing radiation exposure i.e. through mobile phone usage.”38 

According to Dr. Cameron Kyle-Sidell, working in an emergency room (ER) in New York, the afflicted are literally gasping for air. “We’ve never seen anything like it!” he said.39 Covid-19 patients’ symptoms resemble those of high-altitude sickness rather than viral pneumonia. In fact, the ventilators that the hospitals have scrambled to obtain may do more harm than good and may be accounting for the high mortality rate, as they increase pressure on the lungs. These patients don’t need help breathing— they need more oxygen when they take a breath. Many turn blue in the face. These are not signs of a contagious disease but of disruption of our mechanisms for producing energy and getting oxygen to the red blood cells. 

Remember that during the Spanish flu, the problem was the lack of blood coagulability; with Covid-19, a key problem is lack of oxygen in the blood—both conditions point to electrical toxicity rather than infection— iron-rich blood cells would be especially vulnerable to the effects of electromagnetism.

And there’s another symptom: fizzing. Many Covid patients report strange buzzing sensations throughout their body, “an electric feeling on the skin,” or skin that feels like it is burning. Those who are electrically sensitive report similar sensations when they are near a cell phone or use GPS-guided cruise control in their cars. Other symptoms include a loss of smell and taste, fever, aches, breathlessness, fatigue, dry cough, diarrhea, strokes, and seizures—all of which are also reported by those who are electrically sensitive.

The correlation of 5G rollout and Covid-19 cases, and the similarity of symptoms, should give us pause. Shouldn’t we look more closely before we institute mandatory vaccination and electronic ID chipping? Shouldn’t we test to see whether this virus is actually contagious before we mandate social distancing and prescribe face masks? 

Today’s pandemic raises many questions. What makes some people more vulnerable than others to the effects of 5G? Why did thirty-five sailors on the battleship Arachne not get sick? Which environmental factors weaken our defenses? How should we treat this disease if it is not a viral disease? What about our diets? Can we protect ourselves with the right food choices? We will address these questions in subsequent chapters. 

Most important, we will show that the minute particles called viruses are actually exosomes—not invaders but toxin-gobbling messengers that our cells produce to help us adjust to environmental assaults, including electro-smog. After all, most people have adjusted to worldwide radio waves, electricity in their homes, and ubiquitous Wi-Fi (and the sparrow population rebounded after the flu of 1738); exosomes are what allow this to happen. These tiny messengers provide real-time and rapid genetic adaptation to environmental changes. Whether these exosomes can help us adapt to the extreme disruption of 5G is the question of the day.


CHAPTER 3 

PANDEMICS 

Throughout history, philosophers believed that comets were “ harbingers of doom, disease, and death, infecting men with a blood lust to war, contaminating crops, and dispersing disease and plague.”1 

The Chinese textbook Mawangdui Silk details twenty-nine types of comets, dating back to 1500 BC, and the disasters that followed each one. “Comets are vile stars,” wrote a Chinese official in 648 AD. “Every time they appear in the south, they wipe out the old and establish the new. Fish grow sick, crops fail. Emperors and common people die, and men go to war. The people hate life and don’t want to speak of it.”2 

In medieval Europe and even in colonial America, observers associated the appearance of comets with the onset of disease.3 

In the summer of 536 AD, a mysterious and dramatic cloud of dust appeared over the Mediterranean and for eighteen months darkened the sky as far east as China. According to the Byzantine historian Procopius, “During this year a most dread portent took place. For the sun gave forth its light without brightness . . . and it seemed exceedingly like the sun in eclipse, for the beams it shed were not clear.”4 

Analysis of Greenland ice deposited between 533 and 540 AD shows high levels of tin, nickel, and iron oxides, suggesting that a comet or fragment of a comet may have hit the Earth at that time.5 The impact likely triggered volcanic eruptions, which spewed more dust into the atmosphere. With the darkened sky, temperatures dropped, crops failed, and famine descended on many parts of the world.

Depiction of various types of comets in Chinese documents. 

Shortly afterward, in 541 AD, a mysterious illness began to appear on the outskirts of the Byzantine Empire. Victims suffered from delusions, nightmares, and fevers; they had lymph node swellings in the groin, armpits, and behind their ears. The plague, named after the reigning Emperor Justinian, arrived in Constantinople (the capital) in 542. Procopius noted that bodies were left stacked in the open due to a lack of space for proper burial. He estimated that in the city at its peak, the plague was killing ten thousand people per day.6 

The current explanation for the correlation of comets and disease is that of “panspermia.” We now know that outer space is populated by clouds of microorganisms, and the theory holds that comets are watery bodies—dirty snowballs—that rain new microscopic forms on the earth, to which humans and animals have no immunity.7 

However, recent evidence indicates little if any water on comets. Rather, they are asteroids that have an elliptical orbit and become charged electrically as they approach the sun, an exchange that creates the comet’s bright coma and tail. Their surfaces exhibit the kind of features that happen with intense electrical arcing, like craters and cliffs; bright or shiny spots on otherwise barren rocky surfaces indicate areas that are electrically charged. Comets contain mineral alloys requiring temperatures in the thousands of degrees, and they have sufficient energy to emit extreme UV light and even powerful X-rays. Moreover, as comets approach the sun, they can provoke high-energy discharges and flare-ups of solar plasma, which reach out to the comet.8 

Thus, comets can create electrical disturbances in the atmosphere even more powerful than those created by man-made electrification— and this radiation includes demonstrably dangerous ionizing radiation. No wonder the ancients were afraid of comets! 

The conventional view holds that the Plague of Justinian was a case of bubonic plague. Researchers analyzed the remains from graves of the period and detected DNA from the Yersinia pestis. 9 Mainstream thinking has concluded that rats and other rodents carry Yersinia pestis and pass it along to fleas. When rats die, the blood-sucking fleas leave them to prey on other rats, dogs, and humans. The bacteria then enter humans via flea bites. Researchers believe that during the time of Justinian, rats on merchant ships carried the microorganism to the other Mediterranean ports. 

The classic sign of bubonic plague are buboes—badly swollen lymph nodes. These often appear in the groin because, according to conventional thinking, most fleabites occur on the legs. Those infected will first experience fevers, chills, and muscle pains before developing septicemia or pneumonia. 

The plague reappeared at periodic intervals over the next three hundred years, with the last recorded occurrence in 750 AD—possibly explained by still-orbiting cometary debris. It eventually claimed 25 percent of inhabitants in the Mediterranean region. Then the plague disappeared from Europe until the Black Death of the fourteenth century— also presaged by a comet.

According to historian Thomas Short:

In France . . . was seen the terrible Comet called Negra. In December appeared over Avignon a Pillar of Fire. There were many great Earthquakes, Tempests, Thunders and Lightnings, and thousands of People were swallowed up; the Courses of Rivers were stopt; some Chasms of the Earth sent forth Blood. Terrible Showers of Hail, each stone weighing 1 Pound to 8; Abortions in all Countries; in Germany it rained Blood; in France Blood gushed out of Graves of the Dead, and stained the Rivers crimson; Comets, meteors, Fire-beams, coruscations in the Air, Mock-suns, the Heavens on Fire.10 

According to textbooks, the same bubonic plague organism of Justinian’s time caused the Black Death in Europe, 1347–1350. However, some investigators have pointed out flaws in this theory. Although researchers found evidence of Yersina pestis in dental pulp from a mass grave of the period in France, other teams of scientists were unable to find evidence of the pathogen in five other grave sites of the period from other parts of Europe.11 

Sociologist Susan Scott and biologist Christopher J. Duncan claim that a hemorrhagic fever, similar to the Ebola virus, caused the Black Death. Others blame anthrax or some now-extinct disease. They note that medieval accounts don’t square with modern descriptions of the illness. Witnesses described a disease that spread at great speed with very high mortality, unlike the plague, which moves slowly and had a death rate of about 60 percent. Accounts describe buboes covering the entire body rather than limited to the groin area as in the case of plague. Symptom descriptions mention awful odors, splotches resembling bruises, delirium, and stupor—none of which happen with modern-day bubonic plague. Some critics have embraced the theory that a virus caused the disease, but this premise hardly provides a better explanation than does bacteria to explain the disease’s rapid spread and high mortality. 

Then there is the rat problem. No written documents from that time describe vast legions of dead rats required to explain the plague. The Black Death killed over half of Iceland’s population, but rats didn’t reach Iceland until the nineteenth century. And the Black Death continued to kill people during the winter months in northern Europe despite the fact that the plague organism requires relatively warm temperatures.12 

In New Light on the Black Death: The Cosmic Connection, Professor Mike Baillie argues that a comet caused the pandemic. He points out that witnesses of the period describe a significant earthquake on January 25, 1348, with other earthquakes to follow. “There have been masses of dead fish, animals, and other things along the sea shore and in many places covered in dust,” wrote a contemporary observer. “And all these things seem to have come from the great corruption of the air and earth.” Other documents describe tidal waves, rains of fire, foul odors, strange colors in the sky, mists and even dragons, in addition to earthquakes.13 

Baillie believes that fragments from Comet Negra, which passed by earth in 1347, caused the atmospheric phenomena. Some fragments descended and injected huge amounts of dust into the atmosphere. Tree ring analysis indicates that as the material descended from space, it spewed large amounts of chemicals based on carbon and nitrogen into the stratosphere. According to Baillie, illness and death resulted from poisoned water and air as the comet flew overhead.14 

But the symptoms—especially bruise-like blotches on the skin and high fatality rate—indicate radiation poisoning, probably rendered even more deadly by dust and ammonia-like compounds in the atmosphere. Imagine a large comet passing near the earth, crackling with intense electrical arcing, pelting the earth with X-rays and casting off fragments that fall to the earth and spew up toxic clouds of dust, followed immediately by horrible death, sometimes wiping out whole towns. This is not the kind of catastrophe that we can blame on microbes. 

Perhaps our solar system is calming down—mankind has not seen such violent phenomena for centuries. But smaller electrical disturbances, ones that can’t be seen, are still likely to promote outbreaks, albeit less disastrous. And if radiation poisoning—whether ionizing or nonionizing— provokes disease, there are obvious cofactors. Poisons in air, water, and food; toxins from insect bites; deadly fungi on grains; exposure to filth; malnutrition; and starvation; as well as fear and despair—we don’t need to resort to the notion of contagion to explain outbreaks of disease. 

Let us consider insect-borne diseases. Many (if not most) biting or stinging insects release toxins—often complex chemicals that can target the nervous system. Wasps, bees, flies, beetles, mosquitos,15 ticks, bedbugs, lice, and ants all produce poisonous substances. Early studies suggest that insect saliva has chemicals with vasodilatory, anticoagulant, and immunosuppressive properties, although in recent times there has been little interest in (or research money for) the study of insect saliva. 

In addition to overt poisons, insect saliva may contain parasite eggs. Tapeworms can be transmitted by fleas, and mosquito bites contain the eggs of plasmodium, a parasite said to cause malaria. Mosquitos also carry fly larvae, which can enter the body through bites, causing myiasis, a parasitic infestation of the body by fly larvae (maggots), which grow inside the host. Some mosquito species can carry filariasis, a parasite that causes a disfiguring condition called elephantiasis. These diseases are “infectious” in the sense that people acquire them from something outside the body, such as an insect, but only in the most bizarre of circumstances can they be transferred from one human being to another. 

Actually, scientists have yet to solve the mystery of malaria, a disease that kills over one thousand people per day. The conventional view is that mosquitoes in tropical and subtropical regions transfer parasites to human blood through their bites, and this parasite then destroys red blood cells and causes intermittent fever. But the type of mosquito said to cause malaria inhabits every continent except Antarctica, including Europe and North America, where malaria is no longer a problem. From the fifteenth century until recent times, many people in England suffered from malaria under the name of “marsh fever” or “ague”—always associated with living in swampy marshes. In fact, what is common to areas known for malaria (both today and in the past) is human habitation in swamps and wetlands —and not just warm wetlands (which are conducive to mosquitos) but also wetlands in cooler areas such as England. 

Wetlands produce swamp gases—a mixture of hydrogen sulfide, carbon dioxide, and especially methane. Methane poisoning causes fever, headaches, muscle weakness, nausea, vomiting, and feelings of asphyxiation—remarkably similar to the symptoms of malaria: fever, muscle weakness, nausea, vomiting, and chest and abdominal pain. Like malaria, methane poisoning can result in the destruction of red blood cells.16 In areas of the world where people still live in swampy areas, intermittent exposure to swamp gases, which are undoubtedly stronger during warm weather or flooding seasons, seems a better explanation than mosquitos for this stubborn disease. 

The conventional view holds that “viral diseases” such as yellow fever, dengue fever, Zika fever, and chikugunya are transmitted by mosquitoes carrying viruses that “attach to and enter susceptible cells.” According to textbooks, once these viruses enter the body and begin to replicate inside the cells, they are contagious and are spread from person to person through airborne droplets, sexual contact, eating food and drinking water contaminated with the virus, and even touching surfaces and bodily fluids contaminated with the virus. But we don’t need the concepts of viruses and contagion to explain these diseases. Environments infested with fleas, mosquitos, lice, and other insects carrying toxins or parasites will result in many individuals, especially individuals with suboptimal nutrition, manifesting similar symptoms—an “outbreak” that requires no premise of person-to-person contact, only many people subject to the same stressors. For example, the “outbreak” of Zika “virus,” blamed for a rash of babies born with tragically small heads, followed a campaign of DPT vaccinations given to poor pregnant women in Brazil.17 

Toxins are powerful stressors. Sewage fumes contain a mixture of toxic gaseous compounds, such as hydrogen sulfide, carbon dioxide, methane, and ammonia. High concentrations of methane and carbon dioxide displace oxygen. In conditions of low oxygen, beneficial fermentative bacteria begin producing toxins instead of helpful compounds. Industrial chemicals in sewage can add to the adverse effects, especially if these toxins make their way into drinking water. In times past, these toxins included mercury, arsenic, and lead. Lead used for roofing, tanks, gutters, pipes, cables, and winemaking (and even added to recipes in Roman times) poisoned directly, through drinking water, or through the skin. Renaissance noblewomen wore makeup containing white lead ore, vinegar, arsenic, hydroxide, and carbonate, applied to the face over egg whites or a mercury foundation. Arsenic face powder was the crowning touch.18 The price for the flawless complexion was paralysis, madness, and death. 

Leather tanning contributed greatly to water pollution. Lime, tannin, animal dung, urine, alum, and arsenic were used in the process; the Industrial Revolution added toxic chromium solution to the mix. Production of red paint and dyes, metal extraction, and caustic soda production released mercury. Both mercury and arsenic were popular ingredients in medicines, and they no doubt carried off as many people as the diseases themselves. 

The severe vomiting, diarrhea, dehydration, and muscle cramping of cholera is blamed on the bacterium Vibrio cholerae, either from sewage tainted water or shellfish like oysters living in sewage-tainted water. Actually, the killer is a toxin—called “cholera toxin” (CT), which the bacteria produce under low-oxygen conditions. Although CT can be deadly, it also has anti-inflammatory properties and has shown promise as an immunotherapeutic drug.

Cholera affects up to five million people, mostly in third world countries, and causes over one hundred thousand deaths per year. Treatment includes oral rehydration therapy and zinc supplementation. Children are highly susceptible to CT, as are those who are malnourished or have lowered immunity. One strange observation is the fact that type O blood types are more likely to contract cholera.19 

Even today, with the medical world’s fixation on person-to-person transmission of disease and prevention through vaccination, health authorities agree that the solution to cholera is better sanitation. Cholera is rarely spread directly from person to person, but only through filthy drinking water. 

An outbreak of cholera occurred in Soho, London, in 1854. According to Judith Summers in Broad Street Pump Outbreak, “by the middle of the [nineteenth] century, Soho had become an insanitary place of cowsheds, animal droppings, slaughterhouses, grease-boiling dens and primitive, decaying sewers. And underneath the floorboards of the overcrowded cellars lurked something even worse—a fetid sea of cesspits as old as the houses, and many of which had never been drained. It was only a matter of time before this hidden festering time-bomb exploded.”20

The previous year, over ten thousand people died of cholera in England. The outbreak in Soho appeared suddenly: “Few families, rich or poor, were spared the loss of at least one member. Within a week, three quarters of the residents had fled from their homes, leaving their shops shuttered, their houses locked and the streets deserted. Only those who could not afford to leave remained there. It was like the Great Plague all over again.” 

Dr. John Snow lived in the center of the outbreak and traced the source to a pump on the corner of Broad and Cambridge Streets, at the epicenter of the epidemic. “I found,” he wrote afterward, “that nearly all the deaths had taken place within a short distance of the pump.” In fact, in houses much nearer another pump, only ten deaths occurred—and of those, five victims had drunk the water from the Broad Street pump. Workers in a local brewery did not get sick—they drank beer provided as a perk of employment. Dr. Snow blamed the outbreak not on toxins but on “white, flocculent particles,” which he observed under a microscope.21 

Three decades later, Robert Koch tried injecting a culture of these white flocculent particles into animals, without succeeding in getting them sick—so cholera failed his second postulate. Cholera also failed his first postulate, as Vibrio cholerae appeared in both sick and healthy people.22 Even so, he remained convinced that this bacillus was the cause of cholera —old ideas are difficult to dislodge even in the face of conflicting evidence. 

It bears emphasis that all cities up to the nineteenth century were “fetid seas” of horse droppings, stinking manure piles, primitive water sanitation, toxic chemicals, crowded living conditions, loose pigs, and even raw sewage dumped from houses. Swill from inner-city breweries went to cows in inner-city confinement dairies, producing poisoned milk in conditions of unimaginable filth. The death rate among children born in these conditions was 50 percent. Officials blamed the death rate on the milk, which became the justification for pasteurization laws instituted one hundred years later.23 By then, the problem had resolved itself with improved water and sewer systems, better living conditions, the advent of refrigeration, laws prohibiting inner-city breweries and dairies, and (most important) replacement of the horse with the car. Automobiles and buses brought in a different kind of pollution, but new technologies at least ensured that the water was finally clean. Much “infectious disease” cleared up, thanks not to doctors but rather to inventors and civil engineers. 

One invention that made life safer was the washing machine, making it easier to keep clothes and bedding clean, especially as more and more dwellings had hot running water. Another invention was the vacuum cleaner, which helped keep living quarters free of bugs. (Window screens also helped.) 

At the turn of the twentieth century, health officials considered smallpox to be highly infectious, but one physician disagreed. Dr. Charles A. R. Campbell of San Antonio, Texas, believed that smallpox was transmitted by the bites of bedbugs. 

The modern official view holds that smallpox resulted from contact with a contagious virus—“Transmission occurred through inhalation of airborne Variola virus, usually droplets expressed from the oral, nasal, or pharyngeal mucosa of an infected person. It was transmitted from one person to another primarily through prolonged face-to-face contact with an infected person, usually within a distance of 1.8 m (6 feet), but could also be spread through direct contact with infected bodily fluids or contaminated objects (fomites) such as bedding or clothing . . . the infected person was contagious until the last smallpox scab fell off . . . Smallpox was not known to be transmitted by insects or animals.”24 Note that this description is written in the past tense—the official view is that smallpox has been conquered by vaccination, not by something as simple as getting rid of bedbugs. 

Dr. Campbell ran a “pest house” for smallpox patients in San Antonio, where he tried hard to infect himself and others by “fomites” and direct face-to-face contact with infected persons: 

As even the air itself, without contact, is considered sufficient to convey this disease, and touching the clothes of a smallpox patient considered equivalent to contract it, I exposed myself with the same impunity as my pest-house keeper. . . . After numerous exposures, made in the ordinary manner, by going from house to house where the disease was . . . I have never conveyed this disease to my family, or to any of my patients or friends, although I did not disinfect myself or my clothes, nor take any precautions whatever, except to be sure that no bedbugs got about my clothing. 

Another one of my experiments was thoroughly to beat a rug in a room, only eight or ten feet square, from which had just been removed a smallpox patient. . . . I beat this rug in the room until the air was stifling, and remained there in for thirty minutes. This represented the respiratory as well as the digestive systems as accepted avenues of infection. . . . After inhaling the dust from that rug, I examined my sputum microscopically the following morning and found cotton and woolen fibres, pollen and comminuted manure, and also bacteria of many kinds.25 

Although Dr. Campbell subsequently mingled with family, patients, and friends, none contracted smallpox. He repeated these experiments with others, failing to infect, even when in contact with patients covered in sores, but he always found bedbugs in the houses of those who contracted the disease.26 

The British and American colonists used smallpox as a weapon against the Native Americans—they did it by giving them blankets, thus spreading the bedbug to the New World. 

Campbell treated smallpox by administering sources of vitamin C: 

The most important observation on the medical aspect of this disease is the cachexia [bad condition] with which it is associated, and which is actually the soil requisite for its different degrees of virulence. I refer to the scorbutic cachexia. Among the lower classes of people this particular acquitted constitutional perversion of nutrition is most prevalent, primarily on account of their poverty, but also because of the fact that they care little or nothing for fruits or vegetables . . . that it is more prevalent in winter when the anti-scorbutics are scarce and high priced; and finally, that the removal of this perversion of nutrition will so mitigate the virulence of this malady as positively to prevent the pitting or pocking of smallpox. 

A failure of the fruit crop in any particularly large area is always followed the succeeding winter by the presence of smallpox.27 

Dr. Campbell also applied himself to the elimination of mosquitoes by constructing huge bat houses—he was a great admirer of this strange winged creature and knew how to harness its help in the elimination of annoying insects, assumed to cause malaria.28 Campbell was an inventive and colorful character, full of good ideas, yet hardly mentioned in medical journals or in histories of disease. Where’s the glamor of a solution that involves clean beds and fresh fruit compared with the heroics of vaccination—smallpox vaccinations so toxic that health officials no longer recommend them.

Dr. Campbell’s Municipal Bat-Roost, which eliminated mosquitos from San Antonio without the use of toxic chemicals.


Unlike the forgotten Dr. Campbell, Dr. Robert Koch is immortalized as the father of microbiology and the germ theory. Unable to prove that a microorganism caused cholera,29 and in the case of rabies, knowing that Pasteur was unable to even find an organism,30 Dr. Koch turned his attention to tuberculosis (TB). According to a historical article published in World of Microbiology and Immunology: [man there is something so wrong with this field,like everything they think they know is based on is based on lies DC]

In six months, Koch succeeded in isolating a bacillus from tissues of humans and animals infected with tuberculosis. In 1882, he published a paper declaring that this bacillus met his four conditions—that is, it was isolated from diseased animals, it was grown in a pure culture, it was transferred to a healthy animal who then developed the disease, and it was isolated from the animal infected by the cultured organism. When he presented his findings before the Physiological Society in Berlin on March 24, he held the audience spellbound, so logical and thorough was his delivery of this important finding. This day has come to be known as the day modern bacteriology was born.31 

In 1905, Dr. Koch received the Nobel Prize for proving that TB was an infectious disease. 

Except he didn’t. 

In fact, he could find an organism in infected tissue only by using special staining methods after the tissue was heated and dehydrated with alcohol. The stain was a toxic dye, methylene blue, and the solution he used contained another toxin—potassium hydroxide (lye). When he injected the organism stained with these poisons into animals, they got sick. But what caused the illness, the bacillus or the poisons?32 And TB does not even satisfy Koch’s first postulate. Only one person in ten who tests positive for TB actually develops the disease; those who don’t are said to have “latent TB.” 

Even into the 1930's and 1940's, some scientists remained skeptical of the germ theory for TB—many still believed that the cause was genetic. An investigator who disputed both theories was the dentist Weston A. Price, author of the groundbreaking book Nutrition and Physical Degeneration. 33 During the 1930s and 1940s, he traveled around the globe to study the health of so-called “primitive peoples,” living on ancestral diets. As a dentist, he naturally observed dental and facial formation and the presence or absence of tooth decay. He found fourteen groups in regions as diverse as the Swiss Alps, the Outer Hebrides, Alaska, South America, Australia, and the South Seas in which every member of the tribe or village exhibited wide facial structure, naturally straight teeth, and freedom from tooth decay. 

He also noted the absence of disease in these well-nourished groups. As soon as the “displacing foods of modern commerce” made inroads into a population, they became vulnerable to both chronic and “infectious disease,” especially TB. The children born to those who adopted the Western diet of “sanitary” processed food—sugar, white flour, canned foods, and vegetable oils—were born with more narrow faces, crowded and crooked teeth, pinched nasal passages, narrow configuration of the birth canal, and less robust body formation. 

Price rejected the notion that TB was inherited or caused by a microorganism, transmittable by droplets released into the air in the coughs and sneezes of the infected; he surmised that the root cause was a malformation of the lungs, similar to the narrowing of the facial structure and “dental deformities” in those born to parents eating processed foods. In a visit to a pediatric TB ward in Hawaii, he noted that every patient had dental deformities.34 These dental deformities did not cause TB, of course, but Dr. Price believed that the same conditions that prevented the optimal formation of the facial bones also prevented optimal formation of the lungs. It was the dead and dying tissue in the lungs that attracted bacteria, nature’s cleanup crew, and not the microorganism that caused the disease. 

He noted that Swiss villagers living off their native diets of raw dairy products, sourdough rye bread, and some meat and organ meats had no TB —and this was a time when TB was the number-one killer in Switzerland and elsewhere.35 Likewise, inhabitants of Lewis Island in the Outer Hebrides were free of TB. Their nutrient-dense diet consisted of seafood, including fish livers and fish liver oil, along with oat porridge and oatcakes. They lived in thatched houses that had no chimneys, living in close quarters with smoky, polluted air night and day; still they had no TB. When modern foods made their appearance, the situation changed, and TB took hold. Health workers blamed the smoky air of their cottages (not a microorganism!) and made them install chimneys, but to no avail. Only Weston A. Price was curious about the fact that the well-nourished islanders were immune, even when living in smoke-filled houses.36 

Similarly, he observed that African tribesmen living on traditional foods seemed immune to the diseases in Africa, even though they went barefoot, drank unsanitary water, and lived in areas that swarmed with mosquitos.37 Europeans visiting Africa needed to cover themselves completely and sleep under protective netting to avoid disease. Once the continent of Africa became “coca-colonized,” these diseases proliferated among the Africans. 

During the time of Dr. Price’s research, it was not the so-called infectious diseases of Africa that struck terror in American minds, it was polio. According to health officials, the cause was an infectious virus. This virus didn’t just make people (especially young people) sick; it occasionally left them crippled. Pictures of grown men in iron lungs and children wearing leg braces seared the national consciousness. 

In the mid-1950s, physician Morton S. Biskind testified before Congress. Dr. Biskind’s message was not what the legislators wanted to hear: polio was the result of central nervous system (CNS) poison, not a virus, and the chief CNS poison of the day was a chemical called dichlorodiphenyltrichloroethane, commonly known as DDT.38 Used in World War II to control mosquitos said to cause malaria and typhus among civilians and troops, its inventor, Paul Herman Müller,39 was awarded the Nobel Prize in Physiology or Medicine in 1948 “for his discovery of the high efficiency of DDT as a contact poison against several anthropods.” 

By October 1945, DDT was available for public sale in the United States. Government and industry promoted its use as an agricultural and household pesticide—really promoted it. Photographs from the era show housewives filling their houses with DDT fog; dairy farmers dusting cows in their cowsheds, even spraying it into the milk; crop dusters depositing DDT on fields and forests; and children on beaches enveloped in the pesticide. An attachment for your mower could distribute DDT over your lawn, and trucks sprayed DDT on city streets, children cheerfully playing in the spray. 


DDT largely replaced another CNS poison—lead arsenate, introduced in 1898 for use on crops and orchards. Before that, the preferred spray was plain arsenic. Biskind wrote: 

In 1945, against the advice of investigators who had studied the pharmacology of the compound and found it dangerous for all forms of life, DDT . . . was released in the United States and other countries for general use by the public as an insecticide. . . . It was even known by 1945 that DDT is stored in the body fat of mammals and appears in the milk. With this foreknowledge the series of catastrophic events that followed the most intensive campaign of mass poisoning in known human history, should not have surprised the experts. Yet, far from admitting a causal relationship so obvious that in any other field of biology it would be instantly accepted, virtually the entire apparatus of communication, lay and scientific alike, has been devoted to denying, concealing, suppressing, distorting and attempts to convert into its opposite, the overwhelming evidence. Libel, slander and economic boycott have not been overlooked in this campaign. . . . 

Early in 1949, as a result of studies during the previous year, the author published reports implicating DDT preparations in the syndrome widely attributed to a ‘virus-X’ in man, in ‘X-disease’ in cattle and in often fatal syndromes in dogs and cats. The relationship was promptly denied by government officials, who provided no evidence to contest the author’s observations but relied solely on the prestige of government authority and sheer numbers of experts to bolster their position. . 

. . [‘X-disease’] . . . studied by the author following known exposure to DDT and related compounds and over and over again in the same patients, each time following known exposure. We have described the syndrome as follows: . . . . In acute exacerbations, mild clonic convulsions involving mainly the legs, have been observed. Several young children exposed to DDT developed a limp lasting from 2 or 3 days to a week or more. . . . 

Particularly relevant to recent aspects of this problem are neglected studies by Lillie and his collaborators of the National Institutes of Health, published in 1944 and 1947 respectively, which showed that DDT may produce degeneration of the anterior horn cells of the spinal cord in animals. These changes do not occur regularly in exposed animals any more than they do in human beings, but they do appear often enough to be significant. 

When the population is exposed to a chemical agent known to produce in animals lesions in the spinal cord resembling those in human polio, and thereafter the latter disease increases sharply in incidence and maintains its epidemic character year after year, is it unreasonable to suspect an etiologic relationship?40 

Investigator Jim West unearthed Biskind’s writings and testimony, along with other reports about the effects of poisons on the CNS, dating from the mid-nineteenth century. West compiled the following graphs, noting the correlation of pesticide use and polio incidence in the United States.41 





As use of DDT in the United States declined, so did the incidence of polio. Vaccination programs were introduced at the same time and take credit for the decline. 

West says: 

A clear, direct, one-to-one relationship between pesticides and polio over a period of thirty years, with pesticides preceding polio incidence in the context of the [central nervous system]-related physiology . . . leaves little room for complicated virus arguments, even as a cofactor, unless there exists a rigorous proof for virus causation. Polio shows no movement independent from pesticide movement, as one would expect if it were caused by a virus. Both the medical and popular imaginations are haunted by the image of a virus that invades (or infects) and begins replicating to the point of producing disease. 

In the laboratory, however, poliovirus does not easily behave in such a predatory manner. Laboratory attempts to demonstrate causation are performed under conditions which are extremely artificial and aberrant.42 

West notes that in 1908–1909, German researchers Landsteiner and Popper in Germany claimed to have isolated polio virus and used it to cause polio in monkeys. Their method was to inject a pulverized purée of diseased brain tissue into the brains of two monkeys. One monkey died, and the other was sickened. Headlines trumpeted this “proof” of polio virus causation. “The weakness of this method is obvious to everyone except certain viro-pathologists,” said West. Never has “polio contagion” passed muster with Rivers’ postulates.43 

The injection of purée of diseased brain tissue into the brains of dogs was the method preferred by Louis Pasteur to establish microbial causation of rabies; and indeed, injecting smooshed brains into their heads often made them foam at the mouth and die. Many of Pasteur’s contemporaries disagreed strongly that rabies (also called hydrophobia) was a contagious disease and pointed out that the vaccine often caused great harm to animals and people—even Pasteur’s contemporary germ theorist Robert Koch discouraged the use of the rabies vaccine.44 Vets of the era believed that dogs become “rabid” when they were starved and mistreated. Dr. Matthew Woods of Philadelphia noted that “at the Philadelphia dog pound, where on average more than 6,000 vagrant dogs are taken annually, and where the catcher and keepers are frequently bitten while handling them, not one case of hydrophobia [rabies] has occurred during its entire history of twenty-five years, in which time 150,000 dogs have been handled.”45 During the 1960s, researchers succeeded in inducing symptoms of rabies in experimental animals by putting them in bat caverns where they could breathe the toxic stultifying vapors of bat guano, later claiming to have isolated an “airborne rabies virus.” To test whether this so-called “virus” caused rabies, one researcher “inoculated mice intracerebrally.” Fifty percent died within forty-eight hours, but none developed rabies.46 

As for polio, even with worldwide vaccination programs, polio has not gone away, either in the United States or in third world countries. Today in the United States, it has received a new name—acute flaccid paralysis (AFP), displaying symptoms identical to polio—with over two hundred cases recorded in 2018. Many parents have observed that the condition appears after a vaccination. CDC’s pathetic advice: “To prevent infections in general, persons should stay home if they are ill, wash their hands often with soap and water, avoid close contact (such as touching and shaking hands) with those who are ill, and clean and disinfect frequently touched surfaces.”47 

In some areas of the world, such as India and Africa, the incidence of acute flaccid paralysis has skyrocketed, which many blame on campaigns to administer experimental polio vaccines to children ages zero to five. 

Indian researchers described this strong correlation in a 2018 publication in the International Journal of Environmental Research and Public Health and calculated that, countrywide from 2000 to 2017, there were “an additional 491,000 paralyzed children” in excess of “the expected numbers.”48 Dr. Suzanne Humphries suggests that—far from credit for eliminating childhood paralysis going to vaccination campaigns —“there is strong evidence pointing to the likelihood that experimental polio vaccination is related to the sharp rise in AFP.”49 

If the true cause of epidemics is exposure to electrical pollution or toxins (from insects, industrial poisons, toxins produced by bacteria under conditions of filth, vaccinations, and drugs), with substandard nutrition as a cofactor, what about the outbreaks of disease in the Americas, in Africa, and in the South Seas, when these aboriginal peoples first met the European colonists? Didn’t they begin to suffer from infectious disease as soon as they came in contact with infectious diseases carried to the New World on boats from the Old World—diseases to which they had no immunity? 

Actually, native peoples did not contract disease immediately on contact with the Europeans. For example, fishermen and early explorers visited the northeastern waters along the Atlantic coast during the fifteenth and sixteenth centuries, yet we have no historical commentary on the existence of disease or epidemics among the aboriginal peoples during that time. According to Raymond Obomsawin, in his report “Historical and Scientific Perspectives on the Health of Canada’s first Peoples,50 “Since the prime purpose of this early contact was to commercially exploit natural resources, any visible evidence of the physical weakness or sickness of the indigenous inhabitants would surely have excited some keen interest.” Instead, these early reports marveled at the Native Americans’ good health and robust constitution. 

Obomsawin notes that the first recorded outbreaks of disease in Native Americans living in the Ottawa Valleys occurred between 1734 and 1741. Samuel de Champlain had established the first European settlement at Quebec on the St. Lawrence River over one hundred years earlier, in 1608, and it wasn’t until the 1800s that smallpox, dysentery, typhus, yellow fever, tuberculosis, syphilis, and various other “fevers” became prevalent in the aboriginal population. 

By the mid-eighteenth century, Native American life had succumbed to serious disruptions. As a result of intensive trapping, the game populations had dwindled, seriously affecting the availability of food and skins to make clothing and footwear. During this period, sugar, white flour, coffee, tea, and alcohol arrived on trading ships, which the colonists traded with the Indians for furs. 

The same pattern prevailed on the West Coast, where the salmon fisheries became depleted by the mid-1800's. These northwest peoples spoke of “disease boats” or “pestilence canoes,” the Spanish and British seagoing vessels that arrived with increasing frequency. They brought smallpox, but also the foods that made them vulnerable to smallpox. An early one-hundred-foot sailing cargo vessel could transport as much as eight hundred thousand pounds of “goods”—or maybe we should say “bads”—including blankets for the Native Americans.51

Tribal peoples largely dependent upon the buffalo were not affected until the early 1870s, when the animals became depleted through exploitation and deliberate campaigns to kill off the herds upon which they depended. 

According to a Canadian government report: 

The transformation of Aboriginal people from the state of good health that had impressed travelers from Europe to one of ill health . . . grew worse as sources of food and clothing from the land declined and traditional economies collapsed. It grew worse still as once mobile peoples were confined to small plots of land where resources and opportunities for natural sanitation were limited. It worsened yet again as long-standing norms, values, social systems and spiritual practices were undermined or outlawed.52 

Regarding the Plymouth colony, the Pilgrims were not the first Europeans in the area. European fishermen had been sailing off the New England coast, with considerable Native American contact, for much of the sixteenth and seventeenth centuries, and trading for beaver skins commenced in the early 1600s, prior to the arrival of the Pilgrims in 1620. 

In 1605, the Frenchman Samuel de Champlain made an extensive and detailed map of the area and the surrounding lands, showing the Patuxet village (site of the future town of Plymouth) as a thriving settlement. 

In 1617–1618, just prior to the arrival of the Mayflower, a mysterious epidemic wiped out up to 90 percent of the Indian population along the Massachusetts coast. History books blame the epidemic on smallpox, but a recent analysis has concluded that it may have been a disease called leptospirosis.53 Even today, leptospirosis kills almost sixty thousand people per year. 

Leptospirosis is a blood infection similar to malaria, associated with various forms of spirochaete bacteria. Other forms of spirochete parasites characterize syphilis, yaws, and Lyme disease. Humans encounter these spirochetes through animal urine or water and soil contaminated with animal urine coming into contact with the eyes, mouth, nose, or cuts. The disease is associated with poor sanitation. Both wild and domestic animals can transmit leptospirosis through their urine and other fluids; rodents are the most common vector, and the beaver is a rodent. 

Native Americans trade beaver skins with European colonists for liquor and other items that made them vulnerable to disease. 

One important factor omitted from discussions about Native American diseases is the disruption of the salt trade. The first European explorers in the New World did not come to the East Coast but to Florida and the southeastern part of North America. During the 1540s, eighty years before the Pilgrims landed at Plymouth Rock, the explorer Hernando de Soto led the first European expedition deep into the territory of the modern-day United States. They traversed Florida, Georgia, Alabama, and possibly Arkansas, and they saw the Mississippi River. 

Some anthropologists have insisted that the Native Americans did not consume salt, but de Soto received “an abundance of good salt” as a gift from the Native Americans, and he observed the production and trade of salt in the southeastern part of the country. In the lower Mississippi Valley, he met traveling Native American merchants selling salt. According to the de Soto records, lack of salt could lead to a most unfortunate death: 

Some of those whose constitutions must have demanded salt more than others died a most unusual death for lack of it. They were seized with a very slow fever, on the third or fourth day of which there was no one at fifty feet could endure the stench of their bodies, it being more offensive than that of the carcasses of dogs or cats. Thus they perished without remedy, for they were ignorant as to what their malady might be or what could be done for them since they had neither physicians nor medicines. And it was believed that they could not have benefited from such had they possessed them because from the moment they first felt the fever, their bodies were already in a state of decomposition. Indeed, from the chest down, their bellies and intestines were as green as grass.54 

The most important sources of salt were the salt springs that dotted northwestern Louisiana, western Arkansas, and the Ohio River Valley. Archeological remains in these areas indicate that the Native Americans evaporated the brine in shallow clay salt pans, most likely by adding hot rocks to the brackish water. They also retrieved salt from ashes of certain plants and from salt-impregnated sand; they sometimes gathered rock salt. Well-defined salt trails allowed the transport of salt to the east. Coastal Native Americans generally got their salt through trade rather than the evaporation of seawater, as wood for fire making is sparse near ocean beaches, and the moist sea air is unconducive to evaporation.55 

The salt traders did not belong to any tribal group but traveled alone from tribe to tribe carrying baskets of salt gathered from salt lakes, along with other goods. As Native American cultural life crumbled in the face of the European invasion, the salt trade would have been an early victim of this disruption. Salt is critical for protection against parasites. We need the chloride in salt to make hydrochloric acid; without salt, the stomach will not be sufficiently acidic to kill parasites. 

The point is that the so-called “infectious” diseases that caused so much suffering did not arrive until after a period of disruption and nutritional decline; and fear and despair almost certainly played a role. When disease broke out in a village, the afflicted often found themselves abandoned by those still healthy, so they had no one to care for them. Unable to get water for themselves, they typically died of thirst.56 This may explain why the death rates during outbreaks were so much higher for the Native Americans (typically 90 percent) than for Europeans (typically 30 percent). 

One disease blamed for Native American death was measles, considered to be a viral disease. But on February 16, 2016, the Federal Supreme Court of Germany (BGH) made a historic ruling: there is no evidence for the existence of a measles virus. The case grew out of a challenge by German biologist Stefan Lanka, who offered a sum of one hundred thousand Euros to anyone who could supply proof that the measles virus existed. A young doctor, David Bardens, took up the challenge, providing Lanka with six studies as proof that the measles virus did indeed exist. When Lanka claimed that these studies did not meet the evidence needed to claim the prize, Bardens took him to court. The court sided with Bardens and ordered payment of the prize. 

But Lanka took the cases to the Supreme Court, where the judge decided in his favor and ordered the plaintiff to bear all the procedural costs. Lanka was able to show that the six studies misinterpreted “ordinary constituents of cells” as part of the suspected measles virus.57 

According to Lanka, decades of consensus-building processes have created a model of a measles virus that doesn’t actually exist: “To this day, an actual structure that corresponds to this model has been found neither in a human, nor in an animal. With the results of the genetic tests, all thesis of existence of measles virus has been scientifically disproved.”58 

The existence of a contagious measles virus justified the development of the measles vaccine, which has earned the pharmaceutical industry billions of dollars over a forty-year period. But if such a microorganism does not exist, said Lanka, “This raises the question of what was actually injected into millions of German citizens over the past decades. According to the judgment by the Supreme Court, it may not have been a vaccine against measles.”59 

But what about measles parties? What about successful attempts by parents to infect their children with the common childhood diseases like measles, chicken pox, and mumps? And what about sexually transmitted diseases (STD) like syphilis—said to be a disease that the Europeans contracted from the Native Americans? The mystery of childhood illnesses and sexually transmitted diseases will be addressed in chapter 7.

next

FROM AIDS TO COVID

footnotes for these chapters are here starting at page 203 on the scroll file

https://www.bibliotecapleyades.net/archivos_pdf/contagion-myth.pdf

No comments:

Part 1 Windswept House A VATICAN NOVEL....History as Prologue: End Signs

Windswept House A VATICAN NOVEL  by Malachi Martin History as Prologue: End Signs  1957   DIPLOMATS schooled in harsh times and in the tough...