Avian Flu (HSN1), Cervical Cancer (HPV), SARS, BSE, Hepatitis C, AIDS, Polio
How the Medical Industry Continually Invents Epidemics, Making Billion-Dollar Profits At Our Expense
By Torsten Engelbrecht & Claus Kohnlein
The Microbe Hunters Seize Power
"The doctor of the future will give no medicine, but will interest his patients in the care of the human frame, in diet, and in the cause and prevention of disease. "1
Thomas Edison (1847 - 1931)
One of the greatest inventors of history
"The conclusion is unavoidable: Pasteur deliberately deceived the public, including especially those scientists most familiar with his published work. "2
"Modern virus detection methods like PCR tell little or nothing about how a virus multiplies, which animals carry it, or how it makes people sick. It is like trying to say whether somebody has bad breath by looking at his fingerprint. "3
An appeal from 14 top virologists
of the "old guard" to the
new biomedical research generation
Science, 6 July 2001
Pasteur and Koch:
Two of Many Scientific Cheats
The elevated status Louis Pasteur enjoyed during his lifetime is made clear by a quotation from physician Auguste Lutaud in 1887 (eight years before Pasteur's death) : "In France, one can be an anarchist, a communist or a nihilist, but not an anti-Pasteurian."4 In truth, however, Pasteur was no paragon with a divinely pure clean slate, but rather a researcher addicted to fame acting on false assumptions and "he misled the world and his fellow scientists about the research behind two of his most famous experiments," as the journal The Lancet stated in 2004.5
In his downright fanatical hate of microbes, Pasteur actually came from the ludicrous equation that healthy (tissue) equals a sterile (germ-free) environment.6 He believed in all earnestness that bacteria could not be found in a healthy body 7 and that microbes flying through the air on dust particles were responsible for all possible diseases.8 At 45 years of age, he "was basking in his fame," as bacteriologist Paul de Kruif writes in his book Microbe Hunters, "and trumpeted his hopes out into the world: 'it must lie within human power to eliminate all diseases caused by parasites [microbes] from the face of the earth."'9
Flaws in Pasteur's theories were shown long ago in the first half of the 20th century by experiments in which animals were kept completely germ-free. Their birth even took place by Cesarean section; after that, they were locked in microbe free cages and given sterile food and water-after a few days, all the animals were dead. This made it apparent that "contamination" by exogenous bacteria is absolutely essential to their lives.10
In the early 1960s, scientists succeeded for the first time in keeping germ-free mice alive for more than a fews days, namely for several weeks. Seminal research on these germ-free rodents was performed by Morris Pollard in Notre-Dame, Indiana.11
However, this does not undermine the fact that germs are essential for life. Not only do mice under natural conditions have a life span of three years, which is much longer than the average lifespan of these germ-free lab animals 12 The ability to keep germ-free animals such as mice or rats alive for a longer time requires highly artificial lab conditions in which the animals are synthetically fed with vitamin supplements and extra calories, conditions that have nothing to do with nature. These specially designed liquid diets are needed because under normal rearing conditions, animals harbor populations of microorganisms in the digestive tract. 13
These microorganisms generate various organic constituents as products or byproducts of metabolism, including various water-soluble vitamins and amino acids. In the rat and mouse, most of the microbial activity is in the colon, and many of the microbially produced nutrients are not available in germ-free animals. This alters microbial nutrient synthesis and, thereby, influence dietary requirements. Adjustments in nutrient concentrations, the kinds of ingredients, and methods of preparation must be considered when formulating diets for laboratory animals reared in germ-free environments or environments free of specific microbes.14 15
One important target by administering these artificial diets is to avoid the accumulation of metabolic products in the large intestine. However, it has been observed that already after a short time the appendix or cecum of these germ-free reared rodents increased in weight and eventually became abnormally enlarged, filled with mucus which would normally have been broken down by microbes.16 Furthermore, in germ-free conditions rodents typically die of kidney failure 17, a sign that the kidneys are overworked in their function as an excretion organ if the large intestine has been artificially crippled. In any case, it shows that germ-free mice would not be able to survive and reproduce while staying healthy in realistic conditions, which can never be duplicated by researchers, not even approximately.
Apart from this, it is not clear that these germ-free animals have been truly 100% germ-free. Obviously not all tissues and certainly not every single cell could have been checked for germs. Nobody can know that these animals are absolutely germfree, especially if one keeps in mind that germs such as the Chlamydia trachomatis may "hide" so deeply in the cells that they persist there even after treatment with penicillin. 18
Furthermore, even if the specimens of so-called germ-free animals are maintained under optimum conditions-assumed to be perfectly sterile-their tissues do, nevertheless, decay after a time, forming "spontaneous" bacteria. But how do we explain these "spontaneous" bacteria? They cannot come from nothing, so logic allows only one conclusion: the bacteria must have already been present in the so called "germ-free" mice (in any case, mice said to be bacteria-free are apparently not virus-free; this was demonstrated in 1964 in the Journal of Experimental Medicine by Etienne de Harven who observed, by electron microscopy, typical so-called retroviral particles in the thymus of germ-free Swiss and C3H mice; 19 of course, these viruses may be endogenous retroviruses which sometimes are expressed as particles-but of endogenous origin).
If nature wanted us bacteria-free, nature would have created us bacteria-free. Germ-free animals, which apparently aren't really germ-free, can only exist under artificial lab conditions, not in nature. The ecosystems of animals living under natural conditions-be it rodents or be it human beings-depend heavily upon the activities of bacteria, and this arrangement must have a purpose.
But back to "Tricky Louis"20 who deliberately lied, even in his vaccination experiments, which provided him a seat on the Mount Olympus of research gods. In 1881, Pasteur asserted that he had successfully vaccinated sheep against anthrax. But not only does nobody know how Pasteur's open land tests outside the Paris gates really proceeded, but the national hero of Ia grande Nation, as he would later be called, had in fact clandestinely lifted the vaccine mixture from fellow researcher Jean-Joseph Toussaint,21 whose career he had earlier ruined through public verbal attacks.22 And what about Pasteur's purportedly highly successful experiments with a rabies vaccine in 1885? Only much later did the research community learn that they did not satisfy scientific standards at all, and were thus unfit to back up the chorus of praise for his vaccine-mixture. Pasteur's super-vaccine "might have caused rather than prevented rabies," writes scientific historian Horace Judson.23
These experiments weren't debated for decades largely due to the fastidious secretiveness of the famous Frenchman. During his lifetime, Pasteur permitted absolutely no one-not even his closest co-workers-to inspect his notes. And "Tricky Louis" arranged with his family that the books should also remain closed to all even after his death.24 In the late 20th century, Gerald Geison, medical historian at Princeton University, was first given the opportunity to go through Pasteur's records meticulously, and he made the fraud public in 1995.25 That it became so controversial shouldn't be particularly surprising, for sound science thrives in a transparent environment so that other researchers can verify the conclusions made.26
Secretiveness has an oppositional goal: shutting out independent monitoring and verification. When external inspection and verification by independent experts are shut out of the process, the floodgates are open to fraud.27 Of course, we observe this lack of transparency everywhere, be it in politics, in organizations like the international Football association FIFA, and also in "scientific communities [that] believe that public funding is their right, but so is freedom from public control," according to Judson. 28 With this, mainstream research has actually managed to seal off their scientific buildings from public scrutiny.
This set-up lacks critical checks and balances, so no one is ultimately in the position to scrutinize the work of researchers and make sure research is conducted in an honest way. We are left to simply trust that . they go about it truthfully. 29 But, a survey taken by scientists and published in a 2005 issue of Nature showed that a third of researchers admitted they would not avoid deceptive activities, and would simply brush to the side, any data that did not suit their purposes.30 A crucial aspect of science has been lost; few researchers now trouble themselves to verify data and conclusions presented by fellow researchers.
Such quality checkups are equated with a waste of time and money and for that reason are also not financed. Instead medical researchers are completely occupied obsessed with chasing after the next big high-profit discovery. And many of today's experiments are constructed in such a complicated manner that they cannot be reconstructed and precisely verified at all. 31 This makes it very easy for researchers to ask themselves, without having to fear any consequences: why shouldn't I cheat?
One would hope that the so-called peer review system largely eliminates fraud. It is still commonly considered a holy pillar of the temple of science, promising adherence to quality standards.32 But the decades-long practice of peer review is rotten to the core.33 34 It functions like this: experts ("peers") who remain anonymous examine (review) research proposals and journal articles submitted by their scientific competitors. These so-called experts then decide if the proposals should be approved or the articles printed in scientific publications. There are said to be around 50,000 such peer reviewed publications,35 and all the best known journals such as Nature, Science, New England Journal of Medicine, British Medical Journal and The Lancet, are peer reviewed.
There is, however, a fundamental problem: peer reviewing, in its current form, is dangerously flawed. If researchers in other fields conducted studies and published results using this process, what would happen? If their current methods were common in the car industry, for example, BMW's competitors could decide, through an anonymous process, whether or not BMW would be permitted to develop a new car model and bring it to the market. Clearly this would stifle innovation and invite conflicts of interest and fraud.
"Peer review is slow, expensive, a profligate of academic time, highly selective, prone to bias, easily abused, poor at detecting gross defects, and almost useless for detecting fraud," says Richard Smith, former Editor in Chief of the British Medical Journal.36 No wonder, then, that all the cases of fraud which scientific historian Judson outlines in his 2004 book The Great Betrayal: Fraud in Science were not uncovered by the peer review system, but rather by pure coincidence. 37 And next to Pasteur in the pantheon of scientific fraudsters appear such illustrious names as Sigmund Freud and David Baltimore, one of the best -known recipients of the Nobel Prize for medicine 38 (we'll discuss Baltimore in more detail later in this chapter).
The other shining light of modem medicine, German doctor Robert Koch (1843 - 1910) was also an enterprising swindler. At the "10th International Medical Congress" in Berlin in 1890, the microbe hunter "with the oversized ego"39 pronounced that he had developed a miracle substance against tuberculosis.40 And in the German Weekly Medical Journal (Deutsche Medizinische Wochenzeitschrift), Koch even claimed his tests on guinea pigs had proved that it was possible "to bring the disease completely to a halt without damaging the body in other ways."41
The reaction of the world-at-large to this alleged miracle drug "Tuberkulin" was at first so overwhelming that in Berlin, Koch's domain, sanatoria shot out of the ground like mushrooms. 42 Sick people from all over the world turned the German capital into a son of pilgrimage site. 43 But soon enough, Tuberkulin was found to be a catastrophic failure. Long-term cures did not emerge, and instead one hearse after another drove up to the sanatoria. And newspapers such as the New Year's edition of the satirical Der wahre Jakob (The Real McCoy) jeered: "Herr Professor Koch! Would you like to reveal a remedy for dizziness bacteria!"44
In the style of Pasteur, Koch had also kept the contents of his alleged miracle substance strictly confidential at first. But as death rates soared, a closer inspection of the drug's properties revealed that Tuberkulin was nothing more than a bacillus culture killed off by heat; even with the best of intentions, no one could have assumed that it would have helped tuberculosis patients suffering from severe illness. On the contrary, all individuals-be it the test patients or the ones who were given it later as an alleged cure-experienced dramatic adverse reactions: chills, high fever, or death. 45
Finally, Koch's critics, including another medical authority of that time, Rudolf Virchow, succeeded in proving that Tuberkulin could not stop tuberculosis. Rather, it was feared, according to the later scathing criticisms, that it made the disease's progress even worse. Authorities demanded that Koch brings forth evidence for his famous guinea pig tests-but he could not.46
Experts such as historian Christoph Gradmann of Heidelberg say that Koch "cleverly staged" Tuberculin launch. Everything seemed to have been planned well in advance. In late October 1890, during the first wave of Tuberculin euphoria, Koch had taken leave of his hygiene professorship. In confidential letters, he requested his own institute-modeled on the Institut Pasteur in Paris-from the Prussian state in order to be able to research his Tuberkulin extensively.
Professor Koch calculated the expected profit on the basis of a "daily production of 500 portions of Tuberkulin at 4.5 million marks annually." On the reliability of his prognosis, he dryly observed: "Out of a million people, one can reckon, on average, with 6,000 to 8,000 who suffer from pulmonary tuberculosis. In a country with a population of 30 million, then, there are at least 180,000 phthisical (tubercular people)." Koch's announcement in the German Weekly Medical Journal (Deutsche Medizinische Wochenzeitschrift) appeared simultaneously with excessively positive field reports by his confidantes, according to Gradmann, served "for the verification of Tuberkulin just as much as for its propaganda."47
Scurvy, Beriberi and Pellagra:
The Microbe Hunters' Many Defeats
At the end of the 19th century, when Pasteur and Koch became celebrities, the general public had hardly a chance to brace itself against microbe propaganda. Medical authorities, who adhered to the microbes = lethal enemies theory, and the rising pharmaceutical industry already had the reins of power and public opinion firmly in their hands. With this, the course was set for the establishment of clinical studies using laboratory animals, with the goal of developing (alleged) miracle pills against very specific diseases.
The scheme was so effective that even a substance like Tuberkulin, which caused such a fatal disaster, was highly profitable. Koch never even admitted that his Tuberkulin had been a failure. And Hoechst, a dye factory looking for a cheap entry into pharmaceutical research, got into Tuberkulin manufacturing. Koch's student Arnold Libbertz was to supervise production, with close cooperation from Koch's institute, and the rising pharmaceutical industry were decisively spurred on.48
From this point on, scientists tried to squeeze virtually everything into the model "one disease-one cause (pathogen)-one miracle cure," something that prompted one failure after another. For example, for a long time, the prevailing medicine spiritedly asserted that diseases like scurvy (seamen's disease), pellagra (rough skin), or beriberi (miners' and prisoners' disease) were caused by germs. Until the orthodoxy ultimately, with gritted teeth, admitted that vitamin deficiency is the true cause.
With beriberi, for instance, it was decades before the dispute over what caused the degenerative neural disease took its decisive tum when vitamin B1 (thiamine) was isolated in 1911, a vitamin that was absent in refined foods like white rice. Robert R. Williams, one of the discoverers of thiamine, noted that, through the work of Koch and Pasteur, "all young physicians were so imbued with the idea of infection as the cause of disease that it presently came to be accepted as almost axiomatic that disease could have no other cause [than microbes]. The preoccupation of physicians with infection as a cause of disease was doubtless responsible for many digressions from attention to food as the causal factor of beriberi."49
Hippocrates, von Pettenkofer, Bircher-Benner: The Wisdom of the Body
The idea that certain microbes-above all fungi, bacteria and viruses-are our great opponents in battle, causing certain diseases that must be fought with special chemical bombs, has buried itself deep into the collective conscience. But a dig through history reveals that the Western world has only been dominated by the medical dogma of "one disease, one cause, one miracle pill" since the end of the 19th century, with the emergence of the pharmaceutical industry. Prior to that, we had a very different mindset, and even today, there are still traces everywhere of this different consciousness. 50
"Since the time of the ancient Greeks, people did not 'catch' a disease, they slipped into it. To catch something meant that there was something to catch, and until the germ theory of disease became accepted, there was nothing to catch," writes previously mentioned biology professor Edward Golub in his work, The Limits of Medicine: How Science Shapes Our Hope for the Cure. 51 Hippocrates, who is said to have lived around 400 B.C., and Galen (one of the most significant physicians of his day; born in 130 A.D.), represented the view that an individual was, for the most part, in the driver's seat in terms of maintaining health with appropriate behavior and lifestyle choices.
"Most disease [according to ancient philosophy] was due to deviation from a good life," says Golub. " [And when diseases occur] they could most often be set aright by changes in diet-[which] shows dramatically how 1,500 years after Hippocrates and 950 years after Galen, the concepts of health and disease, and the medicines of Europe, had not changed" far into the 19th century. 52
Even into the 1850s, the idea that diseases are contagious found hardly any support in medical and scientific circles. One of the most significant medical authorities of the time was the German Max von Pettenkofer ( 1818 - 1901 ), who tried to comprehend things as wholes, and so incorporated various factors into his considerations about the onset of diseases, including individual behavior and social conditions. To von Pettenkofer, the microbe-theoreticians' oversimplified, monocausal hypothesis seemed naive, something that turned him into a proper "anti contagionist." 53 In view of the then-emerging division of medicine into many separate specialized disciplines, the scientist, later appointed rector of the University of Munich, jeered: "Bacteriologists are people who don't look further than their steam boilers, incubators and microscopes." 54
And so it was also von Pettenkofer who at this time directed the discussion on the treatment of cholera, a disease so typical to rising industrial nations in the 19'h century. He held the same position that the famous doctor Francois Magendie ( 1783 - 1855) had adopted back in 1831, when he reported to the French Academy of Sciences that cholera was not imported, nor contagious, but rather it was caused by excessive dirt as a result of catastrophic living conditions. 55 Correspondingly, the poorest quarters in centers like London were, as a rule, also the ones most afflicted by cholera. 56
Von Pettenkofer identified drinking water as the main cause. There were no treatment plants in those days, so water was often so visibly and severely contaminated with industrial chemicals and human excrement that people regularly complained about its stink and discoloration. Studies also showed that households with access to clean water had few to no cholera cases at all. 57 Although von Pettenkofer certainly didn't deny the presence of microbes in this cesspool, he argued that these organisms could contribute to the disease's course, but only when the biological terrain was primed so they could thrive. 58
Unfortunately, von Pettenkofer's authority ultimately could not prevent adherents of the microbe theory from taking the matter into their own hands at the end of the 19th century, and they squeezed cholera into their narrow explanatory concept as well. So a microbe (in this case the bacterium Vibrio cholerae or its excretions) was branded as the sole culprit-and Pasteurian microbe theory was falsely decorated for having repelled cholera. Golub was left shouting into the void: "Why does Pasteur get the credit for that which the sanitation movement and public health were primarily responsible?"59
The 1500-year history of a holistic view of health and disease was much too connected with life and its monstrous complexities to disappear altogether at the spur of the moment. Yet, it virtually disappeared from the collective conscience.
Geneticist Barbara McClintock was of the opinion that the concepts that have since posed as sound science cannot sufficiently describe the enormous multi-layered complexities of all forms of natural life, and with that, their secrets. Organisms, according to the Nobel Prize winner for medicine, lead their own lives and comply with an order that can only be partially fathomed by science. No model that we conceive of can even rudimentarily do justice to these organisms' incredible capability to find ways and means of securing their own survival. 60
By the beginning of the 1970s, Nobel laureate for medicine, Sir Frank Macfarlane Burnet had also become very skeptical about "the 'usefulness' of molecular biology, [especially because of] the impossible complexity of living structure and particularly of the informational machinery of the cell. [Certainly, molecular biologists are] rightly proud of their achievements and equally rightly feel that they have won the right to go on with their research. But their money comes from politicians, bankers, foundations, who are not capable of recognizing the nature of a scientist's attitude to science and who still feel, as I felt myself 30 years ago, that medical research is concerned only in preventing or curing human disease. So our scientists say what is expected of them, their grants are renewed and both sides are uneasily aware that it has all been .a dishonest piece of play-acting-but then most public functions are."61
Certainly not all doctors have clamored for roles on the medical industrial stage and some were key players in keeping the holistic health viewpoint alive. Swiss doctor Maximilian Bircher-Benner (1867 - 1939) directed his attention to the advantages of nutrition after treating his own jaundice with a raw foods diet, as well as a patient suffering from severe gastric problems. In 1891, long before the significance of vitamins and dietary fiber to the human body had been recognized, Bircher-Benner took over a small city practice in Zurich, where he developed his nutritional therapy based on a raw foods diet.
By 1897, only a few years later, the practice had grown into a small private clinic, where he also treated in patients. There was strong interest in his vegetarian raw food diet from all over the world, so, Bircher-Benner erected a four-story private sanatorium in 1904 called "Lebendige Kraft" (living force). And so besides a raw foods diet, Bircher-Benner (whose name has been immortalized in Bircher-Muesli) promoted natural healing factors like sun-baths, pure water, exercise and psychological health. 62 With this, he supported treatments that had become increasingly neglected with the appearance of machines and, particularly, pharmaceuticals: attention to the natural healing powers of the body and the body's cells, which possess their own sort of sensitivity and intelligence. 63
Walter Cannon, professor of physiology at Harvard, also made holistic health his central theme, in his 1932 work The Wisdom of the Body. Here, he describes the concept of homeostasis, and underlines that occurrences in the body are connected with each other and self-regulating in an extremely complex way.64 "'Wisdom of the Body' is an attribute of living organisms," wrote Israeli medical researcher Gershom Zajicek in a 1999 issue of the journal Medical Hypotheses. "It directs growing plants toward sunshine, guides amoebas away from noxious agents, and determines the behavior of higher animals. The main task of the wisdom of the body is to maintain health, and improve its quality. The wisdom of the body has its own language and should be considered when examining patients."65
The words of biologist Gregory Bateson from 1970 are certainly still valid today: "[Walter] Cannon wrote a book on the Wisdom of the Body; but nobody has written a book on the wisdom of medical science, because that is precisely the thing it lacks."66
Clustering: How to Make an Epidemic Out of One Infected Patient
After World War II, diseases such as tuberculosis, measles, diphtheria or pneumonia no longer triggered mass fatalities in industrialized nations such as affluent America. This became a huge problem for institutions like the Centers for Disease Control (CDC), the American epidemic authorities, as redundancy threatened.67 In 1949, a majority voted to eliminate the CDC completely.68 Instead of bowing out of a potentially very lucrative industry, the CDC went on an arduous search for viruses.69 But, how to find an epidemic where there isn't any? You do "clustering."
This involves a quick scan of your environment-hospitals, daycares, local bars, etc.-to locate one, two, or a few individuals with the same or similar symptoms. This is apparently completely sufficient for virus hunters to declare an impending epidemic. It doesn't matter if these individuals have never had contact with each other, or even that they've been ill at intervals of weeks or even months. So, clusters can deliver no key clues or provide actual proof of an existing or imminent microbial epidemic.
Even the fact that a few individuals present the same clinical picture does not necessarily mean that a virus is at work. It can mean all sorts of things including that afflicted individuals had the same unhealthy diet or that they had to fight against the same unhealthy environmental conditions (chemical toxins etc.). Even an assumption that an infectious germ is at work could indicate that certain groups of people are susceptible to a certain ailment, while many other people who are likewise exposed to the microbe remain healthy. 70
For this reason, epidemics rarely occur in affluent societies, because these societies offer conditions (sufficient nutrition, clean drinking water, etc.) which allow many people to keep their immune systems so fit that microbes simply do not have a chance to multiply abnormally (although antibiotics are also massively deployed against bacteria; and people who overuse antibiotics and other drugs that affect the immune system are even at greater risk) .
Just how ineffective clustering is in finding epidemics becomes evident, moreover, if we look more closely at cases where clustering has been used as a tool to sniff out (allegedly impending) epidemics. This happened with the search for the causes of scurvy, beriberi and pellagra at the beginning of the 20th century. But, as illustrated, it proved groundless to assume that these are infectious diseases with epidemic potential.
The best-known example in recent times is HIV/AIDS. At the beginning of the 1980s, a few doctors tried to construct a purely viral epidemic out of a few patients who had cultivated a drug-taking lifestyle that destroyed the immune system. We'll discuss how virus authorities manufactured this epidemic in Chapter 3. For now, we'll quote CDC officer Bruce Evatt, who admitted that, the CDC went to the public with statements for which there was "almost no evidence. We did not have proof it was a contagious agent." 71
Unfortunately, the world ignored all kinds of statements like this. So talk of the "AIDS virus" has since kept the world in epidemic fear and virus hunters are now the masters of the medical arena. Every cold, every seasonal influenza, hepatitis disease, or whatever other syndrome has become an inexhaustible source for epidemic hunters armed with their clustering methods to declare ever new epidemics that pose threats to the world.
In 1995, allegedly, "the microbe from hell came to England," according to media scientist Michael Tracey, who was then active in Great Britain and collected media headlines like, "Killer Bug Ate My Face," "Flesh Bug Ate My Brother in 18 Hours," and "Flesh Eating Bug Killed My Mother in 20 Minutes." Tracey writes, "The Star was particularly subtle in its subsidiary headline, 'it starts with a sore throat but you can die within 24 hours."' Yet the bacterium, known to the medical world as Streptococcus A, was anything but new. "Usually only a few people die from it each year," says Tracey. "In that year in England and Wales just 11 people. The chances of getting infected were infinitesimally small but that didn't bother the media at all. A classic example of bad journalism triggering a panic. "72
In the same year, the US CDC sounded the alarm, warning insistently of an imminent Ebola virus pandemic. With the assistance of cluster methods, several fever cases in Kikwit, in the Democratic Republic of Congo, were separated out and declared as an outbreak of the Ebola epidemic. In their addiction to sensation the media reported worldwide that a deadly killer virus was about to leave its jungle lair and descend on Europe and the USA 73
Time magazine showed spectacular pictures of CDC "detectives" in spacesuits impermeable to germs and colorful photographs in which the dangerous pathogen could ostensibly be seen. 74 The director of the UN AIDS program made the horror tangible by imagining: "It is theoretically possible that an infected person from Kikwit makes it to the capital, Kinshasa, climbs into a plane to New York, gets sick and then poses a risk to the USA." Within a month, however, Ebola was no longer a problem in Africa, and not one single case was ever reported in Europe or North America 75 And a publication in which the ebola virus is characterized (with its genetic material and virus shell) and shown in an electron micrograph is still nowhere to be found.
Polio: Pesticides Such as DDT and Heavy Metals Under Suspicion
Practically all of the infectious illnesses that infected people in industrialized countries in the decades before World War II (tuberculosis etc.) ceased to cause problems after 1945. For a few years, the major exception was polio (infantile paralysis), which continues to be called an infectious disease. In the 1950's, the number of polio cases in developed countries fell drastically-and epidemic authorities attributed this success to their vaccination campaigns. But a look at the statistics reveals that the number of polio victims had already fallen drastically when vaccination activities started (see diagram 2).[see page 56 on the scroll for this diagram, link at the bottom. DC]
Many pieces of evidence justify the suspicion that the cause of infantile paralysis (polio) is not a virus. Many experts, like American physician Benjamin Sandler, believe a decisive factor is a high consumption of refined foods such as granulated sugar. 76 Others cite mass vaccinations. Indeed, since the beginning of the 20th century, it has been known that the paralysis so typical of polio have often appeared at the site where an injection has been given. 77 Additionally, the number of polio cases increased drastically after mass vaccinations against diphtheria and whooping cough in the 1940s, as documented in the Lancet and other publications.78 79 80
Polio, like most diseases, may be conditional on various factors. It makes particular sense, however, to take poisoning by industrial and agricultural pollution into consideration, to explain why this nervous disease first appeared in the 19th century, in the course of industrialization. It spread like wildfire in the industrialized West in the first half of the 20th century, while in developing countries, in contrast, there was no outbreak.
In the 19th century, the disease was named poliomyelitis, referring to degeneration of spinal column nerves (myelitis is a disease of the spinal cord) typical of polio.81
Orthodox medical literature can offer no evidence that the poliovirus was anything other than benign until the first polio epidemic, which occurred in Sweden in 1887. This was 13 years after the invention of DDT in Germany (in 1874) and 14 years after the invention of the first mechanical crop sprayer, which was used to spray formulations of water, kerosene, soap and arsenic.
"The epidemic also occurred immediately following an unprecedented flurry of pesticide innovations," says Jim West of New York, who has extensively investigated the subject of polio and pesticides. "This is not to say that DDT was the actual cause of the first polio epidemic, as arsenic was then in widespread use and DDT is said to have been merely an academic exercise. However, DDT or any of several neurotoxic organochlorines already discovered could have caused the first polio epidemic if they had been used experimentally as a pesticide. DDT's absence from early literature is little assurance that it was not used."82
Nearly ten years before, in 1878, Alfred Vulpian, a neurologist, had provided experimental evidence for the poisoning thesis when he discovered that dogs 61 - Chapter 2 poisoned by lead suffered from the same symptoms as human polio victims. In 1883, the Russian Miezeyeski Popow showed that the same paralysis could be produced with arsenic. These studies should have aroused the scientific community, considering that the arsenic-based pesticide Paris green had been widely used in agriculture to fight "pests" like caterpillars since 1870.83
"But instead of prohibiting the insecticide Paris green, it was replaced by the even more toxic pesticide: lead arsenate, which likewise contained heavy metals, in the state of Massachusetts in 1892," according to a 2004 article in the British magazine The Ecologist.84 Indeed, a polio epidemic broke out in Massachusetts two years later. Dr. Charles Caverly, who was responsible for the tests, maintained that a toxin was more likely the culprit that,''we are very certainly not dealing with a contagious disease."
Within a short time, however, lead arsenate became the most important pesticide in the industrialized world's fruit cultivation. It was not the only toxic substance used in agricultural industries.85 In 1907, for example, calcium arsenate was introduced in Massachusetts 86 and was used in cotton fields and factories. Months later, 69 children who lived downstream from three cotton factories suddenly became sick and suffered from paralysis. Meanwhile, lead arsenate was also being sprayed on the fruit trees in their gardens.87 But microbe hunters ignored these legitimate "cluster" factors, and instead continued searching for a "responsible" virus.88
A cornerstone for the polio-as-virus theory was laid down in 1908 by scientists Karl Landsteiner and Erwin Popper, both working in Austria.89 90 The World Health Organization calls their experiments one of the "milestones in the obliteration of polio."91 That year, another polio epidemic occurred and once again there was clear evidence that toxic pesticides were at play. But, astoundingly, instead of following up this evidence, medical authorities viewed the pesticides as weapons in the battle against the arch enemy microbes. They even neglected to give the children suffering from lameness treatments to alleviate the pesticide poisoning and, thus establish whether their health could be improved this way.92 (In 1951, Irwin Eskwith did exactly that and succeeded in curing a child suffering cranial nerve damage-bulbar paralysis, a particularly severe form of polio 93-with dimercaprol, a detoxification substance that binds heavy metals like arsenic and lead).94 95 96
Landsteiner and Popper instead chose to take a diseased piece of spinal marrow from a lame nine-year-old boy, chopped it up, dissolved it in water and injected one or two whole cups of it intraperitoneally (into the abdominal cavities) of two test monkeys : one died and the other became permanently paralyzed.97 98 Their studies were plagued by a mind-boggling range of basic problems. First, the "glop" they poured into the animals was not even infectious, since the paralysis didn't appear in the monkeys and guinea pigs given the alleged "virus soup" to drink, or in those that had it injected into their extremities.99 Shortly after, researchers Simon Flexner and Paul Lewis experimented with a comparable mixture, injecting this into monkeys' brains.100 Next, they brewed a new soup from the brains of these monkeys and put the mix into another monkey's head. This monkey did indeed become ill. In 1911, Flexner even boasted in a press release, that they had already found out how polio could be prevented, adding, of course, that they were close to developing a cure.101
But this experiment shows no proof of a viral infection. The glop used cannot be termed an isolated virus, even with all the will in the world. Nobody could have seen any virus, as the electron microscope wasn't invented until 1931. Also, Flexner and Lewis did not disclose the ingredients of their "injection soup." By 1948, it was still unknown "how the polio virus invades humans," as expert John Paul of Yale University stated at an international poliomyelitis congress in New York City. 102
Apart from that, it is very probable that the injection of foreign tissues in the monkeys' craniums triggered their polio-like symptoms (see Chapter 5: BSE). And when one considers the amount of injected material, it can hardly be surprising that the animals became ill. Controlled trials weren't even carried out-that is, they neglected to inject a control group of monkeys with healthy spinal cord tissue. Neither were the effects of chemical toxins like heavy metals injected directly into the brain. 103 104 All of these factors make the experiments virtually worthless.
Although many scientific factors spoke against the possibility that polio was an infectious viral disease, 105 these studies would become the starting point of a decade long fight, which concentrated exclusively on an imaginary polio virus.106 Anything and everything, like brain parts, feces, and even flies were chased into the monkeys' brains in an attempt to establish a viral connection. Later these monkeys were even captured en masse in the Indian wilderness and transported overseas to the experimental laboratories-with the single aim of producing paralysis. And where virus hunters were working, vaccine manufacturers were not far away.
By the end of the 1930s, vaccine researchers had allegedly discovered a whole range of virus isolates. But these could not have been real isolates. And another problem cropped up along the way: the monkeys didn't get sick when they were orally administered the "glop." These researchers could only produce paralysis by injecting into the brain large amounts of substrates of unknown contents. 107 In 1941, the polio virus hunters had to accept a bitter setback, when experts reported in the scientific journal Archives of Pediatrics that, "Human poliomyelitis has not been shown conclusively to be a contagious disease." Neither has the experimental animal disease, produced by the so-called poliomyelitis virus, been shown to be communicable. In 1921, Rosenau stated that "monkeys have so far never been known to contract the disease 'spontaneously' even though they are kept in intimate association with infected monkeys."108 This means that if this was not an infectious disease, no virus could be responsible for it, so the search for a vaccine was a redundant venture.
But virus hunters didn't even consider factors that lay outside of their virus obsession. So it happened that, in the middle of the 20th century, researcher Jonas Salk believed he had conclusively found the polio virus.109 Even though he could not prove that what he called the polio virus actually triggered polio in humans, he still somehow believed he could produce a vaccine from it.110
Salk alone is said to have sacrificed 17,000 test monkeys (termed "the heroes" by one of Salk's co-workers) on the altar of vaccine research during the most heated phase of his research; 111 in total, the number of slaughtered monkeys reached into the hundreds-of-thousands.112 But critics objected that what Salk termed the polio virus was simply an "artificial product of the laboratory."113 Consequently, to this day, it is a huge challenge to find what is termed the polio virus where the patient's nerve cells are damaged, that is to say, in spinal cord tissue.114
In 1954, Bernice Eddy, who was then responsible from the US government's vaccine safety tests, also reported that the Salk vaccine had caused severe paralysis in test monkeys. Eddy was not sure what had triggered the paralysis symptoms: a virus, some other cellular debris, a chemical toxin? But it contained something that could kill. She photographed the monkeys and submitted them to her boss-but he turned her down and criticized her for creating panic. Instead, of course, he should have taken her misgivings into account and started extensive inquiries. But Eddy was stopped by the microbe establishment and even had to give up her polio research shortly before her warnings had proven themselves justified.115
On 12 April 1955, Salk's vaccine was celebrated nationwide as a substance that completely protected against polio outbreaks. US President Dwight Eisenhower awarded Salk a Congressional Gold Medal. American and Canadian television joined in the celebration. And on 16 April, the Manchester Guardian joined the party, stating that "nothing short of the overthrow of the Communist regime in the Soviet Union could bring such rejoicing to the hearths and homes in America as the historic announcement last Tuesday that the 166-year war against paralytic poliomyelitis is almost certainly at an end."117
But the triumph was short-lived. Medical historian Beddow Bayly wrote that "Only thirteen days after the vaccine had been acclaimed by the whole of the American Press and Radio as one of the greatest medical discoveries of the century, and two days after the English Minister of Health had announced he would go right ahead with the manufacture of the vaccine, came the first news of disaster. Children inoculated with one brand of vaccine had developed poliomyelitis. In the following days more and more cases were reported, some of them after inoculation with other brands of the vaccine." According to Bayly, "Then came another, and wholly unlooked for complication. The Denver Medical Officer, Dr. Florio announced the development of what he called 'satellite' polio, that is, cases of the disease in the parents or other close contacts of children who had been inoculated and after a few days illness in hospital, had returned home and communicated the disease to others, although not suffering from it themselves."118
Within only two weeks, the number of polio cases among vaccinated children had climbed to nearly 200.119 On 6 May 1955, the News Chronicle quoted the US government's highest authority on viruses, Carl Eklund, who said that in the country, only vaccinated children had been afflicted by polio. And only, in fact, in areas where no polio cases had been reported for a good three-quarters of a year. At the same time, in nine out of ten cases, the paralysis appeared in the injected arm.120
This triggered panic in the White House. On 8 May, the American government completely halted production of the vaccine.121 A short time later, a further 2,000 polio cases were reported in Boston, where thousands had been vaccinated. In "inoculated" New York, the number of cases doubled, in Rhode Island and Wisconsin, they jumped by 500%. And here as well, the lameness appeared in the inoculated arm in many children.122
Apart from that, an objective look at statistics would have shown that there was no reason to celebrate Salk's vaccine as the great conqueror of an alleged polio virus. "According to international mortality statistics, from 1923 to 1953, before the Salk killed-virus vaccine was introduced, the polio death rate in the United States and England had already declined on its own by 47% and 55% respectively," writes scientific journalist Neil Miller. 123
In the Philippines, only a few years before the US catastrophe, the first polio epidemic in the tropics occurred spontaneously, in fact, with the introduction of the insecticide DDT there.124 Around the end of World War II, US troops in the Philippines had sprayed masses of DDT daily to wipe out flies. Just two years later, the well known Journal of the American Medical Association reported that lameness among soldiers stationed in the Philippines could not be differentiated from polio, and it had advanced to become the second most common cause of death. Only combat exercises were said to have claimed more victims. Meantime, populations in neighboring areas, where the poison had not been sprayed, experienced no problems with paralysis.125 126 This is further evidence that DDT poisoning can cause the same clinical symptoms as polio (which is claimed to be conditional upon a virus).
Young people in industrialized countries are hardly acquainted with DDT anymore. It stands for dichlorodiphenyltrichloroethane, and is a highly toxic substance first synthesized at the end of the 19th century, in 1874, by Austrian chemist Othmar Zeidler. Paul Hermann Muller of Switzerland discovered its insect killing property in 1939, for which he received the Nobel Prize for Medicine in 1948.127 This resulted in its widespread for pest control, even though there was already strong evidence that it was a severe neurotoxin, dangerous for all forms of life and associated with the development of herpes zoster (shingles), produces paralysis, has carcinogenic potential and can be fatal.128 129 130 [see pages 63-66 on the scroll for added insets on 'PUBLIC HEALTH ASPECTS OF THE NEW INSECTICIDES and THE POISON CAUSE OF POLIOMYELITIS AND OBSTRUCTIONS TO ITS INVESTIGATION* by the authors that I cannot format properly with this blog.link at the bottom DC]
DDT is also problematic because it biodegrades very slowly in nature with a halflife of 10 - 20 years. Additionally, through the food chain, it can become concentrated in the fatty tissue of humans and animals. But this toxic substance wasn't outlawed until 1972 in the USA and even later in most other countries in the prosperous northern hemisphere. Today, its use is prohibited in a large part of the world and it one of the "dirty dozen" organic toxins banned worldwide at the Stockholm Convention on 22 May 2001.131
Industrial production of DDT started at the beginning of the 1940's. It was first used to fight malaria, and later became a sort of "all-purpose remedy" against all sorts of insects. 132 There was also military use of DDT. US army recruits were powdered with it to protect them from lice, and they additionally received DDT sprayed shirts. 133 When the Second World War was over, DDT was sold on stock markets round the globe, even though strong warnings about its toxicity had been issued. "In the mid-40s, for example, the National Institutes of Health demonstrated that DDT evidently damaged the same part of the spinal cord as polio," writes research scientist Jim West of New York.134 135 136
The classic Harrison's Principle of Internal Medicine states, "Lameness resulting from heavy metal poisoning is clinically sometimes difficult to differentiate from polio."137 Endocrinologist Morton Biskind came to the same conclusion in his research papers describing the physiological evidence of DDT poisoning that resembles polio physiology: "Particularly relevant to recent aspects of this problem are neglected studies by Lillie and his collaborators of the National Institutes of Health, published in 1944 and 1947 respectively, which showed that DDT may produce degeneration of the anterior horn cells of the spinal cord in animals. These changes do not occur regularly in exposed animals any more than they do in human beings, but they do appear often enough to be significant."138
Biskind concludes: "When in 1945 DDT was released for use by the general public in the United States and other countries, an impressive background of toxicological investigations had already shown beyond doubt that this compound was dangerous for all animal life from insects to mammals. "139
Despite the fact that DDT is highly toxic for all types of animals, the myth has spread that it is harmless, even in very high doses. It was used in many households with a carefree lack of restraint, contaminating peoples' skin, their beds, kitchens and gardens.140 In Siskind's opinion, the spread of polio after the Second World War was caused "by the most intensive campaign of mass poisoning in known human history."141
Along with DDT, the much more poisonous DDE was also used in the USA. Both toxins are known to break through the hematoencephalic barrier, which protects the brain from poisons or harmful substances. Nonetheless, housewives were urged to spray both DDT and DDE to prevent the appearance of polio. Even the wallpaper in children's rooms was soaked in DDT before it was glued on the wall.142
What from today's perspective seems like total blindness was at that time an everyday practice, not only in the United States. After 1945, DDT powder was used in Germany to fight a type of louse said to carry typhus. 143 And in agriculture, including fruit and vegetable cultivation, DDT was likewise lavishly dispersed for so-called plant protection. Through this, DDT gradually replaced its predecessor, lead arsenate, a pesticide containing heavy metals.144
A look at statistics shows that the polio epidemic in the USA reached its peak in 1952, and from then on rapidly declined. We . have seen that this cannot be explained by the Salk-inoculation, since this was first introduced in 1955. There is a most striking parallel between polio development and the utilization of the severe neurotoxin DDT and other highly toxic pesticides like BHC (lindane), which was also hard to degrade and actually much more poisonous than DDT. While use of DDT was eventually drastically reduced because of its extreme harmfulness, the use of BHC was curbed because it produced a bad taste in foods.145
"It is worth noting that DDT production rose dramatically in the United States after 1954," Jim West remarks, "which is primarily connected to the fact that DDT was increasingly exported to the Third World, to be used primarily in programs to fight malaria or in agriculture." As West points out, the following factors contributed to its changed use patterns in the US:
1. An altered legislation led to the use of warning labels, which in tum raised public awareness of DDT's poisonous nature.
2. Eventually, the use of DDT on dairy farms was prohibited. Earlier, Oswald Zimmerman and his fellow research scientists had even advised the daily spraying of a 5% DDT solution directly on cattle and pigs, their feed, drinking water, and resting places.146 In 1950, it was officially recommended to US farmers that they no longer wash cattle with DDT, but at first this advice was largely ignored. In the same year, cows' milk contained up to twice as much DDT as is necessary to trigger serious illnesses (diseases) in humans.147
3. In advertisements and press releases, DDT was no longer celebrated as being "good for you," "harmless," and a "miracle substance."148
4. From 1954, concentrated DDT was only used on crops that did not serve food production (for example, cotton).
5. DDT was used with more caution, something that caused decreased human intake of the poison through foodstuffs.
6. The use of DDT was extended to nationally sponsored forestry programs, so, for instance, entire forests were sprayed with it by airplane.
7. DDT was gradually replaced by allegedly "safe" pesticides in the form of organophosphates like malathion, but their uncertain toxicological effects and the new pesticide laws merely changed the type of neurological damage from acute paralysis to less-paralytic forms, such as chronic, slow-developing diseases which were difficult to define. This made it particularly difficult to prove in legal disputes or studies, that these pesticides contributed to or directly caused the illnesses in question (see also Chapter 5, section: "BSE as an Effect of Chemical Poisoning" for more on the organophosphate phosmet). [see pages 70-71 on the scroll at the link to see Diagram 3 Polio cases and DDT production in the USA, 1940 - 1970 and Diagram 4 Polio cases and pesticide production in the USA, 1940 - 1970 DC]
Finally in 1962, US biologist Rachel Carson published her book, Silent Spring, in which she gives a vivid account of the fatal repercussions of extensive spraying of plant toxins on insects and particularly on birds, and predicts the consequence of a "silent spring" (without any songbirds). Through this, the public was made aware of the dangers of DDT.But public reaction was slow, because 800 chemical companies reacted hysterically to Carson's book, prophesizing hunger and destruction if farmers were no longer permitted to use any pesticides. "The goal was very obviously to create panic and drive farmers into the arms of the chemical industry," as Pete Daniel, expert on the history of pesticides, writes in his 2005 book, Toxic Drift. 149
In 1964, a North Carolina turkey breeder named Kenneth Lynch wrote to the Ministry of Health, stating that, since 1957, his home town of Summerville had been enveloped in a mist of DDT or malathion (an insecticide which can have wide ranging neurotoxic and fatal effects) 150 every summer, in order to kill mosquitoes. And over the past years, his turkeys had "more or less abruptly developed advanced paralyses and, even though they had originally been in good health, died within two or three days."
At the same time, the fertility of the eggs had declined from 75% to 10%. "The evidence clearly indicated that the fog of insecticide is to blame," writes Lynch. With the help of a chemistry professor, he turned to the Public Health Service (PHS) and suggested carrying out corresponding studies. The national authorities, however, showed no interest whatsoever. "It seems to me that the ministry's behavior can hardly be interpreted as anything other than a case of bureaucracy being blinded by its own past mistakes," opined Clarence Cottam, a biologist honored by the National Wildlife Federation as a protector of nature.151 152
In their refusal, political decision-makers and the chemical industry's lobbyists 153 referred primarily to the "prisoner studies" of PHS scientist Wayland Hayes.154 In these experiments on prisoners, Hayes had aimed to show that it was completely harmless to ingest 35 milligrams of DDT per day.155 But critics like Cottam objected that every test subject could release him/herself from the experiments at any time. And indeed, "there were a fair number who withdrew when they became a bit ill."
Since a number of prisoner test patients dropped out of the study, data on adverse effects were largely eliminated, so the study's results were worthless. Cottam points out that Hayes had most likely engaged in researcher bias to substantiate his initial views on pesticides: "Perhaps he is like many human beings who when subjected to criticism become more and more dogmatic in maintaining their initial stand."
Pesticide historian Pete Daniel goes a step further in saying that " the officials in charge knew better, but the bureaucratic imperative to protect pesticides led the division into territory alien to honesty."156
It would be years before the US government held a hearing on DDT and even longer until they finally prohibited it in 1972. Unfortunately, the government discussions were not widely reported, so the general public remained unaware of the connection between polio (in humans !) and pesticides, and other non-viral factors. To achieve this at the beginning of the 1950s ten years before Carson's Silent Spring, someone would have had to have written a bestseller which described the repercussions of DDT (and other toxins) in humans. Unfortunately, this was not the case; and it was not until this book, Virus Mania, was published.
"Carson's book was good, but it was restricted to the damage to animals, whereas one looks in vain for descriptions of statistical trends or analyses in the work," says Jim West. "Even the research scientists Biskind and Scobey, who had clearly described the damage that DDT causes in humans, were practically unmentioned by Carson. Now who knows what kind of editorial censoring process her book had to go through before its publication."
West points out that this type of censorship became the norm in future virus research: "One needs only consider that her work had been financed by the Rockefeller Foundation. This makes one sit up and take notice, for the Rockefeller Foundation has supported the significant orthodox epidemic programs, including the HN = AIDS research and numerous vaccination programs. And the great Grandfather Rockefeller had made his money by selling snake venom and pure mineral oil as a universal cure. Carson's book prompted public outcry, which contributed to DDT's ultimate prohibition. But this was a deceptive victory, which only helped to secure the public belief that democratic regulative mechanisms still functioned effectively. In actual fact, the chemical industry-because the public thought the poisonous demon had then been defeated-was able to establish its likewise highly toxic organophosphate on the market without a problem. And, fatally, nobody discussed its important central topic: that poisons like DDT could cause severe damage like polio."
Gajdusek's "Slow Virus": Infinite Leeway for Explanations
The virus hunters still had many weapons to pull from their box of tricks. Such as the concept of the "slow virus": a virus capable of "sleeping" in a cell for years before striking with its pathogenic or fatal effects. The claim that a disease takes a very long time (decades) to "break out" gained popularity in the 1960s, when virus hunters convinced the medical establishment that the virus concept could even be imposed on cancer 157 158-that is, a disease that generally appears after years or decades.159
But despite a most arduous search, researchers were simply unable to find any active viruses in tumors. The disappointment and frustration was correspondingly great.160 But a new theory was soon developed: that a virus could provoke an infection, then lie dormant in a cell for as long as it wanted-and finally, at some point, even trigger cancer, and even when the virus is no longer present. Just as with polio earlier, the nucleic acids of a so-called slow virus have never been isolated and the particles have never been imaged with an electron microscope, 161 but the virus hunters embraced this suspect theory and adapted it to a number of modern ailments. 162
Scientist Carleton Gajdusek prodded the slow virus concept along to serve not only an explanatory model for HIV AIDS.163 In the 1970s in Papua New Guinea, Gajdusek researched a sponge-like alteration in brain tissue associated with dementia, which was predominantly spread among the female population there.164 The disease, called kuru, was only observed in two clans; they often intermarried, and, according to Gajdusek, maintained a cult of the dead ritual that involved eating the brains of their deceased (something which was later revealed as a myth).
These transmissible spongiform encephalopathies (softening of the brain), as they are called, appear sporadically and end, mostly fatally, within five years. They are generally extremely rare (approximately one case per million people), but are represented within some families with a frequency of 1 in 50, which could point to a genetic cause.165 Despite this Gajdusek received the Nobel Prize in 1976 for his slow virus concept. With this endorsement his idea that this sponge like alteration in brain tissue was produced and transmitted by a pathogen achieved widespread acceptance as fact.
A close look at Gajdusek trials on apes, with which he aimed to show transmissibility, should have shocked the scientific community into disbelief. But instead, they recognized these papers as proof of transmissibility and ignored the fact that neither feeding the apes brain mush, nor injecting them with it had any affect on the chimpanzees. So, Gajdusek conducted a bizarre experiment, in order to finally induce neural symptoms in the test animals.
He ground up the brain of a kuru patient into a mush full of proteins, along with a number of other substances, and poured this into the living apes by drilling holes into their skulls. This so-called disease's alleged transmissibility was founded only upon these experiments 166 How could it possibly derive proof of Gajdusek cannibalistic hypothesis? Particularly since the hypothesis indicates that the disease could appear in humans through ingestion of infected brains, and not through direct surgical insertion into the brain.
To compound matters, Gajdusek was the only living witness of cannibalism on Papua New Guinea. He reported on these cannibalistic rites in his 1976 Nobel Prizewinning lecture, even documenting them with photographs. But in the mid-1980s, it was discovered that Gajdusek's photos, with which he aimed to document the cannibalism, actually showed pig flesh, not human flesh. An anthropological team looked into this claim and they did find stories of cannibalism, but no authentic cases.167
Gajdusek later had to admit that neither he himself, nor others he met had seen the cannibalistic rites.168 Roland Scholz, Munich-based professor of biochemistry and cellular biology in Munich, responded to this revelation by saying that, "the scientific world seems to have been taken in by a myth. " 169
After World War II: Visible Proof of Viruses? We Don't Need That!
Modern viral research is like Bigfoot hunting. Trackers of this legendary ape-like beast (also called Sasquatch and the Abominable Snowman) trot out the occasional questionable blurry photograph and footprint marks to claim proof of Bigfoot's existence. Based on this suspect data, they say the beast is up to ten feet tall and 440 pounds with 17-inch footprints that have even been made into plaster casts to prove its existence.170 Virus hunters also collect dubious data, claiming to have images of the virus, even though electron micrographs of viruses accompanied by an analysis of their complete genetic material and virus shell are the only method of proving a virus's existence.
Bigfoot hunt, like viruses, are splendid moneymakers. Along a strip of California's Highway 101, numerous shops hawk Bigfoot-souvenirs 171 and they are popular with tourists even though it is generally accepted that Bigfoot is an invention.172 Of course, Bigfoot is nowhere near as lucrative as the international virus industry's multi-billion dollar business.
We must stress here that electron microscopy is fundamental to virus identification. For a long time, establishing unequivocal proof of a virus meant seeing is believing, as is the case with bacteria and fungi. The one difference is that bacteria and fungi can be seen with a light microscope, whereas viruses are so tiny that only an electron microscope (first patented in 1931) enables detailed imaging to make them visible.
But, first you have to identify exactly what you're looking at, so these particles (possible viruses) must exist in a pure or purified form, in order to be able to differentiate virus particles from virus-like ones. At the beginning of the 1950s, virologists agreed that this was necessary, since, under certain conditions, even healthy cells produce a whole range of particles that could look like so-called tumor viruses (oncoviruses).173 174
The importance of this process was confirmed at an international meeting of the Pasteur Institute in 1972,175 176 and "endured in the early 1980s," according to Val Turner, a physician and member of the Perth Group, an Australian research team.177 "Viruses are not naked bits of RNA (or DNA) . They are particles with particular sizes and shapes and other identifying features, which are obliged to replicate at the behest of living cells. They won't multiply in dead meat like bacteria. So there you have it. This predicates experiments to prove particles are a virus and that hasn't changed in a thousand years and certainly not since the 90s."
Turner uses easy-to-grasp language to describe the science: "Think of it like a paternity suit in which DNA evidence will be used and the accused is HIV and the child is a human. The crux of the case is proof that the DNA you found in the human is the same DNA you found in the accused. For the latter, you have to have rock solid proof the DNA carne from the accused. Given that in cell cultures all sorts of particles appear, only some of which are viruses, you have to prove that (a) a particular particle is a virus; and (b) your DNA comes from that particle. How can you prove (a) without using electron microscopy (for many reasons) and without purification? You tell me.
Frankly we from the Perth Group do not understand this obsession with 'old data' or 'science moves on.' Has Archimedes' principle* 'moved on'? Do solid objects no longer displace their own volume of liquids? If everything has to be 'up to date' then in ten years nothing that is up to date now will be up to date then. Which means as long as time keeps going nothing will be right. "178 This goes for orthodox theories as well!* Archimedes' principle states that a body immersed in a fluid is buoyed up by a force equal to the weight of the displaced fluid. The principle applies to both floating and submerged bodies and to all fluids, i.e., liquids and gases.
By soundly characterizing virus structure (virus purification), it is theoretically possible to irrefutably differentiate viruses themselves from virus-like particles. If this has taken place, the next step would be to get an electron micrograph of the purified virus (of course, proof that a virus exists does not automatically mean that this virus is also infectious, as had already been established in 1960, at a conference sponsored by the New York Academy of Sciences). 179 But this procedure is rarely carried out in modern viral research. Viruses that purportedly threaten to wipe out humanity (H5N1, SARS virus, etc.) have evidently never been seen by anyone.180 [this is pure insanity,readers need to understand what she is telling us,because they are
"Around 1960, before contemporary molecular biology arose, electron microscopy was held to be the best way of identifying viruses in cell cultures," writes pathology professor Etienne de Harven, a pioneer in electron microscopy and virology. De Harven's research career includes 25 years at the Sloan-Kettering Institute in New York, a private cancer research center founded in 1945, which quickly advanced to become the largest of its kind in the USA. 181 "For this reason, laboratories all over the world directed their efforts at this time towards observing particles in cancer cells with ever-improved methods of electron microscopy." In 1962, the central role of electron microscopy was also recognized at the well-known Cold Spring Harbor Conference. Andre Lwoff, who would receive the Nobel Prize for medicine three years later, was among those who designated electron microscopy as likely the most efficient method of proving viruses' existence; he suggested investigating viruses with this procedure and dividing them into classes.182
A focus of medical science then (as now) was cancer. And because cancer researchers had the fixed idea that viruses were definitely cancer triggers, 183 they spent a lot of time proving the presence of viruses in human cancer cells, with the help of electron microscopy. But, these efforts were unsuccessful. "One only found virus-like particles from time to time-while viruses of a certain types could never convincingly be seen," reports de Harven.184
Virus hunters were, once again, crushed by this scientific news. But the scientific world tends not to publicize negative results whenever possible-in scientific language, this is called, "publication bias."185 Yet, whether the research claims promoted as evidence involve new patented drugs said to be superior to existing (cheaper) ones, or genetic markers of disease (interpreted as "risk" factors), or statistical relationships, discerning whether the claims are spurious or confirmed by clinical trials can only be ascertained by making the full body of controlled studies publicly available.
In medicine, failure to do so casts doubt on the safety and efficacy of treatments as well as undermining the integrity of the scientific literature. Scientific journals are supposed to protect the integrity of science-but they don't. As is the case with most deficient practices in medical research and practice, there is an unacknowledged financial motive. And why are scientists coy about publishing negative data? "In some cases," says Scott Kern of Johns Hopkins University and editor of the recently founded online Journal of Negative Observations in Genetic Oncology, "withholding them keeps rivals doing studies that rest on an erroneous premise, thus clearing the field for the team that knows that, say, gene A doesn't really cause disease B. Which goes to show that in scientific journals, no less than in supermarket tabloids, you can't believe everything you read-or shouldn't."186 l87
As long ago as the 1960s the established science community was coy about publishing negative data, but the cancer virus hunters' failures were so universal that it was simply inevitable that one article or another should leak out into medical publications. In 1959, the researcher Hagenaus reported in the journal Etude du Cancer about the difficulties identifying any typical virus particles in a wide range of breast cancer samples.188 And in 1964, the scientists Bernhard and Leplus were unsuccessful, even with electron microscopy's assistance, in finding virus particles presumed to play a role in the development of Hodgkin's lymphoma (lymphatic cancer) , lymphoid leukemia or metastases (tumors in various parts of the body) . 189
But these scientific studies didn't stop the virus hunters for a second. Instead of disengaging themselves from their virus tunnel vision, they grumbled about the methodology of virus determination: for example, over what are known as thin slices or thin-sections (tissue samples which are extremely precisely dissected and trimmed to size so they can be observed under the electron microscope). Thin sections had proved effective countless times, and had also worked perfectly with mice.190 But, the virus hunters needed a scapegoat and, instead of questioning the cancer-producing virus model, they started griping about the thin-sections. The production of the thin-sections was also thought to be too laborious and time consuming. And who had the time for that once pharmaceutical companies began offering fast cash for quick fixes?
So, scientists turned to the much simpler and faster dye method, in which certain particles of the sample (for instance, DNA and RNA) were marked in color and then electron micrograph. But from a purely scientific perspective, the results of dye method are a disaster. Through the air-drying process that was necessary for the staining, the particles became totally deformed, so that they appeared as particles with long tails. They were full-blown artificial products of the laboratory, and they still looked exactly like so many other non-viral cellular components. This, logically, made it impossible to determine if a virus or a non-viral particle had been found. l91 192
A few scientists did in fact acknowledge that the dye method was dubious. But, instead of admitting defeat and returning to the thin-sections method, they began bashing electron microscopy technology! Other researchers were in turn so anxiously preoccupied with finally finding cancer viruses that they casually overlooked the worthlessness of dye method results, and theorized that the "tailed" particles were a certain type of virus. As absurd as this may sound to logical thinkers, virus hunters were even remunerated with plenty of research money for this action.
As a result, even cow's milk and mother's milk were tested for the presence of "tailed" particles in the mad rush to prove that viruses could produce cancer.193 One well-known molecular biologist Sol Spiegelman even warned against breastfeeding in October 1971, and his message made for numerous lurid media headlines.194 These so-called scientists brushed aside the fact that, to date, not a single retrovirus has been able to be isolated from breast cancer tissue (and probably not from human tumor tissue or blood plasma in general) . 195 Shortly thereafter, Spiegelman was quoted in Science saying, "one can't kick off fear mongering on this scale if one doesn't exactly know if a virus particle is the cause."196
But mainstream viral research drifted purposefully further away from the well established viral proof model. They latched on to Howard Temin's 197 and David Baltimore's 198 description of activity of the enzyme reverse transcriptase in connection with cancer viruses in 1970. Their research seemed so significant to the medical establishment that the two were awarded the Nobel Prize in 1975. 199
What was so significant about this enzyme, a substance that, as a sort of catalyst, makes it possible for biochemical reactions to occur? To understand this, we must remember that, in the 1960s, scientists thought they had established that a few viruses did not possess any DNA (complete genetic information), but rather only RNA genes. This baffled the researchers since they believed viruses without any DNA (only with RNA) were not able to multiply. Until Temin and Baltimore delivered an explanation with the enzyme called reverse transcriptase. It, they said, can transform the RNA in RNA viruses (later called retroviruses because of this) into DNA, by which viruses are then able to multiply (if RNA exists alone, the conditions for replication are not met).200
But there was so much enthusiasm about the discovery of reverse transcriptase that virus hunters rashly assumed that reverse transcriptase was something very typical of retroviruses. They proclaimed something like this: if we observe reverse transcriptase activities in our test tubes (in vitro), then we can be sure that a retrovirus is present as well (even if the virus' existence has never been proven or reverse _ transcriptase role hasn't been established, for instance, in the context of HIV).201 Yet, it was presumed that the (indirectly detected) presence of reverse transcriptase was sufficient enough to prove the existence of a retrovirus, and even a viral infection of the tested cells in vitro.
This dogma would now become fixed in the minds of mainstream researchers and it opened the floodgates to allow indirect virus detection methods (known as surrogate markers) to take the place of direct detection procedures (virus purification and characterization as well as electron micrograph).202
So, in 1983, in a paper printed in Science, researcher Luc Montagnier of the Institute Pasteur in Paris, later celebrated as the discoverer of HN, asserted that his research team had found a new retrovirus (which would later be named HN).203 This was claimed only after reverse transcriptase activity had been observed in the cell culture. But, once again, there was no scientific proof for this conclusion.
Eleven years before, in 1972, Temin and Baltimore had stated, "reverse transcriptase is a property that is innate to all cells and is not restricted to retroviruses."204 And even Francoise Barre-Sinoussi and Jean Claude Chermann, the most important co-authors of Montagnier 1983 Science paper, concluded in 1973 that reverse transcriptase is not specific to retroviruses, but rather exists in all cells. 205 In other words, if the enzyme (the surrogate marker) reverse transcriptase is found in the laboratory cultures, one cannot conclude, as Luc Montagnier did, that a retroviruses, let alone a particular retrovirus has been found.
Reverse transcriptase is no longer the most significant surrogate marker, by a long shot. Now the virus hunters are fixated on antibody tests, PCR viral load tests, and helper cell counts. But these tests raise new questions, given their striking weaknesses (see Chapter 3, "HIV Antibody Tests, . PCR Viral Load Tests, CD4 Courits: As Informative as a Toss of a Coin"). This prompted 14 renowned virologists of the "old guard" to direct an appeal to the young high-technology-focused generation of researchers, which was published in Science in 2001 :
"Modem methods like PCR, with which small genetic sequences are multiplied and detected, are marvelous [but they] tell little or nothing about how a virus multiplies, which animals carry it, how it makes people sick. It is like trying to say whether somebody has bad breath by looking at his fingerprint."206
No less remarkable, in this context, is an early 2006 article in the German Medical Journal (Deutsches arzteblatt) about a study by researchers who thought that, with the assistance of PCR, they had discovered new "exotic" bacteria. The article points out that, "only genetic traces of the pathogen are detected [with the PCR] . From this, it cannot automatically be concluded that complete bacteria exist as well. "207 208
The Virus Disaster of the 1970s and HIV as Salvation in the 1980s
Among the overall virus mania, such critical thoughts founder quickly. In the 70s, elite researchers were simply too busy channeling generous government aid into researching the possible connection between viruses and cancer. On 23 December 1971, US President Richard Nixon declared the "War on Cancer" at the behest of the medical establishment, and, with this metaphor, carried the militant tradition of the monocausal medical doctrine to the extreme, attached to the conception of viruses as the enemy. We had now become accustomed to talking about the "weapons," the "strategies," and the "arsenals" of cell-killing preparations and weren't even taken aback when powerful people like Nixon called the new cancer war "a Christmas present for the people. "209
To date, many hundred millions of dollars of research funds have been poured into this war (a good part of it paid by taxes)-and the results are staggering.210 Back in 1971, a cure for cancer and a preventive vaccine were promised by 1976-but both of these are still nowhere in sight. 211 Incidentally, in the tradition of celebratory medicine, along with a trust that the public conscience and the media have short term memory, the medical establishment rarely feels a need to keep its promises. "I am convinced that in the next decade or maybe later, we will have a medication that is just as effective against cancer ... as penicillin against bacterial infections," boasted Cornelius "Dusty" Rhoads as early as 1953. He had been leader of the US Army's Department for Chemical Warfare (medical division of the US Chemical Warfare branch) during the Second World War, and was director of the Sloan-Kettering Institute for Cancer Research, founded in 1945.212
Death rates have meantime increased exponentially alongside skyrocketing research expenditures. 213 Today in Germany, 220,000 people die annually from cancer; in the USA, it is almost 600,000. Even taking the aging of these populations into consideration, these numbers are staggering. For this reason, experts like George Miklos, one of the most renowned geneticists worldwide, criticized mainstream cancer research in Nature Biotechnology as "fundamentally flawed" and equated it with "voodoo science."214
By the late 1970s, medical experts lobbed damning critiques against mainstream cancer research. Medical scientists "had credited the retroviruses with every nasty thing-above all the triggering of cancer-and have to accept constant mockery and countless defeats," Der Spiegel pointed out in 1986.215
And the concept that viruses are the great trigger factors failed with other diseases, besides cancer. One notorious example is the swine flu disaster of 1976. During a march, David Lewis, a young American recruit, collapsed. Epidemic experts swooped in with their "magic wand" of clustering in their hands and claimed that they had isolated a swine flu virus from his lung. At the behest of the medical establishment, and particularly the US Centers for Disease Control (CDC), US President Gerald Ford appeared on TV and urged all Americans to get vaccinated against an imminent deadly swine flu epidemic. 216 Just like today's avian flu fear mongers, Ford used the great Spanish flu pandemic of 1918 to scare the public into action.
Approximately 50 million US citizens rushed to local health centers for injections of a substance hastily thrown on the market. It produced strong side effects in 20% to 40% of recipients, including paralysis and even death. Consequent damage claims climbed to $2.7 billion. In the end, CDC director David Spencer, who had even set up a swine flu "war room" to bolster public and media support, lost his job. The ultimate bitter irony was that there were no, or only very isolated reports of swine flu.217
Consequently, at the end of the 1970s the US National Institutes of Health (NIH) came into unsettled political waters-just like the CDC, which was extensively restructured at the beginning of the 1980s. As a result, at the CDC and NIH, the most powerful organizations related to health politics and biomedical science, the great contemplation began. To redeem themselves, a new "war" would, of course, be the best thing.
Despite perpetual setbacks, an "infectious disease" remained the most effective way to catch public attention and open government pockets. In fact, Red Cross officer Paul Cumming told the San Francisco Chronicle in 1994 that "the CDC increasingly needed a major epidemic" at the beginning of the 80s "to justify its existence. "218 And the HIV/AIDS theory was a salvation for American epidemic authorities.
"All the old virus hunters from the National Cancer Institute put new signs on their doors and became AIDS researchers. [US President Ronald] Reagan sent up about a billion dollars just for starters," according to Kary Mullis, Nobel laureate for Chemistry. "And suddenly everybody who could claim to be any kind of medical scientist and who hadn't had anything much to do lately was fully employed. They still are. "219
Among those who jumped over from cancer research to AIDS research, the best known is Robert Gallo. Along with Montagnier, Gallo is considered to be the discoverer of the "AIDS virus," enjoys worldwide fame, and has become a millionaire. In his previous life as a cancer researcher, on the other hand, he had almost lost his reputation, after his viral hypotheses on diseases like leukemia imploded.220 "HIV didn't suddenly pop out of the rain forest or Haiti," writes Mullis. "It just popped into Bob Gallo's hands at a time when he needed a new career. "221
AIDS: From Spare Tire to Multibillion-Dollar Business
footnotes and source