OUP user menu

Risk and control of waterborne cryptosporidiosis

Joan B. Rose, Debra E. Huffman, Angela Gennaccaro
DOI: http://dx.doi.org/10.1111/j.1574-6976.2002.tb00604.x 113-123 First published online: 1 June 2002


Cryptosporidium remains at the forefront of studies on waterborne disease transmission and abatement. The impact of environmental land use patterns which contribute animal and human waste, climatic precipitation leading to a strong association with outbreaks, and community infrastructure and water treatment are now recognized as contributing factors in the potential for waterborne spread of the protozoan. Advances in detection methodologies, including the ability to genotype various strains of this organism, have shown that human wastes are often the source of the contamination and cell culture techniques have allowed insight into the viability of the oocyst populations. Currently water treatment has focused on UV and ozone disinfection as most promising for the inactivation of this protozoan pathogen.

  • Drinking water risk
  • Swimming risk
  • UV disinfection
  • Cell culture viability
  • Cryptosporidium

1 Introduction

Cryptosporidium parvum belongs to the phylum Apicomplexa (referred to as the Sporozoa one of 5000 species), to the class Coccidea and the family Cryptosporidiidae. It is an obligate enteric coccidian parasite that infects the gastrointestinal tract and is one of the most important enteric pathogens in both humans and animals. The parasite was first described in 1907 by Tyzzer [1], but it is only in the last two decades that there has been an explosion of published material on the biology, genomic characterization, taxonomy, transmission, detection and public health risks particularly with regard to waterborne cryptosporidiosis [26].

Within the genus Cryptosporidium, there are currently 10 recognized species: Cryptosporidium baileyi and Cryptosporidium meleagridis found in birds, Cryptosporidium felis found in cats, Cryptosporidium muris found predominately in mice, Cryptosporidium wrairi in guinea pigs, Cryptosporidium andersoni in cattle, Cryptosporidium nasorum found in fish, Cryptosporidium serpentis found in reptiles and Cruptosporidium saurophilum in skink [2]. Cryptosporidium parvum or parvum-like infections have been reported in 152 species of mammals, including sea lions, polar bears and dugongs (a marine mammal similar to a manatee) [2]. The species of concern from both a medical and veterinary perspective is C. parvum.

Cryptosporidium was first diagnosed in humans in 1976. Since that time, it has been well recognized as a cause of severe watery diarrhea lasting several days to a week [7]. C. parvum infections can be found in the brush border membrane of the apical enterocytes of the small bowel, from the ileum to the colon, but may develop on any epithelial surface and have been reported in a number of different organs and tissues [7]. Both sexual and asexual replication occurs in the life cycle with the oocyst being the final product that is shed in the feces [8]. C. parvum oocysts (OO) are between 4 and 6 μm in diameter and contain four slender, fusiform sporozoites (SZ) (the infective stage of the life cycle) that escape by excystation (release of infective sporozoites) through a suture seam that opens along the oocyst wall after ingestion by a susceptible host [7].

Waterborne transmission of C. parvum remains one of the most prominent public health concerns worldwide. This review will focus on C. parvum, the risk of waterborne disease as a global issue and relatively new techniques including genotyping, cell culture, and disinfection for characterization and control.

1.1 Common global factors influencing the spread of waterborne cryptosporidiosis

The first waterborne outbreak ever documented occurred in the USA in 1985 [9]. However, it was not until 1994, due to increased awareness and a number of other significant outbreaks, that the US Council of State and Territorial Epidemiologist recommended that the infection become a nationally notifiable disease [10]. This spurred the inclusion of cryptosporidiosis surveillance in early 1995 by several states and by 1998, 47 states had made cryptosporidiosis reportable. The mean 4-year rate incidence per 100 000 in the USA ranged from a high of 3 to a low of 0.9 and approximately 2903 cases per year were reported [10]. Two states with high rates, Nebraska and Minnesota, had experienced waterborne outbreaks [10]. In the UK, surveillance is ongoing through the Public Health Laboratory Service. In 2000 and 2001 case reports totaled 5794 and 3681, respectively with higher reporting than what is seen in the USA [11]. In many other countries, particularly developing nations, however, the records are sporadic and often based on specialized studies. Much higher rates have been reported. In Asia, for example, the rate has been reported to range from 2 to 20% and in India from 4 to 13%[12]. The disease has been found in every continent in the world and association with waterborne transmission has been shown or suggested.

There are biological, environmental, climatic and community factors involved in the potential risk for waterborne transmission of cryptosporidiosis. The biological factors include excretion rates, zoonotic transmission and environmental stability and infectivity of the oocyst. High production and long-term excretion (weeks) of the infectious oocyst stage by infected hosts (ranging from 106 to 1011 per gram of feces) have been documented, thus contributing to high loading to environmental waters [13]. Zoonotic transmission has been clearly documented with two predominate C. parvum genotypes which are known to infect humans and have been identified: a ‘human’ genotype (designated 1) observed exclusively in isolates from humans and a ‘cattle’ genotype (designated 2) found in domestic livestock, other animals as well as humans [6, 1419]. The oocyst is extremely resistant once shed into the environment and can survive in water for weeks [20]. In addition the low infectious dose (ranging from an ID50 of 10–1042 depending on the isolate) contributes to spread of the disease even in the case of low levels of contamination [21].

The water source is one of the key environmental factors. The type of land use activities contributing feces for example shows that waters receiving cattle and sewage discharges have 10–100-fold greater concentrations of oocysts [20]. The occurrence of Cryptosporidium oocysts in surface waters has been reported in 4–100% of the samples examined at levels between 0.1 and 10 000/100 l depending on the impact from sewage and animals [22]. Surface waters used for drinking supplies were shown to be more susceptible to contamination. However, groundwater once thought to be a more protected source has shown between 9.5 and 22% of samples positive for Cryptosporidium, although at low concentrations [23]. The climatic factors include temperature, with lower temperatures leading to greater survival, and rainfall, with increased precipitation associated with increased concentrations of oocysts [24]. Finally, community factors include watershed management (i.e. the presence of combined sewer overflows, leading to raw sewage discharges with levels as high a 1.3×104 oocysts/100 l) and type of water treatment (filtration and disinfection) [4, 25].

The role of climate has been hypothesized as a major factor in the transmission of Cryptosporidium. Data on drinking water outbreaks in the USA from 1971 to 1994 from all infectious agents demonstrated a distinct seasonality, a spatial clustering in key watersheds and a statistical association with extreme precipitation [26]. This suggests that in key watersheds, by virtue of the land use, fecal contaminants from both human sewage and animal wastes are transported to waterways and drinking water supplies by precipitation events. Correlations between increased rainfall and increased Cryptosporidium oocyst and Giardia cyst concentrations in river water have been reported [24]. The largest outbreak in the USA occurred in Milwaukee, WI in 1993 where 400 000 people were ill and 100 died due to contamination of the water supply and unusual spring rains and storm runoff were suspected as major reasons for possibly both human and animal waste contamination of Lake Michigan that overwhelmed the drinking water treatment process [27]. The Oxford/Swindon Cryptosporidium outbreak in the UK was also associated with a rainfall event [4] and rainfall was implicated in a waterborne outbreak of giardiasis (diarrheal disease caused by a similar protozoan, Giardia) [28].

Researchers in Brazil reported that Cryptosporidium was the most common cause of diarrhea in AIDS patients and children showing a distinct seasonality. In these studies again water transmission was related to the seasonality of the cases associated with rainfall [29, 30]. There has also been significant association between the rainy season and cryptosporidiosis cases in parts of India, South Africa and Mexico [12].

Waterborne disease due to any fecal–oral agent such as Cryptosporidium is not only influenced by climate. The incidence of infection in the animal or human population and the excretion of oocysts in certain watersheds are also important factors contributing to waterborne disease. The type of animal waste handling and sewage treatment, as well as the type of disposal will influence the likelihood of oocysts ending up in the environment. The size of the watershed, dilution and hydrology of the system as well as the type and reliability of the drinking water treatment will influence the impact of pathogens in the drinking water. Thus human, infrastructure and engineering factors also play an important role in the potential for waterborne disease, where to date, precipitation has been shown to be important in the transportation of pathogens like Cryptosporidium.

1.2 Monitoring and detection

Methods for the detection of Cryptosporidium in water have been reviewed [5]. The most commonly used method includes filtration of large volumes of water (10–1000 l), followed by the processing (washing) of the filter to recover the captured material, followed by centrifugation, clarification (density gradients or more often immunomagnetic separation, IMS) and finally microscopic screening of the sample after staining with monoclonal antibodies tagged with fluorescein isothiocyanate (using epifluorescent microscopy) (IFA methods). This particular method has the advantage of being quantitative, but does not distinguish species or assess viability. While many have argued that monitoring water is difficult, tedious, inefficient, and of limited value, it appears that these criticisms are without any validity as the published literature demonstrates that the method is peer-reviewed and scientifically valid, and is used worldwide by water utilities, public health officials and regulators providing valuable information.

The first monitoring data for Cryptosporidium were published in 1988 [31]. By 1996, 13 countries had reported on research on the protozoan in water and by 1998, 10 years later, a review of the key literature found data on concentrations of oocysts in wastewater (3–4×105 oocysts/100 l), surface waters (0.12–2.5×104/100 l), and drinking waters (0.5–1.34×102/100 l) from 10 countries (Table 1) [3]. The improvement in methods has been tremendous, addressing Cryptosporidium identification using molecular techniques and viability (discussed below). Finally, the methods have been incorporated into monitoring programs, applied to risk assessments and waterborne investigations.

View this table:
Table 1

Summary of water monitoring studies for Cryptosporidium [3]

CountriesType of waterAverage percent positiveRange in concentrations oocysts/100 l
USA and UKWaste water623–4×105
Australia, Germany, Israel, Malaysia, Netherlands, Spain, UK and USASurface water460.12–2.5×104
Brazil, Spain, UK and USADrinking water240.5–1.34×102

Studies have been ongoing in Asia over the last few years, suggesting that the surface waters are fairly contaminated and while animals may be the source, the tap water may not be safe to drink. [32, 33]. A limited monitoring study of the watershed for Kaoshiun in Southern Taiwan found that 75% of the samples from the Kau-Ping River were positive for Cryptosporidium and 40% (2/5) of the treated drinking water samples were positive at an average concentration of 115 oocysts/100 l [32]. Boiling of the water was recommended. In western Japan, IMS/IFA methods followed by polymerase chain reaction (PCR), and restriction fragment length polymorphism (RFLP) found an average of 46% of the waters were oocyst positive, ranging from 37 to 100%, depending on the area (74/156 samples), and all were of the bovine type [33]. This confirmed other studies based on the use of a better methodology, but refuted the low levels reported by the Japan Ministry of Health and Welfare [33].

Studies in the Netherlands and Canada have used monitoring data for Cryptosporidium and Giardia, within the risk assessment framework, to examine drinking water treatment [34, 35]. Both studies used Monte Carlo analysis to examine distributions around the input variables to the model and both used protozoan specific dose–response probability of infection models [36, 37]. The data in Holland came from the Biesbos storage reservoir where water from the river Meuse is stored. Annual mean individual risk was estimated at around 10−4 (1/10 000) [34]. Sampling in Canada was along the St. Lawrence River from Montreal to Quebec City [35]. The mean annual risk in this case was 8.4×10−4. The addition of ozone reduced the Giardia risk by 2.6 log10; however, Giardia cysts are much more susceptible to disinfection than are Cryptosporidium oocysts. For this reason and due to cold water temperatures, ozone reduced the Cryptosporidium risk only by about 25%. The differences in the two estimates came from four inputs into the modeling efforts (Table 2): the level of pollution, the method recovery, viability estimate, and treatment removal (which were lower in the Netherlands study). Uncertainty analysis suggested that treatment removal had the greatest impact on the outcome, followed by the level and prevalence of oocyst contamination.

View this table:
Table 2

Variables used as input for Cryptosporidium risk assessment Monte Carlo analysis for drinking water [34, 35]

Water systemMean oocysts/100 l (%+)Recovery of methodViabilityTreatment type and removalAnnual mean individual risk estimate
Biesbos reservoir (from River Meuse), Netherlands∼0.03 (30)1.8%, filtration/IFA method30%, based on microscopic morphology2.8 log10, removals based on Clostridium through filtration10−4
St. Lawrence River from Montreal to Quebec City, Canada86 (40)90%, centrifugation followed by IFANot used3.35 log10, removals based on aerobic spore formers through filtration8.4×10−4
  • Probability of infection models Pi=1−e(−r×dose).

In 1998 and 1999, high levels of Cryptosporidium and Giardia were detected in the water supply and distribution system in Sydney, Australia, using a modified IFA technique and flow cytometry. Described as the ‘Sydney Water Crisis’, an expert report and scientific scrutiny have suggested that the ingress into the water system occurred with contaminated flood waters impacting the reservoir. Episodic detection occurred following flowing stream rises in the catchment. Studies by PCR revealed genotype II as the only strain, suggesting that animals are a source. Ongoing monitoring and genetic assessment have been recommended [3840]. Between 43 and 1000 oocysts/100 l were detected during the months that the ‘crisis’ occurred but an epidemiological investigation found no detectable increased antibody response in 104 people [41]. These authors have suggested that this is evidence that the oocysts detected in the distribution system were of no public health consequence. However, this study was not sensitive enough to examine levels of risk which are of concern in water supplies (10−4). In fact, the risk assessment models suggest that all 104 people would have had to have received a dose associated with the 1000 oocysts/100 l and all the oocysts would have had to be viable for eight infections to be observed in the 104 people surveyed. In the future, unless epidemiological studies take into account the distribution of oocyst concentrations and the variability in exposure assessment, and increase the sensitivity of the study by increasing the population examined, the money spent provides very little valuable information on public health risks associated with the contamination of drinking water.

Both in the USA and the UK, national occurrence monitoring programs for Cryptosporidium oocysts in water have been undertaken. In the UK, continuous monitoring post filtration is required (using foam filtration, IMS, IFA) for 1000 l with a standard of one oocyst/10 l (100/1000 l) [42]. Some of the initial results have shown that in 5% of 2076 samples the standard was violated. In 48% of the samples, the level was found to be between one and nine oocysts/1000 l and no oocysts were detected in the remaining 47% of the samples [43]. In the USA, the ‘Information Collection Rule’ was implemented by the US EPA, which ran from July 1997 to December 1998 [43]. In this case surface waters were collected prior to filtration, much smaller volumes were assayed (median 3 l) and turbidity and other surface water characteristics limited sensitivity. A total of 5838 samples were collected (20% were positive) and the mean concentration for a national estimate was approximated at around 2/100 l. This will be used to examine treatment for surface waters in the USA.

Monitoring water for Cryptosporidium oocyst contamination will continue throughout the world, and new methods will allow for greater sensitivity, better specificity and quantitation. This will not only allow for better assessment of endemic and lower public health risks but will elucidate sources and causes of outbreaks. This information can then be used to minimize waterborne cryptosporidiosis.

2 Summary of drinking and recreational outbreaks

Waterborne outbreaks of cryptosporidiosis have been documented in countries around the world. Between the years 1986 and 1996, 16 total cryptosporidiosis outbreaks in drinking water were reported in Europe, the majority in England and Wales, and 14 in North America, the majority in the USA. Two outbreaks were reported in Japan. Most of the Cryptosporidium recreational outbreaks have been documented in the USA and the UK. Rainfall was a strong variable in drinking water outbreaks and fecal accidents in recreational outbreaks [3, 44, 45].

2.1 Drinking and recreational water outbreaks

In the USA there have been 10 drinking water outbreaks of cryptosporidiosis documented from 1984 to 1996. Only in England and Wales have there been more reported outbreaks (Table 3) [3, 44, 45]. In most countries where outbreaks have been documented only a few have been identified as caused by Cryptosporidium. In Canada, between 1993 and 1996, four drinking water outbreaks occurred with 361 laboratory cases reported and an estimated 31 900 people affected. A significant number of cryptosporidiosis outbreaks were associated with contaminated groundwater (wells and springs not properly protected from sewage and runoff or wells located adjacent to rivers and streams) and many of the documented outbreaks associated with surface water contamination were linked to human sewage discharge and runoff, which occurred during heavy rainfall events in the USA, the UK, and Canada.

View this table:
Table 3

Summary of worldwide drinking water cryptosporidiosis outbreaks [3, 44, 45]

YearsCountryNumber of outbreaksComments
1993–1996Canada4Three occurred with surface water as the source, one with well water
1986–1996Croatia129 total outbreaks, 62% of these caused by bacteria
1986–1996England and Wales1320 total outbreaks, only one outbreak associated with well, of all others surface water was the source
1994–1996Japan2Distribution system contamination and one associated with surface water as the source
1986–1996Spain1208 total outbreaks, 47% associated with unknown gastroenteritis, 3% caused by Giardia
1986–1996Sweden151 total outbreaks, 70% associated with unknown gastroenteritis, 7.8% caused by Giardia
1984–1996USA10211 total outbreaks, four outbreaks associated with wells, one distribution system and the others with surface water as the source

Craun et al. have reviewed some of the outbreaks in North America and the UK [45]. Where there was information, they reported attack rates ranging from 1 to 60% (average 22%), and hospitalization rates from 1 to 44% (average 13%), and there was no correlation between these two indices, suggesting that the dose (the level of contamination and the distribution) was not associated with the organism's virulence.

The first recreational outbreak of cryptosporidiosis occurred in 1988 in Los Angeles [46, 47]. The majority of recreational water outbreaks have been linked to swimming pools although some have now been documented at water slides, fountains, and water parks. A petting zoo and the subsequent contamination of a fountain were associated with one outbreak. Table 4 summarizes the recreational outbreaks that have occurred between the years 1990 and 2000 in the USA.

View this table:
Table 4

Recreational water outbreaks in the USA (1990–2000) [2, 46, 47]

YearLocalityType of recreational waterEstimated number of cases (confirmed)
1992IdahoWater slide500
1992OregonWave pool(52)
1993WisconsinMotel pool51 (22)
1993WisconsinMotel pool64*
1994MissouriMotel pool101 (26)
1994New JerseyLake2070 (46)
1995KansasPool101 (2)
1995GeorgiaWater park2470 (6)
1995NebraskaWater park(14)
1996FloridaPool22 (16)
1996CaliforniaWater park3000 (29)
1997MinnesotaFountain369 (73)
1998OregonPool51 (8)
1999FloridaInteractive water fountain38 (2)
2000OhioPool700 (186)
2000NebraskaPool225 (65)

The most common cause of recreational water outbreaks is a fecal accident (termed AFR for accidental fecal release) and guidelines have been established for addressing these events. When an AFR occurs, the common practice for a formed stool is to close down the venue, remove as much fecal material with a net or scoop as possible, raise the free available chlorine concentration to 2 mg l−1 (pH 7.2–7.5) and maintain this for at least 25 min. Once this is all done and the free available chlorine level is back to normal, the venue can be reopened. For the discharge of a loose stool, the free available chlorine concentration must be raised and maintained at 20 mg l−1 for at least 8 h (pH 7.2–7.5). At this point, backwash of the filter should occur as well and the effluent should be discharged directly to waste. Once this is all done and the free available chlorine level has returned to normal, the venue may reopened [48].

It is clearly acknowledged that C. parvum oocysts are resistant to chlorine and the chlorination of pools is no exception. The disinfection of oocysts in the presence of feces and other organic contaminants may be extremely difficult. Experiments simulating a fecal accident found that oocysts remained infectious even after 48 h of chlorine contact. The previous recommendations are likely to be revised or the method of disinfection changed to address recreational cryptosporidiosis [49]. Some alternate methods of disinfection that have been suggested for pools now include ozone or ultraviolet light.

2.2 Molecular characterization of waterborne outbreaks

Advances in molecular techniques now provide an approach to better understand the epidemiology of cryptosporidiosis including strain identification for greater accuracy in determination of the source of an outbreak [50, 51]. Two different genotypes of C. parvum have been identified in humans [18]. Genotype 1 isolates have been shown to be infectious in humans and genotype 2 isolates have been shown to be infectious in mice, calves, lambs, goats, horses as well as humans. This suggests the possibility that there are two distinct populations of oocysts cycling in humans with distinct transmission cycles: (1) zoonotic transmission from animal to human with subsequent human to human and human to animal transmission and (2) a transmission cycle exclusively in humans. A third genotype has recently been identified as well but is considered an unusual human isolate [51].

Up until recently, only a small fraction of the isolates responsible for drinking water outbreaks in the USA had been genotyped (39 isolates only four of which were from human cases infected by drinking water) [50]. In the UK, a much larger number of specimens have been studied (95 human specimens from two suspected waterborne outbreaks, 46 sporadic human cases, 60 livestock cases from other areas and 12 patient samples infected with other protozoan parasites) [50].

PCR/RFLP analysis for the Cryptosporidium outer wall protein (COWP) gene has been successfully utilized in the UK for genotype identification in human and animal fecal samples from two suspected waterborne outbreaks. Ninety-six percent of C. parvum isolates from patients linked to the outbreaks were of genotype 1 while all the isolates from livestock were of genotype 2. Identification of isolates from sporadic human infections revealed 67% were genotype 1 while 34% were genotype 2. Two patient specimens yielded both genotypes and specimens from patients infected with other parasites yielded no amplified DNA fragments of interest [50].

A more recent molecular investigation of Cryptosporidium genotypes associated with waterborne outbreaks in the UK utilized a nested-COWP approach for increased sensitivity. Of >2000 samples from humans evaluated, genotype 1 was detected in 38.6% of the samples, genotype 2 in 59.6% of the samples and in 1.0% of the samples both genotypes 1 and 2 were detected. A third genotype was identified in 15 (0.7%) of the samples; this is suspected to be related to the C. meleagridis species [51]. When human fecal samples were evaluated from seven drinking water related outbreaks, genotype 1 was identified most often in four of the outbreaks while genotype 2 was most prevalent in the remaining three outbreaks. Epidemiologic evidence has speculated that the genotype 2 outbreaks which occurred in the spring (peak lambing season) were due to contamination of water by infected sheep feces. The four outbreaks linked to genotype 1 are suspected to have been caused by contamination of water with human sewage.

In the USA, molecular characterization of Cryptosporidium oocysts from raw surface waters has recently been reported using a small-subunit rRNA-based PCR/RFLP technique [52]. In this study a total of 55 surface water samples and 49 wastewater samples were evaluated. The predominant Cryptosporidium species detected in surface water was C. parvum (genotypes 1 and 2) with C. andersoni detected at a slightly lower frequency. In wastewater, the predominant species detected was C. andersoni (67% of the samples). The high frequency of C. andersoni in the wastewater may be explained by the proximity of cattle slaughterhouses (less than 5 miles) in the area that drain their contents into the city sewage system. It was speculated that the low prevalence of C. parvum genotype 1 in the sewage samples might be attributed to the sampling period (April to July) when the incidence of human cryptosporidiosis is usually low [52]. While many previous waterborne outbreaks of Cryptosporidium have been surreptitiously linked to animal fecal contamination of drinking water sources, further investigation using advanced molecular techniques to identify Cryptosporidium spp. as well as genotypes can provide a more accurate assessment of sources of contamination.

3 Approaches for control in water

3.1 The cell culture approach to the measurement of viability

The application of cell culture for Cryptosporidium infectivity began in the early 1990s and has been used extensively for the testing of pharmaceuticals against this parasite [5356]. Previous research methods used to determine the infectivity of Cryptosporidium oocysts relied heavily on the use of animal models such as mouse infectivity assays. The reproducibility/quantification as well as ethical issues have made animal infectivity studies impractical for water research.

In the 1990s Slifko et al. developed a quantitative cell culture assay that could be used to assess the effects of disinfectants against Cryptosporidium as well as to study host cell/parasite interactions in vitro. Initial studies determined that the parasite would replicate asynchronously in vitro and could be detected microscopically using fluorescent antibody labeling after as little as 17 h of incubation [57]. The quantification of infectious oocysts in the initial sample was determined by enumerating the infectious foci or clusters of reproductive stages including meronts as well as macro- and microgametocytes in the host cell monolayer. This proved to be highly labor intensive as it required several hours of microscopic evaluation. Further research showed that it was possible to quantify the number of infectious oocysts in a sample by scoring replicate host cell monolayer as either positive or negative for infectious foci and utilizing a most-probable-number (MPN) approach to calculate the number of infectious oocysts present in a sample [58]. This method has been termed the Foci Detection Method Most Probable Number (FDM-MPN) approach and has been utilized to assess the effects of disinfectants such as pulsed UV light as well as high hydrostatic pressure [59, 60]. It has also been utilized to assess the presence of infectious oocysts in surface and filter backwash waters [61].

The correlation of the cell culture method and the animal infectivity assay can be seen in Fig. 1[62]. The cell culture data were transformed into a logit response form so that they could be directly compared to the animal infectivity data. The logit dose–response equation has been used to estimate the number of infectious oocysts in a sample [63, 64]. The slope of the mouse model is lower than that achieved by cell culture. Regression analysis of the average logit dose–response models for cells and mice, however, showed no difference in the slopes (P=0.0977; cell culture: m=0.005341±0.005562, Balb C mice: m=0.0127±0.0063). This reveals that increases in oocyst dose achieve the same increase in infectivity regardless of the system utilized for analysis. Further, the infectivity in cell culture is higher than in animal models. The infectious dose 50 (ID50) found in cell culture ranged from six to eight oocysts while the ID50 in animal models ranged from nine to 210 oocysts [60]. Interestingly, cell culture more closely resembled the low end of the range of human ID50 data (10 oocysts) [21].

Figure 1

Comparison of FDM-MPN cell culture and Balb C mouse infectivity assays by average logit dose–response models [62].

Several of the more commonly utilized cell lines capable of supporting C. parvum genotype 2 are listed in Table 5. While numerous modifications have been made to the cell culture infectivity assay for Cryptosporidium, the methods for cell growth are very similar (grown to confluence, aseptically in tissue culture vessels until inoculated with the test sample). Several of the infectivity assays pretreat oocyst samples using oxidizing agents such as dilute bleach solution or acidified salt solutions. These pretreatments are intended to surface sterilize the oocysts (eliminating or reducing contamination of the tissue cultures) as well as to trigger excystation (release of the organism's four internal sporozoites).

View this table:
Table 5

Cell lines routinely used to study C. parvum genotype 2 infectivity

Cell line (origin)Isolate or outbreak sourceOriginal isolating hostInfectivity detection methodRef.
Caco-2 (human)Ames, IABovineRT-PCR[65]
BS-C-1 (African green monkey)NABovineGiemsa stain[66]
BFTE (bovine)Ames, IABovineSEM and TEM[67]
HCT-8 (human)KSU-1, Ames, IA, TAMUBovine, equineELISA, PCR, RT-PCR, specific IF, FDM-MPN[21, 5658]
MDBK (bovine)GCH1, Ames, IAHuman, bovineIF, PCR[68]
MDCK (canine)Ames, IABovineGiemsa stain[67]

A variety of end points are currently being utilized to determine the concentration of infectious oocysts in water samples. Immuno-based assays utilizing antibodies to C. parvum sporozoites and other life cycle stages coupled with a secondary antibody conjugated to a fluorescent dye or enzyme have been utilized [69]. Molecular-based assays that utilize either PCR or RT-PCR methodologies to amplify DNA or RNA targets extracted from infected cells or oligonucleotide probes that can detect nucleic acids in situ have also been developed for speciation and genotyping [61, 65]. Environmental samples that have been concentrated by centrifugation have been shown to be suitable for cell culture-PCR analysis with minor purification of the nucleic acids prior to PCR with the potential detection of one oocyst [65]. This is a major advantage to previous methods where environmental substances could and often would inhibit PCR reactions. One of the major disadvantages to cell culture PCR is the possibility of detecting oocysts or sporozoites on the surface of the cell monolayer that have failed to cause infection.

In a study of surface waters and filtered backwash waters, infectious oocysts of a variety of strains were detected in 4.9% and 7.4% of the samples respectively using cell culture methods. These came from 25 different sites throughout the USA [61]. This suggests that viable oocysts can penetrate the treatment barriers and application of this technique in the future to fully treated water will allow for better public health assessment.

3.2 Disinfection of Cryptosporidium

The small size of Cryptosporidium oocysts (4–6 μm) makes their removal by filtration during water and wastewater treatment a difficult task to accomplish on a consistent basis. Therefore their inactivation by chemical disinfection remains a critical step in water treatment. Normal chlorination levels found in drinking water treatment plants have been shown to be ineffective for the inactivation of Cryptosporidium oocysts even after 18 h of contact time [64, 70]. The relative ineffectiveness of conventional disinfectants such as free chlorine and monochloramine for the inactivation of encysted parasites has led to evaluation of alternative disinfectants for drinking and wastewater treatment [70, 71]. Chlorine dioxide has the potential to provide some inactivation of oocysts (about 90%); however, the two alternative disinfectants that have been proven to be most successful for inactivating Cryptosporidium oocysts are ultraviolet light and ozone [72].

Ultraviolet light has been used to inactivate microorganisms in contaminated water since the early 1900s [73]. Most of the early history regarding the use of UV for disinfection centered around the wastewater industry where the primary focus of disinfection was the reduction of fecal coliform bacteria [74]. UV technology was first introduced in the USA in Henderson, KY, in 1916 with the longest continuously operating UV disinfection plant installed in Ft. Benton, MT, in the early 1970s [7578].

There are several advantages to using UV as a disinfectant: (1) it is a physical process that does not rely on the use of chemical additions, (2) it has been shown to be highly effective in the inactivation of protozoa, (3) it requires relatively short contact times, and (4) no UV disinfection by-products have been currently identified. There are, however, several unanswered questions related to the use of UV. They include the evaluation of different types of UV lamps, reactor design and scale-up issues including performance monitoring and process control as well as solution matrix effects. Due to the high level of interest displayed in UV technology throughout the water industry, a great deal of research is both planned and ongoing to try and answer these questions.

Early studies (prior to 1998) evaluating the effectiveness of UV demonstrated it to be an inadequate disinfectant against protozoa [77, 78]. More recent studies, however, have shown that these initial findings were based upon poor choices of analytical methods and did not reflect the true capabilities of UV disinfection [59, 79, 80]. Vital dyes such as 4,6-diamidino-2-phenylindole (DAPI) and propidium iodide (PI), which had been shown to provide equivalent results to animal infectivity when used to evaluate oocysts that had been exposed to chemical disinfectants, were not capable of accurately estimating oocyst infectivity post UV treatment. Vital dye inclusion/exclusion is a measure of the integrity of an oocyst's outer wall as well as its inner cytoplasmic and nuclear membranes. However, UV doses capable of altering an oocyst's DNA such that it can no longer reproduce, do not alter the permeability of the organism's membranes, thereby causing confounding results [59, 70, 79]. Huffman et al. [59] showed that vital dyes grossly overpredict infectivity post UV disinfection using a pulsed UV water treatment device. Further, this was the first study to compare cell culture with animal infectivity for the determination of infectious oocysts post UV treatment. The FDM-MPN cell culture method revealed comparable inactivation levels compared to animal infectivity without the increased expense and extremely specialized facilities necessary to perform animal studies.

Excystation procedures have likewise been shown to correlate well with animal infectivity analysis for the determination of Cryptosporidium oocyst inactivation by chemical disinfection but not for UV disinfection [59, 70, 79]. Excystation procedures microscopically evaluate the organism's ability to release its four internal sporozoites under specific conditions. These conditions are similar to those found in the human intestinal tract, e.g., appropriate levels of carbon dioxide, acid, pancreatic enzymes and elevated temperature. Excystation post UV disinfection predicts a false level of infectivity that cannot be corroborated with either cell culture or animal infectivity analysis [59].

The biocidal effect of UV is due to the absorption of UV photons by an organism's genomic DNA [81]. Damage to the DNA does not directly kill the organism; however, it does result in the failure of the organism to reproduce and establish infection in the host. Recent advances in UV research include the use of bioassays to more accurately determine UV dose and computer modeling approaches for reactor design validation. These research advances have allowed for the application of UV disinfection technology in large-scale potable water treatment plants.

Both medium and low pressure ultraviolet irradiation have been shown to be extremely effective for the inactivation of Cryptosporidium oocysts in drinking water. Relatively low doses of UV (9 mJ cm−2) have been shown to inactivate >3 log (99.9%) of Cryptosporidium oocysts [80, 82, 83]. Recent studies by Shin et al. [84], Oguma et al. [85] and Belosevic et al. [86] have shown that while Cryptosporidium oocysts have the capability to repair UV-induced pyrimidine dimers in their DNA, the oocysts were not capable of recovering their infectious nature post UV exposure. Oguma and Belosevic performed their studies using animal infectivity assays while Shin utilized both cell culture and animal infectivity analysis with comparable results.

Ozone was first used as a disinfectant of drinking water in France almost 100 years ago. It has been estimated that there are over 2000 drinking water treatment plants in the world using ozone and over 40 plants that use ozone have been built in the USA in the last two decades [87]. Ozone usage for water treatment has increased dramatically, especially for the areas of color removal, taste-and-odor control, iron and manganese oxidation as well as for disinfection purposes. It has the greatest oxidation potential of any water disinfectants while adding no chemicals to the treated water; therefore it leaves no residual disinfectant in the water during distribution. Ozone is produced by passing dry air between electrodes separated by an air gap and a dielectric and applying a current with a voltage between 8000 and 20 000 V.

While Cryptosporidium oocysts have been shown to be relatively impervious to chemical disinfectants such as chlorine and chloramine, ozone has proved to be a much more effective disinfectant against this protozoan parasite [70, 72, 88]. In aqueous media ozone produces free radicals that effect the permeability of the oocyst's wall and ultimately its DNA. Studies performed in the late 1980s and early 1990s showed some disparities regarding the kinetics of C. parvum inactivation using ozone. Ozone kinetic studies for Cryptosporidium inactivation using excystation methods typically showed slower inactivation rates than those based on animal infectivity studies [63, 89]. However, these disparities may be attributed to the various methods employed to assess infectivity as well as the variability in the infectious nature of the oocysts themselves and how they were collected and purified. Another factor contributing to the variability of the results may be due to the experimental design of these studies including reactor design water matrix.

The inactivation kinetics for ozone inactivation of C. parvum species in general may be characterized by a lag phase followed by pseudo-first order kinetics [88]. Ozone disinfection has also been shown to be temperature dependent with the CT requirements for C. parvum inactivation increasing by a factor of 3 for every 10°C decrease in water temperature. Therefore the use of ozone in colder climates may present design restrictions for water treatment plants. Another obstacle to the use of ozone for disinfection purposes is the potential for the formation of bromate, a possible human carcinogen, when the raw water being treated contains bromide. Bromide concentrations greater than 50 g l−1 may result in bromate formation at levels greater than the maximum contaminant level (MCL) of 10 g l−1. However, there are several methods available to control bromate formation, including depression of the water pH in the ozone contactor to less than 6.5–7.0. These issues may preclude the use of ozone for drinking water treatment in some instances.

4 Concluding remarks

Cryptosporidium has been endowed with a number of physical as well as physiological characteristics that have enabled it to become a significant cause of waterborne disease throughout the world. Advances in the recovery and detection in water and identification of genotypes have revealed sources of infection previously unknown. The body of knowledge concerning the biology, taxonomy, epidemiology and resistance of this organism continues to grow at an unprecedented rate. It is with this ever expanding knowledge base that the research focus and directions on the preservation of health through the protection of the world's most precious resource ‘water’ will continue.


  1. [1].
  2. [2].
  3. [3].
  4. [4].
  5. [5].
  6. [6].
  7. [7].
  8. [8].
  9. [9].
  10. [10].
  11. [11].
  12. [12].
  13. [13].
  14. [14].
  15. [15].
  16. [16].
  17. [17].
  18. [18].
  19. [19].
  20. [20].
  21. [21].
  22. [22].
  23. [23].
  24. [24].
  25. [25].
  26. [26].
  27. [27].
  28. [28].
  29. [29].
  30. [30].
  31. [31].
  32. [32].
  33. [33].
  34. [34].
  35. [35].
  36. [36].
  37. [37].
  38. [38].
  39. [39].
  40. [40].
  41. [41].
  42. [42].
  43. [43].
  44. [44].
  45. [45].
  46. [46].
  47. [47].
  48. [48].
  49. [49].
  50. [50].
  51. [51].
  52. [52].
  53. [53].
  54. [54].
  55. [55].
  56. [56].
  57. [57].
  58. [58].
  59. [59].
  60. [60].
  61. [61].
  62. [62].
  63. [63].
  64. [64].
  65. [65].
  66. [66].
  67. [67].
  68. [68].
  69. [69].
  70. [70].
  71. [71].
  72. [72].
  73. [73].
  74. [74].
  75. [75].
  76. [76].
  77. [77].
  78. [78].
  79. [79].
  80. [80].
  81. [81].
  82. [82].
  83. [83].
  84. [84].
  85. [85].
  86. [86].
  87. [87].
  88. [88].
  89. [89].
View Abstract