Home | Menu | Poem | Jokes | Games | Biography | Omss বাংলা | Celibrity Video | Dictionary

World Population Day

Drinking Water Chlorination - Chlorine Div



Executive Summary

The treatment and distribution of water for safe use is one of the greatest achievements of the twentieth century. Before cities began routinely treating drinking water with chlorine (starting with Chicago and Jersey City in 1908), cholera, typhoid fever, dysentery and hepatitis A killed thousands of U.S. residents annually. Drinking water chlorination and filtration have helped to virtually eliminate these diseases in the U.S. and other developed countries.

Meeting the goal of clean, safe drinking water requires a multi-barrier approach that includes: protecting source water from contamination, appropriately treating raw water, and ensuring safe distribution of treated water to consumers’ taps.

During the treatment process, chlorine is added to drinking water as elemental chlorine (chlorine gas), sodium hypochlorite solution or dry calcium hypochlorite. When applied to water, each of these forms “free chlorine,” which destroys pathogenic (disease-causing) organisms.

Almost all U.S. systems that disinfect their water use some type of chlorine-based process, either alone or in combination with other disinfectants. In addition to controlling disease-causing organisms, chlorination offers a number of benefits including:

* Reduces many disagreeable tastes and odors;
* Eliminates slime bacteria, molds and algae that commonly grow in water supply reservoirs, on the walls of water mains and in storage tanks;
* Removes chemical compounds that have unpleasant tastes and hinder disinfection; and
* Helps remove iron and manganese from raw water.

As importantly, only chlorine-based chemicals provide “residual disinfectant” levels that prevent microbial re-growth and help protect treated water throughout the distribution system.

The Risks of Waterborne Disease

Where adequate water treatment is not readily available, the impact on public health can be devastating. Worldwide, about 1.2 billion people lack access to safe drinking water, and twice that many lack adequate sanitation. As a result, the World Health Organization estimates that 3.4 million people, mostly children, die every year from water-related diseases.

Even where water treatment is widely practiced, constant vigilance is required to guard against waterborne disease outbreaks. Well-known pathogens such as E. coli are easily controlled with chlorination, but can cause deadly outbreaks given conditions of inadequate or no disinfection. A striking example occurred in May 2000 in the Canadian town of Walkerton, Ontario. Seven people died and more than 2,300 became ill after E. coli and other bacteria infected the town’s water supply. A report published by the Ontario Ministry of the Attorney General concludes that, even after the well was contaminated, the Walkerton disaster could have been prevented if the required chlorine residuals had been maintained.

Some emerging pathogens such as Cryptosporidium are resistant to chlorination and can appear even in high quality water supplies. Cryptosporidium was the cause of the largest reported drinking water outbreak in U.S. history, affecting over 400,000 people in Milwaukee in April 1993. More than 100 deaths are attributed to this outbreak. New regulations from the U.S. Environmental Protection Agency (EPA) will require water systems to monitor Cryptosporidium and adopt a range of treatment options based on source water Cryptosporidium concentrations. Most water systems are expected to meet EPA requirements while continuing to use chlorination.

The Challenge of Disinfection Byproducts

While protecting against microbial contamination is the top priority, water systems must also control disinfection byproducts (DBPs), chemical compounds formed unintentionally when chlorine and other disinfectants react with natural organic matter in water. In the early 1970s, EPA scientists first determined that drinking water chlorination could form a group of byproducts known as trihalomethanes (THMs), including chloroform. EPA set the first regulatory limits for THMs in 1979. While the available evidence does not prove that DBPs in drinking water cause adverse health effects in humans, high levels of these chemicals are certainly undesirable. Cost-effective methods to reduce DBP formation are available and should be adopted where possible. However, a report by the International Programme on Chemical Safety (IPCS 2000) strongly cautions:

The health risks from these byproducts at the levels at which they occur in drinking water are extremely small in comparison with the risks associated with inadequate disinfection. Thus, it is important that disinfection not be compromised in attempting to control such byproducts.

Recent EPA regulations have further limited THMs and other DBPs in drinking water. Most water systems are meeting these new standards by controlling the amount of natural organic material prior to disinfection.

Chlorine and Water System Security

The prospect of a terrorist attack has forced all water systems, large and small, to re-evaluate and upgrade existing security measures. Since September 11th, 2001, water system managers have taken unprecedented steps to protect against possible attacks such as chemical or biological contamination of the water supply, disruption of water treatment or distribution, and intentional release of treatment chemicals.

With passage of the Public Health Security and Bioterrorism Response Act of 2002, Congress required community water systems to assess their vulnerability to a terrorist attack and other intentional acts. As part of these vulnerability assessments, systems assess the transportation, storage and use of treatment chemicals. These chemicals are both critical assets (necessary for delivering safe water) and potential vulnerabilities (may pose significant hazards, if released). Water systems using elemental chlorine, in particular, must determine whether existing protection systems are adequate. If not, they must consider additional measures to reduce the likelihood of an attack or to mitigate the potential consequences.

Disinfection is crucial to water system security, providing the “front line” of defense against biological contamination. However, conventional treatment barriers in no way guarantee safety from biological attacks. Additional research and funding are needed to improve prevention, detection and responses to potential threats.

The Future of Chlorine Disinfection

Despite a range of new challenges, drinking water chlorination will remain a cornerstone of waterborne disease prevention. Chlorine’s wide array of benefits cannot be provided by any other single disinfectant. While alternative disinfectants (including chlorine dioxide, ozone, and ultraviolet radiation) are available, all disinfection methods have unique benefits, limitations, and costs. Water system managers must consider these factors, and design a disinfection approach to match each system’s characteristics and source water quality.

In addition, world leaders increasingly recognize safe drinking water as a critical building block of sustainable development. Chlorination can provide cost-effective disinfection for remote rural villages and large cities alike, helping to bring safe water to those in need.

Chlorination and Public Health

Of all the advancements made possible through science and technology, the treatment and distribution of water for safe use is truly one of the greatest. Abundant, clean water is essential for good public health. Humans cannot survive without water; in fact, our bodies are 67% water! Both the U.S. Centers for Disease Control and Prevention and the National Academy of Engineering cite water treatment as one of the most significant advancements of the last century.



Disinfection, a chemical process whose objective is to control disease-causing microorganisms by killing or inactivating them, is unquestionably the most important step in drinking water treatment. By far, the most common method of disinfection in North America is chlorination.

Prior to 1908, no U.S. municipal water systems chemically disinfected water. Consequently, waterborne diseases exacted a heavy toll in illness and death. Without chlorination or other disinfection processes, consumers are at great risk of contracting waterborne diseases. Figure 1-1 shows the decline in the death rate due to typhoid fever following the introduction of chlorine to U.S. municipal drinking water systems in 1908. As more cities adopted water chlorination, U.S. death rates due to cholera and hepatitis A also declined dramatically. Worldwide, significant strides in public health and the quality of life are directly linked to the adoption of drinking water chlorination. Recognizing this success, Life magazine (1997) declared, “The filtration of drinking water plus the use of chlorine is probably the most significant public health advancement of the millennium.”

The timeline at the bottom of these pages highlights important developments in the history of drinking water chlorination.

Providing Safe Drinking Water: A Multi-Barrier Approach

Meeting the goal of clean, safe drinking water requires a multibarrier approach that includes protecting raw source water from contamination, appropriately treating raw water, and ensuring safe distribution of treated water to consumers’ taps.

Source Water Protection

Source water includes any surface water (rivers and lakes) or groundwater used as a raw water supply. Every drop of rain and melted flake of snow that does not re-enter the atmosphere after falling to the ground wends its way, by the constant pull of gravity, into the vast interconnected system of Earth’s ground- and surface waters. Precipitation ultimately collects into geographic regions known as watersheds or catchment basins, the shapes of which are determined by an area’s topography.

Increasingly, communities are implementing watershed management plans to protect source water from contamination and ecological disruption. For example, stream buffers may be established as natural boundaries between streams and existing areas of development. In addition, land use planning may be employed to minimize the total area of impervious surfaces such as roads and walkways, which prevent water from soaking into the ground. Reservoirs may be protected from contamination by disinfecting wastewater effluents, prohibiting septic system discharges and even controlling beaver activity (Beaver feces are potential sources of the harmful protozoan parasites Giardia lamblia and Cryptosporidium parvum.) Similarly, the Safe Drinking Water Act requires well head protection programs of water systems using groundwater sources. In such programs, the surface region above an aquifer is protected from contaminants that may infiltrate groundwater. Because source water quality affects the kind of treatment needed, watershed management planning is a sustainable, cost-effective step in providing safe drinking water.

Water Treatment

Every day, approximately 170,000 (U.S. EPA, 2002) public water systems treat and convey billions of gallons of water through approximately 880,000 miles (Kirmeyer, 1994) of distribution system piping to U.S. homes, farms and businesses. Broadly speaking, water is treated to render it suitable for human use and consumption. While the primary goal is to produce a biologically (disinfected) and chemically safe product, other objectives also must be met, including: no objectionable taste or odor; low levels of color and turbidity (cloudiness); and chemical stability (non-corrosive and non-scaling). Individual facilities customize treatment to address the particular natural and manmade contamination characteristic of their raw water. Surface water usually presents a greater treatment challenge than groundwater, which is naturally filtered as it percolates through sediments. Surface water is laden with organic and mineral particulate matter, and may harbor protozoan parasites such as Cryptosporidium parvum and Giardia lamblia. The graphic on the following page illustrates and describes the four main steps in a water treatment plant employing chlorine disinfection.

Water Distribution

In storage and distribution, drinking water must be kept safe from microbial contamination. Frequently, slippery films of bacteria, known as biofilms, develop on the inside walls of pipes and storage containers. Among disinfection techniques, chlorination is unique in that a pre-determined chlorine concentration may be designed to remain in treated water as a measure of protection against harmful microbes encountered after leaving the treatment facility.

In the event of a significant intrusion of pathogens resulting, for example, from a broken water main, the level of the average “chlorine residual” will be insufficient to disinfect contaminated water. In such cases, it is the monitoring of the sudden drop in the chlorine residual that provides the critical indication to water system operators that there is a source of contamination in the system.

Water treatment transforms raw surface and groundwater into safe drinking water. Water treatment involves two types of processes: physical removal of solids (mainly mineral and organic particulate matter) and chemical disinfection (killing/inactivating microorganisms). Treatment practices vary from system to system, but there are four generally accepted basic techniques.

1. Coagulation
Alum (an aluminum sulfate) or other metal salts are added to raw water to aggregate particles into masses that settle more readily than individual particles.

2. Sedimentation
Coagulated particles fall, by gravity, through water in a settling tank and accumulate at the bottom of the tank, clearing the water of much of the solid debris.

3. Filtration
Water from the sedimentation tank is forced through sand, gravel, coal, or activated charcoal to remove solid particles not previously removed by sedimentation.

4. Disinfection
Chlorine is added to filtered water to destroy harmful microorganisms. An additional amount, known as a “chlorine residual” is applied to protect treated water from re-contamination as it travels throughout the distribution system.

Source: Illustration by Bremmer and Goris Communications.

Chlorine: The Disinfectant of Choice

Chlorine is added to drinking water to destroy pathogenic (disease-causing) organisms. It can be applied in several forms: elemental chlorine (chlorine gas), sodium hypochlorite solution (bleach) and dry calcium hypochlorite.

When applied to water, each of these forms “free chlorine” (see Sidebar: How Chlorine Kills Pathogens). One pound of elemental chlorine provides approximately as much free available chlorine as one gallon of sodium hypochlorite (12.5% solution) or approximately 1.5 pounds of calcium hypochlorite (65% strength). While any of these forms of chlorine can effectively disinfect drinking water, each has distinct advantages and limitations for particular applications.

Almost all water systems that disinfect their water use some type of chlorine-based process, either alone or in combination with other disinfectants. Table 2-1 shows the percentage of drinking water systems using each of these methods.

The Benefits of Chlorine

Potent Germicide
Chlorine disinfectants can reduce the level of many disease-causing microorganisms in drinking water to almost immeasurable levels.

Taste and Odor Control
Chlorine disinfectants reduce many disagreeable tastes and odors. Chlorine oxidizes many naturally occurring substances such as foul-smelling algae secretions, sulfides and odors from decaying vegetation.

Biological Growth Control
Chlorine disinfectants eliminate slime bacteria, molds and algae that commonly grow in water supply reservoirs, on the walls of water mains and in storage tanks.

Chemical Control
Chlorine disinfectants destroy hydrogen sulfide (which has a rotten egg odor) and remove ammonia and other nitrogenous compounds that have unpleasant tastes and hinder disinfection. They also help to remove iron and manganese from raw water.

How Chlorine Kills Pathogens

How does chlorine carry out its well-known role of making water safe? Upon adding chlorine to water, two chemical species, known together as “free chlorine,” are formed. These species, hypochlorous acid (HOCl, electrically neutral) and hypochlorite ion (OCl-, electrically negative), behave very differently. Hypochlorous acid is not only more reactive than the hypochlorite ion, but is also a stronger disinfectant and oxidant.

The ratio of hypochlorous acid to hypochlorite ion in water is determined by the pH. At low pH (higher acidity), hypochlorous acid dominates while at high pH hypochlorite ion dominates. Thus, the speed and efficacy of chlorine disinfection against pathogens may be affected by the pH of the water being treated. Fortunately, bacteria and viruses are relatively easy targets of chlorination over a wide range of pH. However, treatment operators of surface water systems treating raw water contaminated by the parasitic protozoan Giardia may take advantage of the pH-hypochlorous acid relationship and adjust the pH to be effective against Giardia, which is much more resistant to chlorination than either viruses or bacteria.

Another reason for maintaining a predominance of hypochlorous acid during treatment has to do with the fact that pathogen surfaces carry a natural negative electrical charge. These surfaces are more readily penetrated by the uncharged, electrically neutral hypochlorous acid than the negatively charged hypochlorite ion. Moving through slime coatings, cell walls and resistant shells of waterborne microorganisms, hypochlorous acid effectively destroys these pathogens. Water is made microbiologically safe as pathogens either die or are rendered incapable of reproducing.

See Next

No comments: