How many people would be willing to voluntarily drive a car through heavy traffic while blindfolded? What if you were allowed to peek every couple of minutes or hours? Few people could be persuaded to attempt this. Yet, blindfolded is, in effect, the state we are in relative to the real-time operation of our current water supply systems. Other than occasional grab samples, little monitoring of our water occurs outside of the treatment plant. This is similar to being allowed to peek from under the blindfold a couple of times per day.
Some of the vulnerabilities drinking water infrastructures face have been addressed by ramping up physical security and policy, although one real and serious vulnerability remains: An intentional contamination event in the distribution system. Operationally there is no effective means of completely preventing such an attack. Therefore, the only option is to attempt to detect such an occurrence as soon as possible so as to mitigate its effects. The only option is monitoring.
The events of September 11, 2001, added a new urgency to monitoring in the distribution system. Various methods and technologies have been proposed to attempt to develop an integrated approach to monitoring that could help lessen the risks associated with contamination events.
There are many challenges in monitoring our water distribution systems. One of the challenges is the vast number of agents that could be utilized by a terrorist. This need to detect such a wide variety of diverse contaminants requires a realignment of thinking from the traditional development of a sensor specific to a given compound or agent. Many factors impact the ideation and development of a broad spectrum monitoring system.
Water quality is a challenge; in a given system, there is great heterogeneity over time in basic conditions. On top of the great diversity of water quality that may be present, the general environment is also very harsh. The environmental conditions found in the pipes can be a great challenge when designing a monitoring system robust enough to be deployed for extended periods.
Cost constraints are also a major factor in the design of monitoring systems. The goal of cost effectiveness can be addressed in two different manners. One, to design an inexpensive monitor that can be deployed for a low cost and, two, develop a monitor capable of providing data that could be useful in decreasing operating costs of the system, thus making its cost a recoverable expense.
Some of the various methods and technologies that have been deployed in an attempt to develop an integrated approach to monitoring are discussed here.
Toxicity testing (the use of organisms to detect changes in water quality) appears to be a reasonable choice when attempting to detect a wide variety of potential threat agents. There have been several attempts to deploy online toxicity monitoring techniques in the distribution system. While it is an intuitively valid approach, there are a number of problems that are inherent in using toxicity methods in the distribution system.
The main concern is culturing and maintenance. The convoluted nature of the distribution system and the fact that it can be accessed at virtually any point requires the deployment of monitoring platforms in a number of locations to achieve an adequate level of protection. The use of live organisms requires significant hands on time. Even systems with automated care and feeding systems tend to need an inordinate amount of attention. These are not factors that are amenable to a widely distributed network of sensors.
Another factor that makes toxicity monitoring in the distribution system a questionable undertaking is that to function appropriately, compounds such as chlorine must first be removed from the water before the organisms are exposed to it. Chlorine can be easily removed, but doing so can alter the toxicity of the water.
The variable environment in the network can also be a problem. Some organisms, such as fish, can be quite sensitive to changes in the general surroundings. Sudden changes in vibration and noise levels could lead to false alarms. Shielding organisms can be costly and limit deployment options. Toxicity monitoring may not be the best option for the distribution system and may serve a more useful role in the monitoring of source water where monitoring points are fewer and more easily controlled.
Lab-on-a-chip technologies are an innovation that is rapidly finding use in many fields. These are microscale devices that attempt to miniaturize analytical methods and mass-produce them utilizing the same techniques that facilitate production in the computer chip industry.
One of the projects currently under development makes use of microfluidics and micro chemical techniques to sample and analyze water for a variety of components. These types of devices are in effect miniaturized discrete analyzers that test for specific substances. They can use a variety of detection techniques. Recent projects have been initiated to morph this technology from an independent handheld device to an online configuration.
These devices are reliant upon micro fluidic techniques to draw samples and perform analyses. The distribution system is not a friendly environment for such techniques. Attempts to pre-filter the sample could alter the characteristics of the sample. Another issue is they are discrete analyzers designed to detect specific analytes. They could be thwarted by use of a toxin that the instrumentation was not designed to detect.
Various manufacturers have modified gas chromatography methods to be online tools. Gas chromatography is a technique used to detect volatile organic compounds (VOCs). The largest drawback to this technique is the limited scope of compounds that are detected. Also, some of the instrumentation can be touchy and the cost per deployed unit can be quite high.
A variety of new optics-based methods are beginning to come online. One device utilizes laser produced, multi-angle light scattering (MALS) technology to generate unique microorganism bio-optical signatures. The device uses lasers to interrogate a water sample and analyze how a particle in the water refracts the light.
This appears to be an effective method for monitoring water for biological contamination. However, it would be ineffective against chemical contaminants.
Another potential drawback is sample size. Due to the very small path length, only a small sample can be analyzed. It would be quite possible to miss bacterial or protozoan contaminants that were present at very low concentrations. This instrumentation would be able to sense large contamination events but, chances are, low-level contamination events may be missed.
Bulk parameter monitoring
Bulk parameter monitoring is the method of monitoring common water quality parameters and then looking for anomalies that may be indicative of a water contamination event. There are a number of manufacturers that are producing multi-parameter instrumentation packages for use in the distribution system.
The problem is what to do with all of this data. Enormous amounts of streaming data need to be processed. Another problem is the minute-to-minute variability that is present in a system. How are we to determine if alterations in water quality parameters are significant against a dynamic background? Unless a full-time team of statisticians is employed to make sense of this information, there is a need for intelligent algorithms to streamline the process. Multiple organizations are working towards creation of such a system and one such technology is currently commercially available.
This technology makes use of five common bulk parameters that are monitored simultaneously in real-time. The parameters monitored are pH, conductivity, total organic carbon, turbidity and residual chlorine. Signals from all of the instruments are processed from a five parameter measure into a single trigger signal. The signal then goes through a crucial proprietary baseline estimator. A deviation of the signal from the estimated baseline is then derived. The magnitude of the deviation signal is then compared to a preset threshold level. If the signal exceeds the threshold, the trigger is activated indicating a significant water quality change.
Once the system triggers, the event’s characteristics can be used to classify what caused the event. Laboratory Agent Data can be used to build a Threat Agent Library of these signals. A signal from the water monitor can be compared to the Threat Agent Library to see if there is a match.
The system is also equipped with a learning algorithm, so that as unknown alarm events occur over time, the system has the ability to store the fingerprint generated during the event. Over time, as the system learns, the probability of an unknown alarm will continue to decrease.
These systems appear to be a good choice for detecting water quality excursions that could be linked to water security events. One advantage is that these instruments are not new. They are common parameters that the average industry worker is familiar with, thus adding a degree of comfort in operations not afforded by other technologies. They represent measurements that are of interest and use to water utility personnel above and beyond their role as water security devices.
One of the largest advantages to this type of monitoring system is the array’s ability to detect a wide variety of potential threats. The ability to trigger on unique unknown events is also a major plus. A disadvantage is that some events occur during normal operation and may trigger an unknown alarm, although the information can be used to streamline and improve processes. Nonetheless, this learning phase requires time to investigate and classify these alarms. Another disadvantage of such systems is that while they will detect biological events, they are not as sensitive to such events as some other methods.
As it pertains to water, syndromic surveillance is the concept of using advanced computational techniques and data mining algorithms to monitor a number of non-specific indicators, such as hospital admissions, 911 calls, pharmacy sales and complaints to the utility. These data streams are directed to a centralized computing system that correlates all of the factors and extrapolates the probability of an attack using advanced algorithms.
While much useful information could theoretically be extrapolated from such a monitoring program, there are drawbacks. Syndromic surveillance was designed to thwart naturally occurring outbreaks of disease. The results of intentional contamination may spread quickly enough to make detection by such a mode redundant and unnecessary. Also, the reliance on such a mode of detection delays the reporting of the hypothetical event until actual exposures have occurred. This may be adequate in cases of a bacterial contaminant that may have a fairly long incubation period and can be treated with antibiotics. It is, however, woefully inadequate in the case of a chemical contamination event.
Many chemicals and biotoxins are not detectable by the consumer, as they have no taste or odor. Also, the onset of symptoms may be delayed for some time after exposure. The problem is that many of the chemical contaminants have no known treatment after exposure. In these cases, the reliance on hospital admissions becomes no more than “body count technology.”
One of the largest disadvantages to such a system is that even if an attack is indicated, without more traditional water quality monitoring to correlate, there is nothing to link the attack back to water. Syndromic surveillance does have some merit when the stream of data being analyzed includes real-time water quality monitoring results. When these results are used as the primary means of detecting an attack and the other subsidiary data is used as confirmation, the approach has considerable merit.
The value of monitoring
Monitoring is a critical component of any water security program. There is no other feasible way to address the severe vulnerability presented by the threat of an intentional contamination event in the distribution system. The ability to contain and isolate an incident is critical in preventing loss of life and limiting the cleanup of any incident. Remediation could be a very expensive proposition — main pipes may need to be replaced as well as some household plumbing. Therefore, the need to rapidly detect and contain is critical in reducing casualties and destruction.
With the current state of technology, there is no need for us to operate our water systems as if we were blindfolded. Admittedly, the instrumentation available today isn’t going to give us super X-ray or even 20/20 vision, but it will allow us a clear enough picture to avoid many of the hazards that we would surely encounter if we left the blindfold securely in place.