The 5 Essential Elements of a Hydrological Monitoring Programme

01 July 2012

by Stuart Hamilton1

Hydrological monitoring programmeWater is the natural capital of the growing world population. Services built on our natural capital are the currency of the 21st century. The timing and spatial distribution of surface water quantity – and the variability in quality of that water – define how we design and build the infrastructure necessary for our energy, agriculture, mining, transportation and industrial sectors.

But water can also take lives. Droughts and floods are threats that require constant vigilance. Our ability to predict flooding, plan for droughts and support healthy ecosystems are challenged by land-use and climate change. Safe drinking water sources and entire ecosystems depend on continuous improvements in our understanding of, and efforts to protect, our water resources.

In fact, it is difficult to overstate the importance of the availability, reliability, and accuracy of data from water monitoring. Today’s hydrometric monitoring networks range from volunteer stewardship of small watersheds to continental-scale initiatives. Collectively, they are the basis for every action taken to support beneficial uses of water and to minimize threats from water.

Written for water resource managers, this paper outlines the five essential elements of a successful hydrological monitoring programme:

1) Quality Management System

2) Network Design

3) Technology

4) Training

5) Data Management

The day-to-day work of the stream hydrographer has changed substantially from even a decade ago. It is time to review how these changes impact the end-to-end system for collecting and publishing credible and valid data. This document presents a modern ‘best practices’ approach to hydrometric monitoring. The practices are fully scalable to any size of network and can improve the availability, reliability, and accuracy of all water information assets.


1 Quality Management System

graphic

A Quality Management System includes a set of standard operating procedures that govern the data production process to ensure that the data are of consistent, known quality. Every monitoring programme requires clear objectives for (1) data quality, (2) service, and (3) security that are closely linked with the needs of the end users. The Quality Management System provides rules to direct and control an organization towards meeting these quality management objectives.

In evaluating or creating a Quality Management System, water resource managers must keep in mind the concept of “Fitness for Purpose.” Data adequate to order an evacuation of a floodplain, for example, may be inadequate for testing a hypothesis about a trend. End-users of data develop a trust relationship with data providers based on their confidence that the quality management objectives – for data quality, service, and security – have been met with respect to their intended purpose.
 

Quality Objectives

Quality is a result of observation and information production processes. These processes need to be enforced by formal compliance with documented standard operating procedures. There are several industry sources for hydrometric standards, including:

  • Techniques & Methods Reports published by the US Geological Survey (USGS);
  • USGS Technique of Water-Resources Investigations Reports;
  • International Organization for Standardization (ISO) Technical Committees 113 and 147;

• WMO publications prepared as part of the Quality Management Framework – Hydrology (No. 49 “Technical Regulations Vol. III Hydrology” - 2006 edition, publication No. 168 “Guide to Hydrological Practices” - 6th edition, and various manuals including the second edition of the Manual on Stream Gauging). (All these can be downloaded from the WMO Website free of charge.)

A commitment to internationally accepted technical standards provides a basis for inter-comparability of data. Data produced by different agencies (or even by different hydrographers within the same agency) should have similar accuracy and precision. This means that if hydrographers were to independently monitor the same gauge, the resultant discharge hydrographs would be very similar and without systematic bias.


Service Objectives

The service objectives address the completeness of the data (for given levels of quality assurance at different lag times since observation). Historically, hydrometric data was published annually, as aggregated daily values and extreme statistics. Today, the focus is on real-time, continuous publication of unit value data. A modern hydrometric service needs to address evolving expectations for data reliability and timeliness.

Achieving the desired service objectives is primarily a function of the balance between:

  • Staffing (e.g. response time for instrument failure);
  • Equipment specifications (i.e. instrument reliability);
  • Life-cycle management of equipment (i.e. calibration and control procedures);
  • Efficiencies in data production (e.g. automated notifications, auto-corrections, and auto-publication); and,
  • Feedback from the data production process (e.g. sufficient metadata to support a continuous improvement process).

There is also an increasing expectation that data should be openly discoverable, searchable and accessible. Harmonized standards for data inter-operability are provided by the Open Geospatial Consortium. For example, the Water ML2.0 standard provides for the exchange of (1) point-based time series data, (2) processed values such as forecasts and aggregations, and (3) relevant information on monitoring points, procedures and context. By working within the Open Geospatial Consortium framework, water resource managers ensure that observations can be provided in the context of relevant coverages and features.


Security Objectives

Hydrometric data are valuable, demanding important capital, human and operational investments to obtain. The security objectives aim to protect these investments over the life of the data. In a well-maintained data management environment, the value of the data accrues with time.

But any information legacy is vulnerable to neglect, loss and destruction. Technological advancements can result in fragmented records and incompatible formats. Continuity between modern systems and historical archives must be managed with care and diligence.

The Global Climate Observing System (GCOS) Principles provide several best practices for maintaining data integrity when managing time series data. In particular: “The details and history of local conditions, instruments, operating procedures, data processing algorithms, and other factors pertinent to interpreting data (i.e. metadata) should be documented and treated with the same care as the data.”

Best practices for data curating ensure that (1) the data are secure and stored out of harm’s way, (2) metadata are complete, and (3) documentation is available for any changes in methods that could potentially impact the integrity of the data.


Results Focus

It is one thing to clearly articulate the desired data quality, service and security objectives. However, the Quality Management System must also verify that the product meets the needs of end-users. Any departure from expected results should provide feedback, creating a loop of continuous improvement. The needs of end-users change with time so the Quality Management System has to be adaptive.

Verifying that the quality objectives have been met is a two-step process. Quality Control is a system of routine and consistent checks to ensure data integrity, completeness, and compliance with stated standard operating procedures. Quality Assurance is a system of independent review procedures to verify that the data quality objectives are met.

Most National Hydrometric Services have developed their own Quality Management System, however, some are choosing to become certified in the standardized ISO 9000 method.


2 Network Design

graphic

Network design is an ongoing process with new stations being established and existing stations being discontinued as priorities and funding evolve. This process must be managed with selective thinning and pruning, while nurturing new growth to fill data voids. Updating the design of a network is fundamentally a sampling problem. The challenge is to find the right balance between hydrometric monitoring objectives and site desirability.


Sampling the Phenomena of Interest

How will the information be used? The design process must begin with the end in mind. Locations upstream and downstream of dams or diversions are both useful, but for very different purposes. An upstream location is an integration of all runoff process occurring in the contributing watershed, whereas a downstream location is rich in information about what will be happening in receiving aquatic and riparian ecosystems. A good location is one where the variation in discharge is sensitive to the phenomena of interest.

The monitoring objectives determine which parameters need to be included in the network design. If the objective is regulatory compliance or to develop statistics for engineering design, then perhaps the only parameter needed is discharge. However, if the purpose is to understand runoff processes, to develop water management policies, or to calibrate predictive models, then network design should consider all relevant components of the water cycle, including stores (e.g. groundwater, snowpack, and lake levels) and flux (e.g. temperature, evaporation, and precipitation). The measurement of some parameters (e.g. sediment and water quality) must be co-located with discharge gauging if loadings are a requirement. Jurisdictional collaboration is integral to the network design process and ensures an efficient, coordinated approach to monitoring within a watershed.


Sampling the Hydroscape

The design of a successful hydrometric monitoring network must next consider how the variability in space needs to be sampled so that the variability in time can be effectively monitored. In other words, the location of gauges should reflect the geophysical complexity of the landscape. In order to satisfy the assumption that the data are scalable and representative, gauges must be located across the scale of the geophysical variability of the watershed.

The WMO Guide to Hydrological Practices recommends the following station densities:
graphic
Minimum Density per Station (Area in km2/Station)

Ultimately the pragmatic station density in a region is a function of risk tolerance. These regional-scale density recommendations may be inadequate to fully characterize local-scale threats from flooding or to provide the needed guidance for local-scale water supply management. Risk tolerance is often particularly high in the developing world; resulting in a perpetual need to react to, rather than prevent, water related crises.


Selecting the Site

Once the monitoring objectives and criteria for geophysical representativeness are established, then a specific reach of river can be selected for monitoring. A desirable location is one with (1) uniform, gradually varying flow, (2) inexpensive site access, (3) stable geophysical features for vertical control benchmarks and for channel control, and (4) safe stream gauging conditions.

Monitoring objectives often restrict the choice of possible locations to those with adverse monitoring conditions. A mismatch between local conditions and appropriate technology results in poor quality data and high maintenance requirements for both field and office procedures. Technologies are available to mitigate for almost any compromise needed in site selection, but the most reliable and affordable solutions are predicated on good site selection.

Site selection affects the following outcomes:

  • Data persistence (i.e. a well selected location should produce data for generations to come),
  • Data quality (e.g. conformance with underlying assumptions),
  • Data representativeness (i.e. relevance to ungauged locations),
  • Operational costs (e.g. site access),
  • Liability risks (i.e. occupational and/or public safety),
  • Selection of methods (e.g. use of rating curve vs. index velocity method), and
  • Reliability risks (e.g. exposure to vandalism).

With so much at stake, a significant investigation is warranted for any change in network size. Unfortunately, water resource managers often come under pressure from management to expand or contract the network on short notice (for example, to make the change by fiscal year end). Thus, many important decisions are made in haste. As a best practice, network design should be an ongoing process with preparedness to make wise choices on short notice.


3 Technology

graphic

Selecting the best technology for a given location is more complex than ever before. Even when choosing a simple pressure transducer, a hydrologist must consider the type (e.g. piezoelectric, capacitive, inductive, potentiometric, vibrating wire, vibrating cylinder, or strain-gauge) and the method of deployment (e.g. bubbler, vented or compensated). For each combination of these technologies there are numerous vendors and products available – and each product has a performance specification that can be characterized by an error band, hysteresis, resolution, sensitivity and time constant.

Hydrometric network operators must consider several additional factors:

  • Reliability requirements – an acceptable mean time between failures.
  • Accuracy in the deployed setting – the blanking distance of some Acoustic Doppler Current Profilers (ADCPs), for example, may be too great to correctly measure discharge for some stream geometries.
  • Cost of site access – for remote sites, the incremental costs of an Acoustic Doppler Velocity Meters (ADVMs) for use with an index-velocity model may be easily recouped by reduced site visits.
  • Local site factors – high sediment transport, algal blooms, and river ice are all factors that warn against deploying expensive submersible technology.
  • Instrument sensitivity and precision – relates to the time and effort spent on post-processing of the data.
  • Training and familiarity – limiting the variety of products deployed in a region can greatly reduce both the training burden and the likelihood of mistakes caused by a lack of familiarity with a specific device.


Total Cost of Ownership

Factors that affect the total cost of ownership of technology include: the initial capital cost; field calibration and service frequency requirements; unscheduled field visits to repair or replace; time and effort spent on corrections and post-processing of the data; data lost due to sensor failure; amount of data degraded by high uncertainty; and supplies (e.g. compressed gas and/or power source). Money saved at the time of purchase can be easily exceeded by operations and maintenance costs.

Low-cost monitoring equipment does, nonetheless, have its place. For example, in monitoring a high-risk location (e.g. during a dynamic river ice breakup), one needs to get as much data as possible before the sensor is inevitably lost or destroyed. There can be as much as an order of magnitude of difference in the cost of sensors. Low-cost sensors have also led to the concept of “a network as a sensor” where several redundant sensors can be deployed at a gauge. In some cases, it is advantageous to use the average of these independent, if imprecise, measurements and also get a measure of the aggregate uncertainty. This concept also lends itself to deploying many low-cost sensors to sample landscapes at the scale of space-based observation systems.

In the context of total cost of operation, telecommunication technologies offer a significant improvement in data reliability as a result of real-time station health monitoring and improvements in the timing of stream gauging activities.


4 Training

graphic

No investment in technology can compensate for poor choices in data collection and data handling. Errors caused by procedural mistakes are the most difficult to detect and correct in data post-processing. Training accelerates the rate at which competencies are gained while simultaneously reducing the frequency of mistakes. Training is, arguably, more important than ever as the demographic in many monitoring agencies today has a double hump of new recruits and pre-retirees, creating an urgent need to compensate for loss of experience with improvements in knowledge.

Stream hydrographers must be skilled in many disciplines to be truly effective. The measurement of flowing water is a sophisticated application of science and engineering principles. Decisions made in the field and for data interpretation require a basic understanding of physics, chemistry, biology, hydrology, hydrodynamics, fluvial geomorphology, math and statistics.

Additionally, the installation and operation of hydrometric monitoring equipment requires skills in plumbing, wiring and programming. Stream gauging requires expert interpretation of quality management protocols with respect to the selection and application of methodologies while considering the specific context of the measurement conditions. The stream hydrographer must make decisions to limit adverse environmental effects and to preserve both personal and public safety.

While there are limited options for training, some National Hydrometric Services (e.g. USGS) offer courses to the general public. Short courses in hydrometric methods are also available from hardware and software vendors.

Investments in training improve data quality, increase productivity, improve gauge reliability, and enhance safety. Training in stream hydrography must be a continuous process to keep current with best practices as they apply to new and emerging technologies.


5 Data Management

graphic

Improvements to hydrological monitoring programmes often focus on field-based technologies. What is frequently ignored is how the data are managed after acquisition. Hydrological data are complex. Stream hydrographers are responsible for storing, validating, analysing and reporting on vast amounts of water data.

Specialized Hydrological Data Management Systems are available to meet the evolving needs of hydrologists and to support current industry standards for water information management. Software designed specifically for hydrologists is required to achieve excellence and effectiveness in hydrological monitoring.


Auditable & Defensible Data

As discussed, the Quality Management System establishes the credibility of the data production process. One important role of the Data Management System is to establish the defensibility of the data by providing evidence of compliance with the Quality Management System. This means the Data Management System must preserve the full history of the data, including who did what, when, how and why.

As a best practice, raw data must be preserved intact and all changes must be recorded and be reversible, if needed. This means that data can be rolled back in time to show exactly what edits, corrections, approvals or notes were applied at any point in time. This is particularly important when dynamically publishing data using web pages or web services as opposed to static documents. The complete history (of who did what, when, where, how, and why) supports peer quality control and supervisory quality assurance. This history confirms the second half of the quality management mantra: “Say What You Do, Do What You Say.”


Centralized & Accessible Data

Hydrologists must manage many types of data in all kinds of formats, for example: lab data in Excel, time series in CSV, gauging data in hardware vendor software, and station data in GIS. As a best practice, all of this data and supporting metadata are consolidated and managed as a secure, coherent collection. The best solutions support relational queries of this data collection. Web service connections to this database mean that data and metadata are accessible from anywhere, at any time.


Real-Time Data & Automation

A modern hydrometric monitoring system delivers data dynamically in real-time. Ideally, the best data are continuously available and can be served using international standards for inter-operability. This means that end-users benefit as soon as new data are appended, erroneous values are filtered, corrections are applied, rating curves updated or shift corrections applied. The best solutions also provide end-users with informative metadata about the quality and status of the data. Data can be filtered based on the state of the data in the Quality Management System process. Archival quality data are clearly identified and ‘locked’ from further editing.

Automated notifications provide timely warnings about hydrological events and alert hydrographers to any faults or station health indicators that require immediate attention. Automated data correction algorithms censor invalid values and correct persistent and/or predictable errors in real-time. This eliminates some of the most onerous and repetitive tasks, allowing the stream hydrographer to focus on high value interpretive analysis. Automated reporting provides high value data products to water resources professionals and decision makers on an event-driven or scheduled basis.


Credible Rating Curves

The best solutions for developing and validating rating curves are engineered from basic hydraulic principles. The full suite of information gathered in the field is relevant to the calibration process, not just the x, y coordinates of the rating measurements. This includes consideration of site photos, cross sections, field notes, measurement quality, control conditions, historical ratings, and the time series of stage data. It has been shown to be less work and more accurate to use an evidence-based approach to curve-fitting rather than to be forever ‘chasing’ the curve using statistical regression techniques.

With modern hydrometric monitoring systems, discharge derivation models are calibrated with respect to underlying hydraulic science and engineering principles. The result is:

  • Improved confidence in extrapolation (within the range of known channel geometry),
  • Improved agreement on a solution (i.e. different hydrographers will independently produce similar results), and
  • Improved defensibility of results (i.e. rating curve parameters help to constrain the solution).

It is often necessary to accommodate shifting channel control conditions with corrections to the stage-discharge model. The best solutions for managing shift corrections include the inspection and interpretation of field observations, residuals plot, and time series visualizations.


Data Visualization, Correction, & Markup

Advanced visual interpretation and analysis of the data is needed to identify errors that cannot be detected automatically. Sophisticated graphical tools available with Data Management Systems make it easier to calibrate time series data using field observations from a reference gauge. Specialized corrections can be made for many of the common, often repetitive, errors typical of the technologies used for hydrometric monitoring. Sophisticated methods are needed to estimate longer gaps in the data and for periods of ice effect. Extensive and comprehensive abilities are required to comment on these actions and to add event markers, quality grades, and to change the status of the data.


Reporting and Publication

The best Data Management Systems provide for continuity in reporting with customizable report templates that can be tailored to match legacy reports. New high-value reports can be developed from scratch or by modifying templates for industry standard reports. The content for the reports can be filtered according to status in the Quality Management System so that reports of archival quality data can be readily produced for conventional publication. Access to web services provides the ability to dynamically publish data, based on metadata filters, using industry-wide standards.


Modern Hydrological Monitoring Programmes

Starting from clearly defined water data quality objectives and ending with the timely publication of credible information, the five essential elements presented in this paper are fundamental to any modern hydrometric monitoring programme. Best practices, industry standards, and technologies for hydrometric monitoring have changed substantially in the last decade. A new ‘normal’ is emerging out of this change and it is time to re-engineer hydrometric programmes to improve the availability, reliability, and accuracy of water information assets.

Changes made to optimize efficiencies and maximize effectiveness in the delivery of critical hydrologic data products and services will ensure the success of mega­projects, the preservation of vital ecosystems, and the safety of citizens. Improvements in water data inter­operability and accessibility will support evidence-based decision-making for water related problems from the scale of culvert design up to global environmental policy-making that will ultimately make the world a better place for generations to come.

References:

[1] US Geological Survey (USGS) Techniques & Methods Report

[2] USGS Techniques of Water Resources Investigations Report -

[3] ISO Technical Committee 113

[4] World Meteorological Organization (WMO) Operational Hydrology Reports

[5] WaterML2.0 Standard

[6] Global Climate Observing System (GCOS) Principles

[7] Standardized ISO 9000 Method

[8] World Meteorological Organization (WMO) Guide to Hydrological Practices

_________

1 Associate Expert, WMO Commission for Hydrology; Canadian Liaison, Hydrometry Committee (TC 113), International Organization for Standardization (ISO); President, North American Stream Hydrographers (NASH); and Senior Hydrologist, Aquatic Informatics.

back to top

survey iconHydrological Monitoring – 2012 Global Industry Survey

Your Participation Matters!

Hydrological monitoring technologies, industry standards, and best practices have changed drastically in the last decade. This September, the global 2012 Hydrological Monitoring Industry Survey will help quantify current challenges, practices, and trends.

Scientists, hydrologists, hydrographers, and water resource managers from around the globe are invited to participate here: http://www.surveymonkey.com/s/2012Water

Everyone who completes the survey by Friday, 5 October, 2012 will be entered to win a new iPad (or a $500 donation to their charity of choice). All participants will also receive a copy of the final report so they can benchmark their organization’s hydrometric monitoring program.

Hydrological monitoring has changed substantially as a result of adaptation to new and emerging technologies. A new ‘normal’ is emerging and it is time to re-engineer hydrometric programs to optimize efficiencies and maximize effectiveness in the delivery of program products and services. But one thing that is lacking is good data on what, exactly, the new normal looks like. The 2012 Hydrological Monitoring Industry Survey is an opportunity for members of the community to work together to gain a better understanding of the current state of the industry.

The survey can be completed here (before 5 October 2012): http://www.surveymonkey.com/s/2012Water

    Share: