The effect of differences between rainfall measurement techniques on groundwater and discharge simulations in a lowland catchment

More Info
expand_more

Abstract

Several rainfall measurement techniques are available for hydrological applications, each with its own spatial and temporal resolution and errors. When using these rainfall datasets as input for hydrological models, their errors and uncertainties propagate through the hydrological system. The aim of this study is to investigate the effect of differences between rainfall measurement techniques on groundwater and discharge simulations in a lowland catchment, the 6.5-km2 Hupsel Brook experimental catchment. We used five distinct rainfall data sources: two automatic raingauges (one in the catchment and another one 30 km away), operational (real-time and unadjusted) and gauge-adjusted ground-based C-band weather radar datasets and finally a novel source of rainfall information for hydrological purposes, namely, microwave link data from a cellular telecommunication network. We used these data as input for the, a recently developed rainfall-runoff model for lowland catchments, and intercompared the five simulated discharges time series and groundwater time series for a heavy rainfall event and a full year. Three types of rainfall errors were found to play an important role in the hydrological simulations, namely: (1) Biases, found in the unadjusted radar dataset, are amplified when propagated through the hydrological system; (2) Timing errors, found in the nearest automatic raingauge outside the catchment, are attenuated when propagated through the hydrological system; (3) Seasonally varying errors, found in the microwave link data, affect the dynamics of the simulated catchment water balance. We conclude that the hydrological potential of novel rainfall observation techniques should be assessed over a long period, preferably a full year or longer, rather than on an event basis, as is often done.