Valve transient
     

Bayesian Modelling of Fluid Flow in Pipelines

Site maintained by Jonathan Rougier. Please contact me if you would like further details.

Statistics Group
Department of Mathematical Sciences
University of Durham
South Road
Durham DH1 3LE
England

Tel: +44 (0) 191 374 2361
Fax: +44 (0) 191 374 7388


We introduce uncertainty into the traditionally deterministic analysis of transient fluid behaviour in pipelines (the water-hammer equations). We are able to represent our beliefs about the fluid as a probabilistic assessment of pressure and flow at vertices along the pipeline, and to propagate these beliefs forward in time. This allows us to perform a more informative off-line analysis of particular operational events (such as rapid valve closure). We can also use our approach for on-line monitoring, e.g. for leak-detection.


On this page you will find information about the following:

You can also download a copy of our paper A Bayesian Analysis of Fluid Flow in Pipelines as a postscript file or a pdf file, from which the outlines below are taken. A version of this paper will be appearing in the Journal of the Royal Statistical Society, Series C, in 2001.


A generic explanation of our approach

In our approach we turn a deterministic physical model into a stochastic dynamic linear model, in order that we might benefit from the utility of these models for real-time Bayesian learning. In so doing, we have taken particular care to source our uncertainty correctly, in the following steps:
  1. We represent parameter uncertainty as a random field in the spatial variables, by specifying a mean function and a covariance kernel.
  2. We solve the deterministic form of the physical model for a discrete representation in the presence of parameter uncertainty. Thus we create discrete random quantities with mean and covariance structures that follow from our uncertainty about the parameters interacting with the physical process.
  3. We provide stochastic models for the evolving boundary conditions. In this way we create a stochastic dynamic model for the physical process.
  4. We linearise the dynamic model. The result is a stochastic representation in which the mean of the state vector will evolve in exactly the same way as in the original deterministic model. The difference is that in our model we can also evolve a variance for the state vector from an initial variance.
  5. We incorporate the data by updating beliefs about the state vector with due allowance for measurement errors.


Belief structure

By `state vector' we mean a description of the pipeline at a given time. Traditionally the state vector for transient fluid analysis consists of pressures and flows measured at a specified number of vertices (or nodes) along the pipeline. In the deterministic analysis knowledge of this state vector at time t-1 can be used (with appropriate boundary conditions) to determine the state vector at time t.

We generalise the simple state vector in two ways. First, we augment it with an extra set of quantities denoting the pipeline coefficient (crudely, the local friction effect) between each of the vertices. Therefore the state vector consists of pressures, flows and friction coefficients. These friction coefficients may or may not evolve through time: it is up to the engineer to determine this. Second, we do not treat the state vector as known, but as unknown, with a given mean and variance. Naturally if some quantities are known then their variances can be zero.

The purpose of our analysis is to allow us to evolve not just the state vector, but, more generally, the mean and the variance of the state vector, through time according to the boundary conditions. Therefore uncertainty about the state vector at time t-1 will feed through to uncertainty about the state vector at time t. This propagation of uncertainty is done entirely in accordance with the physics of fluid flow in pipelines. An interesting implication is the existence of `waves' of uncertainty that can traverse the pipeline in much the same way as a pressure wave.


Leak detection

A typical use of pipeline models is in leak detection. The usual arrangement is to have one meter providing boundary data for a deterministic model, and other meters providing reference data. If the reference data and the prediction of the deterministic model disagree, then it is possible that a non-simulator event such as a leak has occurred. Unfortunately arrangements of this type provide little guidance about how big a discrepancy is tolerable, resulting in the development of complicated `voting algorithms' that attempt to pool discrepancies across meters and across time.

Our stochastic approach offers the following important advantages.

  1. There is no distinction between boundary meters and reference meters. All meters are treated exactly alike, in that their data can be used both for diagnosis and for learning. One practical implication is that it is possible to have a leak detection system that uses just a single meter.
  2. Due allowance is made for meter imprecision, avoiding the dubious practice of feeding inaccurate data directly into a deterministic system.
  3. Our approach is `self-tuning'. In long pipelines, in which the pipeline coefficient is an important determinant of fluid behaviour, it is possible to learn about its value and to permit this value to evolve spatially.
  4. Most importantly, probability itself can be used as a metric when evaluating the data. The stochastic simulator is effectively the likelihood function given that there have been no non-simulator events. It fully incorporates the possibility of correlation across meters and through time.

The role of the simulator as likelihood function paves the way for a fully Bayesian analysis of leak detection, in which it is possible for experts to incorporate detailed beliefs about the `leakiness' of the pipeline, and derive probabilistic descriptions of the leak (in terms of location, size and time of occurrence) if a leak is thought to have occurred.


Gallery

The following pictures (postscript format) use experimental data on rapid valve closure from Large Water Hammer Pressures Due to Column Separation in Sloping Pipes, A. R. Simpson, PhD thesis, Univ. Michigan, 1986, as reproduced in Fluid Transients in Systems, E. B. Wylie and V. L. Streeter, p. 50.
  1. The measured data (dashed) for downstream pressure and the deterministic solution (symbols) of the water-hammer equations [Picture]
  2. .

  3. The measured data (dashed) and forecast uncertainty (+/- 2 sd, shaded) as made at time t=0, for a reasonable description of parameter and modelling uncertainty [Picture]. Also available, a seperate plot of the standard deviations [Picture]. Note the nuggets of variance that appear after 50 ms and after 110 ms: this is a `wave' of uncertainty originating at the two boundaries at the start of the experiment. The first nugget is from the upstream boundary, and the second from the downstream boundary after reflection.
  4. The measured data (dashed) and forecast uncertainty using the latest available data sampled every 10 ms (highlighted with symbols) [Picture]. As the data is collected with error, the reduction in uncertainty at the data points is not complete. Note the super-imposition of the reflected wave in the low-pressure phase.
  5. Upstream pressure, forecast mean (dotted) and uncertainty using the latest available data sampled every 10 ms [Picture]. There is a strong contemporaneous correlation between upstream and downstream pressure, and so learning about the downstream value has an immediate impact upon our beliefs about the upstream value.
  6. Upstream flow, forecast mean and uncertainty using the latest available data sampled every 10 ms [Picture]. The contemporaneous correlation between upstream flow and downstream pressure is weak, and so most of the information from the downstream pressure meter arrives with a lag.


Page last updated 11.09.00.