Computer Graphics & Geometry

Applying Feature Tracking to Flow Visualisation

Dmitry Chetverikov
Computer and Automation Research Institute
Budapest, Kende u.13-17, H-1111 Hungary


Contents

Abstract

Digital Particle Image Velocimetry (DPIV) aims at flow visualisation and measurement of flow dynamics. The fluid is seeded with particles that follow the flow and efficiently scatter light. Traditionally, FFT based correlation techniques have been used to estimate the displacements of the particles in a digital PIV sequence. Recently, an optical flow estimation technique developed in computer vision has been successfully applied to DPIV. In this paper we study the DPIV-efficiency of another group of tracking approaches, the feature tracking techniques. It is concluded that feature tracking algorithms applied to DPIV are a good alternative to both the correlation and the optical flow algorithms.

1  Introduction

In this paper we investigate the applicability and efficiency of feature based tracking algorithms in flow velocity measurement. Flow visualisation and measurement of flow dynamics are important for the analysis of combustion processes, hydrodynamic and aeronautical phenomena, flame propagation, heat exchange problems, construction of artificial heart pumps, and many other practical tasks [6].

In Particle Image Velocimetry (PIV) applications, the fluid is seeded with particles that follow the flow and efficiently scatter light. Conventionally, multiple exposure cameras are used to capture images of the flow at different time instants. The images are recorded on photographic film. Optical correlation methods are then applied point-by-point to the entire negative, and the velocity of the particles between two consecutive images is determined.

Digital PIV, or DPIV, refers to the newly emerged technique of using high-performance CCD cameras and frame-grabbers to store and process the digitised PIV sequences directly by the computer. In DPIV, the conventional cross-correlation methods are implemented using the Fast Fourier Transform [11].

At the same time, estimation and tracking of motion in image sequences is a well-established branch of computer vision (CV). Here, real world scenes with large, rigid or deformable moving objects are usually considered. Two main classes of motion estimation methods are traditionally distinguished: the optical flow based [2] and the local feature based [4] techniques. Efficient algorithms are available in both classes. Applying them to particle flow image sequences may lead to essential improvement in visualisation and measurement results.

Recently, Quénot et al. [9] presented an optical flow algorithm based on a orthogonal dynamic programming (ODP) technique and tested the algorithm with synthetic and real DPIV sequences, including complex flows. The results compare favourably to those obtained using the classical correlation based DPIV methods. A substantial increase in both accuracy and spatial resolution of the velocity field has been achieved. Superior robustness to artificially generated noise has also been demonstrated.

Unfortunately, the algorithm [9] is very slow, with the execution time increasing drastically with image size. This motivates our search for reasonable alternatives to both the conventional and the DP techniques. In this paper, we present the feature tracking approach to PIV, which includes novel techniques for coherence filtering and interpolation of a velocity field. Quantitative results of flow estimation are presented. The tests compare three groups of approaches to DPIV: those based on correlation, ODP, and feature tracking.

2  Feature tracking algorithms applied to PIV

Feature tracking techniques extract local regions of interest (features) and identify the corresponding features in each image of a sequence. In this section, we outline two particular algorithms, the KLT Tracker [10] and the IPAN Tracker [4], which we apply to Particle Image Velocimetry.

2.1  The KLT Tracker

The KLT Tracker selects features which are optimal for tracking, and keeps track of these features. A good feature is a textured patch with high intensity variation in both x and y directions, such as a corner. The algorithm defines a measure of dissimilarity that quantifies the change in appearance of a feature between the first and the current frame, allowing for affine distortions. At the same time, a pure translational model of motion is used to estimate the position in the next frame.

Consider the intensity function g(x,y). The approach assumes small displacements and minimises the linearised dissimilarity. The displacement vector d = (dx,dy)T is computed as the solution of the linear system [10] or, in matrix notation, Zd = e. (gx is a partial derivative of g(x,y).) It is implied that Z and e = (gx,gy)T are integrated over a feature window W. The size of W is normally set to 25×25. A patch defined by the window is accepted as a candidate feature if both eigenvalues of Z, l1 and l2, exceed a predefined threshold l: min(l1,l2) > l. This ensures that the matrix Z is well-conditioned and the solution of (1) is accurate. When applied to a PIV image, the KLT selects individual particles as feature centres. To cope with relatively large motion, the algorithm is implemented in a multiresolution way.

Each feature being tracked is monitored to determine if its current appearance is still similar, under affine distortion, to the initial appearance observed in the first frame. When the dissimilarity exceeds a predefined threshold, the feature is considered to have been lost and will no longer be tracked. At least two frames are needed for the operation of the algorithm.

Since the KLT algorithm incorporates an analytical solution to motion estimation, it is much faster than the methods that use explicit region matching, such as the conventional cross-correlation and the ODP techniques. The source code of the tracker can be downloaded from the web site [3].

2.2  The IPAN Tracker

The IPAN Tracker is a non-iterative, competitive feature point linking algorithm. Its original, application-independent version tracks a moving point set, tolerating point entry, exit and false and missing measurements. Position is the only data assigned to a point. The algorithm is based on a repetitive hypothesis testing procedure that switches between three consecutive image frames and maximises the smoothness of the evolving trajectories.

The application-independent version of the tracker is described in full detail in our recent paper [4]. When applied to PIV, the operation of the algorithm remains basically the same. The modifications necessary for the PIV application are as follows: 1. a PIV-specific feature selection mechanism is added; 2. the cost function is modified to include feature appearance. These modifications are described below.

Selection of feature points. A PIV image g(x,y) is smoothed by a 3×3 mean filter and the (real-valued) maxima of the smoothed image s(x,y) are selected as the features. Each bright particle is represented by a maximum of s(x,y). For more precise motion estimation, the position of each maximum is corrected by parabolic interpolation in x and y directions separately. Denote the corrected coordinates of a maximum (x,y) by (xf,yf).

A feature point P(xf,yf) is assigned a feature value

f(xf,yf) =   �
 �
 �


^
s
 
(x,y)

255
·
^
s
 
(x,y)

s(x,y)
 
(1)
where [^s](x,y) is the mean grayvalue in the 3×3 neighbourhood of (x,y).

f(xf,yf) characterises the dominance of a feature so that large and bright particles have high dominance. The features (particles) are ranked according to their dominance and a specified number of the most dominant features are selected for tracking. Note that the KLT and IPAN algorithms select different points.

Cost function. The IPAN Tracker processes sequentially each three consecutive frames Fn-1, Fn and Fn+1, where n is the current frame. Consider a feature point in each of the 3 frames, A Fn-1, B Fn and C Fn+1. When applied to PIV, the tracker minimises the cost function d(A,B,C) which accounts for changes in velocity and appearance of a feature:

d(A,B,C) = w1 Q(A,B,C) + w2 L(A,B,C) + w3 F(A,B,C),
(2)
where
Q(A,B,C)
=
1-

AB
 
·
BC
 

||
AB
 
|| ·||
BC
 
||
,
L(A,B,C)
=
1-
2
||
AB
 
|| ·||
BC
 
||
[1/2]
 

||
AB
 
|| + ||
BC
 
||
[`AB] and [`BC] are the vectors pointing from A to B and from B to C, respectively, [`AB] ·[`BC] their scalar product. Q and L, the trajectory smoothness terms, penalise changes in the direction and the magnitude of the velocity vector. The weights are set as w1 = 0.1, w2 = w3 = 0.45.

F(A,B,C), the appearance term of d(A,B,C), accounts for changes in feature appearance in a 3×3 neighbourhood. It is defined as

F(A,B,C) = 1
18 ×255
9

k = 1 

| s(Ak)-s(Bk) |+| s(Bk)-s(Ck) |
,
where s(Pk) are the grayvalues of the feature point P and its 8 neighbours.

The cost function d(A,B,C) is minimised using the repetitive hypothesis testing procedure presented in [4]. At least three frames are needed for the operation of the algorithm.

3  Post-processing of velocity vector field

3.1  Coherence filtering

Feature trackers may occasionally yield completely wrong velocity vectors. To enhance the result of measurement, coherence based post-processing is applied to the `raw' velocity field obtained by the trackers. The coherence filter modifies a velocity vector if it is inconsistent with the dominant surrounding vectors. The solution we use is a modified version of the vector median filter [1]. The procedure operates as follows.

Given a feature point Pc with the velocity vector vc, consider all features Pi, i = 1,2,,p, lying within a distance S from Pc, including Pc itself. Let their velocities be vi. Due to coherent motion, these vectors are assumed to form a cluster in the velocity space. Introduce the mean cumulative difference between a vector vi and all other vectors vj, j i:

Di =


j i 
|| vi - vj ||

p-1
(3)
The median vector is the vector that minimises the cumulative difference. Its index is
med = arg
min
i 
Di
(4)
Dmed, the mean cumulative difference of the median velocity, characterises the spread of the velocity cluster. The standard median filter [1] substitutes vc by the median vmed. In our implementation, vc is substituted by vmed only if the difference between vc and vmed is significant:
|| vc - vmed || > Dmed
(5)
The standard median filter tends to modify most of the measurements and introduce an additional error. The conditional median filter (6) only modifies the vectors that are likely to be imprecise or erroneous measurements. Our tests show that, as far as the overall accuracy is concerned, the conditional median is superior to the standard version.

3.2  Resampling

Uniform sampling of the measured velocity field is normally required. A number of techniques [5] are available for resampling the results of a measurement. However, most of them perform resampling from one regular grid to another. We use the following procedure.

Given a point, G, on the required regular grid, consider all feature points Pi, i = 1,2,,p, lying within a certain distance S from G. Let their velocity vectors be vi. Denote by d2i the squared distance from Pi to G and by [^(d2)] the mean of d2i over all i. Introduce

ai
=
exp
- d2i
^
d2

,
bi
=
ai
k

i = 1 
ai
,
where 0 < ai,bi 1 and i = 1kbi = 1. The interpolated velocity vector in G is calculated as
vG = k

i = 1 
bi vi
(6)

The figures below show examples of coherence filtering and resampling.

Example of vector field post-processing. A PIV flow, the original velocity field, its coherence filtering and resampling.









4  Experimental results

The KLT and IPAN Trackers were run on a large number of standard synthetic and real DPIV sequences available on the Internet. The post-processing algorithms described in section 3 were applied to the initial velocity fields obtained by the KLT and IPAN Trackers. In this section, the two algorithms complemented by the post-processing will be referred to as KLT-PIV and IPAN-PIV, respectively. They will be compared to the correlation (CORR-PIV [11]) and the Orthogonal Dynamic Programming (ODP-PIV [9]) algorithms.

Two sets of synthetic test data with the ground truth were used in our experiments. The two datasets were downloaded from the Japanese PIV-STD Project web site [7] and the Quénot's web site [8].

In the tables below, the error is the mean absolute deviation from the ground truth; the variance of the absolute deviation is also given. Typical times elapsed during execution are also shown, measured on a Pentium 333 MHz under Linux OS. The execution times are only given to indicate the order of magnitude, as they depend on implementation and hardware.

4.1  The PIV-STD data

The Visualisation Society of Japan has launched a project named PIV-STD [7] whose goal is to generate and distribute standard synthetic test sequences for performance evaluation of PIV algorithms. The web site [7] offers a collection of pregenerated sequences as well as a program allowing to produce sequences with desired parameters. The basic tunable parameters are the number of particles Np, the average velocity v, the average particle diameter Pa and its standard deviation Pd.

An example of PIV-STD flow is given in the above figure. Table 1 compares displacement errors of the three methods, IPAN-PIV, KLT-PIV and ODP-PIV2, for six STD-PIV sequences. Results of CORR-PIV were not available for this dataset. ODP-PIV2 is the most precise of the three variants presented in [9].

Table 1: Relative displacement errors (%) for PIV-STD

  s01   s03   s04   s05   s06
IPAN-PIV   4.72 4.3   10.3 4.5   4.86 6.2   5.41 4.9   4.59 5.7
KLT-PIV   3.37 2.3   9.82 4.0   2.02 2.1   3.51 2.7   2.02 1.8
ODP-PIV2   3.52 2.9   9.82 5.1   1.97 2.7   2.68 3.1   1.50 1.6

To assess the PIV-STD results, consider table 2 that describes the six sequences. In this table, `standard' means the standard values which are Np = 4000 particles, Pa = 5.0 pixel, Pd = 1.4 pixel, and v = 7.39 pixel/frame. `Slow' indicates a slow flow with v = 2.24. `Same particles' means that Pd = 0 and the particles are indistinguishable. Other parameters have the standard, or close to the standard, values.

Table 2: Description of the six STD-PIV sequences.

Sequence Np Description
s01 4000 standard
s03 4000 slow
s04 10000 dense
s05 1000 sparse
s06 4000 same particles

For the STD-PIV dataset, the performance of KLT-PIV is very close to that of ODP-PIV. The accuracy of IPAN-PIV is lower. The high density and the indistinguishable particles pose some problem to IPAN-PIV. Note that the ODP-PIV algorithm is much slower than the trackers, as discussed below.

4.2  The CYLINDER data

Table 3 compares IPAN-PIV, KLT-PIV, ODP-PIV and the correlation method CORR-PIV for a set the synthetic flow sequences called CYLINDER [8]. Typical execution times are indicated in the bottom row of the table.

Table 3: Displacement errors for CYLINDER.

IPAN-PIV KLT-PIV CORR-PIV ODP-PIV1 ODP-PIV2
N0 0.42 0.5 0.35 0.6 0.55 1.0 0.13 0.1 0.07 0.1
N5 0.50 0.7 0.35 0.6 0.61 1.2 0.21 0.5 0.08 0.1
N10 0.55 0.7 0.33 0.5 0.77 1.6 0.53 1.4 0.11 0.1
N20 0.90 1.2 0.44 0.6 3.11 4.1 0.88 1.6 0.20 0.1
Time 20 sec 15 sec 10 min 20 min 200 min

ODP-PIV1 is the fastest and least precise of the three variants presented in [9]. CORR-PIV is a 32 ×32 window size correlator. N5, N10 and N20 are noisy versions of the original noise-free sequence N0. The numbers indicate the degrees of noise varying from 5% to 20%. CYLINDER is a complex flow with the mean displacement 7.6 pixel/frame.

For the CYLINDER dataset, the feature trackers outperform the correlation algorithm in both accuracy and execution time. For noisy (and more realistic) data, they compete with the fastest variant of ODP-PIV in accuracy and are definitely superior in processing speed. ODP-PIV2 is very accurate, but its computational load is almost prohibitive. In addition, KLT-PIV is more robust to noise, as its accuracy deteriorates slower than that of ODP-PIV2.

Discussion and conclusions

We have presented a novel approach to Particle Image Velocimetry based on feature tracking. Two efficient algorithms have been customised to PIV by adding the procedures for coherence filtering and resampling. The procedures have been extended to cope with flow discontinuities. The coherence filtering improves the accuracy of velocity estimation. The resampling provides a uniform sampling at the expense of a moderate decrease of the accuracy.

The results of the tests demonstrate that the proposed approach offers a good alternative to both correlation and ODP techniques. KLT-PIV and IPAN-PIV provide higher flow estimation accuracy and are faster than the conventional correlation techniques. For noisy images, the feature tracking PIV provides accuracy comparable with that of the precise ODP algorithms, while requiring much less computation. The processing speed of the trackers can potentially make them suitable for fast flow visualisation, qualitative estimation, and analysis of time-varying flows.

The operation of the trackers is based on detection of prominent features. If no features are found in a region of the flow, interpolation provides measurements in this region as well. Such method may fail when a large area of poor visibility appears in the image because of low contrast or blur. This problem needs further research.

The most interesting, open question is that of flow complexity. Simple, smooth flows like PIV-STD can be in most cases reliably measured by all methods considered in this paper. Future research should focus on flow complexity and involve creation of realistic but complex test sequences with the ground truth provided.

Testing the algorithms over the Internet

Online demonstrations of IPAN-PIV and KLT-PIV are available on the Internet at the web site of the IPAN research group:

http://visual.ipan.sztaki.hu/demo/demo.html

A remote user can select an algorithm, set the parameters and run the algorithm on a short PIV sequence. The flow sequences used in our tests are provided together with the ground truth, when available. The quantitative results of our study can therefore be checked online. Alternatively, the user can upload and process his/her own data and compare the result to the submitted ground truth.

Acknowledgements.

This work was supported by the Hungarian Scientific Research Fund under grants OTKA T026592 and M28078. The author thanks Marcell Nagy and Judit Verestóy for their contributions to the implementation and testing of the algorithms described in this paper.

References

[1]
J. Astola, P. Haavisto, and Y. Neuvo. Vector Median Filters. Proceedings of the IEEE, 78:678-689, 1990.

[2]
J.L. Barron, D.J. Fleet, and S.S. Beauchemin. Performance of optical flow techniques. International Journal of Computer Vision, 12:1:43-77, 1994.

[3]
Stan Birchfield. KLT: An Implementation of the Kanade-Lucas-Tomasi Feature Tracker. http://vision.stanford.edu/~ birch/klt/.

[4]
D. Chetverikov and J. Verestóy. Feature Point Tracking for Incomplete Trajectories. Computing, 62:321-338, 1999.

[5]
A.S. Glassner. Principles of Digital Image Synthesis. Morgan Kaufmann, 1995.

[6]
I. Grant. Particle image velocimetry: a review. Proc. Institution of Mechanical Engineers, 211 Part C:55-76, 1997.

[7]
Standard images for particle imaging velocimetry. http://www.vsj.or.jp/piv.

[8]
G. Quénot. Data and procedures for development and testing of PIV applications. ftp://ftp.limsi.fr/pub/quenot/opflow/.

[9]
G. Quénot, J. Pakleza, and T. Kowalewski. Particle image velocimetry with optical flow. Experiments in Fluids, 25, no.3:177-189, 1998.

[10]
J. Shi and C. Tomasi. Good features to track. In Proc. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR94), Seattle, June 1994.

[11]
P.T. Tokumaru and P.E. Dimotakis. Image correlation velocimetry. Experiments in Fluids, 19:1-15, 1995.

Computer Graphics & Geometry