As with the regime shifts in mean and variance, outliers exert significant effect on timing and magnitude of shifts in the correlation coefficient. Figure 1 shows two correlated time series of length n = 60 (1901-1960) generated from a bivariate normal distribution with zero mean and unit variance, but with the correlation coefficient switching from -0.6 in 1901-1930 to 0.6 in 1931-1960.
Now, let us introduce an outlier of -4.0 in one of those series in 1928. If this outlier has the same weight as all other data, it becomes a change point instead of the correct one in 1931. (Fig. 2).
Reducing the tuning constant h from six to two fixes the problem (Fig. 3). It is not recommended to use a too small tuning constant because the data values greater than h are replaced with h since the time series are normalized. This may result in exaggeration of the weighted correlation coefficients. In most cases h = 2 works well, and therefore is used as a default value.