One possible problem with the cluster algorithm is the possibility that under some circumstances it will build many very large clusters, taking most of the lattice. In a situation in which there is reflection invariance of the whole action, the flipping of such a ``supercluster'', combined with a symmetry transformation reflecting the whole lattice, is actually equivalent to flipping the much smaller complementary cluster. In this case one may be using a lot of computational effort building large clusters in order to actually make relatively small changes in the configurations, and hence the algorithm may become inefficient.
This kind of problem appears mostly in low dimensions and for very large
values of (low ``temperatures''), away from the critical region,
situations which are therefore only of secondary interest for those
interested mostly in quantum field theory. However, it might be induced
also by the presence of strong, constant external sources, which tend to
strongly orient the fields and hence to favor the building of large
clusters. In this case not only large clusters are more likely to be
built, but they will also most likely be rejected, because flipping them
will tend to cause large increases of the
term of the action.
Once again large amount of computational effort will be spent building
clusters that are likely to be rejected, and the algorithm may then
diffuse through configuration space only very slowly.
One way to handle this kind of problem is to use truncated clusters. A
truncated cluster is one that we just stop building at some point,
according to some criterion. For example, one may require that the
clusters have no more than a certain given maximum number of sites in
them. The way to handle such clusters is suggested by the treatment of
fixed boundary conditions described in the previous section. At the point
where one arbitrarily stops building the cluster, one has a list of
links that should but no longer will be tested for activation.
The corresponding terms of the action have not been taken into account by
the building algorithm, so that flipping this truncated cluster would
violate the condition of detailed balance.
The way to correct this is quite clear now: one examines the remaining
links and calculates the variation of the corresponding terms
of the action due to the flipping of the cluster. Having done this, one
performs a Metropolis test with this variation of the action, thus
establishing a probability for flipping the cluster. This is exactly the
same situation that one has for fixed boundary conditions, when the
cluster has part of its sites connected to the boundary sites. The only
difference is that in our case here the situation is realized at parts of
the border of the truncated cluster, consisting of the links that have
not been tested for activation by the building algorithm.
The introduction of a new parameter, such as the maximum number of sites in a cluster, gives us a handle to try to control the rejection rate and thus to try to optimize the efficiency of the algorithm, that is, the speed with which it diffuses through configuration space. Note that one could use also other criteria for truncating the cluster, such as that the cluster should not extend beyond a certain radius from the initial site. This technique may be useful in some rather extreme situations. For example, one might consider using it when, with the use of the usual algorithm, the average size of the clusters becomes larger than one half the total number of sites in the lattice.
Note that if one reduces the maximum number of sites in a cluster to just
, then the cluster algorithm is reduced back to the simple Metropolis
algorithm, with a strictly local update. Therefore, we can see that
reducing too much the cluster size may bring back the usual critical
slowing down problems of the Metropolis algorithm. Note also that in this
case there is no fixed separation of the action in two parts, the
separation is dynamical, determined only at the point where we decide to
stop building the current cluster. This variation of the algorithm is not
actually difficult to implement, but finding the optimal maximum cluster
size under a given set of circumstances may not be so easy a task.