Antithetic variatesIn statistics, the antithetic variates method is a variance reduction technique used in Monte Carlo methods. Considering that the error in the simulated signal (using Monte Carlo methods) has a one-over square root convergence, a very large number of sample paths is required to obtain an accurate result. The antithetic variates method reduces the variance of the simulation results.[1][2] Underlying principleThe antithetic variates technique consists, for every sample path obtained, in taking its antithetic path — that is given a path to also take . The advantage of this technique is twofold: it reduces the number of normal samples to be taken to generate N paths, and it reduces the variance of the sample paths, improving the precision. Suppose that we would like to estimate For that we have generated two samples An unbiased estimate of is given by And so variance is reduced if is negative. Example 1If the law of the variable X follows a uniform distribution along [0, 1], the first sample will be , where, for any given i, is obtained from U(0, 1). The second sample is built from , where, for any given i: . If the set is uniform along [0, 1], so are . Furthermore, covariance is negative, allowing for initial variance reduction. Example 2: integral calculationWe would like to estimate The exact result is . This integral can be seen as the expected value of , where and U follows a uniform distribution [0, 1]. The following table compares the classical Monte Carlo estimate (sample size: 2n, where n = 1500) to the antithetic variates estimate (sample size: n, completed with the transformed sample 1 − ui):
The use of the antithetic variates method to estimate the result shows an important variance reduction. See alsoReferences
|