A discrepancy bound for deterministic acceptance-rejection samplers beyond $N^{-1/2}$

Speaker: 

Houying Zhu

Affiliation: 

UNSW Australia, School of Mathematics & Statistics

Date: 

Thu, 05/11/2015 - 4:00pm

Venue: 

RC-M032, The Red Centre, UNSW

Abstract: 

In this talk we consider an acceptance-rejection (AR) sampler based on deterministic driver sequences. We prove that the Kolmogorov-Smirnov distance (which coincides with the so-called star-discrepancy) between the target distribution and the empirical distribution of an $N$ element sample set generated by the AR sampler is bounded by $\mathcal{O} (N^{-2/3}\log N)$, provided that the target density is twice continuously differentiable with non-vanishing curvature and the AR sampler uses the driver sequence

$$\mathcal{K}_M= \{\boldsymbol{x}_n=( n \alpha, n\beta ) ~~ mod~~1 \mbox{ for } n=1,\ldots,M\}. $$

Here $\alpha,\beta$ are real algebraic numbers such that $1,\alpha,\beta$ is a basis of a number field over $\mathbb{Q}$ of degree $3$. For the driver sequence $$\mathcal{F}_k= \{\boldsymbol{x}_j=({j}/{F_k}, \{{jF_{k-1}}/{F_k}\} )\mbox{ for } j=1,\ldots F_k\},$$ where $F_k$ is the $k$-th Fibonacci number, we can remove the $\log$ factor to improve the convergence rate to $\mathcal{O}(N^{-2/3})$, where again $N$ is the number of samples we accepted. We also introduce a criterion for measuring the goodness of driver sequences. The proposed approach is numerically tested by calculating the Kolmogorov-Smirnov distance of samples generated for some target densities using $\mathcal{K}_M$ and $\mathcal{F}_k$ as driver sequences. These results confirm that achieving a convergence rate beyond $N^{-1/2}$ is possible in practice using $\mathcal{K}_M$ and $\mathcal{F}_k$ as driver sequences in the AR sampler.

The seminar will be followed by drinks and finger food in the staff room. All attendees are welcome!

School Seminar Series: