Variational approximation has been widely used in large-scale Bayesian inference recently, the simplest kind of which involves imposing a mean field assumption to approximate complicated latent structures. Despite the computational scalability of mean field, theoretical studies of its loss function surface and the convergence behaviour of iterative updates for optimising the loss are far from complete. In this paper, we focus on the problem of community detection for a simple two-class Stochastic Blockmodel (SBM) with balanced class size. Using batch co-ordinate ascent (BCAVI) for updates, we show different convergence behaviour with respect to different initialisations. When the parameters are known or estimated within a reasonable range and held fixed, we characterise conditions under which an initialisation can converge to the ground truth. On the other hand, when the parameters need to be estimated iteratively, a random initialisation will converge to an uninformative local optimum.