In this paper, we propose an acceleration scheme for mini-batch streaming PCA methods that are based on the Stochastic Gradient Approximation. Our scheme converges to the first $k>1$ eigenvectors in a single data pass even when using a very small batch size. We provide empirical convergence results of our scheme based on the spiked covariance model. Our scheme does not require any prior knowledge of the data distribution and hence is well suited for streaming data scenarios. Furthermore, based on empirical evaluations using the spiked covariance model and large-scale benchmark datasets, we find that our acceleration scheme outperforms related state-of-the-art online PCA approaches including SGA, Incremental PCA and Candid Covariance-free Incremental PCA.