Abstract: Mobile visual search is a new class of applications that use images taken by camera phone to initiate search queries. It is a very challenging task mainly because of image affine transformations caused by viewpoints changes, and motion blur due to hand tremble. These problems are unavoidable in mobile visual search and often result in low recall. Query expansion is an effective strategy for recall improvement, but existing methods are highly memory and time consuming, and often involve lots of redundant features. Integrating robust local patch mining and geometric parameter coding, this paper proposes an accurate offline query expansion method for large-scale mobile visual search. Concretely, a novel criterion is presented for robust patch evaluation and mining. Then multiple representative features are extracted from these selected local patches to deal with viewpoint changes. Moreover, the geometric parameter of each representative viewpoint is also recorded, to support fast and accurate feature matching. Experimental results on several well-known datasets and a large image set (1M) have demonstrated the effectiveness and efficiency of our method, especially its high robustness to viewpoint changes. The proposed approach can also be well generalized to other multimedia content analysis tasks. [Copyright &y& Elsevier]