Varying coefficient models have numerous applications in a wide scope of scientific areas. While enjoying nice interpretability, they also allow flexibility in modeling dynamic impacts of the covariates. But, in the new era of big data, it is challenging to select the relevant variables when there are a large number of candidates. Recently several works are focused on this important problem based on sparsity assumptions; they are subject to some limitations, however. We introduce an appealing forward variable selection procedure. It selects important variables sequentially according to a reduction in sum of squares criterion and it employs a BIC-based stopping rule. Clearly it is simple to implement and fast to compute, and it possesses many other desirable properties from both theoretical and numerical viewpoints. Notice that the BIC is a special case of the EBIC, when an extra tuning parameter in the latter vanishes. We establish rigorous screening consistency results when either BIC or EBIC is used as the stopping criterion, although the BIC is preferred to the EBIC on the bases of its superior numerical performance and simplicity. The theoretical results depend on some conditions on the eigenvalues related to the design matrices, and we consider the situation where we can relax the conditions on the eigenvalues. Results of an extensive simulation study and a real data example are also presented to show the efficacy and usefulness of our procedure. Available at http://hermes-ir.lib.hit-u.ac.jp/rs/handle/10086/26895