The Vapnik-Chervonenkis dimension has proven to be of great use in the theoretical study of generalization in artificial neural networks. The `probably approximately correct' learning framework is described and the importance of the VC dimension is illustrated. We then investigate the VC dimension of certain types of linearly weighted neural networks. First, we obtain bounds on the VC dimensions of radial basis function networks with basis functions of several types. Secondly, we calculate the VC dimension of polynomial discriminant functions defined over both real and binary-valued inputs.