We present closed-form expressions for the entropy rate, statistical complexity, and predictive information for the spike train of a single neuron in terms of the first passage time probability distribution. Our analysis applies to any one-dimensional neural model where observation of a spike causes the neuron to reset'' to some membrane voltage and in which any noise term is uncorrelated in time. We then use these formulae to study the linear leaky integrate-and- fire and quadratic integrate-and-fire neurons driven by white noise in the naturally spiking regime. The statistical complexity is simply related to the interspike interval's mean and coefficient of variation. The excess entropy, or the total predictive information, is highest for neural spike trains with low interspike interval coefficients of variation. Both the statistical complexity and the excess entropy arise naturally in the context of predictive rate-distortion and could be useful for characterizing predictor'' neurons that learn to predict other neurons.