Sampling interval refers to the distance between two successive observations.
According to the sampling theorem of Nyquist and Shannon, which states that
the ideal sampling interval should be equal to half the resolution, which means two samples per resolution cell. Otherwise, one loses information (undersampling) or one samples more densely than necessary (oversampling). In practice, however, sampling interval and resolution are often roughly equal, although it is good to keep in mind that resolution and sampling interval are two different things. Note, by the way, that in the above example the time domain was taken as the basis, but in the spatial domain the same considerations apply.