Sampling interval

Introduction

Sampling interval refers to the distance between two successive observations.

Explanation

According to the sampling theorem of Nyquist and Shannon, which states that

“If a function x(t) contains no frequencies higher than B hertz, it is completely determined by giving its ordinates at a series of points spaced 1/(2B) seconds apart”,

the ideal sampling interval should be equal to half the resolution, which means two samples per resolution cell. Otherwise, one loses information (undersampling) or one samples more densely than necessary (oversampling). In practice, however, sampling interval and resolution are often roughly equal, although it is good to keep in mind that resolution and sampling interval are two different things. Note, by the way, that in the above example the time domain was taken as the basis, but in the spatial domain the same considerations apply.

Incoming relations