Vintage scopes are better. Back in February, I did a four-part series on why vintage oscilloscopes are better that their modern counterparts (see part one, part two, part three, and part four). Here's another reason that recently bit me.
Reason number 5: Aliasing.
Another reason that digital scopes are inferior to analog scopes is the inherent aliasing that occurs in any sampled-data system. If the time base is set incorrectly (very incorrectly), aliasing can cause significant confusion and can mislead a novice user.
I have heard people say "Modern digital oscilloscope do not have a problem with aliasing." This statement is demonstrably false. Here is an example of a Tektronix TDS3012B scope looking at a 8 MHz crystal oscillator. (Embarrassingly, this example occurred in front of a room full of students. Although I realized what was happening, and quickly fixed it, it still angered and disappointed me.)
Here is the waveform shown at 40 milliseconds per division. The period appears to be about 32 ms, but the scope is in roll ("strip chart") mode, so I can't get a stationary trace on the screen. The frequency measurement reports about 31 Hz, which matches the period observed.
The waveform at 4 milliseconds per division. The period still appears to be about 32 ms (eight divisions for the brightest trace), but I can't seem to get a stable trigger. Despite the fact that the oscilloscope says that it's triggering, the waveform is rolling around the screen with several ghost images. The frequency measurement reports about 31 Hz, which still matches the period observed.
The waveform at 400 microseconds per division. Now the waveform looks like a smudge, and the frequency measurement reports 500 kHz, which doesn't make sense. The "Low resolution" label is the first indication that the scope thinks that there might be something wrong.
The waveform at 40 microseconds per division. The waveform still looks like a smudge, but the frequency measurement now reports 8.25 MHz, which is in the right ballpark. At least the smudge is uniform across the screen, which implies the time base is much too slow. The "Low resolution" label is still present.
The waveform at 4 microseconds per division. Now we can see many, many rising and falling transitions, and we know that the time base is set too slow. The frequency measurement is 8.015 MHz, which is much better.
The waveform at 400 nanoseconds per division. Finally, we get a clear view of the actual waveform, and a frequency measurement accurate to three significant digits.
The waveform at 40 nanoseconds per division. At last, the correct time-base setting. Now we can see the whole waveform, at the right speed, and we get an accurate frequency measurement.
Go back and look at the first two pictures again. Imagine yourself as an undergraduate student, racing to complete a laboratory project by the end of the class period. You don't have a lot of experience with oscilloscopes, and you're just trying to get a frequency measurement. You can see the period on the screen, the automatic frequency measurement matches, and there is no other indication that the scope is lying to you. Would you take the time to get a stable fixed trigger? Or would you just press the STOP button, make the measurement, and go on the the next step?
I admit that this problem is mostly due to user error (when looking at a new waveform, start at the fastest time scale and gradually slow down the horizontal sweep until you see the signal). However, the conclusion is sound: modern digital scopes DO have a problem with aliasing, whereas vintage analog (non-sampling, of course) scopes do not.