Hello, I’m Rudi and this is my first post in your group.
Being new to many of the advanced topics, time and frequency measurement, I do enjoy exploring my new Agilent 53132A meter, including the up to 15 digits that can be produced via the GPIB bus or the serial port. Yet, I am puzzled by one aspect of the 53132A: why does a measurement in the Digit Arm mode take so much more time than one in the Gate Time Arm mode, for the same number of produced digits?
As I understand from the documentation, these modes are similar in terms of accuracy and other aspects (unlike the automatic arm mode, which is different in several respects). But the measurement speed difference is huge - in the attached graph, I show my observations for the Digits Arm mode, while for Time Arm, it is up to 10x or 100x faster!
I noted the 53132A manual has a formula for calculating the LSD displayed as a function of gate time, frequency, and more. But that formula puzzles me – when I calculate it for different frequency values (keeping gate time at 1s), I get strange results, not representing the LSD displayed, it seems (see the attached calculation). Also, the last term in the formula, being frequency OR period, makes it ambiguous.
Anyway, any thoughts appreciated!
Best, Rudi
YT: Rudi’s Electronics Lab