time-nuts@lists.febo.com

Discussion of precise time and frequency measurement

View all threads

Re: [time-nuts] Question about frequency counter testing

HM
Hal Murray
Thu, Apr 26, 2018 7:28 PM

The plots I showed were made with approx. 5*10^6 timestamps  per second, so
theoretically I should get approx. 4ps equivalent  resolution (or 11+
significant digits in one second).

Is there a term for what I think you are doing?

If I understand (big if), you are doing the digital version of magic
down-conversion with an A/D.  I can't even think of the name for that.

If I have a bunch of digital samples and count the transitions I can conpute
a frequency.  But I would get the same results if the input frequency was X
plus the sampling frequency.  Or 2X.  ...  The digital stream is the beat
between the input and the sampling frequency.

That technique depends on having a low jitter clock.  There should be some
good math in there, but I don't see it.

A related trick is getting the time from something that ticks slowly, like
the RTC/CMOS clocks on PCs.  They only tick once per second, but you can get
the time with (much) higher resolution if you poll until it ticks.

Don't forget about metastability.

--
These are my opinions.  I hate spam.

olegskydan@gmail.com said: > The plots I showed were made with approx. 5*10^6 timestamps per second, so > theoretically I should get approx. 4ps equivalent resolution (or 11+ > significant digits in one second). Is there a term for what I think you are doing? If I understand (big if), you are doing the digital version of magic down-conversion with an A/D. I can't even think of the name for that. If I have a bunch of digital samples and count the transitions I can conpute a frequency. But I would get the same results if the input frequency was X plus the sampling frequency. Or 2X. ... The digital stream is the beat between the input and the sampling frequency. That technique depends on having a low jitter clock. There should be some good math in there, but I don't see it. A related trick is getting the time from something that ticks slowly, like the RTC/CMOS clocks on PCs. They only tick once per second, but you can get the time with (much) higher resolution if you poll until it ticks. Don't forget about metastability. -- These are my opinions. I hate spam.
OS
Oleg Skydan
Thu, Apr 26, 2018 9:28 PM

From: "Hal Murray" hmurray@megapathdsl.net
Sent: Thursday, April 26, 2018 10:28 PM

Is there a term for what I think you are doing?

I saw different terms like "omega counter" or multiple time-stamp
average counter, probably there are others too.

If I understand (big if), you are doing the digital version of magic
down-conversion with an A/D.  I can't even think of the name for that.

No, it is much simpler. The hardware saves time-stamps to the memory at
each (event) rise of the input signal (let's consider we have digital logic
input signal for simplicity). So after some time we have many pairs of
{event number, time-stamp}. We can plot those pairs with event number on
X-axis and time on Y-axis, now if we fit the line on that dataset the
inverse slope of the line will correspond to the estimated frequency.

The line is fitted using linear regression.

This technique improves frequency uncertainty as

2*sqrt(3)*tresolution/(MeasurementTime * sqrt(NumberOfEvents-2))

So If I have 2.5ns HW time resolution, and collect 5e6 events,
processing should result in 3.9ps resolution.

Of cause this is for the ideal case. The first real life problem is
signal drift for example.

Hope I was able to tell of what I am doing.

BTW, I have fixed a little bug in firmware and now ADEV looks a bit better.
Probably I should look for better OCXOs. Interesting thing - the counter
processed 300GB of time-stamps data during that 8+hour run :).

All the best!
Oleg

From: "Hal Murray" <hmurray@megapathdsl.net> Sent: Thursday, April 26, 2018 10:28 PM > Is there a term for what I think you are doing? I saw different terms like "omega counter" or multiple time-stamp average counter, probably there are others too. > If I understand (big if), you are doing the digital version of magic > down-conversion with an A/D. I can't even think of the name for that. No, it is much simpler. The hardware saves time-stamps to the memory at each (event) rise of the input signal (let's consider we have digital logic input signal for simplicity). So after some time we have many pairs of {event number, time-stamp}. We can plot those pairs with event number on X-axis and time on Y-axis, now if we fit the line on that dataset the inverse slope of the line will correspond to the estimated frequency. The line is fitted using linear regression. This technique improves frequency uncertainty as 2*sqrt(3)*tresolution/(MeasurementTime * sqrt(NumberOfEvents-2)) So If I have 2.5ns HW time resolution, and collect 5e6 events, processing should result in 3.9ps resolution. Of cause this is for the ideal case. The first real life problem is signal drift for example. Hope I was able to tell of what I am doing. BTW, I have fixed a little bug in firmware and now ADEV looks a bit better. Probably I should look for better OCXOs. Interesting thing - the counter processed 300GB of time-stamps data during that 8+hour run :). All the best! Oleg
BK
Bob kb8tq
Fri, Apr 27, 2018 12:22 AM

Hi

The degree to which your samples converge to a specific value while being averaged
is dependent on a bunch of things. The noise processes on the clock and the measured
signal are pretty hard to avoid. It is very easy to over estimate how fast things converge.

Bob

On Apr 26, 2018, at 5:28 PM, Oleg Skydan olegskydan@gmail.com wrote:

From: "Hal Murray" hmurray@megapathdsl.net
Sent: Thursday, April 26, 2018 10:28 PM

Is there a term for what I think you are doing?

I saw different terms like "omega counter" or multiple time-stamp
average counter, probably there are others too.

If I understand (big if), you are doing the digital version of magic
down-conversion with an A/D.  I can't even think of the name for that.

No, it is much simpler. The hardware saves time-stamps to the memory at
each (event) rise of the input signal (let's consider we have digital logic
input signal for simplicity). So after some time we have many pairs of
{event number, time-stamp}. We can plot those pairs with event number on
X-axis and time on Y-axis, now if we fit the line on that dataset the
inverse slope of the line will correspond to the estimated frequency.

The line is fitted using linear regression.

This technique improves frequency uncertainty as

2*sqrt(3)*tresolution/(MeasurementTime * sqrt(NumberOfEvents-2))

So If I have 2.5ns HW time resolution, and collect 5e6 events,
processing should result in 3.9ps resolution.

Of cause this is for the ideal case. The first real life problem is
signal drift for example.

Hope I was able to tell of what I am doing.

BTW, I have fixed a little bug in firmware and now ADEV looks a bit better.
Probably I should look for better OCXOs. Interesting thing - the counter
processed 300GB of time-stamps data during that 8+hour run :).

All the best!
Oleg
<1133.png>_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Hi The degree to which your samples converge to a specific value while being averaged is dependent on a bunch of things. The noise processes on the clock and the measured signal are pretty hard to avoid. It is *very* easy to over estimate how fast things converge. Bob > On Apr 26, 2018, at 5:28 PM, Oleg Skydan <olegskydan@gmail.com> wrote: > > From: "Hal Murray" <hmurray@megapathdsl.net> > Sent: Thursday, April 26, 2018 10:28 PM > >> Is there a term for what I think you are doing? > > I saw different terms like "omega counter" or multiple time-stamp > average counter, probably there are others too. > >> If I understand (big if), you are doing the digital version of magic >> down-conversion with an A/D. I can't even think of the name for that. > > No, it is much simpler. The hardware saves time-stamps to the memory at > each (event) rise of the input signal (let's consider we have digital logic > input signal for simplicity). So after some time we have many pairs of > {event number, time-stamp}. We can plot those pairs with event number on > X-axis and time on Y-axis, now if we fit the line on that dataset the > inverse slope of the line will correspond to the estimated frequency. > > The line is fitted using linear regression. > > This technique improves frequency uncertainty as > > 2*sqrt(3)*tresolution/(MeasurementTime * sqrt(NumberOfEvents-2)) > > So If I have 2.5ns HW time resolution, and collect 5e6 events, > processing should result in 3.9ps resolution. > > Of cause this is for the ideal case. The first real life problem is > signal drift for example. > > Hope I was able to tell of what I am doing. > > BTW, I have fixed a little bug in firmware and now ADEV looks a bit better. > Probably I should look for better OCXOs. Interesting thing - the counter > processed 300GB of time-stamps data during that 8+hour run :). > > All the best! > Oleg > <1133.png>_______________________________________________ > time-nuts mailing list -- time-nuts@febo.com > To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts > and follow the instructions there.
BK
Bob kb8tq
Fri, Apr 27, 2018 5:22 PM

Hi

So what’s going on here?

With any of a number of modern (and not so modern) FPGA’s you can run a clock in the 400 MHz region.
Clocking with a single edge gives you a 2.5 ns resolution. On some parts, you are not limited to a single
edge. You can clock with both the rising and falling edge of the clock. That gets you to 1.25 ns. For the
brave, there is the ability to phase shift the clock and do the trick yet one more time. That can get you
to 0.6125 ns. You may indeed need to drive more than one input to get that done.

As you get more and more fancy, the chip timing gets further into your data. A very simple analogy is
the non-uniform step size you see on an ADC. Effectively you have a number that has a +/- ?.?? sort
of tolerance on it. As before that may not what you expect in a frequency counter. It still does not mean
the the data is trash. You just have a source of error to contend with.

You could also feed the data down a “wave union” style delay chain. That would get you into the 100ps
range with further linearity issues to contend with. There are also calibration issues as well as temperature
and voltage dependencies. Even the timing in the multi phase clock approach will have some voltage
and temperature dependency.

Since it’s an FPGA, coming up with a lot of resources is not all that crazy expensive. You aren’t buying
gate chips and laying out a PCB. A few thousand logic blocks is tiny by modern standards. Your counter
or delay line ideal might fit in < 100 logic blocks.  There’s lots of room for pipelines and I/O this and that.
The practical limit is how much you want to put into the “pipe” that gets the data out of the FPGA.

In the end, you still are still stuck with the fact that many of the various TDC chips have higher resolution / lower cost.
You also have a pretty big gap between raw chip price and what a fully developed instrument will run.
That’s true regardless of what you base it on and how you do the design.

Bob

On Apr 26, 2018, at 5:28 PM, Oleg Skydan olegskydan@gmail.com wrote:

From: "Hal Murray" hmurray@megapathdsl.net
Sent: Thursday, April 26, 2018 10:28 PM

Is there a term for what I think you are doing?

I saw different terms like "omega counter" or multiple time-stamp
average counter, probably there are others too.

If I understand (big if), you are doing the digital version of magic
down-conversion with an A/D.  I can't even think of the name for that.

No, it is much simpler. The hardware saves time-stamps to the memory at
each (event) rise of the input signal (let's consider we have digital logic
input signal for simplicity). So after some time we have many pairs of
{event number, time-stamp}. We can plot those pairs with event number on
X-axis and time on Y-axis, now if we fit the line on that dataset the
inverse slope of the line will correspond to the estimated frequency.

The line is fitted using linear regression.

This technique improves frequency uncertainty as

2*sqrt(3)*tresolution/(MeasurementTime * sqrt(NumberOfEvents-2))

So If I have 2.5ns HW time resolution, and collect 5e6 events,
processing should result in 3.9ps resolution.

Of cause this is for the ideal case. The first real life problem is
signal drift for example.

Hope I was able to tell of what I am doing.

BTW, I have fixed a little bug in firmware and now ADEV looks a bit better.
Probably I should look for better OCXOs. Interesting thing - the counter
processed 300GB of time-stamps data during that 8+hour run :).

All the best!
Oleg
<1133.png>_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Hi So what’s going on here? With any of a number of modern (and not so modern) FPGA’s you can run a clock in the 400 MHz region. Clocking with a single edge gives you a 2.5 ns resolution. On some parts, you are not limited to a single edge. You can clock with both the rising and falling edge of the clock. That gets you to 1.25 ns. For the brave, there is the ability to phase shift the clock and do the trick yet one more time. That can get you to 0.6125 ns. You may indeed need to drive more than one input to get that done. As you get more and more fancy, the chip timing gets further into your data. A very simple analogy is the non-uniform step size you see on an ADC. Effectively you have a number that has a +/- ?.?? sort of tolerance on it. As before that may not what you expect in a frequency counter. It still does not mean the the data is trash. You just have a source of error to contend with. You could also feed the data down a “wave union” style delay chain. That would get you into the 100ps range with further linearity issues to contend with. There are also calibration issues as well as temperature and voltage dependencies. Even the timing in the multi phase clock approach will have some voltage and temperature dependency. Since it’s an FPGA, coming up with a lot of resources is not all that crazy expensive. You aren’t buying gate chips and laying out a PCB. A few thousand logic blocks is tiny by modern standards. Your counter or delay line ideal might fit in < 100 logic blocks. There’s lots of room for pipelines and I/O this and that. The practical limit is how much you want to put into the “pipe” that gets the data out of the FPGA. In the end, you still are still stuck with the fact that many of the various TDC chips have higher resolution / lower cost. You also have a pretty big gap between raw chip price and what a fully developed instrument will run. That’s true regardless of what you base it on and how you do the design. Bob > On Apr 26, 2018, at 5:28 PM, Oleg Skydan <olegskydan@gmail.com> wrote: > > From: "Hal Murray" <hmurray@megapathdsl.net> > Sent: Thursday, April 26, 2018 10:28 PM > >> Is there a term for what I think you are doing? > > I saw different terms like "omega counter" or multiple time-stamp > average counter, probably there are others too. > >> If I understand (big if), you are doing the digital version of magic >> down-conversion with an A/D. I can't even think of the name for that. > > No, it is much simpler. The hardware saves time-stamps to the memory at > each (event) rise of the input signal (let's consider we have digital logic > input signal for simplicity). So after some time we have many pairs of > {event number, time-stamp}. We can plot those pairs with event number on > X-axis and time on Y-axis, now if we fit the line on that dataset the > inverse slope of the line will correspond to the estimated frequency. > > The line is fitted using linear regression. > > This technique improves frequency uncertainty as > > 2*sqrt(3)*tresolution/(MeasurementTime * sqrt(NumberOfEvents-2)) > > So If I have 2.5ns HW time resolution, and collect 5e6 events, > processing should result in 3.9ps resolution. > > Of cause this is for the ideal case. The first real life problem is > signal drift for example. > > Hope I was able to tell of what I am doing. > > BTW, I have fixed a little bug in firmware and now ADEV looks a bit better. > Probably I should look for better OCXOs. Interesting thing - the counter > processed 300GB of time-stamps data during that 8+hour run :). > > All the best! > Oleg > <1133.png>_______________________________________________ > time-nuts mailing list -- time-nuts@febo.com > To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts > and follow the instructions there.