volt-nuts@lists.febo.com

Discussion of precise voltage measurement

View all threads

Re: [volt-nuts] Agilent calibration

CS
Charles Steinmetz
Wed, Aug 14, 2013 5:41 AM

Joe wrote:

The way I read this is that if I send them a DMM that is within spec, they
won't adjust it or provide pre/post data. Is this the case? If I spend over
$200 sending a DMM to them, I want it adjusted to the best possible specs
and I want the data. I do not want someone just saying that it is good
enough and send it back to me. I can get that for $50 in El Paso.

The big difference is not between adjusting and not adjusting -- it
is between getting a calibration "with full data" and getting one
without data.  /The true value of calibration is not the adjustment
-- it is the data./

Agilent doesn't just say it is good enough -- they tell you
specifically how far off it is and quantify the statistical
uncertainty of their measurement.  That is everything you need (i) to
correct readings you make with the instrument and (ii) to be
confident of the potential uncertainty of those measurements.

Let's say your meter has an uncertainty spec of +/- 15 uV (1.5 ppm)
total at 10 V.  If your calibration certificate says the meter reads
dead on at 10.000000 V, the reading shown on the display is your
measurement result (with a certainty of 1.5 ppm, or +/- 15 counts
from the reading) when you measure a 10 V source.  But the cal
certificate could just as well say that the meter reads 10.000008 V
when measuring a 10.000000 V source.  In that case, you know to
subtract 0.000008 V from whatever the meter reads when you measure a
10 V source to get your measurement result (again, with a certainty
of 1.5 ppm, or +/- 15 counts from the corrected reading).  Of course,
in the real world a voltage standard will have its own calibration
offset, so you will make two corrections when you measure your house
"10 V" standard to verify that your meter is still in calibration.

So, why wouldn't they adjust every instrument to be "spot
on"?  Because metrologists have determined that, as a general matter,
not messing with the adjustments results in overall better stability
of instruments.

Adjusting instruments inevitably causes a new drift and settling
cycle, so if you adjust everything as close to perfect as possible
every time you calibrate, you will always be on the steepest portion
of the settling curve.  On the other hand, you can benefit from the
long, ever-decreasing tail of the settling cycle by not adjusting as
long as the instrument is within the manufacturer's
specifications.  Further, seeing the change from one calibration
interval to the next, and the next, etc., increases the confidence
you have in readings you make when the last calibration is not so
fresh any more.

Best regards,

Charles

Joe wrote: >The way I read this is that if I send them a DMM that is within spec, they >won't adjust it or provide pre/post data. Is this the case? If I spend over >$200 sending a DMM to them, I want it adjusted to the best possible specs >and I want the data. I do not want someone just saying that it is good >enough and send it back to me. I can get that for $50 in El Paso. The big difference is not between adjusting and not adjusting -- it is between getting a calibration "with full data" and getting one without data. /The true value of calibration is not the adjustment -- it is the data./ Agilent doesn't just say it is good enough -- they tell you specifically how far off it is and quantify the statistical uncertainty of their measurement. That is everything you need (i) to correct readings you make with the instrument and (ii) to be confident of the potential uncertainty of those measurements. Let's say your meter has an uncertainty spec of +/- 15 uV (1.5 ppm) total at 10 V. If your calibration certificate says the meter reads dead on at 10.000000 V, the reading shown on the display is your measurement result (with a certainty of 1.5 ppm, or +/- 15 counts from the reading) when you measure a 10 V source. But the cal certificate could just as well say that the meter reads 10.000008 V when measuring a 10.000000 V source. In that case, you know to subtract 0.000008 V from whatever the meter reads when you measure a 10 V source to get your measurement result (again, with a certainty of 1.5 ppm, or +/- 15 counts from the corrected reading). Of course, in the real world a voltage standard will have its own calibration offset, so you will make two corrections when you measure your house "10 V" standard to verify that your meter is still in calibration. So, why wouldn't they adjust every instrument to be "spot on"? Because metrologists have determined that, as a general matter, not messing with the adjustments results in overall better stability of instruments. Adjusting instruments inevitably causes a new drift and settling cycle, so if you adjust everything as close to perfect as possible every time you calibrate, you will always be on the steepest portion of the settling curve. On the other hand, you can benefit from the long, ever-decreasing tail of the settling cycle by not adjusting as long as the instrument is within the manufacturer's specifications. Further, seeing the change from one calibration interval to the next, and the next, etc., increases the confidence you have in readings you make when the last calibration is not so fresh any more. Best regards, Charles
DD
Dr. David Kirkby
Wed, Aug 14, 2013 11:29 PM

On 14 August 2013 06:41, Charles Steinmetz csteinmetz@yandex.com wrote:

Joe wrote:

The way I read this is that if I send them a DMM that is within spec, they
won't adjust it or provide pre/post data. Is this the case? If I spend
over
$200 sending a DMM to them, I want it adjusted to the best possible specs
and I want the data. I do not want someone just saying that it is good
enough and send it back to me. I can get that for $50 in El Paso.

The big difference is not between adjusting and not adjusting -- it is
between getting a calibration "with full data" and getting one without data.
/The true value of calibration is not the adjustment -- it is the data./

Agilent doesn't just say it is good enough -- they tell you specifically how
far off it is and quantify the statistical uncertainty of their measurement.
That is everything you need (i) to correct readings you make with the
instrument and (ii) to be confident of the potential uncertainty of those
measurements.

Let's say your meter has an uncertainty spec of +/- 15 uV (1.5 ppm) total at
10 V.  If your calibration certificate says the meter reads dead on at
10.000000 V, the reading shown on the display is your measurement result
(with a certainty of 1.5 ppm, or +/- 15 counts from the reading) when you
measure a 10 V source.  But the cal certificate could just as well say that
the meter reads 10.000008 V when measuring a 10.000000 V source.  In that
case, you know to subtract 0.000008 V from whatever the meter reads when you
measure a 10 V source to get your measurement result (again, with a
certainty of 1.5 ppm, or +/- 15 counts from the corrected reading).  Of
course, in the real world a voltage standard will have its own calibration

One of the advantages of modern instruments over older ones is that
measurements are often more convenient to make. This can reduce your
measurement time and so cost. For many companies, a case can be made
to upgrade if a newer instrument will save time and money.

As a rough guess, I would assume 99.999% of instruments sold sold by
Agilent are for commerical non-metrology work. Those 99.999% of users
do not want to remember to subtract 0.000008 V -  they want that
instrument to be as accurate as possible.

Now if you take an instrument like the Agilent 3458A 8.5 digit DVM,
then the intended user base is going to have a lot of metrologists.
Those people might prefer their instruments are not adjusted, but I
think for 99.999% of users of test equipment, they would want the
instruments adjusted. With so much done in software now, arguments
about pots drifting once adjusted dont make any sence.

By its very nature, the readers of volt-nuts will often fall into the
0.001% that might not want their instruments adjusted, but I think it
is fair to say most would.

Agilent must have thought about these arguments, and have come to a
decision not to adjust. I'm a bit surprised myself, but they obviously
have their reasons. Clearly if an adjustment requires someone to go in
with a screwdriver, then it takes time, has some element of risk of
causing accidental damage, and it might well cause things to drift
more in the short term.

Dave

On 14 August 2013 06:41, Charles Steinmetz <csteinmetz@yandex.com> wrote: > Joe wrote: > >> The way I read this is that if I send them a DMM that is within spec, they >> won't adjust it or provide pre/post data. Is this the case? If I spend >> over >> $200 sending a DMM to them, I want it adjusted to the best possible specs >> and I want the data. I do not want someone just saying that it is good >> enough and send it back to me. I can get that for $50 in El Paso. > The big difference is not between adjusting and not adjusting -- it is > between getting a calibration "with full data" and getting one without data. > /The true value of calibration is not the adjustment -- it is the data./ > > Agilent doesn't just say it is good enough -- they tell you specifically how > far off it is and quantify the statistical uncertainty of their measurement. > That is everything you need (i) to correct readings you make with the > instrument and (ii) to be confident of the potential uncertainty of those > measurements. > > Let's say your meter has an uncertainty spec of +/- 15 uV (1.5 ppm) total at > 10 V. If your calibration certificate says the meter reads dead on at > 10.000000 V, the reading shown on the display is your measurement result > (with a certainty of 1.5 ppm, or +/- 15 counts from the reading) when you > measure a 10 V source. But the cal certificate could just as well say that > the meter reads 10.000008 V when measuring a 10.000000 V source. In that > case, you know to subtract 0.000008 V from whatever the meter reads when you > measure a 10 V source to get your measurement result (again, with a > certainty of 1.5 ppm, or +/- 15 counts from the corrected reading). Of > course, in the real world a voltage standard will have its own calibration One of the advantages of modern instruments over older ones is that measurements are often more convenient to make. This can reduce your measurement time and so cost. For many companies, a case can be made to upgrade if a newer instrument will save time and money. As a rough guess, I would assume 99.999% of instruments sold sold by Agilent are for commerical non-metrology work. Those 99.999% of users do not want to remember to subtract 0.000008 V - they want that instrument to be as accurate as possible. Now if you take an instrument like the Agilent 3458A 8.5 digit DVM, then the intended user base is going to have a lot of metrologists. Those people might prefer their instruments are not adjusted, but I think for 99.999% of users of test equipment, they would want the instruments adjusted. With so much done in software now, arguments about pots drifting once adjusted dont make any sence. By its very nature, the readers of volt-nuts will often fall into the 0.001% that might not want their instruments adjusted, but I think it is fair to say most would. Agilent must have thought about these arguments, and have come to a decision not to adjust. I'm a bit surprised myself, but they obviously have their reasons. Clearly if an adjustment requires someone to go in with a screwdriver, then it takes time, has some element of risk of causing accidental damage, and it might well cause things to drift more in the short term. Dave
DM
Daniel Mendes
Thu, Aug 15, 2013 1:17 AM

Sorry if i´m being naive, but what´s the difficuty of making a digital
equipment with a memory to store the offset of each scale ans subtract
it before sending to the display, no pot trimming involved?
Why aren´t all of the ones made after, let´s say, 1995, like this?

Daniel

Em 14/08/2013 20:29, Dr. David Kirkby escreveu:

On 14 August 2013 06:41, Charles Steinmetz csteinmetz@yandex.com wrote:

Joe wrote:

The way I read this is that if I send them a DMM that is within spec, they
won't adjust it or provide pre/post data. Is this the case? If I spend
over
$200 sending a DMM to them, I want it adjusted to the best possible specs
and I want the data. I do not want someone just saying that it is good
enough and send it back to me. I can get that for $50 in El Paso.

The big difference is not between adjusting and not adjusting -- it is
between getting a calibration "with full data" and getting one without data.
/The true value of calibration is not the adjustment -- it is the data./

Agilent doesn't just say it is good enough -- they tell you specifically how
far off it is and quantify the statistical uncertainty of their measurement.
That is everything you need (i) to correct readings you make with the
instrument and (ii) to be confident of the potential uncertainty of those
measurements.

Let's say your meter has an uncertainty spec of +/- 15 uV (1.5 ppm) total at
10 V.  If your calibration certificate says the meter reads dead on at
10.000000 V, the reading shown on the display is your measurement result
(with a certainty of 1.5 ppm, or +/- 15 counts from the reading) when you
measure a 10 V source.  But the cal certificate could just as well say that
the meter reads 10.000008 V when measuring a 10.000000 V source.  In that
case, you know to subtract 0.000008 V from whatever the meter reads when you
measure a 10 V source to get your measurement result (again, with a
certainty of 1.5 ppm, or +/- 15 counts from the corrected reading).  Of
course, in the real world a voltage standard will have its own calibration

One of the advantages of modern instruments over older ones is that
measurements are often more convenient to make. This can reduce your
measurement time and so cost. For many companies, a case can be made
to upgrade if a newer instrument will save time and money.

As a rough guess, I would assume 99.999% of instruments sold sold by
Agilent are for commerical non-metrology work. Those 99.999% of users
do not want to remember to subtract 0.000008 V -  they want that
instrument to be as accurate as possible.

Now if you take an instrument like the Agilent 3458A 8.5 digit DVM,
then the intended user base is going to have a lot of metrologists.
Those people might prefer their instruments are not adjusted, but I
think for 99.999% of users of test equipment, they would want the
instruments adjusted. With so much done in software now, arguments
about pots drifting once adjusted dont make any sence.

By its very nature, the readers of volt-nuts will often fall into the
0.001% that might not want their instruments adjusted, but I think it
is fair to say most would.

Agilent must have thought about these arguments, and have come to a
decision not to adjust. I'm a bit surprised myself, but they obviously
have their reasons. Clearly if an adjustment requires someone to go in
with a screwdriver, then it takes time, has some element of risk of
causing accidental damage, and it might well cause things to drift
more in the short term.

Dave


volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.

Sorry if i´m being naive, but what´s the difficuty of making a digital equipment with a memory to store the offset of each scale ans subtract it before sending to the display, no pot trimming involved? Why aren´t all of the ones made after, let´s say, 1995, like this? Daniel Em 14/08/2013 20:29, Dr. David Kirkby escreveu: > On 14 August 2013 06:41, Charles Steinmetz <csteinmetz@yandex.com> wrote: >> Joe wrote: >> >>> The way I read this is that if I send them a DMM that is within spec, they >>> won't adjust it or provide pre/post data. Is this the case? If I spend >>> over >>> $200 sending a DMM to them, I want it adjusted to the best possible specs >>> and I want the data. I do not want someone just saying that it is good >>> enough and send it back to me. I can get that for $50 in El Paso. >> The big difference is not between adjusting and not adjusting -- it is >> between getting a calibration "with full data" and getting one without data. >> /The true value of calibration is not the adjustment -- it is the data./ >> >> Agilent doesn't just say it is good enough -- they tell you specifically how >> far off it is and quantify the statistical uncertainty of their measurement. >> That is everything you need (i) to correct readings you make with the >> instrument and (ii) to be confident of the potential uncertainty of those >> measurements. >> >> Let's say your meter has an uncertainty spec of +/- 15 uV (1.5 ppm) total at >> 10 V. If your calibration certificate says the meter reads dead on at >> 10.000000 V, the reading shown on the display is your measurement result >> (with a certainty of 1.5 ppm, or +/- 15 counts from the reading) when you >> measure a 10 V source. But the cal certificate could just as well say that >> the meter reads 10.000008 V when measuring a 10.000000 V source. In that >> case, you know to subtract 0.000008 V from whatever the meter reads when you >> measure a 10 V source to get your measurement result (again, with a >> certainty of 1.5 ppm, or +/- 15 counts from the corrected reading). Of >> course, in the real world a voltage standard will have its own calibration > One of the advantages of modern instruments over older ones is that > measurements are often more convenient to make. This can reduce your > measurement time and so cost. For many companies, a case can be made > to upgrade if a newer instrument will save time and money. > > As a rough guess, I would assume 99.999% of instruments sold sold by > Agilent are for commerical non-metrology work. Those 99.999% of users > do not want to remember to subtract 0.000008 V - they want that > instrument to be as accurate as possible. > > Now if you take an instrument like the Agilent 3458A 8.5 digit DVM, > then the intended user base is going to have a lot of metrologists. > Those people might prefer their instruments are not adjusted, but I > think for 99.999% of users of test equipment, they would want the > instruments adjusted. With so much done in software now, arguments > about pots drifting once adjusted dont make any sence. > > By its very nature, the readers of volt-nuts will often fall into the > 0.001% that might not want their instruments adjusted, but I think it > is fair to say most would. > > Agilent must have thought about these arguments, and have come to a > decision not to adjust. I'm a bit surprised myself, but they obviously > have their reasons. Clearly if an adjustment requires someone to go in > with a screwdriver, then it takes time, has some element of risk of > causing accidental damage, and it might well cause things to drift > more in the short term. > > Dave > _______________________________________________ > volt-nuts mailing list -- volt-nuts@febo.com > To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts > and follow the instructions there.
JP
John Phillips
Thu, Aug 15, 2013 3:15 AM

Daniel,
They are made like that... Problem is with drift.
When you cal a 3458A the first step is to short the inputs and wait for the
thermals to die. Then you do a Cal 0 and the incitement stores all the 0
offsets for that set of terminals. they you switch to the other set and do
it again.
Next step is apply 10 votes to the terminals or in my case it is 9.9999411
and enter cal 9.9999411 front and back set of terminals. remove the voltage
and plug in a 10k standard resistor and in my case enter cal 9999.884 for
front and back. In most cases AC does not have to be done. The meter is
comparing its measure values with the values entered and calculates the
correction factor to be used each time a value is displayed.
Most good meters now days do have a null function so you can look at drift
or compare 2 values.

On Wed, Aug 14, 2013 at 6:17 PM, Daniel Mendes dmendesf@gmail.com wrote:

Sorry if i´m being naive, but what´s the difficuty of making a digital
equipment with a memory to store the offset of each scale ans subtract it
before sending to the display, no pot trimming involved?
Why aren´t all of the ones made after, let´s say, 1995, like this?

Daniel

Em 14/08/2013 20:29, Dr. David Kirkby escreveu:

On 14 August 2013 06:41, Charles Steinmetz csteinmetz@yandex.com wrote:

Joe wrote:

The way I read this is that if I send them a DMM that is within spec,

they
won't adjust it or provide pre/post data. Is this the case? If I spend
over
$200 sending a DMM to them, I want it adjusted to the best possible
specs
and I want the data. I do not want someone just saying that it is good
enough and send it back to me. I can get that for $50 in El Paso.

The big difference is not between adjusting and not adjusting -- it is
between getting a calibration "with full data" and getting one without
data.
/The true value of calibration is not the adjustment -- it is the data./

Agilent doesn't just say it is good enough -- they tell you specifically
how
far off it is and quantify the statistical uncertainty of their
measurement.
That is everything you need (i) to correct readings you make with the
instrument and (ii) to be confident of the potential uncertainty of those
measurements.

Let's say your meter has an uncertainty spec of +/- 15 uV (1.5 ppm)
total at
10 V.  If your calibration certificate says the meter reads dead on at
10.000000 V, the reading shown on the display is your measurement result
(with a certainty of 1.5 ppm, or +/- 15 counts from the reading) when you
measure a 10 V source.  But the cal certificate could just as well say
that
the meter reads 10.000008 V when measuring a 10.000000 V source.  In that
case, you know to subtract 0.000008 V from whatever the meter reads when
you
measure a 10 V source to get your measurement result (again, with a
certainty of 1.5 ppm, or +/- 15 counts from the corrected reading).  Of
course, in the real world a voltage standard will have its own
calibration

One of the advantages of modern instruments over older ones is that
measurements are often more convenient to make. This can reduce your
measurement time and so cost. For many companies, a case can be made
to upgrade if a newer instrument will save time and money.

As a rough guess, I would assume 99.999% of instruments sold sold by
Agilent are for commerical non-metrology work. Those 99.999% of users
do not want to remember to subtract 0.000008 V -  they want that
instrument to be as accurate as possible.

Now if you take an instrument like the Agilent 3458A 8.5 digit DVM,
then the intended user base is going to have a lot of metrologists.
Those people might prefer their instruments are not adjusted, but I
think for 99.999% of users of test equipment, they would want the
instruments adjusted. With so much done in software now, arguments
about pots drifting once adjusted dont make any sence.

By its very nature, the readers of volt-nuts will often fall into the
0.001% that might not want their instruments adjusted, but I think it
is fair to say most would.

Agilent must have thought about these arguments, and have come to a
decision not to adjust. I'm a bit surprised myself, but they obviously
have their reasons. Clearly if an adjustment requires someone to go in
with a screwdriver, then it takes time, has some element of risk of
causing accidental damage, and it might well cause things to drift
more in the short term.

Dave
_____________**
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/**
mailman/listinfo/volt-nutshttps://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.

_____________**
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/**
mailman/listinfo/volt-nutshttps://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.

--
John Phillips

Daniel, They are made like that... Problem is with drift. When you cal a 3458A the first step is to short the inputs and wait for the thermals to die. Then you do a Cal 0 and the incitement stores all the 0 offsets for that set of terminals. they you switch to the other set and do it again. Next step is apply 10 votes to the terminals or in my case it is 9.9999411 and enter cal 9.9999411 front and back set of terminals. remove the voltage and plug in a 10k standard resistor and in my case enter cal 9999.884 for front and back. In most cases AC does not have to be done. The meter is comparing its measure values with the values entered and calculates the correction factor to be used each time a value is displayed. Most good meters now days do have a null function so you can look at drift or compare 2 values. On Wed, Aug 14, 2013 at 6:17 PM, Daniel Mendes <dmendesf@gmail.com> wrote: > > Sorry if i´m being naive, but what´s the difficuty of making a digital > equipment with a memory to store the offset of each scale ans subtract it > before sending to the display, no pot trimming involved? > Why aren´t all of the ones made after, let´s say, 1995, like this? > > Daniel > > Em 14/08/2013 20:29, Dr. David Kirkby escreveu: > > On 14 August 2013 06:41, Charles Steinmetz <csteinmetz@yandex.com> wrote: >> >>> Joe wrote: >>> >>> The way I read this is that if I send them a DMM that is within spec, >>>> they >>>> won't adjust it or provide pre/post data. Is this the case? If I spend >>>> over >>>> $200 sending a DMM to them, I want it adjusted to the best possible >>>> specs >>>> and I want the data. I do not want someone just saying that it is good >>>> enough and send it back to me. I can get that for $50 in El Paso. >>>> >>> The big difference is not between adjusting and not adjusting -- it is >>> between getting a calibration "with full data" and getting one without >>> data. >>> /The true value of calibration is not the adjustment -- it is the data./ >>> >>> Agilent doesn't just say it is good enough -- they tell you specifically >>> how >>> far off it is and quantify the statistical uncertainty of their >>> measurement. >>> That is everything you need (i) to correct readings you make with the >>> instrument and (ii) to be confident of the potential uncertainty of those >>> measurements. >>> >>> Let's say your meter has an uncertainty spec of +/- 15 uV (1.5 ppm) >>> total at >>> 10 V. If your calibration certificate says the meter reads dead on at >>> 10.000000 V, the reading shown on the display is your measurement result >>> (with a certainty of 1.5 ppm, or +/- 15 counts from the reading) when you >>> measure a 10 V source. But the cal certificate could just as well say >>> that >>> the meter reads 10.000008 V when measuring a 10.000000 V source. In that >>> case, you know to subtract 0.000008 V from whatever the meter reads when >>> you >>> measure a 10 V source to get your measurement result (again, with a >>> certainty of 1.5 ppm, or +/- 15 counts from the corrected reading). Of >>> course, in the real world a voltage standard will have its own >>> calibration >>> >> One of the advantages of modern instruments over older ones is that >> measurements are often more convenient to make. This can reduce your >> measurement time and so cost. For many companies, a case can be made >> to upgrade if a newer instrument will save time and money. >> >> As a rough guess, I would assume 99.999% of instruments sold sold by >> Agilent are for commerical non-metrology work. Those 99.999% of users >> do not want to remember to subtract 0.000008 V - they want that >> instrument to be as accurate as possible. >> >> Now if you take an instrument like the Agilent 3458A 8.5 digit DVM, >> then the intended user base is going to have a lot of metrologists. >> Those people might prefer their instruments are not adjusted, but I >> think for 99.999% of users of test equipment, they would want the >> instruments adjusted. With so much done in software now, arguments >> about pots drifting once adjusted dont make any sence. >> >> By its very nature, the readers of volt-nuts will often fall into the >> 0.001% that might not want their instruments adjusted, but I think it >> is fair to say most would. >> >> Agilent must have thought about these arguments, and have come to a >> decision not to adjust. I'm a bit surprised myself, but they obviously >> have their reasons. Clearly if an adjustment requires someone to go in >> with a screwdriver, then it takes time, has some element of risk of >> causing accidental damage, and it might well cause things to drift >> more in the short term. >> >> Dave >> ______________________________**_________________ >> volt-nuts mailing list -- volt-nuts@febo.com >> To unsubscribe, go to https://www.febo.com/cgi-bin/** >> mailman/listinfo/volt-nuts<https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts> >> and follow the instructions there. >> > > ______________________________**_________________ > volt-nuts mailing list -- volt-nuts@febo.com > To unsubscribe, go to https://www.febo.com/cgi-bin/** > mailman/listinfo/volt-nuts<https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts> > and follow the instructions there. > -- John Phillips