T
Tony
Thu, Apr 10, 2014 7:58 PM
Possibly so, but I'd expect that the vast majority of HP 34401A 6-1/2
digit bench multimeters never see/saw > 1kV - even when CRT's were
common. Looking at the specs for several HV probes (1% accuracy was the
best I could find with a quick search), a $10, 3-1/2 digit DVM will be
just as accurate.
"Many of the accurate ones want to see a 10 meg input."
Well if you want high accuracy with such a probe, then you wouldn't
use a 34401A given that it's 10M ohm /P dividor has a tolerance of ± 1%
which would limit the overall accuracy to around .1% at best (for a
1Gohm, 1000:1 passive HV probe).
The 34401A only offers 10G ohm i/p resistance on the .1, 1 and 10V
ranges, switching to 10M ohm on 100 and 1kV ranges. So selecting the
100V range (much easier than using the menus to change the i/p
resistance) automatically selects the 10M ohm i/p resistance. Using a
1000:1 probe, voltages between 1kV and 10kV would lose a digit of
resolution compared to using the 10V range, but 5-1/2 digits is still
way more than needed given the .1% accuracy limited by the 34401A's i/p
resistance tolerance.
All in all, I think providing a minor convenience feature for HV probe
users (not having to manually select the 100V range) is a very unlikely
reason for selecting 10M as the default given that way more measurements
(source > 10 ohms) require the 10G ohm i/p resistance to justify using a
6-1/2 digit instrument.
Tony H
On 10/04/2014 16:07, Tom Miller wrote:
Think "HV Probe". Many of the accurate ones want to see a 10 meg input.
Also, some meters change input impedance depending on the selected range.
T
----- Original Message ----- From: "Tony" vnuts@toneh.demon.co.uk
To: volt-nuts@febo.com
Sent: Thursday, April 10, 2014 10:23 AM
Subject: [volt-nuts] 34401A Why 10M ohm default i/p resistance?
There is no suggestion in the specifications for the 34401A that the
accuracy suffers by selecting 10G ohm input resistance on the .1 to
10V range so why would they make 10M ohm the default? I can think of
very few cases where having the 10M ohm i/p resistor switched in is
better for accuracy than not.
On the other hand 10M is sufficiently low to produce significant
errors on a 6 1/2 digit DVM for sources with resistances as low as 10
ohms. Measuring 1V divided by a 100k/100k ohm divider for example
causes a .5% error - 502.488mV instead of 500.000mV. That might not
be a problem but I wouldn't be surprised if this catches a lot of
people out (including me) when not pausing to do the mental
arithmetic to estimate the error. It's just too easy to be seduced by
all those digits into thinking you've made an accurate measurement
even though you discarded those last three digits.
And if it's not a problem then you probably don't need an expensive 6
1/2 digit meter in the first place.
It's a small point I agree but it can get irritating to have to keep
going into the measurement menus to change it when the meter is
turned on when measuring high impedance sources (e.g. capacitor
leakage testing).
It can't be to improve i/p protection as 10M is too high to make any
significant difference to ESD and in any case there is plenty of
other over-voltage protection. OK. it provides a path for the DC
amplifier's input bias current, specified to be < 30pA at 25 degrees
C, but I imagine that varies significantly from one meter to the
next, and with temperature, so not useful for nulling out that error.
So why would they do this? Could it be psychological? By limiting the
drift caused by the i/p bias current to 300uV max when the meter is
left unconnected? A voltmeter with a rapidly drifting reading
(several mV/s) when not connected to anything is a bit disconcerting
and would probably lead to complaints that the meter is obviously
faulty to users who are used to DVMs which read 0V when open circuit
- because they have i/p resistance << 10G ohms and don't have the
resolution to show the offset voltage caused by the i/p bias current.
Personally I'd have though that the default should be the other way
round - especially given that there is no indication on the front
panel or display as to which i/p resistance is currently selected.
Any thoughts? What do other meters do?
Tony H
Possibly so, but I'd expect that the vast majority of HP 34401A 6-1/2
digit bench multimeters never see/saw > 1kV - even when CRT's were
common. Looking at the specs for several HV probes (1% accuracy was the
best I could find with a quick search), a $10, 3-1/2 digit DVM will be
just as accurate.
"Many of the accurate ones want to see a 10 meg input."
Well if you want *high* accuracy with such a probe, then you wouldn't
use a 34401A given that it's 10M ohm /P dividor has a tolerance of ± 1%
which would limit the overall accuracy to around .1% at best (for a
1Gohm, 1000:1 passive HV probe).
The 34401A only offers 10G ohm i/p resistance on the .1, 1 and 10V
ranges, switching to 10M ohm on 100 and 1kV ranges. So selecting the
100V range (much easier than using the menus to change the i/p
resistance) automatically selects the 10M ohm i/p resistance. Using a
1000:1 probe, voltages between 1kV and 10kV would lose a digit of
resolution compared to using the 10V range, but 5-1/2 digits is still
way more than needed given the .1% accuracy limited by the 34401A's i/p
resistance tolerance.
All in all, I think providing a minor convenience feature for HV probe
users (not having to manually select the 100V range) is a very unlikely
reason for selecting 10M as the default given that way more measurements
(source > 10 ohms) require the 10G ohm i/p resistance to justify using a
6-1/2 digit instrument.
Tony H
On 10/04/2014 16:07, Tom Miller wrote:
> Think "HV Probe". Many of the accurate ones want to see a 10 meg input.
>
> Also, some meters change input impedance depending on the selected range.
>
> T
>
> ----- Original Message ----- From: "Tony" <vnuts@toneh.demon.co.uk>
> To: <volt-nuts@febo.com>
> Sent: Thursday, April 10, 2014 10:23 AM
> Subject: [volt-nuts] 34401A Why 10M ohm default i/p resistance?
>
>
>> There is no suggestion in the specifications for the 34401A that the
>> accuracy suffers by selecting 10G ohm input resistance on the .1 to
>> 10V range so why would they make 10M ohm the default? I can think of
>> very few cases where having the 10M ohm i/p resistor switched in is
>> better for accuracy than not.
>>
>> On the other hand 10M is sufficiently low to produce significant
>> errors on a 6 1/2 digit DVM for sources with resistances as low as 10
>> ohms. Measuring 1V divided by a 100k/100k ohm divider for example
>> causes a .5% error - 502.488mV instead of 500.000mV. That might not
>> be a problem but I wouldn't be surprised if this catches a lot of
>> people out (including me) when not pausing to do the mental
>> arithmetic to estimate the error. It's just too easy to be seduced by
>> all those digits into thinking you've made an accurate measurement
>> even though you discarded those last three digits.
>>
>> And if it's not a problem then you probably don't need an expensive 6
>> 1/2 digit meter in the first place.
>>
>> It's a small point I agree but it can get irritating to have to keep
>> going into the measurement menus to change it when the meter is
>> turned on when measuring high impedance sources (e.g. capacitor
>> leakage testing).
>>
>> It can't be to improve i/p protection as 10M is too high to make any
>> significant difference to ESD and in any case there is plenty of
>> other over-voltage protection. OK. it provides a path for the DC
>> amplifier's input bias current, specified to be < 30pA at 25 degrees
>> C, but I imagine that varies significantly from one meter to the
>> next, and with temperature, so not useful for nulling out that error.
>>
>> So why would they do this? Could it be psychological? By limiting the
>> drift caused by the i/p bias current to 300uV max when the meter is
>> left unconnected? A voltmeter with a rapidly drifting reading
>> (several mV/s) when not connected to anything is a bit disconcerting
>> and would probably lead to complaints that the meter is obviously
>> faulty to users who are used to DVMs which read 0V when open circuit
>> - because they have i/p resistance << 10G ohms and don't have the
>> resolution to show the offset voltage caused by the i/p bias current.
>>
>> Personally I'd have though that the default should be the other way
>> round - especially given that there is no indication on the front
>> panel or display as to which i/p resistance is currently selected.
>>
>> Any thoughts? What do other meters do?
>>
>> Tony H
T
Tony
Thu, Apr 10, 2014 8:06 PM
That seems a more likely reason - matching users' expectations. It's the
unexpected that trips people up - I doubt many casual users of DVMs ever
see the manuals. I still think it was the wrong choice.
Tony H
On 10/04/2014 18:58, Joel Setton wrote:
I think the 10 Meg default value became a de facto standard at the
time of VTVMs (vacuum-tube volt meters), as a convenient value which
reduced input circuit loading while remaining compatible with the grid
current of the input triode. Designers of early solid-state voltmeters
merely decided not to change a good thing.
Just my $0.02 worth!
Joel Setton
On 10/04/2014 18:55, Steven J Banaska wrote:
As Tom said the 10M input impedance is used for the high voltage ranges
because it is a resistive divider (9.9M/100k) that can handle high
voltages
without much drift. Caddock THV or HVD are fairly common in precision
dmms.
Typically you will find a high impedance (10G) path that can be used for
the ranges 10V and lower, but the 10M divider can be left connected and
will work for any voltage range by changing which side you measure.
As you
mentioned there can be an accuracy sacrifice when you have a high output
impedance from your source. I'm not sure why 10M is the default other
than
it may extend the life of the relay that switches the 10M divider in
or out.
Steve
That seems a more likely reason - matching users' expectations. It's the
unexpected that trips people up - I doubt many casual users of DVMs ever
see the manuals. I still think it was the wrong choice.
Tony H
On 10/04/2014 18:58, Joel Setton wrote:
> I think the 10 Meg default value became a de facto standard at the
> time of VTVMs (vacuum-tube volt meters), as a convenient value which
> reduced input circuit loading while remaining compatible with the grid
> current of the input triode. Designers of early solid-state voltmeters
> merely decided not to change a good thing.
> Just my $0.02 worth!
>
> Joel Setton
>
>
> On 10/04/2014 18:55, Steven J Banaska wrote:
>> As Tom said the 10M input impedance is used for the high voltage ranges
>> because it is a resistive divider (9.9M/100k) that can handle high
>> voltages
>> without much drift. Caddock THV or HVD are fairly common in precision
>> dmms.
>>
>> Typically you will find a high impedance (10G) path that can be used for
>> the ranges 10V and lower, but the 10M divider can be left connected and
>> will work for any voltage range by changing which side you measure.
>> As you
>> mentioned there can be an accuracy sacrifice when you have a high output
>> impedance from your source. I'm not sure why 10M is the default other
>> than
>> it may extend the life of the relay that switches the 10M divider in
>> or out.
>>
>> Steve
>>
>>
>
> _______________________________________________
> volt-nuts mailing list -- volt-nuts@febo.com
> To unsubscribe, go to
> https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
> and follow the instructions there.
>
BG
Brent Gordon
Thu, Apr 10, 2014 8:16 PM
Pure conjecture: So that the reading on the 34401A matches that on a
$20 DVM.
Or stated differently: So that the input impedance is the same as other
DVMs.
Brent
On 4/10/2014 8:23 AM, Tony wrote:
There is no suggestion in the specifications for the 34401A that the
accuracy suffers by selecting 10G ohm input resistance on the .1 to
10V range so why would they make 10M ohm the default? I can think of
very few cases where having the 10M ohm i/p resistor switched in is
better for accuracy than not.
On the other hand 10M is sufficiently low to produce significant
errors on a 6 1/2 digit DVM for sources with resistances as low as 10
ohms. Measuring 1V divided by a 100k/100k ohm divider for example
causes a .5% error - 502.488mV instead of 500.000mV. That might not be
a problem but I wouldn't be surprised if this catches a lot of people
out (including me) when not pausing to do the mental arithmetic to
estimate the error. It's just too easy to be seduced by all those
digits into thinking you've made an accurate measurement even though
you discarded those last three digits.
And if it's not a problem then you probably don't need an expensive 6
1/2 digit meter in the first place.
It's a small point I agree but it can get irritating to have to keep
going into the measurement menus to change it when the meter is turned
on when measuring high impedance sources (e.g. capacitor leakage
testing).
It can't be to improve i/p protection as 10M is too high to make any
significant difference to ESD and in any case there is plenty of other
over-voltage protection. OK. it provides a path for the DC amplifier's
input bias current, specified to be < 30pA at 25 degrees C, but I
imagine that varies significantly from one meter to the next, and with
temperature, so not useful for nulling out that error.
So why would they do this?
Pure conjecture: So that the reading on the 34401A matches that on a
$20 DVM.
Or stated differently: So that the input impedance is the same as other
DVMs.
Brent
On 4/10/2014 8:23 AM, Tony wrote:
> There is no suggestion in the specifications for the 34401A that the
> accuracy suffers by selecting 10G ohm input resistance on the .1 to
> 10V range so why would they make 10M ohm the default? I can think of
> very few cases where having the 10M ohm i/p resistor switched in is
> better for accuracy than not.
>
> On the other hand 10M is sufficiently low to produce significant
> errors on a 6 1/2 digit DVM for sources with resistances as low as 10
> ohms. Measuring 1V divided by a 100k/100k ohm divider for example
> causes a .5% error - 502.488mV instead of 500.000mV. That might not be
> a problem but I wouldn't be surprised if this catches a lot of people
> out (including me) when not pausing to do the mental arithmetic to
> estimate the error. It's just too easy to be seduced by all those
> digits into thinking you've made an accurate measurement even though
> you discarded those last three digits.
>
> And if it's not a problem then you probably don't need an expensive 6
> 1/2 digit meter in the first place.
>
> It's a small point I agree but it can get irritating to have to keep
> going into the measurement menus to change it when the meter is turned
> on when measuring high impedance sources (e.g. capacitor leakage
> testing).
>
> It can't be to improve i/p protection as 10M is too high to make any
> significant difference to ESD and in any case there is plenty of other
> over-voltage protection. OK. it provides a path for the DC amplifier's
> input bias current, specified to be < 30pA at 25 degrees C, but I
> imagine that varies significantly from one meter to the next, and with
> temperature, so not useful for nulling out that error.
>
> So why would they do this?
JP
John Phillips
Thu, Apr 10, 2014 8:43 PM
so why do you care what the input is as long as you know what it is and how
to make it do what you want?
On Thu, Apr 10, 2014 at 1:16 PM, Brent Gordon volt-nuts@adobe-labs.comwrote:
Pure conjecture: So that the reading on the 34401A matches that on a $20
DVM.
Or stated differently: So that the input impedance is the same as other
DVMs.
Brent
On 4/10/2014 8:23 AM, Tony wrote:
There is no suggestion in the specifications for the 34401A that the
accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V
range so why would they make 10M ohm the default? I can think of very few
cases where having the 10M ohm i/p resistor switched in is better for
accuracy than not.
On the other hand 10M is sufficiently low to produce significant errors
on a 6 1/2 digit DVM for sources with resistances as low as 10 ohms.
Measuring 1V divided by a 100k/100k ohm divider for example causes a .5%
error - 502.488mV instead of 500.000mV. That might not be a problem but I
wouldn't be surprised if this catches a lot of people out (including me)
when not pausing to do the mental arithmetic to estimate the error. It's
just too easy to be seduced by all those digits into thinking you've made
an accurate measurement even though you discarded those last three digits.
And if it's not a problem then you probably don't need an expensive 6 1/2
digit meter in the first place.
It's a small point I agree but it can get irritating to have to keep
going into the measurement menus to change it when the meter is turned on
when measuring high impedance sources (e.g. capacitor leakage testing).
It can't be to improve i/p protection as 10M is too high to make any
significant difference to ESD and in any case there is plenty of other
over-voltage protection. OK. it provides a path for the DC amplifier's
input bias current, specified to be < 30pA at 25 degrees C, but I imagine
that varies significantly from one meter to the next, and with temperature,
so not useful for nulling out that error.
So why would they do this?
so why do you care what the input is as long as you know what it is and how
to make it do what you want?
On Thu, Apr 10, 2014 at 1:16 PM, Brent Gordon <volt-nuts@adobe-labs.com>wrote:
> Pure conjecture: So that the reading on the 34401A matches that on a $20
> DVM.
>
> Or stated differently: So that the input impedance is the same as other
> DVMs.
>
> Brent
>
>
> On 4/10/2014 8:23 AM, Tony wrote:
>
>> There is no suggestion in the specifications for the 34401A that the
>> accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V
>> range so why would they make 10M ohm the default? I can think of very few
>> cases where having the 10M ohm i/p resistor switched in is better for
>> accuracy than not.
>>
>> On the other hand 10M is sufficiently low to produce significant errors
>> on a 6 1/2 digit DVM for sources with resistances as low as 10 ohms.
>> Measuring 1V divided by a 100k/100k ohm divider for example causes a .5%
>> error - 502.488mV instead of 500.000mV. That might not be a problem but I
>> wouldn't be surprised if this catches a lot of people out (including me)
>> when not pausing to do the mental arithmetic to estimate the error. It's
>> just too easy to be seduced by all those digits into thinking you've made
>> an accurate measurement even though you discarded those last three digits.
>>
>> And if it's not a problem then you probably don't need an expensive 6 1/2
>> digit meter in the first place.
>>
>> It's a small point I agree but it can get irritating to have to keep
>> going into the measurement menus to change it when the meter is turned on
>> when measuring high impedance sources (e.g. capacitor leakage testing).
>>
>> It can't be to improve i/p protection as 10M is too high to make any
>> significant difference to ESD and in any case there is plenty of other
>> over-voltage protection. OK. it provides a path for the DC amplifier's
>> input bias current, specified to be < 30pA at 25 degrees C, but I imagine
>> that varies significantly from one meter to the next, and with temperature,
>> so not useful for nulling out that error.
>>
>> So why would they do this?
>>
>
> _______________________________________________
> volt-nuts mailing list -- volt-nuts@febo.com
> To unsubscribe, go to https://www.febo.com/cgi-bin/
> mailman/listinfo/volt-nuts
> and follow the instructions there.
>
--
John Phillips
TM
Tom Miller
Thu, Apr 10, 2014 8:45 PM
Don't forget. There is accuracy and then precision. You should not confuse
the two.
And many things use high voltages >1kv besides old crts.
T
----- Original Message -----
From: "Brent Gordon" volt-nuts@adobe-labs.com
To: "Discussion of precise voltage measurement" volt-nuts@febo.com
Sent: Thursday, April 10, 2014 4:16 PM
Subject: Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?
Pure conjecture: So that the reading on the 34401A matches that on a $20
DVM.
Or stated differently: So that the input impedance is the same as other
DVMs.
Brent
On 4/10/2014 8:23 AM, Tony wrote:
There is no suggestion in the specifications for the 34401A that the
accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V
range so why would they make 10M ohm the default? I can think of very few
cases where having the 10M ohm i/p resistor switched in is better for
accuracy than not.
On the other hand 10M is sufficiently low to produce significant errors
on a 6 1/2 digit DVM for sources with resistances as low as 10 ohms.
Measuring 1V divided by a 100k/100k ohm divider for example causes a .5%
error - 502.488mV instead of 500.000mV. That might not be a problem but I
wouldn't be surprised if this catches a lot of people out (including me)
when not pausing to do the mental arithmetic to estimate the error. It's
just too easy to be seduced by all those digits into thinking you've made
an accurate measurement even though you discarded those last three
digits.
And if it's not a problem then you probably don't need an expensive 6 1/2
digit meter in the first place.
It's a small point I agree but it can get irritating to have to keep
going into the measurement menus to change it when the meter is turned on
when measuring high impedance sources (e.g. capacitor leakage testing).
It can't be to improve i/p protection as 10M is too high to make any
significant difference to ESD and in any case there is plenty of other
over-voltage protection. OK. it provides a path for the DC amplifier's
input bias current, specified to be < 30pA at 25 degrees C, but I imagine
that varies significantly from one meter to the next, and with
temperature, so not useful for nulling out that error.
So why would they do this?
Don't forget. There is accuracy and then precision. You should not confuse
the two.
And many things use high voltages >1kv besides old crts.
T
----- Original Message -----
From: "Brent Gordon" <volt-nuts@adobe-labs.com>
To: "Discussion of precise voltage measurement" <volt-nuts@febo.com>
Sent: Thursday, April 10, 2014 4:16 PM
Subject: Re: [volt-nuts] 34401A Why 10M ohm default i/p resistance?
> Pure conjecture: So that the reading on the 34401A matches that on a $20
> DVM.
>
> Or stated differently: So that the input impedance is the same as other
> DVMs.
>
> Brent
>
> On 4/10/2014 8:23 AM, Tony wrote:
>> There is no suggestion in the specifications for the 34401A that the
>> accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V
>> range so why would they make 10M ohm the default? I can think of very few
>> cases where having the 10M ohm i/p resistor switched in is better for
>> accuracy than not.
>>
>> On the other hand 10M is sufficiently low to produce significant errors
>> on a 6 1/2 digit DVM for sources with resistances as low as 10 ohms.
>> Measuring 1V divided by a 100k/100k ohm divider for example causes a .5%
>> error - 502.488mV instead of 500.000mV. That might not be a problem but I
>> wouldn't be surprised if this catches a lot of people out (including me)
>> when not pausing to do the mental arithmetic to estimate the error. It's
>> just too easy to be seduced by all those digits into thinking you've made
>> an accurate measurement even though you discarded those last three
>> digits.
>>
>> And if it's not a problem then you probably don't need an expensive 6 1/2
>> digit meter in the first place.
>>
>> It's a small point I agree but it can get irritating to have to keep
>> going into the measurement menus to change it when the meter is turned on
>> when measuring high impedance sources (e.g. capacitor leakage testing).
>>
>> It can't be to improve i/p protection as 10M is too high to make any
>> significant difference to ESD and in any case there is plenty of other
>> over-voltage protection. OK. it provides a path for the DC amplifier's
>> input bias current, specified to be < 30pA at 25 degrees C, but I imagine
>> that varies significantly from one meter to the next, and with
>> temperature, so not useful for nulling out that error.
>>
>> So why would they do this?
>
> _______________________________________________
> volt-nuts mailing list -- volt-nuts@febo.com
> To unsubscribe, go to
> https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
> and follow the instructions there.
T
Tony
Thu, Apr 10, 2014 8:54 PM
Very unlikely I'd have thought - the relay (K104) which selects between
the high and low voltage ranges also selects the I/P resistance. It
wouldn't get used any more than the identical relay |(K102) which
switches when changing between 10 and 100V ranges.
In any case a typical signal relay is rated for 10^8 operations
(typical) with no load, as in this application, which is 32 years at one
operation per second! How often does even a heavily used DVM change
between the 10 and 100V ranges?
On 10/04/2014 20:27, Andreas Jahn wrote:
Hello,
perhaps its just to save the lifetime of the input range selection
relays to at least the warranty time.
Just a guess.
With best regards
Andreas
Very unlikely I'd have thought - the relay (K104) which selects between
the high and low voltage ranges also selects the I/P resistance. It
wouldn't get used any more than the identical relay |(K102) which
switches when changing between 10 and 100V ranges.
In any case a typical signal relay is rated for 10^8 operations
(typical) with no load, as in this application, which is 32 years at one
operation per second! How often does even a heavily used DVM change
between the 10 and 100V ranges?
On 10/04/2014 20:27, Andreas Jahn wrote:
> Hello,
>
> perhaps its just to save the lifetime of the input range selection
> relays to at least the warranty time.
> Just a guess.
>
> With best regards
>
> Andreas
BC
Brooke Clarke
Thu, Apr 10, 2014 8:58 PM
Hi John:
Because when measuring a source with a high resistance you get a different answer.
Some W.W.II electronics specified 1 kOhm/Volt meters and if you used a VTVM you got the wrong results.
If a test procedure specifies a 10MOhm input meter and you use a higher input Z then you may get wrong results.
Have Fun,
Brooke Clarke
http://www.PRC68.com
http://www.end2partygovernment.com/2012Issues.html
John Phillips wrote:
so why do you care what the input is as long as you know what it is and how
to make it do what you want?
On Thu, Apr 10, 2014 at 1:16 PM, Brent Gordon volt-nuts@adobe-labs.comwrote:
Pure conjecture: So that the reading on the 34401A matches that on a $20
DVM.
Or stated differently: So that the input impedance is the same as other
DVMs.
Brent
On 4/10/2014 8:23 AM, Tony wrote:
There is no suggestion in the specifications for the 34401A that the
accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V
range so why would they make 10M ohm the default? I can think of very few
cases where having the 10M ohm i/p resistor switched in is better for
accuracy than not.
On the other hand 10M is sufficiently low to produce significant errors
on a 6 1/2 digit DVM for sources with resistances as low as 10 ohms.
Measuring 1V divided by a 100k/100k ohm divider for example causes a .5%
error - 502.488mV instead of 500.000mV. That might not be a problem but I
wouldn't be surprised if this catches a lot of people out (including me)
when not pausing to do the mental arithmetic to estimate the error. It's
just too easy to be seduced by all those digits into thinking you've made
an accurate measurement even though you discarded those last three digits.
And if it's not a problem then you probably don't need an expensive 6 1/2
digit meter in the first place.
It's a small point I agree but it can get irritating to have to keep
going into the measurement menus to change it when the meter is turned on
when measuring high impedance sources (e.g. capacitor leakage testing).
It can't be to improve i/p protection as 10M is too high to make any
significant difference to ESD and in any case there is plenty of other
over-voltage protection. OK. it provides a path for the DC amplifier's
input bias current, specified to be < 30pA at 25 degrees C, but I imagine
that varies significantly from one meter to the next, and with temperature,
so not useful for nulling out that error.
So why would they do this?
Hi John:
Because when measuring a source with a high resistance you get a different answer.
Some W.W.II electronics specified 1 kOhm/Volt meters and if you used a VTVM you got the wrong results.
If a test procedure specifies a 10MOhm input meter and you use a higher input Z then you may get wrong results.
Have Fun,
Brooke Clarke
http://www.PRC68.com
http://www.end2partygovernment.com/2012Issues.html
John Phillips wrote:
> so why do you care what the input is as long as you know what it is and how
> to make it do what you want?
>
>
> On Thu, Apr 10, 2014 at 1:16 PM, Brent Gordon <volt-nuts@adobe-labs.com>wrote:
>
>> Pure conjecture: So that the reading on the 34401A matches that on a $20
>> DVM.
>>
>> Or stated differently: So that the input impedance is the same as other
>> DVMs.
>>
>> Brent
>>
>>
>> On 4/10/2014 8:23 AM, Tony wrote:
>>
>>> There is no suggestion in the specifications for the 34401A that the
>>> accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V
>>> range so why would they make 10M ohm the default? I can think of very few
>>> cases where having the 10M ohm i/p resistor switched in is better for
>>> accuracy than not.
>>>
>>> On the other hand 10M is sufficiently low to produce significant errors
>>> on a 6 1/2 digit DVM for sources with resistances as low as 10 ohms.
>>> Measuring 1V divided by a 100k/100k ohm divider for example causes a .5%
>>> error - 502.488mV instead of 500.000mV. That might not be a problem but I
>>> wouldn't be surprised if this catches a lot of people out (including me)
>>> when not pausing to do the mental arithmetic to estimate the error. It's
>>> just too easy to be seduced by all those digits into thinking you've made
>>> an accurate measurement even though you discarded those last three digits.
>>>
>>> And if it's not a problem then you probably don't need an expensive 6 1/2
>>> digit meter in the first place.
>>>
>>> It's a small point I agree but it can get irritating to have to keep
>>> going into the measurement menus to change it when the meter is turned on
>>> when measuring high impedance sources (e.g. capacitor leakage testing).
>>>
>>> It can't be to improve i/p protection as 10M is too high to make any
>>> significant difference to ESD and in any case there is plenty of other
>>> over-voltage protection. OK. it provides a path for the DC amplifier's
>>> input bias current, specified to be < 30pA at 25 degrees C, but I imagine
>>> that varies significantly from one meter to the next, and with temperature,
>>> so not useful for nulling out that error.
>>>
>>> So why would they do this?
>>>
>> _______________________________________________
>> volt-nuts mailing list -- volt-nuts@febo.com
>> To unsubscribe, go to https://www.febo.com/cgi-bin/
>> mailman/listinfo/volt-nuts
>> and follow the instructions there.
>>
>
>
PK
Poul-Henning Kamp
Thu, Apr 10, 2014 8:59 PM
Very unlikely I'd have thought - the relay (K104) which selects between
the high and low voltage ranges also selects the I/P resistance. It
wouldn't get used any more than the identical relay |(K102) which
switches when changing between 10 and 100V ranges.
If you leave your 34401A on 10G input with nothing connected in
a dry atmosphere, it is going to build up charge and trigger the
autorange relay. Anti-wear-out mechanisms are certainly relevant.
A similar mechnism is documented in one of the 3458A manuals.
--
Poul-Henning Kamp | UNIX since Zilog Zeus 3.20
phk@FreeBSD.ORG | TCP/IP since RFC 956
FreeBSD committer | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.
In message <534704F7.3030101@toneh.demon.co.uk>, Tony writes:
>Very unlikely I'd have thought - the relay (K104) which selects between
>the high and low voltage ranges also selects the I/P resistance. It
>wouldn't get used any more than the identical relay |(K102) which
>switches when changing between 10 and 100V ranges.
If you leave your 34401A on 10G input with nothing connected in
a dry atmosphere, it is going to build up charge and trigger the
autorange relay. Anti-wear-out mechanisms are certainly relevant.
A similar mechnism is documented in one of the 3458A manuals.
--
Poul-Henning Kamp | UNIX since Zilog Zeus 3.20
phk@FreeBSD.ORG | TCP/IP since RFC 956
FreeBSD committer | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.
JP
John Phillips
Thu, Apr 10, 2014 9:05 PM
With the 1K ohm per volt you need to know what range you are using. You do
have to know your meter and know how to correct for loading or not loading.
It is not very practical to have a bunch of different input standards 10M
works for a lot of things and is the the standard voltage divider.
On Thu, Apr 10, 2014 at 1:59 PM, Poul-Henning Kamp phk@phk.freebsd.dkwrote:
Very unlikely I'd have thought - the relay (K104) which selects between
the high and low voltage ranges also selects the I/P resistance. It
wouldn't get used any more than the identical relay |(K102) which
switches when changing between 10 and 100V ranges.
If you leave your 34401A on 10G input with nothing connected in
a dry atmosphere, it is going to build up charge and trigger the
autorange relay. Anti-wear-out mechanisms are certainly relevant.
A similar mechnism is documented in one of the 3458A manuals.
--
Poul-Henning Kamp | UNIX since Zilog Zeus 3.20
phk@FreeBSD.ORG | TCP/IP since RFC 956
FreeBSD committer | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.
volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to
https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.
With the 1K ohm per volt you need to know what range you are using. You do
have to know your meter and know how to correct for loading or not loading.
It is not very practical to have a bunch of different input standards 10M
works for a lot of things and is the the standard voltage divider.
On Thu, Apr 10, 2014 at 1:59 PM, Poul-Henning Kamp <phk@phk.freebsd.dk>wrote:
> In message <534704F7.3030101@toneh.demon.co.uk>, Tony writes:
>
> >Very unlikely I'd have thought - the relay (K104) which selects between
> >the high and low voltage ranges also selects the I/P resistance. It
> >wouldn't get used any more than the identical relay |(K102) which
> >switches when changing between 10 and 100V ranges.
>
> If you leave your 34401A on 10G input with nothing connected in
> a dry atmosphere, it is going to build up charge and trigger the
> autorange relay. Anti-wear-out mechanisms are certainly relevant.
>
> A similar mechnism is documented in one of the 3458A manuals.
>
> --
> Poul-Henning Kamp | UNIX since Zilog Zeus 3.20
> phk@FreeBSD.ORG | TCP/IP since RFC 956
> FreeBSD committer | BSD since 4.3-tahoe
> Never attribute to malice what can adequately be explained by incompetence.
> _______________________________________________
> volt-nuts mailing list -- volt-nuts@febo.com
> To unsubscribe, go to
> https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
> and follow the instructions there.
>
--
John Phillips
T
Tony
Thu, Apr 10, 2014 9:18 PM
Pure conjecture: So that the reading on the 34401A matches that on a
$20 DVM.
I assume you mean when the DVM is disconnected - otherwise you wouldn't
spend more than $20 on a meter! But I said that in my original post:
/So why would they do this? Could it be psychological? By limiting
the drift caused by the i/p bias current to 300uV max when the meter
is left unconnected? A voltmeter with a rapidly drifting reading
(several mV/s) when not connected to anything is a bit disconcerting
and would *probably lead to complaints that the meter is obviously
faulty to users who are used to DVMs which read 0V when open
circuit* - because they have i/p resistance << 10G ohms and don't
have the resolution to show the offset voltage caused by the i/p
bias current.////
/
Or stated differently: So that the input impedance is the same as
other DVMs.
Not really - that's a different reason. Other meters have a variety of
different input resistances but 10M is probably the most common however.
In any case, with the exception of matching the needs of a HV probe, the
higher the input resistance the better. Deliberately compromising the
performance to match cheaper models and making it harder than necessary
(a sequence of 9 button presses!) to de-select that error source, seems
to be a bizzare choice.
Tony H
Brent
On 4/10/2014 8:23 AM, Tony wrote:
There is no suggestion in the specifications for the 34401A that the
accuracy suffers by selecting 10G ohm input resistance on the .1 to
10V range so why would they make 10M ohm the default? I can think of
very few cases where having the 10M ohm i/p resistor switched in is
better for accuracy than not.
On the other hand 10M is sufficiently low to produce significant
errors on a 6 1/2 digit DVM for sources with resistances as low as 10
ohms. Measuring 1V divided by a 100k/100k ohm divider for example
causes a .5% error - 502.488mV instead of 500.000mV. That might not
be a problem but I wouldn't be surprised if this catches a lot of
people out (including me) when not pausing to do the mental
arithmetic to estimate the error. It's just too easy to be seduced by
all those digits into thinking you've made an accurate measurement
even though you discarded those last three digits.
And if it's not a problem then you probably don't need an expensive 6
1/2 digit meter in the first place.
It's a small point I agree but it can get irritating to have to keep
going into the measurement menus to change it when the meter is
turned on when measuring high impedance sources (e.g. capacitor
leakage testing).
It can't be to improve i/p protection as 10M is too high to make any
significant difference to ESD and in any case there is plenty of
other over-voltage protection. OK. it provides a path for the DC
amplifier's input bias current, specified to be < 30pA at 25 degrees
C, but I imagine that varies significantly from one meter to the
next, and with temperature, so not useful for nulling out that error.
So why would they do this?
Gordon wrote:
> Pure conjecture: So that the reading on the 34401A matches that on a
> $20 DVM.
I assume you mean when the DVM is disconnected - otherwise you wouldn't
spend more than $20 on a meter! But I said that in my original post:
/So why would they do this? Could it be psychological? By limiting
the drift caused by the i/p bias current to 300uV max when the meter
is left unconnected? A voltmeter with a rapidly drifting reading
(several mV/s) when not connected to anything is a bit disconcerting
and would *probably lead to complaints that the meter is obviously
faulty to users who are used to DVMs which read 0V when open
circuit* - because they have i/p resistance << 10G ohms and don't
have the resolution to show the offset voltage caused by the i/p
bias current.////
/
> Or stated differently: So that the input impedance is the same as
> other DVMs.
Not really - that's a different reason. Other meters have a variety of
different input resistances but 10M is probably the most common however.
In any case, with the exception of matching the needs of a HV probe, the
higher the input resistance the better. Deliberately compromising the
performance to match cheaper models and making it harder than necessary
(a sequence of 9 button presses!) to de-select that error source, seems
to be a bizzare choice.
Tony H
>
> Brent
>
> On 4/10/2014 8:23 AM, Tony wrote:
>> There is no suggestion in the specifications for the 34401A that the
>> accuracy suffers by selecting 10G ohm input resistance on the .1 to
>> 10V range so why would they make 10M ohm the default? I can think of
>> very few cases where having the 10M ohm i/p resistor switched in is
>> better for accuracy than not.
>>
>> On the other hand 10M is sufficiently low to produce significant
>> errors on a 6 1/2 digit DVM for sources with resistances as low as 10
>> ohms. Measuring 1V divided by a 100k/100k ohm divider for example
>> causes a .5% error - 502.488mV instead of 500.000mV. That might not
>> be a problem but I wouldn't be surprised if this catches a lot of
>> people out (including me) when not pausing to do the mental
>> arithmetic to estimate the error. It's just too easy to be seduced by
>> all those digits into thinking you've made an accurate measurement
>> even though you discarded those last three digits.
>>
>> And if it's not a problem then you probably don't need an expensive 6
>> 1/2 digit meter in the first place.
>>
>> It's a small point I agree but it can get irritating to have to keep
>> going into the measurement menus to change it when the meter is
>> turned on when measuring high impedance sources (e.g. capacitor
>> leakage testing).
>>
>> It can't be to improve i/p protection as 10M is too high to make any
>> significant difference to ESD and in any case there is plenty of
>> other over-voltage protection. OK. it provides a path for the DC
>> amplifier's input bias current, specified to be < 30pA at 25 degrees
>> C, but I imagine that varies significantly from one meter to the
>> next, and with temperature, so not useful for nulling out that error.
>>
>> So why would they do this?
>
> _______________________________________________
> volt-nuts mailing list -- volt-nuts@febo.com
> To unsubscribe, go to
> https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
> and follow the instructions there.
>