volt-nuts@lists.febo.com

Discussion of precise voltage measurement

View all threads

Datron 1281 Selcal Multiplier

S
starbook@uplink.net
Sat, Apr 5, 2014 2:11 AM

These error codes 2292, 2282, 2272, 2262, 2252, 2216, and 2214 all use
the Selfcal Multiplier circuit.
I have all the error codes listed above, when doing a INT SRCE CAL.
Output at M609 pin-2  is not a dc level, therefore M609 pin-15 is a
sawtooth waveform.
Output on M609 pin-2 is a 10% duty on time square wave with a positive
offset.
All IC's are functional
What is the problem with this selfcal multiplier circuit?
John

These error codes 2292, 2282, 2272, 2262, 2252, 2216, and 2214 all use the Selfcal Multiplier circuit. I have all the error codes listed above, when doing a INT SRCE CAL. Output at M609 pin-2 is not a dc level, therefore M609 pin-15 is a sawtooth waveform. Output on M609 pin-2 is a 10% duty on time square wave with a positive offset. All IC's are functional What is the problem with this selfcal multiplier circuit? John
Михаил
Sat, Apr 5, 2014 5:14 AM

Hello, John

The full Service Manual with circuits description for Datron 1281 is very rare.
If you have it, you may upload it to the http://ko4bb.com/

Regards,
Mickle T.

Saturday, April 5, 2014, 6:11:09 AM, you wrote:
sun> These error codes 2292, 2282, 2272, 2262, 2252, 2216, and 2214 all use
sun> the Selfcal Multiplier circuit.
sun> I have all the error codes listed above, when doing a INT SRCE CAL.
sun> Output at M609 pin-2  is not a dc level, therefore M609 pin-15 is a
sun> sawtooth waveform.
sun> Output on M609 pin-2 is a 10% duty on time square wave with a positive
sun> offset.
sun> All IC's are functional
sun> What is the problem with this selfcal multiplier circuit?
sun> John
sun> _______________________________________________
sun> volt-nuts mailing list -- volt-nuts@febo.com
sun> To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
sun> and follow the instructions there.

Hello, John The full Service Manual with circuits description for Datron 1281 is very rare. If you have it, you may upload it to the http://ko4bb.com/ Regards, Mickle T. Saturday, April 5, 2014, 6:11:09 AM, you wrote: sun> These error codes 2292, 2282, 2272, 2262, 2252, 2216, and 2214 all use sun> the Selfcal Multiplier circuit. sun> I have all the error codes listed above, when doing a INT SRCE CAL. sun> Output at M609 pin-2 is not a dc level, therefore M609 pin-15 is a sun> sawtooth waveform. sun> Output on M609 pin-2 is a 10% duty on time square wave with a positive sun> offset. sun> All IC's are functional sun> What is the problem with this selfcal multiplier circuit? sun> John sun> _______________________________________________ sun> volt-nuts mailing list -- volt-nuts@febo.com sun> To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts sun> and follow the instructions there.
T
Tony
Thu, Apr 10, 2014 2:23 PM

There is no suggestion in the specifications for the 34401A that the
accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V
range so why would they make 10M ohm the default? I can think of very
few cases where having the 10M ohm i/p resistor switched  in is better
for accuracy than not.

On the other hand 10M is sufficiently low to produce significant errors
on a 6 1/2 digit DVM for sources with resistances as low as 10 ohms.
Measuring 1V divided by a 100k/100k ohm divider for example causes a .5%
error - 502.488mV instead of 500.000mV. That might not be a problem but
I wouldn't be surprised if this catches a lot of people out (including
me) when not pausing to do the mental arithmetic to estimate the error.
It's just too easy to be seduced by all those digits into thinking
you've made an accurate measurement even though you discarded those last
three digits.

And if it's not a problem then you probably don't need an expensive 6
1/2 digit meter in the first place.

It's a small point I agree but it can get irritating to have to keep
going into the measurement menus to change it when the meter is turned
on when measuring high impedance sources (e.g. capacitor leakage testing).

It can't be to improve i/p protection as 10M is too high to make any
significant difference to ESD and in any case there is plenty of other
over-voltage protection. OK. it provides a path for the  DC amplifier's
input bias current, specified to be < 30pA at 25 degrees C, but I
imagine that varies significantly from one meter to the next, and with
temperature, so not useful for nulling out that error.

So why would they do this? Could it be psychological? By limiting the
drift caused by the i/p bias current to 300uV max when the meter is left
unconnected? A voltmeter with a rapidly drifting reading (several mV/s)
when not connected to anything is a bit disconcerting and would probably
lead to complaints that the meter is obviously faulty to users who are
used to DVMs which read 0V when open circuit - because they have i/p
resistance << 10G ohms and don't have the resolution to show the offset
voltage caused by the i/p bias current.

Personally I'd have though that the default should be the other way
round - especially given that there is no indication on the front panel
or display as to which i/p resistance is currently selected.

Any thoughts? What do other meters do?

Tony H

There is no suggestion in the specifications for the 34401A that the accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V range so why would they make 10M ohm the default? I can think of very few cases where having the 10M ohm i/p resistor switched in is better for accuracy than not. On the other hand 10M is sufficiently low to produce significant errors on a 6 1/2 digit DVM for sources with resistances as low as 10 ohms. Measuring 1V divided by a 100k/100k ohm divider for example causes a .5% error - 502.488mV instead of 500.000mV. That might not be a problem but I wouldn't be surprised if this catches a lot of people out (including me) when not pausing to do the mental arithmetic to estimate the error. It's just too easy to be seduced by all those digits into thinking you've made an accurate measurement even though you discarded those last three digits. And if it's not a problem then you probably don't need an expensive 6 1/2 digit meter in the first place. It's a small point I agree but it can get irritating to have to keep going into the measurement menus to change it when the meter is turned on when measuring high impedance sources (e.g. capacitor leakage testing). It can't be to improve i/p protection as 10M is too high to make any significant difference to ESD and in any case there is plenty of other over-voltage protection. OK. it provides a path for the DC amplifier's input bias current, specified to be < 30pA at 25 degrees C, but I imagine that varies significantly from one meter to the next, and with temperature, so not useful for nulling out that error. So why would they do this? Could it be psychological? By limiting the drift caused by the i/p bias current to 300uV max when the meter is left unconnected? A voltmeter with a rapidly drifting reading (several mV/s) when not connected to anything is a bit disconcerting and would probably lead to complaints that the meter is obviously faulty to users who are used to DVMs which read 0V when open circuit - because they have i/p resistance << 10G ohms and don't have the resolution to show the offset voltage caused by the i/p bias current. Personally I'd have though that the default should be the other way round - especially given that there is no indication on the front panel or display as to which i/p resistance is currently selected. Any thoughts? What do other meters do? Tony H
TM
Tom Miller
Thu, Apr 10, 2014 3:07 PM

Think "HV Probe". Many of the accurate ones want to see a 10 meg input.

Also, some meters change input impedance depending on the selected range.

T

----- Original Message -----
From: "Tony" vnuts@toneh.demon.co.uk
To: volt-nuts@febo.com
Sent: Thursday, April 10, 2014 10:23 AM
Subject: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

There is no suggestion in the specifications for the 34401A that the
accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V
range so why would they make 10M ohm the default? I can think of very few
cases where having the 10M ohm i/p resistor switched  in is better for
accuracy than not.

On the other hand 10M is sufficiently low to produce significant errors on
a 6 1/2 digit DVM for sources with resistances as low as 10 ohms.
Measuring 1V divided by a 100k/100k ohm divider for example causes a .5%
error - 502.488mV instead of 500.000mV. That might not be a problem but I
wouldn't be surprised if this catches a lot of people out (including me)
when not pausing to do the mental arithmetic to estimate the error. It's
just too easy to be seduced by all those digits into thinking you've made
an accurate measurement even though you discarded those last three digits.

And if it's not a problem then you probably don't need an expensive 6 1/2
digit meter in the first place.

It's a small point I agree but it can get irritating to have to keep going
into the measurement menus to change it when the meter is turned on when
measuring high impedance sources (e.g. capacitor leakage testing).

It can't be to improve i/p protection as 10M is too high to make any
significant difference to ESD and in any case there is plenty of other
over-voltage protection. OK. it provides a path for the  DC amplifier's
input bias current, specified to be < 30pA at 25 degrees C, but I imagine
that varies significantly from one meter to the next, and with
temperature, so not useful for nulling out that error.

So why would they do this? Could it be psychological? By limiting the
drift caused by the i/p bias current to 300uV max when the meter is left
unconnected? A voltmeter with a rapidly drifting reading (several mV/s)
when not connected to anything is a bit disconcerting and would probably
lead to complaints that the meter is obviously faulty to users who are
used to DVMs which read 0V when open circuit - because they have i/p
resistance << 10G ohms and don't have the resolution to show the offset
voltage caused by the i/p bias current.

Personally I'd have though that the default should be the other way
round - especially given that there is no indication on the front panel or
display as to which i/p resistance is currently selected.

Any thoughts? What do other meters do?

Tony H


volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to
https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.

Think "HV Probe". Many of the accurate ones want to see a 10 meg input. Also, some meters change input impedance depending on the selected range. T ----- Original Message ----- From: "Tony" <vnuts@toneh.demon.co.uk> To: <volt-nuts@febo.com> Sent: Thursday, April 10, 2014 10:23 AM Subject: [volt-nuts] 34401A Why 10M ohm default i/p resistance? > There is no suggestion in the specifications for the 34401A that the > accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V > range so why would they make 10M ohm the default? I can think of very few > cases where having the 10M ohm i/p resistor switched in is better for > accuracy than not. > > On the other hand 10M is sufficiently low to produce significant errors on > a 6 1/2 digit DVM for sources with resistances as low as 10 ohms. > Measuring 1V divided by a 100k/100k ohm divider for example causes a .5% > error - 502.488mV instead of 500.000mV. That might not be a problem but I > wouldn't be surprised if this catches a lot of people out (including me) > when not pausing to do the mental arithmetic to estimate the error. It's > just too easy to be seduced by all those digits into thinking you've made > an accurate measurement even though you discarded those last three digits. > > And if it's not a problem then you probably don't need an expensive 6 1/2 > digit meter in the first place. > > It's a small point I agree but it can get irritating to have to keep going > into the measurement menus to change it when the meter is turned on when > measuring high impedance sources (e.g. capacitor leakage testing). > > It can't be to improve i/p protection as 10M is too high to make any > significant difference to ESD and in any case there is plenty of other > over-voltage protection. OK. it provides a path for the DC amplifier's > input bias current, specified to be < 30pA at 25 degrees C, but I imagine > that varies significantly from one meter to the next, and with > temperature, so not useful for nulling out that error. > > So why would they do this? Could it be psychological? By limiting the > drift caused by the i/p bias current to 300uV max when the meter is left > unconnected? A voltmeter with a rapidly drifting reading (several mV/s) > when not connected to anything is a bit disconcerting and would probably > lead to complaints that the meter is obviously faulty to users who are > used to DVMs which read 0V when open circuit - because they have i/p > resistance << 10G ohms and don't have the resolution to show the offset > voltage caused by the i/p bias current. > > Personally I'd have though that the default should be the other way > round - especially given that there is no indication on the front panel or > display as to which i/p resistance is currently selected. > > Any thoughts? What do other meters do? > > Tony H > _______________________________________________ > volt-nuts mailing list -- volt-nuts@febo.com > To unsubscribe, go to > https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts > and follow the instructions there.
SJ
Steven J Banaska
Thu, Apr 10, 2014 4:55 PM

As Tom said the 10M input impedance is used for the high voltage ranges
because it is a resistive divider (9.9M/100k) that can handle high voltages
without much drift. Caddock THV or HVD are fairly common in precision dmms.

Typically you will find a high impedance (10G) path that can be used for
the ranges 10V and lower, but the 10M divider can be left connected and
will work for any voltage range by changing which side you measure. As you
mentioned there can be an accuracy sacrifice when you have a high output
impedance from your source. I'm not sure why 10M is the default other than
it may extend the life of the relay that switches the 10M divider in or out.

Steve

On Thu, Apr 10, 2014 at 8:07 AM, Tom Miller tmiller11147@verizon.netwrote:

Think "HV Probe". Many of the accurate ones want to see a 10 meg input.

Also, some meters change input impedance depending on the selected range.

T

----- Original Message ----- From: "Tony" vnuts@toneh.demon.co.uk
To: volt-nuts@febo.com
Sent: Thursday, April 10, 2014 10:23 AM
Subject: [volt-nuts] 34401A Why 10M ohm default i/p resistance?

There is no suggestion in the specifications for the 34401A that the

accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V
range so why would they make 10M ohm the default? I can think of very few
cases where having the 10M ohm i/p resistor switched  in is better for
accuracy than not.

On the other hand 10M is sufficiently low to produce significant errors
on a 6 1/2 digit DVM for sources with resistances as low as 10 ohms.
Measuring 1V divided by a 100k/100k ohm divider for example causes a .5%
error - 502.488mV instead of 500.000mV. That might not be a problem but I
wouldn't be surprised if this catches a lot of people out (including me)
when not pausing to do the mental arithmetic to estimate the error. It's
just too easy to be seduced by all those digits into thinking you've made
an accurate measurement even though you discarded those last three digits.

And if it's not a problem then you probably don't need an expensive 6 1/2
digit meter in the first place.

It's a small point I agree but it can get irritating to have to keep
going into the measurement menus to change it when the meter is turned on
when measuring high impedance sources (e.g. capacitor leakage testing).

It can't be to improve i/p protection as 10M is too high to make any
significant difference to ESD and in any case there is plenty of other
over-voltage protection. OK. it provides a path for the  DC amplifier's
input bias current, specified to be < 30pA at 25 degrees C, but I imagine
that varies significantly from one meter to the next, and with temperature,
so not useful for nulling out that error.

So why would they do this? Could it be psychological? By limiting the
drift caused by the i/p bias current to 300uV max when the meter is left
unconnected? A voltmeter with a rapidly drifting reading (several mV/s)
when not connected to anything is a bit disconcerting and would probably
lead to complaints that the meter is obviously faulty to users who are used
to DVMs which read 0V when open circuit - because they have i/p resistance
<< 10G ohms and don't have the resolution to show the offset voltage caused
by the i/p bias current.

Personally I'd have though that the default should be the other way round

  • especially given that there is no indication on the front panel or
    display as to which i/p resistance is currently selected.

Any thoughts? What do other meters do?

Tony H


volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/
mailman/listinfo/volt-nuts
and follow the instructions there.


volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/
mailman/listinfo/volt-nuts
and follow the instructions there.

As Tom said the 10M input impedance is used for the high voltage ranges because it is a resistive divider (9.9M/100k) that can handle high voltages without much drift. Caddock THV or HVD are fairly common in precision dmms. Typically you will find a high impedance (10G) path that can be used for the ranges 10V and lower, but the 10M divider can be left connected and will work for any voltage range by changing which side you measure. As you mentioned there can be an accuracy sacrifice when you have a high output impedance from your source. I'm not sure why 10M is the default other than it may extend the life of the relay that switches the 10M divider in or out. Steve On Thu, Apr 10, 2014 at 8:07 AM, Tom Miller <tmiller11147@verizon.net>wrote: > Think "HV Probe". Many of the accurate ones want to see a 10 meg input. > > Also, some meters change input impedance depending on the selected range. > > T > > ----- Original Message ----- From: "Tony" <vnuts@toneh.demon.co.uk> > To: <volt-nuts@febo.com> > Sent: Thursday, April 10, 2014 10:23 AM > Subject: [volt-nuts] 34401A Why 10M ohm default i/p resistance? > > > > There is no suggestion in the specifications for the 34401A that the >> accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V >> range so why would they make 10M ohm the default? I can think of very few >> cases where having the 10M ohm i/p resistor switched in is better for >> accuracy than not. >> >> On the other hand 10M is sufficiently low to produce significant errors >> on a 6 1/2 digit DVM for sources with resistances as low as 10 ohms. >> Measuring 1V divided by a 100k/100k ohm divider for example causes a .5% >> error - 502.488mV instead of 500.000mV. That might not be a problem but I >> wouldn't be surprised if this catches a lot of people out (including me) >> when not pausing to do the mental arithmetic to estimate the error. It's >> just too easy to be seduced by all those digits into thinking you've made >> an accurate measurement even though you discarded those last three digits. >> >> And if it's not a problem then you probably don't need an expensive 6 1/2 >> digit meter in the first place. >> >> It's a small point I agree but it can get irritating to have to keep >> going into the measurement menus to change it when the meter is turned on >> when measuring high impedance sources (e.g. capacitor leakage testing). >> >> It can't be to improve i/p protection as 10M is too high to make any >> significant difference to ESD and in any case there is plenty of other >> over-voltage protection. OK. it provides a path for the DC amplifier's >> input bias current, specified to be < 30pA at 25 degrees C, but I imagine >> that varies significantly from one meter to the next, and with temperature, >> so not useful for nulling out that error. >> >> So why would they do this? Could it be psychological? By limiting the >> drift caused by the i/p bias current to 300uV max when the meter is left >> unconnected? A voltmeter with a rapidly drifting reading (several mV/s) >> when not connected to anything is a bit disconcerting and would probably >> lead to complaints that the meter is obviously faulty to users who are used >> to DVMs which read 0V when open circuit - because they have i/p resistance >> << 10G ohms and don't have the resolution to show the offset voltage caused >> by the i/p bias current. >> >> Personally I'd have though that the default should be the other way round >> - especially given that there is no indication on the front panel or >> display as to which i/p resistance is currently selected. >> >> Any thoughts? What do other meters do? >> >> Tony H >> _______________________________________________ >> volt-nuts mailing list -- volt-nuts@febo.com >> To unsubscribe, go to https://www.febo.com/cgi-bin/ >> mailman/listinfo/volt-nuts >> and follow the instructions there. >> > > _______________________________________________ > volt-nuts mailing list -- volt-nuts@febo.com > To unsubscribe, go to https://www.febo.com/cgi-bin/ > mailman/listinfo/volt-nuts > and follow the instructions there. >
JS
Joel Setton
Thu, Apr 10, 2014 5:58 PM

I think the 10 Meg default value became a de facto standard at the time
of VTVMs (vacuum-tube volt meters), as a convenient value which reduced
input circuit loading while remaining compatible with the grid current
of the input triode. Designers of early solid-state voltmeters merely
decided not to change a good thing.
Just my $0.02 worth!

Joel Setton

On 10/04/2014 18:55, Steven J Banaska wrote:

As Tom said the 10M input impedance is used for the high voltage ranges
because it is a resistive divider (9.9M/100k) that can handle high voltages
without much drift. Caddock THV or HVD are fairly common in precision dmms.

Typically you will find a high impedance (10G) path that can be used for
the ranges 10V and lower, but the 10M divider can be left connected and
will work for any voltage range by changing which side you measure. As you
mentioned there can be an accuracy sacrifice when you have a high output
impedance from your source. I'm not sure why 10M is the default other than
it may extend the life of the relay that switches the 10M divider in or out.

Steve

I think the 10 Meg default value became a de facto standard at the time of VTVMs (vacuum-tube volt meters), as a convenient value which reduced input circuit loading while remaining compatible with the grid current of the input triode. Designers of early solid-state voltmeters merely decided not to change a good thing. Just my $0.02 worth! Joel Setton On 10/04/2014 18:55, Steven J Banaska wrote: > As Tom said the 10M input impedance is used for the high voltage ranges > because it is a resistive divider (9.9M/100k) that can handle high voltages > without much drift. Caddock THV or HVD are fairly common in precision dmms. > > Typically you will find a high impedance (10G) path that can be used for > the ranges 10V and lower, but the 10M divider can be left connected and > will work for any voltage range by changing which side you measure. As you > mentioned there can be an accuracy sacrifice when you have a high output > impedance from your source. I'm not sure why 10M is the default other than > it may extend the life of the relay that switches the 10M divider in or out. > > Steve > >
BC
Brooke Clarke
Thu, Apr 10, 2014 7:23 PM

Hi Tony:

Fluke makes some DDMs that have what they call V-Check where they put a 1,000 Ohm resistor across the voltage input.
When testing lawn sprinkler valves if you measure the voltage across the valve with a Hi-Z voltmeter it looks normal,
but using the V-Check range on the DMM shows the voltage to be almost zero.
http://www.prc68.com/I/DMM.shtml

Have Fun,

Brooke Clarke
http://www.PRC68.com
http://www.end2partygovernment.com/2012Issues.html

Tony wrote:

There is no suggestion in the specifications for the 34401A that the accuracy suffers by selecting 10G ohm input
resistance on the .1 to 10V range so why would they make 10M ohm the default? I can think of very few cases where
having the 10M ohm i/p resistor switched  in is better for accuracy than not.

On the other hand 10M is sufficiently low to produce significant errors on a 6 1/2 digit DVM for sources with
resistances as low as 10 ohms. Measuring 1V divided by a 100k/100k ohm divider for example causes a .5% error -
502.488mV instead of 500.000mV. That might not be a problem but I wouldn't be surprised if this catches a lot of
people out (including me) when not pausing to do the mental arithmetic to estimate the error. It's just too easy to be
seduced by all those digits into thinking you've made an accurate measurement even though you discarded those last
three digits.

And if it's not a problem then you probably don't need an expensive 6 1/2 digit meter in the first place.

It's a small point I agree but it can get irritating to have to keep going into the measurement menus to change it
when the meter is turned on when measuring high impedance sources (e.g. capacitor leakage testing).

It can't be to improve i/p protection as 10M is too high to make any significant difference to ESD and in any case
there is plenty of other over-voltage protection. OK. it provides a path for the DC amplifier's input bias current,
specified to be < 30pA at 25 degrees C, but I imagine that varies significantly from one meter to the next, and with
temperature, so not useful for nulling out that error.

So why would they do this? Could it be psychological? By limiting the drift caused by the i/p bias current to 300uV
max when the meter is left unconnected? A voltmeter with a rapidly drifting reading (several mV/s) when not connected
to anything is a bit disconcerting and would probably lead to complaints that the meter is obviously faulty to users
who are used to DVMs which read 0V when open circuit - because they have i/p resistance << 10G ohms and don't have the
resolution to show the offset voltage caused by the i/p bias current.

Personally I'd have though that the default should be the other way round - especially given that there is no
indication on the front panel or display as to which i/p resistance is currently selected.

Any thoughts? What do other meters do?

Tony H


volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.

Hi Tony: Fluke makes some DDMs that have what they call V-Check where they put a 1,000 Ohm resistor across the voltage input. When testing lawn sprinkler valves if you measure the voltage across the valve with a Hi-Z voltmeter it looks normal, but using the V-Check range on the DMM shows the voltage to be almost zero. http://www.prc68.com/I/DMM.shtml Have Fun, Brooke Clarke http://www.PRC68.com http://www.end2partygovernment.com/2012Issues.html Tony wrote: > There is no suggestion in the specifications for the 34401A that the accuracy suffers by selecting 10G ohm input > resistance on the .1 to 10V range so why would they make 10M ohm the default? I can think of very few cases where > having the 10M ohm i/p resistor switched in is better for accuracy than not. > > On the other hand 10M is sufficiently low to produce significant errors on a 6 1/2 digit DVM for sources with > resistances as low as 10 ohms. Measuring 1V divided by a 100k/100k ohm divider for example causes a .5% error - > 502.488mV instead of 500.000mV. That might not be a problem but I wouldn't be surprised if this catches a lot of > people out (including me) when not pausing to do the mental arithmetic to estimate the error. It's just too easy to be > seduced by all those digits into thinking you've made an accurate measurement even though you discarded those last > three digits. > > And if it's not a problem then you probably don't need an expensive 6 1/2 digit meter in the first place. > > It's a small point I agree but it can get irritating to have to keep going into the measurement menus to change it > when the meter is turned on when measuring high impedance sources (e.g. capacitor leakage testing). > > It can't be to improve i/p protection as 10M is too high to make any significant difference to ESD and in any case > there is plenty of other over-voltage protection. OK. it provides a path for the DC amplifier's input bias current, > specified to be < 30pA at 25 degrees C, but I imagine that varies significantly from one meter to the next, and with > temperature, so not useful for nulling out that error. > > So why would they do this? Could it be psychological? By limiting the drift caused by the i/p bias current to 300uV > max when the meter is left unconnected? A voltmeter with a rapidly drifting reading (several mV/s) when not connected > to anything is a bit disconcerting and would probably lead to complaints that the meter is obviously faulty to users > who are used to DVMs which read 0V when open circuit - because they have i/p resistance << 10G ohms and don't have the > resolution to show the offset voltage caused by the i/p bias current. > > Personally I'd have though that the default should be the other way round - especially given that there is no > indication on the front panel or display as to which i/p resistance is currently selected. > > Any thoughts? What do other meters do? > > Tony H > _______________________________________________ > volt-nuts mailing list -- volt-nuts@febo.com > To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts > and follow the instructions there. >
AJ
Andreas Jahn
Thu, Apr 10, 2014 7:27 PM

Hello,

perhaps its just to save the lifetime of the input range selection
relays to at least the warranty time.
Just a guess.

With best regards

Andreas

Am 10.04.2014 19:58, schrieb Joel Setton:

I think the 10 Meg default value became a de facto standard at the
time of VTVMs (vacuum-tube volt meters), as a convenient value which
reduced input circuit loading while remaining compatible with the grid
current of the input triode. Designers of early solid-state voltmeters
merely decided not to change a good thing.
Just my $0.02 worth!

Joel Setton

On 10/04/2014 18:55, Steven J Banaska wrote:

As Tom said the 10M input impedance is used for the high voltage ranges
because it is a resistive divider (9.9M/100k) that can handle high
voltages
without much drift. Caddock THV or HVD are fairly common in precision
dmms.

Typically you will find a high impedance (10G) path that can be used for
the ranges 10V and lower, but the 10M divider can be left connected and
will work for any voltage range by changing which side you measure.
As you
mentioned there can be an accuracy sacrifice when you have a high output
impedance from your source. I'm not sure why 10M is the default other
than
it may extend the life of the relay that switches the 10M divider in
or out.

Steve


volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to
https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
and follow the instructions there.

Hello, perhaps its just to save the lifetime of the input range selection relays to at least the warranty time. Just a guess. With best regards Andreas Am 10.04.2014 19:58, schrieb Joel Setton: > I think the 10 Meg default value became a de facto standard at the > time of VTVMs (vacuum-tube volt meters), as a convenient value which > reduced input circuit loading while remaining compatible with the grid > current of the input triode. Designers of early solid-state voltmeters > merely decided not to change a good thing. > Just my $0.02 worth! > > Joel Setton > > > On 10/04/2014 18:55, Steven J Banaska wrote: >> As Tom said the 10M input impedance is used for the high voltage ranges >> because it is a resistive divider (9.9M/100k) that can handle high >> voltages >> without much drift. Caddock THV or HVD are fairly common in precision >> dmms. >> >> Typically you will find a high impedance (10G) path that can be used for >> the ranges 10V and lower, but the 10M divider can be left connected and >> will work for any voltage range by changing which side you measure. >> As you >> mentioned there can be an accuracy sacrifice when you have a high output >> impedance from your source. I'm not sure why 10M is the default other >> than >> it may extend the life of the relay that switches the 10M divider in >> or out. >> >> Steve >> >> > > _______________________________________________ > volt-nuts mailing list -- volt-nuts@febo.com > To unsubscribe, go to > https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts > and follow the instructions there.
PK
Poul-Henning Kamp
Thu, Apr 10, 2014 7:29 PM

In message 5346A952.9080203@toneh.demon.co.uk, Tony writes:

There is no suggestion in the specifications for the 34401A that the
accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V
range so why would they make 10M ohm the default?

In addition to the compatibility reasons others have mentioned, it
also protects the input circuits against random elctrostatic fluctuations
if nothing is attached, and it delays things just long enough that
if you attach +100V, it doesn't have to wait for a relay to kick in
before autorange can work.

--
Poul-Henning Kamp      | UNIX since Zilog Zeus 3.20
phk@FreeBSD.ORG        | TCP/IP since RFC 956
FreeBSD committer      | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.

In message <5346A952.9080203@toneh.demon.co.uk>, Tony writes: >There is no suggestion in the specifications for the 34401A that the >accuracy suffers by selecting 10G ohm input resistance on the .1 to 10V >range so why would they make 10M ohm the default? In addition to the compatibility reasons others have mentioned, it also protects the input circuits against random elctrostatic fluctuations if nothing is attached, and it delays things just long enough that if you attach +100V, it doesn't have to wait for a relay to kick in before autorange can work. -- Poul-Henning Kamp | UNIX since Zilog Zeus 3.20 phk@FreeBSD.ORG | TCP/IP since RFC 956 FreeBSD committer | BSD since 4.3-tahoe Never attribute to malice what can adequately be explained by incompetence.
JP
John Phillips
Thu, Apr 10, 2014 7:43 PM

It is more along the lines of building a voltage divider with stable
resistors.  A !G ohm voltage divider would be more expensive to build with
the same stability that you can get from 10M ohms. There are trade offs in
all designs. The cost benefit ratio just is not there. If you really need
high input impedance in a DC meter go  differential.  If you are looking
for AC circuits then stray capacitance will really mess you up with a high
impedance divider.

On Thu, Apr 10, 2014 at 12:27 PM, Andreas Jahn
Andreas_-_Jahn@t-online.dewrote:

Hello,

perhaps its just to save the lifetime of the input range selection relays
to at least the warranty time.
Just a guess.

With best regards

Andreas

Am 10.04.2014 19:58, schrieb Joel Setton:

I think the 10 Meg default value became a de facto standard at the time

of VTVMs (vacuum-tube volt meters), as a convenient value which reduced
input circuit loading while remaining compatible with the grid current of
the input triode. Designers of early solid-state voltmeters merely decided
not to change a good thing.
Just my $0.02 worth!

Joel Setton

On 10/04/2014 18:55, Steven J Banaska wrote:

As Tom said the 10M input impedance is used for the high voltage ranges
because it is a resistive divider (9.9M/100k) that can handle high
voltages
without much drift. Caddock THV or HVD are fairly common in precision
dmms.

Typically you will find a high impedance (10G) path that can be used for
the ranges 10V and lower, but the 10M divider can be left connected and
will work for any voltage range by changing which side you measure. As
you
mentioned there can be an accuracy sacrifice when you have a high output
impedance from your source. I'm not sure why 10M is the default other
than
it may extend the life of the relay that switches the 10M divider in or
out.

Steve


volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/
mailman/listinfo/volt-nuts
and follow the instructions there.


volt-nuts mailing list -- volt-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/
mailman/listinfo/volt-nuts
and follow the instructions there.

--
John Phillips

It is more along the lines of building a voltage divider with stable resistors. A !G ohm voltage divider would be more expensive to build with the same stability that you can get from 10M ohms. There are trade offs in all designs. The cost benefit ratio just is not there. If you really need high input impedance in a DC meter go differential. If you are looking for AC circuits then stray capacitance will really mess you up with a high impedance divider. On Thu, Apr 10, 2014 at 12:27 PM, Andreas Jahn <Andreas_-_Jahn@t-online.de>wrote: > Hello, > > perhaps its just to save the lifetime of the input range selection relays > to at least the warranty time. > Just a guess. > > With best regards > > Andreas > > Am 10.04.2014 19:58, schrieb Joel Setton: > > I think the 10 Meg default value became a de facto standard at the time >> of VTVMs (vacuum-tube volt meters), as a convenient value which reduced >> input circuit loading while remaining compatible with the grid current of >> the input triode. Designers of early solid-state voltmeters merely decided >> not to change a good thing. >> Just my $0.02 worth! >> >> Joel Setton >> >> >> On 10/04/2014 18:55, Steven J Banaska wrote: >> >>> As Tom said the 10M input impedance is used for the high voltage ranges >>> because it is a resistive divider (9.9M/100k) that can handle high >>> voltages >>> without much drift. Caddock THV or HVD are fairly common in precision >>> dmms. >>> >>> Typically you will find a high impedance (10G) path that can be used for >>> the ranges 10V and lower, but the 10M divider can be left connected and >>> will work for any voltage range by changing which side you measure. As >>> you >>> mentioned there can be an accuracy sacrifice when you have a high output >>> impedance from your source. I'm not sure why 10M is the default other >>> than >>> it may extend the life of the relay that switches the 10M divider in or >>> out. >>> >>> Steve >>> >>> >>> >> _______________________________________________ >> volt-nuts mailing list -- volt-nuts@febo.com >> To unsubscribe, go to https://www.febo.com/cgi-bin/ >> mailman/listinfo/volt-nuts >> and follow the instructions there. >> > > _______________________________________________ > volt-nuts mailing list -- volt-nuts@febo.com > To unsubscribe, go to https://www.febo.com/cgi-bin/ > mailman/listinfo/volt-nuts > and follow the instructions there. > -- John Phillips