[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Panel Meter Question



Original poster: Terry Fritz <vardin@xxxxxxxxxxxxxxxxxxxxxxx>

Hi Adam,

Most analog panel meters that are older are just not real good in these days of 0.001% DVMs. The "best" were 5% with most good ones in the 10% range. Normal ones could be out to 20%... I would find the most important voltage point for your application and set it to read that point well and be happy ;-))

Some of the very old "laboratory" type meters that claimed high accuracy also do not stand up well to modern voltage standards. Back then, nobody would know if they really were "off" ;-)) They can be affected but the direction of North and if they are upright of laying flat too. You might want to be sure that the pointer is set right on zero and run them back and forth many times just to drive out tiny dust from the moving parts.

Modern analog meters can hit 2% full scale, 2% repeatability, and 3% "tracking" thanks to computer modeling and some fancy tricks. They also cost about $100 each!!

If the meter works well without sticky spots or other obvious bad problems, consider them "just fine"!

Cheers,

        Terry


At 07:50 PM 10/1/2005, you wrote:
I have a Weston 7544 panel meter that I bought used.
It that reads from 0 to 300 VAC. It came with two 22k
resistors in series with one of the leads, located in
a small compartment. One of them is blown. The other
reads 22.36k on my Fluke DMM. I went to my resistor
drawer and found one that also reads 22.36k. When I
place it in series with the good resistor, the meter
reads too low across the scale. At 100VAC, it reads
90, and at 279 it reads 273. So I hooked it straight
up to a variac without any resistors, and measured the
input voltage and meter readings from 100 to 300. I
figured I could simply find the ratio and calculate a
better fit for a resistor. The problem is that it's
not linear. It ranges from 21.86 volts input to read
100 volts, up to 59.1 volts to read 300 volts. I
created a spreadsheet and played around a bit. If I
pick a resistor value to make 300 volts correct, I'm
10 volts too low at 100. If I correct at 100, I'm 35
volts too high at 300. Is there a better way to do
this, or should I pick an average value, and call it
close enough?

thanks
Adam