I have a Weston 7544 panel meter that I bought used.
It that reads from 0 to 300 VAC. It came with two 22k
resistors in series with one of the leads, located in
a small compartment. One of them is blown. The other
reads 22.36k on my Fluke DMM. I went to my resistor
drawer and found one that also reads 22.36k. When I
place it in series with the good resistor, the meter
reads too low across the scale. At 100VAC, it reads
90, and at 279 it reads 273. So I hooked it straight
up to a variac without any resistors, and measured the
input voltage and meter readings from 100 to 300. I
figured I could simply find the ratio and calculate a
better fit for a resistor. The problem is that it's
not linear. It ranges from 21.86 volts input to read
100 volts, up to 59.1 volts to read 300 volts. I
created a spreadsheet and played around a bit. If I
pick a resistor value to make 300 volts correct, I'm
10 volts too low at 100. If I correct at 100, I'm 35
volts too high at 300. Is there a better way to do
this, or should I pick an average value, and call it
close enough?
thanks
Adam