[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Transformer behavior
Original poster: "S & J Young by way of Terry Fritz <twftesla-at-qwest-dot-net>" <youngs-at-konnections-dot-net>
List,
I need some help understanding power transformer behavior. I just fired up
a DC power supply for a twin TC. The supply has 2 MOTs running off 120 V
mains via variac, each MOT with a half wave voltage doubler, for a total of
about 11 KV filtered DC.
My puzzle is how to separate input magnetizing current vs the input current
actually supplying power to the TC. My first instinct was to measure the
total VA with the TC running, then measure the VA with no load on the power
supply. The difference would be the power going to the TC. But not so.
Here are some measurements - the numbers are AC volts to the MOTS, no load
magnetizing (mag) current, total current running a TC with RSG at a break
rate of about 240, and the difference between the two VAs:
in= 90 VAC, mag= 90 VA, others not measured
in=100 VAC, mag=200 VA, total=550 VA, diff=350 VA
in=110 VAC, mag=572 VA, total=814 VA, diff=242 VA
in=115 VAC, mag=858 VA, total=986 VA, diff=116 VA
No PFC caps were used. The MOT cores seem to start saturating around 100
VAC with no load.
As the voltage increases, the VA to the TC will go up, not down. But the
differences above have the opposite trend. So, obviously, my understanding
of transformers is lacking.
It appears as the MOTs are loaded, they must go out of saturation and become
more efficient. So it appears the only way I will know the power going into
the TC is to measure the (DC) watts being drawn out of the power supply.
Can someone please shed some light on what goes on with a semi-saturated
transformer as the load goes from zero to the rated transformer load?
Thanks,
--Steve