[Prev][Next][Index][Thread]

Re: Transformer Ballasting





---------- Forwarded message ----------
Date: Thu, 16 Oct 1997 15:33:25 -0700
From: Jim Lux <jimlux-at-earthlink-dot-net>
To: Tesla List <tesla-at-pupman-dot-com>
Subject: Re: Transformer Ballasting 

> I have tried searching the list archives for anything on the subject of
> current limiting power transformers, but I did not find exactly what I
> was looking for. The problem seems to be limiting the short out current
> without reducing the output voltage of the transformer. Obviously any
> resistance in circuit with the xfrm primary is going to develop a volt
> drop across it, and so the primary voltage will be lower than the rated
> output. The use of an inductive ballast, is going to have a similar
> effect. I have been experimenting with various arrangements, but have
> not yet achieved a satisfactory solution.
> 

An inductive ballast adds impedance in series, but, it is only reactive power, not active 
power, so it doesn't consume any "wall plug power" unlike a series resistor does.  Of 
course, there are going to be I^2*R losses which you can't get away from.

It is important to note though, that for a given load impedance, the ONLY way to reduce 
the current (i.e. limit it) is to reduce the voltage. This is Ohm's law in action. It 
happens that for certain interesting cases, the resistance isn't constant (i.e. arcs and 
glow discharges) but is highly nonlinear.  For instance, in a neon bulb or tube, the 
voltage is essentially constant, so the total power consumed by the load (i.e. the 
bulb)(and the brightness) is proportional to the current.  Hence, the ideal source for 
such a load is a constant current.

The classic way to get a reasonably constant current is to start with a very high voltage 
and use a high value resistor in series with the load.  As the load resistance changes, 
it doesn't change the overall circuit resistance much, so the current tends to remain 
constant.  For instance, if I wanted to have 1 Amp through a load that might vary from 
zero to 10 ohms, and I wanted it controlled within 1%, I could start with a 1000 Volt 
power supply and a 1000 Ohm resistor.  With the load at zero ohms, you'll get an amp. 
With the load at 10 ohms, you'll get 990 mA, or within the 1% tolerance spec.

Of course, for a load power dissipation from 0 to 10 Watts, I am going to dissipate 
1000 Watts in that 1000 Ohm resistor.  This is pretty inefficient, but it does work, and 
in some applications, it works great. A typical application of the resistive ballast in 
in small gas lasers (i.e. your 1 mW HeNe).

The other thing to do is get the same 1000 Volt drop by using a big inductor.  Ideal 
inductors don't dissipate any power, but do have a voltage drop from their reactance.  
Now, I use the same 1000 Volt power supply, put a 1000 Ohm REACTOR in series, and my load 
still has an amp flowing through it.  A 1000 Ohms at 60 Hz is about 2.7 Henrys, which is 
a fairly large inductor, but not unreasonable.

Another approach is to put a variable resistor in series, and adjust it automatically to 
maintain the current as the load impedance varies.  For the 0 to 10 Ohm scenario above, 
the variable resistor would need to vary from 10 Ohms down to 0 ohms, and the Power 
supply would only need to be 10 Volts. 10 Volts at 1 Amp is only 10 Watts, and the 
"resistor" would disspate anywhere from 0 to 10 Watts, which is a whole lot better than 
1000 Watts in the resistive ballast approach. In reality, the variable resistor would be 
a transistor, which is how a constant current (or current limiting) DC powersupply works.

Finally, you could hook a lossless variable resistor in series.  This is a PWM regulator, 
which essentially is a switch that you turn on and off very fast, changing the duty cycle 
to adapt to the load impedance.  Switches typically have very low loss when either on or 
off, it is just that inbetween half on that dissipates the power.  Of course, you now 
have a ripple problem, but that is what filters are for.