Why The Transformer Is Rated In VA Not Watt?


1 Answers

Anonymous Profile
Anonymous answered
Watts refer to a Power Rating and is usually measured at a constant source of Voltage and Current (Power= Current X Voltage or P=IE). Transformers are an AC device where the current and voltages change from 0 to maximum and back again. When Watts are shown for an AC source it usually refers to the RMS (Route Means Squared) value of Voltage and Current. RMS is taken by squaring the voltage reading at a given moment and adding the values of the moments together. For example, the voltage reading of an AC source is 0 at start 0.5, 30 degrees from start, 0.667 60 degrees from start and 1.00 90 degrees from start. When the voltage decreases the sums of the voltages =0 so to get a true value the voltages are squared because the square of -0.5 is the same and a positive value of +0.5. RMS is a true measurement of a sine (AC) wave but not for a transformer because it could peaks out (start) at the maximum value of an AC wave. The RMS value of an AC wave= 0.707time its peak value.  If the maximum value is not considered, the transformer will burn out.  The best value, therefore is simply to multiply the maximum voltage by maximum current called Volt-Amps instead of by the RMS value of Power.

Answer Question