For the same supply ,IN US FREQUENCY IS 60 HZ,DUE TO THAT
all the EQUIPMENTs ARE RUNNING WITH (60-50)/50*100 i.e 20%
higher SPEED as compared to INDIA.now if N would be higher
(w=2pief),generated emf wld be more;
i.e for the same supply ,but for different frequency ;the
generated e.m.f wld be more.
If frequency is high then losses are more
as we know that hysterysis loss
ph=k*B^1.6*f and eddy current loss
the transformers produced in India can be used there too
but transformers designd there cannot be used here
i think there is no technical reason behind the generation
of power at 50Hz in India and 60Hz in America. It is just
the standardization of country. America firstly designad
their equipments at 60Hz and after that India and other
countries designad their equipments at 50Hz.
well, for the safety reason of the consumer in the US, they
decided to kept voltage level at 110V a.c. at the consumer
end. but to avoid distribution loss and over heating of the
conductors they kept frequency of AC supply at 60 Hz. to
reduce that losses.
what is the method to calculate the voltage drop for number
of meters Example 250volts I/p at one end if the cable
length is 50 meters what could be the drop?, after 100
meters how much? needs formulae.