You may wonder, why is there a 120-volt standard in North America and a 230-volt standard in Europe, you may wonder? The precise reasons are unknown, however, there are a few plausible theories.
The first and most commonly cited response is from Thomas Edison. Although Edison was not the first to invent the bulb, he is known as the inventor since he not only produced the light bulb but a superior one as well.
He also created a whole mechanism to power his light bulb. In some ways, the light bulb was not the most essential item. The generator, main feeder, distribution lines, home wiring, and light bulbs were all part of a power pack.
He worked tirelessly to develop the ideal material that would burn for countless hours. What voltage was optimal for his light bulb? The magic number was 110 volts. Edison's electricity installations in the USA were entirely 110-volt direct current (DC). (It was the DC component that proved to be Edison's undoing.) Direct current is incapable of traveling vast distances. The "battle of currents" was won by George Westinghouse and other people who used alternating currents (AC).
110>115>120
The electric power network in the United States began at 110 volts. By the 1930s, the voltage had risen to 115 volts. By the 1950s, about 75% of incandescent bulbs sold in the United States were classified at 120 volts. The official standard voltage in the United States in 1984 was 120 volts. US companies are now obligated to provide power at a voltage that does not deviate more than 5% from the nominal 120 volts. This implies that the real voltage should remain between 114 and 126 volts.
That strangeness dates back to the past, when two rival enterprises from separate nations employed two distinct technical standards, as we saw with Japan's 100-volt 50Hz/60Hz system above. That is why USA and other north American countries still has 120-volt and European countries use 230 volts since 1890: Replacing outdated equipment, bulbs, cabling, generators, and machines would have been expensive and complicated. In the 1950s, the United States, which began with 110-volt electricity, contemplated converting to more efficient 220-volt residential power.
After studying the American experience, Europe established its 220-volt (now 230-volt) network before any major infrastructural upgrades were necessary. However, several areas of Europe were initially 120 volts.
The Edison Electric Light Company's steam-powered station at Holborn Viaduct in London, which commenced operation in January 1882, was the world's first large-scale central power plant. Holburn generated direct current at 110 volts. The London station was used as a model for the considerably bigger Pearl Street Station in Manhattan, which was the world's first fixed industrial central power station. But, in the AC vs DC war, Edison's direct current (DC) was losing to Nicola Tesla's alternating current (AC).
AC electricity was first introduced in Europe by the Germans. The International Electro-Technical Exhibition in Germany in 1891 would prove to be a watershed moment in the evolution of electric power.
The primary reason Europe chose 220 volts was cost (now 230). Higher voltages, contrary to popular belief, allow the use of thinner wire, which means fewer copper in the initial days of power lines. By employing 220 volts instead of 110 volts, power providers might save money on wiring.