My $0.02
Power = V * I
Ohm's Law: V = R * I
Power (Resistive Dissipation) = I*R^2
Higher voltages are more effecient because they reduce the energy loss due to transmission. (the I*R^2) loss due to heating. The common sense reason that we use lower voltages is that its easier to design.
FYI: 100 Milli-amp's across your chest is enough to cause the heart to fibrilate. (which basically means it rips itself apart... very bad.)
100 Milli amp = 0.1 Amp!
A COMMON LIGHT BULB USES 1 AMP!
Thats all ive got.
As for the original question of an international standard. The united states can not even make a relialbe transmission network across the country. The crisis of de-regulation, and independent power generation cause alot of problems. Added to this is the fact that the transmission grid is being antequated as we speak. This is a billion dollar industry and the industry is under fear that it could be nationalised. (Which i think might be a good thing... but thats just me.)
To give you an idea of how screwed up things are. To transfer power from the west of the mississipi eastward the AC power, say at 765kV must first be converted to DC power. This is because the generators that create the power are not sycronous. The AC waves would cancel at certian points along the grid.
and finally... here is a link for the power industry. Its a preliminary report of what happened last august in ohio, and the midwest.
https://reports.energy.gov/BlackoutReport-1.pdf