MadMan
I think the nuances in the voltages was down to how the electrical infrastructure was in the USA at the time, and how the power stations supplied power. Rural areas didn't have as much power as cities due to the less importance of power supply and the population density being less.
I found a random site that kind of explains it a little bit:
https://www.eng-tips.com/viewthread.cfm?qid=335333
"Edison originally planned everything for 100 V DC -generators,light bulbs.Then after putting up his first line in Manhattan, he found the customers at the end of line complaining of poor illumination, obviously due to voltage drop. As bulbs were already made, he could not derate them for 90 V.The easiest thing he could do was to increase the generator voltage to 110V increasing the field excitation. Thus 110 V became standard,leading to 220V ,440V etc and later to 11,22,66,110,220 kV."
You can notice this on a smaller scale inside older homes that have their original wiring, where if you go around all the outlets with a voltmeter, the further you get from the circuit panel the voltage will drop off due to the length of the wire span, the aging of the wire, and its electrical tolerance in the gauge used.
In my house, the outlet my PC is plugged into is giving me 118V. However the outlet that is directly attached to the panel in the basement and on its own circuit with only maybe less than a foot of wire, is reading 127V. So there's a 9V difference in power supply from opposite ends of the house. This is in a half-modernized 1950 house that had sporadic wiring additions done in 1979 and has a circuit breaker panel from the same time. Imagine how much different it would be in a 1920s craftsman home that has two floors and a basement with thin knob and tube wiring and screw-in fuses back then. If your house didn't have the juice to run the vacuum, you'd be stuck in a pickle and have to use the old broom and dustpan. lol