It is absolutely stupid to talk about this as edisons revenge. If Tesla had the modern high power transistors needed to get high voltage dc out of the ac produced from a spinning turbine he would be all for high voltage dc too. Tesla understood that high voltage was needed for efficient long range transmission. He also understood that transformers were the inly remotely efficient way to climb up to and down from these high voltages. And transformers only work with ac. So he designed an ac system and even designed some better transformers for it.
If there was anything like a high power transistor back then he would have used that. High power transistors that are robust enough to handle the grid were designed inly recently over 100 years after the tesla/edison ac/dc argument.
Also, if anything would have been Edison's revenge it would have been HVDC, where they're sending power long distances with DC. (But as you said, even there it wouldn't make a ton of sense, since they were arguing in a different era).
DC power has been an option for datacenter equipment since I was a young lad racking and stacking hardware. Cisco, Dell, HPE, IBM, and countless others all had DC supply options. Same with PDUs. What’s old is new again.
48vdc was common in phone exchanges. They filled the basement with lead-acid batteries and to could run without the grid for a couple weeks. In turn the phone was 99.999% reliable for decades.
Not to be _that_ guy, but it was technically -48V DC.
Honestly, that was pretty surprising to me when I had to work with some telco equipment a couple of decades ago. To this day, I don't think I've encountered anything else that requires negative voltage relative to ground.
Yes, and that tiny little difference can cost you a lot of expensive gear if you run it off the battery and plug in a serial port or something like that. You'll also learn first hand what arc welding looks like without welding glass.
positive ground used to be in all cars. When they went from 6 volts to 12 the disadvantages became appearant fast and so everyone went negative ground then (mid 1950s). I am not clear why positive ground was bad (maybe corrosion?)
Yeah I always heard that the phone lines carried their own power, and in Florida the phones did keep working when the power went out, but I never knew why.
So the grid was always charging up the lead acid batteries, and the phone lines were always draining them? Or was there some kind of power switching going on where when the grid was available the batteries would just get "topped off" occasionally and were only drained when the power went out?
Grid charging batteries, phone draining them as I understand. Of course there were switches all over the us so I can't make blanket claims but from what I hear that was normal.
I've been hearing this line for over a decade, now. "Immersion cooling will make data centers scale!" "Converting to DC at the perimeter increases density!"
Yes, of course both of those things are true, and yes, some data centers do engage in those processes for their unique advantages. The issue is that aside from specialty kit designed for that use (like the AWS Outposts with their DC conversion), the rank-and-file kit is still predominantly AC-driven, and that doesn't seem to be changing just yet.
While I'd love to see more DC-flavored kit accessible to the mainstream, it's a chicken-and-egg problem that neither the power vendors (APC, Eaton, etc) or the kit makers (Dell, Cisco, HP, Supermicro, etc) seem to want to take the plunge on first. Until then, this remains a niche-feature for niche-users deal, I wager.
Those vendors all have DC power supply options, to my knowledge. It’s hardly new; early telco datacenters had DC power rails, since Western Electric switching equipment ran on 48VDC.
That’s just it though, telco DCs != Compute DCs. Telcos had a vested interest in DC adoption because their wireline networks used it anyway, and the fewer conversions being done the more efficient their deployments were.
Every single DC I’ve worked in, from two racks to hundreds, has been AC-driven. It’s just cheaper to go after inefficiencies in consumption first with standard kit than to optimize for AC-DC conversion loss. I’m not saying DC isn’t the future so much as I’ve been hearing it’s the future for about as long as Elmo’s promised FSD is coming “next year”.
I think the real reason is because battery power didn't have to be converted twice to be able to run the gear in case of an outage, so you'd get longer runtime in case of a power failure, and it saves a bunch of money on supplies and inverters because you effectively only need a single giant supply for all of the gear and those tend to be more efficient (and easier to keep cool) than a whole raft of smaller ones.
At least for servers, power supplies are highly modular. It just takes 1 moderately sized customer to commit to buying them, and a DC module will appear.
Looking at the manual for the first server line that came to mind, you can buy a Dell PowerEdge R730 today with a first party support DC power supply.
Surely if it makes sense for the big players, they will do it, and then the benefits will trickle down to the rest? Like how Formula 1 technology will end up in consumer vehicles.
Well, having spent some time operating a 12VDC system last year when I moved into some shacks, I will say that I find it a lot more convenient to run 120VAC.
I end up converting stuff anyhow, because all my loads run at different voltages- even though I had my lights, vent fan, and heater fans running on 12V I still ended up having to change voltages for most of the loads I wanted to run, or generate a AC to to charge my computer and run a rice cooker.
Not to mention that running anything that draws any real power quickly needs a much thicker wire at 12V. So you're either needing to run higher voltage DC than all your loads for distribution and then lowering the voltage when it gets to the device, or you simply can't draw much power.
Not that you can't have higher voltage DC; with my newer system the line from my solar panels to my charger controller is around 350VDC and I can use 10awg for that... but none of the loads I own that draw much power (saws, instapot, rice cooker, hammond organ, tube guitar amp) take DC :D
Not going to happen. For the same reason that the US never converted to a higher domestic voltage even though there are many practical advantages. The transition from one system to another at the consumer level would be terrible, even if there would be some advantage (and I'm not sure the one you list is even valid, you'd get DC-DC converters instead because your consumers typically use a lower voltage than the house distribution network powering your sockets) it would be offset by the cost of maintaining two systems side by side for decades.
You could wire your house for 12, 24 or 48V DC tomorrow and some off-grid dwellers have done just that. But since inverters have become cheap enough such installations are becoming more and more rare. The only place where you still see that is in cars, trucks and vessels.
And if you thought cooking water in a camper on an inverter is tricky wait until you start running things like washing machines and other large appliances off low voltage DC. You'll be using massive cables the cost of which will outweigh any savings.
The lesser-known instance of this is RV power. When you're running off small batteries and solar, you want to make the best use of the watt-hours you have, and that means avoiding the DC-to-AC-to-DC loop wherever possible. So you run 12V (or in newer models, higher voltage) versions of everything, upconverting as necessary.
If your house gets 800V DC you're still gonna need "bricks" to convert that to 5VDC of 12VDC (or maybe 19VDC) that most of the things that currently have "bricks" need.
And if your house gets lower voltage DC, you're gonna have the problem of worth-stealing sized wiring to run your stove, water heater, or car charger.
I reckon it'd be nice to have USB C PD ports everywhere I have a 220VAC power point, but 5 years ago that'd have been a USB type A port - and even now those'd be getting close to useless. We use a Type I (AS/NZS 2112) power point plug here - and that hasn't needed to change in probably a century. I doubt there's ever been a low voltage DC plug/socket standard that's lasted in use for anything like that long - probably the old "car cigarette lighter" 12DC thing? I'm glad I don't have a house full of those.
Something to consider, and something I got a vivid demonstration of while playing with solar panels, DC arcs aren't self-extinguishing, unlike AC arcs. At one point I stuck a voltage probe in, and the arc stuck with it as I pulled the probe away. It also vaporized the metal tip of the probe.
My understanding is that DC breakers are somewhat prone to fires for this reason, too.
Heh - I vaporised a fairly large soldering iron tip (probably 4mm copper cylindrical bar?), when I fucked up soldering a connector to a big 7 cell ~6000mAHr LiPo battery and shorted the terminals. Quite how I didn't end up blind or in hospital I don't know. It reinforced just how much respect you need to pay to even low-ish voltage DC when the available current was likely able to exceed 700A by a fair margin momentarily. I think those cells were rated at 60C continuous and 120C for 5 seconds.
heh man, I'm glad you got out of that easy, I definitely wore safety glasses 100% of the time after my experience. I think a lifetime of experience with dangerous wall outlets and harmless little 1.5V/9V DC cells teaches us the wrong lessons about DC safety. I've since heard stories of wrenches exploding when they fall across EV high voltage battery terminals. Wrenches aren't supposed to be explosive.
The electricians I was working with also told me stories about how with the really big breakers, you don't stand in front of it when you throw it, because sometimes it can turn into a cloud of molten metal vapor. And that's just using them as intended.
A bunch of those big breakers require two people. One person in a flash suit and another with a 2m long pole around the first person. That way if an arc flash happens, the second person can yank the first person to safety without also getting hurt.
That’s actually a recent phenomenon. Before the age of electronics most household appliances either worked with AC or DC equally well (like incandescent bulbs) or worked well with AC only given the technology at the time (think anything with a motor, fans, HVAC compressors etc).
Taking it to an extreme, the house I lived in while in grad school had wall lamp fixtures that doubled as electric and gas lamps. At some point I imagine it would have been possible to choose between using electric or gas by either flipping the switch or turning a valve. They said "Edison Patent" on them. We could have lit the house on AC, DC, or gas.
Thinking about the failure modes gave me the heebie jeebies, but the gas had been disconnected ages prior.
I've had discussed with people familiar with the matter, and they convinced me its really not worth it for many reasons, the main one being safety - DC arcs are self sustaining - AC voltage constantly goes to zero, so if an arc were to form, it gets auto extinguished when the voltage drops. With DC this never happens, meaing every switch or plug socket can create this nice long arcs and is a potential fire hazard.
The 'what is safer' question for DC and AC at the same effective current and power has a mixed set of answers depending on conditions. For instance, DC is more likely to cause your muscles to contact and not let go (bad), but AC is more likely to send your heart into ventricular fibrillation (sp?, also bad).
AC arcs are easier to extinguish than DC arcs, but DC will creep much easier than AC and so on.
From a personal point of view: I've worked enough with both up to about 1KV at appreciable power levels and much higher than that at reduced power. Up to 50V or so I'd rather work with DC than AC but they're not much different. Up to 400V or so above that I'd much rather have AC and above 400V the answer is 'neither' because you're in some kind of gray zone where creep is still low so you won't know something is amiss until it is too late. And above 1KV in normal settings (say, picture tubes in old small b&w tvs and higher up when they're color and larger) and it will throw you right across the room but you'll likely live because the currents are low.
HF HV... now that's a different matter and I'm very respectful of anything in that domain, and still have a burn from a Tronser trimmer more than 45 years after it happened. Note to self: keep eye on SWR meter/Spectrum analyzer and finger position while trimming large end stages.
Really depends on what we're talking about. A lot of electrical safety equipment has a DC rating, usually something like 90VDC/300VAC. Also, most DC equipment just isn't going to have the stored energy to generate a big arc. Well, except batteries, and we're already piling them all around us.
I mean it depends, but for dual rated stuff has both a voltage and current limit, both of which are way lower.
Like typically a 230V/20A AC switch can switch 24VDC/2A. And the energy is not in the equipment, its in the mains (or batteries like you said, or PV panels)
I've worked overseas a lot and one thing that's really different from 2 decades ago is that I simply don't need a step-down transformer anymore because every single thing I plug in converts to DC (or otherwise accepts dual-voltage) anyways. So I have a giant collection of physical plug adapters because every device I use just needs to fit into the socket and takes care of it from there.
I don't understand why new houses don't just have one high quality AC/DC converter so you can just use LED lighting without every bulb needing its own AC/DC converter. I imagine the light bulb cartel wouldn't really like that.
With modern technologies, that's power over ethernet or USB-C. Other comments in this thread point out that the telephone service also routinely used 48V for the ring signal.
However, higher DC voltage is riskier, and it's not at all standard for electrical and building code reasons. In particular, breaking DC circuits is more difficult because there's no zero-crossing point to naturally extinguish an arc, and 170V (US/120VAC) or 340V (Europe/240VAC) is enough to start a substantial arc under the right circumstances.
Unfortunately for your lighting, it's also both simple and efficient to stack enough LEDs together such that their forward voltage drop is approximately the rectified peak (i.e. targeting that 170/340V peak). That means that the bulb needs only one serial string of LEDs without parallel balancing, making the rest of the circuitry (including voltage regulation, which would still be necessary in DC world) simpler.
> I don't understand why new houses don't just have one high quality AC/DC converter so you can just use LED lighting without every bulb needing its own AC/DC converter.
IEEE 802.3bt can deliver up to 71W at the destination: just pull Cat 5/6 everywhere.
LED light bulbs exist exclusively for compatibility with Edison sockets. Every LED fixture I have seen had a single transformer for the entire fixture; and that transformer was reasonably separate from the LEDs themselves.
It wouldn't work. leds need low voltages, meaning massive wires. you can run the voltage change on ac or dc. Ac just needs a few capacters to smooth the wave out.
That's traded off against the increase efficiency of LED lighting, at least compared to incandescent lighting. An LED "equivalent replacement" for a typical incandescent globe is down around 1/10th of the power. A 7Watt LED bulb is typically marketed as "60W equivalent". If that configured as a bunch of LEDs in series (or series/parallel) that need 12VDC, it's right about the same current draw as the 120V 60W incandescent equivalent. (Or perhaps double the current for those of us who get 220VAC out of our walls.)
(Am I just showing my age here? How many of you have ever bought incandescent globes for house lighting? I vaguely recall it may be illegal to sell them here in .au these days. I really like quartz halogen globes, and use them in 4 or 5 desk lamps I have, but these days I need to get globes for em out of China instead of being able to pick them up from the supermarket like I could 10 or 20 years ago.)
I think there'a a regulatory "Low Voltage" definition of "below 50V", which has implications around whether you need to be a licensed electrician to install it or not. Anything above that is - for at least some proposed - considered "High Voltage".
Other people, of course, have other definitions of high voltage:
"This resonant tower is known as a Tesla coil. This particular one is just over 17 feet tall and it can generate about a million volts at 60,000 cycles per second."
and:
"This pulse forming network can deliver a shaped pulse of over 50,000 amps with a total energy of about 1,057 times the tower primary energy"
AC is less efficient than DC at a given voltage. The advantage of AC is that voltage switching is cheap, easy and efficient. Switching DC voltage is way harder, more expensive, and less efficient. However the switching costs are O(1) and the transmission losses are O(n) so for some distance (currently somewhere around 500 km) it's worth paying the switching cost to get super high voltage DC. The big thing that's changed in the last ~30 years is a ton of research into high voltage transistors, and fast enough computers to do computer controlled mhz switching of giant high power transistors. These new super fancy switching technologies brought the switching costs down from ludicrous to annoyingly high.
To expand on this, a given power line can only take a set maximum current and voltage before it becomes a problem. DC can stay at this maximum voltage constantly, while AC spends time going to zero voltage and back, so it's delivering less power on the same line.
The primary benefit of AC is it's really easy to change the voltage of AC up or down.
The transmission efficiency of AC comes from the fact that you can pretty trivially make a 1 megavolt AC line. The higher the voltage, the lower the current has to be to provide the same amount of power. And lower current means less power in line loss due to how electricity be.
But that really is the only advantage of AC. DC at the same voltage as AC will ultimately be more efficient, especially if it's humid or the line is underwater. Due to how electricy be, a change in the current of a line will induce a current into conductive materials. A portion of AC power is being drained simply by the fact that the current on the line is constantly alternating. DC doesn't alternate, so it doesn't ever lose power from that alternation.
Another key benefit of DC is can work to bridge grids. The thing causing a problem with grids being interconnected is entirely due to the nature of AC power. AC has a frequency and a phase. If two grids don't share a frequency (happens in the EU) or a phase (happens everywhere, particularly the grids in the US) they cannot be connected. Otherwise the power generators end up fighting each other rather than providing power to a load.
In short, AC won because it it was cheap and easy to make high voltage AC. DC is comming back because it's only somewhat recently been affordable to make similar transformations on DC from High to low and low to high voltages. DC carries further benefits that AC does not.
Important factor is that AC at given nominal voltage V swings between 1.41V and -1.41V, so it requires let's say 40% better/thicker insulation than the equivalent V volts DC line. This is OK for overhead lines (just space the wires more) but is a pain for buried or undersea transmission lines; for that reason, they tend to use DC nowadays.
How is DC better than a three phase delta 800Vrms, at 400Hz?
- Three conductors vs two, but they can be the next gauge up since the current flows on three conductors
- no significant skin effect at 400Hz -> use speaker wire, lol.
- large voltage/current DC brakers are.. gnarly, and expensive. DC does not like to stop flowing
- The 400Hz distribution industry is massive; the entire aerospace industry runs on it. No need for niche or custom parts.
- 3 phase @ 400Hz is x6 = 2.4kHz. Six diodes will rectify it with almost no relevant amount of ripple (Vmin is 87% of Vmax) and very small caps will smooth it.
As an aside, with three (or more) phase you can use multi-tap transformers and get an arbitrary number of poles. 7 phases at 400Hz -> 5.6kHz. Your PSU is now 14 diodes and a ceramic cap.
- you still get to use step up/down transformers, but at 400Hz they're very small.
- merging power sources is a lot easier (but for the phase angle)
- DC-DC converters are great, but you're not going to beat a transformer in efficiency or reliability
"now run that unshielded wire 50 meters past racks of GPUs and enjoy your EMI"
Multipole expansion scales faster than r^2.
Also, im not in the field (clearly) but GPUs cant handle 2.4 kHz? The quarter wavelength is 30km.
"nothing in that catalog is rated for 100kW–1MW rack loads at 800Vrms"
Current wise, the catalog covers this track just fine. As to the voltages, well that's the whole point of AC! The voltage you need is but a few loops of wire away.
"you still need an inverter-based UPS upstream, which is the exact conversion stage DC eliminates"
So keep it? To clarify, this is the "we're too good for plebeian power, so we'll transform it AC->DC->AC", right?
"SiC solid-state DC breakers are shipping today from every major vendor"
Of course they do. They're also pricey, have limited current capability (both capital costs and therefore irrelevant when the industry is awash with GCC money) and lower conduction, and therefore higher heat.
They're really nice though.
"wide-bandgap converters are at 95%+ with no moving parts"
transformers have no moving parts. Loaded they can do 97%+ efficiency, or 2MW of heat eliminated on a 100MW center.
An advanced AI rack might use 100kW = 800V 125A, requiring gauge 2, quarter inch diameter---this isn't your lol speaker wire. Actually, I apologize, I realized I may be talking to a serious audiophile, didn't mean to disrespect your Monster cables.
The skin depth by the way is sqrt(2 1.7e-8 ohm m / (2 pi 400Hz mu0))=~3mm for copper---OK for single rack, but starts to be significant for the type of bus bars that an aisle of racks might want.
As for efficiency, both 400Hz transformers AND fancy DC-DC converters are around 95% efficient, except that AC requires electronics to rectify it to DC, losing another few percent, so the slight advantage goes to DC, actually.
As for merging power, remember that DC DC converter uses an internal AC stage, so it's the same---you can have multiple primary windings, just like for plain AC.
> I realized I may be talking to a serious audiophile, didn't mean to disrespect your Monster cables.
I am a recovering audiophool.
I do own a pair of 2m long Monster Cable speaker cables (with locking gold plated banana plugs). I am fairly certain I've used welders with smaller cables.
(In my defence, I bought those as a teenager in the late 80s. I am not so easily marketed to with snake oil these days. I hope.)
(On the other hand, I really like the idea of a reliably stable plus and minus 70V or maybe 100V DC power supply to my house. That'd make audio power amplifiers much easier and lighter...)
>- no significant skin effect at 400Hz -> use speaker wire, lol.
What are you talking about? There's a very significant skin effect at 400Hz. Skin effect goes up with frequency. These datacenters use copper busbars, not cable, so skin effect is an important consideration.
At 100 000 A for a 100 MW data center at 1000 V, speaker wire is a joke.
You obviously need at least a dozen stands in parallel!!
Clearly skin effect scales with frequency but, 400 Hz is still low, only 2.5x lines frequency (the scale is by the root); so the skin depth is 3mm. 3mm on each side makes for a pretty hefty rectangular cross-section.
If you could get that 100,000Amps flowing through your speaker wire, the vaporised copper and the plasma channel would probably keep your 100MW flowing, at least until your building caught fire.
It is absolutely stupid to talk about this as edisons revenge. If Tesla had the modern high power transistors needed to get high voltage dc out of the ac produced from a spinning turbine he would be all for high voltage dc too. Tesla understood that high voltage was needed for efficient long range transmission. He also understood that transformers were the inly remotely efficient way to climb up to and down from these high voltages. And transformers only work with ac. So he designed an ac system and even designed some better transformers for it.
If there was anything like a high power transistor back then he would have used that. High power transistors that are robust enough to handle the grid were designed inly recently over 100 years after the tesla/edison ac/dc argument.
It's just a fun title, you are overthinking it
Also, if anything would have been Edison's revenge it would have been HVDC, where they're sending power long distances with DC. (But as you said, even there it wouldn't make a ton of sense, since they were arguing in a different era).
Agreed, for the IEEE to go down this route is more than a little weird.
Tesla also design the modern induction motor which needs ac. Though these days we often run them on a phase generator which has a dc step.
DC power has been an option for datacenter equipment since I was a young lad racking and stacking hardware. Cisco, Dell, HPE, IBM, and countless others all had DC supply options. Same with PDUs. What’s old is new again.
See e.g. https://www.dell.com/support/kbdoc/en-us/000221234/wiring-in...
48vdc was common in phone exchanges. They filled the basement with lead-acid batteries and to could run without the grid for a couple weeks. In turn the phone was 99.999% reliable for decades.
Not to be _that_ guy, but it was technically -48V DC.
Honestly, that was pretty surprising to me when I had to work with some telco equipment a couple of decades ago. To this day, I don't think I've encountered anything else that requires negative voltage relative to ground.
Yes, and that tiny little difference can cost you a lot of expensive gear if you run it off the battery and plug in a serial port or something like that. You'll also learn first hand what arc welding looks like without welding glass.
Is that something other than a labelling convention? Is ground actually connected to a earth stake?
Cathodic protection against corrosion was the goal of using -48V, in the telcos' case.
positive ground used to be in all cars. When they went from 6 volts to 12 the disadvantages became appearant fast and so everyone went negative ground then (mid 1950s). I am not clear why positive ground was bad (maybe corrosion?)
Check out older English cars.
Yeah I always heard that the phone lines carried their own power, and in Florida the phones did keep working when the power went out, but I never knew why.
So the grid was always charging up the lead acid batteries, and the phone lines were always draining them? Or was there some kind of power switching going on where when the grid was available the batteries would just get "topped off" occasionally and were only drained when the power went out?
Grid charging batteries, phone draining them as I understand. Of course there were switches all over the us so I can't make blanket claims but from what I hear that was normal.
Interesting, so this is why the phone line still worked when power was out across the whole town.
I still have a bunch of 48vdc comms gear in my powerplant.
[dead]
I've been hearing this line for over a decade, now. "Immersion cooling will make data centers scale!" "Converting to DC at the perimeter increases density!"
Yes, of course both of those things are true, and yes, some data centers do engage in those processes for their unique advantages. The issue is that aside from specialty kit designed for that use (like the AWS Outposts with their DC conversion), the rank-and-file kit is still predominantly AC-driven, and that doesn't seem to be changing just yet.
While I'd love to see more DC-flavored kit accessible to the mainstream, it's a chicken-and-egg problem that neither the power vendors (APC, Eaton, etc) or the kit makers (Dell, Cisco, HP, Supermicro, etc) seem to want to take the plunge on first. Until then, this remains a niche-feature for niche-users deal, I wager.
Those vendors all have DC power supply options, to my knowledge. It’s hardly new; early telco datacenters had DC power rails, since Western Electric switching equipment ran on 48VDC.
https://www.nokia.com/bell-labs/publications-and-media/publi...
That’s just it though, telco DCs != Compute DCs. Telcos had a vested interest in DC adoption because their wireline networks used it anyway, and the fewer conversions being done the more efficient their deployments were.
Every single DC I’ve worked in, from two racks to hundreds, has been AC-driven. It’s just cheaper to go after inefficiencies in consumption first with standard kit than to optimize for AC-DC conversion loss. I’m not saying DC isn’t the future so much as I’ve been hearing it’s the future for about as long as Elmo’s promised FSD is coming “next year”.
I think the real reason is because battery power didn't have to be converted twice to be able to run the gear in case of an outage, so you'd get longer runtime in case of a power failure, and it saves a bunch of money on supplies and inverters because you effectively only need a single giant supply for all of the gear and those tend to be more efficient (and easier to keep cool) than a whole raft of smaller ones.
At least for servers, power supplies are highly modular. It just takes 1 moderately sized customer to commit to buying them, and a DC module will appear.
Looking at the manual for the first server line that came to mind, you can buy a Dell PowerEdge R730 today with a first party support DC power supply.
Surely if it makes sense for the big players, they will do it, and then the benefits will trickle down to the rest? Like how Formula 1 technology will end up in consumer vehicles.
I stg if I see the kids talk about Westinghouse being batterymogged I'm leaving the Internet
Waiting for home DC.
It is silly to have AC to DC converters in all of my wall connected electronics ( LED bulbs, home controller, computer equipment etc )
Well, having spent some time operating a 12VDC system last year when I moved into some shacks, I will say that I find it a lot more convenient to run 120VAC.
I end up converting stuff anyhow, because all my loads run at different voltages- even though I had my lights, vent fan, and heater fans running on 12V I still ended up having to change voltages for most of the loads I wanted to run, or generate a AC to to charge my computer and run a rice cooker.
Not to mention that running anything that draws any real power quickly needs a much thicker wire at 12V. So you're either needing to run higher voltage DC than all your loads for distribution and then lowering the voltage when it gets to the device, or you simply can't draw much power.
Not that you can't have higher voltage DC; with my newer system the line from my solar panels to my charger controller is around 350VDC and I can use 10awg for that... but none of the loads I own that draw much power (saws, instapot, rice cooker, hammond organ, tube guitar amp) take DC :D
Not going to happen. For the same reason that the US never converted to a higher domestic voltage even though there are many practical advantages. The transition from one system to another at the consumer level would be terrible, even if there would be some advantage (and I'm not sure the one you list is even valid, you'd get DC-DC converters instead because your consumers typically use a lower voltage than the house distribution network powering your sockets) it would be offset by the cost of maintaining two systems side by side for decades.
You could wire your house for 12, 24 or 48V DC tomorrow and some off-grid dwellers have done just that. But since inverters have become cheap enough such installations are becoming more and more rare. The only place where you still see that is in cars, trucks and vessels.
And if you thought cooking water in a camper on an inverter is tricky wait until you start running things like washing machines and other large appliances off low voltage DC. You'll be using massive cables the cost of which will outweigh any savings.
The lesser-known instance of this is RV power. When you're running off small batteries and solar, you want to make the best use of the watt-hours you have, and that means avoiding the DC-to-AC-to-DC loop wherever possible. So you run 12V (or in newer models, higher voltage) versions of everything, upconverting as necessary.
Our houses should be DC. So wasteful to have all these bricks to change to AC to DC.
Sure, maybe?
If your house gets 800V DC you're still gonna need "bricks" to convert that to 5VDC of 12VDC (or maybe 19VDC) that most of the things that currently have "bricks" need.
And if your house gets lower voltage DC, you're gonna have the problem of worth-stealing sized wiring to run your stove, water heater, or car charger.
I reckon it'd be nice to have USB C PD ports everywhere I have a 220VAC power point, but 5 years ago that'd have been a USB type A port - and even now those'd be getting close to useless. We use a Type I (AS/NZS 2112) power point plug here - and that hasn't needed to change in probably a century. I doubt there's ever been a low voltage DC plug/socket standard that's lasted in use for anything like that long - probably the old "car cigarette lighter" 12DC thing? I'm glad I don't have a house full of those.
Something to consider, and something I got a vivid demonstration of while playing with solar panels, DC arcs aren't self-extinguishing, unlike AC arcs. At one point I stuck a voltage probe in, and the arc stuck with it as I pulled the probe away. It also vaporized the metal tip of the probe.
My understanding is that DC breakers are somewhat prone to fires for this reason, too.
Heh - I vaporised a fairly large soldering iron tip (probably 4mm copper cylindrical bar?), when I fucked up soldering a connector to a big 7 cell ~6000mAHr LiPo battery and shorted the terminals. Quite how I didn't end up blind or in hospital I don't know. It reinforced just how much respect you need to pay to even low-ish voltage DC when the available current was likely able to exceed 700A by a fair margin momentarily. I think those cells were rated at 60C continuous and 120C for 5 seconds.
heh man, I'm glad you got out of that easy, I definitely wore safety glasses 100% of the time after my experience. I think a lifetime of experience with dangerous wall outlets and harmless little 1.5V/9V DC cells teaches us the wrong lessons about DC safety. I've since heard stories of wrenches exploding when they fall across EV high voltage battery terminals. Wrenches aren't supposed to be explosive.
The electricians I was working with also told me stories about how with the really big breakers, you don't stand in front of it when you throw it, because sometimes it can turn into a cloud of molten metal vapor. And that's just using them as intended.
A bunch of those big breakers require two people. One person in a flash suit and another with a 2m long pole around the first person. That way if an arc flash happens, the second person can yank the first person to safety without also getting hurt.
Those harmless 9V DC cells can do a lot of damage if you use them right.
Amps - the old 48vdc telco data centers vaporized wrenchs once in a while.
You got super lucky.
That’s actually a recent phenomenon. Before the age of electronics most household appliances either worked with AC or DC equally well (like incandescent bulbs) or worked well with AC only given the technology at the time (think anything with a motor, fans, HVAC compressors etc).
Taking it to an extreme, the house I lived in while in grad school had wall lamp fixtures that doubled as electric and gas lamps. At some point I imagine it would have been possible to choose between using electric or gas by either flipping the switch or turning a valve. They said "Edison Patent" on them. We could have lit the house on AC, DC, or gas.
Thinking about the failure modes gave me the heebie jeebies, but the gas had been disconnected ages prior.
There are niches where DC makes sense - low-voltage lighting, USB/LED ecosystems.
Once you get into higher power (laptops and up), switching and distribution get harder, so the advantages fade.
For bigger appliances (fridge, etc), AC is fine + practical.
Your modern fridge is probably going to have an inverter-driven motor, so you're right back to using DC.
I've had discussed with people familiar with the matter, and they convinced me its really not worth it for many reasons, the main one being safety - DC arcs are self sustaining - AC voltage constantly goes to zero, so if an arc were to form, it gets auto extinguished when the voltage drops. With DC this never happens, meaing every switch or plug socket can create this nice long arcs and is a potential fire hazard.
The 'what is safer' question for DC and AC at the same effective current and power has a mixed set of answers depending on conditions. For instance, DC is more likely to cause your muscles to contact and not let go (bad), but AC is more likely to send your heart into ventricular fibrillation (sp?, also bad).
AC arcs are easier to extinguish than DC arcs, but DC will creep much easier than AC and so on.
From a personal point of view: I've worked enough with both up to about 1KV at appreciable power levels and much higher than that at reduced power. Up to 50V or so I'd rather work with DC than AC but they're not much different. Up to 400V or so above that I'd much rather have AC and above 400V the answer is 'neither' because you're in some kind of gray zone where creep is still low so you won't know something is amiss until it is too late. And above 1KV in normal settings (say, picture tubes in old small b&w tvs and higher up when they're color and larger) and it will throw you right across the room but you'll likely live because the currents are low.
HF HV... now that's a different matter and I'm very respectful of anything in that domain, and still have a burn from a Tronser trimmer more than 45 years after it happened. Note to self: keep eye on SWR meter/Spectrum analyzer and finger position while trimming large end stages.
Really depends on what we're talking about. A lot of electrical safety equipment has a DC rating, usually something like 90VDC/300VAC. Also, most DC equipment just isn't going to have the stored energy to generate a big arc. Well, except batteries, and we're already piling them all around us.
I mean it depends, but for dual rated stuff has both a voltage and current limit, both of which are way lower. Like typically a 230V/20A AC switch can switch 24VDC/2A. And the energy is not in the equipment, its in the mains (or batteries like you said, or PV panels)
I've worked overseas a lot and one thing that's really different from 2 decades ago is that I simply don't need a step-down transformer anymore because every single thing I plug in converts to DC (or otherwise accepts dual-voltage) anyways. So I have a giant collection of physical plug adapters because every device I use just needs to fit into the socket and takes care of it from there.
(My stand mixer is the lone sad exception)
Having a single big DC converter at a home would help a lot with power factor (LED lamps connected directly to AC have terrible power factor).
Modern bricks really aren’t that inefficient though. An Apple charger is like 90% efficient. A DC to DC converter is about that efficient or worse.
I don't understand why new houses don't just have one high quality AC/DC converter so you can just use LED lighting without every bulb needing its own AC/DC converter. I imagine the light bulb cartel wouldn't really like that.
With modern technologies, that's power over ethernet or USB-C. Other comments in this thread point out that the telephone service also routinely used 48V for the ring signal.
However, higher DC voltage is riskier, and it's not at all standard for electrical and building code reasons. In particular, breaking DC circuits is more difficult because there's no zero-crossing point to naturally extinguish an arc, and 170V (US/120VAC) or 340V (Europe/240VAC) is enough to start a substantial arc under the right circumstances.
Unfortunately for your lighting, it's also both simple and efficient to stack enough LEDs together such that their forward voltage drop is approximately the rectified peak (i.e. targeting that 170/340V peak). That means that the bulb needs only one serial string of LEDs without parallel balancing, making the rest of the circuitry (including voltage regulation, which would still be necessary in DC world) simpler.
> I don't understand why new houses don't just have one high quality AC/DC converter so you can just use LED lighting without every bulb needing its own AC/DC converter.
IEEE 802.3bt can deliver up to 71W at the destination: just pull Cat 5/6 everywhere.
* https://en.wikipedia.org/wiki/Power_over_Ethernet#Standard_i...
* https://www.usailighting.com/poe-lighting
LED light bulbs exist exclusively for compatibility with Edison sockets. Every LED fixture I have seen had a single transformer for the entire fixture; and that transformer was reasonably separate from the LEDs themselves.
It wouldn't work. leds need low voltages, meaning massive wires. you can run the voltage change on ac or dc. Ac just needs a few capacters to smooth the wave out.
Do you want your house to burn down? Lower voltages for LED lights mean higher current.
That's traded off against the increase efficiency of LED lighting, at least compared to incandescent lighting. An LED "equivalent replacement" for a typical incandescent globe is down around 1/10th of the power. A 7Watt LED bulb is typically marketed as "60W equivalent". If that configured as a bunch of LEDs in series (or series/parallel) that need 12VDC, it's right about the same current draw as the 120V 60W incandescent equivalent. (Or perhaps double the current for those of us who get 220VAC out of our walls.)
(Am I just showing my age here? How many of you have ever bought incandescent globes for house lighting? I vaguely recall it may be illegal to sell them here in .au these days. I really like quartz halogen globes, and use them in 4 or 5 desk lamps I have, but these days I need to get globes for em out of China instead of being able to pick them up from the supermarket like I could 10 or 20 years ago.)
because shorts and voltage loss are a real issue at that scale.
This article seems to imply that 800V DC is high-voltage DC, but that seems quite low.
I think there'a a regulatory "Low Voltage" definition of "below 50V", which has implications around whether you need to be a licensed electrician to install it or not. Anything above that is - for at least some proposed - considered "High Voltage".
Other people, of course, have other definitions of high voltage:
"This resonant tower is known as a Tesla coil. This particular one is just over 17 feet tall and it can generate about a million volts at 60,000 cycles per second."
and:
"This pulse forming network can deliver a shaped pulse of over 50,000 amps with a total energy of about 1,057 times the tower primary energy"
https://www.youtube.com/watch?v=RoGbrgOhPes
Quite low compared to a power utility's HVDC, but quite high compared to the 5/12/24 V output of most AC/DC converters used for electronics.
I’ve always wondered about these new High-Voltage DC (HVDC) transmission lines.
I always thought AC’s primary benefit was its transmission efficiency??
Would love to learn if anyone knows more about this
AC is less efficient than DC at a given voltage. The advantage of AC is that voltage switching is cheap, easy and efficient. Switching DC voltage is way harder, more expensive, and less efficient. However the switching costs are O(1) and the transmission losses are O(n) so for some distance (currently somewhere around 500 km) it's worth paying the switching cost to get super high voltage DC. The big thing that's changed in the last ~30 years is a ton of research into high voltage transistors, and fast enough computers to do computer controlled mhz switching of giant high power transistors. These new super fancy switching technologies brought the switching costs down from ludicrous to annoyingly high.
> AC is less efficient than DC at a given voltage
To expand on this, a given power line can only take a set maximum current and voltage before it becomes a problem. DC can stay at this maximum voltage constantly, while AC spends time going to zero voltage and back, so it's delivering less power on the same line.
The primary benefit of AC is it's really easy to change the voltage of AC up or down.
The transmission efficiency of AC comes from the fact that you can pretty trivially make a 1 megavolt AC line. The higher the voltage, the lower the current has to be to provide the same amount of power. And lower current means less power in line loss due to how electricity be.
But that really is the only advantage of AC. DC at the same voltage as AC will ultimately be more efficient, especially if it's humid or the line is underwater. Due to how electricy be, a change in the current of a line will induce a current into conductive materials. A portion of AC power is being drained simply by the fact that the current on the line is constantly alternating. DC doesn't alternate, so it doesn't ever lose power from that alternation.
Another key benefit of DC is can work to bridge grids. The thing causing a problem with grids being interconnected is entirely due to the nature of AC power. AC has a frequency and a phase. If two grids don't share a frequency (happens in the EU) or a phase (happens everywhere, particularly the grids in the US) they cannot be connected. Otherwise the power generators end up fighting each other rather than providing power to a load.
In short, AC won because it it was cheap and easy to make high voltage AC. DC is comming back because it's only somewhat recently been affordable to make similar transformations on DC from High to low and low to high voltages. DC carries further benefits that AC does not.
Important factor is that AC at given nominal voltage V swings between 1.41V and -1.41V, so it requires let's say 40% better/thicker insulation than the equivalent V volts DC line. This is OK for overhead lines (just space the wires more) but is a pain for buried or undersea transmission lines; for that reason, they tend to use DC nowadays.
BTW, megavolt DC DC converters are a sign to behold: https://en.wikipedia.org/wiki/File:Pole_2_Thyristor_Valve.jp...
https://en.wikipedia.org/wiki/High-voltage_direct_current
How is DC better than a three phase delta 800Vrms, at 400Hz?
- Three conductors vs two, but they can be the next gauge up since the current flows on three conductors
- no significant skin effect at 400Hz -> use speaker wire, lol.
- large voltage/current DC brakers are.. gnarly, and expensive. DC does not like to stop flowing
- The 400Hz distribution industry is massive; the entire aerospace industry runs on it. No need for niche or custom parts.
- 3 phase @ 400Hz is x6 = 2.4kHz. Six diodes will rectify it with almost no relevant amount of ripple (Vmin is 87% of Vmax) and very small caps will smooth it.
As an aside, with three (or more) phase you can use multi-tap transformers and get an arbitrary number of poles. 7 phases at 400Hz -> 5.6kHz. Your PSU is now 14 diodes and a ceramic cap.
- you still get to use step up/down transformers, but at 400Hz they're very small.
- merging power sources is a lot easier (but for the phase angle)
- DC-DC converters are great, but you're not going to beat a transformer in efficiency or reliability
> no significant skin effect at 400Hz -> use speaker wire, lol
now run that unshielded wire 50 meters past racks of GPUs and enjoy your EMI
> The 400Hz distribution industry is massive; the entire aerospace industry runs on it
nothing in that catalog is rated for 100kW–1MW rack loads at 800Vrms
> 3 phase @ 400Hz is x6 = 2.4kHz... Your PSU is now 14 diodes and a ceramic cap
you still need an inverter-based UPS upstream, which is the exact conversion stage DC eliminates
> large voltage/current DC breakers are.. gnarly, and expensive. DC does not like to stop flowing
SiC solid-state DC breakers are shipping today from every major vendor
> DC-DC converters are great, but you're not going to beat a transformer in efficiency or reliability
wide-bandgap converters are at 95%+ with no moving parts
"now run that unshielded wire 50 meters past racks of GPUs and enjoy your EMI"
Multipole expansion scales faster than r^2.
Also, im not in the field (clearly) but GPUs cant handle 2.4 kHz? The quarter wavelength is 30km.
"nothing in that catalog is rated for 100kW–1MW rack loads at 800Vrms"
Current wise, the catalog covers this track just fine. As to the voltages, well that's the whole point of AC! The voltage you need is but a few loops of wire away.
"you still need an inverter-based UPS upstream, which is the exact conversion stage DC eliminates"
So keep it? To clarify, this is the "we're too good for plebeian power, so we'll transform it AC->DC->AC", right?
"SiC solid-state DC breakers are shipping today from every major vendor"
Of course they do. They're also pricey, have limited current capability (both capital costs and therefore irrelevant when the industry is awash with GCC money) and lower conduction, and therefore higher heat.
They're really nice though.
"wide-bandgap converters are at 95%+ with no moving parts"
transformers have no moving parts. Loaded they can do 97%+ efficiency, or 2MW of heat eliminated on a 100MW center.
An advanced AI rack might use 100kW = 800V 125A, requiring gauge 2, quarter inch diameter---this isn't your lol speaker wire. Actually, I apologize, I realized I may be talking to a serious audiophile, didn't mean to disrespect your Monster cables.
The skin depth by the way is sqrt(2 1.7e-8 ohm m / (2 pi 400Hz mu0))=~3mm for copper---OK for single rack, but starts to be significant for the type of bus bars that an aisle of racks might want.
As for efficiency, both 400Hz transformers AND fancy DC-DC converters are around 95% efficient, except that AC requires electronics to rectify it to DC, losing another few percent, so the slight advantage goes to DC, actually.
As for merging power, remember that DC DC converter uses an internal AC stage, so it's the same---you can have multiple primary windings, just like for plain AC.
> I realized I may be talking to a serious audiophile, didn't mean to disrespect your Monster cables.
I am a recovering audiophool.
I do own a pair of 2m long Monster Cable speaker cables (with locking gold plated banana plugs). I am fairly certain I've used welders with smaller cables.
(In my defence, I bought those as a teenager in the late 80s. I am not so easily marketed to with snake oil these days. I hope.)
(On the other hand, I really like the idea of a reliably stable plus and minus 70V or maybe 100V DC power supply to my house. That'd make audio power amplifiers much easier and lighter...)
>- no significant skin effect at 400Hz -> use speaker wire, lol.
What are you talking about? There's a very significant skin effect at 400Hz. Skin effect goes up with frequency. These datacenters use copper busbars, not cable, so skin effect is an important consideration.
At 100 000 A for a 100 MW data center at 1000 V, speaker wire is a joke.
You obviously need at least a dozen stands in parallel!!
Clearly skin effect scales with frequency but, 400 Hz is still low, only 2.5x lines frequency (the scale is by the root); so the skin depth is 3mm. 3mm on each side makes for a pretty hefty rectangular cross-section.
If you could get that 100,000Amps flowing through your speaker wire, the vaporised copper and the plasma channel would probably keep your 100MW flowing, at least until your building caught fire.
Even your monster cable? ;)
[dead]