Superconductivity is not enough for a cpu that doesn't generate heat, you would need a cpu built of reversible logic gate. Thermodynamics requires that when you destroy a bit of information you generate at least 2.9×10−21 J. It's the Landauer's principle.
My understanding is that modern CMOS circuits dissipate around 1 pJ (10^-12) per bit, so even if we are limited by the Landauer's principle, it would still be a 9 orders of magnitude improvement. I would surely love a CPU that uses microwatts instead of hundreds of watts.
Yup, the current dominating factor is resistance, not the Landauer limit.
Biological systems supposedly operate at about one order of magnitude above the Landauer limit [0], i.e this is completely practical and has been happening before we even made computers... we probably wouldn't exist without being this efficient, imagine how much energy our cells would need to consume and emit as heat if it were similar to a CPU of today!
I apologize, I couldn't not resist to the occasion of being technically correct. I should also have stated that a superconductors based CPU would be incredibly cool figuratively and physically!
While a room temperature superconductor would potentially have some uses in building a CPU… it’s not as relevant as you might think due to the semiconductor being the operating physics at play to make the transistors and perform the electrical switching and everything stemming from that underlying electrical unit.
You could absolutely use superconductors for all the connecting wires, and with a little engineering it might also be good for conducting heat away from the chip die as part of the larger CPU package… it wouldn’t be a good interface material (pins for connecting to the motherboard) as you want oxidisation resistance and a level of ductility and malleability to help with making a very good electrical connection between the surfaces when they are mechanically pressed together. But overall it will have some uses, it’s just not an immediately applicable technology for the silicon chips themselves… lots of potential in circuit designs and I’m sure if someone invented a way to lay this material onto a PCB as easily as we can print copper traces on them today that inventor would get pretty rich… it’s just not likely to make much of a change to the silicon chip die itself due to the need for semiconductors to do the transistor switching …
Of course someone might have invented a superconducting transistor that I haven’t heard of and if that’s the case disregard most of what I’ve just written haha.
I thought there was something out there like this but I wasn’t sure if it had actually built practical devices yet, cool to see where the state of the art has gone since I was last deeply involved in the ASIC & VLSI world. From what I read there I’d expect significant developments based on a number of the technologies outlined in that article if we do indeed have a room temperature superconductor we can build miniature circuits with…. But one issue I don’t think they mentioned will be grain sizes our transistors are getting very very small with current lithography technology. And it may be difficult to get equivalently small features (to use the industry term for the wires and bits we build silicon chip transistors out of) using superconducting materials where we have to control grain sizes and multi element mixtures, there will be lots of work on it so I’m sure it may change, but there will be a fundamental difference in how small we can make the conductor if it involves having multiple elements and crystal grains in specific structural arrangements, unlike a pure metallic wire which can be as thin as a few atoms (at this size your limits entirely depend on how tolerant you are of electrons accidentally tunnelling to other nearby conductors)
> but there will be a fundamental difference in how small we can make the conductor [...]
Considering how much faster super conductor logic circuits have been demonstrated to be driven [0] it might be worth the trade even with a higher fundamental limit on feature size.
A super conducting chip with far less logic could easily beat CMOS in: (1) Power performance (2) Single threaded performance (where CMOS has stalled) - but interestingly it could still compete in total throughput if the raw frequency is high enough. i.e even though there may be far less logic available compared to the latest and greatest CMOS lithography techniques - if it runs so much faster, less or simpler cores can potentially match or beat the throughput of the more parallel and specialised but slow logic available in CMOS dies.
In short, it could be like taking a step back in time to the simpler, smaller CPU days, but a huge step forward in fundamental frequency. That actually sounds like a nice trade regardless, CPUs are so insanely complex these days.
I work in research, so I understand that it is very important to keep in mind the limits imposed by nature itself. Heck, I had more than one argument with idiot bosses who wanted to break the laws of physics or maths...
Say we use this (or any other) superconductor to build an AND gate. Assume one input is zero, so the information of the other input is lost.
I assume the energy lost as heat occurs due to the "current" going into the and gate having "nowhere else to go" other than to dissapate as heat?
If that's the case, simply redesigning our logic gates to have as many outputs as they have inputs, with some of these outputs feeding indirectly back to the power source without being read, seems feasible.
It's a thermodynamic argument. Suppose you have a system of n bits each in an arbitrary state, so it has 2^n microstates. You set one bit to 0, regardless of what it was originally. It now has 2^(n-1) microstates. Finding entropy S for a given number of microstates N, S = k_b ln N, and so dS = k_b * (ln 2^(n-1) - ln 2^n) = k_b * ln 2 * (n - 1 - n) = -k_b ln 2, i.e. entropy has decreased by a constant amount. But by the second law of thermodynamics, entropy cannot decrease in a closed system, and the way it's dissipated is as heat. How much heat has been released? dE = T*dS, so dE >= k_b * ln 2 * T. Note the dependence on temperature: you can reduce how much heat is released by having it operate at a lower temperature. Even at room temperature, however, this is a billion times less than existing heat dissipation, so there's lots of room for improvement before we start hitting the Landauer limit.
This can be worked around by introducing ancilla bits to maintain the number of states in the system, but the instant you destroy the ancilla bits (e.g. by feeding them back to the power source), you dissipate energy. The exact mechanics of this would depend on the implementation of the device you're talking about, but you'd inevitably encounter it and be unable to overcome it.
> I assume the energy lost as heat occurs due to the "current" going into the and gate having "nowhere else to go" other than to dissapate as heat?
Not quite, it's a thermodynamic principle that applies to any way you could possibly compute AND. Basically, the laws of physics are reversible, so your computation must be reversible too. There are 4 possible inputs to an AND gate, so to be reversible there must be 4 possible outputs, one for each input. But we only want one output for the rest of our computation, so the other one dump into the environment somehow.
I think we're talking about the same thing, but I explained it absolutely terribly.
If we "dump the other output" back into the power source, such as the battery, does that solve the problem of not implicitly dumping it into the environment? Or is it still destroying information?
How will you choose which output is the one with the AND result? You'd need some logic to pick which output was the right one, I assume. Then you've got the same problem again.
I think that you could move those bits to destroy them somewhere else but what I got from pezezin post is that there are many order of improvement to achieve before that loss becomes significant enough to warrant the incredible complexity of shipping wasted bits to the heat sink.
This is the first time I'm hearing it phrased this way and I wonder why it hadn't occurred to me before. Thanks so much, in any case, you have just increased my understanding quite a bit. :) slapshead
I don't think this actually affects the high level opcodes, i.e it's transparent, so long as the circuit implementing them somehow performs charge recovery, the high level programming can still appear to be irreversible (I don't want to think about the potential side channel attacks that causes when you want to zero some bits!).
Reversible logic is fun, but the memory requirements get intense. You need enough storage to retain every intermediate value used in a computation. If you have a 1GHz 64-bit processor and it does an hour-long computation, you need to store the entire 29TB history of its intermediates... and then spend an hour unwinding it!
But currently, the adiabatic chips has a bigger issue with getting to zero: their control circuitry is a bank of AWGs, each burning probably hundreds of watts at room temperature. They ideally don't produce heat in the cold zone, which is great for the cryogenic system, but if we have room temperature superconductors, that's suddenly moot.