Dec 12, 2018 2:38 PM 1+ mon ago
It?s pretty much just simple energy loss that causes heat build-up in electronics. That ostensibly innocuous warming up, though, causes a two-fold problem:Firstly, the loss of energy, manifested as heat, reduces the machine?s computational power ? much of the purposefully created and needed, high-power energy disappears into thin air instead of crunching numbers. And secondly, as data center managers know, to add insult to injury, it costs money to cool all that waste heat.For both of those reasons (and some others, such as ecologically related ones, and equipment longevity?the tech breaks down with temperature), there?s an increasing effort underway to build computers in such a way that heat is eliminated ? completely. Transistors, superconductors, and chip design are three areas where major conceptual breakthroughs were announced in 2018. They?re significant developments, and consequently it might not be too long before we see the ultimate in efficiency: the cold-running computer.To read this article in full, please click here...Read more.
**Content contained on this site is provided on an “as is” basis. 4Internet, LLC makes no commitments regarding the content and does not review it, so don't assume that it's been reviewed. What you see here may not be accurate and should not be relied upon. The content does not necessarily represent the views and opinions of 4Internet, LLC. You use this service and everything you see here at your own risk. Content displayed may be subject to copyright. Content is removed on a case by case basis. To request that content be removed, contact us using the following form: Contact Us.