22

I've been reading Tanenbaum's Structured Computer Organization and he says one of the major bottlenecks for increasing CPU clock speed is heat. So I started thinking: Is it possible to remove the heatsink altogether and use that heat to generate more electricity? I've been searching on this and found these thermoelectric materials and this thermoelectric generator:

Thermoelectric generator concept found on Wikipedia

I read on that Wikipedia article that "Silicon-germanium alloys are currently the best thermoelectric materials around 1000 °C (...)", and I know CPU normally operate around 30~40°C. So, getting to 1000 °C would require more CPUs.

So I thought: What about putting a lot of CPUs in parallel without their heatsinks to gather more heat? We can also overclock these CPUs a whole lot and see how much heat they can generate.

But I'm stuck. I don't know what to think next. I don't even know if it's a good line of thought.

My question is: why not develop some sort of heatsink that generates electricity from the CPU's heat? I know somebody must already have thought about that and thought about a reason why not to do it, but I can't figure it out.

So, why is it not possible?


EDIT for clarification: I do not want CPUs to work at 1000 °C. I'll list my reasoning steps (not necessarily correct), which were roughly:

  1. CPU clock speed is limited by working temperature (T).
  2. CPUs generate heat. Heat makes T rise.
  3. Heatsinks take care of that heat in order to maintain T=40°C.
  4. Replace heatsink with thermoelectric generator (built from SiGe or similar material)
  5. Put a lot of CPUs side by side to increase heat generation.
  6. Heat comes out the CPUs to TEG, so CPUs stays at T=40°C.
  7. Is this possible?
  8. How to build such a TEG? Which material to use?
  9. Why such device doesn't exist already?
  10. Asked this question.

EDIT2: I see that my idea is fundamentally wrong and bad. Thanks for all the answers and comments. Sorry about any misunderstandings.

Nayuki
  • 276
  • 2
  • 13
Enzo Ferber
  • 331
  • 1
  • 2
  • 10
  • 13
    How do you propose your CPUs to work at 1000°C ? – PlasmaHH Jun 07 '17 at 12:29
  • @PlasmaHH I thought about a whole lot of overclocked CPU's in parallel. But the 1000C is the optimal operating temperature for silicon-germanium. I think they work on lower temps too, but less efficiently. – Enzo Ferber Jun 07 '17 at 12:32
  • 36
    Two CPUs at 50° each is not the same as one CPU at 100°. – Hearth Jun 07 '17 at 12:35
  • 3
    @EnzoFerber: so you want them to just generate heat and not work as a CPU doing calculations at all? Why not just use resistive ceramic heating elements in that case? – PlasmaHH Jun 07 '17 at 12:38
  • @PlasmaHH No, I want them to keep being normal CPUs. I just want to reuse the heat for something useful... – Enzo Ferber Jun 07 '17 at 12:39
  • 4
    Also, a major obstacle here is that thermoelectric cells are not able to move heat nearly as quickly as a proper heatsink. They would be much less effective at keeping the CPU from melting. – Hearth Jun 07 '17 at 12:40
  • @Felthry I imagined that. But I don't know how exactly two heat sources add up. – Enzo Ferber Jun 07 '17 at 12:40
  • 12
    They don't. Think of it this way: if the east side of your room is 20°C, and the west side of your room is 20°C, your room on the whole is 20°C, not 40°C or anything like that. – Hearth Jun 07 '17 at 12:42
  • 2
    @EnzoFerber: Then again, how are they going to work at 1000°C? Even if their thermal protection would not kick in, that's yellow glowing temperature, the majority of materials in a computer will melt or even evaporate. No CPU known to mankind is able to operate at those temperatures. So how will yours? – PlasmaHH Jun 07 '17 at 12:42
  • @PlasmaHH I know. My line of thought what: hundreds of CPU's lined up calculating AND generating heat. A single device gathering heat from all those CPUs would cool them AND generate electricity. As I said, I'm curious and wandering in what I thought was an interesting idea... I don't know if it's even possible. – Enzo Ferber Jun 07 '17 at 12:44
  • 11
    @EnzoFerber: ok, I give up, you know that the CPU will be destroyed by it glowing yellow hot, but at the same time you want to make it glow yellow hot and work. Maybe the guys at scifi and fantasy SE have some magic that works for you. – PlasmaHH Jun 07 '17 at 12:46
  • 1
    Posssible, at regular temperatures, yes, practical / cost effective NO, not at this time. – Trevor_G Jun 07 '17 at 12:46
  • 1
    @PlasmaHH No. That's not what I want. I don't want them to be yellow hot. What I want is a device capable of keeping them cool AND generating electricity at the same time. No sifi. Just a question. – Enzo Ferber Jun 07 '17 at 12:49
  • 2
    @EnzoFerber: So you want them to be cool but at the same time be at 1000°C? – PlasmaHH Jun 07 '17 at 12:55
  • @PlasmaHH No no no. I see the confusion. I don't want them to be at 1000C. Never. The optimal operating temperature of the silicom-germanium is 1000C, but does it work on lower temperatures? My idea is that making a lot of CPU's working at 50~80C would add up (which I already found it won't) – Enzo Ferber Jun 07 '17 at 12:59
  • What kinds of CPUs are made of SiGe? And even beyond that, where is your "optimal operating temperature" figure coming from? That sounds horrifically unreasonable. – Shamtam Jun 07 '17 at 15:30
  • 7
    I noticed no-one replied what I think is the real solution, so I'm adding my opinion. To produce energy you cannot use heat; you need heat DIFFERENTIAL. since the CPU needs to stay at a fixed temperature (over 100°C it will behave badly), the only way to extract energy is to make the heatsink cooler. But the energy required to cool the heatsink is higher than the one you can extract. You can extract X energy, but only providing it Y > X energy. So... No power generation, sorry... – frarugi87 Jun 07 '17 at 16:03
  • 1
    @Shamtam The 1000C optimal working temperature for SiGe comes from the Wiki article I linked. I did not say CPUs are made of it or should be made of it. I was wondering if a device could be built from it or similar materials in order to convert heat to energy. – Enzo Ferber Jun 07 '17 at 16:37
  • @frarugi87 Thanks for the comment! Clarified some more things to me. – Enzo Ferber Jun 07 '17 at 16:38
  • 1
    @EnzoFerber Keep in mind that the "optimal working temperature" listed in that article is for thermoelectric effects, _not_ semiconducting effects. SiGe at that temperature would not work as a reasonable semiconductor (i.e.: you would not be able to make a useful CPU out of it) at that temperature. – Shamtam Jun 07 '17 at 17:42
  • 2
    It would probably be easier to use a filament lightbulb to power itself. It can handle high temperatures, and it produces a lot of heat. Now you just need to make a perpetual motion machine, but that should be no problem at all. If that is too complex, you could make a flashlight that has a solar cell to power itself. Car with bigger wheels at the rear so it is always rolling downhill... So many possibilities! –  Jun 07 '17 at 18:02
  • 4
    Your ability to generate power from a heat source depends absolutely on finding a large quantity of cooler temperature to exchange your heat into. The heat engine runs on the differential. That's why nuclear power plants have the big honkin' cooling towers. Since your probable "ultimate heat sink" is the room at 25C, raising the CPU to 55C doesn't give much headroom for power generation. – Harper - Reinstate Monica Jun 07 '17 at 18:07
  • The upshot is, instead of arguing with us about your ill-informed fantasy, **actually get informed** about power generation thermodynamics. Your disinterest in same suggests you dislike knowledge or the gaining of it. True, 90% of the time it'll be a dry hole, but sometimes you'll find something really cool and workable. And just get smarter all around. – Harper - Reinstate Monica Jun 07 '17 at 18:13
  • 1
    @Harper Sure, that's why I'm asking! I'm not arguing trying to convince you guys that I'm right. I'm trying to show you all my reasoning steps. I already see that I'm wrong... – Enzo Ferber Jun 07 '17 at 18:14
  • Dig into power generation. Also dig into the gas laws, as they intersect rather critically - the trick to exploiting latent heat of vaporization at oddball temperatures is freons, or any other fluid whose characteristics you can live with. (for instance foam cups are almost universally blown with *pentane*, because it is ideal... and the factories just cope with the flammability.) We are far, far afield of computer science at this point, but hey - atoms are the new bits. – Harper - Reinstate Monica Jun 07 '17 at 18:20
  • 3
    @Harper _"Your disinterest in same suggests you dislike knowledge or the gaining of it. "_ Look, I was reading a book about computer architecture. I had a question, I thought about it, I played around with it, I came up with an idea, I searched that idea, couldn't find much with my current knowledge. So I asked a question in order to get informed... If not here, then where should I ask and talk about it? I'm just trying to clarify everything in my head. – Enzo Ferber Jun 07 '17 at 18:21
  • 2
    @EnzoFerber sorry, that was excessive. – Harper - Reinstate Monica Jun 07 '17 at 18:22
  • 5
    This question is fantastic, but I'd suggest a slight change. Instead of generating electricity from the heat, how about instead just using the heat? For example, I use my Mac power supply as a coffee warmer. It is just the right size and shape to rest my coffee on it and it's always hot. If you had a server room and could pipe some liquid that you wanted to warm up to 50c across all the CPUs, then you'd get the heating for free. Well, pumping costs at least. – Anton Codes Jun 07 '17 at 20:29
  • 3
    @nocomprende, The idea of making a lightbulb that powers itself is, of course, ridiculous; but a camp-stove that uses waste heat to generate enough electricity to recharge your cell phone is [a product you can buy](https://www.rei.com/product/115523/biolite-wood-burning-campstove-2-with-flexlight). The key is, the amount of power that charges the phone is only a tiny fraction of the waste heat which in turn, is only a small fraction of the heat that's cooking your dinner. – Solomon Slow Jun 07 '17 at 20:54
  • 3
    Re, "remove the heatsink altogether." The "heat sink" is wherever the heat goes. There is always a heat sink even if there is no purpose-built, metal heatsink part. With no heat sink, (i.e., nowhere for the heat to go), the temperature would rise very quickly until the chip melted. A thermoelectric generator does not turn _heat_ into electricity. It generates electricity from the _flow_ of heat from a source to a sink. – Solomon Slow Jun 07 '17 at 22:00
  • 1
    @jameslarge Thanks for the clarification. What I meant was removing the purpose-built metal [heatsink](https://en.wikipedia.org/wiki/Heat_sink) and replacing it with the "miracle-device" I was thinking about. – Enzo Ferber Jun 07 '17 at 22:07
  • @wontonimo To some extent that happens. When heating your house, your computer will (slightly) reduce how much your heater has to work, by assisting in heating the room. When designing HVAC/controls systems for large offices for example, often times there will be a year-round cooling in the middle areas if large enough. All the heat generated from people and devices actually adds net heat to areas where it isn't surrounded by cold. You still need to heat around the perimeter in winter, as the heat lost to outside is greater than the people/devices in the building generate. – JMac Jun 07 '17 at 22:47
  • It always seemed annoying to me that we have to expend energy to cool in the summer and heat in the winter. It is like having to walk uphill both ways! Why isn't there some desirable process that just works? Why can't we store the cold all winter and use it in the summer? We used to, a hundred years ago. Ten feet underground it stays 50 degrees all year. Too bad that isn't a temperature we want. They should have made Hawaii *way way bigger*. –  Jun 08 '17 at 00:24
  • 1
    @wontonimo You should check this out https://www.qarnot.com/. It's a distributed data center implemented in ... heaters. Very good idea IMO ! – Etsitpab Nioliv Jun 08 '17 at 08:26
  • 1
    @nocomprende - We still do - its called a [Geothermal Heat Pump](https://en.wikipedia.org/wiki/Geothermal_heat_pump). – brhans Jun 08 '17 at 11:50
  • @brhans I was thinking of barns full of ice from frozen lakes, packed in sawdust. Heatpumps require expending energy to move heat around, I was saying why can't there be something that just works the way we would like? –  Jun 08 '17 at 12:00
  • @EtsitpabNioliv, Whooa they stole Ireland student bitcoin mining radiator from 6 years ago! When every one had a computer mining under his bed. – Drag and Drop Jun 08 '17 at 15:05
  • Another point not mentioned is the [maximum efficiency of a heat engine](https://en.wikipedia.org/wiki/Heat_engine#Efficiency), which is `n=1-(Tcold/Thot)`. For a CPU at 70C in a room at 20C you get a *maximum* (ie: impossible theoretical limit) efficiency of about 14.5%. For a cooler CPU at even 50C that drops to about 10%. So, assuming a 100W CPU where all the energy turns to heat you will never be able to generate more than maybe 10W of power, even with the best heat engine that could be made. In reality, you would get some fraction of that, and even less when the CPU was cooler or idle. – J... Jun 09 '17 at 14:12
  • @frarugi87 Your comment "No power generation, sorry..." is incorrect. You certainly can generate power, though because of traditional losses in transport and conversion, the amount generated will be less than the amount input. Generated power could be enough to run, say, a CPU cooler without using external power. Or some cool LED lighting. Or charge a battery. The concept is sound. If Columbus had listened to comments like yours, the New World might still be undiscovered. – Suncat2000 Jun 09 '17 at 15:06
  • @Suncat2000 please try to read my comment instead of blocking at the very beginning. You say "the amount generated will be less than the amount input". I wrote "the energy required to cool the heatsink is higher than the one you can extract". Isn't this the same? And a system where you have to input more energy than it supplies is not a generator. So.. No generators. – frarugi87 Jun 09 '17 at 17:07
  • @frarugi87 In your terms, there could never be any kind of generator because all "generators" lose some portion of all the energy in the system. You may know what you're trying to say but your lack of communication skills isn't getting it across. A generator is converting some form of energy into electrical energy, regardless of how efficient or inefficient it is. I encourage the original poster to make a contribution by thinking of new and better applications. – Suncat2000 Jul 20 '17 at 20:16

8 Answers8

33

The issue with thermoelectric generators is they are horrendously inefficient.

For a CPU you HAVE to get rid of the heat they produce or they melt down.

You could hook up a peltier module and extract a small amount of electricity from them but you would still need to dissipate the remainder of the heat via a classical heat exchange method. The amount of electricity generated would likely not be significant enough to warrant the cost of the setup.

You CAN also use peltiers as coolers. However, you need to ADD power to pump out the heat. That power then needs to be dissipated along with the heat you are removing via the heat-exchanger. In the end the latter needs to be larger so your net effect is worse.

Heat to power is a "holy grail" idea and is up there with cold fusion as a theoretical dream.

EDITED FOR CLARITY

Efficient DIRECT conversion from heat to electricity is a "holy grail" idea and is up there with cold fusion as a theoretical dream.

Trevor_G
  • 46,364
  • 8
  • 68
  • 151
  • 7
    Heat-to-power is not just a theoretical dream. Every internal combustion engine, every steam turbine, every jet engine is doing exactly that. It just doesn't make any sense at the temperature that CPUs operate at. Also, the OP needs to learn the difference between heat and temperature. – Dave Tweed Jun 07 '17 at 12:50
  • 1
    Probably best said as "*waste* heat to power is a dream" - most power systems reject heat, but there's no way to efficiently make use of it due to Carnot and other limits. – pjc50 Jun 07 '17 at 12:51
  • 1
    @DaveTweed I'd argue most engines convert pressure to kinetic or electric energy and produce a lot of heat as a by-product which is their efficiency number... but I upped your comment anyway since you can argue it that way :) – Trevor_G Jun 07 '17 at 12:52
  • 5
    The heat content of the output fluid is always less than the heat content of the input fluid, which is why all of the devices I listed are generically classified as "heat engines", and their overall efficiency is limited by well-known laws of thermodynamics. A Peltier device is subject to those same laws, but it is notoriously inefficient to begin with. – Dave Tweed Jun 07 '17 at 12:59
  • 3
    @Trevor pressure is a *result* of the application of heat energy. Essentially pressure is their engineering means of *accessing* heat energy. Temperature is defined as the average kinetic energy, so in a way you have the right idea, but you are wrong on the cause vs. effect as long as you are talking about an engine, and not a compressor. – Chris Stratton Jun 07 '17 at 15:14
  • 1
    @ChrisStratton sigh.. yes I guess I should have said DIRECT conversion form heat to electricity.... – Trevor_G Jun 07 '17 at 15:17
  • 1
    It depends on your waste heat. CPU waste heat at at little more than room temperature is hopeless and always will be. Engine waste heat, for example is [the subject of much research](http://www.sciencedirect.com/science/article/pii/S135943111501128X) (a review article from last year). Other realistic proposals include power station waste heat and waste heat from photovoltaics – Chris H Jun 07 '17 at 15:17
  • 10
    It may be hard to generate useful electrical or mechanical energy from, but "CPU waste heat at at little more than room temperature" can keep you warm in winter - ie, the "data furnace" idea. – Chris Stratton Jun 07 '17 at 15:25
  • @ChrisStratton LOL not just in winter... brr it's cold outside today... I thought it was June? – Trevor_G Jun 07 '17 at 15:26
  • 1
    Given thermodynamics as we understand them, cold fusion is a far more promising line of research than direct heat-to-electricity. Cold fusion at least doesn't have a fundamental theoretical basis to say it's completely impossible. – Mark Jun 07 '17 at 21:10
  • 2
    @Christoph: Well, in large datacenters you have exactly this situation. Heat pumps (Air conditioners) are used to actively pump heat out of the datacenter to make the datacenter easier to cool, and nobody cares about the enormous power draw. – mic_e Jun 08 '17 at 14:14
  • 1
    @Christoph "Of course the increase in energy usage is almost never justified." That was my point, you just move the issue elsewhere. It will keep the surface of the CPU cooler, but more heat will emanate from the box in total. It does have it's uses for close quarters and badly ventilated applications however. – Trevor_G Jun 08 '17 at 14:19
19

For generating electricity, you want the hot side (processor) to be as hot as possible for maximum efficiency. The thermal generator slows down the movement of heat as it extracts energy from it.

For doing computation, you want the processor to be as cold as possible. Higher temperatures increase the electrical resistance of the silicon. This is why you have highly-conductive heatsinks, fans etc: to move heat away as fast as possible.

These requirements directly contradict one another.

pjc50
  • 46,540
  • 4
  • 64
  • 126
  • 6
    Or, to put it another way, you'd have to make the CPU work significantly worse to extract even a trivial amount of power. That's a losing proposition. If you can tolerate the CPU working worse, you'd be better off just providing it less power in the first place rather than providing lots of extra just to make it hot so that you can recover a tiny fraction of that. – David Schwartz Jun 07 '17 at 22:20
  • 1
    Actually Silicon is the opposite of Metal - [Resistance decreases as temperature increases](https://www.quora.com/Why-does-resistivity-of-semiconductors-decrease-with-increase-in-temperature). However high temperatures cause noise and low resistance causes other issues. Both cause CPU errors. – Tom Leys Jun 08 '17 at 02:01
  • Since this question is primarily a thought experiment; would a server farm on Pluto benefit from insulation keeping it at a balmy North Pole like climate? If the insulation is a little too effective, could you could replace some with thermo-electric materials? – gmatht Jun 08 '17 at 05:17
  • 2
    @gmatht There's already experiments with data centres deep in the oceans. It looks quite promising for cloud clusters - cooling even huge server farms is almost trivial at those ambient temperatures, and the water can carry away lots of heat easily. I suspect Pluto would be rather impractical, even if we were only concerned with the temperature and not the other practical difficulties :) – Luaan Jun 08 '17 at 11:12
  • 2
    @TomLeys that's an oversimplification. With undoped semiconductors resistance goes down with temperature. With doped semiconductors it can go either way. – Peter Green Jun 08 '17 at 16:22
  • 1
    @gmatht A datacenter on Pluto would have to contend with the fact that there is next to zero atmosphere on Pluto, so heat dissipation can only happen by radiation, which is very inefficient compared to other methods. Or maybe you meant Pluto, Mickey Mouse's dog? :) In that case, I guess it would have to contend with the insulating effects of dog fur, which are considerable! – user Jun 09 '17 at 12:21
  • @MichaelKjörling I'd assume you'd built the plutonian datacentre with a partial atmosphere - nitrogen condensing on the walls and boiling off the equipment. – pjc50 Jun 09 '17 at 13:42
  • @Suncat2000 that was in relation to "heat transfer vs radiation", not the unrealistic prospect of generation – pjc50 Jun 09 '17 at 15:19
  • @pjc50 That's a faulty assumption. You want a heat differential. That's exactly how a normal heatsink works: moving energy from the hotter side to the cooler side. If you stick a thermal generator in the middle, that can generate electricity from that flow of energy and make use of some of the waste. Maybe not a lot, but it's certainly not impossible. – Suncat2000 Jun 09 '17 at 15:22
  • @Suncat2000 I can't work out what assumption you're referring to and calling "faulty"? I was saying that rather than a vacuum you'd want an atmosphere to remove heat from the "cold side", regardless of whether you have a TEG in there or not. – pjc50 Jun 09 '17 at 15:32
19

Surprised that nobody else has mentioned this:

Generating electricty from the waste heat from some process that burns fuel can make sense. Generating electricity from the waste heat from a system that is powered by electricity in the first place? That makes no sense. If it's possible for you to save energy by doing that, then it's possible for you to save even more energy by building a system that uses electricity more efficiently in the first place.

Solomon Slow
  • 3,013
  • 15
  • 20
  • 3
    Exactly. If the CPU can tolerate the extraction of energy from its heat, it's operating very inefficiently and you'd do better exploiting that inefficiency to make it use less power in the first place rather than trying to extract a tiny fraction of it. – David Schwartz Jun 07 '17 at 22:21
  • I think this answer should be higher up. There is no device which can recover electrical energy back efficiently without losses. The processor itself should be optimised instead. – Max Payne Jun 08 '17 at 07:50
  • 1
    The same argument can be applied to engines which burn fuel: optimizing a thermal engine yields more than trying to collect the waste heat. – Dmitry Grigoryev Jun 08 '17 at 12:18
  • @DmitryGrigoryev which is why we don't see car engines and power plants covered in peltier elemens – Christian Jun 08 '17 at 12:55
  • 1
    It's pretty common for power plants to use the "waste heat" from gas turbines to run steam engines. – Peter Green Jun 08 '17 at 16:23
  • @Christian, actually, auto manufacturers would _like_ to use thermoelectrics instead of a less-reliable, belt-driven alternator to power a car or truck's electrical system, but so far, the manufacturing cost has kept it from getting past the prototype stage. https://en.wikipedia.org/wiki/Automotive_thermoelectric_generator#History – Solomon Slow Jun 08 '17 at 17:05
  • 4
    @DmitryGrigoryev: with one caveat: cogeneration. Collecting the waste heat and using it to heat other stuff is fantastically effective. – whatsisname Jun 09 '17 at 04:27
  • @whatsisname like a car's AC heating system – Mark Segal Jun 09 '17 at 10:09
  • 3
    Meta-comment: Probably nobody has thought of giving this answer before because it is not part of the question. The fact is that CPUs generate heat. The OP states that fact for completeness sake or to set up the context of the question. The OP does *not* ask how/whether this can be avoided. The question is whether the heat, which is a given, can be used to create electricity. Hence it makes no sense to propose to avoid heat (in the context of this question). – AnoE Jun 09 '17 at 11:48
  • This is a very interesting way of thinking about it! – Enzo Ferber Jun 09 '17 at 14:15
  • Further to @whatsisname's comment, Cray Research heated their Chippewa Falls, Wisconsin facility mainly using heat generated from building and testing supercomputers. That avoided conversion inefficiencies - the computers produced warm air, and warm air was exactly what was needed in Wisconsin winters. – Patricia Shanahan Jun 09 '17 at 15:00
  • That's as nonsensical as electrically-powered heaters. Who'd be stupid enough to try that. – Suncat2000 Jun 09 '17 at 15:17
15

tl;dr Yes, you can extract a small amount of power from a CPU's waste heat, but your heat sink must be the bigger the more power you want to extract.

explanation There is no machine which converts heat into power, only machines which convert heat difference into power. In your case, that difference is the one between the CPU temperature and the environment temperature. The maximum theoretical efficiency for this process is (1 - T_cold / T_hot), so for an environment temperature of 25 deg C, a CPU temperature of 40 deg C and a heat flow of 50W you could generate 2.4 watts of electricity with an ideal converter (the temperatures are absolute temperatures in Kelvins). If you allow the CPU to reach 60 deg C, you can get up to 5 watts, and if you allow 100 deg C, you can get up to 10 watts. Real-life heat-to-power converters are more inefficient, especially thermoelectric elements. I'd recommend a stirling engine, which is closer to the ideal efficiency.

This is how heat flows with a passive heatsink:

[CPU] --> [Environment]

The CPU-to-Environment junction has a thermal resistance, measured in Kelvins/Watt, directly equivalent to how electrical resistance is measured in Volts/Ampere. You might have encountered Kelvin/Watt values in some datasheets. An ideal heatsink has zero resistance, so the temperature difference is 0 and the CPU operates at environment temperature (25 deg C). With a real-life heatsink of 0.5K/W and a heat flow of 50W (the CPU generates 50W of heat), the temperature difference is 25K and the CPU is at 50 deg C.

This is how heat flows with your proposed machine:

[CPU] --> [Hot end of machine] --> [Cold end of machine] --> [Environment]

There are thermal resistances, i.e. temperature differences, at all three points. Let us assume that the connection between CPU and hot end of machine is ideal, i.e. they are at the same temperature. The thermal resistance inside the machine is used to generate electricity. The thermal resistance between the cold end and the environment is given by the cold-end heat sink.

Say the heat sink at the cold end is the same that we used for the CPU, with 0.5K/W, and we want the CPU to be at 50 deg C. Then the cold end of the machine is already at 50 deg C, and there can be no temperature difference over the machine, i.e. it can generate no power. If we use a heat sink twice was big (0.25K/W), then the cold end will be at 37.5 deg C and the temperature difference over the machine is 12.5 deg C, so it can generate a little bit of power.

Any machine that extracts power from a temperature difference poses a thermal resistance equal to (temperature difference)/(Heat flow). The thermal resistance of the machine is added to the thermal resistance of the heatsink, so the CPU temperature will always be hotter if there is a machine in between.

BTW Some overclockers go the opposite way: They add a thermoelectric element which runs in reverse, using electric power to pump heat from the CPU to the heatsink, creating a negative temperature difference. The CPU is at the cold end, and the heatsink is at the hot end.

BTW This is why nuclear power plants have enormous cooling towers, which work as the cold-end heat sink.

mic_e
  • 636
  • 5
  • 13
  • 3
    +1 the only answer so far addressing the actual issue instead of focusing on side effects. – Agent_L Jun 08 '17 at 10:41
  • 1
    I have heard a steam boiler is a pretty good device for extracting energy from heat alone.. Naturally you need to go beyond boiling temperature to generate steam that would be useful at which point your semiconductor cooks. I guess theoretically you could use a low-pressure system to bring the boiling point down. Hardly worth it for couple of dozens of watts thought. WRT nuke plants, you definitely could use the waste heat in the cooling cycle to provide e.g. residential heating. Those bad atoms jump from the cooling water to the heating water like everyone knows thought. – Barleyman Jun 08 '17 at 12:09
  • @nocomprende: You are right, of course. I have clarified. – mic_e Jun 08 '17 at 12:23
  • @Barleyman: No. There can be no device which, as its only effect, turns heat into work. If there were such a system, the construction of a perpetual-motion machine would be trivial: Take pools A and B, both at the same temperature. Use the machine to cool down A and use the work to heat up B. Use a thermoelectric element between A and B to turn the heat difference into electricity. – mic_e Jun 08 '17 at 12:38
  • 1
    @Barleyman: Residential heating is a clever heatsink, because you can charge money for its use. But it's unreliable because your customers won't sink your heat during summer, so you will need towers as backup. Also, residential heating will require at least 60 deg C, so it won't be able to cool the cold end below 60 deg C. Remember: The lower the temperature of the cold end, the higher the efficiency. – mic_e Jun 08 '17 at 12:51
  • @mic_e It's hardly perpetual motion, you're bleeding heat out of the system by radiating even if you had perfect insulation. If your Ta was the same to start with, you wouldn't obviously create any more pressure so in that sense saying temperature difference is creating the force is correct. Or in more basic terms, you're increasing the thermal motion of the atoms to create more pressure. – Barleyman Jun 08 '17 at 13:03
  • @Barleyman: Perfect insulation means no radiation losses either. You can easily avoid radiation or other thermal losses by having your two reservoirs at 2.73K, in equilibrium with the rest of the universe. – mic_e Jun 08 '17 at 13:11
  • @mic_e There's a much higher temperature in the secondary circulation, >250 degrees depending on design. So you can build decent heating system if it's designed in. Because of the evil atom most proposals look at the tertiary circulation that's just not hot enough however. Anyways you need hot water around the year, depending where you live this is actually piped in, not heated in a boiler. – Barleyman Jun 08 '17 at 13:12
  • @mic_e You're not insulating anything if your both sides are in equibilirium. Besides, you don't "insulate" radiation, you reflect it. – Barleyman Jun 08 '17 at 13:29
  • Let us [continue this discussion in chat](http://chat.stackexchange.com/rooms/60125/discussion-between-mic-e-and-barleyman). – mic_e Jun 08 '17 at 13:38
  • 1
    +1 for being the answer to end all other answers. :) A shame that another answer (which is OK but with much less detail) has been accepted. – AnoE Jun 09 '17 at 11:52
  • @Barleyman The cold side of a steam boiler is the part where you turn the steam back into water so you can boil it again. I guess you can release the steam into the atmosphere and take in fresh water from a river, but you still have a cold side, it's just the entire planet. – user253751 Jul 26 '18 at 23:11
2

Funny thinking, but no. Your CPU is not just a chip, there are bonding wires and a casing involved which would not exactly stand a chance at 1000°C.

That aside, there are still some laws of thermodynamics to be considered. You still have to put in a huge amount of energy into the system to get very little out. The Peltier element you are referring needs a big dT (difference between cold and hot side) so simply removing the heat sinks will bring up the "cold" side to the same temperature as the hot side, so no more energy to be gained here, you'll need to cool the cold Side which will ruin the efficiency even more. On the other hand those Peltier elements can be used to generate a temperature difference as in cooling the CPU.

wiebel
  • 125
  • 8
2

The laws of thermodynamics state that putting together two energy sources of the same temperature don't equate to a higher energy level. For example, putting pouring a cup of hot water into another cup of hot water doesn't make the combination any hotter than the separate cups.

Heat is also one of the lowest forms of energy in that there is very little you can do with it. Electricity can run circuits, wind can create mechanical motion, but heat can't do much beyond putting more energy into a fluid or solid.

That said, the most feasible method of getting energy from heat is boiling a fluid (water for instance) to turn a turbine. Putting multiple heat sinks together and attached to a tub might make water boil if the CPU's are all above 100 C. But, as you can probably infer, this is a terrible idea.

Mr. Cheezits
  • 108
  • 2
  • 2
  • 14
  • Getting usable energy from a heat *gradient* is easy enough - but the efficiency increases as the difference gets wider. That's how e.g. combustion engines work, and that's why a thermodynamic engine tries to get as hot as practical, while keeping the other side as cold as practical. The gradient between a 50 °C CPU and its 25 °C environment doesn't give you much opportunity to extract useful energy - indeed, keeping the CPU cool enough is a challenge, and a heat engine would only make that worse. – Luaan Jun 08 '17 at 11:16
  • The point made was not about efficiency, but practicality. Boiling water with the waste heat of a CPU is impractical regardless of the temperature gradient. – Mr. Cheezits Jun 08 '17 at 12:42
  • 2
    Boiling water at room pressure, sure. But nobody says it has to be water, and that it has to be room pressure - there's plenty of stuff that would have a convenient boiling point. We're using a lot of different coolants depending on the conditions - including the now-popular heat-pipes that are actually used for cooling CPUs, using low-pressure water evaporative coolant vastly outperforming the heat conduction of the casing. Efficiency and cost is all that matters - extracting even a small part of the energy in such a tiny gradient is impractically expensive. – Luaan Jun 08 '17 at 15:08
2

In theory, it is possible. All you need is some "substance" that generates electricity when one of its surfaces is at 40c and the other is at 20c.
Currently, there are thermocouples that do exactly this (change heat to electricity), but at a much higher temperature.

Guill
  • 2,430
  • 10
  • 6
0

It is possible, but entirely useless. In the context of energy generation it completely depends on utilizing CPU as a heater, meanwhile from all other intents and purposes you do not want the CPU to behave as a good heater. It's an exclusive choice: either choose good CPU, or choose a good heater. Cannot have both.

And while operating a CPU, your main problem is not lack of electrical power, but efficient dissipation of the waste heat. Squeezing out a few extra watts of power gives virtually no benefit whatsoever because wattage is not a limiting factor there: if your computer needed more power, it could just easily drain about over thousand watts more from your electrical outlet until the circuit breaker would intervene. In reality, actually the reverse situation sometimes happens: sacrificing much more electric power to operate an active cooling structure, just to squeeze out a bit of extra performance from the CPU.

And these few "free" extra watts generated aren't really "free" either: the price you pay is having a CPU with a crippled heat sink that will severely shorten its lifetime, or provide subpar performance because of thermal throttling.

Chad Branzdon
  • 1
  • 1
  • 1
  • 10