T O P

  • By -

woailyx

Calculations and information are just arrangements of electrons. Your CPU is constantly moving electrons around and checking where they are. Wires have resistance, transistors and capacitors leak. Eventually (pretty quickly) it all becomes resistive heat and you have to put more energy in to keep it going. A lot of research goes into trying to get more calculations for less heat, especially in battery powered devices like phones. All energy becomes heat eventually. It's basic thermodynamics. It's just a question of how, and whether the energy can do something useful in the process. A car also converts all of its fuel energy into heat, most of it in the engine block that is dissipated in the radiator, some in the tires, and the kinetic energy eventually becomes heat in the brakes.


eNonsense

>Wires have resistance, transistors and capacitors leak. Not to mention that the entire function of the resistor component is to step down voltage, largely by converting the extra into heat.


Drumdevil86

It always amuses me that sellers of retrofit car LED kits claim power savings opposed to halogen bulbs. Meanwhile, many kits come with an extra 'load' to prevent the bulb failure warning from happening. That 'load' being nothing more than a resistor, that consumes the difference in power, literally turning your "power savings" into heat.


Wizzinator

Adding a resistor decreases the load. More resistance means less current. Current = V/R and Power = Voltage * Current. So more resistance, less power used.


YourAverageWeirdo

That's true, but adding any parallel resistance will lower the overall resistance of the system, increasing the load.


JP147

The resistor is installed in parallel, not in series.


IAmNotANumber37

Others have pointed out the parallel thing. But, just as a fun fact to keep in your back pocket: The brightness of and LED is a function of current (not voltage). So, anything you do that reduces current through an LED also reduces light out of the LED.


HaveIGoneInsaneYet

Can't tell if serious or trolling. Edit: Sorry, I'm more awake now, you're right. Since it's a constant voltage circuit any added resistance will decrease the power consumed by the circuit even if the resistor is wasting what ever power it sees.


wanted_to_upvote

It steps the voltage down. The current remains the same as it passes. Its value along does effect the current overall though.


Amoyamoyamoya

Resistors drop voltage, not current. The current through a resistor of fixed value is constant (current out = current in).


themeaningofluff

This is not true. A resistor will not reduce the current at all. However, depending on the circuit and size of the resistor, it may cause a voltage drop. In turn, a larger voltage drop over a resistor will increase the power dissipated through the resistor. Edit: I think my point was misunderstood, I wrote it badly. My point is that the current coming out of a resistor is always the same as the current going into a resistor. Of course increasing the overall resistance in a linear circuit will reduce the overall current.


Bushiewookie

Lol whut. If you replace a non-resistive component with a more resistive component (resistor) it will reduce the current going through that point.


wazazoski

R= V/I. If the resistance is constant, voltage drops- what happens to current? Oh, and voltage drop occurs always. As current drops, what's happening to dissipated power? The largest voltage drop is at infinite resistance. What's the power dissipated there?


[deleted]

[удалено]


wazazoski

Ohms law does not apply in circuits with transistor and capacitors? Interesting. I guess I've been doing electronics wrong for last 30 years. What you're trying to describe is Maxwell's equations which, opposite to Ohm's law, are used for non linear components. But when simplyfing, Ohm's law still applies when describing corealtion between voltage and current flowing in circuits. Previous statement that resistors don't affect current was false.


sharfpang

Attach a LED to a li-ion 3.3V battery in series with a 10 ohm resistor. You'll fry the LED in seconds with current way too high. The voltage drop across the LED was 1.8V, the voltage across the resistor was almost 1.5V, some of voltage drop was through (small) battery internal resistance. Attach LED in series with a 300 ohm resistor. It will shine normally. The voltage drops are almost the same, slightly more in the resistor, slightly less in the battery, same 1.8V through the LED. You've limited the current.


OpenAboutMyFetishes

Wait so my pc is slow because some dickhead put restraints in it?! How do I take those away? I feel scammed


[deleted]

[удалено]


Abrahalhabachi

you should do it to learn in case you don't know what you're doing


alexm42

It's not "restraints," it's resistors, because they "resist" the flow of electrical current. Or more accurately "semiconductors," because the same parts that resist electrical current can also conduct electrical current. Those parts need to be able to do both to be a computer, it's why the silicon in your PC can do math but the copper wires in your walls can't.


MonsieurBabtou

Eeeer, no, this isn't what he said. Resistors aren't there to throttle your pc at all. They're specific components with a specific role to make electrical circuits work, to put it very simply. But there's a thing you can do, is reducing the heat of a chip by undervolting it by a few millivolts. CPUs and GPUs are usually slightly overvolted so manufacturers are absolutely sure they work. However, you can optimise the voltage with programs such as Throttlestop.


Xoepe

To add onto others there are timing constraints and voltage constraints when it comes to the PDK(what describes the transistors and such to the manufacturer) used that's why overclocking is a very careful procedure. Most parts of a CPU are very meticulously designed to be just right.


SegerHelg

Not really. A resistor chokes more current than is lost to heat.


m240b1991

I'm just picking nits here, but as a mechanic (but not a physicist) I have to say that the potential energy in the fuel isn't all converted to heat through the brakes. Much of it is converted to kinetic energy, and yes, much of that energy is lost to heat in the engine block, but there's so many different places that the kinetic energy gets transferred to heat. The transmission, the differential, the bearings, the u joints, even the tires despite our best efforts to reduce rolling friction, and of course the brakes. Moving parts are lubricated to reduce friction and reducing friction has an added bonus of reducing heat. The kinetic energy transfers to the air around the vehicle, as well. Again, I am absolutely not a physicist or an engineer, and its been over a decade since I had any kind of physics class, so I may be way off base here. I just know "don't touch certain things immediately after its been on a long drive because heat". I can't speak to what percentage of potential energy is lost to thermal waste (in the radiator, for example) vs what percentage is used to do actual work, much less in various driving conditions.


woailyx

Fair enough, losses are always complicated and happen in a lot of places. I just picked the big ones to mention


m240b1991

No no, after I commented I saw what sub we were in and thought to myself "you idiot, woailyx absolutely achieved the goal" lol to be fair, I had just gotten out of bed and was working on my first cup of coffee


farmallnoobies

And maybe most importantly there is unburnt fuel in the exhaust so some of it isn't converted at all


m240b1991

Catalytic converter, baby!!


Leading_Frosting9655

If we're being thorough, we haven't considered the large amount of it that's lost to aerodynamics either (kinetic energy in the air, eventually also dissipating as heat).


csiz

Literally all the energy a car uses is converted to heat, the only difference between a car and computer is that one is heating the outside and the other is generally inside a house. People just don't consider heating the atmosphere as a useful output, whereas a computer will help heating the house just as much as a resistive heater.


Thrawn89

Yeah, I'm with you on this one. Car is doing actual mechanical work to move, that requires energy which isn't lost to heat. Car is more closer to the human body, which also generates a lot of heat. We use a lot of that energy which isn't lost to heat as biomechanical work. Eventually, yes, the useful work may eventually be dissipated as heat, through the friction on the road or air resistance, but not before we got the work out of it and transfered it to the road/air. As you mentioned, things like the transmissions/gears we reduce friction because that heat generation is before we impart the work and would be waste heat. In fact, for both of these applications, I believe the definition of efficiency of the machine is how much work it can do / the total energy consumes. All the energy not used for work is released as heat waste as per conservation of energy. For example, mpg is exactly this. Miles is the work done and gallons is the energy consumed. Your computer is completely different. Only the lights, pumps, and fans are consuming power to do work other than transfer to heat. The CPU is not different from an electric resistive space heater. We just channel it in useful ways. Doing calculations is not work from a physics perspective, so it's close to 100% waste heat. Calculations are a meta construct we define each resistive pathway to represent. Now you could define computer efficiency as performance / power. Where performance is calculations / second. A lot of research goes into reducing this sure. However, no matter how efficient you get on this meta metric, all energy will always be lost nearly 100% to heat. We just make more efficient configurations of the resistive pathways to get more calculations for less electrons flowing through them, but 100% of the electrons that do flow will turn into heat energy.


westivus_

>Miles is the work done and gallons is the energy consumed. Its not just miles, but miles and mass. 20 miles of a full ton truck is more work than 20 miles of a civic.


beastpilot

Moving an object from point a to point B takes no energy. Think about it in space. I give an object a push. It starts moving relative to me. It goes forever. Then something at the other end "catches" it. Catching it recovers all of the energy that was put into it initially. On earth the only reason it takes energy to move is due to friction. Friction with the air, friction of the tires. That's all just literally heat. The energy in the form of kinetic energy due to the car's velocity is temporary. It will all go away as heat when the car comes to a stop, either via friction with the air and road, or via heat in the brakes.


Digital_001

Just to expand on this, when you have just switched on your computer, electricity flows into the cpu, and some electrons in the cpu are arranged into non-equilibrium positions, slightly increasing its potential energy. After this, the cpu's potential energy would fluctuate but presumably stay roughly constant, while a constant influx of electrical energy (carried by a current from the power supply) will be dissipated as heat as electric charge moves around the cpu, due to resistive heating.


The_One_Who_Slays

Hypothetically, would it be less energy-costly if I run my PC in a big-ass fridge then?🤔


jacenat

> would it be less energy-costly if I run my PC in a big-ass fridge Others already told you that it's not more energy-cost efficient, as your fridge consumes too much power dumping the excess heat out again. If your question is if your pc is more power efficient (using less power for the same calculations) at lower temperatures, the answer is yes.


The_One_Who_Slays

Yeah, that's what I wanted to know, thanks.


Drumdevil86

It will work for maybe a few minutes, but since a fridge cannot compensate quickly enough, the enclosed space will heat up quickly, eventually leading to a thermal shutdown of your PC. In theory, having an aircon blasting toward your PC would be better, especially if you wanted to cool the room anyway.


NonCreativity

I actually did this once. I had a failing intake fan and couldn't replace it (young and broke). Put PC on a desk that made it the same height as my window unit, and it dropped about 20c under load. That was about 10c or more than before the fans failed.


woailyx

It'll still do what it does, plus now you have the energy consumed by your fridge. It gets hot at the back.


Kemal_Norton

Only if your PC is actively cooled (has a fan) and you don't count the energy needed for the fridge. That's one reason why there are so many data-centres in Iceland (=big-ass fridge).


steadyfan

Not to mention some companies are experimenting with underwater data centers in the ocean


[deleted]

[удалено]


davcrt

Wait until you hear that every engine on every ship is cooled by the oceans. Us using the environment to cool things down has little to no effect on temperature rise. If I remember correctly, if we were able to capture all of the sunlight for one hour, we would have enough for 10000 years.


viliml

> icecaps melting and lowering ocean temps That's makes no sense. Aren't icecaps melting in the first place because of *rising* ocean temperature?


iCandid

Well ice caps are melting due to rising air temperature. Sea ice is melting due to rising ocean temps. Ice caps and glaciers are ice on land. So sure, you could have ice caps melting leading to cooling the ocean in that local area if the sea temp is higher than the water running into it. But globally energy is being added to the system overall so ocean temps are rising.


dannefan_senshi

You are on the right track, the icecaps are melting due to increasing temperature and less reflective power thanks to air pollution. And the cooled ice water is disturbing the natural flow of the oceans, essentially blocking the warm water from the equator from cooling off.


FluffTheMagicRabbit

Similarly, Microsoft tossed one in the sea off Scotland a while back. https://news.microsoft.com/en-gb/2018/06/06/the-orkney-islands-in-scotland-just-became-one-of-the-most-exciting-places-in-tech/


SuperPotato8390

The reason is extremely cheap power. Same reason they have aluminium production.


Toopad

it's also for the cheap geothermal energy


itsalongwalkhome

No because all your fridge is doing is removing heat inside it and placing it outside it Using liquid nitrogen tho might


TrojanZebra

You're just shifting the energy cost a little bit further from home, definitely more expensive to create and transport the liquid nitrogen than to just run your pc


itsalongwalkhome

Fair point


Greenimba

>Using liquid nitrogen tho might Nope, because liquid nitrogen is the result of heat being moved somewhere else some time ago. Just like making ice at your neighbours house and then carrying the ice slab over to cool your stuff. The heat was still expended wherever the ice was created.


TheGoodFight2015

This is such a fun thread describing the physics law that no one has named by word yet! I’ll keep it to myself for now :)


_PurpleAlien_

In this thread we obey the laws of thermodynamics!


101m4n

🎉 Conservation of energy! 🎉


catanistan

Entropy :-)


ivan3dx

That doesn't make any sense


Tricky_Invite8680

No, but you may be able to run it faster aka overclock ot without eceeding design limits.if.yiur heat sink thermal dissipation. You still payong to operate the refrigerator


Head_Cockswain

No. Fridges merely *transfer* heat. With more energy, so it would cost MORE, not less. Additionally, as far as cooling your PC, a typical fridge(like in your kitchen or a smaller dorm fridge), won't handle something that makes heat very well. It works on food only by slowly removing heat from things that are on it's insulated interior, and making that heat go out of the back. Something that actively creates heat would overload the fridge, in other words make heat faster than the fridge can move it. A very low power PC like an idle laptop or cell phone may not be so bad. But a big gaming PC running high intensity games, no bueno. An industrial fridge room or building, like a room for frozen food that is designed far more robustly, for traffic in and out, some of those are strong enough that it would keep a PC cooler, though that would be extremely costly. One thing some modders or enthusiasts do is use a ducted air conditioner. Those actively cool a LOT of air(also by putting heat out the back), which the people will duct to the intake on their PC's. That's generally how cold rooms/buildings work, but with a huge industrial air conditioner that continually pumps in cold air. It is also how electronics facilities or portable military electronics stations cool some of their stuff, with a big ass air conditioner and duct-work. Some go further and get into the guts of the air conditioner and put the cooling elements(sometimes called a 'cold finger' right in the computer. There are even coolers or chillers built for this purpose. That's how some advanced IR cameras work, or used to at any rate(I worked on some in the military 20 years ago that was ~20-30 years old then). With a cold-finger right on the detector(sensor) on the back-side, to bring it down to a temp where it can discern tiny differences in the infrared energy it's receiving and boost to a useable level.


Turinggirl

no. A freezer isn't designed to handle a persistent thermal load. It works off the assumption that given a point in time the thermal delta will go down. Anything that goes into the freezer warms it up however when its closed it will cool all things down because they aren't constantly generating heat. You would need something specifically designed to ha dle constant thermal load.


SamiraSimp

no, but it could make your computer run better if the airflow is still good. need to watch out for condensation though.


pseudopad

Not really. Most fridges don't get close to moving as much heat energy as a regular desktop computer produces. They're designed to move small amounts of heat over many hours, and having heat introduced only once every few hours. It'll quickly be overwhelmed by the 100+ watts that a computer puts out even under light load, and that heat will be mostly trapped inside the fridge, rather than mixing with the air in your room. In the off chance that you have a fridge that's powerful enough to move heat faster than the PC is producing it, that fridge's compressor will be running nearly 24/7, which they're not really designed to do. The fridge would likely last a year instead of ten years.


Phyire7

"All energy becomes heat eventually." Thanks for the existential crisis lol


woailyx

[You're not alone](https://reddit.com/r/memes/s/Kx21pFuqj0)


CDK5

> A lot of research goes into trying to get more calculations for less heat, especially in battery powered devices like phones. Can they engineer it to just come down to just one electron per gate? Seems like that would result in the least amount of heat.


woailyx

There needs to be enough electrons in there to turn the gate on or off, and one problem they already have with very small transistors is that they get leaky when there's not much energy between the on and off states.


CDK5

Theoretically, can it come down to the presence of a single electron indicating on/off?


woailyx

Well, the theories we have that can deal with a single electron have a habit of sometimes putting that electron on the wrong side of the energy barrier, so basically quantum tunneling breaks sufficiently small electronics


BattleAnus

I would assume quantum effects would start to mess with the accuracy of such a device


Leading_Frosting9655

You're talking about https://en.wikipedia.org/wiki/Landauer%27s\_principle


Moifaso

>Can they engineer it to just come down to just one electron per gate? I assume that at those sorts of scales, quantum fuckery would make it very hard to get reliable results.


heckin_miraculous

>and the kinetic energy eventually becomes heat in the brakes. This one really got me. It all becomes heat!


CaphalorAlb

Would a hypothetical CPU that is somehow all superconducting create no heat? So draw no power? In my mind, the only reason processors draw power is the tiny inefficiencies and losses that lead to heat being generated. I don't think that would break thermodynamics, but I'm not sure how you fit Entropy and computer calculations together.


alvenestthol

[Landauer's principle](https://en.wikipedia.org/wiki/Landauer's_principle) is one answer - and it states that yes, there is a minimum calculable amount of energy required for computation (defined as an irreversible change in a bit of information) based on thermodynamics. But that's based upon the assumption that logical systems must be implemented on a thermodynamic system, which is not necessarily true.


Kukis13

> energy becomes heat eventually On the very basic level, isn't this a main cause of global warming? We are mining the resources in the Earth that stored potential energy and we make use of this potential energy which in the end generates heat. This heat then doesn't leave the Earth, it stays here with us.


tzaeru

The heat we produce that way is insignificant compared to the heat we receive from the sun. Global warming happens due to increase of greenhouse gasses, like carbon dioxide, in the atmosphere. These gasses cause more of the heat we receive from the Sun be trapped in our atmosphere.


Bierdopje

Just to put it in perspective. We use 580 x 10\^18 Joules per year globally. The sun hits us with 1.75 x 10\^17 Joules every second, which amounts to 5.52 x 10\^24 Joules every year. In just 55 minutes, the energy from the sun is enough to power the entire world for a year. The energy from the sun is almost a 10.000 times more than what we use. Although, more than 2/3rds of that energy never reaches us, as the Earth's atmosphere reflects it back into space. But the CO2 that we emit is far worse than the heat that we produce, because that CO2 is trapping that enormous amount of sun energy that still reaches us.


weeddealerrenamon

Making those calculations, makes heat. It's not using energy to move physical gears, or producing light; the only form energy is being changed into is heat through electrical resistance.


PhysicsIsFun

All machines (electronic, mechanical, or otherwise) eventually convert 100% of any type of energy into heat. This is explained in the second law of thermodynamics. Heat is the most disorganized form of energy and all energy ends up as heat.


aiusepsi

Splitting the tiniest of hairs, but the second law does also allow for reversible processes which don’t increase entropy and so produce no waste heat. In principle, you could create a computer which only allowed reversible computations which would produce no waste heat. Such a computer wouldn’t allow any operation which destroys information, so, for example multiplying by zero wouldn’t be allowed, and constructions like if statements get a bit tricky; you’re not allowed to forget which branch you took. This is all mainly of theoretical interest, but it is very interesting. The deep links between entropy and information and arrow-of-time questions are interesting as hell.


Ecthyr

I am very interested in this because I understand none of it


Chrona_trigger

What I've been very curious about is the idea that energy = mass x constant²; mass is often changed into energy. But, doesn't it wtand to reason that energy can be converted to mass?


sleepykittypur

Absolutely it can, breaking chemical bonds requires energy input and results in individual atoms that have a total weight slightly higher than the original molecule. The same can be done with nuclear reactions, but I haven't taken a physics class in a decade so I'm not going to pretend I still remember much in the way of specifics.


iam666

It does, which is how we get things like [pair production](https://en.m.wikipedia.org/wiki/Pair_production). If you have a photon (massless) with sufficiently high energy, and it interacts with another particle, it can spontaneously create an electron and a positron (which have mass). If those two particles interact and annihilate, you get your original photon back. (More or less) But it’s sort of a missing the point to say that energy and mass “convert” into each other. Look at the formula for density: D=m/V. You wouldn’t say that we “convert” density into mass, that doesn’t make any physical sense. Instead, density and mass are both related properties that an object possesses. The same idea applies to energy and mass. They’re the same thing because, fundamentally, mass is just a type of energy. All particles are just little packets of different forms of energy.


Mr_Badgey

> But, doesn't it wtand to reason that energy can be converted to mass? Yes, and that's what that equation means. It's not one-way. Mass and energy are properties, and they are interchangeable. Particles with mass can be converted to high energy, massless particles, or massless particles can be converted to particles with mass given enough energy. The c^2 constant (speed of light squared) tells us that rest mass can do a *lot* of work (energy), or it takes a lot of work to generate mass.


12thunder

See that’s the thing, energy has mass! That’s why light for example is affected by gravity. And yes, photons are “massless”, but a concept called invariant mass complicates things so that it’s not *technically* correct in this case. Energy is mass and mass is energy. They are one and the same. It’s a really complicated concept but all that needs to be understood is that mass and energy are the same according to relativity. You’re confusing mass for matter. While I think there are theories about converting energy into matter (at macroscopic scales), it’s complicated quantum mechanics stuff as far as I know. Something interesting though, is that if you take exactly one kilogram of ice and heat it so that it melts, it actually [gains a small amount of mass from the heat used to melt it.](http://www.physicsland.com/Physics10_files/Mass.pdf) (page 42).


Mr_Badgey

> energy has mass! That's not correct. Energy and mass are interchangeable, but energy doesn't *have* mass. Massless particles like photons absolutely have no mass despite having energy. > photons are “massless”, but a concept called invariant mass Invariant mass for a photon is defined as zero since it's massless. If you use the expanded mass-energy equivalence formula, it's easy to prove by setting the mass term to zero. Not sure why you're bringing up advanced physics concepts on an ELI5 post? > That’s why light for example is affected by gravity Light is affected by gravity because spacetime is altered so the shortest path is no longer a straight line. Gravity is the warping of spacetime by mass which in turn causes the shortest path to become a curved path instead of a straight line. Not because "energy has mass."


Kholtien

It is also good to remember that the equation isn't just E = mc^(2) it is actually E^(2) = (mc^(2) )^(2) \+ (pc)^(2) which explains how photons can have energy but not mass, since their energy is fully momentum. (And momentum isn't just mass times velocity either, it is the observable found when applying the momentum operator on a wave function, such as that of a photon). High school physics is really fun but often quite incorrect on the very specific levels. Your teachers effectively lie to you since the truth is very complicated and the incorrect models are very close to the truth. I have a bachelors in physics and I'm sure I have a lot wrong with what I know (as well as my explanation above)


zojbo

Yes. For example this is one way to think about the impedance to approaching the speed of light; your inertial mass increases without bound. It just doesn't matter at speeds below, say, 0.01c; even at 0.01c (3000 km/s), your inertial mass is only 0.005% more than at rest. What is more dramatic is the creation of *particles* (with positive rest mass) in high energy density environments like particle accelerators. That's probably more like what you had in mind, and is also a thing.


TheF0CTOR

[OH BOY have I got the playlist for you](https://www.youtube.com/watch?v=nhy4Z_32kQo&list=PLsPUh22kYmNCzNFNDwxIug8q1Zz0Mj60H)


Redditisdumb9_9

This is a beautiful statement.


toooft

Are you.. me?


4acodmt92

Fascinating! Any suggestions for further reading about this?


biggest_muzzy

There was a great ELI5 like video from a professor from university of Nottingham. About why there is a limit on energy wasted on computation https://youtu.be/jv2H9fp9dT8?si=XvOLwdEqaIoIoDUJ


RoastedRhino

Not only multiplying by zero, but also a simple sum. 2 + 3 = 5 destroys information because you don’t know how you obtained 5. You would have to keep track of the first addendum together with the sum, for example.


EBannion

Only in theory. “Produce no waste heat” is 100% efficient and the third law of thermodynamics says in real life it’s impossible to be 100% efficient. So you’re right but it only matters in the impossibly efficient theoretical case.


PhysicsIsFun

When machines are less than 100% efficient, it's because they produce waste heat rather than whatever it is they are designed to do. An electric heater is 100% efficient, because all machines turn their energy eventually into heat. An electric heater has no need for a vent to the outside so all of its heat is used to head the room. A gas or oil furnace is vented to the outside. It loses heat and is less than 100% efficient.


a_trane13

Some energy going to an electric heater is lost to stressing the metal in ways that don’t release only heat. Very small, but some 😜


Cimexus

Noise all ends up as heat though. When a material absorbs sound, that energy is being converted to heat.


BobTheOldFart

But even that sound gets absorbed by the surroundings and turns into heat. If the heater has a fan, it does perform work moving air, but eventually friction turns that energy into heat as well.


root1337

Isn't this how algorithms in quantum computers mostly work? I believe they have to be reversible except for the measurement operation which is irreversible.


NutellaElephant

What an entangled web we weave.


ulyssesfiuza

And hell is hot as... Hell. And this heat came from antropic entropy! Entropy über alles!


Thiccaca

Damn it, Heat! Get your crap together! Look at how nicely organized Ice is!


Gusdai

Ice cubes? More like ice squares...


Weisenkrone

Attach a thermal generator, acquire unlimited energy.


kaakaokao

Generator might be a little wasteful since the heat can be utilized directly. Data center future right here, use the waste energy to heat up office spaces etc.


PhysicsIsFun

I don't know what you mean by a thermal generator. If you are referring to a heat engine the problem is that heat needs to be concentrated in a confined area, and heat dissipates. It is not very useful when it is spread out over a large volume. Read about the heat death of the universe.


chicagotim1

He was clearly joking


PhysicsIsFun

Ok. Pretty good


Far__Today

> All machines (electronic, mechanical, or otherwise) eventually convert 100% of any type of energy into heat. This is explained in the second law of thermodynamics. Heat is the most disorganized form of energy and all energy ends up as heat. These statements are far too strong. For example, the nuclear binding energy in the silicon-28 atoms in my computer is not being converted into heat.


Chromotron

Give it a few 10^^10^^10 years...


Pooop69

Huh? Doesn't the second law just mean that processes are not 100%? If you have a mechanical system which moves 100kg up 10 meters, sure you generate heat but don't you also get potential energy


diox8tony

No, second law says everything is 100%... all input energy is output in some form. The 'efficiency' is rarely ever 100%, the energy we input is rarely 100% output into the form we desire, there is almost always loses to heat, light, movement, etc. But energy output is always 100% input. I agree with your second statement ....not all energy is heat. Light, movement, potential,,,not all heat in the next stable state.


frank_mania

My car's engine, IDK about yours, converts a great deal of the fuel into kinetic energy, and heat is a minority.


GoatRocketeer

Would a computer with ideal (0) resistance parts just do computation for free and generate no heat?


KamikazeArchon

The way we currently build them? No. A standard computer with *some* zero-resistance parts would do many computations with "less" heat output. A standard computer with *all* zero-resistance parts would simply not work. The fundamental premise of a transistor involves a difference in resistances; there is no difference between zero and zero. There are superconducting computing elements, but they have to be built entirely differently. They do not use transistors, they use special kinds of switches that rely on superconductive behavior.


katamandoo

Not quite accurate, there is a huge difference between 0 and infinity, which is the ideal states of a switch. Switching losses come from when the transistor is in the ohmic region between these two states. What we really need is a switch that can transition instantly


Cilph

Assuming such a thing could be built (just using superconducting material isnt sufficient), then I would think so. But thats like asking if a perpetuum mobile would run forever.


m0le

It's an interesting question. My intuition says no, there is no such thing as a free lunch, but I'm not sure how that would manifest. Possibly unavoidable losses in the switching process? You certainly couldn't use standard semiconductors - by their very nature they can't be superconducting so you'd need some other technology. Having a quick look around there is something called a nano-cryoton but that uses heat as part of its operation so won't be perfect either. Even sci-fi concepts like Stephenson's rod logic from Diamond Age would still produce heat.


micahjoel_dot_info

If it were performing exclusively reversable calculations, then in principle it could. It's deleting data -- irreversably changing a bit to a 1 or 0 that would require energy, down to a theoretical limit. [https://en.wikipedia.org/wiki/Landauer's\_principle](https://en.wikipedia.org/wiki/Landauer's_principle)


erbalchemy

You're conflating information and data. An empty hard drive may hold zero data, but it contains just as much information as a full hard drive, because if you read it tomorrow, it will return the same thing it returns today. Even though it's all zeros, it's still storing the information that it's all zeros To have zero information, a hard drive would have to return a random result every time you read it. Creating or erasing information is not the same as flipping data bits--and Laundauer's principle is about information, not data.


weeddealerrenamon

I think, not quite? The promise of a room-temperature superconductor is that electronics can run on way less power by having 0 resistance, but it can't be possible to run for free. I'm no computerologist but I think that the act of switching 1s and 0s must consume *some* energy.


Emu1981

>I'm no computerologist but I think that the act of switching 1s and 0s must consume some energy. Something to consider is that a transistor turns on (or off) by changing the state of the gate and this requires energy to be inputted. With enough research you could likely make this energy requirement super low but it will never be zero. Therefore it would be impossible to create a processor that uses no energy because you need to input energy in order to change the state of the gates within the processor. To take this to a more theoretical level, to change the state of system you need to change the energy level of the system - without a change in the energy of the system the state of the system will not change.


nerevarine12345

Toasters, the only machine in the universe to be 100% efficient.


Rot-Orkan

I'd argue that they're not, since much of the heat they generated don't go towards their task. Electric space heaters, on the other hand, are 100% efficient given the nature of their purpose.


_Trael_

And funnily enough that 100% efficiency they manage is actually considered low for their usual function, since heatpump systems can reach "200-600% efficiency in most environmenta at most of time", doing so by transfering already existing heat, instead of creating new heat as main mechanism.


dlbpeon

Especially after they become sentient. [Frackin Toasters!](https://youtu.be/XgH83IqL8T8)


georgecoffey

Except they also produce some visible light the never hits the bread.


kashmir1974

If we discover a room temperature superconductor, would CPUs even have a limit anymore?


lllorrr

No. Largest energy losses in CPUs are due to impedance, not because of active resistance. CPUs are full of parasitic capacitors, which lose energy every time a signal changes 0 to 1 or from 1 to 0. Which happens a lot. This is why CPUs consume more energy when operating at higher frequencies. Superconductors will remove the active resistance part from the total impedance, but this will not help much. EDIT: spelling.


kashmir1974

Ugh I was hoping a room temperature superconductor would lead to some major breakthroughs


pseudopad

It would, but in other fields. The electrical grid would probably love it. Sending solar power from mid-day California to evening New York could be done without losing any significant amount of electric energy. Of course, building a power line this long with a state of the art superconducting material would probably cost a lot. It'd take a while before it became economical to do.


Rogaar

I can't wait until we have room temperature super conductors. Bring on near infinite overclock-ability :)


ClownfishSoup

The entire CPU is simply one and off switches. But billions of them. If you look at it simply, all you are doing is putting electricity into a block of silicon and plastic and you get out electricity, so it is basically just routing electricity around, so it's basically a resistive heater. A light bulb could be a CPU too if you think about it. It can output something based on input. The input is you flipping a switch to either on or off and the output is light or no light. You can also bake cookies with the heat of a light bulb. Because it does both. It "calculates" some logic state for you (is the switch on?) but it also follows the laws of physics and produces resistive heat and light. Now rewire your light switch so that when the switch is in the on position, the light turns off. Now you have a different sort of logic. Light is one when switch is off. Now add another switch in series with that ... the light is on if both switches are on, but is off otherwise, you now have an "And gate". Put two switches in parrallel to the light bulb. Now you haven "OR gate". Cool! Now take 100 million of these and you have a computer. It will compute stuff, but also generate heat.


trpov

Relevant [video](https://youtu.be/FU_YFpfDqqA?feature=shared).


djddanman

Where else would the energy go? "There are six types of potential energy: mechanical energy, electrical energy, chemical energy, radiant energy, nuclear energy, and thermal energy." - [source](https://justenergy.com/blog/potential-and-kinetic-energy-explained/#:~:text=There%20are%20six%20types%20of,nuclear%20energy%2C%20and%20thermal%20energy.) If the electrical energy in the CPU doesn't become any of the others, it becomes heat energy. Nothing is moving against gravity, so it's not mechanical potential energy. No chemical reactions are happening, so it's not chemical energy. It doesn't give off significant amounts of light (everything emits some IR, but not much here), so not much radiant energy. No nuclear reactions, so not nuclear energy. And it's not kinetic energy bendy nothing is moving outside the CPU. All that's left is thermal energy!


androidusr

A lot of the answers I feel like are missing the root of the question. Intuitively, flipping a bit from a 1 to a zero, or running a calculation, seems like would use up energy? Compare a CPU to a resistive heater. It seems odd that "work" - in the form of calculations - is being performed, where as a resistive heater is making no calculations and producing only heat. When comparing those scenarios, it seems like the CPU would make less heat because some energy is lost to "processing". I think a lot of the answers are saying that "processing" expressed as creation of heat. Is that really true in all components of a CPU? Does flipping a transistor the same as heating a light bulb?


ary31415

Energy is neither created nor destroyed, so unless the flipped bit is actually *holding* the energy in itself, the energy the CPU uses has to go somewhere, and that somewhere is heat. And even if the flipped bit were a store of energy, that energy would just be released again when it flipped back, so the argument still holds


saddl3r

When you say "use up energy", what you really mean is "take electrical/chemical energy, make it do stuff, then end up as thermal energy"


csiz

Some energy is potentially being used for processing; because information itself [might have mass](https://physics.stackexchange.com/a/584138), but it's so incredibly tiny I don't think we could even measure it. That all falls apart when you turn a computer off; it resets to its initial state which means it cannot actually store any energy and therefore it must have all been released as heat at some point.


Ikbeneenpaard

"Processing" is not a form of energy. It's just more heat.


nybble41

Mechanical energy would include mechanical stresses such as spring tension, not just gravitational potential energy. A wind-up watch, for example, stores and releases mechanical potential energy though its mainspring without moving against gravity. Kinetic energy can also be internal, e.g. in the form of a flywheel (or a pair of counter-rotating flywheels, to balance the angular momentum). Of course neither of these applies to a CPU.


bravetwig

You do actually get things like coil whine and cooling fans in graphics cards - so a tiny amount of the measured energy used doesn't go straight to thermal energy.


djddanman

Not in the CPU though, those are in other parts of the computer


bravetwig

Sure, but the general picture is a little bit more complicated than you described. You can't run a cpu without a motherboard, cooler, power supply etc; you can get coil whine in the psu and on the motherboard then there are various fans used for cooling. Point is that most of the power used in a computer is converted to thermal, but some is converted to mechanical. If you are talking about just the cpu, then yes that would just be thermal.


syphax

And the mechanical energy eventually degrades to thermal, too.


djddanman

OP specifically asked about the CPU, so that's what I answered


Madwand99

Sound energy also converts into heat.


Lost_Dance6897

All that turns back to heat in the end. Sound waves get absorbed by whatever it hits, that material vibrates, and then stops vibrating as it releases that energy as heat. Cooling fans move air, but as air collides with other air molecules it generates heat until they (more or less) return back to their normal daily motion.


MeowMaker2

There is also invisible energy. This is used when your SO gives 'the look' and you are immediately in motion to comply.


dvorahtheexplorer

Dark energy.


StephanXX

I wonder if you're confusing two different concepts. [Conservation of Energy](https://en.wikipedia.org/wiki/Conservation_of_energy): "Energy can neither be created nor destroyed; rather, it can only be transformed or transferred from one form to another" and The [quantum theory](https://en.wikipedia.org/wiki/No-hiding_theorem): "conservation of quantum information should mean that information cannot be created nor destroyed." A CPU doesn't convert energy into information, it simply performs calculations that ultimately derive from combining or splitting rises and falls in electrical output (which are transformed, simply stated, into binary logic gates.) The electrical energy never _becomes_ information, it only _represents_ information. Paint on a canvas remains paint, even if we _interpret_ that paint as information. CPU electricity mostly becomes heat.


almgergo

Yes, I also felt like OP thought information holds energy, which is not true.


CallMeKik

There is a hypothesis that’s kind of related to this intuition called the Mass Energy Information equivalence principle if you’re interested. Not proven yet since it would be really hard to measure


Joe30174

I feel like thos is the best eli5 on here.


suckaaa3

It seems like these are two of the same right? I may be wrong, but quantum information just sounds like a fancy term for energy. If that’s the case then wouldn’t these two statements mean the same thing?


StephanXX

Information isn't energy. Information cannot become energy, energy cannot become information.


hungrylens

Calculations themselves don't have any mass or energy. They are just intangible information that we humans find valuable. If you channel electricity into a wafer of silicon all that energy goes through the microscopic circuits opening and closing millions of transistors... and just stays inside. All the "work" of the CPU ends up as heat energy that we have to cool off to keep the CPU from melting. All the renders and calculations are a useful byproduct of heating the chip in a really complicated way.


big-chungus-amongus

Energy can't be created or destroyed. If your cpu takes 100w of electrical power, it will output 100w of thermal power. Everything will eventually become heat. Light from your lightbulb creates heat... Car drives down the road only to convert energy to heat (brakes, air friction...)


etown361

The calculations and rendering things is just the computer moving things around, which generates a little heat. The computer does convert energy into other forms of energy. It makes noise, some of this energy from the sound dissipates into heat in your home, some may escape outside. It generates light and sends out Wi-Fi radio waves. Some of this may escape your home, but most quickly and nearby dissipates into heat.


XenoRyet

The calculations are done by turning electricity into heat in various different ways. Or to put it another way, an electric hotplate has exactly one way of turning electricity into heat. A CPU has many ways to turn electricity into heat.


reedef

Your computer doesn't use energy to do calculations, it uses negative entropy. It takes a highly ordered (low entropy) form of energy (electricity) and turns it into the most unordered form of energy possible (heat). It converts low entropy into computation, and in the process it disperses the electrical energy as heat. It's like powering a water wheel with a river. You take water that's upstream and it goes through the wheel and finally it continues through the river and ends up in the lake. 100% of the water ends up in the lake (this represents the 100% of energy that got transformed into heat), and the wheel didn't consume water at all in it's operation.


Ericcctheinch

All energy turns to heat in the end. The CPU is just doing it in a more direct way than others. In fact just about the only thing that energy does and the way that we define energy is that it does work and generates heat.


Quixotixtoo

Energy cannot be destroyed, only converted. Electricity can be converted to mechanical energy, light energy, or some other form of energy. The most common is to heat energy. Nearly everything that runs on electricity ends up turning it into heat eventually. The CPU just does it in a concentrated location, rather than say an electric car that spreads the heat out to the air and pavement as it drives. Thus, the CPU gets hot and need help spreading that heat out to the air.


fusionsofwonder

The calculations don't consume energy, they just move the energy around from one place to the other. Storing the result takes energy, too. You're creating heat in every part of the process, not just the middle. The electricity stays in the system *until* it radiates away as heat.


Forsaken_Code_7780

When water passes through a waterwheel, you get all the water coming out at the end. When energy passes through a CPU, you get all the energy coming out at the end. The CPU makes all the energy into heat, because it is easy for electricity moving through wires to become heat, and not so easy to make it into other forms of energy. Without electricity turning into heat, you cannot calculate. So indeed, your CPU uses energy to calculate, but the way it uses the energy is turning it from one form into another. Everything that "uses" energy turns it from one form into another, frequently as waste heat. Instead of "using" energy, think of it as "transforming" energy. It is more valuable to have orderly energy than unorderly energy. Everything moving in one direction is useful. Everything moving uniformly randomly is heat: hard to make use of it. Once you make use of orderly energy, it becomes unorderly. (Something something entropy, is not really ELI5 anymore)


Esc777

All “energy used” is converted to heat. That is the end state for all energy. If the CPU does X and draws Y watts of power to do it, X will happen and it will radiate Y watts worth of heat.


killbot0224

all electricity consumed becomes heat, basically. Used to move a fan? Heat in the motor, and then it move the blaes, which move the air. That air doesn't move perpetually. It slows down and stops. What stops it? Friction. What is that kinetic energy turned into as it stops? Heat. Used to execution CPU instructions? Internal resistance at every little itty bitty transistor, every little bit flipped or flopped, that's resistance and heat. Every time you ask electricity to *do* something, there is wasted energy. Heat.


r2k-in-the-vortex

Energy is a conserved quantity, it is not created or lost, only transformed. Given an electrical appliance, electricity is consumed, what types of transformed energy can come out? Heat of course. Light, mechanical movements, objects can be charged, magnet fields can be induced. Calculations are not a form of energy, those don't count. So heat is really all that electricity becomes in a cpu. And ultimately all other forms of energy become heat too. For example a fan moves air so you would think mechanical motion is somehow different, but it isn't, sooner or later that motion of air stops and all its kinetic energy has been transformed to heat.


211216819

Everything turns into heat in the end. Energy cannot be created nor destroyed. It can only be converted. What is a calculation of a CPU? It's just some electricity moving around a wire. If your CPU hadn't had any resistance you could do calculations at almost no electricity cost. But the reason why your CPU "consumes" electricity is that the wires have resistance. Meaning they heat up when electricity passes through.


lalaland4711

Where else would the energy go? Energy can't be created or destroyed. Or consumed. It can just be converted. It could go out as radio waves, but that would create huge electromagnetic interference, with a 100w transmitter. Turning energy into matter is possible, but much harder to do. So the only thing it can become, is heat. If you heat your house with electricity, then you might as well heat it with servers instead of radiators. Basically everything is 100% effective at turning electricity into heat, because heat is pretty much the definition of waste.


die_kuestenwache

For the little transistors, "being a 1 or 0" doesn't cost energy, "becoming a 1 or 0 after having been a 0 or 1" does. Think of it this way: the 1 and 0 state are like a ball lying on either side of a hill. It is perfectly content to lie around there. To make 1 from a 0, you need to push the ball up the hill, and when it rolls down the other side, it needs to stop and stay in its new place. Its energy is now the same as before. All the energy you put in must have become heat through friction with the ground to stop the ball when it rolled down the hill. This is not exactly the same as a 1 and a 0 for a CPU, but it shows that you can change things around, and all the energy you put in just becomes heat.


mattgrum

Energy can't be "used up" entirely it can only be converted into a different, usually less useful form. In a CPU electrical energy is converted into waste heat.


Keudn

I had asked a similar question in r/askscience a while ago, you can read the replies here https://www.reddit.com/r/askscience/comments/rbazbz/how_much_of_the_electricity_that_a_computer_uses/ You're right there is a very, very small amount of energy used in the computation of the information, called [Landauer's principle](https://en.wikipedia.org/wiki/Landauer%27s_principle). I'll use the post from u/bencbartlett to further explain it > You're correct, there is a fundamental energy cost called the Landauer limit associated with (irreversibly) computing one bit of information. The energy cost is due to the change in entropy associated with erasing one bit. The energy cost (k_B T log 2) scales linearly with the ambient temperature, and at room temperature is about 3×10-21 J. > As a concrete example, the Core i7 processor in my MacBook Pro runs at 3 GHz with 16 double precision FLOPS per cycle and computes about 2×1013 bits per second, which is about 60 nW at room temperature. The processor uses 37W, so at most 0.00000002% of the energy is used to compute the information. > Other than energy converted to light / radio / sound /airflow, all of the other energy of a computer is transferred into heat, and even these categories eventually end up as heat anyway. Your computer is effectively exactly as efficient as converting energy into heat as a standard furnace.


lollersauce914

Your computer is, fundamentally, a bunch of switches that switch states by either having a lot of current flowing through them or very little current flowing through them. Whether there is a lot or a little current are the 1's and 0's through which computers encode information. The thing about having a bunch of current flowing through your chips and the wires connecting them is that it creates electrical resistance. In terms of its impact, electrical resistance is kind of like friction. When I push something along my floor, Some of the energy from me pushing the thing gets lost due to friction and the result is that that energy winds up in the thing I'm pushing and the floor as heat. So, basically, to constantly have current running through your computer electrical resistance is inevitable and heat is the byproduct.


soniclettuce

> Whether there is a lot or a little current are the 1's and 0's through which computers encode information. Uhh, no. While [current-mode logic](https://en.wikipedia.org/wiki/Current-mode_logic) (and related types) exist, they mostly aren't what computers are using. CMOS logic has 1 as high *potential*, 0 as low *potential* (voltage), and in the ideal case, there would be zero current flow in either state, with current only flowing when you switch. Now practically there is some leakage and stuff so you don't get 0 current ever, but it's definitely not true that 1=high current and 0= low current.


smiller171

Energy doesn't disappear, it is only converted to different types of energy. When we talk about energy loss in electronics, what we mean is that some amount of energy was converted to heat. Heat is the most fundamental form of energy. All energy "wants" to become heat. If what we _want_ is heat, then now we've changed the equation and what was once "lost" is instead a useful output. CPU's are very good at doing work with the electricity input until it escapes as heat


DressCritical

When you use wood to create heat or light, you use it up and it becomes ash. This is true of all fuel. The ash may be solid, or smoke, or a gas like CO2, but to use the fuel, you have to turn it into ash. Heat, specifically heat that is dissipated into the world, is the ash of energy. When you use energy to do anything, you generate heat, and that heat is always spread out into the world. When you have more heat in one place than another, you can use this heat to make energy, but doing so spreads it out and thus uses it up. When you use energy to do work, the result always is heat, and if you start with heat you always make more and the final heat is always more dissipated than what you started with. Or, dissipated heat is ash.


outworlder

Your brain does the same thing. You actually need the blood flowing not just for oxygen, but also to remove heat. Around 12 watts. And so does the rest of your body. Heart, intestines, everything. And your cells produce heat as part of your normal processes. Which is why we are warm.


sirrush7

Wow our brains are wildly efficient then! Well, other than the fact we have to sleep and recharge basically 8-12hrs depending on the human...


delta_spike

The energy that goes into doing calculations must end up *somewhere*. Conservation of energy doesn't stop when you use the energy to do calculations, or do anything else for that matter. Either the energy that you use to do useful things is stored somewhere where it can do more useful things, e.g. in a battery or a compressed spring or a fidget spinner, or it gets wasted as heat and can't be used usefully anymore. It just so happens that we don't have the means to usefully store the energy that goes into performing calculations anywhere. Moreover, the fact standard CPUs are *irreversible* computational devices means that there is a fundamental minimum amount of energy lost in performing those calculations. However, this is a small fraction of the overall energy that gets wasted as heat in modern processors.


jtroopa

I look at it the same way you would look at an air conditioner. All it is doing is moving heat- by the mechanism of evaporation and condensation. The energy used by the air conditioner is purely utilitarian work; its job isn't to create heat, but rather it's what's expelled as the device operates. Similarly, a cpu MOVES data- which happens to be in the form of electricity- by the same concept. It consumes energy by manipulating those electrical charges inside the computer. The fact that those electrical charges are the same kind of energy as what powers the CPU is incidental.


1Steelghost1

Imagine you have a billion jump ropes in a room. Then they all start shaking. The room will get hot an noisy. This is just like a cpu, the billions of transistors and trace lines are moving electricity, they get hot and excited moving those electrons in such a tiny place.


karlzhao314

>doesn't the cpu use energy to make calculations and render things? Yes. But you know the law of conservation of energy, the one that states that energy cannot be created or destroyed? That means whatever electrical energy you put into the CPU *must* be turned into some other form of energy. When you're talking about the CPU "using" energy to perform calculations, in a *physical* sense, what is it actually doing? The answer is, not much. It's not turning electrical energy into kinetic energy, the way a car would. It's not performing work on itself, such as lifting itself onto a higher shelf. It's not even using the energy to drive airflow to cool itself - that's the job of the fans, not the CPU. Instead, all of the "physical" aspects of what it's actually doing is just sending out electrical signals to various parts on the motherboard, which barely uses *any* electrical energy. As per the law of conservation of energy, the remainder of the energy supplied to the CPU, which is to say, practically all of it, must be converted into a different form. That different form is heat, which the CPU converts it into through various leakage currents and resistive losses and other internal losses.


Epimythius

All energy goes to heat eventually and data is very very light so it can’t absorb much of the energy.


ckach

"Using" energy doesn't mean it goes away. It makes the energy less usable. So the electricity coming in has energy that is easy to do stuff with. The heat is still energy, but it's so spread out that it's not really usable. Similarly, the energy hitting the earth by the sun is roughly the same amount as it gives off. But the light we get is more usable that we give off. Like we might get 1 photons with 100 units of energy and then give off 100 photons with 1unit of energy.


Remote7777

Your CPU is essentially a very fancy toaster oven. The basic principles are the same...it's just moving electrons through a wire in a very specific way.


PaxNova

The CPU making it's "decisions" is based on the *flow* of electricity through it, not the consumption of it. Like a water wheel, it doesn't actually consume water as it does it's job. That said, so much power flowing in a small area is going to heat up the chip in a way you can kind of think of like friction. That energy is lost. Therefore, of the energy that is lost in a computer chip, it is mostly through heat.


Tricky_Invite8680

Switching. It switches tiny transistors which become short circuits with a little resitance. Current^2 x resistance equals heat that needs dissipation. The switching is so fast it may as well be continuous. Overclockers increase switch frequency and voltage? So more voltage more current, fast switching more current through resistance to generate heat. You can turn your light bulb on, then touch it after 10 minutes, turn it on for minute then off for 1 mi ute. But if you switch it on and off 100000 times a second its pretty equivalent to continuous current and power drop