Whether you’re working from home all day, gaming hard after hours, or both, your computer adds a measurable amount of heat to your home. Here’s why and how to calculate exactly how much it’s heating the place up.
Computers Are Surprisingly Efficient Heaters
Certainly, everybody who uses a computer knows they generate heat. If you put a laptop on your actual lap, it warms things up quite quickly. Anybody who has gone on a gaming bender with a desktop PC knows the room slowly gets warmer as the session goes on.
So the idea that a computer adds some heat to the room it is in while running isn’t necessarily shocking to most people. What is surprising to a lot of folks, however, is just how efficient computers are at converting electricity into heat.
Every single bit of electricity a computer uses (as well as all the electricity used by the peripherals like monitors, printers, and so on) is eventually released as heat.
In fact, assuming you set a space heater to use the same energy as the computer uses, there would be no ultimate difference in the temperature of the room between running the space heater and the computer. Both use electricity to operate and both “shed” the waste heat into the room in the end.
You could run the test yourself, but if you’d prefer just to read the results of someone else running a computer vs. space heater showdown, you can rest easy knowing it’s been done. Back in 2013, Puget Systems, a custom PC building company, ran a test for fun to see if a computer really would function exactly like a space heater under equivalent conditions.
They loaded up a PC with enough GPUs and hardware to match the output of the basic little 1000W space heater they’d purchased for the experiment and tested them in a room isolated from the building’s HVAC system. The end result? Running the gaming PC under load to force it to match the output of the 1000W as closely as possible yielded an equivalent outcome in terms of increased ambient temperature.
We’re sure this is zero surprise to any physics students reading along at home. Electrical energy put into a system has to go somewhere, and it goes into the room as heat. Whether the source is an electric motor on a fan, a computer, a space heater, or even a toaster, the heat eventually makes its way into the room.
As an aside, we’d argue that computers are—in a philosophical sense, not a strictly physical sense—even more efficient than a space heater. A space heater turns 100% of the electrical input into heat, and a computer turns 100% of the electrical input into heat, but a space heater is limited to simply heating or not heating.
A computer, on the other hand, actually does all sorts of useful and interesting things for you while making the room a bit toastier. You can run Doom on a lot of things, after all, but you can’t run it on your space heater.
How to Calculate How Much Heat Your Computer Generates
It’s one thing to know that the electricity your computer is using will eventually end up as heat. It’s another thing to drill down to exactly how much heat it’s actually pumping into your home.
There’s a wrong way and a right way to get to the bottom of the issue, though, so let’s dig in.
Don’t Use the Power Supply Rating to Estimate
The first thing you should avoid is looking at the rating of the power supply as an indicator of how much heat your computer generates.
The Power Supply Unit (PSU) on your desktop PC might be rated for 800W or the fine print on the bottom of your laptop’s power brick might indicate it’s rated for 75W.
But those numbers don’t indicate the actual operating load of the computer. They simply indicate the maximum upper threshold. An 800W PSU doesn’t suck down 800W every second it is in operation—that’s the peak load it can safely provide.
To further complicate things, computers don’t have a steady state when it comes to power consumption. If you have a space heater with a low, medium, and high setting of 300, 500, and 800 watts, respectively, then you know exactly how much energy is being consumed at each setting level.
With a computer, however, there is a whole curve of power consumption beyond something as simple as High/Low. This curve includes everything from the tiny amount of power a computer needs to stay in sleep mode, to the modest amount of power it uses for simple daily tasks like browsing the web and reading emails, all the way up to the higher amount of power required to run a high-end GPU while playing a demanding game.
You can’t simply look at a power label and calculate anything based on that, other than calculating the absolute maximum amount of energy the device might use.
Do Use a Tool to Measure Actual Wattage
Instead of estimating based on the label, you need to actually measure. To measure accurately, you need a tool that reports the watt consumption of your computer and peripherals. If you have a UPS unit with an external display that shows the current load (or it has software that allows you to check the load stats via USB uplink,) you can use that.
We’d consider a UPS a crucial piece of hardware for everything from your desktop PC to your router—so if you don’t have one now is a great time to pick one up.
If you don’t have a UPS (or your model doesn’t report energy use) you can also use a stand-alone power meter like the Kill A Watt meter. We love the Kill A Watt meter and you’ll see us using it frequently like when showing you how to measure your power consumption or answering questions like how much it costs to charge a battery.
You just plug the Kill A Watt into the wall, plug your computer’s power strip into the device (so you can measure both the computer and the peripherals), and then check the readout. Easy peasy.
If you use actually measure, you’ll quickly see that the rating of the power supply isn’t the actual power consumption, by a wide margin.
Here’s a real-world example: I monitored the power consumption of my desktop computer with both the meter built into the UPS and a Kill A Watt meter just to double-check the UPS read-out was accurate.
The PSU in this machine is rated for 750W. But when powered on and idling (or doing very basic tasks like writing this article or reading the news) the power consumption hovers around 270W. Playing relatively lightweight games pushed it up into the 300W range.
When put under load either by playing more demanding games or running a stress-test type benchmark app like 3DMark that taxes the processor and GPU, the power consumption rises to around 490W. Despite a few moments flickering slightly above 500W, at no point did the PC come even close to hitting the 750W PSU rating.
This is just an example, of course, and your setup might have consumers more or less power than mine—which is exactly why you have to measure it to get to the bottom of things.
What to Do With That Information
Unfortunately, we can’t tell you “OK, so your computer adds 500W worth of energy to your room, so it will raise the temperature of the room 5 degrees Fahrenheit over 1 hour,” or any such thing.
There are simply too many variables at play. Maybe your home is a super-insulated concrete structure with triple-pane windows and an R-value insulation rating on par with a YETI cooler. Or maybe you live in an old farmhouse with non-existent insulation, a steady draft, and single-pane windows.
The time of year also plays a role. When the sun is beating down on your home in the summer that extra bit of heat radiating off your gaming PC might make an otherwise bearable room unbearably warm. But in the winter it might, instead, feel quite cozy.
So while that 500W worth of energy (or whatever the value may be for your setup) will enter the space regardless, because all the electricity will eventually become waste heat, what that waste heat means for your comfort level and the temperature of the room is quite variable. If you want to see the actual degrees-Fahrenheit change right before your eyes, put a tabletop thermometer in the room—this model is great for both at-a-glance info and for tracking the data with your phone.
Overall though, whether you throw a thermometer on the desk next to your gaming or not, you’ll have to assess based on your computer setup, your home setup, and what kind of cooling options are available to you, how much power use (and subsequent heat) you’re willing to tolerate.
Further, you might want to consider shifting your use based on your needs and the weather. For example, if you’re actually doing some serious need-my-GPU gaming, then you might need to fire up your desktop PC to get the experience you want.
Responding to emails or just doing some light office work? Maybe fire up the laptop instead and drop the heat energy getting pumped into the room from 300W to 50W or less. Lots of “light” games run fine on a laptop too, so you don’t always need to turn on the desktop rig to game.
Just dinking around on Reddit or reading the news? Maybe skip the desktop or laptop altogether and do those activities on your phone or tablet. At that point, you’ve dropped the energy expenditure from hundreds of watts to a few watts—and kept your living space significantly cooler in the process.
But hey, if you don’t want to give up all those hours of gaming (nor do you want to add heat to your home and get sweaty in the process) you could always use a window air conditioner in your gaming room of choice to both stay comfortable and extract the extra heat your gaming rig introduces.
Source by www.howtogeek.com