Conversation

i wonder how a computer's energy usage scales when you account for waste heat that needs to be removed

4
1
1

@kirakira my autistic brain is having a bit of trouble today, but I would just mention a line in a presentation I liked:

"When it comes to computing, power is literally power"

That is, the Watts drawn from the wall is the total power, and the only question is how many mips/flops do you get as it becomes heat

Worthy of note is a fundamental law showing the power it takes to "switch" a CMOS gate (which is what is carved into the silicon) is based on the *square* of the switching freqency

1
0
1

@kirakira electronics racconvert almost all of the electricity it draws into heat (minus like fans, displays, leds, speakers, antennas). so an increase in power usage of 100W also inraccreases raccooling requirements by 100W

0
0
4

@jhwgh1968 @kirakira This was something I didn’t know (or have long forgotten). No wonder clock rates have topped out. I always thought it was due to something like RF effects more than power

0
0
1

@kirakira in the winter time, our computers help keep the house heated.

2
0
1

@LanceJZ @kirakira

If I run a compute-intensive game in the summer, my air conditioner has to work harder.

So yeah, energy usage definitely goes up.

I guess if you want a data center, you should put it on a really really cold planet. Either it will keep your data center cold indefinitely, or you'll raise the planet's temperature such that it's suitable for human life. Win-win!

1
0
1

@LanceJZ @kirakira

Mars would be a good candidate for that planet, by the way. Mean surface temperature is -60°C. And unlike moons and asteroids, Mars has an atmosphere, so you can do air cooling.

Now we just need warp drive with which to get there in a timely fashion.

0
0
1

@kirakira TDP pretty literally equals power draw, hence why it's measured in watts despite being a measure of how much heat a component emits

0
0
1

@LanceJZ yeah though on a warming planet i would think the net would be negative

0
0
1