Slashdot wrote:A new paper from Microsoft Research (PDF) suggests a radical but slightly mad scheme for dealing with some of the more basic problems of the data centre. Rather than build server farms that produce a lot of waster heat, why not have distributed Data Furnaces, that heat home and offices at the same time as providing cloud computing? This is a serious suggestion and they provide facts and figures to make it all seem viable. So when it gets cold all you have to do is turn up the number crunching ...
This I actually find really interesting - I've noticed that my PC plays its part in keeping my room relatively warm, and this is extremely small scale. It does make sense somewhat - instead of just venting our PC's heat to the atmosphere, how can we reuse it instead? Would love to get input from the Folding @ Home crowd, or some Bitcoin miners.
There's also an interesting point in the original article comments - we treat heating and cooling devices as individual units, instead of going for efficiency by trying to integrate them. An example: the Debonairs just down the road has a heater for staff in winter, yet it just vents the heat generated by the pizza ovens away...
techno people and geeks going green.... hmmmm... this could help with decreasing your carbon footprint...
during peak times, they ask us to put off our geysers and other appliances... why not computers?
"This eBook is displayed using 100% recycled electrons."
It is a good idea, I remember the hassle they had at Barclays when trying to set the system for the aircons as 450 plus computers on the floor heats up the place quite a bit and screwed up their calculations.
i already use my pc too keep my room warmed up in winter. and my laptop is also running 24/7 doing folding.
only issue i had was that in summer it gets very hot in my room. so i have now got a usb fan to keep the air circulating helps to keep me cool while its hot
Is it financially viable though? Looking at CPUs over the past 2 decades they got hotter the higher the frequencies got and then it took a dive. I remember the toms hardware vid where both AMD cpus fried and died while the intel p3 crashed at a lowish temp while the p4 slowed down to slug pace but didnt die or crash. I would think my i7 is cut from the next crop and that it won't overheat unless pushed beyond the operations limit.
GPUs on the other hand had the opposite effect with the 580gtx being a melter. Even the Radeon 5xxx series had over heating issues.
Unless we start living integrated I don't see our pcs keeping us that warm during cold winters. But I can see how come summer we face a new problem where we have to cool both us and our pcs.
Used to work in a factory where we proposed that excess heat off the furnaces be used to heat water for the showers etc.
Payback period was too great and maintenance to the system an issue. Something along the lines of, "it might interfere with production and production maintenance".
MOOD - Thirsty
A surprising amount of modern pseudoscience is coming out of the environmental sector. Perhaps it should not be so surprising given that environmentalism is political rather than scientific.
Timothy Casey