Madshrimps Forum Madness

Madshrimps Forum Madness (https://www.madshrimps.be/vbulletin/)
-   General Madness - System Building Advice (https://www.madshrimps.be/vbulletin/f18/)
-   -   Hear My Bull Shit (https://www.madshrimps.be/vbulletin/f18/hear-my-bull-shit-15891/)

Sidney 2nd July 2005 08:49

Hear My Bull Shit
 
Just finished a dumb movie “Blade Trinity”, I couldn’t help starring down at MBM temp; 33°C it says and the room temperature is at a comfortable zone of ~23°C. This is the same Venice 3000+ clocked at 2.7GHz with a bit over 1.6 Volt I did an article last month. I am still amazed by the fact that AMD managed to produce this processor within 6 months after the Winchester core with lower thermal output/loss while Intel could not do much in reducing heat loss nor increasing the speed of the Prescott for more than 2 years.

With so many people believing processor thermal loss will only increase linearly to speed, AMD is doing the opposite. Browsing over the THG, I read an interview article with BFG, the video card company only several miles east of me. Accordingly, “With form factors of PCs are getting smaller we have to come up with new cooling designs to get to the performance level the chip is able to deliver. This is especially the case when people are overclocking their systems," Herkelman said. While he does not believe that mainstream cards will require a cooling method other than a fan, he said that high- end cards may go a different way: "The most promising approach to cool a graphics card is water cooling” here we have a company following Intel path in believing high thermal “loss” is the way technology is heading.

Loss is waste, we managed to produce refrigerators; air-conditioners; micro-wave oven with less energy loss and better efficiency; we call it improvement. Cars are made faster and use less fuel. Why must we consider higher heat loss in computer and graphic card are normal way of growth? Thank God that AMD thinks differently. I hope Nvidia and ATI are thinking the same as I have given up hope on Intel.

What if we stop buying processors and graphic cards that generate high thermal loss? What will the chip makers be thinking? What if cost of electricity goes up 10 times higher?

I don’t care if Dell, HP, Gateway, Toshiba, Sony or Apple don’t buy Intel. I do care if consumers don’t know the higher electric bill they will be paying and the extra cooling cost associated with Intel. I like the low thermal loss Venice. It runs quieter and less heat with great oveclocking ability. I said the same for Intel when they switched the Willamette core to Northwood.

Unless we want water cooling and huge Heatsink to become social status we wish new processors to run hotter so much that most people would give up the 3-warranty in exchange for the status. Similarly, a 14-dBA 160mm fan will soon become necessity to acquire the status quote.

There are the few who enjoy the 2.4C or AXPM with a smile on their faces with a few hundred dollars extra in their wallets and a few seconds behind the tasks they must do; or Venice/San Diego owners gaining a few second ahead without the worry of pump failure which reminds me how the fish died a couple years ago.

jmke 2nd July 2005 09:33

"I hope Nvidia and ATI are thinking the same as I have given up hope on Intel"

the newly released 7800GTX uses less power then the 6800 series :)

Sidney 2nd July 2005 19:23

Oh yes, I forgot about that. So, if the next ATI card will be less power hungry than the previous, Intel will be very lonely. Unless, they start making water-cooling along side Processor and chipset. :)


All times are GMT +1. The time now is 20:03.

Powered by vBulletin® - Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Content Relevant URLs by vBSEO