. home.aspx



Taking compute performance to the next level with cold computing

December 12, 2018 / Craig Hampel

When it comes to computing, hardware manufacturers are always looking for new ways to keep their chips running at lower temperatures. This lets them get even more performance using the same chips while using less energy and producing even less heat.While keeping a laptop or desktop cool proves no challenge, cooling larger systems such as data centres and supercomputers generally proves more difficult. TechRadar Pro spoke with Rambus’ Chief Scentist Craig Hampel to learn more about cold computing and how researchers are using this technique today. Generally speaking, cold computing is the idea of decreasing the operating temperature of a computing system to increase its computational efficiency, energy efficiency or density.  The most significant impact occurs when you run computing systems at cryogenic temperatures.