Engineers at the University of California, Berkeley, successfully demonstrated the first experiment of the “Landauer limit”, a 1961 theory about magnetic chips that can operate at the minimum level of energy dissipation possible under the laws of physics. This research proves that the amount of energy used in computers can be cut by a million.
Called a “breakthrough for energy-efficient computing”, the technology will be put into practical production in quite a long time. But we now know it is possible to dramatically reduce the power currently used by modern processors like those in computers and smartphones.
The technology demonstrated by UC Berkley researchers uses “magnetic computing”, a newly developed kind of computing that consists of tiny magnetic bars being used instead of regular transistors, which require the movement of electrons to be able to switch between zeros and ones.
It takes a lot of energy in conventional computing to ensure that the two states can be clearly differentiated and one is much higher than the other. But thanks to magnetic computing, the magnetic bits can be distinguished by their direction. Besides, it takes just as much power to get the magnetic point right as it does to point left.
Study author Jeffrey Bokor, who teaches electrical engineering and computer sciences and is a faculty scientist at UC Berkley, explained that the research team does not throw energy away by creating a high and a low energy because these are two equal energy states.
He said that he and his colleagues wanted to know how small they could reduce the amount of energy required for computing given that the biggest challenge in designing those machines and all electronics today is shrinking their energy consumption.
For many years, research has been relying on Moore’s law to pack increasingly faster and smaller transistors on chips, but the focus has recently shifted towards energy efficiency. Bokor pointed out that it requires too much energy to design transistors that go faster, causing them to get extremely hot and end up melting.
What’s behind the breakthrough?
By testing the magnetic bits, the team was able to confirm the Landauer limit, a principle named after IBM Research Lab’s Rolf Landauer, who found in 1961 that in any computer every single bit operation must expend the lowest amount of energy.
The discovery is based on the second law of thermodynamics. Any physical system gets increasingly disordered while it is transformed as it goes from a state of higher concentration to lower concentration. This process is known as entropy and it comes off as waste heat.
The formula of Landauer was designed to calculate this minimum limit of energy needed for a computer operation and the result depends on the machine’s temperature.
The discovery is significant for mobile devices, which require powerful processors that can run at least for a day or more on small batteries, researchers say.
“The significance of this result is that today’s computers are far from the fundamental limit and that future dramatic reductions in power consumption are possible”, study authors wrote.
Source: Daily Mail