Moore's Law? That Crashed.

True it's 50 years old. Or 40. Or maybe a little less. It ultimately depends on the version you consider. If the original statement dates of 1965, Gordon Moore, one of the three founders of Intel, will reconsider his famous law in 1975 to correct its interpretation. Then a third version, apocryphal, but inspired by the famous law, will eventually take over. Moore's Law? That Crashed.

The initial statement estimated that the number of transistors on an integrated circuit doubling every year, at the lowest cost. This first version was based at the time on an observation made by Gordon Moore; it was later revised in 1975 by doubling this time the number of transistors every two years. Subsequently, popular versions of the first laws were retained, which argues that computing power doubles every eighteen months.
 
So we should rather evoke the laws of Moore, and not all are equal. The version of the law from 1965, the one which the fiftieth anniversary we celebrating today, is no longer valid since 1975. The updated version is now at the end of its cycle: one can certainly always double the fine engraving of processors, but the processors that benefit are not the cheapest on the market. As to the popular version, it has slowed since 2004: manufacturers now face the heat dissipation limits and are moving towards solutions using multiple cores to work around the problem. As a result, Moore's laws have got some sticks in the wheels, but if these principles are the deviate from term "law", they do not, however, enforce it. Moore's Law is not a physical law but an empirical law. In fact, it was mainly used as a guide for the entire industry.

In fact and this is no coincidence, Gordon Moore was one of the founders of Intel, today leader in the semiconductor market, which has sought to stick as close to the deadlines laid down by these famous laws of Moore. For manufacturers, it was the goose that lays the golden eggs: for 30 years, the exponential evolution of processors and therefore, software, ensured regular renewal of user equipment every 3-5 years. An insured renewal, which helped unlock the heavy investments, required for the development of engraving techniques: the snake has bitten its tail for 30 years at Intel's delight.

But after 50 years of loyal service, Moore's Law runs out of steam. Builders are beginning to encounter a physical limit. On the one hand, it has abandoned the economic component of the law, which was that the number of transistors doubling at regular intervals for entry-level processors. On the other hand, it is faced with the problem of heat dissipation and the increasing power consumption, which has limited the development. And finally, to a wider horizon, it is difficult considering moving the bar of fine engraving of 5nm. At this level, the behaviors of matter are not the same and costs explode.

Yet there are exit doors that are emerging. It was thus seen from the early 2000s the emergence of multi-core processors, which overcome the difficulties related to the consumption and overheating faced by manufacturers. There are new ideas to find, and new avenues to explore. We have seen a growing awareness on the consumption of processors. Similarly, the scale of values changes: one looks today much less at processor speed and lingers on the other components of the machines. In the long term, we are also beginning to see the emergence of new technologies, such as memristors or quantum computers, which will evolve performance. 

But the challenge is not only the hands of the founders, but it is also in the hands of IT specialists and developers who need to adapt their programs to better take advantage of these new processor architectures. So yes, Moore's laws play their swan song and could well become in the next few years the symbol of a blessed time when Intel reigned supreme. But as Gordon Moore admits in the interview he gave on the occasion of the fiftieth anniversary of his law itself does not really expect such longevity.
 
This is not the end of the world. The technology sector has already begun to try different ways to mitigate the slowdown in Moore's Law, which has been evident in recent years. But the last warning from Intel highlights a set of forces that will have a profound impact on the processor industry, the technology sector as a whole and the rest of the world. The Von Neumann architecture is suddenly out of its depths.

The Von Neumann Architecture is a computer architecture based on that described in 1945 by the mathematician and physicist John von Neumann and others that describes a design architecture for an electronic digital computer with parts consisting of a processing unit containing an arithmetic logic unit and processor registers, a control unit containing an instruction register and program counter, a memory to store both data and instructions, external mass storage, and input and output mechanisms.

One response was the design of processors optimized for specific tasks. The main markets that develop concern graphics processing units, which divide data-intensive tasks separately to facilitate processing; Field-programmable gate arrays, processors that can be reprogrammed for specific purposes.

A second response was to refine research and manufacturing for a small group able to keep pace with science and the climbing equipment costs of each new generation. Only Intel and IBM, whose work feeds alliances with Samsung, TSMC and Global Foundries are still at the heart of the research.

Ironically, the harder it becomes to keep pace with Moore's Law, the more successful Intel's strategy becomes. Intel always counted on being the first of the next generation of processors. It just needs to stay ahead even when the overall race slows down, the others will struggle to compete.

Meanwhile, the rest of the world of technology has adapted. A shortage of software engineers that can take full advantage of new processor models was partly a factor of the slowdown. But as Patrick Moorhead, processor analyst explains: "Adding people to the programming army is easier than to change the laws of physics."

Companies like Google, who are at the forefront of IT, work to optimize their data centers to improve digital productivity. However, the slowdown will be felt sooner or later. Moore believes that within ten years, the pace of change will be palpable. Scientific advances may offer some hope. IBM trumpeted that research on materials like graphene and the promise of a processor with virtually infinite power that is theoretically possible with quantum processors. It is also possible to complete new models such as the IBM Synapse chip - a processor that mimics the human brain. But the practical value of these experiments is impossible to predict.

Computational techniques and high-performance networks, developed since the mid-1980s, are referred to as HPCN (High-Performance Computing and Networking). Yet parallel computing seems for some to be the best way to prepare while doomsday bells ring. It involves the simultaneous and optimal use of multiple IT resources, hardware, and software (processors, disks, memory, communication lines). With this technology, we can provide more accurate results, faster and better reflecting real phenomena, the objective being to have real-time processing of data for the user.

To work around the limitation that individual processors can't be packed with additional transistors, engineers have begun increasing computing power by designing multi-core processors, or systems of chips that perform calculations in parallel. This moderates the heat problem, because they can slow down the clock. Imagine that every time the processor's clock ticks, the transistors fire. So instead of trying to speed up the clock to run all these transistors at faster paces, you can slow down the clock and have parallel activity on all the chips. And because parallelization is the key to complexity, in a certain way multi-core processors make computers work more like the human brain. The era of the computer it seems is coming to an end while paving the way for the era of the supercomputer.

Leave a comment

Name .
.
Message .