Moore’s Law: Past, Present and Future – Part 1

Moore’s Law: Past, Present and Future – Part 1

Tech Blog

How familiar are you with “Moore’s Law” – a key concept about the complexity and density of integrated circuits that has shaped the landscape of computing for the last fifty years? Most likely, many of our readers who follow or work in the computer industry know it well. For those readers, however, who may not be as familiar with Moore’s Law, this post will provide a brief background, as we begin a series of posts on the history of Moore’s Law and its implications for the future of computing.

In the mid-1960s, Intel co-founder Gordon Moore observed that the number or transistors on an integrated circuit was effectively doubling each year. He also predicted how costs in chip manufacturing would drop correspondingly, as real estate was used more efficiently and more components could be added to a single chip.

On its website, Intel Corporation describes the price and economic impact that Moore’s observations about the chip industry foretold, by saying:

Performance—aka power—and cost are two key drivers of technological development. As more transistors fit into smaller spaces, processing power increased and energy efficiency improved, all at a lower cost for the end user. This development not only enhanced existing industries and increased productivity, but it has spawned whole new industries empowered by cheap and powerful computing.

However, it is important to point out that the name “Moore’s Law” itself is a little misleading, since it is not in fact a law, proven by years of rigorous scientific research, but in fact an expert observation and prediction that Gordon Moore made decades ago. To be truthful, Moore’s Law has never been perfectly accurate; for example, in the mid-1970’s the “law” was adjusted from one year to two years as the period in which doubling of components would occur based on industry trends and technological advancements.

Ars Technica
provides a good summary of some of the key misconceptions over Moore’s Law here:

Gordon Moore’s observation was not driven by any particular scientific or engineering necessity. It was a reflection on just how things happened to turn out. The silicon chip industry took note and started using it not merely as a descriptive, predictive observation, but as a prescriptive, positive law: a target that the entire industry should hit.

As noted here, the chip-making industry embraced Moore’s Law in spite of its flaws and gravitated towards using it as a basis in setting targets for innovations and design goals. Over the years, industry groups such as the Semiconductor Industry Association, home to the world’s leading chipmakers, took Moore’s Law into account when preparing their yearly roadmaps as it served the purpose of keeping the industry moving forward together more or less in synchronized fashion.

In our next post on Moore’s Law, we will look at recent news reports questioning whether it is the end of Moore’s Law, and if so, what predictions can we make for the chip industry (and the tech industry at large) as a result.

Thanks for reading today’s Tech Blog! Care to share your thoughts about Moore’s Law or related topics? Feel free to drop us a line via social media or our Contact Us form and let us know – and what you might like to see in future posts!

About AMI

AMI is Firmware Reimagined for modern computing. As a global leader in Dynamic Firmware for security, orchestration, and manageability solutions, AMI enables the world’s compute platforms from on-premises to the cloud to the edge. AMI’s industry-leading foundational technology and unwavering customer support have generated lasting partnerships and spurred innovation for some of the most prominent brands in the high-tech industry. 

You May Also Like…