There’s no doubt high density interconnect (HDI) technology has allowed for smaller, sleeker, more powerful devices. More than that, however, it’s changed the entire routing and functionality of circuit boards. To fully recognize how significant HDI is, it helps to understand the history of this technology and where it may be going in the future. From the Atomic Age to the new millennium, technology has been revolutionized by the dual needs of advanced functionality and smaller package sizes. In short, how can components be mounted more efficiently and effectively? It was this question that gave rise to high density interconnect technology, and with it, more powerful and efficient electronic devices. However, it’s a question that continues to be asked.
To understand these developments, you may begin with British scientist Geoffrey W.A. Dummer, who presented the concept of the integrated circuit at a symposium in Washington, D.C. in 1952. This conceptualization inspired Jack Kilby, a Texas Instruments employee, and the first person to successfully create and demonstrate an example of an integrated circuit, in 1958.
By 1965, Gordon Moore, co-founder of Intel, made the prediction that would become known as Moore’s Law: Namely, that the number of transistors in a dense integrated circuit would double every two years. Hewlett-Packard took Moore’s Law to heart, introducing the world’s first 32-bit microprocessor at the THAM International Solid-State Circuits Conference in 1981. The FOCUS chip, as it was called, contained 500,000 transistors.
From there, engineering experts in Germany, Japan and elsewhere continually innovated, creating and utilizing HDI boards for everything from mainframe computers to small desktop computers. However, it wasn’t until 1994 that Motorola, using HDI boards provided by HP, used microvias on their MicroTAC mobile phone to replace controlled-depth drilling. From there, HDI boards became part and parcel for mobile phones.
Today, printed circuit board (PCB) manufacturers like Sierra Circuits are able use HDI; laser-drilled via-in-pad technologies that deliver unparalleled reliability, signal strength, and computing power.
In 2015, IBM, alongside GlobalFoundries, Samsung, and the State University of New York, unveiled the world’s first 7 nanometer chip utilizing functioning transistors. While these chips have not yet been put into commercial production, they represent a major leap forward, if only for their ability to cross the sub-10nm chip line. IBM’s 7nm chips possess a surface area 50 percent smaller than any chips available today.
Once again, it appears Moore’s Law was more prescient than even Moore himself may have thought possible.