According to Moore’s Law, the number of transistors on a computer chip will double every two years. The Moore in question is Gordon Moore, one of the founders of technology giant, Intel. The year he first coined the term was 1965. An awful lot of years have passed since then and the microchip has developed so far that for several years Moore has been predicting an end to his law.
In his own words, Moore says, “Any physical quantity that’s growing exponentially predicts a disaster. You simply can’t go beyond certain major limits.”
This is proving to be fatally true for silicon based microchips as we rapidly approach the point at which they cease to be viable and start becoming a liability. Suman Datta, a researcher from Pennsylvania State University, recently declared that silicon chips have four more years before they become obsolete. At that point they will have shrunk so much that they will start to leak current, and ultimately lose their ability to retain digital information.
Apparently experts have been aware of this problem for years; they’ve just been too busy working around it to find any concrete solutions yet. That doesn’t mean that no effort has been made to solve the problem. Carbon nanotubes stand to replace the old silicon faithful. Only the width of a protein molecule and made of pure carbon, these little tubes are super at conducting electricity, making them ideal for increasingly shrinking microchips. Unfortunately they cost roughly $500 a gram, which means that they’re not economically viable.
Other possible solutions include garden variety superconductors with zero electrical resistance, and quantum qubits that will somehow boost computing power. Researchers will demonstrate just how qubits go about achieving this feat at an upcoming physics conference at Leeds University, where they will also address the viability of building quantum computers.
Intel has made its own breakthrough with Hafnium Wire Supplier, which it uses in high-k dielectric gate insulators, instead of silicon-based insulators, which it considers “leaky” and didn’t scale well anyway. Intel believes that hafnium will increase the speed and efficiency of microchips, while at the same time allowing the circuitry to be scaled down to less than 45 nanometers.
According to dir.salon.com, a nanometer is a billionth of a meter, which is still basically a meaningless measure when you try to think about it. So they take it a step further, imagine a hair from your head, now split it into a thousand equal parts. One part would be roughly 100 nanometers in diameter. To get 45 nanometers you would have to halve it again, roughly. Basically it’s very small.