Computer technology is the backbone of the modern world. Ever since Charles Babbage started tinkering around with what was called the Difference Engine in 1822 and Konrad Zuse built the Z3 in 1938, computing has improved by leaps and bounds. A simple example of this fact is that some of the first computers weighed as much as 70 kilograms, and today’s most advanced processor has transistors that are as small as 3 nanometers. For reference, a single human hair is thought to be as thick as 100,000 nanometers.
Much of the advancement in the modern day computing era has come through hardware innovation. Semiconductors, which have become one of the most prized products in the world, are the chips that power devices such as smartphones and laptops. These owe their origins to a flurry of developments that took place in the United States in the 1970s.