Scientists and researchers must follow a trend - Moore's Law - that has paved the way for the past 40 years. Essentially, to remain competitive, computer designers must be able to fit the same amount of transistors in half the space that they could 2 years before. A 200% increase in efficiency per every 2 years is close to 1% per 10 days, which may sound a little more reasonable. Still, how are chips made more efficient week after week, even when existing technologies do not change? If we can use old technology to make chips faster, why don't companies increase transistor efficiency 5 or 10% every 10 days? Why is 200% per 2 years the magical number?
Tags: