Question:

How do computers keep getting smaller and faster?

by  |  earlier

0 LIKES UnLike

Scientists and researchers must follow a trend - Moore's Law - that has paved the way for the past 40 years. Essentially, to remain competitive, computer designers must be able to fit the same amount of transistors in half the space that they could 2 years before. A 200% increase in efficiency per every 2 years is close to 1% per 10 days, which may sound a little more reasonable. Still, how are chips made more efficient week after week, even when existing technologies do not change? If we can use old technology to make chips faster, why don't companies increase transistor efficiency 5 or 10% every 10 days? Why is 200% per 2 years the magical number?

 Tags:

   Report

3 ANSWERS


  1. It is not a magic ratio but it seems to be roughly correct so far.

    I would expect it to level off as the successive development costs rise.


  2. smaller channel width transistor technologies

  3. Why do you think the technology hasn't changed?  

    Sometimes the technology of process has changed, and sometimes the technology of the circuits has changed.  We could not make today's chips with technology  of the 80's.  In the 80's it was believed that we would have hit the limit for the basic methods long before now.  IBM invested hundreds of millions of dollars on x-ray lithography because they thought that optical light lithography had reached its limits.  Optical light lithography is still being used today (I believe).

Question Stats

Latest activity: earlier.
This question has 3 answers.

BECOME A GUIDE

Share your knowledge and help people by answering questions.
Unanswered Questions