Picosecond Ultrasonics; January 1998; Scientific American Magazine; by Maris; 4 Page(s)
During the past three decades, humans have developed a remarkable ability to manufacture small objects. A prime example is the computer chip, made from a silicon wafer with strategically placed impurities that form transistors. On top of each chip is a sequence of metal films and insulating layers that electrically connect the transistors to their neighbors. The films may be as thin as a millionth of a centimeter. Their thickness and uniformity determine the efficiency of the chip and, ultimately, the computer it is in.
A film may be anywhere from 50 angstroms to a few microns thick (one angstrom equals 10-8 centimeter, whereas one micron equals 10-4 centimeter). For the finest films, the thickness has to be controlled to an accuracy of one angstrom-- less than the size of an atom. Measuring the thickness is exceedingly difficult. The most efficient means currently available is destructive: take a chip, cut it and look at it from the side. Most manufacturers deal with the problem of ensuring consistent thickness by minutely controlling every aspect of the production process--such as temperature, humidity and pressure--and by checking the dimensions of the few chips that are sacrificed.