Accessibility

6.033--Computer System Engineering

Suggestions for classroom discussion


Topic: Disk recording strategies

By J. H. Saltzer, April 18, 1997.


Here is a small but intrinsic problem that designers of rotating storage media must face: writing data at different radii from the center of rotation. The physical space available near the center of a disk platter is less than that available near the edge. For example, in a 3.5-inch floppy disk, the writing radius varies from 20 mm to 40 mm, a factor of two. On a CD the range is from 23 to 58 mm, a factor of 2.5.

Depending on one's goals, different strategies are used.

Simplicity: For the available recording technology, establish the maximum number of bits that can reliably be written and read back on the innermost track of the disk. Write that same number of bits on every track, no matter what the radius. With this approach the outer tracks, being longer, are less densely filled; they hold far fewer bits than the technology allows. The simplicity advantage is that data flows to and from the disk at a uniform clock rate, no matter which track is being read or written. Having a uniform clock rate simplifies the hardware design, especially when writing. Most--but not all--floppy disks are formatted this way.

Maximize storage: On each track write the maximum number of bits that the recording technology allows on that track. For the floppy disk, the outermost track will have twice the number of bits as the innermost track, and the total capacity will increase by 50% as compared with the Simplicity approach. But now, the number of bits read or written in one revolution of the disk varies with different tracks, so the clock rate of the reading/writing circuitry will have to adjust to the rate of the track being read or written. This technique is used on Macintosh 1.2 Mbyte floppy disks. (Macintosh 800 KByte disks have the same maximum recording density, but use the Simplicity strategy.)

This strategy leads to a surprise (an emergent property) in the reliability area--disks formatted this way are quite a bit less robust. One of the key contributors to disk unreliability is noise, and every clock in a computer system generates some noise. If the clock rate of a disk happens to be the same as some other clock in the system, there may be trouble: either one can cause interference with the other. Since the disk is operating near the maximum rate its technology allows (which means that the read head can barely tell the difference between the ones and the zeros), it is likely to be the thing that is sensitive to noise from another clock rather than vice-versa.

Consider now what happens when you read an entire disk from one end to another: the clock rate gradually rises (or drops) by a factor of two as you move across the disk. This gives the disk circuitry an opportunity to be subjected to noise interference from every other clock in the system. So there is a good chance that somewhere across that spectrum the noise level will be high enough to cause trouble.

This problem is most likely to show up in a laptop, where things are crowded together, leading to increased chance of electrical interference. In fact, this was a standard problem on early Mac powerbooks--the display clock caused read errors about 2/3 of the way across any floppy that was not written with very strong magnetization. The problem was corrected by increasing the shielding around the floppy reader.

Note that modern hard disks use an in-between strategy. The disk is divided into zones, each of which has the same number of sectors (and thus bits) per track: the maximum number of sectors that can be placed on the innermost track of that zone. Thus the data rate from any one zone is uniform, but outer zones run at a higher data rate than inner zones.

Maximize storage but keep constant bit rate: Music CD's had this requirement. There was a Japanese requirement that it be possible to cram Beethoven's ninth symphony (nominally 74 minutes of music) on a single CD, and the only way to meet that requirement with the available recording technology was to use the Maximize Storage strategy. (CD's don't actually have multiple tracks; the bits are written in a continuous spiral, but that detail doesn't affect this story.) As a consequence, it was necessary to operate at maximum bit density at every radius. On the other hand, digital-to-analog conversion required that the bit rate coming off the disk be uniform. So the designers arranged to vary the rotation rate of the disk according to the position of the read head.

The downside of this strategy shows up if you then decide to use the same CD mechanism as a storage system for a computer. Every time you move from one radius to another you must speed up or slow down the disk rotation rate. Since changing rotation rate involves motors and inertia of rotating objects, it takes a long time (or a lot of energy). This is the main reason why CD-ROM's are so slow. (Note that at the time the music CD was designed, using RAM buffers for speed matching was much too expensive to consider. Today, one might explore designs that keep the disk rotation rate constant, and use RAM buffers to deliver a constant data rate to the D/A converters.)


Comments and suggestions: Saltzer@mit.edu