May / June 2007
The development and establishment of modern standard units of measurement, including monetary currencies and their equivalence to weights in gold, began early in civilization, motivated principally by the need to regulate commerce between individuals as well as trade across borders. This culminated in the last two centuries with the development and implementation of numerous standards and norms for the legal (or agreed) sizes of myriad objects and artifacts, and especially with the institutionalization of standard units of measurements for length, mass or force, and time. Nonetheless, besides meters or inches and kilograms and pounds, there exist also a plethora of other units still in widespread use throughout the world, which not only change from country to country, but also within one and the same region, even in metric Europe. International travelers are certainly aware of this problem when they try to buy shoes or clothing abroad, the numerical sizes of which vary not only between the U.S. and, say Europe, but also across the various countries within the European Union.
Shoes, for example, have myriad numbering schemes. Legend has it that shoe sizes originated in the fourteenth century when King Edward II of England ordered shoes for his child and provided the shoemaker with the length of the child’s foot measured in barley corns.
Presumably, royal protocol prohibited access of the lowly shoemaker to the blue-blooded prince, so an indirect podiatric measurement was required. Today, the meaning of shoe sizes in the U.S. is still a mystery to most - even to those who make a living by selling shoes – and for reasons now lost to history, there exist different sizes for men, women, and children, although the latter do use the same number for any given size, whether boy or girl. This discrepancy between sexes is especially an inconvenience with sneakers or sport boots, of which men and women use basically the same type even if their numerical sizes should not agree.
Shoe sizes are generally related to the length of the “last,” which is a foot-shaped template used for shoe fabrication. The American size of the shoe is three times the heel-to-toe length of the foot, measured in inches, minus a constant (what for?), so each half size increment is 1/6 of an inch (4.23 mm). The subtractive constant is 22 for men, 21 for women (or is it 20.5?), and 9.75 for children (9.67?), but the latter only up to size 13 1/2, after which another constant is used! For instance, a man’s foot that is 10.5” long requires an American shoe size 3 x 10.5 – 22 = 9.5. By contrast, most countries in continental Europe and in Latin America follow some version of the French rule, which specifies a shoe size that is 1.5 times the length of the last measured in centimeters, irrespective of gender or age, so each step (or Paris point) is 2/3 cm (6.67 mm) long. Thus, a foot 26.7 cm in length, which for comfort demands a last that is some 2 cm longer than the foot, would correspond to a French shoe size 1.5 x (26.7+2) = 43. Shoe sizes in the U.K. follow a similar but not identical rule to those in the U.S., and (apparently) they do not differ between men and women. Fortunately, and despite regional variations, shoe sizes have remained consistent over the years, so a shoe of a given number from a generation ago is as large as a contemporary shoe of that same number.
Regrettably, this consistency has not held for sizes of clothing, at least not for women’s clothing in the U.S.A. As you may perhaps have noticed, in the course of recent years, women’s sizes in America have suffered considerable deflation, especially in upscale and expensive boutiques, but also in discount stores.
Thus, a woman’s dress size 7 today would have corresponded to a size 11 a generation ago, a phenomenon that is referred to euphemistically as vanity sizing.
Presumably, as America became more rotund over the years the industry adapted, to impress upon the buyers the good feeling that they were as lightweight as ever. Problem is, with vanity dimensioning women’s sizes have largely ceased to be meaningful, at least in the U.S., inasmuch as the numbers now change substantially not only from store to store, but also across brands sold at any one store. Interestingly, no comparable vanity sizes have developed for men in the U.S., despite that they too have increased in bulge, and this is because men’s sizes were regulated early on by the government, to satisfy the need for uniform men’s sizes in the military.
|Back to top|
Another source of confusion is in lumber sizes, again at least in the US. As everybody knows, a 2 x 4 stock is not a piece of lumber of 2 by 4 inches in cross-section, but one that is instead 1 1/2 x 3 1/2 inches. In larger lumber sizes, the discrepancy between nominal and actual sizes is even bigger. When asked as to why the differences, people in lumber yards state that the nominal size is that of the rough, unfinished lumber, and that finishing reduces it to the actual sizes, but it seems rather wasteful that fully 34% of the wood in a two by four should be lost to sawing, planing and finishing. Perhaps part of the explanation may be that the wood is cut to rough sizes in a wet condition that shrinks after kiln drying, but then again, nothing would impede the lumber industry from starting with appropriately larger rough sizes to begin with so as to attain truthful finished sizes. Thus, this anecdotal explanation is just a modern red herring that belongs to the type of truths that people – and the industry – come to accept merely by virtue of its repetition.
The actual reason is more pragmatic: a good number of decades ago, sizes were gradually reduced by mills as a way to increase profits and prevent cost increases without effecting changes in the wording of then existing construction norms and regulations.
Indeed, if you were a carpenter and did some remodeling of houses older than some 50 years, you would find that the studs in many of these older houses are indeed 2 by 4 inches, a measurement that has not shrunk to dry conditions in half a century, so at that time at least, nominal and actual sizes indeed coincided. In later years, the width of 2 inches shrunk first to 1 ¾, then to 1 5/8 and finally to 1 1/2. Moreover, until fairly recently, when you bought ½-inch plywood board, that was the thickness that you actually got, but today, the so-called ½-inch plywood is only 15/32 inches in thickness, again a move by the industry to save on material.
A similar shrinking over time has taken place with coffee cans, at least as far as the contents are concerned. The standard coffee can used to contain one pound (16 ounces) of ground coffee. Today, however, while the size of the can has remained exactly the same, most modern cans contain only 11 or 12 ounces. Peculiarly, when you open one of these cans, they are still filled to the rim with coffee. How can that be possible? This has to do with the way that modern coffee is ground. By appropriate grinding methods, you can make the powder occupy more space, so the coffee now has more air in between its particles. This is what mathematicians refer to as the packing problem, of which the producers of coffee seem to be making very good use.
A curious case is also that of sheet metal. In the U.S., metallic sheets are sold not by thicknesses (as done elsewhere in the world) but instead by gauge (or gage) numbers, which range anywhere from 0 to 39, and the original definitions corresponded very roughly to the reciprocal of the thickness in inches. In the late nineteenth century, sheet gauges were related to the weight per square foot of the sheet, presumably because of the costs of the material and of its transport: weight and not size of the sheets was of the essence. In addition, it was easier for the government to assess taxes for weight of metal than for sizes or thicknesses of sheet, especially because this allowed taxing at similar rates both flat and corrugated metal sheets, which occupy rather different volumes and weights for the same thickness.
At the present time, however, numerical changes in gauge not only do not translate into proportional changes in thickness, which decrease with the gauge number, but the actual thickness depends also on the material of which the sheet is made.
For example, gauge 3 corresponds to 0.2391 inches for sheet steel, 0.2294 inches for aluminum, and 0.2500 for stainless steel (i.e., differences of less than 10%), but at gauge 39, the thicknesses for these three metals is 0.060, 0.040, and 0.062, a dramatic difference for aluminum. Rather peculiarly, if you were to plot modern sheet gauges against thickness for any given material, you would observe not a smooth, monotonically decreasing curve, but one with obvious discontinuities in slope. These must have resulted at later points in time as more gauge numbers for thinner and thicker sheets were added to the standard. Thus, by now the gauge is just an arbitrary number used for trade that says nothing about either average thickness or weight. Instead, these must now be read from standard gauge tables, which may even change among manufacturers.
As can be seen, and despite the significant advances that civilization has made by creating and implementing logical and easy-to-use measurements and sizes, much progress remains to be accomplished before sanity prevails in the standard dimensioning of objects and artifacts. But then again, perhaps commerce may have vested interests in maintaining the status quo and the reigning confusion: it makes shopping by price comparison so much more difficult. Caveat emptor!
|Back to top|
|Send your comments|
|home this issue archives editorial board contact us faculty website|