Symbol: | in |
Symbol2: | (the double prime)[1] |
Standard: | Imperial/US units |
Quantity: | Length |
Units2: | Metric (SI) units |
Inunits2: | 25.4 mm |
Units1: | Imperial/US units |
Inunits1: | yd or ft |
The inch (symbol: in or ) is a unit of length in the British Imperial and the United States customary systems of measurement. It is equal to yard or of a foot. Derived from the Roman uncia ("twelfth"), the word inch is also sometimes used to translate similar units in other measurement systems, usually understood as deriving from the width of the human thumb.
Standards for the exact length of an inch have varied in the past, but since the adoption of the international yard during the 1950s and 1960s the inch has been based on the metric system and defined as exactly 25.4mm.
The English word "inch" (English, Old (ca.450-1100);: ynce) was an early borrowing from Latin Latin: uncia ("one-twelfth; Roman inch; Roman ounce").[2] The vowel change from Latin pronounced as //u// to Old English pronounced as //y// (which became Modern English pronounced as //ɪ//) is known as umlaut. The consonant change from the Latin pronounced as //k// (spelled c) to English pronounced as //tʃ// is palatalisation. Both were features of Old English phonology; see and for more information.
"Inch" is cognate with "ounce" (English, Old (ca.450-1100);: ynse), whose separate pronunciation and spelling reflect its reborrowing in Middle English from Anglo-Norman unce and ounce.[3]
In many other European languages, the word for "inch" is the same as or derived from the word for "thumb", as a man's thumb is about an inch wide (and this was even sometimes used to define the inch[4]). In the Dutch language a term for inch is engelse duim (english thumb).[5] [6] Examples include Catalan; Valencian: polzada ("inch") and Catalan; Valencian: polze ("thumb"); Czech: palec ("thumb"); Danish and Norwegian: tomme ("inch") Norwegian: tommel ("thumb"); Dutch; Flemish: duim (whence Afrikaans: duim and Russian: дюйм); French: pouce; Georgian: დუიმი, Hungarian: hüvelyk; Italian: pollice; Portuguese: polegada ("inch") and Portuguese: polegar ("thumb"); ("duim"); Slovak: palec ("thumb"); Spanish; Castilian: pulgada ("inch") and Spanish; Castilian: pulgar ("thumb"); and Swedish: tum ("inch") and tumme ("thumb").
The inch is a commonly used customary unit of length in the United States,[7] Canada,[8] [9] and the United Kingdom. For the United Kingdom, guidance on public sector use states that, since 1 October 1995, without time limit, the inch (along with the foot) is to be used as a primary unit for road signs and related measurements of distance (with the possible exception of clearance heights and widths)[10] and may continue to be used as a secondary or supplementary indication following a metric measurement for other purposes.[11]
Inches are used for display screens (e.g. televisions and computer monitors) worldwide. It is the official Japanese standard for electronic parts, especially display screens, and is the industry standard throughout continental Europe for display screens (Germany being one of few countries to supplement it with centimetres in most stores[12]).
Inches are commonly used to specify the diameter of vehicle wheel rims, and the corresponding inner diameter of tyres in tyre codes.
Both inch-based and millimeter-based hex keys are widely available for sale in Europe.[13] [14]
The international standard symbol for inch is in (see ISO 31-1, Annex A) but traditionally the inch is denoted by a double prime, which is often approximated by a double quote symbol, and the foot by a prime, which is often approximated by an apostrophe. For example; can be written as 3 2. (This is akin to how the first and second "cuts" of the hour are likewise indicated by prime and double prime symbols, and also the first and second cuts of the degree.)
Subdivisions of an inch are typically written using dyadic fractions with odd number numerators; for example, would be written as and not as 2.375 nor as . However, for engineering purposes fractions are commonly given to three or four places of decimals and have been for many years.[15] [16]
1 international inch is equal to:
The earliest known reference to the inch in England is from the Laws of Æthelberht dating to the early 7th century, surviving in a single manuscript, the Textus Roffensis from 1120.[17] Paragraph LXVII sets out the fine for wounds of various depths: one inch, one shilling; two inches, two shillings, etc.
An Anglo-Saxon unit of length was the barleycorn. After 1066, 1 inch was equal to 3 barleycorns, which continued to be its legal definition for several centuries, with the barleycorn being the base unit.[18] One of the earliest such definitions is that of 1324, where the legal definition of the inch was set out in a statute of Edward II of England, defining it as "three grains of barley, dry and round, placed end to end, lengthwise".
Similar definitions are recorded in both English and Welsh medieval law tracts.[19] One, dating from the first half of the 10th century, is contained in the Laws of Hywel Dda which superseded those of Dyfnwal, an even earlier definition of the inch in Wales. Both definitions, as recorded in Ancient Laws and Institutes of Wales (vol i., pp. 184, 187, 189), are that "three lengths of a barleycorn is the inch".[20]
King David I of Scotland in his Assize of Weights and Measures (c. 1150) is said to have defined the Scottish inch as the width of an average man's thumb at the base of the nail, even including the requirement to calculate the average of a small, a medium, and a large man's measures.[21] However, the oldest surviving manuscripts date from the early 14th century and appear to have been altered with the inclusion of newer material.[22]
In 1814, Charles Butler, a mathematics teacher at Cheam School, recorded the old legal definition of the inch to be "three grains of sound ripe barley being taken out the middle of the ear, well dried, and laid end to end in a row", and placed the barleycorn, not the inch, as the base unit of the English Long Measure system, from which all other units were derived.[23] John Bouvier similarly recorded in his 1843 law dictionary that the barleycorn was the fundamental measure.[24] Butler observed, however, that "[a]s the length of the barley-corn cannot be fixed, so the inch according to this method will be uncertain", noting that a standard inch measure was now [i.e. by 1843] kept in the Exchequer chamber, Guildhall, and that was the legal definition of the inch.
This was a point also made by George Long in his 1842 Penny Cyclopædia, observing that standard measures had since surpassed the barleycorn definition of the inch, and that to recover the inch measure from its original definition, in case the standard measure were destroyed, would involve the measurement of large numbers of barleycorns and taking their average lengths. He noted that this process would not perfectly recover the standard, since it might introduce errors of anywhere between one hundredth and one tenth of an inch in the definition of a yard.[25]
Before the adoption of the international yard and pound, various definitions were in use. In the United Kingdom and most countries of the British Commonwealth, the inch was defined in terms of the Imperial Standard Yard. The United States adopted the conversion factor 1 metre = 39.37 inches by an act in 1866.[26] In 1893, Mendenhall ordered the physical realization of the inch to be based on the international prototype metres numbers 21 and 27, which had been received from the CGPM, together with the previously adopted conversion factor.[27]
As a result of the definitions above, the U.S. inch was effectively defined as 25.4000508 mm (with a reference temperature of 68 degrees Fahrenheit) and the UK inch at 25.399977 mm (with a reference temperature of 62 degrees Fahrenheit). When Carl Edvard Johansson started manufacturing gauge blocks in inch sizes in 1912, Johansson's compromise was to manufacture gauge blocks with a nominal size of 25.4mm, with a reference temperature of 20 degrees Celsius, accurate to within a few parts per million of both official definitions. Because Johansson's blocks were so popular, his blocks became the de facto standard for manufacturers internationally,[28] [29] with other manufacturers of gauge blocks following Johansson's definition by producing blocks designed to be equivalent to his.[30]
In 1930, the British Standards Institution adopted an inch of exactly 25.4 mm. The American Standards Association followed suit in 1933. By 1935, industry in 16 countries had adopted the "industrial inch" as it came to be known,[31] [32] effectively endorsing Johansson's pragmatic choice of conversion ratio.
In 1946, the Commonwealth Science Congress recommended a yard of exactly 0.9144 metres for adoption throughout the British Commonwealth. This was adopted by Canada in 1951;[33] [34] the United States on 1 July 1959;[35] [36] [37] Australia in 1961,[38] effective 1 January 1964;[39] and the United Kingdom in 1963,[40] effective on 1 January 1964.[41] The new standards gave an inch of exactly 25.4 mm, 1.7 millionths of an inch longer than the old imperial inch and 2 millionths of an inch shorter than the old US inch.[42]
The United States retained the -metre definition for surveying, producing a 2 millionth part difference between standard and US survey inches.[43] This is approximately inch per mile; 12.7 kilometres is exactly standard inches and exactly survey inches. This difference is substantial when doing calculations in State Plane Coordinate Systems with coordinate values in the hundreds of thousands or millions of feet.
In 2020, the National Institute of Standards and Technology announced that the U.S. survey foot would "be phased out" on 1 January 2023 and be superseded by the international foot (also known as the foot) equal to 0.3048 metres exactly, for all further applications.[44] This implies that the survey inch was replaced by the international inch.
See main article: Roman inch. Before the adoption of the metric system, several European countries had customary units whose name translates into "inch". The French pouce measured roughly 27.0 mm, at least when applied to describe the calibre of artillery pieces. The Amsterdam foot (voet) consisted of 11 Amsterdam inches (duim). The Amsterdam foot is about 8% shorter than an English foot.[45]
The now obsolete Scottish inch (Gaelic; Scottish Gaelic: òirleach), of a Scottish foot, was about 1.0016 imperial inches (about 1.0016inches).[46]