Bottom Feeder
Ground Beetle
Inches, meters, furlongs, miles, rods and cubits — how did we arrive at standard measurements?
How the “meter” came to be exactly one meter long
BigThink
Throughout human history, civilizations have looked to the world itself for a “standard” that they could all agree on as far as measurement goes: whether for lengths, masses, or durations of time. The notion of a non-arbitrary, standard length got a boost in the early days of physics, with the idea of a length that could be held to a standard that anyone on Earth could adhere to.
Early distance standards, like “cubits” or “feet,” were based on body parts. A single “pace” was often used: around one yard/meter. But the idea of a “standard meter” came from pendulum observations. A swinging pendulum’s period is determined by two factors: length and gravity. A pendulum, where each half-swing lasts one second, requires a pendulum one meter long.
In 1790, the meter was defined as 1/10,000,000th the distance from the North Pole to the equator. That distance was then cast into a platinum bar. After correcting an early error of 0.2 millimeters, these bars became distance standards for decades.
As measurement standards continued to improve, small inconsistencies kept appearing until the late 20th century: where atomic physics, and eventually light itself, led to the current definition of “one meter.” In 1983, a new standard was adopted: the distance light travels in 1/299,792,458th of a second.
Moar details (and graphics) at link.
How the “meter” came to be exactly one meter long
BigThink
Throughout human history, civilizations have looked to the world itself for a “standard” that they could all agree on as far as measurement goes: whether for lengths, masses, or durations of time. The notion of a non-arbitrary, standard length got a boost in the early days of physics, with the idea of a length that could be held to a standard that anyone on Earth could adhere to.
Early distance standards, like “cubits” or “feet,” were based on body parts. A single “pace” was often used: around one yard/meter. But the idea of a “standard meter” came from pendulum observations. A swinging pendulum’s period is determined by two factors: length and gravity. A pendulum, where each half-swing lasts one second, requires a pendulum one meter long.
In 1790, the meter was defined as 1/10,000,000th the distance from the North Pole to the equator. That distance was then cast into a platinum bar. After correcting an early error of 0.2 millimeters, these bars became distance standards for decades.
As measurement standards continued to improve, small inconsistencies kept appearing until the late 20th century: where atomic physics, and eventually light itself, led to the current definition of “one meter.” In 1983, a new standard was adopted: the distance light travels in 1/299,792,458th of a second.
Moar details (and graphics) at link.