Geek: the ampere, redefined... Most of the fundamental units involved in electronics have long since been defined in terms of an observable physical phenomenon that's readily repeatable. This is actually quite a crucial need for metrology – the science of measurement that is a crucial underpinning for the entire electronics industry, mainly through applied metrology in the form of the calibration of electronic instruments.
But one of the most fundamental of electrical units – the ampere (the unit of electrical current) – has always been defined in much more ambiguous terms. It looks like that may be about to change, though.
My last assignment in the US Navy was to our shipboard metrology lab, where I first ran into the whole idea of calibration – far more interesting than I had imagined. We had a few requirements on the ship for very high precision measurements, mainly related to the nuclear reactors and some of the weapons systems on board. The two most challenging things for us to calibrate were precision ammeters (current-measuring devices) and precision calorimeters (you could think of them as fancy, very precise thermometers). In both cases we needed references whose calibration was traceable to NIST by no more than two steps. I don't know how hard that is today, but back then (mid-'70s) getting those references generally involved a shipment from NIST labs in Maryland or Colorado, and the references had to be used within a few days – otherwise their values might drift. The reason we had to go to those extremes was the lack of a definition in terms of observable physical phenomena – that forced the entire calibration chain to be dependent on standards held at the NIST labs...
No comments:
Post a Comment