The Crookes radiometer, also known as the light mill, consists of an airtight glass bulb, containing a partial vacuum. Inside are a set of vanes
which are mounted on a spindle. The vanes rotate when exposed to light, with faster rotation for more intense light, providing a quantitative measurement of electromagnetic radiation intensity. The reason for the rotation has historically been a cause of much scientific debate.
It was invented in 1873 by the chemist Sir William Crookes as the by-product of some chemical research. In the course of very accurate quantitative chemical work, he was weighing samples in a partially evacuated chamber to reduce the effect of air currents, and noticed the weighings were disturbed when sunlight shone on the balance. Investigating this effect, he created the device named after him. It is still manufactured and sold as a novelty item.
In the summer of 1952, Denmark was afflicted by a serious polio epidemic that killed hundreds of children – mostly from respiratory failure.
Dr Poul Astrup and anesthesiologist Bjørn Ibsen discovered that measuring the pH status of blood could prevent respiratory failure. They contacted Radiometer, which at the time had a device for measuring the acidity of liquids.
Could Radiometer develop an instrument that would do the same with blood? In a matter of days, Radiometer’s engineers developed what became known as the world’s first blood gas instrument.
Much has changed since 1952. However, our commitment to using advanced technology and insight into hospital processes to simplify and improve acute care testing remains the same.