THE USE OF CALORIMETRY IN NUCLEAR MATERIALS MANAGEMENT

Year
1970
Author(s)
G.T. Nutter - United States Enrichment Corporation
W.W. Rodenburg - Monsanto Research Corporation-Mound
F.A. O'Hara - The Ohio State University
Abstract
A calorimeter is a device to measure evolved or adsorbed heat. For our purposes, the heat measured is that associated with radioactive decay and the unit of measurement is the watt. Each time an atom decays, energy is released and absorbed by the surroundings and heat generated. For each isotope, this heat is a constant related to the energy of the decay particles and the half-life of the isotope. A point which is often overlooked is that calorimetry is one of the oldest techniques known for measuring radioactivity. In 1903, Pierre Curie and A. Laborde used a twin microcalorimeter to determine that one gram of radium generates about 100 calories per hour. Several months later, Curie and Dewar used liquid oxygen and hydrogen to show that the amount of energy developed by radium and other radioactive elements did not depend on temperature. At that time, this observation was extremely important. It indicated that the nature of radio- activity is entirely different and cannot be compared with any known phenomena. In all other thermal processes known in physics and chemistry, the rate at which heat is developed changes with temperature.