In nuclear power technology, burnup (also known as fuel utilization) is a measure of how much energy is extracted from a primary nuclear fuel source. It is measured as the fraction of fuel atoms that underwent fission in %FIMA (fissions per initial metal atom)[1] or %FIFA (fissions per initial fissile atom)[2] as well as, preferably, the actual energy released per mass of initial fuel in gigawatt-days/metric ton of heavy metal (GWd/tHM), or similar units.

Expressed as a percentage: if 5% of the initial heavy metal atoms have undergone fission, the burnup is 5%FIMA. If these 5% were the total of 235U that were in the fuel at the beginning, the burnup is 100%FIFA (as 235U is fissile and the other 95% heavy metals like 238U are not). In reactor operations, this percentage is difficult to measure, so the alternative definition is preferred. This can be computed by multiplying the thermal power of the plant by the time of operation and dividing by the mass of the initial fuel loading. For example, if a 3000 MW thermal (equivalent to 1000 MW electric at 30% efficiency, which is typical of US LWRs) plant uses 24 tonnes of enriched uranium (tU) and operates at full power for 1 year, the average burnup of the fuel is (3000 MWÂ365 d)/24 metric tonnes = 45.63 GWd/t, or 45,625 MWd/tHM (where HM stands for heavy metal, meaning actinides like thorium, uranium, plutonium, etc.).




Burnup