PhD abstract

Reactor dosimetry is used to determine the neutron fluence during an irradiation and to characterize its spectrum (neutron energy distribution). This technique is based on the analysis of the activity of irradiated dosimeters, made of pure metals or alloys. The activity measurement of these samples is performed by gamma and/or X-ray spectrometry and is currently based on specific standard dosimeters, validated for the measurement conditions. The goal of the thesis is to avoid this calibration step and to be able to directly measure the activity of the sample. The study focused specifically on niobium and rhodium dosimeters which are used to characterize neutrons in the energy range around 1 MeV. Their activation produces 93mNb and 103mRh, respectively. These two radionuclides decay through an isomeric gamma transition, emitting mainly K X-rays with energies around 20 keV, on which the spectrometric activity measurement is based. However, owing to their low energy, these X-rays are particularly difficult to measure accurately. The various parameters required to determine the activity of the dosimeters, with a relative standard uncertainty of around 2%, were studied in detail. The work initially focused on the calibration of hyper-pure germanium (HPGe) detectors in the energy range between 11 keV and 150 keV. This is a crucial step in determining the activity of a radionuclide sample and is difficult to achieve in the energy range of interest. The experimental approach, using standard point sources, was coupled with semi-empirical modelling and simulations of radiation-matter interactions by Monte Carlo methods (PENELOPE and GEANT4). These methods have made it possible to study in detail the photons scattering at low energy, around 20 keV, which interferes with the full-energy peaks in the spectra and disturbs their analysis. In a second step, Monte Carlo simulations were used to calculate the correction factors needed to derive the dosimeter activity: self-absorption of photons in the dosimeter material and the geometry change between the calibration conditions (point source) and the measurement conditions (solid metal sample). The fluorescence induced by impurities (in the dosimeter material or created during irradiation in the reactor) was studied and the correction factors to be applied were established. Radioactive decay data, particularly photon emission intensities, are the main components of the uncertainty in dosimeter activity results. X-ray emission intensities are rarely measured experimentally and most often their values are calculated from fundamental parameters i.e. internal conversion coefficients and fluorescence yields, and a balanced decay scheme, of the nuclide. Several experiments were designed to provide new experimental data. The mass attenuation coefficients and K fluorescence yields of niobium and rhodium were determined using a monochromatic photon beam at the SOLEIL synchrotron facility. The photon emission intensities of 103mRh were measured using two approaches, one from rhodium activated at the ISIS reactor and the other from a solution of palladium 103. All these new values are compared with previously published data and the decay scheme of 103mRh is discussed.

Key words

radioactivity measurement, X and gamma-Ray spectrometry, reactor dosimetry, Niobium, Rhodium, Monte-Carlo simulation