Relationship between Core Impurity, Fuel and Thermal Diffusivities in L and H-mode JET Plasmas
Impurity densities are a key factor in the determination of a reactor performance. If the viability of a radiation shield generated at the edge by seeding impurities to dissipate convective losses is a matter of debate, the presence of impurities in the core is clearly undesired because of the deleterious consequences of radiation losses and plasma dilution on fusion reactivity. In particular the accumulation of impurities and the associated risk of disruptions cannot be tolerated. A figure of merit in this context is the ratio of the transport parameters of the impurities to those of main fuel and energy. In the published literature Dimp/ce, or more often timp/tEe, is reported to vary widely, depending on a variety of parameters such as elongation, triangularity, q95,  and is theoretically predicted to be ~ 1 in case of electrostatic turbulent transport . In this paper we analyse the ratios Dimp/ce and, to a lesser extent, Dimp/DD in a group of JET discharges devised to probe heat and particle (main gas and impurities) diffusivities by means of transient perturbations; namely, electron heating modulation, shallow pellet injection and laser ablation of nickel targets, respectively.