Recent studies linking radiation exposure from pediatric computed tomography (CT) to increased risks of leukemia and brain tumors lacked data to control for cancer susceptibility syndromes (CSS). These syndromes might be confounders because they are associated with an increased cancer risk and may increase the likelihood of pediatric CT scans. We identify CSS predisposing to leukemia and brain tumors through a systematic literature search and summarize prevalence and risk. Since empirical evidence is lacking in published literature on patterns of CT use for most types of CSS, we estimate confounding bias of relative risks (RR) for categories of radiation exposure based on expert opinion about patterns of CT scans among CSS patients. We estimate that radiation-related RRs for leukemia are not meaningfully confounded by Down syndrome, Noonan syndrome and other CSS. Moreover, tuberous sclerosis complex, von Hippel-Lindau disease, neurofibromatosis type 1 and other CSS do not meaningfully confound RRs for brain tumors. Empirical data on the use of CT scans among CSS patients is urgently needed. Our assessment indicates that associations with radiation exposure from pediatric CT scans and leukemia or brain tumors reported in previous studies are unlikely to be substantially confounded by unmeasured CSS
Monte Carlo calculations were used to investigate the efficiency of radiation protection equipment in reducing eye and whole body doses during fluoroscopically guided interventional procedures. Eye lens doses were determined considering different models of eyewear with various shapes, sizes and lead thickness. The origin of scattered radiation reaching the eyes was also assessed to explain the variation in the protection efficiency of the different eyewear models with exposure conditions. The work also investigates the variation of eye and whole body doses with ceiling-suspended shields of various shapes and positioning. For all simulations, a broad spectrum of configurations typical for most interventional procedures was considered. Calculations showed that 'wrap around' glasses are the most efficient eyewear models reducing, on average, the dose by 74% and 21% for the left and right eyes respectively. The air gap between the glasses and the eyes was found to be the primary source of scattered radiation reaching the eyes. The ceiling-suspended screens were more efficient when positioned close to the patient's skin and to the x-ray field. With the use of such shields, the H-p(10) values recorded at the collar, chest and waist level and the H-p(3) values for both eyes were reduced on average by 47%, 37%, 20% and 56% respectively. Finally, simulations proved that beam quality and lead thickness have little influence on eye dose while beam projection, the position and head orientation of the operator as well as the distance between the image detector and the patient are key parameters affecting eye and whole body doses.
Date (da'te) City in Fukushima Prefecture has conducted a population-wide individual dose monitoring program after the Fukushima Daiichi Nuclear Power Plant Accident, which provides a unique and comprehensive data set of the individual doses of citizens. The purpose of this paper, the first in the series, is to establish a method for estimating effective doses based on the available ambient dose rate survey data. We thus examined the relationship between the individual external doses and the corresponding ambient doses assessed from airborne surveys. The results show that the individual doses were about 0.15 times the ambient doses, the coefficient of 0.15 being a factor of 4 smaller than the value employed by the Japanese government, throughout the period of the airborne surveys used. The method obtained in this study could aid in the prediction of individual doses in the early phase of future radiological accidents involving large-scale contamination.
We developed computational methods and tools to assess organ doses for pediatric and adult patients undergoing computed tomography (CT) examinations. We used the International Commission on Radiological Protection (ICRP) reference pediatric and adult phantoms combined with the Monte Carlo simulation of a reference CT scanner to establish comprehensive organ dose coefficients (DC), organ absorbed dose per unit volumetric CT Dose Index (CTDIvol) (mGy/mGy). We also developed methods to estimate organ doses with tube current modulation techniques and size specific dose estimates. A graphical user interface was designed to obtain user input of patient- and scan-specific parameters, and to calculate and display organ doses. A batch calculation routine was also integrated into the program to automatically calculate organ doses for a large number of patients. We entitled the computer program, National Cancer Institute dosimetry system for CT(NCICT). We compared our dose coefficients with those from CT-Expo, and evaluated the performance of our program using CT patient data. Our pediatric DCs show good agreements of organ dose estimation with those from CT-Expo except for thyroid. Our results support that the adult phantom in CT-Expo seems to represent a pediatric individual between 10 and 15 years rather than an adult. The comparison of CTDIvol values between NCICT and dose pages from 10 selected CT scans shows good agreements less than 12% except for two cases (up to 20%). The organ dose comparison between mean and modulated mAs shows that mean mAs-based calculation significantly overestimates dose (up to 2.4-fold) to the organs in close proximity to lungs in chest and chest-abdomen-pelvis scans. Our program provides more realistic anatomy based on the ICRP reference phantoms, higher age resolution, the most up-to-date bone marrow dosimetry, and several convenient features compared to previous tools. The NCICT will be available for research purpose in the near future.
A thorough literature review about the current situation on the implementation of eye lens monitoring has been performed in order to provide recommendations regarding dosemeter types, calibration procedures and practical aspects of eye lens monitoring for interventional radiology personnel. Most relevant data and recommendations from about 100 papers have been analysed and classified in the following topics: challenges of today in eye lens monitoring; conversion coefficients, phantoms and calibration procedures for eye lens dose evaluation; correction factors and dosemeters for eye lens dose measurements; dosemeter position and influence of protective devices. The major findings of the review can be summarised as follows: the recommended operational quantity for the eye lens monitoring is H-p(3). At present, several dosemeters are available for eye lens monitoring and calibration procedures are being developed. However, in practice, very often, alternative methods are used to assess the dose to the eye lens. A summary of correction factors found in the literature for the assessment of the eye lens dose is provided. These factors can give an estimation of the eye lens dose when alternative methods, such as the use of a whole body dosemeter, are used. A wide range of values is found, thus indicating the large uncertainty associated with these simplified methods. Reduction factors from most common protective devices obtained experimentally and using Monte Carlo calculations are presented. The paper concludes that the use of a dosemeter placed at collar level outside the lead apron can provide a useful first estimate of the eye lens exposure. However, for workplaces with estimated annual equivalent dose to the eye lens close to the dose limit, specific eye lens monitoring should be performed. Finally, training of the involved medical staff on the risks of ionising radiation for the eye lens and on the correct use of protective systems is strongly recommended.
The recently published NCRP Commentary No. 27 evaluated the new information from epidemiologic studies as to their degree of support for applying the linear nonthreshold (LNT) model of carcinogenic effects for radiation protection purposes (NCRP 2018 Implications of Recent Epidemiologic Studies for the Linear Nonthreshold Model and Radiation Protection, Commentary No. 27 (Bethesda, MD: National Council on Radiation Protection and Measurements)). The aim was to determine whether recent epidemiologic studies of low-LET radiation, particularly those at low doses and/or low dose rates (LD/LDR), broadly support the LNT model of carcinogenic risk or, on the contrary, demonstrate sufficient evidence that the LNT model is inappropriate for the purposes of radiation protection. An updated review was needed because a considerable number of reports of radiation epidemiologic studies based on new or updated data have been published since other major reviews were conducted by national and international scientific committees. The Commentary provides a critical review of the LD/LDR studies that are most directly applicable to current occupational, environmental and medical radiation exposure circumstances. This Memorandum summarises several of the more important LD/LDR studies that incorporate radiation dose responses for solid cancer and leukemia that were reviewed in Commentary No. 27. In addition, an overview is provided of radiation studies of breast and thyroid cancers, and cancer after childhood exposures. Non-cancers are briefly touched upon such as ischemic heart disease, cataracts, and heritable genetic effects. To assess the applicability and utility of the LNT model for radiation protection, the Commentary evaluated 29 epidemiologic studies or groups of studies, primarily of total solid cancer, in terms of strengths and weaknesses in their epidemiologic methods, dosimetry approaches, and statistical modelling, and the degree to which they supported a LNT model for continued use in radiation protection. Recommendations for how to make epidemiologic radiation studies more informative are outlined. The NCRP Committee recognises that the risks from LD/LDR exposures are small and uncertain. The Committee judged that the available epidemiologic data were broadly supportive of the LNT model and that at this time no alternative dose-response relationship appears more pragmatic or prudent for radiation protection purposes.
Following the Fukushima accident, the International Commission on Radiological Protection (ICRP) convened a task group to compile lessons learned from the nuclear reactor accident at the Fukushima Daiichi nuclear power plant in Japan, with respect to the ICRP system of radiological protection. In this memorandum the members of the task group express their personal views on issues arising during and after the accident, without explicit endorsement of or approval by the ICRP. While the affected people were largely protected against radiation exposure and no one incurred a lethal dose of radiation (or a dose sufficiently large to cause radiation sickness), many radiological protection questions were raised. The following issues were identified: inferring radiation risks (and the misunderstanding of nominal risk coefficients); attributing radiation effects from low dose exposures; quantifying radiation exposure; assessing the importance of internal exposures; managing emergency crises; protecting rescuers and volunteers; responding with medical aid; justifying necessary but disruptive protective actions; transiting from an emergency to an existing situation; rehabilitating evacuated areas; restricting individual doses of members of the public; caring for infants and children; categorising public exposures due to an accident; considering pregnant women and their foetuses and embryos; monitoring public protection; dealing with ' contamination' of territories, rubble and residues and consumer products; recognising the importance of psychological consequences; and fostering the sharing of information. Relevant ICRP Recommendations were scrutinised, lessons were collected and suggestions were compiled. It was concluded that the radiological protection community has an ethical duty to learn from the lessons of Fukushima and resolve any identified challenges. Before another large accident occurs, it should be ensured that inter alia: radiation risk coefficients of potential health effects are properly interpreted; the limitations of epidemiological studies for attributing radiation effects following low exposures are understood; any confusion on protection quantities and units is resolved; the potential hazard from the intake of radionuclides into the body is elucidated; rescuers and volunteers are protected with an ad hoc system; clear recommendations on crisis management and medical care and on recovery and rehabilitation are available; recommendations on public protection levels (including infant, children and pregnant women and their expected offspring) and associated issues are consistent and understandable; updated recommendations on public monitoring policy are available; acceptable (or tolerable) 'contamination' levels are clearly stated and defined; strategies for mitigating the serious psychological consequences arising from radiological accidents are sought; and, last but not least, failures in fostering information sharing on radiological protection policy after an accident need to be addressed with recommendations to minimise such lapses in communication.
The time-integrated absorbed dose to the thyroid gland in the years after a fallout event can indicate the potential excess number of thyroid cancers among young individuals after a radionuclide release. Typical mean values of the absorbed dose to the thyroid have been calculated previously using reported data on radioiodine obtained from air sampling and dairy milk surveys in Sweden after the Chernobyl fallout, not including the contribution from Cs-134 and Cs-137. We have developed a model for Swedish conditions taking these additional dose contributions into account. Our estimate of the average time-integrated absorbed dose to the thyroid, D-th,D-tot, during the first 5 years after fallout ranged from 0.5-4.1 mGy for infants and from 0.3-3.3 mGy for adults. The contribution to D-th,D-tot, from I-131 through inhalation and milk consumption varied considerably among different regions of Sweden, ranging from 9%-79% in infants, and from 4%-58% in adults. The external irradiation and exposure from the ingestion of (CS)-C-134,137 in foodstuffs accounted for the remaining contributions to D-th,D-tot, (i.e. up to 96% for adults). These large variations can be explained by the highly diverse conditions in the regions studied, such as different degrees of fractionation between wet and dry deposition, different grazing restrictions on dairy cattle, and differences in (CS)-C-134,137 transfers through food resulting from differences in the local fallout. It is our conclusion that the main contribution to D-th,D-tot, from nuclear power plant fallout in areas subjected to predominantly wet deposition will be from external exposure from ground deposition, followed by internal exposure from contaminated food containing the long-lived fission product Cs-137 and the neutron-activated fission product (CS)-C-134. The contribution from (CS)-C-134,137 to the thyroid absorbed dose should thus be taken into account in future epidemiological studies.
Computed tomography (CT) has great clinical utility and its usage has increased dramatically over the years. Concerns have been raised, however, about health impacts of ionising radiation exposure from CTs, particularly in children, who have a higher risk for some radiation induced diseases. Direct estimation of the health impact of these exposures is needed, but the conduct of epidemiological studies of paediatric CT populations poses a number of challenges which, if not addressed, could invalidate the results. The aim of the present paper is to review the main challenges of a study on the health impact of paediatric CTs and how the protocol of the European collaborative study EPI-CT, coordinated by the International Agency for Research on Cancer (IARC), is designed to address them. The study, based on a common protocol, is being conducted in Belgium, Denmark, France, Germany, the Netherlands, Norway, Spain, Sweden and the United Kingdom and it has recruited over one million patients suitable for long-term prospective follow-up. Cohort accrual relies on records of participating hospital radiology departments. Basic demographic information and technical data on the CT procedure needed to estimate organ doses are being abstracted and passive follow-up is being conducted by linkage to population-based cancer and mortality registries. The main issues which may affect the validity of study results include missing doses from other radiological procedures, missing CTs, confounding by CT indication and socioeconomic status and dose reconstruction. Sub-studies are underway to evaluate their potential impact. By focusing on the issues which challenge the validity of risk estimates from CT exposures, EPI-CT will be able to address limitations of previous CT studies, thus providing reliable estimates of risk of solid tumours and leukaemia from paediatric CT exposures and scientific bases for the optimisation of paediatric CT protocols and patient protection.
This study firstly explored the risks of secondary cancer in healthy organs of Chinese pediatric patients with brain tumor after boron neutron capture therapy (BNCT). Three neutron beam irradiation geometries (i.e., RLAT, TOP, PA) were adopted in treating patients with brain tumor under the clinical environment of BNCT. The concerned organs in this study were those with high cancer morbidity in China (e.g., lung, liver, and stomach). The equivalent doses for these organs were calculated using Monte Carlo and anthropomorphic pediatric phantoms with Chinese physiological features. The secondary cancer risk, characterized by the lifetime attributable risk (LAR) factor mentioned in the BEIR VII report, was compared among the three irradiation geometries. Results showed that the LAR was lower with the PA irradiation geometry than with the two other irradiation geometries when the 2 cm-diameter tumor was at a depth of 6 cm at the right brain. Under the PA irradiation geometry, the LAR in the organs increased with increasing tumor volume and depth because of the long irradiation time. As the patients aged from 10 to 15 years, the LAR decreased, which was related to the increased patient height and shortened life expectation. Female patients had a relatively higher secondary cancer risk than male patients in this study, which could be due to the thinner body thickness and the weaker protective effect on the internal organs of the female patients. In conclusion, the risks of secondary cancer in organs were related to irradiation geometries, gender, and age, indicating that secondary cancer risk is a personalized parameter that needs to be evaluated before administering BNCT, especially in patients with large or deep tumors.
Over the past decades, the International Commission on Radiological Protection (ICRP) has used radiation detriment, which is a multidimensional concept to quantify the overall harm to health from stochastic effects of low-level radiation exposure of different parts of the body. Each tissue-specific detriment is determined from the nominal tissue-specific risk coefficient, weighted by the severity of the disease in terms of lethality, impact on quality of life and years of life lost. Total detriment is the sum of the detriments for separate tissues and organs. Tissue weighting factors for the calculation of effective dose are based on Relative contributions of each tissue to the total detriment. Calculating radiation detriment is a complex process that requires information from various sources and judgements on how to achieve calculations. As such, it is important to document its calculation methodology. To improve the traceability of calculations and form a solid basis for future recommendations, the ICRP Task Group 102 on detriment calculation methodology was established in 2016. As part of its mission, the history of radiation detriment was reviewed, and the process of detriment calculation was detailed. This article summarizes that work, aiming to clarify the methodology of detriment calculation currently used by ICRP.
Following the Fukushima incident, doses from external exposure accounted for the majority of the total doses. Although countermeasures are being implemented aiming at reducing the radiation exposure, little information is available on the impact of decontamination on individual doses among the residents of radioactively contaminated areas. To evaluate the effectiveness of the decontamination measures in reducing individual doses and examine the influence of the timing of decontamination and the district, the data for 18,392 adults and 3,650 children in Minamisoma City, Fukushima, who participated in voluntary screening programs using individual radiation dosimeters (Glass Badge) between June 2013 and September 2016 were analyzed. The dose reduction rates (DRR) in 2013, 2014, and 2015 for one year were calculated and compared between in areas with and without decontamination. Using a regression approach and Monte-Carlo simulation, the dose reduction rate by decontamination (DRRd') was also estimated. The annual DRR in areas with decontamination for both adults and children were significantly higher than those in areas without decontamination, depending on the timing of decontamination: 31-36% for 2013-2014 for adults in decontamination areas and 33-35% for children in decontamination areas compared to 12-23% and 13-23% for adults and children in areas without decontamination, respectively. There was a positive correlation between the DRRd' and individual doses, and the DRRd' was estimated 30-40% for adults and children with doses of 3 mSv/y in 2013 and 2014. This study demonstrated that the decontamination has an impact on lowering the individual doses from external exposure. The higher the dose at the time of starting the decontamination, the greater the dose reduction rate by decontamination, regardless of the timing of the decontamination. Our study confirms that decontamination was useful for high-dose areas in the later phases of the incident.
Monitoring and protecting of occupational eye doses in interventional radiology (IR) are very important matters. DOSIRIS™ is the useful solution to estimate the 3mm dose-equivalent (Hp(3)), and it can be worn behind lead glasses. And DOSIRIS™, adjustable according to 3 axes, it is ideally placed as close to the eye and in contact with the skin. So, DOSIRIS™ will be suitable eye lens dosimeter. However, the fundamental characteristics of the DOSIRIS™ in the diagnostic X-ray energy domain (including that of IR X-ray systems) remain unclear. Here, we evaluated the performance of the dosimeter in that energy range.
As a result, the DOSIRIS™ has good fundamental characteristics (batch uniformity, dose linearity, energy dependence, and angular dependence) in the diagnostic X-ray energy domain.
We conclude that the DOSIRIS™ has satisfactory basic performance for occupational eye dosimetry in diagnostic X-ray energy settings (including IR X-ray systems).
General X-ray images have a lower probability of nodule detection than other modalities. Especially in children, the probability of nodule detection can likely drop due to poor image quality from using low radiation dose. To demonstrate the effectiveness of fast non-local means (FNLM) filter to increase the probability of nodule detection in pediatric chest X-ray images and reduce radiation dose while maintaining image quality. Quantitative assessment of normalized noise power spectrum (NNPS), coefficient of variation (COV), and contrast to noise ratio (CNR) were performed after applying 4 filters (median, Wiener, total variation, and FNLM) on a 1-year-old child phantom. A 3D-printed patient nodule phantom was inserted into the phantom. Assessment was performed on AP and LAT view images acquired with the tube voltage reduced to 38 and 27%, and tube current reduced to 84 and 61%, respectively. The results showed the lowest NNPS and COV values and the highest CNR value when the FNLM filter applied. Moreover, the AP view results showed 37% decrease in COV and 30% increase in CNR in images with the FNLM filter applied (images exposed with the tube voltage and current reduced to 29% and 50%, respectively). The LAT view results showed 5% decrease in COV and 36% increase in CNR in images with the FNLM filter applied (images exposed with the tube current reduced by 27%). By applying the FNLM filter, the probability of nodule detection could be increased by denoising and contrast enhancement. Moreover, using the FNLM filter could reduce cancer risk in pediatric patients by reducing radiation dose about 30% to 44%.
This paper describes an overview of the radiation protection response to the plutonium intake accident that occurred at the Plutonium Fuel Facility of the Oarai Research and Development Center of the Japan Atomic Energy Agency on June 6, 2017. In the hood of the analyzing room at the Plutonium Fuel Facility five workers were checking a storage container of fast reactor nuclear fuel material. Around 11:15 a.m., vinyl bags inside the fuel material container containing plutonium and enriched uranium burst during the inspection work. All the workers heard the bang, which caused misty dust leakage from the container. This event caused significant both skin and nasal α-contamination for three workers and just skin α-contamination for one worker. Decontamination was conducted in the shower room. Then the five workers were transferred to the Nuclear Fuel Cycle Engineering Laboratory to evaluate inhalation intake of plutonium etc. in the lungs. The maximum values of 2.2×104 for 239Pu and 2.2×102 Bq for 241Am were estimated by the lung monitor. Based on these results, injection of a chelate agent was conducted for prompt excretion of plutonium etc. The next morning, the five workers were transferred to the National Institute of Radiological Sciences for treatment including decontamination of their skin and measurement by a lung monitor. At that time no obvious energy peak was confirmed for plutonium. The Japan Health Physics Society launched an ad-hoc working group for plutonium intake accident around the middle of June to survey issues and to extract lessons for radiological protection. The authors, who are the members of the ad-hoc working group, will here report the activity of the working group.
Statistically significant increases in heart disease (HD) mortality with cumulative recorded occupational radiation dose from external sources were observed among 174 541 subjects, who were predominately exposed to protracted low doses over a number of years, and were followed up until the end of 2011 in the UK National Registry for Radiation Workers (NRRW) cohort. Amongst the subtypes of HD, the increasing trends with cumulative dose arose for ischaemic heart disease (IHD) and other HD (which includes pulmonary HD, valve disorders, cardiomyopathy, cardiac dysrhythmias, carditis, conduction disorder and ill-defined HD). For IHD, the increased mortality appears to be at least 20 years after first exposure and the excess risk peaked between 30 and 40 years after the first exposure. There was no evidence of excess risk of IHD mortality for cumulative radiation doses below 0.1 Sv. A categorical analysis also showed that the risk falls below the expected value based on a linear trend, for cumulative doses greater than 0.4 Sv; this smaller risk appears to be primarily associated with workers who started employment at a younger age and who were employed for longer than 30 years, reflecting possible healthy worker survivor effect. This analysis provided further evidence that low doses of radiation exposure may be associated with increased risk of IHD. For other HD, the data suggest an increased risk starting around 40 years after the first exposure. The risk was statistically significant raised only for cumulative doses above 0.4 Sv. However, the number of deaths in this group was small and the results need to be interpreted with caution.
The potential for adverse health effects from internal exposure to Plutonium has been recognised since its discovery in the 1940s. However, in the absence of specific information, potential risks from Plutonium exposure have always largely been controlled through knowledge of radiation exposure risks in general, much of which comes from external radiation exposures. To try to obtain more direct estimates of potential internal exposure risks, epidemiological studies of Plutonium workers need to be conducted. Such epidemiological analyses require individual Plutonium exposure estimates that are as accurate and unbiased as possible. The UK Sellafield workforce includes one of the world's largest cohorts of Plutonium workers, which constitutes, by some considerable margin, the group of workers most comprehensively monitored for internal exposure to this alpha-particle-emitter. However, for several hundred workers employed at the start of Plutonium work at the facility, during the period from 1952 through to 1963, the historical urinalysis results available cannot provide sufficiently accurate and unbiased exposure assessments needed for use in epidemiological studies. Consequently, these early workers have had to be excluded from epidemiological analyses and this has significantly reduced the power of these studies. A promising quantitative methodology to overcome the issue of missing or deficient exposure data, is to use exposure data from other sources to estimate the average exposure a 'typical worker' would have received, and to collate this information for specific occupations and years. This approach is called a Job-Exposure Matrix (JEM). Work on a pilot study to construct a population-specific quantitative JEM for the early Plutonium workers at Sellafield during 1952-1963, for whom reliable urinalysis results do not exist, has shown the potential for a JEM approach to produce more reliable and useful exposure estimates for epidemiological research.
In heavy water reactors, radionuclides are generated, then removed and treated by ion exchange resin. The disposal cost of spent resin is expected to increase because of the saturation of the existing storage capacity. In this study, a spent resin treatment process using microwaves is proposed, and a radiological safety assessment and cost evaluation of the spent resin treatment process are performed. A dose assessment was conducted by using the established exposure scenarios and the RESRAD-Build software. A sensitivity analysis was conducted to identify the main contributory radionuclide of the dose according to each exposure pathway because a spent resin consists of various radionuclides. The main exposure pathway was identified, and sensitivity analysis was applied to the working time and radioactivity concentrations of C-14, Co-60 and (CS)-C-137 to confirm their effect on the dose. Finally, an optimal shielding system for a safe work environment was proposed. The disposal cost of the spent resin is reduced by lowering its radioactivity level via a treatment process using microwaves. The treatment process can reduce the radioactivity level through the desorption of C-14 and can also recycle the C-14 nuclide. These characteristics have great economic advantages from the viewpoint of the entire nuclear energy cycle. Thus, this study evaluates the radiological safety of the spent resin treatment process for actual application in a heavy water reactor power plant.
After the 2011 Fukushima Daiichi nuclear power plant accident, evidence on the real-life conditions of returnees to areas once designated as legal no-go zones, including their radiation dose levels, is scarce. In the present study, using a radiation dosimeter and lifestyle survey, we evaluated the lifestyle characteristics and dose levels in 2017 from external exposure among those who returned to the no-go zones after the evacuation orders were lifted. A total of 112 returnees to Odaka district, Minamisoma City, Fukushima Prefecture, were considered and compared to 266 non-returnees. The proportion of participants with annual additional doses from external exposure above 1 mSv was 7.0% for returnees, and 7.3% and 4.2% for non-returnees living in other districts or outside of the city, respectively. Although caution is required given the very small sample sizes, this implies that as of 2017 doses from external exposure among returnees in Odaka were very low and by scientific consensus would be associated with a very low likelihood of physical effects. We also found that while returnees were older on average than non-returnees, they had similar life conditions (i.e. occupation and time spent outdoors). It should be particularly emphasised that the expected lifetime doses from the incident in addition to the natural background dose are a very small among returnees. This study contributes to enhancing societal debates and risk communication regarding how government can provide returnees with the support they need, improve their outlook for radiation doses, and continue to improve crucial infrastructure in former no-go zones so that communities can be rebuilt.