Here is a selection out of my scholarly activities in random order:
1. Quality assurance for the detection of radioisotopes in materials
1.1 Intro
The main contemporary methods of quality assurance with regard to the detection and measurement of radioisotopes in materials are the so called inter-comparison / inter-calibration exercises. In the framework of such exercises the following are evaluated: (a) the quality of the reference standards employed in a Laboratory, (b) the calibration of the instrumentation and (c) the methods and the metrology techniques applied. An inter-comparison / inter-calibration exercise can be set with the participation of two or more Laboratories, which compare their results against reference values when applying various measurement techniques over the same or similar samples. The exact reference values for such exercises are derived from Laboratories participating in the exercise; these Laboratories are certified for their precision and accuracy regarding the measurement techniques in evaluation The Nuclear Engineering Laboratory of NTUA (NEL-NTUA) has long experience with such exercises. Four out of these exercises, that took place during or after 2006, will be referenced below. For each of these exercises I was responsible for delivering some of the requested measurement results.
1.2 Exercise "1" (Work Group: A. Efstathopoulos, N. Petropoulos)
In 2006 NEL-NTUA participated in a European Inter-calibration / Inter-comparison Exercise for the detection and measurement of radioisotopes in samples containing certified amounts of radioactivity. The Exercise was organized by the National Physical Laboratory of UK. NEL-NTUA provided measured concentration results of high quality with regard to the detected isotopes. My contribution to the results was the successful reporting of Sr-90, Sr-89 and Cl-36, concentrations using the liquid scintillation measurement method.
1.3 Exercise "2" (Work Group: M. Savva, D. Karangelos, N. Petropoulos)
In 2010 NEL-NTUA participated in a European Inter-calibration / Inter-comparison Exercise for the detection and measurement of radioisotopes in samples containing certified amounts of radioactivity. The Exercise was organized by the EU Joint Research Center IRMM, which is located in Belgium. NEL-NTUA provided measured concentration results of high quality with regard to the detected isotopes. My contribution to the results was the relatively successful reporting of Sr-90 concentrations using radio-chemical separation and the liquid scintillation measurement method.
1.4 Exercise "3" (Work Group: M. Savva, D. Karangelos, N. Petropoulos)
In 2011 NEL-NTUA participated in an International Inter-calibration / Inter-comparison Exercise for the detection and measurement of radioisotopes in samples containing certified amounts of radioactivity. The Exercise was organized by the ALMERA Network. ALMERA is an international network of radiation measurement laboratories comprising of 126 laboratories from 78 countries. The network has been organized and is maintained by the International Atomic Energy Agency (IAEA). NEL-NTUA is an active member of this network. My contribution to the results was the relatively successful reporting of Ra-226 concentrations in water and also the relatively successful reporting of total -a and total -b activities in various samples using the liquid scintillation method.
1.5 Exercise "4"
In 2012 NEL-NTUA participated in an International Inter-calibration / Inter-comparison Exercise for the detection and measurement of radioisotopes in samples containing certified amounts of radioactivity. The Exercise was organized by the ALMERA Network (see above). My contribution to the results was the reporting of H-3 concentrations for water samples using the liquid scintillation method. Such results were reported for the first time by NEL-NTUA in the framework of an Inter-comparison Exercise. The evaluation of the reported data showed that there was a systematic overestimation by a factor of about "5" in comparison to the certified H-3 concentration. This unacceptably high discrepancy is most probably due to erroneous initial estimation of the measurement method calibration factor; the reason for this is attributed to the fact that the calibration source activity was too high compared to the activity of the water samples measured. NEL-NTUA will purchase a more suitable H-3 calibration source of lower activity in due time.
2. New computer servers @ NEL-NTUA (Work Group: D. Karangelos, N. Petropoulos, S. Simopoulos)
2.1 Intro
It had been well established as early as since 2004, that, at the time, the computer servers of NEL-NTUA, could not meet the modern needs of collecting and processing experimental data coming mainly from gamma spectroscopic analysis instrumentation and also the set-ups for radio-environmental surveillance. In addition the servers could not meet the needs for providing network commodity services, like e-mail, web pages etc. The problems were due to the age of those computers (the oldest of which was committed in 1987) which led to: (a) insufficient maintenance of the hardware, (b) poor maintenance of the software and impossible upgrade to contemporary releases and distributions, (c) irregularly kept backup. Furthermore, over the years, an important part of the experimental set-ups and data loggers of NEL-NTUA had been replaced with different communication protocols difficult to couple with vintage computer hardware. In addition the servers users had been increased in numbers and their needs had been multiplied in terms of disk and memory quota. After September 2006, I supervised the gradual replacement of three old computer servers of NEL-NTUA, acting both with initiative and also in collaboration with the other two members of the work group. The following tasks were seen through:
(1) hardware selection , (2) hardware procurement, (3) selection of the operating system, (4) systems arrangement and networking, (5) software installation and application of internet safety and security firewall rules, (6) users environment initial set-up according to minimum common needs, (7) old software applications migration to the new systems; relevant operational performance check, (8) optimum hardware coupling of existing experimental set-ups to the new systems, and finally (9) system "commitment" and "treatment" of initial bugs.
All work was accomplished without the help of any IT specialist. After successfully completing the systems replacement, it can be said that significant new know-how has been transferred to NTUA-NEL, that the new servers can be maintained easily and also that they can be readily upgraded or replaced with newer bearing even more contemporary hardware and software without performing major changes and interventions. Below, the characteristics of the three new systems developed in this process are being reported in chronological order.
2.2 2006: 1st System: E-mail and Web Server.
Hardware and OS replaced: Silicon Graphics Incorporated (SGI) Indy Workstation with Irix 5.3 OS, in operation since 1995.
New System: Intel Pentium Workstation with LINUX Debian OS, in operation since 2006.
New system special requirements: Among other tasks, this system records, using an internal network, radio-environmental data collected in another server (3rd system, as per the description below). These data are given from experimental set-ups developed specifically in NEL-NTUA. The server feeds these data to especially designed web pages in regular time intervals, to be used for scientific and public information purposes. Credits for the current appearance of the NEL-NTUA's web pages are due to Dr. D. Karangelos.
2.3 2007-2008: 2nd System: Server for the Gamma Spectroscopic Analysis Experimental Set-ups.
Hardware and OS replaced: Hewlett Packard 9000/370 Super mini computer Workstation with HPUX 7.x (UNIX System V) OS, in operation since 1987.
New System: Intel Pentium Workstation with LINUX Fedora OS, in operation since 2008.
New system special requirements: This server in question is on-line connected to a set of multichannel analyzers, which collect data originating by five (5) different Germanium detectors suitable for the qualitative and quantitative detection of radioisotopes in materials using gamma spectroscopic analysis. To serve its purpose the server needs:
(a) to be connected to the multichannel(s) by serial, USB or network protocols; each different protocol has -of course- its own configuration problems that had to be solved, (b) to be connected to existing large paper format dot-matrix printers by serial, parallel or network protocols, (c) to be connected with legacy -yet still efficiently working- ASCII terminals (for this particular connection, specific terminal software libraries had to be modified, (d) to ensure impeccable operation of SPUNAL, an in-house developed gamma spectroscopic analysis software by Prof. S.E. Simopoulos and finally (e) to ensure flawless performance of the gamma spectroscopic analyses data base and also of the computer - user interactive environment regarding this data base, which again was originally designed by Prof. Simopoulos. It has to be noted however, that my contribution for items (d) and (e) was rather limited. Most part of this work was served by Dr. D. Karangelos and Prof. S.E. Simopoulos.
2.4 2009: 3rd System: Server for Radio-Environmental Data Collection Using NEL-NTUA's Suitable Experimental Set-up.
Hardware and OS replaced: Hewlett Packard 9000/370 Super mini computer Workstation with HPUX 7.x (UNIX System V) OS, in operation since 1987.
New System: Intel Pentium Workstation with LINUX Fedora OS, in operation since 2009.
New system special requirements: This server in question is on-line connected to a NaI radioactivity detector of suitable volume and adequate electronics. This detector is directly exposed to the outdoors in order to monitor environmental air with regard to the possible presence of radioisotopes originating from fission. The detector system as arranged communicates data to the server and accepts sequential commands through serial protocol. Originally, the special software for this communication was written for the replaced hardware and OS. This software has been modified under my supervision in the course of a Diploma Dissertation, in order to secure its operation with the new hardware and LINUX OS. The new server is also on-line real-time connected to certain transducers through an analogue - to - digital - converter in order to collect environmental data, like temperature outdoors, atmospheric pressure, relative humidity, etc. Data collection is enabled using the comedi software platform, using a short interface developed by Dr. D.J. Karangelos. As already noted, server "3" operates in collaboration with server "1" for the communication of the collected data through the internet.
3. Industrial radiography installation development
Industrial Radiography is one of the most important Non-Destructive Techniques. Providing education for this method can give the capability to NTUA's Mechanical Engineering graduates to apply it safely and correctly. For this purpose, since Academic Year 2005-2006, NED-NTUA has introduced in the 7th undergraduate semester a course entitled "Industrial Applications of Nuclear Engineering" to cover, among others, subjects related to Industrial Radiography. Furthermore, in the framework of the MSc Program Applied Mechanics a similar respective course was introduced approximately at the same time frame. I have been teaching both these courses since 2006-2007. In order to add Laboratory Work within these courses, it was decided that an Industrial Radiography installation should be set-up and run at NEL-NTUA. It took me about 36 months of part time design and construction to personally complete and commission the entire installation. The task was divided in several discrete parts as it follows: (a) equipment selection, (b) equipment procurement, (c) spatial arrangements, and (d) laboratory building infrastructure development and enhancement to accommodate the added equipment. It can be estimated that the whole investment for this particular set-up including procurement and building facilities enhancements, should be over 80000 EUROs in 2009 prices. In addition, it should me noted that, due to the lack of specialized technical personnel at NEL-NTUA, all necessary labor regarding the task was also undertaken by myself. This specific labor consisted mainly of: (1) laying out of electrical and networking cabling of total length > 1 km using PVC piping and metallic cable racks of various formats, (2) connecting electrical cabling to mains power distribution panels, to ensure feeding the installation with single phase power directly from the grid, and also from the building's generator and UPS, (3) developing of tap water piping of total length ~50 m, and (4) connecting a water drainage system of ~ 15 m in length, to the main drainage piping. This installation work has been thoroughly documented within a series of Diploma Dissertations and one MSc Dissertation. Today (2011) this installation is fully operational and has the following main components: (i) X-RAY generator GΕ Inspection Technologies GmbH, 200 kV (42 mm steel penetration capability), (ii) AGFA type NOVA film processor, (iii) film viewer, (iv) densitometer, (v) collection of Image Quality Indicators following ΕΝ and ASTM standards and (vi) most necessary paraphernalia and consumables. The Installation was set-up in a heavily shielded underground space, which was especially modified and arranged for hosting ionizing radiation equipment. The X-RAY generator remote control is served from a specially arranged control room outside and away from the shielded space. Within the shielded space, a dark room has been also devoted for the operation of the film processor. The Installation satisfies the most stringent radiation protection rules and has been licensed for Educational Industrial Radiography Laboratory Work by the Greek Atomic Energy Committee. Following the applicable law and regulations I have been also appointed as the Radiation Safety Officer with regard to this specific Installation. In this Installation our students get the opportunity to work hands-on in small teams under my supervision for the radiographic detection of flaws in metallic objects. Apart from education the Installation is being used for research in order to produce radiographic and tomographic imaging for specimens with standard or non-standard flaws. A small but adequate machine shop within the NEL-NTUA premises has been complementary affiliated to the Installation, so that to allow for the in-house construction of most tested specimens and also for needed small repairs. This machine shop has been also set-up and maintained by myself and consists mainly of a refurbished vintage Raglan Little John Mk.2 bench lathe (distance between centers ~60 cm) and of an Optimum GmbH Opti F40E drilling - milling machine along with most of the standard related tooling. The tool collection is being gradually updated and enriched since funding for such purposes is limited within the fiscal year.
4. The application of CFD codes in Nuclear Engineering problems
Recently (in essence since 2005 and onwards), there exists a sturdy debate in the Nuclear Engineering research community regarding the potential and cost-effective usage of CFD codes, for solving existing operational problems of nuclear reactors. Until approximately 2005 such a debate had no point, given that (a) a nuclear reactor truly is a very big water/steam domain to be represented with the finite time and spatial step of a CFD code, (b) the computational time and the computing hardware needed for such a purpose were in most cases far beyond an acceptable cost, and (c) the so-called "reactor codes", e.g. RELAP, COBRA etc. were satisfactory enough -with regard to the required computing facilities and calculation accuracy- for the macroscopic prediction of fluid dynamics phenomena for most known reactor types. It has to be noted, at this point, that "reactor codes" are not really based on the computational solution of fluid dynamics and thermal transport equations systems, but mainly on the application of the so called "engineering correlations". After the commercial availability of multi - core processors in the retail market around 2005, computing hardware and computing time for the solution of a fluid dynamics problem in a nuclear reactor could be actually dealt with, at a reasonable technical and human resources cost, using computing systems of rather limited size. In addition within the same time frame the nuclear engineer started dealing heavily with specific detailed problems, the solution of which could be proven essential for the marginal -and yet invaluable- increase of the efficiency and safety of an operating nuclear reactor or for the detailed design of the new reactor types within Generation 3+ or Generation IV. The currently employed "reactor codes", despite their quality and advantages for macroscopic calculation in terms of the reactor fluid and steam domain behavior, cannot handle with accuracy these detail problems; in fact in some cases they fail to represent experimental data by 25 or even 30%. On the other hand, it is estimated that CFD codes might perform much better for detail problems, since they are capable of acting as "magnifying lens" on phenomena, which are ignored as unimportant by "reactor codes". Yet, CFD codes nowadays (2011) are quite many and the turbulence and thermal mixing models within them are quite diverse. This pool of multi - parameter complexity causes concern and poses problems for reactor calculations as per the following summary:
For a given problem:
(a) which is the suitable model?
(b) which CFD code employing this particular model, solves the problem more efficiently?
(c) which grid is suitable in terms of type and spatial resolution? and
(d) which time step is suitable, for transient problems?
Even if one can provide some answers in these initial question pool, yet more questions are raised:
(1) Which computing hardware is required for applying the best answers for questions (a) to (d)? Is it cost-effective?
(2) How much computing time is needed? Is it reasonable, so that the nuclear engineer can get an answer within an available time frame with some acceptable accuracy? and
(3) What is the sensitivity of the chosen CFD model with regard to boundary conditions, grid, thermophysical properties of substances and a series of other influencing parameters?
It is understandable that the right answer compilation in terms of the above questions is not unique and that fully applying CFD codes in all related nuclear engineering problems is a yet unpaved road, which, furthermore, has to be walked cautiously, despite that there exists optimism that soon CFD codes would be capable to cost and time -effectively provide a detailed representation of the whole reactor fluid domain. It is not to be forgotten that the introduction of CFD codes calculations in the nuclear reactor calculation cycle should be strictly verified and regularly reviewed and certified, due to the specific nature of this kind of engineering. Today (2011) it seems that there exists quite a number of related issues still open for investigation, before a large scale implementation of CFD codes in Nuclear Engineering. A tool for issues investigation is the so called CFD benchmark. In the framework of a benchmark CFD codes are internationally tested and critically evaluated over the solution of a specific given detail problem for which enough and valid experimental results have been collected and are readily available. Such benchmarks in the Nuclear Engineering field are organized every two years.
Following my participation (oral paper presentation) in the 11th NURETH Conference, Avignon, France, October 2005, I was invited to take part in the benchmark organized by NEA OECD under the title "OECD/NEA Sponsored CFD Benchmark Exercise:Thermal Fatigue in a T-Junction". This benchmark dealt with the computational representation of turbulent thermal mixing of water flows of different temperatures and velocity in a t-junction geometry. I chose to participate, because that benchmark (a) gave me the opportunity to use a CFD code for an actual Nuclear Engineering problem, and (b) helped me understand the way that thermophysical properties are involved in the solution. For this benchmark I used the ANSYS CFX 12.0 CFD code, as adequately licensed for NTUA for academic purposes. My results were based on the SAS-SST model and were duly submitted before the deadline in Spring 2010. The total of the international participants, who managed to provide results in time, were just 28. I had the opportunity to discuss these results with some others of these participants during the CFD – Network meeting & follow-up of OECD/NEA T-Junction Benchmark, organized by the Gesellshaft fuer Anlagen und Reaktorsicherheit Forshungszentrun (GRS), at the Technische Universitaet Muenchen, Muenchen, Germany, between March 23-24, 2011. Precision and accuracy -wise my submitted results do not successfully reproduce the experimental data set. Nevertheless, this seems not to be due to a poor problem set-up but mainly due to a poor behavior of the applied SAS-SST turbulence model in confined space. Such a behavior was confirmed also by other meeting participants. Following my participation to the "OECD/NEA Sponsored CFD Benchmark Exercise: Thermal Fatigue in a T-Junction", I was further invited and accepted to participate in the "OECD/NEA Sponsored CFD Benchmark Exercise: Turbulent Flow in a Rod Bundle with Spacers", the results for which are due to be submitted in Spring 2012.