The ANSWERS Seminar, 22nd-24th May 2012

The 2012 ANSWERS Seminar was held at the Wessex Hotel in Bournemouth, a location near to the superb Dorset coast and in close proximity to the town centre.

Over one hundred delegates attended all or part of the three-day meeting, with the programme covering the shielding, reactor physics and criticality technical areas. Programme topics included ANSWERS code developments, code applications, related general interest issues and software demonstrations.

Day 1 - Radiation Shielding

The 2012 ANSWERS Seminar started on Tuesday 22nd May with Radiation Shielding. The host for the day was Pat Cowan, ANSWERS Applications Area Manager for Radiation Shielding. Pat introduced a variety of presentations on the recent developments in the ANSWERS Shielding codes and supporting tools and a number of applications using the ANSWERS codes.

The morning started with a joint presentation given by Ray Perry and George Wright (Serco) entitled "Nuclear Data for Shielding". George started by explaining the nuclear data requirements for a shielding calculation including source, particle tracking and scoring. Ray went on to explain the sources of nuclear data, the international nuclear data projects and how evaluated nuclear data libraries are used in the shielding codes. He described the BINGO collision processor and libraries that will be available in MCBEND version 11A. The next presentation was given by David Picton (Serco) on "Calculation Methodology for Gamma Dose Rates outside a Waste Storage Facility". David was using MCBEND to calculate gamma dose-rates outside a waste store with thick walls, but no roof. External dose-rates were predominantly from skyshine, but the direct dose-rates were also evaluated. David explained various methodologies he tried using two-stage calculations with intermediate leakage files. He explained the pitfalls in one of the methods. The first session ended with an interesting presentation given by Chris Baker (Imperial College) entitled "Uncertainty Quantification Methods for Deterministic Neutron Transport Codes within RADIANT".

Following the morning break, Nigel Davies (Serco) gave a presentation on "Improvements to MCBEND to enable MONK-MCBEND-TH code coupling & the use of MONK Calculated Isotopic Compositions in MCBEND". He explained some new developments that have been carried out to enable the use of MCBEND for gamma heating in a MONK-MCBEND-TH code coupled burn-up calculation and the transfer of Material Compositions from MONK to MCBEND. This was followed by a presentation from Greg Black (University of Manchester) on "Reactor simulations and inventory modelling of UK irradiated graphite waste", which is the subject of his post graduate project. The objectives of this work is to inform future decommissioning, treatment and disposal options for graphite waste, through developing an understanding of the origin, production pathways and location of radionuclides and by characterising the end of life radionuclide inventory of the graphite waste. Greg explained that this is being achieved by means of experimental characterisation of both virgin and irradiated graphite samples and by simulation and modelling of activation and radionuclide inventory. The final presentation in the morning session was by Adam Bird and Tim Fry (Serco) on "Visual Workshop". This presentation included an update on the current status of Visual Workshop 2A and a preview of new features planned for Visual Workshop 3A. A demonstration of some new features was given.

The afternoon session started with Chloe Newton (Atkins) giving a presentation entitled "The Advantages of Dose Rate Contour Plotting". Chloe gave her views on how dose-contour plotting has helped her in shield design. She showed examples of dose contours through the bulk shield and through wall penetrations of a shielded facility containing fixed and mobile sources. The contour plots helped her iterate to a design solution quicker. The next presentation was given by Geoff Dobson (Serco) entitled "Investigations into Contributon Flux using MCBEND". Contributon flux gives a measure of the flux of those particles that contribute to the response and is defined as the product of the angular flux and the angular adjoint flux. Geoff presented work from recent investigations he has carried out to see if this can be evaluated using existing MCBEND facilities. The next presentation was given by Simon Shaw (EDF Energy) entitled "A MCBEND Core Octant Model of the Hartlepool / Heysham 1 AGR Reactors". Simon described recent work that has been carried out to produce a full 3D core octant model from original engineering drawings. Visual Workshop proved very useful to track down modelling errors during the course of the work.

Following the afternoon break a presentation was given by Liz Holland (Sellafield Ltd) on "Streaming Gamma Calculations using MCBEND and ATTILA". Liz presented gamma dose-rates through a complex penetration, using MCBEND and ATTILA. Liz tried a number of acceleration techniques with MCBEND – forced flight, REVISE and MERGE, RECURSIVE. ATTILA was used as a cross-check. Finally, Pat Cowan ended the day with a talk entitled "MCBEND Status and Future Plans". Pat reminded everyone of the new features that will be available in MCBEND11A, which is in the final stages of testing and due to be released in Q3 of 2012. Pat also described a selection of work plans for the ANSWERS Shielding Area.

Day 2 - Reactor Physics

The second day of the 2012 ANSWERS Seminar on Wednesday 23rd May was devoted to Reactor Physics. The host for the day was Tim Newton, ANSWERS Technical Director and Area Manager for Deterministic Codes. Tim introduced a variety of presentations on the recent developments in the ANSWERS Reactor Physics codes and supporting tools and a number of application studies in the Reactor Physics area.

The first presentation was from Paul Bryce (EDF Energy) on a comparison between MCNP and WIMS calculations for an AGR lattice. The objective of the study was to examine the effect of graphite weight loss on the prediction of reactivity, steam worth, rod worth and other safety parameters. Paul's conclusions were that: there was excellent agreement between WIMS and MCNP on reactivity; there were extremely consistent results for control rod worth; and that there was a small bias on steam worth which is accounted for in the overall uncertainty. This was followed by a talk by Pavel Mikoláš (Škoda) on the calculation of characteristics of the Dukovany VVER by WIMS10. Four code/library variants were considered: WIMS9 with JEF2.2 library; WIMS10 with equivalent JEF2.2 nuclear data to that used in WIMS9; WIMS10 with JEF2.2 data with a correction to the Pu239 cross sections; and, WIMS10 with JEFF3.1 data. Predictions of reactivity and power profiles were compared. Pavel's conclusion was that the JEF2.2 libraries gave similar results but moving to JEFF3.1 gave significantly different results. The reason for the differences should be investigated further.

The first session ended with a presentation by Jim Gulliford (OECD) on the activities of the OECD-NEA in the area of code development and validation. Jim gave an overview of the work of the NEA and introduced the IDAT tool which enables the user to extract experiments from the IRPHE database which are similar to a system being considered. Jim then presented facilities in the IDAT tool for data plotting and uncertainty analysis. Following the morning break Jimmy Sudjana (Tractebel Engineering) gave a talk on LWRWIMS/PANTHER modelling of the Tihange-2 PWR in Belgium using the new microscopic depletion feature in PANTHER. Jimmy studied both single assembly benchmarks and core scale calculations. Jimmy concluded that the results were in agreement with the standard macroscopic depletion route and that transient results showed an improvement due to the impact of modelling the short lifetime nuclides.

This was followed by a presentation by Nigel Davies (Serco) on inventory modelling which has been carried out for the IAEA on a number of different reactor types. Nigel described the route for running coupled WIMS – FISPIN calculations and noted that the variety of reactors that had been modelled showed the flexibility of the WIMS suite of codes.

The last talk of the morning was given by Paul Bryce (EDF Energy). Paul presented results for analysis of the Sizewell B PWR using WIMS and PANTHER. A number of new modelling approaches have been tested including: changing from LWRWIMS to WIMS10 as the lattice code; microscopic depletion; buckling dependent cross sections; and use of the embedded supercell corrections. An incremental approach to including these changes was adopted. Paul’s conclusion was that while some of these changes when taken individually can worsen agreement with measurement, when taken together closer agreement with measurement was found.

The afternoon session began with a presentation by Dave Powney (Serco) on WIMS heating calculations. Dave discussed facilities within the EDIT module that can be used to estimate heat deposition at the point of fission. Calculations to allow more-accurate estimates of heat deposition, through the use of the PHODAT and GAM modules, were also discussed. GAM uses a Monte Carlo solver to model heat deposition from neutrons and photons. It employs photon source data generated by the PHODAT module and neutron sources calculated via any of the neutron flux solvers available in WIMS. Examples of PHODAT/GAM input data were given, as were example output data from GAM. This approach to calculating heat deposition is an alternative to MCBEND calculations.

Glynn Hosking (Serco) presented a summary of new language features in WIMS. These new features include enhanced parameter and equation facilities, infinitely-nested DO loops, IF statements, improved QA tools and define data blocks. The last feature can help to simplify input files where sets of data (e.g. geometry specifications) are used more than once. A data block containing the set of repeated data can be fully-defined once at the start of an input file and then referenced via a single keyword command throughout the rest of the input file. Recent WIMS-related developments of VisualWorkshop were described by Tim Fry and Adam Bird (Serco). The VisualWorkshop code can now interpret WIMS input files. This enables it to display CACTUS geometries in a similar manner to CACTUSEdit. VisualWorkshop can also display CACTUS3D geometries in it's wire-frame and ray-trace views; with the ray-trace view also able to display sub-mesh configurations. It is also possible to overlay calculated flux distributions onto a CACTUS3D geometry display.

The next presentation was given by Robert Mills (NNL). Robert had performed WIMS-TRAIL-FISPIN calculations for a NDA-funded benchmark of decay heat from BWR and PWR spent fuel assemblies (this benchmark contained measurements from 68 PWR assemblies and 66 BWR assemblies). For the PWR assemblies there was good agreement between measurement and calculation with both JEF-2.2 and JEFF-3.1 data, there being a 0% to 2% difference between using JEF-2.2 and JEFF-3.1 with the latter predicting decay heat slightly better. It was noted that the benchmark contains no precise estimates of clad cobalt content for the PWR cases with stainless steel clad. Scoping calculations indicated that the C/E values varied significantly with cobalt content. Robert suggested that this effect should be captured through the use of an appropriate uncertainty factors. Significant under-predictions in decay heat, of up to 15%, were calculated for some of the BWR assemblies. Sensitivity analysis indicated that the under-estimates were unlikely to be a consequence of inappropriate modelling of coolant void fractions. Robert proposed that the under-estimates were due to the inadequacy of the FISPIN point model for predicting effects in the detailed axial structure of BWR assemblies.

Tim Newton (Serco) presented a novel method for applying perturbation theory to transport calculations. In the new method the transport component of a perturbation is calculated using currents derived from forward/adjoint solutions of the unperturbed/perturbed problems. The method was demonstrated for a method of characteristics solution of a slab model where a perturbation to the boundary conditions was made. The values of perturbation components estimated from the theory were in exact agreement with those calculated by direct k differencing, thus indicating that the method works well.

The last presentation of the day was also given by Tim. This covered the current status of WIMS10 and summarised potential developments in future versions of the code. Customers were invited to submit their suggestions/requirements for the future development of WIMS.

Day 3 - Criticality

The final day of the 2012 ANSWERS Seminar focused on Nuclear Criticality. The first presentation was given by Nigel Davies (Serco), who described the progress on a novel parallelization scheme based on Woodcock tracking and the use of separate computers to perform each of the functions normally found in a Monte Carlo burnup code. He described the relevant physics required by such a system and showed how he handled the microscopic cross-sections, modeled scattering, and implemented Doppler broadening. The second presentation was from Malcolm Armishaw (Serco) giving a brief history of burnup in MONK and then describing progress on the burnup developments in MONK. These developments included the use of BINGO for tracking, overlaid meshes for depletion and thermal hydraulics calculations, an ARCHIVE file for easier data management, a thermal hydraulics link, a gamma heating link using MCBEND, parallelisation and 64-bit compatability: the latter two features being required to enable large calculations to be run in a reasonable time.

Then, Nigel Davies described a series of comparisons between the new MONK burnup method and the WIMS burnup method for a range of reactor types. The reactor types included PWR, CANDU, AGR, MAGNOX, and a number of Pu and research reactors. The data compared included k-infinity, Nd-148, U-238, Pu-239, Xe-135fp for each burnup cycle, and the conclusion was that the MONK results compared well with the WIMS results. The final talk of the first session was given by Ray Perry (Serco) who gave an overview of the International Nuclear Data Projects. in particular the status of JEFF. He continued by describing the BINGO, WIMS and DICE libraries intended to be released with MONK Version 10A. In the case of BINGO this included the bound scattering nuclides, and the fission products, additional actinides and absorbers that form part of the burnup/decay process. Ray concluded his talk by looking at a set of validation results for uranium systems using the new BINGO libraries and showing the general improvement in agreement.

The second session began with a talk from Geoff Dobson (Serco) who described recent developments to BINGO to include the MONK Sensitivity option, and the status of the Doppler broadening capability. He began by reviewing the sensitivity method used in MONK and how this could be applied to generate the new sensitivity of k-effective to nubar. He then described the new covariance module in MONK and how it links into the sensitivity option. Geoff concluded his talk by giving an update on the Doppler broadening capability in BINGO, including additional keywords to modify the degree of processing required, and automatically to avoid using the capability when within 0.5C of a library temperature. Albrecht Kyrieleis (Serco) then gave a presentation covering the implementation of CAD import into MONK, and the XML/HDF5 results export capability. Albrecht described the CAD import options, gave examples of MONK inputs showing the flexibility gained by mixing FG and multiple CAD instances, and described three cases comparing CAD with FG, all showing good agreement. The second part of his talk covered the results export capability. He described the XML export capability, and how the method implemented included the flexibility for future support of HDF5. He described the structures of XML and HSF5, and the HDF5 converter written for MONK to convert XML into HDF5.

The third talk was by Mark Goffin (Imperial College) who described goal-based adaptivity methods using finite elements to calculate k-effective. He described the finite element method, then showed how mesh adaptivity can be used to minimise the error by producing the optimum mesh for the required target(s). He explained the use of Global and Local error estimates, the latter giving the error estimate on a given quantity and requiring the solution of the adjoint equation. This lead on to Goal-based adaptivity based on local error estimators which gave an optimum mesh for a given quantity. Several examples were shown that demonstrated the effect of the adaptivity on the meshes and the reduction in the error estimate. He concluded by highlighting future work on angular adaptivity, then simultaneous space and angle adaptivity. The final presentation of the morning session was given by Adam Bird (Serco) who described the latest development to Visual Workshop, including the option to calculate volumes using a standalone version of the Fortran engine used by Visual Workshop. He described the capabilities for displaying UT meshes, new FG bodies and CAD import files. Embedded files and looping are fully supported, and WIMS geometries can now be displayed in Visual Workshop. The results display has been enhanced to indicate graphically the standard deviation on the results. He then described a number of slides showing screenshots, and concluded by giving a live demonstration of the new capabilities in Visual Workshop.

The afternoon session began with a presentation by Deborah Hill (NNL) on the activities of the WPC group. She described the wide range of activities and interactions with regulators and operators, good/best practice, professional development, and collaborations both national and international. She highlighted topics covered in the last few months, ranging from inspections feedback, the CPD workshop and the competence framework though to international activities such as the ICNC, technical presentations, and the code users forum. She described a number of items to be covered in the next few months, and concluded by reminding us how we can contribute and get involved with the WPC to benefit the criticality community.

The next presentation was given by James Dyrda (AWE) who described a series of WPNCS sensitivity benchmark studies he has contributed to, and how the MONK sensitivity module compares with other codes. He gave a brief overview of the MONK sensitivity method, then described three benchmark problems designed to study different aspects of the models. He then showed the results for MONK9A in some detail, then presented a series of very detailed tables of sensitivities for selected parameters calculated by each of the participants. In the second phase he showed plots of critical heights and the sensitivity of k-effective to a range of fissile and moderator nuclides.

The first presentation of the final session was by Mark Henderson (EDF Energy) who gave a talk on the effects of modeling support components in fuel elements. He described AGR and PWR fuel elements and how not modeling their internal grids and braces is considered conservative by increasing the amount of moderator when the fuel elements are stored fully flooded. He described a change in the methodology due to the soluble boron content in the water, and how it was necessary to justify continuing to assume the supports need not be modeled. Mark described a series of models developed to test the validity of the assumption, including a tipped AGR skip. He demonstrated that for most credible accident scenarios the assumption is valid, however conservatism decreases as boron concentration increases.

The next presentation was given by Paul Smith (Serco) who described the CRITEX code used to model the transient criticality of fissile solutions. He described the physical phenomena to be considered and the transient progression with no gas and additional gas production. The equations describing the treatment of the point kinetics, energy balance and radiolytic gas production were described. The revised input using RKARD was shown, and a series of test case results described showing good agreement. The final two talks were given by Malcolm Armishaw. The first described the features planned to be in the next release of MONK. These included input syntax enhancements, CAD import, additional HOLE geometries, extension to FG rotations, a range of BINGO libraries, sensitivity of k to BINGO nuclear data (including nubar), Shannon entropy, Unified Tally capability and the new UT mesh based burnup with MPI and 64-bit addressing for handling large calculations. The second talk was a round up of the status of MONK and some of the plans for future work after the next release of MONK.

Malcolm closed the day by thanking all those who had attended the seminar, in particular those who had taken the time and effort to prepare and present talks.

 


Thank you for letting us use cookies to improve your experience of this website. Please read our cookie statement for information on how we use cookies and how to restrict their use.