Abstract
The acceptance of materials for long-term, safety-critical power generation applications requires multiple testing stages and data generation. Initial screening involves short-term exposures under simplified, constant atmospheres and temperatures, which can eliminate unsuitable materials but fail to distinguish between those with broadly acceptable properties. Subsequent pilot plant testing, costing over £100K for month-long exposures, is typically required. An intermediate laboratory testing step that better replicates in-service conditions would offer a cost-effective approach to material selection and lifetime prediction. For steam oxidation degradation, key experimental parameters—such as water chemistry, pressure, steam delivery, and flow rate—must be tailored to produce oxide scale morphologies similar to those observed in actual plant conditions. This study examines the effects of these parameters through steam exposure tests on ferritic (P92), austenitic (Esshete 1250), and superalloy (IN740) materials. Results indicate that oxidation rates vary with dissolved oxygen levels in feed water, increasing for austenitic materials and decreasing for ferritic materials, while also influencing spallation tendencies. Additionally, steam pressure and delivery methods impact oxidation rates and scale morphology. A comparison with service-exposed materials revealed that traditional oxide scale morphologies were not adequately replicated, whereas cyclic oxidation tests provided a closer match to service-grown scales.