As a manufacturer of precision metal tubing, testing the hardness of alloys is important to our operation. We measure the hardness of our high quality tubes as part of our quality control procedures. These tests are quick to perform, relatively cheap and often non-destructive to the components being evaluated.
Hardness is the ability of any material to resist surface indentation or scratching. It is not a fundamental property of a material, so its value varies according to the test method used. Consequently, we employ several different testing methods depending on the alloys used and the kind of tubing being manufactured.
The basic principal is that hardness is measured from an indentation produced in the material by applying a constant load on a specific indentor in contact with the component surface for a specified time. When testing thin wall or small diameter tubes, commonly used methods such as the Rockwell test are not ideal as the relatively high load (90kg) can distort surfaces or even punch holes through the tube walls. The ball impression can also fall away due to the curvature of the surface. In this context Vickers hardness testing is seen as more accurate. However when using Vickers tests a form of equivalence with the Rockwell scale is often desired. This correlation is not linear and comparison scales vary subjectively and between different metals.
Fine Tubes has performed detailed research with the UK National Physics Laboratory to find a meaningful comparison between hardness scales using different tests. Below we’ve compiled an outline that explains the most common hardness measurement methods and how they compare.
Hardness Conversion Chart
Comparison of Hardness Scales approx. and Tensile Stress Equivalents approx. (maximum value) in imperial and metric units.
Click here to view the chart