Internal Standard Conc. Calculation
Posted: Thu Mar 05, 2026 5:00 pm
Hi all,
I have a question regarding the calculation of concentrations in calibrants for an internal standard method of calibration? I hadn’t really thought about how to go about this beyond what I’ve always done until someone challenged me about it yesterday as they were doing it differently. For context, the concentration units for the analysis are % (w/w).
I would normally do it as follows. If I added, say, 1g of component A to 98g of matrix solution along with 1g of internal standard, I’d say that the concentration of component A and the internal standard are both 1% (w/w). The total mass of the solution is 100g (1+1+98) and both component and ISTD are 1g each. Therefore, (1/100)*100 = 1%. So far so logical, to me anyway!
But a colleague put it to me that the concentration of the analyte in this calibration solution should actually be 1.01 %(w/w) on the basis that the internal standard isn’t actually in the samples being measured. It’s only added by the chemist. Therefore, the calculation should be (1/99)*100 = 1.01%.
This seems illogical to me. And, ultimately, you might argue that only a pedant would care about the difference between 1 and 1.01 (guilty as charged!!) but I can foresee issues further down the line unless I box this off in my head.
I was wondering what your thoughts were regarding this?
I have a question regarding the calculation of concentrations in calibrants for an internal standard method of calibration? I hadn’t really thought about how to go about this beyond what I’ve always done until someone challenged me about it yesterday as they were doing it differently. For context, the concentration units for the analysis are % (w/w).
I would normally do it as follows. If I added, say, 1g of component A to 98g of matrix solution along with 1g of internal standard, I’d say that the concentration of component A and the internal standard are both 1% (w/w). The total mass of the solution is 100g (1+1+98) and both component and ISTD are 1g each. Therefore, (1/100)*100 = 1%. So far so logical, to me anyway!
But a colleague put it to me that the concentration of the analyte in this calibration solution should actually be 1.01 %(w/w) on the basis that the internal standard isn’t actually in the samples being measured. It’s only added by the chemist. Therefore, the calculation should be (1/99)*100 = 1.01%.
This seems illogical to me. And, ultimately, you might argue that only a pedant would care about the difference between 1 and 1.01 (guilty as charged!!) but I can foresee issues further down the line unless I box this off in my head.
I was wondering what your thoughts were regarding this?