I am struggling with detector sensitivity changes over time. It seems that the instrument gradually loses sensitivity after annual preventative maintenance service. The instrument was just serviced and now the analyte of interest is quantifying at almost double the expected concentration (greater shift than I've ever observed after PM). This kind of fluctuation makes monitoring degradation of AOI over time in food product very frustrating.

I run headspace samples, so they are incredibly clean. I swap liners and septa often, trim or replace columns when needed, and otherwise maintain the instrument/consumables.

My application is monitoring decomposition of allyl isothiocyanate in prepared horseradish throughout the product shelf life.

Ideally, there is something I could do to prevent the fluctuation, but it is almost necessary at this point to find a way to decrease the sensitivity so the results make any sense at all.

Agilent has been no help. On multiple service visits their tinkering created side issues that I've mitigated by replacing the column. It still stands that I have inflated (or accurate) results that do not make sense with the data that had been collected before the service visits. Chromatography is hard. :cry: