I met Tim Garret at CPSA’s in-person conferences several times in the pre-covid era. Tim and his graduate students have always been working at the cutting edge of LC-MS. His latest article, “Small molecule biomarker discovery: Proposed workflow for LC-MS-based clinical research projects” is a collaboration with several global researchers. With the need for more biomarkers that can identify patients with different diseases and monitor the impact of new therapies, robust practices are needed. The article discusses 5 integrated phases that result in robust, reliable data and new biomarker identification. It starts with a sound quality control process that oversees and is built into the other phases
1) careful planning and implementation of the clinical trial (including the sample collection, processing and storage)
2) planning the LC-MS analyses with proper controls to ensure integrity, followed by sample extraction and then targeted, semi-targeted and non-targeted LC-MS
3) data integration and processing to ensure that observed peaks can reliably be identified and quantified (or relative quatification in the case of non-targeted analysis)
4) data analysis and interpretation into the realm of the clinical trial and disease.
Many
of the activities in phase 3 and 4 are now being supported by a growing range
of machine learning (ML) algorithms and artificial intelligence (AI) applications.
A surprising number of which are referenced in the article. Surprising from their number, not the inclusion in the article.
The article does not delve deeply into the intricate art of target or non-targeted LC-MS, but focuses more on the overall process and as such, serves as a useful tool to anyone working in new therapy development and needs to understand how to successfully integrate biomarker discovery into their clinical trials.
No comments:
Post a Comment