WeChat Mini Program
Old Version Features

Ctdna Dynamics and Mechanisms of Acquired Resistance in Patients Treated with Osimertinib with or Without Bevacizumab from the Randomized Phase II ETOP-BOOSTER Trial

CLINICAL CANCER RESEARCH(2024)

Department of Haematology-Oncology

Cited 0|Views5
Abstract
PURPOSE:The ETOP 10-16 BOOSTER study was a randomized phase II trial of osimertinib and bevacizumab therapy versus osimertinib therapy in patients with an acquired EGFR T790M mutation. The mechanisms of acquired resistance to osimertinib and bevacizumab have not been described previously. EXPERIMENTAL DESIGN:Next-generation sequencing (Guardant360) was conducted in serial plasma samples. The association between ctDNA and efficacy outcomes was explored, and molecular alterations at progression were described. RESULTS:A total of 136 patients (88% of 155 randomized) had plasma samples at baseline (68 per arm), 110 (71%) at week 9, and 65 (42%) at progression. In a multivariable model for progression-free survival (PFS), the treatment effect was found to differ by smoking status (interaction P = 0.046), with the effect of smoking also differing by baseline EGFR T790M (interaction P = 0.033), whereas both TP53 at baseline and the tissue EGFR exon 21 L858R mutation were significantly associated with worse PFS outcome. Smokers (current/former) without baseline EGFR T790M showed a significant improvement in PFS under combination treatment, albeit with small numbers (P = 0.015). Week-9 EGFR T790M clearance was associated with improved PFS in the osimertinib arm (P = 0.0097). Acquired EGFR C797S mutations were detected in 22% and 13% of patients in the combination and osimertinib arms, respectively. CONCLUSIONS:The differential effect of treatment by smoking was not explained by TP53 mutations or other molecular alterations examined. Molecular mechanisms of acquired resistance were detected, but no novel molecular alterations were identified in the combination arm.
More
Translated text
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined