Determinants of 5-Year Survival in Patients with Advanced NSCLC with PD-L1≥50% Treated with First-Line Pembrolizumab Outside of Clinical Trials: Results from the Pembro-real 5Y Global Registry
Journal for immunotherapy of cancer(2025)SCI 2区
Fdn Policlin Univ Campus Biomed
Abstract
Background Pembrolizumab monotherapy is an established front-line treatment for advanced non-small cell lung cancer (NSCLC) with programmed cell death-ligand 1 (PD-L1) tumor proportion score (TPS)>= 50%. However, real-world data on its long-term efficacy remains sparse.Methods This study assessed 5-year outcomes of first-line pembrolizumab monotherapy in a large, multicenter, real-world cohort of patients with advanced NSCLC and PD-L1 TPS >= 50%, referred to as Pembro-real 5Y. Individual patient-level data (IPD) from the experimental arm of the KEYNOTE-024 trial were extracted (KN024 IPD cohort) to compare the long-term outcomes between the two cohorts. To further assess the reproducibility of clinical trial results, we reconstructed the "KN024 look-alike" cohort by excluding patients with an Eastern Cooperative Oncology Group-performance status (ECOG-PS)>= 2, those requiring corticosteroids with doses >= 10 mg of prednisolone/equivalent, patients with positive/unknown epidermal growth factor receptor/anaplastic lymphoma kinase genotype, and those with pre-existing autoimmune disease. We additionally provided a hierarchical organization of determinants of long-term benefit through a conditional inference tree analysis.Results The study included 1050 patients from 61 institutions across 14 countries, with a median follow-up of 70.3 months. The 5-year survival rate was 26.9% (95% CI: 23.8% to 30.2%), and median OS was 21.8 months (95% CI: 19.1 to 25.7), while 32 (3.0%) patients who achieved a complete response remained progression-free at the data cut-off. The KN024 look-alike cohort had a 5-year survival rate of 29.3% (95% CI: 25.5% to 33.6%) and a median OS of 27.5 months (95% CI: 22.8 to 31.3). Neither the overall study population nor the KN024 look-alike cohort exhibited significantly different OS compared with the KN024 IPD cohort. By the data cut-off, 1015 patients (96.7%) had permanently discontinued treatment: 659 (64.9%) due to progressive disease, 156 (15.4%) due to toxicity, 77 (7.6%) due to treatment completion, and 106 (10.4%) due to other reasons. Overall, 222 participants (21.1%) were treated for a minimum period of 24 months, among them the 5-year survival rates were: 31.7%, 72.7%, 78.6%, 84.2% for patients who discontinued treatment due to progressive disease, toxicity, treatment completion, and other reasons, respectively.Conclusion This study provides valuable real-world evidence that confirms the long-term efficacy of pembrolizumab outside of clinical trials. Hierarchical organization indicates ECOG-PS, age and PD-L1-TPS as the most important predictors of 5-year survival, potentially informing clinical practice.
MoreTranslated text
Key words
Immunotherapy,Immune Checkpoint Inhibitor,Lung Cancer
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
2012
被引用1721 | 浏览
2019
被引用147 | 浏览
2019
被引用81 | 浏览
2019
被引用265 | 浏览
2020
被引用75 | 浏览
2021
被引用636 | 浏览
2022
被引用197 | 浏览
2023
被引用188 | 浏览
2023
被引用189 | 浏览
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper