WeChat Mini Program
Old Version Features

Harnessing a T1 Phage-Derived Spanin for Developing Phage-Based Antimicrobial Development

BioDesign Research(2024)

Natl Inst Infect Dis

Cited 0|Views6
Abstract
The global increase in the prevalence of drug-resistant bacteria has necessitated the development of alternative treatments that do not rely on conventional antimicrobial agents. Using bacteriophage-derived lytic enzymes in antibacterial therapy shows promise; however, a thorough comparison and evaluation of their bactericidal efficacy are lacking. This study aimed to compare and investigate the bactericidal activity and spectrum of such lytic enzymes, with the goal of harnessing them for antibacterial therapy. First, we examined the bactericidal activity of spanins, endolysins, and holins derived from 2 Escherichia coli model phages, T1 and T7. Among these, T1-spanin exhibited the highest bactericidal activity against E. coli. Subsequently, we expressed T1-spanin within bacterial cells and assessed its bactericidal activity. T1-spanin showed potent bactericidal activity against all clinical isolates tested, including bacterial strains of 111 E. coli , 2 Acinetobacter spp., 3 Klebsiella spp., and 3 Pseudomonas aeruginosa . In contrast, T1 phage-derived endolysin showed bactericidal activity against E. coli and P. aeruginosa , yet its efficacy against other bacteria was inferior to that of T1-spanin. Finally, we developed a phage-based technology to introduce the T1-spanin gene into target bacteria. The synthesized non-proliferative phage exhibited strong antibacterial activity against the targeted bacteria. The potent bactericidal activity exhibited by spanins, combined with the novel phage synthetic technology, holds promise for the development of innovative antimicrobial agents.
More
Translated text
Key words
Antimicrobial Resistance
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined