人TRAP1基因的克隆、原核表达及表达条件优化
Biotechnology Bulletin(2011)
Biomedicine Research & Development Center
Abstract
应用RT-PCR技术,从人白血病多药耐药细胞株K562/ADR中扩增出肿瘤坏死因子受体相关蛋白1(tumor nec-rosis factor receptor-associated protein1,TRAP1)基因的cDNA。选择NdeⅠ和XhoⅠ分别作为上下游引物的酶切位点,将TRAP1基因克隆到带有6×His标签的pET-28a(+)的载体上。重组质粒转化大肠杆菌DH5α中,涂布于含卡那霉素的LB琼脂培养基上,经双酶切鉴定后的阳性克隆送去测序。测序成功的重组质粒pET28a(+)-TRAP1转化大肠杆菌BL21(DE3)中,在IPTG的诱导下,成功表达重组蛋白TRAP1。通过改变诱导温度、诱导时机、IPTG浓度及诱导时间,找出最佳表达条件,使重组蛋白TRAP1表达量最高。结果表明,在39℃条件下,OD600达到0.8时,经终浓度为0.1 mmol/L的IPTG诱导6 h,目的蛋白的表达量最高。该研究为纯化出TRAP1蛋白,进一步研究该蛋白的结构和功能奠定了基础。
MoreTranslated text
Key words
TRAP1 Hsp75 Cloning Prokaryotic expression Optimization pET-28a(+) BL21
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined