WeChat Mini Program
Old Version Features

Restoring Sweat Gland Function in Mice Using Regenerative Sweat Gland Cells Derived from Chemically Reprogrammed Human Epidermal Keratinocytes

SCIENCE BULLETIN(2024)

Cited 0|Views1
Abstract
The regeneration of sweat glands (SwGs) plays a pivotal role in the functional recovery of extensive skin wounds. Recent research has illuminated the possibility of reprogramming human epidermal keratinocytes (HEKs) into induced SwG cells through the ectopic expression of ectodysplasin A. However, the clinical application of this genetic manipulation approach is inherently limited. In this study, we present findings demonstrating that a combination of six compounds can effectively and speedily reprogram HEKs in culture into fully functional SwG cells. These chemically induced SwG-like cells (ciSGCs) closely resemble the morphology, phenotypes, and functional properties of human primary SwG ductal cells. Furthermore, ciSGCs can be stimulated to differentiate into mature SwG cell types in vitro. In a 3D culture system, they can also generate SwG organoids that exhibit structural and biological features akin to native SwGs. Upon transplantation into scalded mouse paw skin, ciSGCs significantly expedited cutaneous wound healing and completely restored the structural and functional aspects of the SwGs. In conclusion, the small molecule cocktail-directed SwG reprogramming offers a non-transgenic and controllable strategy for producing high-quality, clinical-grade SwG cells, showing immense potential for the treatment of burn patients.
More
Translated text
Key words
Human epidermal keratinocytes,Sweat gland,Regeneration,Chemical reprogramming
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined