Small-diameter Vascular Graft Composing of Core-Shell Structured Micro-Nanofibers Loaded with Heparin and VEGF for Endothelialization and Prevention of Neointimal Hyperplasia
BIOMATERIALS(2024)
Soonchunhyang Univ
Abstract
Despite the significant progress made in recent years, clinical issues with small-diameter vascular grafts related to low mechanical strength, thrombosis, intimal hyperplasia, and insufficient endothelialization remain unresolved. This study aims to design and fabricate a core-shell fibrous small-diameter vascular graft by co-axial electrospinning process, which will mechanically and biologically meet the benchmarks for blood vessel replacement. The presented graft (PGHV) comprised polycaprolactone/gelatin (shell) loaded with heparin-VEGF and polycaprolactone (core). This study hypothesized that the shell structure of the fibers would allow rapid degradation to release heparin-VEGF, and the core would provide mechanical strength for long-term application. Physico-mechanical evaluation, in vitro biocompatibility, and hemocompatibility assays were performed to ensure safe in vivo applications. After 25 days, the PGHV group released 79.47 ± 1.54% of heparin and 86.25 ± 1.19% of VEGF, and degradation of the shell was observed but the core remained pristine. Both the control (PG) and PGHV groups demonstrated robust mechanical properties. The PGHV group showed excellent biocompatibility and hemocompatibility compared to the PG group. After four months of rat aorta implantation, PGHV exhibited smooth muscle cell regeneration and complete endothelialization with a patency rate of 100%. The novel core-shell structured graft could be pivotal in vascular tissue regeneration application.
MoreTranslated text
Key words
Small -diameter vascular graft,Vascular tissue engineering,Co -axial electrospinning,Endothelialization,Intimal hyperplasia
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined