WeChat Mini Program
Old Version Features

Biochemical and Anthropometric Parameters for the Early Recognition of the Intrauterine Growth Restriction and Preterm Neonates at Risk of Impaired Neurodevelopment.

INTERNATIONAL JOURNAL OF MOLECULAR SCIENCES(2023)

Univ Perugia

Cited 1|Views12
Abstract
Background: S100B and Tau are implicated with both brain growth and injury. Their urinary levels in 30-to-40-day-old full-term, preterm, IUGR, and preterm-IUGR subjects were measured to investigate their possible relationship with future delayed neurodevelopment. Methods: Values were related to the neuro-behavioral outcome at two years of age, as well as to brain volumes and urinary NGF assessed at the same postnatal time point. Results: Using the Griffiths III test, cognitive and motor performances were determined to establish subgroups characterized by either normal or impaired neuro-behavior. The latter included preterm, IUGR, and preterm-IUGR individuals who exhibited significantly higher and lower S100B and Tau levels, respectively, along with markedly reduced cerebral volumes and urinary NGF, as previously demonstrated. Contrary to NGF, however, Tau and S100B displayed a weak correlation with brain volumes. Conclusions: Delayed cognitive and motor performances observed in two-year-old preterm and IUGR-born individuals were also found to be associated with anomalous urinary levels of S100B and Tau, assessed at 30–40 days of the postnatal period, and their changes did not correlate with brain growth. Thus, our data suggests that, in addition to cerebral volumes and NGF, urinary S100B and Tau can also be considered as valuable parameters for the early detection of future neurodevelopmental abnormalities.
More
Translated text
Key words
impaired neurodevelopment,IUGR,preterm,S100B,Tau,NGF,cerebral volumes
上传PDF
Bibtex
收藏
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined