Anomalous Neutral Hydrogen Column Densities in Local Interstellar Medium Clouds
The Astrophysical Journal(2025)
JILA
Abstract
Analysis of high-resolution spectra of the hydrogen and deuterium Ly α lines provides measurements of interstellar neutral hydrogen column densities N (H i ) to 113 stars within 50 pc of the Sun. Plots of N (H i ) versus distance through the LIC, G, Mic, and other clouds in the local interstellar medium (LISM) show very interesting properties. For the LIC and G clouds, nearly all of the observed neutral hydrogen occurs within 3 or 4 pc of the Sun with no significant additional neutral hydrogen at larger distances out to 50 pc, except for several sight lines with anomalously high N (H i ). Scatter about the mean hydrogen column density in the LIC is several times larger than the measurement errors. We evaluate several possible sources of the high N (H i ) sight lines. Seven sight lines with anomalously high N (H i ) are aligned perpendicular to the line connecting the centers of the LIC and G clouds where the two clouds merge creating a mixed cloud region. The high N (H i ) in these seven sight lines can be explained by their paths through an irregularly shaped mixed cloud region where the neutral hydrogen number density is the sum for both clouds. However, the most recent nearby supernova explosion created a shell that is seen in nearly the same direction perpendicular to the LIC/G axis and may also explain these seven high N (H i ) sight lines. Other possible explanations for the high N (H i ) sight lines include interstellar shocks, wakes produced by stars moving rapidly through the LISM, and regions where other clouds may overlap.
MoreTranslated text
Key words
Stellar-interstellar interactions,Interstellar clouds,Interstellar medium wind,Heliosphere,Warm neutral medium,Ultraviolet sources
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined