Mn0.6Ni1.4Co2Oy Micro-Nano Tower Structure with Tunable Spectral Selectivity Interface for Infrared Stealth and Solar Selective Coating Application
Progress in Organic Coatings(2023)
Nanjing Univ Aeronaut & Astronaut
Abstract
In order to make full use of solar energy and solve the problem of functional incompatibility between solar and infrared spectra, the present work has cleverly aimed at the problem by designing Mn0.6Ni1.4Co2Oy multifunctional interface. The Mn0.6Ni1.4Co2Oy coating was obtained by a simple vacuum spraying method. It was found that the coatings prepared with the appropriate amount of MNC/resin ratio (5:5) possessed the high solar absorptivity (0.915) and low infrared emittance (0.245) at ambient temperatures, as well as excellent heat collecting properties, which suggests that the structure of the Mn0.6Ni1.4Co2Oy is maximally preserved. Notably, the DSC results confirm that the introduction of Mn0.6Ni1.4Co2Oy into the resin can strengthen the thermal stability of the coating, which is mainly attributed to the fact that Mn0.6Ni1.4Co2Oy can effectively prevent the chain segment movement of the resin. Furthermore, the addition of low-emissivity aluminum powder further reduced the emissivity of the coating to 0.142 and the absorbance/emissivity ratio to 6.22 at room temperature. The coating demonstrated remarkable thermal stability and infrared stealth performance even at 300 degrees C, meeting the basic requirements of solar selective absorption. This work provides a low-cost and simple spray method for fabricating Mn0.6Ni1.4Co2Oy solar selective absorption coating with great potential in high temperature infrared stealth applications.
MoreTranslated text
Key words
Tunable spectral interface,micro-nano tower layer,Solar selective absorption coating,Infrared stealth,Spraying method
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper