Constructing a Lithiophilic and Mixed Conductive Interphase Layer in Electrolyte with Dual-Anion Solvation Sheath for Stable Lithium Metal Anode
SSRN Electronic Journal(2022)
Univ Chinese Acad Sci
Abstract
The application of lithium metal batteries (LMBs) is greatly inhibited by the uncontrollable growth of Li metal dendrites and the consequent safety hazards. Herein, we introduce AgSO3CF3 and LiNO3 into the electrolyte to form a stable mixed conductive interphase (MCI) layer on Li metal via in-situ surface reaction. Upon the surface reaction, ultrafine Ag nanoparticles uniformly formed on the Li surface that effectively seed dendrite-free lithium deposition. Besides, the SO3CF3- and NO3- anions in the solvation shell of Li+ can be reduced to produce an AgLiF-Li3N rich interface layer. The as-obtained uniform and robust layer renders a smooth Li morphology and fast interfacial kinetics. Therefore, Li||Li symmetrical cell exhibits a long lifespan over 2000 hours with an ultralow overpotential, and Li||LiFePO4 full cell maintains capacity retention of ~85% after 1100 cycles. The strategy proposed here opens a new avenue for addressing dendrite issues and makes the application of lithium metal batteries feasible.
MoreTranslated text
Key words
lithium metal anode,Dendrite free,Mixed conductive interphase,Solvation structure,Silver trifluoromethane sulfonate
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined