Electronic Tuning of CO2 Interaction by Oriented Coordination of N-Rich Auxiliary in Porphyrin Metal-Organic Frameworks for Light-Assisted CO2 Electroreduction
ADVANCED SCIENCE(2023)
Anhui Univ Technol
Abstract
The efficient CO2 electroreduction into high-value products largely relies on the CO2 adsorption/activation or electron-transfer of electrocatalysts, thus site-specific functionalization methods that enable boosted related interactions of electrocatalysts are much desired. Here, an oriented coordination strategy is reported to introduce N-rich auxiliary (i.e., hexamethylenetetramine, HMTA) into metalloporphyrin metal organic frameworks (MOFs) to synthesize a series of site-specific functionalized electrocatalysts (HMTA@MOF-545-M, M = Fe, Co, and Ni) and they are successfully applied in light-assisted CO2 electroreduction. Noteworthy, thus-obtained HMTA@MOF-545-Co presents approximately two times enhanced CO2 adsorption-enthalpy and electrochemical active surface-area with largely decreased impedance-value after modification, resulting in almost twice higher CO2 electroreduction performance than its unmodified counterpart. Besides, its CO2 electroreduction performance can be further improved under light-illumination and displays superior FECO (approximate to 100%), high CO generation rate (approximate to 5.11 mol m(-2) h(-1) at -1.1 V) and energy efficiency (approximate to 70% at -0.7 V). Theoretical calculations verify that the oriented coordination of HMTA can increase the charge density of active sites, almost doubly enhance the CO2 adsorption energy, and largely reduce the energy barrier of rate determining step for the boosted performance improvement. This work might promote the development of modifiable porous crystalline electrocatalysts in high-efficiency CO2 electroreduction.
MoreTranslated text
Key words
CO2 electroreduction,CO2 interaction,hexamethylene tetramine,metal-organic framework,oriented coordination
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined