Charge Ordering and Π–d Interaction in Electron-Doped 3/4-Filling Molecular System Α′′-(Bedt-ttf)2rb2xco(scn)4 (x = 0.6)
Journal of the Physical Society of Japan(2021)SCI 4区
Tohoku Univ
Abstract
We have investigated a newly found alpha ''-phase bis(ethylenedithio)tetrathiafulvalene (BEDT-TTF) molecular arrangement system, namely, alpha ''-(BEDT-TTF)(2)Rb2xCo(SCN)(4) (alpha ''-Rb2xCo), with localized S = 3/2 Co2+ spins. From X-ray structural analyses, we found that owing to the nonstoichiometric ratio of Rb ions (x = 0.6), this compound takes an intermediate value of pi-electron band filling between those of alpha ''-(BEDT-TTF)(2)CsHg(SCN)(4) and alpha ''-(BEDT-TTF)(2)K1.4Co(SCN)(4). alpha ''-Rb2xCo (x = 0.6) is a paramagnetic metal at room temperature and exhibits a first-order transition to an insulating state at 100 K, which is the lowest among the three compounds. The results of the two-dimensional optical conductivity spectra and peak splitting of charge-sensitive nu(27) molecular vibrational modes indicate the phase transition to a state with a gap of 600 cm(-1), which is attributed to charge ordering consisting of at least four differently charged BEDT-TTF molecules. The pi-spin susceptibility suddenly decreases at the transition temperature, and another anomaly can be seen at about 40 K. Given that pi-spin disappears at 100K (i.e., formation of a spin-singlet state), it is difficult to systematically explain other physical properties as a whole at low temperatures. However, by simply assuming the presence of pi spins, we can account for all the observed results without contradiction. We also discuss anomalies in the dielectricity of pi electrons under magnetic fields mediated by pi-d interactions and the spin-charge coupling.
MoreTranslated text
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
1991
被引用37 | 浏览
2005
被引用136 | 浏览
1996
被引用22 | 浏览
2010
被引用68 | 浏览
2007
被引用31 | 浏览
2007
被引用48 | 浏览
1999
被引用4 | 浏览
1998
被引用376 | 浏览
2001
被引用31 | 浏览
2013
被引用108 | 浏览
2004
被引用150 | 浏览
2006
被引用38 | 浏览
2006
被引用258 | 浏览
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper