Final Technical Report: Askaryan Calorimeter Experiment
openalex(2023)
Univ. of Hawaii at Manoa
Abstract
the important next step was the development of large area (1m x 1m) GEM planes, we also have looked into opportunities of applying this technology to precision tracking detectors to significantly improve the performance of the Range Stack detector for CP violation experiments and to provide an amplification layer for the liquid Argon Time Projection Chamber in the LBNE experiment. We have jointly developed 33cmx100cm large GEM foils with the CERN gas detector development group to construct 33cm x100cm unit chambers. Three of these unit chambers will be put together to form a 1m x 1m detector plane. Following characterization of one 33cmx100cm unit chamber prototype, a total of five 1m x 1m planes will be constructed and inserted into an existing 1m3 RPC DHCAL stack to test the performance of the new GEM DHCAL in particle beams. The large area GEM detector we planned to develop in this proposal not only gives an important option to DHCAL for future collider experiments but also the potential to expand its use to Intensity Frontier and Cosmic Frontier experiments as high efficiency, high amplification anode planes for liquid Argon time projection chambers. Finally, thanks to its sensitivity to X-rays and other neutral radiations and its light-weight characteristics, the large area GEM has a great potential for the use in medical imaging and homeland security, as well as satellite based astronomy experiments.
MoreTranslated text
Key words
Time Projection Chambers,Detector Performance
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined