Analytical Solution and Experimental Verification for the Buckling Failure of Additively Manufactured Octagonal Honeycombs
COMPOSITE STRUCTURES(2023)
Abstract
In this study, an analytical closed-form expression was developed based on the beam-column theory for the buckling strength of octagonal honeycombs without and with defects under additive manufacturing, and an analytical expression of the critical relative density of the octagonal honeycomb was also proposed. Plateau borders and wavy cell walls were considered in the studied structure. Finite element simulations and experiments on the buckling strengths of the octagonal honeycombs with and without defects were also performed, and the results were then compared with the theoretical results. Good agreement between the results validated the presented analytical solution for the buckling strength, and the critical relative density was used to accurately predict the failure modes of the honeycomb structures. The buckling resistance of the octagonal honeycomb structure was better than that of the traditional hexagonal honeycomb structure. For the octagonal honeycomb with defects, the analytical solution was consistently in good agreement with the experimental and simulation results. The buckling stress distribution, buckling mode, and buckling strength of the octagonal honeycomb were significantly affected by the presence of defects.
MoreTranslated text
Key words
Octagonal honeycomb,Buckling,Defects,Failure mode,Additive manufacturing
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined