WeChat Mini Program
Old Version Features

Flow Control Optimization Using Genetic Algorithms with Reduced Order Modeling

AIAA SCITECH 2024 FORUM(2024)

Ohio State Univ

Cited 0|Views5
Abstract
The theory of active flow control has been extensively developed and applied to mitigate undesirable flow features of aerodynamic systems. However, active control is often difficult to implement for applications described by complex, nonlinear physics. Genetic algorithms (GA) offer an attractive alternative by mimicking natural selection to converge on an optimal control input for a given objective function. The principal benefit of the GA for our purposes is that it is data driven, i.e., agnostic to the governing equations of the flow and thus does not incur simplifications typically adopted with traditional control approaches. In this work, a real-coded GA is first validated using a two-dimensional, algebraic test function as a surrogate fitness function; this exercise guides the choice of mutation, selection, and crossover parameters for rapid convergence to the optimal solution. The GA is then considered for the problem of a supersonic planar impinging jet to mitigate noise from aeroacoustic resonance modes. The formulation uses a dynamic mode decomposition based reduced order model (DMD-ROM) from a large eddy simulation (LES) database to provide a very economical fitness function evaluation. This allows various combinations of control input forcing variables of notional actuators near the nozzle (amplitude, frequency and phase) to be tested, that would be cost prohibitive without the ROM. Results on the efficiency of the GA in finding the optimal energy forcing gain relative to a brute force parametric sweep are discussed. Ongoing work will exhibit how the GA converges on a control scheme to reduce jet noise with more complex fitness functions.
More
Translated text
Key words
Control,Probabilistic Design Optimization,Fluid Dynamics
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined