Ab-upt: Scaling Neural CFD Surrogates for High-Fidelity Automotive Aerodynamics Simulations via Anchored-Branched Universal Physics Transformers: Deep Learning on High-Fidelity Automotive Aerodynamics Simulations

  • Benedikt Alkin
  • , Maurits Bleeker
  • , Richard Kurle
  • , Tobias Kronlachner
  • , Reinhard Sonnleitner
  • , Matthias Dorfer
  • , Johannes Brandstetter*
  • *Corresponding author for this work

Research output: Working paper and reportsPreprint

Abstract

Recent advancements in neural operator learning are paving the way for transformative innovations in fields such as automotive aerodynamics. However, key challenges must be overcome before neural network-based simulation surrogates can be implemented at an industry scale. First, surrogates must become scalable to large surface and volume meshes, especially when using raw geometry inputs only, i.e., without relying on the simulation mesh. Second, surrogates must be trainable with a limited number of high-fidelity numerical simulation samples while still reaching the required performance levels. To this end, we introduce Geometry-preserving Universal Physics Transformer (GP-UPT), which separates geometry encoding and physics predictions, ensuring flexibility with respect to geometry representations and surface sampling strategies. GP-UPT enables independent scaling of the respective parts of the model according to practical requirements, offering scalable solutions to open challenges. GP-UPT circumvents the creation of high-quality simulation meshes, enables accurate 3D velocity field predictions at 20 million mesh cells, and excels in transfer learning from low-fidelity to high-fidelity simulation datasets, requiring less than half of the high-fidelity data to match the performance of models trained from scratch.
Original languageEnglish
Number of pages43
DOIs
Publication statusPublished - 13 Feb 2025

Publication series

NamearXiv.org
No.2502.09692

Fields of science

  • 102003 Image processing
  • 101019 Stochastics
  • 103029 Statistical physics
  • 101018 Statistics
  • 102001 Artificial intelligence
  • 101017 Game theory
  • 202017 Embedded systems
  • 101016 Optimisation
  • 101015 Operations research
  • 101014 Numerical mathematics
  • 101029 Mathematical statistics
  • 101028 Mathematical modelling
  • 101026 Time series analysis
  • 101024 Probability theory
  • 102032 Computational intelligence
  • 102004 Bioinformatics
  • 102013 Human-computer interaction
  • 101027 Dynamical systems
  • 305907 Medical statistics
  • 101004 Biomathematics
  • 305905 Medical informatics
  • 101031 Approximation theory
  • 102033 Data mining
  • 102 Computer Sciences
  • 305901 Computer-aided diagnosis and therapy
  • 102019 Machine learning
  • 106007 Biostatistics
  • 102018 Artificial neural networks
  • 106005 Bioinformatics
  • 202037 Signal processing
  • 202036 Sensor systems
  • 202035 Robotics

JKU Focus areas

  • Digital Transformation

Cite this