Parallel Natural Convection Simulation on GPU Using Lattice Boltzmann Method
Main Authors: | Widyaparaga, Adhika , Pranowo, . |
---|---|
Format: | Proceeding PeerReviewed Book |
Subjects: | |
Online Access: |
http://e-journal.uajy.ac.id/13802/1/C7_04_IJJSS_2012.pdf http://e-journal.uajy.ac.id/13802/2/Peer_Review_C7_04_IJJSS_2012.pdf http://e-journal.uajy.ac.id/13802/3/Cek_Turnitin_C7_04_IJJSS_2012.pdf http://e-journal.uajy.ac.id/13802/ |
Daftar Isi:
- In this paper, we propose the implementation of a parallel lattice Boltzmann method (LBM) algorithm for the simulation of two-dimensional natural convection heat transfer problems. The LBM code is written using NVIDIA C language and is run on the graphical processing unit (GPU) of a Nvidia GeForce9600 card. The behaviour of the convective flows, which are driven by buoyancy forces, was numerically studied for Rayleigh numbers (Ra) from 1×103to 1×106. Our numerical results show a good agreement with the experimental and numerical results obtained in the literature. Based on performance comparison, it is shown that performance utilizing the processing power of the GPU is significantly faster than unaided CPU processing.