Version 38 (modified by 10 years ago) ( diff ) | ,
---|
1. AstroBEAR and Research
Ablative RT Problem
Implementation in the code
- Important Equations and Parameters:bliu01092014 and bliu01142014
- Stencils for diffusion solver: [ThermalConduction]
- Testing for the code: #309, #331, #345 blog
- References: Thermal Conduction Test
- Transfer the initial data to coarse grid with AstroBEAR and so it can take advantage of AMR: bliu08282014
2D Case
- Growth rate
200 Extended Zones | ;; ; movie |
5 Extended Zones | movie |
3D Case
- Initial Profile from LLE
Original Doc | parameters ; gravity=0.4*acceleration0 |
Original Data for thick Target | density; pressure; temperature; Vx; Vy;Vz |
Original Data for thin Target | density; pressure; Vz |
Thick Target Data files in text (normalized using parameters above, initial profile along z direction) | density; pressure; temperature; Vx; Vy;Vz |
Thin Target Data files in text (normalized using parameters above, initial profile along z direction) | density; pressure; Vz |
Thick Target: Initial Profile along center line | |
Thick Target Results from AstroBEAR (fixed grid | ; movie1; movie2; |
Thin Target: Initial Profile along center line | |
Thin Target Results from AstroBEAR (fixed grid) | ; movie; |
Thin Target Results from AstroBEAR (Coarse grid and AMR) | ; 1 level AMR; 3 levels AMR;3 levels AMR with mesh |
2. Teaching, Presentation, Posters and Other Useful Documents
Configure File for AstroBEAR | The Notes for using AutoTools to generate configure files for AstroBEAR codes. 2014.10 |
CUDA | CIRC Summer School 2014.8 |
Scaling Test of AstroBEAR on BlueHive2 and Stampede | CIRC All-Staff 2014.5 |
Introduction to AstroBEAR2.0 | CIRC All-Staff 2013.8 |
AstroBEAR2.0 and its Performance on BlueGene/Q | Annual CIRC Poster Session 2013 & 2014.5 |
Comparision of Benchmarks Across CIRC Supercomputing Architectures | Annual CIRC Poster Session 2012.5 |
3. Scripts
Collects for useful shell, gnuplot and python scripts & codes
Python function with multiple returns | multiple return function | test. one return is a list |
Python script to read csv file | read csv file; example csv file | the example csv file comes from DDiMAP |
Python scripts to read, write and parse fasta files | read;write; parse; fasta file to a List; example fasta file | biopython needed. The example fasta file comes from DDiMAP |
Python Script to rotate and save Window in VisIt | Save to curve | Save line plot to curve/text file |
Python Script to rotate and save Window in VisIt | Rotate&Save | Tested with VisIt 2.6.2 |
Relink all executable files in all the subdirectories | Relink astrobear for all subfolders | For Bruce, on alfalfa |
Convert gif files to movie files on Mac | Movie Convert | Using ffmpeg; For Mac |
Interpolate from fine grids to coarse grids with AstroBEAR | blog:bliu08282014 | Set Info%ErrFlag to be 1; Read in data and set as level 1 AMR grid instead of 0 |
cherry pick the values for specific time steps | transfer gravity; | pick up gravity values around ts |
hdf5 data to txt | HDF5 to ASCII; ASCII to TXT; | The hdf5 files need to be from single-core runs |
Read in chombo files with multi-processors and reWrite the same frame using single-core. Useful when transferring hdf5 files to txt files. | scrambler.f90; subst for run-directory; script; | Replace the scrambler.f90 file with the new one. The frame number in script need to be updated. 8.19.2014 |
bash and gnuplot script: plot Multiple Data files to Multiple eps | plot.sh gnuplot template | bash arithmetic expressions; latex in gnuplot; use gnuplot variables in title; string parse in bash script; and others |
Scripts to transfer the hdf4 file to text on BH2 | hdf4_to_ascii; combine_pos_var; combine_pos_var; combine_pos_var_v2 | hdf4 module needed |
Extended 2D data to 3D (Set Precisions!) | SI along y;SI along z; cgs along z; | Generate 3D initial data to testing the Ablative RT 3D module code. The initial profile is along y/z direction. Everything along z is same. 2D data in 2Ddata/ and 3D data generated in 3Ddata/ |
Tests
Note:
See TracWiki
for help on using the wiki.