08/26/2013 Zhuo
- I updated my page. Solution using HLLC scheme is herehttps://astrobear.pas.rochester.edu/trac/astrobear/wiki/u/zchen/HLLC
- I have written Roe scheme Riemann solver with entropy fix. I will post its solution on my page tonight.
NLUF stuff
Draft of initial condition setup (shock speed = 80um/ns coming from the left).
In the rad-hydro simulation, the radiation precursor would hit the target first, followed by the thick ablator front:
Here's a cut of density using Spitzer resistivity at 60ns. The shock is coming from left of speed 80um/ns (about Mach 60):
One problem is that the post-shock temperature I'm getting in this simulation is about 600ev (which is correct for adiabatic jump condition). This is very different from Peter's earlier rad-hydro simulation (only ~10ev). Radiative cooling needs to be included in the simulation. Our in-dev radiative transfer could be used in generating the initial condition:
Marshak shock
Order of accuracy
Hi all, here is a plot showing the error I computed for my Poisson solver.
The equation it is solving is:
Laplacian(u) = -Pi*ex*[Pi*ex *Cos(Pi*ex) + Sin(Pi* ex)]
The analytic solution is u = Cos(Pi*ex)
I calculated the error for different dx, using E = uapprox - uexact
where uapprox is the numerical solution at x=0 and uexact is the exact solution at x=0 (=-1)
The error is supposed to go like:
E ~ (dx)n
where n is the order of accuracy.
After computing E for the different dx, I plotted it against what a 1st, 2nd, 3rd order accurate scheme would produce for the given step-size dx:
This seems wonky. The trend seems to start out near 2nd order accurate, but as I reduce dx, it begins to diverge from it. This is the opposite of what I would expect to happen…
I increased the zones in the following sequence, (25, 50, 100, 200, 400, 800), and computed the error at a fixed x=0. I wonder if changing x would give better results… either a different fixed position along u, or a fluctuating position, where the tolerance criteria was minimized…
*UPDATE*
Okay, so I checked in 3 different norms whether this trend still happens, and I see it still… Basically it seems to be 2nd order until the resolution gets small, where the error begins to increase..
PN Stuff
In the PN stuff the wind and jet temperature are << shock temperature - which can cause feedback at the spherical boundary… Here is an image of the same run with an outflow and a jet in which the temperatures were changed from 1000 to 10000K. The shock temperatures are ~ 100000K
And here's a lemon
Radiation interface
To use radiation you need to
- Add configuration options to global.data
!============================================================================================ !Variables used for radiative transfer module !============================================================================================ &RadTransferData solver = 2 ! Which solver to use for linear system solve? StructPCG=1, StructGMRes=2, StructSMG=3 [1] tolerance = 1e-4 ! Solver convergence tolerance [1e-6] printlevel = 0 ! Sets print level for hypre solver. Increasing this will make solvers more verbose. [0] MaxIters = 10000 ! Maximum number of iterations to attempt convergence. [1000] hverbosity = 0 ! Anything > 0 will cause linear systems to be outputted as matrix, rhs, lhs files. [0] mthbc = 6 2 2 1 2 2 ! What type of boundary conditions to use for gravitational potential ! format: (x1, y1, z1, x2, y2, z2) ! 0-Free Streaming, ! 1-Extrapolated, ! 2-Periodic ! 3-Reflecting ! 4-User defined boundary ! 5-User defined radiative flux RadFlux = 824931.169876 ! User defined radiative flux used for mthbc 5 Crankiness = 1 ! parameter that adjusts from backward euler (0) to crank-nicholson(1) time stepping. (2 is forward euler but nothing over 1 should be used - and 0 should be more stable but less accurate) cfl_rad = .01
- Set iRadiation in physics.data
iRadiation = 1 ! 0 no radiation, 1 implicit terms only (heating/cooling), 2 implicit and explicit (includes pressure forces and advection) iLimiter = 1 ! LP_FLUX_LIMITER=0, DIFFUSION_ONLY=1, NO_FLUX_LIMITER=2
- Set opacity related variables
!================================================================================ ! Opacity related section !================================================================================ PlanckOpacityType = 1 ! [1]-Specific Opacity, 2-Power law opacity PlanckSpecificOpacity = 1000 !computational units RoselandOpacityType = 2 ! [1]-Specific Opacity, 2-Power law opacity RoselandSpecificOpacity = 1 !computational units Roseland_dpow=2 Roseland_Tpow=-3
- Make sure to intialize the radiation field. Equilibrium value is given by scalerad*T4. Not all objects will support this.
Journal Club 0820 Agenda
- Review operator splitting method (papers attached).
- Test problems for Federrath and Krumholz accretion algorithms.
- Ruka's PN simulations.
- Introduction of radiation transport interface.
- Incorporating devs from outside of our group.
Meeting update
I had so much stuff floating around in my head, I wanted to record it all before I lost it … Here is a page on most of all that I learned in the Poisson solver problem — https://astrobear.pas.rochester.edu/trac/astrobear/wiki/u/erica/PoissonSolver
I would still like to add my references and such to it (for future reference should I need them), and try to answer the questions down at the bottom I posted.
I also chewed over how to add the solver to my 1D fluid code and check the Jeans instability growth rate with it… I am a bit stuck here, although I have some ideas. Will share with the group in the meeting shortly.
Meeting Update 0819
- Triggered Star Formation
A comparison between Krumholz and Federrath accretion algorithms:
movie
We should be able to finish all runs needed for the first paper of this project by the end of August. Two things to think about: (1) what to say in the paper? (2) we should start accumulating low resolution simulations (on bluestreak since it's free) for the next step we want to go (magnetic field?).
- Ablative RT
The following is a movie showing the evolution of density and temperature over 5 instability growth times (assuming single mode, t = 1/sqrt(k*g), where k = 2*pi/l, l is the width of the box). 5 growth time = 1 sound crossing time after fixing the scaling.
movie
Next step would be to continue to check the input: (1) Boundary condition I've been using is slightly different from Riccardo's, which may explain the fluctuation on the boundary. (2) effect of varying g. (3) use the "homebrew" initial condition.
- NLUF
Draft of initial condition setup (shock speed = 80km/s coming from the left).
In the rad-hydro simulation, the radiation precursor would hit the target first, followed by the thick ablator front:
I have not decided how to mimic this setup, the simplest thing to do is run with a single shock. The shocked runs should take a few hours each, I also checked the stability behavior of the setup across the simulation time (6 ns as used in the rad-hydro simulations) since the setup is not in pressure equilibrium:
density movie
temperature movie
Another thing that we may put in is a constant pressure on the boundary to simulate the incoming laser pulse.
Meeting Update 08/19/2013 - Eddie
Pulsed Jets
I ran into a couple of errors with the 2.5D pulsed jet runs last week. After some troubleshooting, I finally figured out that it was memory issues due to the projections (emission maps). Jonathan has some ideas for improving the code, but I think the best thing for me to do right now is to lower the number of levels for the projections.
So my simulations will be refined on cooling time at 42 x 420 + 7 levels, but the projections will be at 42 x 420 + 4 levels (or 672 x 6720 since projections are actually fixed grid images).
I did a quick test to make sure this would work. Previously, my runs were stalling at the first frame dump, so I couldn't get any output. By decreasing the plot level for the emission maps, I got 4 frames in less than 5 minutes on 1080 cores. So it looks like these runs will work now.
1D Radiative Shocks
I exchanged a bunch of emails with Pat today. There are some discrepancies between our results, but they follow the same trends. We are working on sorting out some of the discrepancies. I sent him the fully ionized model that he requested, and I am also going to be rerunning all of the original 18 models and sending him the data tomorrow.
Lab Jets
Francisco got back in touch with me and referred me to a paper on plasma simulations from Spain. These simulations use Kr and Xe plasmas, while Francisco's experiments are of an Al plasma jet entering an Ar medium. There are definitely some parallels here, so I will read this paper more closely to learn more and see if any of it might be useful for astrobear.
Jets etc...
Thought about updating the outflow module to be more transparent and to help launch a spherical outflow with an overlaid collimated outflow or conical outflow…. See attached powerpoint slide for discussion…
Also updated Ruka's module to launch jets in 2.5D (not using multiple outflow objects though)
Update 8/19
- The "memory error" bug seems to have been fixed *knock on wood*
- The low res (~25 cells/radius) grid on the pn page should be fleshed out again by the end of tonight
- Will queue up higher-res (64 cells/radius) overnight- if they work we can probably close the ticket I opened.
- Baowei helped me set up accounts on Bluestreak and Stampede
- I probably don't need them anymore…but can't hurt?
meeting update Aug/19/2013 Zhuo
- HLLC using approximate solver to solve P_star and u_star is working and correct. I will update my page later. Next goal is Roe solver.
- reading books and papers (all previously mentioned).
- I am trying to model a more general method to initialize the boundary condition for co-rotating binaries (with prescribed elliptic orbit, arbitrary wind and orbital velocity) using interpolation method. I roughly get the idea and I am confident it should work.
Meeting Update 08/19/2013 -- Baowei
- Tickets
- new: none
- closed: #303 (Changing of gamma in global.data causes oddities)
- Users
- New user from Chalmers University of Technology of Sweden (AGB and Pre-PNe outflow modeling)
- Josh of Clemson's r-theta polar coordinate code
- Thank everybody for your help during Keira's visiting
- Resources
- Archived Shule's data on /bamboodata/: 5TB available
- Martin's Teragrid: progress report for renewing
fixed length scale in the RT data files
After fixing a length scaling problem in the original data files, I was able to get a better equilibrium. The original parameters still need to be tuned (kappa needs to be reduced by a factor of 5). The following is a movie showing the evolution of density and temperature over 5 instability growth times (assuming single mode, t = 1/sqrt(k*g), where k = 2*pi/l, l is the width of the box).
movie
Teragrid Allocation Policy
- EXTENSIONS
At the end of each 12-month allocation period, any unused compute SUs will be forfeited.
- Extensions of allocation periods beyond the normal 12-month duration
- Reasons for Extensions: encounter problems in consuming the allocation. For example, unexpected staffing changes
- Length of extension: 1-6 months
- Procedure: a brief note for the reason through POPS via the XSEDE User Portal.
- RENEWAL
If a PI wish to continue computing after the expiration of their current allocation period, he/she should submit a Renewal request.
- In most cases, they should submit this request approximately one year after their initial request submission, so that it can be reviewed and awarded to avoid any interruption. (July 15th for Martin's allocation and April 15th for Adam's allocation)
- Procedure: Progress Report (3 pages)
resolution needs for my 2.5D jet sims
The run information for my original 2.5D runs was purged from my Kraken directories, so I don't have any run-time estimates from those.
I did a 32 x 320 + 3 run on Grass today. It took 42 minutes on 8 procs (5.6 cpu hrs). This is with the new cooling time refinement criteria.
Kraken processors are similar to Grass processors in terms of speed (2.6 GHz vs 2.5 GHz). The resolution that I think I need is 42 x 420 + 7. This should take 441 times as long, thus requiring about 2374.62 cpu hrs.
Journal Club 0813 Agenda
- Erica's 2D fluid code.
- Eddie's jet simulations.
- Adam's paper review.
Meeting update for 07/08/2013
So Marshak boundary conditions are mixed
One could instead do
or
so that
In any even, if we discretize this mixed equation we get (E_g is ghost zone, E_i is internal zone at left edge)
which we can solve for
as
and we have
If we then plug this into the time evolution equation ignoring portions that are unchanged we get:
where
we get
where
is the time averaged energy and depends on the time stepping (crank-nicholson or forward euler)and finally we arrive at
This is very similar to what we normally get ie
and noting that
for the limiteror flux is just
Note that
for
and that if
we revert to what we would expect if there was no Radiation energy in the ghost zone.Meeting Update 0812
- Ablative RT: after experimenting with various parameters, it seems the conductivity we were using previously is too high. The conductivity that can keep the ablation front at the same place should be about 2e-17 (the original parameter is about 3e-14). Here is a comparison between different conductivity (red: 2e-17, green: 3e-17, blue: 5e-17, white: 7e-17):
http://www.pas.rochester.edu/~shuleli/rtnew/diff4.gif
The temperature and density evolution with kappa = 2e-17:
http://www.pas.rochester.edu/~shuleli/rtnew/rt2e17.gif
Comparison between kappa = 2e-17 and non-conductive:
http://www.pas.rochester.edu/~shuleli/rtnew/3comp.gif
Comparison between kappa = 2e-17, flux = 5e+21 (blue) and kappa = 2e-17, flux = 5e+18 (red)
http://www.pas.rochester.edu/~shuleli/rtnew/2flux.gif
- The triggered star formation simulation: non-rotational case has finished:
http://www.pas.rochester.edu/~shuleli/rtnew/tsfnr.gif
The parallel rotation case with K = 0.1 is at frame 78 of 200, should finish this week.
http://www.pas.rochester.edu/~shuleli/rtnew/tsfrot1.gif
Meeting update 8/12
- My runs had been sitting in queue for about a week, and once I was able to run the hi-res MHD, I got lots of memory errors, which I submitted ticket #306 for.
- I re-ran all of the MHD cases at hi-res, and they all crash at a resolution of 64 cells/radius. At 32 cells/radius, the stratified diverging wind case and the constant ambient jet case finish, with the rest having the same error. At ~25 cells/radius, they all finish.
Meeting Update 08/12/2013 - Eddie
- I got updates to my pulsed jet code (thank you Jonathan and Baowei), and now I will test these changes on Kraken. I'm also still working on the paper.
- In regards to the 2.5D runs…the most recent resolution that I ran these at was 32 x 320 + 4. This gave an effective resolution of 3.92 AU/cell, and 85.33 cells/jet radius. According to a plot that I made a while ago (https://astrobear.pas.rochester.edu/trac/astrobear/blog/ehansen04232013), the minimum cooling length for typical internal shocks in this set up should be approximately 4 AU. Therefore, if we use these numbers, to ensure 10 cells per cooling length we would need 10 times the aforementioned resolution. This would be 40 x 400 + 7.
- Anna compared some of out 1D radiative shock models and this is a plot that she sent me.
The behavior of our results look similar which is good. One thing to note is that my code for whatever reason always gives slightly longer cooling distances. I think this makes sense if the code that Anna is using has more emission lines and therefore stronger cooling.
Also, she said that our emission line fluxes were off by an order of magnitude, so she's looking into why.
Meeting update
- 2D code results and page updated - https://astrobear.pas.rochester.edu/trac/astrobear/wiki/u/erica/2D_Godunov
- Read up on elliptic equations and numerical methods for solving them
- Am planning on finishing the Poisson solver this week and learning about Hypre
Meeting Update 08/12/2013 -- Baowei
- Tickets
- new: #306 (2.5D MHD diverging wind running out of memory)
- closed: none
- Users
- Keira's visit
- New users asked for the code: Open University, UK, Educational in connection with the undergraduate course S383 Relativistic Universe
- Resources
- Grass needs a 1TB new hard drive
- New Kraken allocation: 86% | 2,954,282 SUs remaining (used by Shule & Baowei), Old Kraken allocation (45% | 516,370 SUs remaining)
- Archiving data?
- Worked on
- Pulsed Jets: Tried to find the best production run setup (resolution vs processor number). 96X480X96 + 5AMR runs slow on 1200 cores of Kraken. Changed the resolution to 192X960X192 + 3AMR. got the first several frames for both the MHD(beta=5) and Hydra. Movies will be attached soon. The highest level refinement goes to the whole jet for the first several frames. This makes it run extremely slow at the beginning but hopefully it will run much faster later on since the highest refinement will go to the bow-shock only then.
hydra |
meeting update 08/12/2013
- Read chapter 8,9,10,11. I roughly understand what are their purposes and means. There are some typos in the book and I also did not figure out some abbreviated derivation. I would like to reread and resolve the derivation later (perhaps after a long time) because it does not hinder my understanding now.
- Updated my page.
- My father will come to NYC next Tuesday, can I have 3 days off?
Ablative RT tests
I ported conduction to AstroBEAR3.0, here's the result of verified 1D conduction front test:
click for animation
The ablative RT problem does work better in 3.0 than before. It can hold up to half of an instability growth time (t = 1/sqrt(Akg)) compared to previous less than 1/20 of growth time. There's no spiking anymore. Here, I use Rui's value of conductivity: 7e+5 cm2/s at the bottom. The heat flux is about 5.876e+21 erg/(cm2*s).
click for animation
click for animation
The major problem now seems to be the flux on the boundary is not treated correctly. In an earlier version of AstroBEAR1.0, the code does maintain the temperature gradient at the lower boundary. So I'm thinking about looking at how that version treated the boundary flux.
The thermal conduction module is turned on by setting the following parameters in physics.data:
33 iDiffusion = 2 !Turns on implicit diffusion
34 kappa1 = 3.e-13 !conductivity (cgs)
35 flb = 17.628e+21 !boundary flux (cgs)
36 ndiff = 2.5 !diffusion index
37 diffcfl = 0.03 ! diffusion cfl
If you are getting Hypre solver error, try tune down diffcfl or increase the number protection cycles (currently set to 100), which is in elliptic/diffusion/diffusion.f90:
104 dt_protection = dt_remaining/100.0;
This protection can be parameterized in the future official release.
The diffusion scales are treated in physics/physics_control.f90:
411 ScaleDiff = lScale*SQRT(pScale*rScale)/(TempScalendiff)
412 kappa1 = kappa1/ScaleDiff
413 ScaleFlux = TempScale*SQRT(pScale*rScale)
The hydrostatic bottom boundary condition is treated in problem.f90:
145 IF(Info%xBounds(2,1)==GxBounds(2,1).AND.Gmthbc(2,1)==1)THEN
146 DO i=1,mx
147 rho1=Info%q(i,1,1,1);vy1=Info%q(i,1,1,3)/rho1;vx1=Info%q(i,1,1,2)/rho1
148 p1=(Info%q(i,1,1,iE)-0.5*(SUM(Info%q(i,1,1,2:3)2))/Info%q(i,1,1,1))*(gamma-1d0)
149 t1 =p1/rho1
150 CALL NonLinearsolver(t0,heatflux,t1,ndiff)
151 !t0=3519.10130483276
152 vx0=vx1
153 aa=t0+gx; bb=gx*rho1-p1-rho1*vy12; cc=rho12*vy12
154 root1=-0.5*bb/aa+0.5*sqrt(bb2-4d0*aa*cc)/aa
155 root2=-0.5*bb/aa-0.5*sqrt(bb2-4d0*aa*cc)/aa
156 rho0=max(root1,root2)
157 vy0=rho1*vy1/rho0
158 p0=rho0*t0
159 Info%q(i,0,1,iTemp)=p0/rho0
160 Info%q(i,0,1,1)=rho0; Info%q(i,0,1,2)=rho0*vx0; Info%q(i,0,1,3)=rho0*vy0
161 Info%q(i,0,1,iE)=p0/(gamma-1d0)+0.5*rho0*(vx02+vy02)
162 END DO
163 END IF
Getting in touch
Hi, are you all on skype? My name is erica.kaminski1012
Meeting Update 0805
- TSF new images.
Federrath accretion + rotation (K = 0.1):
animation: click
Krumholz accretion + no rotation, 512 zones per radius:
animation: click
Krumholz accretion + no rotation, 512 zones per radius, Zoom-in:
animation: click
- Meeting with LLE folks this Wednesday, will hand them the code.
Meeting Update-08/05/13 - Erini
-Running a large polytrope run atm (Gmx =1283, gxbounds=-2-22222, t_final=.04d0 with timescale ~ 7.6 *106
-lineout at 2.5 sound crossing time
No explosion persay, but definitley some odd boundary effects. This is the longest run I have done at this resolution.
-check out u/Polytropes - for it is updated
Meeting update- 2D code progress
I have code that now:
1) sets up a 2d mesh with the cylindrical explosion initial conditions
2) prints to file the initial conditions that can than be visualized (easily… ) with mathematica:
Contour plot of initial density, the density (and pressure) are higher inside of the circle:
Here it is in a 3D plot:
and the initial velocity (both in x and y) is zero everywhere on grid:
3) determines the cfl condition, and dt for the time step considering the max speed of waves traveling in either x or y
4) sweeps the grid in the x-direction and solves the local Riemann problem along intercell boundaries in x, taking into account the presence of shears
What I am stuck on doing now is writing these routines only once, so that it can be called for either the x or y sweep, with variables swapped for x and y… I could write the routines the brute force way, so that they are individualized for x and y directions (e.g. Call Sweep in X, Call Sweep in Y), but this is silly, since the equations have the same form in x and y…
I am trying to figure out how to write the routine just once, but as of now my routines involve nested do-loops that cycle through the mesh. By nesting the loops like this, it seems to define a preferred direction for the Riemann problem… one that doesn't seem to be easily swapped if I switch from x to y.
So, that is where I am — trying to write these routines the proper way, one time, and have them called with the variables swapped for x and y directions..
I also spent some time trying to figure out how to visualize the 2D data. I played with importing the 2D array data into the spreadsheet, but the data set was too large for the program to handle easily. This is what it looked like in the spreadsheet initially:
This spreadsheet's cells correspond to points in the mesh, and their values are the value of density at that mesh point. This is why we see the circle in the center of the spreadsheet, the values of these cells have fewer digits than the surround, so it looks like a circle. If the widths of the rows and columns were the same, we would see an actual circle rather than an oval.
I thought if only I could assign colors to the values in the cells, like Visit does in the mesh, it would be like a 2d psuedocolor slice. I tried getting the data into a form Visit liked, but this was time consuming to figure out, and not much information on this on the web.
Then I resorted to mathematica, which was a much easier time once the output was formatted correctly (as a list of (x,y,rho) values: (x1, y1, rho1), (x2,y2, rho2), etc.) These are the contour and 3D plots above.
Update
The fix seems to be to make the problem 1D. That is, in the x- or y-sweep, make 1d arrays corresponding to the row (or column) you are sweeping along, and feed this 1D array into a routine that 1) solves the local RP at the intercell boundaries, and 2)updates the cells along that 1D array using the Godunov conservative formula. By making 1D arrays, and isolating them from the rest of the mesh, the array has no preferred orientation relative to the rest of the mesh. Will work on implementing this fix to my code.
Meeting Update 08/05/2013 -- Baowei
- Tickets
- new: none
- closed: none
- Users
- Visitor's schedule: http://www.pas.rochester.edu/~bliu/Visitors/ScheduleforKeiraBrooks.pdf
- new one got the code: Student from East China University of Science and Technology ("to study the transport of charge carriers in the solar cells"); Student from University of Costa Rica ("run simulations for my thesis project on helmet streamers on the sun".)
- Resources
- Archived Martin's data for magnetic tower
- Worked on
- Ticket #302 (Cooling length refinement, run on kraken with 96X480X96+5AMR goes slow)