Posts by author madams

Location of wire simulations

With the coming of my starting graduate school, I moved all of my simulations off of bluehive2 and bluestreak to /grassdata/. You can now find them at:


where I changed the group setting to orda. Hopefully any one in the group can have access to them.

Under ProductionRuns/, I have things split between the two machines. Under bh2/ you can find the level = 1 hydro simulation, along with CDMs. Then Under bs/ you'll find the continuation of level = 2 hydro simulation, along with the mhd simulations and some code Jonathan was trying to test solvers with.

Column Density Maps (CDM) of Wire Simulations

This is the first completed run. Hydro case, where I switched from MaxLevel = 2 to 1 at frame 62.

Here is a basic model I made, to make the axes more clear:

down x


down z


The integrated mass for the CDM down-x is predominantly between 37 and 1000. However looking at x_0=1.5 to x=3 in the CDM down-z, I suppose it makes sense how we see down-x being very high in mass.

For the CDM down-x, I replicated the BOV down each axis three times. For down-z, I replicated it 3 times in the Y-direction to be consistent with a 3x3 wire mesh.

Update (07/28/2015) -- Marissa


  • Shape is now installed on all local machines! (Grass, Clover, Bamboo). It is installed in /opt/Shape/. In order for it to work, now all the machines have Java as well.
  • Shape has a difficult time rendering large ASCII files.

Recall how Martin's data looks
3D Module Render Module

Martin's ASCII file has 4,803 lines. Note that the 3D Module illustrates if you have properly imported the simulation into Shape, where as the Render module is the actual emission map of the simulation. In the latter module, one has to go through a rendering process. This generally took a few seconds for Martin's data.

Baowei & Bruce's Data

Here is how it looks Visit:

In the 3D Module in Shape:
Screen Shot 1

Now rotated to get another perspective:
Screen Shot 2


  • The Right view in Screen Shot 1 looks like what might be the face of a box.
  • However from the other views in the 3D Module, we can see that the box is not filled in. Wouldn't we expect to see something that does not look so "flat."
  • For instance, Martin's data in the 3D Module sort of looks like the general shape that we would expect it to look like in emission map form.
  • Perhaps the ambient is too strong?
  • Perhaps Shape has a hard time loading in all of the rows?

The fact that we see *something* in the 3D Module that is not a mesh object that you import the data on is a good sign that I have imported the data correctly into Shape.

So at this point I would go on to the Render module to get the emission map. However when I fix my parameters and I am ready to visualize the simulation, Shape has a hard time rendering such a large ASCII file. So I cannot verify if there is a comparable view of the simulation to how it looks in Visit. If Shape were able to render this simulation, I would rotate it to see if there is a view that yields comparable results. However I cannot do that.

The ASCII file that Baowei gave me is in the format px,py,pz,n,vx,vy,vz,T, i.e. position components, density, velocity components, and temperature. In Martin's simulation I did not use temperature. Although when I import the file with or without the temperature column, it still looks the same. So I do not think there is anything wrong with how I am importing it into Shape.

So now let's take a look at the parameters of the simulation (thanks to Baowei):

nDim     = 3
GmX      = 64,64,64
MaxLevel = 0
LastStaticLevel = -1
GxBounds = -32d0,0d0,-32d0,32d0,64d0,32d0

So we have a 3-dimensional simulation, which is good. MaxLevel=0 will help. However it seems that the GmX and the GxBounds are kind of big. We'd have 643 = 262144, which is close to the number of lines that we have in the file: 254727. Keeping the right proportions with a smaller GmX, say 163 = 4,096. Seems more reasonable.


  • Visualizing on BH2 for the wire simulations seems to be a roadblock.
    • Jonathan suggested using vnc_start -t 480 --mem 60gb however I wait for my job to start for a long time. I have never successfully gotten it to get me onto the node. Maybe other people are using it?
    • debug queue on BH2 ('interactive -p debug -N 32 -t 60'): Cannot do contour plots of any frames. It'll register the object you want to visualize in red text. However for that same frame, I can take a slice and visualize it fine. However say I want to visualize the next frame as a slice, then that frame also has the dreaded red text of death. However if I submit a new job and visualize that frame from the get-go, it works. However there is a similar story for the next frame, etc.
    • Visit with Clover, Grass, Bamboo: with any of the local machines seems to be no good.
    • The maximum space for an AMR=2 chombo that I am trying to visualize is ~50 GB, for one with AMR=1, it is 13GB. The same story of visualization woes holds for both levels of AMR…
    • I don't think the simulation is corrupt as if it was, then I couldn't visualize any frame after I submit a new visualization job.

some python packages that are useful for hdf5

An interface for hdf5, using numpy: h5py
For managing a large database: pytables, their FAQ.

Tasks going forward (07-20-2015) -- Marissa

Wire Simulations

  • Transfer files
    • 2D Simulations on BH2 —> grassdata
    • Level 1 Hydro Run —> BH2
      • Visualize Level 1 Hydro Run
  • Keep submitting jobs for Level 2 Hydro Run
  • Use code made with different solver for level 2/1 MHD Run
    • Blog post comparing the different solvers


  • Visualize Baowei's output from his code
  • Spherical shell problem.f90
    • Testing with Baowei's code
    • Visualize output in Shape
  • Documentation of Shape
    • Making movies
    • HDf5 —> ascii (pipeline of Baowei's code)

Weekly Update (07-13-2015) -- Marissa

Mainly finished this page in my Shape guide : Importing External Data to Shape

Still working on other parts of my guide. Will post things for Jonathan's stuff tomorrow. Got my soft limit bumped to my hard limit as well.

Testing visualization in shape with pseudo-data

Wrote this python script that generates points on a sphere in Cartesian coordinates. It then writes these coordinates with some random values with velocity (vx, vy, vz) and density (n) to a tab-delimited ascii file. You can choose the radius of the sphere, and N is number of points on the sphere. Forwarded from Martin, a stackoverflow discussion. Also I am using python3.

import random
from math import pi,sin,cos

#Creating Sphere Dataz                                                                                                                                                                                      
def createSphere(r=5, N=100):
    lst = []
    for phi in [(pi*i)/(N-1) for i in range(N)]:
        M = int(sin(phi)*(N-1))+1
        for theta in [(2*pi*i)/M for i in range(M)]:
            x = r * sin(phi) * cos(theta)
            y = r * sin(phi) * sin(theta)
            z = r * cos(phi)
            lst.append((x, y, z))
    return lst

#Opens/creates new ascii file                                                                                                                                                                               
outfile = open("test.dat", "w")

#Writes the data to the file                                                                                                                                                                                
for x,y,z in createSphere():
    rho = random.random()*1000000
    vx = random.random()*10
    vy = random.random()*100
    vz = random.random()
    print("{0}\t{1}\t{2}\t{3}\t{4}\t{5}\t{6}".format(x,y,z,vx,vy,vz,rho), file=outfile)

#Closes the file                                                                                                                                             

So the pseudo-sphere data seems to visualize just fine in Shape:

3D Module
3d module in shape
Render Module
render module in shape

Aside: I am starting to document how to use Shape, here.

Visualization w. Shape -- Martin's Data

Column Density Map (m2) PV Diagram (km/s vs. m)

The simulation I am trying to visualize in Shape is probably Martin's old poloidal collimation project. Above is a visualization of frame 22 (chombo files cannot be found, so I cannot compare in VisIt), so the beginning of the simulation. Toward the end of the simulation, it should supposedly look like this:
Shape (probably not frame 212)


  • Still think the PV Diagram could use improvement. Not sure if I used the appropriate parameters for the velocity axes. For the above visualization, the axis goes from -500 to 500 km/s. With this arrangement I got something to show in the PV diagram part of the render module. However I think I could mess with it more.
  • Document how to do this on the wiki!
  • Regrid Bruce's simulations so they are in ascii, and thus fed into Shape.
  • Visualize Bruce's simulations.
  • Show Bruce how to use Shape.
  • Pretty up the output?

Weekly Update (05-26-2015) -- Marissa


I can visualize the data and have it imported into the 3D Module. However I have yet to be able to get it to be viewable in the Render Module. I can rotate the simulation around and such. Martin and I have used the same set of data. However you'll notice that it looks a bit different than mine. I think it depends on how many lines you want to import. However there seems to be 'two' ways to import it. I want to meet with Martin once more, and neaten this up.

Marissa's Shape

Martin's Shape

  • Would anyone be interested in me giving a tutorial on how to use SHAPE?
  • Once I figure this out I'll write a tutorial here on the wiki.
  • Need to get a hold of Bruce's data and regrid it to ascii.
  • Any chance SHAPE can take HDF5 files yet?

Wire Simulations

Made an executable that makes CDMs. Here is a preliminary taste from the first submission of the jobs. It goes to frame 22.

CDM down-x CDM down-y CDM down-z
  • Visualizing the actual hdf5 files on Clover and BH2 using the contour operator kills visit. Not sure what to do. Clover's visit didn't even like visualizing the bov files, so I did these on Bamboo, and even Bamboo had to wait between each image it spit out.
  • Updated the Run Statistics page. Currently have 114,520 sus left.
  • hydro is at frame 54, and mhd is at frame 36.
  • Can no longer write to BH2. Should I delete the 2D simulations from before?

Weekly Update (05-11-2015) -- Marissa

Wire Simulations

  • Thanks to Baowei we got the code to make on Stampede. Submitting jobs there now for the run directories Jonathan forwarded to me.
  • Made a stampede statistics page so we can pay attention to our resources.
  • Writing up a study on the cpu usage needed for these, can find it there soon. Going to also do a write up on what is different about the production runs on the studies page too.


  • Met with Martin on Friday. Showed him what I've been doing. I forwarded him the data I've been using. He downloaded Shape and is going to play with it himself. We plan to meet again tomorrow.
  • Martin says that in the past the software usually gets rid of the mesh when your data has been properly imported into Shape. This means there might be something weird with how I am importing it, or visualizing it in the 3d module.
  • Scroll to the bottom of the .dat file to see how many lines there are. One can set this to the lines per entry..
  • Find the max and minimum position values for the whole data set, take the average of these and use it for the "center."

SHAPE Questions

In order to understand the context of the discussion I am going to lay out here, please view the following video made by the makers of SHAPE v. 5 Shape Version 5 - External Data Visualization. Back in March I met with Marcus at RIT, and we worked on this together. What I will be detailing here are comparable results to what we saw. However we were really too excited that we saw something, and thus we weren't super skeptical of the results.

Attached to this blog post I have included two .dat files, or ascii files, from Martin's directory — in case anyone wants to download Shape, follow the video, and try to find it out. They are titled SHAPE00000.dat and SHAPE00022.dat.

Video Summary

1. Open Shape. Go to the 3D Module. Insert a "mesh" by clicking either sphere, torus, etc. The point of this is to "hold" the external data on some shape.

2. Rename your Shape, and choose what file format you want. We are using ascii.

3. Input properties dialog.

Questions & Concerns:

  1. What is dNum ? The narrator in the video says that it takes a "slice" of the data. Does this mean that you only consider a certain number of rows in your data? Why 256, why 1? Especially when you are only inputting a single frame of your data. There is also this parameter n in the input properties dialog. Wouldn't these be the same thing based on what the narrator has said?
  1. The narrator says that n is the "lines per entry" in the data. However wouldn't we just have 1 row based on how people typically use file I/O? When you expand the input parameters dialog box, the data you can see will adjust with the box.
  1. For any data set that I obtain I will need to know the format string, or the label for all of the columns in the data. I didn't ask Martin what his labels are, however Marcus and myself made an educated guess about what they could be.
  1. The narrator says that one needs to put a dummy variable at the end of their string. Why do we need a dummy variable if we know all of the columns?
  1. The Center… how do you know how your data is centered? Then there are extents. Marcus and I sort of played around with this. You can also adjust the radius of the mesh object from Step 1 above to see what makes sense for how your data looks.

4. Delete the modifiers already there in the 3D Module (Density, Temperature).

5. Add the emissions species by accessing the Physics Module.

Questions & Concerns:

  1. In doing this I have always followed the instructions given by the narrator in the tutorial. I don't think that this is part of the problem when trying to visualize external data in Shape. I say this because we are actually able to "see" something, which I'll illustrate at the end of the blog post.
  1. When do you know what species to use? One will probably need to know if their simulation requires a particular emission.

6. Apply the species modifer.

7. Now go to the Render module.

This is what the data looks like originally if you use comparable parameters from the video.

However if you change the width of the Image, it starts to look like a sphere…

And if you change the center (in the input parameters dialog) you simply just move around the mesh.

Update (05-04-2015)

A quick update. Expect more later.

  • Colliding Flows: Edit 3 plots for their final version. Redid the inverse beta maps and b vs. n plots, however I will have to quickly revisit the B vs. n plots.
  • Turbulence Simulations: Jonathan sent me a new problem module that utilizes a "fractal" mesh. I am going to begin to set up production runs and make the code.

-Jonathan tells me to keep an eye on cpu time for the MHD run in particular in case we need to scale down resolution. I think I am going to make a page, comparable to the one Erica made for the Colliding Flows, that tracks the statistics for each restart.

-I am assuming I will have to start running these on and XSEDE machine now. Going to test Jonathan's new problem module on BH2 first. Then work with Baowei to get code made on Stampede.

-I plan to update the page I have made of the 2D runs after I finish setting this up.

  • Shape: Adam e-mailed Martin. Before I meet with Martin I am going to write a blogpost detailing my questions. Bruce also let me know he has generated some models of stellar outflows on alfalfa. Once I figure out things with Martin I'll have a go at that. Will have to regrid frames we prefer so they are accessible to Shape.

Increase in MinDensity seems to be a fix & status of wire turbulence jobs

Here is the page. Increasing the minimum density helps with the issue I brought up in the previous post. See for yourself:

Here is a side-by-side of the same parameters (beta = 10, Ma = 3, Thickness=0.01) but with different min densities at frame 40. On the left you have a smaller minimum density of 1d-10, where on the left we have 1d-2.

Here are two side-by-side gifs. Sorry if it slows down the webpage.

MinDensity = 1d-10 MinDensity = 1d-2

Now after doing the restarts for mindensity = 1d-3, the first one to complete with that value was the beta=10, ma=20, thickness=0.06 case.

Here is a GIF of the new frames with that mindensity. With the old mindensity of 1d-2, there would be this wave that would propagate out immediately from the restart.

Job Update

Hydro/MHD Ma Thickness Frame (Restart)
Hydro 20 0.01 200, completed
" " 0.03 200, completed
" " 0.06 200, completed
Beta=0.1 20 0.01 200, completed
" " 0.03 45*
Beta=10 20 0.01 200, completed
" " 0.03 87*
" " 0.06 121*
" 3 0.01 76
" " 0.03 Restart from 0
" " 0.06 Restart from 0
  • Beta=0.1, Ma = 20, Thickness = 0.03 is queued up.
  • Beta=10, Ma = 20, Thickness = 0.03 is queued up.
  • Beta=10, Ma = 20, Thickness = 0.06 is queued up.
  • Beta=10, Ma = 3, Thickness = 0.01 is running currently (7 hrs at this post). Expected to end in 11.9 hrs at frame 76.1 !
  • Beta=10, Ma = 3, Thickness = 0.03 is queued up.
  • Beta=10, Ma = 3, Thickness = 0.06 is queued up.
  • All Ma =3 were done with 1d-2 to go faster. As you can see from previous blog post, there were some errors. So I decided to restart them with 1d-10 (original mindensity).
  • * indicates those changed with mindensity of 1d-3 which I had 1d-2 a few weeks prior. However the visualization, as we saw, looks bad, so I need to restart them from last good frame.
  • 5 jobs are completed.

Job Update (Week of 05-04-2015)

Hydro/MHD Ma Thickness Frame (Restart)
Hydro 20 0.01 200, completed
" " 0.03 200, completed
" " 0.06 200, completed
Beta=0.1 20 0.01 200, completed
" " 0.03 50
Beta=10 20 0.01 200, completed
" " 0.03 100
" " 0.06 135
" 3 0.01 200, completed
" " 0.03 49
" " 0.06 71

Update on Wire Turbulence Study (04-22-2015)

I have been updated the Wire Turbulence Studies page I have made daily.

These runs have been taking a while. The dt in the time step is very small. Jonathan asked me to try to increase the mindensity in the file.

MinDensity       = .01 !1d-10                ! Minimum computational density before protection is triggered [1d-10]

I noticed there is something weird with the runs for which I did that, either from the beginning of the simulation, or at the point where I introduced the new mindensity during the restart.

Run B10-T.03-M20 B10-T.03-M3 B10-T.01-M3 B.1-T.03-M20

It looks like a boundary conditions problem, however I have not changed the Gmthbc for any of these runs:

Gmthbc   = 1,2,2,1,2,2

Page for wire turbulence study

Find it here: WireTurbulenceStudies

Meeting Update 03-31

Colliding Flows

  • High Resolution (~800 ppi for color, ~300 ppi for b&w) Figures made: CDMs (0, 15, 30, 60), InvBeta (0, 15, 30, 60), B vs. n (0, 15, 30, 60), Spectra (0, 15, 30, 60), Mass History (0), Model of Flows
  • HR Figures to be made: Mass History (15, 30, 60), Histograms for sinks (We are thinking about these three figures I made: 1, 2, 3)
  • Querying for the mass history is a pain in the ass in VisIt and could take over a week. I am just going to make a post processing executable and run quick jobs on the simulations to pull the information of the rho in the box and simulation out.
  • In the past few weeks I have learned a significant amount about what goes into preparing figures for publication. Not a trivial task. Visit exports pseudo-vectorized images. Exporting as an EPS is an option in Visit, but when you zoom in on the printed figure there is no scalability. So I saved visualizations done in Visit as high resolution PNGs and converted them to PDF (ImageMagick rasterizes, but that won't matter if we save as a PNG anyway). Some other softwares can export actual vectorized images. These you don't want to touch with ImageMagick (use ghostscript instead). So we have a bunch of really heavy figures. I wrote a script on how to compress them, see here. Erica wants me to document all of the stuff I learned.


Meeting Marcus Freeman, an RIT graduate student, on the week of the 6th to work on Shape. Will post about the outcome of that meeting afterword. Before then I should probably work on regridding the simulation so it has the necessary variables we would like?

Turbulence Simulations

Met with Jonathan last week. He handed over the problem module and .data files. Suggested I make a 2D simulation first, so I did this over the weekend. Looks as expected: clump and wind. Hot desaturated really makes the shock stand out. There is a weak magnetic field (beta = 100) and MHD is turned on.

Space is minimal on Bamboo and Grass, so I ran this simulation using 4 processors on Clover (which is only being used about 56%). It took all weekend and has 6 more frames to go. I am going to start moving over to a more powerful machine, like BS or BH2. The configure file is having issues on BS (see Baowei's e-mail).

Here are some of the parameters in the we'll be considering for the study:

spacing=.2           ! Lattice constant
thickness=.03        ! Radius of wire
beta=100             ! magnetic beta
mach=20              ! mach number
screen_x=.25         ! location of screen in x
rho_wire=1000d0       ! peak density of screen
rho_wind=1d0     ! density of wind
rho_amb=.01          ! ambient density

We could vary the beta and mach, or the spacing and thickness of the wires. I suppose we'll see once I get some jobs set up and the code made on BH2.

Making an animation in Shape by example

FYI: I made my own Library (copying Erica). I wrote an outline about the 2007 Shape paper, here. In case anyone is interested about the point of Shape.

Since our group is trying to utilize the Shape software for our simulations, I figure I would start to document helpful components to visualizing and modeling in Shape. Here I will briefly explain how one can make a movie, or animation in Shape. Some days toward the end of the day, I am going to go through the templates on their website. With Shape however, I have two main objectives:

  1. Learn how to convert our hdf5 simulation data to ASCII, so that it can be fed into Shape.
  1. Learn how to visualize this external data in Shape. What limits are there and what can Shape tell us about our data compared to other visualization softwares?

So if the template I check out is useful, I'll make a blog post about it explaining the objective of the template. Eventually I'll use these to make a series of wikipages documenting Shape for our purposes. However I am yet to be a "Shape Master." Otherwise I would have done 1 & 2 already. This template is quite simple: how do you make a movie in Shape?


(Make sure you check out the .gifs! They are really pretty.)

Image 1

Image 2
Red/Blue GIF

Image 3

Image 4

This data is from one of the Shape templates, titled Animation 1 - Rotation. It is a bipolar nebula rotating around all three x-y-z Cartesian axes. Image 1 is a 2D projection, comparable to our column density map movies. However the object is not evolving in time. Image 2 is the line profile for the velocity vs. pixel intensity as the object rotates. Image 3 is the mesh used for the object. Image 4 is its PV-Diagram.


Image 5

This screen-shot was taken post-rendering. Click the big button with the Shape-S in the top, left-hand corner to view the data as seen here. Prior to rendering, the PV-diagrams will be black. Note that all of the data parameters are already in the template. You can adjust the colors if you want too. For instance I have also made a movie color coded to indicate Doppler-shifting, i.e. Red/Blue (see above). Other options include grey scale, rainbow (color), red/blue, gradient, and spectrum. Click the movie-film looking button in the same row as the renderer to get to the animation module.

Image 6

Image 6 is what the animation module looks like. Instead of clicking the rendering button, you'll click the animation button (which is circled and marked as the second step). First you want to adjust what format your output will be. Note the timeline at the bottom of the GUI. As Shape makes your animation, it will tell you how far it is in the process. On the right hand side of the Animation module is the variable tree. Below the variable tree, in the table, is what is referred to as the variable stack. Essentially the functionality of this part of the animation module in v. 5 seems not much different from v. 4. Y

Here is a table of screen shots of all the parameters listed in the animation module.

Image 7. General

Image 8. Variable

Image 9. Output

Image 7 (in order from top to bottom):

  • Name of your file output
  • Number of frames (where you want them to start and end in the simulation)
  • Current frame tells you when Shape is in spitting your .png output (there is also a timeline at the bottom of the GUI that starts from first frame and ends at the number of the last frame) relative to the simulation
  • Start and end times of your simulation for some given time units (years, days, hours, minutes, seconds).
  • Like current frame, it tells you what current time your simulation is being animated at in your chosen units.
  • Distribute & Fields: Using particles for rendering or output velocity vector information may require Shape to redistribute the particles in every frame. This is done by default. If you are not using particles, then disabling the distribution can save processing time.
  • Render: Render while you animate. This is essentially required.

Image 8. (Currently investigating)

Image 9. Above in the results section of this post, you can see all of the options visualized that are listed here: 2D Maps, PV Diagrams, 3D Mesh, Hydro, Plots (Images) (Note: the numbers to the right indicate LxW size of the image, so these plots are 512x512), Plots (Ascii), Math Variables, and Time Units. You can denote the image type, and indicate where you want the images to be saved (a working directory).

Completed mass query for theta = 0 case

Figure. A mass versus time plot for the four sinks that form in the case of CollidingFlows, along with M_Simulation (the total mass in the simulation box), and M_Collision (the total mass in the collision region within the box).

  • Time interval is for ~13 Myr.
  • Mass axes is logarithmic in order to better illustrate the trend. A linear axes lets the sink masses look close to zero in comparison to the mass in the box and cylindrical collision region.

Followed a similar procedure as to what we did for the E vs. t line plots, however did a query for rho instead.

I did the query using both VNC and the interactive queue on both Erica's and my own account on BH2. For the first 200 frames of the M_simulation query, Clover collected most of the data based on the chombos stored locally. Surprisingly, Clover was faster than both VNC and the interactive queue. Ultimately both of these types of remote visualization settings are really unreliable and result in a lot of babysitting. It took me about 3 days to collect all the data for the box and cylindrical collision region in Visit. Here are some issues & interesting things I encountered:

  • Want to use a GPU? Need to use VNC which has a GUI. GUIs are super flaky and prone to time out errors it seems. Here is the command to run visit with a GPU though:
    module load virtualgl visit/2.7.3 && vglrun visit
  • Attached is the script I used for use VNC. CIRC Website also has some stuff on remote visualization (using VNC). It might be faster and nicer to use on data that isn't too demanding on memory.
  • Wanted to query using the interactive -p standard for say -t 300. Whenever I tried to do this, after Visit collected data for a few frames, it would time out. Seems like there were some memory issues. So I just stuck to an hour in the debug queue and monitored the memory/cpu percentage on the node I was on. Here is a website explaining how to do that. This implies I can only query for approximately 10 frames per interactive job.
  • Apparently using query on data that utilizes the cylindrical clip operator requires more memory than just query for the total mass in the box. Visit is probably doing extra work. Just an FYI.

Jonathan suggested making a post processing element in astrobear that'll just spit out all the data into a curve file during a batch job. Think if we want these for the three other runs I will just do that…

Moral of the story: Using query in visit for large data sets is finicky, be careful!

VNC Script


  • Sign into your machine/local machine.
  • emacs -nw
  • Paste script in.
  • Make sure it is executable and run it.
  • It should prompt you to sign into BH2, and for how long you want your session to be, along with the size of the GUI, etc.
  • Hit enter when it says a password is found. (FYI) You'll have to make an extra password for your account too when it establishes the tunnel.
#!/bin/bash -i                                                                                                                     

read -p "Please enter your netid: " user
read -p "Do you need to start a VNC server? [y]:" vnc_start
read -p "Set a timeout for your VNC server [60]:" vnc_timeout
read -p "Choose a resolution [1280x1024]:" vnc_resolution

if [[ -z "$vnc_timeout" ]]; then

if [[ -z "$vnc_resolution" ]]; then

if [[ $vnc_start =~ ^[Yy]$ ]] || [[ -z "$vnc_start" ]]; then
  echo "Now connecting to bluehive and starting the "
  echo "VNC server."
  ssh $user@$via "vnc_start -t $vnc_timeout -g $vnc_resolution" # | grep "vncserver running on " |  cut -d " " -f 4                         

read -p "Please enter server (ie bhx0101:1) " server
host=`echo $server | awk -F ':' '{print $1}'`
display=`echo $server | awk -F ':' '{print $2}'`
port=$(expr 5900 + $display)

echo "Establishing ssh tunnel"
TMPSOCK=`mktemp -u XXXXXX.control`
ssh -fN -o ExitOnForwardFailure=yes -M -S /tmp/$TMPSOCK -L $port:$host:$port $user@$via
echo "Launching VNC viewer"
$vncviewer localhost:$port
echo "Disconnecting ssh tunnel"
ssh -S /tmp/$TMPSOCK -O exit $user@$via

Core mass script documentation

Toward the end of last week I worked on writing a script that would open, read and write the mass data in the sinks_*.okc files to a new .csv file (In particular for the CollidingFlows problem, see our data here: CollidingFlowsFigures). The purpose of this was to gain all of the sink data over time, so that we could visualize it, as you can see by the results I have posted below. These charts will allow us to observe how the sinks accumulate mass over the time of the simulation. Here I will document the development of my code, and discuss what else I would like to do with it.


Editing code:

  • Write information of the density/mass of the cores to a .csv file
  • Convert the information from the .csv into solar masses and Myr.
  • Take converted .csv file and make graphs using matplotlib, or excel. Excel is quick, but a code that can do all of this and generate plots in one swoop would be super efficient for later uses.
  • Document and post concise and general form under the VisIt page as a script tab.

So far I have completed the first bullet~ (02-09-2015)

Editing charts:

  • Convert mass to solar masses.
  • Convert time to Myr.
  • Crop the x-axes to when the first sink forms. Get rid of the 0 values.
  • Fix x and y-axes to be the same for all the runs. v. 1

Screen capture of code Screen capture of .okc file
okc to csv v. 1 screencap screen cap of okc

The code reads the first line of the .okc file, splits the numbers into a list of ints. It then uses those numbers in order to access the data below that starts at the 34th line. The only parts that are hard coded are the headers for the columns of the csv file (L17) and the number of lines of data it has to read into the .csv (L27). Essentially you change this by the number of sinks that form by the end of the simulation. You can check this by counting the number of lines of data that are spit out by the final frame of the simulation.

Results 02-09-2015

Shear 0 chart of unconverted sink masses (s0)
Shear 15 chart of unconverted sink masses (s15)
Shear 30 chart of unconverted sink masses (s30)
Shear 60 chart of unconverted sink masses (s60)

CollidingFlowsFigures page is updated

New beta, CDM and spectra plots for shear 0 case where done:


Also did some slight reformatting and added some links.

E vs. t line plots for the colliding flows runs

Check them out: E vs. t line plots

Notes from last week's meeting on colliding flows

Download them here. Updated the page to include them as well see here.

Updated CollidingFlowsFigures page





-Currently working on E/t plots to post today

-Figures for the Shear15 case

Colliding Flows Paper Figures

Here is the first round of figures. Essentially this is the reason for lack of blogposts the past few weeks:


Please check them out! Keep in mind this is the first round, so some of the text on the figures might be weird (dates and such).

Solution to streamlines issues.

Turns out we are calling the wrong position in the array. The expressions should be:

By_downx = array_decompose(projections,0)

Bz_downx = array_decompose(projections,1)

Bz_downy = array_decompose(projections,0)

Bx_downy = array_decompose(projections,1)

Byz_downx = {<By_downx>, <Bz_downx>}

Bzx_downy = {<Bz_downy>, <Bx_downy>}

This yields the following:

down the barrel

Which makes much more sense.

Potential problems with projected streamlines?

So I am attempting to plot the streamlines for our colliding flows problem. Here is an example of the shear-0 case at 10.1 Myr (or frame 101) (also I did these under Erica' account on BH2 - hence the username haha).

The first image is down the barrel of the two flows (otherwise projecting the mass down the x-axis) thus the vertical axis is z and the horizontal is y.

The second image is a projection down the y-axis. Thus the vertical axis is x and the horizontal is z. This makes sense given that we've defined GxBounds = 0d0,0d0,0d0,62.5d0,75d0,75d0. The two flows are colliding along x, so in the second image, they are coming in from top and bottom.

In both images I've plotted the column density maps for min = 60 and max = 1000. I did similarly for the min/max of the streamlines which are plotted on top of the column density maps. I also checked that they are scaled by magnitude. Now after talking with Erica we are not sure if these streamlines make any physical sense if we have defined a magnetic field along the flow axis (i.e. x). Ignore the visit axis labels as they are generic and don't define the dimensions of our problem.

down the barrel

down y

In our problem.f90 we have defined the projections for streamlines like so:

    !For 'projected' streamlines plot of the data down x:                                                                                                           
    CALL CreateProjection(projection)

    !For 'projected' streamlines plot of the data down y:                                                                                                           
    CALL CreateProjection(projection)

So in Visit I defined a few expressions to be able to plot the streamlines. For down the x-axis (which correspond to the mass1 CDMs):

By_downx = array_decompose(projections, 1)

Bz_downx = array_decompose(projections, 2)

which you can create the expression for the vector Byz_downx = {<By_downx, Bz_downx>} to plot the streamlines like I have above. The first component corresponds should correspond to the right axis if the horizontal component is truly y. Thus the second component will correspond to z if the vertical is truly z. So I think I have these lined up correctly? For projections down the y-axis (corresponding to mass2 CDMs):

Bz_downy = array_decompose(projections, 1)

Bx_downy = array_decompose(projections, 2)

you can create Bzx_downy = {<Bz_downy>, <Bx_downy>}. Clearly from the size of the box we know the horizonal component is z, and thus the first parameter in our vector should be Bz. Similarly for the second being x. However the streamlines don't seem right? Not sure what is going on.

Beta 10 Shear Magnetic Field vs. Density plots

Shear Angle Final Frame/2 (137)* Final Frame (273)*
15 *

Table. Magnetic Field vs. Density plots for all shear cases of the "weak-field" simulations (beta 10). The sloped straight line is the 10K temperature, the top horizontal line is the ram pressure, the lower the magnetic pressure. The curved function is the density verses pressure line. Here we have also included a "grey" area, which is the mass weighted density versus pressure post-processing.

* The Beta 10 Shear 15 case does not have a final frame of 273 yet. We are still waiting for the untarring of the rest of the chombos from Stampede. Thus for now I've included 217 and 109 for the middle and end visuals. Will update the table once we get up to 273.

Work tasks for rest of November

November 17-21:

  • Write a script to automate jobs.
    • Update (11/18): Completed.
  • Run astrobear in post-processing mode on all of the Beta 10 runs (Shear 0, 15, 30, 60), as well as the convergence tests for Beta 10 Shear 15 (levels 2, 4, 6) for the B vs. n plots. Done twice for the mid-way and end of the simulation. Exception: Beta 10, Shear 60, where we have frames out to 328, and the convergence tests which are at frame 50. This yields 12 simulations in total
    • Update (11/18): Finished shear 0, and 30, and 2/3 of Shear 60. Need to get other chombos for Shear 15.
  • Make B vs. n plots for the different runs (convergence tests not necessary), publish on wiki.
  • Make B vs. n figures (for the paper).
  • Run code for beta maps with projected streamlines, 3 times (10.1, 20.1, 27.2 Myr) for each production run, the final frame of Beta 10 Shear 60 (i.e. frame 328), and all of the convergence tests for Shear 15 at frame 15.
    • Update (11/18): Reservation for tomorrow.
    • Update (11/19): Submitted jobs, waiting in queue for noon to roll around.
  • Make plots of the beta maps and the projected streamlines, publish on wiki.
  • Make figures for the beta maps.

Waiting on Erica before doing the following:

  • Run doe for energy spectra on all the production runs.
  • Make plots of spectra, publish on wiki.
  • Make figures for paper out of best spectra.

November 24-28:

  • Run code for density and velocity spectra on the different runs.
  • Make plots of spectra, publish on wiki.
  • Make figures for paper out of best spectra.
  • Run filament analysis on the 3 or 4 best filaments of each of the runs at end time

Fly Through Movies to Make:

  • Movie of camera circling around the collision region for one of the colliding flows cases.
  • Make a movie that travels down the barrel of the flow, reaches collision region and circles around.
  • Make a movie that travels down the barrel of the flow, reaches the collision region, circles around and then zooms into a smaller region of interest to observe its evolution
  • Make a movie that travels down the barrel of the flow, reaches the collision region, circles around, zooms into a smaller region of interest and circles arond to watch its evolution, then pans out to circle around the collision region. Do this for each run.

Work Tasks (11-10 -- 11-14)

  • Write script to automate analysis of the high resolution colliding flows runs, which will be useful for spectral analysis (see: Erica's blog post)
  • Run astrobear in post-processing mode on all of the runs (B10 S60, 30, 15, 0) for the B vs. n plots.
  • Make B vs. n figures for the paper of the different runs.
  • Make column density figures for the paper of each of the different runs stated above including B1S60 and convergence tests.
  • Make a movie that travels down the barrel of the flow, reaches the collision region and circles around (i.e. implement the fly through cameras for our colliding flows data)

New rotating clump without wind fly throughs

Took the wind out of the problem module and added "res" and "temperature," which can be edited for each simulation in the Temperature is added to the clump and the res is added to the "Projection Movie." In both simulations they have a res=300 even though they have 1 level of amr. You can see you can see the affect by adding mesh in visit (the image with the mesh is from the second visualization):

movie file for this run:

&CameraDatasee by adding the mesh to the visualization

Here we have 5 cameras with a starting position of pos=-10,4,-10 that is viewing the clump in the box with an up vector in the y-direction until frame 25 where it switches perspective and moves along the x direction toward positive 10. So the simulation looks like it is moving around while we move in the x-direction over time.

movie for this run:


Here I am using 4 cameras. In this simulation I attempted to rotate the clump while zooming toward and away from it. The visualization is also continuous as the first and last camera have the same parameters. I think this is quite nice and I prefer this visualization to the first one.

Focus and up vectors

Today I ran seven simulations that play with the CameraData in the file. I focused on the up vector as well as the focus of the camera. For two cameras I played with a focus {1, 0, 0} and {3, 0, 0} as the previously done simulations have a focus of {2, 0, 0} for all cameras. The simulations I ran with three cameras have the second camera switch up vectors, as well as have cameras switch into ascending or descending order based on the three focuses we know already (1, 2 and 3).

Two Cameras

Run description Image Final Frame No. of Frames CameraData File Link to movie
2 cameras, both up vectors in y, with focus of 1 in x 50 camera movie
2 cameras, both up vectors in y, with focus of 3 in x 50 camera movie

Three Cameras

Run description Image Final Frame No. of Frames CameraData File Link to movie
3 cameras; first and last camera have up vectors in y; second camera has an up vector in x 100 camera movie
3 cameras; all cameras have up vectors in y 100 camera movie
3 cameras; first and last camera have up vectors in y; second camera has an up vector in z 100 camera movie
3 cameras; all up vectors in y with ascending focus from 1-3 in x 100 camera movie
3 cameras; all up vectors in y with descending focus from 3-1 in x 100 camera movie

A brief comparison considering frame and final times with camera sims

Below I have a table of visualized simulations I ran of a clump and a wind from last week's blog posts:


For each of these little tests I changed the final time and frame number in the file. There is a time for the camera in the file, so I am also interested in whether or not that time has to be equal to t-final or not and how that affects the simulation if it is changed. No matter the number of frames, the simulation will play out to the same point. Essentially the t-final in the acts as a frame rate, or temporal resolution. You can see that 0.1 final time simulations have a wind that hasn't even hit the clump yet by the end of the run. For the 0.25 simulations, the wind hits it by the end, and in 0.5 the wind envelopes the clump. The number of files, bovs or chombos do not bear on what is "physically happening."

The "camera movies" are made in VisIt like how we would make a visualized bov movie. The camera movies are projections, however they are not necessarily "down the barrel," like what we are use to.

In the chombo simulations I took a slice down the y-axis in VisIt. We know for the camera movies the camera movies along the x-axis, which is does also in Visit. It is a projection of a 3D simulation so we do not need a "z" axis.

In each simulation there are two cameras. One is positioned at pos=-10,2,-10 and the other at pos=8,2,-10. So it is changing in the x-direction, but keeps focused on the same object as it moves away from it. Both cameras have a focus of 2 in the x-direction.

So the camera is positioned outside of the box and integrates the density through the domain - in essence it is a column density map.

Why I did this is because there is a time for each camera in the file. For the nth camera, should we have its time equal to the final computational time in the file?

0.1 Final Time 0.25 Final Time 0.50 Final Time
10 Frames
camera movie, 10 bovs camera movie, 10 bovs camera movie, 10 bovs
chombo movie, 10 chombos chombo movie, 10 chombos chombo movie, 10 chombos
25 Frames
camera movie, 25 bovs camera movie, 25 bovs camera movie, 50 bovs
chombo movie, 25 chombos chombo movie, 25 chombos chombo movie, 25 chombos
50 Frames
camera movie, 50 bovs camera movie, 50 bovs camera movie, 50 bovs
chombo movie, 50 chombos chombo movie, 50 chombos chombo movie, 50 chombos

Fly through test 2

Yesterday I got a new code and checked out development. I then made a new problem directory in my code to run Jonathan's problem module and .data files from BH2. The code made eventually after fixing the HYPRE path and CPP errors. It seems we didn't see the clump with the first test as I ran a 2D simulation, so this time my has nDim = 3 and Gmx = 32, 16, 16. I ran the executable twice for 0 and 2 levels of AMR (see results below). I ran them both out for 100 frames, so the camera seems to move from one position to the next at a slower rate in comparison to Jonathan's simulation that only has a few frames.

Currently I am trying to understand how the camera works and run some simulations with a different camera position, focus and up vector. I'll post those results once they are done.

0 Levels of AMR

Chombo movie

bov movie

2 Levels of AMR

2 levels of amr chombo frame 0

Chombo movie

fly through projection

bov movie

Fly through test 1

Currently trying to replicate Jonathan's simulation johannjc09292014 so then I can apply his method to setting up the cameras to our problem module for CollidingFlows. So in Jonathan's problem module he sets up the cameras, makes an ambient to fill the grid, puts a clump in the ambient, and then puts a wind in the box.

Last week I made the code incorrectly due to my own lack of understanding. This time I created a new problem directory and threw in his .data files and problem.f90 to make a new executable. I did this using scrambler_3.0 that I copied from Erica's directories. I ran into an issue when I tried to make the code:

modules/Problem/problem.o: In function `problem_mp_problemmoduleinit_':
modules/Problem/problem.f90:(.text+0x3ef): undefined reference to `updateprojection_'
make: *** [astrobear] Error 1

So I went into Jonathan's problem.f90 and commented out

CALL UpdateProjection(Projection)

and it made the code without any errors. The results from the simulation are below. We still see rotation in the frame, however it does not look like Jonathan's simulation still. It rotates into the wall and then zooms up on it. I only included ¾ of the simulation since I did this on my personal laptop and it was taking too much time in visit while ssh-ing into Bamboo.

So now I have got a new astrobear and trying to do the same process again (this time hopefully not needing to comment the call to update projections).


Here is what the chombo files look like:


Essentially there is just a wind coming in.

Meeting Update 10/27/2014

Visualized my fly-through data. It isn't right; seems like it just follows the corner of a box. Didn't implement the tracers last time since it was just a test of the code, so I figured it wouldn't look like our problem. That is the next step.

Update with 1024cores

I ran the Beta 10 No Shear case on Stampede with 1024 cores. Here is the result (see Table below):

So if we're at frame 246, we have 154 frames left. So dividing 154 by our rate, we have 9.4 days (225.6 hrs) to run this simulation out. Thus, 225.6 * 1024 = 231,014.4 cpu hrs. Multiply this by 4, as we have 4 runs, yields approximately 924,057.6 cpu hrs total. This is not much different then the total result from last week. It does not seem economical to run these on 1024 cores; in my opinion we might as well just run these on 2048 cores as they'll be faster but have little to no shift in cpu hrs.

Perhaps we should choose just a few cases on 2048 cores?

If on 2048 cores we estimate 34.85 frames a day (average of rates from last blog post) with approximately 164 frames left (average from last blog post) that implies that we have approximately 5 days to run a simulation, or 113 hours. This is approximately 231,304 cpu hrs. With 3 runs, that is 693,911 cpu hrs. With 2 runs that is 462,607 cpu hrs.

Perhaps we could split the runs between machines? However we aimed to use Stampede because it is so fast in comparison to the likes of BlueStreak.

Run (Current Frame) Hours Sampled Time (mins) Avg. time (mins)
b10s0 (246) 23:30 - 00:50 80
10:27 - 11:58 91
19:44 - 21:16 92

Table. The Beta 10 Shear 0 run with current frame for which the hours were sampled and averaged to do the brief calculations above.

Update 10/16/2014

I'm studying for the physics GRE so I might be locked away in a library for the next week in a half. Wish me luck.

Made some VisIt documentation! Weeeeeeeeeee

Working on fly-throughs for the CollidingFlows problem by referencing Jonathan's blog post. So far I've implemented his problem.f90 and got the code to make properly. I've got some results to visualize, so I'll post those later when I get around to it.

Managing some jobs on Stampede, however things are pretty expensive.

cpu hrs for CF runs on Stampede

Running the CollidingFlows problem out from frame 200 to 400 to double the time and see if we can observe any more sink formation. Given that this run is really computationally intensive, I've done a quick calculation for cpu hrs based on some current runs I am doing on Stampede. All runs are in the normal queue for 24 hrs on 2048 cores. The table below provides the current frame number at which I collected this data. We can see that the average time for our code to spit out a frame is (underscores correspond to the run):

Given that we have 1,440 minutes in a day, implying that we'd spit out the following frames per day:

Considering that the difference between the current frame and the last frame (400) for beta10 shear 0, 15, 30 and 60 respectively are 179, 182, 159, and 136, we're looking at running these out for approximately 5-6 days on 2048 cores. Specifically for b10s0: 5.5 days, b10s15: 5.8 days, b10s30: 5.2 days, and b10s60: 3 days. Using this number of days, that there are 24 hours in a day and we'd run these on 2048 cores, this puts us at a total of: 957,973 cpu hrs. THAT IS INSANE.

After a quick discussion with Erica and Baowei I've come up with the following short term plan: Once these jobs stop later today, I'll submit 1 job to the normal queue on 1,000 cores. For this run I'll make the same calculation and see if it is more economical when multiplied by 4. Baowei has also suggested to throw runs on Gordon, another machine owned by the Texans. We have a lot of SUs there, so he is currently setting me up. We currently only have 1,551,296 SUs available on Stampede — so running our jobs for this problem there could be quite precarious.

Run (Current Frame) Hours Sampled Time (mins) Avg. time (mins)
b10s0 (221) 16:07 - 16:55 48
02:58 - 03:38 40
07:13 - 07:58 45
b10s15 (218) 18:03 - 18:48 45
02:19 - 03:03 44
11:05 - 11:54 49
b10s30 (241) 17:57 - 18:40 43
00:26 - 01:23 57
07:03 - 07:44 41
b10s60 (264) 17:40 - 18:07 27
00:04 - 00:38 34
07:43 - 08:18 35

Table. Each run with current frame for which the hours were sampled and averaged to do the brief calculations above.

Meeting Update 09/29/2014

Tasks for this week:

  • Particular CDM movies.
  • Post Processing for convergence tests on Bluestreak. Encountered the following error on Erica's account:
    2014-09-27 05:11:02.120 (FATAL) [0x40001069280] 22137:ibm.runjob.client.Job: could not start job: job failed to start
    2014-09-27 05:11:02.126 (FATAL) [0x40001069280] 22137:ibm.runjob.client.Job: Load failed on R00-ID-J06: Application executable ELF header contains\
     invalid value, errno 8 Exec format error
  • Reading Shapes and Shaping of Planetary Nebulas (2002) by Bruce and Adam.
  • Attempting to visualize these ascii files in SHAPE.

CF High Resolution Runs Visualized

Meeting Update 09/22/2014

Last week was spent attempting to construct a quick dirty table of the high res colliding flows runs… however I ran into a plethora of issues regarding visit on BH2. Using x2go is fine and works well. Apparently my account is unable to use public key authentication to use some particular nodes. Talk with Jonathan and this is getting sorted with CIRC.

Visualizing from my own computer while ssh-ing into a local machine painstaking. It takes half a day just to get a movie of a .bov. So now I am working on Erica's computer and it is going quite swimmingly. Also have these simulations with sink particles. There are still some issues regarding particular files, however, that are putting a damper on getting the data out there.

As I've tried to go about visualizing the runs, noticed some issues with the files other than data endians. A few are missing or have gotten lost/deleted during our move around.

Last week I accomplished the following:

  • Using sed I changed the data endians and variable names.
  • Wrote a script to change the name of the files so they are all the same and we can efficiently access them with the "smart" grouping in visit. Did this for both BS and local sets of the data.
  • Finished the NoShear data set (now we have all hist and pdfs — transferred between local and BS).
  • Cherry picking sections of missing data or areas where visit doesn't like the files. Still working on this actually…

Meeting Update 09/08/2014

Over the past few weeks I've spent my time shuffling around data and organizing things to make visualization not… impossible? So now we have all of the data together in their respective directories, shared between bamboodata and grassdata. Although one can use Erica's directories in grassdata as "home base" as I made sym-links to the bamboodata directories. So everything seems like it is stored with its applicable files on grassdata.

Current Disk Usage:

/bamboodata/ 86%
/grassdata/ 89%

Now we're doing post processing on the data, along with resolution testing. We have updated our wikipage (see: CFRunStatsHighRes).

I am also studying for the physics GRE, and given my study/work schedule I might not show up for group meetings. I'll still post updates.

Running Jobs

  • Currently:
    1. Bluestreak (BS):
      • Post processing (PP) on our chombos (reservation, for B1S60 and B10S15 256 nodes). Once those two runs are done, we'll do the other two. Eventually we'll need to throw the NoShear case into the mix (see 2.).
      • Convergence testing at AMR level 3 (pending in Standard queue for both shear 60 cases). Once these jobs are done, we'll do the other two if necessary.
    2. Stampede:
      • Running the NoShear case on there. Have done approximately 3 restarts. Current restart began 187, so this should be the last restart there. After each restart completes and I restart the next one, I tranfer files back to our local directories. Once we have all the files, I'll transfer them to BS to do PP.
  • Future:
    1. PP on BS of NoShear data.
    2. PP for B10S30 and B10S60 on BS.
    3. Convergence Tests for the rest of the data sets.

Tranferring Files

  • Once all PP on BS is done, transfer everything back to local directories.
  • Tranferring NoShear to BS and local machines.


  • Erica submitted the proposal to the Vista Collaboratory Contest this past Friday.
  • Visualization on BS using x2go.

Meeting Update 08/25/2014: I made a shape!

So in an attempt to visualize the pn data that Baowei set aside for me in SHAPE, I need to learn the ropes of the software first. So that is what I've been up to. The SHAPE guys have a youtube channel, so I am going through all their videos, copying what they do on my own computer with the software (see: SHAPE 3D Interactive Astrophysics)

So I made a torus. Understand the 3D view ports in SHAPE now, it is pretty intuitive when you know where things are located. Turns out you can also add observational photos to the background, which might be useful as a reference when you're creating detailed objects. In the SHAPE tutorials, they have a Jet Template Project in their Hydrodynamics module. My short term goal is to recreate that, and perhaps a few other templates until I can convert the data we have (see: SHAPE Templates).

Another issue is converting hdf5 to ascii, which Martin informed me is the necessary format to feed into SHAPE. Baowei has a script (see: Baowei's Page) where one first needs to rewrite each chombo with a single core, and then conver to ascii.

For fun I threw the chombo files into VisIt to see what they looked like:


Both sets have the same max and min.

Going forward:

  • Attempting to become a shape aficionado, try to visualize the data we have in SHAPE… and understand it. Dig on some literature about p/pns.
  • Working on doing the post processing for Erica's runs. They should all be complete now.
  • Once I'm done making some of the typical movies of the colliding flows bovs, I'll work on doing the movie fly-throughs. Martin said he needed some pretty graphics, so I am going to try to get as much of this done as I can soon. Also for the VISTA contest in September 5th.

Meeting Update 08/18

No science this week for me. Still working on Erica's runs, and starting to delve into more development related topics.

On High Res Runs

  • Transferring materials currently to Stampede, about to start submitting jobs today (beta 1, shear 60 & beta 10, shear 15). The other two will run on Bluestreak.
  • Submitting jobs to Bluestreak. Waiting on Carl to respond to renew reservation, having beta 10 shear 60 running in standard queue at 128 nodes.
  • Aiming to update the CF Production Runs Page. Some of our runs are almost done. For instance beta 10 shear 60 is nearly to chombo 190.

Computing Resources

See previous post.


Had a meeting with Jonathan last Friday to start working on development projects.

  • Once I am done working with Erica's data I am going to focus on creating a nap sack algorithm (which Eddie kindly gave to me :) ). Jonathan and I took a look at the current algorithm we use (see: (17) Found this thesis by Pierre Schaus (see: Planning to contact both Pierre Schaus (nurse - patient algorithm) and Mark Kremholtz (about load balancing on the accretion module for Orion (astrophysical AMR)). We also came across this Zoltan code which might be useful to look at?
  • Cleaning up unused variables in the code.
  • Implementing hydrostatic equilibrium in the disk module. For which I assume I'll have to consider:

We know hydrostatic equilibrium is defined generally to be

which I'll define to be for some cylindrical coordinate system (with a distance from the disk axis, and height above the plane). Then balancing with the ideal gas law and isothermal sound speed, we can integrate our equation, yielding:

If anyone wants to go over the derivation with me, that is fine, I just included some highlights here. It is a pretty standard derivation, I think it can be found in some books. Given this would be implemented for the disk module, I just assumed a central mass and an infinitesimal chunk of mass at for a cylindrical coordinate system.

However for now I'll be focusing primarily on the napsack problem since it'll be more beneficial for the code.


Downloaded the software. It looks pretty neat. Haven't had any creative ideas of what to do with it yet in order to understand its capabilities.

Computing Resources & Local Machines

Noticed as I am scp-ing data on the CollidingFlows runs that some of the typical machines I used are starting to become full. I also do visualizations on Clover or my own machine, however they can be quite hard on my own graphics card. This has led to talk to Baowei and others about potentially looking into developing a new machine quote (see:

Given that the quote is 6 months old, that the price is not the same anymore. It has been suggested to me that someone in the group should e-mail Sean Kesavan <> and ask him to give up an updated quote or change it to meet our requirements (referencing the quote number on the quotation). I also think these are the specs for Bamboo (

In my opinion it might be a good idea to look into more disk space/storage and also get a new machine for visualization. In the mean time I took a look at our local machines and who is using them.

Commands I am using:

*du -sh * 2>&1 | grep -v "du: cannot" prints the disk spaced used for each user by printing standard error to standard out.

*df -h inside of the directory to see disk spaced used (along with %) and space that is available.


The Disk Space Used

DS Used User / Directory
63G bliu
13G ckrau
4.6G ehansen
227G erica
422G johannjc
16K lost+found
1.1T martinhe
4.0K modulecalls
20M nordhaus
1.3T shuleli
300M test.dd
3.5G trac_backup

The Disk Space Available

Filesystem Size Used Avail Use% Mounted on
aufsroot 11T 4.7T 6.3T 43% /
udev 7.9G 4.0K 7.9G 1% /dev
tmpfs 1.6G 1.3M 1.6G 1% /run
none 4.0K 0 4.0K 0% /sys/fs/cgroup
none 5.0M 0 5.0M 0% /run/lock
none 7.9G 108K 7.9G 1% /run/shm
none 100M 160K 100M 1% /run/user
/dev/md127 5.5T 3.3T 1.9T 64% /media/grassdata 11T 4.7T 6.3T 43% /home/ehansen 11T 4.7T 6.3T 43% /home/shuleli 5.3T 4.3T 802G 85% /mnt/net/alfalfadata


The Disk Space Used

DS Used User / Directory
1.2T bliu
4.0K brookskj
6.0G chaig
8.0K ehansen
580G erica
1.1T johannjc
16K lost+found
2.6T madams
1.2T martinhe
21M rsarkis
4.9G ScalingTest
3.6T shuleli

The Disk Space Available

Filesystem Size Used Avail Use% Mounted on
aufsroot 11T 4.7T 6.3T 43% /
udev 7.9G 4.0K 7.9G 1% /dev
tmpfs 1.6G 1.3M 1.6G 1% /run
none 4.0K 0 4.0K 0% /sys/fs/cgroup
none 5.0M 0 5.0M 0% /run/lock
none 7.9G 108K 7.9G 1% /run/shm
none 100M 160K 100M 1% /run/user
/dev/md127 5.5T 3.3T 1.9T 64% /media/grassdata 11T 4.7T 6.3T 43% /home/ehansen 11T 4.7T 6.3T 43% /home/shuleli 5.3T 4.3T 802G 85% /mnt/net/alfalfadata 13T 11T 1.2T 91% /mnt/net/bamboodata


The Disk Space Used

DS Used User / Directory
4.0K bliu
673G johannjc
904K orda
3.4G repositories
4.0K repositories_backup.s
11G scrambler_tests
2.6G tests
75G trac
4.0K trac_backup.s
7.2G trac_dev
4.0K trac.wsgi
1.8M www

The Disk Space Available

Filesystem Size Used Avail Use% Mounted on
aufsroot 11T 4.7T 6.3T 43% /
udev 7.9G 4.0K 7.9G 1% /dev
tmpfs 1.6G 1.3M 1.6G 1% /run
none 4.0K 0 4.0K 0% /sys/fs/cgroup
none 5.0M 0 5.0M 0% /run/lock
none 7.9G 108K 7.9G 1% /run/shm
none 100M 160K 100M 1% /run/user
/dev/md127 5.5T 3.3T 1.9T 64% /media/grassdata 11T 4.7T 6.3T 43% /home/ehansen 11T 4.7T 6.3T 43% /home/shuleli 5.3T 4.3T 802G 85% /mnt/net/alfalfadata 13T 11T 1.2T 91% /mnt/net/bamboodata 890G 838G 6.3G 100% /mnt/net/botwindata


The Disk Space Used

DS Used User / Directory
28M afrank
368K ameer
72K aquillen
4.0K array.f90
4.0K awhitbe2
20K BasicDisk
137G bcc
31M blin
9.7M bliu
1.7G bobbylc
3.3G chaig
2.6M clover
37M cruggier
0 data
851M DB
4.0K devsrc
107M edmonpp
72G ehansen
301G erica
1019M iminchev
448K jnp1
3.0T johannjc
14M langao
154M laskimr
2.9M lijoimc
3.9G local
16K lost+found
221M madams
13G martinhe
8.0K MegaSAS.log
124K mendygral
37M MHDWaves
13M mitran
648M moortgat
852M munson
3.0G noyesma
20K odat1
4.0K old_accounts
45G pvarni
292K raid
3.3G repositories
5.8G repositories_backup
140M rge21
2.9G rsarkis
192K ryan
126G scrambler_tests
2.1T shuleli
16M spearssj
0 test
54M test3
840M tests
355M tests130
231M tests_old
452K tkneen
14G trac
27G trac_backup
60K wma
1.8M www

The Disk Space Available

Filesystem Size Used Avail Use% Mounted on
aufsroot 11T 4.7T 6.3T 43% /
udev 7.9G 4.0K 7.9G 1% /dev
tmpfs 1.6G 1.3M 1.6G 1% /run
none 4.0K 0 4.0K 0% /sys/fs/cgroup
none 5.0M 0 5.0M 0% /run/lock
none 7.9G 108K 7.9G 1% /run/shm
none 100M 160K 100M 1% /run/user
/dev/md127 5.5T 3.3T 1.9T 64% /media/grassdata 11T 4.7T 6.3T 43% /home/ehansen 11T 4.7T 6.3T 43% /home/shuleli 5.3T 4.3T 802G 85% /mnt/net/alfalfadata 11T 5.8T 4.5T 57% /mnt/net/cloverdata


The Disk Space Used

DS Used User / Directory
184M afrank
4.0K bshroyer
711M bshroyerdata
8.5M chaig
4.0K cruggier
27M data4out.out
127M data5out.out
4.0K ehansen
1.4T erica
4.0K eschroe3
2.1G ferreira
3.2G fnauman
33G grass_swapfile
29G jnp1
119G johannjc
4.0K johannjc:~
4.0K langao
4.0K laskimr
16K lost+found
310G madams
656G martinhe
12M munson
1.4G noyesma
112M root
762G shuleli
1.4G slucchin
4.0K tdennis
1001M test.dd
4.0K testlog
403M tkneen
46M xguan

The Disk Space Available

Filesystem Size Used Avail Use% Mounted on
aufsroot 11T 4.7T 6.3T 43% /
udev 7.9G 4.0K 7.9G 1% /dev
tmpfs 1.6G 1.3M 1.6G 1% /run
none 4.0K 0 4.0K 0% /sys/fs/cgroup
none 5.0M 0 5.0M 0% /run/lock
none 7.9G 108K 7.9G 1% /run/shm
none 100M 160K 100M 1% /run/user
/dev/md127 5.5T 3.3T 1.9T 64% /media/grassdata 11T 4.7T 6.3T 43% /home/ehansen

Meeting Update 08-05-2014

File Transfer from BS & Run Updates

  • Went through all of the chombos on visit. All look great, no issues.
  • Deleted all of the chombos on BS as I was over soft limit and could no longer write files.
  • Been updating Erica's Production Runs page:
  • Reservation starting Thursday at 8AM for 5 days. Plan on running B1S60 at 128 nodes, B10S15 at 256 nodes, and B10S30 at 128 nodes. Currently have B10S60 running in th standard queue, already at chombo/sink 157.

Post-Processing & Local Machines

  • All of the files I have run on BS are transferred over either on grassdata or bamboodata (some b10s30 (143-153). Can check ProductionRuns directories under madams in grassdata/bamboodata.
  • We have the following chombos: 144 (B10S15), 153 (B10S30), 153 (B10S60), and 130 (B1S60)
  • We have about ¾ of all the runs done.
  • Post-Processing is up to date for B1S60 (completed last night), currently doing post processing on both bamboo and grass (bamboodata) for B10S60 and B10S30 respectively.


Going Forward

  • Reservations on BS. E-mailed Carl to see if he could take off the soft limit for Erica's account, including my own.
  • Getting onto Stampede so we can split the jobs between the two machines.
  • Need more space on our local machines. Many of our files are scattered between our directories. It would be smart to manage them in the same directories.

Meeting Update 07/28

Now in full control of Erica's CollidingFlows data on Bluestreak. Completed first reservation (5 days) today. Can see our progress here (CFRunStatsHighRes).

  • E-mailed Carl about the soonest I can get another reservation (4-7 days on BS). (Update 14:47: Got our reservation extended to next Monday at 8am).
  • Erica is working on what is necessary to get us up on Stampede.


  • scp-ing files from BS to /grassdata/madams/.
  • Post-processing on Bamboo.
  • Running B10S30 on our extended reservation on BS at 512 nodes for 48 hours (07-28-2014 14:49).
  • Submitted B10S60 to the standard queue on BS at 128 nodes for 24 hours… pending, but we'll see what happens.

Some highlights from CFRunStatsHighRes Page:

Beta Shear Machine, Path Frames Completed / Left Filling Fracs Walltime Taken / Left (1-2) Info, (3) Message allocations Framerate (hrs/frame), No. cores Notes
1 60 BS, /scratch/madams15/CollidingFlows/Beta1/Shear60/Restart/ 122/200 0.106 0.559 0.631 0.321 0.467 16.8days/0days 282.5 gb, 197.6 mb, 64.0 mb ~5hr/frame, 2048 cores Reservation. Stayed with 128 nodes. Started with 48 hours left on reservation.
10 15 BS, /scratch/madams15/CollidingFlows/Beta10/Shear15/Restart 139/200 0.386 0.698 0.616 0.496 0.468 22.1days/0days 422.4 gb, 296.5 mb, 64.0 mb 7hr/frame, 2048 cores Reservation. Restarted with 128 nodes on 07/26.
10 30 BS, /scratch/madams15/CollidingFlows/Beta10/Shear30 128/200 0.136 0.587 0.601 0.465 0.418 1.2mo/0days 377.0 gb, 252.4 mb, 64.0 mb 9hr/frame, 2048 cores Reservation. Started Jul 24 08:46, cancelled to test other runs, started again at Jul 26 16:41.
10 60 BS, /scratch/madams15/CollidingFlows/Beta10/Shear60/Restart/ 151/200 0.180 0.547 0.544 0.318 0.269 6.7days/0days 305.3 gb, 211.8 mb, 64.0 mb ~4hr/frame, 2048 cores Reservation.

Going Forward:

No. Nodes Case frames/hr frames/day Chombos Left Days to Complete Weeks to Complete
128 B1S60 0.2 4.8 78 16 2.29
B10S15 0.143 3.5 61 17 2.43
B10S30 0.111 2.67 72 27 3.9
B10S60 0.25 6 49 8 1.14

No. Nodes Case frames/hr frames/day Chombos Left Days to Complete Weeks to Complete
256 B1S60 0.4 9.6 78 8.1 1.16
B10S15 0.29 6.96 61 8.7 1.24
B10S30 0.22 5.28 72 13 1.86
B10S60 0.5 12 49 4 0.57

No. Nodes Case frames/hr frames/day Chombos Left Days to Complete Weeks to Complete
512 B1S60 0.8 19.2 78 4 0.57
B10S15 0.57 13.68 61 4.5 0.643
B10S30 0.44 10.56 72 7 1
B10S60 1 24 49 2 0.29
  • Depending on how quickly we can get onto Stampede, I am estimating that we'll be done with these in 2 weeks if we do this most efficiently.

Meeting Update 07-21


  • I know it is about time, but now attempting to make low-res fly-throughs of the .bov files that we already have for our colliding flows visualizations. Plan is once we get all of the high-res data from our production runs, I'll make some nice fly-throughs of those?
    • cp'd code,,, and to /bamboodata/erica/CollidingFlows/CollidingFlows/MHD_3D/ to /bamboodata/madams/CollidingFlows/FlyThru/
    • Referencing ProjectionObjects wikipage.
    • Need problem.f90.
  • Continuing to make .bov gifs for the files as we get them.
    • Bluestreak is currently down until Thursday this week at 8am, in time for my reservation to pick up on Erica's restarts.
    • Erica, Baowei and myself think it will be good to get some code up on Stampede, and run some jobs there. Erica says it'll probably be even faster. I'll have to familiarize myself with how to run jobs there.
    • Having computer issues with Clover and my own personal macbook. Making 3D Box CDM plots is very computationally intensive for these machines…

Mass-1 Column Density Maps

Beta 1


Beta 10




So what are different about these than those posted in my last blog post? I found the coloring to be too weak on the ones I previously posted. I also did not consider the ratio between GmX and the GxBounds when determining the maximum and minimum. So after going through and visualizing all of the mass-1 .bov sets for Beta 1 and 10 for our production runs, I have decided max = 1000 and min = 190 will be appropriate. This implies, given the size of the box (looked up in, for all beta 10 cases, the mass2 and mass 3 will have max = 639 and min = 121.4. I think Beta 1 has a different box size, but I need to double check.

Again, it seems like I'll only be able to focus on fly-throughs, enhancement, somethings I noticed previously about bovs, and getting on Stampede until I can ssh to BGQ.

Meeting Update 07-17-2014


  • Transferring files from Bluestreak to grassdata as we get them.
  • Some of the more recent .bov files have been corrupted. Moving the chombos for those to my scratch space so we can do post processing again. Here is what we have so far:




  • PREP Visit: I was invited by PREP (Pre-College Experience in Physics for Girls) to have lunch with the students, and come give a talk on what I had been working on this summer. Titled the talk, "Your Computer: The unexpected frontier in star formation." Covered what we do know about star formation, how colliding flows, and AstroBEAR contributes to the field. We derived the Jeans mass as done in Derek Ward-Thompson's book (i.e. it was not intimidating, just some variable manipulation), and gave them a problem where they can just plug in some numbers and determine whether a proto-star will collapse to a star. I promised them brownies if I got 10 answers by Friday this week. Showed them some simulations Erica and I had done, how we use computers to run code. They seemed most blown away by terminal and that your computer doesn't need to just be used for writing essays for AP English and using Facebook. A lot of good questions, very smart students. Someone from the group should visit them every year.
  • Horizons: Erica and I shared the lesson last week (Thursday, July 10th). We had 5 students: one kindergartener, one in third grade, and three who were in middle school. Erica tells me that we only had three this past Tuesday. Today I am hosting the session in collaboration with some graduate students in Optics. We're going to talk about the Sun and telescopes. Hoping if the weather permits (sadly it is expected to be partly cloudy around that time) we'll use a sun-scope and check out some sunspots. Optics is bring some pinhole cameras and telescope demos as they get a lot of this stuff via donation from Edmund Optics.

High Resolution CDM for Beta 10 cases 15, 60(ish)

Our high resolution colliding flows simulations are starting to come in. Erica went through post-processing and here are some CDM. Grass ended up crashing while I was making the Beta 10 Shear 60 gif, so I only have frames up to 64.



Resolution Test for Erica's Grid

In the previous group meeting, we discussed the comparison plots I made with the different box sizes. Turns out, that by changing the GmX ratio you're affecting the resolution. So given the original ratio of dx = 1.5625, I took Erica's box size of L_x = 200 and applied a GmX of 128, yielding the same ratio. Here is the result.


Biggest Box Ever...

Here we go, as I promised my last post:


v/c_{s} for hydro shear 60, x = 200, y = z = 64

The Hydro Shear 60, x = 200, y = z = 64 data set has the smallest height and width, so we're seeing whether or not the flow leaving the box was subsonic or supersonic as this case would be thought to be more problematic (due to it being more smooshed). To do this we consider the ratio between the magnitude of the velocity vector , with components , and the sounds speed, .

So we defined



Note that is the Hydro/3DPressure expression in VisIt, i.e.


and that .


Given that fluid would be sonic at a ratio of 1, we can see that anything orange and red is super/sonic, otherwise it is subsonic. So a lot of the fluid leaving the box is subsonic, however the clumps we see being flung off from the interface of the colliding flows is supersonic. This might be because in this simulation we are taking the magnitude of the velocity, and not just in the z-direction.


Above is a simulation for , we still see a lot of subsonic fluid.

Objective: Going to make a 200x200x200 file, so by the 200th Chombo, the affects won't reach the boundary. Then I'll check the ratio of times for the different box sizes and determine how economical it'll be to have a box size that huge and to what extent we should shrink it.

New Box Comparison Plots [1, 200]

Now I've included chombos 150 and 200.

Why no sinks?!: Contour Plots

Continuing with the previous most I made titled, "Why no sinks?!" Today I played around with some contour plots in VisIt to see if that could be any bit illuminating. The first picture is the 3D contour, looking down the barrel, comparable to the .bov file I attached in the previous post. Then the second image is a 2D slice from the same perspective.



So I think that this might actually be a physical density. Going to make some longer plots as soon as I can.

More Box Size Comparisons

Here I've made another visual illustrating box comparisons. The min: 0.5, max: 55. For chombo files 0, 50, 100. Now you can compare side-by-side more clearly.

Erica and I are hoping to decide what sizing we think works for the high res runs we're going to do. Similarly for fly-bys.

Why no sinks?!

Consider this image that Erica posted on her Low Rest MHD Shear Flows page. Note how the Beta 1, Shear 60 simulation becomes the most density at its semi-major axes before any of the other sets.


The other two Beta 1 cases (shear 15 and 30) see sinks occur beginning at around 16.5 and 17.17 Myr respectively, however the shear 60 case does not see sinks occur at all. My question is why don't we see sinks forming at those two regions?

Sinks Form (~Myr)~Min~Max
B1S15 16.5 293168
B1S3017.1729.54 5053

As you can see by the picture above, by the end of the simulation the maximum is above 6000, and the minimum is ~24. The maximum is in the ball park for sinks to potentially form, however the minimum is not. Around 17Myr, the minimum is at ~28, and the max is at ~3000.

  • Perhaps the density we observe here is physical. Maybe elongating the time on this simulation might result in sinks popping up. Perhaps it just takes them longer to form with a higher shear angle?
  • Perhaps the density we observe is not actually physical, but a result of the nature of the .bov. Given the higher shear angle, the more elliptical the projection, and more area for the projection to pick up on at the semi-major axes. This might just be a area density the .bov is picking up on.

Box Comparisons

Erica asked me to do some tests for the box sizes in with the CollidingFlows problem. Did some side-by-sides for the ~/Hydro_control/Shear60/ set (x = 62.5, y = z = 75), as well as Erica's ~/Testing_Grid_Domain/Shear60 (beta = 0.0, and x = 200, y = z = 64). I also figured maybe we're missing some affects in z, so I increased Erica's test box to x = 200, y = z = 80. Here are the comparisons done in VisIt:

So the first two images seem to be illustrating the same sort of physics — just that perhaps we can see more of what is happening with y = z = 80. The last image (x = 62.5) has the characteristic "hydro" wisps and fingers that we would expect, where as we don't really see these features in the others. The beta = 0 for all of them, and the physical boundary conditions are the same… not sure where these differences are coming from. Perhaps I'll play around more with some smaller sizing. Also a fly through will be coming to all of you soon~

New CDM Plots with Sink Particles

Here are some CDMs for beta 10 data that I got done today. Done for all shear angles in x-field. You'll notice that sink particles form toward the end of each gif (when we'd expect star formation).





  • CDM plots with sink parks for beta 1: 15, 30 in x, as well as hydro.
  • CDM plots for the other two shear angles to help complete the page Erica is creating.
  • Fly through movie for AstroBEAR YT channel with some of these x-field plots.

Update 2: Some Column Density Maps

Here are some plots for the CollidingFlows shear 15 data (beta 1, 10: x, y, z fields, and Hydro). Each of these are column density maps for each .bov file along their respective axes.

Beta 1, Shear 15




Beta 10, Shear 15




Hydro, Shear 15


Did the best I could to orient them appropriately without the triade VisIt produces (which is misleading as it does not indicate the actual direction of the flows). The min/max is kept the same for each plot.

First Post (Update 1): Some Colliding Flows

Hi!~ so here is a link to my page

Previously I had experimented with making visualizations with Erica's CollidingFlows data on Bamboo during the semester. I created some slices with the Shear 15 data for Beta 1, 10 and Hydro. The most notable effect is how the magnetic field confines the fluid the higher its strength. This is expected, but as I post more in the next few days it'll become more clear for these data sets.

Here I will post some density (rho) plots pertaining to the Beta 1, Shear 15 set with a magnetic field in the x-direction. Each of these movies are slices in the yz, yx, and xz plane.

min: 0.65, max: 250





  • Attempted to make a three slice visualization with these slices. Visit started crashing on hay. Need to talk to Rich sometime soon about Visit issues.


  • Visualizations for density (Beta 10, and Hydro).
  • Make some velocity plots for these three data sets (Beta 1, 10 and Hydro).