You are viewing paulmck

(no subject)

I don't know how the Barnes-Hut data structure is meant to evolve over the course of the n-body simulation. My guess is that the positions and velocities will change so the data structure would not be static. Depending on how advanced the implementation is the whole thing might even be thrown away for each simulation step rather than attempt to exploit spatial and/or temporal coherence. So I don't know if process exit is a useful leak mitigation strategy or not.

re: Distribution: Fair enough. Once you have a specific distribution in mind specialization is a valid approach to gaining better accuracy and performance. Your's is also a much better solution than my idea of breaking up the mass into constituent bodies! However, it would still be trying to produce something resembling the "worst-case". My wild guess is the uniform distribution is the best case in terms of errors introduced by the approximate data structures and I was trying to come up with something that would resemble the worst case such a general n-body solver with approximate data structures could produce.

You mentioned the uniform n-body issue came up during the talk so I'm curious how it was addressed there.

The xchg() idea is interesting. I believe I've seen it in the kernel before but it's been so long since I saw it that I only remember the technique and not where it was used. I imagine performance-wise it's a bit like the CAS solution only without the overhead of a branch.


Comment Form

No HTML allowed in subject

  
 
   
 

Notice! This user has turned on the option that logs your IP address when posting. 

(will be screened)