0

A common algorithm used in N-body gravitational calculations with large numbers of bodies (stars in a galaxy, galaxies in the universe, etc.) is the Barnes-Hut algorithm, which assembles particles into an octree for approximate bulk calculations of gravitational forces between distant areas. The complexity of this algorithm is O(N log N), compared to the O(N^2) of a more realistic direct calculation between all pairs of points.

I'm trying to fully understand where the N log N comes from. I understand that, once the octree is assembled, the calculation of forces is N log N because you have to go through each particle (N), and each particle "sees" approximately log N other particles because of the octree reducing the number of calculations needing to be performed for distant particles.

What I'm still trying to understand is what the complexity is for assembling the octree in the first place. Is it also N log N? N because you need to do it for each particle, and log N because that's approximately how deep you'll have to go (with some factor in front) to reach an isolated leaf in the tree?

1 Answer 1

1

Yes, creating a tree is typically O(n log n). This holds for all kinds of trees, regardless of the number of child nodes per node, as long as they are somewhat balanced. What changes is the base of the logarithm, e.g. log2 for a binary tree and log8 for an octree – but this base can be ignored for asymptotic complexity.

Some trees use tree rotations upon insertion to maintain balance, or we can insert nodes in an order that guarantees balance (e.g. by sorting all nodes and combining them bottom-up), or by shuffling all nodes so that they are inserted in a random order (often good enough in practice). Note that sorting is an O(n log n) operation as well, so that's basically a free operation when inserting into a tree.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.