Search code examples
artificial-intelligencepath-findinga-star

A* Pathfinding, Calculating G cost


I'm having some difficulty understanding how to consistently calculate a correct G cost for the implementation of A* Pathfinding. As I understand it, it's the cost of moving from the starting node to the current node, but what I don't fully understand is how to find the value that is used to increment the G cost. I've seen examples that use numbers like 10 and 14, but are these arbitrary and does it depend upon the implementation?

When I was developing a 2D game, it seemed like I almost had to find a "sweet spot" for the G cost (which I should note seemed to be close to the width of the floor tiles that were being used as nodes) before I was consistently finding the shortest paths.

Any clarification on the matter would be great.


Solution

  • You define them. The amount you add to G when you "walk a step" is how you tell the algorithm what paths you really like (H is only an admissible approximation of summing a bunch of G-increments that helps it to find what you wanted faster). Using 10 and 14 is an approximation of 1 and sqrt(2), sort of like what would happen if you had Euclidean distance but you were constrained to the Moore neighborhood at every step, sometimes this is called "diagonal distance" or "octile distance" (though more properly that term is reserved for using an accurate sqrt(2)). This choice therefore does not entirely come out of nowhere.

    But it is up to you, choosing different costs makes A* prefer (or "not prefer") certain paths, for example if you make the diagonal cost the same as the "straight" cost, it will really like diagonal moves, it will not necessarily avoid zigzagging back and forth (which would be free as long as the zig goes the right way, for example a path /\ would be the same length as --). Using a high diagonal cost (>2) will make it find paths that mostly look like you used a von Neumann neighborhood, except that in "emergencies" it will still be able to move diagonally. Between 1 and 2 the difference is much less pronounced but it shows up around obstacles sometimes.

    So using diagonal costs lower than sqrt(2) tends to make "strange" paths that needlessly zigzag, using diagonal costs higher than sqrt(2) tends to make "dumb" paths that fail to take diagonal shortcuts. But you may want that, especially if that matches the "actual cost" (if you have one), such as walking time taken by units or something like that. Then again, in one of my own games, I intentionally made the diagonal cost higher than it would have been based on the walking time, to make paths look more natural (it was too zig-zaggy otherwise).

    paths with different diagonal costs

    Yellow is "explored". The bottom path takes a detour with diagonal cost 10 due to implementation details (it adds nodes clockwise starting at NW and uses a stable insertion into open instead of something clever like a heap, so nodes with equal F are tie-broken on which got added first).