Search code examples
algorithmgraphbreadth-first-search

Why is Knight's Shortest Path Algorithm's Time Complexity O(m *n) with breadth-first search


`Question: You are given the position of two knight pieces on an infinite chess board. Write a function that returns the minimum number of turns required before one of the knights is able to capture the other knight, assuming the knights are working together to achieve this goal. The position of each knight is given as x,y. A knight can make 8 possible moves: (-2, 1),(-1, 2),(1, 2), (2, 1), (2, -1), (1, -2), (-1, -2), (-2, -1)`` A knight is able to capture the other knight when it is able to move onto the square currently occupied by the other knight.

Algorithm: We basically take this question as a graph question and apply breadth-first search. Using breadth-first search and queue we look at the starting position add its eight possible moves to the queue and mark them as visited. In next steps we do the same, get an element from queue and add its possible moves to the queue if they are not visited. The time complexity stated in a third party site is O(m*n)ST

So this algorithm supposedly runs in O(mn) ST where m is horizontal distance between target position and starting position and n is vertical distance. I understand that using breadth-first search we will be able to find the shortest path. What I don't understand is the time complexity. I really don't see why it should be O(mn). If it was a limited board it would be easy to see because at most we would have visited all the squares. But in an infinite board aren't there a lot of moves that are far away from target position and we add their eight potential moves to the list constantly. I'd think it would be substantial but I am wrong. Can you explain me why on an infinite chess board while we are checking all possible moves time complexity is O( m* n) instead of O(8 ^ s) where s is minimum steps required to reach target. I mean I know in graph questions time complexity is O(v + e) but aren't we ignoring a really big group of vertices and their edges?

The algorithm works fine I just don't understand the time complexity.

   function knightPath(knightA, knightB) {
  const possibleMoves = [
    [-2, 1],
    [-1, 2],
    [1, 2],
    [2, 1],
    [2, -1],
    [1, -2],
    [-1, -2],
    [-2, -1],
  ];

  const queue = [[knightA[0], knightA[1], 0]];
  const visited = new Set(positionToString(knightA));

  while(true) {
    const currentPosition = queue.shift();

    if (currentPosition[0] === knightB[0] && currentPosition[1] === knightB[1]) {
      return Math.ceil(currentPosition[2] / 2)
    }

    for (const possibleMove of possibleMoves) {
      const position = [currentPosition[0] + possibleMove[0], currentPosition[1] + possibleMove[1]];
      const positionString = positionToString(position);
      if (!visited.has(positionString)) {
        position.push(currentPosition[2] + 1);
        queue.push(position);
        visited.add(positionString);
      }
    }
  }
}

function positionToString(position) {
  return position.join(',');
}





  

Solution

  • The correct time complexity for the BFS search is O(m2 + n2), or equivalently O(max(m,n)2).

    One way to prove this is to first show that the shortest path has O(m+n) steps. It takes m/2 + n/2 steps to get within a 3x3 square of the target by moving "directly", and then it takes only a constant number of steps to move the last few squares.

    So your BFS will find the target after examining all paths of O(m+n) length. Let f(m,n) equal the actual length. All nodes visited by paths of length <= f(m,n) are within a distance of 2f(m,n).

    There are only O(f(m,n)2) squares within that distance. Since the BFS takes time proportional to the number of squares visited, the complexity is in O(f(m,n)2) = O(m2 + n2)