I came up with a programming problem that goes something like this:
suppose you have a NxN binary grid. “Tapping” a square toggles it and all edge-adjacent squares. Find the shortest sequence of “taps” that converts the grid to all 1s.
The easiest to see algorithm is a BFS, where we start at a grid of all 1s, and repeatedly visit each board that is one “tap” away from the current board, while making sure that we never visit a board twice. However, since there are 2^(n^2) board states, it’s a pretty inefficient algorithm.
I was able to find a polynomial time (O(n^2)) algorithm for N <= 3, but it breaks down for N>=4, since it relies on the fact that each board is solvable.
Is there a polynomial time algorithm for N>=4?
Thanks!
Edit: Here is an outline of my algorithm for n=3 (also applies for smaller N):
Let A->B denote tapping B onto board A Let 1 denote the NxN grid of all 1s.
Note that 1->(1->(1->(1->(A)))) = A for all boards A with N<=3. To my knowledge, there’s no intuitive proof of this; you just have to expand it out using xor and it eventually resolves.
Let B = 1->(1->(1->A)). By substitution, 1->B = A, so 1->B->B = A->B = 1, by (2) above, so we have found B as desired.
Since every board is reachable, and there are the same number of board states as tap sets, then B is unique, so it is optimal.
It should work with BFS. A propper BFS algorithm does not rely on the fact that there is a solution: when there is none, it just falls through its main loop without results.
Some observations:
It is never necessary to tap the same cell more than once. If you tapped it twice, it is the same as not tapping it. It is a XOR operation.
The order of the taps is not important. The effect of tapping the same cells in a different order is the same.
You can improve a bit on BFS by implementing the A* search algorithm. For the realised cost you could count 5 per performed tap (to represent the maximum number of flipped cells), and for an admissible heuristic you could count the number of cells that are still 0. Note that an A* search uses a priority queue.
Working with bit representations and using bit operations like AND, OR and XOR, may speed up an implementation, although it doesn't influence the time complexity.
When you have a 0 cell in the matrix, it is certain that this cell or any of its edge-neighbors needs to be tapped. So it makes sense to select one (only) of the zeroes and consider the 5 taps (at most) as "next" moves.
Here is an implementation in Python with some comments to explain details:
# Import HeapQ as priority queue implementation
from heapq import heappush, heappop
# Define conversion functions between matrix and bitmap representations
def matrix_to_int(matrix):
n = len(matrix)
# Foresee an extra column:
width = n + 1
result = 0
# board mask represents the bits that are not in the extra column
boardmask = 0
rowmask = (1 << n) - 1 # A mask with n times a 1-bit
for row in matrix:
# Add an extra bit after each row: a boundary
result = result * 2
boardmask = (boardmask << width) + rowmask
for bit in row:
# Store the opposite bit (0 <--> 1)
result = result * 2 + (1-bit)
return boardmask, result
def int_to_matrix(n, pattern):
width = n + 1
bit = (1 << (width * n - 2))
matrix = []
for _ in range(n):
row = []
for _ in range(n):
row.append(int(pattern & bit > 0))
bit >>= 1
bit >>= 1 # Skip boundary bit
matrix.append(row)
return matrix
def solve(matrix):
n = len(matrix)
width = n + 1
# Define a bit pattern of a cross shape (+), having 5 1-bits:
crosspattern = 2 | (7 << width) | (2 << (width*2))
# This cross has it center location at (1, 1) instead of (0, 0),
# so define a shift
crossshift = width + 1
# Convert the input to a single integer (bit pattern) where
# 1 and 0 are swapped, so we now look to clear(!) this pattern
boardmask, pattern = matrix_to_int(matrix)
availabletaps = boardmask
# The admissible heuristic is the number of 1 bits (that we want as 0)
distance = pattern.bit_count()
# Each heap entry will have 4 elements:
# - estimated total cost
# - realised cost (by previous taps),
# - the current state of the matrix (as a bit pattern)
# - the set of positions that can be tapped (as a bit pattern)
heap = [(distance, 0, pattern, availabletaps)]
while heap:
allcost, pastcost, pattern, availabletaps = heappop(heap)
if pattern == 0: # solved!
return int_to_matrix(n, availabletaps ^ boardmask)
# Foresee the cost of the next tap:
# Use a cost of 5 to make sure the heuristic is admissible.
pastcost += 5
# Get the first 1 bit in the pattern
firstbit = pattern & -pattern
# Get the taps that will set this firstbit to 0
for dir in -width, -1, 0, 1, width:
tapbit = firstbit << dir if dir >= 0 else firstbit >> -dir
if tapbit & availabletaps == 0:
continue # Cannot/shouldn't tap here
# Tap at tapbit: move the cross pattern to this bit
crossbits = (tapbit * crosspattern) >> crossshift
# ... and apply it to the board, discarding the bits outside it
nextpattern = (pattern ^ crossbits) & boardmask
# Heuristic for future cost: number of 1 bits in the pattern
distance = nextpattern.bit_count()
heappush(heap, (pastcost + distance, pastcost,
nextpattern, availabletaps - tapbit))
Here is an example run on a 10x10 matrix:
matrix = matrix = [
[1, 0, 0, 0, 0, 1, 0, 1, 1, 1],
[1, 0, 0, 1, 1, 1, 0, 0, 1, 1],
[0, 1, 0, 0, 1, 0, 1, 1, 0, 0],
[0, 1, 1, 0, 0, 0, 1, 0, 1, 1],
[1, 1, 1, 1, 0, 0, 1, 1, 0, 0],
[1, 0, 0, 0, 1, 0, 1, 0, 1, 1],
[0, 0, 0, 0, 0, 0, 1, 0, 0, 1],
[1, 0, 1, 0, 1, 1, 1, 0, 1, 0],
[1, 0, 0, 0, 1, 0, 0, 1, 1, 0],
[0, 0, 0, 0, 1, 0, 1, 0, 0, 1]
]
taps = solve(matrix)
The taps
matrix will represent the cells that need to be tapped (in any order):
[
[0, 0, 0, 1, 0, 0, 0, 0, 0, 0],
[0, 1, 0, 0, 0, 0, 1, 0, 0, 0],
[1, 0, 0, 1, 0, 1, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 1, 1],
[0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 1, 0, 0, 0, 0, 0, 0],
[0, 1, 0, 0, 0, 1, 0, 1, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 1, 0, 1, 0, 0, 0, 1],
[0, 1, 0, 0, 0, 0, 0, 0, 1, 0]
]
As the heuristic function is admissible, the returned solution is guaranteed to represent the minimum of taps needed.