I have a grid as a tuple of tuples with integers (1/0), a row number and column number for a cell as integers. And I have to find how many neighbouring cells have neighbours as an integer.
It's a task from the www.checkio.org, an interesting site of learning python.
Here is my code:
def count_neighbours(grid, row, col):
grid = ()
count = 0
for pos in ((row - 1, col), (row + 1, col), (row, col - 1), (row, col + 1), (row - 1, col - 1), (row - 1, col + 1), (row + 1, col - 1), (row + 1, col + 1)):
if pos == 1:
count += 1
return count
The system answers to me that there are no neighbours near the chosen cell. Please explain me what's wrong and thank you for attention!
I see two obvious mistakes:
you replaced the grid
with an empty tuple
your code doesn't reference the grid
variable at all, you just add 1 to count
if pos
is equal to 1. pos
will never be equal to 1, because you are setting it to one of a series of tuples.
Ergo, your function will always return 0
as long as row
and col
are numeric (and raises an exception otherwise).
You need to actually reference the grid that is passed in:
def count_neighbours(grid, row, col):
count = 0
for pos in (
(row - 1, col), (row + 1, col), (row, col - 1),
(row, col + 1), (row - 1, col - 1), (row - 1, col + 1),
(row + 1, col - 1), (row + 1, col + 1)):
if grid[pos[0]][pos[y]] == 1:
count += 1
return count
I'm assuming here that the grid is a list of lists representing rows and cells.
Next, you'll have to handle your positions going out of bounds; there are no neighbours to the top of the first row, for example:
def count_neighbours(grid, row, col):
count = 0
for x, y in (
(row - 1, col), (row + 1, col), (row, col - 1),
(row, col + 1), (row - 1, col - 1), (row - 1, col + 1),
(row + 1, col - 1), (row + 1, col + 1)):
if not (0 <= x < len(grid) and 0 <= y < len(grid[x])):
# out of bounds
continue
if grid[x][y] == 1:
count += 1
return count