As a basic check for hundreds of ascii files, I'd like to double-check that each has the correct number of rows and columns. The first 6 lines of each file aren't part of the 23x23 matrix matrix that each contains. I've tried the various possibilities for reading the matrix size that I turned into comment lines (below), but now I'm thinking that I need a new way to read the arrays other than arcpy's listTables. I would be open to using modules used in pandas as well. Any ideas? Thanks.
import arcpy, numpy
from arcpy import env
env.workspace = r"C:\VMshared\small_example_valley5\SDepth1"
for file in arcpy.ListTables():
#numpy.loadtxt(file,dtype = float, "#", delimiter = ' ', "#", skiprows = '6')
outfile = numpy.loadtxt(file, skiprows = '6')
print numpy.shape(outfile)
#print enumerate(file)
#print len(file) + len(file.T)
#print len(file) + map(len,file)
I think numpy could be used for this.
for file in arcpy.ListTables():
outfile = numpy.loadtxt(file, delimiter=" ", skiprows = 6)
if outfile.shape != (23,23):
print file + " has an incorrect number of rows or columns"