In bash
I'd like to test a number of directories if they contain filenames outside of what I have whitelisted:
[a-z]
[A-Z]
[0-9]
[+-_ßäöüÄÖÜ.,]
[ ]
(ie. one space is ok, two or more in a row would not).
Trying
$ ls my/dir/ |grep --color=always -v [a][b][c][d][e][f][g][h][i][j][k][l][m][n][o][p][q][r][s][t][u][v][w][x][y][z][A][B][C][D][E][F][G][H][I][J][K][L][M][N][O][P][Q][R][S][T][U][V][W][X][Y][Z][0][1][2][3][4][5][6][7][8][9]
yields all files, even the one containing a #
character for testing.
So does the "shorthand" of
$ ls my/dir/ |grep --color=always -v [a-zA-Z0-9]
(and yes, I know I'd have to include the [+-_ßäöüÄÖÜ., ]
characters into this later).
I have tried using diff
for this:
$ diff -y <(ls -1 my/dir/) <(ls -1 /my/dir/|tr -cd '[a-zA-Z0-9\n\r \-\.\,]')|grep --color=never '|'|cut -d "|" -f 1
which outputs the filenames which are outside of my tr
list:
A File Containing Some Rainbo#ws.wav
But this seems rather clumsy... Any ideas for something, well, nicer?
$ find -name '*[^a-zA-Z0-9+\-_ßäöüÄÖÜ., ]*' -o -name '* *'
See man find
. The -regex
and -maxdepth
options might also be interesting to you.