I'd like to evaluate if a picture is blurred or not. Actually using imagemagick 6.7.7-10 I've seen that identify commmand gives some probably interesting information, and command:
identify -format "%[filename] sd:%[standard-deviation]\n" floue1.jpg
answer> floue1.jpg sd:7876.9
seems to be a good start. However, if i use:
identify -verbose floue1.jpg | grep stan
standard deviation: 34.8189 (0.136545)
standard deviation: 26.7428 (0.104874)
standard deviation: 29.8434 (0.117033)
standard deviation: 30.6494 (0.120194)
I see 4 standard deviation values,one global and one for each color channel, and these values differ from the single one I got with the first command.
Where do these results come from/mean, and how can I select (print) any of these?
It works like this:
identify -verbose floue1.jpg | grep stan
standard deviation: 39.0047 (0.15296)
standard deviation: 36.45 (0.142941)
standard deviation: 37.9805 (0.148943)
standard deviation: 37.8263 (0.148339)
The first number on the line is scaled on the range 0-255 and the second one is normalised to a range of 0 to 1.
The first row is Red standard deviation, the second is Green and the third is Blue, whereas the last is averaged over all channels.
If you want to access them individually, use:
convert floue1.jpg -format "%[filename] sd:%[fx:standard_deviation.blue]\n" info:
floue1.jpg sd:0.148943
Or, you can select a different one and scale it too:
convert floue1.jpg -format "%[filename] sd:%[fx:int(standard_deviation.green*255)]\n" info:
floue1.jpg sd:36
If you want them all in one go and you are using bash
, you can do:
read sdr sdg sdb <<< $(convert floue1.jpg -format "%[fx:int(standard_deviation.red*255)] %[fx:int(standard_deviation.green*255)] %[fx:int(standard_deviation.blue*255)]" info: )
echo $sdr, $sdg, $sdb
39, 36, 37