The following reads in a png into an array:
import matplotlib.pyplot as plt
import matplotlib.image as mpimg
import numpy as np
img=mpimg.imread('example.png')
the result is the array img
, which is e.g. a 1024 x 1024-array of tuples (see http://matplotlib.org/1.3.1/users/image_tutorial.html):
How I can enforce, that my result is an n x n-array (of tuples) instead of the 1024 x 1024-array (n<1024)? I need to explicitly define the dimension of that array (e.g. set 400x400).
Thanks in advance
I recommend installing pillow
(preferably using Anaconda). It makes image manipulation easy — mostly easier than treating the image as a raw ndarray
.
Once you have pillow
installed, this answer should help: How do I resize an image using PIL and maintain its aspect ratio?
If you really want to keep it as an array, then you could use scipy.misc.imresize
.
Edit to add the thing that actually worked, in case others miss it:
import scipy.misc
img_rescaled = scipy.misc.imresize(img, size=[400,400], interp='bilinear')