I am having some issues with detecting faces of a browsed image. I know the issue is that I do not know how to apply the face detection code I am testing out on an imported image. The example code I am testing was written for an image stored locally. I believe I am close but can you help me out?
First, I created a gallery method
protected void gallery() {
Intent intent = new Intent();
intent.setType("image/*");
intent.setAction("android.intent.action.GET_CONTENT");
startActivityForResult(Intent.createChooser(intent, "Choose An Image"), 1);
}
I am still learning about intents and such, but as far as I understand I needed to use the intent to use Android's gallery, and because I setAction to get content, I am also using the intent to pass information to it. With that said, I tried to then pass the information in the intent to a uri. So this is what I did next.
protected void onActivityResult(int requestCode, int resultCode, Intent intent) {
super.onActivityResult(requestCode, resultCode, intent);
if(requestCode == 1 && resultCode == RESULT_OK)
{
Uri uri = intent.getData();
try {
InputStream is = getContentResolver().openInputStream(uri);
Bitmap bitmap = BitmapFactory.decodeStream(is);
ImageView image = (ImageView)findViewById(R.id.img_view);
image.setImageBitmap(bitmap);
} catch (Exception e) {
e.printStackTrace();
}
}
}
So here is the confusing part for me. I guess InputStream has the image information? Well I tried to apply the face detection code inside this same try-catch. I figured that after image.setImageBitmap(bitmap) is completed, that is the time to apply face detection. Here is the face detection code.
protected void onActivityResult(int requestCode, int resultCode, Intent intent) {
super.onActivityResult(requestCode, resultCode, intent);
if(requestCode == 1 && resultCode == RESULT_OK)
{
Uri uri = intent.getData();
try {
InputStream is = getContentResolver().openInputStream(uri);
Bitmap bitmap = BitmapFactory.decodeStream(is);
ImageView image = (ImageView)findViewById(R.id.image_view);
image.setImageBitmap(bitmap);
BitmapFactory.Options options = new BitmapFactory.Options();
options.inPreferredConfig=Bitmap.Config.RGB_565;
bitmap = BitmapFactory.decodeResource(getResources(), R.id.img_view, options);
imageWidth = bitmap.getWidth();
imageHeight = bitmap.getHeight();
detectedFaces = new FaceDetector.Face[NUM_FACES];
faceDetector= new FaceDetector(imageWidth, imageHeight, NUM_FACES);
NUM_FACE_DETECTED = faceDetector.findFaces(bitmap, detectedFaces);
mIL.invalidate();
} catch (Exception e) {
e.printStackTrace();
}
}
}
I do not know how to change "mFaceBitmap = BitmapFactory.decodeResource(getResources(), R.drawable.smilingfaces, options);" which is for local images, to the image that I think is stored inside the InputStream (or is it? Where is the selected image?) I came up with the idea to instead do the imageView layout, since the image is in the layout. I do not understand how that all transfers and works together. Anyway, that code snippet is suppose to detect faces. And then onDraw() draws squares around the detected faces. I am not sure where to put it, but I placed it outside of the onActivityResult()
protected void onDraw(Canvas canvas) {
Paint myPaint = new Paint();
myPaint.setColor(Color.RED);
myPaint.setStyle(Paint.Style.STROKE);
myPaint.setStrokeWidth(3);
myPaint.setDither(true);
for (int count = 0; count < NUM_FACE_DETECTED; count++) {
Face face = detectedFaces[count];
PointF midPoint = new PointF();
face.getMidPoint(midPoint);
eyeDistance = face.eyesDistance();
canvas.drawRect(midPoint.x-eyeDistance, midPoint.y-eyeDistance, midPoint.x+eyeDistance, midPoint.y+eyeDistance, myPaint);
}
}
Any advice? I am very close to getting this to work!
I understood what you actually want. I will write you the complete code and just go along.
In this code i am taking an imageview in layout, and two classes , one activity class and other is imageview class.
I will create two buttons, where one button is used to select image from gallery and display it ( for face detection) and second button for detecting faces on the selected image.
firstly mainlayout.xml
<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent"
android:layout_height="fill_parent" >
<com.simpleapps.facedetection.MyView
android:id="@+id/faceview"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
/>
<LinearLayout
android:orientation="horizontal"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:layout_gravity="top">
<ImageView
android:id="@+id/gallery"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginRight="10dp"
android:layout_weight="1"
android:background="@drawable/gallery" />
<ImageView
android:id="@+id/detectf"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginRight="10dp"
android:layout_weight="1"
android:background="@drawable/detect" />
</LinearLayout>
</FrameLayout>
now the activity class
MainActivity.java
public class MainActivity extends Activity {
public MyView faceview;
public static Bitmap defaultBitmap;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
setContentView(R.layout.activity_main);
DisplayMetrics displaymetrics = new DisplayMetrics();
getWindowManager().getDefaultDisplay().getMetrics(displaymetrics);
screenHeight = displaymetrics.heightPixels;
screenWidth = displaymetrics.widthPixels;
faceview = (MyView)findViewById(R.id.faceview);
myGallery = (LinearLayout)findViewById(R.id.mygallery);
gallery=(ImageView)findViewById(R.id.gallery);
detectf=(ImageView)findViewById(R.id.detectf);
BitmapFactory.Options bitmapFatoryOptions=new BitmapFactory.Options();
bitmapFatoryOptions.inPreferredConfig=Bitmap.Config.RGB_565;
defaultBitmap=BitmapFactory.decodeResource(getResources(), R.drawable.face,bitmapFatoryOptions);
faceview.setImage(defaultBitmap);
gallery.setOnClickListener(new OnClickListener() {
public void onClick(View v) {
// TODO Auto-generated method stub
Intent intent = new Intent(Intent.ACTION_GET_CONTENT);
intent.setType("image/*");
startActivityForResult(intent, 0 );
}
});
detectf.setOnClickListener(new OnClickListener() {
public void onClick(View v) {
// TODO Auto-generated method stub
faceview.facedetect();
}
});
}
@Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (resultCode == Activity.RESULT_OK) {
if(requestCode==0){
imageURI = data.getData();
try {
BitmapFactory.Options bitmapFatoryOptions=new BitmapFactory.Options();
bitmapFatoryOptions.inPreferredConfig=Bitmap.Config.RGB_565;
Bitmap b =
BitmapFactory.decodeStream(getContentResolver().openInputStream(imageURI), null,
bitmapFatoryOptions);
faceview.myBitmap=b;
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
faceview.invalidate();
}
faceview.invalidate();
} else {
System.exit(0);
Log.e("result", "BAD");
}
}
}
now the view class.
MyView.java
public class MyView extends View {
private FaceDetector.Face[] detectedFaces;
private int NUMBER_OF_FACES=10;
private FaceDetector faceDetector;
private int NUMBER_OF_FACE_DETECTED;
private float eyeDistance;
public Paint myPaint;
public Bitmap resultBmp;
public Bitmap myBitmap,HairBitmap;
public PointF midPoint1;
public MyView(Context context, AttributeSet attrs) {
super(context,attrs);
// TODO Auto-generated constructor stub
BitmapFactory.Options bitmapFatoryOptions=new BitmapFactory.Options();
bitmapFatoryOptions.inPreferredConfig=Bitmap.Config.RGB_565;
}
public void setImage(Bitmap bitmap) {
myBitmap = bitmap;
invalidate();
}
public void facedetect(){
myPaint = new Paint();
myPaint.setColor(Color.GREEN);
myPaint.setStyle(Paint.Style.STROKE);
myPaint.setStrokeWidth(3);
detectedFaces=new FaceDetector.Face[NUMBER_OF_FACES];
faceDetector=new FaceDetector(resultBmp.getWidth(),resultBmp.getHeight(),NUMBER_OF_FACES);
NUMBER_OF_FACE_DETECTED=faceDetector.findFaces(resultBmp, detectedFaces);
System.out.println("faces detected are"+NUMBER_OF_FACE_DETECTED);
Canvas facec=new Canvas();
for(int count=0;count<NUMBER_OF_FACE_DETECTED;count++)
{
if(count==0){
face1=detectedFaces[count];
midPoint1=new PointF();
face1.getMidPoint(midPoint1);
eyeDistance=face1.eyesDistance();
}
}
invalidate();
if(NUMBER_OF_FACE_DETECTED==0){
Toast.makeText(getContext(), "no faces detected", Toast.LENGTH_LONG).show();
}else if(NUMBER_OF_FACE_DETECTED!=0){
Toast.makeText(getContext(), "faces detected "+NUMBER_OF_FACE_DETECTED, Toast.LENGTH_LONG).show();
}
}
protected void onDraw(Canvas canvas)
{
if(myBitmap!=null)
{
w = myBitmap.getWidth();
h = myBitmap.getHeight();
resultBmp = null;
int widthofBitMap = MainActivity.screenWidth ;
int heightofBitMap = widthofBitMap*h/w;
resultBmp = Bitmap.createScaledBitmap(myBitmap, widthofBitMap, heightofBitMap, true);
canvas.drawBitmap(resultBmp, (MainActivity.screenWidth-widthofBitMap)/2,(MainActivity.screenHeight-heightofBitMap)/2, null);
}
}
@Override
public boolean onTouchEvent(MotionEvent event) {
// TODO Auto-generated method stub
int action = event.getAction();
switch(action){
case MotionEvent.ACTION_MOVE:
x = event.getX();
y = event.getY();
break;
case MotionEvent.ACTION_DOWN:
x = event.getX();
y = event.getY();
break;
case MotionEvent.ACTION_UP:
default:
}
invalidate();
return true;
}
}
I took some time to write this code. I hope it helps. If you get some error just ask.