Search code examples
iphoneiosuiimageviewuitouchcgrect

touch locations, frames. CGRectContainsPoint not returning true


I have multiple UIImageView subclasses:

-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {

    for (UITouch *touch in touches) {

        CGPoint _location = [touch locationInView:touch.view];
        [self processTouch:_location];
    }


}





-(void)processTouch:(CGPoint)location {



    DLog(@"location: %f .. bounds: %f : %f" ,location.x, [twoThirdOfTheWayCoordinateX floatValue], [oneThirdOfTheWayCoordinateX floatValue]);



    if(CGRectContainsPoint(self.frame, location)) {

        DLog(@"1");

        if(location.x > [twoThirdOfTheWayCoordinateX floatValue]){

            [self oneStar];

        } else if (location.x < [oneThirdOfTheWayCoordinateX floatValue]) {

            [self zeroStars];

        } else if( [twoThirdOfTheWayCoordinateX floatValue] >= location.x && location.x >= [oneThirdOfTheWayCoordinateX floatValue]){

            [self halfStars];

        }

    }

}

The DLog(@"1"); never gets called. The UIImageView Subclass is in the middle of the iphone screen. when it is in the top left corner, this code works.

NSLog: -[Star processTouch:] location: 44.000000 .. bounds: 33.333332 : 16.666666

I assume the problem is because:

if(CGRectContainsPoint(self.frame, location)) {

never gets called.

Why is that so? the first DLog gets called.

Code i use to initialize the instance:

self.instance = [[Class alloc] initWithFrame:CGRectMake(10, 10, 50 , 50)];
self.instance.userInteractionEnabled = YES;
self.instance.image = [UIImage imageNamed:@"0-0.png"];
[self.view addSubview: self.instance];
[self.instance setNeedsDisplay];

Solution

  • replacing self.frame with self.bounds fixed the issue.

    kudos to august on #iphonedev on freenode.