I'm having trouble assigning an instance of a class to the following UITableViewController
subclass:
@interface MyTableViewController : UITableViewController </*usual protocols*/>
@property (nonatomic, retain) MyClass *myClass;
@end
I'm currently assigning a non-null instance of MyClass to an instance of MyTableViewController like this:
MyTableViewController *myTableViewController = [[MyTableViewController alloc] init];
MyClass *nonNullInstanceOfMyClass = [[MyClass alloc] init];
myTableViewController.myClass = nonNullInstanceOfMyClass;
[self.navigationController pushViewController:myTableViewController animated:YES];
[nonNullInstanceOfMyClass release];
[myTableViewController release];
The problem is that myClass is null
in MyTableViewController's viewDidLoad
. Why is this happening?
Edit #1: I check that nonNullInstanceOfMyClass is not null by NSLogging it.
Edit #2: Provided more code. Also viewDidLoad
seems to be called before I push the view controller (which could cause the problem... although it seems odd).
Edit #3: Fixed by moving self.tableView.delegate = nil;
and self.tableView.dataSource = nil;
from init
to viewDidLoad
.
Are you doing anything between instantiating the view controller and assigning to the property? viewDidLoad
doesn't necessarily get called when you push the view controller onto the navigation stack - it gets called whenever it needs to load the view. So, for example, NSLog(@"%@", myTableViewController.view);
would trigger viewDidLoad
, and if you did this before assigning to the property, that would explain this behaviour.
Could you post actual code instead of an approximation please? There could be important details you are leaving out.