I'm missing something here, and feeling like an idiot about it.
I'm using a UIPickerView in my app, and I need to assign the row number to a 32-bit integer attribute for a Core Data object. To do this, I am using this method:
-(void)pickerView:(UIPickerView *)pickerView didSelectRow:(NSInteger)row inComponent:(NSInteger)component
{
object.integerValue = row;
}
This is giving me a warning:
warning: passing argument 1 of 'setIntegerValue:' makes pointer from integer without a cast
What am I mixing up here?
--Edit 1--
Ok, so I can get rid of the warning by changing the method to do the following:
NSNumber *number = [NSNumber numberWithInteger:row];
object.integerValue = rating;
However, I still get a value of 0 for object.integerValue if I use NSLog to print it out. object.integerValue has a max value of 5, so I print out number instead, and then I'm getting a number above 62,000,000. Which doesn't seem right to me, since there are 5 rows. If I NSLog the row variable, I get a number between 0 and 5. So why do I end up with a completely different number after casting the number to NSNumber?
--Edit 2--
Ok, so I'm realizing that there is some fundamental idea that I don't understand. I now understand that the 60 million + number can be cast back to the correct 0-5 number by using integerValue. So, it seems my question is how can I save an integer between 0-5 to the attribute if the NSNumber that is returned is over 60 million? Do I need to be using a different data type?