Search Results

Search found 1125 results on 45 pages for 'uiview'.

Page 42/45 | < Previous Page | 38 39 40 41 42 43 44 45  | Next Page >

  • How to know when StatusBar size changed (iPhone)

    - by JOM
    I have UITabBarController with 2 tabs. One resizes just fine, when StatusBar size changes (emulator "Toggle In-Call Status Bar" menu item). The other one doesn't. The problematic tab item contains a static view, which dynamically loads one or another view depending on certain things. While getting this setup working I discovered that main tab view did NOT automagically send e.g. viewWillAppear and viewWillDisappear messages to my dynamic subviews. Apple docs explained this was because dynamically added views were not recognized by the system. @interface MyTabViewController : UIViewController { UIView *mainView; FirstViewController *aController; SecondViewController *bController; } ... if (index == 0) { self.aController = [[FirstViewController alloc] initWithNibName:@"FirstViewController" bundle:nil]; [self.mainView addSubview:aController.view]; [self.aController viewWillAppear:YES]; } How can I get StatusBar size changed event into my dynamic subviews? The "didChangeStatusBarFrame" doesn't work, as documented elsewhere.

    Read the article

  • presentModalViewController does not want to work when called from a protocol method

    - by johnbdh
    I have a subview that when double tapped a protocol method on the subview's parent view controller is called like this... - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *theTouch = [touches anyObject]; if (theTouch.tapCount == 1) { } else if (theTouch.tapCount == 2) { if ([self.delegate respondsToSelector:@selector(editEvent:)]) { [self.delegate editEvent:dictionary]; } } } Here is the protocol method with the dictionary consuming code removed... - (void)editEvent:(NSDictionary){ EventEditViewController *eventEditViewController = [[EventEditViewController alloc] initWithNibName:@"EventEditViewController" bundle:nil]; eventEditViewController.delegate = self; navigationController = [[UINavigationController alloc] initWithRootViewController:eventEditViewController]; [self presentModalViewController:navigationController animated:YES]; [eventEditViewController release]; } The protocol method is called and runs without any errors but the modal view does not present itself. I temporarily copied the protocol method's code to an IBAction method for one of the parent's view button's to isolate it from the subview. When I tap this button the modal view works fine. Can anyone tell me what I am doing wrong? Why does it work when executed from a button on the parent view, and not from a protocol method called from a subview. Here is what I have tried so far to work around the problem... Restarted xCode and the simulator Ran on the device (iTouch) Presenting eventEditViewController instead of navigationController Using Push instead of presentModal. delaying the call to the protocol with performSelector directly to the protocol, to another method in the subview which calls the protocol method, from the protocol method to another method with the presentModal calls. Using a timer. I have it currently setup so that the protocol method calls a known working method that presents a different view. Before calling presentModalViewController it pops a UIAlertView which works every time, but the modal view refuses to display when called via the protocol method. I'm stumped. Perhaps it has something to do with the fact that I am calling the protocol method from a UIView class instead of a UIViewController class. Maybe I need to create a UIViewController for the subView?? Thanks, John

    Read the article

  • Pin animatesDrop mapView-iOS

    - by user1724168
    I have implemented code as seen below. I would like to add animation with dropping effect. However, once I type pinView.animatesDrop does not recognize! I could not able to figure out what I am doing wrong? - (MKAnnotationView *)mapView:(MKMapView *)mV viewForAnnotation:(id <MKAnnotation>)annotation { MKAnnotationView *pinView=nil; if(![annotation isKindOfClass:[Annotation class]]) // Don't mess user location return nil; static NSString *defaultPinID = @"StandardIdentifier"; pinView = (MKAnnotationView *)[self.mapView dequeueReusableAnnotationViewWithIdentifier:defaultPinID]; if (pinView == nil){ pinView = [[MKAnnotationView alloc] initWithAnnotation:annotation reuseIdentifier:defaultPinID]; } // Build our annotation if ([annotation isKindOfClass:[Annotation class]]) { Annotation *a = (Annotation *)annotation; pinView.image = [ZSPinAnnotation pinAnnotationWithColor:a.color];// ZSPinAnnotation Being Used pinView.annotation = a; pinView.enabled = YES; pinView.centerOffset=CGPointMake(6.5,-16); pinView.calloutOffset = CGPointMake(-11,0); //pinView.animatesDrop = YES; } pinView.canShowCallout = YES; UIButton *rightButton = [UIButton buttonWithType:UIButtonTypeDetailDisclosure]; [rightButton setTitle:annotation.title forState:UIControlStateNormal]; [pinView setRightCalloutAccessoryView:rightButton]; pinView.leftCalloutAccessoryView = [[UIView alloc] init]; pinView.leftCalloutAccessoryView=nil; /*UIButton *leftButton = [UIButton buttonWithType:UIButtonTypeInfoLight]; [leftButton setTitle:annotation.title forState:UIControlStateNormal]; [pinView setLeftCalloutAccessoryView:leftButton];*/ return pinView; }

    Read the article

  • TableView Cells unresponsive

    - by John Donovan
    I have a TableView and I wish to be able to press several cells one after the other and have messages appear. At the moment the cells are often unresponsive. I found some coed for a similar problem someone was kind enough to post, however, although he claimed it worked 100% it doesn't work for me. The app won't even build. Here's the code: -(UIView*) hitTest:(CGPoint)point withEvent:(UIEvent*)event { // check to see if the hit is in this table view if ([self pointInside:point withEvent:event]) { UITableViewCell* newCell = nil; // hit is in this table view, find out // which cell it is in (if any) for (UITableViewCell* aCell in self.visibleCells) { if ([aCell pointInside:[self convertPoint:point toView:aCell] withEvent:nil]) { newCell = aCell; break; } } // if it touched a different cell, tell the previous cell to resign // this gives it a chance to hide the keyboard or date picker or whatever if (newCell != activeCell) { [activeCell resignFirstResponder]; self.activeCell = newCell; // may be nil } } // return the super's hitTest result return [super hitTest:point withEvent:event]; } With this code I get this warning: that my viewController may not respond to pointsInside:withEvent (it's a TableViewController). I also get some faults: request for member 'visibleCells' in something not a structure or a union. incompatible type for argument 1 of pointInsideWithEvent, expression does not have a valid object type and similar. I must admit I'm not so good at reading other people's code but I was wondering whether the problems here are obvious and if so if anyone could give me a pointer it would be greatly appreciated.

    Read the article

  • iPhone app crashes on start-up, in stack-trace only messages from built-in frameworks

    - by Aleksejs
    My app some times crashes at start-up. In stack-trace only messages from built-in frameworks. An excerpt from a crash log: OS Version: iPhone OS 3.1.3 (7E18) Report Version: 104 Exception Type: EXC_BAD_ACCESS (SIGBUS) Exception Codes: KERN_PROTECTION_FAILURE at 0x000e6000 Crashed Thread: 0 Thread 0 Crashed: 0 CoreGraphics 0x339305d8 argb32_image_mark_RGB32 + 704 1 CoreGraphics 0x338dbcd4 argb32_image + 1640 2 libRIP.A.dylib 0x320d99f0 ripl_Mark 3 libRIP.A.dylib 0x320db3ac ripl_BltImage 4 libRIP.A.dylib 0x320cc2a0 ripc_RenderImage 5 libRIP.A.dylib 0x320d5238 ripc_DrawImage 6 CoreGraphics 0x338d7da4 CGContextDelegateDrawImage + 80 7 CoreGraphics 0x338d7d14 CGContextDrawImage + 364 8 UIKit 0x324ee68c compositeCGImageRefInRect 9 UIKit 0x324ee564 -[UIImage(UIImageDeprecated) compositeToRect:fromRect:operation:fraction:] 10 UIKit 0x32556f44 -[UINavigationBar drawBackButtonBackgroundInRect:withStyle:pressed:] 11 UIKit 0x32556b00 -[UINavigationItemButtonView drawRect:] 12 UIKit 0x324ecbc4 -[UIView(CALayerDelegate) drawLayer:inContext:] 13 QuartzCore 0x311cacfc -[CALayer drawInContext:] 14 QuartzCore 0x311cab00 backing_callback 15 QuartzCore 0x311ca388 CABackingStoreUpdate 16 QuartzCore 0x311c978c -[CALayer _display] 17 QuartzCore 0x311c941c -[CALayer display] 18 QuartzCore 0x311c9368 CALayerDisplayIfNeeded 19 QuartzCore 0x311c8848 CA::Context::commit_transaction(CA::Transaction*) 20 QuartzCore 0x311c846c CA::Transaction::commit() 21 QuartzCore 0x311c8318 +[CATransaction flush] 22 UIKit 0x324f5e94 -[UIApplication _reportAppLaunchFinished] 23 UIKit 0x324a7a80 -[UIApplication _runWithURL:sourceBundleID:] 24 UIKit 0x324f8df8 -[UIApplication handleEvent:withNewEvent:] 25 UIKit 0x324f8634 -[UIApplication sendEvent:] 26 UIKit 0x324f808c _UIApplicationHandleEvent 27 GraphicsServices 0x335067dc PurpleEventCallback 28 CoreFoundation 0x323f5524 CFRunLoopRunSpecific 29 CoreFoundation 0x323f4c18 CFRunLoopRunInMode 30 UIKit 0x324a6c00 -[UIApplication _run] 31 UIKit 0x324a5228 UIApplicationMain 32 Journaler 0x000029ac main (main.m:14) 33 Journaler 0x00002948 start + 44 File main.m is simple as possible: #import <UIKit/UIKit.h> int main(int argc, char *argv[]) { NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init]; int retVal = UIApplicationMain(argc, argv, nil, nil); // line 14 [pool release]; return retVal; } What my cause the app to crash?

    Read the article

  • iPad start in Landscape receive only touch within 768x768

    - by user1307179
    It works perfect fine when starting in portrait and also works when you rotate from portrait to landscape and back. It does not work when starting in landscape. But then it works when you rotate from landscape to portrait and back. In landscape starting mode, the screen does not respond with any touch where screen coordinateX greater than 768. What happens in code is, I use status bar orientation to determine original orientation and rotate each view manually. The views display correctly but does not receive touch properly. Then my root view controller will get called when ipad start rotating with: - (void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration which will rotate every subviews. Root controller: - (void)loadView { self.view = [[UIView alloc]init ]; //initialize child views [self willRotateToInterfaceOrientation:0 duration:0]; } - (void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration { if ([model isLandscape]) { self.view.frame = CGRectMake(0, 0, 1024, 768-80); } else { self.view.frame = CGRectMake(0, 0, 768, 1024-80); } //rotate child views } My code [model isLandscape] works so I don't need to provide details as to how it works but here are the code anyway: - (bool)isLandscape { if (orientation == UIInterfaceOrientationLandscapeLeft || orientation == UIInterfaceOrientationLandscapeRight) return true; else return false; } -(id) init { [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(orientationChanged:) name:UIDeviceOrientationDidChangeNotification object:nil]; } - (void)orientationChanged:(NSNotification *)notification { UIInterfaceOrientation curOrientation = [[UIDevice currentDevice] orientation]; if (curOrientation == UIDeviceOrientationPortrait || curOrientation == UIDeviceOrientationPortraitUpsideDown || curOrientation == UIDeviceOrientationLandscapeLeft || curOrientation == UIDeviceOrientationLandscapeRight) { orientation = curOrientation; ((AppDelegate*)([UIApplication sharedApplication].delegate)).savedOrientationForRestart = orientation; NSLog(@"changed"); } } -(void)validateOrientation { //first time when initializing orientation UIInterfaceOrientation curOrientation = [[UIDevice currentDevice] orientation]; if (curOrientation != UIDeviceOrientationPortrait && curOrientation != UIDeviceOrientationPortraitUpsideDown && curOrientation != UIDeviceOrientationLandscapeLeft && curOrientation != UIDeviceOrientationLandscapeRight) { orientation = [[UIApplication sharedApplication] statusBarOrientation]; } }

    Read the article

  • Mingling C++ classes with Objective C classes

    - by Joey
    I am using the iphone SDK and coding primarily in C++ while using parts of the SDK in obj-c. Is it possible to designate a C++ class in situations where an obj-c class is needed? For instance: 1) when setting delegates to obj-c objects. I cannot make a C++ class derive from a Delegate protocol so this and possibly other reasons prevent me from making my C++ class a delegate for various obj-c objects. What I do as a solution is create an obj-c adapter class that contains a ptr to the C++ class and is used as the delegate (notifying the C++ class when it is called). It feels cumbersome to write these every time I need to get delegate notifications to a C++ class. 2) when setting selectors This goes hand in hand with item 1. Say I want to set a callback to fire when something is done, like a button press or a setAnimationDidStopSelector in the UIView animation functionality. It would be nice to be able to designate a C++ function along with the relevant delegate for setAnimationDelegate. Well, I suspect this isn't readily possible, but if anyone has any suggestions on how to do it if it is, or on how to write such things more easily, I would love to hear them. Thanks.

    Read the article

  • iPhone multitouch - Some touches dispatch touchesBegan: but not touchesMoved:

    - by zkarcher
    I'm developing a multitouch application. One touch is expected to move, and I need to track its position. For all other touches, I need to track their beginnings and endings, but their movement is less critical. Sometimes, when 3 or more touches are active, my UIView does not receive touchesMoved: events for the moving touch. This problem is intermittent, and can always be reproduced after a few attempts: Touch the screen with 2 fingers. Touch the screen with another finger, and move this finger around. The moving finger always dispatches touchesBegan: and touchesEnded:, but sometimes does not dispatch any touchesMoved: events. Whenever the moving touch does not dispatch touchesMoved: events, I can force it to dispatch touchesMoved: if I move one of the other touches. This seems to "force" every touch to recheck its position, and I successfully receive a touchesMoved: event. However, this is clumsy. This bug is reproducible on both the iPhone 2G and 3GS models. My question is: How do I ensure that my moving touch dispatches touchesMoved: events? Does anyone have any experience with this issue? I've spent few fruitless days searching the web for answers. I found a post describing how to sync touch events with the VBL: http://www.71squared.com/2009/04/maingameloop-changes/ . However, this has not solved the problem. I really don't know how to proceed. Any help is appreciated!

    Read the article

  • Cannot hide a UIButton - Please help!

    - by Neurofluxation
    Hey again, I have the following code: visitSite.hidden = YES; For some reason, when I click a UIButton and call this piece of code, the visitSite button does not hide. The code is within this block: -(IBAction)welcomeButtonPressed:(id)sender { [UIButton beginAnimations:@"welcomeAnimation" context:NULL]; [UIButton setAnimationDuration:1.5]; [UIButton SetAnimationDidStopSelector:@selector(nowHideThisSiteButton:finished:context:)]; [UIButton setAnimationTransition:UIViewAnimationTransitionCurlUp forView:self.view cache:YES]; ((UIView *)sender).hidden = YES; [UIButton commitAnimations]; } and the stop selector below: -(void)nowHideThisSiteButton:(NSString *)animationID finished:(BOOL *)finished context:(void *)context { visitSite.hidden = YES; } I've also tried [visitSite setHidden:YES]; and that fails as well. ALSO I've noticed that the setAnimationDidStopSelector does not get called at all. Also, visitSite (when NSLogged) equals: <UIButton: 0x1290f0; frame = (0 0; 320 460); opaque = NO; autoresize = RM+BM; layer = <CALayer: 0x1290f0>> visitSite.hidden (when NSLogged) equals: NULL Any more ideas? :(

    Read the article

  • Removing a view from it's superview causes memory error - why?

    - by mystify
    Xcode is throwing an error at me: malloc: * error for object 0x103f000: pointer being freed was not allocated * set a breakpoint in malloc_error_break to debug I tracked down the code until a line where I do this: - (void)inputValueCommitted:(NSString *)animationID finished:(BOOL)finished context:(void *)context { // retainCount of myView is 2! (one for the retain-property, one for beeing a subview) [self.myView removeFromSuperview]; // ERROR-LINE !! self.myView = nil; } When I remove that errorful line, the error is gone. So in conclusion: I can't get rid of my view! It's an UIImageView with nothing else inside, just showing an image. What I do is this: I create an UIView Animation Block, create that UIImageView, assign it to an retain-property with self.myView = ..., and after the animation is done, I just want to get rid of that view. So I remove it from it's superview and then set my property to nil, which lets it go away - in theory. Did anyone else encounter such issues? iPhone SDK 3.0.

    Read the article

  • Iphone: TabView + TableView

    - by OneTrickPonySoft
    I think I'm missing something simple, but I can't figure out exactly what it is. I'm trying to set up an App with a UITabViewController, and one of the Tabs will have a UITableView and UISearchBar (but no Navigation Controller). I set up the UITabViewController with all the tabs in interface builder, and the views are in their own xib files. The xib file for the tab with the UITableView is set up and connected as follows. Stuff in the browser: File's owner (Class is my custom class that is a child of UITableViewController) view - View View (class UIView, reference view - File's owner) contains: UITableView (if i try and set references to the data source / delegate, the app breaks) UISearchBar (unconfigured at the moment) This setup displays all the items and doesn't lock up, but I can't assign a DataSource without it crashing when i try and load the tab with the UITableView. What should I do to get data into this table, either in IB or code? My ideas are as follows: Implement custom UITableView class, hook up to table view in IB or to custom tableviewcontroller in Xcode. Pound head or laptop against the wall until it works. Update: Here's the error the simulator pushes to the console when I connect the Tableview's data source and delegate to File Owner (who's class is my custom tableviewcontroller). 2/14/09 6:59:12 PM TabBarWillbeRight[33172] * Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '* -[UIViewController tableView:numberOfRowsInSection:]: unrecognized selector sent to instance 0x523760'

    Read the article

  • iPhone - how to track touches and allow button taps at the same time?

    - by Jonathan Cohen
    I'm wondering how to track touches anywhere on the iPhone screen and still have UIButtons respond to taps. I subclassed a UIView, made it full screen and the highest view in the hierarchy, and overrode its pointInside:withEvent method. If I return YES, I'm able to track touches anywhere on the screen but the buttons don't respond (likely because the view is instructed to handle and terminate the touch). If I return NO, the touch passes through the view and the buttons respond, but I'm not able to track touches. Do I need to subclass UIButton or is this possible through the responder chain? What am I doing wrong? - (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event{ return NO; } //only works if pointInside:withEvent: returns YES. -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{ NSLog(@"began"); [self.nextResponder touchesBegan:touches withEvent:event]; } //only works if pointInside:withEvent: returns YES. -(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{ NSLog(@"end"); [self.nextResponder touchesEnded:touches withEvent:event]; }

    Read the article

  • Obtaining touch location for a uiscrollview touch

    - by LOSnively
    I have a uiscrollview as an element of a uiscrollviewcontroller, along with other view objects. The image scrolls and zooms as expected, when the scrollView is the top subview. However, I also need to get the screen location of the touch, in particular when there is no scroll action. (I understand the location may change during a scroll, but that's not important.) I haven't found a way to do that. In the scrollviewcontroller implementation I have customized all of the standard methods that should do this: "touchesShouldBegin...", "touchesBegan:...", "touchesEnded:...", and so on. As far as I can tell, none of these are being called during a touch event when the scrollView is the top subview. I've tried setting the delayContentTouches property to both YES and NO, and that doesn't seem to make a difference. As an alternative, I've tried putting a UIView as the top subview and then tried passing the touches to the now underlying scrollView. In this configuration, the standard methods are called and I can get the touch location, but I haven't found a mechanism for the touches to be passed to the scrollView so scrolling occurs. Doing something like sending the touch messages to the specific scrollView, or to "super" or just sending them to nextResponder doesn't do it. It seems I can make the scroll work or find the location of the touch but not both, depending on what the "top" subview is. I suspect this is trivial, but after two weeks of struggling, it's time to eat my embarrassment for not being able to do this seemingly simplest of things. I've read all of the related questions here on stackoverflow, tried most if not all of the suggestions, and so far, nothing has worked. I've looked through the various links and references suggested by the answers, including Apple's documentation, but none have pointed out the gap in my understanding. Any ideas would be appreciated.

    Read the article

  • Switch between multiple views while respecting orientation

    - by zoul
    Hello! I have an MVC application with a single model and several views (something like skins). I want the user to be able to switch the views and I can’t get it working with interface orientation. The most simple approach looks like this: - (void) switchToADifferentView: (UIView*) newView { // self is a descendant of UIViewController self.view = newView; } This does not work because the incoming view does not get rotated according to current orientation (until the next orientation change, test case). Is there a way to force the orientation on a view? It looks like the system is trying really hard to keep the interface controls for itself. (Or is it as simple as setting the right transform by hand?) I figured I’d better not switch the views directly and switch controllers instead. This makes sense, as it makes the initial code simpler. But how do I switch controllers that have no “navigation relation” between them? I guess I could use presentModalViewController:, but that seems like a hack. Same goes for navigation controller. If I exchange the controllers by hand, I get the wrong orientation again: - (void) switchToAController: (id) incoming { [currentController.view removeFromSuperview]; [window addSubview:incoming.view]; // does not respect current orientation } Now how the heck do I simply exchange the current controller for another one? Again, the controllers are something like “skins” operating above a shared model, so it really makes no sense to pretend that skin A is a “modal” dialog above skin B or that they’re a part of a navigation stack.

    Read the article

  • ios saving uilabel and uiimageview as uiimage

    - by Ashraf Hussein
    I'm trying to add text to an image bu adding a uilabel as subview to a uiimageview I already did that but I want to save them as an image I'm using render in context but it's not working here's my code UIImage * img = [UIImage imageNamed:@"IMG_1650.JPG"]; float x = (img.size.width/imageView.frame.size.width) * touchPoint.x; float y = (img.size.height/imageView.frame.size.height) * touchPoint.y; CGPoint tpoint = CGPointMake(x, y); UIFont *font = [UIFont boldSystemFontOfSize:30]; context = UIGraphicsGetCurrentContext(); UIGraphicsBeginImageContextWithOptions(img.size, YES, 0.0); [[UIColor redColor] set]; for (UIView * view in [imageView subviews]){ [view removeFromSuperview]; } UILabel * lbl = [[UILabel alloc]init]; [lbl setText:txt]; [lbl setBackgroundColor:[UIColor clearColor]]; CGSize sz = [txt sizeWithFont:lbl.font]; [lbl setFrame:CGRectMake(touchPoint.x, touchPoint.y, sz.width, sz.height)]; lbl.transform = CGAffineTransformMakeRotation( -M_PI/4 ); [imageView addSubview:lbl]; [imageView bringSubviewToFront:lbl]; [imageView setImage:img]; [imageView.layer renderInContext:UIGraphicsGetCurrentContext()]; [lbl.layer renderInContext:UIGraphicsGetCurrentContext()]; UIImage * nImg = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); UIImageWriteToSavedPhotosAlbum(nImg, nil, nil, nil); THX

    Read the article

  • iPhone UI layout debugging

    - by Cruinh
    I have this chronic issue with iPhone UI development where views sometimes seem to appear on the screen in a location different than what is reported by their frame property. Here is what I am doing to try to debug the issue: UIView *currentView = self.view; while (currentView!=nil) { NSLog(@"frame: %f,%f,%f,%f", currentView.frame.origin.x, currentView.frame.origin.y, currentView.frame.size.width, currentView.frame.size.height); currentView = currentView.superview; } I expect this should show me the coordinates and size of each element up the hierarchy from the given view to the app's root UIWindow element, with the coordinates for each element relative to its parent. However, that does not seem to be the case. In my current situation, I have a UI I'm trying to debug where every other time I rotate the device, the whole UI shifts up or down 20 pixels, yet the code block above reports exactly the same numbers every time. I tried calling the above code after as much as a second delay, but that the numbers still come out the same each time. Does anyone know a better way to inspect the screen coordinates of UI elements? If I can detect when one is wrong, I can compensate for the problem when it appears.

    Read the article

  • Drawing line Continous.

    - by japs
    Hi All, How to draw line continuous in uiview? I have used below code and it works fine but after drawing line straght when i draw another line then first one gets clear if i do not clear that then they join each other. Please suggest some solution. (void)drawRect:(CGRect)rect { CGContextRef context = UIGraphicsGetCurrentContext(); //for( Line *eachLine in lineArray ) // [eachLine drawInContext:context]; CGContextSetLineWidth(context, 2.0); CGContextSetStrokeColorWithColor(context, [UIColor redColor].CGColor); CGContextSetFillColorWithColor(context, [UIColor redColor].CGColor); if (firstTouch.x != 0.0 && firstTouch.y != 0.0) { CGRect dotRect = CGRectMake(firstTouch.x - 3, firstTouch.y - 3.0, 5.0, 5.0); CGContextAddEllipseInRect(context, dotRect); CGContextDrawPath(context, kCGPathFillStroke); **CGContextMoveToPoint(context, firstTouch.x, firstTouch.y); for (NSString *onePointString in points) { CGPoint nextPoint = CGPointFromString(onePointString); CGContextAddLineToPoint(context, nextPoint.x, nextPoint.y); }** CGContextStrokePath(context); } else { CGContextSetFillColorWithColor(context, self.backgroundColor.CGColor); CGContextAddRect(context, self.bounds); CGContextFillPath(context); } }

    Read the article

  • Quartz 2D or OpenGL ES? Pros and cons in the long term, possibility of migration to other platforms.

    - by fspirit
    Hi all! I'm having a hard time deciding whether to go with Quartz2D or OpenGL for an iPad game. It will be 2D mostly, but effect-intense (simultaneous lighting effects for 10-30 objects, 10-20 simultaneous animations on the screen). So far, assuming i'm equally dumb in both technologies and have to learn them from the ground, i came to this list. (I've read several topics here, on SO, with names like "Quartz or OpenGL", but i'm still left with some questions) Quartz: Better time-to-market, because of ready to use absractions like UIView, UIImageView, CoreAnimation abstractions Open GL ES Closer to hardware, thus, performance is better. App, implemented with OpenGL ES can be easier migrated to Android, MeeGo, Windows Phone, etc. My questions are: How time will it take to rewrite Quartz 2d app to use OpenGL? Lets say it took me 2 man-month to write Quartz app, how much time will i need to rewrite it? (Please, just some subjective opinions, i'll try to summarize them somehow) Regarding the ease of migration to other platforms, when using OpenGL, is it really so? Or efforts when migrating Quartz app from iPhoneOS to Android will be not so much bigger, compared to OpenGL app migration? (Ease of migration is quite important criterion) Regarding OpenGL, should i go with OpenGL 1.1 or 2.0, concerning migration? (Android supports 2.0 through NDK, but dont know whether NDK's use will increase or decrease migration efforts)

    Read the article

  • Using a UITableViewController with a small-sized table?

    - by rpj
    When using a UITableViewController, the initWithStyle: method automatically creates the underlying UITableView with - according to the documentation - "the correct dimensions". My problem is that these "correct dimensions" seem 320x460 (the iPhone's screen size), but I'm pushing this TableView/Controller pair into a UINavigationController which is itself contained in a UIView, which itself is about half the height of the screen. No frame or bounds wrangling I can come up with seems to correctly reset the table's size, and as such it's "too long", meaning there are a collection of rows that are pushed off the bottom of the screen and are not visible nor reachable by scrolling. So my question comes down to: what is the proper way to tell a UITableViewController to resize its component UITableView to a specified rectangle? Thanks! Update I've tried all the techniques suggested here to no avail, but I did find one interesting thing: if I eschew the UINavigationController altogether (which I'm not yet willing to do for production, but as an experiment), and add the table view as a direct subview of the enclosing view I mentioned, the frame size given is respected. The very moment I re-introduce the UINavigationController into the mix, no matter if it is added as a subview before or after the table view, and no matter if alloc/init it before or after the table view is added as a subview, the result is the same as it was before. I'm beginning to suspect UINavigationController isn't much of a team player... Update 2 The suggestion to check frame size after the table view on screen was a good one: turns out that the navigation controller is in fact resizing it some time in between load and display. My solution, hacky at best, has been to cache the frame given on load and to reset it if changed at the beginning of tableView:cellForRowAtIndexPath:. Why there you ask? Because it's the one place I found that worked, that's why! I don't consider this a solution as it's obviously improper, but for the benefit of anyone else reading, it does seem to work.

    Read the article

  • iphone crash log with dSym not loading debug information

    - by AngeDeLaMort
    Hello, I was trying to see why my application crashed on the device (iPhone) using the dSym generated along the executable (in ad hoc), but I don't know why, there isn't any useful information. It seems that "Organizer" is able to find the appropriate dSym and translate some data into more readable one, but when it comes to my application, I just have an address. Since I know how to reproduce it, I've tried to setup my build so it can help me in the future. So, I've tried to find if I had all the proper flags set int the project build properties and everything seems fine. So after doing some research, it seems that all information are stripped during link time and the dSym seems completely useless. I've played with some flags, but nothing changed. So, is there something special to do in order to get the crash file human readable? Or is it impossible in the ad hoc setting? The closest thing near to work that I've done was to build a debug version and look up the address in it. At least it seems to give the right file. So, I made a sample app and here what I have: (the line I want is #4): Thread 0 Crashed: 0 libobjc.A.dylib 0x00003ebc objc_msgSend + 20 1 UIKit 0x0005c970 -[UIView dealloc] + 60 2 UIKit 0x0005c840 -[UIImageView dealloc] + 76 3 CoreFoundation 0x0003963a -[NSObject release] + 28 4 MyApplication 0x000046a6 0x1000 + 13990 5 UIKit 0x00069750 -[UIViewController view] + 44 6 MyApplication 0x000053fa 0x1000 + 17402 The crash is made using 2 successive releases on an object. Thanks in advance.

    Read the article

  • How should child views of UIScrollView report their bounds for contentSize?

    - by Mike
    I'm looking more for advice on the correct design for a view. What I have is a UIScrollView that contains one or more custom Views I have created. My problem is, who reports to the scrollview what it's contentSize should be? I have the following: UIView +-UIScrollView +-CustomView 1 with dynamic height depending on data +-CustomView 2 with dynamic Height depending on data The UIViewController creates new instances of the custom views with data and then adds them as subviews to the UIScrollView. The problem I'm having is how to set the value of the scrollview's contentSize? Right now, I'm not doing that and the contents of the scrollview are clipped with no scrolling possible. Should the custom view call [parent setContentSize:] in its drawRect:? Should the UIViewController query the custom view after creation to get its bounds and then call setContentSize? Should I subclass the UIScrollView to override addSubView to query each subview's height? Is there something else I'm missing? I hope I explained that properly. I'm new to this and still getting a handle on things.

    Read the article

  • How to use generic (NSObject) controller with subviews of a UIViewController?

    - by wanderlust
    I have a UIViewController that is loading several subviews at different times based on user interaction. I originally built all of these subviews in code, with no nib files. Now I am moving to nib files with custom UIView subclasses. Some of these subviews display static data, and I am using loadNibNamed:owner:options: to load them into the view controller. Others contain controls that I need to access. I (sort of) understand the reasons Apple says to use one view controller per screen of content, using generic controller objects (NSObjects) to manage subsections of a screen. So I need a view controller, a generic controller, a view class and a nib. How do I put this all together? My working assumptions and subsequent questions: I will associate the view class with the nib in the 'class identity' drop down in IB. The view controller will coordinate overall screen interactions. When necessary, it will create an instance of the generic controller. Does the generic controller load the nib? How? Do I define the outlets and actions in that view class, or should they be in the generic controller? How do I pass messages between the view controller and the generic controller? If anyone can point me to some sample code using a controller in this way, it will go a long way to helping me understand. None of the books or stackoverflow posts I've read have quite hit the spot yet.

    Read the article

  • Optimization headers for UITableView?

    - by Pask
    I have an optimization problem for the headers of a table with plain style. If I use the standard view for the table (the classic gray with titles set by titleForHeaderInSection:) everything is ok and the scrolling is smooth and immediate. When, instead, use this code to set my personal view: - (UIView *)tableView:(UITableView *)tableView viewForHeaderInSection:(NSInteger)section { return [self headerPerTitolo:[titoliSezioni objectAtIndex:section]]; } - (UIImageView *)headerPerTitolo:(NSString *)titolo { UIImageView *headerView = [[[UIImageView alloc] initWithFrame:CGRectMake(10.0, 0.0, 320.0, 44.0)] autorelease]; headerView.image = [UIImage imageNamed:kNomeImmagineHeader]; headerView.alpha = kAlphaSezioniTablePlain; UILabel * headerLabel = [[[UILabel alloc] initWithFrame:CGRectZero] autorelease]; headerLabel.backgroundColor = [UIColor clearColor]; headerLabel.opaque = NO; headerLabel.textColor = [UIColor whiteColor]; headerLabel.font = [UIFont boldSystemFontOfSize:16]; headerLabel.frame = CGRectMake(10.0,-11.0, 320.0, 44.0); headerLabel.textAlignment = UITextAlignmentLeft; headerLabel.text = titolo; [headerView addSubview:headerLabel]; return headerView; } scrolling is jerky and not immediate (sliding the finger on the screen does not match an immediate shift of the table). I do not know what caused this problem, maybe the fact that every time the method viewForHeaderInSection: is called, the code runs to create a new UIImageView. I tried many ways to solve the problem, such as creating an array of all the necessary view: apart from more time spent loading at startup, there is a continuing problem of low reactivity of the table. 've Attempted by reducing the size of UIImageView positioned from about 66 KB to 4 KB: not only has a deterioration in quality of colors (which distorts a bit 'original graphics), but ... the problem persists! Perhaps you have suggestions about it, or know me obscure techniques that enable me to optimize this aspect of my application ... I apologize for my English, I used Google for translation.

    Read the article

  • Why is there no autorelease pool when I do performSelectorInBackground: ?

    - by Thanks
    I am calling a method that goes in a background thread: [self performSelectorInBackground:@selector(loadViewControllerWithIndex:) withObject:[NSNumber numberWithInt:viewControllerIndex]]; then, I have this method implementation that gets called by the selector: - (void) loadViewControllerWithIndex:(NSNumber *)indexNumberObj { NSAutoreleasePool *arPool = [[NSAutoreleasePool alloc] init]; NSInteger vcIndex = [indexNumberObj intValue]; Class c; UIViewController *controller = [viewControllers objectAtIndex:vcIndex]; switch (vcIndex) { case 0: c = [MyFirstViewController class]; break; case 1: c = [MySecondViewController class]; break; default: NSLog(@"unknown index for loading view controller: %d", vcIndex); // error break; } if ((NSNull *)controller == [NSNull null]) { controller = [[c alloc] initWithNib]; [viewControllers replaceObjectAtIndex:vcIndex withObject:controller]; [controller release]; } if (controller.view.superview == nil) { UIView *placeholderView = [viewControllerPlaceholderViews objectAtIndex:vcIndex]; [placeholderView addSubview:controller.view]; } [arPool release]; } Althoug I do create an autorelease pool there for that thread, I always get this error: 2009-05-30 12:03:09.910 Demo[1827:3f03] *** _NSAutoreleaseNoPool(): Object 0x523e50 of class NSCFNumber autoreleased with no pool in place - just leaking Stack: (0x95c83f0f 0x95b90442 0x28d3 0x2d42 0x95b96e0d 0x95b969b4 0x93a00155 0x93a00012) If I take away the autorelease pool, I get a whole bunch of messages like these. I also tried to create an autorelease pool around the call of the performSelectorInBackground:, but that doesn't help. I suspect the parameter, but I don't know why the compiler complains about an NSCFNumber. Am I missing something? My Instance variables are all "nonatomic". Can that be a problem? UPDATE: I may also suspect that some variable has been added to an autorelease pool of the main thread (maybe an ivar), and now it trys to release that one inside the wrong autorelease pool? If so, how could I fix that? (damn, this threading stuff is complex ;) )

    Read the article

  • Three20 TTSectionedDataSource row height

    - by Ward
    Hey there, I'm using Three20 to create a table with several textfields for user registration. I've found two possible methods using Three20. The first uses the TTSectionedDataSource's tableDidLoadModel method to manually add UI components and the second adds custom items that contains pre formatted UI components. The second option seems way more complex and I'm having a difficult time accessing the individual fields. So if one field is a textfield for the username, I need to access the field to submit the username and it doesn't seem like there's an easy answer. The first option gives me a lot of flexibility, but I can't figure out how to set the individual row heights. One row may have a label above a text field, another may have an image, etc. Is there a method that can be used in TTSectionedDataSource that will allow me to set the height for each row? Thus far, I'm using method one and creating UIViews to hold a label field and a text field. I've tried changing the frame of the uiview before it is added to the items array, but it has no affect. Any ideas?

    Read the article

< Previous Page | 38 39 40 41 42 43 44 45  | Next Page >