Search Results

Search found 18447 results on 738 pages for 'iphone and ipod touch'.

Page 401/738 | < Previous Page | 397 398 399 400 401 402 403 404 405 406 407 408  | Next Page >

  • register device at run time

    - by user177893
    In the App ID section of the Program Portal, locate the App ID you wish to use with the Apple Push Notification service. Only App IDs with a specific bundle ID can be used with the APNs. You cannot use a “wild-card” application ID. You must see “Available” under the Apple Push Notification service column to register this App ID and configure a certificate for this App ID. Click the ‘Configure’ link next to your desired App ID. In the Configure App ID page, check the Enable Push Notification Services box and click the Configure button. Clicking this button launches the APNs Assistant, which guides you through the next series of steps that create your App ID specific Client SSL certificate. Download the Client SSL certificate file to your download location. Navigate to that location and double-click the certificate file (which has an extension of cer) to install it in your keychain. When you are finished, click Done in the APNS Assistant. Double-clicking the file launches Keychain Access. Make sure you install the certificate in your login keychain on the computer you are using for provider development. The APNs SSL certificate should be installed on your notification server. When you finish these steps you are returned to the Configure App ID page of the iPhone Dev Center portal. The certificate should be badged with a green circle and the label “Enabled”. To complete the APNs set-up process, you will need to create a new provisioning profile containing your APNs-enabled App ID. IS it posssible to do theses steps through code.

    Read the article

  • How to setup OpenGL camera for a racing game

    - by vian
    I need the view to show the road polygon (a rectangle 3.f * 100.f) with a vanishing point for a road being at 3/4 height of the viewport and the nearest road edge as a viewport's bottom side. See Crazy Taxi game for an example of what I wish to do. I'm using iPhone SDK 3.1.2 default OpenGL ES project template. I setup the projection matrix as follows: glMatrixMode(GL_PROJECTION); glLoadIdentity(); glFrustumf(-2.25f, 2.25f, -1.5f, 1.5f, 0.1f, 1000.0f); Then I use glRotatef to adjust for landscape mode and setup camera. glMatrixMode(GL_MODELVIEW); glLoadIdentity(); glRotatef(-90, 0.0f, 0.0f, 1.0f); const float cameraAngle = 45.0f * M_PI / 180.0f; gluLookAt(0.0f, 2.0f, 0.0f, 0.0f, 0.0f, 100.0f, 0.0f, cos(cameraAngle), sin(cameraAngle)); My road polygon triangle strip is like this: static const GLfloat roadVertices[] = { -1.5f, 0.0f, 0.0f, 1.5f, 0.0f, 0.0f, -1.5f, 0.0f, 100.0f, 1.5f, 0.0f, 100.0f, }; And I can't seem to find the right parameters for gluLookAt. My vanishing point is always at the center of the screen.

    Read the article

  • Core Data @sum aggregate

    - by nasim
    I am getting an exception when I try to get @sum on a column in iPhone Core-Data application. My two models are following - Task model: @interface Task : NSManagedObject { } @property (nonatomic, retain) NSString * taskName; @property (nonatomic, retain) NSSet* completion; @end @interface Task (CoreDataGeneratedAccessors) - (void)addCompletionObject:(NSManagedObject *)value; - (void)removeCompletionObject:(NSManagedObject *)value; - (void)addCompletion:(NSSet *)value; - (void)removeCompletion:(NSSet *)value; @end Completion model: @interface Completion : NSManagedObject { } @property (nonatomic, retain) NSNumber * percentage; @property (nonatomic, retain) NSDate * time; @property (nonatomic, retain) Task * task; @end And here is the fetch: NSFetchRequest *request = [[NSFetchRequest alloc] init]; request.entity = [NSEntityDescription entityForName:@"Task" inManagedObjectContext:context]; NSSortDescriptor *sortDescriptor = [[NSSortDescriptor alloc] initWithKey:@"taskName" ascending:YES]; request.sortDescriptors = [NSArray arrayWithObject:sortDescriptor]; NSError *error; NSArray *results = [context executeFetchRequest:request error:&error]; NSArray *parents = [results valueForKeyPath:@"taskName"]; NSArray *children = [results valueForKeyPath:@"[email protected]"]; NSLog(@"%@ %@", parents, children); [request release]; [sortDescriptor release]; The exception is thrown at the fourth line from bottom. The thrown exception is: *** -[NSCFSet decimalValue]: unrecognized selector sent to instance 0x3b25a30 *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[NSCFSet decimalValue]: unrecognized selector sent to instance 0x3b25a30' I would very much appreciate any kind of help. Thanks.

    Read the article

  • Drawing only part of a

    - by Ben Reeves
    ..Continued on from my previous question I have a 320*480 RGB565 framebuffer which I wish to draw using OpenGL ES 1.0 on the iPhone. - (void)setupView { glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameteriv(GL_TEXTURE_2D, GL_TEXTURE_CROP_RECT_OES, (int[4]){0, 0, 480, 320}); glEnable(GL_TEXTURE_2D); } // Updates the OpenGL view when the timer fires - (void)drawView { // Make sure that you are drawing to the current context [EAGLContext setCurrentContext:context]; //Get the 320*480 buffer const int8_t * frameBuf = [source getNextBuffer]; //Create enough storage for a 512x512 power of 2 texture int8_t lBuf[2*512*512]; memcpy (lBuf, frameBuf, 320*480*2); //Upload the texture glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 512, 512, 0, GL_RGB, GL_UNSIGNED_SHORT_5_6_5, lBuf); //Draw it glDrawTexiOES(0, 0, 1, 480, 320); [context presentRenderbuffer:GL_RENDERBUFFER_OES]; } If I produce the original texture in 512*512 the output is cropped incorrectly but other than that looks fine. However using the require output size of 320*480 everything is distorted and messed up. I'm pretty sure it's the way I'm copying the framebuffer into the new 512*512 buffer. I have tried this routine int8_t lBuf[512][512][2]; const char * frameDataP = frameData; for (int ii = 0; ii < 480; ++ii) { memcpy(lBuf[ii], frameDataP, 320); frameDataP += 320; } Which is better, but the width appears to be stretched and the height is messed up. Any help appreciated.

    Read the article

  • How to modulate every texture unit in OpenGL ES 1.1?

    - by Jesse Beder
    I have two textures and a "blend factor", and I'd like to mix them, modulated by the current color; in effect, I want to use the following shader: gl_FragColor = gl_Color * mix(tex0, tex1, blendFactor); I'm using OpenGL ES 1.1, targeting all versions of the iPhone, so I can't use shaders, and I have two texture units. My best attempt is: // texture 0 glActiveTexture(GL_TEXTURE0); image1.Bind(); glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE); // texture 1 glActiveTexture(GL_TEXTURE1); image2.Bind(); glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE); glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_RGB, GL_INTERPOLATE); glTexEnvi(GL_TEXTURE_ENV, GL_SRC0_RGB, GL_PREVIOUS); glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_RGB, GL_SRC_COLOR); glTexEnvi(GL_TEXTURE_ENV, GL_SRC1_RGB, GL_TEXTURE); glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND1_RGB, GL_SRC_COLOR); glTexEnvi(GL_TEXTURE_ENV, GL_SRC2_RGB, GL_CONSTANT); glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND2_RGB, GL_SRC_ALPHA); glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_ALPHA, GL_INTERPOLATE); glTexEnvi(GL_TEXTURE_ENV, GL_SRC0_ALPHA, GL_PREVIOUS); glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_ALPHA, GL_SRC_ALPHA); glTexEnvi(GL_TEXTURE_ENV, GL_SRC1_ALPHA, GL_TEXTURE); glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND1_ALPHA, GL_SRC_ALPHA); glTexEnvi(GL_TEXTURE_ENV, GL_SRC2_ALPHA, GL_CONSTANT); glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND2_ALPHA, GL_SRC_ALPHA); const float factor[] = { 0, 0, 0, blendFactor }; glTexEnvfv(GL_TEXTURE_ENV, GL_TEXTURE_ENV_COLOR, factor); This has the effect of the shader: gl_FragColor = mix(gl_Color * tex0, tex1, blendFactor); but I don't see how to module texture 1 by the color. Is there any way to have the color provided by a texture unit automatically modulated by the incoming primary color? Or any other way to do what I want that I'm missing? Multiple passes are definitely allowed, but they have to have the proper blend effect; I have glBlend(GL_ONE, GL_ONE_MINUS_SRC_ALPHA); enabled, so it can be tricky to get right with multiple passes.

    Read the article

  • Connection

    - by pepersview
    Hello, I would like to ask you about NSURLConnection in objective-c for iPhone. I have one app that needs to connect to one webservice to receive data (about YouTube videos), Then I have all the things that I need to connect (Similar to Hello_Soap sample code in the web). But now, my problem is that I create a class (inherits from NSObject) named Connection and I have implemented the methods: didReceiveResponse, didReceiveData, didFailWithError and connectionDidFinishLoading. Also the method: -(void)Connect:(NSString *) soapMessage{ NSLog(soapMessage); NSURL *url = [NSURL URLWithString:@"http://....."]; NSMutableURLRequest *theRequest = [NSMutableURLRequest requestWithURL:url]; NSString *msgLength = [NSString stringWithFormat:@"%d", [soapMessage length]]; [theRequest addValue: @"text/xml; charset=utf-8" forHTTPHeaderField:@"Content-Type"]; [theRequest addValue: msgLength forHTTPHeaderField:@"Content-Length"]; [theRequest setHTTPMethod:@"POST"]; [theRequest setHTTPBody: [soapMessage dataUsingEncoding:NSUTF8StringEncoding]]; NSURLConnection *theConnection = [[NSURLConnection alloc] initWithRequest:theRequest delegate:self]; if( theConnection ) { webData = [[NSMutableData data] retain]; } else { NSLog(@"theConnection is NULL"); } } But when from my AppDelegate I create one Connection object: Connection * connect = [[Connection alloc] Init:num]; //It's only one param to test. [connect Connect:method.soapMessage]; And I call this method, when this finishes, it doesn't continue calling didReceiveResponse, didReceiveData, didFailWithError or connectionDidFinishLoading. I'm trying to do this but I can't for the moment. The thing I would like to do is to be able to call this class "Connection" each time that I want to receive data (after that to be parsed and displayed in UITableViews). Thank you.

    Read the article

  • Presenting an image cropping interface

    - by wkw
    I'm trying to engineer a UI for cropping images in iphone OS and suspect I'm going about things the hard way. My goal is pretty much what the Tapbots duo have done with Pastebot. In that app, they dim the source image but provide a movable and resizable cropping view and the image you're cropping is in a zoomable scrollview; when you resize or move the underlying image, the cropping view adjusts appropriately. I mocked up a composite image which will give a sense of the design I'm after, along with how I presently have my view hierarchy setup, viewable here The approach I've started with is the following: UIImageView with the image to crop is in a scrollview, a plain UIView with black fill and suitable transparency/alpha setting is added in front of the imageview. I then use a custom UIView which is a sibling to the scrollview at a higher level, which implements the drawRect: method and for the most part calls CGImageCreateWithImageInRect to get the portion of the image's bitmap that matches the position of the cropping view and draws that to the CGContext. in the viewcontroller I'm using the UIScrollViewDelegate methods to track scrolling and passing those changes to the custom cropping UIView so it stays in sync with the scroll contentOffset. That much is finally working. But trying to keep in sync as the scrollview zoomScale changes is when I figured I should ask for help. Looking for suggestions or guidance. My initial approach just seems like more work than is required. Could this be done with a masking layer in the ImageView? And if so, how would I setup the tracking for moving and resizing the cropping rect? My experience working with layers is non-nil, but very limited thus far.

    Read the article

  • Testing Async downloads with ASIHTTPRequest

    - by Baishampayan Ghose
    I am writing a simple library using ASIHTTPRequest where I am fetching URLs in an async manner. My problem is that the main function that I have written to test my lib exits before the async calls are finished. I am very new to Obj C and iPhone development, can anyone suggest a good way to wait before all the requests are finished in the main function? Currently, my main function looks like this - int main(int argc, char *argv[]) { NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init]; IBGApp *ibgapp = [[IBGApp alloc] init]; IBGLib *ibgl = [[IBGLib alloc] initWithUsername:@"joe" andPassword:@"xxx"]; // The two method calls below download URLs async. [ibgl downloadURL:@"http://yahoo.com/" withRequestDelegate:ibgapp andRequestSelector:@selector(logData:)]; [ibgl downloadURL:@"http://google.com/" withRequestDelegate:ibgapp andRequestSelector:@selector(logData:)]; [pool release]; return 0; // I reach here before the async calls are done. } So what is the best way to wait till the async calls are done? I tried putting sleep, but obviously doesn't work.

    Read the article

  • Can Core Data be used for objects with variable schemas?

    - by glenc
    I'm implementing a new iPhone app and am relatively new to Cocoa development overall. I am at the stage of choosing how the persistence layer of this app will work, and it looks like I'm basically choosing between Core Data and sqlite3. The persisted models in this app are intended to have a schema that is loaded at runtime (from some kind of defn file, probably XML). By which I mean, this app is intended to have objects that are user-definable to some extent, e.g. the Customer type (which has certain built-in fields like "name" and "email") can be modified to have extra fields based on the user's specific needs (e.g. a user might want to add a "favourite fruit" field to their Customer type). Having said that, will Core Data work for an app with a non-baked-in data model like this? I've just started playing around with the Core Data object designer thing in XCode and it seems like this thing wants to work with objects that have fixed fields that are compiled in. I'm definitely trying to take the path of least resistance here, and I can see the benefits of using an Apple-supplied data framework, but don't want to start down that path if it's going to lock me into a data model that's defined at compile time.

    Read the article

  • Is NSManagedObjectContext autosaved or am I looking at NSFetchedResultsController's cache?

    - by Andreas
    I'm developing an iPhone app where I use a NSFetchedResultsController in the main table view controller. I create it like this in the viewDidload of the main table view controller: NSSortDescriptor *sortDescriptorDate = [[NSSortDescriptor alloc] initWithKey:@"date" ascending:YES]; NSSortDescriptor *sortDescriptorTime = [[NSSortDescriptor alloc] initWithKey:@"start" ascending:YES]; NSArray *sortDescriptors = [[NSArray alloc] initWithObjects:sortDescriptorDate,sortDescriptorTime, nil]; [fetchRequest setSortDescriptors:sortDescriptors]; [sortDescriptorDate release]; [sortDescriptorTime release]; [sortDescriptors release]; controller = [[NSFetchedResultsController alloc] initWithFetchRequest:fetchRequest managedObjectContext:context sectionNameKeyPath:@"date" cacheName:nil]; [fetchRequest release]; NSError *error; BOOL success = [controller performFetch:&error]; Then, in a subsequent view, I create a new object on the context: TestObject *testObject = [NSEntityDescription insertNewObjectForEntityForName:@"TestObject" inManagedObjectContext:context]; The TestObject has several related object which I create in the same way and add to the testObject using the provided add...Objects methods. Then, if before saving the context, I press cancel and go back to the main table view, nothing is shown as expected. However, if I restart the app, the object I created on the context shows in the main table view. How come? At first, I thought it was because the NSFetchedResultsController was reading from the cache, but as you can see I set this to nil just to test. Also, [context hasChanges] returns true after I restart. What am I missing here?

    Read the article

  • Problem with writeToFile with array of NSDictionary objects

    - by Ken
    I'm trying to write an array of NSDictionary objects to a .plist file on the iPhone (OS 3.0). (They are actually NSCFDictionary objects when I call the [object class] method). My problem is that it won't write to file. If I set the array to "nil" it at least creates the empty plist file but won't do it if I have these objects in the array. My array is a parsed response from a JSON HTTP request and looks like this: { "title" = "A Movie"; "time_length" = "3:22"; }, { "title" = "Another Movie"; "time_length" = "1:40"; }, { "title" = "A Third Movie"; "time_length" = "2:10"; } The code to create the file is: [array writeToFile:[self dataFilePath] atomically:YES]; - (NSString *)dataFilePath { NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; return [documentsDirectory stringByAppendingPathComponent:@"data.plist"]; } Could the NCSFDictionary class of the objects in my array be preventing me from writing to file? Thanks for your help.

    Read the article

  • Covert uiiamge into string

    - by Warrior
    I am new iphone development.Is there any possibility to covert the uiimage into string and then once again back to image. - (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)img1 editingInfo:(NSDictionary *)editInfo { [[picker parentViewController] dismissModalViewControllerAnimated:YES]; NSData *data = UIImagePNGRepresentation(img1); NSString *str1; str1 = [[NSString alloc] initWithData:data encoding:NSASCIIStringEncoding]; MyAppAppDelegate *appDelegate = (MyAppAppDelegate *) [[UIApplication sharedApplication] delegate]; [appDelegate setCurrentLink:str1]; EmailPictureViewController *email = [[EmailPictureViewController alloc] initWithNibName:@"EmailPictureViewController" bundle:nil]; [self.navigationController pushViewController:email animated:YES]; } so i can use delegate methods to tranfer the image from one view to another view. so i should convert the string once again to image and display it in another view. In Another view - (void)viewDidLoad { MyAppAppDelegate *appDelegate =(MyAppAppDelegate *) [[UIApplication sharedApplication] delegate]; str1 = [appDelegate getCurrentLink]; NSLog(@"The String %@",str1); NSData *aData; aData = [str1 dataUsingEncoding: NSASCIIStringEncoding]; NSLog(@"The String Data %@",aData); NSLog(@"Inside Didload3"); [imgview setImage:[UIImage imageWithData:aData]]; } But this doesn't work for me.Where do i go wrong.Is there any way to solve it?.Please help me out.Thanks.

    Read the article

  • Can I load a UIImage from a URL?

    - by progrmr
    I have a URL for an image (got it from UIImagePickerController) but I no longer have the image in memory (the URL was saved from a previous run of the app). Can I reload the UIImage from the URL again? I see that UIImage has a imageWithContentsOfFile: but I have a URL. Can I use NSData's dataWithContentsOfURL: to read the URL? EDIT based on @Daniel's answer I tried the following code but it doesn't work... NSLog(@"%s %@", __PRETTY_FUNCTION__, photoURL); if (photoURL) { NSURL* aURL = [NSURL URLWithString:photoURL]; NSData* data = [[NSData alloc] initWithContentsOfURL:aURL]; self.photoImage = [UIImage imageWithData:data]; [data release]; } When I ran it the console shows: -[PhotoBox willMoveToWindow:] file://localhost/Users/gary/Library/Application%20Support/iPhone%20Simulator/3.2/Media/DCIM/100APPLE/IMG_0004.JPG *** -[NSURL length]: unrecognized selector sent to instance 0x536fbe0 *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[NSURL length]: unrecognized selector sent to instance 0x536fbe0' Looking at the call stack, I'm calling URLWithString, which calls URLWithString:relativeToURL:, then initWithString:relativeToURL:, then _CFStringIsLegalURLString, then CFStringGetLength, then forwarding_prep_0, then forwarding, then -[NSObject doesNotRecognizeSelector]. Any ideas why my NSString (photoURL's address is 0x536fbe0) doesn't respond to length? Why does it say it doesn't respond to -[NSURL length]? Doesn't it know that param is an NSString, not a NSURL?

    Read the article

  • UITableViewCell imageView images loading small even when they are the correct size!

    - by Alex Barlow
    Im having an issue whilst loading images into a UITableViewCell after an asynchronous download and placement into an UIImage variable.. The images appear smaller than they actually are! But when scrolled down and scrolled back up to the image, or the whole table is reloaded, they appear at the correct size... Here is a code excerpt... - (void)reviewImageDidLoad:(NSIndexPath *)indexPath { ThumbDownloader *thumbDownloader = [imageDownloadsInProgress objectForKey:indexPath]; if (thumbDownloader != nil) { UITableViewCell *cell = [self.tableView cellForRowAtIndexPath:thumbDownloader.indexPathInTableView]; [UIView beginAnimations:nil context:nil]; [UIView setAnimationDuration:0.4]; [self.tableView cellForRowAtIndexPath:indexPath].imageView.alpha = 0.0; [UIView commitAnimations]; cell.imageView.image = thumbDownloader.review.thumb; [UIView beginAnimations:nil context:nil]; [UIView setAnimationDuration:0.4]; [self.tableView cellForRowAtIndexPath:indexPath].imageView.alpha = 1.0; [UIView commitAnimations]; } } Here is an image of the app just after calling this method.. http://www.flickr.com/photos/arbarlow/5288563627/ After calling tableView reloadData or scrolling around the appear correctly, go the the next flickr image, to see the normal result, but im sure you can guess that.. Does anyone have an ideas as to make the images appear correctly? im absolutely stumped?! Regards, Alex iPhone noob

    Read the article

  • Trouble Running Leaks Instrument

    - by TheGeoff
    I'm having trouble running the Leaks Instrument since installing the 3.0 SDK. An NDA disclaimer here I don't think this is a 3.0 SDK issue, just a configuration problem. So I'm looking for advice on configuring the tools in question not the 3.0 SDK per se. Here’s the breakdown of the behavior I am seeing. My Application is compiled to OS version 2.2. I can run it out of XCode in debug mode on the Simulator and Device running 2.2, 2.2.1, 3.0. If I start it with Performance Tools - Leaks, I get an error message from the OS, “The application xxxx quit unexpectedly”, “Ignore, Report, Relaunch.” If I click “Ignore” one of two things will happen, either Leaks tells me it couldn’t attach, or Leaks stop responding to input and I have to Force Quit. Interesting thing is the Simulator starts in 3.0 OS. If I start Instruments Manually and attach to a running 2.2 Simulator it shows the same behavior. If I attach Leaks to an iPhone Device it works. It seems that once I launch Leaks my app won't run in the simulator until I do a new build. Any ideas for getting my Simulator/Leaks/Xcode synced back up? Thanks, Geoff

    Read the article

  • Making a sqlite file stay existent between runs of the program

    - by Cocorico
    Hi! I'm having a problem with some sqlite code for an iPhone program in Xcode. I was opening my database like this: int result = sqlite3_open("stealtown.db", &database); Which is how they had it in a book I was looking at while I type the program. But then, that way of opening a database it only works when you run in simulator, not on device. So I finally figure out I need to do this: NSString *file = [[NSBundle mainBundle] pathForResource:@"stealtown" ofType:@"db"]; int result = sqlite3_open([file UTF8String], &database); And that works on device, EXCEPT one thing: Each time you launch the program, it starts as if you had never created the database, and when you stick an entry in the table, it's the ONLY entry in that table. When I used the first code on the simulator, I could open my program 6 times, each time adding 1 entry to a table, and at the end, I had 6 entries in that table. With the second code, I do exact same thing but each time there is only 1 entry in that table. Am I explaining this okay, I hope so, it's hard sometimes for me. Anyone maybe know why this would be?

    Read the article

  • Change classes instantiated with loadNibNamed

    - by Nick H247
    I am trying to change the class of objects created with a nib with the iPhone SDK. The reason for this is; i dont know until runtime what the class is that i want the nib object to be (though they will have the same UIView based super class), and i dont want to create a different nib for every eventuality - as the .nib will be the same for each, apart from the class of one object. I have been successful, with a couple of methods, but either have some knock on effects or am unsure of how safe the methods I have used are: Method 1: Override alloc, on the super class and set a c variable to the class I require: + (id) alloc { if (theClassIWant) { id object = [theClassIWant allocWithZone:NSDefaultMallocZone()]; theClassIWant = nil; return object; } return [BaseClass allocWithZone:NSDefaultMallocZone()]; } this works well, and i assume is 'reasonably' safe, though if I alloc a subclass myself (without setting 'theClassIWant') - an object of the base class is created. I also dont really like the idea of overriding alloc... Method 2: use object_setClass(self,theClassIWant) in initWithCoder (before calling initWithCoder on the super class): - (id) initWithCoder:(NSCoder *)aDecoder { if (theClassIWant) { // the framework doesn't like this: //[self release]; //self = [theClassIWant alloc]; // whoa now! object_setClass(self,theClassIWant); theClassIWant = nil; return [self initWithCoder:aDecoder]; } if (self = [super initWithCoder:aDecoder]) { ... this also works well, but not all the subclasses are necessarily going to be the same size as the super class, so this could be very unsafe! To combat this i tried releasing and re-allocing to the correct type within initWithCoder, but i got the following error from the framework: "This coder requires that replaced objects be returned from initWithCoder:" dont quite get what this means! i am replacing an object in initWithCoder... Any comments on the validity of these methods, or suggestions of improvements or alternatives welcome!

    Read the article

  • How to append a row to a TableViewSection in Titanium?

    - by Mike Trpcic
    I'm developing an iPhone application in Titanium, and need to append a row to a particular TableViewSection. I can't do this on page load, as it's done dynamically by the user throughout the lifecycle of the application. The documentation says that the TableViewSection has an add method which takes two arguments, but I can't make it work. Here's my existing code: for(var i = 0; i <= product_count; i++){ productsTableViewSection.add( Ti.UI.createTableViewRow({ title:'Testing...' }) ); } That is just passing one argument in, and that causes Titanium to die with an uncaught exception: 2010-04-26 16:57:18.056 MyApplication[72765:207] *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Invalid update: invalid number of rows in section 2. The number of rows contained in an existing section after the update (2) must be equal to the number of rows contained in that section before the update (1), plus or minus the number of rows inserted or deleted from that section (0 inserted, 0 deleted).' 2010-04-26 16:57:18.056 MyApplication[72765:207] Stack: ( The exception looks like it did add the row, but it's not allowed to for some reason. Since the documentation says that TableViewSection takes in "view" and "row", I tried the following: for(var i = 0; i <= product_count; i++){ productsTableViewSection.add( Ti.UI.createView({}), Ti.UI.createTableViewRow({ title:'Testing...' }) ); } The above code doesn't throw the exception, but it gives a [WARN]: [WARN] Invalid type passed to function. expected: TiUIViewProxy, was: TiUITableViewRowProxy in -[TiUITableViewSectionProxy add:] (TiUITableViewSectionProxy.m:62) TableViewSections don't seem to support any methods like appendRow, or insertRow, so I don't know where else to go with this. I've looked through the KitchenSink app, but there are no examples that I could find of adding a row to a TableViewSection. Any help is appreciated.

    Read the article

  • Modal view becomes partly transparent when dismissing?

    - by Jaanus
    A completely ordinary setup: UIViewController where I push another UIVC: BlahVc *blah = [[BlahVc alloc] initWithNibName:@"Blah" bundle:nil]; UINavigationController *nav = [[UINavigationController alloc] initWithRootViewController:blah]; blah.delegate = self; [self presentModalViewController:nav animated:YES]; [nav release]; [blah release]; Details about Blah: to support both landscape and portrait with least effort, I built Blah.xib so that inside Blah's main view, call it view A, there is another view B, with width fixed to 320px, that positions itself in the centre of the screen. With portrait iPhone it fills up the whole screen, with landscape there are margins on the side. So far, so good. Autorotate etc works well. Now, to dismiss blah, I use the recommended setup: inside Blah, I do: [self.delegate blahDidCancel:self]; And in the parent VC, I have: - (void)blahDidCancel:(Blah *)blah { [self dismissModalViewControllerAnimated:YES]; } Both view A's and B's backgrounds are opaque white. Problem: as soon as it hits the dismissModalViewControllerAnimated line, view A seems to become transparent, while view B remains white. This is not a problem in portrait since view B still fills up the screen. But in landscape, the result is that view B is still opaque, but has see-through transparent margins on the side (where view A used to be that has now mysteriously become transparent), from where the parent view contents comes through during the dismissing animation. Why does it seem like view A becomes transparent upon dismissing the modal VC?

    Read the article

  • How do you make a static sprite be a child of another sprite in cocos2D while using SpaceManager

    - by JJBigThoughts
    I have two static (STATIC_MASS) SpaceManager sprites. One is a child of the other - by which I mean that one sort of builds up the other one, but although the child's images shows up in the right place, the child doesn't seem to exists in the chipmunk physics engine, like I would expect. In my case, I have a backboard (rectangular sprite) and a hoop (a circular sprite). Since I might want to move the backboard, I'd like to attach the hoop to backboard so that the hoop automatically moves right along with the backboard. Here, we see a rotating backboard with attached hoop. It looks OK on the screen, but other objects only bounce off the backboard but pass right through the hoop (in a bad sense of the term). What doesn't my child sprite seem to exist in the physics engine? // Add Backboard cpShape *shapeRect = [smgr addRectAt:cpvWinCenter mass:STATIC_MASS width:200 height:10 rotation:0.0f ];// We're upgrading this cpCCSprite * cccrsRect = [cpCCSprite spriteWithShape:shapeRect file:@"rect_200x10.png"]; [self addChild:cccrsRect]; // Spin the static backboard: http://stackoverflow.com/questions/2691589/how-do-you-make-a-sprite-rotate-in-cocos2d-while-using-spacemanager // Make static object update moves in chipmunk // Since Backboard is static, and since we're going to move it, it needs to know about spacemanager so its position gets updated inside chipmunk. // Setting this would make the smgr recalculate all static shapes positions every step // cccrsRect.integrationDt = smgr.constantDt; // cccrsRect.spaceManager = smgr; // Alternative method: smgr.rehashStaticEveryStep = YES; smgr.rehashStaticEveryStep = YES; // Spin the backboard [cccrsRect runAction:[CCRepeatForever actionWithAction: [CCSequence actions: [CCRotateTo actionWithDuration:2 angle:180], [CCRotateTo actionWithDuration:2 angle:360], nil] ]]; // Add the hoop cpShape *shapeHoop = [smgr addCircleAt:ccp(100,-45) mass:STATIC_MASS radius: 50 ]; cpCCSprite * cccrsHoop = [cpCCSprite spriteWithShape:shapeHoop file:@"hoop_100x100.png"]; [cccrsRect addChild:cccrsHoop]; This is only half working for me. Note: SpaceManager is a toolkit for working with cocos2D-iphone

    Read the article

  • ObjectiveFlickr - how to call getSizes on a photo ID not known until the response?

    - by Jonathan Cohen
    Hi guys, I'm building a free music instrument iPhone app with the Flickr API and ObjectiveFlickr.A random photo from the interestingness list is displayed in the background, but I can't center it without knowing its size. (so I can reset the UIWebView frame) I've been researching this for awhile, and if the answer is super easy, please have some mercy on a noob - it's my first time playing with a web service API. =) Since I don't know the photo ID until after I receive the response from the interestingness feed, how would I call flickr.photo.getSizes on the response? This is what I have so far: - (void)flickrAPIRequest:(OFFlickrAPIRequest *)inRequest didCompleteWithResponse:(NSDictionary *)inResponseDictionary{ int randomResponse = arc4random() % 49; photoDict = [[inResponseDictionary valueForKeyPath:@"photos.photo"] objectAtIndex:randomResponse]; NSString *photoID = [photoDict valueForKeyPath:@"id"]; NSLog(@"%@",photoID); NSURL *photoURL = [flickrContext photoSourceURLFromDictionary:photoDict size:OFFlickrMediumSize]; NSString *htmlSource = [NSString stringWithFormat: @"<html>" @"<head>" @" <style>body { margin: 0; padding: 0; } </style>" @"</head>" @"<body>" @"<img src=\"%@\" />" @"</body>" @"</html>" , photoURL]; [webView loadHTMLString:htmlSource baseURL:nil]; }

    Read the article

  • MKAnnotations are being made successfully, however they sometimes fail to render on MKMapView

    - by jtkendall
    I'm working on an iPhone app using the 3.1.3 SDK, my app finds the users current location, displays it on a MKMapView and then finds nearby locations and renders them as MKAnnotations. My code is working, however sometimes the nearby annotations do not appear on the map. They are still being made as I see the correct data in the console (from NSLog which runs just after the annotations are made). When it fails is completely random, it could be the 5th time I've hit "Build and Run" for the day, or the 500th it doesn't appear to have any pattern and doesn't throw any type of error, it just doesn't add the annotations to the MapView. This is the method called for each nearby location to add the MKAnnotation. - (void)addPinsWithLocation:(NSDictionary *)spot { CLLocationCoordinate2D location; location.longitude = [[spot objectForKey:@"spot_longitude"] doubleValue]; location.latitude = [[spot objectForKey:@"spot_latitude"] doubleValue]; PlaceMarks *placemark = [[PlaceMarks alloc] initWithCoordinate:location title:[spot objectForKey:@"spot_name"] subtitle:@""]; NSLog(@"Adding Pin for Location: '%@' at %f, %f", [spot objectForKey:@"spot_name"], location.latitude, location.longitude); [mapView addAnnotation:placemark]; } Any ideas on how to get MKAnnotations to always show?

    Read the article

  • Is there a way to toggle bluetooth and/or wifi on and off programatically in iOS?

    - by Andy W
    I am looking for an easy way to toggle both bluetooth and wifi between on and off states on iOS 4.x devices (iPhone and iPad). I am constantly toggling these functions as I move between different locations and usage scenarios, and right now it takes multiple taps and visits to the Settings App. I am looking to create a simple App, that lives on Springboard, that I can just tap and it will turn off the wifi if it's on, and vice versa, then immediately quit. Similarly with an App for toggling bluetooth’s state. I have the developer SDK, and am comfortable in Xcode and with iOS development, so am happy to write the required Xcode to create the App. I am just at a loss as to which API, private or not, has the required functionality to simply toggle the state of these facilities. Because this is scratching a very personal itch, I have no intent to try and sell the App or get it up on the App store, so conforming with App guidelines on API usage is a non-issue. What I don’t want to do is jailbreak the devices, as I want to keep the core software as shipped. Can anyone point me at some sample code or more info on achieving this goal, as my Google-fu is letting me down, and if the information is out there for 4.x devices I just can’t find it.

    Read the article

  • How do I send text to a UITextView?

    - by Lee
    I'm new to iphone development. I'm a VB programmer who is trying to convert a VB application to an ipad app. I need some help with sending text to a UITextView. I want to first have a UIPickerView and then once the user hits a UIButton, a UITextView appears and the text is then generated by my source code code, line by line. The code would constantly be concatenating strings to the text. It would sort of go like this-- 1) User makes selections with UIPickerView. 2) User then hits UIButton. 3) UIPickerView is replaced on the screen with a UITextView. 4) The code does stuff. 5) The code adds the 1st line of text into the UITextView. 6) The code does more stuff. 7) The code then adds the 2nd line of text into the UITextView, retaining the 1st line that was already there. Steps 6 and 7 are repeated until the code is done. Does anyone know of any examples of this that I could look at? I am mostly interested in finding something like a youtube video, a webpage that explains the code or even a good book that covers this particular topic. I am finding that the sample codes that Apple has on this site goes over my head. In fact, I could probably benefit from a good book. But, I am looking for one that I would already know covers this particular topic, since it is so essential to the app that I am trying to build.

    Read the article

  • How could I create a FBO with stencil buffer in OpenGL ES 2.0?

    - by Alphones
    I need stencil buffer on 3GS to render planar shadow, and polygon offset won't work prefect, still has z-fighting problem. So I use stencil buffer to make the shadow correct, it works on win32 gles2 emulator, but not on iPhone. After I added a post effect to the whole scene. The stencil buffer won't work even on win32 gles2 emulator. And I tried to attach a stencil buffer to FBO, buf the screen turns to black. Here's my code, glGenRenderbuffers(1, &dbo); // depth buffer glBindRenderbuffer(GL_RENDERBUFFER, dbo); glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT24_OES, widthGL, heightGL); glGenRenderbuffers(1, &sbo); // stencil buffer glBindRenderbuffer(GL_RENDERBUFFER, sbo); glRenderbufferStorage(GL_RENDERBUFFER, GL_STENCIL_INDEX8, widthGL, heightGL); glGenFramebuffers(1, &fbo); glBindFramebuffer(GL_FRAMEBUFFER, fbo); glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, tex, 0); glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, dbo); glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_STENCIL_ATTACHMENT, GL_RENDERBUFFER, sbo); // this make the whole screen black. The eglContext is created with STENCIL_SIZE=8, it works without a RTT. I tried to change the RenderbufferStorage for both depth buffer and stencil buffer, but none of them works. Is there anything I have missed? Does the stencil buffer pack with depth buffer? (I cannot find things like GL_DEPTH24_STENCIL8 ...)

    Read the article

< Previous Page | 397 398 399 400 401 402 403 404 405 406 407 408  | Next Page >