Search Results

Search found 4848 results on 194 pages for 'cocoa matters'.

Page 62/194 | < Previous Page | 58 59 60 61 62 63 64 65 66 67 68 69  | Next Page >

  • UIDatePicker - Problem Localizing

    - by Smorpheus
    Hello, I've created a UIDatePicker in my app and I also have support for several languages. My UIDatePicker is created in Interface Builder, and I have created a seperate localization XIB so I can customize my UIDatePicker. Setting the "Locale" option in IB appears to do nothing. Attempting to change my DatePicker programatically with Locale and NSCalender also do nothing via the following code: NSLocale * locale = [[NSLocale alloc] initWithLocaleIdentifier:@"es_ES"]; datePicker.locale = locale; datePicker.calender = [locale objectForKey:NSLocaleCalender]; This results in an english picker. Here's the really weird thing though. The word for "Today" is translated. As seen in the attached screenshot. (OK I'm not allowed to post images. But imagine a Date & Time picker with "May" in English and "Today" written "Ajourd'hui". Based on what I've read, adding the UIDatePicker programatically doesn't seem to help much.

    Read the article

  • Location of New Window after old window closed

    - by John Brayton
    I have an app that allows multiple windows. I have a strange bug where, if I repeatedly open and close windows, new windows are positioned lower and lower on the screen. I would expect this if I were keeping the windows open, but it seems that the OS X window tiling mechanism is unaware of when my windows are closing. Potentially relevant notes: I am using garbage collection. This is not a document-based app. When I close a window, the corresponding menu item is removed from the "Window" menu. Any hints as to what I might be doing wrong would be appreciated. Thanks!

    Read the article

  • Sound not working in iPhone Simulator?

    - by pix0r
    Somehow my iPhone Simulator is unable to play sounds. First an app I'm working on using AudioServicesPlaySystemSound() stopped working.. I spent a while debugging this but sound is still working on the iPhone when I run the app on the device. I get the same results with other iPhone apps such as the sample Crash Landing app. I can't find a sound setting anywhere in the simulator or Xcode preferences. I've tried resetting the simulator through "Reset Content and Settings" menu item to no avail.

    Read the article

  • Missing AVFoundation.framework

    - by Alex
    Hi, AVFoundation.framework is not where the documentation says it should be. I have iPhone SDK 2.2 installed (never had previous sdk versions installed) and I can't find that folder under /System/Library/Frameworks I did find it under /Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS2.2.sdk/System/Library/Frameworks/ folder but if I add it from that location, then the compiler can't find the header files. I tried copying the entire AVFoundation.framework folder to /System/Library/Framework, but it still can't find the header files. How can I use AVFoundation classes? Thanks, Alex

    Read the article

  • Building a complex view with Three20 - resources?

    - by psychotik
    I'm using three20 for most of my iPhone app. One of the views I need to create is relatively complex. It needs a top bar (under the nav bar) with some controls and label, an image view below this bar (which occupies most of the body) and another bottom bar with more controls and labels (above the tab bar control). I don't have much UI experience - my only experience with anything UI is laying stuff out using CSS, etc on websites. Apple's online doc seems to assume that the reader knows a bunch about rectangles, layouts, frames, etc or is using InterfaceBuilder. And three20 isn't too well documented either. So my question is: Is it possible to design something like what I describe in IB and then still have a three20-based app use it? If so, any tips/pointers on how would be much appreciated. Can you point me to some documentation that explain how views/controls etc are rendered. I'm pretty sure I can figure it out if I find some decent explanation/tutorial for it.

    Read the article

  • How to limit NSTextField text length and keep it always upper case?

    - by carlosb
    Need to have an NSTextField with a text limit of 4 characters maximum and show always in upper case but can't figure out a good way of achieving that. I've tried to do it through a binding with a validation method but the validation only gets called when the control loses first responder and that's no good. Temporarly I made it work by observing the notification NSControlTextDidChangeNotification on the text field and having it call the method: - (void)textDidChange:(NSNotification*)notification { NSTextField* textField = [notification object]; NSString* value = [textField stringValue]; if ([value length] > 4) { [textField setStringValue:[[value uppercaseString] substringWithRange:NSMakeRange(0, 4)]]; } else { [textField setStringValue:[value uppercaseString]]; } } But this surely isn't the best way of doing it. Any better suggestion?

    Read the article

  • How do I draw a proper parallelogram that can be animated on iPhone?

    - by Robert Kosara
    I'm trying to do something very simple: I need some parallelograms in my program. These are attached to other objects, all of which are UIViews. It's important that I be able to animate these, since the objects they are attached to can also be animated. I've figured out how to use the transform in UIView/CALayer to do this, but the problem is that these sheared UIViews don't look very nice: there is no anti-aliasing of the edges. Is there some other way to do this? I would like to use UIViews, since I also use them for user interaction and animation is so much easier than drawing by hand. I don't want to use OpenGL for this.

    Read the article

  • Analyzing bitmaps produced by NSAffineTransform and CILineOverlay filters

    - by Adam
    I am trying to manipulate an image using a chain of CIFilters, and then examine each byte of the resulting image (bitmap). Long term, I do not need to display the resulting image (bitmap) -- I just need to "analyze" it in memory. But near-term I am displaying it on screen, to help with debugging. I have some "bitmap examination" code that works as expected when examining the NSImage (bitmap representation) I use as my input (loaded from a JPG file into an NSImage). And it SOMETIMES works as expected when I use it on the outputBitmap produced by the code below. More specifically, when I use an NSAffineTransform filter to create outputBitmap, then outputBitmap contains the data I would expect. But if I use a CILineOverlay filter to create the outputBitmap, none of the bytes in the bitmap have any data in them. I believe both of these filters are working as expected, because when I display their results on screen (via outputImageView), they look "correct." Yet when I examine the outputBitmaps, the one created from the CILineOverlay filter is "empty" while the one created from NSAffineTransfer contains data. Furthermore, if I chain the two filters together, the final resulting bitmap only seems to contain data if I run the AffineTransform last. Seems very strange, to me??? My understanding (from reading the CI programming guide) is that the CIImage should be considered an "image recipe" rather than an actual image, because the image isn't actually created until the image is "drawn." Given that, it would make sense that the CIimage bitmap doesn't have data -- but I don't understand why it has data after I run the NSAffineTransform but doesn't have data after running the CILineOverlay transform? Basically, I am trying to determine if creating the NSCIImageRep (ir in the code below) from the CIImage (myResult) is equivalent to "drawing" the CIImage -- in other words if that should force the bitmap to be populated? If someone knows the answer to this please let me know -- it will save me a few hours of trial and error experimenting! Finally, if the answer is "you must draw to a graphics context" ... then I have another question: would I need to do something along the lines of what is described in the Quartz 2D Programming Guide: Graphics Contexts, listing 2-7 and 2-8: drawing to a bitmap graphics context? That is the path down which I am about to head ... but it seems like a lot of code just to force the bitmap data to be dumped into an array where I can get at it. So if there is an easier or better way please let me know. I just want to take the data (that should be) in myResult and put it into a bitmap array where I can access it at the byte level. And since I already have code that works with an NSBitmapImageRep, unless doing it that way is a bad idea for some reason that is not readily apparent to me, then I would prefer to "convert" myResult into an NSBitmapImageRep. CIImage * myResult = [transform valueForKey:@"outputImage"]; NSImage *outputImage; NSCIImageRep *ir = [NSCIImageRep alloc]; ir = [NSCIImageRep imageRepWithCIImage:myResult]; outputImage = [[[NSImage alloc] initWithSize: NSMakeSize(inputImage.size.width, inputImage.size.height)] autorelease]; [outputImage addRepresentation:ir]; [outputImageView setImage: outputImage]; NSBitmapImageRep *outputBitmap = [[NSBitmapImageRep alloc] initWithCIImage: myResult]; Thanks, Adam

    Read the article

  • Caching the struct Object

    - by PRamod
    How do I create a cache for a struct pointer object in Objective-C? Is there any third party component for caching objects as Java and .NET have? I have the following struct: typedef struct _news { references char *headline; char *story_url; } news; I have a double pointer for the above struct in an interface class. I would like to cache it for some time using Objective-C.

    Read the article

  • Objective-C respondsToSelector question.

    - by Holli
    From what I have learned so far: In Objective-C you can send any message to any object. If the object does implement the right method it will be executed otherwise nothing will happen. This is because before the message is send Objective-C will perform respondsToSelector. I hope I am right so far. I did a little program for testing where an action is invoked every time a slider is moved. Also for testing I set the sender to NSButton but in fact it is an NSSlider. Now I asked the object if it will respond to setAlternateTitle. While a NSButton will do and NSSlider will not. If I run the code and do respondsToSelector myself it will tell me the object will not respond to that selector. If I test something else like intValue, it will respond. So my code is fine so far. - (IBAction)sliderDidMove:(id)sender { NSButton *slider = sender; BOOL responds = [slider respondsToSelector:@selector(setAlternateTitle)]; if(responds == YES) { NSLog(@"YES"); } else { NSLog(@"NO"); } [slider setAlternateTitle:@"Hello World"]; } But when I actually send the setAlternateTitle message the program will crash and I am not exactly sure why. Shouldn't it do a respondsToSelector before sending the message?

    Read the article

  • Cocos2D, UIScrollView, and initial placement of a scene

    - by diatrevolo
    Hello: I am using a UIScrollView to forward touches to Cocos2D as outlined in http://getsetgames.com/2009/08/21/cocos2d-and-uiscrollview/ Everything works great after a few days of working with it, except one thing: when the initial view appears on the screen, the background appears to be scrolled to the center. As soon as I try to scroll around, the image jumps to 0,0, and everything works as normal, except the touches are offset by half the width and height of the background image. Am I overlooking something basic? I can't think of a useful portion of the code that illustrates the issue, as I can't track it down, but would be happy to post code if anyone has any ideas. Thanks in advance, -Roberto

    Read the article

  • MGTwitterEngine - Using getImageAtURL on iPhone

    - by Andrew Malchow
    Essentially, I'm working on asynchronously downloading images and adding them to specific UITableView cells (twitter profile images using MGTwitterEngine from Matt Gemmell). I've looked at general asynchronous download code and must admit, I'm still too much of a noob to understand it well enough to adapt it to my purposes. Instead, I'm attempting to use Gemmell's included getImageAtUrl method to add the images. I have it working to the point that -imageReceived: receives the images for visible cells, however, I'm stuck as to how to include them into the appropriate cells at that point. - (void)imageReceived:(UIImage *)image forRequest:(NSString *)identifier { NSLog(@"Got an image:%@",image); // What goes here? Or elsewhere? } This method is within my main view controller, I also have a custom cell controller where I'm drawing the cell content using Loren Brichter's fast scrolling code. Any help with this MGTwitterEngine method in particular, or with dynamically adding these images to my table cells would be greatly appreciated.

    Read the article

  • setIncludesSubentities: in an NSFetchRequest is broken for entities across multiple persistent store

    - by SG
    Prior art which doesn't quite address this: http://stackoverflow.com/questions/1774359/core-data-migration-error-message-model-does-not-contain-configuration-xyz I have narrowed this down to a specific issue. It takes a minute to set up, though; please bear with me. The gist of the issue is that a persistentStoreCoordinator (apparently) cannot preserve the part of an object graph where a managedObject is marked as a subentity of another when they are stored in different files. Here goes... 1) I have 2 xcdatamodel files, each containing a single entity. In runtime, when the managed object model is constructed, I manually define one entity as subentity of another using setSubentities:. This is because defining subentities across multiple files in the editor is not supported yet. I then return the complete model with modelByMergingModels. //Works! [mainEntity setSubentities:canvasEntities]; NSLog(@"confirm %@ is super for %@", [[[canvasEntities lastObject] superentity] name], [[canvasEntities lastObject] name]); //Output: "confirm Note is super for Browser" 2) I have modified the persistentStoreCoordinator method so that it sets a different store for each entity. Technically, it uses configurations, and each entity has one and only one configuration defined. //Also works! for ( NSString *configName in [[HACanvasPluginManager shared].registeredCanvasTypes valueForKey:@"viewControllerClassName"] ) { storeUrl = [NSURL fileURLWithPath:[[self applicationDocumentsDirectory] stringByAppendingPathComponent:[configName stringByAppendingPathExtension:@"sqlite"]]]; //NSLog(@"entities for configuration '%@': %@", configName, [[[self managedObjectModel] entitiesForConfiguration:configName] valueForKey:@"name"]); //Output: "entities for configuration 'HATextCanvasController': (Note)" //Output: "entities for configuration 'HAWebCanvasController': (Browser)" if (![persistentStoreCoordinator addPersistentStoreWithType:NSSQLiteStoreType configuration:configName URL:storeUrl options:options error:&error]) //etc 3) I have a fetchRequest set for the parent entity, with setIncludesSubentities: and setAffectedStores: just to be sure we get both 1) and 2) covered. When inserting objects of either entity, they both are added to the context and they both are fetched by the fetchedResultsController and displayed in the tableView as expected. // Create the fetch request for the entity. NSFetchRequest *fetchRequest = [[NSFetchRequest alloc] init]; [fetchRequest setEntity:entity]; [fetchRequest setIncludesSubentities:YES]; //NECESSARY to fetch all canvas types [fetchRequest setSortDescriptors:sortDescriptors]; [fetchRequest setFetchBatchSize:20]; // Set the batch size to a suitable number. [fetchRequest setAffectedStores:[[managedObjectContext persistentStoreCoordinator] persistentStores]]; [fetchRequest setReturnsObjectsAsFaults:NO]; Here is where it starts misbehaving: after closing and relaunching the app, ONLY THE PARENT ENTITY is fetched. If I change the entity of the request using setEntity: to the entity for 'Note', all notes are fetched. If I change it to the entity for 'Browser', all the browsers are fetched. Let me reiterate that during the run in which an object is first inserted into the context, it will appear in the list. It is only after save and relaunch that a fetch request fails to traverse the hierarchy. Therefore, I can only conclude that it is the storage of the inheritance that is the problem. Let's recap why: - Both entities can be created, inserted into the context, and viewed, so the model is working - Both entities can be fetched with a single request, so the inheritance is working - I can confirm that the files are being stored separately and objects are going into their appropriate stores, so saving is working - Launching the app with either entity set for the request works, so retrieval from the store is working - This also means that traversing different stores with the request is working - By using a single store instead of multiple, the problem goes away completely, so creating, storing, fetching, viewing etc is working correctly. This leaves only one culprit (to my mind): the inheritance I'm setting with setSubentities: is effective only for objects creating during the session. Either objects/entities are being stored stripped of the inheritance info, or entity inheritance as defined programmatically only applies to new instances, or both. Either of these is unacceptable. Either it's a bug or I am way, way off course. I have been at this every which way for two days; any insight is greatly appreciated. The current workaround - just using a single store - works completely, except it won't be future-proof in the event that I remove one of the models from the app etc. It also boggles the mind because I can't see why you would have all this infrastructure for storing across multiple stores and for setting affected stores in fetch requests if it by core definition (of setSubentities:) doesn't work.

    Read the article

  • NSTimer as Alarm

    - by huntaub
    Is the best practice of setting an alarm on OS X to create a NSTimer scheduled for the number of seconds between the current time and the desired time for the alarm, or is there an alternative to that method?

    Read the article

  • Working with images (CGImage), exif data, and file icons

    - by Nick
    What I am trying to do (under 10.6).... I have an image (jpeg) that includes an icon in the image file (that is you see an icon based on the image in the file, as opposed to a generic jpeg icon in file open dialogs in a program). I wish to edit the exif metadata, save it back to the image in a new file. Ideally I would like to save this back to an exact copy of the file (i.e. preserving any custom embedded icons created etc.), however, in my hands the icon is lost. My code (some bits removed for ease of reading): // set up source ref I THINK THE PROBLEM IS HERE - NOT GRABBING THE INITIAL DATA CGImageSourceRef source = CGImageSourceCreateWithURL( (CFURLRef) URL,NULL); // snag metadata NSDictionary *metadata = (NSDictionary *) CGImageSourceCopyPropertiesAtIndex(source,0,NULL); // make metadata mutable NSMutableDictionary *metadataAsMutable = [[metadata mutableCopy] autorelease]; // grab exif NSMutableDictionary *EXIFDictionary = [[[metadata objectForKey:(NSString *)kCGImagePropertyExifDictionary] mutableCopy] autorelease]; << edit exif >> // add back edited exif [metadataAsMutable setObject:EXIFDictionary forKey:(NSString *)kCGImagePropertyExifDictionary]; // get source type CFStringRef UTI = CGImageSourceGetType(source); // set up write data NSMutableData *data = [NSMutableData data]; CGImageDestinationRef destination = CGImageDestinationCreateWithData((CFMutableDataRef)data,UTI,1,NULL); //add the image plus modified metadata PROBLEM HERE? NOT ADDING THE ICON CGImageDestinationAddImageFromSource(destination,source,0, (CFDictionaryRef) metadataAsMutable); // write to data BOOL success = NO; success = CGImageDestinationFinalize(destination); // save data to disk [data writeToURL:saveURL atomically:YES]; //cleanup CFRelease(destination); CFRelease(source); I don't know if this is really a question of image handling, file handing, post-save processing (I could use sip), or me just being think (I suspect the last). Nick

    Read the article

  • OpenGl es view and uitableview

    - by Mel
    I have written an application using opengl es view. Now I want to add a uitableview to display on the screen. However I am needing some guidance on how opengl es view and the other views play together nice. For example I have read some things that would lead me to think I need to pause the opengl view when the table is displayed. Can anyone point me to a tutorial on how to make these things work together or just point me to the stackoverflow question that I can't find where some guy asked the same exact thing and got an answer :-)

    Read the article

  • Can't find applicationSupportDirectory?

    - by Frost Li
    There is always a pre-written function at AppDelegate: (NSString *)applicationSupportDirectory { NSArray *paths = NSSearchPathForDirectoriesInDomains(NSApplicationSupportDirectory, NSUserDomainMask, YES); NSString *basePath = ([paths count] > 0) ? [paths objectAtIndex:0] : NSTemporaryDirectory(); return [basePath stringByAppendingPathComponent:@"SyncFile"]; } However, I can't call this method outside this class: id _appDelegate = (SyncFile_AppDelegate *)[[NSApplication sharedApplication] delegate]; NSLog(@"%@", [_appDelegate applicationSupportDirectory]); The compiler warned me that it can't find method applicationSupportDirectory... Does anyone know what's wrong with my code? Thank you very much!

    Read the article

  • Problem with playing a sound when a button is clicked.

    - by iSharreth
    in my code: import "MyViewController.h" import (IBAction)playSound{ AVAudioPlayer *myExampleSound; NSString *myExamplePath = [[NSBundle mainBundle] pathForResource:@"myaudiofile" ofType:@"caf"]; myExampleSound =[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:myExamplePath] error:NULL]; myExampleSound.delegate = self; [myExampleSound play]; } But it is showing a warning that Class MyViewController does not implement AVAudioplayerDelegate. Anyone please help. I had included the AVFoundation.Framework.

    Read the article

  • Can I disable UIPickerView scroll sound?

    - by cocoaholic
    Hi, I want to disable the annoying clicks that the UIPickerView generates upon scrolling up and down. Is there a way to do this? I want to play short sounds for each item that the picker view lands upon. It gets ruined by the built in sound. I understand that the picker sounds can be turned off globally by switching off the keyboard sounds in iPhone/iPod settings. But is there a way to programatically do this? Any help will be much appreciated! Thanks

    Read the article

< Previous Page | 58 59 60 61 62 63 64 65 66 67 68 69  | Next Page >