Search Results

Search found 24890 results on 996 pages for 'iphone device'.

Page 389/996 | < Previous Page | 385 386 387 388 389 390 391 392 393 394 395 396  | Next Page >

  • Best method of achieving bi-directional communication between Apple iPad "clients" and a Windows Ser

    - by user361910
    We are currently starting to build a client-server system which will see 10 or more Apple iPad client devices communicating to a central Windows server over a wireless LAN. We wanted to some existing plumbing (.NET remoting/WCF/web services/etc) that would allow us to implement a reliable, secure solution without having to start at a low level (e.g. sockets) and recreate the wheel. One of the major requirements that complicates this scenario is that unlike a traditional web service, the windows server needs to be able to arbitrarily notify the clients whenever certain events occur -- so it is not a simple request/response scenario like the web. Initially, we were going to use Windows clients, so our plan was to use the full-duplex mode of .NET WCF over HTTP|TCP. But now using the iPad, we don't have any of the WCF infrastructure. So my question is: what is the best way to allow an iPad and a Windows server to (securely) communicate over a LAN, with each device able to initiate communication to the other? Am I stuck writing low-level socket code? Thanks!

    Read the article

  • Reachability sometimes fails, even when we do have an internet connection

    - by stoutyhk
    Hi I've searched but can't see a similar question. I've added a method to check for an internet connection per the Reachability example. It works most of the time, but when installed on the iPhone, it quite often fails even when I do have internet connectivity (only when on 3G/EDGE - WiFi is OK). Basically the code below returns NO. If I switch to another app, say Mail or Safari, and connect, then switch back to the app, then the code says the internet is reachable. Kinda seems like it needs a 'nudge'. Anyone seen this before? Any ideas? Many thanks James + (BOOL) doWeHaveInternetConnection{ BOOL success; // google should always be up right?! const char *host_name = [@"google.com" cStringUsingEncoding:NSASCIIStringEncoding]; SCNetworkReachabilityRef reachability = SCNetworkReachabilityCreateWithName(NULL, host_name); SCNetworkReachabilityFlags flags; success = SCNetworkReachabilityGetFlags(reachability, &flags); BOOL isAvailable = success && (flags & kSCNetworkFlagsReachable) && !(flags & kSCNetworkFlagsConnectionRequired); if (isAvailable) { NSLog(@"Google is reachable: %d", flags); }else{ NSLog(@"Google is unreachable"); } return isAvailable; }

    Read the article

  • How do I add a button to my navigationController's right side after pushing another view controller

    - by bobobobo
    So, immediately after pushing a view controller to my tableView, // Override to support row selection in the table view. - (void)tableView:(UITableView *)tableView didSelectRowAtIndexPath:(NSIndexPath *)indexPath { // Navigation logic may go here -- // for example, create and push another view controller. AnotherViewController *anotherViewController = [[AnotherViewController alloc] initWithNibName:@"AnotherView" bundle:nil]; [self.navigationController pushViewController:anotherViewController animated:YES]; Ok, so that makes another view slide on, and you can go back to the previous view ("pop" the current view) by clicking the button that automatically appears in the top left corner of the navigation bar now. Ok, so SAY I want to populate the RIGHT SIDE of the navigation bar with a DONE button, like in the "Notes" app that comes with the iPhone. How would I do that? I tried code like this: UIBarButtonItem * doneButton = [[UIBarButtonItem alloc] initWithBarButtonSystemItem:UIBarButtonSystemItemDone target:self action:@selector( doneFunc ) ]; self.navigationController.navigationItem.rightBarButtonItem = doneButton ; // not it.. [doneButton release] ; doneFunc is defined, and everything, just the button never appears on the right side..

    Read the article

  • Problem using AVAudioRecorder.

    - by tek3
    Hi all, I am facing a strange problem with AVAudioRecorder. In my application i need to record audio and play it. I am creating my player as : if(recorder) { if(recorder.recording) [recorder stop]; [recorder release]; recorder = nil; } NSString * filePath = [NSHomeDirectory() stringByAppendingPathComponent: [NSString stringWithFormat:@"Documents/%@.caf",songTitle]]; NSDictionary *recordSettings = [[NSDictionary alloc] initWithObjectsAndKeys: [NSNumber numberWithFloat: 44100.0],AVSampleRateKey, [NSNumber numberWithInt: kAudioFormatAppleIMA4],AVFormatIDKey, [NSNumber numberWithInt: 1], AVNumberOfChannelsKey, [NSNumber numberWithInt: AVAudioQualityMax],AVEncoderAudioQualityKey,nil]; recorder = [[AVAudioRecorder alloc] initWithURL: [NSURL fileURLWithPath:filePath] settings: recordSettings error: nil]; recorder.delegate = self; if ([recorder prepareToRecord] == YES){ [recorder record]; I am releasing and creating player every time i press record button. But the problem is that ,AVAudiorecorder is taking some time before starting to record , and so if i press record button multiple times continuously ,my application freezes for some time. The same code works fine without any problem when headphones are connected to device...there is no delay in recording, and the app doesn't freeze even if i press record button multiple times. Any help in this regard will be highly appreciated. Thanx in advance.

    Read the article

  • Problem with header files of ARToolkitPlus after making the ARToolKitPlus dylib!

    - by MNassar
    I'm writing this Augmented Reality app for the iPhone and I'd decided to use ARToolKitPlus for it. Using QMake, I created the xcode project file and subsequently the libArToolKitPlus.dylib I tried to compile and run the sample files "simple" and "multi" which worked well. Now all other attempts that I tried to create another project and use the library have failed due to header files not being found. If I drag the "include" folder to the xcode project I get 8 errors instead of just one; The one is for the main include not being found is solved but then it includes 8 other headers that cannot be found (although they are in the same folder) #include "ARToolKitPlus/TrackerSingleMarkerImpl.h" I get a error: ARToolKitPlus/TrackerSingleMarkerImpl.h: No such file or directory If I drag the include folder then some of what I get: error: ARToolKitPlus/TrackerSingleMarker.h: No such file or directory error: ../src/TrackerSingleMarkerImpl.cpp: No such file or directory error: expected class-name before ',' token class TrackerSingleMarkerImpl : public TrackerSingleMarker, protected TrackerImpl<__PATTERN_SIZE_X,__PATTERN_SIZE_Y, __PATTERN_SAMPLE_NUM, __MAX_LOAD_PATTERNS, __MAX_IMAGE_PATTERNS Having the dylib doesnt make a difference as far as I can tell. What do you think I should do?? Would creating a framework help??

    Read the article

  • Can AVAudioSession do full duplex?

    - by Eric Christensen
    It would seem like it should be able to, but the following breakout test code can't do both: //play a file: NSArray *pathsArray = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [pathsArray objectAtIndex:0]; NSString* playFilePath=[documentsDirectory stringByAppendingPathComponent:@"testplayfile.wav"]; AVAudioPlayer *tempplayer = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:playFilePath] error:nil]; [tempplayer prepareToPlay]; [tempplayer play]; //and record a file: NSString* recFilePath=[documentsDirectory stringByAppendingPathComponent:@"testrecordfile.wav"]; AVAudioRecording *soundrecording = [[AVAudioRecorder alloc] initWithURL:[NSURL fileURLWithPath:recFilePath] settings:nil error:nil]; [soundrecording prepareToRecord]; [soundrecording record]; This is the minimum I can think of to individually play one file and record another. And this works just fine in the simulator. I can play back a file and record at the same time. But it doesn't work on the iphone itself. If I comment out either function, the other performs fine. The playback plays fine either alone or with both, if it's first. If I comment out the playback, the record records fine. (There's additional code to stop the recording not shown here.) So each works fine, but not together. I know audioQueue has a setting to allow both, but I don't see an analogue for AVAudioSessions. Any idea if it's possible, and if so, what I need to add? Thanks!

    Read the article

  • Switching between iScroll and standard WebView Scrolling functionality

    - by Jonathan
    I'm using iScroll 4 for my rather heavy iPhone web/Phonegap app and trying to find a way to switch between iScrolls scrolling functionality and standard "native" webview scrolling, basically on click. Why I want this is described below. My app has several different subpages all in one file. Some subpages have input fields, some don't. As we all know, iScroll + input fields = you're out of luck. I've wrapped iScrolls wrapper div (and all its functionality) around the one sub page where scrolling is crucial, and where there are no input fields. The other sections, I've simply placed outside this div, which gives these no scrolling functionality at all. I've of course tried wrapping all inside the wrapper and enabling/disabling scroll (shown below) but that didn't work me at all: myScroll.disable() myScroll.enable() By placing some sub pages outside the main scrolling area / iscroll div, I've disabled both iScrolls and the standard webview scrolling (the latter - which i guess iScroll does) which leaves me with only basic basic scrolling, hence basically no scrolling at all. One can move around vertically, but once you let go of the screen with, the "scrolling" stops. Quite naturally but alas so nasty. Therefore, I'm searching for a way to enable standard webview scrolling on the sub pages placed outside of iScroll's wrapper div. I've tried different approaches such as the one above and by using: document.removeEventListener('touchmove', preventDefault, false); document.removeEventListener('touchmove', preventDefault, true); But with no success. Sorry for not providing you guys with any hard code or demos to test out for yourselves, it's simply too much code and it would be presented so out of its context, nobody would be able to debug it. So, is there a way n javascript to do this, switching between iScroll scrolling functionality and standard "native" webview scrolling? I would rather not rebuild the entire DOM framework so a solution like the one described above would be preferable.

    Read the article

  • Extracting images from a PDF

    - by sagar
    My Query I want to extract only images from a PDF document, using Objective-C in an iPhone Application. My Efforts I have gone through the info on this link, which has details regarding different operators on PDF documents. I also studied this document from Apple about PDF parsing with Quartz. I also went through the entire PDF reference document from the Adobe site. According to that document, for each image there are the following operators: q Q BI EI I have created a table to get the image: myTable = CGPDFOperatorTableCreate(); CGPDFOperatorTableSetCallback(myTable, "q", arrayCallback2); CGPDFOperatorTableSetCallback(myTable, "TJ", arrayCallback); CGPDFOperatorTableSetCallback(myTable, "Tj", stringCallback); I use this method to get the image: void arrayCallback2(CGPDFScannerRef inScanner, void *userInfo) { // THIS DOESN'T WORK // CGPDFStreamRef stream; // represents a sequence of bytes // if (CGPDFDictionaryGetStream (d, "BI", &stream)){ // CGPDFDataFormat t=CGPDFDataFormatJPEG2000; // CFDataRef data = CGPDFStreamCopyData (stream, &t); // } } This method is called for the operator "q", but I don't know how to extract an image from it. What should be the solution for extracting the images from the PDF documents? Thanks in advance for your kind help.

    Read the article

  • Setting corelocation results to NSNumber object parameters

    - by Dan Ray
    This is a weird one, y'all. - (void)locationManager:(CLLocationManager *)manager didUpdateToLocation:(CLLocation *)newLocation fromLocation:(CLLocation *)oldLocation { CLLocationCoordinate2D coordinate = newLocation.coordinate; self.mark.longitude = [NSNumber numberWithDouble:coordinate.longitude]; self.mark.latitude = [NSNumber numberWithDouble:coordinate.latitude]; NSLog(@"Got %f %f, set %f %f", coordinate.latitude, coordinate.longitude, self.mark.latitude, self.mark.longitude); [manager stopUpdatingLocation]; manager.delegate = nil; if (self.waitingForLocation) { [self completeUpload]; } } The latitude and longitude in that "mark" object are synthesized parameters referring to NSNumber iVars. In the simulator, my NSLog output for that line in the middle there reads: 2010-05-28 15:08:46.938 EverWondr[8375:207] Got 37.331689 -122.030731, set 0.000000 -44213283338325225829852024986561881455984640.000000 That's a WHOLE lot further East than 1 Infinite Loop! The numbers are different on the device, but similar--lat is still zero and long is a very unlikely high negative number. Elsewhere in the controller I'm accepting a button press and uploading a file (an image I just took with the camera) with its geocoding info associated, and I need that self.waitingForLocation to inform the CLLocationManager delegate that I already hit that button and once its done its deal, it should go ahead and fire off the upload. Thing is, up in the button-click-receiving method, I test see if CL is finished by testing self.mark.latitude, which seems to be getting set zero...

    Read the article

  • [SOLVED] How to create a FBO with stencil buffer in OpenGL ES 2.0?

    - by Alphones
    I need stencil buffer on 3GS to render planar shadow, and polygon offset won't work prefect, still has z-fighting problem. So I use stencil buffer to make the shadow correct, it works on win32 gles2 emulator, but not on iPhone. After I added a post effect to the whole scene. The stencil buffer won't work even on win32 gles2 emulator. And I tried to attach a stencil buffer to FBO, buf the screen turns to black. Here's my code, glGenRenderbuffers(1, &dbo); // depth buffer glBindRenderbuffer(GL_RENDERBUFFER, dbo); glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT24_OES, widthGL, heightGL); glGenRenderbuffers(1, &sbo); // stencil buffer glBindRenderbuffer(GL_RENDERBUFFER, sbo); glRenderbufferStorage(GL_RENDERBUFFER, GL_STENCIL_INDEX8, widthGL, heightGL); glGenFramebuffers(1, &fbo); glBindFramebuffer(GL_FRAMEBUFFER, fbo); glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, tex, 0); glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, dbo); glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_STENCIL_ATTACHMENT, GL_RENDERBUFFER, sbo); // this make the whole screen black. The eglContext is created with STENCIL_SIZE=8, it works without a RTT. I tried to change the RenderbufferStorage for both depth buffer and stencil buffer, but none of them works. Is there anything I have missed? Does the stencil buffer pack with depth buffer? (I cannot find things like GL_DEPTH24_STENCIL8 ...)

    Read the article

  • Hiding a UINavigationController's UIToolbar during viewWillDisappear:

    - by Nathan de Vries
    I've got an iPhone application with a UITableView menu. When a row in the table is selected, the appropriate view controller is pushed onto the application's UINavigationController stack. My issue is that the MenuViewController does not need a toolbar, but the UIViewControllers which are pushed onto the stack do. Each UIViewController that gets pushed calls setToolbarHidden:animated: in viewDidAppear:. To hide the toolbar, I call setToolbarHidden:animated: in viewWillDisappear:. Showing the toolbar works, such that when the pushed view appears the toolbar slides up and the view resizes correctly. However, when the back button is pressed the toolbar slides down but the view does not resize. This means that there's a black strip along the bottom of the view as the other view transitions in. I've tried adding the toolbar's height to the height of the view prior to hiding the toolbar, but this causes the view to be animated during the transition so that there's still a black bar. I realise I can manage my own UIToolbar, but I'd like to use UINavigationControllers built in UIToolbar for convenience. This forum post mentions the same issue, but no workaround is mentioned.

    Read the article

  • pushViewController Not Displaying UIView/Nib with tabbar and nav bar

    - by james
    I'm relatively new to objective c but not programming and am stuck with my iphone app. I created a nav based app with both a navbar and a tab bar controller. I set the tab bar as the root controller. I'm able to switch between each tab without any issues to various UIViews and UITableViews. My issue is that in one of my UITableViews that I call from the TabBarController, didSelectRowAtIndexPath function is suppose to display a new UIView. The below code does not give any errors and runs fine but does not show the new Nib. if(newViewController == nil) { NSLog(@"yes nil"); BookViewController *aNewViewController = [[BookViewController alloc] initWithNibName:@"BookOptionView" bundle:nil]; self.newViewController = aNewViewController; [aNewViewController release]; } BookAppDelegate *delegate = (BookAppDelegate *)[[UIApplication sharedApplication] delegate]; [delegate.appNavBar pushViewController:newViewController animated:YES]; Now when I do the below, it works fine but it gets rid of the nav and tab which I'm assuming because its a modal call instead of pushing the view controller. BookViewController *screen = [[BookViewController alloc] initWithNibName:@"BookOptionView" bundle:[NSBundle mainBundle]]; screen.modalTransitionStyle = UIModalTransitionStyleCoverVertical; [self presentModalViewController:screen animated:YES]; [screen release]; Any ideas why I can't get the View Controller to push correctly? In my application delegate file, I declared an AppNavBarController object (inherit from UINavigationController) called appNavBar. Any help would be appreciated!

    Read the article

  • Fuzzy Date algorithm in Objective-C

    - by Brock Woolf
    I would like to write a fuzzy date method for calculating dates in Objective-C for iPhone. There is a popular explanation here: http://stackoverflow.com/questions/11/how-do-i-calculate-relative-time However it contains missing arguments. How could this be used in Objective-C?. Thanks. const int SECOND = 1; const int MINUTE = 60 * SECOND; const int HOUR = 60 * MINUTE; const int DAY = 24 * HOUR; const int MONTH = 30 * DAY; if (delta < 1 * MINUTE) { return ts.Seconds == 1 ? "one second ago" : ts.Seconds + " seconds ago"; } if (delta < 2 * MINUTE) { return "a minute ago"; } if (delta < 45 * MINUTE) { return ts.Minutes + " minutes ago"; } if (delta < 90 * MINUTE) { return "an hour ago"; } if (delta < 24 * HOUR) { return ts.Hours + " hours ago"; } if (delta < 48 * HOUR) { return "yesterday"; } if (delta < 30 * DAY) { return ts.Days + " days ago"; } if (delta < 12 * MONTH) { int months = Convert.ToInt32(Math.Floor((double)ts.Days / 30)); return months <= 1 ? "one month ago" : months + " months ago"; } else { int years = Convert.ToInt32(Math.Floor((double)ts.Days / 365)); return years <= 1 ? "one year ago" : years + " years ago"; }

    Read the article

  • Blackberry how to display message in app if device got no internet connection?

    - by Johannes
    Hello I've just started with programming for the Blackberry device. I'm using version 5 of the API. I'm building a very simple application which is just a browserfield. So far it's all working great. I can display my browserfield with the content I need. The problem I'm having now is if the device doesn't have an active internet connection I get the ugly "Error requesting content for" message. I would need to someone display my own message if the device doesn't have an active connection. Something like "You need to have an active internet connection to use this application" with an Exit button which closes the app. I've tried to find this for hours but no luck. Hopefully it's something relatively easy so I can get help here. Here's my code so far: package com.mycompany.webview; import net.rim.device.api.browser.field2.*; import net.rim.device.api.ui.*; import net.rim.device.api.ui.container.*; public class webview extends UiApplication { public static void main(String[] args) { webview app = new webview(); app.enterEventDispatcher(); } public webview() { pushScreen(new webviewScreen()); } } class webviewScreen extends MainScreen { public webviewScreen() { BrowserField myBrowserField = new BrowserField(); add(myBrowserField); myBrowserField.requestContent("http://www.google.com"); } } Would really appreciate some help please. Thanks

    Read the article

  • Are there more Cocoa and Cocoa Touch videos which are worth looking at?

    - by dontWatchMyProfile
    To gain a better understanding, I think it would be a good idea to watch every cocoa video available on the net. I tend to find session videos from conferences or good podcast videos only by accident, so maybe someone has a handy list of links to great ressources. I already know all the WWDC stuff and the stuff from stanford, but a lot of universities around the world publish session videos as well in local languages. Also, there are like thousands of conferences around the world with great session videos. This list should compensate for all those who can't afford beeing at WWDC. Therefore, guys, let's create a handy list to fill the gaps for everyone! This is community wiki, so just list them all! I'll start with: English 360 Conferences (360iDev) Videos Oredev with some good iPhone dev session videos German Macoun 2009 with some interesting session videos, if you can speak German Please don't hesitate to post links to videos in other languages than English. Many of us speak more languages, so go ahead! We'll be excited!

    Read the article

  • UISegmentedControl Best Practice

    - by Neal L
    Hi all, I'm trying to work out the "best" way to use a UISegmentedControl for an iPhone application. I've read a few posts here on stackoverflow and seen a few people's ideas, but I can't quite sort out the best way to do this. The posts I'm referring to are: http://stackoverflow.com/questions/1559794/changing-views-from-uisegmentedcontrol and http://stackoverflow.com/questions/1047114/how-do-i-use-a-uisegmentedcontrol-to-switch-views It would seem that the options are: Add each of the views in IB and lay them out on top of each other then show/hide them Create each of the subviews separately in IB, then create a container in the main view to populate with the subview that you need Set up one really tall or really wide UIView and animate it left/right or up/down depending on the selected segment Use a UITabBarController to swap out the subviews - seems silly For tables, reload the table and in cellForRowAtIndex and populate the table from different data sources or sections based on the segment option selected (not the case for my app) So which approach is best for subview/non-table approaches? Which is the easiest to implement? Could you share some sample code to the approach? Thanks!

    Read the article

  • Launching a modal UINavigationController

    - by Alexi Groove
    I'd like to launch a modal view controller the way one does with 'ABPeoplePickerNavigationController' and that is without having to creating a navigation controller containing the view controller. Doing something similar yields a blank screen with no title for the navigation bar and there's no associated nib file loaded for the view even though I am invoking the initWithNibName when the 'init' is called. My controller looks like: @interface MyViewController : UINavigationController @implementation MyViewController - (id)init { NSLog(@"MyViewController init invoked"); if (self = [super initWithNibName:@"DetailView" bundle:nil]) { self.title = @"All Things"; } return self; } - (void)viewDidLoad { [super viewDidLoad]; self.title = @"All Things - 2"; } @end When using the AB controller, all you do is: ABPeoplePickerNavigationController *picker = [[ABPeoplePickerNavigationController alloc] init]; picker.peoplePickerDelegate = self; [self presentModalViewController:picker animated:YES]; [picker release]; ABPeoplePickerNavigationController is declared as: @interface ABPeoplePickerNavigationController : UINavigationController The other way to create a modal view as suggested in Apple's 'View Controller Programming Guide for iPhone OS': // Create a regular view controller. MyViewController *modalViewController = [[[MyViewController alloc] initWithNibName:nil bundle:nil] autorelease]; // Create a navigation controller containing the view controller. UINavigationController *secondNavigationController = [[UINavigationController alloc] initWithRootViewController:modalViewController]; // Present the navigation controller as a modal view controller on top of an existing navigation controller [self presentModalViewController:secondNavigationController animated:YES]; I can create it this way fine (as long as I change the MyViewController to inherit from UIViewController instead of UINavigationController). What else should I be doing to MyViewController to launch the same way as ABPeoplePickerNavigationController?

    Read the article

  • NULL value when using NSDateFormatter after setting NSDate property via XML parsing

    - by David A Gibson
    Hello, I am using the following code to try and display in a time in a table cell. TimeSlot *timeSlot = [timeSlots objectAtIndex:indexPath.row]; NSDateFormatter *timeFormat = [[NSDateFormatter alloc] init]; [timeFormat setDateFormat:@"HH:mm:ss"]; NSLog(@"Time: %@", timeSlot.time); NSDate *mydate = timeSlot.time; NSLog(@"Time: %@", mydate); NSString *theTime = [timeFormat stringFromDate:mydate]; NSLog(@"Time: %@", theTime); The log output is this: 2010-04-14 10:23:54.626 MyApp[1080:207] Time: 2010-04-14T10:23:54 2010-04-14 10:23:54.627 MyApp[1080:207] Time: 2010-04-14T10:23:54 2010-04-14 10:23:54.627 MyApp[1080:207] Time: (null) I am new to developing for the iPhone and as it all compiles with no errors or warnings I am at a loss as to why I am getting NULL in the log. Is there anything wrong with this code? Thanks Further Info I used the code exactly from your answer lugte098 just to check and I was getting dates which leads me to believe that my TimeSlot class can't have a date correctly set in it's NSDate property. So my question becomes - how from XML do I set a NSDate property? I have this code (abbreviated): -(void)parser:(NSXMLParser *)parser foundCharacters:(NSString *) string { if ([currentElement isEqualToString:@"Time"]) { currentTimeSlot.time = string } } Thanks

    Read the article

  • Fetch request error: no entity? Probably easy, but help!

    - by cksubs
    Hi, I'm going through the Stanford iPhone course. I'm on the second Paparazzi assignment. I'm getting this error: * Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: 'executeFetchRequest:error: A fetch request must have an entity.' I've copied a lot of the code from this walkthrough site. Running his source code works perfectly, and at this point my code is pretty much exactly the same as his. But it throws that error. See below for the relavent bits: // Create FetchRequest NSFetchRequest *request = [[NSFetchRequest alloc] init]; NSEntityDescription *entity = [NSEntityDescription entityForName:@"Person" inManagedObjectContext:context]; [request setEntity:entity]; // Set the sort descriptor NSSortDescriptor *sortDescriptor = [[NSSortDescriptor alloc] initWithKey:@"personName" ascending:NO]; NSArray *sortDescriptors = [[NSArray alloc] initWithObjects:sortDescriptor, nil]; [request setSortDescriptors:sortDescriptors]; [sortDescriptors release]; [sortDescriptor release]; // set up NSFetchedResultsController to hold the fetch NSFetchedResultsController *frc = [[NSFetchedResultsController alloc] initWithFetchRequest:request managedObjectContext:context sectionNameKeyPath:nil cacheName:@"personCache"]; // execute the fetch NSError *error; NSLog(@"Prints Here"); [frc performFetch:&error]; NSLog(@"Doesn't Print Here"); I'm clearly setting the entity with [request setEntity:entity]. So the error has me stumped. Is there something I'm missing? Maybe in another file? I don't know. I'm still so confused with Objective-C.... Thanks.

    Read the article

  • Problem with UIScrollView

    - by leon
    Hi, Sorry for long winded post. I am trying to understand UIScrollView and running into very simple problem. I am creating a scroll view I am making this view 1.5 size larger then normal size Using UIScrollView I expect to see some edge elements of view out of bounds, but should be able to pan the view therefore bringing missing elements back to the visible area. However I am seeing that I can't just pan/scroll view anyway I want, instead view always wants to scroll up, as soon as move away my finger from the screen (touch end event). I am not handling any touches, etc - I just want to understand why does not scaled view stay put where I scroll it? CGRect viewFrame = self.view.frame ; viewFrame.size.width *= 1.5; viewFrame.size.height *= 1.5; CGSize mySize = viewFrame.size; [ ((UIScrollView *) self.view) setContentSize: mySize]; self.view.transform = CGAffineTransformMakeScale(1.5, 1.5); What I really trying to accomplish is something similar to Number on iPad (the same code will work on iPhone): There is a view with lots of controls on it (order entry form) User can zoom into the entire form so all elements look bigger user can pan the form therefore bringing various elements into the visible area of the screen. It seems that UIScrollView can should be able to handle zoom and pan actions (for now I am using Affine Transform to zoom in to the order entry form and iPad) Thanks

    Read the article

  • Forcing CLLocationManager updates - does it help or hurt?

    - by Steve N
    I've been trying to find any way to optimize the performance of my location-based iPhone application and have seen some people mention that you can force location updates by starting and stopping your CLLocationManager. I need the best accuracy I can get, and the user in my case would probably like to see updates every few seconds (say, 10 seconds) as they walk around. I've set the filters accordingly, but I notice that sometimes I don't get any updates on the device for quite some time. I'm testing the following approach, which forces an update when a fixed time interval passes (I'm using 20 seconds). My gut tells me this really won't help me provide more accurate updates to the user, and that just leaving CLLocationManager running all the time is probably the best approach. - (void)forceLocationUpdate { [[LocationManager locationManager] stopUpdates]; [[LocationManager locationManager] startUpdates]; [self performSelector:@selector(forceLocationUpdate) withObject:nil afterDelay:20.0]; } My question is- does forcing updates from CLLocationManager actually improve core location performance? Does it hurt performance? If I'm outside in an open field with good GPS reception, will this help then? Does anyone have experience trying this? Thanks in advance, Steve

    Read the article

  • Objective-C memory model

    - by TofuBeer
    I am attempting to wrap my head around one part of the Objective-C memory model (specifically on the iPhone, so no GC). My background is C/C++/Java, and I am having an issue with the following bit of code (also wondering if I am doing this in an "Objective-C way" or not): - (NSSet *) retrieve { NSMutableSet *set; set = [NSMutableSet new]; // would normally fill the set in here with some data return ([set autorelease]); } - (void) test { NSSet *setA; NSSet *setB; setA = [self retrieve]; setB = [[self retrieve] retain]; [setA release]; [setB release]; } start EDIT Based on comments below, the updated retrieve method: - (NSSet *) retrieve { NSMutableSet *set; set = [[[NSMutableSet alloc] initWithCapacity:100] autorelease]; // would normally fill the set in here with some data return (set); } end EDIT The above code gives a warning for [setA release] "Incorrect decrement of the reference count of an object is not owned at this point by the caller". I though that the "new" set the reference count to 1. Then the "retain" call would add 1, and the "release" call would drop it by 1. Given that wouldn't setA have a reference count of 0 at the end and setB have a reference count of 1 at the end? From what I have figured out by trial and error, setB is correct, and there is no memory leak, but I'd like to understand why that is the case (what is wrong with my understanding of "new", "autorelease", "retain", and "release").

    Read the article

  • Autoresize UIScrollView

    - by Sebastian
    I made a UIViewController, which programatically generates a UIScrollView. Everything's fine, but when I rotate the Device, the UIScollView should resize so it takes the complete width of my View. Is there a way to do that without rebuilding the complete UIScrollView ? Thx a lot ! Sebastian This is called in my viewDidLoad: -(void)buildmyScroller { myScroller = [[UIScrollView alloc] initWithFrame:CGRectMake(0, 800, 768, 100)]; //...adding some subviews to myScroller thumbScroller.contentSize = CGSizeMake(3000, 100); [[self view] addSubview:myScroller]; } Then I tried to resize myScroller with this: -(void)changemyScroller { UIInterfaceOrientation interfaceOrientation = self.interfaceOrientation; if (interfaceOrientation == UIInterfaceOrientationPortrait) { [thumbScroller setFrame:CGRectMake(0, 805, 768, 150)]; } else if (interfaceOrientation == UIInterfaceOrientationPortraitUpsideDown){ thumbScroller.frame = CGRectMake(0, 805, 768, 150); } else if (interfaceOrientation == UIInterfaceOrientationLandscapeLeft){ thumbScroller.frame = CGRectMake(0, 549, 1024, 150); } else if (interfaceOrientation == UIInterfaceOrientationLandscapeRight){ thumbScroller.frame = CGRectMake(0, 549, 1024, 150); } } And called the method in didAnimateFirstHalf... cause I'm not shure where else to call it. Thx a lot again !!

    Read the article

  • presentModalViewController does not want to work when called from a protocol method

    - by johnbdh
    I have a subview that when double tapped a protocol method on the subview's parent view controller is called like this... - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *theTouch = [touches anyObject]; if (theTouch.tapCount == 1) { } else if (theTouch.tapCount == 2) { if ([self.delegate respondsToSelector:@selector(editEvent:)]) { [self.delegate editEvent:dictionary]; } } } Here is the protocol method with the dictionary consuming code removed... - (void)editEvent:(NSDictionary){ EventEditViewController *eventEditViewController = [[EventEditViewController alloc] initWithNibName:@"EventEditViewController" bundle:nil]; eventEditViewController.delegate = self; navigationController = [[UINavigationController alloc] initWithRootViewController:eventEditViewController]; [self presentModalViewController:navigationController animated:YES]; [eventEditViewController release]; } The protocol method is called and runs without any errors but the modal view does not present itself. I temporarily copied the protocol method's code to an IBAction method for one of the parent's view button's to isolate it from the subview. When I tap this button the modal view works fine. Can anyone tell me what I am doing wrong? Why does it work when executed from a button on the parent view, and not from a protocol method called from a subview. Here is what I have tried so far to work around the problem... Restarted xCode and the simulator Ran on the device (iTouch) Presenting eventEditViewController instead of navigationController Using Push instead of presentModal. delaying the call to the protocol with performSelector directly to the protocol, to another method in the subview which calls the protocol method, from the protocol method to another method with the presentModal calls. Using a timer. I have it currently setup so that the protocol method calls a known working method that presents a different view. Before calling presentModalViewController it pops a UIAlertView which works every time, but the modal view refuses to display when called via the protocol method. I'm stumped. Perhaps it has something to do with the fact that I am calling the protocol method from a UIView class instead of a UIViewController class. Maybe I need to create a UIViewController for the subView?? Thanks, John

    Read the article

  • Adapting existing HTML/Javascript model to Titanium's latest release (v 0.9)

    - by Alan Neal
    In pre-0.9 versions of Titanium, one could simply specify an .html file (local or remote) in the tiapp.xml file and interact with it in the same manner as one would on a website. As of version 0.9, that is no the longer case. One creates their entire app dynamically. Unfortunately, this broke my previous implementation and, other than an updated Kitchen Sink, much of the new model and API calls are not covered in the documentation (e.g., createLabel). So, my question is this... What are the simplest steps for re-creating the previous effect (knowingly forgoing some of the advantages of the Titanium's latest approach if necessary)? My previous implementation was exactly as it functions on the website. The website has a single index.html file with no content other than links to JavaScript and style files. The document body's onload event called the first JavaScript function (located in the main script) and, from that point forth, the entire content was dynamically created. How can I set up the latest version of Titanium so that I am poised to do the exact same thing? BTW: Whereas I previously had the choice to keep the files local or remote, I don't believe that remote access (e.g., simply using the webView widget to point to the website) is viable. That's because pages displayed via the webView do not have access to most of the API. Since the iPhone and Safari browsers do not support the file input type, the only means for uploading files (something my app requires) is calling Titanium's function. Thanks in advance.

    Read the article

< Previous Page | 385 386 387 388 389 390 391 392 393 394 395 396  | Next Page >