Search Results

Search found 13860 results on 555 pages for 'core graphics'.

Page 118/555 | < Previous Page | 114 115 116 117 118 119 120 121 122 123 124 125  | Next Page >

  • Updating a deallocated UIWebView from a background thread

    - by Dan Ray
    As you can see from the title, I've programmed myself into a corner and I've got several things working against me... In a UIViewController subclass that manages a large and complex view. One part of it is a UIWebView that contains output from a web request that I had to build and execute, and manually assemble HTML from. Since it takes a second or two to run, I dropped it into the background by calling self performSelectorInBackground:. Then from that method I call there, I use self performSelectorOnMainThread: to get back to the surface of the thread stack to update the UIWebView with what I just got. Like this (which I've cut down to show only the relevant issues): -(void)locationManager:(CLLocationManager *)manager didUpdateToLocation:(CLLocation *)newLocation fromLocation:(CLLocation *)oldLocation { //then get mapquest directions NSLog(@"Got called to handle new location!"); [manager stopUpdatingLocation]; [self performSelectorInBackground:@selector(getDirectionsFromHere:) withObject:newLocation]; } - (void)getDirectionsFromHere:(CLLocation *)newLocation { NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init]; CLLocationCoordinate2D here = newLocation.coordinate; // assemble a call to the MapQuest directions API in NSString *dirURL // ...cut for brevity NSLog(@"Query is %@", dirURL); NSString *response = [NSString stringWithContentsOfURL:[NSURL URLWithString:dirURL] encoding:NSUTF8StringEncoding error:NULL]; NSMutableString *directionsOutput = [[NSMutableString alloc] init]; // assemble response into an HTML table in NSString *directionsOutput // ...cut for brevity [self performSelectorOnMainThread:@selector(updateDirectionsWithHtml:) withObject:directionsOutput waitUntilDone:NO]; [directionsOutput release]; [pool drain]; [pool release]; } - (void)updateDirectionsWithHtml:(NSString *)directionsOutput { [self.directionsWebView loadHTMLString:directionsOutput baseURL:nil]; } This all works totally great, UNLESS I've backed out of this view controller before CLLocationManager hits its delegate method. If this happens after I've already left this view, I get: 2010-06-07 16:38:08.508 EverWondr[180:760b] bool _WebTryThreadLock(bool), 0x1b6830: Tried to obtain the web lock from a thread other than the main thread or the web thread. This may be a result of calling to UIKit from a secondary thread. Crashing now... Despite what this says, I can repeatably cause this crash when I back out too early. I'm not at all convinced that attempting a UI update from a background thread is really the issue; I think it's that my UIWebView is deallocated. I suspect that the fact I was just IN a background thread makes the runtime suspect something's up about that, but I feel fairly sure that's not it. So how do I tell CLLocationManager not to worry about it, when I'm backing out of that view? I tried [self.locationManager stopUpdatingLocation] inside my viewWillDisappear method, but that didn't do it. (Incidentally, MapQuest's apis are FANTASTIC. Way WAY better than anything Google provides. I can't recommend them highly enough.)

    Read the article

  • NSPredicate acting strange in NSFetchedResultsController

    - by Scott Langendyk
    I feel as if this should be very simple, but it's behaving strangely. I have 3 entities, with a relationship as such Entity A <-- Entity B <<-- Entity C I have an NSFetchedResults controller and I'm trying to filter the results of Entity A using the following predicate. [NSPredicate predicateWithFormat:@"NONE entityB.entityC == %@", self.entityC]; When I try and run the app, the output shows no results. I can alter the predicate slightly to: [NSPredicate predicateWithFormat:@"ANY entityB.entityC == %@", self.entityC]; And it shows me only the results that I want it to filter out. Why is this happening?

    Read the article

  • Easy Flood Fill

    - by Jimmy
    Some advice, please. I'm just starting out in C#. I've managed to get some shapes created on a Windows form, and now I'd like to fill them with color. In the old C++ I studied years ago, there was a floodfill function that was really easy. It has been an unpleasant realization to find there's not a similar method available in regular old C#. Does anyone have advice for me, or some code, so I can implement filling without understanding GDI+, DirectX, or rest of the avalanche of acronyms that I've run into by researching this on the web? I need to fill irregular shapes, bounded by a certain color. Gradient and transparency control would be nice, but I'd settle for plain old solid fill right now, just to get a modicom of control over this. Any help, code or advice would be really appreciated.

    Read the article

  • How well are SVG filter elements defined?

    - by Peter Becker
    We are considering using SVG filters as part of our toolchain, serving the SVG to browsers capable of supporting it, while serving pre-rendered PNGs to other. One problem we noticed is that the rendering of the filter chains seems to be very inconsistent across renderers. When looking at the "filters01" example from the SVG specification, the rendering looks very different across the tools we tried. Chrome (5.0.307.11) failed to render the image, while other tools (Firefox 3.6, Opera 10.10, Inkscape 0.47, GIMP 2.6.7) render something vaguely similar in style to the picture in the specification, but no two are truly the same. Is that an issue of under-specification or are the tools just not there? If we would use SVG with filter effects: is there a reference tool that can give us a rendering the way it is intended by the spec?

    Read the article

  • How do I prevent jagged edges alongside the surfaces of my 3d model?

    - by badcodenotreat
    Lets say I've implemented in openGL a crude model viewer with shading which renders a series of blocks, such that I have something that looks like this. http://i.imgur.com/TsF7K.jpg Whenever I rotate my model to the side, it causes an unwanted jagged effect along any surface with a steep viewing angle. http://i.imgur.com/Bgl9o.jpg I'm pretty sure this is due to the polygon offset I used to prevent z-fighting between the model and the wireframe, however I'm not able to find the factor/unit parameters in openGL which prevent this unwanted effect. what are the best values of factor and unit for glPolygonOffset to prevent this? would implementing anti-aliasing alleviate the problem? is the trade off in performance trivial/significant? is this perhaps a shading issue? should i try a solution along this line of thought?

    Read the article

  • How to find State location from iphone GPS?

    - by Silent
    Hello all I'm trying to add to my program that locates the person in GPS but sets a value to the State that person is in so example: GPS Locates person from his/her iphone then returns the state they are in so say its California then the state variable gets set to California as a string would someone have an example any help is appreciated thanks!

    Read the article

  • How do I draw a proper parallelogram that can be animated on iPhone?

    - by Robert Kosara
    I'm trying to do something very simple: I need some parallelograms in my program. These are attached to other objects, all of which are UIViews. It's important that I be able to animate these, since the objects they are attached to can also be animated. I've figured out how to use the transform in UIView/CALayer to do this, but the problem is that these sheared UIViews don't look very nice: there is no anti-aliasing of the edges. Is there some other way to do this? I would like to use UIViews, since I also use them for user interaction and animation is so much easier than drawing by hand. I don't want to use OpenGL for this.

    Read the article

  • A dynamic array of class "landmark", inside another single class "landmarks"

    - by pinnacler
    I'm working on a robot localization simulator and I created a class called "landmark". The end result is going to be a robot that is always centered and always faces the top of the screen. As it turns, the birds eye view map will rotate around the robot. To accomplish this, I'm assuming I can rotate one class and have all elements inside rotate as well. So, the landmark class has properties x,y, label, and radius. This is suppose to simulate a tree location in a forest. To test everything, I need "forest data," and I wrote a script to generate 100 trees in a 100m x 100m area. The script automatically generates values within an acceptable range for x,y, radius. The generated data is stored in an object called tempForest and is 100x3. Ideally, I want to create a class called "landmarks" (plural) that has 100 landmark instances inside. How would I instantiate 100 instances of landmark in one instance of landmarks using that randomly generated data? Ideally, I'd just type treeBeacons = landmarks(); and it would randomly populate 100 (user definable, set in config file) instances with x, y, radius data. I'm not sure how to deal with a dynamic array of class "Landmark", inside another single class "landmarks." Any ideas?

    Read the article

  • How to get objects after CoreData Context merged

    - by Emmettoc
    Hi, I tried to save data and merge with CoreData and multi-thread for iPhone app. But I can't get managed objects in the main thread after merging. I wrote code just like this: [managedObjectContext performSelectorOnMainThread:@selector(mergeChangesFromContextDidSaveNotification:) withObject:notification waitUntilDone:YES]; [self performSelectorOnMainThread:@selector(didMerged:) withObject:objectIds waitUntilDone:YES]; So I tried to pass objectIds to get NSManagedObject instances in the main thread which were generated in another thread. At first I tried "objectWithId" method but it generated fault objects. Then I tried "existingObjectWithID" method but it generated objects partly and others were nil with following Error: [Error] Error Domain=NSCocoaErrorDomain Code=133000 "Operation could not be completed. (Cocoa error 133000.)" What is wrong? Is there any way how to retrieve all objects by objectIds after merging in another thread? Thank you.

    Read the article

  • drawing images and lines over UIScrollView

    - by Jorge
    I'm programming an app in which one of the ViewControllers is showing an UIScrollView that shows an image. I'd like to load an image (pushpin in png format) and draw it (and delete it) in some points of the UIScrollView image. I'd also would like to draw bezier paths in that image (and deleting them). I've programmed several apps but this is the first time I face graphic programming and don't know where to start from. Any suggestions? Thanks!

    Read the article

  • Is there a way to pause a CABasicAnimation?

    - by mclaughlinj
    I have a basic spinning animation of the iPhone. Is there any way that I can "pause" the animation so that the position of the view will be maintained? I guess one way of doing this would be to cause the animation to "complete" instead of calling "remove" on it, how would I do that? CABasicAnimation* rotationAnimation; rotationAnimation = [CABasicAnimation animationWithKeyPath:@"transform.rotation.z"]; rotationAnimation.toValue = [NSNumber numberWithFloat: M_PI * 2]; rotationAnimation.duration = 100; rotationAnimation.cumulative = YES; rotationAnimation.repeatCount = HUGE_VALF; rotationAnimation.removedOnCompletion = NO; rotationAnimation.fillMode = kCAFillModeForwards; [myView.layer addAnimation:rotationAnimation forKey:@"rotationAnimation"];

    Read the article

  • Auto scrolling or shifting a bitmap in .NET

    - by mikej
    I have a .NET GDI+ bitmap object (or if it makes the problem easier a WPF bitmap object) and what I want to to is shift the whole lot by dx,dy (whole pixels) and I would ideally like to do it using .NET but API calls are ok. It has to be efficient bacause its going to be called 10,000 times say with moderately large bitmaps. I have implemented a solution using DrawImage - but its slow and it halts the application for minutes while the GC cleans up the temp objects that have been used. I have also started to work on a version using ScrollDC but so far have had no luck getting it to work on the DC of the bitmap (I can make it work buy creating an API bitmap with bitmap handle, then creating a compatible DC asnd calling ScrollDC but then I have to put it back into the bitmap object). There has to be an "inplace" way of shifting a bitmap. mikej

    Read the article

  • Error loading my managedObjectModel

    - by niklassaers
    Hi guys, When I call [myAppDelegate managedObjectModel], in the retain line below, my application will crash (iPhone SDK v3.1.3): - (NSManagedObjectModel *)managedObjectModel { if (managedObjectModel != nil) { return managedObjectModel; } managedObjectModel = [[NSManagedObjectModel mergedModelFromBundles:nil] retain]; return managedObjectModel; } Here is my crash trace #0 0x905c44e6 in objc_exception_throw #1 0x01e78c3b in +[NSException raise:format:arguments:] #2 0x01e78b9a in +[NSException raise:format:] #3 0x000af99b in _NSArrayRaiseInsertNilException #4 0x0001c360 in -[NSCFArray insertObject:atIndex:] #5 0x0001c274 in -[NSCFArray addObject:] #6 0x01c16a7e in +[NSManagedObjectModel mergedModelFromBundles:] #7 0x00002432 in -[myAppDelegate managedObjectModel] at myAppDelegate.m:102 What is going on here? This is template code that I haven't seen fail before. Cheers Nik

    Read the article

  • How to access pixels of an NSBitmapImageRep?

    - by Paperflyer
    I have an NSBitmapImageRep that is created like this: NSBitmapImageRep *imageRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:NULL pixelsWide:waveformSize.width pixelsHigh:waveformSize.height bitsPerSample:8 samplesPerPixel:4 hasAlpha:YES isPlanar:YES colorSpaceName:NSCalibratedRGBColorSpace bytesPerRow:0 bitsPerPixel:0]; Now I want to access the pixel data so I get a pointer to the pixel planes using unsigned char *bitmapData; [imageRep getBitmapDataPlanes:&bitmapData]; According to the Documentation this returns a C array of five character pointers. But how can it do that? since the type of the argument is unsigned char **, it can only return an array of chars, but not an array of char pointers. So, this leaves me wondering how to access the individual pixels. Do you have an idea how to do that? (I know there is the method – setColor:atX:y:, but it seems to be pretty slow if invoked for every single pixel of a big bitmap.)

    Read the article

  • change fillColor of selected CAShapeLayer

    - by Frank
    I'm trying to change the fillColor of a CAShapeLayer when the layer it's contained in is touched. I'm able to change the background color of the tapped layer like this: -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { CALayer *layer = [(CALayer *)self.view.layer.presentationLayer hitTest:point]; layer = layer.modelLayer; layer.backgroundColor = [UIColor blueColor].CGColor; } This turns the background of "layer" blue as expected. My problem is how do I change the color of the CAShapelayer inside "layer"? Thanks!

    Read the article

  • How do I arbitrarily distort a textured polygon?

    - by Archagon
    I'd like to write a program that lets me arbitrarily distort a textured polygon by dragging its vertices. I want the texture to distort fluidly and without overlap, assuming the new polygon doesn't intersect itself. I should also be able to repeat the process with the new shape, and with a minimum amount of loss. Are there any algorithms for doing this?

    Read the article

  • imageWithCGImage and memory

    - by Adam Ernst
    If I use [UIImage imageWithCGImage:], passing in a CGImageRef, do I then release the CGImageRef or does UIImage take care of this itself when it is deallocated? The documentation isn't entirely clear. It says "This method does not cache the image object." Originally I called CGImageRelease on the CGImageRef after passing it to imageWithCGImage:, but that caused a malloc_error_break warning in the Simulator claiming a double-free was occurring.

    Read the article

  • Are we DELPHI, VCL or Pascal programmers?

    - by José Eduardo
    i´ve been a delphi database programmer since D2. Now i´m facing some digital imaging and 3D challenges that make me to start study OpenGL, DirectX, Color Spaces and so on. I´m really trying but nobody seems to use Delphi for this kind of stuff, just the good-old-paycheck Database programming. ok, i know that we have some very smart guys behind some clever components, some of this open-source. Is there any PhotoShop, Blender, Maya, Office, Sonar, StarCraft, Call of Dutty written in Delphi? Do i have to learn C++ to have access to zillions of books about that kind of stuff? What is the fuzz/hype behind this: int *varName = &anhoterThing? Why pointers seems to be the holy graal to this apps? I´ve downloaded MSVC++ Express and start to learn some WPF and QT integration, and i think: "Man, Delphi does this kind of stuff, with less code, less headaches, since the wheels were invented" This lead my mind to the following... Do you ever tried to write a simple notepad program using just notepad and dcc32 in Pascal/Delphi? if so embarcadero could make our beloved pascal compiler free, and sell just the ide, the vcl, the customer support ... and back to the question: Are we DELPHI, VCL or Pascal programmers?

    Read the article

  • QPainter paints garbage

    - by DSblizzard
    Fragment of program code: def add_link(Item0Num, Item1Num): global Mw, View # Mw - MainWindow if Item0Num != Item1Num and not link_exists(Item0Num, Item1Num): append( links_to(Item1Num), Item0Num ) append( links_from(Item0Num), Item1Num ) LinkGi = TLinkGi() Mw.Scene.addItem(LinkGi) LinkGi.setZValue(200) LinkGi.scale(1 / View.Scale, 1 / View.Scale) LinkGi.Item0Num = Item0Num LinkGi.Item1Num = Item1Num class TLinkGi(QGraphicsItem): def paint(self, Painter, StyleOptionGraphicsItem, Widget): global Mw, View Pen = QPen(Qt.black, 1) Painter.setPen(Pen) X0, Y0 = task_center(self.Item0Num) self.setPos(X0, Y0) X1, Y1 = task_center(self.Item1Num) X, Y = int( (X1 - X0) * View.Scale ), int( (Y1 - Y0) * View.Scale ) Painter.drawLine(0, 0, X, Y) #Mw.Scene.update(0, 0, Plan.Size, Plan.Size) # (1) #Mw.gvMain.repaint() # (2) def boundingRect(self): global View Rect = QRectF(0, 0, Plan.Size, Plan.Size) return Rect This paints such garbage: http://img697.imageshack.us/content_round.php?page=done&l=img697/5395/qpaintergarbage1.jpg When lines (1) and (2) are uncommented things doesn't become much better: http://img63.imageshack.us/content_round.php?page=done&l=img63/9693/qpaintergarbage0.jpg Please help me to solve this problem.

    Read the article

  • How to program a real-time accurate audio sequencer on the iphone?

    - by Walchy
    Hi... I want to program a simple audio sequencer on the iphone but I can't get accurate timing. The last days I tried all possible audio techniques on the iphone, starting from AudioServicesPlaySystemSound and AVAudioPlayer and OpenAL to AudioQueues. In my last attempt I tried the CocosDenshion sound engine which uses openAL and allows to load sounds into multiple buffers and then play them whenever needed. Here is the basic code: init: int channelGroups[1]; channelGroups[0] = 8; soundEngine = [[CDSoundEngine alloc] init:channelGroups channelGroupTotal:1]; int i=0; for(NSString *soundName in [NSArray arrayWithObjects:@"base1", @"snare1", @"hihat1", @"dit", @"snare", nil]) { [soundEngine loadBuffer:i fileName:soundName fileType:@"wav"]; i++; } [NSTimer scheduledTimerWithTimeInterval:0.14 target:self selector:@selector(drumLoop:) userInfo:nil repeats:YES]; In the initialisation I create the sound engine, load some sounds to different buffers and then establish the sequencer loop with NSTimer. audio loop: - (void)drumLoop:(NSTimer *)timer { for(int track=0; track<4; track++) { unsigned char note=pattern[track][step]; if(note) [soundEngine playSound:note-1 channelGroupId:0 pitch:1.0f pan:.5 gain:1.0 loop:NO]; } if(++step>=16) step=0; } Thats it and it works as it should BUT the timing is shaky and instable. As soon as something else happens (i.g. drawing in a view) it goes out of sync. As I understand the sound engine and openAL the buffers are loaded (in the init code) and then are ready to start immediately with alSourcePlay(source); - so the problem may be with NSTimer? Now there are dozens of sound sequencer apps in the appstore and they have accurate timing. I.g. "idrum" has a perfect stable beat even in 180 bpm when zooming and drawing is done. So there must be a solution. Does anybody has any idea? Thanks for any help in advance! Best regards, Walchy

    Read the article

  • Rendering to a single Bitmap object from multiple threads

    - by Lee Treveil
    What im doing is rendering a number of bitmaps to a single bitmap. There could be hundreds of images and the bitmap being rendered to could be over 1000x1000 pixels. Im hoping to speed up this process by using multiple threads but since the Bitmap object is not thread-safe it cant be rendered to directly concurrently. What im thinking is to split the large bitmap into sections per cpu, render them separately then join them back together at the end. I haven't done this yet incase you guys/girls have any better suggestions. Any ideas? Thanks

    Read the article

  • Video Synthesis - Making waves, pattern, gradients...

    - by Nathan
    I'm writing a program to generate some trippy visuals. My code paints each pixel with a random blue value which loops at 0.04 second intervals. for (y = 0; y < 5.5; y += 0.2) { for (x = 0; x < 7.5; x += 0.2) { b = rand() / ((double) RAND_MAX); setPixelColor(x,y,r,g,b); } } I'd like to do more than just make blue noise... but my maths is a bit rusty, and Google isn't helping me much today, so it would be great if you could share anything you know about making waves, patterns, gradient animations, etc or links to such material.

    Read the article

< Previous Page | 114 115 116 117 118 119 120 121 122 123 124 125  | Next Page >