Search Results

Search found 13860 results on 555 pages for 'core graphics'.

Page 24/555 | < Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >

  • How to access core data objects from Javascript?

    - by Eli
    How can I gain access to Core Data objects from Javascript/WebKit on Mac OS X? I've made custom subclasses of NSManagedObject for each of my tables, with accessors defined using @property/@dynamic for each attribute, but neither isSelectorExcludedFromWebScript: or isKeyExcludedFromWebScript: is called for any of them, so Javascript just stops when I try to access any of the attributes. It returns 'undefined' if I access it as a property (eg business.name ) and javascript execution stops if I access it as a function (eg business.name() ).

    Read the article

  • Computer graphics: programatically create duotone (or separations)

    - by TarGz
    There are special kind of images called "duotone" which have just two channels. It is mostly used when you want to achive higher quality reproduction - have a printing press with two colors (black , gray). My question is, I have normal gray-scale image, how to convert it to duotone? I know I can tweak the curves in Photoshop - this is not what I'm asking, rather than how to do it programmatically? Perhaps there is a library which can do just that? What about "dot gain compensation"? "Total ink coverage"?

    Read the article

  • question about working with System.Drawing.Graphics

    - by backdoor
    hi all. i have a System.Drawing.Point[] filled with some System.Drawing.Point's. so when i want to draw this points as a polygon in a System.Windows.Form instance , the final drawn polygon is not all in the screen or sometimes is very small (in screen shown as 2-3 pixel). i wonder if there is some Library that using that i can just send Point[] to that and thatself scales and ... points and draws polygon manner that all points shown in screen and they are scaled to fit the screen (i mean small objects that shown as 2-3 pixel scale up to fit entire screen); thaks all and sorry for my bad english...

    Read the article

  • Core Location and speed measurements

    - by Krumelur
    Does anyone know if Core Location in the iPhone OS uses anything but simple vector math to calculate speed? I've read that the GPS system can provide speed measurements that can be accurate when position is not (I believe using the Doppler shifts of the signals). I've tried and failed to see if the iPhone does this. The question is basically, does this data contain information or is it just convenience functions, using (filtered?) location data?

    Read the article

  • Core Data Change property value when another property changes

    - by user320587
    Hi, I have a Core Data Entity which has three properties startDate, endDate and duration. All three properties are persistent properties. I would like to know how I can calculate and update the duration property whenever the value for startDate and endDate changes? BTW, I won't be able to make the duration as transient property since I have to use the property for sorting in my table view? Any help is greatly appreciated Thanks, Javid

    Read the article

  • iOS Core Data migration: moving something from an entity into a file

    - by Tim Sullivan
    I have a scenario where I'm moving the contents of a blob stored in a core data entity into a file. I need a way to export that data during a migration, where I know the entity that's being converted and save the blob to a file, writing the location of that file into the converted entity's appropriate attribute. I can't seem to find a way to do this. The docs regarding the Three Stage Migration seem to indicate what can be done, but I'm not sure where to define things, or what exactly to define.

    Read the article

  • UIDatePicker Not storing time in core data

    - by Derek
    Hi, I am using a UIDatepicker with only "Time", I save the time in a NSDate object type, however when I try to store the Object in core data I get an error saying its not a NSDate type... *tt = [pickerTime date]; [myObject setValue:tt forKey:@"time"]; Thanks,

    Read the article

  • Core Plot: x-axis labels not plotted when using scaleToFitPlots

    - by AlexR
    Problem: I can't get Core Plot (1.1) to plot automatic labels for my x-axis when using autoscaling ([plotSpace scaleToFitPlots:[graph allPlots]). What I have tried: I changed the values for the offsets and paddings, but this did not change the result. However, when turning autoscale off (not using [plotSpace scaleToFitPlots:[graph allPlots]]and setting the y scale automatically, the automatic labeling of the x-axis works. Question: Is there a bug in Core Plot or what did I do wrong? I would appreciate any help! Thank you! This is how I have set up my chart: CPTBarPlot *barPlot = [CPTBarPlot tubularBarPlotWithColor:[CPTColor blueColor] horizontalBars:NO]; barPlot.baseValue = CPTDecimalFromInt(0); barPlot.barOffset = CPTDecimalFromFloat(0.0f); // CPTDecimalFromFloat(0.5f); barPlot.barWidth = CPTDecimalFromFloat(0.4f); barPlot.barCornerRadius = 4; barPlot.labelOffset = 5; barPlot.dataSource = self; barPlot.delegate = self; graph = [[CPTXYGraph alloc]initWithFrame:self.view.bounds]; self.hostView.hostedGraph = graph; graph.paddingLeft = 40.0f; graph.paddingTop = 30.0f; graph.paddingRight = 30.0f; graph.paddingBottom = 50.0f; [graph addPlot:barPlot]; graph.plotAreaFrame.masksToBorder = NO; graph.plotAreaFrame.cornerRadius = 0.0f; graph.plotAreaFrame.borderLineStyle = borderLineStyle; double xAxisStart = 0; CPTXYAxisSet *xyAxisSet = (CPTXYAxisSet *)graph.axisSet; CPTXYAxis *xAxis = xyAxisSet.xAxis; CPTMutableLineStyle *lineStyle = [xAxis.axisLineStyle mutableCopy]; lineStyle.lineCap = kCGLineCapButt; xAxis.axisLineStyle = lineStyle; xAxis.majorTickLength = 10; xAxis.orthogonalCoordinateDecimal = CPTDecimalFromDouble(yAxisStart); xAxis.paddingBottom = 5; xyAxisSet.delegate = self; xAxis.delegate = self; xAxis.labelOffset = 0; xAxis.labelingPolicy = CPTAxisLabelingPolicyAutomatic; [plotSpace scaleToFitPlots:[graph allPlots]]; CPTMutablePlotRange *yRange = plotSpace.yRange.mutableCopy; [yRange expandRangeByFactor:CPTDecimalFromDouble(1.3)]; plotSpace.yRange = yRange; NSInteger xLength = CPTDecimalIntegerValue(plotSpace.xRange.length) + 1; plotSpace.xRange = [CPTPlotRange plotRangeWithLocation:CPTDecimalFromDouble(xAxisStart) length:CPTDecimalFromDouble(xLength)] ;

    Read the article

  • Core Animation Unwanted Text Sharpening.

    - by dave-gennel
    Whenever I add a layer for Core Animation either from the nib or programatically, the NSTextFields (labels) in my interface get messed up. Here's a screenshot from Apple's BasicCocoaAnimations example. (Look at the text fields on the left, somehow they're drawn sharper than normal) Note that if I add a layer in IB then it also gets messed up in IB itself before I even run my app.

    Read the article

  • Store NSArray In Core Data Sample Code?

    - by Stunner
    Ey guys, I have been searching for some sample code on how to store an NSArray in Core Data for awhile now, but haven't had any luck. Would anyone mind pointing me to some tutorial or example, or better yet write a simple sample as an answer to this question? I have read this but it doesn't show an example of how to go about implementing a transformable attribute that is an NSArray. Thanks in advance!

    Read the article

  • Core data fetch only returns unique managed objects

    - by JK
    I execute a core data fetch which specifies a predicate as follows: NSPredicate *predicate = [NSPredicate predicateWithFormat:@"identifier IN %@", favoritesIDs]; When there are duplicate items in the favoriteIDs array, the fetch request only returns 1 managed object. How can I ensure that more than one instance is fetched? Thank you.

    Read the article

  • Clear Graphics Object to Black or other RGB

    - by Jimmy
    I am a rank beginner in C#. I am currently using this code: objGraphics.Clear(SystemColors.Control); What I want to do is Clear this object to black (or some other RGB color), and I'm stumped as to how to replace the SystemColors.Control with, preferably, an RGB color spec. I'd probably want to clear to black most of the time. Any help will be much appreciated!

    Read the article

  • Core Data: Mass updates possible?

    - by wgpubs
    Is it possible to do mass updates on a given entity in Core Data? Given an Person entity for example, can I do something like this: Person.update(@"set displayOrder = displayOrder + 1 where displayOrder > 5") Or is my only option to fetch all the entities needed and then loop through and update them individually??? Thanks

    Read the article

  • Core Data - Best way to save a "number of items"

    - by Daniel Granger
    The user will have a static list of items to choose from. Using a Picker View they will choose one of the items and then select how many of them they want. Whats the best way to save this in core data? A Struct? struct order { NSInteger item; NSInteger numberOf; }; Or some sort of relationship? Many Thanks

    Read the article

  • How to convert from one co-ordinate system to another (graphics)

    - by Dororo
    I've been having issues with this for a little while now. I feel like I should know this but I can't for the life of me remember. How can I map the screen pixels to their respective 'graphical' x,y positions? The co-ordinate systems have been configured to start at the bottom left (0,0) and increase to the top-right. I want to be able to zoom, so I know that I need to configure the zoom distance into the answer. Screen |\ Some Quad | \--------|\Qx | \ Z | \ | \ \|Qy \ | Sx\ |Sy \| I want to know which pixels on my screen will have the quad on it. Obviously as Z decreases, the quad will occupy more of the screen, and as Z increases it will occupy less, but how exactly are these calculated? Thanks for any help.

    Read the article

  • 32 core (each physical core) 2.2 GhZ or 12 core (6 physical cores) 3.0GHZ?

    - by Tejaswi Rana
    I am working on a multithreaded application (Forex trading app built on C#) and had the client upgrade from the 12 core 3.0GHZ machine (Intel) to a 32 core 2.2 Ghz machine (AMD). The PassMark benchmark results were significantly higher when using multicores doing Integer, Floating and other calculations while for a single core calculation it was a bit slower than the pack (others that were being compared to with similar config as the 12 core one). Oh it also comes with 64 GB RAM (4 times as the other one) and a much faster SSD. So after configuring and running the application on that machine, not only did it not perform as well, it was significantly slower. We're talking about 30seconds - 1 minute slower on an app that usually completes processing within 5-20 secs. The application uses MAX DEGREE of PARALLELISM (TPL) which I've tried setting to number of cores and also half of that. I've also tried running single threaded and without setting any limits in parallel threading. While it may be the hardware has some issues, I am wondering if the CPU processing speed is the issue. I can overclock to 3.0 GHZ. But is that even a good idea? Server Info - AMD http://www.passmark.com/forum/showthread.php?4013-AMD-Dual-6272-performance-is-60-lower-than-benchmarks Seems that benchmark was wrong to start with - officially. Intel i7 3930k OS (same in both) Windows 7 Professional 64-bit

    Read the article

  • How do I take some RAM and use it towards Dedicated video memory for my Nvidia graphics card?

    - by Noah Rainey
    I have a Nividia GeForce 6150SE nForce 430 graphics card (so it's quite old), it only gets 64MB of dedicated memory by default. I went into the bios and see if I can increase it, but it wouldn't let me. However, from the Nividia control panel I see I have up to 1071MB of total available graphics memory. I'm not sure what that means and I'm not sure how I can harness this memory and use some RAM for my graphics card. Can someone explain if this is possible and if so, how?

    Read the article

  • Marrying Core Animation with OpenGL ES

    - by Ole Begemann
    Edit: I suppose instead of the long explanation below I might also ask: Sending -setNeedsDisplay to an instance of CAEAGLLayer does not cause the layer to redraw (i.e., -drawInContext: is not called). Instead, I get this console message: <GLLayer: 0x4d500b0>: calling -display has no effect. Is there a way around this issue? Can I invoke -drawInContext: when -setNeedsDisplay is called? Long explanation below: I have an OpenGL scene that I would like to animate using Core Animation animations. Following the standard approach to animate custom properties in a CALayer, I created a subclass of CAEAGLLayer and defined a property sceneCenterPoint in it whose value should be animated. My layer also holds a reference to the OpenGL renderer: #import <UIKit/UIKit.h> #import <QuartzCore/QuartzCore.h> #import "ES2Renderer.h" @interface GLLayer : CAEAGLLayer { ES2Renderer *renderer; } @property (nonatomic, retain) ES2Renderer *renderer; @property (nonatomic, assign) CGPoint sceneCenterPoint; I then declare the property @dynamic to let CA create the accessors, override +needsDisplayForKey: and implement -drawInContext: to pass the current value of the sceneCenterPoint property to the renderer and ask it to render the scene: #import "GLLayer.h" @implementation GLLayer @synthesize renderer; @dynamic sceneCenterPoint; + (BOOL) needsDisplayForKey:(NSString *)key { if ([key isEqualToString:@"sceneCenterPoint"]) { return YES; } else { return [super needsDisplayForKey:key]; } } - (void) drawInContext:(CGContextRef)ctx { self.renderer.centerPoint = self.sceneCenterPoint; [self.renderer render]; } ... (If you have access to the WWDC 2009 session videos, you can review this technique in session 303 ("Animated Drawing")). Now, when I create an explicit animation for the layer on the keyPath @"sceneCenterPoint", Core Animation should calculate the interpolated values for the custom properties and call -drawInContext: for each step of the animation: - (IBAction)animateButtonTapped:(id)sender { CABasicAnimation *animation = [CABasicAnimation animationWithKeyPath:@"sceneCenterPoint"]; animation.duration = 1.0; animation.fromValue = [NSValue valueWithCGPoint:CGPointZero]; animation.toValue = [NSValue valueWithCGPoint:CGPointMake(1.0f, 1.0f)]; [self.glView.layer addAnimation:animation forKey:nil]; } At least that is what would happen for a normal CALayer subclass. When I subclass CAEAGLLayer, I get this output on the console for each step of the animation: 2010-12-21 13:59:22.180 CoreAnimationOpenGL[7496:207] <GLLayer: 0x4e0be20>: calling -display has no effect. 2010-12-21 13:59:22.198 CoreAnimationOpenGL[7496:207] <GLLayer: 0x4e0be20>: calling -display has no effect. 2010-12-21 13:59:22.216 CoreAnimationOpenGL[7496:207] <GLLayer: 0x4e0be20>: calling -display has no effect. 2010-12-21 13:59:22.233 CoreAnimationOpenGL[7496:207] <GLLayer: 0x4e0be20>: calling -display has no effect. ... So it seems that, possibly for performance reasons, for OpenGL layers, -drawInContext: is not getting called because these layers do not use the standard -display method to draw themselves. Can anybody confirm that? Is there a way around it? Or can I not use the technique I laid out above? This would mean I would have to implement the animations manually in the OpenGL renderer (which is possible but not as elegant IMO).

    Read the article

  • Core Data Model Design Question - Changing "Live" Objects also Changes Saved Objects

    - by mwt
    I'm working on my first Core Data project (on iPhone) and am really liking it. Core Data is cool stuff. I am, however, running into a design difficulty that I'm not sure how to solve, although I imagine it's a fairly common situation. It concerns the data model. For the sake of clarity, I'll use an imaginary football game app as an example to illustrate my question. Say that there are NSMO's called Downs and Plays. Plays function like templates to be used by Downs. The user creates Plays (for example, Bootleg, Button Hook, Slant Route, Sweep, etc.) and fills in the various properties. Plays have a to-many relationship with Downs. For each Down, the user decides which Play to use. When the Down is executed, it uses the Play as its template. After each down is run, it is stored in history. The program remembers all the Downs ever played. So far, so good. This is all working fine. The question I have concerns what happens when the user wants to change the details of a Play. Let's say it originally involved a pass to the left, but the user now wants it to be a pass to the right. Making that change, however, not only affects all the future executions of that Play, but also changes the details of the Plays stored in history. The record of Downs gets "polluted," in effect, because the Play template has been changed. I have been rolling around several possible fixes to this situation, but I imagine the geniuses of SO know much more about how to handle this than I do. Still, the potential fixes I've come up with are: 1) "Versioning" of Plays. Each change to a Play template actually creates a new, separate Play object with the same name (as far as the user can tell). Underneath the hood, however, it is actually a different Play. This would work, AFAICT, but seems like it could potentially lead to a wild proliferation of Play objects, esp. if the user keeps switching back and forth between several versions of the same Play (creating object after object each time the user switches). Yes, the app could check for pre-existing, identical Plays, but... it just seems like a mess. 2) Have Downs, upon saving, record the details of the Play they used, but not as a Play object. This just seems ridiculous, given that the Play object is there to hold those just those details. 3) Recognize that Play objects are actually fulfilling 2 functions: one to be a template for a Down, and the other to record what template was used. These 2 functions have a different relationship with a Down. The first (template) has a to-many relationship. But the second (record) has a one-to-one relationship. This would mean creating a second object, something like "Play-Template" which would retain the to-many relationship with Downs. Play objects would get reconfigured to have a one-to-one relationship with Downs. A Down would use a Play-Template object for execution, but use the new kind of Play object to store what template was used. It is this change from a to-many relationship to a one-to-one relationship that represents the crux of the problem. Even writing this question out has helped me get clearer. I think something like solution 3 is the answer. However if anyone has a better idea or even just a confirmation that I'm on the right track, that would be helpful. (Remember, I'm not really making a football game, it's just faster/easier to use a metaphor everyone understands.) Thanks.

    Read the article

  • Core Plot never stops asking for data, hangs device

    - by Ben Collins
    I'm trying to set up a core plot that looks somewhat like the AAPLot example on the core-plot wiki. I have set up my plot like this: - (void)initGraph:(CPXYGraph*)graph forDays:(NSUInteger)numDays { self.cplhView.hostedLayer = graph; graph.paddingLeft = 30.0; graph.paddingTop = 20.0; graph.paddingRight = 30.0; graph.paddingBottom = 20.0; CPXYPlotSpace *plotSpace = (CPXYPlotSpace*)graph.defaultPlotSpace; plotSpace.xRange = [CPPlotRange plotRangeWithLocation:CPDecimalFromFloat(0) length:CPDecimalFromFloat(numDays)]; plotSpace.yRange = [CPPlotRange plotRangeWithLocation:CPDecimalFromFloat(0) length:CPDecimalFromFloat(1)]; CPLineStyle *lineStyle = [CPLineStyle lineStyle]; lineStyle.lineColor = [CPColor blackColor]; lineStyle.lineWidth = 2.0f; // Axes NSLog(@"Setting up axes"); CPXYAxisSet *xyAxisSet = (id)graph.axisSet; CPXYAxis *xAxis = xyAxisSet.xAxis; // xAxis.majorIntervalLength = CPDecimalFromFloat(7); // xAxis.minorTicksPerInterval = 7; CPXYAxis *yAxis = xyAxisSet.yAxis; // yAxis.majorIntervalLength = CPDecimalFromFloat(0.1); // Line plot with gradient fill NSLog(@"Setting up line plot"); CPScatterPlot *dataSourceLinePlot = [[[CPScatterPlot alloc] initWithFrame:graph.bounds] autorelease]; dataSourceLinePlot.identifier = @"Data Source Plot"; dataSourceLinePlot.dataLineStyle = nil; dataSourceLinePlot.dataSource = self; [graph addPlot:dataSourceLinePlot]; CPColor *areaColor = [CPColor colorWithComponentRed:1.0 green:1.0 blue:1.0 alpha:0.6]; CPGradient *areaGradient = [CPGradient gradientWithBeginningColor:areaColor endingColor:[CPColor clearColor]]; areaGradient.angle = -90.0f; CPFill *areaGradientFill = [CPFill fillWithGradient:areaGradient]; dataSourceLinePlot.areaFill = areaGradientFill; dataSourceLinePlot.areaBaseValue = CPDecimalFromString(@"320.0"); // OHLC plot NSLog(@"OHLC Plot"); CPLineStyle *whiteLineStyle = [CPLineStyle lineStyle]; whiteLineStyle.lineColor = [CPColor whiteColor]; whiteLineStyle.lineWidth = 1.0f; CPTradingRangePlot *ohlcPlot = [[[CPTradingRangePlot alloc] initWithFrame:graph.bounds] autorelease]; ohlcPlot.identifier = @"OHLC"; ohlcPlot.lineStyle = whiteLineStyle; ohlcPlot.stickLength = 2.0f; ohlcPlot.plotStyle = CPTradingRangePlotStyleOHLC; ohlcPlot.dataSource = self; NSLog(@"Data source set, adding plot"); [graph addPlot:ohlcPlot]; } And my delegate methods like this: #pragma mark - #pragma mark CPPlotdataSource Methods - (NSUInteger)numberOfRecordsForPlot:(CPPlot *)plot { NSUInteger maxCount = 0; NSLog(@"Getting number of records."); if (self.data1 && [self.data1 count] > maxCount) { maxCount = [self.data1 count]; } if (self.data2 && [self.data2 count] > maxCount) { maxCount = [self.data2 count]; } if (self.data3 && [self.data3 count] > maxCount) { maxCount = [self.data3 count]; } NSLog(@"%u records", maxCount); return maxCount; } - (NSNumber *)numberForPlot:(CPPlot *)plot field:(NSUInteger)fieldEnum recordIndex:(NSUInteger)index { NSLog(@"Getting record @ idx %u", index); return [NSNumber numberWithInt:index]; } All the code above is in the view controller for the view hosting the plot, and when initGraph is called, numDays is 30. I realize of course that this plot, if it even worked, would look nothing like the AAPLot example. The problem I'm having is that the view is never shown. It finished loading because viewDidLoad is the method that calls initGraph above, and the NSLog statements indicate that initGraph finishes. What's strange is that I return a value of 54 from numberOfRecordsForPlot, but the plot asks for more than 54 data points. in fact, it never stops asking. The NSLog statement in numberForPlot:field:recordIndex prints forever, going from 0 to 54 and then looping back around and continuing. What's going on? Why won't the plot stop asking for data and draw itself?

    Read the article

< Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >