Search Results

Search found 6486 results on 260 pages for 'cocoa touch'.

Page 31/260 | < Previous Page | 27 28 29 30 31 32 33 34 35 36 37 38  | Next Page >

  • Installing the Wacom Bamboo Pen & Touch

    - by federico
    I would like to use the Wacom Bamboo "Pen & Touch" with Ubuntu Maverick and I don't have any idea how to do this. In addition, when I see "change or add kernel" I become really scared. :-) I would really appreciate your help. Thanks in advance I saw answers for the Wacom Bamboo "pen" but I don't know if the installation instructions are the same or if some different additions need to be made to my system.

    Read the article

  • Dell Synaptics touch pad's middle mouse button gets mapped as normal click

    - by Henrik
    How do I make the middle touch pad's button work? xinput --test 11 yields button press 1 button press 1 For pressing both the left and the middle button. I have tried to do xinput set-button-map 11 1 4 2 and so on, but as the --test shows that button 1 is being depressed, then probably the issue is at a lower level than with X11's perception of what mouse buttons I'm pressing (or assigning button-map 11 1 2 3 and clicking the right button in firefox, wouldn't trigger the middle-click on the link)

    Read the article

  • Rotation angle based on touch move

    - by Siddharth
    I want to rotate my stick based on the movement of the touch on the screen. From my calculation I did not able to find correct angle in degree. So please provide guidance, my code snippet for that are below. if (pSceneTouchEvent.isActionMove()) { pValueX = pSceneTouchEvent.getX(); pValueY = CAMERA_HEIGHT - pSceneTouchEvent.getY(); rotationAngle = (float) Math.atan2(pValueX, pValueY); stick.setRotation((float) MathUtils.radToDeg(rotationAngle)); }

    Read the article

  • Ubuntu open to greater touch

    <b>The Register:</b> "You'll want to touch Ubuntu in personal places - like in your kitchen or in your car. At least that's what Canonical hopes, as it works on architectural changes and business deals to put the Linux distro on more embedded systems."

    Read the article

  • Benefits and Advantages of Touch Screen Tills

    Touch screen technology is mostly used in the mobiles. This system is helpful to do work fast. Due to this with the advantage of digital age the screen is now used in different electronic system, in ... [Author: Alan Wisdom - Computers and Internet - April 05, 2010]

    Read the article

  • Troubles moving a UIView.

    - by Joshua
    I have been trying to move a UIView by following a users touch. I have almost got it to work except for one thing, the UIView keeps flicking between two places. Here's the code I have been using: - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"touchDown"); UITouch *touch = [touches anyObject]; firstTouch = [touch locationInView:self.view]; lastTouch = [touch locationInView:self.view]; [self.view setNeedsDisplay]; } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { InSightViewController *contentView = [[InSightViewController alloc] initWithNibName:@"SubView" bundle:[NSBundle mainBundle]]; [contentView loadView]; UITouch *touch = [touches anyObject]; currentTouch = [touch locationInView:self.view]; if (CGRectContainsPoint(contentView.view.bounds, firstTouch)) { NSLog(@"touch in subView/contentView"); sub.frame = CGRectMake(currentTouch.x - 50.0, currentTouch.y, 130.0, 21.0); } NSLog(@"touch moved"); lastTouch = currentTouch; [self.view setNeedsDisplay]; } And here's what's been happening: http://cl.ly/Sjx

    Read the article

  • How can I differentiate two different touches on a layer ?

    - by srikanth rongali
    I am writing an app in cocos2d. I hava a sprite and a text in my scene. I have written two separate classes for sprite and text. And I added both of them to another class. In sprite class I have written - (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event And in text class I have written -(void) registerWithTouchDispatcher { [[CCTouchDispatcher sharedDispatcher]addTargetedDelegate:self priority:0 swallowsTouches:YES]; } -(BOOL) ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event { return YES; } -(void) ccTouchEnded:(UITouch *)touch withEvent:(UIEvent *)event { NSLog(@"Recognized tOuches in Instructions");// CGSize windowSize = [[CCDirector sharedDirector] winSize]; CCNode *node = [self getChildByTag:kTagNode]; [node setPosition: ccp(text1.contentSize.width/2,text1.contentSize.height/2 - windowSize.height)]; } -(void) ccTouchMoved:(UITouch *)touch withEvent:(UIEvent *)event { CGPoint touchLocation = [touch locationInView: [touch view]]; CGPoint prevLocation = [touch previousLocationInView: [touch view]]; touchLocation = [[CCDirector sharedDirector] convertToGL: touchLocation]; prevLocation = [[CCDirector sharedDirector] convertToGL: prevLocation]; CGPoint diff = ccpSub(touchLocation,prevLocation); CCNode *node = [self getChildByTag:kTagNode]; CGPoint currentPos = [node position]; [node setPosition: ccpAdd(currentPos, diff)]; } But, only touches in the text are recognized and touch of sprite is not recognized ? How can I differentiate the two touches.

    Read the article

  • How to use UILongPressGestureRecognizer with sprite drag & wait?

    - by ganesh
    May be it's asked before also but I couldn't find any good answer. Please tell me how this can be implemented with UILongPressGestureRecognizer? A user drags a sprite from X location to Y location. Then it waits at Y location (touch is not ended yet) for 1 or 2 secs and release the touch i.e touch is ended. In this case, shouldn't following states be triggered in below order for UILongPressGestureRecognizer: UIGestureRecognizerStateBegan UIGestureRecognizerStateChanged UIGestureRecognizerStateEnded ? My problem is if UIPanGestureRecognizer is also implemented to handle drags, UILongPressGesture is never triggered even after Long waits. Any thoughts?

    Read the article

  • Is there a way to detect non-movement (touch events) ?

    - by hyn
    Is there a way to detect a finger's non-movement by using a combination of UITouch events? The event methods touchesEnded and touchesCancelled are only fired when the event is cancelled or the finger lifted. I would like to know when a touch has stopped moving, even while it is still touching the screen.

    Read the article

  • How can we detect a touch of a sprite?

    - by srikanth rongali
    I have two sprites in my app. Both should have touches enabled and both touches are independent of one another. And if I touch the screen (not on sprites) it should have different touches. My problem is all three sprite1, sprite2, remaining screen should have independent touches. But my program is taking all the touches as same. How can I make them as what I needed ? Thank You.

    Read the article

  • How do I properly embed third-party frameworks in my Cocoa application?

    - by Jordan Kay
    I am writing a Cocoa application that makes use of the ParseKit framework (http://www.parsekit.com/). I've included the Framework in the proper folder, added a Copy Files build phase, and added it to the build phase. I can build and launch the application on my Mac. However, when I try to run it on another Mac, it crashes. The Console shows the following error message: dyld: Library not loaded: /Users/Jordan/Files/ParseKit/build/Debug/ParseKit.framework/Versions/A/ParseKit It looks like when the app launches, it is looking for the framework on my local drive. However, the framework is in the Copy Files build phase, so it has been copied into that application's Contents/Frameworks folder. If if the application were looking in this folder, it would be able to load the framework just fine, but for some reason it's looking for it on my local drive on the original Mac (which obviously doesn't exist on the other Mac). What am I doing wrong?

    Read the article

  • I don't understand how to use delegates in Cocoa but I know what they are.

    - by lampShade
    Like many people I'm interested on Objective - C and Cocoa programming. I know conceptually what a delegate it is but I don't understand how to use them or when to use them. Here is some example code: #import "AppControler.h" @implementation AppControler -(id)init { [super init]; NSLog(@"init"); speechSynth = [[NSSpeechSynthesizer alloc] initWithVoice:nil]; // [speechSynth setDelegate:self]; voiceList = [[/Applications/Google Chrome.app availableVoices] retain]; return self; } I'm setting the AppControler to be the delegate of the speechSynthasizer. Which means that the speechSynthasizer is telling hte AppControler what to do. But I don't understand this line: [speechSynth setDelegate:self];

    Read the article

  • Any good way to set the exit status of a Cocoa application?

    - by buglesareking
    I have a Cocoa app which interacts with a server and displays a GUI. If there is a fatal error, I display an alert and exit. I'd like to set the exit status to a non-zero value to reflect that an error occurred, for ease of interaction with some other UNIX based tools. Unfortunately I've been unable to find a good way to do so - NSApplication doesn't seem to have any way to set an exit status. At the moment, I've subclassed NSApplication and added an exitStatus ivar (which I set in my app delegate when necessary), then overridden -terminate: so that it calls exit(exitStatus). This works fine, but it seems a bit grungy to me, not to mention that I may be missing something important that the stadnard `terminate: is doing behind the scenes. I can't call [super terminate:sender] in my subclassed method, because that exit()s without giving me a chance to set the status. Am I missing something obvious?

    Read the article

  • Touch Screen Running Windows CE

    - by Jed
    I'm starting my first project that runs on a 7 inch touch screen running Windows CE 6.0 (and NETCF 3.5). The touch screen doesn't respond to touch too well when I use my finger. The only way for me to navigate around is by using a stylus (or similar). Since I've never worked with Windows CE or a resistive touch screen, I'm not sure if I should expect to be able to use my finger or if the stylus method is, essentially, the only way to effectively navigate around. - or, maybe, I have a touch screen that simply isn't that good. If you have experience with WinCE running on a touch screen, do you find that a stylus is the only way to go?

    Read the article

  • Where should document-related actions for a Cocoa app be implemented?

    - by Adam Preble
    I'm writing a document-based Cocoa app that's basically a graphical editing program. I want the user to be able to show/hide non-modal windows (such as an inspector window). Since these windows would be shown/hidden from menu items, where is the "best" place to implement the actions, such as - (IBAction)toggleInspector:(id)sender? I've seen that in the Sketch example code these are implemented in the app delegate, and the window controller instances are kept there as well, but that feels like more of a convenient place to put it than the most "graceful" place. Additionally, since this inspector would only be relevant when a document is open it feels like it should be associated more with the document's main NSWindowController than the app.

    Read the article

  • Is there any method in Cocoa that helps iterating through all NSColor values(R,G,B only, alpha not t

    - by krasnyk
    You may ask why do I need it? I have to detect all white objects at the B/W image. I'm coloring each object and taking its rect. In order fill each object with different color I would be very happy to use a function that can for a given color gimme a next one or so. In Qt there is a nice one that is called nextColor or so, returns the integer representing the next color(which can be easily translated to RGB). Does Cocoa have sth like that?

    Read the article

  • Is it possible to run a compiled program with Xcode on Mac OS X in FreeBSD? (Objective-C/Cocoa)

    - by Eonil
    Hi. I have a plan to build a web-site which running CGI made with Cocoa. My final goal is develop on Mac OS X, and run on FreeBSD. Is this possible? As I know, there is a free implementation of some NextStep classes, the GNUStep. The web-site is almost built with only strings. I read GNUStep documents, classes are enough. DB connection will be made with C interfaces. Most biggest problem which I'm concerning is linking and binary compatibility. I'm currently configuring FreeBSD on VirtualBox, but I wanna know any possibility informations about this from experts. This is not a production server. Just a trial. Please feel free to saying anything.

    Read the article

  • How do you keep Cocoa controllers from getting too big?

    - by zoul
    Hello! Do you have some tricks or techniques to break Cocoa controller classes into smaller chunks? I find that whatever I do the controllers end up being one of the more complicated classes in my design. The basic stuff is simple, but once I have several pop-overs or action sheets running, things get uncomfortably complex. It's not that bad, but still I would like to refactor the code into several standalone chunks. I thought about categories, but the code is not that independent (a lot of times it needs to tap into viewWillAppear, for example) and I find that I spend a long time fighting the compiler. I also thought about adding functionality in layers using inheritance, but that feels like a hack.

    Read the article

  • In Cocoa (or maybe GUI development in general) how do you specify an arbitrary number of things tile

    - by RankWeis
    I'm new to creating GUI's, everything I've done up until this point is using the command line. I'm trying to create a port of minesweeper to the macintosh, as an experiment, and I've got the CLI working, but I'm running into walls everywhere with the gui. The first thing it seems I have to do, however, is be able to tile n x m 'boxes' for grid - and I'm not sure how to do that. The information is ready to be handed to it, but I don't know where to do it, or how. Also, if anyone has any recommendations for sites/Cocoa development books, feel free to drop them in here... Thanks!

    Read the article

  • Detect multitouch (two fingers touch) on a sprite to apply pinch zoom behaviour

    - by Tahreem
    I am using andengine, want to move, zoom and rotate multiple sprites individually on a scene. I have achieved "move" but for pinch zoom i am unable to get the event to two fingers' touch. Below is the code: public class Main extends SimpleBaseGameActivity { private Camera camera; private BitmapTextureAtlas mBitmapTextureAtlas; private ITextureRegion mFaceTextureRegion; private ITextureRegion mFaceTextureRegion2; Sprite face2; private static final int CAMERA_WIDTH = 800; private static final int CAMERA_HEIGHT = 480; @Override public EngineOptions onCreateEngineOptions() { camera = new Camera(0, 0, CAMERA_WIDTH, CAMERA_HEIGHT); EngineOptions engineOptions = new EngineOptions(true, ScreenOrientation.LANDSCAPE_FIXED, new RatioResolutionPolicy( CAMERA_WIDTH, CAMERA_HEIGHT), camera); return engineOptions; } @Override protected void onCreateResources() { BitmapTextureAtlasTextureRegionFactory.setAssetBasePath("gfx/"); this.mBitmapTextureAtlas = new BitmapTextureAtlas( this.getTextureManager(), 1024, 1600, TextureOptions.NEAREST); BitmapTextureAtlasTextureRegionFactory.createTiledFromAsset(this.mBitmapTextureAtlas, // this, "ui_ball_1.png", 0, 0, 1, 1), // this.getVertexBufferObjectManager()); this.mFaceTextureRegion = BitmapTextureAtlasTextureRegionFactory .createFromAsset(this.mBitmapTextureAtlas, this, "ui_ball_1.png", 0, 0); this.mFaceTextureRegion2 = BitmapTextureAtlasTextureRegionFactory .createFromAsset(this.mBitmapTextureAtlas, this, "ui_ball_1.png", 0, 0); this.mBitmapTextureAtlas.load(); this.mEngine.getTextureManager().loadTexture(this.mBitmapTextureAtlas); } @Override protected Scene onCreateScene() { this.mEngine.registerUpdateHandler(new FPSLogger()); final Scene scene = new Scene(); scene.setBackground(new Background(0.09804f, 0.6274f, 0.8784f)); final float centerX = (CAMERA_WIDTH - this.mFaceTextureRegion .getWidth()) / 2; final float centerY = (CAMERA_HEIGHT - this.mFaceTextureRegion .getHeight()) / 2; final Sprite face = new Sprite(centerX, centerY, this.mFaceTextureRegion, this.getVertexBufferObjectManager()) { @Override public boolean onAreaTouched(final TouchEvent pSceneTouchEvent, final float pTouchAreaLocalX, final float pTouchAreaLocalY) { this.setPosition(pSceneTouchEvent.getX() - this.getWidth() / 2, pSceneTouchEvent.getY() - this.getHeight() / 2); return true; } }; face.setScale(2); scene.attachChild(face); scene.registerTouchArea(face); face2 = new Sprite(200, 200, this.mFaceTextureRegion2, this.getVertexBufferObjectManager()) { @Override public boolean onAreaTouched(final TouchEvent pSceneTouchEvent, final float pTouchAreaLocalX, final float pTouchAreaLocalY) { switch(pSceneTouchEvent.getAction()){ case TouchEvent.ACTION_DOWN: int count = pSceneTouchEvent.getMotionEvent().getPointerCount() ; for(int i= 0; i <count; i++) { int id = pSceneTouchEvent.getMotionEvent().getPointerId(i); } break; case TouchEvent.ACTION_MOVE: this.setPosition(pSceneTouchEvent.getX() -this.getWidth() / 2, pSceneTouchEvent.getY()-this.getHeight() / 2); break; case TouchEvent.ACTION_UP: break; } return true; } }; face2.setScale(2); scene.attachChild(face2); scene.registerTouchArea(face2); scene.setTouchAreaBindingOnActionDownEnabled(true); return scene; } } This line int count = pSceneTouchEvent.getMotionEvent().getPointerCount() ; should set 2 to the count variable if i touch the sprite with to fingers, then i can apply zooming functionality (setScale method) on the sprite by getting the distance between the coordinates of two fingers. Can anyone help me? why it does not detect two fingers? and without this how can i zoom the sprite on pinch of two fingers? I am very new to game development, any help would be appreciated. Thanks in advance.

    Read the article

  • Help trying to get two-finger scrolling to work on Asus UL80VT

    - by Dan2k3k4
    Multi-touch works fine on Windows 7 with: two-fingers scroll vertical and horizontally, two-finger tap for middle click, and three-finger tap for right click. However with Ubuntu, I've never been able to get multi-touch to "save" and work, I was able to get it to work a few times but after restarting - it would just reset back. I have the settings for two-finger scrolling on: Mouse and Touchpad Touchpad Two-finger scrolling (selected) Enable horizontal scrolling (ticked) The cursor stops moving when I try to scroll with two fingers, but it doesn't actually scroll the page. When I perform xinput list, I get: Virtual core pointer id=2 [master pointer (3)] ? Virtual core XTEST pointer id=4 [slave pointer (2)] ? ETPS/2 Elantech ETF0401 id=13 [slave pointer (2)] I've tried to install some 'synaptics-dkms' bug-fix (from a few years back) but that didn't work, so I removed that. I've tried installing 'uTouch' but that didn't seem to do anything so removed it. Here's what I have installed now: dpkg --get-selections installed-software grep 'touch\|mouse\|track\|synapt' installed-software libsoundtouch0 --- install libutouch-evemu1 --- install libutouch-frame1 --- install libutouch-geis1 --- install libutouch-grail1 --- install printer-driver-ptouch --- install ptouch-driver --- install xserver-xorg-input-multitouch --- install xserver-xorg-input-mouse --- install xserver-xorg-input-vmmouse --- install libnetfilter-conntrack3 --- install libxatracker1 --- install xserver-xorg-input-synaptics --- install So, I'll start again, what should I do now to get two-finger scrolling to work and ensure it works after restarting? Also doing: synclient TapButton1=1 TapButton2=2 TapButton3=3 ...works but doesn't save after restarting. However doing: synclient VertTwoFingerScroll=1 HorizTwoFingerScroll=1 Does NOT work to fix the two-finger scrolling. Output of: cat /var/log/Xorg.0.log | grep -i synaptics [ 4.576] (II) LoadModule: "synaptics" [ 4.577] (II) Loading /usr/lib/xorg/modules/input/synaptics_drv.so [ 4.577] (II) Module synaptics: vendor="X.Org Foundation" [ 4.577] (II) Using input driver 'synaptics' for 'ETPS/2 Elantech ETF0401' [ 4.577] (II) Loading /usr/lib/xorg/modules/input/synaptics_drv.so [ 4.584] (--) synaptics: ETPS/2 Elantech ETF0401: x-axis range 0 - 1088 [ 4.584] (--) synaptics: ETPS/2 Elantech ETF0401: y-axis range 0 - 704 [ 4.584] (--) synaptics: ETPS/2 Elantech ETF0401: pressure range 0 - 255 [ 4.584] (--) synaptics: ETPS/2 Elantech ETF0401: finger width range 0 - 16 [ 4.584] (--) synaptics: ETPS/2 Elantech ETF0401: buttons: left right middle double triple scroll-buttons [ 4.584] (--) synaptics: ETPS/2 Elantech ETF0401: Vendor 0x2 Product 0xe [ 4.584] (--) synaptics: ETPS/2 Elantech ETF0401: touchpad found [ 4.588] (**) synaptics: ETPS/2 Elantech ETF0401: (accel) MinSpeed is now constant deceleration 2.5 [ 4.588] (**) synaptics: ETPS/2 Elantech ETF0401: MaxSpeed is now 1.75 [ 4.588] (**) synaptics: ETPS/2 Elantech ETF0401: AccelFactor is now 0.154 [ 4.589] (--) synaptics: ETPS/2 Elantech ETF0401: touchpad found Tried installing synaptiks but that didn't seem to work either, so removed it. Temporary Fix (works until I restart) Doing the following commands: modprobe -r psmouse modprobe psmouse proto=imps Works but now xinput list shows up as: Virtual core pointer id=2 [master pointer (3)] ? Virtual core XTEST pointer id=4 [slave pointer (2)] ? ImPS/2 Generic Wheel Mouse id=13 [slave pointer (2)] Instead of Elantech, and it gets reset when I reboot. Solution (not ideal for most people) So, I ended up reinstalling a fresh 12.04 after indirectly playing around with burg and plymouth then removing plymouth which removed 50+ packages (I saw the warnings but was way too tired and assumed I could just 'reinstall' them all after (except that didn't work). Right now xinput list shows up as: ? Virtual core pointer --- id=2 [master pointer (3)] ? ? Virtual core XTEST pointer --- id=4 [slave pointer (2)] ? ? ETPS/2 Elantech Touchpad --- id=13 [slave pointer (2)] grep 'touch\|mouse\|track\|synapt' installed-software libnetfilter-conntrack3 --- install libsoundtouch0 --- install libutouch-evemu1 --- install libutouch-frame1 --- install libutouch-geis1 --- install libutouch-grail1 --- install libxatracker1 --- install mousetweaks --- install printer-driver-ptouch --- install xserver-xorg-input-mouse --- install xserver-xorg-input-synaptics --- install xserver-xorg-input-vmmouse --- install cat /var/log/Xorg.0.log | grep -i synaptics [ 4.890] (II) LoadModule: "synaptics" [ 4.891] (II) Loading /usr/lib/xorg/modules/input/synaptics_drv.so [ 4.892] (II) Module synaptics: vendor="X.Org Foundation" [ 4.892] (II) Using input driver 'synaptics' for 'ETPS/2 Elantech Touchpad' [ 4.892] (II) Loading /usr/lib/xorg/modules/input/synaptics_drv.so [ 4.956] (II) synaptics: ETPS/2 Elantech Touchpad: ignoring touch events for semi-multitouch device [ 4.956] (--) synaptics: ETPS/2 Elantech Touchpad: x-axis range 0 - 1088 [ 4.956] (--) synaptics: ETPS/2 Elantech Touchpad: y-axis range 0 - 704 [ 4.956] (--) synaptics: ETPS/2 Elantech Touchpad: pressure range 0 - 255 [ 4.956] (--) synaptics: ETPS/2 Elantech Touchpad: finger width range 0 - 15 [ 4.956] (--) synaptics: ETPS/2 Elantech Touchpad: buttons: left right double triple [ 4.956] (--) synaptics: ETPS/2 Elantech Touchpad: Vendor 0x2 Product 0xe [ 4.956] (--) synaptics: ETPS/2 Elantech Touchpad: touchpad found [ 4.980] () synaptics: ETPS/2 Elantech Touchpad: (accel) MinSpeed is now constant deceleration 2.5 [ 4.980] () synaptics: ETPS/2 Elantech Touchpad: MaxSpeed is now 1.75 [ 4.980] (**) synaptics: ETPS/2 Elantech Touchpad: AccelFactor is now 0.154 [ 4.980] (--) synaptics: ETPS/2 Elantech Touchpad: touchpad found So, if all else fails, reinstall Linux :/

    Read the article

  • Extract known pattern substring from NSString (without regex)

    - by d11wtq
    I'm really tempted to drop RegexKit (or my own libpcre wrapper) into my project in order to do this, but before I do that I want to know how Cocoa developers manage to do half of this basic stuff without really convoluted code or without linking with RegexKit or another regular expression library. I find it gobsmacking that Cocoa does not include any regular expression matching features. I've so accustomed to using regular expressions for all kinds of things that I'm lost without them. I can do what I need without them, but the code would be rather convoluted. So, Cocoa devs, I ask you, what's the "Cocoa way" to do this... The problem is an everyday problem in programming as far as I'm concerned. Cocoa must have ways of doing this with the built-in features. Note that the position of the elements I want to match changes, and sometimes "quotes" are present. Whitespace is variable. Take the following strings: Content-Type: application/xml; charset=utf-8 Content-Type: text/html; charset="iso-8859-1" Content-Type: text/plain; charset=us-ascii Content-Type: text/plain; name="example.txt"; charset=utf-8 From all of these strings, how would you go about determining the mime type (e.g. text/plain) and the charset (e.g. utf-8) using just the built-in Cocoa classes? I'd end up performing a series of -rangeOfString: and substring calls, with conditional checks to deal with the optional quotes etc. Is there a way to do this with NSScanner? The NSScanner class seems to have a pretty naive API to me. Something like C's sscanf() that works for NSString objects would be an ideal fit. Most of my string parsing needs are simple such as this example so maybe regular expressions, while I'm accustomed to them, are overkill?

    Read the article

  • How do I correctly modify a custom cocoa framework?

    - by Septih
    Hello, I'm working with the very-useful ID3 framework in my cocoa project. There's one tiny thing I'd like to modify in it, but I can't seem to get the changes I've made to apply to the built framework. The source code provided with the framework comes with an Xcode project, so I've opened that up and for testings sake put an NSLog(@"hello"); in. It's definetly in a place where it will be called and there are other NSLog() calls in the framework that show up so it's not just console output being supressed. To build the framework once modified I've first cleaned the build folder, made sure that it's actually removed the files, and then built it. Then in the Xcode project I'm using the framework in, I've deleted the old reference and added a new one to the framework that's freshly built. Running my project with the newly build framework doesn't call the modified framework code. I've tried with both the Development and Deployment builds that are part of the framework Xcode project. My gut instinct is that the executable that the framework code is compiled into is being cached somehow. But as I'm fairly unfamiliar with the workings of frameworks, I'm not really sure where to look.

    Read the article

  • Checking if a touch is withing a UIButton's bounds.

    - by Joshua
    I am trying to make an if statement which will check whether the users touch is within a UIButton's bounds. I thought this would be an easy affair as UIButton is a subclass of UIView, however my code doesn't seem to work. This is the code I have been using. - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { NSArray *array = [touches allObjects]; UITouch *specificTouch = [array objectAtIndex:0]; currentTouch = [specificTouch locationInView:self.view]; if (CGRectContainsPoint(but.bounds, currentTouch)) { //Do something is in bounds. } //Else do nothing. }

    Read the article

< Previous Page | 27 28 29 30 31 32 33 34 35 36 37 38  | Next Page >