Search Results

Search found 558 results on 23 pages for 'touches'.

Page 4/23 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • How to detect touch over a UIView when touched over a UIButton?

    - by jglievano
    I'm using (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event, and (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event to handle some dragging on a UIView. This UIView however have some UIButtons as subviews of the UIView and when the user touches over a UIButton (which are also over the UIView) the touches methods aren't called. I need the touch methods in the UIView to be called at all times and still have the UIButtons working, how can I achieve this?

    Read the article

  • touchesEnded not being called??? or randomly being called

    - by Rob
    If I lift my finger up off the first touch, then it will recognize the next touch just fine. It's only when I hold my first touch down continuously and then try and touch a different area with a different finger at the same time. It will then incorrectly register that second touch as being from the first touch again. Update It has something to do with touchesEnded not being called until the very LAST touch has ended (it doesn't care if you already had 5 other touches end before you finally let go of the last one... it calls them all to end once the very last touch ends) - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch* touch = [touches anyObject]; NSString* filename = [listOfStuff objectAtIndex:[touch view].tag]; // do something with the filename now } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { ITouch* touch = [touches anyObject]; NSString* buttonPressed = [listOfStuff objectAtIndex:[touch view].tag]; // do something with this info now }

    Read the article

  • touchesBegan and other touch events not getting detected in UINavigationController

    - by SaltyNuts
    In short, I want to detect a touch on the navigation controller titlebar, but having trouble actually catching any touches at all! Everything is done without IB, if that makes a difference. My app delegate's .m file contains: MyViewController *viewController = [[MyViewController alloc] init]; navigationController = [[UINavigationController alloc] initWithRootViewController:viewController]; [window addSubview:navigationController.view]; There are a few other subviews added to this window in a way that overlays navigationController leaving only the navigation bar visible. MyViewController is a subclass of UIViewController and its .m file contains: - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { for (UITouch *touch in touches) { NSLog(@"ended\n"); } } -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { for (UITouch *touch in touches) { NSLog(@"began\n"); } } I also tried putting these functions directly into app delegate's .m file, but the console remains blank. What am I doing wrong?

    Read the article

  • What class should manage/control the CALayers in my view using proper MVC?

    - by wanderlust
    I have a ViewController with a view (UIView). I need to handle touches, run some logic, check against model data, and and add and remove sublayers to the view based on those touches. Then I need to update the model based on the results. Should I have: ViewController - manage touches, get/set model data, add/remove sublayers UIView CALayer Sublayers or Controller (NSObject) - get/set data ViewController - manage touches, add/remove sublayers UIView CALayer Sublayers or Controller (NSObject) - get/set data CustomView - manage touches, add/remove sublayers CALayer Sublayers Or is it something else all together? No matter what I try, it "feels" awkward. SVN is my friend. Can you guys help a girl (with architectural issues) out?

    Read the article

  • How to resize the UIView when CGAffineTransformIdentity

    - by Gowtham
    I am doing an app which has a feature to rotate and re size a view. i have implemented this feature but i do face an issue. My problem The View wil be resized when dragging its four corners, after resizing it i can rotate the view in both directions. Once the rotation is done, if i try again to resize the view by dragging its corner, the view's size gone to unpredictable value and its moving all around the screen. I googled lot finally i got the following solution The frame property is undefined when transform != CGAffineTransformIdentity, as per the docs on UIView I saw one app which has implemented the feature exactly what i wish to implement. How can i resize the UIView after rotation of UIView My code for resize the view Touches Began - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{ UITouch *touch = [[event allTouches] anyObject]; NSLog(@"[touch view]:::%@",[touch view]); touchStart = [[touches anyObject] locationInView:testVw]; isResizingLR = (testVw.bounds.size.width - touchStart.x < kResizeThumbSize && testVw.bounds.size.height - touchStart.y < kResizeThumbSize); isResizingUL = (touchStart.x <kResizeThumbSize && touchStart.y <kResizeThumbSize); isResizingUR = (testVw.bounds.size.width-touchStart.x < kResizeThumbSize && touchStart.y<kResizeThumbSize); isResizingLL = (touchStart.x <kResizeThumbSize && testVw.bounds.size.height -touchStart.y <kResizeThumbSize); } Touches Moved - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{ CGPoint touchPoint = [[touches anyObject] locationInView:testVw]; CGPoint previous=[[touches anyObject]previousLocationInView:testVw]; float deltaWidth = touchPoint.x-previous.x; float deltaHeight = touchPoint.y-previous.y; NSLog(@"CVTM:%@",NSStringFromCGRect(testVw.frame)); if (isResizingLR) { testVw.frame = CGRectMake(testVw.frame.origin.x, testVw.frame.origin.y,touchPoint.x + deltaWidth, touchPoint.y + deltaWidth); } if (isResizingUL) { testVw.frame = CGRectMake(testVw.frame.origin.x + deltaWidth, testVw.frame.origin.y + deltaHeight, testVw.frame.size.width - deltaWidth, testVw.frame.size.height - deltaHeight); } if (isResizingUR) { testVw.frame = CGRectMake(testVw.frame.origin.x ,testVw.frame.origin.y + deltaHeight, testVw.frame.size.width + deltaWidth, testVw.frame.size.height - deltaHeight); } if (isResizingLL) { testVw.frame = CGRectMake(testVw.frame.origin.x + deltaWidth ,testVw.frame.origin.y , testVw.frame.size.width - deltaWidth, testVw.frame.size.height + deltaHeight); } if (!isResizingUL && !isResizingLR && !isResizingUR && !isResizingLL) { testVw.center = CGPointMake(testVw.center.x + touchPoint.x - touchStart.x,testVw.center.y + touchPoint.y - touchStart.y); } }

    Read the article

  • Implementing touch-based rotation in cocoa touch

    - by ewoo
    I am wondering what is the best way to implement rotation-based dragging movements in my iPhone application. I have a UIView that I wish to rotate around its centre, when the users finger is touch the view and they move it. Think of it like a dial that needs to be adjusted with the finger. The basic question comes down to: 1) Should I remember the initial angle and transform when touchesBegan is called, and then every time touchesMoved is called apply a new transform to the view based on the current position of the finger, e.g., something like: - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; CGPoint currentPoint = [touch locationInView:self]; //current position of touch if (([touch view] == self) && [Utility getDistance:currentPoint toPoint:self.middle] <= ROTATE_RADIUS //middle is centre of view && [Utility getDistance:currentPoint toPoint:self.middle] >= MOVE_RADIUS) { //will be rotation gesture //remember state of view at beginning of touch CGPoint top = CGPointMake(self.middle.x, 0); self.initialTouch = currentPoint; self.initialAngle = angleBetweenLines(self.middle, top, self.middle, currentPoint); self.initialTransform = self.transform; } } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{ UITouch *touch = [touches anyObject]; CGPoint currentPoint = [touch locationInView:self]; //current position of touch if (([touch view] == self) && [Utility getDistance:currentPoint toPoint:self.middle] <= ROTATE_RADIUS && [Utility getDistance:currentPoint toPoint:self.middle] >= MOVE_RADIUS) { //a rotation gesture //rotate tile float newAngle = angleBetweenLines(self.middle, CGPointMake(self.middle.x, 0), self.middle, currentPoint); //touch angle float angleDif = newAngle - self.initialAngle; //work out dif between angle at beginning of touch and now. CGAffineTransform newTransform = CGAffineTransformRotate(self.initialTransform, angleDif); //create new transform self.transform = newTransform; //apply transform. } OR 2) Should I simply remember the last known position/angle, and rotate the view based on the difference in angle between that and now, e.g.,: - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; CGPoint currentPoint = [touch locationInView:self]; //current position of touch if (([touch view] == self) && [Utility getDistance:currentPoint toPoint:self.middle] <= ROTATE_RADIUS && [Utility getDistance:currentPoint toPoint:self.middle] >= MOVE_RADIUS) { //will be rotation gesture //remember state of view at beginning of touch CGPoint top = CGPointMake(self.middle.x, 0); self.lastTouch = currentPoint; self.lastAngle = angleBetweenLines(self.middle, top, self.middle, currentPoint); } } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{ UITouch *touch = [touches anyObject]; CGPoint currentPoint = [touch locationInView:self]; //current position of touch if (([touch view] == self) && [Utility getDistance:currentPoint toPoint:middle] <= ROTATE_RADIUS && [Utility getDistance:currentPoint toPoint:middle] >= MOVE_RADIUS) { //a rotation gesture //rotate tile float newAngle = angleBetweenLines(self.middle, CGPointMake(self.middle.x, 0), self.middle, currentPoint); //touch angle float angleDif = newAngle - self.lastAngle; //work out dif between angle at beginning of touch and now. CGAffineTransform newTransform = CGAffineTransformRotate(self.transform, angleDif); //create new transform self.transform = newTransform; //apply transform. self.lastTouch = currentPoint; self.lastAngle = newAngle; } The second option makes more sense to me, but it is not giving very pleasing results (jaggy updates and non-smooth rotations). Which way is best (if any), in terms of performance? Cheers!

    Read the article

  • UIScrollView without paging but with touchesmoved

    - by BittenApple
    I have a UIScrollView that I use to display PDF pages in. I don't want to use paging (yet). I want to display content based on touchesmoved event (so after horizontal swipe). This works (sort of), but instead of catching a single swipe and showing 1 page, the swipe seems to gets broken into 100s of pieces and 1 swipe acts as if you're moving a slider! I have no clue what am I doing wrong. Here's the experimental "display next page" code which works on single taps: - (void)nacrtajNovuStranicu:(CGContextRef)myContext { CGContextTranslateCTM(myContext, 0.0, self.bounds.size.height); CGContextScaleCTM(myContext, 1.0, -1.0); CGContextSetRGBFillColor(myContext, 255, 255, 255, 1); CGContextFillRect(myContext, CGRectMake(0, 0, 320, 412)); size_t brojStranica = CGPDFDocumentGetNumberOfPages(pdfFajl); if(pageNumber < brojStranica){ pageNumber ++; } else{ // kraj PDF fajla, ne listaj dalje. } CGPDFPageRef page = CGPDFDocumentGetPage(pdfFajl, pageNumber); CGContextSaveGState(myContext); CGAffineTransform pdfTransform = CGPDFPageGetDrawingTransform(page, kCGPDFCropBox, self.bounds, 0, true); CGContextConcatCTM(myContext, pdfTransform); CGContextDrawPDFPage(myContext, page); CGContextRestoreGState(myContext); //osvjezi displej [self setNeedsDisplay]; } Here's the swiping code: - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { [super touchesBegan:touches withEvent:event]; UITouch *touch = [touches anyObject]; gestureStartPoint = [touch locationInView:self]; } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; CGPoint currentPosition = [touch locationInView:self]; CGFloat deltaX = fabsf(gestureStartPoint.x - currentPosition.x); CGFloat deltaY = fabsf(gestureStartPoint.y - currentPosition.y); if (deltaX >= kMinimumGestureLength && deltaY <= kMaximumVariance) { [self nacrtajNovuStranicu:(CGContextRef)UIGraphicsGetCurrentContext()]; } } The code sits in UIView which displays the PDF content, perhaps I should place it into UIScrollView or is the "display next page" code wrong?

    Read the article

  • Cancel UITouch Events When View Covered By Modal UIViewController

    - by kkrizka
    Hi there, I am writing an application where the user has to move some stuff on the screen using his fingers and drop them. To do this, I am using the touchesBegan,touchesEnded... function of each view that has to be moved. The problem is that sometimes the views are covered by a view displayed using the [UIViewController presentModalViewController] function. As soon as that happens, the UIView that I was moving stops receiving the touch events, since it was covered up. But there is no event telling me that it stopped receiving the events, so I can reset the state of the moved view. The following is an example that demonstrates this. The functions are part of a UIView that is being shown in the main window. It listens to touch events and when I drag the finger for some distance, it presents a modal view that covers everything. In the Run Log, it prints what touch events are received. - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"touchesBegan"); touchStart=[[touches anyObject] locationInView:self]; } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { CGPoint touchAt=[[touches anyObject] locationInView:self]; float xx=(touchAt.x-touchStart.x)*(touchAt.x-touchStart.x); float yy=(touchAt.y-touchStart.y)*(touchAt.y-touchStart.y); float rr=xx+yy; NSLog(@"touchesMoved %f",rr); if(rr > 100) { NSLog(@"Show modal"); [viewController presentModalViewController:[UIViewController new] animated:NO]; } } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"touchesEnded"); } - (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"touchesCancelled"); } But when I test the application and trigger the modal dialog to be displayed, the following is the output in the Run Log. [Session started at 2010-03-27 16:17:14 -0700.] 2010-03-27 16:17:18.831 modelTouchCancel[2594:207] touchesBegan 2010-03-27 16:17:19.485 modelTouchCancel[2594:207] touchesMoved 2.000000 2010-03-27 16:17:19.504 modelTouchCancel[2594:207] touchesMoved 4.000000 2010-03-27 16:17:19.523 modelTouchCancel[2594:207] touchesMoved 16.000000 2010-03-27 16:17:19.538 modelTouchCancel[2594:207] touchesMoved 26.000000 2010-03-27 16:17:19.596 modelTouchCancel[2594:207] touchesMoved 68.000000 2010-03-27 16:17:19.624 modelTouchCancel[2594:207] touchesMoved 85.000000 2010-03-27 16:17:19.640 modelTouchCancel[2594:207] touchesMoved 125.000000 2010-03-27 16:17:19.641 modelTouchCancel[2594:207] Show modal Any suggestions on how to reset the state of a UIView when its touch events are interrupted by a modal view?

    Read the article

  • Odd values/movement with UITouch and CGPoint.

    - by Joshua
    I'm getting odd numbers from UITouch and CGPoint and one is different, I also think this maybe causing a flickering affect in my app when I try to move something by following a touch. This is the code I'm using: - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"touchDown"); UITouch *touch = [touches anyObject]; firstTouch = [touch locationInView:self.view]; if (CGRectContainsPoint(but.frame, firstTouch)) { butContains = YES; NSLog(@"butContains = %d", butContains); } } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; currentTouch = [touch locationInView:self.view]; NSInteger x = currentTouch.x; NSInteger y = currentTouch.y; CGFloat CGX = (CGFloat)x; CGFloat CGY = (CGFloat)y; if (butContains == YES) { NSLog(@"touch in subView/contentView"); sub.frame = CGRectMake(CGX, CGY, 130.0, 21.0); } NSLog(@"touch moved"); } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; currentTouch = [touch locationInView:self.view]; NSLog(@"User tapped at %@", NSStringFromCGPoint(currentTouch)); NSLog(@"Point %a, %a", currentTouch.x, currentTouch.y); NSInteger x = currentTouch.x; NSInteger y = currentTouch.y; NSLog(@"Point %a, %a", y, x); CGFloat CGX = (CGFloat)x; CGFloat CGY = (CGFloat)y; NSLog(@"Point %g, %g", CGX, CGY); if (butContains == YES) { NSLog(@"touch in subView/contentView"); sub.frame = CGRectMake(CGX, CGY, 130.0, 21.0); } butContains = NO; NSLog(@"touch ended"); } - (IBAction)add:(id)sender{ InSightViewController *contentView = [[InSightViewController alloc] initWithNibName:@"SubView" bundle:[NSBundle mainBundle]]; [contentView loadView]; [self.view insertSubview:contentView.view atIndex:0]; } This is what I get from the touchesEnded method in the Debugger. 2010-04-20 20:06:13.045 InSight[25042:207] User tapped at {50, 78} 2010-04-20 20:06:13.047 InSight[25042:207] Point 0x1.9p+5, 0x1.38p+6 2010-04-20 20:06:13.048 InSight[25042:207] Point 0x1.900000027p-1037, 0x1.38p+6 2010-04-20 20:06:13.048 InSight[25042:207] Point 50, 78 And this is what's happening in the Simulator. fwdr.org/file:y8bd As this is a complicated problem this is the source code of my XCode Project aswell. http://cl.ly/Qjj

    Read the article

  • touchesBegan / Ended incorrectly identifying second, third, etc. touch

    - by Rob
    I have an issue where touchesBegan and touchesEnded are incorrectly identifying my second, third, etc touch if I continue to hold down my first touch. If I lift my finger up off the first touch, then it will recognize the next touch just fine. It's only when I hold my first touch down continuously and then try and touch a different area with a different finger at the same time. It will then incorrectly register that second touch as being from the first touch again. Any insights into how I can fix this? - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch* touch = [touches anyObject]; NSString* filename = [listOfStuff objectAtIndex:[touch view].tag]; // do something with the filename now } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { ITouch* touch = [touches anyObject]; NSString* buttonPressed = [listOfStuff objectAtIndex:[touch view].tag]; // do something with this info now }

    Read the article

  • Why my own UIViewController can't detect touch?

    - by Tattat
    I have my OwnViewController, the viewDidLoad is like this: - (void)viewDidLoad { [super viewDidLoad]; UIImage *img = [UIImage imageWithContentsOfFile: [[NSBundle mainBundle] pathForResource:@"myImg" ofType:@"png"]]; CGRect cropRect = CGRectMake(175, 0, 175, 175); CGImageRef imageRef = CGImageCreateWithImageInRect([img CGImage], cropRect); UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 175, 175)]; imageView.image = [UIImage imageWithCGImage:imageRef]; self.view = imageView; CGImageRelease(imageRef); } It works, and I have detect touches method like this: - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"touchesBegan"); } But my UIView can't detect any touches. My Own UIViewController is a subclass of UIViewController. It is a little square view on the IB, why that can't detect touches? thx u.

    Read the article

  • Maintaining last stage when rotating through CATransform3DIdentity

    - by Mikhail Naimy
    Hi. i am rotating imageview through following code.it rotates fine..but when i rotate again , Imageview goes to previous angle( which is in initial stage) and then it rotates...any one can help in this?rotationTransform has been declared as CABasicAnimation* rotationAnimation in .h file.... - (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event { UITouch *touch = [touches anyObject]; startTouchPosition = [touch locationInView:self.view]; } - (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event { UITouch *touch = [[event allTouches] anyObject]; location = [touch locationInView:self.view]; currentLocationRadians = atan2f(location.y - self.view.frame.size.height/2, location.x - self.view.frame.size.width/2); lastLocationRadians = atan2f(startTouchPosition.y - self.view.frame.size.height/2, startTouchPosition.x - self.view.frame.size.width/2); rotationTransform = CATransform3DIdentity; rotationTransform = CATransform3DRotate(rotationTransform, currentLocationRadians-lastLocationRadians + rad, 0.0, 0.0, 1.0); _imgview.layer.transform = rotationTransform; } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { rotationTransform = _imgview.layer.transform; }

    Read the article

  • How can I detect if

    - by Suzie
    Is there a way to detect just 2 distinct touches? I just want to track the touches for my two buttons, which is a sprite, but whenever I have another touch other than my first two touches, it affects my touch with my button. Is there a way to get rid of that third touch? I wish you could help me with this problem.Thank You!

    Read the article

  • Basic drawing with Quartz 2D on iPhone

    - by wwrob
    My goal is to make a program that will draw points whenever the screen is touched. This is what I have so far: The header file: #import <UIKit/UIKit.h> @interface ElSimView : UIView { CGPoint firstTouch; CGPoint lastTouch; UIColor *pointColor; CGRect *points; int npoints; } @property CGPoint firstTouch; @property CGPoint lastTouch; @property (nonatomic, retain) UIColor *pointColor; @property CGRect *points; @property int npoints; @end The implementation file: //@synths etc. - (id)initWithFrame:(CGRect)frame { return self; } - (id)initWithCoder:(NSCoder *)coder { if(self = [super initWithCoder:coder]) { self.npoints = 0; } return self; } - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; firstTouch = [touch locationInView:self]; lastTouch = [touch locationInView:self]; } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; lastTouch = [touch locationInView:self]; points = (CGRect *)malloc(sizeof(CGRect) * ++npoints); points[npoints-1] = CGRectMake(lastTouch.x-15, lastTouch.y-15,30,30); [self setNeedsDisplay]; } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; lastTouch = [touch locationInView:self]; [self setNeedsDisplay]; } - (void)drawRect:(CGRect)rect { CGContextRef context = UIGraphicsGetCurrentContext(); CGContextSetLineWidth(context, 2.0); CGContextSetStrokeColorWithColor(context, [UIColor blackColor].CGColor); CGContextSetFillColorWithColor(context, pointColor.CGColor); for(int i=0; i<npoints; i++) CGContextAddEllipseInRect(context, points[i]); CGContextDrawPath(context, kCGPathFillStroke); } - (void)dealloc { free(points); [super dealloc]; } @end When I load this and click some points, it draws the first points normally, then then next points are drawn along with random ellipses (not even circles). Also I have another question: When is exactly drawRect executed?

    Read the article

  • Disposing a dialog in touch devices in lwuit

    - by MANISH
    I am displaying a dialog when a user touches the screen and want the dialog to dispose when the user touches anywhere outside the dialog.i have set setDisposeWhenPointerOutOfBounds() to true though by default it is...n hav written the following code in pointerReleased() event but whenever the user touches the screen outside of dialog the dialog disposes but not without executing the code that shud be executed only wen the x,y are within the dialog....plz help me out....ne1... public void pointerReleased(int x, int y) { dispose(); if (contains(x, y)) { actionCommand((cmds[l.getSelectedIndex()])); } }

    Read the article

  • CGAffineTransformMakeRotation goes the other way after 180 degrees (-3.14)

    - by TheKillerDev
    So, i am trying to do a very simple disc rotation (2d), according to the user touch on it, just like a DJ or something. It is working, but there is a problem, after certain amount of rotation, it starts going backwards, this amount is after 180 degrees or as i saw in while logging the angle, -3.14 (pi). I was wondering, how can i achieve a infinite loop, i mean, the user can keep rotating and rotating to any side, just sliding his finger? Also a second question is, is there any way to speed up the rotation? Here is my code right now: #import <UIKit/UIKit.h> @interface Draggable : UIImageView { CGPoint firstLoc; UILabel * fred; double angle; } @property (assign) CGPoint firstLoc; @property (retain) UILabel * fred; @end @implementation Draggable @synthesize fred, firstLoc; - (id)initWithFrame:(CGRect)frame { self = [super initWithFrame:frame]; angle = 0; if (self) { // Initialization code } return self; } -(void)handleObject:(NSSet *)touches withEvent:(UIEvent *)event isLast:(BOOL)lst { UITouch *touch =[[[event allTouches] allObjects] lastObject]; CGPoint curLoc = [touch locationInView:self]; float fromAngle = atan2( firstLoc.y-self.center.y, firstLoc.x-self.center.x ); float toAngle = atan2( curLoc.y-(self.center.y+10), curLoc.x-(self.center.x+10)); float newAngle = angle + (toAngle - fromAngle); NSLog(@"%f",newAngle); CGAffineTransform cgaRotate = CGAffineTransformMakeRotation(newAngle); self.transform = cgaRotate; if (lst) angle = newAngle; } -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch =[[[event allTouches] allObjects] lastObject]; firstLoc = [touch locationInView:self]; }; -(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { [self handleObject:touches withEvent:event isLast:NO]; }; -(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { [self handleObject:touches withEvent:event isLast:YES]; } @end And in the ViewController: UIImage *tmpImage = [UIImage imageNamed:@"theDisc.png"]; CGRect cellRectangle; cellRectangle = CGRectMake(-1,self.view.frame.size.height,tmpImage.size.width ,tmpImage.size.height ); dragger = [[Draggable alloc] initWithFrame:cellRectangle]; [dragger setImage:tmpImage]; [dragger setUserInteractionEnabled:YES]; dragger.layer.anchorPoint = CGPointMake(.5,.5); [self.view addSubview:dragger]; I am open to new/cleaner/more correct ways of doing this too. Thanks in advance.

    Read the article

  • Gesture recognizer for mouse down and up in iPhone SDK

    - by user545201
    I want to catch both mouse down and mouse up using gesture recognizer. However, when the mouse down is caught, mouse up is never caught. Here's what I did: First create a custom MouseGestureRecognizer: @implementation MouseGestureRecognizer -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { [super touchesBegan:touches withEvent:event]; self.state = UIGestureRecognizerStateRecognized; } -(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { [super touchesEnded:touches withEvent:event]; self.state = UIGestureRecognizerStateRecognized; } @end Then bind the recognizer to a view in view controller: UIGestureRecognizer *recognizer = [MouseGestureRecognizer alloc] initWithTarget:self action:@selector(handleGesture:)]; [self.view addGestureRecognizer:recognizer]; When I click mouse in the view, the touchesBegan is called, but touchesEnded is never called. Is it because of the UIGestureRecognizerStateRecognized? Any advice will be appreciated! Thanks!

    Read the article

  • How to drag only one image with SDK Iphone

    - by loka
    Hi! I want to create a little app that takes two images and i want to make only the image over draggable. After research, i found this solution : -(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [[ event allTouches] anyObject]; image.alpha = 0.7; if([touch view] == image){ CGPoint location = [touch locationInView:self.view]; image.center = location; } It works but the problem is that the image is draggable from its center and i don't want that. So i found another solution : - (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event { // Retrieve the touch point CGPoint pt = [[touches anyObject] locationInView:self.view]; startLocation = pt; [[self view] bringSubviewToFront:self.view]; } - (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event { // Move relative to the original touch point CGPoint pt = [[touches anyObject] locationInView:self.view]; frame = [self.view frame]; frame.origin.x += pt.x - startLocation.x; frame.origin.y += pt.y - startLocation.y; [self.view setFrame:frame]; } It works very well but when i add another image, all the images of the view are draggable at the same time.I'm a beginner with the iphone programmation and i have no idea of how i can only make the image over draggable. Thank you in advance for your help!!

    Read the article

  • How To Draw line on touch event ?

    - by AJPatel
    Hey i m beginner of objective C Please Help me i make following code but not work..... -(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; if ([touch view] == self.view) { CGPoint location = [touch locationInView:self.view]; loc1 = location; CGContextMoveToPoint(context, location.x, location.y); NSLog(@"x:%d y:%d At Touch Begain", loc1.x, loc1.y); } } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; if ([touch view] == self.view) { CGPoint location = [touch locationInView:self.view]; CGContextMoveToPoint(context, loc1.x, loc1.y); NSLog(@"x:%d y:%d At Touch Move", loc1.x, loc1.y); CGContextAddLineToPoint(context, location.x, location.y); NSLog(@"x:%d y:%d", location.x, location.y); } }

    Read the article

  • Increase animation speed according to the swipe speed in unity for Android

    - by rohit
    I have the animation done through Maya and brought the FBX file to unity. Here is my code to calculate the speed of the swipe: Vector2 speedMeasuredInScreenWidthsPerSecond =(Input.touches[0].deltaPosition / Screen.width) * Input.touches[0].deltaTime; Now I wanted to take speedMeasuredInScreenWidthsPerSecond and use it to increase the animation speed accordingly like this: animation["gmeChaAnimMiddle"].speed=Mathf.Round(speedMeasuredInScreenWidthsPerSecond); However, this results in an error that I need to convert Vector2 to float. So how do I overcome it?

    Read the article

  • Visual Studio 2010 Best Practices

    - by Etienne Tremblay
    I’d like to thank Packt for providing me with a review version of Visual Studio 2010 Best Practices eBook. In fairness I also know the author Peter having seen him speak at DevTeach on many occasions.  I started by looking at the table of content to see what this book was about, knowing that “best practices” is a real misnomer I wanted to see what they were.  I really like the fact that he starts the book by really saying they are not really best practices but actually recommend practices.  As a Team Foundation Server user I found that chapter 2 was more for the open source crowd and I really skimmed it.  The portion on Branching was well documented, although I’m not a fan of the testing branch myself, but the rest was right on. The section on merge remote changes (bring the outside to you) paradigm is really important and was touched on. Chapter 3 has good solid practices on low level constructs like generics and exceptions. Chapter 4 dives into architectural practices like decoupling, distributed architecture and data based architecture.  DTOs and ORMs are touched on briefly as is NoSQL. Chapter 5 is about deployment and is really a great primer on all the “packaging” technologies like Visual Studio Setup and Deployment (depreciated in 2012), Click Once and WIX the major player outside of commercial solutions.  This is a nice section on how to move from VSSD to WIX this is going to be important in the coming years due to the fact that VS 2012 doesn’t support VSSD. In chapter 6 we dive into automated testing practices, including test coverage, mocking, TDD, SpecDD and Continuous Testing.  Peter covers all those concepts really nicely albeit succinctly. Being a book on recommended practices I find this is really good. I really enjoyed chapter 7 that gave me a lot of great tips to enhance my Visual Studio “experience”.  Tips on organizing projects where good.  Also even though I knew about configurations I like that he put that in there so you can move all your settings to another machine, a lot of people don’t know about that. Quick find and Resharper are also briefly covered.  He touches on macros (depreciated in 2012).  Finally he touches on Continuous Integration a very important concept in today’s ALM landscape. Chapter 8 is all about Parallelization, threads, Async, division of labor, reactive extensions.  All those concepts are touched on and again generalized approaches to those modern problems are giving.       Chapter 9 goes into distributed apps, the most used and accepted practice in the industry for .NET projects the chapter tackles concepts like Scalability, Messaging and Cloud (the flavor of the month of distributed apps, although I think this will stick ;-)).  He also looks a protocols TCP/UDP and how to debug distributed apps.  He touches on logging and health monitoring. Chapter 10 tackles recommended practices for web services starting with implementing WCF services, which goes into all sort of goodness like how to host in IIS or self-host.  How to manual test WCF services, also a section on authentication and authorization.  ASP.NET Web services are also touched on in that chapter All in all a good read, nice tips and accepted practices.  I like the conciseness of the subjects and Peter touches on a lot of things in this book and uses a lot of the current technologies flavors to explain the concepts.   Cheers, ET

    Read the article

  • touchesBegin & touchesMove Xcode Obj C Question

    - by AndrewDK
    So I have my app working good when you press and drag along. I also have UIButtons set to Touch Down in Interface Builder. As well when you drag you need to drag from the outside of the UIButton. You cannot click on the UIButton and drag to the other. TOUCHES MOVED: Code: -(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [[event touchesForView:self.view] anyObject]; CGPoint location = [touch locationInView:touch.view]; if(CGRectContainsPoint(oneButton.frame, location)) { if (!oneButton.isHighlighted){ [self oneFunction]; [oneButton setHighlighted:YES]; } }else { [oneButton setHighlighted:NO]; } // if(CGRectContainsPoint(twoButton.frame, location)) { if (!twoButton.isHighlighted){ [self twoFunction]; [twoButton setHighlighted:YES]; } }else { [twoButton setHighlighted:NO]; } } TOUCHES BEGAN: Code: - (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [[event touchesForView:self.view] anyObject]; CGPoint location = [touch locationInView:touch.view]; if(CGRectContainsPoint(oneButton.frame, location)) { [self oneFunction]; [oneButton setHighlighted:YES]; } if(CGRectContainsPoint(twoButton.frame, location)) { [self twoFunction]; [twoButton setHighlighted:YES]; } } I want to be able to click on any of the button fire the function & also be able to drag from one button on to the other and fire that function. So basically just being able to click on a button and slide your finger over and activate the other button without having to press and slide from outside of the button. I think I'm close, need a bit of help. Hope thats clear enough. Thanks.

    Read the article

  • Opengl Iphone SDK: How to tell if you're touching an object on screen?

    - by TheGambler
    First is my touchesBegan function and then the struct that stores the values for my object. I have an array of these objects and I'm trying to figure out when I touch the screen if I'm touching an object on the screen. I don't know if I need to do this by iterating through all my objects and figure out if I'm touching an object that way or maybe there is an easier more efficient way. How is this usually handled? -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{ [super touchesEnded:touches withEvent:event]; UITouch* touch = ([touches count] == 1 ? [touches anyObject] : nil); CGRect bounds = [self bounds]; CGPoint location = [touch locationInView:self]; location.y = bounds.size.height - location.y; float xTouched = location.x/20 - 8 + ((int)location.x % 20)/20; float yTouched = location.y/20 - 12 + ((int)location.y % 20)/20; } typedef struct object_tag // Create A Structure Called Object { int tex; // Integer Used To Select Our Texture float x; // X Position float y; // Y Position float z; // Z Position float yi; // Y Increase Speed (Fall Speed) float spinz; // Z Axis Spin float spinzi; // Z Axis Spin Speed float flap; // Flapping Triangles :) float fi; // Flap Direction (Increase Value) } object;

    Read the article

  • Can iPad/iPhone Touch Points be Wrong Due to Calibration?

    - by Kristopher Johnson
    I have an iPad application that uses the whole screen (that is, UIStatusBarHidden is set true in the Info.plist file). The main window's frame is set to (0, 0, 768, 1024), as is the main view in that frame. The main view has multitouch enabled. The view has code to handle touches: - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { for (UITouch *touch in touches) { CGPoint location = [touch locationInView:nil]; NSLog(@"touchesMoved at location %@", NSStringFromCGPoint(location)); } } When I run the app in the simulator, it works pretty much as expected. As I move the mouse from one edge of the screen to the other, reported X values go from 0 to 767. Reported Y values go from 20 to 1023, but it is a known issue that the simulator doesn't report touches in the top 20 pixels of the screen, even when there is no status bar. Here's what's weird: When I run the app on an actual iPad, the X values go from 0 to 767 as expected, but reported Y values go from -6 to 1017. The fact that it seems to work properly on the simulator leads me to suspect that real devices' touchscreens are not perfectly calibrated, and mine is simply reporting values six pixels too low. Can anyone verify that this is the case? Otherwise, is there anything else that could account for the Y values being six pixels off from what I expect? (In a few days, I should have a second iPad, so I can test this with another device and compare the results.)

    Read the article

  • How to add images while moving finger across UIView?

    - by Luke Irvin
    I am having issues adding an image across a view while moving finger across the screen. Currently it is adding the image multiple times but squeezing them too close together and not really following my touch. Here is what I am trying: -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch * touch = [touches anyObject]; touchPoint = [touch locationInView:imageView]; if (touchPoint.x > 0 && touchPoint.y > 0) { _aImageView = [[UIImageView alloc] initWithImage:aImage]; _aImageView.multipleTouchEnabled = YES; _aImageView.userInteractionEnabled = YES; [_aImageView setFrame:CGRectMake(touchPoint.x, touchPoint.y, 80.0, 80.0)]; [imageView addSubview:_aImageView]; [_aImageView release]; } } -(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { [self touchesBegan:touches withEvent:event]; } Any suggestions is much appreciated. EDIT: What I want: After taking or choosing an image, the user can then select another image from a list. I want the user to touch and move their finger across the view and the selected image will appear where they drag their finger, without overlapping, in each location.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >