Search Results

Search found 558 results on 23 pages for 'touches'.

Page 8/23 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • C# Programming Tips and Tricks

    Volume 2 of tips and tricks that touches on tips related to some of the new features of C# 4.0 along with other beneficial tips and tricks. In addition, it mentions some tools that are worth knowing as well.

    Read the article

  • How to cleanly add after-the-fact commits from the same feature into git tree

    - by Dennis
    I am one of two developers on a system. I make most of the commits at this time period. My current git workflow is as such: there is master branch only (no develop/release) I make a new branch when I want to do a feature, do lots of commits, and then when I'm done, I merge that branch back into master, and usually push it to remote. ...except, I am usually not done. I often come back to alter one thing or another and every time I think it is done, but it can be 3-4 commits before I am really done and move onto something else. Problem The problem I have now is that .. my feature branch tree is merged and pushed into master and remote master, and then I realize that I am not really done with that feature, as in I have finishing touches I want to add, where finishing touches may be cosmetic only, or may be significant, but they still belong to that one feature I just worked on. What I do now Currently, when I have extra after-the-fact commits like this, I solve this problem by rolling back my merge, and re-merging my feature branch into master with my new commits, and I do that so that git tree looks clean. One clean feature branch branched out of master and merged back into it. I then push --force my changes to origin, since my origin doesn't see much traffic at the moment, so I can almost count that things will be safe, or I can even talk to other dev if I have to coordinate. But I know it is not a good way to do this in general, as it rewrites what others may have already pulled, causing potential issues. And it did happen even with my dev, where git had to do an extra weird merge when our trees diverged. Other ways to solve this which I deem to be not so great Next best way is to just make those extra commits to the master branch directly, be it fast-forward merge, or not. It doesn't make the tree look as pretty as in my current way I'm solving this, but then it's not rewriting history. Yet another way is to wait. Maybe wait 24 hours and not push things to origin. That way I can rewrite things as I see fit. The con of this approach is time wasted waiting, when people may be waiting for a fix now. Yet another way is to make a "new" feature branch every time I realize I need to fix something extra. I may end up with things like feature-branch feature-branch-html-fix, feature-branch-checkbox-fix, and so on, kind of polluting the git tree somewhat. Is there a way to manage what I am trying to do without the drawbacks I described? I'm going for clean-looking history here, but maybe I need to drop this goal, if technically it is not a possibility.

    Read the article

  • How to attach an object to a rotating circle in box2d cocos2d?

    - by armands
    I am trying to make an object get attached on a collision point to a circle that is rotating, but the player needs to get attached with a constant point on the player. For example the player is moving back and forth and when the user touches the screen and the player jumps up but what I need is that when the player collides with the circle it attaches it's legs to it and continues rotating with the circle. So I wanted to know how to make this kind of collision joint in cocos2d box2d?

    Read the article

  • How to Smooth the drawing Stroke?

    - by user1852420
    I am creating drawing.. i can undo, and put colors on it. but when i draw using my fingers the stroke is not that smooth and has edge lines,, here my codes. on which I can Paint on a view, Undo, change color, and the opacity. stroke.h #import <UIKit/UIKit.h> @interface stroke : UIView{ NSMutableArray *strokeArray; UIColor *strokeColor; int strokeSize; float strokeAlpha; int strokeAlpha2; IBOutlet UISlider *slides; float red; float green; float blue; CGPoint mid1; CGPoint mid2; CGPoint endingPoint,previousPoint1,previousPoint2; CGPoint currentTouch; } @property (nonatomic, retain) UIColor *strokeColor; @property (nonatomic) int strokeSize; @property (nonatomic, retain) NSMutableArray *strokeArray; - (IBAction)changeAlphaValue; -(void)loadSLider; -(void)blueColor; -(void)darkvioletColor; -(void)violetColor; -(void)pinkColor; -(void)darkbrownColor; -(void)redColor; -(void)magentaRedColor; -(void)lightBrownColor; -(void)lightOrangeColor; -(void)OrangeColor; -(void)YellowColor; -(void)greenColor; -(void)lightYellowColor; -(void)darkGreenColor; -(void)TurquioseColor; -(void)PaleTurquioseColor; -(void)skyBlueColor; -(void)whiteColor; -(void)DirtyWhiteColor; -(void)SilverColor; -(void)LightGrayColor; -(void)GrayColor; -(void)LightBlackColor; -(void)BlackColor; @end stroke.m #import "stroke.h" @implementation stroke @synthesize strokeColor; @synthesize strokeSize; @synthesize strokeArray; - (void) awakeFromNib{ self.strokeArray = [[NSMutableArray alloc] init]; self.strokeColor = [UIColor colorWithRed:0 green:0 blue:232 alpha:1]; self.strokeSize = 3; } - (void)drawRect:(CGRect)rect{ NSMutableArray *stroke; for (stroke in strokeArray) { CGContextRef contextRef = UIGraphicsGetCurrentContext(); CGContextSetLineWidth(contextRef, [[stroke objectAtIndex:1] intValue]); CGFloat *color = CGColorGetComponents([[stroke objectAtIndex:2] CGColor]); CGContextSetRGBStrokeColor(contextRef, color[0], color[1], color[2], color[3]); CGContextBeginPath(contextRef); CGPoint points[[stroke count]]; for (NSUInteger i = 3; i < [stroke count]; i++) { points[i-3] = [[stroke objectAtIndex:i] CGPointValue]; } CGContextAddLines(contextRef, points, [stroke count]-3); CGContextStrokePath(contextRef); } } -(void)loadSLider{ } - (IBAction)changeAlphaValue{ strokeAlpha2 =((int)slides.value); } -(void)blueColor{ red = 0/255.0; green = 0/255.0; blue = 255/255.0; } -(void)darkvioletColor{ red = 75/255.0; green = 0/255.0; blue = 130/255.0; } -(void)violetColor{ red = 128/255.0; green = 0/255.0; blue = 128/255.0; } -(void)pinkColor{ red = 255/255.0; green = 0/255.0; blue = 255/255.0; } -(void)darkbrownColor{ red = 0.200; green = 0.0; blue = 0.0; } -(void)redColor{ red = 255/255.0; green = 0/255.0; blue = 0/255.0; } -(void)magentaRedColor{ red = 0.350; green = 0.0; blue = 0.0; } -(void)lightBrownColor{ red = 0.480; green = 0.0; blue = 0.0; } -(void)lightOrangeColor{ red = 0.600; green = 0.200; blue = 0.0; } -(void)OrangeColor{ red = 1.0; green = 0.300; blue = 0.0; } -(void)YellowColor{ red = 0.950; green = 0.450; blue = 0.0; } -(void)greenColor{ red = 0.0; green = 1.0; blue = 0.0; } -(void)lightYellowColor{ red = 1.0; green = 1.0; blue = 0.0; } -(void)darkGreenColor{ red = 0.0; green = 0.500; blue = 0.0; } -(void)TurquioseColor{ red = 0.0; green = 0.700; blue = 0.200; } -(void)PaleTurquioseColor{ red = 0.0; green = 0.700; blue = 0.600; } -(void)skyBlueColor{ red = 0.0; green = 0.400; blue = 0.800; } -(void)whiteColor{ red = 1.0; green = 1.0; blue = 1.0; } -(void)DirtyWhiteColor{ red = 0.800; green = 0.800; blue = 0.800; } -(void)SilverColor{ red = 0.600; green = 0.600; blue = 0.600; } -(void)LightGrayColor{ red = 0.500; green = 0.500; blue = 0.500; } -(void)GrayColor{ red = 0.300; green = 0.300; blue = 0.300; } -(void)LightBlackColor{ red = 0.150; green = 0.150; blue = 0.150; } -(void)BlackColor{ red = 0.0; green = 0.0; blue = 0.0; } - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch; NSEnumerator *counter = [touches objectEnumerator]; while ((touch = (UITouch *)[counter nextObject])) { switch (strokeAlpha2) { case 1: strokeAlpha = .1; break; case 2: strokeAlpha = .2; break; case 3: strokeAlpha = .3; break; case 4: strokeAlpha = .4; break; case 5: strokeAlpha = .5; break; case 6: strokeAlpha = .6; break; case 7: strokeAlpha = .7; break; case 8: strokeAlpha = .8; break; case 9: strokeAlpha = .9; break; case 10: strokeAlpha = 1; break; default: strokeAlpha = 1; break; } self.strokeColor = [UIColor colorWithRed:red green:green blue:blue alpha:strokeAlpha]; NSValue *touchPos = [NSValue valueWithCGPoint:[touch locationInView:self]]; UIColor *color = [UIColor colorWithCGColor:strokeColor.CGColor]; NSNumber *size = [NSNumber numberWithInt:strokeSize]; NSMutableArray *stroke = [NSMutableArray arrayWithObjects: touch, size, color, touchPos, nil]; [strokeArray addObject:stroke]; } } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch; NSEnumerator *counter = [touches objectEnumerator]; while ((touch = (UITouch *)[counter nextObject])) { NSMutableArray *stroke; for (stroke in strokeArray) { if ([stroke objectAtIndex:0] == touch) { [stroke addObject:[NSValue valueWithCGPoint:[touch locationInView:self]]]; } [self setNeedsDisplay]; } } } @end

    Read the article

  • touchend event doesn't work on Android

    - by Protos
    Hi, I've just started looking at doing some basic mobile web development on the android and an writing a test script to investigate the touch events. I've run the following code in the android emulator, and the touchend event never gets fired. Can anyone tell me why ? I've tried in three versions of the emulator (1.6, 2.1 and 2.2) and all three behave in the same way. Thanks in advance for any help you can give me. Cheers, Colm EDIT - I've also tried this using the XUI framework and have the same problem so I'm guessing I have a fundamental misunderstanding of how this stuff works ...... Map Test <meta name="description" content="" /> <meta name="keywords" content="" /> <meta name="language" content="english" /> <meta name="viewport" content="minimum-scale=1.0, width=device-width, height=device-height, user-scalable=no"> <script type="text/javascript"> window.onload = function(){ document.body.appendChild( document.createTextNode("w: " + screen.width + " x " + "h : " +screen.height) ); attachTouchEvents(); } function attachTouchEvents() { console = document.getElementById("console"); var map = document.getElementById("map"); map.addEventListener ('touchstart', function (event) { event.preventDefault(); var touch = event.touches[0]; document.getElementById("touchCoord").innerHTML = "S : " + touch.pageX + " " + touch.pageY; document.getElementById("touchEvent").innerHTML = "Touch Start"; }, false); map.addEventListener ('touchmove', function (event) { event.preventDefault(); var touch = event.touches[0]; document.getElementById("touchCoord").innerHTML = "M : " + touch.pageX + " " + touch.pageY; document.getElementById("touchEvent").innerHTML = "Touch Move"; }, false); map.addEventListener ('touchend', function (event) { var touch = event.touches[0]; document.getElementById("touchCoord").innerHTML = "E : " + touch.pageX + " " + touch.pageY; document.getElementById("touchEvent").innerHTML = "Touch End"; event.preventDefault(); }, false); console.innerHTML = "event attached"; } </script> <style type="text/css"> html, body { height:100%; width:100%; margin: 0; background-color:red; } #map { height: 300px; width: 300px; background-color:yellow; } </style> </head> <body> <div id="map"></div> <div id="touchCoord">Touch Coords</div> <div id="touchEvent">Touch Evnt</div> <div id="console">Console</div> </body>

    Read the article

  • Bounding Box Collision Glitching Problem (Pygame)

    - by Ericson Willians
    So far the "Bounding Box" method is the only one that I know. It's efficient enough to deal with simple games. Nevertheless, the game I'm developing is not that simple anymore and for that reason, I've made a simplified example of the problem. (It's worth noticing that I don't have rotating sprites on my game or anything like that. After showing the code, I'll explain better). Here's the whole code: from pygame import * DONE = False screen = display.set_mode((1024,768)) class Thing(): def __init__(self,x,y,w,h,s,c): self.x = x self.y = y self.w = w self.h = h self.s = s self.sur = Surface((64,48)) draw.rect(self.sur,c,(self.x,self.y,w,h),1) self.sur.fill(c) def draw(self): screen.blit(self.sur,(self.x,self.y)) def move(self,x): if key.get_pressed()[K_w] or key.get_pressed()[K_UP]: if x == 1: self.y -= self.s else: self.y += self.s if key.get_pressed()[K_s] or key.get_pressed()[K_DOWN]: if x == 1: self.y += self.s else: self.y -= self.s if key.get_pressed()[K_a] or key.get_pressed()[K_LEFT]: if x == 1: self.x -= self.s else: self.x += self.s if key.get_pressed()[K_d] or key.get_pressed()[K_RIGHT]: if x == 1: self.x += self.s else: self.x -= self.s def warp(self): if self.y < -48: self.y = 768 if self.y > 768 + 48: self.y = 0 if self.x < -64: self.x = 1024 + 64 if self.x > 1024 + 64: self.x = -64 r1 = Thing(0,0,64,48,1,(0,255,0)) r2 = Thing(6*64,6*48,64,48,1,(255,0,0)) while not DONE: screen.fill((0,0,0)) r2.draw() r1.draw() # If not intersecting, then moves, else, it moves in the opposite direction. if not ((((r1.x + r1.w) > (r2.x - r1.s)) and (r1.x < ((r2.x + r2.w) + r1.s))) and (((r1.y + r1.h) > (r2.y - r1.s)) and (r1.y < ((r2.y + r2.h) + r1.s)))): r1.move(1) else: r1.move(0) r1.warp() if key.get_pressed()[K_ESCAPE]: DONE = True for ev in event.get(): if ev.type == QUIT: DONE = True display.update() quit() The problem: In my actual game, the grid is fixed and each tile has 64 by 48 pixels. I know how to deal with collision perfectly if I moved by that size. Nevertheless, obviously, the player moves really fast. In the example, the collision is detected pretty well (Just as I see in many examples throughout the internet). The problem is that if I put the player to move WHEN IS NOT intersecting, then, when it touches the obstacle, it does not move anymore. Giving that problem, I began switching the directions, but then, when it touches and I press the opposite key, it "glitches through". My actual game has many walls, and the player will touch them many times, and I can't afford letting the player go through them. The code-problem illustrated: When the player goes towards the wall (Fine). When the player goes towards the wall and press the opposite direction. (It glitches through). Here is the logic I've designed before implementing it: I don't know any other method, and I really just want to have walls fixed in a grid, but move by 1 or 2 or 3 pixels (Slowly) and have perfect collision without glitching-possibilities. What do you suggest?

    Read the article

  • iPhone SDK UIScrollView doesn't get touch events after moving it

    - by newbie
    Hi! I'm subclassing UIScrollView and on the start I fill this ShowsScrollView with some items. After filling it, I setup frame and contentSize to this ShowsScrollView. Everything works fine for now, i get touches events, scrolling is working.. But after rotation to landscape, I change x and y coordinates of ShowsScrollView frame, to move it from bottom to top right corner. Then I resize it (change width and height of ShowsScrollView frame) and reorder items in this scroll. At the end I setup new contentSize. Now i get touches event only on first 1/4 of scrollview, scrolling also work only on 1/4 of scrollview, but scroll all items in scrollview. After all actions I write a log: NSLog(@"ViewController: setLandscape finished: size: %f, %f content: %f,%f",scrollView.frame.size.width,scrollView.frame.size.height, scrollView.contentSize.width, scrollView.contentSize.height ); Values are correct: ViewController: setLandscape finished: size: 390.000000, 723.000000 content: 390.000000,950.000000 On rotating back to portrait, I move and resize all thing back and everything works fine.. Please help!

    Read the article

  • How do I check to see if my subview is being touched?

    - by Amy
    I went through this tutorial about how to animate sprites: http://icodeblog.com/2009/07/24/iphone-programming-tutorial-animating-a-game-sprite/ I've been attempting to expand on the tutorial by trying to make Ryu animate only when he is touched. However, the touch is not even being registered and I believe it has something to do with it being a subview. Here is my code: -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{ UITouch *touch = [touches anyObject]; if([touch view] == ryuView){ NSLog(@"Touch"); } else { NSLog(@"No touch"); } } -(void) ryuAnims{ NSArray *imageArray = [[NSArray alloc] initWithObjects: [UIImage imageNamed:@"1.png"], [UIImage imageNamed:@"2.png"], [UIImage imageNamed:@"3.png"], [UIImage imageNamed:@"4.png"], [UIImage imageNamed:@"5.png"], [UIImage imageNamed:@"6.png"], [UIImage imageNamed:@"7.png"], [UIImage imageNamed:@"8.png"], [UIImage imageNamed:@"9.png"], [UIImage imageNamed:@"10.png"], [UIImage imageNamed:@"11.png"], [UIImage imageNamed:@"12.png"], nil]; ryuView.animationImages = imageArray; ryuView.animationDuration = 1.1; [ryuView startAnimating]; } -(void)viewDidLoad { [super viewDidLoad]; UIImageView *image = [[UIImageView alloc] initWithFrame: CGRectMake(100, 125, 150, 130)]; ryuView = image; ryuView.image = [UIImage imageNamed:@"1.png"]; ryuView.contentMode = UIViewContentModeBottomLeft; [self.view addSubview:ryuView]; [image release]; } This code compiles fine, however, when touching or clicking on ryu, nothing happens. I've also tried if([touch view] == ryuView.image) but that gives me this error: "Comparison of distinct Objective-C type 'struct UIImage *' and 'struct UIView *' lacks a cast." What am I doing wrong?

    Read the article

  • UIWebView in custom UITableViewCell

    - by mlecho
    i am not sure if this is a good route to go,...i have some data i receive from the web which in turn, populates a table view. The problem is, the text is html (p tags, etc). My first thought was to create a uiwebview in the cell, and populate with loadHTMLString. Fact is , it KINDA works. But, i then the cell no longer was the recipient of touches. SO, before we get too deep in code, is there a better way to populate the cells, than using a UIWebView. It feels like a hack, and i fear even if it works, apple would turn it away. //from my custom UITableViewCell class: - (id)initWithFrame:(CGRect)frame reuseIdentifier:(NSString *)reuseIdentifier { if (self = [super initWithFrame:frame reuseIdentifier:reuseIdentifier]) { [self setFrame:frame]; webcell = [[UIWebView alloc] initWithFrame:CGRectMake(0,0,frame.size.width-20,frame.size.height)]; [self.contentView addSubview:webcell]; //bloack the webview from touches UIView *cover = [[UIView alloc] initWithFrame:webcell.frame]; [self.contentView addSubview:cover]; [cover release]; } return self; } -(void)setLabelData:(FeedItem *)feedItem { link = feedItem.link; NSMutableString *htmlstring = [NSMutableString string]; [htmlstring appendString:@"<html><head><link rel='stylesheet' href='style.css'/></head><body>"]; [htmlstring appendFormat:@"<p>%@</p>",feedItem.title]; [htmlstring appendFormat:@"<p>%@</p>",feedItem.description]; [htmlstring appendString:@"</body></html>"]; [webcell loadHTMLString:htmlstring baseURL:[NSURL fileURLWithPath:[[NSBundle mainBundle]bundlePath]]]; } thanks

    Read the article

  • NSUndoManager grouping problem?

    - by anonymous
    I'm working on a barebones drawing app. I'm attempting to implement undo/redo capability, so I tell the view's undoManager to save the current image before updating the display. This works perfectly (yes, I understand that redrawing/saving the entire view is not incredibly efficient, but to solve this problem before attempting to optimize the code). However, as expected, when I 'undo' or 'redo', only the minute change is reflected. My goal is to have the whole finger stroke undone/redone. To do that, I told the undoManager to [beginUndoGrouping] in the [touchesBegan] method, and to [endUndoGrouping] in [touchesEnded]. That works for a bit, but after drawing a few strokes, the app crashes, and gdb exits with exc_bad_access. I'm very grateful for any insight you can give me. - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { mouseDragged = YES; currentPoint = [[touches anyObject] locationInView:self]; UIGraphicsBeginImageContext(drawingImageView.bounds.size); [drawingImageView.image drawInRect:drawingImageView.bounds]; CGContextRef ctx = UIGraphicsGetCurrentContext(); CGContextSetLineCap(ctx, kCGLineCapRound); CGContextSetLineWidth(ctx, drawingWidth); [drawingColor setStroke]; CGContextBeginPath(ctx); CGContextMoveToPoint(ctx, previousPoint.x, previousPoint.y); CGContextAddLineToPoint(ctx, currentPoint.x, currentPoint.y); CGContextStrokePath(ctx); [self.undoManager registerUndoWithTarget:drawingImageView selector:@selector(setImage:) object:drawingImageView.image]; drawingImageView.image = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); previousPoint = currentPoint; }

    Read the article

  • Why I cannot get correct class of a custom class through isKindOfClass?

    - by Anthony Chan
    Hi, I've created a custom class AnimalView which is a subclass of UIView containing a UILabel and a UIImageView. @interface AnimalView : UIView { UILabel *nameLabel; UIImageView *picture; } Then I added in several AnimalView onto the ViewController.view. In the touchesBegan:withEvent: method, I wanted to detect if the touched object is an AnimalView or not. Here is the code for the viewController: @implementation AppViewController - (void)viewDidLoad { UIScrollView *scrollView = [[UIScrollView alloc] initWithFrame:... [self.view addSubview scrollview]; for (int i = 0; i<10; i++) { AnimalView *newAnimal = [[AnimalView alloc] init]; // customization of newAnimal [scrollview addSubview:newAnimal; } } - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; UIView *hitView = touch.view; if ([hitView isKindOfClass:[AnimalView class]]) { AnimalView *animal = (AnimalView *)hitView; [animal doSomething]; } } However, nothing happened when I clicked on the animal. When I checked the class of hitView by NSLog(@"%@", [hitView class]), it always shows UIView instead of AnimalView. Is it true that the AnimalView changed to a UIView when it is added onto the ViewController? Is there any way I can get back the original class of a custom class?

    Read the article

  • presentModalViewController does not want to work when called from a protocol method

    - by johnbdh
    I have a subview that when double tapped a protocol method on the subview's parent view controller is called like this... - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *theTouch = [touches anyObject]; if (theTouch.tapCount == 1) { } else if (theTouch.tapCount == 2) { if ([self.delegate respondsToSelector:@selector(editEvent:)]) { [self.delegate editEvent:dictionary]; } } } Here is the protocol method with the dictionary consuming code removed... - (void)editEvent:(NSDictionary){ EventEditViewController *eventEditViewController = [[EventEditViewController alloc] initWithNibName:@"EventEditViewController" bundle:nil]; eventEditViewController.delegate = self; navigationController = [[UINavigationController alloc] initWithRootViewController:eventEditViewController]; [self presentModalViewController:navigationController animated:YES]; [eventEditViewController release]; } The protocol method is called and runs without any errors but the modal view does not present itself. I temporarily copied the protocol method's code to an IBAction method for one of the parent's view button's to isolate it from the subview. When I tap this button the modal view works fine. Can anyone tell me what I am doing wrong? Why does it work when executed from a button on the parent view, and not from a protocol method called from a subview. Here is what I have tried so far to work around the problem... Restarted xCode and the simulator Ran on the device (iTouch) Presenting eventEditViewController instead of navigationController Using Push instead of presentModal. delaying the call to the protocol with performSelector directly to the protocol, to another method in the subview which calls the protocol method, from the protocol method to another method with the presentModal calls. Using a timer. I have it currently setup so that the protocol method calls a known working method that presents a different view. Before calling presentModalViewController it pops a UIAlertView which works every time, but the modal view refuses to display when called via the protocol method. I'm stumped. Perhaps it has something to do with the fact that I am calling the protocol method from a UIView class instead of a UIViewController class. Maybe I need to create a UIViewController for the subView?? Thanks, John

    Read the article

  • Touch coordinates in iPhone landscape mode app

    - by gok
    I am trying to make this landscape only iphone app. I only use this code for this purpose: - (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation { return (interfaceOrientation == UIInterfaceOrientationLandscapeRight); } However when I click clip subviews checkbox from interface builder, the view is clipped from the middle. I also don't receive any touch events from outside of view bounds obviously. - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { CGPoint fingerPos = [[touches anyObject] locationInView:self.view]; NSLog(@"%f %f",fingerPos.x,fingerPos.y); } only prints for coordinates between 20 and 320 for X. But Y works fine. When i try to modify bounds by hand, everything works fine; View is positioned and shown correctly, printed coordinates are correct, I receive touch from all of the screen except between 0 and 20 for X. So Left side of the screen is unresponsive to touch events for only 20 pixels. Code I use to modify bounds: self.view.bounds = CGRectMake(-180.0f, 0.0f, 680.0f, 480.0f); What might be causing this? Weird!

    Read the article

  • Dragging an UIView inside UIScrollView

    - by Sergey Mikhanov
    Hello community! I am trying to solve a basic problem with drag and drop on iPhone. Here's my setup: I have a UIScrollView which has one large content subview (I'm able to scroll and zoom it) Content subview has several small tiles as subviews that should be dragged around inside it. My UIScrollView subclass has this method: - (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event { UIView *tile = [contentView pointInsideTiles:[self convertPoint:point toView:contentView] withEvent:event]; if (tile) { return tile; } else { return [super hitTest:point withEvent:event]; } } Content subview has this method: - (UIView *)pointInsideTiles:(CGPoint)point withEvent:(UIEvent *)event { for (TileView *tile in tiles) { if ([tile pointInside:[self convertPoint:point toView:tile] withEvent:event]) return tile; } return nil; } And tile view has this method: - (void)touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event { UITouch *touch = [touches anyObject]; CGPoint location = [touch locationInView:self.superview]; self.center = location; } This works, but not fully correct: the tile sometimes "falls down" during the drag process. More precisely, it stops receiving touchesMoved: invocations, and scroll view starts scrolling instead. I noticed that this depends on the drag speed: the faster I drag, the quicker the tile "falls". Any ideas on how to keep the tile glued to the dragging finger? Thanks in advance!

    Read the article

  • Obtaining touch location for a uiscrollview touch

    - by LOSnively
    I have a uiscrollview as an element of a uiscrollviewcontroller, along with other view objects. The image scrolls and zooms as expected, when the scrollView is the top subview. However, I also need to get the screen location of the touch, in particular when there is no scroll action. (I understand the location may change during a scroll, but that's not important.) I haven't found a way to do that. In the scrollviewcontroller implementation I have customized all of the standard methods that should do this: "touchesShouldBegin...", "touchesBegan:...", "touchesEnded:...", and so on. As far as I can tell, none of these are being called during a touch event when the scrollView is the top subview. I've tried setting the delayContentTouches property to both YES and NO, and that doesn't seem to make a difference. As an alternative, I've tried putting a UIView as the top subview and then tried passing the touches to the now underlying scrollView. In this configuration, the standard methods are called and I can get the touch location, but I haven't found a mechanism for the touches to be passed to the scrollView so scrolling occurs. Doing something like sending the touch messages to the specific scrollView, or to "super" or just sending them to nextResponder doesn't do it. It seems I can make the scroll work or find the location of the touch but not both, depending on what the "top" subview is. I suspect this is trivial, but after two weeks of struggling, it's time to eat my embarrassment for not being able to do this seemingly simplest of things. I've read all of the related questions here on stackoverflow, tried most if not all of the suggestions, and so far, nothing has worked. I've looked through the various links and references suggested by the answers, including Apple's documentation, but none have pointed out the gap in my understanding. Any ideas would be appreciated.

    Read the article

  • Detect if certain UIView was touched amongst other UIViews

    - by Rudiger
    HI Guys, Sorry if this has been answered elsewhere but I can't seem to get it to work. I have 3 UIViews, layered on top of one large uiview. I want to know if the user touches the top one and not care about the other ones. I will have a couple of buttons in the second UIView and a UITable in the 3rd UIView. Problem is I turn userInteractionEngabled on on the first view and that works, but all the other views respond in the same way even if I turn it off. If I disable userInteractionEnabled on self.view none of them respond. I also can't detect which view was touched in the touchesBegan delegate method. my code: UIView *aView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 320, 150)]; aView = userInteractionEnabled = YES; [self.view addSubview:aView]; UIView *bView = [[UIView alloc] initWithFrame:CGRectMake(0, 150, 320, 50)]; bView.userInteractionEnabled = NO; [self.view addSubview:bView]; -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { //This gets called for a touch anywhere } Thanks for any help.

    Read the article

  • Problem draw line by Quartz 2D with alpha property < 1.0 on iPhone

    - by The Khanh
    Hello Everybody ! This code i use to draw in my app. So i have problem, if i draw with alpha property = 1. It is very good but if i change alpha property = 0.2 then my paint is not good. How do i make for better with alpha property = 0.2. http://www.flickr.com/photos/9601621@N05/page1/ Draw with alpha = 1: It is good Draw with alpha = 0.2: It is bad - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { if ([self.view superview] && (headerView.frame.origin.y == -30)) { mouseSwiped = YES; UITouch *touch = [touches anyObject]; CGPoint currentPoint = [touch locationInView:self.view]; currentPoint.y -= 20; UIGraphicsBeginImageContext(self.view.frame.size); CGContextRef context = UIGraphicsGetCurrentContext(); [drawImage.image drawInRect:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)]; CGContextSetLineCap(context, kCGLineCapRound); CGContextSetLineWidth(context, currentBrushProperty.brushSize); CGContextSetRGBStrokeColor(context, [self red], [self green], [self blue], currentBrushProperty.brushTransparency); CGContextSetRGBStrokeColor(context, 1.0, 0.0, 0.0, 1.0); CGContextBeginPath(context); CGContextMoveToPoint(context, lastPoint.x, lastPoint.y); CGContextAddLineToPoint(context, currentPoint.x, currentPoint.y); CGContextStrokePath(context); drawImage.image = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); lastPoint = currentPoint; }} Help me, please.

    Read the article

  • How to tell if you touched a CCLabel? (Cocos2d Question)

    - by RexOnRoids
    How to tell if you touched a CCLabel? The following code obviously does not work well enough because it only tests for point equality. Naturally touch point will not necessarily be equal to the position property of the CCLabel (CCNode). How to I tell if a Touch point has fallen within the "rectangle?" of the CCLabel? - (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { for( UITouch *touch in touches ) { CGPoint location = [touch locationInView: [touch view]]; location = [[CCDirector sharedDirector] convertToGL:location]; self.myGraphManager.isSliding = NO; if(CGPointEqualToPoint(location, label1.position)){ NSLog(@"Label 1 Touched"); }else if(CGPointEqualToPoint(location, label2.position)){ NSLog(@"Label 2 Touched"); }else if(CGPointEqualToPoint(location, label3.position)){ NSLog(@"Label 3 Touched"); }else if(CGPointEqualToPoint(location, label4.position)){ NSLog(@"Label 4 Touched"); }else if(CGPointEqualToPoint(location, label5.position)){ NSLog(@"Label 5 Touched"); }else if(CGPointEqualToPoint(location, label6.position)){ NSLog(@"Label 6 Touched"); } NSLog(@"Touch Made!"); } }

    Read the article

  • IPhone Objectvie C

    - by Sea
    I am trying to determine if a UILabel was touched and if so do something. Give .. . . . UILabel * site = [[UILabel alloc] initWithFrame:CGRectMake(0, 185, 320, 30)]; site.text = [retriever.plistDict valueForKey:@"url"]; site.textAlignment =UITextAlignmentCenter; site.backgroundColor = [UIColor clearColor]; site.textColor = [UIColor whiteColor]; site.userInteractionEnabled = YES; [theBgView addSubview:site]; [site release]; . . . Then I write the callback. - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{ retriever = [PListRetriever sharedInstance]; CGPoint pt = [[touches anyObject] locationInView: self]; NSURL *target = [[NSURL alloc] initWithString:[retriever.plistDict valueForKey:@"url"]]; [[UIApplication sharedApplication] openURL:target]; } The problem is right now, no matter where I touch in the View the Url is being opened. How do I determine if only just my label was touched?

    Read the article

  • UITouch Events and Table Views

    - by Andy
    I'm working on a navigation-based iPhone-only app that serves two main purposes: One, to present data in a hierarchical view, allowing users to drill down and eventually edit said data, and, two, to all users to perform a default action when the table view cell is tapped. I now need to offer a small set of options tied to the same data; however, both the didSelectRowAtIndexPath: and accessoryButtonTappedForRowAtIndexPath: methods are obviously taken. So, my options seem to be to implement a double-tap method, wherein the small list of additional options would be presented after (you guessed it) a double-tap on said table row; or, preferably, a tap-and-hold method. From what I can tell, tap-and-hold seems like the way to go in SDK 4.0 - which does me no good right this red-hot minute. I decided to go with the double-tap option, but I'm having a little trouble. First and foremost, the touchesBegan:withEvent: method does not seem to be getting called at all; a breakpoint placed within the method is never called while the application runs, and the table view responds exactly as it did before I inserted the method (which is to say, it performs the default action): - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *aTouch = [touches anyObject]; if (aTouch.tapCount == 2) { [NSObject cancelPreviousPerformRequestsWithTarget:self]; } } Second, I don't really need to handle a single-tap - the didSelectRowAtIndexPath: method can handle the single-tap just fine. The double-tap is the funky one I want to handle. I suspect the answer is going to contain the phrase, "You can't have the table view handle the single-tap and the touchesBegan: method handle the double-tap. The touch handling methods have to handle all of them." I would really appreciate some guidance from some of you who've dealt with this issue. Thanks in advance.

    Read the article

  • changing sounds when objects meet

    - by blacksheep
    i'd like to change sounds when tapping on an image and drag another image over several others. it is not working properly: the "MoveImage" is not dragable, the tapping is working outside the "TouchImage" and the sounds do not change when tapping on it. (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{ UITouch *touch = [[event allTouches] anyObject]; CGPoint location = [touch locationInView:touch.view]; if(((CGRectContainsPoint(myTouchImage.frame, location)), (CGRectIntersectsRect(myMoveImage.frame,myImage_1.frame)))) { [sound_E play]; } if (((CGRectContainsPoint(myTouchImage.frame, location)), (CGRectIntersectsRect(myMoveImage.frame,myImage_2.frame)))) { [sound_F play]; } if (((CGRectContainsPoint(myTouchImage.frame, location)), (CGRectIntersectsRect(myMoveImage.frame,myImage_3.frame)))) { [sound_D play]; } if (((CGRectContainsPoint(myTouchImage.frame, location)), (CGRectIntersectsRect(myMoveImage.frame,myImage_4.frame)))) { [sound_Dis play]; } } (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [[event allTouches] anyObject]; CGPoint location = [touch locationInView:touch.view]; if (CGRectContainsPoint(myMoveImage.frame, location)){ CGPoint yLocation = CGPointMake(myMoveImage.center.x,location.y); myMoveImage.center = yLocation; } if(CGRectIntersectsRect(myMoveImage.frame,myImage_1.frame)) { E_NOTE.text = @"E"; } if(CGRectIntersectsRect(myMoveImage.frame,myImage_2.frame)) { F_NOTE.text = @"F"; } if(CGRectIntersectsRect(myMoveImage.frame,myImage_3.frame)) { D_NOTE.text = @"D"; } if(CGRectIntersectsRect(myMoveImage.frame,myImage_4.frame)) { Dis_NOTE.text = @"D#"; } }

    Read the article

  • Alpha animation bug on button

    - by RaiderJ
    I have animations that fade in a Button (alpha from 0 to 1) and fade out a button (alpha from 1 to 0). This part is all working fine. Button A triggers the fade in of Button B. Button B triggers the fade out of itself. Button B totally covers up Button A. The idea is that Button B contains an image that is used like an information popup. Button A is touched and Button B fades in on top. When Button B is touched it fades itself out again. Initially, Button B's visibility is set INVISIBLE and when the fade in animation is complete, it is set to VISIBLE. When Button B is clicked it fades out and then I set the visibility to INVISIBLE. The problem is that after Button B has faded out, and it is set INVISIBLE, it is still clickable and even though it is not visible, and touches are not received by Button A. I have tried removing Button B from the parent and re-adding it after the animation is completed, and this allows for touches to reach Button A, but only once. After that button B is not longer touchable.

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >