Search Results

Search found 4898 results on 196 pages for 'ipod touch'.

Page 28/196 | < Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >

  • How can I tell when something outside my UITableViewCell has been touched?

    - by kk6yb
    Similar to this question I have a custom subclass of UITableViewCell that has a UITextField. Its working fine except the keyboard for doesn't go away when the user touches a different table view cell or something outside the table. I'm trying to figure out the best place to find out when something outside the cell is touched, then I could call resignFirstResponder on the text field. If the UITableViewCell could receive touch events for touches outside of its view then it could just resignFirstResponder itself but I don't see any way to get those events in the cell. The solution I'm considering is to add a touchesBegan:withEvent: method to the view controller. There I could send a resignFirstResponder to all tableview cells that are visible except the one that the touch was in (let it get the touch event and handle it itself). Maybe something like this pseudo code: - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { CGPoint touchPoint = // TBD - may need translate to cell's coordinates for (UITableViewCell* aCell in [theTableView visibleCells]) { if (![aCell pointInside:touchPoint withEvent:event]) { [aCell resignFirstResponder]; } } } I'm not sure if this is the best way to go about this. There doesn't seem to be any way for the tableviewcell itself to receive event notifications for events outside its view.

    Read the article

  • Does fast typing influence fast programming? [closed]

    - by Lukasz Lew
    Many young programmers think that their bottleneck is typing speed. After some experience one realizes that it is not the case, you have to think much more than type. At some point my room-mate forced me to turn of the light (he sleeps during the night). I had to learn to touch type and I experienced an actual improvement in programming skill. The most surprising was that the improvement not due to sheer typing speed, but to a change in mindset. I'm less afraid now to try new things and refactor them later if they work well. It's like having a new tool in the bag. Have anyone of you had similar experience? Now I trained a touch typing a little with KTouch. I find auto-generate lessons the best. I can use this program to create new lessons out of text files but it's only verbatim training, not auto-generated based on a language model. Do you know any touch typing program that allows creation of custom, but randomized lessons?

    Read the article

  • How can I practice with a full set of characters in KTouch?

    - by Josh
    I originally used KTouch to learn Qwerty touch typing, but found it too stressful on the hands. Hearing about Dvorak, I decided I'd switch to that (still using a Qwerty keyboard, but with the keys mapped differently). Since the keyboard is physically displaying a Qwerty layout, I cannot look at the keyboard for hints, making it very hard to type characters that I am unfamiliar with. Unfortunately, KTouch only covers letters, not punctuation and other symbols. Where can I find a lecture that covers all, or most of, the characters on a keyboard?

    Read the article

  • How to change drag & drop behaviour in Windows 7's explorer?

    - by Pekka
    I have a new touch screen, and am playing around with its functionality. The most productive use for me is organizing files (literally) by hand. It's fun working through a list of files, dragging and dropping them to the right locations using your index finger. It feels better on the wrist than mouse-clicking, too. The only problem is that when I drag & drop files across drives in Windows 7, the default behaviour is to copy the file instead of moving it. I know I can influence this using right click, but that is of course no option in my situation. How can I change the default drag & drop behaviour in Windows 7's explorer?

    Read the article

  • Are you supposed to type '6' with the left hand or the right hand?

    - by Joey Adams
    A few weeks ago, I did a Google Images search for keyboard finger charts to see which fingers I'm supposed to be using to type which keys. According to the charts, '6' is supposed to be typed with the right hand: (as shown on en.wikipedia.org/wiki/Typing) However, today I spotted a split keyboard in a store with the '6' on the left side of the split. Indeed, an image search for split keyboards indicates that this is the norm: (as shown on en.wikipedia.org/wiki/Microsoft_Natural_keyboard) When doing touch typing "correctly", should I go with the finger charts (type 6 with my right hand), or should I go with the split keyboards (type 6 with my left hand)? <troll> Is this just another example of Microsoft not following the standards? </troll>

    Read the article

  • which touch event to use to slide an image??

    - by hemant
    i am using the following function to move a ball from one location to another wherever user touches the screen..right now i dont have an i-phone to test my application and i am new to i-phone application programming so i wanted to know does this event will also make the ball slide from one point to another wen user maintains the touch?? -(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch=[[event allTouches] anyObject]; CGPoint location=[touch locationInView:touch.view]; fball.center=location; }

    Read the article

  • What is the best free software to learn touch-typing?

    - by gojira
    What is the best free software to learn touch-typing? Features it needs to have: should NOT display the keyboard layout on the screen! should give detailed statistics which actually measure progress (for which key do I have the highest error rate, graphs showing how typing speed improved over time, etc) should enable me to actually learn touch-typing in about one long weekend were I don't do much else than learning to touch-type. it would be very good if it were possible to load a text-file and the program will use words from the text-file for the exercises as well My goals are: at least same typing speed as I have now but which touch-typing, want to be able to look at the screen only when typing P.S.: EDIT: I forgot to mention, I'm using Win 7. And I know what the Dvorak and Colemak keyboard layouts are, but I'm not interested in them. My question was with respect to standard US keyboard layout.

    Read the article

  • How can I tell which laptop touch-screens work well with a stylus (for drawing/taking notes)?

    - by BlueRaja
    I'm looking for a laptop with a touch-screen and stylus for drawing/note-taking. I've read the difference between the different kinds of styluses, but that's only half the story - what about the touch-screen? How do I know if the touch-screen supports "palm-rejection"? Or if the included stylus is a capacitive stylus or a "Wacom digitizer"? Or if the screen will even support Wacom? How can I tell how accurate the touch-screen is (from my testing, some definitely seem to have higher "resolution" than others)? Is there anything else I should be looking at? I don't see any of this information on, for instance, the Newegg specs page for a laptop.

    Read the article

  • iphone basic game loop

    - by Mrigank Gupta
    I am little confused with my Opengl game loop. as I am skipping touch events some times. My game loop is some thing like this.. I have ScreenManager which draw and update all game screens who so ever has control.In update I am checking input of all screen also. if input state changes, then whichever screen has control, consume touches. EaglView draw update | | ScreenManager.draw ScreenManager.update -> handle input stack ___________ ___________ of ___________ ___________ screen ___________ ___________ Problem comes.. I am changing input state as touch begun and end methods called. but sometimes In my game loop both touchbegun/end methods get called between two updates and I am missing events. I guess approach is not good. can you guys share your approach to this problem.

    Read the article

  • Unable to forward UITouch events to my view controller

    - by hyn
    I have a UISplitViewController setup with a custom view added as a subview of the view (UILayoutContainerView) of split view controller. I am trying to forward touch events from my custom view controller to the master and detail views, but the following (which was suggested here on another thread) seems to have no effect: - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; // Do something [self.nextResponder touchesBegan:touches withEvent:event]; } (I couldn't get this formatted properly) As a result my custom view controller locks the events and all the UI underneath never has a chance to do anything. How can I get my master and detail view controllers to receive events?

    Read the article

  • touches event handler for UIImageView

    - by madmik3
    I am just getting stated with iPhone development and can't seem to find the answer I am looking for what I want to do. It seems like I should be able to programmatically create a UIImageView and then set up an event handler for it's touch functions. in c# i would have something that looks like Button b = new Button(); b.Click+= my handler code right now I have this CGRect myImageRect = CGRectMake(0.0f, 0.0f, 141.0f, 151.0f); UIImageView *myImage = [[UIImageView alloc] initWithFrame:myImageRect]; myImage.userInteractionEnabled = YES; [myImage setImage:[UIImage imageNamed:@"myImage.png"]]; myImage.opaque = YES; // explicitly opaque for performance [self.view addSubview:myImage]; [myImage release]; What do I need to do to override the touch events? thanks

    Read the article

  • Removing views from UIScrollView

    - by mohan
    I have two UIScrollViews that I move images between. The user can drag and forth between the scroll views. I am trying to animate the movement of the image from one scroll view to another. In -touchesMoved (handled in my UIViewController which has two custom UIScrollViews that intercept touch and sends to my UIViewController), I am trying to set the "center" of my UIImageView that is being moved. As it is moved, the image gets hidden behind the UIScrollView and not visible. How do I make it appear on top of the UIScrollView? I am able to handle the -touchesEnded properly by animating in the destination scroll view. I am also confused about -convertPoint:fromView: usage in the iPhone Programming Guide (See Chapter 3, Event Handling). If the touch coordinates are in window coordinates, to change my view (UIImageView inside a UIScrollView which is inside a UIView and inside a window) center don't I have to use -convertPoint:toView:, i.e., imageView.center = [self.view.window convertPoint:currentTouchPosition toView:imageView]; What am I missing?

    Read the article

  • Can I detect if higher subview has been touched?

    - by Kevin Beimers
    I've got a big UIView that responds to touches, and it's covered with lots of little UIViews that respond differently to touches. Is it possible to touch anywhere on screen and slide around, and have each view know if it's being touched? For example, I put my finger down on the upper left and slide toward the lower right. The touchesBegan/Moved is collected by the baseView. As I pass over itemView1, itemView2, and itemView3, control passes to them. If I lift my finger while over itemView2, it performs itemView2's touchesEnded method. If I lift my finger over none of the items, it performs baseView's touchesEnded. At the moment, if I touch down on baseView, touchEnded is always baseView and the higher itemViews are ignored. Any ideas?

    Read the article

  • How to reset the value of NSInterval ?

    - by srikanth rongali
    I am writing a small game. After the completion of the game we get the previous screen. So, we can play again. But problem is that the time interval (NSTime Interval)started first time while running game is still continuing. For example the time interval of my touch when I first started game is 3.0 seconds. When again I played (not quitting the application) the time interval is 3.0 seconds + interval till the present touch ? Hoe can I make it work ? Please help me with solution for my problem.

    Read the article

  • How do I detect whether a browser supports mouseover events?

    - by Damovisa
    Let's assume I have a web page which has some onmouseover javascript behaviour to drop down a menu (or something similar) Obviously, this isn't going to work on a touch device like the iPad or smartphones. How can I detect whether the browser supports hover events like onmouseover or onmouseout and the :hover pseudotag in CSS? Note: I know that if I'm concerned about this I should write it a different way, but I'm curious as to whether detection can be done. Edit: When I say, "supports hover events", I really mean, "does the browser have a meaningful representation of hover events". If the hardware supports it but the software doesn't (or vice versa), there's no meaningful representation. With the exception of some upcoming tech, I don't think any touch devices have a meaningful representation of a hover event.

    Read the article

  • Global NSMutableArray doesn't seem to be holding values

    - by diatrevolo
    I have a Cocos2D iPhone application that requires a set of CGRects overlaid on an image to detect touches within them. "Data" below is a class that holds values parsed from an XML file. "delegateEntries" is a NSMutableArray that contains several "data" objects, pulled from another NSMutableArray called "entries" that lives in the application delegate. For some strange reason, I can get at these values without problems in the init function, but further down the class in question, I try to get at these values, and the application crashes without an error message (as an example, I put in the "ccTouchBegan" method which accessess this data through the "populateFieldsForTouchedItem" method. Can anyone see why these values would not be accessible from other methods? No objects get released until dealloc. Thanks in advance! @synthesize clicked, delegate, data, image, blurImage, normalImage, arrayOfRects, delegateEntries; - (id)initWithTexture:(CCTexture2D *)aTexture { if( (self=[super initWithTexture:aTexture] )) { arrayOfRects = [[NSMutableArray alloc] init]; delegateEntries = [[NSMutableArray alloc] init]; delegate = (InteractivePIAppDelegate *)[[UIApplication sharedApplication] delegate]; delegateEntries = [delegate entries]; data = [delegateEntries objectAtIndex:0]; NSLog(@"Assigning %@", [[delegateEntries objectAtIndex:0] backgroundImage]); NSLog(@"%@ is the string", [[data sections] objectAtIndex:0]); //CGRect rect; NSLog(@"Count of array is %i", [delegateEntries count]); //collect as many items as there are XML entries for(int i=0; i<[delegateEntries count]; i++) { if([[delegateEntries objectAtIndex:i] xPos]) { NSLog(@"Found %i items", i+1); [arrayOfRects addObject:[NSValue valueWithCGRect:CGRectMake([[[delegateEntries objectAtIndex:i] xPos] floatValue], [[[delegateEntries objectAtIndex:i] yPos] floatValue], [[[delegateEntries objectAtIndex:i] xBounds] floatValue], [[[delegateEntries objectAtIndex:i] yBounds] floatValue])]]; } else { NSLog(@"Nothing"); } } //remove the following once the NSMutableArray from above works (legacy) blurImage = [[NSString alloc] initWithString:[data backgroundBlur]]; NSLog(@"5"); normalImage = [[NSString alloc] initWithString:[data backgroundImage]]; clicked = NO; } return self; } And then: - (void)populateFieldsForTouchedItem:(TouchedRect)touchInfo { Data *touchDatum = [[Data alloc] init]; touchDatum = [[self delegateEntries] objectAtIndex:touchInfo.recordNumber]; NSLog(@"Assigning %@", [[[self delegateEntries] objectAtIndex:touchInfo.recordNumber] backgroundImage]); rect = [[arrayOfRects objectAtIndex:touchInfo.recordNumber] CGRectValue]; image = [[NSString alloc] initWithString:[[touchDatum sections] objectAtIndex:0]]; [touchDatum release]; } - (BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event { TouchedRect touchInfo = [self containsTouchLocation:touch]; NSLog(@"Information pertains to %i", touchInfo.recordNumber); if ( !touchInfo.touched && !clicked ) { //needed since the touch location changes when zoomed NSLog(@"NOPE"); return NO; } [self populateFieldsForTouchedItem:touchInfo]; NSLog(@"YEP"); return YES; }

    Read the article

  • How can I dismiss the view appeared by touching Add item in UINavigationController ?

    - by srikanth rongali
    I have added add(+ symbol button) button to my navigation controller. When I click it a view appears from bottom. I added a navigation bar and two buttons to it. One save and one cancel button. And the view have one textEdit box. After editing I can save or cancel. If I touch cancel I need the view to disappear like it should go down again. I think all iPhone , iPodTouch users use it. Like when they touch Add item then a view appears from bottom and when they cancel it goes down again. How can I make in this way in my application.

    Read the article

  • iPhone SDK UIScrollView doesn't get touch events after moving it

    - by newbie
    Hi! I'm subclassing UIScrollView and on the start I fill this ShowsScrollView with some items. After filling it, I setup frame and contentSize to this ShowsScrollView. Everything works fine for now, i get touches events, scrolling is working.. But after rotation to landscape, I change x and y coordinates of ShowsScrollView frame, to move it from bottom to top right corner. Then I resize it (change width and height of ShowsScrollView frame) and reorder items in this scroll. At the end I setup new contentSize. Now i get touches event only on first 1/4 of scrollview, scrolling also work only on 1/4 of scrollview, but scroll all items in scrollview. After all actions I write a log: NSLog(@"ViewController: setLandscape finished: size: %f, %f content: %f,%f",scrollView.frame.size.width,scrollView.frame.size.height, scrollView.contentSize.width, scrollView.contentSize.height ); Values are correct: ViewController: setLandscape finished: size: 390.000000, 723.000000 content: 390.000000,950.000000 On rotating back to portrait, I move and resize all thing back and everything works fine.. Please help!

    Read the article

  • Intercepting/Hijacking iPhone Touch Events for MKMapView

    - by Shawn
    Is there a bug in the 3.0 SDK that disables real-time zooming and intercepting the zoom-in gesture for the MKMapView? I have some real simple code so I can detect tap events, but there are two problems: zoom-in gesture is always interpreted as a zoom-out none of the zoom gestures update the Map's view in realtime. In hitTest, if I return the "map" view, the MKMapView functionality works great, but I don't get the opportunity to intercept the events. Any ideas? MyMapView.h: @interface MyMapView : MKMapView { UIView *map; } MyMapView.m: - (id)initWithFrame:(CGRect)frame { if (![super initWithFrame:frame]) return nil; self.multipleTouchEnabled = true; return self; } - (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event { NSLog(@"Hit Test"); map = [super hitTest:point withEvent:event]; return self; } - (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"%s", __FUNCTION__); [map touchesCancelled:touches withEvent:event]; } - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent*)event { NSLog(@"%s", __FUNCTION__); [map touchesBegan:touches withEvent:event]; } - (void)touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event { NSLog(@"%s, %x", __FUNCTION__, mViewTouched); [map touchesMoved:touches withEvent:event]; } - (void)touchesEnded:(NSSet*)touches withEvent:(UIEvent*)event { NSLog(@"%s, %x", __FUNCTION__, mViewTouched); [map touchesEnded:touches withEvent:event]; }

    Read the article

  • UISegmentedControl tint color on touch

    - by gotye
    Hey everyone, I have a UISegmentedControl in my app (see code below) : // --------------- SETTING NAVIGATION BAR RIGHT BUTTONS NSArray *segControlItems = [NSArray arrayWithObjects:[UIImage imageNamed:@"up.png"],[UIImage imageNamed:@"down.png"], nil]; segControl = [[UISegmentedControl alloc] initWithItems:segControlItems]; segControl.segmentedControlStyle = UISegmentedControlStyleBar; segControl.momentary = YES; segControl.frame = CGRectMake(25.0, 7, 65.0, 30.0); segControl.tintColor = [UIColor blackColor]; [segControl addTarget:self action:@selector(segAction:) forControlEvents:UIControlEventValueChanged]; if (current == 0) [segControl setEnabled:NO forSegmentAtIndex:0]; if (current == ([news count]-1)) [segControl setEnabled:NO forSegmentAtIndex:1]; // --------------- But I can't make it to show something when you click on it ... It functionnally works perfectly but I would like it to tint to gray when you click on it (but just when you click) ... would that be possible ? Thank you, Gotye.

    Read the article

  • how to crop an image using rectangale overlay and touch on iphone

    - by Amir
    Hey Everyone, I am looking for a good tutorial or sample code, that would show how to crop an image taking from iphone camera something in lines of http://www.defusion.org.uk/code/javascript-image-cropper-ui-using-prototype-scriptaculous/ but you would control the corners with your fingers any tip would be greatly appericated, as i am new to iphone dev. Thanks, Amir

    Read the article

< Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >