Search Results

Search found 4898 results on 196 pages for 'ipod touch'.

Page 17/196 | < Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >

  • playing sound files on the iphone/ipod touch

    - by user272769
    What are the recommended formats to play sound files on the iphone/ipod touch devices. I am am developing an application that should be able to play long sound files on the device. Are there any limitations to the size of the sound file and which would be the best and most optimized file size to play on the iphone/ipod touch

    Read the article

  • How to recognize the touch of a non regular sprite image ?

    - by srikanth rongali
    I have a sprite and if it is touched the touch should be recognized. I used the coordinates to do so. I took the coordinates (min x, min y, max x , max y)of the sprite image. But The sprite image is not a rectangular shape. So, even if I touch the coordinates outside the sprite and inside the rectangular bounds the sprite is recognized. But for my application I need only the sprite to be recognized. So, I have to take only the coordinates of the sprite, but it is not regular shape. I am using CCSprite in my program. So, what can I do to for only the sprite to be selected ? Which classes should use for this? Thank You.

    Read the article

  • How to I get raw 'mouse' events with touch screens on Windows Vista/7?

    - by Emil
    Does anyone have a clue how to completely disable the touch/tablet 'magic' introduced in Windows Vista? When I follow the steps on http://msdn.microsoft.com/en-us/library/bb969148(VS.85).aspx (both SetProp disable and WM_TABLET_QUERYSYSTEMGESTURESTATUS override) I succeed in stopping windows from treating press-and-hold as a right-click (it correctly gives me a WM_LBUTTONDOWN), but it also gives me a premature WM_LBUTTONUP (before I really let go of the screen). And there is also another problem: a click followed by a drag (down, up, down, move) is treated as a double-click (down, up, down, up, move). These issues occur with two very different touch screens (so it is not a hardware problem), and it never used to happen with Windows XP. This really bugs me. I would much rather have the raw input events like you have for normal mouse clicks. Any ideas?

    Read the article

  • dialing on iphone/ipod touch not working with documented procedures

    - by dave
    I'm trying to set up an iphone app to the phone number of a various sports store using the tel:// url passing method- I am developing on an ipod touch- usually on the touch you see the error message "Unsupported URL - This URL wasn't loaded tel://99887766" when you try and dial a number. I cant get this message to appear on the simulator or the ipod touch. do I need to do some sort of fancy signing before the app will dial properly? I am using this code: [[UIApplication sharedApplication] openURL:[NSURL URLWithString:[NSString stringWithFormat:@"tel:%@", [selectedBar phoneNumber]]]]; and I've tried adding the slashes: [[UIApplication sharedApplication] openURL:[NSURL URLWithString:[NSString stringWithFormat:@"tel://%@", [selectedBar phoneNumber]]]]; but neither work. I have also tried this way: [[UIApplication application] openURL:[NSURL URLWithString:@"tel://99887766"]]; and this way: NSMutableString *phone = [[@"+ 12 34 567 89 01" mutableCopy] autorelease]; [phone replaceOccurrencesOfString:@" " withString:@"" options:NSLiteralSearch range:NSMakeRange(0, [phone length])]; [phone replaceOccurrencesOfString:@"(" withString:@"" options:NSLiteralSearch range:NSMakeRange(0, [phone length])]; [phone replaceOccurrencesOfString:@")" withString:@"" options:NSLiteralSearch range:NSMakeRange(0, [phone length])]; NSURL *url = [NSURL URLWithString:[NSString stringWithFormat:@"tel:%@", phone]]; [[UIApplication sharedApplication] openURL:url]; No matter what i do i can't get any response from the simulator / ipod touch that it is dealing with a phone number- When I press the button associated with this code, it doesnt crash, it's like its processed it and decided not to do anything. i even put an NSLog(@"button called"); in just before the code to confirm the button was working, which it is.

    Read the article

  • How to calculate the touch location with convertToWorldSpace?

    - by Paul
    i would like to convert the touch location as a world coordinate in my tile game. With this code, i clicked on the right of the screen (so that my character walks in the tiled game, and the background goes slowly to the left) : - (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { for( UITouch *touch in touches ) { CGPoint location = [touch locationInView: [touch view]]; location = [[CCDirector sharedDirector] convertToGL: location]; CGPoint test = [self convertToWorldSpace:location]; CCLOG(@"test : %.2g", test.x); The test log gives me : 50, 72, 1e+02, 2.6e+02, 4.2e+02, (and then goes down) 3.2e+02, 9.5, -1.9e+02, etc. Does anyone know why? I would like to calculate the "real" coordinate of the touch, so that i know when the character has to keep going (click on the right of its actual position) or if he has to turn and go backwards. (click on the left of its actual position) Thanks for your help

    Read the article

  • What cable would be used for a touch screen

    - by George Bailey
    I was told that any monitor could be turned into a touch screen if you have the right software. This has got to be old news, or even a myth. Please shed some light on this if you can. Am I wrong? My primary question is which cable would normally be used to hook up a touch screen. Would you need 2 cables for this (one for the monitor, one to receive touch events?), or is a single cable going to have data in both directions (perhaps an HDMI?)

    Read the article

  • ViewSonic LCD Monitor Touch-input

    - by Synetech inc.
    I bought a used 15" ViewSonic LCD monitor (VP150m) and noticed that it has a 3.5mm connector on the back labeled “touch i/o”. I’m trying to figure out how to use the touch function but am having trouble finding anything useful. First, I cannot find any information on what kind of cable it uses (TS, TRS, TRRS, etc.), or how to connect it to the computer. Second, I cannot find touch drivers for it—though I can find a page that mentions how easy it is to install them. Does anyone have any information on using the touchscreen function of the VP150m? Thanks a lot.

    Read the article

  • Is there a way to touch-enable scrolling in a WPF ScrollViewer?

    - by Brian Sullivan
    I'm trying to create a form in a WPF application that will allow the user to use iPhone-like gestures to scroll through the available fields. So, I've put all my form controls inside a StackPanel inside a ScrollViewer, and the scrollbar shows up as expected when there are too many elements to be shown on the screen. However, when I try to test this on my touch-enabled device, a panning gesture (placing a finger down on the surface and dragging it upward) does not move the viewable area down as I would expect. When I simply put a number of elements inside a ListView, the touch gestures work just fine. Is there any way to enable the same kind of behavior in a ScrollViewer? My window is structured like this: <Window x:Class="TestTouchScrolling.MainWindow" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" Title="MainWindow" Height="350" Width="525" Loaded="Window_Loaded"> <Grid> <ScrollViewer Name="viewer" VerticalScrollBarVisibility="Auto"> <StackPanel Name="panel"> <StackPanel Orientation="Horizontal"> <Label>Label 1:</Label> <TextBox Name="TextBox1"></TextBox> </StackPanel> <StackPanel Orientation="Horizontal"> <Label>Label 2:</Label> <TextBox Name="TextBox2"></TextBox> </StackPanel> <StackPanel Orientation="Horizontal"> <Label>Label 3:</Label> <TextBox Name="TextBox3"></TextBox> </StackPanel> <!-- Lots more like these --> </StackPanel> </ScrollViewer> </Grid>

    Read the article

  • Your Most Popular Sites Screen in IE 10 - Icons not appearing

    - by MJWadmin
    We use the following code to add icons for favicon, tablets, smartphones, windows 8 tiles and the like:- <link rel="apple-touch-icon" href="apple-touch-icon.png"> <link rel="shortcut icon" type="image/x-icon" href="/favicon.ico"/> <link rel="apple-touch-icon-precomposed" sizes="144x144" href="apple-touch-icon-144x144-precomposed.png"> <link rel="apple-touch-icon-precomposed" sizes="114x114" href="apple-touch-icon-114x114-precomposed.png"> <link rel="apple-touch-icon-precomposed" sizes="72x72" href="apple-touch-icon-72x72-precomposed.png"> <link rel="apple-touch-icon-precomposed" href="apple-touch-icon-precomposed.png"> <meta name="msapplication-TileImage" content="apple-touch-icon-144x144-precomposed.png"/> <meta name="msapplication-TileColor" content="#17151a"/> Unfortunately this doesn't seem to work for IE9 and IE10's 'your most popular sites screen', google searches have been un-enlightening. Stack uses <link rel="apple-touch-icon" href="apple-touch-icon.png"> which seems to work for it, but not for us. Any clues to a solution appreciated.

    Read the article

  • How to report a crash bug with nothing on screen

    - by winniemiel05
    I would like to report an installation bug, but I can't launch ubuntu-bug : On my tablet (LDLC Janus, an Intel based tablet without OS installed by default), I have a problem when I install Ubuntu 11.04 or 11.10. Once the installation is finished, when I reboot, there is ABSOLUTELY nothing on screen, but Ubuntu boot correctly anyway : I choosed auto-login mode and I can hear the login sound. Even if I Ctrl+Alt+F1 or F2... there is nothing. But the livecd works fine (even touch screen even if without multi touch). I tried to install 10.10 it works, but as soon as I upgrade the same problem occure. i installed Fedora 15 and then Fedora 16 (because of other problems) and no problems with them. I would like to report this bug on launchpad but I don't know how to do to give devs logs files and other useful files which could help debug. Thanks a lot for your answer, I would really prefer using my tablet with Ubuntu (I miss U1, USC, Unity)

    Read the article

  • Touching a CGRect

    - by Coder404
    In my cocos2d app I am trying to determine when a CCSprite is touched Here is what I have: -(BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event{ NSMutableArray *targetsToDelete = [[NSMutableArray alloc] init]; for (CCSprite *target in _targets) { CGRect targetRect = CGRectMake(target.position.x - (target.contentSize.width/2), target.position.y - (target.contentSize.height/2), 27, 40); CGPoint touchLocation = [self convertTouchToNodeSpace:touch]; if (CGRectContainsPoint(targetRect, touchLocation)) { NSLog(@"Moo cheese!"); } } return YES; } For some reason it does not work. Can someone help me?

    Read the article

  • How can I differentiate two different touches on a layer ?

    - by srikanth rongali
    I am writing an app in cocos2d. I hava a sprite and a text in my scene. I have written two separate classes for sprite and text. And I added both of them to another class. In sprite class I have written - (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event And in text class I have written -(void) registerWithTouchDispatcher { [[CCTouchDispatcher sharedDispatcher]addTargetedDelegate:self priority:0 swallowsTouches:YES]; } -(BOOL) ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event { return YES; } -(void) ccTouchEnded:(UITouch *)touch withEvent:(UIEvent *)event { NSLog(@"Recognized tOuches in Instructions");// CGSize windowSize = [[CCDirector sharedDirector] winSize]; CCNode *node = [self getChildByTag:kTagNode]; [node setPosition: ccp(text1.contentSize.width/2,text1.contentSize.height/2 - windowSize.height)]; } -(void) ccTouchMoved:(UITouch *)touch withEvent:(UIEvent *)event { CGPoint touchLocation = [touch locationInView: [touch view]]; CGPoint prevLocation = [touch previousLocationInView: [touch view]]; touchLocation = [[CCDirector sharedDirector] convertToGL: touchLocation]; prevLocation = [[CCDirector sharedDirector] convertToGL: prevLocation]; CGPoint diff = ccpSub(touchLocation,prevLocation); CCNode *node = [self getChildByTag:kTagNode]; CGPoint currentPos = [node position]; [node setPosition: ccpAdd(currentPos, diff)]; } But, only touches in the text are recognized and touch of sprite is not recognized ? How can I differentiate the two touches.

    Read the article

  • How To Detect "Touch Down" in superview of UIScrollView?

    - by wgpubs
    I have a UIView that contains a UIScrollView and I want to be able to capture the "Touch Down" event in the UIView any time the user taps on the UIScrollView. I've tried including all the touchesBegan/Ended/Cancelled handlers in my UIViewController but none of them get fired when tapping inside the UIScrollView contained in the main UIView. What is the best way to accomplish this?

    Read the article

  • Cocoa-Touch framework for speaking to a TCP socket?

    - by Coocoo4Cocoa
    I have a daemon running on a server that's latched onto a TCP/IP port. I'm looking to see if there's currently any support iPhone/Cocoa-touch frameworks that gives a nice OO wrapper for speaking to the daemon over an IP socket. I need to be able to interactively query the daemon with commands and retrieve back information. If there isn't any OO wrappers for such a task, what's the next best bet?

    Read the article

  • Is there a way to detect non-movement (touch events) ?

    - by hyn
    Is there a way to detect a finger's non-movement by using a combination of UITouch events? The event methods touchesEnded and touchesCancelled are only fired when the event is cancelled or the finger lifted. I would like to know when a touch has stopped moving, even while it is still touching the screen.

    Read the article

  • How can we detect a touch of a sprite?

    - by srikanth rongali
    I have two sprites in my app. Both should have touches enabled and both touches are independent of one another. And if I touch the screen (not on sprites) it should have different touches. My problem is all three sprite1, sprite2, remaining screen should have independent touches. But my program is taking all the touches as same. How can I make them as what I needed ? Thank You.

    Read the article

  • How to detect touch over a UIView when touched over a UIButton?

    - by jglievano
    I'm using (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event, and (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event to handle some dragging on a UIView. This UIView however have some UIButtons as subviews of the UIView and when the user touches over a UIButton (which are also over the UIView) the touches methods aren't called. I need the touch methods in the UIView to be called at all times and still have the UIButtons working, how can I achieve this?

    Read the article

  • Best Practices for persisting iPod Playlist (MPMediaItemCollection) across sessions

    - by coneybeare
    When using in-app audio in the iPhone SDK, it is possible to allow users to select a list from their ipod library and create an in-app local playlist. If I want to persist this choice, it is easy to serialize the data and write to file, then recover. Just vanilla like this, however, leads me to think there is going to be something wrong. For example, what if the user syncs and removes sounds? I can loop across them all and query the iPod DB at setup time, but with lists that could be 50,000 long, this could take some time. How are other people doing this and what are some gotchas that I haven't though about?

    Read the article

< Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >