Search Results

Search found 119 results on 5 pages for 'multitouch'.

Page 2/5 | < Previous Page | 1 2 3 4 5  | Next Page >

  • Windows 8.1 touchpad three-finger-multitouch gesture gone

    - by THEVAN3D
    Couple of days ago, i installed the Windows 8.1 Pro WMC and all of a sudden the touchpad gesture, which worked like a charm back in Win 8.1 preview, Win 8 and even Win 7, just stopped working. I am talking about the "Three Finger Flick" feature, which is used to go back and forward in folders and/or internet browsers at first i didnt even have the left edge pull feature, but then i added some values in this key into regedit, and now i have it [HKEY_CURRENT_USER\Software\Synaptics\SynTPEnh\ZoneConfig\TouchPadPS2\Left Edge Pull] I Googled that, i didnt know what to do by myself and i dont know where to write and what to write anything in order to get the three finger flick back too I installed various versions of drivers starting from about version 15 to the version 17.0.8.0 (the one that i currently have), and even though i have all the other gestures, taps and swipes working they are not in the device settings, there is no multitouch gestures in device settings so thats why i guess i have to write something in regedit, but again, i dont know what to write, and where to write. please help me if you can

    Read the article

  • Odd Android touch event problem

    - by user22241
    Overview When testing my game I came across a bizarre problem with my touch controls. Note this isn't related to multi-touch as I completely removed my ACTION_POINTER_UP and ACTION_POINTER_DOWN along with my ACTION_MOVE code. So I'm simply working with ACTION_UP and ACTION_DOWN now and still get the problem. The problem I have a left and right button on the left of the screen and a jump button on the right. Everything works as it should but if I touch a large area of my hand (the fleshy part at the base of the thumb for instance) onto the screen, then release it and then press one of my arrows, the sprite moves in that direction for a few seconds, and then ACTION_UP is mysteriously triggered. The sprite stops and then if I release my finger and re-apply it to an arrow, the same thing happens. This goes on and on and eventually (randomly??) stops and everything work OK again. Test device & OS Google Nexus 10 Tablet running Jellybean 4.2.2 Code //Action upon which to switch actionMask = event.getActionMasked(); //Pointer Index of the currently touching pointer pointerIndex = event.getActionIndex(); //Number of pointers (for multi-touch) pointerCount = event.getPointerCount(); //ID of the pointer currently being processed (Multitouch) pointerID = event.getPointerId(pointerIndex); switch (actionMask){ //Primary pointer down case MotionEvent.ACTION_DOWN: { //if pressing left button then set moving left if (isLeftPressed(event.getX(), event.getY())){ renderer.setSpriteLeft(); } //if pressing right button then set moving right else if (isRightPressed(event.getX(), event.getY())){ renderer.setSpriteRight(); } //if pressing jump button then set sprite jumping else if (isJumpPressed(event.getX(),event.getY())){ renderer.setSpriteState('j', true); } break; }//End of case //Primary pointer up case MotionEvent.ACTION_UP:{ //When finger leaves the screen, stop sprite's horizontal movement renderer.setSpriteStopped(); break; }

    Read the article

  • iPad Simulator Multitouch Cursors Don't Show Up When Window is Scaled 100%

    - by Joel
    I have the iPhone SDK 3.2 installed and been working on an iPad application. However, the iPad simulator doesn't show the two gray multitouch "cursors" when I hold down the ALT/OPTION button and move the mouse around. This only happens when the simulator scale size is set to 100%. If I have it set to 50% they show up. When I have it set to be an iPhone, they show up. It's only iPad 100% size. The multitouch still works fine, I just can't see where I'm "touching". I've trying closing the simulator completely, changing from the iPhone and back again. Resizing. All sorts of stuff. Has anyone else seen this problem? Anyone have any suggestions for fixing this? I've googled and searched SOF for anyone else having this problem, but I kinda wonder if it's just me. If it makes a difference I have a Mac Mini 1.83 GHz Intel Core 2 Duo with Snow Leopard 10.6.3 installed. Thanks.

    Read the article

  • cocos2d-x and handling touch events

    - by Jason
    I have my sprites on screen and I have a vector that stores each sprite. Can a CCSprite* handle a touch event? Or just the CCLayer*? What is the best way to decide what sprite was touched? Should I store the coordinates of where the sprite is (in the sprite class) and when I get the event, see if where the user touched is where the sprite is by looking through the vector and getting each sprites current coordinates? UPDATE: I subclass CCSprite: class Field : public cocos2d::CCSprite, public cocos2d::CCTargetedTouchDelegate and I implement functions: cocos2d::CCRect rect(); virtual void onEnter(); virtual void onExit(); bool containsTouchLocation(cocos2d::CCTouch* touch); virtual bool ccTouchBegan(cocos2d::CCTouch* touch, cocos2d::CCEvent* event); virtual void ccTouchMoved(cocos2d::CCTouch* touch, cocos2d::CCEvent* event); virtual void ccTouchEnded(cocos2d::CCTouch* touch, cocos2d::CCEvent* event); virtual void touchDelegateRetain(); virtual void touchDelegateRelease(); I put CCLOG statements in each one and I dont hit them! When I touch the CCLayer this sprite is on though I do hit those in the class that implements the Layer and puts these sprites on the layer.

    Read the article

  • Multiple buttons on screen

    - by Rajas
    I am not a game developer. I am a java developer. However I need some information and I am hoping you can help me. I want to build an app that has 3 buttons on the touch screen - 1,2 and 3. Depending on the screen input the user has to press or release button 3 but (s)he will have to keep either 1 or 2 depressed all the time. For example: Keep 1 pressed while operating button 3 or Keep 2 pressed while operating button 3. Please let me know if this is even doable. Best Regards, \|/

    Read the article

  • Would a multitouch capable PC allow me to do Android development simulating the touch UI without an Android device ?

    - by Scott Davies
    Hi, I recently purchased a Samsung Galaxy Tab as a reference implementation (phone and first gen Android tablet), of Android 2.x for app development. I have noticed a slew of Android 3.0 slates being talked about at CES 2011 (Motorola XOOM, etc.). If I had a multitouch PC with the Android SDK/Emulator on it, would this allow me to more closely approximate device simulation by allowing user input via the multitouch screen ? Would it work via touch just like Windows 7 recognizes touch as mouse style input ? Has anyone done this ? Thanks, Scott

    Read the article

  • Testcase with multitouch on Android?

    - by makke
    The TouchUtils class in the android documentation has functions like drag() [http://developer.android.com/intl/de/reference/android/test/TouchUtils.html#drag(android.test.InstrumentationTestCase,%20float,%20float,%20float,%20float,%20int)], but they do not support multi touch gestures, like a two finger swipe. Looking at the MotionEvent.obtain() methods, there does not seem to be any way of invoking a "virtual" multi touch event from a testcase. Anyone has got it working?

    Read the article

  • Does the Wacom Bamboo Pen & Touch work out of the box?

    - by Emilien
    Is there any tweaking involved in Ubuntu 10.10 to make the Wacom Bamboo Pen & Touch work? And is this hardware getting some love from the new multitouch framework? If there's no multitouch support for it, then I'd fall back on the simpler (and cheaper) Wacom Bamboo Pen (to draw, no multitouch)... ENAC's general list of Linux multitouch devices states the following regarding Wacom: "The 'wacom' kernel driver handles these, and is undergoing work to make it compliant with the kernel multitouch protocol." But is this also compatible with Ubuntu's multitouch protocol (which I understand is a different effort than the kernel's)

    Read the article

  • What are cross-platform free/Open source Framework to create Touch based web apps/site using HTML/CSS/JS available?

    - by Jitendra Vyas
    What are cross-platform and cross browser and license free/Open source framework to create Touch/Multitouch based Web apps/site using HTML/CSS/JS, for mobile devices, specially for latest versions of Android, Blackberry, Windows 7, iphone and ipad available? For desktop websites I'm a jQuery lover. I know Sencha but it's not free I think. I know jQtouch but it's only for iPhone and I also know jquery mobile but I'm not cofirm, is it as powerful as Sencha? It's not necessarily for me to go with jquery mobile, if there are another better framework available than this I want to make compatible with Android, Blackberry, Windows 7 also. not only for iphone and ipad.

    Read the article

  • How can I get multitouch enabled on my Sentelic touchpad (msi x350 notebook)?

    - by Jon
    I understand my MSI x350 notebook comes with a Sentelic trackpad, which supports multi-touch (according to the MSI website). Is there a way to enable multitouch on Ubuntu? I've been having difficulty finding info about this on google, and since it's not a synaptics touchpad I haven't been able to find much info in ubuntu docs. My mouse preferences doesn't have a trackpad tab like it does on, say, a Macbook. Running "xinput list" returns: FSPPS/2 Sentelic FingerSensingPad id=11 And in my Xorg.0.log: [ 17.481] (II) config/udev: Adding input device FSPPS/2 Sentelic FingerSensingPad (/dev/input/event6) [ 17.481] () FSPPS/2 Sentelic FingerSensingPad: Applying InputClass "evdev pointer catchall" [ 17.481] () FSPPS/2 Sentelic FingerSensingPad: always reports core events [ 17.481] () FSPPS/2 Sentelic FingerSensingPad: Device: "/dev/input/event6" [ 17.500] (II) FSPPS/2 Sentelic FingerSensingPad: Found 11 mouse buttons [ 17.500] (II) FSPPS/2 Sentelic FingerSensingPad: Found scroll wheel(s) [ 17.500] (II) FSPPS/2 Sentelic FingerSensingPad: Found relative axes [ 17.500] (II) FSPPS/2 Sentelic FingerSensingPad: Found x and y relative axes [ 17.500] (II) FSPPS/2 Sentelic FingerSensingPad: Configuring as mouse [ 17.500] () FSPPS/2 Sentelic FingerSensingPad: YAxisMapping: buttons 4 and 5 [ 17.500] (**) FSPPS/2 Sentelic FingerSensingPad: EmulateWheelButton: 4, EmulateWheelInertia: 10, EmulateWheelTimeout: 200 [ 17.500] (II) XINPUT: Adding extended input device "FSPPS/2 Sentelic FingerSensingPad" (type: MOUSE) [ 17.500] (II) FSPPS/2 Sentelic FingerSensingPad: initialized for relative axes. [ 17.500] (II) config/udev: Adding input device FSPPS/2 Sentelic FingerSensingPad (/dev/input/mouse0)

    Read the article

  • Advice for Architecture Design Logic for software application

    - by Prasad
    Hi, I have a framework of basic to complex set of objects/classes (C++) running into around 500. With some rules and regulations - all these objects can communicate with each other and hence can cover most of the common queries in the domain. My Dream: I want to provide these objects as icons/glyphs (as I learnt recently) on a workspace. All these objects can be dragged/dropped into the workspace. They have to communicate only through their methods(interface) and in addition to few iterative and conditional statements. All these objects are arranged finally to execute a protocol/workflow/dataflow/process. After drawing the flow, the user clicks the Execute/run button. All the user interaction should be multi-touch enabled. The best way to show my dream is : Jeff Han's Multitouch Video. consider Jeff is playing with my objects instead of the google maps. :-) it should be like playing a jigsaw puzzle. Objective: how can I achieve the following while working on this final product: a) the development should be flexible to enable provision for web services b) the development should enable easy web application development c) The development should enable client-server architecture - d) further it should also enable mouse based drag/drop desktop application like Adobe programs etc. I mean to say: I want to economize on investments. Now I list my efforts till now in design : a) Created an Editor (VB) where the user writes (manually) the object / class code b) On Run/Execute, the code is copied into a main() function and passed to interpreter. c) Catch the output and show it in the console. The interpreter can be separated to become a server and the Editor can become the client. This needs lot of standard client-server architecture work. But some how I am not comfortable in the tightness of this system. Without interpreter is there much faster and better embeddable solution to this? - other than writing a special compiler for these objects. Recently learned about AXIS-C++ can help me - looks like - a friend suggested. Is that the way to go ? Here are my questions: (pl. consider me a self taught programmer and NOT my domain) a) From the stage of C++ objects to multi-touch product, how can I make sure I will develop the parallel product/service models as well.? What should be architecture aspects I should consider ? b) What technologies are best suited for this? c) If I am thinking of moving to Cloud Computing, how difficult/ how redundant / how unnecessary my efforts will be ? d) How much time in months would it take to get the first beta ? I take the liberty to ask if any of the experts here are interested in this project, please email me: [email protected] Thank you for any help. Looking forward.

    Read the article

  • Do Android/webOS devices support multi-touch Javascript events?

    - by Rufo Sanchez
    On iPhone, iPod touch and (presumably) iPad, Apple has multi-touch event handling available via JavaScript in Mobile Safari. I know the Nexus One recently added multi-touch support via an update, and I believe webOS is also multi-touch enabled. Do Android 2.1 and/or webOS have access to multi-touch in the browser, or is this currently exclusive to Apple devices?

    Read the article

  • Differentiating Between UITouch Objects On The iPhone

    - by Jasarien
    Hey guys, I'm trying to differentiate between two (or more) UITouch objects on the iPhone. Specifically, I'd like to know the order in which the touches occurred. For instance, in my -touchesBegan:withEvent: method I get an NSSet of UITouch objects. Now I can find out how many touches there are, but, which object represents which finger? I notice the timestamp property on UITouch - is this what I'm looking for? I see how that would be useful to obtaining the last or first touch - providing the touches don't mutate... Therein lies my problem. I can use the timestamp to single out the latest touch, but then the touch that occurred first moves, and IT becomes the latest touch... At the end of this exercise, I'd like to be able to implement the "pinch" gesture to zoom in or out, etc. Any help would be greatly appreciated, thanks.

    Read the article

  • Using the iPhones Multi-Touch Keyboard Search Button

    - by senfo
    I have a regular text field on a view and I'd like to make use of the search button on the iPhones keyboard. For the life of me, I can't figure out how to do this. There doesn't seem to be any event exposed that I can wire up that specifically relates to the search button on the keyboard. I've googled around, but I also haven't found anything related to this subject.

    Read the article

  • jQuery - draggable images on iPad / iPhone - how to integrate event.preventDefault();?

    - by Tim
    Hello! I use jQuery, jQuery UI and jQuery mobile to build a web application for iPhone / iPad. Now I create images and they should be draggable, so I did this: <!DOCTYPE html> <html> <head> <meta http-equiv="Content-type" content="text/html; charset=utf-8"> <title>Drag - Test</title> <link rel="stylesheet" href="http://code.jquery.com/mobile/1.0a2/jquery.mobile-1.0a2.min.css" /> <script src="http://code.jquery.com/jquery-1.4.4.min.js"></script> <script src="http://code.jquery.com/mobile/1.0a2/jquery.mobile-1.0a2.min.js"></script> <script src="https://ajax.googleapis.com/ajax/libs/jqueryui/1.8.7/jquery-ui.min.js"></script> </head> <body> <div> <div style="width:500px;height:500px;border:1px solid red;"> <img src="http://upload.wikimedia.org/wikipedia/en/thumb/9/9e/JQuery_logo.svg/200px-JQuery_logo.svg.png" class="draggable" alt="jQuery logo" /> <img src="http://upload.wikimedia.org/wikipedia/en/a/ab/Apple-logo.png" class="draggable" alt="Apple Inc. logo" /> </div> </div> </body> <script type="text/javascript"> $(document).ready(function() { $(".draggable").draggable(); }); </script> </html> Here you can see the live example: http://jsbin.com/igena4/ The problem is, that the whole page want to scroll. I searched in Apple's HTML5 examples and found this to prevent the scrolling of the page, so that the image is draggable: ... onDragStart: function(event) { // stop page from panning on iPhone/iPad - we're moving a note, not the page event.preventDefault(); ... } But the problem is for me, how can I include this into my jQuery? Where do I get event? Best Regards.

    Read the article

  • iOS4.2: TouchBegan does not draw more then one circle per sensed touch

    - by Christian
    Hi all, quick question (which might be a no-brainer for most here) :) My code below should draw a circle for every time touch that is recognised but although more than ones touches are sensed only one circle will drawn up at a time. Can anyone see any obvious issues? This method sits in the XYZViewControler.m class. TouchPoint.m is the class that defines the circle. Thanks a bundle for your help and redirects. Chris - (void) touchesBegan: (NSSet *) touches withEvent: (UIEvent *)event { NSSet * allTouches = [event allTouches]; // get all events for (UITouch * touch in touches) { TouchPoint * touchPoint = [[TouchPoint alloc] initWithFrame:CGRectMake(0, 0, circleWidth, circleWidth)]; touchPoint.center = [touch locationInView:[self view]]; touchPoint.color = [UIColor redColor]; touchPoint.backgroundColor = [UIColor whiteColor]; [[self view] addSubview: touchPoint]; [touchPoint release]; CFDictionarySetValue(touchMap, touch , touchPoint); } [[self view] setNeedsDisplay]; }

    Read the article

  • Android multi-touch support

    - by Zdenek F
    Hi, I wonder if is the Android multi-touch support reliable? I've read it suffers from some problems. I also wonder, how can I define custom multi-touch gestures? Like: 3 fingers rotate or 3 fingers stay static and fourth is moving. I've come across some resources (Gestures or MotionEvent on developer.android.com) but nothing states it clearly. Regards, Zdenek

    Read the article

< Previous Page | 1 2 3 4 5  | Next Page >