Search Results

Search found 6791 results on 272 pages for 'touch typing'.

Page 4/272 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • What is the best free software to learn touch-typing?

    - by gojira
    What is the best free software to learn touch-typing? Features it needs to have: should NOT display the keyboard layout on the screen! should give detailed statistics which actually measure progress (for which key do I have the highest error rate, graphs showing how typing speed improved over time, etc) should enable me to actually learn touch-typing in about one long weekend were I don't do much else than learning to touch-type. it would be very good if it were possible to load a text-file and the program will use words from the text-file for the exercises as well My goals are: at least same typing speed as I have now but which touch-typing, want to be able to look at the screen only when typing P.S.: EDIT: I forgot to mention, I'm using Win 7. And I know what the Dvorak and Colemak keyboard layouts are, but I'm not interested in them. My question was with respect to standard US keyboard layout.

    Read the article

  • Qt for Symbian - Detecting touch/non-touch devices...

    - by Nikos
    I'm porting a game for Symbian which supports both a touch & non-touch UI. I need to be able to tell if the device has a touch screen on start-up so I can enable the appropriate mode. After googling for hours and going though the Qt Docs I found QSysInfo but this merely provides the version of the Symbian device. Is there a way to get the actual capabilities of the device? There must be a way to tell if the device has a touch screen...! I'm using the latest QtCreator with the NokiaSDK. Thank you in advance, Nikos.

    Read the article

  • Optional structural typing possibilty in C++ or anyother language?

    - by ambhai
    In C++ how to tell compiler that Ogre::Vector3 IS_SAME_AS SomeOtherLIB::Vector3 ? I feel that.. in languages like c++ which are not structural typed but there are cases when it makes sense. Normally as game developer when working with 4+ libraries that provide sort or their own Vector3 implementation. The code is littered with ToOgre, ToThis, ToThat conversion function. Thats a lot of Float3 copying around which should not happen on first place. Is in C++ or any other languages where we dont have to convert (copying) from one type to another which is essentially the samething. But any solution in C++ as most of the good gamedevs libs are for c/c++.

    Read the article

  • Detecting browser capabilities and selective events for mouse and touch

    - by skidding
    I started using touch events for a while now, but I just stumbled upon quite a problem. Until now, I checked if touch capabilities are supported, and applied selective events based on that. Like this: if(document.ontouchmove === undefined){ //apply mouse events }else{ //apply touch events } However, my scripts stopped working in Chrome5 (which is currently beta) on my computer. I researched it a bit, and as I expected, in Chrome5 (as opposed to older Chrome, Firefox, IE, etc.) document.ontouchmove is no longer undefined but null. At first I wanted to submit a bug report, but then I realized: There are devices that have both mouse and touch capabilities, so that might be natural, maybe Chrome now defines it because my OS might support both types of events. So the solutions seems easy: Apply BOTH event types. Right? Well the problem now take place on mobile. In order to be backward compatible and support scripts that only use mouse events, mobile browsers might try to fire them as well (on touch). So then with both mouse and touch events set, a certain handler might be called twice every time. What is the way to approach this? Is there a better way to check and apply selective events, or must I ignore the problems that might occur if browsers fire both touch and mouse events at times?

    Read the article

  • How can I re-enable the typing break in 11.10?

    - by Hamish Downer
    I've just upgraded to beta 2 of Oneiric/11.10 and the typing break has gone. I've gone into the system settings and looked in "Keyboard Layout" and "Keyboard" and can't find anything. Has it just been dropped? Is there some hidden way to re-enable it? Update: Thought I'd write an update based on some stuff that has happened since this question (and the two answers) were written. Workrave has now been re-instated in oneiric-backports and for 12.04 (how to enable backports). It works fine, though if you want to put it in your systray then you need to allow it in there. The easy/lazy command line way to allow workrave into the notification area is to do something like: gsettings set com.canonical.Unity.Panel systray-whitelist "['all']" But read this question if you want a more detailed explanation about what you're doing here. Meanwhile the Gnome typing break has been split out into an app called DrWright, however it has not (at time of writing) been packaged for 11.10 (or later). And as mentioned in the other answer, another option is RSIBreak. It is a KDE app but works fine in Unity.

    Read the article

  • What's an example of duck typing in Java?

    - by Cuga
    I just recently heard of duck typing and I read the Wikipedia article about it, but I'm having a hard time translating the examples into Java, which would really help my understanding. Would anyone be able to give a clear example of duck typing in Java and how I might possibly use it?

    Read the article

  • Touch Typing Software recommendations

    - by Mike
    Since the keyboard is the interface we use to the computer, I've always thought touch typing should be something I should learn, but I've always been, well, lazy is the word. So, anyone recommend any good touch typing software? It's easy enough to google, but I'ld like to hear recommendations.

    Read the article

  • Most Up-To-Date C# Duck-Typing Library

    - by Anton Gogolev
    The title says it all, basically. What is the current state of the art on duck typing for C# below version 4.0? I know about Duck Typing Project, I know that BLTookit has something to that end, but I'd like to know if I'm missing something really wicked apart from DLR languages and C# 4.0. The inevitable:

    Read the article

  • HTC Diamond Touch sync problem

    - by Anders
    I have a HTC Diamond Touch with all my contacts etc. on it. Did however not use it for 6mo while being abroad. When I start the phone now I realize that the touch screen has stopped working. I have tried restarting, soft resetting, shutting it off etc but the touch just wont follow commands. However, I can manage the phone by buttons so it's not frozen. Hence I can get into the phone and watch contacts but not use it to call etc. The problem is, how do I get my 300 contacts out of the thing!? When I'm plugging in the phone, it lets me choose between "Sync with Outlook" and "Use as storage device". It automatically selects "Use as storage device". Now, I cannot choose to sync it with the buttons. I can not change this option afterwards either. In short, I have a phone with all of my contact data and am completely unable to get that out of it. Any tips/help/suggestions? If possible, preferably one that does not including sending the phone to a hardware workshop for three weeks in order to get it fixed:)

    Read the article

  • Intercepting touch events on activity and button on Android

    - by hgpc
    I have an Android activity with an ImageButton. I would like to execute some logic when the button is clicked and show a different image for the pressed state, but also receive the touch event on the activity. By default only the button receives the touch event. If I set the clickable attribute of the button to false then only the activity receives the touch event. What's the best way to receive the touch event in both the activity and the button?

    Read the article

  • How to cancel a touch sequence

    - by Alex
    I have an UIImage view that responds to touch events. I want to cancel the touch sequence if the touch goes outside of certain bounds. How can I do that? I know that I can inspect the coordinates of the touch object, what I don't know is how to cancel the sequence. I don't see any event in the API that allows for that.

    Read the article

  • Problems with CGPoint in touches event

    - by Jason
    I'm having some problems with storing variables from my touch events. The warning I get when I run this is that coord and icoord are unused, but I used them in the viewDidLoad implementation, is there a reason why this does not work? Any suggestions? -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [[event allTouches] anyObject]; CGPoint icoord = [touch locationInView:touch.view]; } -(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [[event allTouches] anyObject]; CGPoint coord = [touch locationInView:touch.view]; } - (void)viewDidLoad { if (coord.x > icoord.x) { player.center = CGPointMake(player.center.x + 5, player.center.y); } } Thanks.

    Read the article

  • Has anyone got Ubuntu Touch working on Nexus 5?

    - by user1628
    I have been debating whether to get a nexus 5 phone since it came out. My only fear is that I won't like android. I love ubuntu, I know that I'd love ubuntu. So I have a few question related to Ubuntu Touch: Is it easy or possible to switch between Ubuntu Touch and Android? Would I have to keep hacking the phone? Can I dual boot them? Would I lose my data every time I switch? The nexus 5 isn't mentioned here: https://wiki.ubuntu.com/Touch/Devices Does that mean it simply won't work if I follow the porting instructions? Would I have to do a bit of hacking? Has anyone got it working? Will it eventually be supported?

    Read the article

  • How will the launcher/button work on a touch panel?

    - by burli
    I'm not sure if Unity has a design problem. If the Launcher is hidden you can bring it to front by moving the mouse over the home button or hit the Super Key. So far, so good. But what is on Tablet Devices with touch panel? Intuitively I would "click" in the corner to show the launcher, but a click will open the Dash. How should that work on a touch device? Do I have to "drag" my finger into the corner? Will touch devices have a "menu button"? Will there be a gesture to show the launcher?

    Read the article

  • How to reinstall latest Ubuntu Touch on Nexus 4?

    - by Galen Gruman
    I've followed the instructions on https://wiki.ubuntu.com/Touch/Install, first doing the steps that lead to phablet-flash -b and then the manual ones. In both cases, I get stuck at the Google boot screen. It does not boot into Touch. No errors during manual install, and adb devices shows the device, but I get the following with phablet-flash or phablet-flash -b (second and subsequent times, not the first time): Device detected as /system/bin/sh: getprop: not found Unsupported device, autodetect fails device When working on flipped images, detection does not work and would require -d Not clear what that all means. The Nexus 4 had the initial Touch dev preview on it, FYI. I saw no separate instructions for upgrading from that.

    Read the article

  • Building a touch event driven UI from scratch: what algorithms or data types?

    - by user1717079
    I have a touch display. As input I can receive the coordinates and how many touch points are in use, basically I just get an X,Y couple for every touch event/activated point at a customizable rate. I need to start from this and build my own callback system to achieve something like Object.onUp().doSomething() meaning that I would like to abstract just the detection of some particular movements and not having to deal with raw data: what algorithms can be useful in this case? What statements? Is there some C++ library that I can dissect to get some useful info? Would you suggest the use of an heuristic algorithm?

    Read the article

  • Unable to start basic application using sencha 2. Library files are not loaded

    - by Gendaful
    I am a new bee to sencha 2.I wanted to run a basic application using sencha touch but unable to load the application. Here is what I have done. I have downloaded the notesApp from miamicoder and i am trying to run the first chapter. I have attached the folder structure in the screenshot. Please have a look to understand the folder structure. Here is my index.html <!DOCTYPE html> <html> <head> <title>My Notes</title> <link href="sencha-touch.css" rel="stylesheet" type="text/css" /> <script src="sencha-touch-debug.js" type="text/javascript"></script> <script src="app.js" type="text/javascript"></script> </head> <body> </body> </html> I have downloaded sencha sdk 2.1 and took sencha-touch-debug.js and sencha-touch.css and placed in the root of the folder and referred from index.html as mentioned below. I used to to the same thing in sencha 1 and I was getting success but I am getting below error if I am trying to do the same with sencha 2. I am getting errors as below. Failed to load resource file:///path/NotesApp-Book-Code-Ch1/src/event/Dispatcher.js?_dc=1354982532236 Failed to load resource file:///path/NotesApp-Book-Code-Ch1/src/event/publisher/Dom.js?_dc=1354982532238 Uncaught Error: [Ext.Loader] Failed loading 'file:///path/ebook-building-a-sencha-touch-2-app%20(1)/NotesApp-Book-Code-Ch1/src/event/Dispatcher.js', please verify that the file exists sencha-touch-debug.js:8324 Uncaught Error: [Ext.Loader] Failed loading 'file:///path/ebook-building-a-sencha-touch-2-app%20(1)/NotesApp-Book-Code-Ch1/src/event/publisher/Dom.js', please verify that the file exists Is it necessary to use senchatools and generate folder structure? Simply copying the two lib files (sencha-touch-debug.js and sencha-touch.css) and refer them from index.html will not work with sencha 2? Please help. Thank you.

    Read the article

  • iPhone: Tracking/Identifying individual touches

    - by FlorianZ
    I have a quick question regarding tracking touches on the iPhone and I seem to not be able to come to a conclusion on this, so any suggestions / ideas are greatly appreciated: I want to be able to track and identify touches on the iphone, ie. basically every touch has a starting position and a current/moved position. Touches are stored in a std::vector and they shall be removed from the container, once they ended. Their position shall be updated once they move, but I still want to keep track of where they initially started (gesture recognition). I am getting the touches from [event allTouches], thing is, the NSSet is unsorted and I seem not to be able to identify the touches that are already stored in the std::vector and refer to the touches in the NSSet (so I know which ones ended and shall be removed, or have been moved, etc.) Here is my code, which works perfectly with only one finger on the touch screen, of course, but with more than one, I do get unpredictable results... - (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event { [self handleTouches:[event allTouches]]; } - (void) touchesEnded:(NSSet*)touches withEvent:(UIEvent*)event { [self handleTouches:[event allTouches]]; } - (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event { [self handleTouches:[event allTouches]]; } - (void) touchesCancelled:(NSSet*)touches withEvent:(UIEvent*)event { [self handleTouches:[event allTouches]]; } - (void) handleTouches:(NSSet*)allTouches { for(int i = 0; i < (int)[allTouches count]; ++i) { UITouch* touch = [[allTouches allObjects] objectAtIndex:i]; NSTimeInterval timestamp = [touch timestamp]; CGPoint currentLocation = [touch locationInView:self]; CGPoint previousLocation = [touch previousLocationInView:self]; if([touch phase] == UITouchPhaseBegan) { Finger finger; finger.start.x = currentLocation.x; finger.start.y = currentLocation.y; finger.end = finger.start; finger.hasMoved = false; finger.hasEnded = false; touchScreen->AddFinger(finger); } else if([touch phase] == UITouchPhaseEnded || [touch phase] == UITouchPhaseCancelled) { Finger& finger = touchScreen->GetFingerHandle(i); finger.hasEnded = true; } else if([touch phase] == UITouchPhaseMoved) { Finger& finger = touchScreen->GetFingerHandle(i); finger.end.x = currentLocation.x; finger.end.y = currentLocation.y; finger.hasMoved = true; } } touchScreen->RemoveEnded(); } Thanks!

    Read the article

  • Multiple apps in one "icon" on the iPod touch [closed]

    - by Jerry
    I have researched but can't find any discussion about moving several apps into one "icon" on the iPod touch screen. I have moved 5 apps (all the same category) into one what is normally one app icon. The title on the icon reads "games" - all are games. I have all the apps jiggling and drag one game app on top of another game app - They move to be side by side and the title automatically reads "games" - is this "OK" to do - you can have nine apps in each of the 16 spaces available on each screen. Will this hurt the touch? As long as you have space (GBs) is this ok? Has anyone done or heard of this? Any help is appriecated.

    Read the article

  • iPod Touch G4 disconnects from Belkin N+ Router at random intervals

    - by leeand00
    I have an iPod Touch G4 and a Belkin N+ Router F5D8235-4 v2, and for some reason the iPod Touch disconnects from the router at random intervals. Checking the settings in the iPod, it will read that it is still connected to the router, but before I can access the internet again, I have to turn on Airplane mode and then turn it off again to get any program to work with the Internet again. I've tried upgrading the firmware in the router, but that also doesn't seem to help. I'm using the wiresless mode 802.11b&802.11g&802.11n in the 20/40MHz frequency. Is there any way of fixing this issue? It doesn't happen with any of the other devices that are connected to the router. This post has been cross-posted here

    Read the article

  • Stream music from Laptop to iPod Touch over Wi-Fi

    - by codeulike
    Rather than running a long cable from my Laptop to my stereo, I've been thinking for a while about getting one of those wireless devices that lets you transmit audio from your PC to a receiver plugged into the stereo. Then I realised I already have hardware that can potentially do this - I have an iPod Touch, and a Wi-Fi network in the house. So, is there some iPod Touch App that will let the iPod act as a Wi-Fi music receiver? And presumably, a corresponding piece of broadcast software for my (Windows Vista) Laptop?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >